WO2018124688A1 - Drone control device for collision avoidance - Google Patents

Drone control device for collision avoidance Download PDF

Info

Publication number
WO2018124688A1
WO2018124688A1 PCT/KR2017/015463 KR2017015463W WO2018124688A1 WO 2018124688 A1 WO2018124688 A1 WO 2018124688A1 KR 2017015463 W KR2017015463 W KR 2017015463W WO 2018124688 A1 WO2018124688 A1 WO 2018124688A1
Authority
WO
WIPO (PCT)
Prior art keywords
drone
image
omnidirectional
collision avoidance
distance
Prior art date
Application number
PCT/KR2017/015463
Other languages
French (fr)
Korean (ko)
Inventor
이선구
이충구
Original Assignee
이선구
이충구
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 이선구, 이충구 filed Critical 이선구
Publication of WO2018124688A1 publication Critical patent/WO2018124688A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/04Anti-collision systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/30UAVs specially adapted for particular uses or applications for imaging, photography or videography

Definitions

  • the present invention relates to a collision avoidance drone control device that provides a function of avoiding collision with surrounding objects using an omnidirectional camera system.
  • drones unmanned drones
  • drones have been commercialized, and drones have been used in various fields such as camera photography.
  • These drones are small unmanned drones and are generally operated manually by receiving operator's operation signals wirelessly.
  • This drone operation method is inefficient in that the operator must be together, and also poses a danger in that an accident may occur due to the operator's mistake.
  • a drone control system that installs a distance sensor and a camera on the drone to detect objects around the drone and sends an alarm to a remote controller when an object with a risk of collision is detected during remote control, or around a drone
  • An unmanned drone control system has been developed to control the flight of objects avoiding objects.
  • the omnidirectional camera technology has been applied to vehicle-related fields such as vehicle periphery monitoring, and content production fields for providing virtual reality contents, and research and development are being actively conducted.
  • Korean Patent Publication No. 10-2015-0025452 (July 12, 2005) is disclosed as a prior art document related to the present invention, and the prior art document discloses a flight-wide photographing system.
  • each of the drones is photographed in the air by using a plurality of single viewing angle cameras, and then a single 360-degree omnidirectional video is generated so that a wide photographing range can be expressed as a single photograph, and difficult to access. It can be easily photographed at the position to provide a flying all-round shooting system that can produce a variety of videos.
  • the photographing unit of the flightable omnidirectional photographing system may be provided with six camera modules having a short viewing angle installed in the flying unit, and the six camera modules are disposed in front, rear, left, and right directions, respectively, Four shooting areas and two shooting areas can be taken simultaneously. And, the omnidirectional omnidirectional photographing system overlaps the common photographing area of the omnidirectional photographs to generate one 360 degree omnidirectional photograph.
  • the prior art invention requires a plurality of cameras installed in different directions to photograph the omnidirectional, and the image and the image boundary portions cause heterogeneity to the observer in the process of matching the images captured by the different cameras.
  • the omnidirectional photographing system currently mounted on the drone should use a plurality of cameras or a camera including a plurality of external lenses, and only transmits the photographed omnidirectional image to the user's terminal. I'm not doing it.
  • the present invention is to solve the above-mentioned problem, by using a sensor unit for capturing the omnidirectional with a single lens to accurately detect the object around the drone, collision control drone control device to self-control the drone to avoid the detected object To provide.
  • the drive unit for driving the rotary blades of the drone;
  • a sensor unit including at least one omnidirectional camera capturing omnidirectional images with a single external lens; And image the omnidirectional image to detect objects around the drone, and measure the distance between the detected object and the drone based on at least two omnidirectional images, and when the distance between the object and the drone is less than a predetermined distance,
  • a processor that controls the driver to emergency avoid the object.
  • the processor may divide the omnidirectional image at intervals along the arc direction, and detect the object around the drone by image processing the divided image.
  • the processor may segment the omnidirectional image such that the segmentation angle is gradually increased as it is spaced apart from the omnidirectional image region matching the moving direction of the drone.
  • the sensor unit may include a first omnidirectional camera disposed under the drone and a second omnidirectional camera disposed under the first omnidirectional camera.
  • the first omnidirectional camera and the second omnidirectional camera may be stereo cameras.
  • the processor may further include an image preprocessor for preprocessing an omnidirectional image to obtain a stereo image, a disparity calculator for stereo matching the stereo image to obtain disparity information, and a stereo based on the disparity information. It may include a segmentation unit separating the background and the foreground of the image, and an object identification unit detecting at least one or more objects in the separated foreground.
  • the processor calculates a first disparity from the first image of the first omnidirectional camera, calculates a second disparity from the second image of the second omnidirectional camera, and calculates the first and second disparities.
  • the distance between the object and the drone can be calculated in inverse proportion to the sum of the parity.
  • the processor may correct the first disparity according to the distance and the direction in which the point on which the subject is projected from the first image is spaced based on the focus position, and the point at which the subject is projected on the second image may be adjusted to the focus position.
  • the second disparity may be corrected according to the distance and the direction separated from each other, and the distance between the object and the drone may be calculated based on the corrected first and second disparities.
  • the processor also converts the first image of the first omnidirectional camera into a first rectangular panoramic image, converts the second image of the second omnidirectional camera into a second rectangular panoramic image, and converts the first disc from the first rectangular panoramic image.
  • the parity may be calculated, the second disparity may be calculated from the second rectangular panoramic image, and the distance between the object and the drone may be calculated based on the first disparity and the second disparity.
  • the collision avoidance control apparatus obtains the omnidirectional image by capturing the omnidirectional image of the drone with a single external omnidirectional lens, and processes the omnidirectional image to detect objects around the drone, based on at least two omnidirectional images. It measures the distance between the detected object and the drone, and when the distance between the object and the drone is less than the predetermined distance, the drone controls the propeller to emergency avoid the object, providing emergency collision avoidance function to safely drive the drone. Can be.
  • such a collision avoidance control device by using only one, as many as two omnidirectional cameras, can monitor the drone all around at once, there is an advantage that the manufacturing cost is reduced because there is no need for multiple cameras or distance sensors, Since the entire omnidirectional image is captured by a single external omnidirectional lens, there is less distortion, and thus, the position of the object can be accurately detected.
  • the collision avoidance control apparatus may analyze the omnidirectional image in consideration of the characteristics of the omnidirectional image photographed by a single external omnidirectional lens to more accurately detect the position of the object photographed in the omnidirectional image. .
  • FIG. 1 is a side view of a drone equipped with a collision avoidance drone control device according to an exemplary embodiment of the present invention.
  • FIG. 2 is a block diagram of a collision avoidance drone control device according to an embodiment of the present invention.
  • FIG 3 is a cross-sectional view of the omni-directional camera of the sensor unit according to an embodiment of the present invention.
  • FIG. 4 is a cross-sectional view of the omni-directional camera of the sensor unit according to another embodiment of the present invention.
  • FIG. 5 is a flowchart illustrating a process of providing an emergency object avoidance function by a collision avoidance drone control device according to an exemplary embodiment of the present invention.
  • FIG. 6A is an example of a segmented omnidirectional image according to an embodiment of the present invention
  • FIG. 6B is an example of a segmented omnidirectional image according to another embodiment of the present invention.
  • FIG. 7 is an internal block diagram of a processor according to an embodiment of the present invention.
  • FIG. 8 is a conceptual diagram illustrating a method of measuring a distance between an object and a drone by a processor according to an embodiment of the present invention.
  • FIG. 9 is a flowchart illustrating a process of controlling the focus of the omnidirectional camera by the processor according to an embodiment of the present invention.
  • FIG. 10 is a cross-sectional view of the omni-directional camera of the sensor unit according to another embodiment of the present invention.
  • FIG. 11 is a side view of a drone equipped with a collision avoidance drone control device according to another embodiment of the present invention.
  • Figure 13 is a block diagram of a collision avoidance drone control device according to another embodiment of the present invention.
  • FIG. 16 shows how a processor sets an optimal movement path of a drone according to an embodiment of the present invention.
  • FIG. 1 is a side view of a drone equipped with a collision avoidance drone control device according to an exemplary embodiment of the present invention.
  • a drone may include a body 10 forming an outer shape of a drone, a propeller 20 rotatably disposed on the body 10, an object surrounding the drone, and detecting an object. It includes a collision avoidance drone control device for driving control of the propeller 20 to avoid.
  • the collision avoidance control apparatus by capturing the omnidirectional image of the drone with a single external lens to obtain a omnidirectional image, image processing the omnidirectional image to detect objects around the drone, based on at least two omnidirectional images The distance between the detected object and the drone, and when the distance between the object and the drone is less than a predetermined distance, the drone controls the propeller 20 to emergency avoid the object, it can provide an emergency collision avoidance function. .
  • Such a collision avoidance control device by using only one, many two omnidirectional cameras 160, can monitor the drones at once, there is no need for multiple cameras or distance sensors, there is an advantage that the manufacturing cost is reduced Since the whole omnidirectional image is an image captured by a single external omnidirectional lens 200a, since there is little distortion, the position of the object can be accurately detected.
  • the collision avoidance control apparatus analyzes the omnidirectional image in consideration of the characteristics of the omnidirectional image captured by the single external omnidirectional lens 200a to more accurately detect the position of the object photographed in the omnidirectional image. can do.
  • FIG. 2 is a block diagram of a collision avoidance drone control device according to an embodiment of the present invention.
  • a collision avoidance control apparatus includes an input unit 110, a communication unit 12, a driver 130, a memory 140, a power supply unit 150, a sensor unit 160, and a processor ( 170, and in particular, the sensor unit 160 may include at least one omnidirectional camera 200 and other sensors.
  • the units of the collision avoidance control device shown in FIG. 2 may not be an essential unit for implementing the collision avoidance control device, and according to an embodiment, some of the units may not be included in the collision avoidance control device. Other units not shown may be further included in the collision avoidance control device.
  • the collision avoidance drone control apparatus may include an input unit 110 that detects a user input.
  • the input unit 110 may execute execution input for turning on / off the power supply of the collision avoidance drone control device, or turn on / off the emergency drone avoidance function of the collision avoidance drone control device. Input can be detected, etc.
  • the input unit 110 may include a gesture input unit 110 for detecting a user gesture (for example, an optical sensor), a touch input unit 110 for detecting a touch (for example, a touch sensor, and a touch key). (touch key, push key, etc.), the user terminal 600, a microphone for detecting a voice input, a microphone, a remote remote controller, and at least one of a mobile terminal to detect a user input. .
  • the collision avoidance drone control device may include a communication unit 120 for wireless communication with the user's terminal.
  • the communicator 120 may transmit the drone surrounding information obtained by analyzing the omnidirectional image and the omnidirectional image captured by the omnidirectional camera 200 to the terminal.
  • the communication unit 120 may receive a remote drone control signal input by a user through the terminal, and the processor 170 controls the driver 130 according to the received remote drone control signal. In addition, it may provide a remote control function for the user to remote control the drone. However, if there is a risk of collision with the object even during remote control, the processor 170 may execute the collision avoidance control in preference to the remote control signal.
  • the collision avoidance drone control apparatus may transmit the omnidirectional image so that the omnidirectional location of the drone can be confirmed as one image, thereby improving user convenience.
  • the communication unit 120 may exchange data with a terminal in a wireless manner.
  • the wireless data communication method may be various data communication methods such as Bluetooth WiFi, Direct WiFi, APiX, LTE, or NFC.
  • the collision avoidance drone control device may include a driving unit 130 for supplying power to the propeller 20 and controlling the propeller 20 rotational direction to control the movement direction / movement speed of the drone.
  • the collision avoidance drone control device has been described as including the drive unit 130 directly, the drone is provided with a separate drive unit 130 and the collision avoidance drone control device is connected to the drive unit 130 as an interface By transmitting the control signal, an embodiment of controlling the driver 130 may be possible.
  • the driver 130 includes a power source driver 130, and the power source driver 130 may perform electronic control of a power source in the drone.
  • the power source driver 130 may control the motor. Thereby, the rotation speed, torque, etc. of a motor can be controlled.
  • the steering driver 130 may perform electronic control of a steering apparatus in the drone.
  • the steering driver 130 may change the propagation direction of the drone through the inclination of the propeller 20 and the power control of the different propellers 20.
  • the processor 170 may detect a traveling direction to avoid the collision dangerous object when detecting the collision dangerous object, and control the steering driver 130 to advance the drone in the detected traveling direction.
  • the collision avoidance drone controller may include a memory 140, and the memory 140 includes various data for operation of the overall collision avoidance drone controller such as a program for processing or controlling the processor 170. Can be stored.
  • the memory 140 may store a plurality of application programs or applications that are driven by the collision avoidance drone controller, data for operating the collision avoidance drone controller, and instructions.
  • the application program may be stored in the memory 140 and may be driven by the processor 170 to perform an operation (or function) of the collision avoidance drone controller.
  • the memory 140 may store data for identifying an object included in the omnidirectional camera 200 image.
  • the memory 140 may store data for identifying what the object corresponds to by a predetermined algorithm when a predetermined object is detected in the image acquired through the camera.
  • the memory 140 may store data for identifying what the object corresponds to by a predetermined algorithm when a predetermined object such as a tall building, a flying vehicle, or a bird is included in the image acquired by the camera. Can be.
  • the collision avoidance drone control apparatus includes a power supply unit 150, and the power supply unit 150 may supply power required for operation of each component under the control of the processor 170.
  • the power supply unit 150 may be a battery inside the drone.
  • the collision avoidance drone control device may include a sensor unit 160 composed of at least one omnidirectional camera 200.
  • FIG 3 is a cross-sectional view of the omnidirectional camera 200 of the sensor unit 160 according to the embodiment of the present invention.
  • the sensor unit 160 may include one omnidirectional camera 200.
  • the omnidirectional camera 200 includes a refraction part 240 formed to refract incident light, a horizontal part 260 formed horizontally at the end of the refraction part 240, and a horizontal part 260. And a reflection coating 250 which reflects incident light reflected from the inner reflection coating to provide incident light to the inner recess, and refracts the reflected light reflected through the reflection coating 250 to the lens array 290. Reflective coating of the incident light incident on the inner concave portion 210, the inner refractive portion 220 concave at the end of the inner concave portion 210, and the inner refractive portion 240, and incident from the refracting portion 240.
  • An omnidirectional lens 200a including an inner reflection coating 230 for reflecting to 250 and an omnidirectional lens 200a are installed at the front, and a barrel having a lens array 290 formed therein ( 280).
  • the refracting part 240 is formed so that incident light is refracted, and the horizontal part 260 may be formed horizontally at the end of the refracting part 240.
  • the reflection coating 250 may be formed on the horizontal portion 260 to reflect incident light reflected from the inner reflection coating 250 to provide incident light to the inner recess 210.
  • an inner recess 210 may be formed on the inner surface of the front recess lens 200a to refract incident light reflected through the reflective coating 250 to the lens array 290.
  • An inner refractive portion 220 may be formed concave at the end of the 210.
  • the inner reflection coating 230 formed on the inner refractive portion 240 to reflect the incident light incident from the refracting portion 240 to the reflection coating 250 may be formed.
  • two refractive surfaces and two reflective coatings 250 may be configured.
  • the omnidirectional lens 200a itself is not an aspherical surface, but is spherical, the processing is easy and the manufacturing process is simple, thereby providing convenience in manufacturing and manufacturing cost reduction.
  • the refraction portion 240, the inner refraction portion 240 and the inner concave portion 210 is to have a spherical surface can solve the difficulty of machining, which is a problem of aspheric surface and at the same time can provide an omnidirectional photographing effect.
  • the omnidirectional lens 200a may be installed at the front of the barrel 280, and a lens array 290 may be formed inside the barrel 280.
  • the inclination of the inner refractive portion 240 based on the virtual central axis 270 of the omnidirectional lens 200a may be steeper than that of the refracting portion 240. This is to form an inclination angle of the inner refractive portion 240 to be steeper than the inclination angle of the refracting portion 240 to satisfy the viewing range within the range of 35 to 85 degrees.
  • the reason for forming the inner recess 210 in the inner central portion of the omnidirectional lens 200a is to focus.
  • the incident light when the incident light is incident on the lens array 290 by the inner recess 210, light may be collected and focused.
  • the image is formed according to the amount or reflection of light, and thus the design of the optical system can be manufactured in a complete body. That is, since the reflection of light generated through one overall shape is organic, it can be processed into a complete body from the initial design point.
  • FIG 4 is a cross-sectional view of the omnidirectional camera 200 of the sensor unit 160 according to another embodiment of the present invention.
  • the sensor unit 160 may include a first omnidirectional camera 200 and a second omnidirectional camera 201 disposed to be spaced apart from each other.
  • the description of each of the first and second omnidirectional cameras 200 and 201 overlaps with the description of the omnidirectional camera 200 described above, and thus a detailed description thereof will be omitted.
  • the first omnidirectional camera 200 may be disposed under the drone to photograph the omnidirectional view. At this time, the upper portion covered by the drone and the lower portion of the lower portion covered by the reflective coating 250 may not be photographed.
  • the second omnidirectional camera 201 may be disposed to extend to the reflective coating 250 of the first omnidirectional camera 200.
  • one end of the barrel 281 of the second omnidirectional camera 201 may be disposed to overlap with the reflective coating 250 of the first omnidirectional camera 200.
  • the first omnidirectional camera 200 and the second omnidirectional camera 201 are disposed to be spaced apart by a predetermined distance in the vertical direction (up and down direction), so that the first omnidirectional camera 200 and the second omnidirectional camera 201 are disposed.
  • the camera may take two omnidirectional images having parallax with each other in the vertical direction.
  • the sensor unit 160 is configured as a omnidirectional stereo camera to position the first omnidirectional camera 200 and the second omnidirectional camera 201 at different positions so as to photograph the omnidirectional image with a parallax. Can be.
  • the sensor unit 160 may shoot the omnidirectional image of the drone with the two omnidirectional cameras 200, and may accurately measure the distance between the object and the drone photographed in the image.
  • the sensor unit 160 of the collision avoidance drone control device may further include at least one sensor of a motion sensor, an altitude sensor, a distance sensor, a balance sensor, or a vision sensor.
  • the motion sensor may be at least one magnetometer, accelerometer, gyroscope, acceleration gyro sensor.
  • the motion information of the drone measured by the motion sensor may be used for drone control for collision avoidance.
  • the magnetometer is a sensor that functions as a compass, and measures magnetic north to obtain the drone's bearing information
  • the processor combines the GPS position information, the magnetometer's bearing information, and the accelerometer's motion information to obtain the drone's motion information. Can be obtained.
  • the accelerometer may be a three-axis accelerometer, it is possible to measure the acceleration in the x, y, z axis direction to obtain the motion information, such as the speed and position of the drone, the inclination, the warp, the direction of the war. Such motion information may be utilized by the processor 170 to maintain the drone in a stable posture.
  • the gyroscope may acquire the inclination information of the drone by measuring the angular velocity in the x, y, z axis direction with a three-axis gyroscope.
  • the acceleration gyro sensor may be a sensor in which the three-axis acceleration sensor and the three-axis gyroscope sensor are combined, and further include a temperature sensor to acquire motion information.
  • the altitude sensor may acquire the altitude of the drone by measuring atmospheric pressure, and may supplement the altitude information of the drone measured through GPS.
  • the GPS sensor may measure the position, coordinates, and altitude of the drone through the satellite signal. It can be understood that such a GPS sensor is included in the communication unit.
  • the apparatus may further include a distance sensor measuring a distance between the drone and the external object.
  • the distance sensor may measure the distance between the drone and the external object through an ultrasonic or laser output and return time. This distance information may be further used by the processor 170 to control the drone.
  • the apparatus may further include a vision sensor for capturing an external image and analyzing the photographed image pattern to determine the presence or absence of an obstacle.
  • This vision sensor has a similar function to an omnidirectional camera, but can be driven when hovering at an altitude of 10m or less to check obstacles and to prevent collisions.
  • the collision avoidance drone control device may include a processor 170 that controls the overall operation of each unit in the collision avoidance drone control device.
  • the processor 170 may include application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), controllers, and microcontrollers. It may be implemented using at least one of a controller (micro-controllers), microprocessors (microprocessors), electrical units for performing other functions.
  • ASICs application specific integrated circuits
  • DSPs digital signal processors
  • DSPDs digital signal processing devices
  • PLDs programmable logic devices
  • FPGAs field programmable gate arrays
  • controllers and microcontrollers. It may be implemented using at least one of a controller (micro-controllers), microprocessors (microprocessors), electrical units for performing other functions.
  • the sensor unit 160 may photograph an omnidirectional image around the drone. (S101)
  • one omnidirectional camera 200 detects the light incident through the single external omnidirectional lens 200a to obtain an omnidirectional image around the drone. can do.
  • the two omnidirectional cameras 200 respectively detect light incident through the single external omnidirectional lens 200a to provide two omnidirectional images having parallax. Can be obtained.
  • the obtained omnidirectional image may be formed in a circular shape and may have a donut shape by the reflective coating 250.
  • the processor 170 may divide the obtained omnidirectional image. (S102)
  • the embodiment is for preventing an excessive load on the processor 170 when image processing the entire omnidirectional image of high-capacity data at once.
  • the processor 170 may divide the omnidirectional image into a plurality of images and sequentially process the divided images to minimize the processing load occurring at the moment.
  • the processor 170 may precisely detect an object disposed between the divided images, thereby improving resolution of sensing information around the drone.
  • the processor 170 may convert the sector-shaped split image into a square image, match the square images, and convert the omnidirectional image into a panoramic image.
  • the processor 170 may divide the omnidirectional image to have equal intervals in the arc direction.
  • the processor 170 may determine the priorities of the divided images, sequentially process the divided images according to the priorities, and analyze the objects included in the divided images.
  • the processor 170 may analyze the split image on the moving direction side of the drone as the highest priority, and analyze the split image adjacent to the moving direction side split image as the lower priority.
  • the processor 170 may divide the omnidirectional image into different sizes.
  • the processor 170 may designate the main surveillance region in the omnidirectional image, divide the main surveillance region into a plurality of images, improve resolution, and accurately detect an object in the primary surveillance region.
  • the processor 170 may move a region of the omnidirectional image that is matched to the direction in which the drone moves to have a small split angle ⁇ 1 and move away from the split image in the moving direction D side.
  • the omnidirectional image may be divided to have a large division angle ⁇ 5.
  • the processor 170 divides the drone moving direction D side omnidirectional image area to increase resolution to accurately detect the moving direction D object, and divides the omnidirectional side image area farther from the moving direction. You can detect dangerous objects roughly.
  • the processor 170 may divide the area of the corresponding omnidirectional image into small portions to accurately detect the dangerous object.
  • the processor 170 may extract an object photographed in the omnidirectional image. (S103)
  • the sensor unit 160 is composed of one omnidirectional camera 200, a method of sensing a photographed object will be described.
  • the processor 170 extracts an object using one omnidirectional camera 200
  • a change occurs in the entire video frame photographed by the omnidirectional camera 200, which is an area where an object entering the surveillance target area is located on the entire video frame. This is indicated by the change in pixel value at.
  • the processor 170 may compare the pixel values of the pixels constituting the previous image frame and the current image frame, respectively, and determine the region where the change in the pixel value is an object image.
  • a criterion for extracting an object from a current image frame may be set using a plurality of previous image frames. That is, the processor 170 generates a background image for extracting an object based on a plurality of previous image frames temporally preceding the current image frame, and generates pixels of the pixels constituting the background image and pixels of the current image frame. You can extract objects based on the difference in values. Since the region in which the pixel value does not change over the plurality of previous image frames is regarded as a background where no object exists, the processor 170 generates a background image using the same. If the area of the pixel value is changed between the background image and the current image frame as the object image, the object can be extracted more accurately than when comparing the previous image frame with the current image frame.
  • the sensor unit 160 When the sensor unit 160 is configured as an omnidirectional stereo camera, the sensor 160 may detect the object more precisely.
  • a method of extracting an object photographed in the omnidirectional stereo image will be described in detail with reference to FIG. 7.
  • the processor 170 may include an image preprocessor 410, a disparity calculator 420, an object detector 434, and an object tracking unit 440.
  • the application unit 450 may be included.
  • an image is processed in the order of the image preprocessor 410, the disparity calculator 420, the object detector 434, the object tracking unit 440, and the application unit 450. It is not limited.
  • the image preprocessor 410 may receive two images from the omnidirectional stereo camera and perform preprocessing.
  • the image preprocessing unit 410 may include noise reduction, rectification, calibration, color enhancement, and color space conversion for two images. CSC), interpolation, and the omnidirectional camera 200 gain control. Accordingly, a sharper image may be obtained than the stereo image captured by the omnidirectional camera 200.
  • the disparity calculator 420 receives an image signal-processed by the image preprocessor 410, performs stereo matching on the two received images, and performs stereo matching.
  • a disparity map may be obtained. That is, disparity information on the stereo image of the front of the drone may be obtained.
  • the stereo matching may be performed in units of pixels of stereo images or in units of predetermined blocks.
  • the disparity map may refer to a map in which stereo parallax information of stereo images, that is, left and right images, is numerically represented.
  • the segmentation unit 432 may perform segmentation and clustering on at least one of the images based on the disparity information from the disparity calculator 420.
  • the segmentation unit 432 may separate the background and the foreground from at least one of the stereo images based on the disparity information.
  • an area in which the disparity information is equal to or less than a predetermined value in the disparity map may be calculated in the background, and the corresponding part may be excluded. Thereby, the foreground can be relatively separated.
  • an area in which the disparity information is greater than or equal to a predetermined value in the disparity map may be calculated in the foreground and a corresponding portion may be extracted. Thereby, the foreground can be separated.
  • the signal processing speed, the signal processing amount, and the like can be shortened in the subsequent object detection.
  • the processor 170 may calculate a positional relationship (eg, distance) between the detected object and the drone based on the disparity information extracted from the stereo image, and a detailed method will be described later.
  • a positional relationship eg, distance
  • the object detector 434 may detect the object based on the image segment from the segmentation unit 432.
  • the object detector 434 may detect an object with respect to at least one of the images based on the disparity information.
  • the object detector 434 may detect an object with respect to at least one of the images.
  • an object can be detected from the foreground separated by image segments.
  • the object verification unit 436 may classify and verify the separated object.
  • the object verification unit 436 may include an identification method using a neural network, a support vector machine (SVM) method, a method of identifying by AdaBoost using a haar-like feature, or a histograms of oriented gradients (HOG). Techniques can be used.
  • SVM support vector machine
  • AdaBoost a method of identifying by AdaBoost using a haar-like feature
  • HOG histograms of oriented gradients
  • the object checking unit 436 may check the objects by comparing the objects stored in the memory 140 with the detected objects.
  • the object verification unit 436 can identify the surrounding aircraft, buildings, dangerous areas, and the like located near the drone.
  • the object tracking unit 440 may perform tracking on the identified object. For example, in order to sequentially identify the object in the obtained stereo images, calculate the motion or motion vector of the identified object, track the movement of the object, etc. based on the calculated motion or motion vector. Can be. As a result, it is possible to track surrounding vehicles, high-rise buildings, dangerous areas, etc., which are located around the drone.
  • the application unit 450 may calculate a risk of collision or the like based on various objects located around the drone, for example, a surrounding vehicle, a high-rise building, a dangerous area, and the like. In addition, the possibility of collision with a vehicle or a building can be calculated.
  • the application unit 450 may output, as the drone surrounding information, a message or the like for informing the user of such information, based on the calculated risk, the likelihood of collision, the slip state, or the like.
  • a control signal for attitude control or running control of the drone may be generated as the drone control information.
  • the image preprocessor 410, the disparity calculator 420, the segmentation unit 432, the object detector 434, the object checker 436, the object tracking unit 440, and the application unit 450 are processors. It may be an internal configuration of the image processing unit in 170.
  • the processor 170 may include an image preprocessor 410, a disparity calculator 420, a segmentation unit 432, an object detector 434, an object checker 436, and an object tracking unit ( 440 and only a part of the application unit 450 may be included.
  • the processor 170 may calculate a positional relationship between the detected object and the drone. This step may be executed simultaneously with the step of extracting the object from the omnidirectional image.
  • the processor 170 may compare the first omnidirectional image photographed by the first omnidirectional camera 200 and the second omnidirectional image photographed by the second omnidirectional camera 201, and may be separated from each other by an object. The separation distance can be calculated.
  • the processor 170 may obtain the disparity information generated due to the parallax difference in the first omnidirectional image and the second omnidirectional image.
  • the processor 170 may correct the disparity information in consideration of the distortion generated by the omnidirectional image captured by the single external omnidirectional lens 200a.
  • the processor 170 may correct the disparity information in consideration of the distortion generated by the omnidirectional image captured by the single external omnidirectional lens 200a.
  • the omnidirectional image the object photographed near the center is photographed as being closer than the actual separation distance, and the object photographed near the outer periphery in the omnidirectional image is photographed as being farther than the actual separation distance.
  • the processor 170 may accurately measure the distance between the object and the drone by correcting the disparity information of the stereo image by reflecting the distortion characteristic of the omnidirectional image.
  • Equation 1 is a formula for measuring a distance between a drone and an object based on a stereo image.
  • f focal length
  • L baseline between the first omnidirectional camera 200 and the second omnidirectional camera 201
  • dr focal position of the first image plane from the point at which the subject is projected
  • dl distance from the point at which the subject is projected to the focal point of the second image plane (second disparity)
  • the processor 170 may calculate a first disparity from the first image 200i of the first omnidirectional camera 200, and may calculate the second image (eg, the second omnidirectional camera 201).
  • the second disparity may be calculated, and the distance between the object and the drone may be calculated in inverse proportion to the sum of the first and second disparities.
  • the first disparity means the distance from the point at which the subject is projected to the focal position of the plane of the first image 200i
  • the second disparity is the second image 201i from the point at which the subject is projected.
  • first disparity and the second disparity contain distortions generated by the single external circular lens of the omni-directional camera 200, correction may be performed.
  • the processor 170 may correct the first disparity according to the distance and the direction in which the point on which the subject is projected in the first image 200i is spaced based on the focal position, and in the second image 201i.
  • the second projection is corrected according to the distance from the point where the subject is projected based on the focus position and the separation direction, and the distance between the object and the drone based on the corrected first and second disparities. Can be calculated.
  • the processor 170 may focus on the first image 200i because, if the position where the subject is projected in the first image 200i is close to the outer periphery of the image, the object may have been photographed as being farther than the actual position.
  • the first disparity can be corrected by adding a correction variable ⁇ (negative) proportional to the distance between the position and the subject projection position.
  • the processor 170 may determine that if the position where the subject is projected in the second image 201i is close to the center of the image, the object may have been photographed as being closer than the actual position.
  • the second disparity can be corrected by adding a correction variable ⁇ (positive number) proportional to the distance between the subject projection positions.
  • the processor 170 converts the first image 200i of the first omnidirectional camera 200 into a first rectangular panoramic image, and converts the second image 201i of the second omnidirectional camera 201 into a second rectangular image. Convert to a panoramic image, calculate a first disparity from the first rectangular panoramic image,
  • the second disparity may be calculated from the second rectangular panoramic image, and the distance between the object and the drone may be calculated based on the first disparity and the second disparity. That is, in the step of converting the omnidirectional image image into a quadrangular image, the distortion of the omnidirectional image may be corrected in advance.
  • the processor 170 may accurately calculate the distance between the object and the drone in consideration of the distortion generated in the omnidirectional stereo image.
  • the processor 170 may detect an object at risk of collision according to the positional relationship between the drone and the object. (S105)
  • the processor 170 may determine the object as a collision risk object.
  • the processor 170 may determine the collision risk object by considering whether the rate of change of the distance between the drone and the object is negative or positive.
  • the processor 170 may calculate a collision avoidance direction and control to make an emergency turning in the collision avoidance direction of the drone through the driver 130.
  • the processor 170 may control the drone in preference to the remote control signal.
  • the processor 170 may generate an emergency drone control signal in advance of the remote control signal, transmit the emergency drone control signal to the driver 130, and control the propeller 20, thereby further protecting the drone.
  • FIG. 9 is a flowchart illustrating a process of controlling the focus of the omnidirectional camera 200 by the processor 170 according to an embodiment of the present invention.
  • the processor 170 may store the speed at which the drone maintains in the memory 140, and automatically control the focus of the omnidirectional camera 200 according to the altitude at which the drone starts supporting and changes. (S203)
  • the processor 170 calculates the altitude of the drone in consideration of the time taken to start supporting and the predetermined speed, and photographs the ground side according to the altitude of the drone. It is possible to calculate the focus for the control of the omnidirectional camera 200 according to the calculated focus.
  • the omnidirectional camera 200 may further include a focus control unit for controlling the focus.
  • the omnidirectional camera 200 may further include a focus lens for controlling focus and a focus controller for controlling the focus lens, together with the same configuration as the above-described embodiment. can do.
  • the omnidirectional camera 200 includes a fluid focus lens 320 disposed in the inner concave portion and including a different liquid, and a focus controller 310 including a current injection portion contracting or expanding the fluid focus lens 320. It may include.
  • FluidFocus lenses may include two types of liquids that are not mixed with each other.
  • the two liquids may be liquids with different refractive indices, one with electricity and one without electricity.
  • the focus control unit 310 injects current into one of the two liquids through the current injection unit, and the conductive liquid into which the current is injected changes in volume, so that the other liquids contract and expand like the cornea of the human eye. While the focus of the omnidirectional camera 200 can be controlled.
  • the fluid focus lens 320 is disposed in a shape corresponding to the inner concave portion 210 of the single external lens (the omnidirectional lens 200a) of the omnidirectional camera 200 to uniformly control the focus of the entire omnidirectional image.
  • the processor 170 may detect the tracking object, calculate the distance to the tracking object, and then control the focus according to the calculated distance change. (S205)
  • the processor 170 may accurately photograph a moving object, analyze the photographed image, and accurately calculate a distance from the photographed object, thereby further improving the performance of the emergency collision avoidance function. .
  • FIG. 11 is a side view of a drone equipped with a collision avoidance drone control device according to another exemplary embodiment of the present invention
  • FIG. 12 is an internal configuration diagram of the lower and upper omnidirectional cameras.
  • a collision avoidance drone control device mounted on a drone may include a lower omnidirectional camera 160a for detecting a lower peripheral area of a drone, and an upper peripheral area of a drone. It may include an upward omnidirectional camera 160b.
  • the upper omnidirectional camera 160b may have the same configuration as the omnidirectional camera of the above-described embodiment.
  • the surface of the single external omnidirectional lens 200a is subjected to window tinting. That is, the upper omnidirectional camera 160b is the same as the omnidirectional camera described above, except that a window tinting process is applied to the surface of the single external omnidirectional lens 200a.
  • the processor detects the object photographed by the upper omnidirectional camera 160b, measures the distance between the object and the drone, and emergency avoidance control to move the drone downward as a dangerous object when the detected object is less than a predetermined distance. Can be performed.
  • the processor detects the object photographed by the lower omnidirectional camera 160a, measures the distance between the object and the drone, and moves the drone upward as a dangerous object when the detected object is less than a predetermined distance.
  • Emergency evasion control can be performed.
  • FIG. 13 is a block diagram of a collision avoidance drone control device according to still another embodiment of the present invention
  • FIGS. 14 and 15 are cross-sectional views of a liquid lens.
  • a collision avoidance control apparatus includes an input unit 110, a communication unit 12, a driving unit 130, a memory 140, a power supply unit 150,
  • the sensor unit 160 and the processor 170 may be included.
  • the sensor unit 160 may include at least one omnidirectional camera 200 and a liquid lens 121.
  • the liquid lens 121 includes an oil layer 121c and an aqueous solution layer 121d stacked between the first protective glass 121a, the second protective glass 121b, the first and second protective glasses 121a and 121b, The first insulating portion 121g disposed between the first electrode portion 121e and the second electrode portion 121f and the first and second electrode portions 121e and 121f disposed at the periphery thereof to insulate a voltage. And a second insulating portion 121h.
  • the liquid lens 121 is supplied with power from the outside to the first and second electrode portions 121e and 121f, the radius of curvature and the thickness of the oil layer 121c are changed to pass through the liquid lens 121.
  • the position of the focus of the light can be changed.
  • the radius of curvature and the thickness of the oil layer 121c may increase.
  • the focal length can be shortened by increasing the voltage.
  • the liquid lens 121 does not require a separate servo motor for the flow of the lens, thereby greatly reducing the manufacturing cost. In addition, precise refractive index change is possible.
  • liquid lens 121 may be located between the lens array 290 and the omnidirectional lens 200a, may be located at the rear end of the lens array 290, any of the plurality of lenses in the lens array 290 It may be located between two lenses.
  • the processor 170 may control the power applied to the liquid lens 121 to vary the refractive index of the liquid lens 121. Therefore, by focusing from far to near, it is possible to check the approaching object precisely.When focusing at far distance, the angle of view becomes smaller in the direction of progress, and focuses about 500 times per second so that the object can be identified by image analysis. do.
  • the collision avoidance drone control apparatus may calculate an optimal movement path based on an image photographed through a camera and map information received through the communicator 120.
  • the processor 170 may include a movement path setting unit 171 for calculating the optimum movement path.
  • obstacle 1 may be photographed in an area of an image that can be photographed through a camera in a drone. Therefore, the collision avoidance drone controller can set the optimal route to the destination while avoiding obstacle 1. However, since the area of the image that can be photographed by the drone has a limit, it may be difficult to calculate a moving path outside the photographing area.
  • the collision avoidance drone control device may receive a map image matching the image through the communication unit 120.
  • the collision avoidance drone controller may match the image with the received map image to determine a location where the current drone is located in the map image, and then set an optimal route to the destination while avoiding obstacles through the map image.
  • the obstacle 2 can be grasped in advance through the map information, and an optimum movement path can be set even for an area outside the photographing area.
  • the present invention can also be embodied as computer readable codes on a computer readable recording medium.
  • Computer-readable recording media include all kinds of recording devices that store data that can be read by a computer system. Examples of computer-readable recording media include ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage, and the like, and may also be implemented in the form of a carrier wave (for example, transmission over the Internet). Include.
  • the computer readable recording medium can also be distributed over network coupled computer systems so that the computer readable code is stored and executed in a distributed fashion.
  • functional programs, codes, and code segments for implementing the present invention can be easily inferred by programmers in the art to which the present invention belongs.
  • the present invention is a device mounted on the drone to control the movement to avoid the collision when the drone moves.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Studio Devices (AREA)
  • Traffic Control Systems (AREA)

Abstract

A drone control device for collision avoidance, according to an embodiment, comprises: a driving unit for driving a rotor blade of a drone; a sensor unit including an omnidirectional camera for acquiring an omnidirectional image by photographing all directions of the drone with a single external omnidirectional lens; and a processor, which performs image processing on the omnidirectional image so as to detect an object positioned around the drone, measures the distance between the drone and the object detected on the basis of at least two omnidirectional images, and controls the driving unit such that the drone urgently avoids the object when the distance between the object and the drone is less than or equal to a predetermined distance.

Description

충돌 회피용 드론 제어장치 Collision Avoidance Drone Control
본 발명은 전방위 카메라 시스템을 이용하여 주변 객체와 충돌을 회피하는 기능을 제공하는 충돌 회피용 드론 제어장치에 관한 것이다.The present invention relates to a collision avoidance drone control device that provides a function of avoiding collision with surrounding objects using an omnidirectional camera system.
최근 드론(무인드론)이 상용화되면서 카메라 촬영 등 다양한 분야에서 드론이 활용되고 있다.Recently, drones (unmanned drones) have been commercialized, and drones have been used in various fields such as camera photography.
이러한 드론은 소형 무인드론로서 운용자의 조작 신호를 무선으로 받아 수동으로 운용되는 것이 일반적이다These drones are small unmanned drones and are generally operated manually by receiving operator's operation signals wirelessly.
이러한 드론의 운용 방식은 반드시 운용자가 함께 있어야 한다는 점에서 비효율적이고, 또한 운용자의 실수로 사고가 발생할 수 있다는 점에서 위험성을 함께 내포하고 있다.This drone operation method is inefficient in that the operator must be together, and also poses a danger in that an accident may occur due to the operator's mistake.
이러한 드론 운용 방식을 보완하기 위하여, 드론에 거리 센서와 카메라를 설치하여 드론 주변의 오브젝트를 감지하고, 원격 제어 중 충돌 위험이 있는 오브젝트 감지시 원격 조종자에게 알람을 전송하는 드론 제어시스템이나, 드론 주변의 오브젝트를 회피하여 비행하도록 제어하는 무인 드론 제어시스템이 개발되고 있다. In order to complement this drone operation method, a drone control system that installs a distance sensor and a camera on the drone to detect objects around the drone and sends an alarm to a remote controller when an object with a risk of collision is detected during remote control, or around a drone An unmanned drone control system has been developed to control the flight of objects avoiding objects.
그런데 드론은 전방위로 이동이 가능하므로, 이러한 드론 제어시스템을 구축하기 위해서는 드론의 전방위에 위치한 오브젝트를 감지하기 위해 다수의 거리 센서와, 다수의 단 시야각 카메라를 필요로 하여, 시스템 구현 비용이 증대되고 미관상 좋지 않은 문제가 있다. However, since drones can move in all directions, such a drone control system requires multiple distance sensors and multiple short viewing angle cameras to detect objects located in all directions of the drone. There is an aesthetic problem.
한편, 최근 360도 카메라(전방위 카메라)의 활용성이 주목 받고 있다. On the other hand, the utilization of the 360-degree camera (the omnidirectional camera) has recently attracted attention.
자세히, 전방위 카메라 기술은, 차량 주변 감시 등 차량 관련 분야나, 가상 현실 콘텐츠를 제공하기 위한 콘텐츠 제작 분야 등에 응용되어, 연구 개발이 활발이 진행되고 있다 In detail, the omnidirectional camera technology has been applied to vehicle-related fields such as vehicle periphery monitoring, and content production fields for providing virtual reality contents, and research and development are being actively conducted.
또한, 최근 전방위 카메라를 드론에 접목하여, 비행 가능한 전방위 촬영시스템을 구현하는 기술 또한 제안되었다. In addition, recently, a technique for implementing a flight-wide omnidirectional photographing system by applying an omnidirectional camera to a drone has also been proposed.
자세히, 본 발명과 관련된 선행 문헌으로 대한민국 공개특허 제10-2015-0025452 호(2005년 07월 12일)가 있으며, 상기 선행 문헌에는 비행 가능한 전방위 촬영시스템 가 개시되어 있다.In detail, Korean Patent Publication No. 10-2015-0025452 (July 12, 2005) is disclosed as a prior art document related to the present invention, and the prior art document discloses a flight-wide photographing system.
선행문헌발명은, 공중에서 복수의 단 시야각 카메라를 이용하여 드론의 사방을 각각 촬영한 후, 이를 하나의 360도 전방위 동영상으로 생성시킴으로써, 넓은 촬영 범위를 하나의 사진으로 표현할 수 있고, 접근이 어려운 위치에서도 용이하게 촬영할 수 있어 다양한 동영상을 연출할 수 있는 비행 가능한 전방위 촬영시스템을 제공할 수 있다. According to the prior art, each of the drones is photographed in the air by using a plurality of single viewing angle cameras, and then a single 360-degree omnidirectional video is generated so that a wide photographing range can be expressed as a single photograph, and difficult to access. It can be easily photographed at the position to provide a flying all-round shooting system that can produce a variety of videos.
자세히, 비행 가능한 전방위 촬영시스템의 촬영 유닛은 상기 비행 유닛에 설치되는 단 시야각을 갖는 6개의 카메라 모듈로 구비될 수 있으며, 상기 6개의 카메라 모듈은 전후좌우 방향과 상하 방향에 각각 배치되어, 전후좌우 4개의 촬영 영역과 상하 2개의 촬영 영역을 동시에 촬영할 수 있다. 그리고, 비행 가능한 전방위 촬영시스템은 상기 전방위 사진들의 공통 촬영 영역을 중첩시켜, 하나의 360도 전방위 사진을 생성시킨다.In detail, the photographing unit of the flightable omnidirectional photographing system may be provided with six camera modules having a short viewing angle installed in the flying unit, and the six camera modules are disposed in front, rear, left, and right directions, respectively, Four shooting areas and two shooting areas can be taken simultaneously. And, the omnidirectional omnidirectional photographing system overlaps the common photographing area of the omnidirectional photographs to generate one 360 degree omnidirectional photograph.
따라서, 선행문헌발명은, 전방위를 촬영하기 위해 각기 다른 방향을 지향하며 설치된 복수의 카메라를 필요로 하며, 각기 다른 카메라에서 촬영된 영상을 정합하는 과정에서 영상과 영상 경계 부분이 관찰자에게 이질감을 발생시키는 문제가 있다. 즉, 현재 드론에 장착된 전방위 촬영 시스템은, 복수의 카메라나, 복수의 외부렌즈를 포함하는 카메라를 이용해야 하고, 촬영된 전방위 영상을 사용자의 단말기로 송출하기만 할 뿐 물체를 감지하는데 전혀 이용하고 있지는 않는다. Accordingly, the prior art invention requires a plurality of cameras installed in different directions to photograph the omnidirectional, and the image and the image boundary portions cause heterogeneity to the observer in the process of matching the images captured by the different cameras. There is a problem. That is, the omnidirectional photographing system currently mounted on the drone should use a plurality of cameras or a camera including a plurality of external lenses, and only transmits the photographed omnidirectional image to the user's terminal. I'm not doing it.
또한, 전술하였듯이, 각기 다른 카메라에서 촬영된 영상을 정합하는 과정에서 왜곡이 발생되어, 정합 영상을 신뢰할 수 없는 문제가 있다.In addition, as described above, distortion occurs in the process of matching images taken by different cameras, and thus there is a problem that the matched image cannot be trusted.
본 발명은 전술한 문제점을 해결하기 위한 것으로, 단일 렌즈로 전방위를 촬영하는 센서부를 이용하여 드론 주변의 객체를 정밀하게 감지하고, 감지한 객체를 회피하도록 드론을 자체 제어하는 충돌 회피용 드론 제어장치를 제공하고자 한다. The present invention is to solve the above-mentioned problem, by using a sensor unit for capturing the omnidirectional with a single lens to accurately detect the object around the drone, collision control drone control device to self-control the drone to avoid the detected object To provide.
실시예에 따른 충돌 회피용 드론 제어장치는, 드론의 회전 날개를 구동시키는 구동부; 단일 외부렌즈로 전방위를 촬영하여 전방위 영상을 획득하는 적어도 하나 이상의 전방위 카메라를 포함하는 센서부; 및 전방위 영상을 이미지 처리하여 드론 주변의 객체를 감지하고, 적어도 둘 이상의 전방위 영상을 기초로 감지된 객체와 드론 사이의 거리를 측정하며, 객체와 드론 사이의 거리가 소정의 거리 이하일 때, 드론이 객체를 긴급 회피하도록 구동부를 제어하는 프로세서를 포함한다. Collision avoidance drone control apparatus according to an embodiment, the drive unit for driving the rotary blades of the drone; A sensor unit including at least one omnidirectional camera capturing omnidirectional images with a single external lens; And image the omnidirectional image to detect objects around the drone, and measure the distance between the detected object and the drone based on at least two omnidirectional images, and when the distance between the object and the drone is less than a predetermined distance, And a processor that controls the driver to emergency avoid the object.
이때, 프로세서는, 전방위 영상을 원호 방향을 따라서 간격을 두고 분할하고, 분할한 영상을 이미지 처리하여 드론 주변의 객체를 감지할 수 있다. In this case, the processor may divide the omnidirectional image at intervals along the arc direction, and detect the object around the drone by image processing the divided image.
또한, 프로세서는, 드론의 이동방향과 매칭되는 전방위 영상 영역에서 이격될수록 분할 각이 점차 커지도록 전방위 영상을 분할할 수 있따. In addition, the processor may segment the omnidirectional image such that the segmentation angle is gradually increased as it is spaced apart from the omnidirectional image region matching the moving direction of the drone.
또한, 센서부는, 드론의 하부에 배치된 제 1 전방위 카메라와, 제 1 전방위 카메라 아래에 배치된 제 2 전방위 카메라를 포함하고, 제 1 전방위 카메라와, 제 2 전방위 카메라는 스테레오 카메라일 수 있다. The sensor unit may include a first omnidirectional camera disposed under the drone and a second omnidirectional camera disposed under the first omnidirectional camera. The first omnidirectional camera and the second omnidirectional camera may be stereo cameras.
또한, 프로세서는, 전방위 영상을 전처리(preprocessing)하여 스테레오 이미지를 획득하는 영상 전처리부와, 스테레오 이미지를 스테레오 매칭하여 디스페러티 정보를 획득하는 디스페러티 연산부와, 디스페러티 정보에 기초하여 스테레오 이미지의 배경과 전경을 분리하는 세그멘테이션부와, 분리된 전경에서 적어도 하나 이상의 객체를 검출하는 오브젝트 확인부를 포함할 수 있다. The processor may further include an image preprocessor for preprocessing an omnidirectional image to obtain a stereo image, a disparity calculator for stereo matching the stereo image to obtain disparity information, and a stereo based on the disparity information. It may include a segmentation unit separating the background and the foreground of the image, and an object identification unit detecting at least one or more objects in the separated foreground.
또한, 프로세서는, 제 1 전방위 카메라의 제 1 이미지에서 제 1 디스페러티를 산출하고, 제 2 전방위 카메라의 제 2 이미지에서 제 2 디스페러티를 산출하고, 제 1 디스페러티와 제 2 디스페러티의 합에 반비례하여 객체와 드론 사이의 거리를 산출할 수 있다.Further, the processor calculates a first disparity from the first image of the first omnidirectional camera, calculates a second disparity from the second image of the second omnidirectional camera, and calculates the first and second disparities. The distance between the object and the drone can be calculated in inverse proportion to the sum of the parity.
또한, 프로세서는, 제 1 이미지에서 피사체가 투영된 지점이 초점 위치를 기준으로 이격된 거리와 이격방향에 따라서 제 1 디스페러티를 보정하고, 제 2 이미지에서 피사체가 투영된 지점이 초점 위치를 기준으로 이격된 거리와 이격방향에 따라서 제 2 디스페러티를 보정하고, 보정된 제 1 디스페러티와 제 2 디스페러티를 기초로 객체와 드론 사이의 거리를 산출할 수 있다. In addition, the processor may correct the first disparity according to the distance and the direction in which the point on which the subject is projected from the first image is spaced based on the focus position, and the point at which the subject is projected on the second image may be adjusted to the focus position. The second disparity may be corrected according to the distance and the direction separated from each other, and the distance between the object and the drone may be calculated based on the corrected first and second disparities.
또한, 프로세서는, 제 1 전방위 카메라의 제 1 이미지를 제 1 사각 파노라마 이미지로 변환하고, 제 2 전방위 카메라의 제 2 이미지를 제 2 사각 파노라마 이미지로 변환하고, 제 1 사각 파노라마 이미지로부터 제 1 디스페러티를 산출하고, 제 2 사각 파노라마 이미지로부터 제 2 디스페러티를 산출하고, 제 1 디스페러티 및 제 2 디스페러티를 기초로 객체와 드론 사이의 거리를 산출할 수 있다. The processor also converts the first image of the first omnidirectional camera into a first rectangular panoramic image, converts the second image of the second omnidirectional camera into a second rectangular panoramic image, and converts the first disc from the first rectangular panoramic image. The parity may be calculated, the second disparity may be calculated from the second rectangular panoramic image, and the distance between the object and the drone may be calculated based on the first disparity and the second disparity.
실시예에 따른 충돌 회피용 제어장치는, 단일 외부 전방위 렌즈로 드론의 전방위를 촬영하여 전방위 영상을 획득하고, 전방위 영상을 이미지 처리하여 드론 주변의 객체를 감지하고, 적어도 둘 이상의 전방위 영상을 기초로 감지된 객체와 드론 사이의 거리를 측정하며, 객체와 드론 사이의 거리가 소정의 거리 이하일 때, 드론이 객체를 긴급 회피하도록 프로펠러를 제어하여, 긴급 충돌회피 기능을 제공하여, 드론을 안전하게 주행시킬 수 있다. The collision avoidance control apparatus according to the embodiment obtains the omnidirectional image by capturing the omnidirectional image of the drone with a single external omnidirectional lens, and processes the omnidirectional image to detect objects around the drone, based on at least two omnidirectional images. It measures the distance between the detected object and the drone, and when the distance between the object and the drone is less than the predetermined distance, the drone controls the propeller to emergency avoid the object, providing emergency collision avoidance function to safely drive the drone. Can be.
또한, 이러한 충돌 회피용 제어장치는, 하나, 많게는 두개의 전방위 카메라만을 이용하여, 드론의 사방을 일거에 감시할 수 있으므로, 다수의 카메라나 거리 센서가 필요 없어 제조비용이 절감되는 장점이 있고, 전방위 영상 전체는 단일 외부 전방위 렌즈를 통해 촬영된 영상이므로 왜곡이 적기 때문에 객체의 위치를 정밀하게 감지할 수 있는 장점이 있다. In addition, such a collision avoidance control device, by using only one, as many as two omnidirectional cameras, can monitor the drone all around at once, there is an advantage that the manufacturing cost is reduced because there is no need for multiple cameras or distance sensors, Since the entire omnidirectional image is captured by a single external omnidirectional lens, there is less distortion, and thus, the position of the object can be accurately detected.
나아가, 실시예에 따른 충돌 회피용 제어장치는, 단일 외부 전방위 렌즈로 촬영된 전방위 영상의 특징을 고려하여 상기 전방위 영상을 분석하여, 전방위 영상에 촬영된 객체의 위치를 좀더 정밀하게 감지할 수 있다. Furthermore, the collision avoidance control apparatus according to the embodiment may analyze the omnidirectional image in consideration of the characteristics of the omnidirectional image photographed by a single external omnidirectional lens to more accurately detect the position of the object photographed in the omnidirectional image. .
도 1은 본 발명의 실시예에 따른 충돌 회피용 드론 제어장치가 장착된 드론의 측면을 나타낸다. 1 is a side view of a drone equipped with a collision avoidance drone control device according to an exemplary embodiment of the present invention.
도 2는 본 발명의 실시예에 따른 충돌 회피용 드론 제어장치의 블록도이다. 2 is a block diagram of a collision avoidance drone control device according to an embodiment of the present invention.
도 3은 본 발명의 실시예에 따른 센서부의 전방위 카메라의 단면을 나타낸다. 3 is a cross-sectional view of the omni-directional camera of the sensor unit according to an embodiment of the present invention.
도 4는 본 발명의 다른 실시예에 따른 센서부의 전방위 카메라의 단면을 나타낸다. 4 is a cross-sectional view of the omni-directional camera of the sensor unit according to another embodiment of the present invention.
도 5는 본 발명의 실시예에 따른 충돌 회피용 드론 제어장치가 긴급 물체 회피기능을 제공하는 과정을 나타내는 흐름도이다. 5 is a flowchart illustrating a process of providing an emergency object avoidance function by a collision avoidance drone control device according to an exemplary embodiment of the present invention.
도 6a는 본 발명의 실시예에 따른 분할된 전방위 영상의 일례이고, 도 6b는 본 발명의 다른 실시예에 따른 분할된 전방위 영상의 일례이다.6A is an example of a segmented omnidirectional image according to an embodiment of the present invention, and FIG. 6B is an example of a segmented omnidirectional image according to another embodiment of the present invention.
도 7은 본 발명의 실시예에 따른 프로세서의 내부 블록도이다.7 is an internal block diagram of a processor according to an embodiment of the present invention.
도 8은 본 발명의 실시예에 따른 프로세서가 객체와 드론 사이의 거리를 측정하는 방법을 설명하기 위한 개념도이다. 8 is a conceptual diagram illustrating a method of measuring a distance between an object and a drone by a processor according to an embodiment of the present invention.
도 9는 본 발명의 실시예에 따른 프로세서가 전방위 카메라의 초점을 제어하는 과정을 나타내는 흐름도이다. 9 is a flowchart illustrating a process of controlling the focus of the omnidirectional camera by the processor according to an embodiment of the present invention.
도 10은 본 발명의 또 다른 실시예에 따른 센서부의 전방위 카메라의 단면을 나타낸다. 10 is a cross-sectional view of the omni-directional camera of the sensor unit according to another embodiment of the present invention.
도 11은 본 발명의 다른 실시예에 따른 충돌 회피용 드론 제어장치가 장착된 드론의 측면을 나타낸다.11 is a side view of a drone equipped with a collision avoidance drone control device according to another embodiment of the present invention.
도 12는 하측 및 상측 전방위 카메라의 내부 구성도이다.12 is an internal configuration diagram of the lower and upper omnidirectional cameras.
도 13은 본 발명의 또 다른 실시예에 따른 따른 충돌 회피용 드론 제어장치의 블록도이다.Figure 13 is a block diagram of a collision avoidance drone control device according to another embodiment of the present invention.
도 14 및 도 15는 액체 렌즈의 단면도이다.14 and 15 are cross-sectional views of the liquid lens.
도 16은 본 발명의 실시예에 따른 프로세서가 드론의 최적 이동경로를 설정하는 모습을 나타낸다. FIG. 16 shows how a processor sets an optimal movement path of a drone according to an embodiment of the present invention.
이하, 본 발명의 실시예에 의한 충돌 회피용 드론 제어장치의 도면을 참고하여 상세하게 설명한다. 다음에 소개되는 실시 예들은 당업자에게 본 발명의 사상이 충분히 전달될 수 있도록 하기 위해 예로서 제공되는 것이다. 따라서, 본 발명은 이하 설명되는 실시 예들에 한정되지 않고 다른 형태로 구체화될 수도 있다. 그리고, 도면들에 있어서, 장치의 크기 및 두께 등은 편의를 위하여 과장되어 표현될 수도 있다. 명세서 전체에 걸쳐서 동일한 참조 번호들은 동일한 구성요소들을 나타낸다.Hereinafter, with reference to the drawings of the collision avoidance drone control apparatus according to an embodiment of the present invention will be described in detail. The following embodiments are provided as examples to sufficiently convey the spirit of the present invention to those skilled in the art. Therefore, the present invention is not limited to the embodiments described below and may be embodied in other forms. In the drawings, the size and thickness of the device may be exaggerated for convenience. Like numbers refer to like elements throughout the specification.
본 발명의 이점 및 특징, 그리고 그것들을 달성하는 방법은 첨부되는 도면과 함께 상세하게 후술되어 있는 실시 예들을 참조하면 명확해질 것이다. 그러나, 본 발명은 이하에서 개시되는 실시 예들에 한정되는 것이 아니라 서로 다른 다양한 형태로 구현될 것이며, 단지 본 실시 예들은 본 발명의 개시가 완전하도록 하며, 본 발명이 속하는 기술분야에서 통상의 지식을 가진 자에게 발명의 범주를 완전하게 알려주기 위해 제공되는 것이며, 본 발명은 청구항의 범주에 의해 정의될 뿐이다. 명세서 전체에 걸쳐 동일 참조 부호는 동일 구성요소를 지칭한다. 도면에서 층 및 영역들의 크기 및 상대적인 크기는 설명의 명료성을 위해 과장될 수 있다.Advantages and features of the present invention, and methods for achieving them will be apparent with reference to the embodiments described below in detail in conjunction with the accompanying drawings. However, the present invention is not limited to the embodiments disclosed below, but may be implemented in various different forms, only the embodiments are to make the disclosure of the present invention complete, and the general knowledge in the technical field to which the present invention belongs. It is provided to fully convey the scope of the invention to those skilled in the art, and the present invention is defined only by the scope of the claims. Like reference numerals refer to like elements throughout. In the drawings, the size and relative size of layers and regions may be exaggerated for clarity.
본 명세서에서 사용된 용어는 실시 예들을 설명하기 위한 것이며, 따라서 본 발명을 제한하고자 하는 것은 아니다. 본 명세서에서, 단수형은 문구에서 특별히 언급하지 않는 한 복수형도 포함한다. 명세서에서 사용되는 "포함한다 (comprise)" 및/또는 "포함하는(comprising)"은 언급된 구성요소, 단계, 동작 및/ 또는 소자는 하나 이상의 다른 구성요소, 단계, 동작 및/또는 소자의 존재 또는 추가를 배제하지 않는다.The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. In this specification, the singular also includes the plural unless specifically stated otherwise in the phrase. As used herein, “comprise” and / or “comprising” refers to a component, step, operation and / or element that is present in one or more other components, steps, operations and / or elements. Or does not exclude additions.
도 1은 본 발명의 실시예에 따른 충돌 회피용 드론 제어장치가 장착된 드론의 측면을 나타낸다. 1 is a side view of a drone equipped with a collision avoidance drone control device according to an exemplary embodiment of the present invention.
도 1을 참조하면, 실시예에 따른 드론은, 드론의 외형을 형성하는 바디(10)와, 바디(10)에 배치되어 방사상으로 회전 가능한 프로펠러(20)와, 드론 주변 객체를 감지하고 객체를 회피하기 위해 상기 프로펠러(20)를 구동 제어하는 충돌 회피용 드론 제어장치를 포함한다. Referring to FIG. 1, a drone according to an embodiment may include a body 10 forming an outer shape of a drone, a propeller 20 rotatably disposed on the body 10, an object surrounding the drone, and detecting an object. It includes a collision avoidance drone control device for driving control of the propeller 20 to avoid.
자세히, 실시예에 따른 충돌 회피용 제어장치는, 단일 외부렌즈로 드론의 전방위를 촬영하여 전방위 영상을 획득하고, 전방위 영상을 이미지 처리하여 드론 주변의 객체를 감지하고, 적어도 둘 이상의 전방위 영상을 기초로 감지된 객체와 드론 사이의 거리를 측정하며, 객체와 드론 사이의 거리가 소정의 거리 이하일 때, 드론이 객체를 긴급 회피하도록 프로펠러(20)를 제어하여, 긴급 충돌회피 기능을 제공할 수 있다. In detail, the collision avoidance control apparatus according to the embodiment, by capturing the omnidirectional image of the drone with a single external lens to obtain a omnidirectional image, image processing the omnidirectional image to detect objects around the drone, based on at least two omnidirectional images The distance between the detected object and the drone, and when the distance between the object and the drone is less than a predetermined distance, the drone controls the propeller 20 to emergency avoid the object, it can provide an emergency collision avoidance function. .
이러한 충돌 회피용 제어장치는, 하나, 많게는 두 개의 전방위 카메라(160)만을 이용하여, 드론의 사방을 일거에 감시할 수 있으므로, 다수의 카메라나 거리 센서가 필요 없어 제조비용이 절감되는 장점이 있고, 전방위 영상 전체는 단일 외부 전방위 렌즈(200a)를 통해 촬영된 영상이므로 왜곡이 적기 때문에 객체의 위치를 정밀하게 감지할 수 있는 장점이 있다. Such a collision avoidance control device, by using only one, many two omnidirectional cameras 160, can monitor the drones at once, there is no need for multiple cameras or distance sensors, there is an advantage that the manufacturing cost is reduced Since the whole omnidirectional image is an image captured by a single external omnidirectional lens 200a, since there is little distortion, the position of the object can be accurately detected.
나아가, 실시예에 따른 충돌 회피용 제어장치는, 단일 외부 전방위 렌즈(200a)로 촬영된 전방위 영상의 특징을 고려하여 상기 전방위 영상을 분석하여, 전방위 영상에 촬영된 객체의 위치를 좀더 정밀하게 감지할 수 있다. Furthermore, the collision avoidance control apparatus according to the embodiment analyzes the omnidirectional image in consideration of the characteristics of the omnidirectional image captured by the single external omnidirectional lens 200a to more accurately detect the position of the object photographed in the omnidirectional image. can do.
이하 이러한 충돌 회피용 드론 제어장치를 이루는 각 구성에 대해 좀더 상세히 설명한다. Hereinafter, each component of the collision avoidance drone controller will be described in more detail.
도 2는 본 발명의 실시예에 따른 충돌 회피용 드론 제어장치의 블록도이다. 2 is a block diagram of a collision avoidance drone control device according to an embodiment of the present invention.
도 2를 참조하면, 실시예에 따른 충돌 회피용 제어장치는, 입력부(110), 통신부(12), 구동부(130), 메모리(140), 전원부(150), 센서부(160) 및 프로세서(170)를 포함하며, 특히, 센서부(160)는, 적어도 하나 이상의 전방위 카메라(200)와, 기타 센서를 포함할 수 있다. 다만, 도 2에 도시된 충돌 회피용 제어장치의 유닛들은, 충돌 회피용 제어장치를 구현하는데 필수적인 유닛이 아닐 수 있으며, 실시예에 따라 유닛의 일부는 충돌 회피용 제어장치에 포함되지 않을 수 있고, 도시되지 않은 다른 유닛이 충돌 회피용 제어장치에 더 포함될 수도 있다. 2, a collision avoidance control apparatus according to an embodiment includes an input unit 110, a communication unit 12, a driver 130, a memory 140, a power supply unit 150, a sensor unit 160, and a processor ( 170, and in particular, the sensor unit 160 may include at least one omnidirectional camera 200 and other sensors. However, the units of the collision avoidance control device shown in FIG. 2 may not be an essential unit for implementing the collision avoidance control device, and according to an embodiment, some of the units may not be included in the collision avoidance control device. Other units not shown may be further included in the collision avoidance control device.
먼저, 충돌 회피용 드론 제어장치는, 사용자의 입력을 감지하는 입력부(110)를 포함할 수 있다. First, the collision avoidance drone control apparatus may include an input unit 110 that detects a user input.
예를 들어, 입력부(110)는, 충돌 회피용 드론 제어장치의 전원을 온(on)/오프(off)시키는 실행 입력이나, 충돌 회피용 드론 제어장치의 긴급 드론 회피기능을 온/오프시키는 실행 입력 등을 등을 감지할 수 있다. For example, the input unit 110 may execute execution input for turning on / off the power supply of the collision avoidance drone control device, or turn on / off the emergency drone avoidance function of the collision avoidance drone control device. Input can be detected, etc.
이러한 입력부(110)는 사용자 제스쳐를 감지하는 제스쳐 입력부(110)(예를 들어 (optical sensor) 등), 터치를 감지하는 터치 입력부(110)(예를 들어, 터치 센서(touch sensor), 터치키(touch key), 푸시키(mechanical key) 등), 사용자 단말기(600), 음성 입력을 감지하는 마이크로폰(microphone) 및 원격 리모콘, 이동 단말기 중 적어도 하나 이상을 포함하여, 사용자 입력을 감지할 수 있다. The input unit 110 may include a gesture input unit 110 for detecting a user gesture (for example, an optical sensor), a touch input unit 110 for detecting a touch (for example, a touch sensor, and a touch key). (touch key, push key, etc.), the user terminal 600, a microphone for detecting a voice input, a microphone, a remote remote controller, and at least one of a mobile terminal to detect a user input. .
또한, 충돌 회피용 드론 제어장치는, 사용자의 단말기와 무선 통신하는 통신부(120)를 포함할 수 있다. In addition, the collision avoidance drone control device may include a communication unit 120 for wireless communication with the user's terminal.
실시예에서, 통신부(120)는, 전방위 카메라(200)가 촬영한 전방위 영상/ 전방위 영상을 분석하여 획득한 드론 주변 정보를 단말기로 송신할 수 있다. In an exemplary embodiment, the communicator 120 may transmit the drone surrounding information obtained by analyzing the omnidirectional image and the omnidirectional image captured by the omnidirectional camera 200 to the terminal.
또한, 실시예에서, 통신부(120)는, 사용자가 단말기를 통해 입력한 원격 드론 제어신호를 수신할 수 있으며, 프로세서(170)는, 수신한 원격 드론 제어신호에 따라 구동부(130)를 제어하여, 사용자가 드론을 원격 제어하도록 하는 원격 제어 기능을 제공할 수 있다. 다만, 프로세서(170)는, 원격 제어 중에도 객체와 충돌 위험이 있는 경우, 원격 제어신호에 우선하여 충돌 회피 제어를 실행할 수 있다.In addition, in an embodiment, the communication unit 120 may receive a remote drone control signal input by a user through the terminal, and the processor 170 controls the driver 130 according to the received remote drone control signal. In addition, it may provide a remote control function for the user to remote control the drone. However, if there is a risk of collision with the object even during remote control, the processor 170 may execute the collision avoidance control in preference to the remote control signal.
즉, 충돌 회피용 드론 제어장치는, 드론의 전방위를 하나의 영상으로 확인할 수 있도록 전방위 영상을 송신하여, 사용자 편의를 향상시킬 수 있다. That is, the collision avoidance drone control apparatus may transmit the omnidirectional image so that the omnidirectional location of the drone can be confirmed as one image, thereby improving user convenience.
이러한 통신부(120)는, 단말기와 무선(wireless) 방식으로 데이터를 교환할 수 있다. 무선 데이터 통신 방식으로는 블루투스(Bluetooth) WiFi, Direct WiFi, APiX, LTE 또는 NFC 등 다양한 데이터 통신 방식이 가능할 수 있다. The communication unit 120 may exchange data with a terminal in a wireless manner. The wireless data communication method may be various data communication methods such as Bluetooth WiFi, Direct WiFi, APiX, LTE, or NFC.
다음으로, 충돌 회피용 드론 제어장치는, 프로펠러(20)의 동력을 공급하고 프로펠러(20) 회전방향을 제어하여 드론의 이동방향/이동속도를 제어하는 구동부(130)를 포함할 수 있다. Next, the collision avoidance drone control device may include a driving unit 130 for supplying power to the propeller 20 and controlling the propeller 20 rotational direction to control the movement direction / movement speed of the drone.
실시예에서, 충돌 회피용 드론 제어장치가 구동부(130)를 직접 포함하는 것으로 설명하였으나, 드론에 별도의 구동부(130)가 구비되고 충돌 회피용 드론 제어장치는 인터페이스로 상기 구동부(130)와 연결되어 제어신호를 송신함으로써, 구동부(130)를 제어하는 실시예도 가능할 것이다. In the embodiment, the collision avoidance drone control device has been described as including the drive unit 130 directly, the drone is provided with a separate drive unit 130 and the collision avoidance drone control device is connected to the drive unit 130 as an interface By transmitting the control signal, an embodiment of controlling the driver 130 may be possible.
구동부(130)는 동력원 구동부(130)를 포함하며, 동력원 구동부(130)는 드론 내의 동력원에 대한 전자식 제어를 수행할 수 있다. 예로, 전기 기반의 모터가 동력원인 경우, 동력원 구동부(130)는, 모터에 대한 제어를 수행할 수 있다. 이에 의해, 모터의 회전 속도, 토크 등을 제어할 수 있다.The driver 130 includes a power source driver 130, and the power source driver 130 may perform electronic control of a power source in the drone. For example, when the electric-based motor is a power source, the power source driver 130 may control the motor. Thereby, the rotation speed, torque, etc. of a motor can be controlled.
조향 구동부(130)는, 드론 내의 조향 장치(steering apparatus)에 대한 전자식 제어를 수행할 수 있다. 자세히, 조향 구동부(130)는, 프로펠러(20)의 기울기, 서로 다른 프로펠러(20)의 동력 제어를 통해, 드론의 진행 방향을 변경할 수 있다.The steering driver 130 may perform electronic control of a steering apparatus in the drone. In detail, the steering driver 130 may change the propagation direction of the drone through the inclination of the propeller 20 and the power control of the different propellers 20.
실시예에서, 프로세서(170)는, 충돌 위험 객체 감지시 충돌 위험 객체를 회피하는 진행방향을 검출하고, 검출한 진행방향으로 드론이 진행하도록 조향 구동부(130)를 제어할 수 있다. In an embodiment, the processor 170 may detect a traveling direction to avoid the collision dangerous object when detecting the collision dangerous object, and control the steering driver 130 to advance the drone in the detected traveling direction.
또한, 충돌 회피용 드론 제어장치는, 메모리(140)를 포함할 수 있으며, 메모리(140)는 프로세서(170)의 처리 또는 제어를 위한 프로그램 등 충돌 회피용 드론 제어장치 전반의 동작을 위한 다양한 데이터를 저장할 수 있다.In addition, the collision avoidance drone controller may include a memory 140, and the memory 140 includes various data for operation of the overall collision avoidance drone controller such as a program for processing or controlling the processor 170. Can be stored.
또한, 메모리(140)는 충돌 회피용 드론 제어장치에서 구동되는 다수의 응용 프로그램(application program 또는 애플리케이션(application)), 충돌 회피용 드론 제어장치의 동작을 위한 데이터들, 명령어들을 저장할 수 있다. 그리고 이러한 응용 프로그램은, 메모리(140)에 저장되고, 프로세서(170)에 의하여 충돌 회피용 드론 제어장치의 동작(또는 기능)을 수행하도록 구동될 수 있다. In addition, the memory 140 may store a plurality of application programs or applications that are driven by the collision avoidance drone controller, data for operating the collision avoidance drone controller, and instructions. The application program may be stored in the memory 140 and may be driven by the processor 170 to perform an operation (or function) of the collision avoidance drone controller.
실시예에서, 메모리(140)는 전방위 카메라(200) 영상에 포함되는 오브젝트 확인을 위한 데이터를 저장할 수 있다. 예를 들면, 메모리(140)는, 카메라를 통해 획득된 영상에서, 소정 오브젝트가 검출되는 경우, 소정 알고리즘에 의해, 상기 오브젝트가 무엇에 해당하는지 확인하기 위한 데이터를 저장할 수 있다.In an embodiment, the memory 140 may store data for identifying an object included in the omnidirectional camera 200 image. For example, the memory 140 may store data for identifying what the object corresponds to by a predetermined algorithm when a predetermined object is detected in the image acquired through the camera.
예를 들면, 메모리(140)는, 카메라를 통해 획득된 영상에서 고층 빌딩, 비행체, 조류와 같은 소정의 오브젝트가 포함되면, 소정 알고리즘에 의해, 상기 오브젝트가 무엇에 해당하는지 확인하기 위한 데이터를 저장할 수 있다. For example, the memory 140 may store data for identifying what the object corresponds to by a predetermined algorithm when a predetermined object such as a tall building, a flying vehicle, or a bird is included in the image acquired by the camera. Can be.
또한, 충돌 회피용 드론 제어장치는, 전원부(150)를 포함하며, 전원부(150)는, 프로세서(170)의 제어에 따라, 각 구성요소들의 동작에 필요한 전원을 공급할 수 있다. 이러한 전원부(150) 는, 드론 내부의 배터리일 수 있다. In addition, the collision avoidance drone control apparatus includes a power supply unit 150, and the power supply unit 150 may supply power required for operation of each component under the control of the processor 170. The power supply unit 150 may be a battery inside the drone.
또한, 충돌 회피용 드론 제어장치는, 적어도 하나 이상의 전방위 카메라(200)로 구성된 센서부(160)를 포함할 수 있다. In addition, the collision avoidance drone control device may include a sensor unit 160 composed of at least one omnidirectional camera 200.
도 3은 본 발명의 실시예에 따른 센서부(160)의 전방위 카메라(200)의 단면을 나타낸다. 3 is a cross-sectional view of the omnidirectional camera 200 of the sensor unit 160 according to the embodiment of the present invention.
도 3을 참조하면, 실시예에 따른 센서부(160)는, 하나의 전방위 카메라(200)를 포함할 수 있다. Referring to FIG. 3, the sensor unit 160 according to the embodiment may include one omnidirectional camera 200.
자세히, 실시예에 따른 전방위 카메라(200)는, 입사광이 굴절되도록 형성되는 굴절부(240)와, 굴절부(240)의 끝단에 수평하게 형성되어 있는 수평부(260)와, 수평부(260)에 형성되어 내측반사코팅으로부터 반사되는 입사광을 반사시켜 내측오목부로 입사광을 제공하는 반사코팅(250)과, 반사코팅(250)을 통해 반사된 입사광을 렌즈어레이(290)로 제공하도록 다시 굴절시키는 내측오목부(210)와, 내측오목부(210)의 끝단에 오목하게 형성되는 내측굴절부(220)와, 내측굴절부(240)에 형성되어 굴절부(240)로부터 입사되는 입사광을 반사코팅(250)으로 반사시키는 내측반사코팅(230)을 포함하여 구성되는 전방위렌즈(200a)와, 전방위렌즈(200a)가 전방에 설치 구성되어 있으며, 내부에 렌즈어레이(290)가 형성되어 있는 경통(280)을 포함할 수 있다. In detail, the omnidirectional camera 200 according to the embodiment includes a refraction part 240 formed to refract incident light, a horizontal part 260 formed horizontally at the end of the refraction part 240, and a horizontal part 260. And a reflection coating 250 which reflects incident light reflected from the inner reflection coating to provide incident light to the inner recess, and refracts the reflected light reflected through the reflection coating 250 to the lens array 290. Reflective coating of the incident light incident on the inner concave portion 210, the inner refractive portion 220 concave at the end of the inner concave portion 210, and the inner refractive portion 240, and incident from the refracting portion 240. An omnidirectional lens 200a including an inner reflection coating 230 for reflecting to 250 and an omnidirectional lens 200a are installed at the front, and a barrel having a lens array 290 formed therein ( 280).
상기와 같은 외부에 단 한 개의 전방위렌즈(200a) 구성을 통해 360도 시야각 확보가 가능한 장점을 제공할 수있다. Through the configuration of only one omnidirectional lens (200a) to the outside as described above it can provide an advantage capable of securing a 360-degree viewing angle.
자세히, 굴절부(240)가 입사광이 굴절되도록 형성되어 있으며, 굴절부(240)의 끝단에 수평하게 수평부(260)가 형성될 수 있다. 이때, 수평부(260)에 반사코팅(250)을 형성하여 내측반사코팅(250)으로부터 반사되는 입사광을 반사시켜 내측오목부(210)로 입사광을 제공할 수 있다. In detail, the refracting part 240 is formed so that incident light is refracted, and the horizontal part 260 may be formed horizontally at the end of the refracting part 240. In this case, the reflection coating 250 may be formed on the horizontal portion 260 to reflect incident light reflected from the inner reflection coating 250 to provide incident light to the inner recess 210.
또한, 내면에는 전방위렌즈(200a)의 중앙 부위에 반사코팅(250)을 통해 반사된 입사광을 렌즈어레이(290)로 제공하도록 다시 굴절시키는 내측오목부(210)를 형성하게 되며, 내측오목부(210)의 끝단에 오목하게 내측굴절부(220)를 형성할 수 있다. In addition, an inner recess 210 may be formed on the inner surface of the front recess lens 200a to refract incident light reflected through the reflective coating 250 to the lens array 290. An inner refractive portion 220 may be formed concave at the end of the 210.
이때, 내측굴절부(240)에 형성되어 굴절부(240)로부터 입사되는 입사광을 반사코팅(250)으로 반사시키는 내측반사코팅(230)을 형성시킬 수 있다. In this case, the inner reflection coating 230 formed on the inner refractive portion 240 to reflect the incident light incident from the refracting portion 240 to the reflection coating 250 may be formed.
그리고 상기한 바와 같이, 두 개의 굴절면과 두 개의 반사코팅(250)면을 구성할 수 있다. As described above, two refractive surfaces and two reflective coatings 250 may be configured.
또한, 전방위렌즈(200a) 자체가 비구면이 아니라, 구면이므로 가공이 쉬워 제조 공정이 간단하여, 제조상의 편의성 및 제조 원가 절감을 제공할 수 있다. In addition, since the omnidirectional lens 200a itself is not an aspherical surface, but is spherical, the processing is easy and the manufacturing process is simple, thereby providing convenience in manufacturing and manufacturing cost reduction.
즉, 굴절부(240)와 내측굴절부(240) 및 내측오목부(210)는 구면을 가지게 되는 것이어서 비구면시의 문제점인 가공의 어려움을 해결하면서 동시에 전방위 촬영이 가능한 효과를 제공할 수 있다. That is, the refraction portion 240, the inner refraction portion 240 and the inner concave portion 210 is to have a spherical surface can solve the difficulty of machining, which is a problem of aspheric surface and at the same time can provide an omnidirectional photographing effect.
그리고 전방위렌즈(200a)를 경통(280)의 전방에 설치 구성하게 되며, 경통(280)의 내부에는 렌즈어레이(290)가 형성될 수 있다. 또한, 전방위렌즈(200a)의 가상의 중앙축(270)을 기준으로 내측굴절부(240)의 경사는 굴절부(240)의 경사보다 가파르게 형성될 수 있다. 이는 35도 내지 85도 범위 내의 시야범위를 충족시키기 위하여 내측굴절부(240)의 경사각도가 굴절부(240)의 경사각도보다 가파르도록 형성하기 위함이다. In addition, the omnidirectional lens 200a may be installed at the front of the barrel 280, and a lens array 290 may be formed inside the barrel 280. In addition, the inclination of the inner refractive portion 240 based on the virtual central axis 270 of the omnidirectional lens 200a may be steeper than that of the refracting portion 240. This is to form an inclination angle of the inner refractive portion 240 to be steeper than the inclination angle of the refracting portion 240 to satisfy the viewing range within the range of 35 to 85 degrees.
한편, 전방위렌즈(200a)의 내측 중앙 부위에 내측오목부(210)를 형성하는 이유는 초점을 맞추기 위함이다. 자세히, 내측오목부(210)에 의해 입사광이 렌즈어레이(290)에 입사할 경우에 광을 모아주어 초점을 맞출 수 있다. On the other hand, the reason for forming the inner recess 210 in the inner central portion of the omnidirectional lens 200a is to focus. In detail, when the incident light is incident on the lens array 290 by the inner recess 210, light may be collected and focused.
보통 광학 렌즈의 경우에는 빛의 양이나 반사에 따라 상이 맺히기에 광학계의 설계는 완전체로 제작될 수 있다. 즉, 하나의 전체적인 형상을 통해 발생되는 빛의 반사는 유기적인 것이므로 최초 설계시점부터 완전체로 가공하게 될 수 있다. In the case of an ordinary optical lens, the image is formed according to the amount or reflection of light, and thus the design of the optical system can be manufactured in a complete body. That is, since the reflection of light generated through one overall shape is organic, it can be processed into a complete body from the initial design point.
도 4는 본 발명의 다른 실시예에 따른 센서부(160)의 전방위 카메라(200)의 단면을 나타낸다. 4 is a cross-sectional view of the omnidirectional camera 200 of the sensor unit 160 according to another embodiment of the present invention.
도 4를 참조하면, 실시예에 따른 센서부(160)는, 서로 이격되어 배치된 제 1 전방위 카메라(200)와 제 2 전방위 카메라(201)를 포함할 수 있다. 이때, 각각의 제 1 및 제 2 전방위 카메라(200, 201)에 대한 설명은 전술한 전방위 카메라(200)에 대한 설명과 중복되므로, 자세한 설명은 생략하기로 한다. Referring to FIG. 4, the sensor unit 160 according to the embodiment may include a first omnidirectional camera 200 and a second omnidirectional camera 201 disposed to be spaced apart from each other. In this case, the description of each of the first and second omnidirectional cameras 200 and 201 overlaps with the description of the omnidirectional camera 200 described above, and thus a detailed description thereof will be omitted.
자세히, 제 1 전방위 카메라(200)는, 드론의 하부에 배치되어, 전방위를 촬영할 수 있다. 이때, 드론에 의해 가려진 상측과, 반사코팅(250)에 의해 가려진 하측 일부 영역은 촬영이 불가할 수 있다. In detail, the first omnidirectional camera 200 may be disposed under the drone to photograph the omnidirectional view. At this time, the upper portion covered by the drone and the lower portion of the lower portion covered by the reflective coating 250 may not be photographed.
그리고 제 2 전방위 카메라(201)는, 제 1 전방위 카메라(200)의 반사코팅(250)에 연장되도록 배치될 수 있다. 자세히, 제 2 전방위 카메라(201)의 경통(281) 일단이 제 1 전방위 카메라(200)의 반사코팅(250)과 오버랩 되도록 배치될 수 있다. In addition, the second omnidirectional camera 201 may be disposed to extend to the reflective coating 250 of the first omnidirectional camera 200. In detail, one end of the barrel 281 of the second omnidirectional camera 201 may be disposed to overlap with the reflective coating 250 of the first omnidirectional camera 200.
즉, 제 1 전방위 카메라(200)와 제 2 전방위 카메라(201)는, 수직방향(상하 방향)으로 소정의 거리만큼 이격되어 배치되어, 제 1 전방위 카메라(200)와 제 2 전방위 카메라(201)는 상하 방향으로 서로 시차를 갖는 두개의 전방위 영상을 각각 촬영할 수 있다. That is, the first omnidirectional camera 200 and the second omnidirectional camera 201 are disposed to be spaced apart by a predetermined distance in the vertical direction (up and down direction), so that the first omnidirectional camera 200 and the second omnidirectional camera 201 are disposed. The camera may take two omnidirectional images having parallax with each other in the vertical direction.
좀더 자세히, 실시예에 따른 센서부(160)는, 제 1 전방위 카메라(200)와 제 2 전방위 카메라(201)를 서로 다른 위치에 배치시켜 시차를 두고 전방위 영상을 촬영하도록 하는 전방위 스테레오 카메라로 구성될 수 있다. In more detail, the sensor unit 160 according to the embodiment is configured as a omnidirectional stereo camera to position the first omnidirectional camera 200 and the second omnidirectional camera 201 at different positions so as to photograph the omnidirectional image with a parallax. Can be.
따라서, 실시예에 따른 센서부(160)는, 2개의 전방위 카메라(200)로 드론의 전방위의 영상을 촬영함과 동시에, 영상에 촬영된 객체와 드론 사이의 거리를 정밀하게 측정할 수도 있다. Therefore, the sensor unit 160 according to the embodiment may shoot the omnidirectional image of the drone with the two omnidirectional cameras 200, and may accurately measure the distance between the object and the drone photographed in the image.
전방위 카메라(200)에서 촬영한 영상을 이용하여 드론 주변 객체를 감지하는 기술에 대한 자세한 설명은 후술하기로 한다. A detailed description of a technique for detecting an object around a drone by using the image photographed by the omnidirectional camera 200 will be described later.
다음으로, 충돌 회피용 드론 제어장치의 센서부(160)는, 모션 센서, 고도 센서, 거리 센서, 균형 센서 또는 비젼 센서 중 적어도 하나 이상의 센서를 더 포함할 수 있다. Next, the sensor unit 160 of the collision avoidance drone control device may further include at least one sensor of a motion sensor, an altitude sensor, a distance sensor, a balance sensor, or a vision sensor.
먼저, 모션 센서는, 자력계, 가속계, 자이로스코프, 가속도 자이로 센서 죽 적어도 하나일 수 있다. 그리고 이러한 모션 센서에서 측정된 드론의 모션 정보는 충돌 회피를 위한 드론 제어 등에 활용될 수 있다. First, the motion sensor may be at least one magnetometer, accelerometer, gyroscope, acceleration gyro sensor. The motion information of the drone measured by the motion sensor may be used for drone control for collision avoidance.
자세히, 자력계는, 나침반 기능을 하는 센서로, 자북을 측정하여 드론의 방위정보를 획득할 수 있으며, 프로세서는, GPS의 위치 정보, 자력계의 방위 정보, 가속계의 이동정보를 합쳐 드론의 모션 정보를 획득할 수 있다. In detail, the magnetometer is a sensor that functions as a compass, and measures magnetic north to obtain the drone's bearing information, and the processor combines the GPS position information, the magnetometer's bearing information, and the accelerometer's motion information to obtain the drone's motion information. Can be obtained.
또한, 가속계는 3축 가속계일 수 있으며, x,y,z 축 방향의 가속도를 측정하여 드론의 속도와 위치, 기울기, 뒤틀림, 방향 전황 등의 모션 정보를 획득할 수 있다. 이러한 모션 정보는, 프로세서(170)가 드론을 안정한 자세로 유지하는데 활용할 수 있다. In addition, the accelerometer may be a three-axis accelerometer, it is possible to measure the acceleration in the x, y, z axis direction to obtain the motion information, such as the speed and position of the drone, the inclination, the warp, the direction of the war. Such motion information may be utilized by the processor 170 to maintain the drone in a stable posture.
또한, 자이로스코프는, 3축 자이로스코프로 x,y,z 축 방향의 각속도를 측정하여 드론의 기울기 정보를 획득할 수 있다. In addition, the gyroscope may acquire the inclination information of the drone by measuring the angular velocity in the x, y, z axis direction with a three-axis gyroscope.
또한, 가속도 자이로 센서는, 상기 3축 가속도 센서, 3축 자이로스코프 센서가 결합된 센서로, 온도 센서를 더 포함하여, 모션 정보를 획득할 수 있다. The acceleration gyro sensor may be a sensor in which the three-axis acceleration sensor and the three-axis gyroscope sensor are combined, and further include a temperature sensor to acquire motion information.
다음으로, 고도 센서는, 대기압을 측정하여 드론의 고도를 획득할 수 있으며, GPS를 통해 측정된 드론의 고도 정보를 보완할 수 있다. Next, the altitude sensor may acquire the altitude of the drone by measuring atmospheric pressure, and may supplement the altitude information of the drone measured through GPS.
다음으로, GPS 센서는, 인공위성 신호를 통해 드론의 위치, 좌표, 고도를 측정할 수 있다. 이러한 GPS 센서는 통신부에 포함되는 것으로도 이해할 수 있다. Next, the GPS sensor may measure the position, coordinates, and altitude of the drone through the satellite signal. It can be understood that such a GPS sensor is included in the communication unit.
다음으로, 드론과 외부 객체와의 거리를 측정하는 거리 센서를 더 포함할 수 있다. 예를 들어, 거리 센서는, 초음파나 레이저 출력하고 반사하여 돌아오는 시간을 통해 드론과 외부 객체 사이의 거리를 측정할 수 있다. 이러한 거리 정보는, 프로세서(170)가 드론을 제어하는데 추가적으로 이용될 수 있다. Next, the apparatus may further include a distance sensor measuring a distance between the drone and the external object. For example, the distance sensor may measure the distance between the drone and the external object through an ultrasonic or laser output and return time. This distance information may be further used by the processor 170 to control the drone.
또한, 외부 이미지를 촬영하고 촬영된 이미지 패턴을 분석하여 장애물 유무를 판단하는 비젼 센서를 더 포함할 수 있다. 이러한 비젼 센서는, 전방위 카메라와 비슷한 기능을 가졌지만, 10m 이하의 고도에서 호버링할 때 구동하여 장애물을 체크하고 충돌 방지를 위해 활용할 수 있다. The apparatus may further include a vision sensor for capturing an external image and analyzing the photographed image pattern to determine the presence or absence of an obstacle. This vision sensor has a similar function to an omnidirectional camera, but can be driven when hovering at an altitude of 10m or less to check obstacles and to prevent collisions.
다음으로, 충돌 회피용 드론 제어장치는, 충돌 회피용 드론 제어장치 내의 각 유닛의 전반적인 동작을 제어하는 프로세서(170)를 포함할 수 있다.Next, the collision avoidance drone control device may include a processor 170 that controls the overall operation of each unit in the collision avoidance drone control device.
이러한 프로세서(170)는 ASICs (application specific integrated circuits), DSPs(digital signal processors), DSPDs(digital signal processing devices), PLDs(programmable logic devices), FPGAs(field programmable gate arrays), 제어기(controllers), 마이크로 컨트롤러(micro-controllers), 마이크로 프로세서 (microprocessors), 기타 기능 수행을 위한 전기적 유닛 중 적어도 하나를 이용하여 구현될 수 있다.The processor 170 may include application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), controllers, and microcontrollers. It may be implemented using at least one of a controller (micro-controllers), microprocessors (microprocessors), electrical units for performing other functions.
이하, 전술한 유닛으로 구성된 충돌 회피용 드론 제어장치가, 충돌 긴급 회피 기능을 제공하는 과정을 도 5를 참조하여 상세히 설명한다. Hereinafter, a process of providing a collision emergency avoidance function by the collision avoidance drone control device including the above-described unit will be described in detail with reference to FIG. 5.
먼저, 센서부(160)는, 드론 주변의 전방위 영상을 촬영할 수 있다. (S101)First, the sensor unit 160 may photograph an omnidirectional image around the drone. (S101)
실시예에서, 센서부(160)가 하나의 전방위 카메라(200)로 구성된 경우, 하나의 전방위 카메라(200)는 단일 외부 전방위 렌즈(200a)를 통해 입사되는 광을 감지하여 드론 주변 전방위 영상을 획득할 수 있다. In an embodiment, when the sensor unit 160 is configured as one omnidirectional camera 200, one omnidirectional camera 200 detects the light incident through the single external omnidirectional lens 200a to obtain an omnidirectional image around the drone. can do.
다른 실시예에서, 센서부(160)가 전방위 스테레오 카메라를 포함하는 경우, 두개의 전방위 카메라(200)는 각각 단일 외부 전방위 렌즈(200a)를 통해 입사되는 광을 감지하여 시차를 갖는 두개의 전방위 영상을 획득할 수 있다. In another embodiment, when the sensor unit 160 includes an omnidirectional stereo camera, the two omnidirectional cameras 200 respectively detect light incident through the single external omnidirectional lens 200a to provide two omnidirectional images having parallax. Can be obtained.
이때, 획득한 전방위 영상은, 원형으로 형성될 수 있으며, 반사코팅(250)에 의해 도넛 형상을 가질 수 도 있다. In this case, the obtained omnidirectional image may be formed in a circular shape and may have a donut shape by the reflective coating 250.
다음으로, 프로세서(170)는, 획득한 전방위 영상을 분할할 수 있다. (S102)Next, the processor 170 may divide the obtained omnidirectional image. (S102)
전방위 영상 전체를 이미지 프로세싱하여, 객체를 감지하는 실시예도 가능하나, 실시예는, 고용량 데이터의 전방위 영상 전체를 한번에 이미지 처리할 경우, 프로세서(170)에 과다하게 부하가 걸림을 방지하기 위함이다. 자세히, 프로세서(170)는, 전방위 영상을 복수의 영상으로 분할하고, 분할된 영상을 순차적으로 이미지 처리하여, 순간적으로 발생하는 처리 부하를 최소화할 수 있다. Although an embodiment of sensing an object by image processing the entire omnidirectional image is possible, the embodiment is for preventing an excessive load on the processor 170 when image processing the entire omnidirectional image of high-capacity data at once. In detail, the processor 170 may divide the omnidirectional image into a plurality of images and sequentially process the divided images to minimize the processing load occurring at the moment.
그리고 프로세서(170)가 전방위 영상의 분할 횟수가 증가할수록 분할된 영상 사이에 배치된 객체를 정밀하게 검출할 수 있어, 드론 주변 감지 정보의 분해능이 향상될 수 있다. As the number of divisions of the omnidirectional image increases, the processor 170 may precisely detect an object disposed between the divided images, thereby improving resolution of sensing information around the drone.
또한, 프로세서(170)는, 부채꼴 형상의 분할 영상을 사각 영상으로 변환하고, 사각 영상들을 정합하여 전방위 영상을 파노라마 영상으로 변환할 수도 있다. In addition, the processor 170 may convert the sector-shaped split image into a square image, match the square images, and convert the omnidirectional image into a panoramic image.
예를 들어, 도 6a를 참조하면, 프로세서(170)는, 전방위 영상을 원호 방향으로 등간격을 갖도록 분할할 수 있다. For example, referring to FIG. 6A, the processor 170 may divide the omnidirectional image to have equal intervals in the arc direction.
그리고 프로세서(170)는, 분할된 영상들의 우선 순위를 판별하고, 우선 순위에 따른 분할 영상부터 순차적으로 이미지 처리하여, 분할 영상에 포함된 객체들을 분석할 수 있다. The processor 170 may determine the priorities of the divided images, sequentially process the divided images according to the priorities, and analyze the objects included in the divided images.
자세히, 프로세서(170)는, 드론의 이동방향 측에 분할 영상을 최우선 순위로 분석하고, 이동방향 측 분할 영상에 인접한 분할 영상을 후순위로 분석할 수 있다. In detail, the processor 170 may analyze the split image on the moving direction side of the drone as the highest priority, and analyze the split image adjacent to the moving direction side split image as the lower priority.
다른 실시예에 따르면, 프로세서(170)는, 전방위 영상을 서로 다른 크기로 분할할 수도 있다. According to another embodiment, the processor 170 may divide the omnidirectional image into different sizes.
자세히, 프로세서(170)는, 전방위 영상에서 주 감시 영역을 지정하고, 주 감시 영역을 다수 영상으로 분할하여, 분해능을 향상시켜, 주 감시 영역의 객체를 정밀하게 감지할 수 있다. In detail, the processor 170 may designate the main surveillance region in the omnidirectional image, divide the main surveillance region into a plurality of images, improve resolution, and accurately detect an object in the primary surveillance region.
예를 들어, 도 6b를 참조하면, 프로세서(170)는, 드론이 이동하는 방향 측에 매칭되는 전방위 영상의 영역을 작은 분할 각(θ1)을 갖고 이동방향(D) 측 분할 영상에서 멀어질수록 큰 분할 각(θ5)을 갖도록 전방위 영상을 분할할 수 있다. For example, referring to FIG. 6B, the processor 170 may move a region of the omnidirectional image that is matched to the direction in which the drone moves to have a small split angle θ1 and move away from the split image in the moving direction D side. The omnidirectional image may be divided to have a large division angle θ5.
즉, 프로세서(170)는, 드론 이동방향(D) 측 전방위 영상 영역을 작게 분할하여 분해능을 높여 이동방향(D) 측 객체를 정밀하게 감지하고, 이동방향에서 먼 방향 측 전방위 영상 영역은 크게 분할하여 러프하게 위험 객체를 감지할 수 있다. In other words, the processor 170 divides the drone moving direction D side omnidirectional image area to increase resolution to accurately detect the moving direction D object, and divides the omnidirectional side image area farther from the moving direction. You can detect dangerous objects roughly.
또한, 프로세서(170)는, 전방위 영상을 통해 드론에 접근하는 위험 객체 감지시, 해당 방향 측 전방위 영상의 영역을 작게 분할하여, 위험 객체를 정밀하게 감지할 수도 있다. In addition, when detecting a dangerous object approaching the drone through the omnidirectional image, the processor 170 may divide the area of the corresponding omnidirectional image into small portions to accurately detect the dangerous object.
다음으로, 프로세서(170)는, 전방위 영상 내에 촬영된 객체를 추출할 수 있다. (S103)Next, the processor 170 may extract an object photographed in the omnidirectional image. (S103)
이하, 센서부(160)가 하나의 전방위 카메라(200)로 구성된 경우, 촬영된 객체를 감지하는 방법에 대해 설명한다. Hereinafter, when the sensor unit 160 is composed of one omnidirectional camera 200, a method of sensing a photographed object will be described.
프로세서(170)가 하나의 전방위 카메라(200)를 이용하여 객체를 추출하는 방법으로, 연속하는 이전 영상프레임 및 현재 영상프레임 간의 화소값의 변화만을 이용하는 방법이 있다. 객체가 존재하지 않았던 감시대상 영역에 객체의 진입이 발생하면 전방위 카메라(200)에 의해 촬영되는 전체영상프레임에 변화가 발생하게 되며, 이는 전체영상프레임 상에서 감시대상 영역에 진입한 객체가 위치하는 영역에서의 화소값의 변화로 나타난다. 따라서 프로세서(170)는 이전 영상프레임과 현재영상프레임을 구성하는 화소들의 화소값을 각각 비교하여 화소값에 변화가 발생한 영역을 객체영상으로 결정할 수 있다.As a method in which the processor 170 extracts an object using one omnidirectional camera 200, there is a method of using only a change of pixel values between successive previous image frames and current image frames. When an object enters a surveillance target area where an object does not exist, a change occurs in the entire video frame photographed by the omnidirectional camera 200, which is an area where an object entering the surveillance target area is located on the entire video frame. This is indicated by the change in pixel value at. Accordingly, the processor 170 may compare the pixel values of the pixels constituting the previous image frame and the current image frame, respectively, and determine the region where the change in the pixel value is an object image.
객체를 추출하는 다른 방법으로, 복수의 이전 영상프레임을 이용하여 현재 영상프레임으로부터 객체를 추출하기 위한 기준을 설정할 수도 있다. 즉, 프로세서(170)는 현재 영상프레임에 시간적으로 앞서는 복수의 이전 영상프레임을 기초로 객체를 추출하기 위한 배경영상을 생성하고, 배경영상을 구성하는 화소들과 현재 영상프레임을 구성하는 화소들의 화소값의 차를 기초로 객체를 추출할 수 있다. 복수의 이전 영상프레임에 걸쳐 화소값이 변화하지 않는 영역은 객체가 존재하지 않는 배경으로 간주되므로, 프로세서(170)는 이를 이용 하여 배경영상을 생성한다. 배경영상과 현재 영상프레임 사이에서 화소값이 변화한 영역을 객체영상으로 결정하 게 되면 이전 영상프레임과 현재 영상프레임을 비교하는 경우에 비하여 보다 정확하게 객체를 추출할 수 있다.As another method of extracting an object, a criterion for extracting an object from a current image frame may be set using a plurality of previous image frames. That is, the processor 170 generates a background image for extracting an object based on a plurality of previous image frames temporally preceding the current image frame, and generates pixels of the pixels constituting the background image and pixels of the current image frame. You can extract objects based on the difference in values. Since the region in which the pixel value does not change over the plurality of previous image frames is regarded as a background where no object exists, the processor 170 generates a background image using the same. If the area of the pixel value is changed between the background image and the current image frame as the object image, the object can be extracted more accurately than when comparing the previous image frame with the current image frame.
센서부(160)가 전방위 스테레오 카메라로 구성된 경우 좀더 정밀하게 객체를 감지할 수 이다. 이하, 전방위 스테레오 영상에 촬영된 객체를 추출하는 방법에 대해 도 7을 참조하여 상세히 설명한다. When the sensor unit 160 is configured as an omnidirectional stereo camera, the sensor 160 may detect the object more precisely. Hereinafter, a method of extracting an object photographed in the omnidirectional stereo image will be described in detail with reference to FIG. 7.
도 7을 참조하면, 프로세서(170)의 내부 블록도의 일예로서, 프로세서(170)는, 영상 전처리부(410), 디스패러티 연산부(420), 오브젝트 검출부(434), 오브젝트 트래킹부(440), 및 어플리케이션부(450)를 포함할 수 있다. 도 5와 이하 설명에서는 영상 전처리부(410), 디스패러티 연산부(420), 오브젝트 검출부(434), 오브젝트 트래킹부(440), 및 어플리케이션부(450) 순으로 영상이 처리되는 것으로 설명하나, 이에 한정되지는 않는다. Referring to FIG. 7, as an example of an internal block diagram of the processor 170, the processor 170 may include an image preprocessor 410, a disparity calculator 420, an object detector 434, and an object tracking unit 440. , And the application unit 450 may be included. In FIG. 5 and the following description, an image is processed in the order of the image preprocessor 410, the disparity calculator 420, the object detector 434, the object tracking unit 440, and the application unit 450. It is not limited.
영상 전처리부(image preprocessor)(410)는, 전방위 스테레오 카메라로부터의 2개의 이미지를 수신하여, 전처리(preprocessing)를 수행할 수 있다. The image preprocessor 410 may receive two images from the omnidirectional stereo camera and perform preprocessing.
구체적으로, 영상 전처리부(410)는, 2개의 이미지에 대한, 노이즈 리덕션(noise reduction), 렉티피케이션(rectification), 캘리브레이션(calibration), 색상 강화(color enhancement), 색상 공간 변환(color space conversion;CSC), 인터폴레이션(interpolation), 전방위 카메라(200) 게인 컨트롤(camera gain control) 등을 수행할 수 있다. 이에 따라, 전방위 카메라(200)에서 촬영된 스테레오 이미지 보다 선명한 이미지를 획득할 수 있다.In detail, the image preprocessing unit 410 may include noise reduction, rectification, calibration, color enhancement, and color space conversion for two images. CSC), interpolation, and the omnidirectional camera 200 gain control. Accordingly, a sharper image may be obtained than the stereo image captured by the omnidirectional camera 200.
디스패러티 연산부(disparity calculator)(420)는, 영상 전처리부(410)에서 신호 처리된, 이미지를 수신하고, 수신된 2개의 이미지들에 대한 스테레오 매칭(stereo matching)을 수행하며, 스테레오 매칭에 따른, 디스패러티 맵(dispartiy map)을 획득할 수 있다. 즉, 드론 전방에 대한, 스테레오 이미지에 대한 디스패러티 정보를 획득할 수 있다.The disparity calculator 420 receives an image signal-processed by the image preprocessor 410, performs stereo matching on the two received images, and performs stereo matching. A disparity map may be obtained. That is, disparity information on the stereo image of the front of the drone may be obtained.
이때, 스테레오 매칭은, 스테레오 이미지들의 픽셀 단위로 또는 소정 블록 단위로 수행될 수 있다. 한편, 디스패러티 맵은, 스테레오 이미지, 즉 좌,우 이미지의 시차(時差) 정보(binocular parallax information)를 수치로 나타낸 맵을 의미할 수 있다.In this case, the stereo matching may be performed in units of pixels of stereo images or in units of predetermined blocks. The disparity map may refer to a map in which stereo parallax information of stereo images, that is, left and right images, is numerically represented.
세그멘테이션부(segmentation unit)(432)는, 디스패러티 연산부(420)로부터의 디스페러티 정보에 기초하여, 이미지 중 적어도 하나에 대해, 세그먼트(segment) 및 클러스터링(clustering)을 수행할 수 있다.The segmentation unit 432 may perform segmentation and clustering on at least one of the images based on the disparity information from the disparity calculator 420.
구체적으로, 세그멘테이션부(432)는, 디스페러티 정보에 기초하여, 스테레오 이미지 중 적어도 하나에 대해, 배경(background)과 전경(foreground)을 분리할 수 있다.In detail, the segmentation unit 432 may separate the background and the foreground from at least one of the stereo images based on the disparity information.
예를 들면, 디스패리티 맵 내에서 디스페러티 정보가 소정치 이하인 영역을, 배경으로 연산하고, 해당 부분을 제외시킬 수 있다. 이에 의해, 상대적으로 전경이 분리될 수 있다. 다른 예로, 디스패리티 맵 내에서 디스페러티 정보가 소정치 이상인 영역을, 전경으로 연산하고, 해당 부분을 추출할 수 있다. 이에 의해, 전경이 분리될 수 있다.For example, an area in which the disparity information is equal to or less than a predetermined value in the disparity map may be calculated in the background, and the corresponding part may be excluded. Thereby, the foreground can be relatively separated. As another example, an area in which the disparity information is greater than or equal to a predetermined value in the disparity map may be calculated in the foreground and a corresponding portion may be extracted. Thereby, the foreground can be separated.
이와 같이, 스테레오 이미지에 기반하여 추출된 디스페러티 정보에 기초하여, 전경과 배경을 분리함으로써, 이후의, 오브젝트 검출시, 신호 처리 속도, 신호 처리 양 등을 단축할 수 있게 된다.As described above, by separating the foreground and the background based on the disparity information extracted based on the stereo image, the signal processing speed, the signal processing amount, and the like can be shortened in the subsequent object detection.
한편, 프로세서(170)는, 스테레오 이미지에서 추출한 디스페러티 정보에 기초하여, 검출한 오브젝트와의 드론 사이의 위치관계(예컨대, 거리)를 산출할 수 있으며, 자세한 방법은 후술한다. The processor 170 may calculate a positional relationship (eg, distance) between the detected object and the drone based on the disparity information extracted from the stereo image, and a detailed method will be described later.
다음, 오브젝트 검출부(object detector)(434)는, 세그멘테이션부(432)로부터의 이미지 세그먼트에 기초하여, 오브젝트를 검출할 수 있다. Next, the object detector 434 may detect the object based on the image segment from the segmentation unit 432.
즉, 오브젝트 검출부(434)는, 디스페러티 정보에 기초하여, 이미지 중 적어도 하나에 대해, 오브젝트를 검출할 수 있다.That is, the object detector 434 may detect an object with respect to at least one of the images based on the disparity information.
구체적으로, 오브젝트 검출부(434)는, 이미지 중 적어도 하나에 대해, 오브젝트를 검출할 수 있다. 예를 들면, 이미지 세그먼트에 의해 분리된 전경으로부터 오브젝트를 검출할 수 있다.In detail, the object detector 434 may detect an object with respect to at least one of the images. For example, an object can be detected from the foreground separated by image segments.
다음, 오브젝트 확인부(object verification unit)(436)는, 분리된 오브젝트를 분류하고(classify), 확인할 수 있다(verify).Next, the object verification unit 436 may classify and verify the separated object.
이를 위해, 오브젝트 확인부(436)는, 뉴럴 네트워크(neural network)를 이용한 식별법, SVM(Support Vector Machine) 기법, Haar-like 특징을 이용한 AdaBoost에 의해 식별하는 기법, 또는 HOG(Histograms of Oriented Gradients) 기법 등을 사용할 수 있다.To this end, the object verification unit 436 may include an identification method using a neural network, a support vector machine (SVM) method, a method of identifying by AdaBoost using a haar-like feature, or a histograms of oriented gradients (HOG). Techniques can be used.
한편, 오브젝트 확인부(436)는, 메모리(140)에 저장된 오브젝트들과, 검출된 오브젝트를 비교하여, 오브젝트를 확인할 수 있다.The object checking unit 436 may check the objects by comparing the objects stored in the memory 140 with the detected objects.
예를 들면, 오브젝트 확인부(436)는, 드론 주변에 위치하는, 주변 비행체, 건물, 위험 지역 등을 확인할 수 있다. For example, the object verification unit 436 can identify the surrounding aircraft, buildings, dangerous areas, and the like located near the drone.
오브젝트 트래킹부(object tracking unit)(440)는, 확인된 오브젝트에 대한 트래킹을 수행할 수 있다. 예를 들면, 순차적으로, 획득되는 스테레오 이미지들에 내의, 오브젝트를 확인하고, 확인된 오브젝트의 움직임 또는 움직임 벡터를 연산하며, 연산된 움직임 또는 움직임 벡터에 기초하여, 해당 오브젝트의 이동 등을 트래킹할 수 있다. 이에 따라, 드론 주변에 위치하는, 주변 비행체, 고층 건물, 위험 지역 등을 트래킹할 수 있게 된다. The object tracking unit 440 may perform tracking on the identified object. For example, in order to sequentially identify the object in the obtained stereo images, calculate the motion or motion vector of the identified object, track the movement of the object, etc. based on the calculated motion or motion vector. Can be. As a result, it is possible to track surrounding vehicles, high-rise buildings, dangerous areas, etc., which are located around the drone.
다음, 어플리케이션부(450)는, 드론 주변에, 위치하는 다양한 오브젝트들, 예를 들면, 주변 비행체, 고층 건물, 위험 지역 등에 기초하여, 충돌의 위험도 등을 연산할 수 있다. 또한, 비행체나 건물과의 충돌 가능성 등을 연산할 수 있다.Next, the application unit 450 may calculate a risk of collision or the like based on various objects located around the drone, for example, a surrounding vehicle, a high-rise building, a dangerous area, and the like. In addition, the possibility of collision with a vehicle or a building can be calculated.
그리고, 어플리케이션부(450)는, 연산된 위험도, 충돌 가능성, 또는 슬립 여부 등에 기초하여, 사용자에게, 이러한 정보를 알려주기 위한, 메시지 등을, 드론 주변 정보로서, 출력할 수 있다. 또는, 드론의 자세 제어 또는 주행 제어를 위한 제어 신호를, 드론 제어 정보로서, 생성할 수도 있다. The application unit 450 may output, as the drone surrounding information, a message or the like for informing the user of such information, based on the calculated risk, the likelihood of collision, the slip state, or the like. Alternatively, a control signal for attitude control or running control of the drone may be generated as the drone control information.
한편, 영상 전처리부(410), 디스페러티 연산부(420), 세그먼테이션부(432), 오브젝트 검출부(434), 오브젝트 확인부(436), 오브젝트 트래킹부(440) 및 어플리케이션부(450)는 프로세서(170)내의 영상 처리부의 내부 구성일 수 있다.Meanwhile, the image preprocessor 410, the disparity calculator 420, the segmentation unit 432, the object detector 434, the object checker 436, the object tracking unit 440, and the application unit 450 are processors. It may be an internal configuration of the image processing unit in 170.
한편, 실시예에 따라, 프로세서(170)는 영상 전처리부(410), 디스페러티 연산부(420), 세그먼테이션부(432), 오브젝트 검출부(434), 오브젝트 확인부(436), 오브젝트 트래킹부(440) 및 어플리케이션부(450) 중 일부만을 포함할 수 있다. According to an exemplary embodiment, the processor 170 may include an image preprocessor 410, a disparity calculator 420, a segmentation unit 432, an object detector 434, an object checker 436, and an object tracking unit ( 440 and only a part of the application unit 450 may be included.
다음으로, 프로세서(170)는, 감지한 객체와 드론 사이의 위치관계를 산출할 수 있다. (S104) 본 단계는, 전방위 영상에서 객체를 추출하는 단계와 동시에 실행될 수도 있다. Next, the processor 170 may calculate a positional relationship between the detected object and the drone. This step may be executed simultaneously with the step of extracting the object from the omnidirectional image.
자세히, 프로세서(170)는, 제 1 전방위 카메라(200)가 촬영한 제 1 전방위 이미지와, 제 2 전방위 카메라(201)가 촬영한 제 2 전방위 이미지를 비교하여, 객체의 이격방향, 객체와의 이격거리 등을 산출할 수 있다. In detail, the processor 170 may compare the first omnidirectional image photographed by the first omnidirectional camera 200 and the second omnidirectional image photographed by the second omnidirectional camera 201, and may be separated from each other by an object. The separation distance can be calculated.
좀더 자세히, 프로세서(170)는, 제 1 전방위 이미지와, 제 2 전방위 이미지에서 시차 차이로 인해 발생된 디스페러티 정보를 획득할 수 있다. In more detail, the processor 170 may obtain the disparity information generated due to the parallax difference in the first omnidirectional image and the second omnidirectional image.
그리고 프로세서(170)는, 단일 외부 전방위 렌즈(200a)를 통해 촬영한 전방위 영상에 의해 발생된 왜곡을 고려하여 디스페러티 정보를 보정할 수 있다. 자세히, 전방위 영상에서 중심 부근에 촬영된 객체는 실제 이격거리 보다 가까이 있는 것으로 촬영되고, 전방위 영상에서 외주연 부근에 촬영된 객체는 실제 이격거리 보다 멀리 있는 것으로 촬영된다. In addition, the processor 170 may correct the disparity information in consideration of the distortion generated by the omnidirectional image captured by the single external omnidirectional lens 200a. In detail, in the omnidirectional image, the object photographed near the center is photographed as being closer than the actual separation distance, and the object photographed near the outer periphery in the omnidirectional image is photographed as being farther than the actual separation distance.
따라서, 스테레오 이미지의 디스페러티 정보를 보정하지 않을 경우, 이격거리에 오차가 발생할 수 있다. Therefore, when the disparity information of the stereo image is not corrected, an error may occur in the separation distance.
실시예에 따른 프로세서(170)는, 스테레오 이미지의 디스페러티 정보를 전방위 영상의 왜곡 특징을 반영하여 보정함으로써, 객체와 드론 사이의 거리를 정밀하게 측정할 수 있다. The processor 170 according to the embodiment may accurately measure the distance between the object and the drone by correcting the disparity information of the stereo image by reflecting the distortion characteristic of the omnidirectional image.
수학식 1은, 스테레오 이미지에 기초하여 드론과 객체 사이의 거리를 측정하는 공식이다.Equation 1 is a formula for measuring a distance between a drone and an object based on a stereo image.
Figure PCTKR2017015463-appb-img-000001
Figure PCTKR2017015463-appb-img-000001
여기서, f=초점 거리(focal length), L=제 1 전방위 카메라(200)와 제 2 전방위 카메라(201) 사이의 간격(baseline), dr=피사체가 투영된 지점으로부터 제 1 이미지 평면의 초점 위치까지의 거리(제 1 디스페러티), dl=피사체가 투영된 지점으로부터 제 2 이미지 평면의 초점 위치까지의 거리(제 2 디스페러티), α=dr을 보정하는 보정변수, β=dl을 보정하는 보정변수를 의미한다. Where f = focal length, L = baseline between the first omnidirectional camera 200 and the second omnidirectional camera 201, dr = focal position of the first image plane from the point at which the subject is projected Distance to the first disparity, dl = distance from the point at which the subject is projected to the focal point of the second image plane (second disparity), a correction variable for correcting α = dr, β = dl Means the correction variable to be corrected.
자세히, 도 8을 참조하면, 프로세서(170)는, 제 1 전방위 카메라(200)의 제 1 이미지(200i)에서 제 1 디스페러티를 산출하고, 제 2 전방위 카메라(201)의 제 2 이미지(201i)에서 제 2 디스페러티를 산출하고, 제 1 디스페러티와 제 2 디스페러티의 합에 반비례하여 객체와 드론 사이의 거리를 산출할 수 있다. In detail, referring to FIG. 8, the processor 170 may calculate a first disparity from the first image 200i of the first omnidirectional camera 200, and may calculate the second image (eg, the second omnidirectional camera 201). In operation 201i, the second disparity may be calculated, and the distance between the object and the drone may be calculated in inverse proportion to the sum of the first and second disparities.
여기서, 제 1 디스페러티는, 피사체가 투영된 지점으로부터 제 1 이미지(200i) 평면의 초점 위치까지의 거리를 의미하고, 제 2 디스페러티는, 피사체가 투영된 지점으로부터 제 2 이미지(201i) 평면의 초점 위치까지의 거리를 의미할 수 있다. Here, the first disparity means the distance from the point at which the subject is projected to the focal position of the plane of the first image 200i, and the second disparity is the second image 201i from the point at which the subject is projected. ) May mean a distance to a focal point of the plane.
전술하였듯이, 제 1 디스페러티와, 제 2 디스페러티는, 전방위 카메라(200)의 단일 외부 원형 렌즈에 의해 발생된 왜곡을 내포하고 있으므로, 각각 보정이 이루어질 수 있다. As described above, since the first disparity and the second disparity contain distortions generated by the single external circular lens of the omni-directional camera 200, correction may be performed.
자세히, 프로세서(170)는, 제 1 이미지(200i)에서 피사체가 투영된 지점이 초점 위치를 기준으로 이격된 거리와 이격방향에 따라서 제 1 디스페러티를 보정하고, 제 2 이미지(201i)에서 피사체가 투영된 지점이 초점 위치를 기준으로 이격된 거리와 이격방향에 따라서 제 2 디스페러티를 보정하고, 보정된 제 1 디스페러티와 제 2 디스페러티를 기초로 객체와 드론 사이의 거리를 산출할 수 있다. In detail, the processor 170 may correct the first disparity according to the distance and the direction in which the point on which the subject is projected in the first image 200i is spaced based on the focal position, and in the second image 201i. The second projection is corrected according to the distance from the point where the subject is projected based on the focus position and the separation direction, and the distance between the object and the drone based on the corrected first and second disparities. Can be calculated.
좀더 자세히, 프로세서(170)는, 제 1 이미지(200i)에서 피사체가 투영된 위치가 영상의 외주연에 가깝다면, 객체가 실제 위치보다 멀리 있는 것으로 촬영되었을 것이므로, 제 1 이미지(200i)의 초점 위치와 피사체 투영위치 사이의 거리에 비례한 보정변수 α(음수)를 더해 제 1 디스페러티를 보정할 수 있다. In more detail, the processor 170 may focus on the first image 200i because, if the position where the subject is projected in the first image 200i is close to the outer periphery of the image, the object may have been photographed as being farther than the actual position. The first disparity can be corrected by adding a correction variable α (negative) proportional to the distance between the position and the subject projection position.
반대로, 프로세서(170)는, 제 2 이미지(201i)에서 피사체가 투영된 위치가 영상의 중심에 가깝다면, 객체가 실제 위치보다 가까이 있는 것으로 촬영되었을 것이므로, 제 2 이미지(201i)의 초점 위치와 피사체 투영위치 사이의 거리에 비례한 보정변수 β (양수)를 더해 제 2 디스페러티를 보정할 수 있다. In contrast, the processor 170 may determine that if the position where the subject is projected in the second image 201i is close to the center of the image, the object may have been photographed as being closer than the actual position. The second disparity can be corrected by adding a correction variable β (positive number) proportional to the distance between the subject projection positions.
한편, 프로세서(170)는, 제 1 전방위 카메라(200)의 제 1 이미지(200i)를 제 1 사각 파노라마 이미지로 변환하고, 제 2 전방위 카메라(201)의 제 2 이미지(201i)를 제 2 사각 파노라마 이미지로 변환하고, 제 1 사각 파노라마 이미지로부터 제 1 디스페러티를 산출하고,Meanwhile, the processor 170 converts the first image 200i of the first omnidirectional camera 200 into a first rectangular panoramic image, and converts the second image 201i of the second omnidirectional camera 201 into a second rectangular image. Convert to a panoramic image, calculate a first disparity from the first rectangular panoramic image,
제 2 사각 파노라마 이미지로부터 제 2 디스페러티를 산출하고, 제 1 디스페러티 및 제 2 디스페러티를 기초로 객체와 드론 사이의 거리를 산출할 수도 있다. 즉, 전방위 영상 이미지를 사각 이미지로 변환하는 단계에서, 전방위 영상의 왜곡을 미리 보정할 수도 있다. The second disparity may be calculated from the second rectangular panoramic image, and the distance between the object and the drone may be calculated based on the first disparity and the second disparity. That is, in the step of converting the omnidirectional image image into a quadrangular image, the distortion of the omnidirectional image may be corrected in advance.
이와 같이, 프로세서(170)는, 전방위 스테레오 이미지에서 발생된 왜곡을 고려하여 객체와 드론 사이의 거리를 정밀하게 산출할 수 있다. As such, the processor 170 may accurately calculate the distance between the object and the drone in consideration of the distortion generated in the omnidirectional stereo image.
다음으로, 프로세서(170)는, 드론과 객체 사이의 위치관계에 따라서 충돌 위험이 있는 객체를 검출할 수 있다. (S105)Next, the processor 170 may detect an object at risk of collision according to the positional relationship between the drone and the object. (S105)
자세히, 프로세서(170)는, 드론과 객체 사이의 거리가 소정의 거리 이내일 때, 해당 객체를 충돌 위험 객체로 판단할 수 있다. In detail, when the distance between the drone and the object is within a predetermined distance, the processor 170 may determine the object as a collision risk object.
이때, 프로세서(170)는, 드론과 객체 사이의 거리의 변화율이 음인지 양인지를 고려하여 충돌 위험 객체를 판단할 수도 있다. In this case, the processor 170 may determine the collision risk object by considering whether the rate of change of the distance between the drone and the object is negative or positive.
그 다음, 프로세서(170)는, 충돌 위험 객체를 검출하면, 충돌 회피 방향을 산출하고, 구동부(130)를 통해 드론의 충돌 회피 방향으로 긴급 선회하도록 제어할 수 있다. (S106)Next, when the collision risk object is detected, the processor 170 may calculate a collision avoidance direction and control to make an emergency turning in the collision avoidance direction of the drone through the driver 130. (S106)
이때, 프로세서(170)는, 원격 제어신호에 우선하여, 드론을 제어할 수 있다. In this case, the processor 170 may control the drone in preference to the remote control signal.
즉, 프로세서(170)는, 원격 제어신호보다 우선하여 긴급 드론 제어신호를 생성하고, 구동부(130)로 전송하여, 프로펠러(20)를 제어함으로써, 드론을 보다 안전하게 보호할 수 있다. That is, the processor 170 may generate an emergency drone control signal in advance of the remote control signal, transmit the emergency drone control signal to the driver 130, and control the propeller 20, thereby further protecting the drone.
도 9는 본 발명의 실시예에 따른 프로세서(170)가 전방위 카메라(200)의 초점을 제어하는 과정을 나타내는 흐름도이다. 9 is a flowchart illustrating a process of controlling the focus of the omnidirectional camera 200 by the processor 170 according to an embodiment of the present invention.
한편, 드론이 지속적으로 움직임에 따라서, 객체와 드론 사이의 거리가 변하므로, 전방위 카메라(200)의 초점을 제어할 필요성이 있다. On the other hand, as the drone continues to move, since the distance between the object and the drone changes, there is a need to control the focus of the omnidirectional camera 200.
도 9를 참조하면, 먼저, 드론이 부양을 시작하고 전방위 영상을 촬영하면, 드론과 지면 사이의 거리가 변화하므로, 전방위 카메라(200)의 초점을 제어할 필요성이 있다. (S201, 202)Referring to FIG. 9, first, when a drone starts supporting and photographs an omnidirectional image, since the distance between the drone and the ground changes, it is necessary to control the focus of the omnidirectional camera 200. (S201, 202)
자세히, 프로세서(170)는, 드론이 부양하는 속도를 메모리(140)에 저장하고, 드론이 부양을 시작하고 변화하는 고도에 따라서 전방위 카메라(200)의 초점을 자동으로 제어할 수 있다. (S203)In detail, the processor 170 may store the speed at which the drone maintains in the memory 140, and automatically control the focus of the omnidirectional camera 200 according to the altitude at which the drone starts supporting and changes. (S203)
예를 들어, 드론이 소정의 속도로 부양할 경우, 프로세서(170)는, 부양을 시작하고 걸린 시간과 소정의 속도를 고려하여 드론의 고도를 산출하고, 드론의 고도에따라 지면 측을 촬영하기 위한 초점을 산출하고, 산출된 초점에 따라 전방위 카메라(200)의 초점을 제어할 수 있다.For example, when the drone supports at a predetermined speed, the processor 170 calculates the altitude of the drone in consideration of the time taken to start supporting and the predetermined speed, and photographs the ground side according to the altitude of the drone. It is possible to calculate the focus for the control of the omnidirectional camera 200 according to the calculated focus.
이를 위해, 전방위 카메라(200)에는, 초점을 제어하는 초점 제어부가 더 포함될 수 있다. To this end, the omnidirectional camera 200 may further include a focus control unit for controlling the focus.
자세히, 도 10을 참조하면, 또 다른 실시예에 따른 전방위 카메라(200)는, 전술한 실시예와 동일한 구성과 함께, 초점을 제어하기 위한 초점 렌즈와, 초점 렌즈를 제어하는 초점 제어부를 더 포함할 수 있다. In detail, referring to FIG. 10, the omnidirectional camera 200 according to another embodiment may further include a focus lens for controlling focus and a focus controller for controlling the focus lens, together with the same configuration as the above-described embodiment. can do.
자세히, 전방위 카메라(200)는, 내측 오목부 내에 배치되어 서로 다른 액체를 포함하는 유체초점 렌즈(320)와, 유체초점 렌즈(320)를 수축 또는 팽창시키는 전류 주입부를 포함하는 초점 제어부(310)를 포함할 수 있다. In detail, the omnidirectional camera 200 includes a fluid focus lens 320 disposed in the inner concave portion and including a different liquid, and a focus controller 310 including a current injection portion contracting or expanding the fluid focus lens 320. It may include.
유체초점(FluidFocus) 렌즈는, 서로 혼합이 되지 않는 두 종류의 액체를 포함할 수 있다. 두 액체는, 굴절률도 서로 다르고 한 쪽은 전기를 통하고 한 쪽은 전기를 통하지 않는 액체일 수 있다. 초점 제어부(310)는, 전류 주입부를 통해 두 액체 중 하나의 액체에 전류를 주입하고, 전류가 주입된 전도성 액체는 부피가 변화하여, 나머지 다른 액체가 마치 인간의 눈의 각막처럼 수축과 팽창을 하면서, 전방위 카메라(200)의 초점이 제어될 수 있다. FluidFocus lenses may include two types of liquids that are not mixed with each other. The two liquids may be liquids with different refractive indices, one with electricity and one without electricity. The focus control unit 310 injects current into one of the two liquids through the current injection unit, and the conductive liquid into which the current is injected changes in volume, so that the other liquids contract and expand like the cornea of the human eye. While the focus of the omnidirectional camera 200 can be controlled.
이러한 유체초점 렌즈(320)는, 전방위 카메라(200)의 단일 외부렌즈(전방위 렌즈(200a))의 내측오목부(210)에 대응되는 형상으로 배치되어, 전방위 영상 전체의 초점을 일률적으로 제어할 수 있는 장점이 있다. The fluid focus lens 320 is disposed in a shape corresponding to the inner concave portion 210 of the single external lens (the omnidirectional lens 200a) of the omnidirectional camera 200 to uniformly control the focus of the entire omnidirectional image. There are advantages to it.
한편, 프로세서(170)는, 추적 객체를 검출하고, 추적 객체와의 거리를 산출한 후, 산출된 거리 변화에 따라 초점을 제어할 수도 있다. (S205)Meanwhile, the processor 170 may detect the tracking object, calculate the distance to the tracking object, and then control the focus according to the calculated distance change. (S205)
이와 같은 제어를 통해, 프로세서(170)는, 이동하는 객체를 정확하게 촬영하고, 촬영된 영상을 분석하여 촬영된 객체와의 거리를 정밀하게 산출함으로써, 긴급 충돌 회피기능의 성능을 좀더 향상시킬 수 있다. Through such control, the processor 170 may accurately photograph a moving object, analyze the photographed image, and accurately calculate a distance from the photographed object, thereby further improving the performance of the emergency collision avoidance function. .
한편, 드론이, 상측으로 이동이 가능하므로, 드론의 상측 영역의 객체를 감지할 필요성이 있다. On the other hand, since the drone can move upward, there is a need to detect an object in the upper region of the drone.
도 11은 본 발명의 다른 실시예에 따른 충돌 회피용 드론 제어장치가 장착된 드론의 측면을 나타내고, 도12는 하측 및 상측 전방위 카메라의 내부 구성도이다.11 is a side view of a drone equipped with a collision avoidance drone control device according to another exemplary embodiment of the present invention, and FIG. 12 is an internal configuration diagram of the lower and upper omnidirectional cameras.
도 11 및 도 12를 참조하면, 다른 실시예에 따른 드론에 장착된 충돌 회피용 드론 제어장치는, 드론의 하측 주변 영역을 감지하기 위한 하측 전방위 카메라(160a)와, 드론의 상측 주변 영역을 감지하기 위한 상측 전방위 카메라(160b)를 포함할 수 있다. 11 and 12, a collision avoidance drone control device mounted on a drone according to another embodiment may include a lower omnidirectional camera 160a for detecting a lower peripheral area of a drone, and an upper peripheral area of a drone. It may include an upward omnidirectional camera 160b.
상측 전방위 카메라(160b)는, 전술한 실시예의 전방위 카메라와 동일한 구성을 가질 수 있다. The upper omnidirectional camera 160b may have the same configuration as the omnidirectional camera of the above-described embodiment.
다만, 상측 전방위 카메라(160b)의 경우, 태양으로부터 직사광선을 받기 때문에, 단일 외부 전방위 렌즈(200a)의 표면에는 윈도 틴팅 처리가 되어 있는 것이 바람직하다. 즉, 상측 전방위 카메라(160b)는, 전술한 전방위 카메라와 동일하며, 다만, 단일 외부 전방위 렌즈(200a) 표면에 윈도 틴팅 처리가 되어 있는 점에서 차별이 있다. However, in the case of the image omnidirectional camera 160b, since it receives direct sunlight from the sun, it is preferable that the surface of the single external omnidirectional lens 200a is subjected to window tinting. That is, the upper omnidirectional camera 160b is the same as the omnidirectional camera described above, except that a window tinting process is applied to the surface of the single external omnidirectional lens 200a.
프로세서는, 상측 전방위 카메라(160b)에서 촬영된 객체를 감지하고, 객체와 드론 사이의 거리를 측정하여, 감지된 객체가 소정의 거리 이하일 때 위험 객체로 보아 드론을 하측 방향으로 이동하도록 긴급 회피 제어를 수행할 수 있다. The processor detects the object photographed by the upper omnidirectional camera 160b, measures the distance between the object and the drone, and emergency avoidance control to move the drone downward as a dangerous object when the detected object is less than a predetermined distance. Can be performed.
반대로, 프로세서는, 하측 전방위 카메라(160a)에서 촬영된 객체를 감지하고, 객체와 드론 사이의 거리를 측정하여, 감지된 객체가 소정의 거리 이하일 때, 위험 객체로 보아 드론을 상측 방향으로 이동하도록 긴급 회피 제어를 수행할 수 있다. In contrast, the processor detects the object photographed by the lower omnidirectional camera 160a, measures the distance between the object and the drone, and moves the drone upward as a dangerous object when the detected object is less than a predetermined distance. Emergency evasion control can be performed.
도 13은 본 발명의 또 다른 실시예에 따른 따른 충돌 회피용 드론 제어장치의 블록도이고, 도 14 및 도 15는 액체 렌즈의 단면도이다.FIG. 13 is a block diagram of a collision avoidance drone control device according to still another embodiment of the present invention, and FIGS. 14 and 15 are cross-sectional views of a liquid lens.
도 13 내지 도 15를 참조하면, 본 발명의 또 다른 실시예에 따른 충돌 회피용 제어장치는, 입력부(110), 통신부(12), 구동부(130), 메모리(140), 전원부(150), 센서부(160) 및 프로세서(170)를 포함하며, 특히, 센서부(160)는, 적어도 하나 이상의 전방위 카메라(200)와 액체렌즈(121)를 포함할 수 있다.13 to 15, a collision avoidance control apparatus according to another embodiment of the present invention includes an input unit 110, a communication unit 12, a driving unit 130, a memory 140, a power supply unit 150, The sensor unit 160 and the processor 170 may be included. In particular, the sensor unit 160 may include at least one omnidirectional camera 200 and a liquid lens 121.
액체 렌즈(121)는 제1 보호 유리(121a), 제2 보호 유리(121b), 제1 및 제2 보호 유리(121a, 121b) 사이에 적층된 오일층(121c)과 수용액층(121d), 이것들의 주변부에 배치되어 전압을 인가하기 위한 제1 전극부(121e)와 제2 전극부(121f), 제1 및 제2 전극부(121e, 121f) 사이를 절연하는 제1 절연부(121g)와 제2 절연부(121h)를 포함할 수 있다.The liquid lens 121 includes an oil layer 121c and an aqueous solution layer 121d stacked between the first protective glass 121a, the second protective glass 121b, the first and second protective glasses 121a and 121b, The first insulating portion 121g disposed between the first electrode portion 121e and the second electrode portion 121f and the first and second electrode portions 121e and 121f disposed at the periphery thereof to insulate a voltage. And a second insulating portion 121h.
액체 렌즈(121)는 외부로부터 공급된 전원이 제1 및 제2 전극부(121e, 121f)에 인가됨으로써, 오일층(121c)의 곡률 반경과 두께가 변화되어, 액체 렌즈(121)를 통과한 광의 초점의 위치를 변화시킬 수 있다.As the liquid lens 121 is supplied with power from the outside to the first and second electrode portions 121e and 121f, the radius of curvature and the thickness of the oil layer 121c are changed to pass through the liquid lens 121. The position of the focus of the light can be changed.
보다 상세하게는 제1및 제 2전극부(121e, 121f)에 소정의 전압이 인가되는 경우, 오일층(121c)의 곡률 반경과 두께가 커질 수 있다. 그리고 전압의 크기를 크게 함으로써 초점 거리를 짧게 할 수 있다.In more detail, when a predetermined voltage is applied to the first and second electrode portions 121e and 121f, the radius of curvature and the thickness of the oil layer 121c may increase. The focal length can be shortened by increasing the voltage.
액체 렌즈(121)는 렌즈의 유동을 위한 별도의 서보 모토를 필요로 하지 않아 제조 비용이 크게 절감되는 효과가 있다. 또한 정밀한 굴절률가변이 가능하다.The liquid lens 121 does not require a separate servo motor for the flow of the lens, thereby greatly reducing the manufacturing cost. In addition, precise refractive index change is possible.
또한 액체 렌즈(121)는 렌즈어레이(290)와 전방위 렌즈(200a) 사이에 위치할 수 있고, 렌즈어레이(290)이의 후단에 위치할 수도 있고, 렌즈어레이(290) 내의 복수의 렌즈들 중 어느 두 개의 렌즈 사이에 위치할 수도 있다.In addition, the liquid lens 121 may be located between the lens array 290 and the omnidirectional lens 200a, may be located at the rear end of the lens array 290, any of the plurality of lenses in the lens array 290 It may be located between two lenses.
프로세서(170)는 액체 렌즈(121)로 인가되는 전원을 제어하여 액체 렌즈(121)의 굴절률을 가변시킬 수 있다. 따라서 원거리에서 근거리까지 포커싱으로 접근 물체를 정밀하게 확인할 수 있도록 하고, 여기서 원거리로 포커싱을 하면 화각은 진행방향으로 작아지는 효과가 발생하고, 대략 초당 500회를 포커싱 하여 영상 분석으로 물체를 확인할 수 있도록 한다.The processor 170 may control the power applied to the liquid lens 121 to vary the refractive index of the liquid lens 121. Therefore, by focusing from far to near, it is possible to check the approaching object precisely.When focusing at far distance, the angle of view becomes smaller in the direction of progress, and focuses about 500 times per second so that the object can be identified by image analysis. do.
한편, 도 16을 참조하면, 실시예에 따른 충돌 회피용 드론 제어장치는, 카메라를 통해 촬영된 영상과 통신부(120)를 통해 수신한 맵 정보를 통해서 최적의 이동경로를 산출할 수 있다. 그리고 프로세서(170)는 상기 최적 이동경로를 산출하는 이동경로설정부(171)을 포함할 수 있다. Meanwhile, referring to FIG. 16, the collision avoidance drone control apparatus according to the embodiment may calculate an optimal movement path based on an image photographed through a camera and map information received through the communicator 120. The processor 170 may include a movement path setting unit 171 for calculating the optimum movement path.
자세히, 도 16을 보면, 드론에서 카메라를 통해 촬영할 수 있는 영상의 영역 내에는 장애물 1이 촬영될 수 있다. 따라서, 충돌 회피용 드론 제어장치는, 장애물 1을 회피하면서 목적지로 가기 위한 최적의 경로를 설정할 수 있다. 다만, 드론에서 촬영할 수 있는 영상의 영역은 한계가 있어서 촬영 영역을 벗어나는 곳에 이동경로를 산출하기 어려울 수 있다. In detail, referring to FIG. 16, obstacle 1 may be photographed in an area of an image that can be photographed through a camera in a drone. Therefore, the collision avoidance drone controller can set the optimal route to the destination while avoiding obstacle 1. However, since the area of the image that can be photographed by the drone has a limit, it may be difficult to calculate a moving path outside the photographing area.
이를 대비하기 위해, 충돌 회피용 드론 제어장치는, 통신부(120)를 통해 영상과 매칭되는 맵 영상을 수신할 수 있다. 그리고 충돌 회피용 드론 제어장치는, 영상과 수신한 맵 영상을 매칭하여 현재 드론이 맵 영상에 위치한 곳을 확정하고 이후 맵 영상을 통해 장애물을 회피하면서 목적지로 가기 위한 최적의 경로를 설정할 수 있다. In order to prepare for this, the collision avoidance drone control device may receive a map image matching the image through the communication unit 120. The collision avoidance drone controller may match the image with the received map image to determine a location where the current drone is located in the map image, and then set an optimal route to the destination while avoiding obstacles through the map image.
즉, 도 16과 같이, 장애물 2는 드론에서 촬영될 수 없으므로, 맵 정보를 통해 미리 장애물 2를 파악하여, 촬영 영역을 벗어난 영역에 대해서도 최적의 이동경로를 설정할 수 있다. That is, since obstacle 2 cannot be photographed by the drone, as shown in FIG. 16, the obstacle 2 can be grasped in advance through the map information, and an optimum movement path can be set even for an area outside the photographing area.
한편, 본 발명은 또한 컴퓨터로 읽을 수 있는 기록매체에 컴퓨터가 읽을 수 있는 코드로서 구현하는 것이 가능하다. 컴퓨터가 읽을 수 있는 기록매체는 컴퓨터 시스템에 의해 읽혀질 수 있는 데이터가 저장되는 모든 종류의 기록장치를 포함한다. 컴퓨터가 읽을 수 있는 기록매체의 예로는 ROM, RAM, CD-ROM, 자기 테이프, 플로피 디스크, 광데이터 저장장치 등이 있으며, 또한 캐리어 웨이브(예를 들어 인터넷을 통한 전송)의 형태로 구현되는 것도 포함한다. 또한, 컴퓨터가 읽을 수 있는 기록매체는 네트워크로 연결된 컴퓨터 시스템에 분산되어, 분산방식으로 컴퓨터가 읽을 수 있는 코드가 저장되고 실행할 수 있다. 그리고, 본 발명을 구현하기 위한 기능적인(functional) 프로그램, 코드 및 코드 세그먼트들은 본 발명이 속하는 기술분야의 프로그래머들에 의해 용이하게 추론될 수 있다.On the other hand, the present invention can also be embodied as computer readable codes on a computer readable recording medium. Computer-readable recording media include all kinds of recording devices that store data that can be read by a computer system. Examples of computer-readable recording media include ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage, and the like, and may also be implemented in the form of a carrier wave (for example, transmission over the Internet). Include. The computer readable recording medium can also be distributed over network coupled computer systems so that the computer readable code is stored and executed in a distributed fashion. In addition, functional programs, codes, and code segments for implementing the present invention can be easily inferred by programmers in the art to which the present invention belongs.
이상에서 설명한 본 발명의 상세한 설명에서는 본 발명의 바람직한 실시 예를 참조하여 설명하였지만, 해당 기술 분야의 숙련된 당업자 또는 해당 기술분야에 통상의 지식을 갖는 자라면 후술할 특허청구범위에 기재된 본 발명의 사상 및 기술 영역으로부터 벗어나지 않는 범위 내에서 본 발명을 다양하게 수정 및 변경시킬 수 있음을 이해할 수 있을 것이다. 따라서, 본 발명의 기술적 범위는 명세서의 상세한 설명에 기재된 내용으로 한정되는 것이 아니라 특허청구범위에 의해 정하여져야만 할 것이다. In the detailed description of the present invention described above with reference to the preferred embodiment of the present invention, those skilled in the art or those skilled in the art having ordinary knowledge of the present invention described in the claims to be described later It will be understood that various modifications and variations can be made in the present invention without departing from the spirit and scope of the art. Therefore, the technical scope of the present invention should not be limited to the contents described in the detailed description of the specification but should be defined by the claims.
본 발명은 드론에 장착되어 드론 이동시 충돌을 회피하도록 이동을 제어하는 장치이다. The present invention is a device mounted on the drone to control the movement to avoid the collision when the drone moves.

Claims (10)

  1. 드론의 회전 날개를 구동시키는 구동부;A driving unit for driving the rotating blades of the drone;
    단일 외부 전방위 렌즈로 상기 드론의 전방위를 촬영하여 전방위 영상을 획득하는 전방위 카메라를 포함하는 센서부; 및A sensor unit including an omnidirectional camera for capturing the omnidirectional image of the drone with a single external omnidirectional lens; And
    상기 전방위 영상을 이미지 처리하여 상기 드론의 주변에 위치한 객체를 감지하고,Image the omnidirectional image to detect an object located near the drone,
    적어도 둘 이상의 상기 전방위 영상을 기초로 감지된 상기 객체와 상기 드론 사이의 거리를 측정하며,Measuring a distance between the detected object and the drone based on at least two omnidirectional images,
    상기 객체와 상기 드론 사이의 거리가 소정의 거리 이하일 때, 상기 드론이 상기 객체를 긴급 회피하도록 상기 구동부를 제어하는 프로세서를 포함하는When the distance between the object and the drone is less than a predetermined distance, the drone includes a processor for controlling the drive unit to emergency avoiding the object;
    충돌 회피용 드론 제어장치.Drone control for collision avoidance.
  2. 제 1 항에 있어서,The method of claim 1,
    상기 프로세서는,The processor,
    상기 전방위 영상을 원호 방향을 따라서 간격을 두고 분할하고,The omnidirectional image is divided at intervals along the arc direction,
    분할한 영상을 이미지 처리하여 상기 드론 주변의 객체를 감지하는Image processing of the divided image to detect the object around the drone
    충돌 회피용 드론 제어장치.Drone control for collision avoidance.
  3. 제 2 항에 있어서,The method of claim 2,
    상기 프로세서는,The processor,
    상기 전방위 영상에서 상기 드론의 이동방향과 매칭되는 영역을 분할 각이 작게 상기 전방위 영상을 분할하고, 상기 이동방향과 매칭되는 영역과 멀어질수록 분할 각이 점차 커지도록 상기 전방위 영상을 분할하는Dividing the omnidirectional image by dividing the region matching the moving direction of the drone in the omnidirectional image, and dividing the omnidirectional image so that the dividing angle gradually increases as it becomes farther from the region matching the moving direction.
    충돌 회피용 드론 제어장치. Drone control for collision avoidance.
  4. 제 1 항에 있어서,The method of claim 1,
    상기 센서부는,The sensor unit,
    상기 드론의 하부에 배치된 제 1 전방위 카메라와, 상기 제 1 전방위 카메라 아래에 배치된 제 2 전방위 카메라를 포함하고,A first omnidirectional camera disposed under the drone, and a second omnidirectional camera disposed under the first omnidirectional camera,
    상기 제 1 전방위 카메라와, 상기 제 2 전방위 카메라는 전방위 스테레오 카메라를 이루는The first omnidirectional camera and the second omnidirectional camera constitute an omnidirectional stereo camera.
    충돌 회피용 드론 제어장치. Drone control for collision avoidance.
  5. 제 4 항에 있어서,The method of claim 4, wherein
    상기 프로세서는,The processor,
    상기 전방위 영상을 전처리(preprocessing)하여 스테레오 이미지를 획득하는 영상 전처리부와,An image preprocessor for preprocessing the omnidirectional image to obtain a stereo image;
    상기 스테레오 이미지를 스테레오 매칭하여 디스페러티 정보를 획득하는 디스페러티 연산부와,A disparity calculator for stereo matching the stereo image to obtain disparity information;
    상기 디스페러티 정보에 기초하여 상기 스테레오 이미지의 배경과 전경을 분리하는 세그멘테이션부와,A segmentation unit that separates a background and a foreground of the stereo image based on the disparity information;
    상기 분리된 전경에서 적어도 하나 이상의 객체를 검출하는 오브젝트 확인부를 포함하는An object checker to detect at least one object in the separated foreground;
    충돌 회피용 드론 제어장치.Drone control for collision avoidance.
  6. 제 6 항에 있어서,The method of claim 6,
    상기 프로세서는,The processor,
    상기 제 1 전방위 카메라의 제 1 이미지에서 제 1 디스페러티를 산출하고,Calculate a first disparity from the first image of the first omnidirectional camera,
    상기 제 2 전방위 카메라의 제 2 이미지에서 제 2 디스페러티를 산출하고,Calculate a second disparity from a second image of the second omnidirectional camera,
    상기 제 1 디스페러티와 상기 제 2 디스페러티의 합에 반비례하여 상기 객체와 상기 드론 사이의 거리를 산출하는Calculating a distance between the object and the drone in inverse proportion to the sum of the first and second disparities
    충돌 회피용 드론 제어장치. Drone control for collision avoidance.
  7. 제 6 항에 있어서,The method of claim 6,
    상기 프로세서는,The processor,
    상기 제 1 이미지에서 피사체가 투영된 지점이 초점 위치를 기준으로 이격된 거리와 이격방향에 따라서 상기 제 1 디스페러티를 보정하고,The first disparity is corrected according to the distance from which the subject is projected on the first image based on the focal position and the separation direction.
    상기 제 2 이미지에서 상기 피사체가 투영된 지점이 초점 위치를 기준으로 이격된 거리와 이격방향에 따라서 상기 제 2 디스페러티를 보정하고,The second disparity is corrected according to a distance from the point where the subject is projected on the second image based on a focal position and a separation direction,
    보정된 제 1 디스페러티와 보정된 제 2 디스페러티를 기초로 상기 객체와 상기 드론 사이의 거리를 산출하는 Calculating a distance between the object and the drone based on the corrected first disparity and the corrected second disparity
    충돌 회피용 드론 제어장치. Drone control for collision avoidance.
  8. 제 1 항에 있어서,The method of claim 1,
    상기 센서부는,The sensor unit,
    모션 센서, 고도 센서, 거리 센서, 균형 센서 또는 비젼 센서 중 적어도 하나 이상의 센서를 더 포함하는Further comprising at least one sensor of a motion sensor, an altitude sensor, a distance sensor, a balance sensor or a vision sensor
    충돌 회피용 드론 제어장치.Drone control for collision avoidance.
  9. 제 8 항에 있어서,The method of claim 8,
    상기 거리 센서는 초음파 센서로서, 상기 초음파를 출력한 후 반사된 시간을 측정하여 외부 객체와의 거리 정보를 획득하고,The distance sensor is an ultrasonic sensor, obtains distance information with an external object by measuring the reflected time after outputting the ultrasonic wave,
    상기 프로세서는,The processor,
    상기 거리 정보를 통해 상기 드론이 상기 객체를 긴급 회피하도록 상기 구동부를 제어하는The drone controls the driving unit to emergency avoid the object through the distance information.
    충돌 회피용 드론 제어장치.Drone control for collision avoidance.
  10. 제 1 항에 있어서,The method of claim 1,
    외부 서버로부터 영상을 포함하는 맵 정보를 수신하는 통신부를 더 포함하고,Further comprising a communication unit for receiving map information including an image from an external server,
    상기 프로세서는,The processor,
    상기 맵 정보에 상기 촬영된 전방위 영상을 맵핑하고,Mapping the photographed omnidirectional image to the map information;
    상기 맵핑된 상기 맵 정보를 기초로 이동경로를 산출하는 Calculating a movement route based on the mapped map information
    충돌 회피용 드론 제어장치.Drone control for collision avoidance.
PCT/KR2017/015463 2016-12-26 2017-12-26 Drone control device for collision avoidance WO2018124688A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2016-0179069 2016-12-26
KR1020160179069A KR101895343B1 (en) 2016-12-26 2016-12-26 Collision avoidance apparatus for vehicles

Publications (1)

Publication Number Publication Date
WO2018124688A1 true WO2018124688A1 (en) 2018-07-05

Family

ID=62709810

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2017/015463 WO2018124688A1 (en) 2016-12-26 2017-12-26 Drone control device for collision avoidance

Country Status (2)

Country Link
KR (1) KR101895343B1 (en)
WO (1) WO2018124688A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110544009A (en) * 2019-07-26 2019-12-06 中国人民解放军海军航空大学青岛校区 Aviation organic coating aging damage quantitative evaluation method based on digital image processing

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102526857B1 (en) * 2018-07-16 2023-04-28 한국전자통신연구원 A method and apparatus for detecting unmanned aerial vehicle using stereo camera and additional camera
KR102167332B1 (en) 2018-10-19 2020-10-19 안병열 Drone
KR102171043B1 (en) * 2019-02-28 2020-10-28 한국과학기술원 Method for detecting and avoiding obstacles through image processing of aircraft
KR102281164B1 (en) 2019-11-20 2021-07-23 한국생산기술연구원 Apparatus and method for recognizing line using mutual relation of linear characteristic information
KR20210065459A (en) 2019-11-27 2021-06-04 김민호 Evasion flight control method of dron for flight obs/tacle
KR102305307B1 (en) 2019-11-27 2021-09-27 김민호 Evasion flight control sys/tem of dron for flight obs/tacle based on air injection
KR102391771B1 (en) 2020-04-02 2022-04-27 함영국 Method for operation unmanned moving vehivle based on binary 3d space map
KR102131377B1 (en) 2020-04-17 2020-07-08 주식회사 파블로항공 Unmanned Vehicle for monitoring and system including the same
KR102316012B1 (en) * 2020-05-26 2021-10-22 (주)파이온시스템즈 Apparatus and method for determining possibility of collision with flying object in front of drone using camera image provided in drone
KR20230157192A (en) 2022-05-09 2023-11-16 청주대학교 산학협력단 Method, apparatus and system for intellignece shield using uav swarm
KR102638951B1 (en) 2023-08-09 2024-02-21 (주)모빌리티원 Heterogeneous drone/robot Unified Remote-Control system and method

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070106473A1 (en) * 2005-01-24 2007-05-10 Bodin William K Navigating a uav with obstacle avoidance algorithms
US20110164108A1 (en) * 2009-12-30 2011-07-07 Fivefocal Llc System With Selective Narrow FOV and 360 Degree FOV, And Associated Methods
KR20130037697A (en) * 2013-03-12 2013-04-16 노인철 The conflict prevention system and methods of unmanned aerial vehicle
KR20150113586A (en) * 2014-03-31 2015-10-08 세종대학교산학협력단 Multi rotor unmanned aerial vehicle, autonomous flight control method augmented by vision sensor thereof and record media recorded program for implement thereof
KR20160083774A (en) * 2015-01-02 2016-07-12 (주)창조인프라 Drone with self camera and photographer chasing function

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070106473A1 (en) * 2005-01-24 2007-05-10 Bodin William K Navigating a uav with obstacle avoidance algorithms
US20110164108A1 (en) * 2009-12-30 2011-07-07 Fivefocal Llc System With Selective Narrow FOV and 360 Degree FOV, And Associated Methods
KR20130037697A (en) * 2013-03-12 2013-04-16 노인철 The conflict prevention system and methods of unmanned aerial vehicle
KR20150113586A (en) * 2014-03-31 2015-10-08 세종대학교산학협력단 Multi rotor unmanned aerial vehicle, autonomous flight control method augmented by vision sensor thereof and record media recorded program for implement thereof
KR20160083774A (en) * 2015-01-02 2016-07-12 (주)창조인프라 Drone with self camera and photographer chasing function

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110544009A (en) * 2019-07-26 2019-12-06 中国人民解放军海军航空大学青岛校区 Aviation organic coating aging damage quantitative evaluation method based on digital image processing
CN110544009B (en) * 2019-07-26 2022-12-09 中国人民解放军海军航空大学青岛校区 Aviation organic coating aging damage quantitative evaluation method based on digital image processing

Also Published As

Publication number Publication date
KR20180075111A (en) 2018-07-04
KR101895343B1 (en) 2018-09-05

Similar Documents

Publication Publication Date Title
WO2018124688A1 (en) Drone control device for collision avoidance
WO2020071839A1 (en) Ship and harbor monitoring device and method
US10904430B2 (en) Method for processing image, image processing apparatus, multi-camera photographing apparatus, and aerial vehicle
WO2017008224A1 (en) Moving object distance detection method, device and aircraft
WO2018110848A1 (en) Method for operating unmanned aerial vehicle and electronic device for supporting the same
US11030760B2 (en) Image processing device, ranging device and method
WO2017066927A1 (en) Systems, methods, and devices for setting camera parameters
WO2016106715A1 (en) Selective processing of sensor data
WO2017008206A1 (en) Dual lens system having a light splitter
WO2015093828A1 (en) Stereo camera and vehicle comprising same
WO2019017592A1 (en) Electronic device moved based on distance from external object and control method thereof
WO2016143983A1 (en) Method and device for radiating light used to capture iris
WO2020171512A1 (en) Electronic device for recommending composition and operating method thereof
WO2020091262A1 (en) Method for processing image by using artificial neural network, and electronic device supporting same
WO2015093823A1 (en) Vehicle driving assistance device and vehicle equipped with same
US11107245B2 (en) Image processing device, ranging device, and method
WO2020141827A1 (en) Optical system, and camera module comprising same
WO2020071823A1 (en) Electronic device and gesture recognition method thereof
WO2019143050A1 (en) Electronic device and method for controlling autofocus of camera
WO2023008791A1 (en) Method for acquiring distance to at least one object located in any direction of moving object by performing proximity sensing, and image processing device using same
WO2016186319A1 (en) Vehicle driving assisting device and vehicle
WO2016105074A1 (en) Lens optical system
WO2020189909A2 (en) System and method for implementing 3d-vr multi-sensor system-based road facility management solution
WO2018135745A1 (en) Method and device for genertating image for indicating object on periphery of vehicle
WO2020138760A1 (en) Electronic device and control method thereof

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17888914

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 27/09/2019)

122 Ep: pct application non-entry in european phase

Ref document number: 17888914

Country of ref document: EP

Kind code of ref document: A1