WO2023003100A1 - Control apparatus for tracking and photographing subject, drone, and method for operating control apparatus - Google Patents

Control apparatus for tracking and photographing subject, drone, and method for operating control apparatus Download PDF

Info

Publication number
WO2023003100A1
WO2023003100A1 PCT/KR2021/017690 KR2021017690W WO2023003100A1 WO 2023003100 A1 WO2023003100 A1 WO 2023003100A1 KR 2021017690 W KR2021017690 W KR 2021017690W WO 2023003100 A1 WO2023003100 A1 WO 2023003100A1
Authority
WO
WIPO (PCT)
Prior art keywords
drone
subject
camera
angle
location coordinates
Prior art date
Application number
PCT/KR2021/017690
Other languages
French (fr)
Korean (ko)
Inventor
정승호
정승현
Original Assignee
주식회사 아르고스다인
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 주식회사 아르고스다인 filed Critical 주식회사 아르고스다인
Publication of WO2023003100A1 publication Critical patent/WO2023003100A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/69Control of means for changing angle of the field of view, e.g. optical zoom objectives or electronic zooming
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C39/00Aircraft not otherwise provided for
    • B64C39/02Aircraft not otherwise provided for characterised by special use
    • B64C39/024Aircraft not otherwise provided for characterised by special use of the remote controlled vehicle type, i.e. RPV
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • G03B15/006Apparatus mounted on flying objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules

Definitions

  • An embodiment of the present invention relates to a subject tracking and shooting control device, a drone, and an operating method thereof, and more particularly, based on the position of the drone and the position of the subject, the angle and zoom for photographing of the camera mounted on the drone It relates to a control device for tracking and photographing a subject by control, a drone, and an operation method thereof.
  • An unmanned air vehicle is a single-use or reusable powered air vehicle capable of carrying weapons or general cargo, which flies autonomously or remotely by being buoyed by aerodynamic force without carrying a pilot.
  • unmanned aerial vehicles micro-sized unmanned aerial vehicles are called drones.
  • Drones are used in various industries. More specifically, it has been used for military and hobby purposes in the past, but recently its utilization has been very wide ranging from the transportation industry to the film and broadcasting industries, and various vehicles with various sizes and performance are being developed according to the purpose of use .
  • drones are put into and operated in areas inaccessible to humans, such as jungles, remote areas, volcanic areas, natural disaster areas, and nuclear power plant accident areas.
  • the camera gimbal refers to a device that electronically prevents shaking when shooting while moving without fixing the camera. It works by controlling the motor to position.
  • the present invention has been devised to solve the above problems, and an object of the present invention is to provide a device, drone, and method for actively tracking and photographing an object on a drone in flight.
  • a subject tracking and shooting control method by a drone receives a specific area set by a user from an image captured by the drone from a server or a user terminal, and recognizes the specific area as a subject. step; acquiring the location coordinates of the drone and the location coordinates of the subject; Calculating a direction angle of a camera to position the subject at the center of an image based on the positional coordinates of the drone and the positional coordinates of the subject; calculating a magnification of the camera based on the location coordinates of the drone and the location coordinates of the subject; and driving a camera gimbal according to the calculated orientation angle of the camera, and adjusting a camera zoom according to a magnification of the camera.
  • Subject tracking shooting control method by drone characterized in that for doing.
  • the calculating of the direction angle of the camera may include an angle ( ⁇ ) between a north direction and a traveling direction of the drone based on the location coordinates of the drone and an angle ( ⁇ ) between the north direction and the subject.
  • the sum of is calculated as the yaw angle of the camera gimbal, and the arctangent of the value obtained by dividing the altitude difference (h d ) between the subject and the drone by the distance (d) from the drone to the subject is the camera gimbal. It is characterized in that it is calculated by the roll angle of the gimbal.
  • the step of calculating the magnification of the camera may include the magnification of the camera based on the distance D between the drone and the subject, the length D1 of the subject measured by the camera, and the angle of view of the camera. It is characterized by calculating .
  • the length D1 of the subject measured by the camera is characterized in that it is calculated based on the following equation.
  • Equation 1 D 2 means the diagonal length of the subject, ⁇ A means the direction angle of the camera, and ⁇ B means the diagonal direction angle of a specific area as the subject.
  • the location coordinates of the drone are obtained from a sensor unit mounted on the drone, and the location coordinates of the subject are received from a server or a user terminal. characterized by receiving.
  • a drone includes an image acquisition unit for acquiring an image captured by a mounted camera; a camera gimbal for controlling an orientation angle of the camera; a communication unit that transmits the captured image and receives a specific area set by a user from the captured image from a server or a user terminal; an object recognizing unit recognizing the set specific area as a subject; a sensor unit generating location coordinates by a sensor mounted on the drone; and calculating the orientation angle and magnification of the camera based on the location coordinates of the drone and the location coordinates of the subject obtained from the server or the user terminal through the communication unit, and calculating the orientation angle and magnification of the camera according to the calculated orientation angle and magnification of the camera. It may include a gimbal and a shooting control unit for adjusting the zoom of the camera.
  • the shooting control unit calculates a sum of an angle ( ⁇ ) between a north direction and a traveling direction of the drone and an angle ( ⁇ ) between the north direction and the subject based on the location coordinates of the drone as the camera
  • the arctangent of the value calculated as the yaw angle of the gimbal and dividing the altitude difference (hd) between the subject and the drone by the distance (d) from the drone to the subject is called the roll of the camera gimbal. It is characterized in that it is calculated as an angle.
  • the shooting control unit is characterized in that for calculating the magnification of the camera based on the distance (D) between the drone and the subject, the length (D1) of the subject measured by the camera, and the angle of view of the camera .
  • the specific area is a rectangle
  • the location coordinates of the subject are the location coordinates of the center of the subject
  • the diagonal length D2 of the subject is received from the server or user terminal through the communication unit.
  • An apparatus for tracking and shooting a subject of a drone includes a communication unit that receives an image captured by a camera mounted on the drone and location coordinates of the drone from the drone; an input unit configured to set a specific area from the captured image by a user; an object recognizing unit recognizing the set specific area as a subject; a calculation unit that calculates a direction angle and magnification of the camera based on the location coordinates of the drone and the location coordinates of the subject received through the input unit; and a signal generator for generating a control signal for controlling the direction angle and magnification of the camera and transmitting the signal to the drone through the communication unit.
  • the calculation unit calculates a sum of an angle ⁇ between a north direction and a traveling direction of the drone and an angle ⁇ between the north direction and the subject based on the location coordinates of the drone as the camera gimbal. Calculated as the yaw angle, and the arc tangent of the value obtained by dividing the altitude difference (h d ) between the subject and the drone by the distance (d) from the drone to the subject as the roll angle of the camera gimbal. characterized in that it produces
  • the calculator calculates the magnification of the camera based on the distance D between the drone and the subject, the length D1 of the subject measured by the camera, and the angle of view of the camera. .
  • the calculation unit calculates the expected location coordinates of the expected time on the flight path based on the flight path of the drone, and calculates the direction angle and magnification of the camera based on the calculated expected location coordinates and the location coordinates of the subject. It is characterized in that the magnification is calculated.
  • FIG. 1 is a schematic diagram for showing a drone equipped with a camera according to an embodiment of the present invention.
  • FIG. 2 is a block diagram schematically showing the configuration of a drone having a subject tracking and photographing function according to an embodiment of the present invention.
  • FIG. 3 is a flowchart illustrating a method for tracking and controlling a photographing of a subject of a drone according to an embodiment of the present invention.
  • FIG. 4 is a schematic diagram for explaining a subject tracking and shooting control method of a drone according to an embodiment of the present invention.
  • 5 and 6 are conceptual diagrams for explaining a method of calculating a direction angle of a camera according to an embodiment of the present invention.
  • FIG. 7 is a conceptual diagram for explaining a method of calculating a zoom magnification of a camera according to an embodiment of the present invention.
  • FIG. 8 is a diagram schematically showing the configuration of a subject tracking and shooting control device according to an embodiment of the present invention.
  • FIG. 1 is a schematic diagram for showing a drone equipped with a camera according to an embodiment of the present invention.
  • active tracking shooting refers to tracking and shooting a subject by controlling the shooting direction angle of the camera 10 so that the camera 10 can focus on the subject based on the position of the drone 1 and the position of the subject. Through this, even when the drone 1 and/or the subject move, the subject can be located in the center of the captured image.
  • the camera 10 is shown to be mounted on the front and bottom of the drone 1, but is not limited thereto and may be mounted on the rear or top of the drone 1 depending on the tracking target and environment.
  • the drone 1 may store an image captured by the camera 10 or transmit it to a preset external device in real time.
  • FIG. 2 is a block diagram schematically showing the configuration of a drone 1 having a subject tracking and photographing function according to an embodiment of the present invention.
  • the drone 1 largely includes a sensor unit 105, a flight unit 110, an image acquisition unit 120, an object recognition unit 130, and a shooting control unit. 140, and may further include a communication unit 150 and a storage unit 160.
  • the sensor unit 105 may include a gyro sensor and a GPS sensor, and measures acceleration, rotation angle, position, and the like of the drone 1 and transmits the measurements to the flight unit 110 and the shooting control unit 140 .
  • the gyro sensor collects angular velocity information of the drone 1.
  • the gyro sensor is preferably a 3-axis sensor. That is, the gyro sensor may collect angular velocity information of three axes of x-axis, y-axis, and z-axis.
  • the angular velocity of the three axes of x, y, and z axes detected by the 3-axis gyro sensor is called roll, pitch, and yaw, respectively.
  • rotation around the x-axis is called roll
  • rotation around the y-axis is called pitch
  • rotation around the z-axis is called yaw.
  • the GPS sensor may acquire location coordinates periodically or non-periodically to recognize the current location of the drone 1.
  • the flight unit 110 generates a driving command according to the flight path of the drone 1 and drives a plurality of motors that drive the propeller according to the generated driving command.
  • the driving command may include movement direction, movement distance, rotation, elevation and descent, landing and takeoff.
  • the flight path can be flown along the flight path set according to the autonomous flight software (FMS).
  • FMS autonomous flight software
  • a flight path may be set based on 3D coordinate information indicating a terrain on which the drone 1 will fly.
  • the flight path may be created based on the current location and the destination location, and a driving command may be directly received from an external device (eg, a controller) through the communication unit 150 to drive the plurality of motors.
  • the current location and the destination location may be received and set from the user terminal or server.
  • the flight unit 110 calculates and tracks the location of the tracking target object recognized by the object recognition unit 130 described later using a TLD (Tracking Learning Detection) learning algorithm method, and at the same time, the unmanned aerial vehicle corresponding to it A driving command for driving may be generated.
  • TLD Track Learning Detection
  • the image acquisition unit 120 includes a camera 10 and a camera gimbal 20, and may obtain an image by photographing a subject while in flight.
  • the camera gimbal 20 can adjust the shooting direction angle of the camera, and can move the center of an image by rotating within the rotation angle of the camera gimbal 20 .
  • the camera gimbal 20 may operate in such a way that the gyro sensor transfers tilt information to the motor and drives the motor to control the shooting angle of the camera.
  • the camera gimbal 20 controls a photographing direction angle of the camera 10 according to a control signal of the photographing controller 140 .
  • the camera 10 adjusts the zoom of the camera 10 according to the control signal of the photographing controller 140 to photograph the subject.
  • the image acquisition unit 120 may store the captured image in the storage unit 160 or transmit the captured image to a preset external device in real time through the communication unit 150 .
  • the object recognizing unit 130 receives a specific area set from a user terminal or a server through a communication unit 150 to be described later, and recognizes the set specific area as a subject.
  • the specific area is a rectangle or a square.
  • the subject may include all of various objects that may be photographed.
  • a subject shown as a rectangle may indicate a rectangular area (geographical area) to be monitored. For example, if it is necessary to continuously monitor an area requiring high military security, monitoring operations may be continuously performed by setting the area as a subject.
  • the location coordinates of the subject are coordinates of four vertexes of the quadrangle, and the diagonal length D2 of the subject is calculated based on the location coordinates of the subject.
  • the location coordinates of the subject are the coordinates of the center point of the specific region, and in this case, the diagonal length D2 of the subject may be input from a user terminal or a server.
  • the shooting control unit 140 is based on the position of the drone 1 acquired by the sensor unit 105 and the position of the object to be tracked obtained from the user terminal or server through the communication unit 150 in order to actively track and photograph the subject.
  • the direction angle and zoom value of the camera 10 are calculated, and the tilting angle of the camera gimbal 20 and the zoom of the camera 10 are adjusted according to the calculation result.
  • the communication unit 150 transmits and receives information with an external device in a wireless communication method. For example, zone information for driving, driving route, driving command, etc. may be received from a controller or server, and a captured image may be transmitted to a server or user terminal.
  • the zone information may include flight restriction zone (A) information and access restriction distance information.
  • Wireless communication methods include GSM (Global System for Mobile communication), CDMA (Code Division Multi Access), CDMA2000 (Code Division Multi Access 2000), EV-DO (Enhanced Voice-Data Optimized or Enhanced Voice-Data Only), WCDMA (Wideband) CDMA), High Speed Downlink Packet Access (HSDPA), High Speed Uplink Packet Access (HSUPA), Long Term Evolution (LTE), Long Term Evolution-Advanced (LTE-A), etc. may be used.
  • GSM Global System for Mobile communication
  • CDMA Code Division Multi Access
  • CDMA2000 Code Division Multi Access 2000
  • EV-DO Enhanced Voice-Data Optimized or Enhanced Voice-Data Only
  • WCDMA Wideband CDMA
  • High Speed Downlink Packet Access HSDPA
  • High Speed Uplink Packet Access HSUPA
  • LTE Long Term Evolution-Advanced
  • LTE-A Long Term Evolution-Advanced
  • Wireless Internet technology may be used as a wireless communication method.
  • Wireless Internet technologies include, for example, WLAN (Wireless LAN), Wi-Fi (Wireless-Fidelity), Wi-Fi (Wireless Fidelity) Direct, DLNA (Digital Living Network Alliance), WiBro (Wireless Broadband), WiMAX (World Interoperability for Microwave Access), High Speed Downlink Packet Access (HSDPA), High Speed Uplink Packet Access (HSUPA), Long Term Evolution (LTE), Long Term Evolution-Advanced (LTE-A), and 5G.
  • WLAN Wireless LAN
  • Wi-Fi Wireless-Fidelity
  • Wi-Fi Wireless Fidelity
  • Direct Direct
  • DLNA Digital Living Network Alliance
  • WiBro Wireless Broadband
  • WiMAX Worldwide Interoperability for Microwave Access
  • HSDPA High Speed Downlink Packet Access
  • HSUPA High Speed Uplink Packet Access
  • LTE Long Term Evolution-Advanced
  • LTE-A Long Term Evolution-Advance
  • the storage unit 160 stores various data for subject tracking and shooting and images captured by the camera 10 .
  • the storage unit 160 may provide the stored information according to a request of other components such as the capturing control unit 140 .
  • FIG. 3 is a flowchart for explaining a subject tracking and shooting control method of a drone 1 according to an embodiment of the present invention
  • FIG. 4 is a flowchart illustrating a subject tracking and shooting control method of a drone 1 according to an embodiment of the present invention
  • 5 and 6 are conceptual diagrams for explaining a method for calculating a direction angle of a camera according to an embodiment of the present invention
  • FIG. 7 is a magnification of a camera according to an embodiment of the present invention. It is a conceptual diagram to explain how to calculate .
  • step S110 a specific area set by the user from the image taken by the drone 1 is transferred to the server. Alternatively, it is received from the user terminal, and the specific area is recognized as a subject.
  • the drone 1 while the drone 1 is flying along the flight path, the surroundings of the flight path are photographed to obtain an image and provide the image to a server or user terminal.
  • a server or a user terminal Through a server or a user terminal, a user recognizes a subject by setting a specific area in a captured image.
  • the drone 1 flies around the subject position coordinates based on previously stored or transmitted subject position coordinates and photographs the surroundings.
  • the drone 1 may recognize an object to be tracked by using feature points on a photographed image. By checking the similarity between feature points extracted from the image and feature points of a preset subject, a tracking target having a similarity higher than the preset similarity may be recognized as an object. In this way, even when the drone 1 recognizes a subject using feature points, the subject may be confirmed through a user terminal or a server.
  • the unmanned aerial vehicle may be photographed while rotating in place.
  • step S120 the location coordinates of the drone 1 and the location coordinates of the subject are acquired.
  • the location coordinates of the drone 1 may be obtained from a sensor unit mounted on the drone 1, and the location coordinates of the subject may be received and obtained from a server or a user terminal.
  • the location coordinates of the subject may be obtained based on previously stored map information and location coordinates matching the map information.
  • the drone 1 may determine a location by matching a recognized subject with a previously stored map based on the similarity of feature points, and may check location coordinates matching the identified location on the map.
  • the drone 1 receives the coordinates of the center point of the specific area and the diagonal length of the specific area, but in another embodiment, the location coordinates of all four vertices of the specific area are received, and the coordinates and The length of the diagonal may be calculated from the position coordinates of the diagonal.
  • the sum of the angle ⁇ between the north direction and the traveling direction of the drone 1 and the angle ⁇ between the north direction and the subject is calculated based on the location coordinates of the drone 1.
  • the angle ⁇ with the traveling direction is the angle between the imaginary straight line l1 pointing in the north direction from the drone 1 and the imaginary straight line l2 pointing in the traveling direction of the drone 1
  • the angle ⁇ of is an angle between an imaginary straight line l1 pointing north from the drone 1 and an imaginary straight line l3 pointing from the drone 1 to a target.
  • the arc tangent of the value obtained by dividing the altitude difference (h d ) between the subject and the drone 1 by the distance (d) from the drone 1 to the subject is the pitch of the camera gimbal. (pitch) It is calculated as an angle ( ⁇ pitch ) (see Equation 1 below).
  • the distance d from the drone 1 to the subject is calculated according to a distance formula between two points based on the position coordinates of the drone 1 and the position coordinates of the subject.
  • the magnification factor may be calculated based on the location coordinates of the drone 1 and the location coordinates of the subject based on the sensing values of the sensor unit 105 provided in the drone 1.
  • the shooting control unit 140 adjusts the magnification of the camera based on the distance D between the drone 1 and the subject, the horizontal length D1 of the subject measured by the camera, and the angle of view of the camera. Calculate
  • the length D1 of the subject measured by the camera is calculated based on Equation 1 below.
  • Equation 1 D 2 means the diagonal length of the subject, ⁇ A means the direction angle of the camera, and ⁇ B means the diagonal direction angle of a specific area as the subject.
  • the angle of view of the camera may be different for each type of camera, and may be pre-stored or input from a user terminal and a server.
  • the straight line M1 pointing from the drone 1 to the camera direction and the length D1 of the subject on the image taken by the camera of the drone 1 are vertical.
  • An imaginary rectangle (stuv) having both sides of a straight line (M2) is aligned with one vertex (O) of the specific region (opqr).
  • ⁇ A An angle between a virtual straight line pointing in the north direction from the drone 1 and a virtual straight line pointing in the direction of the camera.
  • An angle between a diagonal line oq of a specific area opqr of a quadrangular shape recognized as a subject and a side where it meets is called an azimuth angle ⁇ B of the diagonal line.
  • the angle ( ⁇ F) between the change op of the specific area opqr and the side st of the virtual rectangle stuv is equal to 90°-camera azimuth angle ( ⁇ A) (Equation 3 below).
  • Equation 4 the sum of the angle ( ⁇ C) between the side (st) of the imaginary rectangle (stuv) and the diagonal (oq) of the specific area (opqr) and the angle ( ⁇ F) is equal to the azimuth ( ⁇ B) of the diagonal . If this is expressed as an equation, it is as shown in Equation 4 below.
  • the length of the subject measured by the camera (D1) the length of the diagonal of the subject (D2) ⁇ cosC, if the angle ( ⁇ C) is expressed as the azimuth of the diagonal ( ⁇ B) + the azimuth of the camera ( ⁇ A) - 90 °, math
  • the length D1 of the subject measured by the camera in Equation 1 is derived.
  • the shooting control unit 140 calculates the magnification M as shown in [Equation 5] below based on the horizontal length D1 of the subject and the angle of view of the camera ⁇ camera calculated in Equation 1. can be adjusted
  • A means a predetermined constant value.
  • step S150 the camera gimbal 20 is driven according to the calculated direction angle of the camera, and the camera zoom is adjusted according to the magnification M of the camera 10.
  • a first control signal for driving the camera gimbal 20 is generated according to the calculated orientation angle of the camera, and a second control signal for adjusting the camera zoom according to the magnification of the camera 10 is generated.
  • the first control signal is transmitted to the camera gimbal 20 to control the direction angle of the camera, and the second control signal is transmitted to the camera 10 and controls the zoom of the camera.
  • FIG. 8 is a diagram schematically showing the configuration of a subject tracking and shooting control device according to an embodiment of the present invention.
  • the subject tracking and shooting control apparatus 200 includes a communication unit 210, an input unit 220, an object recognition unit 230, a calculation unit 240, and a signal generation unit 250. ) may be included.
  • the subject tracking photographing control device 200 may further include an output unit (not shown).
  • the communication unit 210 transmits and receives information to and from the drone 1 .
  • the communication unit 210 receives an image captured by a camera mounted on the drone 1 and location coordinates of the drone 1 .
  • the received image and the location coordinates of the drone 1 may be transmitted to the object recognition unit 230 and the calculation unit 240.
  • the signal generated by the signal generating unit 250 may be transmitted to the camera gimbal and camera mounted on the drone 1 .
  • the input unit 220 converts the user's input operation into an input signal and transmits it to other components, for example, the object recognition unit 230 and the calculation unit 240 .
  • the input unit 220 may be implemented as, for example, a keyboard, a mouse, a touch sensor on a touch screen, a touch pad, a keypad, voice input, and other input processing devices that are currently available in the past or will be available in the future.
  • the input unit 220 receives, for example, a user setting a specific region from a photographed image output to an output unit (not shown).
  • the input unit 220 in order to set a specific area, provides an interface for setting the specific area, receives a specific area through the interface, and transmits it to the object recognition unit 230 .
  • the calculation unit 240 calculates the direction angle and magnification of the camera based on the location coordinates of the drone 1 received through the communication unit 210 and the location coordinates of the subject received through the input unit 220.
  • a method of calculating the direction angle of the camera by the calculation unit 240 is similar to the method of calculating the direction angle of the camera in step S130 of FIG. 3 , so a detailed description thereof will be omitted.
  • a method of calculating the magnification by the calculation unit 240 is similar to the method of calculating the magnification in step S140 of FIG. 3, so detailed description thereof will be omitted.
  • the signal generating unit 250 controls the direction angle of the camera gimbal and the zoom of the camera by generating a control signal according to the direction angle and magnification of the camera calculated by the calculation unit 240 and transmitting the control signal to the drone 1 .
  • the subject tracking and shooting control device 200 receives a flight path in advance, and determines the direction angle of the camera gimbal based on the expected position coordinates of the drone 1 and the position coordinates of the subject before being positioned on each path. By completing the calculation for controlling the camera zoom, the time when the drone 1 arrives at the expected position on the flight path can be predicted in advance and the direction angle of the camera gimbal and the zoom of the camera can be controlled according to the predicted time. .
  • each block of the process flow chart diagrams and combinations of the flow chart diagrams can be performed by computer program instructions. Since these computer program instructions may be loaded into a processor of a general-purpose computer, special-purpose computer, or other programmable data processing equipment, the instructions executed by the processor of the computer or other programmable data processing equipment may be included in the flowchart block(s).
  • These computer program instructions may also be stored in a computer usable or computer readable memory that can be directed to a computer or other programmable data processing equipment to implement functionality in a particular way, such that the computer usable or computer readable memory
  • the instructions stored in are also capable of producing an article of manufacture containing instruction means that perform the functions described in the flowchart block(s). Since the computer program instructions can also be loaded on a computer or other programmable data processing equipment, a series of operational steps are performed on the computer or other programmable data processing equipment to create a computer-executed process to create a computer or other program Instructions performing possible data processing equipment may also provide steps for carrying out the functions described in the flowchart block(s).
  • each block may represent a module, segment or portion of code that includes one or more executable instructions for executing specified logical function(s). It should also be noted that in some alternative implementations it is possible for the functions mentioned in the blocks to occur out of order. For example, two blocks shown in succession may in fact be executed substantially concurrently, or the blocks may sometimes be executed in reverse order depending on their function.
  • ' ⁇ unit' used in this embodiment means software or a hardware component such as FPGA or ASIC, and ' ⁇ unit' performs certain roles.
  • ' ⁇ part' is not limited to software or hardware.
  • ' ⁇ part' may be configured to be in an addressable storage medium and may be configured to reproduce one or more processors. Therefore, as an example, ' ⁇ unit' refers to components such as software components, object-oriented software components, class components, and task components, processes, functions, properties, and procedures. , subroutines, segments of program code, drivers, firmware, microcode, circuitry, data, databases, data structures, tables, arrays, and variables.
  • components and ' ⁇ units' may be combined into smaller numbers of components and ' ⁇ units' or further separated into additional components and ' ⁇ units'.
  • the components and ' ⁇ units' may be implemented to play one or more CPUs in a device or a secure multimedia card.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Studio Devices (AREA)
  • Human Computer Interaction (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

The present specification relates to a control apparatus for tracking and photographing a subject by using a drone, the drone, and a method for operating the control apparatus. A control method for tracking and photographing a subject by using a drone comprises the steps of: receiving, from a server or a user terminal, a specific region set by a user in an image captured by the drone, and recognizing the specific region as a subject; obtaining location coordinates of the drone and location coordinates of the subject; calculating a direction angle of a camera on the basis of the location coordinates of the drone and the location coordinates of the subject; calculating a magnification of the camera on the basis of the location coordinates of the drone and the location coordinates of the subject; and driving a camera gimbal according to the calculated direction angle of the camera, and adjusting camera zoom according to the magnification of the camera.

Description

피사체 추적 촬영제어장치, 드론 및 그것의 동작 방법Subject tracking shooting control device, drone and its operation method
본 발명의 일실시예는 피사체 추적 촬영제어장치, 드론 및 그것의 동작 방법에 관한 것으로, 보다 상세하게는 드론의 위치 및 피사체의 위치에 기초하여 드론에 탑재된 카메라의 촬영을 위한 각도와 줌을 제어하여 피사체를 추적 촬영하기 위한 제어장치, 드론 및 그것의 동작 방법에 관한 것이다.An embodiment of the present invention relates to a subject tracking and shooting control device, a drone, and an operating method thereof, and more particularly, based on the position of the drone and the position of the subject, the angle and zoom for photographing of the camera mounted on the drone It relates to a control device for tracking and photographing a subject by control, a drone, and an operation method thereof.
무인 비행체란 '조종사를 태우지 않고 공기 역학적 힘에 의해 부양하여 자율적 또는 원격 조종에 의해 비행하며, 무기 또는 일반 화물을 실을 수 있는 일회용 또는 재사용할 수 있는 동력 비행체'를 의미한다. 이러한 무인 비행체 중에 초소형 무인 비행체를 드론이라고 한다.An unmanned air vehicle is a single-use or reusable powered air vehicle capable of carrying weapons or general cargo, which flies autonomously or remotely by being buoyed by aerodynamic force without carrying a pilot. Among these unmanned aerial vehicles, micro-sized unmanned aerial vehicles are called drones.
드론은 다양한 산업 분야에 활용되고 있다. 보다 구체적으로, 종래에는 군용 및 취미용으로 사용되었으나, 최근 운송업 및 영화나 방송산업에 이르기까지 그 활용성이 매우 넓어지고 있으며, 활용목적에 따라 다양한 크기와 성능을 가진 비행체들이 다양하게 개발되고 있다. 특히, 정글이나 오지, 화산지역, 자연재해지역, 원자력 발전소 사고지역 등 인간이 접근할 수 없는 지역에 드론을 투입하여 운용하기도 한다.Drones are used in various industries. More specifically, it has been used for military and hobby purposes in the past, but recently its utilization has been very wide ranging from the transportation industry to the film and broadcasting industries, and various vehicles with various sizes and performance are being developed according to the purpose of use . In particular, drones are put into and operated in areas inaccessible to humans, such as jungles, remote areas, volcanic areas, natural disaster areas, and nuclear power plant accident areas.
한편, 카메라 짐볼은 카메라를 고정하지 않고 이동하면서 촬영할 때 그 흔들림을 전자적 방법으로 방지하는 장치를 의미하며, 보통 2축 또는 3축으로 구성되어 자이로 센서에서 기울기 정보를 모터로 전달하여 카메라가 제자리에 위치하도록 모터를 제어하는 방식으로 작동되고 있다.On the other hand, the camera gimbal refers to a device that electronically prevents shaking when shooting while moving without fixing the camera. It works by controlling the motor to position.
비행중인 드론에서 피사체를 촬영시, 카메라와 피사체와의 위치가 계속하여 변경되므로, 대상체를 영상의 중심에 나오도록 촬영하는 것이 쉽지 않다.When a subject is photographed by a drone in flight, since the position of the camera and the subject continuously changes, it is not easy to photograph the subject so that it appears in the center of the image.
본 발명은 상기의 문제점을 해결하기 위해 창안된 것으로서, 본 발명은 비행중인 드론 상에서, 대상체를 능동적으로 추적 촬영하는 장치, 드론 및 방법을 제공하고자 한다.The present invention has been devised to solve the above problems, and an object of the present invention is to provide a device, drone, and method for actively tracking and photographing an object on a drone in flight.
이를 위하여, 본 발명의 일실시예에 따른 드론에 의한 피사체 추적 촬영제어방법은 상기 드론에 의해 촬영된 영상으로부터 사용자로부터 설정된 특정 영역을 서버 또는 사용자 단말로부터 수신받고, 상기 특정영역을 피사체로서 인식하는 단계; 상기 드론의 위치좌표와 상기 피사체의 위치좌표를 획득하는 단계; 상기 드론의 위치좌표와 상기 피사체의 위치좌표에 기초하여 상기 피사체를 영상의 중앙에 위치시키기 위한 카메라의 방향각을 계산하는 단계; 상기 드론의 위치좌표 와 상기 피사체의 위치좌표에 기초하여 상기 카메라의 확대배율을 계산하는 단계; 및 계산된 카메라의 방향각에 따라 카메라 짐볼을 구동시키고, 상기 카메라의 확대 배율에 따라 카메라 줌을 조절하는 단계를 포함한다. 하는 것을 특징으로 하는 드론에 의한 피사체 추적 촬영제어방법.To this end, a subject tracking and shooting control method by a drone according to an embodiment of the present invention receives a specific area set by a user from an image captured by the drone from a server or a user terminal, and recognizes the specific area as a subject. step; acquiring the location coordinates of the drone and the location coordinates of the subject; Calculating a direction angle of a camera to position the subject at the center of an image based on the positional coordinates of the drone and the positional coordinates of the subject; calculating a magnification of the camera based on the location coordinates of the drone and the location coordinates of the subject; and driving a camera gimbal according to the calculated orientation angle of the camera, and adjusting a camera zoom according to a magnification of the camera. Subject tracking shooting control method by drone, characterized in that for doing.
일실시예에서, 상기 카메라의 방향각을 계산하는 단계는, 상기 드론의 위치좌표를 중심으로 북쪽방향과 상기 드론 진행방향과의 각도(α)와 상기 북쪽방향과 상기 피사체와의 각도(θ)의 합을 상기 카메라 짐볼의 요(yaw) 각도로 산 출하고, 상기 피사체와 상기 드론과의 고도차이(hd)를 드론에서 피사체까지의 거리(d)로 나눈 값의 아크탄젠트값을 상기 카메라 짐볼의 롤(roll) 각도로 산출하는 것을 특징으로 한다.In one embodiment, the calculating of the direction angle of the camera may include an angle (α) between a north direction and a traveling direction of the drone based on the location coordinates of the drone and an angle (θ) between the north direction and the subject. The sum of is calculated as the yaw angle of the camera gimbal, and the arctangent of the value obtained by dividing the altitude difference (h d ) between the subject and the drone by the distance (d) from the drone to the subject is the camera gimbal. It is characterized in that it is calculated by the roll angle of the gimbal.
일실시예에서, 상기 카메라의 확대배율을 계산하는 단계는, 상기 드론과 피사체와의 거리(D), 상기 카메라에서 측정되는 피사체의 길이(D1) 및 카메 라 화각에 기초하여 상기 카메라의 확대배율을 계산하는 것을 특징으로 한다.In one embodiment, the step of calculating the magnification of the camera may include the magnification of the camera based on the distance D between the drone and the subject, the length D1 of the subject measured by the camera, and the angle of view of the camera. It is characterized by calculating .
일실시예에서, 상기 카메라에서 측정되는 피사체의 길이(D1)는 하 기 수학식에 기초하여 산출되는 것을 특징으로 한다.In one embodiment, the length D1 of the subject measured by the camera is characterized in that it is calculated based on the following equation.
[수학식 1][Equation 1]
Figure PCTKR2021017690-appb-I000001
Figure PCTKR2021017690-appb-I000001
상기 수학식 1에서 D2는 피사체의 대각선 길이를 의미하고, ∠A는 카메라의 방향각을 의미하며, ∠B는 피사체로서 특정영역의 대각선 방향각을 의미함.In Equation 1, D 2 means the diagonal length of the subject, ∠A means the direction angle of the camera, and ∠B means the diagonal direction angle of a specific area as the subject.
일실시예에서, 상기 드론의 위치좌표와 피사체의 위치좌표를 획득하는 단계에서, 상기 드론의 위치좌표는 상기 드론에 탑재된 센서부로부터 획득하고, 상기 피사체의 위치좌표는 서버 또는 사용자 단말로부터 수신받는 것을 특징으로 한다.In one embodiment, in the step of obtaining the location coordinates of the drone and the location coordinates of the subject, the location coordinates of the drone are obtained from a sensor unit mounted on the drone, and the location coordinates of the subject are received from a server or a user terminal. characterized by receiving.
본 발명의 다른 실시예에 따른 드론은 탑재된 카메라에 의해 촬영된 영상을 획득하는 영상획득부; 상기 카메라의 방향각을 제어하는 카메라 짐볼; 상기 촬영된 영상을 송부하고, 상기 촬영된 영상으로부터 사용자로부터 설정된 특정영역을 서버 또는 사용자 단말로부터 수신받는 통신부; 상기 설정된 특정영역을 피사체 로서 인식하는 물체인식부; 상기 드론에 탑재된 센서에 의해 위치좌표를 생성하는 센서부; 및 상기 드론의 위치좌표와 상기 통신부를 통해 서버 또는 사용자 단말로 부터 획득한 상기 피사체의 위치좌표에 기초하여 카메라의 방향각과 확대배율을 계 산하고, 계산된 카메라의 방향각과 확대배율에 따라 상기 카메라 짐볼과 상기 카메 라의 줌을 조절하는 촬영제어부를 포함할 수 있다.A drone according to another embodiment of the present invention includes an image acquisition unit for acquiring an image captured by a mounted camera; a camera gimbal for controlling an orientation angle of the camera; a communication unit that transmits the captured image and receives a specific area set by a user from the captured image from a server or a user terminal; an object recognizing unit recognizing the set specific area as a subject; a sensor unit generating location coordinates by a sensor mounted on the drone; and calculating the orientation angle and magnification of the camera based on the location coordinates of the drone and the location coordinates of the subject obtained from the server or the user terminal through the communication unit, and calculating the orientation angle and magnification of the camera according to the calculated orientation angle and magnification of the camera. It may include a gimbal and a shooting control unit for adjusting the zoom of the camera.
일실시예에서, 상기 촬영제어부는, 상기 드론의 위치좌표를 중심으 로 북쪽방향과 상기 드론 진행방향과의 각도(α)와 상기 북쪽방향과 상기 피사체와의 각도(θ)의 합을 상기 카메라 짐볼의 요(yaw) 각도로 산출하고, 상기 피사체와 상기 드론과의 고도차이(hd)를 드론에서 피사체까지의 거리(d)로 나눈 값의 아크탄 젠트값을 상기 카메라 짐볼의 롤(roll) 각도로 산출하는 것을 특징으로 한다.In one embodiment, the shooting control unit calculates a sum of an angle (α) between a north direction and a traveling direction of the drone and an angle (θ) between the north direction and the subject based on the location coordinates of the drone as the camera The arctangent of the value calculated as the yaw angle of the gimbal and dividing the altitude difference (hd) between the subject and the drone by the distance (d) from the drone to the subject is called the roll of the camera gimbal. It is characterized in that it is calculated as an angle.
일실시예에서, 상기 촬영제어부는, 상기 드론과 피사체와의 거리(D), 상기 카메라에서 측정되는 피사체의 길이(D1) 및 카메라 화각에 기초하여 상기 카메라의 확대배율을 계산하는 것을 특징으로 한다.In one embodiment, the shooting control unit is characterized in that for calculating the magnification of the camera based on the distance (D) between the drone and the subject, the length (D1) of the subject measured by the camera, and the angle of view of the camera .
일실시예에서, 상기 특정영역은 사각형이고, 상기 피사체의 위치좌 표는 상기 피사체의 피사체 중심의 위치좌표이며, 상기 피사체의 대각선 길이(D2)는 상기 통신부를 통해 서버 또는 사용자 단말로부터 수신받는 것을 특징으로 한다.In one embodiment, the specific area is a rectangle, the location coordinates of the subject are the location coordinates of the center of the subject, and the diagonal length D2 of the subject is received from the server or user terminal through the communication unit. to be characterized
본 발명의 다른 실시예에 따른 드론의 피사체 추적 촬영제어장치는 드론에 탑재된 카메라에 의해 촬영된 영상 및 드론으로부터 드론의 위치좌표를 수신받는 통신부; 상기 촬영된 영상으로부터 사용자로부터 특정영역을 설정받는 입력부; 상기 설정된 특정영역을 피사체로서 인식하는 물체인식부; 상기 드론의 위치좌 표와 상기 입력부를 통해 입력받은 상기 피사체의 위치좌표에 기초하여 카메라의 방향각 및 확대배율을 계산하는 연산부; 및 상기 카메라의 방향각 및 확대배율을 제어하는 제어 신호를 생성하여 상기 통신부를 통해 상기 드론에 전송하는 신호 생 성부를 포함한다.An apparatus for tracking and shooting a subject of a drone according to another embodiment of the present invention includes a communication unit that receives an image captured by a camera mounted on the drone and location coordinates of the drone from the drone; an input unit configured to set a specific area from the captured image by a user; an object recognizing unit recognizing the set specific area as a subject; a calculation unit that calculates a direction angle and magnification of the camera based on the location coordinates of the drone and the location coordinates of the subject received through the input unit; and a signal generator for generating a control signal for controlling the direction angle and magnification of the camera and transmitting the signal to the drone through the communication unit.
일실시예에서, 상기 연산부는, 상기 드론의 위치좌표를 중심으로 북쪽방향과 상기 드론 진행방향과의 각도(α)와 상기 북쪽방향과 상기 피사체와의 각도(θ)의 합을 상기 카메라 짐볼의 요(yaw) 각도로 산출하고, 상기 피사체와 상기 드론과의 고도차이(hd)를 드론에서 피사체까지의 거리(d)로 나눈 값의 아크탄젠 트값을 상기 카메라 짐볼의 롤(roll) 각도로 산출하는 것을 특징으로 한다.In one embodiment, the calculation unit calculates a sum of an angle α between a north direction and a traveling direction of the drone and an angle θ between the north direction and the subject based on the location coordinates of the drone as the camera gimbal. Calculated as the yaw angle, and the arc tangent of the value obtained by dividing the altitude difference (h d ) between the subject and the drone by the distance (d) from the drone to the subject as the roll angle of the camera gimbal. characterized in that it produces
일실시예에서, 상기 연산부는, 상기 드론과 피사체와의 거리(D), 상기 카메라에서 측정되는 피사체의 길이(D1) 및 카메라 화각에 기초하여 상기 카 메라의 확대배율을 계산하는 것을 특징으로 한다.In one embodiment, the calculator calculates the magnification of the camera based on the distance D between the drone and the subject, the length D1 of the subject measured by the camera, and the angle of view of the camera. .
일실시예에서, 상기 연산부는, 상기 드론의 비행경로에 기초하여 비행경로 상의 예상 시각의 예상 위치좌표를 연산하고, 연산된 예상 위치좌표와 상기 피사체의 위치좌표에 기초하여 카메라의 방향각 및 확대배율을 계산하는 것을 특징으로 한다.In one embodiment, the calculation unit calculates the expected location coordinates of the expected time on the flight path based on the flight path of the drone, and calculates the direction angle and magnification of the camera based on the calculated expected location coordinates and the location coordinates of the subject. It is characterized in that the magnification is calculated.
본 발명에 따르면, 비행중인 드론 상에서, 사용자의 제어없이도 자동으로 대상체를 능동적으로 추적 촬영할 수 있다.According to the present invention, on a drone in flight, it is possible to actively track and photograph an object automatically without a user's control.
도 1은 본 발명의 실시예에 따른 카메라를 장착한 드론을 나타내기 위한 개략적인 도면이다.1 is a schematic diagram for showing a drone equipped with a camera according to an embodiment of the present invention.
도 2는 본 발명의 일실시예에 따른 피사체 추적 촬영 기능을 갖는 드론의 구성을 개략적으로 나타낸 블럭도이다.2 is a block diagram schematically showing the configuration of a drone having a subject tracking and photographing function according to an embodiment of the present invention.
도 3은 본 발명의 일실시예에 따른 드론의 피사체 추적 촬영제어방법을 설명하기 위한 흐름도이다.3 is a flowchart illustrating a method for tracking and controlling a photographing of a subject of a drone according to an embodiment of the present invention.
도 4는 본 발명의 일실시예에 따른 드론의 피사체 추적 촬영제어방법을 설명하기 위한 개략도이다.4 is a schematic diagram for explaining a subject tracking and shooting control method of a drone according to an embodiment of the present invention.
도 5 및 도 6은 본 발명의 일실시예에 따른 카메라의 방향각을 계산하는 방법을 설명하기 위한 개념도이다.5 and 6 are conceptual diagrams for explaining a method of calculating a direction angle of a camera according to an embodiment of the present invention.
도 7은 본 발명의 일실시예에 따른 카메라의 확대배율을 계산하는 방법을 설명하기 위한 개념도이다.7 is a conceptual diagram for explaining a method of calculating a zoom magnification of a camera according to an embodiment of the present invention.
도 8은 본 발명의 일실시예에 따른 피사체 추적 촬영제어장치의 구성을 개 략적으로 나타낸 도면이다.8 is a diagram schematically showing the configuration of a subject tracking and shooting control device according to an embodiment of the present invention.
이하, 첨부된 도면을 참조하여 본 발명에 따른 실시 예를 상세하게 설명한다. 본 발명의 구성 및 그에 따른 작용 효과는 이하의 상세한 설명을 통해 명확하게 이해될 것이다. 본 발명의 상세한 설명에 앞서, 동일한 구성요소에 대해 서는 다른 도면 상에 표시되더라도 가능한 동일한 부호로 표시하며, 공지된 구성에 대해서는 본 발명의 요지를 흐릴 수 있다고 판단되는 경우 구체적인 설명은 생략하 기로 함에 유의한다.Hereinafter, embodiments according to the present invention will be described in detail with reference to the accompanying drawings. The configuration of the present invention and its operational effects will be clearly understood through the following detailed description. Prior to the detailed description of the present invention, the same components are denoted by the same reference numerals as much as possible even if they are shown on different drawings, and detailed descriptions of known components are omitted if it is determined that the gist of the present invention may be obscured. Note
도 1은 본 발명의 실시예에 따른 카메라를 장착한 드론을 나타내기 위한 개략적인 도면이다.1 is a schematic diagram for showing a drone equipped with a camera according to an embodiment of the present invention.
도 1에 도시된 바와 같이, 드론(1)의 일측면 또는 하방에 설치되는 무인항공기 탑재용 카메라(10)가 촬영하는 영상정보를 이용하여 특정물체를 피사체로 인식하고 능동적으로 추적 촬영을 할 수 있다. 여기서 능동적 추적 촬영이란 드론(1)의 위치와 피사체의 위치에 기초하여 카메라(10)가 피사체에 초점을 맞추어 촬영할 수 있도록 카메라(10)의 촬영 방향각을 제어하여 피사체를 추적 촬영하는 것을 말한다. 이를 통해 드론(1) 및/또는 피사체가 움직이는 상황에도 피사체는 촬영된 영상의 중앙부분에 위치할 수 있게 된다.As shown in FIG. 1, it is possible to recognize a specific object as a subject and actively take a tracking shot using image information taken by an unmanned aerial vehicle camera 10 installed on one side or below the drone 1. there is. Here, active tracking shooting refers to tracking and shooting a subject by controlling the shooting direction angle of the camera 10 so that the camera 10 can focus on the subject based on the position of the drone 1 and the position of the subject. Through this, even when the drone 1 and/or the subject move, the subject can be located in the center of the captured image.
도면에는 상기 카메라(10)가 드론(1)의 전방 및 하방에 장착되는 것으로 표시되었으나 이에 한정하지 않고 추적대상 및 환경에 따라 드론(1)의 후방이나 상방에 장착될 수 있다.In the drawing, the camera 10 is shown to be mounted on the front and bottom of the drone 1, but is not limited thereto and may be mounted on the rear or top of the drone 1 depending on the tracking target and environment.
일실시예에 따른 드론(1)은 카메라(10)에 의해 촬영되는 영상을 저장하거나 실시간으로 미리 설정된 외부 장치에 전송할 수 있다.The drone 1 according to an embodiment may store an image captured by the camera 10 or transmit it to a preset external device in real time.
도 2는 본 발명의 일실시예에 따른 피사체 추적 촬영 기능을 갖는 드론(1)의 구성을 개략적으로 나타낸 블럭도이다.2 is a block diagram schematically showing the configuration of a drone 1 having a subject tracking and photographing function according to an embodiment of the present invention.
도 2에 도시한 바와 같이, 본 발명의 일실시예에 따른 드론(1)은 크게 센서부(105), 비행부(110), 영상획득부(120), 물체인식부(130) 및 촬영제어부(140)를 포함하고, 통신부(150) 및 저장부(160)를 더 포함할 수 있다.As shown in FIG. 2, the drone 1 according to an embodiment of the present invention largely includes a sensor unit 105, a flight unit 110, an image acquisition unit 120, an object recognition unit 130, and a shooting control unit. 140, and may further include a communication unit 150 and a storage unit 160.
센서부(105)는 자이로 센서 및 GPS 센서 등을 포함할 수 있고 드론(1)의 가속도, 회전각 및 위치 등을 측정하여 비행부(110) 및 촬영제어부(140)로 전달한다.The sensor unit 105 may include a gyro sensor and a GPS sensor, and measures acceleration, rotation angle, position, and the like of the drone 1 and transmits the measurements to the flight unit 110 and the shooting control unit 140 .
자이로 센서는 드론(1)의 각속도 정보를 수집한다. 자이로 센서는 3축 센서인 것이 바람직하다. 즉, 자이로 센서는 x축, y축, z축의 3축의 각속도 정보를 수집할 수 있다. 3축 자이로 센서에서 감지하는 x축, y축 및 z축의 3축의 각 속도를 각각 롤(Roll), 피치(Pitch), 요(Yaw)라고 한다. 구체적으로 x축을 중심으로 회전하는 것을 롤(Roll), y축을 중심으로 회전하는 것을 피치(Pitch), z축을 중심으로 회전하는 것을 요(Yaw)라고 한다.The gyro sensor collects angular velocity information of the drone 1. The gyro sensor is preferably a 3-axis sensor. That is, the gyro sensor may collect angular velocity information of three axes of x-axis, y-axis, and z-axis. The angular velocity of the three axes of x, y, and z axes detected by the 3-axis gyro sensor is called roll, pitch, and yaw, respectively. Specifically, rotation around the x-axis is called roll, rotation around the y-axis is called pitch, and rotation around the z-axis is called yaw.
GPS 센서는 드론(1)의 현위치를 인지하기 위해, 위치좌표를 주기적 또는 비주기적으로 획득할 수 있다.The GPS sensor may acquire location coordinates periodically or non-periodically to recognize the current location of the drone 1.
비행부(110)는 드론(1)의 비행경로에 따라 구동명령을 생성하고, 생성된 구동명령에 따라 프로펠러를 구동하는 복수의 모터를 구동시킨다. 여기서 구동명령은 이동방향, 이동거리, 회전, 고도의 상승 및 하강, 착륙 및 이륙을 포함할 수 있다.The flight unit 110 generates a driving command according to the flight path of the drone 1 and drives a plurality of motors that drive the propeller according to the generated driving command. Here, the driving command may include movement direction, movement distance, rotation, elevation and descent, landing and takeoff.
비행경로는 자율비행 소프트웨어(FMS)에 따라 설정되는 비행경로를 따라 비행할 수 있다. 자율비행 소프트웨어에 따르면 드론(1)이 비행할 지형을 나타내는 3차원 좌표 정보에 기반하여 비행경로가 설정될 수 있다. 비행경로는 현재 위치와 목적지 위치에 기초하여 생성할 수 있고, 통신부(150)를 통해 외부 장치(예를 들어 컨트롤러)로부터 직접 구동명령을 전달받아 상기 복수의 모터를 구동시킬 수 있다. 현재 위치와 목적지 위치는 사용자 단말 또는 서버로부터 수신받아 설정할 수 있다.The flight path can be flown along the flight path set according to the autonomous flight software (FMS). According to the autonomous flight software, a flight path may be set based on 3D coordinate information indicating a terrain on which the drone 1 will fly. The flight path may be created based on the current location and the destination location, and a driving command may be directly received from an external device (eg, a controller) through the communication unit 150 to drive the plurality of motors. The current location and the destination location may be received and set from the user terminal or server.
일실시예에서, 비행부(110)는 후술하는 물체인식부(130)에서 인식한 상기 추적대상 물체를 TLD(Tracking Learning Detection) 학습 알고리즘 방식으로 위치를 산출하여 추적함과 동시에 이에 대응되게 무인항공기를 구동시키기 위한 구동명령을 생성할 수도 있다.In one embodiment, the flight unit 110 calculates and tracks the location of the tracking target object recognized by the object recognition unit 130 described later using a TLD (Tracking Learning Detection) learning algorithm method, and at the same time, the unmanned aerial vehicle corresponding to it A driving command for driving may be generated.
영상획득부(120)는 카메라(10) 및 카메라 짐볼(20)을 포함하며, 비행중인 상태에서 피사체를 촬영하여 영상을 획득할 수 있다. 카메라 짐볼(20)은 카메라의 촬영 방향각 조절이 가능하며, 카메라 짐볼(20)의 회전 각도 내에서 회전함으로써 영상의 중심을 이동시킬 수 있다. 카메라 짐볼(20)은 자이로 센서에서 기울기 정보를 모터로 전달하여 카메라의 촬영각도를 제어하도록 모터를 구동시키는 방식으로 작동될 수 있다.The image acquisition unit 120 includes a camera 10 and a camera gimbal 20, and may obtain an image by photographing a subject while in flight. The camera gimbal 20 can adjust the shooting direction angle of the camera, and can move the center of an image by rotating within the rotation angle of the camera gimbal 20 . The camera gimbal 20 may operate in such a way that the gyro sensor transfers tilt information to the motor and drives the motor to control the shooting angle of the camera.
일실시예에서, 카메라 짐볼(20)은 촬영제어부(140)의 제어 신호에 따라 카메라(10)의 촬영 방향각을 제어한다. 또한, 카메라(10)는 촬영제어부(140)의 제어 신호에 따라 카메라(10)의 줌을 조정하여 피사체를 촬영한다.In one embodiment, the camera gimbal 20 controls a photographing direction angle of the camera 10 according to a control signal of the photographing controller 140 . In addition, the camera 10 adjusts the zoom of the camera 10 according to the control signal of the photographing controller 140 to photograph the subject.
영상획득부(120)는 촬영된 영상을 저장부(160)에 저장하거나 통신부(150)를 통해 실시간으로 미리 설정된 외부 장치에 전송할 수 있다.The image acquisition unit 120 may store the captured image in the storage unit 160 or transmit the captured image to a preset external device in real time through the communication unit 150 .
물체인식부(130)는 후술되는 통신부(150)를 통해 사용자 단말 또는 서버로부터 특정영역을 설정받고, 설정된 특정영역을 피사체로서 인식한다. 상기 특정영역은 직사각형 또는 정사각형이다.The object recognizing unit 130 receives a specific area set from a user terminal or a server through a communication unit 150 to be described later, and recognizes the set specific area as a subject. The specific area is a rectangle or a square.
본 발명의 일실시예에서 피사체란, 촬영 대상이 될 수 있는 다양한 물체를 모두 포함할 수 있다. 또한, 이하에서 설명되는 발명의 상세한 설명에서 사각형으로 도시되는 피사체는, 모니터링 대상이 되는 사각 형상의 지역(지리적인 영역)을 나타낼 수 있다. 예를 들면, 군사적으로 높은 보안이 요구되는 지역을 지속적으로 모니터링해야 할 경우, 그 지역을 피사체로 설정하여 지속적으로 모니터링 동작들을 수행할 수 있을 것이다.In one embodiment of the present invention, the subject may include all of various objects that may be photographed. In addition, in the detailed description of the invention described below, a subject shown as a rectangle may indicate a rectangular area (geographical area) to be monitored. For example, if it is necessary to continuously monitor an area requiring high military security, monitoring operations may be continuously performed by setting the area as a subject.
상기 피사체의 위치좌표는 상기 사각형의 4개의 꼭지점의 좌표이며, 상기 피사체의 대각선 길이(D2)는 상기 피사체의 위치좌표에 의해 계산된다. 또는 본 발명의 일실시예에서 상기 피사체의 위치좌표는 상기 특정영역의 중심점의 좌표이며, 이러한 경우, 피사체의 대각선 길이(D2)는 사용자 단말 도는 서버로부터 입력받을 수 있다.The location coordinates of the subject are coordinates of four vertexes of the quadrangle, and the diagonal length D2 of the subject is calculated based on the location coordinates of the subject. Alternatively, in one embodiment of the present invention, the location coordinates of the subject are the coordinates of the center point of the specific region, and in this case, the diagonal length D2 of the subject may be input from a user terminal or a server.
촬영제어부(140)는 피사체를 능동 추적 촬영하기 위하여, 센서부(105)에 의해 획득한 드론(1)의 위치와 통신부(150)를 통해 사용자 단말 또는 서버로부터 획득한 추적대상 물체의 위치에 기초하여, 카메라(10)의 방향각 및 줌값을 연산하고, 연산 결과에 따라 카메라 짐볼(20)의 틸팅각도 및 카메라(10)의 줌을 조정한다.The shooting control unit 140 is based on the position of the drone 1 acquired by the sensor unit 105 and the position of the object to be tracked obtained from the user terminal or server through the communication unit 150 in order to actively track and photograph the subject. Thus, the direction angle and zoom value of the camera 10 are calculated, and the tilting angle of the camera gimbal 20 and the zoom of the camera 10 are adjusted according to the calculation result.
통신부(150)는 무선 통신 방식으로 외부 장치와 정보를 송수신한다. 예를 들어, 컨트롤러 또는 서버로부터 주행을 위한 구역정보, 주행 경로, 주행 명령 등을 수신받을 수 있고, 촬영된 영상을 서버 또는 사용자 단말에 전송할 수 있다.The communication unit 150 transmits and receives information with an external device in a wireless communication method. For example, zone information for driving, driving route, driving command, etc. may be received from a controller or server, and a captured image may be transmitted to a server or user terminal.
여기서, 구역 정보는 비행 제한 구역(A) 정보, 접근 제한 거리 정보를 포함할 수 있다.Here, the zone information may include flight restriction zone (A) information and access restriction distance information.
무선 통신 방법은 GSM(Global System for Mobile communication), CDMA(Code Division Multi Access), CDMA2000(Code Division Multi Access 2000), EV-DO(Enhanced Voice-Data Optimized or Enhanced Voice-Data Only), WCDMA(Wideband CDMA), HSDPA(High Speed Downlink Packet Access), HSUPA(High Speed Uplink Packet Access), LTE(Long Term Evolution), LTE-A(Long Term Evolution-Advanced) 등이 사용될 수 있다.Wireless communication methods include GSM (Global System for Mobile communication), CDMA (Code Division Multi Access), CDMA2000 (Code Division Multi Access 2000), EV-DO (Enhanced Voice-Data Optimized or Enhanced Voice-Data Only), WCDMA (Wideband) CDMA), High Speed Downlink Packet Access (HSDPA), High Speed Uplink Packet Access (HSUPA), Long Term Evolution (LTE), Long Term Evolution-Advanced (LTE-A), etc. may be used.
무선 통신 방법은 무선 인터넷 기술이 사용될 수 있다. 무선 인터넷 기술로는, 예를 들어 WLAN(Wireless LAN), Wi-Fi(Wireless-Fidelity), Wi- Fi(Wireless Fidelity) Direct, DLNA(Digital Living Network Alliance), WiBro(Wireless Broadband), WiMAX(World Interoperability for Microwave Access), HSDPA(High Speed Downlink Packet Access), HSUPA(High Speed Uplink Packet Access), LTE(Long Term Evolution), LTE-A(Long Term Evolution- Advanced), 5G 등이 있다. 특히 5G 통신망을 이용하여 데이터를 송수신함으로써 보다 빠른 응답이 가능하다.As a wireless communication method, wireless Internet technology may be used. Wireless Internet technologies include, for example, WLAN (Wireless LAN), Wi-Fi (Wireless-Fidelity), Wi-Fi (Wireless Fidelity) Direct, DLNA (Digital Living Network Alliance), WiBro (Wireless Broadband), WiMAX (World Interoperability for Microwave Access), High Speed Downlink Packet Access (HSDPA), High Speed Uplink Packet Access (HSUPA), Long Term Evolution (LTE), Long Term Evolution-Advanced (LTE-A), and 5G. In particular, faster response is possible by transmitting and receiving data using the 5G communication network.
저장부(160)는 피사체 추적 촬영을 위한 각종 데이터 및 카메라(10)에 의해 촬영된 영상을 저장한다. 저장부(160)는 저장된 정보를 촬영 제어부(140) 등의 다른 구성요소의 요청에 따라 제공할 수 있다.The storage unit 160 stores various data for subject tracking and shooting and images captured by the camera 10 . The storage unit 160 may provide the stored information according to a request of other components such as the capturing control unit 140 .
이하에서 다른 실시예로써, 상술한 바와 같은 구성을 갖는 본 발명에 따른 드론(1)의 피사체 추적 촬영제어방법에 대해 도 3 내지 도 6을 참조하여 상세히 설명한다.Hereinafter, as another embodiment, a subject tracking and shooting control method of the drone 1 according to the present invention having the configuration described above will be described in detail with reference to FIGS. 3 to 6 .
도 3은 본 발명의 일실시예에 따른 드론(1)의 피사체 추적 촬영제어방법을 설명하기 위한 흐름도이고, 도 4는 본 발명의 일실시예에 따른 드론(1)의 피사체 추적 촬영제어방법을 설명하기 위한 개략도이고, 도 5 및 도 6은 본 발명의 일실시예에 따른 카메라의 방향각을 계산하는 방법을 설명하기 위한 개념도이고, 도 7은 본 발 명의 일실시예에 따른 카메라의 확대배율을 계산하는 방법을 설명하기 위한 개념 도이다.3 is a flowchart for explaining a subject tracking and shooting control method of a drone 1 according to an embodiment of the present invention, and FIG. 4 is a flowchart illustrating a subject tracking and shooting control method of a drone 1 according to an embodiment of the present invention. 5 and 6 are conceptual diagrams for explaining a method for calculating a direction angle of a camera according to an embodiment of the present invention, and FIG. 7 is a magnification of a camera according to an embodiment of the present invention. It is a conceptual diagram to explain how to calculate .
도면에 도시된 바와 같이, 본 발명의 실시예에 따른 드론(1)의 피사체 추적 촬영제어방법은, 먼저, 단계 S110에서, 상기 드론(1)에 의해 촬영된 영상으로부터 사용자로부터 설정된 특정영역을 서버 또는 사용자 단말로부터 수신받고, 상기 특정영역을 피사체로서 인식한다.As shown in the figure, in the subject tracking and shooting control method of the drone 1 according to an embodiment of the present invention, first, in step S110, a specific area set by the user from the image taken by the drone 1 is transferred to the server. Alternatively, it is received from the user terminal, and the specific area is recognized as a subject.
일실시예에서, 도 4에 도시하는 바와 같이, 드론(1)이 비행경로를 비행 중, 비행경로 주변을 촬영하여 영상을 획득하여 서버 또는 사용자 단말에 제공한다. 서버 또는 사용자 단말을 통해 사용자는 촬영된 영상에서 특정영역을 설정함으로써 피사체를 인식한다.In one embodiment, as shown in FIG. 4 , while the drone 1 is flying along the flight path, the surroundings of the flight path are photographed to obtain an image and provide the image to a server or user terminal. Through a server or a user terminal, a user recognizes a subject by setting a specific area in a captured image.
다른 변형예에서, 드론(1)이 미리 저장되거나 전송받은 피사체 위치 좌표에 기초하여, 피사체 위치좌표 주변을 비행하여 주변을 촬영한다. 드론(1)은 촬영된 이미지 상에서 특징점을 이용하여 추적대상 물체를 인식할 수 있다. 이미지 상에서 추출한 특징점과 미리 설정된 피사체의 특징점의 유사도를 확인하여, 미리 설정된 유사도 이상의 유사도를 갖는 추적대상을 물체를 피사체로서 인식할 수 있다. 이와 같이 드론(1)이 특징점을 이용하여 피사체를 인식한 경우에도, 사용자 단말 또는 서버를 통해 피사체를 확인받을 수 있다.In another modification, the drone 1 flies around the subject position coordinates based on previously stored or transmitted subject position coordinates and photographs the surroundings. The drone 1 may recognize an object to be tracked by using feature points on a photographed image. By checking the similarity between feature points extracted from the image and feature points of a preset subject, a tracking target having a similarity higher than the preset similarity may be recognized as an object. In this way, even when the drone 1 recognizes a subject using feature points, the subject may be confirmed through a user terminal or a server.
드론(1)이 비행중에 추적대상 물체를 상기 이미지에서 찾지 못하는 경우에 무인항공기를 제자리에서 회전시키면서 촬영하도록 할 수 있다.When the drone 1 cannot find the object to be tracked in the image during flight, the unmanned aerial vehicle may be photographed while rotating in place.
다음으로, 단계 S120에서, 드론(1)의 위치좌표와 상기 피사체의 위치좌표를 획득한다. 드론(1)의 위치좌표는 상기 드론(1)에 탑재된 센서부로부터 획득하고, 상기 피사체의 위치좌표는 서버 또는 사용자 단말로부터 수신받아 획득할 수 있다.Next, in step S120, the location coordinates of the drone 1 and the location coordinates of the subject are acquired. The location coordinates of the drone 1 may be obtained from a sensor unit mounted on the drone 1, and the location coordinates of the subject may be received and obtained from a server or a user terminal.
다른 변형예에서, 피사체의 위치좌표는 미리 저장된 지도 정보와 지도 정보에 매칭되는 위치좌표에 기초하여 획득할 수 있다. 예를 들어, 드론(1)이 특징점의 유사도에 기초하여 인식된 피사체를 미리 저장된 지도와 매칭시켜 위치를 확인하고, 지도상에 확인된 위치와 매칭되는 위치좌표를 확인할 수 있다.In another modification, the location coordinates of the subject may be obtained based on previously stored map information and location coordinates matching the map information. For example, the drone 1 may determine a location by matching a recognized subject with a previously stored map based on the similarity of feature points, and may check location coordinates matching the identified location on the map.
일실시예에서, 드론(1)이 특정영역의 중심점 좌표와 특정영역의 대각선 길이를 수신받는 것을 예시하였으나, 다른 실시예에서, 특정영역의 네 꼭지점의 위치좌표를 모두 수신받고, 중심점의 좌표 및 대각선의 위치좌표로부터 대각선의 길이를 산출할 수도 있다.In one embodiment, it is illustrated that the drone 1 receives the coordinates of the center point of the specific area and the diagonal length of the specific area, but in another embodiment, the location coordinates of all four vertices of the specific area are received, and the coordinates and The length of the diagonal may be calculated from the position coordinates of the diagonal.
도 5를 참조하면, 상기 드론(1)의 위치좌표를 중심으로 북쪽방향과 상기 드론(1) 진행방향과의 각도(α)와 상기 북쪽방향과 상기 피사체와의 각도(θ)의 합을 상기 카메라 짐볼의 요(yaw) 각도로 산출한다. 다시 말하면 진행방향과의 각도(α)는 드론(1)으로부터 북쪽방향을 가리키는 가상의 직선(l1)과 상기 드론(1)의 진행방향을 가리키는 가상의 직선(l2) 사이의 각도이고, 피사체와의 각도(θ)는 드론(1)으로부터 북쪽방향을 가리키는 가상의 직선(l1)과 드론(1)으로부터 피사체(target)를 가리키는 가상의 직선(l3) 사이의 각도이다.Referring to FIG. 5, the sum of the angle α between the north direction and the traveling direction of the drone 1 and the angle θ between the north direction and the subject is calculated based on the location coordinates of the drone 1. Calculated by the yaw angle of the camera gimbal. In other words, the angle α with the traveling direction is the angle between the imaginary straight line l1 pointing in the north direction from the drone 1 and the imaginary straight line l2 pointing in the traveling direction of the drone 1, and The angle θ of is an angle between an imaginary straight line l1 pointing north from the drone 1 and an imaginary straight line l3 pointing from the drone 1 to a target.
또한, 도 6을 참조하면, 상기 피사체와 상기 드론(1)과의 고도차이(hd)를 드론(1)에서 피사체까지의 거리(d)로 나눈 값의 아크탄젠트값을 상기 카메라 짐볼의 피치(pitch) 각도(∠θpitch)로 산출(아래 수학식 1 참조)한다. 드론(1)에서 피사체까지의 거리(d)는 드론(1)의 위치좌표와 피사체의 위치좌표에 기초하여 두 점 사이의 거리공식에 따라 산출된다.In addition, referring to FIG. 6, the arc tangent of the value obtained by dividing the altitude difference (h d ) between the subject and the drone 1 by the distance (d) from the drone 1 to the subject is the pitch of the camera gimbal. (pitch) It is calculated as an angle (∠θ pitch ) (see Equation 1 below). The distance d from the drone 1 to the subject is calculated according to a distance formula between two points based on the position coordinates of the drone 1 and the position coordinates of the subject.
[수학식 1][Equation 1]
Figure PCTKR2021017690-appb-I000002
Figure PCTKR2021017690-appb-I000002
다음으로, 단계 S140에서, 드론(1)에 마련된 센서부(105)의 센싱값에 기초하여 드론(1)의 위치좌표와 피사체의 위치좌표에 기초하여 확대배율을 계산 할 수 있다.Next, in step S140, the magnification factor may be calculated based on the location coordinates of the drone 1 and the location coordinates of the subject based on the sensing values of the sensor unit 105 provided in the drone 1.
본 발명의 일시예에 따른 촬영제어부(140)는 상기 드론(1)과 피사체와의 거리(D), 상기 카메라에서 측정되는 피사체의 수평 길이(D1) 및 카메라 화각에 기초하여 상기 카메라의 확대배율을 계산한다.The shooting control unit 140 according to an embodiment of the present invention adjusts the magnification of the camera based on the distance D between the drone 1 and the subject, the horizontal length D1 of the subject measured by the camera, and the angle of view of the camera. Calculate
상기 카메라에서 측정되는 피사체의 길이(D1)는 하기 수학식1에 기초하여 산출된다.The length D1 of the subject measured by the camera is calculated based on Equation 1 below.
[수학식 2][Equation 2]
Figure PCTKR2021017690-appb-I000003
Figure PCTKR2021017690-appb-I000003
상기 수학식 1에서 D2는 피사체의 대각선 길이를 의미하고, ∠A는 카메라의 방향각을 의미하며, ∠B는 피사체로서 특정영역의 대각선 방향각을 의미한다.In Equation 1, D 2 means the diagonal length of the subject, ∠A means the direction angle of the camera, and ∠B means the diagonal direction angle of a specific area as the subject.
상기 카메라 화각은 카메라의 기종마다 서로 상이할 수 있고, 미리 저장되거나 사용자 단말 및 서버로부터 입력받을 수 있다.The angle of view of the camera may be different for each type of camera, and may be pre-stored or input from a user terminal and a server.
도 6을 참조하여, 수학식 1의 유도과정을 살펴보면, 드론(1)으로부터 카메라 방향쪽을 가리키는 직선(M1)과 드론(1)의 카메라에서 촬영한 이미지 상의 피사체의 길이(D1)를 갖는 수직한 직선(M2)을 양변으로 하는 가상의 직사각형(stuv)을 상기 특정영역(opqr)의 한 꼭지점(O)에 맞추어 위치시킨다.Referring to FIG. 6, looking at the derivation process of Equation 1, the straight line M1 pointing from the drone 1 to the camera direction and the length D1 of the subject on the image taken by the camera of the drone 1 are vertical. An imaginary rectangle (stuv) having both sides of a straight line (M2) is aligned with one vertex (O) of the specific region (opqr).
드론(1)으로부터 북쪽 방향을 가리키는 가상의 직선과 카메라의 방향이 가리키는 가상의 직선 사이의 각도를 카메라 방위각(∠A)이라고 한다.An angle between a virtual straight line pointing in the north direction from the drone 1 and a virtual straight line pointing in the direction of the camera is referred to as the camera azimuth (∠A).
피사체로 인식되는 사각형 형태의 특정영역(opqr)의 대각선(oq)과 만나는 한 변과의 사이의 각도를 대각선의 방위각(∠B)이라고 한다.An angle between a diagonal line oq of a specific area opqr of a quadrangular shape recognized as a subject and a side where it meets is called an azimuth angle ∠B of the diagonal line.
특정영역(opqr)의 변op와 상기 가상의 직사각형(stuv)의 변(st) 사이의 각도(∠F)는 90°- 카메라 방위각(∠A)과 같다(아래 수학식 3).The angle (∠F) between the change op of the specific area opqr and the side st of the virtual rectangle stuv is equal to 90°-camera azimuth angle (∠A) (Equation 3 below).
[수학식 3][Equation 3]
Figure PCTKR2021017690-appb-I000004
Figure PCTKR2021017690-appb-I000004
또한, 가상의 직사각형(stuv)의 변(st)와 특정영역(opqr)의 대각선(oq) 사이의 각도(∠C)와 상기 각도(∠F)의 합은 대각선의 방위각(∠B)과 같다.이를 수학식으로 나타내면, 아래 수학식 4와 같다.In addition, the sum of the angle (∠C) between the side (st) of the imaginary rectangle (stuv) and the diagonal (oq) of the specific area (opqr) and the angle (∠F) is equal to the azimuth (∠B) of the diagonal . If this is expressed as an equation, it is as shown in Equation 4 below.
[수학식 4][Equation 4]
Figure PCTKR2021017690-appb-I000005
Figure PCTKR2021017690-appb-I000005
수학식 4에서 대각선의 방위각(∠B)- 각도(∠C)= 각도(∠F)이고, 각도(∠F)= 90°- 카메라 방위각(∠A)이므로, 대각선의 방위각(∠B)- 각도(∠C)= 90°- 카메라 방위각(∠A)으로 나타낼 수 있고, 각도(∠C)= 대각선의 방위각(∠B)+ 카메라 방위각(∠A)- 90°로 나타낼 수 있다.In Equation 4, the diagonal azimuth (∠B)- angle (∠C) = angle (∠F), and angle (∠F) = 90°- camera azimuth (∠A), so the diagonal azimuth (∠B)- Angle (∠C) = 90°-camera azimuth (∠A), and angle (∠C) = diagonal azimuth (∠B) + camera azimuth (∠A)-90°.
결국, 카메라에서 측정되는 피사체의 길이(D1)=피사체의 대각선 길이(D2)ХcosC 에서, 각도(∠C)를 대각선의 방위각(∠B)+ 카메라 방위각(∠A)- 90°로 나타내면, 수학식 1인 상기 카메라에서 측정되는 피사체의 길이(D1)가 유도된다.In the end, the length of the subject measured by the camera (D1) = the length of the diagonal of the subject (D2) ХcosC, if the angle (∠C) is expressed as the azimuth of the diagonal (∠B) + the azimuth of the camera (∠A) - 90 °, math The length D1 of the subject measured by the camera in Equation 1 is derived.
본 발명의 일시예에 따른 촬영제어부(140)는 수학식 1에서 산출한 피사체의 수평 길이(D1)와 카메라 화각(θcamera)에 기초하여 아래 [수학식 5]와 같이 확대배율(M)을 조정할 수 있다.The shooting control unit 140 according to an embodiment of the present invention calculates the magnification M as shown in [Equation 5] below based on the horizontal length D1 of the subject and the angle of view of the camera θ camera calculated in Equation 1. can be adjusted
[수학식 5][Equation 5]
Figure PCTKR2021017690-appb-I000006
Figure PCTKR2021017690-appb-I000006
상기 수학식에서 A는 소정 상수값을 의미한다.In the above equation, A means a predetermined constant value.
다음으로, 단계 S150에서, 계산된 카메라의 방향각에 따라 카메라 짐볼(20)을 구동시키고, 카메라(10)의 확대배율 M에 따라 카메라 줌을 조절한다.Next, in step S150, the camera gimbal 20 is driven according to the calculated direction angle of the camera, and the camera zoom is adjusted according to the magnification M of the camera 10.
계산된 카메라의 방향각에 따라 카메라 짐볼(20)을 구동시키는 제1 제어신호를 생성하고, 카메라(10)의 확대배율에 따라 카메라 줌을 조절하는 제2 제어신호를 생성한다. 제1 제어신호는 상기 카메라 짐볼(20)에 전달하여 카메라의 방향각을 제어하고, 제2 제어신호는 카메라(10)에 전달하며 카메라의 줌을 제어한다.A first control signal for driving the camera gimbal 20 is generated according to the calculated orientation angle of the camera, and a second control signal for adjusting the camera zoom according to the magnification of the camera 10 is generated. The first control signal is transmitted to the camera gimbal 20 to control the direction angle of the camera, and the second control signal is transmitted to the camera 10 and controls the zoom of the camera.
도 8은 본 발명의 일실시예에 따른 피사체 추적 촬영제어장치의 구성을 개략적으로 나타낸 도면이다.8 is a diagram schematically showing the configuration of a subject tracking and shooting control device according to an embodiment of the present invention.
도 8을 참조하면, 본 발명의 일실시예에 따른 피사체 추적 촬영제어장치(200)는 통신부(210), 입력부(220), 물체인식부(230), 연산부(240) 및 신호 생성부(250)를 포함할 수 있다. 또한, 피사체 추적 촬영제어장치(200)는 출력부(미도시)를 더 포함할 수 있다.Referring to FIG. 8 , the subject tracking and shooting control apparatus 200 according to an embodiment of the present invention includes a communication unit 210, an input unit 220, an object recognition unit 230, a calculation unit 240, and a signal generation unit 250. ) may be included. In addition, the subject tracking photographing control device 200 may further include an output unit (not shown).
통신부(210)는 드론(1)과 정보를 송수신한다. 예를 들어, 통신부(210)는 드론(1)에 탑재된 카메라에 의해 촬영된 영상 및 드론(1)의 위치좌표를 수신받는다. 수신받은 영상 및 드론(1)의 위치좌표는 물체인식부(230) 및 연산부(240)에 전달될 수 있다. 또한, 신호 생성부(250)에 의해 생성된 신호를 드론(1)에 탑재된 카메라 짐볼 및 카메라에 전달할 수 있다.The communication unit 210 transmits and receives information to and from the drone 1 . For example, the communication unit 210 receives an image captured by a camera mounted on the drone 1 and location coordinates of the drone 1 . The received image and the location coordinates of the drone 1 may be transmitted to the object recognition unit 230 and the calculation unit 240. In addition, the signal generated by the signal generating unit 250 may be transmitted to the camera gimbal and camera mounted on the drone 1 .
입력부(220)는 사용자의 입력 동작을 입력 신호로 변환하여 다른 구성요소 예를 들어 물체인식부(230) 및 연산부(240)에 전달한다.The input unit 220 converts the user's input operation into an input signal and transmits it to other components, for example, the object recognition unit 230 and the calculation unit 240 .
입력부(220)는 예를 들어 키보드, 마우스, 터치스크린 상의 터치센서, 터치패드, 키패드, 음성 입력, 기타 현재, 과거에 가능하거나 미래에 가능해질 입력 처리 장치들로써 구현될 수 있다. 입력부(220)는 예를 들어 출력부(미도시)에 출력된 촬영 영상으로부터 사용자로부터 특정영역을 설정받는다.The input unit 220 may be implemented as, for example, a keyboard, a mouse, a touch sensor on a touch screen, a touch pad, a keypad, voice input, and other input processing devices that are currently available in the past or will be available in the future. The input unit 220 receives, for example, a user setting a specific region from a photographed image output to an output unit (not shown).
일실시예에서 특정영역을 설정받기 위해서 입력부(220)는 특정영역을 설정하기 위한 인터페이스를 제공하고, 상기 인터페이스를 통해 특정영역을 선택받아 물체인식부(230)에 전달한다.In one embodiment, in order to set a specific area, the input unit 220 provides an interface for setting the specific area, receives a specific area through the interface, and transmits it to the object recognition unit 230 .
연산부(240)는 통신부(210)를 통해 수신받은 드론(1)의 위치좌표와 상기 입력부(220)를 통해 입력받은 상기 피사체의 위치좌표에 기초하여 카메라의 방향각 및 확대배율을 계산한다.The calculation unit 240 calculates the direction angle and magnification of the camera based on the location coordinates of the drone 1 received through the communication unit 210 and the location coordinates of the subject received through the input unit 220.
연산부(240)에 의해서 카메라의 방향각을 산출하는 방법은 도 3의 단계 S130에서 산출하는 방법과 유사하므로 상세한 설명은 생략한다.A method of calculating the direction angle of the camera by the calculation unit 240 is similar to the method of calculating the direction angle of the camera in step S130 of FIG. 3 , so a detailed description thereof will be omitted.
연산부(240)에 의해서 확대배율을 산출하는 방법은 도 3의 단계 S140에서 산출하는 방법과 유사하므로 상세한 설명은 생략한다.A method of calculating the magnification by the calculation unit 240 is similar to the method of calculating the magnification in step S140 of FIG. 3, so detailed description thereof will be omitted.
신호 생성부(250)는 연산부(240)에 의해 연산된 상기 카메라의 방향각 및 확대배율에 따른 제어신호를 생성하여 드론(1)에 전달함으로써 카메라 짐볼의 방향각 및 카메라의 줌을 제어한다.The signal generating unit 250 controls the direction angle of the camera gimbal and the zoom of the camera by generating a control signal according to the direction angle and magnification of the camera calculated by the calculation unit 240 and transmitting the control signal to the drone 1 .
다른 실시예에서, 피사체 추적 촬영제어장치(200)는 비행경로를 미리 입력받아, 각 경로 상에 위치하기 이전에 드론(1)의 예상 위치좌표와 피사체의 위치좌표에 기초하여 카메라 짐볼의 방향각과 카메라 줌을 제어하기 위한 연산을 끝 마침으로써, 드론(1)이 비행경로에 예상 위치에 도달할 때의 시간을 미리 예측하고 예측시각에 맞추어 카메라 짐볼의 방향각 및 카메라의 줌을 제어할 수 있다.In another embodiment, the subject tracking and shooting control device 200 receives a flight path in advance, and determines the direction angle of the camera gimbal based on the expected position coordinates of the drone 1 and the position coordinates of the subject before being positioned on each path. By completing the calculation for controlling the camera zoom, the time when the drone 1 arrives at the expected position on the flight path can be predicted in advance and the direction angle of the camera gimbal and the zoom of the camera can be controlled according to the predicted time. .
이 때, 처리 흐름도 도면들의 각 블록과 흐름도 도면들의 조합들은 컴퓨터 프로그램 인스트럭션들에 의해 수행될 수 있음을 이해할 수 있을 것이다. 이들 컴퓨터 프로그램 인스트럭션들은 범용 컴퓨터, 특수용 컴퓨터 또는 기타 프로 그램 가능한 데이터 프로세싱 장비의 프로세서에 탑재될 수 있으므로, 컴퓨터 또는 기타 프로그램 가능한 데이터 프로세싱 장비의 프로세서를 통해 수행되는 그 인스 트럭션들이 흐름도 블록(들)에서 설명된 기능들을 수행하는 수단을 생성하게 된다. 이들 컴퓨터 프로그램 인스트럭션들은 특정 방식으로 기능을 구현하기 위해 컴퓨터 또는 기타 프로그램 가능한 데이터 프로세싱 장비를 지향할 수 있는 컴퓨터 이용 가능 또는 컴퓨터 판독 가능 메모리에 저장되는 것도 가능하므로, 그 컴퓨터 이용 가능 또는 컴퓨터 판독 가능 메모리에 저장된 인스트럭션들은 흐름도 블록(들)에서 설명된 기능을 수행하는 인스트럭션 수단을 내포하는 제조 품목을 생산하는 것도 가능하다. 컴퓨터 프로그램 인스트럭션들은 컴퓨터 또는 기타 프로그램 가능한 데 이터 프로세싱 장비 상에 탑재되는 것도 가능하므로, 컴퓨터 또는 기타 프로그램 가능한 데이터 프로세싱 장비 상에서 일련의 동작 단계들이 수행되어 컴퓨터로 실 행되는 프로세스를 생성해서 컴퓨터 또는 기타 프로그램 가능한 데이터 프로세싱 장비를 수행하는 인스트럭션들은 흐름도 블록(들)에서 설명된 기능들을 실행하기 위한 단계들을 제공하는 것도 가능하다.At this time, it will be understood that each block of the process flow chart diagrams and combinations of the flow chart diagrams can be performed by computer program instructions. Since these computer program instructions may be loaded into a processor of a general-purpose computer, special-purpose computer, or other programmable data processing equipment, the instructions executed by the processor of the computer or other programmable data processing equipment may be included in the flowchart block(s). It creates means to perform the functions described in These computer program instructions may also be stored in a computer usable or computer readable memory that can be directed to a computer or other programmable data processing equipment to implement functionality in a particular way, such that the computer usable or computer readable memory The instructions stored in are also capable of producing an article of manufacture containing instruction means that perform the functions described in the flowchart block(s). Since the computer program instructions can also be loaded on a computer or other programmable data processing equipment, a series of operational steps are performed on the computer or other programmable data processing equipment to create a computer-executed process to create a computer or other program Instructions performing possible data processing equipment may also provide steps for carrying out the functions described in the flowchart block(s).
또한, 각 블록은 특정된 논리적 기능(들)을 실행하기 위한 하나 이 상의 실행 가능한 인스트럭션들을 포함하는 모듈, 세그먼트 또는 코드의 일부를 나 타낼 수 있다. 또, 몇 가지 대체 실행 예들에서는 블록들에서 언급된 기능들이 순 서를 벗어나서 발생하는 것도 가능함을 주목해야 한다. 예컨대, 잇달아 도시되어 있는 두 개의 블록들은 사실 실질적으로 동시에 수행되는 것도 가능하고 또는 그 블록들이 때때로 해당하는 기능에 따라 역순으로 수행되는 것도 가능하다.Additionally, each block may represent a module, segment or portion of code that includes one or more executable instructions for executing specified logical function(s). It should also be noted that in some alternative implementations it is possible for the functions mentioned in the blocks to occur out of order. For example, two blocks shown in succession may in fact be executed substantially concurrently, or the blocks may sometimes be executed in reverse order depending on their function.
이 때, 본 실시 예에서 사용되는 '~부'라는 용어는 소프트웨어 또는 FPGA또는 ASIC과 같은 하드웨어 구성요소를 의미하며, '~부'는 어떤 역할들을 수행한다. 그렇지만 '~부'는 소프트웨어 또는 하드웨어에 한정되는 의미는 아니다. '~ 부'는 어드레싱할 수 있는 저장 매체에 있도록 구성될 수도 있고 하나 또는 그 이 상의 프로세서들을 재생시키도록 구성될 수도 있다. 따라서, 일 예로서 '~부'는 소프트웨어 구성요소들, 객체지향 소프트웨어 구성요소들, 클래스 구성요소들 및 태스크 구성요소들과 같은 구성요소들과, 프로세스들, 함수들, 속성들, 프로시저들, 서브루틴들, 프로그램 코드의 세그먼트들, 드라이버들, 펌웨어, 마이 크로코드, 회로, 데이터, 데이터베이스, 데이터 구조들, 테이블들, 어레이들, 및 변수들을 포함한다. 구성요소들과 '~부'들 안에서 제공되는 기능은 더 작은 수의 구성요소들 및 '~부'들로 결합되거나 추가적인 구성요소들과 '~부'들로 더 분리될 수 있다. 뿐만 아니라, 구성요소들 및 '~부'들은 디바이스 또는 보안 멀티미디어 카드 내의 하나 또는 그 이상의 CPU들을 재생시키도록 구현될 수도 있다.At this time, the term '~unit' used in this embodiment means software or a hardware component such as FPGA or ASIC, and '~unit' performs certain roles. However, '~ part' is not limited to software or hardware. '~ part' may be configured to be in an addressable storage medium and may be configured to reproduce one or more processors. Therefore, as an example, '~unit' refers to components such as software components, object-oriented software components, class components, and task components, processes, functions, properties, and procedures. , subroutines, segments of program code, drivers, firmware, microcode, circuitry, data, databases, data structures, tables, arrays, and variables. Functions provided within components and '~units' may be combined into smaller numbers of components and '~units' or further separated into additional components and '~units'. In addition, the components and '~units' may be implemented to play one or more CPUs in a device or a secure multimedia card.
본 명세서가 속하는 기술분야의 통상의 지식을 가진 자는 본 명세서 가 그 기술적 사상이나 필수적인 특징을 변경하지 않고서 다른 구체적인 형태로 실 시될 수 있다는 것을 이해할 수 있을 것이다. 그러므로 이상에서 기술한 실시 예들 은 모든 면에서 예시적인 것이며 한정적이 아닌 것으로 이해해야만 한다. 본 명세 서의 범위는 상기 상세한 설명보다는 후술하는 특허청구의 범위에 의하여 나타내어 지며, 특허청구의 범위의 의미 및 범위 그리고 그 균등 개념으로부터 도출되는 모 든 변경 또는 변형된 형태가 본 명세서의 범위에 포함되는 것으로 해석되어야 한다.Those skilled in the art to which this specification pertains will be able to understand that this specification can be implemented in other specific forms without changing its technical spirit or essential features. Therefore, it should be understood that the embodiments described above are illustrative in all respects and not restrictive. The scope of the present specification is indicated by the claims to be described later rather than the detailed description above, and all changes or modifications derived from the meaning and scope of the claims and equivalent concepts thereof are included in the scope of the present specification. should be interpreted as being
한편, 본 명세서와 도면에는 본 명세서의 바람직한 실시 예에 대하여 개시하였으며, 비록 특정 용어들이 사용되었으나, 이는 단지 본 명세서의 기술 내용을 쉽게 설명하고 발명의 이해를 돕기 위한 일반적인 의미에서 사용된 것이지, 본 명세서의 범위를 한정하고자 하는 것은 아니다. 여기에 개시된 실시 예 외에도 본 명세서의 기술적 사상에 바탕을 둔 다른 변형 예들이 실시 가능하다는 것은 본 명세서가 속하는 기술 분야에서 통상의 지식을 가진 자에게 자명한 것이다.On the other hand, the present specification and drawings disclose preferred embodiments of the present specification, and although specific terms are used, they are only used in a general sense to easily explain the technical content of the present specification and help understanding of the present invention, It is not intended to limit the scope of the specification. In addition to the embodiments disclosed herein, it is obvious to those skilled in the art that other modified examples based on the technical spirit of the present specification may be implemented.

Claims (15)

  1. 드론에 의한 피사체 추적 촬영제어방법으로서,As a subject tracking shooting control method by a drone,
    상기 드론에 의해 촬영된 영상으로부터 사용자로부터 설정된 특정영역을 서버 또는 사용자 단말로부터 수신받고, 상기 특정영역을 피사체로서 인식하는 단계;receiving, from a server or a user terminal, a specific area set by a user from an image captured by the drone, and recognizing the specific area as a subject;
    상기 드론의 위치좌표와 상기 피사체의 위치좌표를 획득하는 단계;acquiring the location coordinates of the drone and the location coordinates of the subject;
    상기 드론의 위치좌표와 상기 피사체의 위치좌표에 기초하여 상기 피사체를 영상의 중앙에 위치시키기 위한 카메라의 방향각을 계산하는 단계;Calculating a direction angle of a camera to position the subject at the center of an image based on the positional coordinates of the drone and the positional coordinates of the subject;
    상기 드론의 위치좌표와 상기 피사체의 위치좌표에 기초하여 상기 카메라의 확대배율을 계산하는 단계; 및calculating a magnification of the camera based on the location coordinates of the drone and the location coordinates of the subject; and
    계산된 카메라의 방향각에 따라 카메라 짐볼을 구동시키고, 상기 카메라의 확대배율에 따라 카메라 줌을 조절하는 단계Driving the camera gimbal according to the calculated direction angle of the camera and adjusting the camera zoom according to the magnification of the camera
    를 포함하는 것을 특징으로 하는 드론에 의한 피사체 추적 촬영제어방법.Subject tracking shooting control method by a drone, characterized in that it comprises a.
  2. 제1항에 있어서,According to claim 1,
    상기 카메라의 방향각을 계산하는 단계는,Calculating the direction angle of the camera,
    상기 드론의 위치좌표를 중심으로 북쪽방향과 상기 드론 진행방향과의 각도(α)와 상기 북쪽방향과 상기 피사체와의 각도(θ)의 합을 상기 카메라 짐볼의 요(yaw) 각도로 산출하고, 상기 피사체와 상기 드론과의 고도차이(hd)를 드론에서 피사체까지의 거리(d)로 나눈 값의 아크탄젠트값을 상기 카메라 짐볼의 롤(roll) 각도로 산출하는 것 을 특징으로 하는 드론에 의한 피사체 추적 촬영제어방법.Calculate the sum of the angle (α) between the north direction and the drone traveling direction and the angle (θ) between the north direction and the subject centered on the positional coordinates of the drone as the yaw angle of the camera gimbal, In the drone, characterized in that the arctangent value of the value obtained by dividing the altitude difference (h d ) between the subject and the drone by the distance (d) from the drone to the subject is calculated as the roll angle of the camera gimbal. Subject tracking shooting control method by
  3. 제1항에 있어서,According to claim 1,
    상기 카메라의 확대배율을 계산하는 단계는,Calculating the magnification of the camera,
    상기 드론과 피사체와의 거리(D), 상기 카메라에서 측정되는 피사체의 길이(D1) 및 카메라 화각에 기초하여 상기 카메라의 확대배율을 계산하는 것을 특징으로 하는 드론에 의한 피사체 추적 촬영제어방법.The method of tracking and photographing a subject by a drone, characterized in that calculating the magnification of the camera based on the distance (D) between the drone and the subject, the length (D1) of the subject measured by the camera, and the angle of view of the camera.
  4. 제3항에 있어서,According to claim 3,
    상기 카메라에서 측정되는 피사체의 길이(D1)는 하기 수학식에 기초하여 산출되는 것을 특징으로 하는 드론에 의한 피사체 추적 촬영제어방법.The subject tracking and shooting control method by a drone, characterized in that the length (D1) of the subject measured by the camera is calculated based on the following equation.
    [수학식 1][Equation 1]
    Figure PCTKR2021017690-appb-I000007
    Figure PCTKR2021017690-appb-I000007
    상기 수학식 1에서 D2는 피사체의 대각선 길이를 의미하고, ∠A는 카메라의 방향각을 의미하며, ∠B는 피사체로서 특정영역의 대각선 방향각을 의미함.In Equation 1, D 2 means the diagonal length of the subject, ∠A means the direction angle of the camera, and ∠B means the diagonal direction angle of a specific area as the subject.
  5. 제1항에 있어서,According to claim 1,
    상기 드론의 위치좌표와 피사체의 위치좌표를 획득하는 단계에서,In the step of obtaining the location coordinates of the drone and the location coordinates of the subject,
    상기 드론의 위치좌표는 상기 드론에 탑재된 센서부로부터 획득하고, 상기 피사체의 위치좌표는 서버 또는 사용자 단말로부터 수신받는 것을 특징으로 하는 드론에 의한 피사체 추적 촬영제어방법.The subject tracking and shooting control method by a drone, characterized in that the location coordinates of the drone are obtained from a sensor unit mounted on the drone, and the location coordinates of the subject are received from a server or a user terminal.
  6. 드론에 탑재된 카메라에 의해 촬영된 영상을 획득하는 영상획득부;An image acquisition unit for acquiring an image captured by a camera mounted on the drone;
    상기 카메라의 방향각을 제어하는 카메라 짐볼;a camera gimbal for controlling an orientation angle of the camera;
    상기 촬영된 영상을 송부하고, 상기 촬영된 영상으로부터 사용자로부터 설정된 특정영역을 서버 또는 사용자 단말로부터 수신받는 통신부;a communication unit that transmits the captured image and receives a specific area set by a user from the captured image from a server or a user terminal;
    상기 설정된 특정영역을 피사체로서 인식하는 물체인식부;an object recognizing unit recognizing the set specific area as a subject;
    상기 드론에 탑재된 센서에 의해 위치좌표를 생성하는 센서부; 및a sensor unit generating location coordinates by a sensor mounted on the drone; and
    상기 드론의 위치좌표와 상기 특정영역의 위치좌표에 기초하여 카메라의 방향각과 확대배율을 계산하고, 계산된 카메라의 방향각과 확대배율에 따라 상기 카메라 짐볼과 상기 카메라의 줌을 조절하는 촬영제어부A shooting control unit that calculates the direction angle and magnification of the camera based on the location coordinates of the drone and the location coordinates of the specific area, and adjusts the camera gimbal and zoom of the camera according to the calculated orientation angle and magnification of the camera.
    를 포함하는 드론.A drone that includes a.
  7. 제6항에 있어서, 상기 촬영제어부는,The method of claim 6, wherein the shooting control unit,
    상기 드론의 위치좌표를 중심으로 북쪽방향과 상기 드론 진행방향과의 각도(α)와 상기 북쪽방향과 상기 피사체와의 각도(θ)의 합을 상기 카메라 짐볼의 요(yaw) 각도로 산출하고,The sum of the angle α between the drone direction and the north direction and the angle θ between the north direction and the subject centered on the position coordinates of the drone is calculated as the yaw angle of the camera gimbal,
    상기 피사체와 상기 드론과의 고도차이(hd)를 드론에서 피사체까지의 거리(d)로 나눈 값의 아크탄젠트값을 상기 카메라 짐볼의 롤(roll) 각도로 산출하는 것을 특징으로 하는 드론.An arctangent value obtained by dividing an altitude difference (h d ) between the subject and the drone by a distance (d) from the drone to the subject is calculated as a roll angle of the camera gimbal.
  8. 제6항에 있어서, 상기 촬영제어부는,The method of claim 6, wherein the shooting control unit,
    상기 드론과 피사체와의 거리(D), 상기 카메라에서 측정되는 피사체의 길이(D1) 및 카메라 화각에 기초하여 상기 카메라의 확대배율을 계산하는 것을 특징으로 하는 드론.The drone, characterized in that the magnification of the camera is calculated based on the distance (D) between the drone and the subject, the length (D1) of the subject measured by the camera, and the angle of view of the camera.
  9. 제8항에 있어서, 상기 촬영제어부는,The method of claim 8, wherein the shooting control unit,
    상기 카메라에서 측정되는 피사체의 길이(D1)는 하기 수학식에 기초하여 산출되는 것을 특징으로 하는 드론.The drone, characterized in that the length (D1) of the subject measured by the camera is calculated based on the following equation.
    [수학식 1][Equation 1]
    Figure PCTKR2021017690-appb-I000008
    Figure PCTKR2021017690-appb-I000008
    상기 수학식 1에서 D2는 피사체의 대각선 길이를 의미하고, ∠A는 카메라의 방향각을 의미하며, ∠B는 피사체로서 특정영역의 대각선 방향각을 의미함.In Equation 1, D 2 means the diagonal length of the subject, ∠A means the direction angle of the camera, and ∠B means the diagonal direction angle of a specific area as the subject.
  10. 제9항에 있어서,According to claim 9,
    상기 특정영역은 사각형이고, 상기 피사체의 위치좌표는 상기 피사체의 피사체 중심의 위치좌표이며,The specific area is a rectangle, the location coordinates of the subject are the location coordinates of the center of the subject,
    상기 피사체의 대각선 길이(D2)는 상기 통신부를 통해 서버 또는 사용자 단말로부터 수신받는 것을 특징으로 하는 드론.The drone, characterized in that the diagonal length (D2) of the subject is received from the server or the user terminal through the communication unit.
  11. 드론에 탑재된 카메라에 의해 촬영된 영상 및 드론으로부터 드론의 위치좌표를 수신받는 통신부;a communication unit for receiving an image captured by a camera mounted on the drone and location coordinates of the drone from the drone;
    상기 촬영된 영상으로부터 사용자로부터 특정영역을 설정받는 입력부; 상기 설정된 특정영역을 피사체로서 인식하는 물체인식부;an input unit configured to set a specific area from the captured image by a user; an object recognizing unit recognizing the set specific area as a subject;
    상기 드론의 위치좌표와 상기 입력부를 통해 입력받은 상기 피사체의 위치좌 표에 기초하여 카메라의 방향각 및 확대배율을 계산하는 연산부; 및a calculation unit that calculates a direction angle and magnification of the camera based on the location coordinates of the drone and the location coordinates of the subject received through the input unit; and
    상기 카메라의 방향각 및 확대배율을 제어하는 제어 신호를 생성하여 상기 통신부를 통해 상기 드론에 전송하는 신호 생성부A signal generator for generating a control signal for controlling the direction angle and magnification of the camera and transmitting the control signal to the drone through the communication unit.
    를 포함하는 드론의 피사체 추적 촬영제어장치.Subject tracking shooting control device of a drone including a.
  12. 제11항에 있어서, 상기 연산부는,The method of claim 11, wherein the calculation unit,
    상기 드론의 위치좌표를 중심으로 북쪽방향과 상기 드론 진행방향과의 각도(α)와 상기 북쪽방향과 상기 피사체와의 각도(θ)의 합을 상기 카메라 짐볼의 요(yaw) 각도로 산출하고,Calculate the sum of the angle (α) between the north direction and the drone traveling direction and the angle (θ) between the north direction and the subject centered on the positional coordinates of the drone as the yaw angle of the camera gimbal,
    상기 피사체와 상기 드론과의 고도차이(hd)를 드론에서 피사체까지의 거리(d)로 나눈 값의 아크탄젠트값을 상기 카메라 짐볼의 롤(roll) 각도로 산출하는 것을 특징으로 하는 드론의 피사체 추적 촬영제어장치.Subject tracking of the drone, characterized in that the arc tangent of the value obtained by dividing the altitude difference (hd) between the subject and the drone by the distance (d) from the drone to the subject is calculated as the roll angle of the camera gimbal. filming control device.
  13. 제11항에 있어서, 상기 연산부는,The method of claim 11, wherein the calculation unit,
    상기 드론과 피사체와의 거리(D), 상기 카메라에서 측정되는 피사체의 길이(D1) 및 카메라 화각에 기초하여 상기 카메라의 확대배율을 계산하는 것을 특징으 로 하는 드론의 피사체 추적 촬영제어장치.The subject tracking and shooting control device of the drone, characterized in that for calculating the magnification of the camera based on the distance (D) between the drone and the subject, the length (D1) of the subject measured by the camera, and the angle of view of the camera.
  14. 제11항에 있어서, 상기 연산부는,The method of claim 11, wherein the calculation unit,
    상기 카메라에서 측정되는 피사체의 길이(D1)는 하기 수학식에 기초하여 산 출되는 것을 특징으로 하는 드론의 피사체 추적 촬영제어장치.The length D1 of the subject measured by the camera is calculated based on the following equation.
    [수학식 1][Equation 1]
    Figure PCTKR2021017690-appb-I000009
    Figure PCTKR2021017690-appb-I000009
    상기 수학식 1에서 D2는 피사체의 대각선 길이를 의미하고, ∠A는 카메라의 방향각을 의미하며, ∠B는 피사체로서 특정영역의 대각선 방향각을 의미함.In Equation 1, D 2 means the diagonal length of the subject, ∠A means the direction angle of the camera, and ∠B means the diagonal direction angle of a specific area as the subject.
  15. 제11항에 있어서, 상기 연산부는,The method of claim 11, wherein the calculation unit,
    상기 드론의 비행경로에 기초하여 비행경로 상의 예상 시각의 예상 위치좌표를 연산하고, 연산된 예상 위치좌표와 상기 피사체의 위치좌표에 기초하여 카메라의 방향각 및 확대배율을 계산하는 것을 특징으로 하는 드론의 피사체 추적 촬영제어장치.Based on the flight path of the drone, the expected position coordinates of the expected time on the flight path are calculated, and the direction angle and magnification of the camera are calculated based on the calculated expected position coordinates and the position coordinates of the subject. Subject tracking shooting control device of
PCT/KR2021/017690 2021-07-20 2021-11-29 Control apparatus for tracking and photographing subject, drone, and method for operating control apparatus WO2023003100A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2021-0094894 2021-07-20
KR1020210094894A KR102571330B1 (en) 2021-07-20 2021-07-20 Control apparatus for subject tracking shooting, drone and operation method thereof

Publications (1)

Publication Number Publication Date
WO2023003100A1 true WO2023003100A1 (en) 2023-01-26

Family

ID=84980259

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2021/017690 WO2023003100A1 (en) 2021-07-20 2021-11-29 Control apparatus for tracking and photographing subject, drone, and method for operating control apparatus

Country Status (2)

Country Link
KR (1) KR102571330B1 (en)
WO (1) WO2023003100A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20200056068A (en) * 2018-11-14 2020-05-22 이병섭 System for tracking an object in unmanned aerial vehicle based on mvs
KR102179676B1 (en) * 2019-07-31 2020-11-17 주식회사 아르고스다인 Method and system for determining position of autonomous mobile
KR20210078164A (en) * 2019-12-18 2021-06-28 엘지전자 주식회사 User end, system and method for controlling a drone
KR102278467B1 (en) * 2019-04-29 2021-07-19 주식회사 에프엠웍스 Method and apparatus of real-time tracking a position using drones, traking a position system including the apparatus

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102335167B1 (en) * 2015-03-17 2021-12-03 삼성전자주식회사 Image photographing apparatus and method for photographing thereof

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20200056068A (en) * 2018-11-14 2020-05-22 이병섭 System for tracking an object in unmanned aerial vehicle based on mvs
KR102278467B1 (en) * 2019-04-29 2021-07-19 주식회사 에프엠웍스 Method and apparatus of real-time tracking a position using drones, traking a position system including the apparatus
KR102179676B1 (en) * 2019-07-31 2020-11-17 주식회사 아르고스다인 Method and system for determining position of autonomous mobile
KR20210078164A (en) * 2019-12-18 2021-06-28 엘지전자 주식회사 User end, system and method for controlling a drone

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
BARTAK ROMAN; VYKOVSKY ADAM: "Any Object Tracking and Following by a Flying Drone", 2015 FOURTEENTH MEXICAN INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE (MICAI), IEEE, 25 October 2015 (2015-10-25), pages 35 - 41, XP032879614, ISBN: 978-1-5090-0322-8, DOI: 10.1109/MICAI.2015.12 *

Also Published As

Publication number Publication date
KR102571330B1 (en) 2023-08-29
KR20230013875A (en) 2023-01-27

Similar Documents

Publication Publication Date Title
US20210065400A1 (en) Selective processing of sensor data
JP5618840B2 (en) Aircraft flight control system
JP5775632B2 (en) Aircraft flight control system
KR102280688B1 (en) Controller for Unmanned Aerial Vehicle
WO2017034252A1 (en) Location guidance control method for unmanned aerial vehicle, using image information
WO2018059398A1 (en) Method, apparatus, and system for controlling multi-rotor aircraft
JP6583840B1 (en) Inspection system
WO2019198868A1 (en) Mutual recognition method between unmanned aerial vehicle and wireless terminal
CN105045293A (en) Cradle head control method, external carrier control method and cradle head
JP2019073096A (en) Overhead line image capturing system and overhead line image capturing method
JP2023100642A (en) inspection system
KR101796478B1 (en) Unmanned air vehicle capable of 360 degree picture shooting
JP6482855B2 (en) Monitoring system
WO2021251441A1 (en) Method, system, and program
WO2023003100A1 (en) Control apparatus for tracking and photographing subject, drone, and method for operating control apparatus
JP6681101B2 (en) Inspection system
CN115065816B (en) Real geospatial scene real-time construction method and real-time construction device
Moraes et al. Autonomous Quadrotor for accurate positioning
KR20190123095A (en) Drone-based omni-directional thermal image processing method and thermal image processing system therefor
WO2021014752A1 (en) Information processing device, information processing method, and information processing program
WO2019093692A1 (en) Method and electronic device for controlling unmanned aerial vehicle comprising camera
KR20220080907A (en) Flight method for unmanned aerial vehicle tracking objects based on pixel coordinates
Evangeliou et al. Development of a versatile modular platform for aerial manipulators
JPWO2021064982A1 (en) Information processing device and information processing method
JP6681102B2 (en) Inspection system

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE