KR20170019777A - Apparatus and method for controling capturing operation of flying bot - Google Patents

Apparatus and method for controling capturing operation of flying bot Download PDF

Info

Publication number
KR20170019777A
KR20170019777A KR1020150114035A KR20150114035A KR20170019777A KR 20170019777 A KR20170019777 A KR 20170019777A KR 1020150114035 A KR1020150114035 A KR 1020150114035A KR 20150114035 A KR20150114035 A KR 20150114035A KR 20170019777 A KR20170019777 A KR 20170019777A
Authority
KR
South Korea
Prior art keywords
flying
bot
shooting
camera
image
Prior art date
Application number
KR1020150114035A
Other languages
Korean (ko)
Inventor
김태형
이가민
이은주
김성진
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020150114035A priority Critical patent/KR20170019777A/en
Publication of KR20170019777A publication Critical patent/KR20170019777A/en

Links

Images

Classifications

    • H04N5/23216
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C39/00Aircraft not otherwise provided for
    • B64C39/02Aircraft not otherwise provided for characterised by special use
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C39/00Aircraft not otherwise provided for
    • B64C39/06Aircraft not otherwise provided for having disc- or ring-shaped wings
    • H04N5/225

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The present invention relates to a photographing control apparatus and a control method of a flying bot capable of effectively photographing a user-selected region by using a flying boat equipped with a camera, ); And an image picked up by the camera of the flying robot is displayed on the display unit and the moving and rotating of the flying bots is controlled according to the photographing area set by the user on the photographing image displayed on the display unit, And a flying bot controller.

Description

TECHNICAL FIELD [0001] The present invention relates to an image capture control apparatus and a control method thereof,

BACKGROUND OF THE INVENTION Field of the Invention [0001] The present invention relates to an imaging control apparatus and a control method thereof for a flying bot capable of effectively photographing a user-selected region by using a flying boat equipped with a camera.

Flying Bot is an abbreviation for "fly robot" and means small aircraft that can be adjusted by induction of radio waves without pilots. One example of the flying bots is a "drones ". These flying bots are equipped with a high-resolution camera or a variety of sensors so that various information collected in a place or in the air that is difficult for the user to access can be transmitted to other devices such as a mobile terminal, Other communication device) through wireless communication.

However, since the flying bots are controlled by radio waves, it is necessary for the user to manually control the camera mounted on the flying bots to designate the photographing area by a special adjustment technique.

Accordingly, a control device and a control method of a flying robot for more efficiently controlling various functions added to a flying robot, as well as a simple flying function, have been studied actively.

It is therefore an object of the present invention to provide an image pickup control apparatus and method for a flying robot capable of controlling image pickup by a flying robot in various image pickup modes.

It is another object of the present invention to provide a photographing control apparatus and method of a flying bot in which a user can intuitively set a photographing area when photographing by a flying bot.

According to an aspect of the present invention, there is provided an imaging control apparatus for a flying bot, including: a flying bot equipped with at least one camera; And an image picked up by the camera of the flying robot is displayed on the display unit and the moving and rotating of the flying bots is controlled according to the photographing area set by the user on the photographing image displayed on the display unit, And a flying bot controller.

According to another aspect of the present invention, there is provided an imaging control method for a flying bot, the method comprising: moving a flying boat by determining a moving direction or a rotating direction of the flying bot in accordance with an imaging mode; Performing an initial photographing by controlling a camera mounted on the flying bot, receiving an initial photographing image from the flying bot, and displaying the initial photographing image on the display unit; Sensing an imaging area set by a user on an initial imaging image displayed on the display unit; And controlling the movement and rotation of the flying robot according to the set shooting area to photograph the set shooting area again.

The present invention is advantageous in that a photographing area is intuitively set when a close-up and a remote photographing are performed using a flying bot, so that a user can conveniently photograph only a desired area conveniently without operating a flying bot.

In addition, the present invention can perform special shooting (moving picture, continuous shooting, 3D scanning, 3D scanning) as well as general shooting by performing the shooting by controlling the position and the rotational direction of the flying bats according to various shooting modes There are advantages.

BRIEF DESCRIPTION OF THE DRAWINGS FIG. 1 is a block diagram for explaining a controller and a flying bot of a flying robot according to an embodiment of the present invention; FIG.
FIGS. 2A, 2B, and 2C are conceptual diagrams illustrating an example of a controller and a flying bot of a flying bot according to an embodiment of the present invention. FIG.
3 is a flowchart illustrating a concept of performing a photographing operation by controlling a flying bot 180 using a smartphone.
FIG. 4 is a flowchart showing a photographing control method of a flying bot according to an embodiment of the present invention. FIG.
5 is an embodiment of a method of controlling a shooting area of a flying bamboo according to an embodiment of the present invention in a general shooting mode.
5 is another embodiment of the method of controlling the shooting area of the flying bots according to the embodiment of the present invention in the general shooting mode.
FIGS. 7A and 7B illustrate a method of selecting an image capturing area on a captured image. FIG.
8 is an embodiment of a method of controlling a shooting area of a flying bot according to an embodiment of the present invention in an autonomous shooting mode.
9 is a view showing another embodiment of the method of controlling the shooting area of the flying robot according to the embodiment of the present invention in the autonomous shooting mode.
10 is an embodiment of a method of controlling an imaging area of a flying bot according to an embodiment of the present invention in a moving image and continuous shooting mode.
11 is an embodiment of a method for controlling an imaging area of a flying bot according to an embodiment of the present invention in a 3D imaging mode.
12 is an embodiment of a method of controlling an imaging area of a flying bot according to an embodiment of the present invention in a scan mode.

Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings, wherein like or similar elements are denoted by the same or similar reference numerals, and redundant description thereof will be omitted. The suffix "module" and " part "for the components used in the following description are given or mixed in consideration of ease of specification, and do not have their own meaning or role. In the following description of the embodiments of the present invention, a detailed description of related arts will be omitted when it is determined that the gist of the embodiments disclosed herein may be blurred. It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are intended to provide further explanation of the invention as claimed. , ≪ / RTI > equivalents, and alternatives.

In the following description, it is assumed that a flying boat is a quadcopter having four rotor rotors. However, it is needless to say that the flying bot may be a multi-copter having two, six, or eight rotor blades as well as four rotor blades. Further, the present invention can be applied not only to the flying type using the rotor blades, but also to the flying bots using other methods that do not use the rotor blades, for example, a jet propulsion type.

First, referring to FIG. 1, FIG. 1 is a block diagram for explaining a flying robot control apparatus 100 and a flying bot 180 according to an embodiment of the present invention.

A control unit 110, a memory 110, a display unit 120, a display unit 140, an output unit 150, a user input unit 160, A power supply unit 170 and a power supply unit (not shown). The components shown in Fig. 1 are not essential for implementing the flying bot control apparatus 100, so that the flying bot control apparatus 100 described in this specification can have more or fewer components than those listed above Elements.

More specifically, the camera 120 among the components may receive a video signal. The camera 120 may include at least one of a camera sensor (e.g., CCD, CMOS, etc.), a photo sensor (or an image sensor), and a laser sensor.

The camera 120 and the laser sensor can be combined with each other to sense a user's gaze or a user's touch. The photosensor may be stacked on a display device. The photosensor may be configured to mount a photo diode and a transistor (TR) on a row / column to scan the movement of the sensing object according to an electrical signal that varies according to the amount of light applied to the photo diode. can do. Accordingly, the photosensor can sense the direction of the user's face from the received image and the user's gaze.

The communication unit 130 communicates directly with the flying bot 180 using a wireless signal of a predetermined frequency band or wirelessly communicates with the flying bot 180 via a repeater of a predetermined wireless communication system. Communication can be performed. The communication unit 130 may include at least one communication module for communicating with the flying bot 180.

The communication unit 130 may further include a module for acquiring the position of the flying bot controller 100. A typical example of such a location information acquisition module may be a Global Positioning System (GPS) module. For example, using the GPS module, the flying bot control device 100 can acquire the position of the flying controller 100 by using signals transmitted from the GPS satellites.

The sensing unit 140 may include various sensors for controlling the flying bots and at least one sensor for sensing the surrounding environment around the flying bots controller 100. For example, the sensing unit 140 may include an optical sensor (e.g., a camera 120) for recognizing the user's gaze, a magnetic sensor, a G-sensor, a gyroscope And may include at least one of a gyroscope sensor, a motion sensor, an inertial sensor, and a direction sensor. Meanwhile, the control unit 110 of the flying robot control apparatus 100 disclosed in this specification can combine and utilize information sensed by at least two sensors among these sensors.

The user input unit 160 is used to receive information from the user. When information is inputted through the user input unit 160, the control unit 180 outputs a signal for controlling the operation of the flying bot 180 so as to correspond to the inputted information. To the flying bot (180). The user input unit 123 may include a mechanical input unit (or a mechanical key such as a button located at the front, rear or side, a dome switch, a jog wheel, a jog switch, etc.) And an expression input means. For example, the touch-type input means may comprise a virtual key, a soft key or a visual key displayed on the touch screen through software processing, And a touch key disposed on the touch panel. Meanwhile, the virtual key or the visual key can be displayed on a touch screen having various forms, for example, a graphic, a text, an icon, a video, As shown in FIG.

The output unit 150 generates at least one of a display unit 151 and an audio output unit 152 for generating an output related to time, hearing, and the like. The display unit 151 may have a mutual layer structure with the touch sensor or may be integrally formed to realize a touch screen. The touch screen functions as a user input unit 160 that provides an input interface between the flying bot control apparatus 100 and a user and can provide an output interface between the flying bot control apparatus 100 and a user .

In addition, the memory 170 stores data that supports various functions of the flying bot control device 100. The memory 170 may store a plurality of application programs (application programs or applications) driven by the flying bot controller 100, data for controlling the operation of the flying bot 180, and commands. At least some of these applications may be downloaded from an external server via wireless communication. Also, at least some of these application programs may reside on the flying bot control device 100 from the time of shipment. The application program may be stored in the memory 170 and installed on the flying bot control device 100 so that the operation of the flying bot 180 may be controlled by the control unit 180 .

In addition to the operations related to the application program, the control unit 110 typically controls the overall operation of the flying bot control apparatus 100. [ The control unit 180 processes signals, data, information, and the like input or output through the above-described components, or drives an application program stored in the memory 170 to provide information necessary for the user to control the flying bot 180 And may transmit a signal for controlling the flying bot 180 according to a user's input or setting.

For example, the control unit 110 may determine the flying direction or the flying speed and the altitude of the flying bot 180 according to a user's input through the user input unit 160. The control unit 110 also recognizes the position and the state of the flying robot 180 from signals received from the flying robot 180 and controls various control modes for controlling the flying of the flying robot 180 according to the recognized result Can be provided to the user.

For example, the control unit 110 may be a control mode (hereinafter referred to as a first-person control mode) for allowing a user to control the flying bot 180 at the time of the flying bot 180, 180) from a third party's viewpoint and control the flying bots (hereinafter, referred to as third-person control mode) to the user. Alternatively, the control unit 110 may provide the user with a control mode (hereinafter referred to as a positioning mode) for controlling the movement of the flying bot 180 according to the position designated by the user of the flying bot 180.

In this case, the control unit 110 displays various information for controlling the fingertip 180 through the output unit 150 according to any one of the various control modes selected or automatically set by the user can do. For example, when the control mode of the flying bot 180 is 'first person control mode', the control unit 110 displays an image received from the camera provided on the flying bot 180 on the display unit 151 Information about the altitude of the flying robot 180 and the weather environment (for example, wind velocity) sensed by the flying robot 180 may be displayed on the display unit 151 or the ' May be displayed through at least one of the audio output unit 150 and the audio output unit 150. Alternatively, the control unit 110 may display the current position of the flying robot 180 on the display unit 151 when the control mode of the flying robot 180 is the 'positioning mode'. The control unit 110 may control the movement and the flying state of the flying bot 180 according to a user's input according to any of the control modes.

Under the control of the controller 110, a power supply unit (not shown) receives power from an external power source and supplies power to internal components of the flying bots 100. Such a power supply unit includes a battery, which may be an internal battery or a replaceable battery.

Meanwhile, the flying bot 180 may include a bot camera 183, a bot communication unit 184, a bot sensing unit 186, and a flight driving unit 185. 1 are not required to implement the flying bot 180, so that the flying bot 180 described herein may have more or fewer components than those listed above have.

More specifically, among the components, the bot camera 183 can receive a video signal. The bot camera 183 may include a camera sensor such as a CCD or a CMOS, and may capture an image sensed by the camera sensor under the control of the bot controller 182.

The bot sensing unit 186 may include at least one sensor for sensing the environment surrounding the flying bot 180. For example, the boss sensing unit 186 may include a proximity sensor, an illumination sensor, an acceleration sensor, a magnetic sensor, a gravity sensor, a gyroscope sensor, an inertial sensor, a motion sensor, an infrared sensor, And at least one of the sensors. And may further include an environmental sensor (for example, a barometer, a hygrometer, a thermometer, an anemometer, etc.). Representative sensors among these sensors will be described as follows.

First, the proximity sensor can detect whether or not there is another object within a certain distance of the vicinity of the flying bot 180. For example, the proximity sensor can detect an object approaching or approaching the flying bot 180 by using a laser, an ultrasonic wave, an infrared ray, or the like.

In addition, the gyro sensor, the orientation sensor, and the inertial sensor sense the direction of rotation when the flying robot 180 rotates, and sense the direction in which the flying robot 180 is heading. In addition, the motion sensor can detect the current flying state, tilted state (tilt), rotation, or moving state of the flying bot 180, and the acceleration sensor can detect the flying speed of the flying bot 180, The magnetic sensor can sense the altitude and the like of the flying robot 180 at present. In addition, the environmental sensor can detect various weather conditions such as air pressure, humidity, temperature, and wind speed around the flying bats.

The bot communication unit 184 may include at least one communication module for performing communication with the flying bot control apparatus 100 using a radio signal of a predetermined frequency band. The bot communication unit 184 inputs a signal received from the flying bot control apparatus 100 to the bot control unit 182 or controls the robot and the bot that are detected by the bot camera 183 under the control of the bot control unit 182. [ The sensed values sensed by the sensing unit 186 may be transmitted to the flying bot controller 100. [

In addition, the bot communication unit 184 may further include a module for acquiring the position of the flying bot control apparatus 100. There may be a GPS module as a module for acquiring position information of the flying bot 180. For example, using the GPS module, the flying bot 180 can acquire the position of the flying bot 180 using signals transmitted from the GPS satellites. The acquired position information of the flying robot 180 may be transmitted to the flying robot controller 100 through the robot communication unit 184.

The flight driver 185 enables flying of the flying bot 180 under the control of the bot controller 182. The flight driving unit 185 may cause the flying bot 180 to fly in a direction controlled by the bot controller 182 and may also cause the flying bot 180 to fly at an altitude that is controlled by the bot controller 182 .

The bot controller 182 controls the overall operation of the flying bot 180. The bot controller 182 controls the flight driver 185 so that the flying bot 180 can fly in the altitude and direction according to the control signal received through the bot communicator 184. For example, the bot controller 182 controls the moving direction and the rotating direction of the flying robot 180 according to a control signal received from the flying controller 100, to the user input unit 160 of the flying controller 100, And allows the flying robot 180 to fly according to a control signal input through the user input unit 160 of the flying robot controlling apparatus 100. [

Also, various signals required from the flying bot control device 100 can be transmitted to the flying bot control device 100. The signals transmitted to the flying bot controller 100 may be sensed values sensed from at least one of the sensors of the bot sensor 186 and may be images captured by the bot camera 183. The transmitted signals may be position information of the flying robot 180 obtained from the robot communication unit 184.

2A, 2B, and 2C illustrate an example of a controller 100 capable of controlling the flying bats 180 and the flying bats 180 according to the embodiment of the present invention.

Referring first to FIG. 2A, FIG. 2A shows an example of a flying bot 180 according to an embodiment of the present invention.

Referring to FIG. 2A, the flying robot 180 may include at least one rotor blade 185a, 185b, 185c, and 185d and a bot camera 183 according to an embodiment of the present invention. The bot controller 182 of the flying bot 180 controls the driving of the rotor blades 185a, 185b, 185c and 185d according to control signals received from the flying controller 100, The direction of the robot 180 may be rotated or moved, or the robot 180 may be tilted over a predetermined angle (tilt state).

Further, the bot controller 182 can capture various sensing values of the surrounding environment and the image through the bot camera 183 using the various sensors and the bot camera 183. At least two or more of the bot cameras 183 may be provided. In this case, the bot controller 182 can simultaneously photograph images received in at least two directions. If there are at least two bot cameras 183, the bot controller 182 can determine any one of the at least two bot cameras 183 as a main camera, The direction in which the flying robot 180 moves may be determined according to the direction in which the flying robot 180 moves.

2A, the number of the rotor blades is four. However, the number of the rotor blades 180 to which the present invention is applicable is not limited to four or more than four, and the rotor blades may not be used. For example, It is of course possible to use a jet propulsion method.

Meanwhile, FIGS. 2B and 2C show various examples in which the flying robot control apparatus 100 according to the embodiment of the present invention is implemented.

Referring to FIG. 2B showing an example of a flying robot control apparatus 100 according to an embodiment of the present invention, the flying robot control apparatus 100 according to an embodiment of the present invention includes a display unit 151, And at least one user input keys 160a, 160b, 160c capable of receiving a signal for controlling the flying bot 180 from a user. Each of the keys 160a, 160b and 160c includes a movement key 160a for controlling the moving direction of the flying robot 180, a rotation key 160b for controlling the tilting state or the rotating state of the flying robot 180, And an altitude key 160c for controlling the altitude of the flying bot 180. It should be understood that the present invention is not limited to the keys 160a, 160b, and 160c shown in FIG. 2B, and may be configured to include more keys or fewer keys than those shown in FIG. 2B to be.

 Meanwhile, the control unit 110 may display various image information for controlling the flying bots 180 on the display unit 151 through the display unit 151. For example, the control unit 110 detects images captured by the bot camera 183 of the flying bot 180 and sensed values sensed by various sensors of the flying bot 180, such as altitude (ALT) 170a, wind speed 170b, and the like on the display unit 151 can be displayed. Also, the control unit 110 can recognize the user's gaze and change the flight control mode of the flying bot 180 based on the time at which the user's gaze is recognized. The controlling apparatus 100 for controlling the flying bats according to the embodiment of the present invention may use various means for detecting the user's gaze. For example, the flying robot controller 100 may use the camera 120 as a means for recognizing the user's gaze, and the controller 110 may be configured to display the image captured through the camera 120, It is possible to detect the user's gaze gazing at the controlling device 100 of the flying robot and the time at which the user gazes at the controlling device 100.

For example, the controller 110 controls the moving direction of the flying robot 180 or the rotational direction of the flying robot 180 according to the sight recognition result of the user or the flying control mode of the flying robot 180 determined from the position or state of the flying robot 180 Can be determined. That is, based on the currently set flight control mode, the controller 110 controls the direction of the moving camera 160a based on the direction of the bot camera 183 of the flying bot 180, the position of the user, It is possible to determine the moving direction and / or the rotating direction of the flying bats 180 according to the input.

Meanwhile, as shown in FIG. 2B, the flying robot control apparatus 100 according to the embodiment of the present invention may be designed separately for controlling the flying bot 180, but may be an application installed in a mobile terminal such as a smart phone It can also be implemented in the form of a program. In this case, when the application or the program is activated, the smartphone can operate as a flying robot control apparatus 100 according to an embodiment of the present invention, which can control the flying bot 180.

FIG. 2C shows an example of a smart phone 200 in which the application or program is driven when the flying robot control apparatus 100 according to the embodiment of the present invention is implemented in the form of an application or a program.

2C, when a flying robot control application or a program according to an embodiment of the present invention is executed, the control unit of the smart phone 200 may control various images for controlling the flying bot 180 Information can be displayed through the display unit 251 of the smart phone 200. [ For example, the control unit of the smartphone 200 may display various graphic objects 260a, 260b, and 260c on the display unit 251 to control the flying state of the flying bot 180.

These graphic objects include a first virtual key 260a corresponding to a move key 160a for controlling the moving direction of the flying bot 180, a first virtual key 260a for controlling the rotating direction or the tilting state of the flying bot 180, A second virtual key 260b corresponding to the altitude key 160b and a third virtual key 260c corresponding to the altitude key 160c for controlling the altitude of the flying bot 180. [ The control unit of the smartphone 200 controls the virtual keys for controlling more or fewer of the flying bots 180. For example, And may be displayed on the display unit 251.

Meanwhile, when the Flying BOT control application or the program according to the embodiment of the present invention is executed, the control unit of the smart phone 200 can recognize the user's gaze using the camera provided on the front side of the smart phone 200, It is possible to change the flight control mode of the flying bot 180 based on the time at which the sight line of the flying robot 180 is recognized.

For example, the control unit of the smartphone 200 may control the moving direction of the flying bot 180 according to the sight recognition result of the user or the flying control mode of the flying bot 180 determined from the position or state of the flying bot 180 The direction of rotation and the like can be determined. That is, the control unit of the smartphone 200 controls the direction of the bot camera 183 of the flying bot 180 based on the currently set flight control mode, or the first four directions The moving direction and / or the rotational direction of the flying bot 180 according to the input of the virtual key 260a can be determined.

The control unit 110 of the smart phone 200 may display various image information related to the currently determined flight control method on the display unit 251. [ For example, the control unit of the smart phone 200 may display various images sensed by various sensors of the flying robot 180 and images captured by the bot camera 183 on the display unit 251 .

In the above description, the flying bot control device 100 designed to control the flying bot 180 and the application or program for controlling the flying bot 180 according to the flying bot control method according to the embodiment of the present invention In the following description, it is assumed that the flying robot control apparatus 100 according to the embodiment of the present invention is configured such that the application or the program is executed in the smart phone 200 It is assumed that it is in the form of In this case, the smartphone 200 may be a flying robot control apparatus 100 according to an embodiment of the present invention, and the control unit, the display unit 251, and the camera 220 of the smart phone 200 may be, The display unit 151 and the camera 120 of the flying robot control apparatus 100 according to the embodiment of the present invention.

Hereinafter, embodiments related to a control method that can be implemented in the flying bot control apparatus 100 configured as above will be described with reference to the accompanying drawings. It will be apparent to those skilled in the art that the present invention may be embodied in other specific forms without departing from the spirit or essential characteristics thereof.

Hereinafter, an operation for setting and controlling a photographing area will be described with reference to a smart phone 200 in which an application or a program for controlling the flying bot 180 is operated. The flying robot is an abbreviation of a flying robot.

In the present invention, the flying bot controller may be implemented as a mobile terminal, a smart phone, and a mobile flying bot controller.

3 is a flowchart illustrating a concept of performing a photographing operation by controlling the flying bot 180 using a smartphone.

3, the control unit of the smartphone 200 according to the embodiment of the present invention can control the flying of the flying bot 180 according to an initially set control method when the flying bot 180 is driven S100). This initial setting control method may be predetermined by the user or predetermined at the factory of the flying bot 180 and the flying bot controller 100 or in the application of the smartphone.

The initial basic control method may be a third person control mode. Here, the third person control mode may be a flight control method for controlling the flying state of the flying bot 180 while the user observes the flying bot 180 with the naked eye.

When the flying bot 180 starts to fly, the bot controller 182 of the flying bot 180 can transmit information to the flying bot controller 100 or the smartphone 200, which senses various situations around the flying bot 180. Here, the information received by the flying bot 180 may include information on the sensed value from the sensors of the flying bot 180, or the position of the flying bot 180. In this case, the control unit of the smart phone 200 may determine the location and state of the flying bot 180 from the received information (S110).

Meanwhile, the controller may determine the flight control method of the flying bot 180 based on at least one of the position and the state of the flying bot 180. Alternatively, the flight control method of the flying bot 180 may be determined according to the result of sensing the time taken by the user's gaze to the smartphone 200 through the camera.

On the other hand, the control unit can detect the separation distance between the flying bot 180 and the smartphone 100 from the position of the flying bot 180, and based on the detected separation distance, And may determine a more appropriate flight control method.

The user can select a predetermined shooting mode through the smart phone 200 during the flight of the flying bot 180 in step S120 and the smartphone controller transmits a control signal related to the shooting mode to the flying bot 180. [ The shooting mode includes all camera shooting modes such as a self-portrait mode, a terrain shooting mode, a moving picture shooting mode, and a 3D shooting mode.

The control unit 182 of the flying bot 180 determines the flying control method according to the selected shooting mode and then controls the front, rear, left, and right sides of the flying bot 180, Determines the right direction, and moves the flying bot 180 in the corresponding direction (S130). For example, if the flight control method of the flying bot 180 is changed to the "self-mode", the smartphone control unit determines the direction of the forwarding of the bot camera 183 by the flying bot 180, 'Rear', 'left' and 'right' can be determined based on the determined 'forward'. In addition, the control unit can determine the moving direction and the rotating direction of the flying bot 180 according to the changed flight control method.

In addition, the smartphone control unit may include sensors for sensing the motion, inertia, and rotation of the flying robot 180 so that the rotational state and the tilting state of the flying robot 180 are synchronized based on the determined 'forward' I.e., a signal for controlling the motion sensor, the inertial sensor, and the gyroscope sensor, to the flying bot 180. Accordingly, the flying bots 180 can be synchronized with the keys of the user input unit in the smartphone, as well as the direction of movement, the direction of rotation, and the tilt state according to the flight control method according to the selected shooting mode.

That is, the control unit of the smart phone 200 can determine the moving direction or the rotating direction of the flying bot 180 based on the currently set shooting mode. That is, the control unit of the smartphone 200 controls the direction of the bot camera 183 of the flying bot 180 based on the currently set flight control mode, or the first four directions The moving direction and / or the rotational direction of the flying bot 180 according to the input of the virtual key 260a can be determined.

The control unit of the smartphone 200 can detect the position of the flying bot 180 based on various information (e.g., signal strength) about the current state of the flying bot 180 transmitted from the flying bot 180 have. The control unit can detect the distance between the flying robot 180 and the smart phone 200 based on the detected position of the flying robot 180.

The control unit of the smart phone 200 may display various image information related to the currently determined flight control method on the display unit 251. [ For example, the control unit of the smart phone 200 may display various images sensed by various sensors of the flying robot 180 and images captured by the bot camera 183 on the display unit 251 .

Accordingly, when the flying robot 180 moves in a position and a direction determined according to the photographing mode, the brush camera 183 is operated according to the control signal transmitted from the control unit of the smart phone 200 to photograph the corpuscular body 140 (140). The photographed body image is transmitted to the smartphone 200 through the brush communication unit 184 (S150) and displayed on the display unit 251 of the smartphone 200 of FIG. 2C.

4 is a flowchart illustrating a photographing control method of a flying bot according to an embodiment of the present invention.

As shown in FIG. 4, the flying bot 180 performs an initial flight under the control of the smartphone controller. When the user selects the shooting mode through the menu of the smartphone 200 in this state, the flying bot 180 moves to a position corresponding to the selected shooting mode, moves according to a control signal according to the selected shooting mode, . The photographed image is transmitted to the smartphone 200 and displayed on the display unit 151 (S200).

At this time, the smartphone control unit stores the camera pose at the time of the initial photographing from the flying bot 180 in the memory.

 The user can select an imaging area for re-imaging in the initial imaging image displayed on the display unit 251 (S210). The photographing area can be set by the user directly by the touch gesture or by moving the guide information displayed on the display unit 251 when the resampling menu is selected.

Once the photographing area for re-photographing is selected, the smartphone controller calculates a camera pose for photographing the selected area (S220), and transmits a control signal corresponding to the calculated camera pose to the flying bot 180 (S230).

In one embodiment, the camera pose for photographing the selected region may be calculated by comparing the photographed image according to the stored camera pose with the previously selected photographed region. The control signal compares the initial photographing screen and the selected photographing area with respect to the direction of the current camera or the direction of the bot camera 183 of the flying bot 180 or the user's position or the reference four directions (east, west, north, Or the moving direction and / or the rotating direction of the bot camera 183.

Accordingly, the flying bot 180 moves or rotates the flying bot 180 or the bot camera 183 according to the control signal transmitted from the smartphone control unit, and photographs the selected shooting region again (S24, S250). The movement includes at least one of the front, rear, left, and right directions of the bot 180.

5 is an embodiment of a method of controlling a shooting area of a flying bamboo according to an embodiment of the present invention in a general shooting mode.

5, in the case of the general photographing mode, the bot controller 182 can move and / or rotate the flying bot 180 in the position and moving direction under the control of the smartphone 200, The direction in which the camera 183 faces also changes. Therefore, after the movement, the bot camera 183 can take an image in different directions according to the user's control, and the smartphone 200 receives the photographed initial image and displays it on the display unit 251 .

The user can determine the re-photographing by viewing the initial photographing image displayed on the display unit 251. [ In this case, the user can select the shooting area 50 by inputting a touch gesture (touch & drag) on the initial shot image, or adjust the size of the guide area displayed on the display unit 251 when selecting the re- The controller 50 can select the area 50. In this case, the photographing area 50 is selected to be smaller than the photographed image area. In particular, the controller selects the shape of the touch gesture It is possible to compare the previously stored shape (for example, a graphic form) and display the photographing area 50 of the form corresponding to the input touch gesture automatically based on the similarity.

In the present invention, the photographing mode may be directly selected by the user, but may be automatically selected based on predetermined conditions. For example, the smartphone control unit can determine the shooting mode based on the analysis result or position of the image received from the flying bot 180. For example, the control unit may analyze the object and background of the received image, and may automatically change to a general shooting mode when a plurality of persons are captured, and to a self-mode when a single person is captured. The panorama mode or the landscape photographing mode can be automatically changed.

When the photographing area 50 is selected, the smartphone controller calculates a camera pose for the selected photographing area based on the camera pose of the initial photographing image, and transmits the calculated camera pose to the flying bot 180.

The bot controller 182 moves and / or rotates the flying robot 180 in a position and a direction corresponding to the camera pose transmitted from the smartphone 200 and then controls the bot camera 183, And the smartphone 200 receives the re-photographed image and displays it on the display unit 251.

6 is another embodiment of a method of controlling a shooting area of a flying bamboo according to an embodiment of the present invention in a general shooting mode. The embodiment shown in FIG. 6 is an example in which a retake is performed when an area in which an initial photographing image is narrower than an area desired by the user is photographed.

As shown in FIG. 6, the user inputs a touch gesture on the initial shot image to select the shooting area 50. If the selected shooting region 50 is larger than the size of the currently displayed initial shooting image, the user can reduce the initial shooting image and then select the shooting region 50 larger than the reduced image.

5, the smartphone control unit calculates a camera pose for the selected shooting area on the basis of a camera pose for the initial shot image, and outputs the camera pose to the flying bot 180 send.

Accordingly, the bot controller 182 moves and / or rotates the flying robot 180 in a position and direction corresponding to the camera pose transmitted from the smartphone 200, and then controls the bot camera 183, 50). The resampled image is transmitted to the smart phone 200 and displayed on the display unit 251.

Figs. 7A and 7B show an embodiment showing a method of selecting an image capturing area on a captured image.

As described above, if the user does not like the initial photographed image, the user can select (set) the photographed area 50 of a predetermined size on the initial photographed image and perform the photographed.

In this case, as shown in FIG. 7A, the photographing area 50 can be enlarged or reduced by dragging the first displayed photographing area 50a, or the originally displayed photographing area 50a may be left as it is, You can zoom in / out and set the initial shot image. It will be appreciated that both methods have the same effect.

As described above, the smartphone control unit can automatically determine the shooting mode based on the analysis result or position of the image received from the flying bot 180. [ For example, when the shooting command is input when the flying robot 180 is located at a high altitude, the control unit automatically changes the mode to the autonomous shooting mode and transmits a control signal for autonomous shooting to the flying bot 180.

FIG. 8 is an embodiment of a method of controlling an imaging area of a flying robot according to an embodiment of the present invention in an autonomous imaging mode.

8, the flying bot 180 captures an initial photograph at a high altitude by controlling the bot camera 183 and transmits it to the smartphone 200. The smartphone control unit displays the transmitted initial photograph image on the display unit 251) and stores the camera pose for the initial shot image in the memory.

If an area larger than the desired space is photographed as a result of checking the initial photographed image, the user can select a photographed area 50 of a small size by dragging the predetermined area. The selected shooting region 50 is set along the trajectory of the dragged finger, and can be adjusted in size and shape according to the subsequent touch drag input.

When the photographing area 50 is selected, the smartphone control unit calculates position information on the selected photographing area 50 and transmits the calculated position information to the flying bot 180. The flying bot 180 scans the photographing area 50 The plurality of photographed images are displayed on the display unit 251 of the smartphone.

9 is another embodiment of a method of controlling an imaging area of a flying robot according to an embodiment of the present invention in an autonomous imaging mode.

9, the flying bot 180 captures an initial photograph at a high altitude by controlling the bot camera 183 and transmits it to the smartphone 200, and the smartphone control unit displays the transmitted initial photograph image Unit 251 and stores the camera pose for the initial shot image in the memory.

If a region narrower than the desired space is photographed as a result of checking the initial photographed image, the user can select a photographed region 50 of a large size by dragging the predetermined region. At this time, the user can set the photographing area 50 after reducing the initial photographing image. The photographing area 50 is set to a shape in which a finger is dragged, and the size and shape of the photographing area 50 may be adjusted according to a subsequent touch drag input. That is, a regular shape or an irregular surface.

When the photographing area 50 is selected, the smartphone control unit calculates position information on the selected photographing area 50 and transmits the calculated position information to the flying bot 180. The flying bot 180 scans the photographing area 50 The plurality of photographed images are displayed on the display unit 251 of the smartphone.

10 is a view illustrating an embodiment of a method of controlling a shooting area of a flying robot according to an embodiment of the present invention in a moving picture and continuous shooting mode.

Referring to FIG. 10, in the moving image and continuous shooting mode, the user can designate the shooting area 50 in the image captured by the bot camera 183 by the flying bot 180. When the shooting region 50 is designated, the smartphone control unit rotates the flying robot 180 around the object in the shooting region 50, and shoots the moving image of the object at various angles. At this time, the smartphone control unit extracts the feature points of the object in the shooting region 50, controls the flying bot 180, and fires the flying bot 180 so that the minutiae points of the new object can be extracted.

In the present invention, at least two or more bot cameras 183 may be provided. In this case, the bot controller 182 can simultaneously photograph images in at least two directions. When there are at least two bot cameras 183, the bot controller 182 can determine any one of the at least two bot cameras 183 as a main camera, The direction in which the flying robot 180 moves may be determined according to the direction in which the flying robot 180 moves. The brush camera 183 may include a depth camera or a 3D scan camera.

FIG. 11 shows an embodiment of a method of controlling an imaging area of a flying robot according to an embodiment of the present invention in a 3D imaging mode, FIG. 12 is a flowchart illustrating a method of controlling an imaging area of a flying robot according to an embodiment of the present invention in a scan mode Yes.

11, the user sets a rectangular parallelepiped or spherical three-dimensional photographing area 60 in a state in which an initial photographing image is displayed on the display unit 251. [ When the 3D shooting region 60 is set, the smartphone control unit controls the flying bot 180 so as to rotate at a predetermined distance from the 3D shooting region 60, thereby controlling the Depth camera mounted on the flying bot 180 A three-dimensional image is shot through.

12, when the scan area 70 is set in the 3D scan mode, the smartphone control unit controls the flying bot (for example, 180) so as to capture a three-dimensional scan image through a D-scan camera mounted on the flying bot (180). At this time, the smartphone control unit may detect the lack of scan data for each voxel in the scan area and control the additional scan.

The present invention is not limited to this example. For example, the flying robot controller 100 may be configured to control the shooting operation of the bot camera 183 of the flying robot 180, The same applies to the case of applying the invention.

As described above, the present invention is advantageous in that a photographing area is intuitively set when performing a close-up and a remote photographing using a flying robot, so that a user can selectively photograph only a desired area conveniently without operating a flying bot .

In addition, the present invention can perform special shooting (moving picture, continuous shooting, 3D scanning, 3D scanning) as well as general shooting by performing the shooting by controlling the position and the rotational direction of the flying bats according to various shooting modes There are advantages.

The present invention described above can be embodied as computer-readable codes on a medium on which a program is recorded. The computer readable medium includes all kinds of recording devices in which data that can be read by a computer system is stored. Examples of the computer readable medium include a hard disk drive (HDD), a solid state disk (SSD), a silicon disk drive (SDD), a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, , And may also be implemented in the form of a carrier wave (e.g., transmission over the Internet). Also, the computer may include a control unit 180 of the terminal. Accordingly, the above description should not be construed in a limiting sense in all respects and should be considered illustrative. The scope of the present invention should be determined by rational interpretation of the appended claims, and all changes within the scope of equivalents of the present invention are included in the scope of the present invention.

50, 50a: photographing area 60: three-dimensional photographing area
70: Scan area 100: Flying bot controller
180; Flying bot 182:
183: Bot camera 200: Smartphone
251:

Claims (19)

A flying boat equipped with at least one camera; And
And a control unit for controlling the movement and rotation of the flying bots according to the shooting area set by the user on the shooting image displayed on the display unit, And a controller for controlling the shooting of the flying bot.
2. The apparatus according to claim 1,
Wherein the guide information is automatically displayed in a region input by a user with a touch gesture or when a resampling menu is selected.
2. The apparatus according to claim 1,
Is set to be smaller or larger than that of the initially photographed image,
Wherein the photographing area is set to a regular figure shape or an irregular curved shape.
2. The apparatus according to claim 1,
And a three-dimensional imaging area of a rectangular parallelepiped or spherical shape.
The method according to claim 1,
Wherein the control unit determines and controls the moving direction or the rotating direction of the flying robot based on the shooting mode.
The method according to claim 1,
The mobile terminal, the smart phone, and the flying bot controller.
The method according to claim 1,
Wherein the camera pose for the initial shot image captured by the camera of the flying bot is stored and the camera pose for the selected shot area is calculated using the camera pose for the stored initial shot image, Device.
8. The method according to claim 7,
Wherein the control unit moves and / or rotates in a position and direction corresponding to the camera pose transmitted from the flying bot control unit, and re-captures the set shooting area using the camera.
The method according to claim 1,
And automatically determines the shooting mode based on the result of analysis or the distance of the initial shot image received from the flying robot.
The method according to claim 1,
Wherein the flying control system changes the flying control method of the flying bot based on a result of further sensing a current altitude of the flying bot and a weather condition sensed by the flying bot.
6. The method according to claim 5,
The continuous shooting mode, the normal shooting mode, the autonomous shooting mode, the moving picture and continuous shooting mode, the 3D shooting mode, and the scan mode. In the moving picture and continuous shooting mode, 3D shooting mode and scan mode, Wherein the controller is configured to control the shooting controller.
Determining a moving direction or a rotating direction of the flying bot according to the shooting mode and moving the flying boat;
Performing an initial photographing by controlling a camera mounted on the flying bot, receiving an initial photographing image from the flying bot, and displaying the initial photographing image on the display unit;
Sensing an imaging area set by a user on an initial imaging image displayed on the display unit; And
And controlling the movement and rotation of the flying robot according to the set shooting area to re-photograph the set shooting area.
13. The apparatus according to claim 12,
Wherein the guide information is automatically displayed in a region input by a user with a touch gesture or when a resampling menu is selected.
13. The apparatus according to claim 12,
Is set to be smaller or larger than that of the initially photographed image,
Wherein the shooting region is set to a regular graphic form or an irregular curved surface form.
13. The apparatus according to claim 12,
And a three-dimensional photographing area of a rectangular parallelepiped or spherical shape.
13. The method of claim 12, further comprising storing a camera pose for an initial shot image taken by a camera of the flying bot,
Wherein the camera pose for the selected shooting area is calculated using the camera pose for the stored initial shot image in the re-shooting step.
17. The method of claim 16, wherein the flying bot
Moving and / or rotating the camera in a position and direction corresponding to the camera pose transmitted from the flying bot control device, and re-capturing the set shooting area using the camera.
13. The method according to claim 12, further comprising: automatically determining an imaging mode based on an analysis result of the initial imaging image or the distance received from the flying bot. 13. The method according to claim 12,
A normal shooting mode, an autonomous shooting mode, a moving picture and continuous shooting mode, a 3D shooting mode, and a scan mode,
Wherein in the moving image and continuous shooting mode, the 3D shooting mode, and the scan mode, the flying bot captures an image of a shooting area or an object to be shot.
KR1020150114035A 2015-08-12 2015-08-12 Apparatus and method for controling capturing operation of flying bot KR20170019777A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150114035A KR20170019777A (en) 2015-08-12 2015-08-12 Apparatus and method for controling capturing operation of flying bot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150114035A KR20170019777A (en) 2015-08-12 2015-08-12 Apparatus and method for controling capturing operation of flying bot

Publications (1)

Publication Number Publication Date
KR20170019777A true KR20170019777A (en) 2017-02-22

Family

ID=58314669

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150114035A KR20170019777A (en) 2015-08-12 2015-08-12 Apparatus and method for controling capturing operation of flying bot

Country Status (1)

Country Link
KR (1) KR20170019777A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102402949B1 (en) * 2021-07-28 2022-05-30 주식회사 네스앤텍 Acquisition method of image information with improved precision

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102402949B1 (en) * 2021-07-28 2022-05-30 주식회사 네스앤텍 Acquisition method of image information with improved precision

Similar Documents

Publication Publication Date Title
CN110692027B (en) System and method for providing easy-to-use release and automatic positioning of drone applications
US11340606B2 (en) System and method for controller-free user drone interaction
US11649052B2 (en) System and method for providing autonomous photography and videography
US11632497B2 (en) Systems and methods for controlling an image captured by an imaging device
US20200346753A1 (en) Uav control method, device and uav
US20180181119A1 (en) Method and electronic device for controlling unmanned aerial vehicle
CN111596649B (en) Single hand remote control device for an air system
US10191487B2 (en) Control device and control method for flying bot
JP2003267295A (en) Remote operation system
CN107205111B (en) Image pickup apparatus, mobile apparatus, image pickup system, image pickup method, and recording medium
CN110771137A (en) Time-delay shooting control method and device
WO2018112848A1 (en) Flight control method and apparatus
JP2018129063A (en) Method for controlling unmanned aircraft, unmanned aircraft, and system for controlling unmanned aircraft
JP6685742B2 (en) Operating device, moving device, and control system thereof
JP6855616B2 (en) Operating devices, mobile devices, and their control systems
KR20170019777A (en) Apparatus and method for controling capturing operation of flying bot
JP2017112438A (en) Imaging system and control method therefor, communication device, mobile imaging device, program
KR20180060403A (en) Control apparatus for drone based on image
CN115437390A (en) Control method and control system of unmanned aerial vehicle