KR20170080031A - Control apparatus for controlling the movement of the object - Google Patents

Control apparatus for controlling the movement of the object Download PDF

Info

Publication number
KR20170080031A
KR20170080031A KR1020150191184A KR20150191184A KR20170080031A KR 20170080031 A KR20170080031 A KR 20170080031A KR 1020150191184 A KR1020150191184 A KR 1020150191184A KR 20150191184 A KR20150191184 A KR 20150191184A KR 20170080031 A KR20170080031 A KR 20170080031A
Authority
KR
South Korea
Prior art keywords
control device
displacement
mode
calculated
function
Prior art date
Application number
KR1020150191184A
Other languages
Korean (ko)
Other versions
KR101927810B1 (en
Inventor
홍유정
Original Assignee
홍유정
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 홍유정 filed Critical 홍유정
Priority to KR1020150191184A priority Critical patent/KR101927810B1/en
Publication of KR20170080031A publication Critical patent/KR20170080031A/en
Application granted granted Critical
Publication of KR101927810B1 publication Critical patent/KR101927810B1/en

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0011Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
    • G05D1/0016Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement characterised by the operator's input device
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0011Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
    • G05D1/005Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement by providing the operator with signals other than visual, e.g. acoustic, haptic
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Mathematical Physics (AREA)
  • Computing Systems (AREA)
  • Acoustics & Sound (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The present invention relates to a control device capable of adjusting a moving object. The control device for controlling the motion of the object according to the present invention includes a sensing part for sensing movement of a predetermined user applied to the control device; Calculates a displacement of the control device due to the motion of the user based on the motion of the predetermined user being applied and generates a control signal for controlling the object based on the displacement of the calculated control device A control unit; And a wireless communication unit for transmitting the control signal to the object.

Description

[0001] CONTROL APPARATUS FOR CONTROLLING THE MOVEMENT OF THE OBJECT [0002]

The present invention relates to a control device capable of controlling a moving object.

Drone is a drone that can be controlled by radio waves. Cameras, sensors, and communication systems, and they range in weight and size from 25 grams to 1200 kilograms. The drones were first created for military use, but recently they have been re-invented as a cheaper kitten so that individuals can afford to purchase drones. In addition, it is used in various aspects such as high-resolution shooting, delivery, spraying of pesticide, and measurement of air quality.

Such a dron can be controlled by a separate controller (controller) such as a smart phone or a remote controller.

1 is a conceptual diagram showing an embodiment of a conventional dron control device.

Referring to FIG. 1, front and rear, left and right movements, left and right turn, rise and fall, and the like of the dron can be controlled by using sticks on both the left and right sides.

However, this control method is very intuitive and requires a lot of practice in order for the user to freely manipulate the drones. Also, as it is developed to enable precise control such as aerobatic flight, the complexity of the control method is increasing.

The present invention is directed to solving the above-mentioned problems and other problems. Another object of the present invention is to provide a control device for controlling the movement of an object based on a movement of a control device by a user's gesture.

Specifically, we intend to provide users with an intuitive object control environment. In addition, we want to provide an object control environment that can be fine-tuned and improve control accuracy.

According to an aspect of the present invention, there is provided a control device for controlling movement of an object, the control device comprising: a sensing unit for sensing movement of a predetermined user applied to the control device; Calculating a displacement of the control device based on a movement of the user based on the motion of the predetermined user, and controlling the object based on the displacement of the control device and the positional information of the object A control unit for generating a control signal for controlling the display unit; And a wireless communication unit for transmitting the control signal to the object.

In an embodiment, the control unit may set the transmission of the control signal to an OFF state so that the object maintains a hovering state, based on a predetermined user input.

In still another embodiment, a first mode for calculating the displacement of the object based on the displacement of the calculated control device or a second mode for calculating the velocity of the object based on the displacement of the calculated control device And a user input unit for selecting the user input unit.

In another embodiment, the control unit may calculate the displacement of the object by applying a predetermined ratio to the displacement of the calculated control device in the first mode.

In another embodiment, the user input unit can set the ratio based on a predetermined user input.

In another embodiment, the controller sets a zero point based on a predetermined user input in the second mode, performs calibration on the basis of the zero point, A reference point for the occurrence of the object's speed, and a reference point for the position variation can be calculated.

In another embodiment, the control unit may calculate the velocity of the object based on the absolute value of the calculated control device displacement in the second mode.

In another embodiment, the control unit may calculate the rotation direction and the rotation angle of the control device based on the movement of the user based on the motion of the predetermined user, and the rotation of the control device The rotation direction and the rotation angle of the object can be calculated based on the direction and the rotation angle.

In another embodiment, the control unit may calculate a rotation direction and a rotation angle of at least one camera provided in the object, based on the calculated rotation direction and rotation angle of the control device.

In yet another embodiment, the apparatus may further include a vibration output unit that outputs a vibration in a predetermined manner based on a predetermined condition being sensed.

The effect of the object motion control apparatus according to the present invention will be described as follows.

According to at least one of the embodiments of the present invention, the movement of the three-dimensional moving object such as a drone can be controlled only by the movement of the controller, thereby providing an intuitiveness to the user.

Further, there is an advantage that fine control is possible and accuracy of moving object control can be improved.

Further scope of applicability of the present invention will become apparent from the following detailed description. It should be understood, however, that the detailed description and specific examples, such as the preferred embodiments of the invention, are given by way of illustration only, since various changes and modifications within the spirit and scope of the invention will become apparent to those skilled in the art.

1 is a conceptual diagram showing an embodiment of a conventional dron control device.
2 is a block diagram for explaining a control apparatus for controlling the motion of an object according to the present invention.
3 is a conceptual diagram for explaining the principle of the first mode.
4 is a conceptual diagram for explaining the principle of the second mode.
5 is a conceptual diagram for explaining an embodiment in which the first mode is selected.
6 is a conceptual diagram for explaining an embodiment in which the second mode is selected.
7 is a conceptual diagram for explaining an embodiment for calculating the displacement of the control device.
FIGS. 8 and 9 are conceptual diagrams for explaining an embodiment in which a ratio for calculating the displacement of an object in the first mode is set.
FIGS. 10 and 11 are conceptual diagrams for explaining an embodiment related to input of a control signal.
12 is a conceptual diagram for explaining an embodiment for rotating an object in the first mode or the second mode.
13 is a conceptual diagram for explaining an embodiment in which vibration is output when an obstacle close to an object is detected in the first mode or the second mode.

Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings, wherein like reference numerals are used to designate identical or similar elements, and redundant description thereof will be omitted. The suffix "module" and " part "for the components used in the following description are given or mixed in consideration of ease of specification, and do not have their own meaning or role. In the following description of the embodiments of the present invention, a detailed description of related arts will be omitted when it is determined that the gist of the embodiments disclosed herein may be blurred. It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are intended to provide further explanation of the invention as claimed. , ≪ / RTI > equivalents, and alternatives.

Hereinafter, a control device for controlling the movement of an object according to the present invention will be described with reference to the drawings.

Specifically, the object includes all objects capable of three-dimensional movement such as a UAV (drone), and the controller can be implemented as a controller capable of intuitively controlling the movement of the object. In the embodiment, the case where the object is a drone is explained, but the present invention is not limited thereto.

2 is a block diagram for explaining a control apparatus for controlling the motion of an object according to the present invention.

2, the control apparatus 100 includes a user input unit 110, a sensing unit 120, a controller 130, and a wireless communication unit 140. [ Further, it may further include a vibration output unit 150.

The sensing unit 120 may sense a predetermined user's motion applied to the controller 100. For this purpose, the sensing unit 120 may include an acceleration sensor, a magnetic sensor, an impedance sensor, a hybrid sensor for an impedance sensor and a magnetic sensor, a hybrid sensor, a gravity sensor (G-sensor), a gyroscope sensor and may include at least one of a gyroscope sensor, a motion sensor, an infrared sensor, an ultrasonic sensor, and an optical sensor (e.g., a camera).

Alternatively, the controller 100 may include at least one of a sensing unit and all devices capable of acquiring position information of the sensed gas / output unit .

Meanwhile, the sensing unit 120 may combine and use information sensed by at least one of the sensors.

The control unit 130 calculates the displacement of the control device 100 based on the movement of the user based on the motion of the predetermined user and calculates the displacement of the control device 100 based on the calculated displacement of the control device 100 A control signal for controlling the object 200 may be generated.

In the embodiment, the movement of the object 200 can be controlled by calculating the displacement, speed, and the like of the object 200 based on the calculated displacement of the control device 100.

The wireless communication unit 140 may transmit the control signal to the object 200. In an embodiment, when switching to a particular mode, a control signal may not be transmitted to the object 200.

The vibration output section 150 can output the vibration in a predetermined manner based on the detection of a preset condition. As an embodiment, when an obstacle close to the drones 200 is detected, a vibration of a predetermined intensity can be output.

The user input unit 110 may be implemented by at least one push button, a touch button, a scroll button, or the like, and may receive various input values that can be set by the user.

At least one push button, a touch button, a scroll button, and the like may be disposed on the front, back, side, and the like of the controller 100.

In the embodiment, a first mode (absolute coordinate mode) for calculating the displacement of the object 200 based on the calculated displacement of the control device 100 or a second mode (absolute coordinate mode) for calculating the displacement of the object 200 based on the calculated displacement of the control device 100 And a second mode (relative coordinate mode) in which the velocity of the object 200 is calculated.

Hereinafter, specific embodiments will be described.

The controller 130 controls the transmission of the control signal to the OFF state so that the object 200 maintains the hovering state based on the predetermined user input. Can be set.

In yet another embodiment, the first mode for calculating the displacement of the object 200 based on the calculated displacement of the control device 100 or the first mode for calculating the displacement of the object (200) based on the displacement of the calculated control device 100 And a second mode for calculating the speed of the mobile terminal 200.

In another embodiment, the control unit 130 may calculate the displacement of the object 200 by applying a predetermined ratio to the displacement of the calculated control device 100 in the first mode .

In another embodiment, the user input unit 110 can set the ratio based on a predetermined user input.

In another embodiment, the controller 130 sets a zero point based on the predetermined user input in the second mode, performs calibration based on the zero point, The velocity of the object 200 can be calculated.

In another embodiment, the control unit 130 may calculate the velocity of the object 200 based on the absolute value of the calculated displacement of the control device 100 in the second mode.

In another embodiment, the control unit 130 may calculate the rotation direction and the rotation angle of the controller 100 based on the movement of the user, based on the motion of the predetermined user, The rotation direction and the rotation angle of the object 200 can be calculated based on the calculated rotation direction and rotation angle of the control device 100. [

The control unit 130 controls the rotation direction and the rotation direction of at least one camera provided in the object 200 based on the calculated rotation direction and rotation angle of the control device 100. [ The angle can be calculated.

In another embodiment, the vibration output unit 150 may further include a vibration output unit 150 that outputs vibration in a preset manner, based on a predetermined condition being sensed.

On the other hand, the control apparatus 100 according to the present invention can be implemented to receive either the absolute coordinate control mode (first mode) or the relative coordinate control mode (second mode). Alternatively, only the first mode or the second mode may be implemented.

As an embodiment, the user input unit 110 may include a first mode for calculating the displacement of the object 200 based on the calculated displacement of the control device 100 or a first mode for calculating the displacement of the object 200 based on the calculated displacement of the control device 100 And a second mode for calculating the velocity of the object 200.

The control unit 130 can calculate the displacement of the object 200 by applying a predetermined ratio to the calculated displacement of the control device 100 in the first mode.

3 is a conceptual diagram for explaining the principle of the first mode (absolute control).

Referring to FIG. 3, the first mode refers to a control method based on absolute coordinates.

As an embodiment, the displacement of the controller 100 due to the motion of the user in the first mode can be measured. Subsequently, the displacement of the drones 200 can be calculated by applying a linear transformation (linear function) to the measured displacement values of the controller 100. [

The controller 130 sets a zero point based on a predetermined user input in the second mode and performs calibration on the basis of the zero point, 200 can be calculated.

In an embodiment, the control unit 130 may calculate the velocity of the object 200 based on the absolute value of the calculated displacement of the control device 100 in the second mode.

4 is a conceptual diagram for explaining the principle of the second mode (relative control).

Referring to FIG. 4, the second mode refers to a control scheme based on relative coordinates based on a zero point.

As an embodiment, after the origin coordinates are set in a virtual frame (space) in the second mode, the displacement of the controller 100 due to the movement of the user can be measured. Then, the speed of the drones 200 can be calculated with the absolute value of the measured displacement of the controller 100. To this end, a control signal for controlling the motor output of the drone 200 may be transmitted.

As described above, the user input unit 110 may be implemented as a push button, a scroll button, or the like in order to select the first mode or the second mode.

FIG. 5 is a conceptual diagram illustrating an embodiment in which a first mode is selected, and FIG. 6 is a conceptual diagram illustrating an embodiment in which a second mode is selected.

5 and 6, a user may set a scroll button provided on the controller 100 to the right (500) or the left (600) so that the first mode (absolute coordinate mode) or the second mode Mode) can be set.

Referring to FIG. 5, the controller 100 may be moved 510 by a first distance in a first direction in an absolute coordinate mode, and then moved 520 by a second distance in a second direction. Accordingly, the drones 200 are moved 530 in a first direction by a distance proportional to the first distance or first distance (or a distance to which a constant ratio is applied), and then in a second direction, (Or by a distance that is proportional to the second distance) (540).

Referring to FIG. 6, the controller 100 may be moved (610) by a first distance in a first direction in a relative coordinate mode. Thus, the drone 200 moves in the first direction. At this time, movement 620 is performed at a speed proportional to the absolute value of the first distance (or at a speed of a value to which a certain ratio is applied).

Hereinafter, an embodiment related to setting a reference axis (X, Y, Z axis) related to the movement of the object 200 will be described.

7 is a conceptual diagram for explaining an embodiment for calculating the displacement of the control device.

7, a plurality of reference points 710, 720, and 730 can be set, and the displacement of the controller 100 can be measured (calculated) based on the set reference points 710, 720, and 730.

As an example, the plurality of reference points 710, 720, 730 may be set as triangular shaped points existing on a part of the user's body. To this end, the user may wear a suit in which the sensor is disposed at a position corresponding to the plurality of reference points 710, 720, 730, or attach the sensor directly. As a result, the displacement value of the controller 100 can be measured more accurately.

In another embodiment, the reference point may be set to one. Specifically, a chute on which one sensor is disposed can be worn or attached directly, and a point corresponding to the position of the sensor can be set as a reference point.

As another embodiment, user input for axis setting can be received. Specifically, in the relative coordinate mode, the zero point can be set by pressing the push button. Thereafter, the controller 100 may be moved in a predetermined direction to set the X axis or the Y axis. At this time, the Z axis may be set to the gravity direction.

On the other hand, the user input unit 110 can set a ratio to be applied to the calculated displacement of the control device 100, based on a predetermined user input.

FIGS. 8 and 9 are conceptual diagrams for explaining an embodiment in which a ratio for calculating the displacement of an object in the first mode (for example, a constant value of a primary transformation) is set.

8 and 9, the user can set a constant value in the first mode (absolute coordinate mode) by disposing the scroll button provided in the controller 100 at the lowermost position 800 or the highest position 900 have. In particular, the constant value can act as a frame scale (expansion or contraction) for the motion of the drone 200.

Referring to FIG. 8, as the scroll button is disposed at the lowermost position 800, the level may be set to the lowest level. When piloting at the lowest level, the drone 200 can be manipulated finer.

Referring to FIG. 9, as the scroll button is disposed at the highest position 900, the level can be set to the highest level. When steerable at the top level, the drones 200 can move at a higher speed.

In an embodiment, when the controller 100 moves 810 in a first direction by a first distance, at the lowest level, the drones 200 are moved by a first distance in the first direction or by a first distance by a predetermined ratio Value. On the other hand, in the uppermost stage, the drone 200 can move the first distance in the first direction by a predetermined value.

As another embodiment, the drone 200 can move at a higher speed in the lowest level by a value obtained by extending the first distance by a predetermined ratio in the first direction. That is, at the highest level, it can move farther from the lowest level during the same time.

On the other hand, even if the movement is not for controlling the drones 200, the drones 200 may be moved unnecessarily due to the movement of the user. In order to prevent such a case, the transmission of the control signal can be set to an OFF state.

The control unit 130 sets the transmission of the control signal to the OFF state so that the object 200 maintains the hovering state based on the predetermined user input, It was successful.

FIGS. 10 and 11 are conceptual diagrams for explaining an embodiment related to the input restriction of a control signal.

Referring to FIGS. 10 and 11, the user can set the ON / OFF state of transmission of the control signal to the push button 1000 provided in the controller 100.

As an embodiment, when no push input is applied to the push button 1000, the transmission of the control signal is set to the OFF state, so that the drones 200 maintain the hovering state (stopped state).

On the other hand, when a push input is applied to the push button 1000, the transmission of the control signal is set to the ON state, and the dron 200 moves. That is, the transmission state of the control signal can be switched on / off each time a single push input is applied to the push button 1000.

As another embodiment, when the push button 1000 is continuously pressed, transmission of the control signal can be set to the ON state. In this case, the displacement of the user's motion can be switched to the displacement of the drones 200 by pushing the push button 1000 while moving it.

On the other hand, the control unit 130 calculates the rotation direction and the rotation angle of the control device 100 based on the movement of the user based on the motion of the predetermined user, The rotation direction and the rotation angle of the object 200 can be calculated based on the rotation direction and the rotation angle of the object 200.

In another embodiment, the controller 130 controls the rotation direction and rotation angle of at least one camera provided on the object 200 based on the calculated rotation direction and rotation angle of the control device 100, Can be calculated.

12 is a conceptual diagram for explaining an embodiment for rotating an object in the first mode or the second mode.

Referring to FIG. 12, when the controller 100 rotates counterclockwise by a predetermined angle 1210, the drone 200 also rotates counterclockwise by a predetermined angle 1220.

As another embodiment, scale adjustment can be performed on the rotation angle by setting the level, as described in Figs. 8 and 9. Fig.

Specifically, in the first step, when the controller 100 rotates 15 degrees, the drone 200 also rotates 15 degrees. In contrast, in the second stage, when the controller 100 rotates 15 degrees, the drone 200 can rotate at 30 degrees.

As another embodiment, the trajectory in which the drones 200 rotate according to the level can be set in stages.

As another embodiment, the dragon 200 may be rotated by a scroll button. Specifically, the drone 200 can be rotated by an angle at which the scroll button is turned. Thus, the accuracy of rotation can be improved.

As another embodiment, the rotation of the drone 200 may be made not only on the up / down axis but also on the left / right or front / rear axis. For this purpose, a separate axis selection button may be provided.

As another embodiment, a flip for each axis can also be made, based on which predetermined user input (e.g., an input pushing a separate select button or the like) is applied.

In another embodiment, when a push input is applied to a separate camera rotation button, the camera provided in the drone 200 may rotate according to the rotation of the controller 100. That is, the user may rotate the camera provided in the drone 200 itself or the drone 200.

On the other hand, the vibration output unit 150 can output the vibration in a preset manner based on the detection of the predetermined condition.

13 is a conceptual diagram for explaining an embodiment in which vibration is output when an obstacle close to an object is detected in the first mode or the second mode.

Referring to FIG. 13, when the sensing function is added to the drones 200, vibrations may be output when the drones 200 approach a structure or other objects (obstacles) within a predetermined distance range. As an example, the closer the distance is from the obstacle, the greater the intensity of the output vibration.

In another embodiment, when a push input is applied to an additional button provided on the controller 100, the movement path of the drones 200 may be changed in the opposite direction.

On the other hand, the control device may be provided with one or more function button keys. The one or more function button keys are placed on the palm-free area when the user grasps the main body. Function button keys are linked to the ability to control objects automatically. When the function button key is pressed, the control device transmits a control command corresponding to the function to the object so that the function set in the pressed function button key is performed in the object.

Although the control device is provided with one or more function button keys, there are limitations in executing various functions related to the object (or the drones) with the function button key provided in the control device. In addition, when considering the specificity of a control device such as a portable device, the size of the control device is limited, and it is impossible to implement all the functions that the object can perform with one or more function button keys.

In order to solve such a problem, the control device according to the present invention can change the function set in the function button key to another function based on the user input. For example, in a state where the first function is set to the first function button key, the control device can set a second function different from the first function to the first function button based on a user input. Thereafter, when the first function button is pressed, a control command corresponding to the second function is transmitted to the object, so that the object performs the second function.

When the control device according to the present invention is connected to a terminal having a display, a user interface for changing functions provided in the control device may be output to the display. The user can confirm one or more function button keys provided in the control device and the respective functions through the user interface, and change the set functions to other functions.

Function buttons The functions that can be set on the keys include a sync function, a display function, a flip function, a backhome function, a back pause function, a follow me function, And may include at least one of a rounding function, a boost function, a dodge function, and a landing function.

The sink function is a function of changing the control reference axis so that the direction in which the object advances is aligned with the direction in which the user's sight line is oriented.

The display function is a function of moving an object according to a programmed movement trajectory.

The flip function is a function of rotating a virtual axis set on an object. For example, the object may be rotated by 180 degrees and turned over.

The backhaul function is a function of returning to a position where the control device is located.

The backfast function is a function of moving an object to a predetermined position.

The follow-me function is a function of moving at least one of a constant distance and a constant altitude from the control device and moving along the control device. At least one of the distance and the altitude may vary depending on user input.

The rounding function is a function that the object rotates and moves based on a specific point. For example, it may be a function of rotating and moving a predetermined distance based on a position where the control device is located. The predetermined distance may vary depending on user input.

The boost function is a function that radically increases the speed of the object up to a predetermined speed.

The Dodge function is a function for automatically avoiding an object within a predetermined distance regardless of the control of the control device.

The landing function is a function of landing safely on the ground while descending slowly.

On the other hand, various functions related to objects are continuously being developed, and there is a desire of a user to apply a new function to a control device.

In order to solve such a need, the control device may be connected to the terminal or the server by wire or wirelessly to download a new function, and the downloaded function may be set to a function button key included in the control device. For example, the user can download a new function from a predetermined server using the terminal, and connect the control device to the terminal to install the new function in the control device. Subsequently, the user can reset the new function to any one of the function button keys disposed in the controller so that the new function can be executed by the function button key.

The effect of the object motion control apparatus according to the present invention will be described as follows.

According to at least one of the embodiments of the present invention, the movement of the three-dimensional moving object such as a drone can be controlled only by the movement of the controller, thereby providing an intuitiveness to the user.

Further, there is an advantage that fine control is possible and accuracy of moving object control can be improved.

The present invention described above can be embodied as computer-readable codes on a medium on which a program is recorded. The computer readable medium includes all kinds of recording devices in which data that can be read by a computer system is stored. Examples of the computer readable medium include a hard disk drive (HDD), a solid state disk (SSD), a silicon disk drive (SDD), a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, , And may also be implemented in the form of a carrier wave (e.g., transmission over the Internet). In addition, the computer may include a control unit 130 of the terminal. Accordingly, the above description should not be construed in a limiting sense in all respects and should be considered illustrative. The scope of the present invention should be determined by rational interpretation of the appended claims, and all changes within the scope of equivalents of the present invention are included in the scope of the present invention.

Claims (10)

A control device for controlling movement of an object,
A sensing unit for sensing movement of a predetermined user applied to the control device;
Calculates a displacement of the control device due to the motion of the user based on the motion of the predetermined user being applied and generates a control signal for controlling the object based on the displacement of the calculated control device A control unit; And
And a wireless communication unit for transmitting the control signal to the object.
The method according to claim 1,
Wherein,
And sets the transmission of the control signal to an OFF state so that the object maintains a hovering state based on a predetermined user input being applied.
The method according to claim 1,
And a second mode for calculating a velocity of the object based on a first mode for calculating a displacement of the object based on the calculated displacement of the control device or a displacement of the calculated control device; Further comprising:
The method of claim 3,
Wherein,
Wherein in the first mode, the displacement of the object is calculated by applying a predetermined ratio to the displacement of the calculated control device.
5. The method of claim 4,
Wherein the user input unit comprises:
And said ratio is set based on whether predetermined user input is applied.
The method of claim 3,
Wherein,
Wherein in the second mode, a zero point is set based on a preset user input, and the velocity of the object is calculated by performing a calibration on the basis of the zero point Device.
The method according to claim 6,
Wherein,
And in the second mode, calculates the velocity of the object based on the absolute value of the calculated control device displacement.
The method according to claim 1,
Wherein,
Calculating a rotation direction and a rotation angle of the control device based on the motion of the predetermined user based on the movement of the user and calculating a rotation direction and a rotation angle of the control device based on the rotation direction and the rotation angle of the control device, And calculates the rotation direction and the rotation angle.
9. The method of claim 8,
Wherein,
And calculates a rotation direction and a rotation angle of at least one camera provided on the object based on the calculated rotation direction and rotation angle of the control device.
The method according to claim 1,
And a vibration output unit for outputting the vibration in a predetermined manner based on the detection of the predetermined condition.
KR1020150191184A 2015-12-31 2015-12-31 Control apparatus for controlling the movement of the object KR101927810B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150191184A KR101927810B1 (en) 2015-12-31 2015-12-31 Control apparatus for controlling the movement of the object

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150191184A KR101927810B1 (en) 2015-12-31 2015-12-31 Control apparatus for controlling the movement of the object

Publications (2)

Publication Number Publication Date
KR20170080031A true KR20170080031A (en) 2017-07-10
KR101927810B1 KR101927810B1 (en) 2018-12-11

Family

ID=59356298

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150191184A KR101927810B1 (en) 2015-12-31 2015-12-31 Control apparatus for controlling the movement of the object

Country Status (1)

Country Link
KR (1) KR101927810B1 (en)

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101194321B1 (en) * 2011-12-22 2012-10-26 김철우 Universal motion controller
KR101471852B1 (en) * 2013-12-02 2014-12-12 경상대학교산학협력단 Smart Device, Apparatus for Providing Robot Information, Method for Generating Trajectory of Robot, and Method for Teaching Work of Robot

Also Published As

Publication number Publication date
KR101927810B1 (en) 2018-12-11

Similar Documents

Publication Publication Date Title
US11644832B2 (en) User interaction paradigms for a flying digital assistant
US11573562B2 (en) Magic wand interface and other user interaction paradigms for a flying digital assistant
US11347244B2 (en) Image space motion planning of an autonomous vehicle
KR102236339B1 (en) Systems and methods for controlling images captured by an imaging device
US9928649B2 (en) Interface for planning flight path
US9947230B2 (en) Planning a flight path by identifying key frames
CN113448343B (en) Method, system and readable medium for setting a target flight path of an aircraft
TWI598143B (en) Following remote controlling method for aircraft
KR20180075191A (en) Method and electronic device for controlling unmanned aerial vehicle
WO2016168722A1 (en) Magic wand interface and other user interaction paradigms for a flying digital assistant
WO2015013979A1 (en) Remote control method and terminal
KR101887314B1 (en) Remote control device and method of uav, motion control device attached to the uav
US12079388B2 (en) Barometric sensing of arm position in a pointing controller system
US12007763B2 (en) Magic wand interface and other user interaction paradigms for a flying digital assistant
KR101927810B1 (en) Control apparatus for controlling the movement of the object
WO2012096282A1 (en) Controller, model device and control method
EP3518063A1 (en) Combined video display and gimbal control
KR20190076407A (en) Remote control device and method of uav
AU2014202714A1 (en) Aircraft

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right
GRNT Written decision to grant