KR20170091321A - Unmanned security robot - Google Patents

Unmanned security robot Download PDF

Info

Publication number
KR20170091321A
KR20170091321A KR1020160012184A KR20160012184A KR20170091321A KR 20170091321 A KR20170091321 A KR 20170091321A KR 1020160012184 A KR1020160012184 A KR 1020160012184A KR 20160012184 A KR20160012184 A KR 20160012184A KR 20170091321 A KR20170091321 A KR 20170091321A
Authority
KR
South Korea
Prior art keywords
unit
unmanned security
security robot
intruder
destination
Prior art date
Application number
KR1020160012184A
Other languages
Korean (ko)
Inventor
김갑순
김동영
윤건우
최치훈
신성일
김현우
Original Assignee
경상대학교산학협력단
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 경상대학교산학협력단 filed Critical 경상대학교산학협력단
Priority to KR1020160012184A priority Critical patent/KR20170091321A/en
Publication of KR20170091321A publication Critical patent/KR20170091321A/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • B25J11/002Manipulators for defensive or military tasks
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/021Optical sensing devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1671Programme controls characterised by programming, planning systems for manipulators characterised by simulation, either to verify existing program or to create and verify new program, CAD/CAM oriented, graphic oriented programming systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1674Programme controls characterised by safety, monitoring, diagnostic
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B25/00Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
    • G08B25/14Central alarm receiver or annunciator arrangements

Abstract

An unmanned security robot is provided. This unmanned security robot includes a sensor, generates map information about the surrounding environment by using a sensing unit for detecting an intruder, a driving unit for moving an unmanned security robot, and a sensor. Based on the generated map information, And a control unit for controlling the driving unit.

Description

UNMANNED SECURITY ROBOT

BACKGROUND OF THE INVENTION 1. Field of the Invention [0002] The present invention relates to an unmanned security robot, and more particularly, to an unmanned security robot for generating map information about a surrounding environment to move the surrounding environment and detect an intruder while moving around.

Recently, automation systems have been widely used in various fields. Among them, the unmanned security industry is also developing in relation to the security of private property and social infrastructure. With increasing demand for unmanned security systems, consumer complaints about unmanned security systems are also increasing.

For example, since a thermal sensor is used as a sensor of a currently used unmanned security system, there is a problem that it is impossible to accurately identify a heat object and an intruder.

In addition, since currently used unmanned security devices are fixed at a single point, there is a problem that there is a blind spot and it is very vulnerable to crime that invades a blind spot.

Therefore, it is necessary to increase the recognition rate of intruders by unmanned security, and to reduce uninjured security by minimizing blind spots.

SUMMARY OF THE INVENTION The present invention has been made in order to solve the above problems, and it is an object of the present invention to provide an unmanned security robot that generates map information and detects an intruder while moving in the surrounding environment based on the generated map information .

According to an aspect of the present invention, there is provided an unmanned security robot including: a sensor including a sensor and detecting an intruder; A driver for moving the unmanned security robot; And a control unit for generating map information on the surrounding environment using the sensor and controlling the driving unit to move in the surrounding environment based on the generated map information.

The control unit may control the driving unit to move to a destination corresponding to the received user command when a user command for designating a destination is received through the communication unit.

The controller may determine the shortest path of the current position and the destination based on the coordinates corresponding to the current position of the unmanned intelligent robot and the coordinates corresponding to the destination and move the robot along the shortest path to the destination The driving unit can be controlled.

The controller may control the sensing unit to acquire at least one of an RGB image, a depth image, and an infrared image using a sensor, and generate map information based on the acquired image.

In addition, the controller may determine the current position of the unmanned security robot based on the map information.

The control unit may control the communication unit to transmit intruder detection information including at least one of an intruder recognition time and an intruder recognition position to an external device when the intruder is detected.

As described above, according to various embodiments of the present disclosure, a user can be provided with an unmanned security robot that increases the intruder recognition rate and minimizes the blind spot.

1 is a block diagram schematically showing a configuration of an unmanned security robot according to an embodiment of the present disclosure;
FIG. 2 is a block diagram illustrating the configuration of an unmanned security robot in detail according to an embodiment of the present disclosure;
Figures 3A-C illustrate various embodiments in which an unmanned guard robot moves, according to various embodiments of the present disclosure,
4A and 4B are diagrams for explaining various embodiments in which an unmanned security robot detects an intruder according to an embodiment of the present disclosure,
5 is a flowchart illustrating a security method for an unmanned security robot according to an embodiment of the present disclosure.

The terms used in the embodiments of the present disclosure will be briefly described, and the embodiments will be described in detail.

The terms used in the embodiments of the present disclosure have selected the currently widely used generic terms possible in light of the functions in this disclosure, but these may vary depending on the intentions or precedents of those skilled in the art, the emergence of new technologies, and the like . Also, in certain cases, some terms are arbitrarily selected by the applicant, and in this case, the meaning thereof will be described in detail in the description of the corresponding embodiments. Therefore, the terms used in the embodiments should be defined based on the meaning of the term, not on the name of a simple term, and on the contents of the embodiments throughout.

In the embodiments of the present disclosure, terms including ordinal numbers such as first, second, and so on can be used to describe various elements, but the elements are not limited to these terms. The terms are used only for the purpose of distinguishing one component from another. For example, without departing from the scope of the present disclosure, the first component may be referred to as a second component, and similarly, the second component may also be referred to as a first component. And / or < / RTI > includes any combination of a plurality of related listed items or any of a plurality of related listed items.

Moreover, in the embodiments of the present disclosure, the singular forms "a", "an" and "the" include plural referents unless the context clearly dictates otherwise.

Also, in the embodiments of the present disclosure, terms such as "comprise" or "have ", etc. are intended to specify that there is a feature, number, step, operation, component, Steps, operations, elements, components, or combinations of elements, numbers, steps, operations, components, parts, or combinations thereof.

Further, in the embodiments of the present disclosure, 'module' or 'subtype' performs at least one function or operation, and may be implemented in hardware or software, or a combination of hardware and software. In addition, a plurality of 'modules' or a plurality of 'parts' may be integrated into at least one module except for 'module' or 'module' which needs to be implemented by specific hardware, and may be implemented by at least one processor.

Also, in the embodiments of the present disclosure, when a portion is referred to as being "connected" with another portion, it is not only the case where it is "directly connected" And the like.

Further, in the embodiments of the present disclosure, the user input may include at least one of a touch input, a bending input, a voice input, a button input, and a multimodal input, but is not limited thereto.

Unless otherwise defined, all terms used herein, including technical or scientific terms, have the same meaning as commonly understood by one of ordinary skill in the art to which this disclosure belongs. Terms such as those defined in commonly used dictionaries are to be interpreted as having a meaning consistent with the contextual meaning of the related art and are to be interpreted as either ideal or overly formal in the sense of the present application Do not.

Hereinafter, the present disclosure will be described with reference to the drawings. 1 is a diagram showing a configuration of an unmanned security robot according to an embodiment of the present disclosure. 1, the unmanned security robot 100 includes a sensing unit 110, a driving unit 120, and a control unit 130. [

The sensing unit 110 senses the surroundings of the unmanned security robot 100. In particular, the sensing unit 110 may include a gyro sensor capable of sensing movement of the unmanned security robot 100, an acceleration sensor and an ultrasonic sensor capable of sensing the periphery of the unmanned security robot 100, a pressure sensor, Various sensors such as a sensor, a camera, and the like. The sensing unit 110 may sense information about a surrounding environment for generating a map of the surrounding environment through various sensors included in the sensing unit 100. [ The sensing unit 110 may sense an intruder using a camera, a thermal sensor, or the like.

The driving unit 120 is driven according to a driving control signal of the controller 130 to move the unmanned security robot 100 in a predetermined direction. In particular, the driving unit 120 may be driven according to a driving control signal of the controller 130 to drive the unmanned security robot 100 to move the surrounding environment according to a preset pattern or a user's destination setting.

The control unit 130 controls the overall operation of the unmanned security robot 100. In particular, the control unit 130 generates map information on the surrounding environment using the sensor included in the sensing unit 110, and controls the driving unit 120 to move within the surrounding environment based on the generated map information have. The control unit 130 may detect the intruder through the sensing unit 110.

First, the control unit 130 can generate map information on the surrounding environment using the sensor included in the sensing unit 110. [ Specifically, the control unit 130 may control the sensing unit 110 to acquire at least one of an RGB image, a depth image, and an infrared image using a plurality of sensors. Then, the control unit 130 can generate map information based on the acquired image.

The control unit 130 can determine the current position of the unmanned security robot 100 based on the generated map information.

The control unit 130 may control the driving unit 120 to move within the surrounding environment based on the generated map information. Specifically, when a user command for designating a destination is received through a communication unit (not shown), the control unit 130 controls the driving unit 120 to move to the destination corresponding to the received user command by the unmanned security robot 100 can do. Although it has been described that a user command for designating a destination is received through a communication unit (not shown) according to an embodiment, the present invention is not limited to this embodiment, and a destination may be specified through an input unit (not shown) of the unmanned security robot 100 A user command for inputting a user command can be input. At this time, the external device that transmits a user command for designating a destination to the unmanned security robot 100 may be various electronic devices such as a smart phone, a desktop PC, a notebook PC, and the like.

The control unit 130 can determine the shortest path between the current location and the destination based on the coordinates corresponding to the current location of the unmanned security robot 100 and the coordinates corresponding to the destination. The control unit 130 may control the driving unit 120 to move to a destination corresponding to the received user command along the determined shortest path.

The control unit 130 may detect the intruder through the sensing unit 110.

Hereinafter, various embodiments of the present disclosure will be described with reference to Figs. 2 to 4B. FIG. 2 is a block diagram showing in detail the configuration of the unmanned security robot 100 according to an embodiment of the present disclosure. 2, the unmanned security robot 100 includes a display unit 210, an audio output unit 220, a communication unit 230, a storage unit 240, a sensing unit 250, an input unit 260, A driving unit 270 and a control unit 280.

FIG. 2 is a view illustrating various components in the case where the unmanned security robot 100 has various functions such as a moving function, an intruder detection function, and the like. Therefore, depending on the embodiment, some of the components shown in Fig. 2 may be omitted or changed, and other components may be further added.

The display unit 210 displays at least one of a video frame processed by the image processing unit (not shown) and various screens generated by the graphic processing unit 273, from the image data received from the image receiving unit (not shown). In particular, the display unit 210 may display intruder detection information according to a user input.

The audio output unit 220 is configured to output various types of audio data as well as various kinds of notification sounds and voice messages in which various processing operations such as decoding, amplification, and noise filtering are performed by an audio processing unit (not shown). In particular, the audio output unit 220 may output an intruder detection warning sound under the control of the control unit 280. Meanwhile, the audio output unit 220 may be implemented as a speaker, but it may be implemented as an output terminal capable of outputting audio data, which is only an example.

The communication unit 230 is configured to perform communication with various types of external devices according to various types of communication methods. The communication unit 230 may include various communication chips such as a Wi-Fi chip, a Bluetooth chip, an NFC chip, and a wireless communication chip. At this time, the Wi-Fi chip, the Bluetooth chip, and the NFC chip communicate with each other using the WiFi method, the Bluetooth method, and the NFC method. Among these, the NFC chip refers to a chip operating in an NFC (Near Field Communication) system using 13.56 MHz band among various RF-ID frequency bands such as 135 kHz, 13.56 MHz, 433 MHz, 860 to 960 MHz and 2.45 GHz. When a Wi-Fi chip or a Bluetooth chip is used, various connection information such as an SSID and a session key may be transmitted and received first, and communication information may be used to transmit and receive various information. The wireless communication chip refers to a chip that performs communication according to various communication standards such as IEEE, ZigBee, 3G (3rd Generation), 3rd Generation Partnership Project (3GPP), LTE (Long Term Evolution)

In particular, the communication unit 230 can receive a user command from an external device. Then, the communication unit 230 can transmit the intruder detection information to the external device.

The storage unit 240 stores various modules for driving the unmanned security robot 100. For example, the storage unit 240 may store software including a base module, a sensing module, a communication module, a presentation module, a web browser module, and a service module. At this time, the base module is a base module that processes signals transmitted from respective hardware included in the unmanned security robot 100 and transfers the processed signals to an upper layer module. The sensing module is a module for collecting information from various sensors and analyzing and managing the collected information, and may include a face recognition module, a voice recognition module, a motion recognition module, and an NFC recognition module. The presentation module is a module for constructing a display screen, and may include a multimedia module for reproducing and outputting multimedia contents, a UI, and a UI rendering module for performing graphics processing. The communication module is a module for performing communication with the outside. A web browser module refers to a module that accesses a web server by performing web browsing. A service module is a module including various applications for providing various services. In particular, the storage unit 240 may store a predetermined pattern for the unmanned security robot 100 to move in the surrounding environment.

As described above, the storage unit 240 may include various program modules, but it goes without saying that the various program modules may be partially omitted, modified or added depending on the type and characteristics of the unmanned security robot 100.

In an embodiment of the present invention, the storage unit 240 may include a memory card (not shown) mounted on the ROM 282, the RAM 281 or the unmanned security robot 100 in the control unit 280 , micro SD card, memory stick).

The sensing unit 250 senses the surrounding environment of the unmanned security robot 100. In particular, the sensing unit 250 may include various sensors and cameras such as a gyro sensor, an acceleration sensor and an ultrasonic sensor, a pressure sensor, a noise sensor, and a thermal sensor capable of sensing movement of the unmanned security robot 100 have. The sensing unit 250 may sense information about a surrounding environment for generating a map of the surrounding environment through various sensors included in the sensing unit 250. The sensing unit 250 may sense an intruder using a camera, a thermal sensor, or the like.

The input unit 260 receives a user command for controlling the unmanned security robot 100. In particular, the input unit 260 may receive a user command for displaying intruder detection information, and may receive a user command for specifying a destination. The input unit 260 may include various input devices such as a touch input unit, a button, a voice input unit, a motion input unit, a keyboard, and a mouse to receive a user command.

The driving unit 270 is driven according to the driving control signal of the controller 280 to move the unmanned security robot 100 in a predetermined direction. In particular, the driving unit 270 may be driven in accordance with a driving control signal of the controller 280 to drive the unmanned security robot 100 to move in a predetermined pattern or in a surrounding environment according to a user's destination setting.

The control unit 280 controls the overall operation of the electronic device 200 using various programs stored in the storage unit 240.

2, the control unit 280 includes a RAM 281, a ROM 282, a graphics processing unit 283, a main CPU 284, first through n interfaces 285-1 through 285-n, Bus 286. At this time, the RAM 281, the ROM 282, the graphics processing unit 283, the main CPU 284, the first to n interfaces 285-1 to 285-n, etc. can be connected to each other via the bus 286 .

A ROM 282 stores a command set for booting the system and the like. When the turn-on command is input and power is supplied, the main CPU 284 copies the O / S stored in the storage unit 240 to the RAM 281 according to the instruction stored in the ROM 282, To boot the system. When the booting is completed, the main CPU 284 copies various application programs stored in the storage unit 240 to the RAM 281, executes the application program copied to the RAM 281, and performs various operations.

The graphic processing unit 283 generates a screen including various objects such as a point, an icon, an image, and a text using an operation unit (not shown) and a rendering unit (not shown). The operation unit calculates an attribute value such as a coordinate value, a shape, a size, a color, and the like to be displayed by each object according to the layout of the screen using the control command received from the input unit. The rendering unit generates screens of various layouts including the objects based on the attribute values calculated by the operation unit. The screen generated by the rendering unit is displayed in the display area of the display unit 210.

The main CPU 284 accesses the storage unit 240 and performs booting using the O / S stored in the storage unit 240. [ The main CPU 284 performs various operations using various programs, contents, data stored in the storage unit 240, and the like.

The first through n interfaces 285-1 through 285-n are connected to the various components described above. One of the interfaces may be a network interface connected to an external device via a network.

In particular, the control unit 280 generates map information on the surrounding environment using the sensor included in the sensing unit 250, and controls the driving unit 270 to move within the surrounding environment based on the generated map information. The control unit 280 can detect the intruder using the sensing unit 250.

First, the control unit 280 generates map information on the surrounding environment by using the sensor included in the sensing unit 250. Specifically, when the point data for the peripheral area is acquired through the sensing unit 250 at a preset period, the controller 280 may store the acquired point data in the storage unit 240. [ The control unit 280 may store the point data acquired at the previous position together with the acquired point data in the storage unit 240 in some cases. The control unit 280 can calculate the plane feature component based on the data stored in the storage unit 240 or derive the parameters necessary for the matching. The control unit 280 may extract the plane feature using the data stored in the storage unit 240 and the derived parameters. In this case, the point data may be data representing a three-dimensional coordinate of (x, y, z) as a cloud point for the surrounding environment acquired through laser distance measurement, an infrared camera or the like.

In addition, the controller 280 may generate map optimized for the surrounding environment by data association between the data and the planar features obtained according to the moving position of the unmanned security robot 100. [ In particular, according to one embodiment, when the unmanned security robot 100 moves again to the same area, the control unit 280 adds the map information about the acquired environment and the acquired data to map information about the surrounding environment Can be optimized.

The control unit 280 can sense the surroundings using a laser distance measuring sensor, an infrared camera, or the like, and determine the current position based on the obtained map information of the surrounding environment.

3A, when the unmanned security robot 100 is located in the surrounding environment 300, the controller 280 controls the surrounding environment 300 using the sensor included in the sensing unit 250. For example, ≪ / RTI > The control unit 280 can determine the current position of the unmanned security robot 100 based on the detected surrounding environment using the laser distance measuring sensor, the infrared camera, and the acquired map information.

The control unit 280 may control the driving unit 270 to move the unmanned security robot 100 in the surrounding environment 300 according to a predetermined pattern. When a user command for designating a destination is received through the communication unit 230, the controller 280 may control the driver 270 to move the unmanned security robot 100 to a destination.

3B, when a user command for designating a destination through the communication unit 230 is received while the unmanned security robot 100 moves within the surrounding environment 300, Can control the driving unit 270 to move the unmanned security robot 100 to the destination 310. However, the present invention is not limited to this, and a user command for designating a destination may be directly input through the input unit 260 of the unmanned security robot 100 You can get it.

3C, when a user command for designating a destination through the communication unit 230 is received while the unmanned security robot 100 moves within the surrounding environment 300, the controller 280 controls the unmanned The unmanned security robot 100 determines the shortest path 320 based on the coordinates corresponding to the current position of the guard robot 100 and the coordinates corresponding to the position of the destination, It is possible to control the driving unit 270 so that the driving unit 270 moves. However, the present invention is not limited to this, and a user command for designating a destination may be directly input through the input unit 260 of the unmanned security robot 100 You can get it.

The control unit 280 may detect the intruder through the sensing unit 250 while moving in the surrounding environment. In detail, the controller 280 can transmit a real-time image acquired through the camera to the external device while moving in the surrounding environment. When the intruder detection notification is received from the external device through the communication unit 230, the control unit 280 may store the intruder detection information including the intruder detection time and the intruder detection position in the storage unit 240.

For example, as shown in FIG. 4A, the real-time image acquired through the camera while the unmanned security robot 100 moves within the surrounding environment 400 can be transmitted to the external device. When the intruder 410 detection notification is received from the external device through the communication unit 230, the control unit 280 detects the intruder 410 detection information including the detection time of the intruder 410, the intruder 410 detection location, And may be stored in the storage unit 240. The control unit 280 may control the communication unit 230 to transmit the intruder detection information indicating that the intruder is detected to at least one or more external apparatuses.

4B, the control unit 280 controls the display unit 210 to display the intruder detection notification 420. The control unit 280 controls the display unit 210 to display the intruder detection notification 420, can do. The intruder detection notification 420 includes a detection notification such as a detection notification such as "intruder was detected 430 ", detection time information such as" detection time: 00:00 00:00 00:00 (440) : &Quot; A point 450 ". On the other hand, if it is determined that there is no user command for confirming the intruder detection information, if it is a preset time, if the predetermined period is reached, or if an intruder detection event occurs, the control unit 280 detects intruder The display unit 210 may be controlled to display the notification 420. [ For example, the control unit 280 may control the display unit 210 to display the intruder detection notification 420 at 9 am every day.

Hereinafter, a security method of the unmanned security robot 100 according to an embodiment of the present disclosure will be described with reference to FIG.

First, the unmanned security robot 100 generates map information on the surrounding environment (S510). Specifically, the unmanned security robot 100 can generate map information on the surrounding environment using a plurality of sensors. The unmanned security robot 100 may acquire at least one of an RGB image, a depth image, and an infrared image using a plurality of sensors, and generate map information based on the acquired image. In addition, the unmanned security robot 100 can determine the current position of the unmanned security robot 100 based on the generated map information.

Then, the unmanned security robot 100 moves within the surrounding environment based on the generated map information (S520). Specifically, when a user command for designating a destination is received, the unmanned security robot 100 can move to a destination corresponding to the received user command. Meanwhile, the user command for designating the destination through the input unit of the unmanned security robot 100 may be input to the unmanned security robot 100, although it has been described that the user command for specifying the destination is received through the communication unit. In particular, the unmanned security robot 100 can determine the shortest path between the current position and the destination based on the coordinates corresponding to the current position of the unmanned security robot 100 and the coordinates corresponding to the destination, And can move to a destination corresponding to the received user command.

Then, the unmanned security robot 100 can detect an intruder (S530).

According to various embodiments of the present disclosure as described above, the user can be provided with the unmanned security robot that increases the recognition rate of the intruder and minimizes the blind spot.

Meanwhile, the above-described method can be implemented in a general-purpose digital computer that can be created as a program that can be executed by a computer and operates the program using a computer-readable recording medium. In addition, the structure of the data used in the above-described method can be recorded on a computer-readable recording medium through various means. The computer-readable recording medium includes a storage medium such as a magnetic storage medium (e.g., ROM, floppy disk, hard disk, etc.), optical reading medium (e.g., CD ROM,

It will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims. Therefore, disclosure methods should be considered from an illustrative point of view, not from a restrictive point of view. The scope of the present invention is defined by the appended claims rather than by the foregoing description, and all differences within the scope of equivalents thereof should be construed as being included in the present invention.

100: Unmanned security robot
110, 250:
120, and 270:
130, 280:

Claims (6)

A sensing unit including a sensor and detecting an intruder;
A driver for moving the unmanned security robot; And
And a control unit for generating map information on the surrounding environment using the sensor and controlling the driving unit to move within the surrounding environment based on the generated map information.
The method according to claim 1,
Further comprising:
Wherein,
Wherein the control unit controls the driving unit to move to a destination corresponding to the received user command when a user command for designating a destination is received through the communication unit.
3. The method of claim 2,
Wherein,
Determining the shortest path of the current position and the destination on the basis of the coordinates corresponding to the current position of the unmanned security robot and the coordinates corresponding to the destination and controlling the driving unit to move to the destination along the shortest path Features unmanned security robot.
The method according to claim 1,
Wherein,
Wherein the control unit controls the sensing unit to acquire at least one of an RGB image, a depth image, and an infrared image using a sensor, and generates map information based on the acquired image.
The method according to claim 1,
Wherein,
And the current position of the unmanned security robot is determined based on the map information.
The method according to claim 1,
Further comprising:
Wherein,
Wherein the control unit controls the communication unit to transmit intruder detection information including at least one of an intruder recognition time and an intruder recognition position to an external device when the intruder is detected.


KR1020160012184A 2016-02-01 2016-02-01 Unmanned security robot KR20170091321A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020160012184A KR20170091321A (en) 2016-02-01 2016-02-01 Unmanned security robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020160012184A KR20170091321A (en) 2016-02-01 2016-02-01 Unmanned security robot

Publications (1)

Publication Number Publication Date
KR20170091321A true KR20170091321A (en) 2017-08-09

Family

ID=59652743

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020160012184A KR20170091321A (en) 2016-02-01 2016-02-01 Unmanned security robot

Country Status (1)

Country Link
KR (1) KR20170091321A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20210013898A (en) 2019-07-29 2021-02-08 (주) 퓨처로봇 Multi-functional robot with time-based control and method of controlling the same

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20210013898A (en) 2019-07-29 2021-02-08 (주) 퓨처로봇 Multi-functional robot with time-based control and method of controlling the same

Similar Documents

Publication Publication Date Title
US11699271B2 (en) Beacons for localization and content delivery to wearable devices
KR102194262B1 (en) Method for displaying pointing information and device thereof
KR102576654B1 (en) Electronic apparatus and controlling method thereof
US20150070387A1 (en) Structural modeling using depth sensors
KR20170027631A (en) Large format display apparatus and control method thereof
US10789033B2 (en) System and method for providing widget
US20170083280A1 (en) Display apparatus and method for controlling display apparatus thereof
US11748992B2 (en) Trigger regions
US10108388B2 (en) Display apparatus and controlling method thereof
KR20170069600A (en) Electronic apparatus and operation method thereof
US20140195980A1 (en) Display apparatus and method for providing user interface thereof
US20140181724A1 (en) Display apparatus and method for providing menu thereof
KR101632220B1 (en) A mobile device, a method for controlling the mobile device, and a control system having the mobile device
KR20140057038A (en) Display apparatus and method for displaying list thereof
KR20170091321A (en) Unmanned security robot
KR102248741B1 (en) Display appaeatus and control method thereof
US10929007B2 (en) Method of displaying object on device, device for performing the same, and recording medium for performing the method
KR102464911B1 (en) Method of providing a user interfave and display apparatus according to thereof
US9459707B2 (en) Display apparatus and method of controlling the same
KR20150121565A (en) Method and apparatus for performing mirroring service
KR20150024009A (en) Method for providing user input feedback and apparatus for the same
US11797104B2 (en) Electronic device and control method of the same
KR20150081007A (en) Display apparatus and Method for controlling display apparatus thereof
CN112333489A (en) Laser cursor monitoring method and device and storage medium

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E601 Decision to refuse application