KR20130085653A - Method for providing route guide using augmented reality and mobile terminal using this method - Google Patents

Method for providing route guide using augmented reality and mobile terminal using this method Download PDF

Info

Publication number
KR20130085653A
KR20130085653A KR1020120006572A KR20120006572A KR20130085653A KR 20130085653 A KR20130085653 A KR 20130085653A KR 1020120006572 A KR1020120006572 A KR 1020120006572A KR 20120006572 A KR20120006572 A KR 20120006572A KR 20130085653 A KR20130085653 A KR 20130085653A
Authority
KR
South Korea
Prior art keywords
information
emergency escape
unit
terminal
mobile device
Prior art date
Application number
KR1020120006572A
Other languages
Korean (ko)
Inventor
김연구
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020120006572A priority Critical patent/KR20130085653A/en
Publication of KR20130085653A publication Critical patent/KR20130085653A/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3407Route searching; Route guidance specially adapted for specific applications
    • G01C21/3438Rendez-vous, i.e. searching a destination where several users can meet, and the routes to this destination for these users; Ride sharing, i.e. searching a route such that at least two users can share a vehicle for at least part of the route
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3602Input other than that of destination using image analysis, e.g. detection of road signs, lanes, buildings, real preceding vehicles using a camera
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3605Destination input or retrieval
    • G01C21/362Destination input or retrieval received from an external device or application, e.g. PDA, mobile phone or calendar application
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/021Services related to particular areas, e.g. point of interest [POI] services, venue services or geofences
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/024Guidance services

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Software Systems (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Telephone Function (AREA)

Abstract

PURPOSE: An augmented reality-using path guiding method and a mobile device using the same are provided to effectively evacuate a user during an emergency such as a natural disaster. CONSTITUTION: A wireless communication unit (110) identifies the location of a mobile device. The wireless communication unit obtains necessary information to escape from an emergency. A control unit (180) launches an emergency evacuation application. The control unit generates an AR (Augmented Reality) content including the necessary information to escape from an emergency regarding the mobile device's current location. A display unit (150) displays the AR contents in a camera view mode. [Reference numerals] (110) Wireless communication unit; (111) Broadcast reception unit; (112) Telecommunication module; (113) Wireless internet module; (114) Short-distance communication module; (115) Location information module; (120) A/V input unit; (121) Camera; (122) Mic; (130) User input unit; (140) Sensing unit; (150) Output unit; (151) Display unit; (152) Sound output unit; (153) Alarm unit; (154) Haptic module; (160) Memory; (170) Interface unit; (180) Control unit; (181) Multimedia module; (190) Power supply

Description

Method for providing route guide using augmented reality and mobile terminal using this method

The present invention relates to a route guidance method and a mobile terminal using augmented reality.

Terminals may be divided into mobile / portable terminals and stationary terminals according to their mobility. The mobile terminal may be further classified into a handheld terminal and a vehicle mount terminal according to whether a user can directly carry it.

Such a terminal has various functions, for example, in the form of a multimedia device having multiple functions such as photographing and photographing of a moving picture, reproduction of a music or video file, reception of a game and broadcasting, etc. .

In order to support and enhance the functionality of such terminals, it may be considered to improve the structural and / or software parts of the terminal.

In general, an existing terminal capable of guiding a road provides a route guidance of a destination requested by a user based on a terrain and an object modeled based on previously stored geographic information. When the road guide is performed using the above method, the user may not easily transmit the change of the street and the building to the user by providing the previously secured content to the user.

In addition, when a problem such as a natural disaster or a disaster occurs, the user should directly set the route. In addition, when a situation where it is impossible to determine the information and route of the location where the user is located, the user can not be provided with a road guidance service, it is difficult to be provided with a road guide for emergency escape or destination.

An object of the present invention is to provide a method and a terminal using the same that can effectively provide a route guidance for emergency escape to a user at the time of an emergency accident such as a natural disaster.

In addition, an object of the present invention is to provide a method and a terminal using the method to effectively identify the location and state of the user by using augmented reality information and to easily receive a route guidance for emergency escape.

According to the present invention for realizing the above problem, the step of executing an emergency escape application, confirming the region where the terminal is located, determining whether emergency escape route information exists for the identified region, the emergency escape route information Generating augmented reality (AR) content, including displaying the generated augmented reality content in a camera view mode.

In another aspect, the present invention obtains information about the area in which the terminal is located, the wireless communication unit for obtaining emergency escape route information for the area, executing the emergency escape application, emergency escape route information corresponding to the location where the terminal moves A control unit for generating augmented reality content including a, and an output unit for displaying the generated augmented reality content in the camera view mode.

The route guidance method and the mobile terminal using the same according to an embodiment of the present invention as described above have an effect of allowing the user to grasp the state and location of the user even in an emergency situation and easily receive the movement route information by an augmented reality technology. It can have

1 is a block diagram illustrating a mobile terminal according to an exemplary embodiment of the present invention.
2 is a flowchart illustrating an operation of providing route guidance in a mobile terminal according to an embodiment of the present invention.
3 is an exemplary view of a screen displayed in a camera view mode when providing route guidance according to an embodiment of the present invention.
4 is a flowchart illustrating an operation for providing route guidance in a mobile terminal according to another exemplary embodiment of the present invention.
5 is an exemplary view of a screen displayed in a camera view mode when providing route guidance according to another embodiment of the present invention.

Hereinafter, a mobile terminal related to the present invention will be described in detail with reference to the drawings. The suffix "module" and " part "for the components used in the following description are given or mixed in consideration of ease of specification, and do not have their own meaning or role.

The mobile terminal described in this specification may include a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a PDA (Personal Digital Assistants), a PMP (Portable Multimedia Player), and navigation. However, it will be understood by those skilled in the art that the configuration according to the embodiments described herein may be applied to a fixed terminal such as a digital TV, a desktop computer, and the like, unless the configuration is applicable only to a mobile terminal.

Next, a structure of a mobile terminal according to an embodiment of the present invention will be described with reference to FIG. 1.

1 is a block diagram illustrating a mobile terminal according to an exemplary embodiment of the present invention.

The mobile terminal 100 includes a wireless communication unit 110, an audio / video input unit 120, a user input unit 130, a sensing unit 140, an output unit 150, a memory 160, A controller 170, a controller 180, a power supply 190, and the like. The components shown in FIG. 1 are not essential, and a mobile terminal having more or fewer components may be implemented.

Hereinafter, the components will be described in order.

The wireless communication unit 110 may include one or more modules for enabling wireless communication between the mobile terminal 100 and the wireless communication system or between the mobile terminal 100 and the network in which the mobile terminal 100 is located. For example, the wireless communication unit 110 may include a broadcast receiving module 111, a mobile communication module 112, a wireless Internet module 113, a short range communication module 114, and a location information module 115 .

The broadcast receiving module 111 receives a broadcast signal and / or broadcast related information from an external broadcast management server through a broadcast channel.

The broadcast channel may include a satellite channel and a terrestrial channel. The broadcast management server may refer to a server for generating and transmitting broadcast signals and / or broadcast related information, or a server for receiving broadcast signals and / or broadcast related information generated by the broadcast management server and transmitting the generated broadcast signals and / or broadcast related information. The broadcast signal may include a TV broadcast signal, a radio broadcast signal, a data broadcast signal, and a broadcast signal in which a data broadcast signal is combined with a TV broadcast signal or a radio broadcast signal. In particular, the broadcast channel according to an embodiment of the present invention may include a broadcast signal for disaster broadcasting from a broadcast management server. Therefore, the mobile terminal 100 may receive broadcast information on a disaster situation and an emergency situation.

The broadcast-related information may refer to a broadcast channel, a broadcast program, or information related to a broadcast service provider. The broadcast related information may also be provided through a mobile communication network. In this case, it may be received by the mobile communication module 112.

The broadcast related information may exist in various forms. For example, it may exist in the form of Electronic Program Guide (EPG) of Digital Multimedia Broadcasting (DMB) or Electronic Service Guide (ESG) of Digital Video Broadcast-Handheld (DVB-H).

For example, the broadcast receiving module 111 may be a Digital Multimedia Broadcasting-Terrestrial (DMB-T), a Digital Multimedia Broadcasting-Satellite (DMB-S), a Media Forward Link Only And a Digital Broadcasting System (ISDB-T) (Integrated Services Digital Broadcast-Terrestrial). Of course, the broadcast receiving module 111 may be adapted to other broadcasting systems as well as the digital broadcasting system described above.

The broadcast signal and / or broadcast related information received through the broadcast receiving module 111 may be stored in the memory 160.

The mobile communication module 112 transmits and receives radio signals to at least one of a base station, an external terminal, and a server on a mobile communication network. The wireless signal may include various types of data depending on a voice call signal, a video call signal or a text / multimedia message transmission / reception. In particular, the mobile communication module 112 according to an embodiment of the present invention may receive an information message about a disaster situation, an accident accident, or the like from the base station, an external terminal or a server, or transmit it to a preset destination.

The wireless Internet module 113 is a module for wireless Internet access, and may be built in or externally attached to the mobile terminal 100. Wireless Internet technologies may include Wireless LAN (Wi-Fi), Wireless Broadband (Wibro), World Interoperability for Microwave Access (Wimax), High Speed Downlink Packet Access (HSDPA), and the like. In particular, since Wi-Fi includes identifier information, it is possible to receive various information existing in an area where a terminal is located by an identifier capable of wi-fi communication. According to an embodiment of the present invention, emergency escape route information of an area where a terminal is located may be obtained by using a wireless internet module capable of wi-fi communication.

The short range communication module 114 refers to a module for short range communication. Bluetooth, Radio Frequency Identification (RFID), infrared data association (IrDA), Ultra Wideband (UWB), ZigBee, and the like can be used as a short range communication technology.

The position information module 115 is a module for obtaining the position of the mobile terminal, and a representative example thereof is a Global Position System (GPS) module.

Referring to FIG. 1, an A / V (Audio / Video) input unit 120 is for inputting an audio signal or a video signal, and may include a camera 121 and a microphone 122. The camera 121 processes image frames such as still images or moving images obtained by the image sensor in the video call mode or the photographing mode. The processed image frame can be displayed on the display unit 151. [

The image frame processed by the camera 121 may be stored in the memory 160 or transmitted to the outside through the wireless communication unit 110. [ Two or more cameras 121 may be provided depending on the use environment.

The microphone 122 receives an external sound signal through a microphone in a communication mode, a recording mode, a voice recognition mode, or the like, and processes it as electrical voice data. The processed voice data can be converted into a form that can be transmitted to the mobile communication base station through the mobile communication module 112 when the voice data is in the call mode, and output. Various noise reduction algorithms may be implemented in the microphone 122 to remove noise generated in receiving an external sound signal.

The user input unit 130 generates input data for a user to control the operation of the terminal. The user input unit 130 may include a key pad dome switch, a touch pad (static pressure / capacitance), a jog wheel, a jog switch, and the like.

The sensing unit 140 senses the current state of the mobile terminal 100 such as the open / close state of the mobile terminal 100, the position of the mobile terminal 100, the presence or absence of user contact, the orientation of the mobile terminal, And generates a sensing signal for controlling the operation of the mobile terminal 100. For example, when the mobile terminal 100 is in the form of a slide phone, it may sense whether the slide phone is opened or closed. In addition, whether the power supply unit 190 is supplied with power, whether the interface unit 170 is coupled to the external device may be sensed. Meanwhile, the sensing unit 140 may include a proximity sensor 141.

The output unit 150 is for generating output related to the visual, auditory or tactile sense and includes a display unit 151, an audio output module 152, an alarm unit 153, and a haptic module 154 .

The display unit 151 displays (outputs) information processed by the mobile terminal 100. For example, when the mobile terminal is in the call mode, a UI (User Interface) or a GUI (Graphic User Interface) associated with a call is displayed. When the mobile terminal 100 is in the video communication mode or the photographing mode, the photographed and / or received video or UI and GUI are displayed.

The display unit 151 may be a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED), a flexible display display, and a 3D display.

The display unit 151 may display an image and an image including route information and emergency escape information on an area where the terminal is located in the camera view mode.

Some of these displays may be transparent or light transmissive so that they can be seen through. This can be referred to as a transparent display, and a typical example of the transparent display is TOLED (Transparent OLED) and the like. The rear structure of the display unit 151 may also be of a light transmission type. With this structure, the user can see an object located behind the terminal body through the area occupied by the display unit 151 of the terminal body.

There may be two or more display units 151 according to the embodiment of the mobile terminal 100. For example, in the mobile terminal 100, a plurality of display portions may be spaced apart from one another, or may be disposed integrally with one another, and may be disposed on different surfaces, respectively.

When the display unit 151 and a sensor for detecting a touch operation (hereinafter, referred to as a touch sensor) form a mutual layer structure (hereinafter referred to as a touch screen), the display unit 151 may be configured in addition to an output device. Can also be used as an input device. The touch sensor may have the form of, for example, a touch film, a touch sheet, a touch pad, or the like.

The touch sensor may be configured to convert a change in a pressure applied to a specific portion of the display unit 151 or a capacitance generated in a specific portion of the display unit 151 into an electrical input signal. The touch sensor can be configured to detect not only the position and area to be touched but also the pressure at the time of touch.

If there is a touch input to the touch sensor, the corresponding signal (s) is sent to the touch controller. The touch controller processes the signal (s) and transmits the corresponding data to the controller 180. As a result, the controller 180 can know which area of the display unit 151 is touched. In the case of the display unit 151 including the touch sensor, path information may be displayed in various directions and screens according to an input signal requested by a user. In addition, various modes such as emergency message transmission and call mode switching may be performed according to a user input.

Referring to FIG. 1, a proximity sensor 141 may be disposed in an inner region of the mobile terminal or in the vicinity of the touch screen, which is surrounded by the touch screen. The proximity sensor 141 refers to a sensor that detects the presence of an object approaching a predetermined detection surface or an object existing in the vicinity of the detection surface without mechanical contact using an electromagnetic force or an infrared ray. The proximity sensor 141 has a longer life than the contact type sensor and its utilization is also high.

Examples of the proximity sensor 141 include a transmission type photoelectric sensor, a direct reflection type photoelectric sensor, a mirror reflection type photoelectric sensor, a high frequency oscillation type proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor. And to detect the proximity of the pointer by the change of the electric field along the proximity of the pointer when the touch screen is electrostatic. In this case, the touch screen (touch sensor) may be classified as a proximity sensor.

Hereinafter, for convenience of explanation, the act of allowing the pointer to be recognized without being in contact with the touch screen so that the pointer is located on the touch screen is referred to as a "proximity touch", and the touch The act of actually touching the pointer on the screen is called "contact touch." The position where the pointer is proximately touched on the touch screen means a position where the pointer is vertically corresponding to the touch screen when the pointer is touched.

The proximity sensor detects a proximity touch and a proximity touch pattern (e.g., a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, a proximity touch movement state, and the like). Information corresponding to the detected proximity touch operation and the proximity touch pattern may be output on the touch screen.

The sound output module 152 may output audio data received from the wireless communication unit 110 or stored in the memory 160 in a call signal reception, a call mode or a recording mode, a voice recognition mode, a broadcast reception mode, and the like. The sound output module 152 also outputs sound signals related to functions (e.g., call signal reception sound, message reception sound, etc.) performed in the mobile terminal 100. [ The audio output module 152 may include a receiver, a speaker, a buzzer, and the like.

The alarm unit 153 outputs a signal for notifying the occurrence of an event of the mobile terminal 100. Examples of events that occur in the mobile terminal include call signal reception, message reception, key signal input, touch input, and the like. The alarm unit 153 may output a signal for notifying the occurrence of an event in a form other than the video signal or the audio signal, for example, vibration. The video signal or the audio signal may be output through the display unit 151 or the audio output module 152 so that they may be classified as a part of the alarm unit 153.

The haptic module 154 generates various tactile effects that the user can feel. Vibration is a representative example of the haptic effect generated by the haptic module 154. The intensity and pattern of vibration generated by the haptic module 154 can be controlled. For example, different vibrations may be synthesized and output or sequentially output.

In addition to vibration, the haptic module 154 may be configured to provide a pin array that vertically moves with respect to the contact skin surface, a jetting force or suction force of air through an injection or inlet port, grazing to the skin surface, contact of an electrode, electrostatic force, and the like. Various tactile effects can be generated, such as effects by the endothermic and the reproduction of a sense of cold using the elements capable of endotherm or heat generation.

The haptic module 154 can be implemented not only to transmit the tactile effect through the direct contact but also to allow the user to feel the tactile effect through the muscular sensation of the finger or arm. The haptic module 154 may include two or more haptic modules 154 according to the configuration of the portable terminal 100.

The memory 160 may store a program for the operation of the controller 180 and temporarily store input / output data (e.g., a phone book, a message, a still image, a moving picture, etc.). The memory 160 may store data on vibration and sound of various patterns outputted when a touch is input on the touch screen.

The memory 160 may be a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (for example, SD or XD memory), a RAM (Random Access Memory), SRAM (Static Random Access Memory), ROM (Read Only Memory), EEPROM (Electrically Erasable Programmable Read-Only Memory), PROM A disk, and / or an optical disk. The mobile terminal 100 may operate in association with a web storage that performs a storage function of the memory 160 on the Internet.

The memory 160 may store route information, status information, and the like corresponding to area information where the terminal is located, which is obtained from the location information module 115 of the wireless communication unit 110. In addition, the memory 160 may store state information and location information of a terminal obtained using various wireless communication modules such as the mobile communication module 112 and the wireless internet module 113, and route information for emergency escape. have.

The interface unit 170 serves as a path for communication with all external devices connected to the mobile terminal 100. The interface unit 170 receives data from an external device or supplies power to each component in the mobile terminal 100 or transmits data to the external device. For example, a wired / wireless headset port, an external charger port, a wired / wireless data port, a memory card port, a port for connecting a device having an identification module, an audio I / O port, A video input / output (I / O) port, an earphone port, and the like may be included in the interface unit 170.

The identification module is a chip for storing various information for authenticating the use right of the mobile terminal 100 and includes a user identification module (UIM), a subscriber identity module (SIM), a general user authentication module A Universal Subscriber Identity Module (USIM), and the like. Devices with identification modules (hereinafter referred to as "identification devices") can be manufactured in a smart card format. Accordingly, the identification device can be connected to the terminal 100 through the port.

When the mobile terminal 100 is connected to an external cradle, the interface unit may be a path through which power from the cradle is supplied to the mobile terminal 100, or various command signals input by the user to the cradle may be transmitted It can be a passage to be transmitted to the terminal. The various command signals or the power source input from the cradle may be operated as a signal for recognizing that the mobile terminal is correctly mounted on the cradle.

The controller 180 typically controls the overall operation of the mobile terminal. For example, voice communication, data communication, video communication, and the like. The control unit 180 may include a multimedia module 181 for multimedia playback. The multimedia module 181 may be implemented in the control unit 180 or may be implemented separately from the control unit 180. [

The controller 180 may perform a pattern recognition process for recognizing handwriting input or drawing input performed on the touch screen as characters and images, respectively.

The controller 180 may generate the AR content by combining the information on the region where the terminal 100 is acquired through the wireless communication unit 110 and the emergency escape route information. The generated augmented reality content may be generated including information on the region or place where the terminal is located and path information for emergency escape when a disaster occurs in the region or place.

The power supply unit 190 receives an external power source and an internal power source under the control of the controller 180 to supply power for operation of each component.

The various embodiments described herein may be embodied in a recording medium readable by a computer or similar device using, for example, software, hardware, or a combination thereof.

According to a hardware implementation, the embodiments described herein may be implemented as application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays May be implemented using at least one of processors, controllers, micro-controllers, microprocessors, and other electronic units for performing other functions. In some cases, May be implemented by the control unit 180.

In a software implementation, embodiments such as procedures or functions may be implemented with separate software modules that allow at least one function or operation to be performed. The software code may be implemented by a software application written in a suitable programming language. The software code may be stored in the memory 160 and executed by the controller 180.

2 is a flowchart illustrating an operation of providing a route guide in a mobile terminal according to an exemplary embodiment of the present invention. FIG.

2 to 3, when a message is received (step 202), the wireless communication unit 110 of the mobile terminal 100 parses the sender and the content of the received message (step 204). )

The controller 180 of the mobile terminal 100 parses the sender or the content of the received message and determines that the received message is a disaster situation notification message indicating a disaster situation, and provides emergency escape route information according to a disaster occurrence. Run the emergency escape application (step 206).

When the emergency escape application is executed, the location information module 115 of the wireless communication unit 110 is used to check the area where the mobile terminal 100 is located.

The controller 180 determines whether emergency escape route information of a region or a place corresponding to the region where the mobile terminal 100 is obtained by using the location information module 115 is stored in the memory 160. Step 210)

If there is no emergency escape route information for the area or place where the terminal is located in the memory 160, the controller 180 can download the emergency escape route information of the current location through the wireless communication unit 110 (step 212). )

On the other hand, the controller 180 loads the emergency escape route information from the memory 160 when the emergency escape route information for the region or place where the terminal is located is located in the memory 160 (step 214). The loaded emergency escape route information may be generated as AR content by combining the region information where the terminal is located.

The controller 180 executes a camera view mode for displaying the generated AR content (step 215). The display unit 151 may display the AR content in the camera view mode (step 218).

The AR content provided in the camera view mode in the display unit 151 may appear as shown in the screen example of FIG. 3.

The AR content 300 including the emergency escape route information displayed in the camera view mode may include information 310 and an image 320 on the region or place where the terminal acquired by the wireless communication unit 110 is located. In addition, the image may display a shelter 321 or a distance 322 for a movable route that is close to the current location. In addition, the AR content may include various external environment information 330 such as climate information on the region where the terminal is located, that is, the location where the user exists. The AR content display screen 300 may display, as text 340, a disaster character received through the wireless communication unit 110 or situation information continuously received. In addition, according to a user input, various menus 350 for receiving an input signal for requesting a switch or exit from the emergency escape route information providing mode to another mode may be displayed.

4 is a flowchart illustrating an operation of providing a route guide in a mobile terminal according to another exemplary embodiment of the present disclosure. FIG. 5 is a diagram illustrating a screen displayed in a camera view mode when providing a route guide according to another exemplary embodiment.

In another embodiment of the present invention, a method for providing an emergency escape route in a situation in which it is impossible to obtain information on an area where a terminal is located will be described with reference to FIGS. 4 to 5.

4 to 5, when the message is received (step 402), the wireless communication unit 110 of the mobile terminal 100 parses the sender and the content of the received message (step 404). )

The controller 180 of the mobile terminal 100 parses the sender or contents of the received message and determines that the received message is a disaster notification message informing of a disaster situation. Run the escape application (step 406).

When the emergency escape application is executed, the location information module 115 of the wireless communication unit 110 is used to check the area where the mobile terminal 100 is located. In another embodiment of the present invention will be described with an example where it is impossible to check the local information where the mobile terminal 100 is located.

Therefore, if it is impossible to check the local information, the controller 180 determines whether emergency escape route information corresponding to a location corresponding to a location designated by the user is stored in the memory 160 (step 410).

The controller 180 determines whether the Wi-Fi communication is possible when the emergency escape route information of the location requested by the user does not exist in the memory 160, and when the communication is impossible, the controller 180 switches to the SOS mode to make an emergency to the outside. In operation 414, the mobile terminal enters the power saving mode to reduce the paging search time, thereby allowing the user to move the mobile terminal to the maximum extent possible in an emergency. You can make it available.

On the other hand, when the Wi-Fi communication is possible, the controller 180 may communicate with a nearby Wi-Fi transceiver through a Wi-Fi identifier to download information of the current location or emergency escape route information corresponding to the current location. The emergency escape route information may be information provided at a location or location including a transceiver capable of Wi-Fi communication. In addition to the Wi-Fi communication, it is possible to use a variety of wireless communication modules that can be provided with emergency escape route information.

The controller 180 can generate AR content by using the obtained emergency escape route information. The controller 180 can execute the camera view mode (step 418) and display the generated AR content (step 420).

FIG. 5 is an exemplary diagram of a case where user emergency escape route information obtained through the Wi-Fi communication is generated as an AR content and provided in a camera view mode.

In the AR content 500 including emergency escape route information displayed in the camera view mode, information (510), current location (511), arrow (512) The location of the shelter 513 may be displayed. In addition, emergency escape route information may be provided as text 520 when displaying AR content. In addition, according to a user input, various menus 530 may be displayed to receive a signal for requesting a switch or exit from the emergency escape route information providing mode to another mode.

In an embodiment of the present invention, for example, the emergency escape application may be executed according to a disaster text message received. However, the emergency escape application may be executed by a user request. In addition, it is possible to provide an escape route information in case of emergency on the movement route according to the current location and destination information by the user request rather than the disaster situation.

According to the embodiment of the present invention, the above-described method can be implemented as a code that can be read by a processor on a medium on which the program is recorded. Examples of the medium that can be read by the processor include ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage, etc., and may be implemented in the form of a carrier wave (e.g., transmission over the Internet) .

The mobile terminal described above can be applied to not only the configuration and method of the embodiments described above but also all or some of the embodiments may be selectively combined so that various modifications may be made to the embodiments It is possible.

Claims (9)

Executing an emergency escape application;
Identifying an area in which the terminal is located;
Determining whether emergency escape route information exists for the identified region;
Generating Augmented Reality (AR) content including the emergency escape route information;
Displaying the generated augmented reality content in a camera view mode; including
How to get directions on your mobile device.
The method of claim 1,
The step of executing the emergency escape application
Receives a message from the outside, parses the sender or the content of the received message to automatically run the application if the message is a disaster message
How to get directions on your mobile device.
The method of claim 1,
If the emergency escape information does not exist inside the terminal, the emergency escape information corresponding to the area where the confirmed terminal is located is downloaded from the outside.
How to get directions on your mobile device.
The method of claim 1,
In the augmented reality content displayed in the camera view mode
It includes at least one of the surrounding image information, location information and emergency escape direction information for the area where the terminal is located
How to get directions on your mobile device.
The method of claim 1,
If it is impossible to confirm the area where the terminal is located, the emergency exit route information for the area where the terminal is located is downloaded using the unique identifier of the Wi-Fi using the Wi-Fi.
How to get directions on your mobile device.
A wireless communication unit which obtains information on an area where a terminal is located and obtains emergency escape route information for the area;
A controller for executing an emergency escape application and generating augmented reality content including emergency escape route information corresponding to a location to which the terminal moves;
And an output unit configured to display the generated augmented reality content in a camera view mode.
How to get directions on your mobile device.
The method according to claim 6,
The control unit
Parse the message received through the wireless communication unit, and executes the emergency escape application if the parsed message includes disaster information
How to get directions on your mobile device.
The method according to claim 6,
The augmented reality content is
At least one of surrounding image information, location information, and emergency escape direction information for the area where the terminal is located
How to get directions on your mobile device.
The method according to claim 6,
Further comprising a memory unit for storing information on the region where the terminal is located and emergency escape route information of the region.
How to get directions on your mobile device.
KR1020120006572A 2012-01-20 2012-01-20 Method for providing route guide using augmented reality and mobile terminal using this method KR20130085653A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020120006572A KR20130085653A (en) 2012-01-20 2012-01-20 Method for providing route guide using augmented reality and mobile terminal using this method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020120006572A KR20130085653A (en) 2012-01-20 2012-01-20 Method for providing route guide using augmented reality and mobile terminal using this method

Publications (1)

Publication Number Publication Date
KR20130085653A true KR20130085653A (en) 2013-07-30

Family

ID=48995780

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020120006572A KR20130085653A (en) 2012-01-20 2012-01-20 Method for providing route guide using augmented reality and mobile terminal using this method

Country Status (1)

Country Link
KR (1) KR20130085653A (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101504612B1 (en) * 2014-09-17 2015-03-20 주식회사초록물고기아트텍 Emergency evacuation information system using augmented reality and information system thereof
WO2016190679A1 (en) * 2015-05-27 2016-12-01 한국과학기술정보연구원 Method for displaying disaster and safety information, and portable device
CN108151709A (en) * 2016-12-06 2018-06-12 百度在线网络技术(北京)有限公司 Localization method and device applied to terminal
KR101968607B1 (en) * 2017-11-03 2019-04-12 주식회사 유진코어 System and method for broadcating real-time disaster using mobile device
KR102012681B1 (en) * 2018-05-23 2019-08-21 최성은 Operation management system in case of emergency disaster and Drive method of the Same
KR20190138222A (en) 2018-06-04 2019-12-12 이형주 Evacuation route guidance system in a building based on augmented reality
KR102228175B1 (en) * 2020-07-22 2021-03-16 (주)메가플랜 Emergency route guidance method using augmented reality and electronic device having same
WO2021172970A3 (en) * 2020-02-28 2021-10-21 유상규 Portable device camera function capable of storing location information, address transmission method using same, and navigation device configuration and content provision method
CN113631886A (en) * 2019-03-26 2021-11-09 斯纳普公司 Augmented reality guided interface
KR102349293B1 (en) * 2021-11-09 2022-01-11 대한민국 Method and system for evacuation route guidance of occupants using augmented reality of mobile devices
WO2023146198A1 (en) * 2022-01-25 2023-08-03 삼성전자 주식회사 Electronic device and method for controlling output device

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101504612B1 (en) * 2014-09-17 2015-03-20 주식회사초록물고기아트텍 Emergency evacuation information system using augmented reality and information system thereof
WO2016043393A1 (en) * 2014-09-17 2016-03-24 주식회사 초록물고기아트텍 Emergency evacuation guidance system and guidance method using augmented reality
WO2016190679A1 (en) * 2015-05-27 2016-12-01 한국과학기술정보연구원 Method for displaying disaster and safety information, and portable device
US20180143326A1 (en) * 2015-05-27 2018-05-24 Korea Institute Of Science & Technology Information Method for displaying disaster and safety information, and portable device
AU2016267916B2 (en) * 2015-05-27 2018-10-25 Korea Institute Of Science & Technology Information Method for displaying disaster and safety information, and portable device
US10234563B2 (en) 2015-05-27 2019-03-19 Korea Institute Of Science & Technology Information Method for displaying disaster and safety information, and portable device
CN108151709A (en) * 2016-12-06 2018-06-12 百度在线网络技术(北京)有限公司 Localization method and device applied to terminal
KR101968607B1 (en) * 2017-11-03 2019-04-12 주식회사 유진코어 System and method for broadcating real-time disaster using mobile device
KR102012681B1 (en) * 2018-05-23 2019-08-21 최성은 Operation management system in case of emergency disaster and Drive method of the Same
KR20190138222A (en) 2018-06-04 2019-12-12 이형주 Evacuation route guidance system in a building based on augmented reality
CN113631886A (en) * 2019-03-26 2021-11-09 斯纳普公司 Augmented reality guided interface
WO2021172970A3 (en) * 2020-02-28 2021-10-21 유상규 Portable device camera function capable of storing location information, address transmission method using same, and navigation device configuration and content provision method
KR102228175B1 (en) * 2020-07-22 2021-03-16 (주)메가플랜 Emergency route guidance method using augmented reality and electronic device having same
WO2022019392A1 (en) * 2020-07-22 2022-01-27 (주)메가플랜 Emergency path guiding method using augmented reality, and electronic device including same
KR102349293B1 (en) * 2021-11-09 2022-01-11 대한민국 Method and system for evacuation route guidance of occupants using augmented reality of mobile devices
US11995736B2 (en) 2021-11-09 2024-05-28 National Disaster Management Research Institute Method and system for evacuation route guidance of occupants using augmented reality of mobile devices
WO2023146198A1 (en) * 2022-01-25 2023-08-03 삼성전자 주식회사 Electronic device and method for controlling output device

Similar Documents

Publication Publication Date Title
KR101830965B1 (en) Mobile Terminal And Method Of Controlling The Same
KR20130085653A (en) Method for providing route guide using augmented reality and mobile terminal using this method
KR101582687B1 (en) Method for releasing locking in mobile terminal and mobile terminal using the same
KR101695812B1 (en) Mobile terminal and method for controlling the same
KR20120032336A (en) Method for displaying information of augmented reality and terminal thereof
KR101987463B1 (en) Mobile terminal and method for controlling of the same
KR101884469B1 (en) Mobile terminal and method for controlling thereof
KR101509258B1 (en) Method for dispalying broadcasting program and mobile terminal thereof
KR20140108858A (en) Method for providing information on access point using nfc tag and apparatus therefor
KR20120124818A (en) Mobile terminal and method for controlling application thereof
KR20100089383A (en) Apparatus and method for alarmming the receipt of a call
KR101565408B1 (en) Mobile terminal and method for controlling urgent broadcast thereof
KR20150044128A (en) Method and terminal for call mode switching using face recognization
KR20150027613A (en) Electronic Device And Method Of Controlling The Same
US9874999B2 (en) Mobile terminal and method for operating same
KR20130076028A (en) Method for controlling mobile terminal
KR101995234B1 (en) Terminal and control method thereof
KR20140124303A (en) Terminal and control method thereof
KR20140141310A (en) Terminal and operating method thereof
KR20150092624A (en) Electronic device and control method thereof
KR20140067291A (en) Terminal and method for displaying icon
KR101852432B1 (en) Mobile terminal and control method thereof
KR101554740B1 (en) Position displaying method and terminal using the same
KR20110009339A (en) The method for displaying map data in mobile terminal and mobile terminal thereof
KR20150040074A (en) Terminal and operating method thereof

Legal Events

Date Code Title Description
WITN Withdrawal due to no request for examination