KR102067059B1 - Terminal and method for controlling the same - Google Patents

Terminal and method for controlling the same Download PDF

Info

Publication number
KR102067059B1
KR102067059B1 KR1020130107787A KR20130107787A KR102067059B1 KR 102067059 B1 KR102067059 B1 KR 102067059B1 KR 1020130107787 A KR1020130107787 A KR 1020130107787A KR 20130107787 A KR20130107787 A KR 20130107787A KR 102067059 B1 KR102067059 B1 KR 102067059B1
Authority
KR
South Korea
Prior art keywords
external terminal
terminal
map
controller
chat
Prior art date
Application number
KR1020130107787A
Other languages
Korean (ko)
Other versions
KR20150029088A (en
Inventor
민경석
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020130107787A priority Critical patent/KR102067059B1/en
Publication of KR20150029088A publication Critical patent/KR20150029088A/en
Application granted granted Critical
Publication of KR102067059B1 publication Critical patent/KR102067059B1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3407Route searching; Route guidance specially adapted for specific applications
    • G01C21/3438Rendez-vous, i.e. searching a destination where several users can meet, and the routes to this destination for these users; Ride sharing, i.e. searching a route such that at least two users can share a vehicle for at least part of the route
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06QDATA PROCESSING SYSTEMS OR METHODS, SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/01Social networking

Abstract

The present invention relates to a mobile terminal and a control method thereof capable of simultaneously displaying location information and chat contents with a mobile terminal and a chat partner on a map of an area where the mobile terminal and the chat partner are located.

Description

Mobile terminal and its control method {TERMINAL AND METHOD FOR CONTROLLING THE SAME}

The present invention relates to a portable terminal and a method of controlling the same so that the use of the terminal can be implemented in consideration of user convenience.

Terminal can move It may be divided into a mobile terminal (mobile / portable terminal) and a stationary terminal depending on whether or not. The mobile terminal may be further classified into a handheld terminal and a vehicle mount terminal according to whether a user can directly carry it.

As the terminal functions are diversified, for example, such a terminal is a multimedia player having a complex function such as taking a picture or a video, playing a music or video file, playing a game, or receiving a broadcast. Is being implemented.

In order to support and increase the function of such a terminal, it may be considered to improve the structural part and / or the software part of the terminal.

Conventionally, when sharing location information between terminals, the terminals agree to provide location information with each other, and transmit and receive location information between each other through a text message.

However, conventionally, there is an inconvenience of providing only simple text-based location information through text message transmission and reception.

SUMMARY OF THE INVENTION An object of the present invention is to provide a mobile terminal and a control method thereof capable of simultaneously displaying location information and chat contents with a mobile terminal and a chat partner on a map of an area where the mobile terminal and the chat partner are located. .

According to an aspect of the present invention, there is provided a portable terminal comprising: a wireless communication unit configured to transmit / receive a message including text-based chat contents with one or more external terminals registered in a messenger; A display unit configured to display a chat content with the external terminal on a screen; A memory having map data including at least one region; When the text-based chat mode is switched to a map-based chat mode, a map depicting an area including a current location of the mobile terminal and the external terminal is searched for in the map data and displayed on the screen. Displays first and second items representing the portable terminal and the external terminal at points corresponding to current locations of the portable terminal and the external terminal, respectively, and the portable terminal moves to the external terminal on the first item. And a controller configured to display the chat contents to be transmitted and to display the chat contents transmitted to the portable terminal by the external terminal on the second item.

In addition, the method of controlling a mobile terminal according to the present invention includes transmitting and receiving a message including text-based chat contents with at least one external terminal registered in a messenger; Displaying chat contents with the external terminal on a screen; Switching the text-based chat mode to a map-based chat mode; Retrieving a map depicting an area including a current location of the mobile terminal and the external terminal from among the provided map data; Displaying the searched map on the screen; Displaying first and second items respectively representing the mobile terminal and the external terminal at points corresponding to current locations of the mobile terminal and the external terminal in the map; And displaying the chat contents transmitted by the portable terminal to the external terminal on the first item, and displaying the chat contents transmitted by the external terminal to the portable terminal on the second item.

The mobile terminal and its control method according to the present invention simultaneously display the location information and the chat contents of the mobile terminal and the chat partner on a map of the area where the mobile terminal and the chat partner are located, thereby performing a chat and simultaneously. Provides the effect of sharing the location with each other.

1 is a block diagram of a mobile terminal according to an embodiment of the present invention.
2 is a front perspective view of a mobile terminal according to an embodiment of the present invention.
3 is a rear perspective view of a mobile terminal according to an embodiment of the present invention.
4 is a flowchart illustrating a control process of a mobile terminal according to the present invention.
5 to 19 are explanatory diagrams showing a control process of a portable terminal according to the present invention.

Hereinafter, a portable terminal according to the present invention will be described in detail with reference to the drawings. The suffixes "module" and "unit" for components used in the following description are given or used in consideration of ease of specification, and do not have distinct meanings or roles from each other.

The portable terminal described herein may include a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a personal digital assistant (PDA), a portable multimedia player (PMP), navigation, and the like.

However, it will be readily apparent to those skilled in the art that the configuration according to the embodiments described herein may also be applied to fixed terminals such as digital TVs, desktop computers, etc., except when applicable only to portable terminals.

1 is a block diagram of a mobile terminal according to an embodiment of the present invention. The mobile terminal 100 includes a wireless communication unit 110, an A / V input unit 120, a user input unit 130, a sensing unit 140, an output unit 150, a memory 160, and an interface unit. 170, the controller 180, and the power supply 190 may be included. The components shown in FIG. 1 are not essential, so that a mobile terminal having more or fewer components may be implemented.

Hereinafter, the components will be described in order.

The wireless communication unit 110 may include one or more modules that enable wireless communication between the mobile terminal 100 and the wireless communication system or between the mobile terminal 100 and a network in which the mobile terminal 100 is located. For example, the wireless communication unit 110 may include a broadcast receiving module 111, a mobile communication module 112, a wireless internet module 113, a short range communication module 114, a location information module 115, and the like. .

The broadcast receiving module 111 receives a broadcast signal and / or broadcast related information from an external broadcast management server through a broadcast channel. The broadcast channel may include a satellite channel and a terrestrial channel. Two or more broadcast receiving modules may be provided to the mobile terminal 100 for simultaneous broadcast reception or switching of broadcast channels for at least two broadcast channels.

The broadcast management server may mean a server that generates and transmits a broadcast signal and / or broadcast related information or a server that receives a previously generated broadcast signal and / or broadcast related information and transmits the same to a terminal. The broadcast signal may include not only a TV broadcast signal, a radio broadcast signal, and a data broadcast signal, but also a broadcast signal having a data broadcast signal combined with a TV broadcast signal or a radio broadcast signal.

The broadcast associated information refers to information related to a broadcast channel, a broadcast program, or a broadcast service provider. The broadcast related information may also be provided through a mobile communication network. In this case, it may be received by the mobile communication module 112.

The broadcast related information may exist in various forms. For example, it may exist in the form of Electronic Program Guide (EPG) of Digital Multimedia Broadcasting (DMB) or Electronic Service Guide (ESG) of Digital Video Broadcast-Handheld (DVB-H).

The broadcast receiving module 111 may include, for example, Digital Multimedia Broadcasting-Terrestrial (DMB-T), Digital Multimedia Broadcasting-Satellite (DMB-S), Media Forward Link Only (MediaFLO), and Digital Video Broadcast (DVB-H). -Handheld), DVB-CBMS (Convergence of Broadcasting and Mobile Service), OMA-BCAST (Open Mobile Alliance-BroadCAST), CMMB (China Multimedia Mobile Broadcasting), MBBMS (Mobile Broadcasting Business Management System), ISDB-T (Integrated Services) Digital broadcast signals may be received using a digital broadcast system such as Digital Broadcast-Terrestrial. Of course, the broadcast receiving module 111 may be configured to be suitable for not only the above-described digital broadcasting system but also other broadcasting systems.

The broadcast signal and / or broadcast related information received through the broadcast receiving module 111 may be stored in the memory 160.

The mobile communication module 112 may include at least one of a base station, an external terminal, and a server on a mobile communication network such as but not limited to Gobal System for Mobile communications (GSM), Code Division Multiple Access (CDMA), and Wideband CDMA (WCDMA). Send and receive wireless signals with one. The wireless signal may include various types of data according to transmission and reception of a voice call signal, a video call call signal, or a text / multimedia message.

The wireless internet module 113 refers to a module for wireless internet access and may be embedded or external to the mobile terminal 100. Wireless Internet technologies include Wireless LAN (Wi-Fi), Wireless Broadband (Wibro), World Interoperability for Microwave Access (Wimax), High Speed Downlink Packet Access (HSDPA), GSM, CDMA, WCDMA, Long Term Evolution), but is not limited to such.

In view of the fact that the wireless Internet access by Wibro, HSDPA, GSM, CDMA, WCDMA, LTE, etc. is made through a mobile communication network, the wireless Internet module 113 for performing a wireless Internet access through the mobile communication network is the mobile communication module. It can be understood as a kind of (112).

The short range communication module 114 refers to a module for short range communication. As a short range communication technology, Bluetooth, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee, and the like may be used.

The location information module 115 is a module for obtaining a location of a mobile terminal, and a representative example thereof is a GPS (Global Position System) module. According to the current technology, the GPS module 115 calculates distance information and accurate time information away from three or more satellites, and then applies trigonometric methods to the calculated information, thereby providing three-dimensional chords according to latitude, longitude, and altitude. The location information can be calculated accurately. Currently, a method of calculating position and time information using three satellites and correcting the error of the calculated position and time information using another satellite is widely used. In addition, the GPS module 115 may calculate speed information by continuously calculating the current position in real time.

Referring to FIG. 1, the A / V input unit 120 is for inputting an audio signal or a video signal, and may include a camera 121 and a microphone 122. The camera 121 processes image frames such as still images or moving images obtained by the image sensor in the video call mode or the photographing mode. The processed image frame may be displayed on the display unit 151.

The image frame processed by the camera 121 may be stored in the memory 160 or transmitted to the outside through the wireless communication unit 110. Two or more cameras 121 may be provided according to the use environment.

The microphone 122 receives an external sound signal by a microphone in a call mode, a recording mode, a voice recognition mode, etc., and processes the external sound signal into electrical voice data. The processed voice data may be converted into a form transmittable to the mobile communication base station through the mobile communication module 112 and output in the call mode. The microphone 122 may implement various noise removing algorithms for removing noise generated in the process of receiving an external sound signal.

The user input unit 130 generates input data for the user to control the operation of the terminal. The user input unit 130 may include a button 136 and a touch sensor (static pressure / capacitance) 137 positioned on the front, rear, or side surfaces of the portable terminal 100, and although not shown, a keypad (key pad) ), A dome switch, a jog wheel, and a jog switch may be further included.

The sensing unit 140 detects a current state of the mobile terminal 100 such as an open / closed state of the mobile terminal 100, a position of the mobile terminal 100, presence or absence of a user contact, orientation of the mobile terminal, acceleration / deceleration of the mobile terminal, and the like. To generate a sensing signal for controlling the operation of the portable terminal 100. For example, when the mobile terminal 100 is in the form of a slide phone, it may sense whether the slide phone is opened or closed. In addition, whether the power supply unit 190 is supplied with power, whether the interface unit 170 is coupled to the external device may be sensed. The sensing unit 140 may include a proximity sensor 141. (The proximity sensor will be described later.)

The output unit 150 is used to generate an output related to sight, hearing, or tactile sense, and includes a display unit 151, an audio output module 152, an alarm unit 153, and a haptic module 154. Can be.

The display unit 151 displays (outputs) information processed by the mobile terminal 100. For example, when the mobile terminal is in a call mode, a user interface (UI) or a graphic user interface (GUI) related to a call is displayed. When the mobile terminal 100 is in a video call mode or a photographing mode, the mobile terminal 100 displays a photographed and / or received image, a UI, and a GUI.

The display unit 151 includes a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED), and a flexible display (flexible). and at least one of a 3D display.

Some of these displays can be configured to be transparent or light transmissive so that they can be seen from the outside. This may be referred to as a transparent display. A representative example of the transparent display is TOLED (Transparant OLED). The rear structure of the display unit 151 may also be configured as a light transmissive structure. With this structure, the user can see the object located behind the terminal body through the area occupied by the display unit 151 of the terminal body.

There may be two or more display units 151 according to the implementation form of the mobile terminal 100. For example, the plurality of display units may be spaced apart or integrally disposed on one surface of the portable terminal 100, or may be disposed on different surfaces.

When the display unit 151 and the touch sensor 137 form a mutual layer structure or are integrally formed (hereinafter referred to as a touch screen), the display unit 151 may also be used as an input device in addition to an output device. Can be. For example, when the touch sensor 137 has a form of a touch film, a touch sheet, or a touch pad, the touch sensor 137 may be stacked on the display unit 151 to form a layer structure. It can be made integrally by including.

The touch sensor 137 may be configured to convert a change in pressure applied to a specific portion of the display unit 151 or capacitance generated at a specific portion of the display unit 151 into an electrical input signal. The touch sensor 137 may be configured to detect not only the position and area of the touch but also the pressure at the touch.

When there is a touch input to the touch sensor 137, the corresponding signal (s) is sent to a touch controller (not shown). The touch controller processes the signal (s) and then transmits the corresponding data to the controller 180. As a result, the controller 180 can know which area of the display unit 151 is touched.

The proximity sensor 141 may be disposed in an inner region of the mobile terminal surrounded by the touch screen or near the touch screen. The proximity sensor refers to a sensor that detects the presence or absence of an object approaching a predetermined detection surface or an object present in the vicinity without using a mechanical contact by using an electromagnetic force or infrared rays. Proximity sensors have a longer life and higher utilization than touch sensors.

Examples of the proximity sensor include a transmission photoelectric sensor, a direct reflection photoelectric sensor, a mirror reflection photoelectric sensor, a high frequency oscillation proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor. When the touch screen is capacitive, the touch screen is configured to detect the proximity of the pointer by the change of the electric field according to the proximity of the pointer. In this case, the touch screen (touch sensor) may be classified as a proximity sensor.

Hereinafter, for convenience of explanation, an act of allowing the pointer to be recognized without being in contact with the touch screen so that the pointer is located on the touch screen is referred to as a "proximity touch". The act of actually touching the pointer on the touch screen may be referred to as "contact touch". The position where the proximity touch is performed by the pointer on the touch screen may mean a position where the pointer is perpendicular to the touch screen when the pointer is in proximity touch.

The proximity sensor detects a proximity touch and a proximity touch pattern (for example, a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, and a proximity touch movement state). Information corresponding to the sensed proximity touch operation and proximity touch pattern may be output on the touch screen.

The sound output module 152 may output audio data received from the wireless communication unit 110 or stored in the memory 160 in a call signal reception, a call mode or a recording mode, a voice recognition mode, a broadcast reception mode, and the like. The sound output module 152 may also output a sound signal related to a function (eg, a call signal reception sound, a message reception sound, etc.) performed by the portable terminal 100. The sound output module 152 may include a receiver, a speaker, a buzzer, and the like.

The alarm unit 153 outputs a signal for notifying occurrence of an event of the portable terminal 100. Examples of events occurring in the mobile terminal include call signal reception, message reception, key signal input, and touch input. The alarm unit 153 may output a signal for notifying occurrence of an event in a form other than a video signal or an audio signal, for example, vibration. The video signal or the audio signal may also be output through the display unit 151 or the audio output module 152. In this case, the display unit 151 and the audio output module 152 may be a kind of alarm unit 153. May be classified.

The haptic module 154 generates various haptic effects that a user can feel. Vibration is a representative example of the haptic effect generated by the haptic module 154. The intensity and pattern of vibration generated by the haptic module 154 can be controlled. For example, different vibrations may be synthesized and output or may be sequentially output.

In addition to vibration, the haptic module 154 may be configured to provide a pin array that vertically moves with respect to the contact skin surface, a jetting force or suction force of air through an injection or inlet, grazing to the skin surface, contact with an electrode, electrostatic force, and the like. Various tactile effects can be generated, such as effects by the endothermic and the reproduction of a sense of cold using the elements capable of endotherm or heat generation.

The haptic module 154 may not only deliver the haptic effect through direct contact, but also may implement the user to feel the haptic effect through a muscle sense such as a finger or an arm. Two or more haptic modules 154 may be provided according to a configuration aspect of the mobile terminal 100.

The memory 160 may store a program for processing and controlling the controller 180 and may temporarily store input / output data (for example, a phone book, a message, an audio, a still image, a video, etc.). It can also perform a function. The memory unit 160 may store a frequency of use of each of the data (eg, a phone number, a message, and a frequency of use of each multimedia).

In addition, the memory 160 may store data on vibration and sound of various patterns output when a touch is input on the touch screen, and store map data including a plurality of regions according to the present invention.

The memory 160 may be a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (for example, SD or XD memory), RAM (Random Access Memory, RAM), Static Random Access Memory (SRAM), Read-Only Memory (ROM), Electrically Erasable Programmable Read-Only Memory (EEPROM), Programmable Read-Only Memory (PROM), Magnetic Memory, Magnetic It may include a storage medium of at least one type of disk, optical disk. The mobile terminal 100 may operate in association with a web storage that performs a storage function of the memory 160 on the Internet.

The interface unit 170 serves as a path with all external devices connected to the mobile terminal 100. The interface unit 170 receives data from an external device, receives power, transfers the power to each component inside the mobile terminal 100, or transmits data inside the mobile terminal 100 to an external device. For example, wired / wireless headset ports, external charger ports, wired / wireless data ports, memory card ports, ports for connecting devices with identification modules, audio input / output (I / O) ports, The video input / output (I / O) port, the earphone port, and the like may be included in the interface unit 170.

The identification module is a chip that stores various types of information for authenticating the use authority of the mobile terminal 100, and includes a user identification module (UIM), a subscriber identify module (SIM), and a universal user authentication module ( Universal Subscriber Identity Module (USIM), and the like. A device equipped with an identification module (hereinafter referred to as an 'identification device') may be manufactured in the form of a smart card. Therefore, the identification device may be connected to the terminal 100 through a port.

The interface unit may be a passage through which power from the cradle is supplied to the portable terminal 100 when the portable terminal 100 is connected to an external cradle, or various command signals inputted from the cradle by a user may be carried. It may be a passage that is delivered to the terminal. Various command signals or power input from the cradle may be operated as signals for recognizing that the portable terminal is correctly mounted in the cradle.

The controller 180 typically controls the overall operation of the mobile terminal. For example, perform related control and processing for voice calls, data communications, video calls, and the like. The controller 180 may include a multimedia module 181 for playing multimedia. The multimedia module 181 may be implemented in the controller 180 or may be implemented separately from the controller 180.

The controller 180 may perform a pattern recognition process for recognizing a writing input or a drawing input performed on the touch screen as text and an image, respectively.

The power supply unit 190 receives an external power source and an internal power source under the control of the controller 180 to supply power for operation of each component. The power supply unit 190 may include, for example, a battery, a connection port, a power supply control unit, and a charging monitoring unit.

The battery may be a built-in battery made to be chargeable, and may be detachably coupled to the terminal body for charging. The connection port may be configured as an example of the interface 170 to which an external charger for supplying power for charging the battery is electrically connected.

Various embodiments described herein may be implemented in a recording medium readable by a computer or similar device using, for example, software, hardware or a combination thereof.

According to a hardware implementation, the embodiments described herein include application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), and the like. It may be implemented using at least one of processors, controllers, micro-controllers, microprocessors, and electrical units for performing other functions. The described embodiments may be implemented by the controller 180 itself.

According to the software implementation, embodiments such as the procedures and functions described herein may be implemented as separate software modules. Each of the software modules may perform one or more functions and operations described herein. Software code may be implemented in software applications written in a suitable programming language. The software code may be stored in the memory 160 and executed by the controller 180.

2 is a front perspective view of an example of a mobile terminal according to the present invention;

The disclosed portable terminal 100 has a terminal body in the form of a bar. However, the present invention is not limited thereto and may be applied to various structures such as a slide type, a folder type, a swing type, a swivel type, and two or more bodies are coupled to be relatively movable.

The body of the portable terminal 100 includes cases 101, 102, and 103 forming an appearance. In this embodiment, the case may be divided into a front case 101 and a rear case 102. Various electronic components are built in the space formed between the front case 101 and the rear case 102.

Parts that need to be detached, such as an auxiliary storage medium 165 such as a USIM card and a memory card, may include an opening in the case to be inserted into the case from the outside. Slots may be formed on the side of the case such that the auxiliary storage medium 165 is inserted into and mounted on the side of the portable terminal 100, and card slots 166 and 167 may be formed to be mounted on the surface of the rear case 102. It may be.

The cases 101, 102, and 103 may be formed by injecting synthetic resin, or may be formed to have a metal material such as stainless steel (STS) or titanium (Ti).

The terminal cases 101 and 102 may include a display unit 151, an audio output unit 152, a camera 121, a user input unit 130/131, 132, a microphone 122, an interface 170, and the like.

The display unit 151 occupies most of the main surface of the front case 101. The sound output unit 152 and the camera 121 are disposed in regions adjacent to one end of both ends of the display unit 151, and the user input unit 131 and the microphone 122 are disposed in regions adjacent to the other end. The user input unit 132 and the interface 170 may be disposed on side surfaces of the front case 101 and the rear case 102.

The user input unit 130 is manipulated to receive a command for controlling the operation of the portable terminal 100 and may include a plurality of operation units 131, 132, and 133. The manipulation units 131, 132, 133 may also be collectively referred to as a manipulating portion.

Content input by the first or second manipulation units 131 and 132 may be variously set. For example, the first operation unit 131 receives a command such as start, end, scroll, and the like, and the second operation unit 132 inputs a command such as adjusting the volume of the sound output from the sound output unit 152. The third manipulation unit 133 may receive a command such as activating / deactivating the touch recognition mode of the display unit 151.

The operation units 131, 132, and 133 may have a button method for recognizing when a user applies pressure. In addition to the display unit 151, the operation units 131, 132, and 133 may include a touch sensor 137. Therefore, the user's command may be input only by the user's touch.

3 is a rear perspective view of the portable terminal shown in FIG. 2.

Referring to FIG. 3, a camera 121 ′ may be additionally mounted on the rear of the terminal body, that is, the rear case 102. The camera 121 ′ has a photographing direction substantially opposite to that of the camera 121 (see FIG. 2), and may be a camera having the same or different pixels as the camera 121.

For example, the camera 121 has a low pixel so that the user's face is photographed and transmitted to the counterpart in case of a video call, and the camera 121 'photographs a general subject and does not transmit it immediately. It is desirable to have a high pixel because there are many. The cameras 121 and 121 'may be installed in the terminal body so as to be rotatable or pop-up.

The flash 123 and the mirror 124 may be further disposed adjacent to the camera 121 ′. The flash 123 shines light toward the subject when the subject is photographed by the camera 121 '. The mirror 124 allows the user to see his / her own face or the like when photographing (self-photographing) the user using the camera 121 '.

The sound output unit 152 'may be further disposed on the rear surface of the terminal body. The sound output unit 152 ′ at the rear of the body may implement a stereo function together with the sound output unit 152 (see FIG. 2) at the front of the body, and may be used to implement a speakerphone mode during a call.

The antenna 116 for receiving a broadcast signal may be additionally disposed on the side of the terminal body. The antenna 116 constituting a part of the broadcast receiving module 111 (refer to FIG. 1) may be installed to be pulled out of the terminal body.

Hereinafter, an embodiment of a control method that may be implemented in the mobile terminal 100 configured as described above will be described with reference to FIG. 4.

For convenience of description, it is assumed that the portable terminal mentioned below includes at least some of the components shown in FIG. 1. Specifically, the portable terminal according to the present invention includes a display unit 151, a memory 160, and a controller 180 among the components shown in FIG. 1. Since the portable terminal according to the present invention can be more easily implemented when the display unit 151 is the touch screen 151, it will be assumed below that the display unit 151 is a 'touch screen'.

Meanwhile, in the embodiment of the present invention, the user's touch action refers to a touch gesture implemented by touching or proximity touching on the touch screen type display unit 151, and the touch input refers to an input received by the touch gesture. do.

Touch gestures can be tapped, touch & drag, flicking, long touch, multi touch, pinch in, or pinch out depending on the action. out).

Tapping is an operation of lightly pressing and releasing the display unit 151 once, and means a touch gesture such as a mouse click in a general personal computer.

In addition, the touch drag is an operation of releasing after moving to a specific position while releasing the touch of the display unit 151. When dragging an object, the object may be continuously moved and displayed according to the drag direction.

In addition, flicking refers to an operation of touching the display unit 151 and then strokes in a specific direction (up, down, left, right, or diagonal) and at a specific speed (or intensity). When a touch input is received by flicking, the processor 100 performs a specific operation based on the flicking direction, speed, and the like.

In addition, the long touch refers to an operation of continuously touching the display unit 151 after a preset time.

In addition, the multi-touch refers to an operation of simultaneously touching a plurality of points of the display unit 151.

In addition, pinch-in refers to an operation of dragging a plurality of pointers in multi-touch on the display unit 151 in a direction in which they are closer to each other. That is, the drag starts from at least one point of the plurality of points multi-touched on the display unit 151 and occurs in a direction in which the plurality of points multi-touched closer to each other.

In addition, pinch-out refers to an operation of dragging a plurality of pointers in multi-touch on the display unit 151 in a direction away from each other. That is, a drag starts from at least one point among a plurality of points multi-touched on the display unit 151 and occurs in a direction away from each other.

4 to 19, according to the present invention, by simultaneously displaying the location information and the chat content of the mobile terminal and the chat partner on the map of the area where the mobile terminal and the chat partner is located, At the same time, the process of sharing the location between the chat partner and each other will be described in detail.

4 is a flowchart illustrating a control process of a mobile terminal according to the present invention.

5 to 19 are explanatory diagrams showing a control process of a portable terminal according to the present invention.

4 to 19, the controller 180 of the mobile terminal 100 drives a messenger for group chat with one or more external terminals according to a user's request, and performs text-based chat of the driven messenger. The content display screen is displayed on the touch screen 151.

The messenger provides a text-based chat mode and a map-based chat mode according to the present invention.

If a chat group including one or more external terminals is set from the user through the messenger [S110], the controller 180 transmits / receives a message including the contents of the chat with the external terminals set in the chat group. The chat contents included in the transmitted / received message are displayed on the chat contents display screen of the messenger [S120].

At this time, if a command for switching the text-based chat mode to a map-based chat mode is input, the controller 180 switches the text-based chat mode to the map-based chat mode [S130], and the memory ( A map describing a current location of the mobile terminal 100 and an area including the current location of the external terminals is searched from the map data provided at 160 [S140].

That is, when the text-based chat mode is switched to the map-based chat mode, the controller 180 determines the current location of the mobile terminal 100 by driving the location information module 115 and the wireless communication unit 110. The mobile terminal 100 transmits and provides the current location information of the mobile terminal 100 to the external terminals, requests and receives the current location information of the external terminals from the external terminals, and recognizes the mobile terminal 100. Search for a map depicting an area including the current location of the mobile terminal 100 and the current location of the received external terminals in the map data based on the current location of the terminal and the current location of the received external terminals. .

The controller 180 displays the searched map as a background on the chat contents display screen of the messenger [S150], and displays each item representing the mobile terminal 100 and the external terminals in the displayed map. Each point is displayed at a point corresponding to a current position of external terminals [S160].

In addition, the controller 180 displays chat contents transmitted from the user of the mobile terminal 100 to the users of the external terminals on the item representing the mobile terminal 100 among the displayed items in a speech bubble form [S170]. The chat contents transmitted by the users of the external terminals to the user of the mobile terminal 100 are displayed on the items representing the external terminals among the displayed items in a speech bubble form [S180].

For example, as illustrated in FIG. 5A, the controller 180 may touch the touch key 411 to which the map-based chat mode switching function is assigned on the screen of the messenger of the text-based chat mode environment. 5, the text-based chat mode 200 is switched to the map-based chat mode 300.

In addition, as shown in (b) of FIG. 5, when the controller 180 receives a message requesting to switch the current text chat mode to the map chat mode from an external terminal that is a specific chat partner, the controller 180 receives the received message. When the touch key 412 for displaying and switching the text chat mode to the map chat mode is touched, as illustrated in FIG. 5C, the text-based chat mode 200 is mapped to the map-based chat mode. Switch to chat mode 300.

When the controller 180 is switched to the map chat mode 300, as shown in FIG. 5C, among the map data included in the memory 160, the current state of the portable terminal 100 and the external terminals is displayed. The map 310 of the region including the location is searched and displayed as the background of the messenger screen.

In addition, the controller 180 displays a first item 211 representing a user (“ME”) of the mobile terminal 100 on a point corresponding to the current location of the mobile terminal 100 in the map 310. And the second and third items 221 and 231 representing the users (LEE, TOM) of the external terminals, respectively, and the current positions of the users (LEE, TOM) of the external terminals. Mark on each point corresponding to.

In addition, the controller 180 transmits the chat contents 212 that the user ("ME") of the mobile terminal 100 transmits to the users "LEE" and "TOM" of the external terminals on the first item 211. ) In the form of a speech bubble and each of the users ("LEE", "TOM") of the external terminals on the second and third items (221, 231) is a user ("ME") of the portable terminal 100. Chat contents (222, 232) transmitted to the display in the form of speech bubbles.

In this case, the controller 180 may display a conversation name of the user (“ME”) of the mobile terminal 100 or a contact name registered in the phone book of the memory 160 in the first item 211. The conversation name of users "LEE" and "TOM" of the corresponding external terminals in the second and third items 221 and 231 or the users ("LEE" and "TOM" of the external terminals already registered in the phonebook. The contact name of ") can be displayed.

In addition, as shown in FIG. 6, the controller 180 can display the movement paths of the mobile terminal 100 and the external terminals on the map 310, respectively.

As shown in (a) and (b) of FIG. 6, when the controller 180 switches to the map-based chat mode, the controller 180 periodically determines the location of the mobile terminal 100 and the location of external terminals. Whenever the positions of the portable terminal 100 and the external terminals are changed, the first to third items 211, 221, and 231 representing the portable terminal 100 and the external terminals are displayed at the changed positions. On the map 310, the path 213 where the portable terminal 100 moved from the previous position to the current changed position and the paths 223 and 233 where the external terminals moved from the previous position to the current changed position, respectively, respectively. Display.

Next, as shown in FIG. 7, when the first item 211 representing the mobile terminal 100 is selected, the controller 180 lists at least one editing function associated with the map 310. Is displayed on the map 310, and the user can edit the map 310 in a desired manner by using an editing function in the list.

As illustrated in FIG. 7A, when the first item 211 is pre-touched, the controller 180 may display at least one associated with the map 310 as illustrated in FIG. 7B. The list 420 of the editing functions 421, 422, 423, 424, 425 is displayed on the map 310.

For example, the list 420 includes a map capture function 421, and when the map capture function 421 is selected, the list 420 includes the first to third items 211 and 221. , 231 and the chat contents 212, 222, and 232 are captured.

As another example, the list 420 includes a map capture function 422, and when the map storage function 422 is selected, the list 420 includes the first to third items 211, 221,. 231 and the map 310 including the chat contents 212, 222, and 232 are stored in the memory 160.

As another example, the list 420 includes a map sharing function 423, and when the map sharing function 423 is selected, the list 420 includes the first to third items 211, 221, and 231. And share the image of the map 310 including the chat contents 212, 222, and 232 by transmitting to the external terminals, or the map sharing function 423 is selected, and a contact of a specific counterpart terminal is selected. If specified, the image of the map 310 may be transmitted to and shared with the specific counterpart terminal.

As another example, the list 420 includes a map zoom in / zoom out function 424, and the controller 180 selects the map zoom in / zoom out function 424, and pinches in the map 310. When a pinch-in touch gesture is input, the map 310 is zoomed out according to the pinch-in degree, and when a pinch-out touch gesture is input on the map 310, The map 310 is zoomed in according to the pinch-out degree.

As another example, the list 420 may include a chat counter add function 425, and when the chat counter add function 425 is selected, the list 420 may display a list of chat contacts registered in the messenger. Alternatively, an input window of a contact / chat ID of a chat partner to be added is displayed, and a chat partner selected through the list or a chat partner input through the input window is added to the current chat group.

Next, as shown in (a) of FIG. 8, when the first item 211 representing the user of the mobile terminal 100 is touched, the controller 180 displays as shown in (b) of FIG. 8. Likewise, the chat content input window 430 to be transmitted to the external terminals may be displayed on the map 310.

Next, as illustrated in FIG. 9A, the controller 180 may touch the first item 211 among the first to third items 211, 221, and 231 displayed in the map 310. After the second and third items 221 and 231 are touched, as shown in FIG. 9B, a multi-party call (voice) between the mobile terminal 100 and external terminals through the wireless communication unit 110 is performed. Call or video call).

In addition, if one of the second and third items is selected after the first item 211 is touched, the controller 18 may perform voice input with the mobile terminal 100 and the external terminal corresponding to the selected item. Connect a call or video call.

Next, as illustrated in FIG. 10A, when the first item 211 representing the user of the mobile terminal 100 is touched, the controller 180 may display the same as illustrated in FIG. 10B. Similarly, when the memo input window 440 is displayed and memo contents are input through the memo input window 440, the map 310 may be included in the input memo contents and stored. In addition, the input memo contents may be transmitted to the external terminals and shared.

Next, as illustrated in FIG. 11A, when the first item 211 is touched and a specific point to be set as the destination 451 on the map 310 is touched, FIG. As shown in (b) of FIG. 11, the route 452 from the position of the mobile terminal 100 indicated by the first item 211 to the position corresponding to the destination 451 is identified, and the identified The route 452 may be displayed on the map 310. In this case, the controller 180 may transmit and share at least one of the location information corresponding to the specific point and the location information corresponding to the path 452 to at least one external terminal among the external terminals.

Next, as shown in FIG. 12A, the controller 180 may touch the first item 211 and at least one of the second and third items 221 and 231. If a specific point to be set as an appointment place 461 is touched on the map 310 after 231 is touched, as shown in FIG. 12B, a position corresponding to the touched appointment place 461 is touched. Information is transmitted to an external terminal corresponding to the third item 231 to inform the location of the appointment place 461.

In addition, the controller 180 may include a first path 462 and the third item from the location of the mobile terminal 100 at the time when the first item 211 is touched to a location corresponding to the appointment place 461. The second path 463 from the location of the external terminal at the time when 231 is touched to the location corresponding to the appointment place 461 is determined, and the first and second paths 462 and 463 are identified. It is displayed on the map 310.

Next, as shown in (a) of FIG. 13, the controller 180 sets the destination 451 of the user of the mobile terminal 100 on the map 310 by the process of FIG. 11. When the first item 211 and the third item 231 are touched, as shown in FIG. 13B, the destination is located at the location of the mobile terminal 100 when the first item 211 is touched. The estimated arrival time is determined based on the route to the location corresponding to (451).

If the mobile terminal 100 does not reach the location of the destination 451 within the expected arrival time, the controller 180 contacts or preset government offices of the external terminal corresponding to the touched third item 231. The contact (eg, "112 or 119") of the user of the mobile terminal 100 automatically transmits a message indicating that an emergency situation occurs.

Subsequently, as shown in FIG. 14A, when the first item 211 and the third item 231 are touched, as shown in FIG. 14B, as shown in FIG. Check whether a message including a chat content is received from an external terminal corresponding to the third item 231 through the wireless communication unit 110 for a preset time, and if the message is not received during the preset time, the wireless communication unit Automatically transmits a message indicating that an emergency situation occurs to the user of the external terminal corresponding to the third item 231 to the contact information (for example, "112 or 119") of the predetermined public office through the 110.

Next, as shown in (a) of FIG. 15, when the first item 211 and the third item 231 are touched, as shown in FIG. It provides a 1: 1 whisper chat function with the user of the mobile terminal 100 corresponding to the first item 211 and the user of the external terminal corresponding to the third item 231.

Next, as shown in (a) of FIG. 16, when the first item 211 and the third item 231 are touched, as shown in FIG. The movement path and the estimated arrival time of the intermediate point having the shortest distance from the position of the mobile terminal 100 corresponding to the first item 211 to the position of the external terminal corresponding to the third item 231 are identified. Information 470, which indicates the movement route and the estimated arrival time of the identified intermediate point, is displayed at a position corresponding to the intermediate point in the map 310. In addition, the controller 180 may transmit and share the determined position information of the intermediate point to an external terminal corresponding to the third item 231.

Next, as shown in (a) of FIG. 17, when the third item 231 is touched, the controller 180 displays a third in the current chat group, as shown in (b) of FIG. 17. The external terminal corresponding to the item 231 is deleted or withdrawn.

Next, as shown in (a) of FIG. 18, when the first item 211 and the third item 231 are touched, as shown in FIG. The camera 121 of the mobile terminal 100 is driven, and a real-time street image 481 around the mobile terminal 100 input through the camera 121 is displayed on the first item 211 in thumbnail form. The mobile terminal 110 transmits the street image 481 to the external terminal corresponding to the third item 231 to transmit the real-time street image 481 around the portable terminal 100 from the external terminal. To display.

In addition, the controller 180 transmits a signal for requesting a real-time street image 482 around the external terminal to the external terminal corresponding to the third item 231 through the wireless communication unit 110, and the wireless communication unit 110. Receives a real-time street image 482 around the external terminal received from the camera of the external terminal through the, and the real-time street image 482 around the external terminal received in the form of a thumbnail on the third item 231 To be displayed.

Lastly, as shown in FIG. 19A, the controller 180 sets first and third items 211, 221, and 231 as touches and sets an appointment place 491 on the map 310. When a specific point is touched, as illustrated in (b) of FIG. 19, external terminals corresponding to the second and third items 221 and 231 receive location information corresponding to the touched appointment place 461. The location of the appointment 491 is announced.

In addition, the controller 180 may include a first path 492 from a location of the mobile terminal 100 at the time when the first item 211 is touched to a location corresponding to the appointment place 461, and the second path. At the location of the external terminal when the item 221 is touched, from the location of the external terminal to the location corresponding to the appointment location 491 and the location of the external terminal when the third item 231 is touched. The third route 494 to the position corresponding to the appointment place 491 is identified, and the identified first to third routes 492, 493, and 494 are displayed on the map 310.

It will be apparent to those skilled in the art that the present invention can be embodied in other specific forms without departing from the spirit and essential features of the present invention.

Accordingly, the above detailed description should not be construed as limiting in all aspects and should be considered as illustrative. The scope of the present invention should be determined by reasonable interpretation of the appended claims, and all changes within the equivalent scope of the present invention are included in the scope of the present invention.

100: mobile terminal 110: wireless communication unit
111: broadcast receiving unit 112: mobile communication module
113: wireless internet module 114: short-range communication module
115: location information module 120: A / V input unit
121: camera 122: microphone
130: user input unit 140: sensing unit
141: proximity sensor 150: output unit
151: display unit 152: sound output module
153: alarm module 154: haptic module
155: Projector Module 160: Memory
170: interface unit 180: control unit
181: multimedia module 190: power supply

Claims (19)

  1. A wireless communication unit configured to transmit / receive a message including text-based chat contents with at least one external terminal registered in a messenger;
    A display unit configured to display a chat content with the external terminal on a screen;
    A memory having map data including at least one region; And
    When the text-based chat mode is switched to a map-based chat mode, a map indicating a region including a current location of the mobile terminal and the external terminal is searched and displayed on the screen from the map data.
    Displaying first and second items respectively representing the mobile terminal and the external terminal at points corresponding to current locations of the mobile terminal and the external terminal in the map;
    Displaying chat contents transmitted by the mobile terminal to the external terminal on the first item;
    And a controller configured to display chat contents transmitted from the external terminal to the portable terminal on the second item.
  2. According to claim 1,
    And a location information unit for acquiring a current location of the portable terminal.
    The controller, when the text-based chat mode is switched to the map-based chat mode, obtains a current location of the mobile terminal through the location information unit, and transmits the current location of the mobile terminal to the external terminal through the wireless communication unit. And providing a location and requesting and obtaining a current location of the external terminal from the external terminal through the wireless communication unit.
  3. Claim 3 has been abandoned upon payment of a set-up fee.
    According to claim 1,
    The controller may display a conversation name or a contact name of a user of the portable terminal and a user of the external terminal on the first and second items, respectively.
  4. Claim 4 has been abandoned upon payment of a setup registration fee.
    According to claim 1,
    The controller may be configured to display the corresponding chat contents in the form of speech bubbles on the first and second items.
  5. Claim 5 was abandoned upon payment of a set-up registration fee.
    According to claim 1,
    And the controller is further configured to display respective movement paths of the portable terminal and the external terminal in the map based on current locations of the portable terminal and the external terminal.
  6. Claim 6 has been abandoned upon payment of a setup registration fee.
    According to claim 1,
    The controller, when the first item is selected, performs at least one of capturing, storing, sharing, and zooming in and out of the map.
  7. Claim 7 was abandoned upon payment of a set-up fee.
    According to claim 1,
    When the first item is selected, the controller displays an input window for inputting chat contents to be transmitted to the external terminal on the map.
  8. Claim 8 has been abandoned upon payment of a set-up fee.
    According to claim 1,
    The controller, when the first and second items are selected, the portable terminal, characterized in that for connecting any one of a voice call and a video call with the external terminal via the wireless communication unit.
  9. Claim 9 was abandoned upon payment of a set-up fee.
    According to claim 1,
    The controller, if the first item is selected, characterized in that to display an input window for inputting the memo content on the map.
  10. Claim 10 has been abandoned upon payment of a setup registration fee.
    According to claim 1,
    When the first item is selected and a specific point to be set as a destination on the map is selected, the controller grasps a path from a position corresponding to the first item to a position corresponding to the specific point, and determines the identified point. And displaying a route on the map.
  11. Claim 11 was abandoned upon payment of a set-up fee.
    The method of claim 10,
    The controller may be configured to transmit at least one of location information corresponding to the selected specific point and location information corresponding to the identified path to the external terminal through the wireless communication unit.
  12. Claim 12 was abandoned upon payment of a set-up fee.
    According to claim 1,
    When the first and second items are selected, and a specific point to be set as an appointment place with the external terminal is selected on the map, the controller may provide the location information on the selected specific point to the external terminal through the wireless communication unit. And a path from a location corresponding to the first item to a location corresponding to the specific point, and displaying the identified path on the map.
  13. Claim 13 was abandoned upon payment of a set-up fee.
    According to claim 1,
    The memory may store at least one of a contact point of the external terminal and a contact point of a public office for notifying the user of the portable terminal of an emergency situation,
    When the first and second items are selected, the controller determines an estimated arrival time from a current location of the portable terminal to a preset destination, and the portable terminal is located at the location of the destination within the estimated estimated arrival time. If not, the mobile terminal, characterized in that for automatically transmitting a message indicating that an emergency situation occurs to the user of the portable terminal to at least one of the contact of the external terminal and the contact of the government through the wireless communication unit.
  14. Claim 14 was abandoned upon payment of a set-up fee.
    According to claim 1,
    The memory, at least one of the contact information of the public office for notifying the emergency situation occurred to the user of the external terminal is stored,
    When the first and second items are selected, the controller checks whether a message including chat contents is received from the external terminal through the wireless communication unit for a preset time, and as a result of the check, the chat contents are received from the external terminal. If the included message is not received, the mobile terminal, characterized in that for automatically transmitting a message indicating that an emergency situation occurs to the user of the external terminal through the wireless communication unit to the contact of the government office.
  15. Claim 15 was abandoned upon payment of a set-up fee.
    According to claim 1,
    When the first and second items are selected, the controller performs one-to-one (1: 1) chat with only an external terminal corresponding to the second item among two or more external terminals.
  16. Claim 16 was abandoned upon payment of a set-up fee.
    According to claim 1,
    When the first and second items are selected, the controller determines a movement path having the shortest distance from a current position of the portable terminal to a current position of the external terminal and an estimated arrival time through the movement path, And displaying information indicating the identified movement route and the estimated arrival time on the map.
  17. Claim 17 was abandoned upon payment of a set-up fee.
    According to claim 1,
    When the second item is selected, the controller deletes or leaves the external terminal corresponding to the second item from the chat group.
  18. Claim 18 was abandoned upon payment of a set-up fee.
    According to claim 1,
    The camera further includes;
    The controller, when the first and second items are selected, transmits images input through the camera in real time to the external terminal through the wireless communication unit, and the camera of the external terminal from the external terminal through the wireless communication unit. The mobile terminal, characterized in that for receiving and displaying the image input through the real time.
  19. Transmitting / receiving a message including text-based chat contents with at least one external terminal registered in the messenger;
    Displaying chat contents with the external terminal on a screen;
    Switching the text-based chat mode to a map-based chat mode;
    Retrieving a map representing a region including a current location of the mobile terminal and the external terminal from pre-installed map data;
    Displaying the searched map on the screen;
    Displaying first and second items representing the portable terminal and the external terminal at points corresponding to current locations of the portable terminal and the external terminal in the map; And
    And displaying the chat contents transmitted by the portable terminal to the external terminal on the first item, and displaying the chat contents transmitted by the external terminal to the portable terminal on the second item. Control method of the terminal.
KR1020130107787A 2013-09-09 2013-09-09 Terminal and method for controlling the same KR102067059B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020130107787A KR102067059B1 (en) 2013-09-09 2013-09-09 Terminal and method for controlling the same

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020130107787A KR102067059B1 (en) 2013-09-09 2013-09-09 Terminal and method for controlling the same

Publications (2)

Publication Number Publication Date
KR20150029088A KR20150029088A (en) 2015-03-18
KR102067059B1 true KR102067059B1 (en) 2020-01-16

Family

ID=53023654

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020130107787A KR102067059B1 (en) 2013-09-09 2013-09-09 Terminal and method for controlling the same

Country Status (1)

Country Link
KR (1) KR102067059B1 (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101666625B1 (en) * 2015-03-20 2016-10-14 신명진 Messaging Service System for Location Based by Multi Users and Method thereof
KR101648515B1 (en) * 2015-03-31 2016-08-16 (주)테슬라시스템 Position Information Sharing Method Based on Meeting Time and Place
KR101710886B1 (en) * 2015-11-06 2017-02-28 주식회사 하오문 Method for processing notice message by monitoring
WO2018030557A1 (en) 2016-08-10 2018-02-15 라인 가부시키가이샤 Messenger service method, system and recording medium for providing output effect
KR101983082B1 (en) * 2016-11-21 2019-05-30 파파야 주식회사 Personal Information Appratus and System for Sharing Map-Based User Creating Contents for SNS and Operating Method for the Same

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100899819B1 (en) 2006-12-19 2009-05-27 서태섭 Method of providing communicating service using position information of user terminals
US20090254840A1 (en) 2008-04-04 2009-10-08 Yahoo! Inc. Local map chat

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100923387B1 (en) * 2008-02-22 2009-10-23 삼성에스디에스 주식회사 Information sharing system of map upside locate foundation using mobile terminal and method thereof
KR101340206B1 (en) * 2012-02-14 2013-12-10 (주)카카오 Instant messaging service method using location information

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100899819B1 (en) 2006-12-19 2009-05-27 서태섭 Method of providing communicating service using position information of user terminals
US20090254840A1 (en) 2008-04-04 2009-10-08 Yahoo! Inc. Local map chat

Also Published As

Publication number Publication date
KR20150029088A (en) 2015-03-18

Similar Documents

Publication Publication Date Title
US9405455B2 (en) Mobile terminal and controlling method thereof
US10241743B2 (en) Mobile terminal for matching displayed text with recorded external audio and method of controlling the mobile terminal
KR102011457B1 (en) Mobile terminal and control method thereof
KR102040611B1 (en) Mobile terminal and controlling method thereof
US9363359B2 (en) Mobile terminal and method for controlling the same
KR101315957B1 (en) Mobile terminal and control method thereof
KR102080741B1 (en) Mobile terminal and control method thereof
KR102182160B1 (en) Mobile terminal and method for controlling the same
KR101860341B1 (en) Mobile terminal and control method for the same
US9996249B2 (en) Mobile terminal and method of controlling the mobile terminal
KR101071843B1 (en) Mobile terminal and method for controlling the same
US9626083B2 (en) Mobile terminal and controlling method of a locked screen
KR101935039B1 (en) Mobile terminal and method for controlling of the same
US20140325428A1 (en) Mobile terminal and method of controlling the mobile terminal
KR101886753B1 (en) Mobile terminal and control method thereof
KR101510484B1 (en) Mobile Terminal And Method Of Controlling Mobile Terminal
US9563350B2 (en) Mobile terminal and method for controlling the same
KR102029242B1 (en) Method of controling mobile terminal
US9130893B2 (en) Mobile terminal and method for displaying message thereof
KR101726790B1 (en) Mobile terminal and control method for mobile terminal
KR101781852B1 (en) Mobile terminal and method for controlling the same
KR101608532B1 (en) Method for displaying data and mobile terminal thereof
KR101531192B1 (en) Mobile Terminal And Method Of Displaying Map Using Same
US9665268B2 (en) Mobile terminal and control method thereof
EP2672682B1 (en) Mobile terminal and controlling method thereof

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right