KR20100105191A - Mobile terminal and information displaying method thereof - Google Patents

Mobile terminal and information displaying method thereof Download PDF

Info

Publication number
KR20100105191A
KR20100105191A KR1020090024089A KR20090024089A KR20100105191A KR 20100105191 A KR20100105191 A KR 20100105191A KR 1020090024089 A KR1020090024089 A KR 1020090024089A KR 20090024089 A KR20090024089 A KR 20090024089A KR 20100105191 A KR20100105191 A KR 20100105191A
Authority
KR
South Korea
Prior art keywords
channel
information
keyword
mobile terminal
extracted
Prior art date
Application number
KR1020090024089A
Other languages
Korean (ko)
Inventor
김용신
장두이
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020090024089A priority Critical patent/KR20100105191A/en
Publication of KR20100105191A publication Critical patent/KR20100105191A/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/414Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
    • H04N21/41407Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance embedded in a portable device, e.g. video client on a mobile phone, PDA, laptop
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/432Content retrieval operation from a local storage medium, e.g. hard-disk
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/482End-user interface for program selection
    • H04N21/4823End-user interface for program selection using a channel name
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/61Network physical structure; Signal processing
    • H04N21/6106Network physical structure; Signal processing specially adapted to the downstream path of the transmission network

Abstract

The present invention provides a method of displaying information on a mobile terminal, the method comprising: receiving channel information through a wireless communication unit of the mobile terminal; Extracting a first keyword from the received channel information; Determining whether the extracted first keyword matches the second keyword stored in the mobile terminal; And receiving and displaying a channel corresponding to the extracted channel information when the first and second keywords coincide with each other. According to the mobile terminal and the information display method of the present invention, by displaying the channel corresponding to the channel information including the keyword matching the stored specific keyword, it is possible to easily filter and display the information that the user wants to obtain.

Description

MOBILE TERMINAL AND INFORMATION DISPLAYING METHOD THEREOF}

The present invention relates to a mobile terminal, and more particularly, to display a channel corresponding to channel information including a keyword matching a stored specific keyword, so that the user can easily filter and display information desired to be acquired by the user. And a method of displaying the information.

As terminals, such as personal computers, laptops, and mobile phones, are diversified in functions, multimedia devices having complex functions such as taking pictures or videos, playing music or video files, playing games, and receiving broadcasts are provided. ) Is implemented.

Terminals may be divided into mobile terminals and stationary terminals according to their mobility. The mobile terminal may be divided into a handheld terminal and a vehicle mount terminal according to whether a user can directly carry it.

In order to support and increase the function of the terminal, it may be considered to improve the structural part and / or the software part of the terminal.

Recently, various terminals including mobile terminals are often supported by a location based service (LBS) function as they provide complex and various functions.

The location-based service refers to various information services provided in connection with the measured location after measuring the location of the mobile terminal using a mobile communication network or satellite signal. Such a location-based service (LBS) The location information acquired through the function is used to apply to various applications.

Furthermore, recently, various terminals including mobile terminals have been added with functions for acquiring necessary information through a communication network.

The present invention is to provide a mobile terminal and a method of displaying the information by displaying a channel corresponding to the channel information including a keyword that matches a specific keyword stored, and can easily filter and display the information desired by the user. .

Technical problems to be achieved by the present invention are not limited to the above-mentioned technical problems, and other technical problems not mentioned above will be clearly understood by those skilled in the art from the following description. Could be.

According to an aspect of the present invention, there is provided a method of displaying information in a mobile terminal, the method comprising: receiving channel information through a wireless communication unit of the mobile terminal; Extracting a first keyword from the received channel information; Determining whether the extracted first keyword matches a second keyword stored in the mobile terminal; And when the first and second keywords coincide with each other, receiving and displaying a channel corresponding to the extracted channel information.

In addition, the mobile terminal according to an embodiment of the present invention, the wireless communication unit; A display unit; Receives channel information through the wireless communication unit, extracts a first keyword from the received channel information, and if the extracted first keyword and the second keyword stored in the memory coincide with each other, the channel from which the first keyword is extracted It may include a control unit for receiving a channel corresponding to the information to display on the display unit.

According to the mobile terminal and according to the present invention, the following effects can be expected.

First, by displaying a channel corresponding to channel information including a keyword that matches a specific stored keyword, the user may easily filter and display information desired to be acquired.

Second, by setting the priority to display the channel, it is possible to display the channel that the user is more interested in more recognizable.

Third, it is possible to set a condition such as a time and a place where the information obtained through the channel is displayed, thereby preventing the channel from being received and displayed in a situation where it is not necessary.

Fourth, a channel containing specific information such as natural disaster information can be displayed even when the user sets a condition, thereby achieving a predetermined public interest purpose.

The above objects, features and advantages of the present invention will become more apparent from the following detailed description taken in conjunction with the accompanying drawings. Hereinafter, exemplary embodiments of the present invention will be described in detail with reference to the accompanying drawings. Like numbers refer to like elements throughout. In addition, when it is determined that the detailed description of the known function or configuration related to the present invention may unnecessarily obscure the subject matter of the present invention, the detailed description thereof will be omitted.

Hereinafter, a mobile terminal according to the present invention will be described in more detail with reference to the accompanying drawings. The suffixes "module" and "unit" for components used in the following description are given or used in consideration of ease of specification, and do not have distinct meanings or roles from each other.

The mobile terminal described herein may include a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a personal digital assistant (PDA), a portable multimedia player (PMP), navigation, and the like.

1 is a block diagram of a mobile terminal according to an embodiment of the present invention.

The mobile terminal 100 includes a wireless communication unit 110, an A / V (Audio / Video) input unit 120, a user input unit 130, a sensing unit 140, an output unit 150, a memory unit 160, The interface unit 170 may include a controller 180, a power supply unit 190, and the like. The components shown in Fig. 1 are not essential, and a mobile terminal having more or fewer components may be implemented.

Hereinafter, the components will be described in order.

The wireless communication unit 110 may include one or more modules that enable wireless communication between the mobile terminal 100 and the wireless communication system or between the mobile terminal 100 and a network in which the mobile terminal 100 is located. For example, the wireless communication unit 110 may include a broadcast receiving module 111, a mobile communication module 112, a wireless Internet module 113, a short range communication module 114, and a location information module 115 .

The broadcast receiving module 111 receives a broadcast signal and / or broadcast related information from an external broadcast management server through a broadcast channel.

The broadcast channel may include a satellite channel and a terrestrial channel. The broadcast management server may mean a server that generates and transmits a broadcast signal and / or broadcast related information or a server that receives a previously generated broadcast signal and / or broadcast related information and transmits the same to a terminal. The broadcast signal may include not only a TV broadcast signal, a radio broadcast signal, and a data broadcast signal, but also a broadcast signal having a data broadcast signal combined with a TV broadcast signal or a radio broadcast signal.

The broadcast related information may mean information related to a broadcast channel, a broadcast program, or a broadcast service provider. The broadcast related information may also be provided through a mobile communication network. In this case, it may be received by the mobile communication module 112.

The broadcast related information may exist in various forms. For example, it may exist in the form of Electronic Program Guide (EPG) of Digital Multimedia Broadcasting (DMB) or Electronic Service Guide (ESG) of Digital Video Broadcast-Handheld (DVB-H).

The broadcast receiving module 111 receives broadcast signals using various broadcasting systems, and in particular, digital multimedia broadcasting-terrestrial (DMB-T), digital multimedia broadcasting-satellite (DMB-S), and media forward link (MediaFLO). Digital broadcast signals can be received using digital broadcasting systems such as only), digital video broadcast-handheld (DVB-H), integrated services digital broadcast-terrestrial (ISDB-T), and the like. Of course, the broadcast receiving module 111 may be adapted to other broadcasting systems that provide broadcast signals as well as the digital broadcasting system described above.

The broadcast signal and / or broadcast related information received through the broadcast receiving module 111 may be stored in the memory unit 160.

The mobile communication module 112 transmits and receives a wireless signal with at least one of a base station, an external terminal, and a server on a mobile communication network. The wireless signal may include various types of data according to transmission and reception of a voice call signal, a video call call signal, or a text / multimedia message.

The wireless Internet module 113 refers to a module for wireless Internet access, and the wireless Internet module 113 can be embedded in the mobile terminal 100 or externally. Wireless Internet technologies may include Wireless LAN (Wi-Fi), Wireless Broadband (Wibro), World Interoperability for Microwave Access (Wimax), High Speed Downlink Packet Access (HSDPA), and the like.

The short range communication module 114 refers to a module for short range communication. As a short range communication technology, Bluetooth, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee, and the like may be used.

The location information module 115 is a module for confirming or obtaining the location of the mobile terminal. The location information module 115 may obtain location information by using a Global Navigation Satellite System (GNSS), a cell ID method, and a WLAN connection location detection method. Hereinafter, the location information module 115 will be described for each method of obtaining location information.

First, the location information module 115 may obtain location information using a global navigation satellite system (GNSS). Here, the global satellite navigation system (GNSS) is a term used to describe radionavigation satellite systems that revolve around the earth and send certain signals from which certain types of radio navigation receivers can determine their location near or on the earth's surface. to be. The Global Positioning System (GNSS) includes Global Positioning System (GPS) in the United States, Galileo in Europe, GLONASS (Global Orbiting Navigational Satelite System) in Russia, COMPASS in China and Japan QZSS (Quasi-Zenith Satellite System), which is operated by the Ministry of Land, Infrastructure and Transport.

As a representative example of the GNSS, the location information module 115 may be a Global Position System (GPS) module. The GPS module calculates information on a distance (distance) from three or more satellites to a point (object), information on a time when the distance information is measured, and then applies a trigonometric method to the calculated distance information, Dimensional position information according to latitude, longitude, and altitude with respect to a point (object) in the three-dimensional space. Furthermore, a method of calculating position and time information using three satellites and correcting the error of the calculated position and time information using another satellite is also used. The GPS module continuously calculates the current position in real time, and calculates velocity information using the current position.

Second, the location information module 115 may obtain location information by using a cell-ID method. The cell ID method uses a plurality of base stations 200 of a wireless communication system (for example, see FIG. 5A). Representative examples of the wireless communication system include a mobile communication system using a CDMA, GSM, or WCDMA communication method. The location may be determined using identification information (for example, base station ID) of one or more base stations 200 corresponding to where the mobile terminal 100 is located and signal strength information received from each base station. The cell ID method may determine a more precise location by using a triangulation method using a method using one or two base stations and three base stations. That is, different cell ID methods may be used according to the number of base stations on which the cell ID method is based, and an error range may be reduced as the number of base stations increases.

Third, the location information module 115 may obtain location information using a wireless LAN (LAN) connection location detection method. FIG. 5B is a diagram schematically illustrating an example of a wireless LAN system communicating with the mobile terminal illustrated in FIG. 1. In general, a WLAN system may include a plurality of Access Points (APs) corresponding to endpoints of a backbone network. The WLAN system can recognize information about the location of each access point. In addition, the location information of each access point may be managed for each specific area or for each access point. The mobile terminal 100 may perform wireless communication with the access point by a wireless LAN method. Accordingly, the mobile terminal 100 may obtain location information regarding the current location of the mobile terminal 100 by wireless communication with the access point.

Referring to FIG. 1, the A / V input unit 120 is for inputting an audio signal or a video signal, and may include a camera 121 and a microphone 122. The camera 121 processes image frames such as still images or moving images obtained by an image sensor in a video call mode or a photographing mode. The processed image frame may be displayed on the display unit 151.

The image frame processed by the camera 121 may be stored in the memory unit 160 or transmitted to the outside through the wireless communication unit 110. The camera 121 may be equipped with two or more cameras according to the configuration of the terminal.

The microphone 122 receives an external sound signal by a microphone in a call mode, a recording mode, a voice recognition mode, etc., and processes the external sound signal into electrical voice data. The processed voice data may be converted into a form transmittable to the mobile communication base station through the mobile communication module 112 and output in the call mode. The microphone 122 may implement various noise removing algorithms for removing noise generated in the process of receiving an external sound signal.

The user input unit 130 generates input data for the user to control the operation of the terminal. The user input unit 130 may include a key pad dome switch, a touch pad (static pressure / capacitance), a jog wheel, a jog switch, and the like. The sensing unit 140 detects a current state of the mobile terminal 100 such as an open / closed state of the mobile terminal 100, a location of the mobile terminal 100, presence or absence of a user contact, orientation of the mobile terminal, acceleration / deceleration of the mobile terminal, and the like. To generate a sensing signal for controlling the operation of the mobile terminal 100. For example, when the mobile terminal 100 is in the form of a slide phone, it may sense whether the slide phone is opened or closed. In addition, it may be responsible for sensing functions related to whether the power supply unit 190 is supplied with power, whether the interface unit 170 is coupled to an external device, and the like. The sensing unit 140 may include a proximity sensor 141.

The output unit 150 is used to generate an output related to sight, hearing, or tactile sense, and includes a display unit 151, an audio output module 152, an alarm unit 153, and a haptic module 154. Can be.

The display unit 151 displays and outputs information processed by the mobile terminal 100. For example, when the mobile terminal is in a call mode, the mobile terminal displays a user interface (UI) or a graphic user interface (GUI) related to the call. When the mobile terminal 100 is in a video call mode or a photographing mode, the mobile terminal 100 displays photographed and / or received images, a UI, and a GUI. The display unit 151 may be a liquid crystal display, a thin film transistor-liquid crystal display, an organic light-emitting diode, a flexible display, a three-dimensional display 3D display). Some of these displays can be configured to be transparent or light transmissive so that they can be seen from the outside. This may be referred to as a transparent display. A representative example of the transparent display is a transparent LCD. The rear structure of the display unit 151 may also be configured as a light transmissive structure. With this structure, the user can see the object located behind the terminal body through the area occupied by the display unit 151 of the terminal body.

There may be two or more display units 151 according to the implementation form of the mobile terminal 100. For example, in the mobile terminal 100, a plurality of display portions may be spaced apart from one another, or may be disposed integrally with one another, and may be disposed on different surfaces, respectively.

When the display unit 151 and a sensor for detecting a touch operation (hereinafter, referred to as a touch sensor) form a mutual layer structure (hereinafter, abbreviated as “touch screen”), the display unit 151 is an output device. It can also be used as an input device. The touch sensor may have, for example, a form of a touch film, a touch sheet, a touch pad, or the like.

The touch sensor may be configured to convert a change in pressure applied to a specific portion of the display unit 151 or capacitance generated in a specific portion of the display unit 151 into an electrical input signal. The touch sensor may be configured to detect not only the position and area of the touch but also the pressure at the touch.

If there is a touch input to the touch sensor, the corresponding signal (s) is sent to the touch controller. The touch controller processes the signal (s) and then transmits the corresponding data to the controller 180. As a result, the controller 180 can know which area of the display unit 151 is touched.

Referring to FIG. 1, a proximity sensor 141 may be disposed in an inner region of the mobile terminal or in the vicinity of the touch screen, which is surrounded by the touch screen. The proximity sensor refers to a sensor that detects the presence or absence of an object approaching a predetermined detection surface or an object present in the vicinity without using a mechanical contact by using an electromagnetic force or infrared rays. Proximity sensors have a longer life and higher utilization than touch sensors. Examples of the proximity sensor include a transmission photoelectric sensor, a direct reflection photoelectric sensor, a mirror reflection photoelectric sensor, a high frequency oscillation proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor.

When the touch screen is capacitive, the touch screen is configured to detect the proximity of the pointer by the change of the electric field according to the proximity of the pointer. In this case, the touch screen (touch sensor) may be classified as a proximity sensor.

Hereinafter, for convenience of explanation, the act of allowing the pointer to be recognized without being in contact with the touch screen so that the pointer is located on the touch screen is referred to as a "proximity touch", and the touch The act of actually touching the pointer on the screen is called "contact touch." The position where the proximity touch is performed by the pointer on the touch screen refers to a position where the pointer is perpendicular to the touch screen when the pointer is in proximity proximity.

The proximity sensor detects a proximity touch and a proximity touch pattern (eg, a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, and a proximity touch movement state). Information corresponding to the sensed proximity touch operation and proximity touch pattern may be output on the touch screen.

The sound output module 152 may output audio data received from the wireless communication unit 110 or stored in the memory unit 160 in a call signal reception, a call mode or a recording mode, a voice recognition mode, a broadcast reception mode, and the like. The sound output module 152 outputs an acoustic signal related to a function (e.g., a call signal reception sound, a message reception sound, etc.) performed in the mobile terminal 100. The sound output module 152 may include a receiver, a speaker, a buzzer, and the like.

The alarm unit 153 outputs a signal for notifying occurrence of an event of the mobile terminal 100. Examples of events occurring in the mobile terminal include call signal reception, message reception, key signal input, and touch input. The alarm unit 153 may output a signal for notifying occurrence of an event in a form other than a video signal or an audio signal, for example, vibration. The video signal or the audio signal can also be output through the display unit 151 or the audio output module 152.

The haptic module 154 generates various tactile effects that the user can feel. Vibration is a representative example of the haptic effect generated by the haptic module 154. The intensity and pattern of vibration generated by the haptic module 154 can be controlled. For example, different vibrations may be synthesized and output or may be sequentially output. In addition to the vibration, the haptic module 154 may be used for the effects of stimulation by the arrangement of pins vertically moving with respect to the contact skin surface, the effect of the injection force of the air through the injection or inlet or the stimulation through the suction force, and the stimulation that rubs the skin surface. Various tactile effects may be generated, such as effects by stimulation through contact of electrodes, effects by stimulation using electrostatic force, and effects of reproducing a sense of warmth and heat using an endothermic or heat generating element.

The haptic module 154 may not only deliver the haptic effect through direct contact, but also implement the haptic effect through the muscle sense of the user's finger or arm. Two or more haptic modules 154 may be provided according to a configuration aspect of the mobile terminal 100.

The memory unit 160 may store a program for the operation of the controller 180 and may temporarily store input / output data (for example, a phone book, a message, a still image, a video, etc.). The memory unit 160 may store data regarding vibration and sound of various patterns output when a touch input on the touch screen is performed.

The memory unit 160 may include a flash memory type and a hard disk drive.

Hard disk type, multimedia card micro type, card type memory (e.g. SD or XD memory, etc.), random access memory (RAM), static random access memory (SRAM), The storage medium may include a read-only memory (ROM), an electrically erasable programmable read-only memory (EEPROM), a programmable read-only memory (PROM) magnetic memory, a magnetic disk, or an optical disk. The mobile terminal 100 may operate in association with a web storage that performs a storage function of the memory unit 160 on the Internet.

The interface unit 170 serves as a path with all external devices connected to the mobile terminal 100. The interface unit 170 receives data from an external device or receives power from the external device to transfer the data to each component in the mobile terminal 100 or to transmit data in the mobile terminal 100 to an external device. For example, a wired / wireless headset port, an external charger port, a wired / wireless data port, a memory card port, a port for connecting a device having an identification module, an audio I / O port, A video input / output (I / O) port, an earphone port, and the like may be included in the interface unit 170. The identification module is a chip that stores various types of information for authenticating the use authority of the mobile terminal 100. The identification module includes a user identification module (UIM), a subscriber identification module (SIM), and a universal user authentication module. (Universal Subscriber Identity Module, USIM) and the like. Devices with identification modules (hereinafter referred to as "identification devices") can be manufactured in a smart card format. Therefore, the identification device may be connected to the terminal 100 through a port. The interface unit may be a passage through which power from the cradle is supplied to the mobile terminal 100 when the mobile terminal 100 is connected to an external cradle, or various command signals input from the cradle by a user may be transferred. It may be a passage that is delivered to the terminal. Various command signals or power input from the cradle may be operated as signals for recognizing that the mobile terminal is correctly mounted on the cradle.

The control unit 180 typically controls the overall operation of the mobile terminal. For example, perform related control and processing for voice calls, data communications, video calls, and the like. The controller 180 may include a multimedia module 181 for playing multimedia. The multimedia module 181 may be implemented in the controller 180, or may be implemented separately from the controller 180. The controller 180 can perform writing and drawing inputs on the touch screen, respectively, for text and images. A pattern recognition process that can be recognized can be performed.

The power supply unit 190 receives an external power source and an internal power source under the control of the controller 180 to supply power for operation of each component.

The various embodiments described herein may be implemented in a recording medium readable by a computer or similar device using, for example, software, hardware or a combination thereof. According to the hardware implementation, Embodiments described may include application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPCs), and controllers. ), Micro-controllers, microprocessors, or electrical units for performing functions, in some cases such embodiments may be implemented by the controller 180. Can be.

In a software implementation, embodiments such as procedures or functions may be implemented with separate software modules that allow at least one function or operation to be performed. The software code may be implemented by a software application written in a suitable programming language. Also, the software codes may be stored in the memory unit 160 and executed by the control unit 180. [

2A is a front perspective view of an example of a mobile terminal or a portable terminal according to the present invention.

The disclosed portable terminal 100 has a terminal body in the form of a bar. However, the present invention is not limited thereto and may be applied to various structures such as a slide type, a folder type, a swing type, a swivel type, and two or more bodies are coupled to be relatively movable.

The body includes a casing (casing, housing, cover, etc.) that forms an exterior. In this embodiment, the case may be divided into a front case 101 and a rear case 102. Various electronic components are built in the space formed between the front case 101 and the rear case 102. At least one intermediate case may be additionally disposed between the front case 101 and the rear case 102. [

The cases may be formed by injecting a synthetic resin, or may be formed of a metal material, for example, a metal material such as stainless steel (STS) or titanium (Ti). The sound output unit 152, the camera 121, the user input units 130/131 and 132, the microphone 122, and the interface 170 may be disposed.

The display unit 151 occupies most of the main surface of the front case 101. A sound output unit 151 and a camera 121 are disposed in an area adjacent to one end of both ends of the display unit 151 and a user input unit 131 and a microphone 122 are disposed in an area adjacent to the other end. The user input unit 132, the interface 170, and the like are disposed on the side surfaces of the front case 101 and the rear case 102.

The user input unit 130 is manipulated to receive a command for controlling the operation of the portable terminal 100 and may include a plurality of operation units 131 and 132.

The manipulation units 131 and 132 may also be collectively referred to as manipulating portions, and may be employed in any manner as long as the user operates the tactile manner with a tactile feeling.

Content input by the first or second manipulation units 116 and 117 may be variously set. For example, the first operation unit 116 receives a command such as start, end, scroll, and the like, and the second operation unit 117 adjusts the volume of the sound output from the sound output unit 152 or the display unit 151. Command), such as switching to the touch recognition mode.

FIG. 2B is a rear perspective view of the portable terminal shown in FIG. 2A.

Referring to FIG. 2B, a camera 121 ′ may be additionally mounted on the rear of the terminal body, that is, the rear case 102. The camera 121 'may have a photographing direction substantially opposite to the camera 121 (see FIG. 2A), and may be a camera having different pixels from the camera 121.

For example, the camera 121 has a low pixel so that the user's face is photographed and transmitted to the counterpart in case of a video call, and the camera 121 'photographs a general subject and does not transmit it immediately. It is desirable to have a high pixel because there are many. The cameras 121 and 121 'may be installed in the terminal body so as to be rotatable or pop-upable.

A flash 123 and a mirror 124 are further disposed adjacent to the camera 121 '. The flash 123 shines light toward the subject when the subject is photographed by the camera 121 '. The mirror 124 allows the user to see his / her own face or the like when photographing (self-photographing) the user using the camera 121 '.

The sound output unit 152 'may be further disposed on the rear surface of the terminal body. The sound output unit 152 ′ may implement a stereo function together with the sound output unit 152 (see FIG. 2A), and may be used to implement a speakerphone mode during a call.

In addition to the antenna for a call or the like, a broadcast signal reception antenna 124 may be additionally disposed on the side of the terminal body. An antenna 124 constituting a part of the broadcast receiving module 111 (refer to FIG. 1) may be installed to be pulled out of the terminal body. The terminal body includes a power supply unit 190 for supplying power to the portable terminal 100. Is mounted. The power supply unit 190 may be embedded in the terminal body or may be directly detachable from the outside of the terminal body.

The rear case 102 may be further equipped with a touch pad 135 for sensing a touch. Like the display unit 151, the touch pad 135 may also be configured to have a light transmission type. In this case, if the display unit 151 is configured to output visual information from both sides, the visual information may be recognized through the touch pad 135. The information output on both surfaces may be controlled by the touch pad 135. Alternatively, a display may be additionally mounted on the touch pad 135, and a touch screen may be disposed on the rear case 102.

The touch pad 135 operates in association with the display unit 151 of the front case 101. The touch pad 135 may be disposed in parallel to the rear of the display unit 151. The touch pad 135 may have the same or smaller size as the display unit 151.

Hereinafter, the operation of the display unit 151 and the touch pad 135 will be described with reference to FIGS. 3A and 3B.

3A and 3B are front views of a portable terminal for explaining an operation state of the portable terminal according to the present invention.

Various types of time information can be displayed on the display unit 151. [ These pieces of information can be displayed in the form of letters, numbers, symbols, graphics, or icons.

In order to input such information, at least one of the letters, numbers, symbols, graphics, or icons may be displayed in a predetermined arrangement so as to be implemented in the form of a keypad. Such a keypad may be called a so-called " soft key ".

3A illustrates receiving a touch applied to a softkey through the front of the terminal body.

The display unit 151 may operate in an entire area or may be operated in a divided manner. In the latter case, the plurality of areas can be configured to operate in association with each other.

For example, an output window 151a and an input window 151b are displayed on the upper and lower portions of the display unit 151, respectively. In the input window 151b, a soft key 151c displaying a number for inputting a telephone number or the like is output. When the softkey 151c is touched, a number or the like corresponding to the touched softkey is displayed on the output window 151a. When the first manipulation unit 116 is operated, a call connection to the telephone number displayed on the output window 151a is attempted.

3B illustrates receiving a touch applied to a softkey through the rear of the terminal body. If FIG. 3A is a portrait in which the terminal body is arranged vertically, FIG. 3B illustrates a landscape in which the terminal body is arranged horizontally. The display unit 151 may be configured to convert the output screen according to the arrangement direction of the terminal body.

3B shows that the text input mode is activated in the mobile terminal. The display unit 151 displays the output window 135a and the input window 135b. In the input window 135b, a plurality of softkeys 135c on which at least one of letters, symbols, and numbers are displayed may be arranged. The softkeys 135c may be arranged in the form of a QWERTY key.

When the soft keys 135c are touched through the touch pad 135, letters, numbers, symbols, etc. corresponding to the touched soft keys are displayed on the output window 135a. As described above, the touch input through the touch pad 135 has an advantage of preventing the softkey 135c from being blocked by the finger when touched, as compared with the touch input through the display unit 151. When the display unit 151 and the touch pad 135 are transparent, the fingers located at the rear of the terminal body can be visually checked, and thus more accurate touch input is possible.

In addition to the input methods disclosed in the above embodiments, the display unit 151 or the touch pad 135 may be configured to receive a touch input by scrolling. By scrolling the display unit 151 or the touch pad 135, the user may move an object displayed on the display unit 151, for example, a cursor or a pointer located at an icon. Further, when the finger is moved on the display portion 151 or the touch pad 135, the path along which the finger moves may be visually displayed on the display portion 151. [ This may be useful for editing an image displayed on the display unit 151.

One function of the terminal may be executed in response to a case where the display unit 151 (touch screen) and the touch pad 135 are touched together within a predetermined time range. In the case of being touched together, there may be a case where the user clamps the terminal body using the thumb and index finger. For example, the function may include activation or deactivation of the display unit 151 or the touch pad 135.

The proximity sensor 141 described with reference to FIG. 1 will be described in more detail with reference to FIG. 4.

4 is a conceptual diagram illustrating a proximity depth of a proximity sensor.

As shown in FIG. 4, when a pointer such as a user's finger is close to the touch screen, the proximity sensor 141 disposed in or near the touch screen detects this and outputs a proximity signal.

The proximity sensor 141 may be configured to output different proximity signals according to a distance between the proximity touched pointer and the touch screen (hereinafter, referred to as “proximity depth”).

The distance at which the proximity signal is output when the pointer approaches the touch screen is called a detection distance. The proximity signal output from each proximity sensor is compared by using a plurality of proximity sensors having different detection distances. I can know the depth.

In FIG. 4, for example, a cross section of a touch screen on which three proximity sensors capable of sensing three proximity depths is disposed is illustrated. Of course, proximity sensors can also detect less than three or more than four proximity depths.

In detail, when the pointer is completely in contact with the touch screen (d0), the pointer is recognized as a touch. When the pointer is positioned below the distance d1 on the touch screen, the pointer is recognized as a proximity touch of a first proximity depth. When the pointer is spaced apart from the distance d1 or more and less than d2 on the touch screen, the pointer is recognized as a proximity touch of a second proximity depth. When the pointer is spaced apart from the d2 distance by more than d3 distance on the touch screen, it is recognized as a proximity touch of a third proximity depth. In addition, when the pointer is located at a distance greater than or equal to d3 on the touch screen, the proximity touch is recognized as released.

Accordingly, the controller 180 may recognize the proximity touch as various input signals according to the proximity distance and the proximity position of the pointer touch screen, and perform various operation control according to the various input signals.

Meanwhile, the mobile terminal 100 shown in FIG. 1 is a communication system for transmitting data through frames or packets, including a wireless / wired communication system and a satellite communication system. It can be configured to work. Such communication systems may use other air interfaces and / or physical layers. FIG. 5 is a block diagram of a CDMA wireless communication system in communication with the mobile device shown in FIG. 1.

Examples of the air interfaces used by the communication systems include frequency division multiple access (FDMA), time division multiple access (TDMA), code division multiple access (CDMA). ), And Universal Mobile Telecommunication System (UMTS), Long Term Evolution (LTE) of UMTS, and Global System for Mobile communication (GSM). In the following description, the present invention is limited to the CDMA communication system, but is not limited thereto, and may be applied to other system types.

Referring to FIG. 5A, a CDMA wireless communication system includes a plurality of mobile terminals 100, a plurality of base stations 200, base station controllers BSCs 210, and a mobile switching center. Centers; MSCs, 220). The mobile switching center 220 is configured to be connected to a public switch telephone network (PSTN) 230. The mobile switching center 220 is also configured to be connected with the base station controllers 210. The base station controllers 210 are connected to the base stations 200 via backhaul lines. The backhaul lines may be configured according to E1 / T1, ATM, IP, PPP, frame relay, HDSL, ADSL, or xDSL known to those skilled in the art. It will be apparent to those skilled in the art that the system may include two or more base station controllers 210.

Each of the base stations 200 may comprise one or more sectors, each sector comprising an omnidirectional antenna or an antenna tuned to a specific radial direction from the base station 200. can do. Alternatively, each sector may include two antennas for diversity reception. Each base station 200 is configured to accommodate a plurality of frequency assignments, which may have a specific spectrum (eg, 1.25 MHz, 5 Mhz).

The intersection of sectors and frequency assignments may be referred to as a CDMA channel. The base stations 27 may also be referred to as Base station Transceiver Subsystems (BTSs). In some examples, the term “base station” may be used to collectively refer to base station controller 210 and one or more base stations 200. The base stations may also be labeled "cell sites". Alternatively, individual sectors of a given base station 200 may be referred to as cell sites.

The terrestrial DMB transmitter 240 may broadcast to mobile terminals 100 operating in the system. The broadcast receiving module 111 of the mobile terminal 100 is generally configured to receive broadcast signals transmitted by the DMB transmitter 240. This may be similarly applied to other types of broadcast and multicast signaling as described above.

5A illustrates various Global Positioning System (GPS) satellites 250. Such satellites 250 may track the location of some or all mobile terminals 100. Although two satellites are shown, it is apparent to those skilled in the art that the location information can be obtained from more or fewer satellites. Other types of location techniques (eg, location techniques that can be used in place of or in addition to GPS technology) can be used. If desired, some or all of the GPS satellites 250 may be configured to support satellite DMB transmission separately or additionally.

During operation of the wireless communication system, the base stations 200 receive reverse-link signals from several mobile terminals 100. The mobile terminals 100 may be in a call, sending a message, or performing other communication. Each reverse link signal received by the base station is processed at the base station. The processed data is transmitted to the connected base station controller 210. The base station controller 210 provides call resource allocation and includes mobility management functionality including orchestration of soft handoffs between base stations 200. ). The base station controllers 210 also send the received data to the mobile switching centers (MSCs) 220. The mobile switching center 220 provides additional routing services for interfacing with the public switch telephone network (PSTN) 230. Similarly, the public switching telephone network 230 interfaces with the mobile switching center 220, and the mobile switching center interfaces with the base station controllers 210. The base station controllers 210 in turn control the base stations 200 to transmit forward link signals to the mobile terminals 100.

Hereinafter, embodiments of the present invention will be described. In the present invention, for convenience of description, it is assumed that the display unit 151 is a touch screen 151. As described above, the touch screen 151 may perform both an information display function and an information input function. However, it should be clear that the present invention is not limited thereto. In addition, the touch described below may include both a proximity touch and a direct touch.

In addition, the memory 160 may store a map. The map may be stored in the memory 160 or received from the outside through the wireless communication unit 100. When the map is received from the outside, it may be permanently stored in the memory 160 or temporarily stored.

6A and 6B are flowcharts illustrating an operation process of a mobile terminal according to an embodiment of the present invention. Hereinafter, based on the flowchart shown in FIG. 6A and 6B, it demonstrates with reference to each figure related to the specific operation | movement on a flowchart.

As shown in these figures, the mobile terminal 100 according to an embodiment of the present invention may include receiving channel information (CIA, CIB, CIC of FIG. 8) (S10).

The channel information (CIA, CIB, CIC in FIG. 8) is included in the header or tag of each channel (CN in FIG. 8), and is a portion indicating the attribute of each channel (CN in FIG. 8). The attributes of each channel (CN of FIG. 8) will be described in detail in the corresponding sections. The channel information (CIA, CIB, CIC of FIG. 8) may be obtained from a wireless network to which the mobile terminal 100 currently belongs, as shown in FIG.

FIG. 7 is a diagram illustrating a relationship between a location of a mobile terminal and a wireless network according to FIGS. 6A and 6B.

Each of the wireless networks A, B, and C may be Bluetooth for short range communication, Shared Wireless Access Protocol (SWAP), infrared communication (IrDA), Radio Frequency Identification (RFID), ZigBee, or the like. In addition, the wireless networks A, B, and C for each of the short range communication may communicate with the short range communication module 114 included in the mobile terminal 100.

Each of the wireless networks A, B, and C may have different areas covering each other, but may overlap each other in a predetermined area. Specifically, it means that the second and third wireless networks B and C may exist in the first wireless network A region covering a relatively wide area. In this case, a difference may occur in the wireless networks A, B, and C that can communicate through the local area communication module 114 according to the location of the mobile terminal 100.

That is, when the mobile terminal 100 is located at the first position P1, the mobile terminal 100 belongs only to the area of the first wireless network A, and thus the first channel (CNA of FIG. 8) transmitted by the first wireless network A is included. ) Can only be received. However, when the mobile terminal 100 moves to the second position P2, it is possible to receive the first and second channels (CNA and CNB of FIG. 8) originating from the first and second wireless networks A and B. When moving to the third position P3, all of the first, second, and third channels (CNA, CNB, CNC in FIG. 8) can be received. Meanwhile, although FIG. 7 illustrates the case where there are three wireless networks A, B, and C, the number of wireless networks A, B, and C is not limited thereto.

8 is a diagram schematically showing a channel according to an embodiment of the present invention, Figures 9a to 9 is a view showing a more specific configuration of each channel shown in FIG. As shown in these figures, each channel CN may include channel information CIA, CIB, CIC and channel bodies CBA, CBB, CBC.

The channel information CIA, CIB, CIC is an area indicating the attribute of each channel CN. That is, it means that the information contained in each channel CN includes information on which field. For example, the channel information (CIA, CIB, CIC) is the channel (CN) is a variety of information, such as weather information, traffic information, shopping information, advertising information, natural disaster information, lost child information, crime occurrence information, fire occurrence information It may include information indicating what kind of information is included in the.

When the first channel CNA is related to weather information, as shown in FIG. 9A, the first channel information CIA may include information for extracting a first keyword of “weather”. Meanwhile, the text information “weather” may be directly included in the first channel information CIA, and the first channel information CIA may be associated with the “weather” by using a predetermined symbol or number. It may also indicate that it contains information. When the second channel CNB is related to traffic information, as shown in FIG. 9B, the second channel information CNB may include information for extracting a 'second keyword' called 'traffic'. As illustrated in FIG. 9C, the third channel information CNC may include information for extracting a second keyword of “shopping”. 9A to 9C illustrate that one first keyword is included for convenience of description, but two or more first keywords related to each other may be included.

Channel bodies (CBA, CBB, CBC) are substantially areas containing information. Specifically, the first channel body CBA of the first channel CNA may include at least one content information CTI and a content CT corresponding to each of the content information CTI. That is, as shown in FIG. 9A, when the first channel CNA is a channel related to weather, the content information CTI included in the first channel body CBA is “Seoul”, “Suwon”, or the like. Each of these may be a 'third keyword'. On the other hand, as shown in Figure 9b, if the second channel (CNB) is a channel associated with traffic, the content information (CTI) included in the second channel body (CBB) may be "Sad", "Seocho", etc. Each of these may be a 'third keyword'. In addition, as shown in FIG. 9C, if the third channel (CNC) is a channel related to shopping, the content information (CTI) included in the third channel body (CBC) may be "homepage", "discount coupon", and the like. Each of these may be a 'third keyword'.

In step S10, when each channel information (CIA, CIB, CIC) is received in the channel (CN) transmitted from the wireless network (A, B, C) that can communicate at the current point where the mobile terminal 100 is located, In operation S20, the first keyword may be extracted from the channel information CIA, CIB, and CIC.

As described above, the first keyword is information such as "weather", "traffic", "shopping", etc. extracted from the channel information CIA, CIB, CIC indicating the attribute of each channel CN.

When the first keyword is extracted, the step S30 of determining whether the channel display condition is satisfied may proceed.

The user may wish that the channel is not displayed at a certain time such as a meeting time or bedtime or at a specific place such as a company or school. In this way, the condition for setting the channel not to be displayed at a specific time and place is the channel display condition. As shown in FIG. 10, the display unit 151 may display a channel search time setting menu M1 and a channel search place setting menu M1. Accordingly, the user can select a time and place where the channel search is to be performed.

If the channel display condition is not satisfied, step S40 may be performed to determine whether the received channel information corresponds to a specific channel.

Natural disaster information channel that informs the occurrence of natural disasters, such as the information channel for the occurrence of a missing child in a specific region, crime occurrence information channel to inform that a crime occurred, fire occurrence information channel to inform that a fire has occurred, In the case where the information for the public interest is large, the channel may be displayed even if it does not meet the channel display condition set by the user. That is, even when the user sets the channel search time from 10 to 18:00 as shown in FIG. 10, the information on the natural disaster information channel generated at 9:30 as shown in FIG. 11 is displayed. It means possible.

When the channel display condition is satisfied or the received channel information corresponds to a specific channel, a step (S50) of determining whether the extracted first keyword and the second keyword stored in the mobile terminal 100 are identical to each other may be performed. Can be.

The first keyword is information such as "weather", "traffic", "shopping", etc. extracted from channel information (CIA, CIB, CIC) indicating the attributes of each channel CN.

The second keyword is a keyword previously input and stored by the user. That is, when searching for a channel, information about what the user wants to know. A plurality of keywords may be inputted, or may be inputted in a form such as a higher concept-lower concept as necessary. That is, it is possible to input weather or region as a higher concept, and to input a region as a lower concept.

As shown in FIG. 12A, a first keyword extracted from channel information CIA, CIB, and CIC received in a specific region may be 'weather', 'traffic', or 'shopping'. In addition, the second keyword entered by the user may be 'shopping' or 'discount coupon'. In this case, when comparing the first keyword and the second keyword, it can be seen that the keywords 'shopping' coincide with each other.

When there are keywords that match each other by comparing the first and second keywords, a step (S60) of receiving and displaying a channel corresponding to the channel information from which the first keyword is extracted may be performed. On the other hand, as shown in Figure 6b, step S60 may include a step (S62) for extracting a third keyword from the content information (CTI).

As described above, the content information CTI may include specific keywords for the content CT included in each channel CN. Referring to FIG. 9C, the first keyword that may be extracted from the channel information CIC of the third channel CNC may be 'shopping'. On the other hand, the third channel (CNC) may include a content (CT) containing a website address and a content (CT) containing a discount coupon specifically. The header or tag portion of each content CT may include content information (CTI) indicating the characteristics of the information contained in each content (CT), and when the third keyword is extracted from the content information (CTI), Or 'discount coupon'.

When the third keyword is extracted, a step S64 may be performed to determine whether the third keyword and the second keyword coincide with each other.

That is, as shown in FIG. 12B, when the extracted third keyword includes 'homepage' and 'discount coupon', the second keyword stored by the user is compared with 'shopping' and 'discount coupon'. In this case, discount coupons can be coincident with each other.

If the third keyword and the second keyword coincide with each other, the step S66 of displaying the content corresponding to the third keyword may be performed.

When the third keyword after the first keyword coincides with the second keyword input by the user, the content CT of the channel CN is likely to be information that the user wants to receive. Accordingly, as shown in FIG. 13, the content corresponding to the content CT may be displayed on the display unit 151 so that the user can recognize and use the information.

14 is a flowchart illustrating a process of setting priorities of keywords in a mobile terminal according to an embodiment of the present invention, and FIGS. 15A and 15B are diagrams illustrating operations of the mobile terminal according to FIG. 14.

As shown in these figures, the process of setting priorities of the second keywords previously input to the mobile terminal may include displaying the input second keywords according to the current priority (S72).

That is, as illustrated in FIG. 15A, the second keywords previously input by the user may be displayed on the display unit 151. Each of the second keywords such as 'natural disaster', 'weather', 'transportation' and 'discount coupon' may be indicated by a current priority.

When the second keywords are displayed, step S74 of receiving a user's manipulation signal for each second keyword and changing the priority of the second keywords according to the input manipulation signal S76 may be performed.

The operation signal may be input through various user input units 130 provided in the mobile terminal 100, and specifically, may be input through a user's touch operation as shown in FIG. 15A. That is, the user selects a second keyword to change its priority and drags it upward or downward to change the priority of the second keyword as shown in FIG. 15B.

When the change of the priority is finished, the step S78 of storing the second keyword may be performed based on the changed priority. Therefore, if the priority is set, channels may be searched and displayed according to the priority in the future.

FIG. 16 illustrates a case where a channel is searched for and displayed based on the priority set by FIGS. 15A and 15B. That is, the channel may be displayed according to the priority set in FIG. 15B. Therefore, information of interest to the user may be displayed in a more visible position. In addition, when the priority is higher than a predetermined standard, it is also possible to call the user's attention by repeatedly outputting or generating a sound.

The method for providing location information of a mobile terminal according to the present invention described above may be provided by recording on a computer-readable recording medium as a program for executing on a computer. Can be run through software. When implemented in software, the constituent means of the present invention are code segments that perform the necessary work. The program or code segments may be stored on a processor readable medium or transmitted by a computer data signal coupled with a carrier on a transmission medium or network.

Computer-readable recording media include all kinds of recording devices that store data that can be read by a computer system. Examples of the computer-readable recording device include ROM, RAM, CD-ROM, DVD ± ROM, DVD-RAM, magnetic tape, floppy disk, hard disk, optical data storage device, and the like. The computer-readable recording medium may also be distributed to networked computer devices so that computer readable code can be stored and executed in a distributed manner.

As described above, the present invention is not limited to the described embodiments, and various modifications and changes can be made without departing from the spirit and scope of the present invention. Accordingly, such modifications or variations are intended to fall within the scope of the appended claims.

1 is a block diagram of a mobile terminal according to an embodiment of the present invention.

2A is a front perspective view of a mobile terminal according to an embodiment of the present invention.

2B is a rear perspective view of a mobile terminal according to an embodiment of the present invention.

3A and 3B are front views of a portable terminal for explaining an operation state of the portable terminal according to the present invention.

4 is a conceptual diagram illustrating a proximity depth of a proximity sensor.

5A is a block diagram of a CDMA wireless communication system in communication with the mobile device shown in FIG.

FIG. 5B is a diagram schematically illustrating an example of a wireless LAN system communicating with the mobile terminal illustrated in FIG. 1.

6A and 6B are flowcharts illustrating an operation process of a mobile terminal according to an embodiment of the present invention.

FIG. 7 is a diagram illustrating a relationship between a location of a mobile terminal and a wireless network according to FIGS. 6A and 6B.

8 is a diagram schematically illustrating a channel according to an embodiment of the present invention.

9A to 9C are diagrams illustrating a more specific configuration of each channel shown in FIG. 8.

FIG. 10 is a diagram illustrating a state of setting channel display conditions in the mobile terminal according to FIGS. 6A and 6B.

FIG. 11 is a view illustrating a natural disaster information channel displayed in the mobile terminal according to FIGS. 6A and 6B.

12A and 12B illustrate a comparison process between keywords acquired in a channel and keywords stored in a mobile terminal according to an embodiment of the present invention.

FIG. 13 is a diagram illustrating a state in which a channel is received and displayed in the mobile terminal according to FIGS. 6A and 6B.

14 is a flowchart illustrating a process of setting priorities of keywords in a mobile terminal according to an embodiment of the present invention.

15A and 15B illustrate an operation of the mobile terminal according to FIG. 14.

FIG. 16 is a diagram illustrating a state in which a searched channel is displayed according to the keyword priority set according to FIG. 14.

Claims (20)

In the information display method of a mobile terminal, Receiving channel information through a wireless communication unit of the mobile terminal; Extracting a first keyword from the received channel information; Determining whether the extracted first keyword matches a second keyword stored in the mobile terminal; And And receiving and displaying a channel corresponding to the extracted channel information when the first and second keywords coincide with each other. The method of claim 1, The channel further includes content information indicating at least one content and a property of each of the at least one content. Receiving and displaying the channel, If the first and second keywords coincide with each other, extracting a third keyword from content information included in a channel corresponding to the extracted channel information of the first keyword; Determining whether the extracted third keyword matches the stored second keyword; And displaying the content corresponding to the content information from which the third keyword is extracted when the second and third keywords coincide with each other. 3. The method of claim 2, Receiving and displaying the channel, Determining a priority held by the second keyword; And And displaying a channel corresponding to the second keyword having a higher priority as a higher priority than a channel corresponding to the second keyword having a lower priority. 3. The method of claim 2, Receiving and displaying the channel, Determining a priority held by the second keyword; And And repeatedly displaying a channel corresponding to a second keyword whose priority is higher than a reference. The method of claim 1, Before the step of determining whether the second keywords match, Determining whether the channel display condition stored in the mobile terminal is satisfied; Determining whether or not the second keyword matches, And the channel display condition is satisfied. The method of claim 5, The channel display condition is, And at least one of a displayable time range of the channel or a displayable area range of the channel. The method of claim 5, Determining whether or not the second keyword matches, If the received channel information is channel information corresponding to a specific channel, the information display method of the mobile terminal is performed even if the channel display condition is not satisfied. The method of claim 7, wherein The specific channel is And at least one of a natural disaster information channel, a lost child information channel, a crime occurrence information channel, and a fire occurrence information channel. The method of claim 1, The channel is, Information display method of the mobile terminal, characterized in that at least any one of the weather information channel, traffic information channel, advertising information channel, shopping information channel, natural disaster information channel, lost child information channel, crime occurrence information channel, fire occurrence information channel . The method of claim 1, The wireless communication unit, A short distance communication module comprising at least one of a Bluetooth module, a swap (SWAP, Sharded Wireless Access Protocol) module, and an infrared wireless communication (IrDA) module. Wireless communication unit; A display unit; Receives channel information through the wireless communication unit, extracts a first keyword from the received channel information, and if the extracted first keyword and the second keyword stored in the memory coincide with each other, the channel from which the first keyword is extracted And a controller for receiving a channel corresponding to the information and displaying the channel on the display unit. The method of claim 11, The channel further includes content information indicating at least one content and a property of each of the at least one content. The control unit, When the first and second keywords coincide with each other, a third keyword is extracted from the content information included in the channel corresponding to the extracted channel information, and the extracted third keyword and the stored second keyword. If is matched, the mobile terminal, characterized in that to display on the display unit the content corresponding to the extracted content information of the third keyword. The method of claim 12, The control unit, Determining the priority held by the second keyword, and displaying the channel corresponding to the second keyword having a higher priority on the display unit with a higher priority than the channel corresponding to the second keyword having a lower priority. Mobile terminal. The method of claim 12, The control unit, And determining the priority held by the second keyword, and repeatedly displaying the channel corresponding to the second keyword having the higher priority than the set criterion on the display unit. The method of claim 11, The control unit, And when the channel display condition stored in the memory is satisfied, determining whether the first and second keywords match. The method of claim 15, The channel display condition is, And at least one of a displayable time range of the channel or a displayable area range of the channel. The method of claim 15, The control unit, And if the received channel information is channel information corresponding to a specific channel, determining whether the first and second keywords match even if the channel display condition is not satisfied. The method of claim 17, The specific channel is A mobile terminal comprising at least one of a natural disaster information channel, a lost child information channel, a crime occurrence information channel, and a fire occurrence information channel. The method of claim 11, The channel is, And at least one of a weather information channel, a traffic information channel, an advertisement information channel, a shopping information channel, a natural disaster information channel, a child occurrence information channel, a crime occurrence information channel, and a fire occurrence information channel. The method of claim 11, The wireless communication unit, A mobile terminal comprising at least one of a Bluetooth module, a swap (SWAP) module, and an infrared wireless communication (IrDA) module.
KR1020090024089A 2009-03-20 2009-03-20 Mobile terminal and information displaying method thereof KR20100105191A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020090024089A KR20100105191A (en) 2009-03-20 2009-03-20 Mobile terminal and information displaying method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020090024089A KR20100105191A (en) 2009-03-20 2009-03-20 Mobile terminal and information displaying method thereof

Publications (1)

Publication Number Publication Date
KR20100105191A true KR20100105191A (en) 2010-09-29

Family

ID=43009357

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020090024089A KR20100105191A (en) 2009-03-20 2009-03-20 Mobile terminal and information displaying method thereof

Country Status (1)

Country Link
KR (1) KR20100105191A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012102507A2 (en) * 2011-01-28 2012-08-02 건아정보기술 주식회사 Motion-recognizing customized advertising system

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012102507A2 (en) * 2011-01-28 2012-08-02 건아정보기술 주식회사 Motion-recognizing customized advertising system
WO2012102507A3 (en) * 2011-01-28 2012-11-22 건아정보기술 주식회사 Motion-recognizing customized advertising system

Similar Documents

Publication Publication Date Title
US8990721B2 (en) Mobile terminal and method of controlling the same
US9215308B2 (en) Mobile terminal and control method thereof
US8886213B2 (en) Mobile terminal and control method thereof
KR100983027B1 (en) Mobile Terminal And Method Of Transferring And Receiving Data Using The Same
US8666454B2 (en) Mobile terminal and method of controlling the same
KR20150069179A (en) Electronic Device And Method Of Controlling The Same
KR20110049492A (en) Mobile terminal and method of providing information using the same
KR20120003323A (en) Mobile terminal and method for displaying data using augmented reality thereof
KR20100131610A (en) Mobile terminal and method of displaying information in mobile terminal
KR20110133655A (en) Mobile terminal and control method thereof
US9448589B2 (en) Mobile terminal and control method thereof
KR20090132991A (en) Mobile terminal and method for managing channel list therein
US20140075355A1 (en) Mobile terminal and control method thereof
US20160196055A1 (en) Mobile terminal and method for controlling mobile terminal
KR20100019888A (en) Mobile terminal and method for controlling operation thereof
KR20090132990A (en) Mobile terminal and method for controlling broadcast thereof
KR101781072B1 (en) Electronic Device And Method Of Controlling The Same
KR20090119412A (en) Mobile terminal and method for controlling broadcast contents purchase therein
KR101643260B1 (en) Electronic Device And Method Of Controlling The Same
KR20100036883A (en) Mobile terminal and method for transmitting information
KR20100040126A (en) Mobile terminal and configuration setting method of external terminal using the same
KR20100105191A (en) Mobile terminal and information displaying method thereof
KR20100109728A (en) Mobile terminal and method of providing recommended music using same
KR20100067419A (en) Mobile terminal and method of setting web access route using same
KR20100041933A (en) Mobile terminal and method of providing traffic information map using same

Legal Events

Date Code Title Description
WITN Withdrawal due to no request for examination