KR20130000280A - Mobile device and control method for the same - Google Patents

Mobile device and control method for the same Download PDF

Info

Publication number
KR20130000280A
KR20130000280A KR1020110060878A KR20110060878A KR20130000280A KR 20130000280 A KR20130000280 A KR 20130000280A KR 1020110060878 A KR1020110060878 A KR 1020110060878A KR 20110060878 A KR20110060878 A KR 20110060878A KR 20130000280 A KR20130000280 A KR 20130000280A
Authority
KR
South Korea
Prior art keywords
area
information
displayed
page
search
Prior art date
Application number
KR1020110060878A
Other languages
Korean (ko)
Inventor
박도영
박아현
지현호
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020110060878A priority Critical patent/KR20130000280A/en
Publication of KR20130000280A publication Critical patent/KR20130000280A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

PURPOSE: A portable electronic device and a control method are provided to search information related to an object included in a setting area in a displayed page, thereby supplying a search function with page display. CONSTITUTION: A display unit(151) includes first and second areas and displays a page on the first area. The display unit sets up the first area on the page. A control unit(180) searches information related to an object displayed on the area. The control unit displays a search result corresponding to the searched information on the second area. When the area is set on the page, the control unit recognizes the object displayed on the area and sets the recognized object as a search word. [Reference numerals] (110) Wireless communication unit; (111) Broadcasting receiving module; (112) Mobile communication module; (113) Wireless internet module; (114) Local area network module; (115) Position information module; (120) A/V input unit; (121) First camera; (122) Second camera; (123) Third camera; (124) Microphone; (130) User input unit; (140) Sensing unit; (150) Output unit; (151) Display unit; (152) Sound output module; (153) Alarm unit; (154) Haptic module; (160) Memory; (170) Interface unit; (180) Control unit; (181) Multimedia module; (190) Power supply unit

Description

MOBILE DEVICE AND CONTROL METHOD FOR THE SAME}

The present invention relates to a portable electronic device capable of data retrieval and a control method thereof.

Portable electronic devices are electronic devices that are portable and have one or more functions such as voice and video calling, information input / output, and data storage.

As the functions are diversified, portable electronic devices are realized in the form of multimedia players with complex functions such as photographing or recording videos, playing music or video files, receiving games, broadcasting, and connecting to the Internet. It is becoming.

In order to implement complex functions in multimedia devices, various new attempts have been made in terms of hardware or software.

In order to support and increase the function of such a portable electronic device, it may be considered to improve the structural part and / or the software part of the terminal.

Pages with text and images, such as e-books, when you use a portable electronic device to search for text-related information, you must run a web browser separately from the page and type a search term. There is a hassle.

An object of the present invention is to provide a portable electronic device and a method of controlling the same that can conveniently retrieve information associated with the displayed page on the page.

Another object of the present invention is to provide a portable electronic device and a control method thereof, by which a user can use various information in one place by displaying information related to the displayed page on the page.

According to an embodiment of the present invention, a portable electronic device is formed to include a first area and a second area, and displays a page in the first area and sets the area on the page. And a controller configured to search for information associated with an object displayed in an area and to display at least one search result corresponding to the searched information in the second area.

In an embodiment, if an area is set on the page, the controller recognizes an object displayed in the set area, and sets at least one of the recognized objects as a search word.

If the recognized object is an image or a video, the controller sets a keyword included in the image or a video as a search word.

The controller may search for information associated with the object in a database accessible through a search engine using the set search word.

If the recognized objects are plural, the recognized plural objects are set together as a search word.

The display unit may include a first area and a second area, display the page in the first area, and display the retrieved information in the second area.

The control unit may display a search result corresponding to at least one selected by a user among information displayed in the second area, in the third area of the display unit.

The second region may be configured to receive at least one of the displayed information based on a touch input.

In addition, when a touch touch input to two different points in the second area is detected, an area is set, and if the set area is dragged to a third area or a touch input different from the touch input is detected for the set area. The information included in the set area is selected.

The search result may be at least one of a text, an image, a video corresponding to the selected information, and a page linked to the information.

In addition, search results associated with objects displayed in different setting areas may be accumulated and displayed in the third area.

The control unit may sort the search results included in the third area by at least one of a page including an object, a format, and a selection date.

The control unit may display an object corresponding to the selected search result in the first area or display the selected search result in the first area when any one of the search results displayed in the third area is selected. And overlapping and displaying the page displayed in the first area transparently.

According to an embodiment, when at least one of the information displayed on the second area is selected by the user, the controller may highlight the object displayed on the page to be distinguished from other objects.

In an embodiment, when a touch input for the highlighted object is detected, the controller may display a search result associated with the object and corresponding to the information selected by the user on the page, or display the first and the first items. It is characterized by displaying in 3 areas different from 2 areas.

In addition, when a touch input for the highlighted object is detected, the controller displays an information list or icon associated with the object on the page and selected by the user, and at least one of the information list or the icon is selected. The search result corresponding to the selected information list or the icon is displayed in the first area or in a third area different from the first and second areas.

The display unit may be configured to allow a touch input, and the area may be set by a touch input detected at two different points of the display unit.

The control unit may recognize an object included in the set area when a touch input different from a touch input for setting the area is input to the display unit.

In addition, in one embodiment, any one of an area where the page is displayed, an area where the information retrieved by the controller is displayed, and an area where a search result corresponding to the information is displayed is changed in size or generated or When deleted, the other area is changed in size in conjunction with the change, generation or deletion.

A control method of a portable electronic device having a display unit including a first area and a second area according to an embodiment of the present invention, the method comprising: recognizing an object included in an area set on a page displayed in the first area And setting the recognized object as a search word, searching for information associated with the set search word, and displaying at least one of a search result corresponding to the information in the second area.

In addition, the displaying may include selecting at least one of the information and displaying a search result corresponding to the selected information in a region different from a display area of the page and the search result. Characterized in that.

According to an embodiment, when at least one of the information is selected, an object associated with the selected information is highlighted to be distinguished from other objects, and when a touch input is detected on the highlighted object, a search result corresponding to the selected information It is characterized in that is displayed.

The search result corresponding to the selected information may be displayed on the page transparently so that the page is identifiable or in a different area from the page.

The search result may be displayed as at least one of a text, an image, a video, and a page linked to the search result corresponding to the selected information.

The portable electronic device and its control method according to an embodiment of the present invention can provide a search function at the same time as displaying a page to a user by searching for information associated with an object included in an area set in a displayed page.

In addition, the portable electronic device and its control method according to an embodiment of the present invention can provide the user with information associated with the object by providing the selected information among the searched information in association with the object included in the page.

1 is a block diagram illustrating a portable electronic device according to an embodiment of the present disclosure.
2 is a flowchart illustrating a control method of a portable electronic device according to an exemplary embodiment disclosed herein.
3A and 3B are conceptual diagrams of a portable electronic device according to one embodiment disclosed herein.
4 is a conceptual diagram illustrating a method for selecting information in a portable electronic device according to an embodiment of the present disclosure.
5A to 5F are conceptual views illustrating a method of displaying a search result in a portable electronic device according to an exemplary embodiment disclosed herein.
6A and 6B are conceptual views illustrating a display method of a portable electronic device according to an exemplary embodiment disclosed herein.

Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings, wherein like reference numerals are used to designate identical or similar elements, and redundant description thereof will be omitted. The suffix "module" and " part "for the components used in the following description are given or mixed in consideration of ease of specification, and do not have their own meaning or role. In the following description of the embodiments of the present invention, a detailed description of related arts will be omitted when it is determined that the gist of the embodiments disclosed herein may be blurred. In addition, it should be noted that the attached drawings are only for easy understanding of the embodiments disclosed in the present specification, and should not be construed as limiting the technical idea disclosed in the present specification by the attached drawings.

The portable electronic device described herein may include a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a personal digital assistant (PDA), a portable multimedia player (PMP), navigation, and the like. However, it will be readily apparent to those skilled in the art that the configuration according to the embodiments described herein may also be applied to fixed terminals such as digital TVs, desktop computers, and the like, except when applicable only to portable electronic devices.

1 is a block diagram illustrating a portable electronic device according to an exemplary embodiment disclosed herein.

The portable electronic device 100 may include a wireless communication unit 110, an A / V input unit 120, a user input unit 130, a sensing unit 140, an output unit 150, a memory 160, The interface unit 170 may include a controller 180, a power supply unit 190, and the like. The components shown in FIG. 1 are not essential, so a portable electronic device having more or fewer components may be implemented.

Hereinafter, the components will be described in order.

The wireless communication unit 110 may include one or more modules that enable wireless communication between the portable electronic device 100 and the wireless communication system or between the portable electronic device 100 and a network in which the portable electronic device 100 is located. . For example, the wireless communication unit 110 may include a broadcast receiving module 111, a mobile communication module 112, a wireless Internet module 113, a short range communication module 114, and a location information module 115 .

The broadcast receiving module 111 receives a broadcast signal and / or broadcast related information from an external broadcast management server through a broadcast channel.

The broadcast channel may include a satellite channel and a terrestrial channel. The broadcast management server may refer to a server for generating and transmitting broadcast signals and / or broadcast related information, or a server for receiving broadcast signals and / or broadcast related information generated by the broadcast management server and transmitting the generated broadcast signals and / or broadcast related information. The broadcast signal may include a TV broadcast signal, a radio broadcast signal, a data broadcast signal, and a broadcast signal in which a data broadcast signal is combined with a TV broadcast signal or a radio broadcast signal.

The broadcast-related information may refer to a broadcast channel, a broadcast program, or information related to a broadcast service provider. The broadcast related information may also be provided through a mobile communication network. In this case, it may be received by the mobile communication module 112.

The broadcast related information may exist in various forms. For example, it may exist in the form of Electronic Program Guide (EPG) of Digital Multimedia Broadcasting (DMB) or Electronic Service Guide (ESG) of Digital Video Broadcast-Handheld (DVB-H).

The broadcast receiving module 111 may include, for example, Digital Multimedia Broadcasting-Terrestrial (DMB-T), Digital Multimedia Broadcasting-Satellite (DMB-S), Media Forward Link Only (MediaFLO), and Digital Video Broadcast (DVB-H). Digital broadcast signals can be received using digital broadcasting systems such as Handheld and Integrated Services Digital Broadcast-Terrestrial (ISDB-T). Of course, the broadcast receiving module 111 may be adapted to other broadcasting systems as well as the digital broadcasting system described above.

The broadcast signal and / or broadcast related information received through the broadcast receiving module 111 may be stored in the memory 160.

The mobile communication module 112 transmits and receives radio signals to at least one of a base station, an external terminal, and a server on a mobile communication network. The wireless signal may include various types of data depending on a voice call signal, a video call signal or a text / multimedia message transmission / reception.

The mobile communication module 112 is configured to implement a video call mode and a voice call mode. The video call mode refers to a state of talking while viewing a video of the other party, and the voice call mode refers to a state in which a call is made without viewing the other party's video. In order to implement the video communication mode and the voice communication mode, the mobile communication module 112 is configured to transmit and receive at least one of voice and image.

The wireless internet module 113 refers to a module for wireless internet access and may be embedded or external to the portable electronic device 100. Wireless Internet technologies may include Wireless LAN (Wi-Fi), Wireless Broadband (Wibro), World Interoperability for Microwave Access (Wimax), High Speed Downlink Packet Access (HSDPA), and the like.

The short range communication module 114 refers to a module for short range communication. Bluetooth, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee, and the like can be used as a short range communication technology.

The location information module 115 is a module for obtaining a location of a portable electronic device, and a representative example thereof is a GPS (Global Position System) module.

Referring to FIG. 1, an A / V (Audio / Video) input unit 120 is for inputting an audio signal or a video signal, and may include a camera 121 and a microphone 122. The camera 121 processes image frames such as still images or moving images obtained by the image sensor in the video call mode or the photographing mode. The processed image frame can be displayed on the display unit 151. [

The image frame processed by the camera 121 may be stored in the memory 160 or transmitted to the outside through the wireless communication unit 110. [ Two or more cameras 121 may be provided depending on the use environment.

The microphone 122 receives an external sound signal through a microphone in a communication mode, a recording mode, a voice recognition mode, or the like, and processes it as electrical voice data. The processed voice data can be converted into a form that can be transmitted to the mobile communication base station through the mobile communication module 112 when the voice data is in the call mode, and output. Various noise reduction algorithms may be implemented in the microphone 122 to remove noise generated in receiving an external sound signal.

The user input unit 130 generates input data for a user to control the operation of the terminal. The user input unit 130 may include a key pad, a dome switch, a touch pad (static / static), a jog wheel, a jog switch, and the like.

The sensing unit 140 is a portable electronic device 100 such as an open / closed state of the portable electronic device 100, a position of the portable electronic device 100, presence or absence of user contact, orientation of the portable electronic device, acceleration / deceleration of the portable electronic device, and the like. Sensing the current state of the electronic device to generate a sensing signal for controlling the operation of the portable electronic device 100. For example, when the portable electronic device 100 is in the form of a slide phone, whether the slide phone is opened or closed may be sensed. In addition, whether the power supply unit 190 is supplied with power, whether the interface unit 170 is coupled to the external device may be sensed. The sensing unit 140 may include a proximity sensor 141.

The output unit 150 may include a display unit 151, an audio output module 152, an alarm unit 153, and a haptic module 154, for example, for generating output related to visual, auditory, have.

The display unit 151 displays (outputs) information processed by the portable electronic device 100. For example, when the portable electronic device is in a call mode, the portable electronic device displays a user interface (UI) or a graphic user interface (GUI) related to the call. When the portable electronic device 100 is in a video call mode or a photographing mode, it displays a photographed and / or received image, a UI, or a GUI.

The display unit 151 includes a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED), and a flexible display (flexible). display, a 3D display, or an e-ink display.

Some of these displays may be transparent or light transmissive so that they can be seen through. This can be referred to as a transparent display, and a typical example of the transparent display is TOLED (Transparent OLED) and the like. The rear structure of the display unit 151 may also be of a light transmission type. With this structure, the user can see the object located behind the terminal body through the area occupied by the display unit 151 of the terminal body.

Two or more display units 151 may exist according to the implementation form of the portable electronic device 100. For example, in the portable electronic device 100, a plurality of display units may be spaced apart or integrally disposed on one surface, or may be disposed on different surfaces.

(Hereinafter, referred to as a 'touch screen') in which a display unit 151 and a sensor for sensing a touch operation (hereinafter, referred to as 'touch sensor') form a mutual layer structure, It can also be used as an input device. The touch sensor may have the form of, for example, a touch film, a touch sheet, a touch pad, or the like.

The touch sensor may be configured to convert a change in a pressure applied to a specific portion of the display unit 151 or a capacitance generated in a specific portion of the display unit 151 into an electrical input signal. The touch sensor can be configured to detect not only the position and area to be touched but also the pressure at the time of touch.

If there is a touch input to the touch sensor, the corresponding signal (s) is directed to the touch controller. The touch controller processes the signal (s) and transmits the corresponding data to the controller 180. As a result, the controller 180 can know which area of the display unit 151 is touched.

Referring to FIG. 1, a proximity sensor 141 may be disposed in an inner region of a portable electronic device wrapped by the touch screen or near the touch screen. The proximity sensor refers to a sensor that detects the presence or absence of an object approaching a predetermined detection surface or a nearby object without mechanical contact using the force of an electromagnetic field or infrared rays. The proximity sensor has a longer life span than the contact sensor and its utilization is also high.

Examples of the proximity sensor include a transmission photoelectric sensor, a direct reflection photoelectric sensor, a mirror reflection photoelectric sensor, a high frequency oscillation proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor. And to detect the proximity of the pointer by the change of the electric field along the proximity of the pointer when the touch screen is electrostatic. In this case, the touch screen (touch sensor) may be classified as a proximity sensor.

Hereinafter, for convenience of explanation, the act of allowing the pointer to be recognized without being in contact with the touch screen so that the pointer is located on the touch screen is referred to as a "proximity touch", and the touch The act of actually touching the pointer on the screen is called "contact touch." The position where the pointer is proximately touched on the touch screen means a position where the pointer is vertically corresponding to the touch screen when the pointer is touched.

The proximity sensor detects a proximity touch and a proximity touch pattern (e.g., a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, a proximity touch movement state, and the like). Information corresponding to the detected proximity touch operation and the proximity touch pattern may be output on the touch screen.

The sound output module 152 may output audio data received from the wireless communication unit 110 or stored in the memory 160 in a call signal reception, a call mode or a recording mode, a voice recognition mode, a broadcast reception mode, and the like. The sound output module 152 may also output a sound signal related to a function (eg, a call signal reception sound, a message reception sound, etc.) performed by the portable electronic device 100. The audio output module 152 may include a receiver, a speaker, a buzzer, and the like.

The alarm unit 153 outputs a signal for notifying occurrence of an event of the portable electronic device 100. Examples of events generated in portable electronic devices include call signal reception, message reception, key signal input, and touch input. The alarm unit 153 may output a signal for notifying the occurrence of an event in a form other than the video signal or the audio signal, for example, vibration. The video signal or the audio signal may be output through the display unit 151 or the audio output module 152 so that they may be classified as a part of the alarm unit 153.

The haptic module 154 generates various tactile effects that the user can feel. Vibration is a representative example of the haptic effect generated by the haptic module 154. The intensity and pattern of vibration generated by the haptic module 154 can be controlled. For example, different vibrations may be synthesized and output or sequentially output.

In addition to vibration, the haptic module 154 may be configured to provide a pin array that vertically moves with respect to the contact skin surface, a jetting force or suction force of air through an injection or inlet port, grazing to the skin surface, contact of an electrode, electrostatic force, and the like. Various tactile effects can be generated, such as effects by the endothermic and the reproduction of a sense of cold using the elements capable of endotherm or heat generation.

The haptic module 154 can be implemented not only to transmit the tactile effect through the direct contact but also to allow the user to feel the tactile effect through the muscular sensation of the finger or arm. Two or more haptic modules 154 may be provided according to a configuration aspect of the portable electronic device 100.

The memory 160 may store a program for the operation of the controller 180 and temporarily store input / output data (e.g., a phone book, a message, a still image, a moving picture, etc.). The memory 160 may store data on vibration and sound of various patterns outputted when a touch is input on the touch screen.

The memory 160 may be a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (for example, SD or XD memory), RAM random access memory (RAM), static random access memory (SRAM), read-only memory (ROM), electrically erasable programmable read-only memory (EEPROM), programmable read-only memory (PROM), magnetic memory, magnetic It may include a storage medium of at least one type of disk, optical disk. The portable electronic device 100 may operate in association with a web storage that performs a storage function of the memory 160 on the Internet.

The interface unit 170 serves as a path with all external devices connected to the portable electronic device 100. The interface unit 170 receives data from an external device, receives power, transfers the power to each component inside the portable electronic device 100, or transmits data within the portable electronic device 100 to an external device. For example, a wired / wireless headset port, an external charger port, a wired / wireless data port, a memory card port, a port for connecting a device having an identification module, an audio I / O port, A video input / output (I / O) port, an earphone port, and the like may be included in the interface unit 162.

The identification module is a chip that stores various information for authenticating the use authority of the portable electronic device 100. The identification module may include a user identify module (UIM), a subscriber identify module (SIM), and a universal user authentication module. It may include a universal subscriber identity module (USIM) and the like. Devices with identification modules (hereinafter referred to as "identification devices") can be manufactured in a smart card format. Accordingly, the identification device can be connected to the terminal 100 through the port.

The interface unit 170 may be a passage through which power from the cradle is supplied to the portable electronic device 100 when the portable electronic device 100 is connected to an external cradle, or input by the user from the cradle. Various command signals may be a passage for transmitting to the portable electronic device. Various command signals or power input from the cradle may be operated as signals for recognizing that the portable electronic device is correctly mounted on the cradle.

The controller 180 typically controls the overall operation of the portable electronic device. For example, voice communication, data communication, video communication, and the like. The controller 180 may include a multimedia module 181 for playing multimedia. The multimedia module 181 may be implemented in the control unit 180 or may be implemented separately from the control unit 180. [

 In addition, the control unit 180 may perform a pattern recognition process for recognizing handwriting input or drawing input performed on the touch screen as characters and images, respectively.

The power supply unit 190 receives external power and internal power under the control of the controller 180 and supplies power necessary for operation of the respective components.

The various embodiments described herein may be embodied in a recording medium readable by a computer or similar device using, for example, software, hardware, or a combination thereof.

According to a hardware implementation, the embodiments described herein may be implemented as application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays , Microprocessors, microprocessors, microprocessors, and other electronic units for carrying out other functions. In some cases, the embodiments described herein may be implemented by the controller 180 itself.

According to the software implementation, embodiments such as the procedures and functions described herein may be implemented as separate software modules. Each of the software modules may perform one or more of the functions and operations described herein.

The software code may be implemented in a software application written in a suitable programming language. The software code is stored in the memory 160 and can be executed by the control unit 180. [

In addition, the controller 180 may search for information related to the object displayed on the display unit 151.

In order to retrieve information associated with an object, the controller 180 searches for information associated with an object displayed in an area set on a page displayed on the display unit 151.

The controller 180 recognizes an object displayed in the set area, and sets the recognized object as a search word. Then, the terminal searches for information on a search term set in a database accessed through a search engine accessible through a wired / wireless Internet or a database accessible through a memory 160 in a portable electronic device. .

When the search for the information is completed by the controller 180, the controller 180 displays a search result corresponding to at least one of the information in one area of the display unit 151.

Hereinafter, a method of searching for and displaying information associated with an object in the portable electronic device according to the present invention will be described with reference to FIGS. 2 and 3A and 3B.

2 is a flowchart illustrating a method for controlling a portable electronic device according to the present invention, and FIGS. 3A and 3B are conceptual views of the portable electronic device according to one embodiment disclosed herein.

The portable electronic device 100 (refer to FIG. 3A) according to an exemplary embodiment of the present disclosure includes a display unit 151 (refer to FIG. 3) disposed on one surface, for example, a front surface, and the display unit 151 is a touch. Input is made possible.

The display unit 151 may be formed to include the first area 210 and the second area 210, and may include a plurality of areas in addition to the first area and the second area.

The control method first displays a page 211 including content consisting of text, an image, a video, and the like, on the display unit 151 in the first area 210 (S100). Here, the page includes at least one of a text, an image, a video, an icon, and the like, and may be, for example, a page constituting an e-book, a web page, or the like. That is, the page contains information, and the page may be a single page or may be composed of a plurality of pages.

For example, referring to FIG. 3A, the displayed page 211 (hereinafter, referred to as a first page) is 'page.12', such as an e-book or web page including the first page 211. May be composed of a plurality of pages (for example, Page. 1 to Page. 50). In addition, the first page 211 includes an object consisting of text and an image.

Next, as described above, when a region is set by a user's setting on a page including an object such as text, an image, and a video, the controller 180 recognizes an object included in the set region (S200).

Here, the object is composed of an image, a video, a text, an icon, and the like, and as a target to which the user's actions are applied, the user sets an area to include at least one of the objects by using a touch input.

A method of setting an area using a touch input to the display unit 151 will be described with reference to FIG. 3A. The controller 180 senses a touch input at two different points of the display unit 151, The space formed by the point is set as an area.

The two points may be a first point at which the touch input starts and a second point at which the touch input ends, that is, a touch and drag input, or a touch input for two points spaced apart from each other, for example, a single touch input. . For example, as illustrated in FIG. 3A, an area including a text object marked as “smartphone” is set by a drag touch input by a user.

As described above, when a region is set on the first page 211 by a user's touch input, the controller 180 determines what the object displayed on the set region is.

Accordingly, as shown in FIG. 3A, the controller 180 recognizes a text object called 'smart phone' displayed in the set area.

In addition, as illustrated in FIG. 3B, for example, when a plurality of objects are included in the set area by an area set by two vertices facing each other based on a touch input to a point spaced apart from each other, for example, When the 'image' object is set as the area together with the text object called 'smart phone', the controller 180 recognizes the plurality of objects together.

In addition, the controller 180 may distinguish between a function of recognizing an object included in an area set on the first page 211 and another function (for example, an area copy function and an area cut function). In this case, after the region is set by the user's touch input, if the touch input different from the touch input for setting the region is input, the controller 180 recognizes the object included in the set region by the controller 180. Can be controlled. Here, the other touch input may be a single touch input, a double touch input or a long touch input.

As described above, when the object included in the area set by the user on the first page 211 is recognized (S200), the controller 180 sets the recognized object as a search word (S300).

As described above, when a plurality of objects are recognized in the set area, the controller 180 simultaneously sets the recognized plurality of objects as a search word.

Meanwhile, if the object displayed in the set area is a text object as shown in FIG. 3A, the controller 180 directly sets the text object as a search word, and when the image object is set as shown in FIG. 3B, the object is included in the image object. Set the keyword as a search term.

Here, the keyword is information associated with the object and is information consisting of text summarizing the object. Also, the inclusion of a keyword in the object may be expressed as 'information associated with the object has been tagged'.

A tag is a set of keywords in which words indicating characteristics, meanings, titles, etc. of objects are input, and tag information is input for each object (image or video) by the user, or not input by the user. Even in this case, the object itself may include tag information. In addition, the tag information may be expressed as metadata, and metadata is, for example, data describing an image object, and the tag or keyword is used to efficiently find information.

Accordingly, as shown in FIG. 3B, when an image object is included in an area, the controller 180 extracts a keyword or tag included in the image object and sets the extracted keyword as a search word. For example, the image object illustrated in FIG. 3B includes a keyword 'LG', and the controller 180 sets 'LG' as a search word along with the text object 'Smartphone' included in the area.

As described above, when the object included in the area set on the first page 211 is recognized by the controller 180 (S200, see FIG. 2), and the recognized object is set as a search word (S300), the controller 180 The information associated with the object is searched using the set search word (S400).

The controller 180 inputs the search word as a search word of a search engine accessible through the Internet, searches for information associated with the search word in a database accessed through the search engine, or stores the search word in a memory 160 (refer to FIG. 1). Retrieve information from an existing database

On the other hand, the settings of the database and the search engine including information on the search word can be changed by the user's settings, it may be set to use a plurality of search engines at the same time.

Here, a search engine is a search tool or service that searches for information on the Internet using a search word. The search engine may be, for example, google, naver, yahoo, or the like.

On the other hand, as described above, when the search for the information on the search term set in the search engine or the internal database is executed, as shown in Figs. 3a and 3b, the control unit 180 with the first page and the first page 211 Is displayed on the second area 220 which is separated from the first area 210 that is displayed.

The controller 180 can always display the search engine or the internal database on the display unit 151 together with the first page 211, and an area is set, an object is set, or a search is executed by a user. It can be displayed at the same time as it is.

In addition, when the area including the retrieved information is displayed, the size of the first area in which the first page 211 is displayed on the display unit 151 may be changed in conjunction with the display of the area in which the information is displayed.

In addition, the areas where the first page 211 and the information are displayed may be displayed to overlap each other or may be set to have respective areas.

3A and 3B, the display unit 151 initially includes only the first area 210 in which the first page 211 is displayed, but searches for objects are not performed. After execution, the searched information is converted to include the second area 220 in which the information is displayed.

The controller 180 sets the first region 210 and the second region 220 so that they do not overlap each other, and when the second region 220 is displayed, the size of the first region 210 is changed in association with this. do.

Meanwhile, the information displayed on the second area 220 is related to an object included in the area set in the first page, and all kinds of data capable of expressing contents associated with the object, such as text, an image, a video, and an icon. Can be.

In addition, the information displayed in the second area 220 may be summarized information linked with detailed information, and the second area 220 may display information including a plurality of items or a list.

Meanwhile, the search result corresponding to at least one of the information related to the object displayed by the controller 180 (S400 and FIG. 2) and displayed on the second area 220 is one of the display unit 151. It is displayed in the area (S500).

Here, one region of the display unit 151 on which the search result is displayed may be the first page 211, that is, the first region 210, and as shown in FIGS. 3A and 3B. It may be a first area 210 in which one page 211 is displayed and a third area 230 distinguished from the second area 220 in which the information is displayed.

In addition, a search result displayed on one area (eg, the third area 230) of the display unit 151 may be based on at least one information selected by a user among information searched by the controller 180. Corresponds.

The search result may include at least one of a text, an image, a video, and a page linked to the information, and may be displayed in the same manner as the searched information. In addition, when the linked information is included in the searched information, the link information included in the information is also included in the search results displayed in the third area 230.

In addition, the information included in the third area 230 is continuously displayed even if the first page 211 displayed in the first area 210 is changed, and again after the display of the first page 211 is finished. It is stored so that it can be used continuously even if it is used.

Meanwhile, when the selection of the information is completed in the second area in which the retrieved information is displayed, the second area may disappear by setting of the user or the controller 180. In this case, the first region and the third region may be changed in size in conjunction with the change of the second region.

As such, in a portable electronic device according to an embodiment of the present disclosure, a user may easily search for information associated with an object included on a page through a region setting, and display and store the found information in one region of the display unit. Make your information available in one place.

Hereinafter, a method of receiving at least one of the information displayed in the second area 220 by the user will be described with reference to FIG. 4.

4 is a conceptual diagram illustrating a method for selecting information in a portable electronic device according to an embodiment of the present disclosure.

As described above, when the search for the object is performed by the controller 181 (see FIG. 1), and the information obtained by the search is displayed on one area (ie, the second area) of the display unit 151, the The search result corresponding to the displayed information is displayed in another area (ie, the third area) of the display unit.

As described above, the search results displayed on the third area 230 are at least one of a text, an image, a video, and a page linked to the information. In addition, the search results displayed in the third area 230 are based on the user's selection of information displayed in the second area 220.

Looking at the method of selecting at least one of the information displayed on the second area 220, as shown in Figure 4, is selected based on the touch input to the display unit 151, or is not a touch input Eo may be selected through the user input unit 130 (see FIG. 1).

When the information displayed on the second area 220 is selected using a touch input, when a touch input is detected with respect to the displayed information, the controller 180 (see FIG. 1) controls the controller to display the detected information. It is determined that the selection from the user. The touch input may be single touch, double touch, or long touch, and the touch input may be a touch input for selecting one of a plurality of pieces of information divided into items, or the like, or in a region set using two different points. It may be a touch input for selecting the included information.

For example, as shown in FIG. 4, the controller 180 detects information included in the area 260 set by two different points as a touch input for selecting the information.

As described above, when information is selected using two different touch points, the area may include one information, and as shown in FIG. 4, a plurality of information may be included.

In addition, when the above-described touch input is detected, the selected information may be immediately displayed on the third area 230 without any other checking process. Otherwise, for example, a selection button may be left and the selection button is selected. Only the third area 230 may be displayed.

In another embodiment, as shown in FIG. 4, when the selected area 260 or information is dragged to another area separated from the first area 210 and the second area 220, the controller 180 ) May generate the third region 230 and display the dragged information on the third region 230. If the third region 230 is already displayed, the dragged information is included in the displayed third region 230.

In addition, when the selected information is dragged to the first area 210, the selected information is displayed on the first page 211 displayed in the first area 210.

In addition, the controller 180 may highlight or enlarge the set area or the touch detected information to be distinguished from other information for which the touch input is not detected so that the user can identify it.

Referring to another embodiment of selecting information by using a touch input, as shown in FIG. 4, the controller 180 may store respective pieces of information in one region separated from the region where the information is displayed in the second region 220. A selection window 250 that can be selected is displayed, and the selection window 250 is displayed to receive a selection of information from the user.

The selection window 250 may be displayed in a shape such as a rectangle or a circle. When a touch input or the like is detected on the selection window 250, a check indicator is displayed on the selection window 250 where the touch input is detected. In addition, as described above, when a touch input to the selection window 250 is detected, the controller 180 can directly display a search result related to information corresponding to the selection window 250 in the third area 230. In addition, the controller 180 sets a selection button or a scrap button that is distinguished from the selection window 250, and when a user's input to the button is detected, the controller 180 displays a search result corresponding to the selected information in the third area 230. Mark on.

In the above, the method for receiving at least one of the information retrieved by the controller 180 from the user has been described.

Hereinafter, a method of displaying a search result corresponding to the selected information on a displayed page (eg, the first page 211) and another area different from the page (eg, the third area 230). With reference to Figures 5a to 5c.

The controller 180 allows the search result to be linked with the object or page so that the search results can be viewed together with the object and the page associated with the search result, and displays the search result so that the user can recognize the search result.

In an embodiment, when information associated with the objects displayed on the page is searched and at least one of the searched information is selected, the controller 180 indicates that there is information selected with respect to the object and a search result corresponding thereto. .

Referring to FIG. 5A, when the user searches for information on a text object called 'smartphone' on the first page 211 displayed by the user, for example, and the user selects the information related thereto, The controller 180 distinguishes the text ('smartphone) object from other objects by highlighting, enlarging, displaying a specific mark or changing a display color.

In addition, as illustrated in FIG. 5A, when a touch input is detected on the text object displayed to be distinguished, a list or an icon of a search result selected by the user may be displayed in relation to the text object. When any one of the search result lists is selected by the user, the controller 180 displays the selected object in the first area 210 or in another area of the display unit 151 that is separated from the selected object.

 As described above, through the display that distinguishes a specific object from other objects, the user may recognize that there is a search result related to the specific object.

In addition, as shown in FIG. 5B, the controller 180 may display another page 219 (eg, 20) in addition to the 'smartphone' text object displayed on the first page 211 (eg, page 12). The 'smartphone' text object displayed on the page) may indicate that there is a search result found in relation to the text object included in the page 12.

Accordingly, the search results related to the 'smartphone' and the text object searched on the 'page 12' may also be used in the 'page 20'.

As described above, the controller 180 may use the search result in the same text object as the text object set as the search word in any page, regardless of the specific page including the text set as the search word.

In addition, as illustrated in FIG. 5B, when a text object, for example, a 'smart phone' displayed to be distinguished is selected, the controller 180 displays a search result corresponding to the object.

As shown in FIG. 5C, the search result is overlapped with the first page in the first area 210 in which the first page 211 is displayed or distinguished from the first area 210. Area, for example, the third area 230.

First, when the first page 211 and a search result overlap together, the search result is displayed transparently so that the first page 211 can be identified.

In addition, the transparency of the overlapped search result may be adjusted by the user using, for example, an adjustment bar 295, and the search result is set by the user in the first area 210. You can change the displayed position. In addition, the search result may be dragged by the user to another area different from the first area 210 (not shown). In this case, the controller 180 generates the third area 230 and the search result. Is displayed on the generated third area 230.

Next, when a touch input for selecting an object displayed to be distinguished, for example, a 'smart phone' is detected, the controller 180 may divide a search result related to the object into a third area separated from the first area 210. (230). When the object is selected, the third region 230 may be displayed together with the first region 210 or may be generated by the controller based on an object selection input of the user.

Next, in FIG. 5D, a description will be given of a method in which objects included in a page displayed in the first area 210 and search results displayed in the third area 230 are displayed on the display unit 151 in association with the page. .

As described above, in the third area 230, a search result corresponding to information selected by the user among information searched in association with the object is displayed. The controller 180 displays the search results displayed in the third area 230 together with the objects associated therewith. For example, if any one of the search results displayed in the third area 230 is selected, the controller ( 180 changes the output range of the currently displayed page so that the object linked to the search result is displayed.

For example, referring to FIG. 5D, an object called 'smartphone' included in the first page 211 and page 12 includes search results corresponding to '2 and 3' in the third region 230. If it is related, the controller changes the output range of the page so that the object 'smart phone' is displayed in the first area 210 when the user selects the search result '2' (page 13 => page 12). .

As another embodiment, when the search result '2' is selected from the search results displayed in the third area 230, the controller 180 includes a 'smartphone' object corresponding to the search result '2'. The first page 211 is displayed in the first area 210, and the search result '2' is displayed to overlap with the first page 211.

As such, the controller 180 (refer to FIG. 1) according to the present invention may set the search result selected by the user to be linked with the object associated with it, so that the user may grasp information related to the object at a glance.

5E and 5F, the third area 230 in which the search results are displayed will be described in detail.

The third area 230 is an area including search results corresponding to information selected by the user. The third area 230 is a search related to objects searched in the past as well as search results related to objects included in the displayed page. The results can be accumulated cumulatively.

If a touch input is detected with respect to any one of the search results displayed in the third area 230, the controller 180 may provide information about the search result. The information may be related to information about an object associated with the search results, a date on which a search result is selected, another search result selected with the search result, another search result selected with respect to an object associated with the search result, and the like.

In addition, the search results displayed in the third area 230 may be displayed by sorting by a predetermined criterion. For example, the search results may be sorted according to a file format, an object, a page, a selection date, or the like. Can be displayed.

In addition, in the third area 230, 'memo' and 'add' icons 270 and 280 are included in addition to the search result selected by the user so that the user may include contents to be arbitrarily added. In addition, when data such as 'note' or an image or a video is added to the third area 230 through the 'memo' and 'add' icons 270 and 280, the added data may be added to the first area. In order to link with the page displayed in the area 210, an object may be selected in the page, and the selected object and the added data may be linked.

Data such as a search result included in the third area 230 is not only related to the page displayed in the first area 210, for example, 'A e-book', but also related to another 'B e-book'. It can also contain data. In addition, an application may be generated so that only data included in the third area 230 can be viewed separately from the page, and the data may be freely used by the portable electronic device.

Next, with reference to FIGS. 6A and 6B, the display unit 151 displays the page, information searched in association with an object included in the page, and a search result including some of the information. Take a look.

As shown in FIGS. 6A and 6B, a page is displayed in the first area 210, searched information is displayed in the second area 220, and the second area 220 is displayed in the third area 230. Search results corresponding to the information displayed in the are displayed.

The first to third areas 210 to 230 according to the present invention display unit 151 when any one of the areas is changed in size, created or deleted by the user or the controller 180 (see FIG. 1). The size of the other areas indicated by) is changed in size in conjunction with the change, creation or deletion.

For example, referring to FIG. 6A, first, a first area 210 displaying a first page 211 is displayed on the display unit 151, and the controller 180 controls an object included in the first page. When relevant information is found, the controller 180 displays the second area 220 in which the found information is displayed. In this case, the controller 180 reduces the size of the first region 210 so that the generated second region 220 and the first region 210 can be displayed on one screen of the display unit 151.

In addition, the size of the first region 210 may be adjusted by a drag touch input, for example, and may be variously adjusted by providing a separate size setting button.

In addition, referring to FIG. 6B, when any one area is deleted or the size is changed by the user or the controller 180 while the first to third areas 210 to 230 are displayed, the size of the other areas is changed. Is enlarged in size in response to the change or deletion.

Referring to an embodiment in which the size of the regions is changed or deleted by a user's setting, for example, a setting according to a touch input, first, an icon for reducing or deleting the size of each region is placed and the icon is selected. The command corresponding to the icon may be executed.

In addition, the controller 180 senses a touch input of two points to the display unit 151, for example, two fingers, and changes the size of the regions according to a change in distance between the two fingers (zoom in and zoom out). Can be deleted or deleted. In addition, it may be enlarged, reduced or deleted by using a specific touch input such as double touch.

As such, the size of the regions displayed on the display unit may be freely modified, and may be set to be generated or destroyed as a search is executed or selection is completed.

In the above-described embodiments, the method of displaying the first page 210 on which the page is displayed, the second area 220 on which the searched information is displayed, and the third area 230 on which the search result is displayed are described. As an example, there are various modifications to the division of the area where the page, information, and search results can be displayed on the display unit 151.

As described above, the portable electronic device and the control method thereof according to an embodiment of the present invention search for information associated with an object included in an area set in a displayed page, thereby displaying a page to a user and simultaneously provide a search function. have.

In addition, the portable electronic device and its control method according to an embodiment of the present invention can provide the user with information associated with the object by providing the selected information among the searched information in association with the object included in the page.

The portable electronic device described above is not limited to the configuration and method of the embodiments described above, but the embodiments are configured by selectively combining all or part of the embodiments so that various modifications can be made. May be

Claims (23)

A display unit formed to include a first area and a second area, and configured to display a page in the first area and to set an area on the page; And
And a controller configured to search for information associated with an object displayed in an area set on the page, and to display at least one search result corresponding to the searched information in the second area.
The apparatus of claim 1, wherein the control unit
And when an area is set on the page, recognizes an object displayed on the set area, and sets at least one of the recognized objects as a search word.
The method of claim 2,
And the controller sets a keyword included in the image or the video as a search word if the recognized object is the image or the video.
4. The apparatus of claim 3, wherein the control unit
And searching for information associated with the object in a database accessible through a search engine using the set search word.
The method of claim 2,
And the plurality of recognized objects are set together as a search word when the recognized objects are plural.
The method of claim 1,
And the controller is configured to display a search result corresponding to at least one selected by a user among information displayed in the second area, in the third area of the display unit.
The method of claim 6, wherein the second region is
And at least one of the displayed information is selected based on a touch input.
The method of claim 7, wherein
If a touch touch input to two different points in the second area is detected, an area is set, and if the set area is dragged to a third area or if a touch input different from the touch input is detected for the set area, the set area is set. And the information included in the area is selected.
The method according to claim 6,
The search result is at least one of a text, an image, a video corresponding to the selected information and a page linked to the information.
The method according to claim 6,
And the search results associated with objects displayed in different setting areas are accumulated and displayed in the third area.
The method of claim 10, wherein the control unit
And the search results included in the third area are arranged by at least one of a page, a format, and a selection date including an object.
7. The apparatus of claim 6, wherein the control unit
If any one of the search results displayed in the third area is selected,
Display an object corresponding to the selected search result in the first area;
And overlapping the selected search result with the first area and transparently displaying the page displayed on the first area so as to be identified.
The method of claim 1,
If at least one of the information displayed in the second area is selected by the user,
The controller may highlight the objects displayed on the page to be distinguished from other objects.
The method of claim 13, wherein the control unit
When a touch input for the highlighted object is detected, a search result associated with the object and corresponding to information selected by the user is displayed on the page or displayed in a third area different from the first and second areas. Portable electronic device, characterized in that.
The method of claim 13, wherein the control unit
When a touch input for the highlighted object is detected, the information list or icon associated with the object on the page and selected by the user is displayed.
When at least one of the information list or the icon is selected, a search result corresponding to the selected information list or the icon is displayed in the first area or a third area different from the first and second areas. Portable electronic devices.
The display apparatus of claim 1, wherein the display unit is formed to allow a touch input.
And the area is set by a touch input detected at two different points of the display unit.
17. The apparatus of claim 16, wherein the control unit
And a touch input different from the touch input for setting the area on the display unit to recognize an object included in the set area.
The method of claim 1,
If one of the area where the page is displayed, the area where the information retrieved by the controller is displayed, and the area where the search result corresponding to the information is displayed is changed, created or deleted by the user, the other area A portable electronic device, characterized in that the size is changed in conjunction with the change, creation or deletion.
In the control method of a portable electronic device having a display unit including a first region and a second region,
Recognizing an object included in an area set on a page displayed in the first area;
Setting the recognized object as a search word and searching for information associated with the set search word; And
And displaying at least one of the search results corresponding to the information in the second area.
20. The method of claim 19,
The displaying step
Receiving at least one of the information; And
And displaying a search result corresponding to the selected information in a third area separated from the first and second areas.
20. The method of claim 19,
When at least one of the information is selected, the object associated with the selected information is highlighted to distinguish it from other objects,
And if a touch input is detected on the highlighted object, a search result corresponding to the selected information is displayed.
The method of claim 21,
And a search result corresponding to the selected information is displayed on the page in a transparent manner so that the page is identifiable or in a different area from the page.
22. The method of claim 21, wherein the search result is displayed as at least one of a text, an image, a video, and a page linked to the search result corresponding to the selected information.
KR1020110060878A 2011-06-22 2011-06-22 Mobile device and control method for the same KR20130000280A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020110060878A KR20130000280A (en) 2011-06-22 2011-06-22 Mobile device and control method for the same

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020110060878A KR20130000280A (en) 2011-06-22 2011-06-22 Mobile device and control method for the same

Publications (1)

Publication Number Publication Date
KR20130000280A true KR20130000280A (en) 2013-01-02

Family

ID=47833900

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020110060878A KR20130000280A (en) 2011-06-22 2011-06-22 Mobile device and control method for the same

Country Status (1)

Country Link
KR (1) KR20130000280A (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104123283A (en) * 2013-04-24 2014-10-29 华为技术有限公司 Method, device and system for searching for target data
KR20160016860A (en) * 2014-03-17 2016-02-15 바이두 온라인 네트웍 테크놀러지 (베이징) 캄파니 리미티드 Search recommendation method and apparatus
KR20160018564A (en) * 2014-03-21 2016-02-17 바이두 온라인 네트웍 테크놀러지 (베이징) 캄파니 리미티드 Method and device for search and recommendation
KR20160020166A (en) * 2014-08-13 2016-02-23 삼성전자주식회사 Electronic apparatus and screen diplaying method thereof
US10635297B2 (en) 2013-04-02 2020-04-28 Facebook, Inc. Interactive elements in a user interface
KR20200048757A (en) * 2018-10-30 2020-05-08 삼성에스디에스 주식회사 Search method and apparatus thereof
KR20220037696A (en) * 2020-09-18 2022-03-25 네이버 주식회사 Method and ststem for providing relevant infromation

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10635297B2 (en) 2013-04-02 2020-04-28 Facebook, Inc. Interactive elements in a user interface
CN104123283A (en) * 2013-04-24 2014-10-29 华为技术有限公司 Method, device and system for searching for target data
KR20160016860A (en) * 2014-03-17 2016-02-15 바이두 온라인 네트웍 테크놀러지 (베이징) 캄파니 리미티드 Search recommendation method and apparatus
KR20160018564A (en) * 2014-03-21 2016-02-17 바이두 온라인 네트웍 테크놀러지 (베이징) 캄파니 리미티드 Method and device for search and recommendation
KR20160020166A (en) * 2014-08-13 2016-02-23 삼성전자주식회사 Electronic apparatus and screen diplaying method thereof
KR20200048757A (en) * 2018-10-30 2020-05-08 삼성에스디에스 주식회사 Search method and apparatus thereof
KR20220037696A (en) * 2020-09-18 2022-03-25 네이버 주식회사 Method and ststem for providing relevant infromation

Similar Documents

Publication Publication Date Title
KR101830653B1 (en) Mobile device and control method for the same
KR101861698B1 (en) Mobile device and control method for the same
KR101859100B1 (en) Mobile device and control method for the same
US9817798B2 (en) Method for displaying internet page and mobile terminal using the same
US9973612B2 (en) Mobile terminal with touch screen and method of processing data using the same
US20110083078A1 (en) Mobile terminal and browsing method thereof
KR20140112851A (en) Mobile terminal and control method for the mobile terminal
KR20130005174A (en) Mobile device and control method for the same
KR20130010684A (en) Mobile terminal and method for controlling display thereof
KR20110127853A (en) Mobile terminal and method for controlling the same
KR20130000280A (en) Mobile device and control method for the same
KR101294306B1 (en) Mobile device and control method for the same
KR20110013606A (en) Method for executing menu in mobile terminal and mobile terminal thereof
KR102187569B1 (en) Mobile terminal and control method for the mobile terminal
KR20140113155A (en) Mobile device and control method for the same
KR20140100315A (en) Mobile terminal and control method thereof
KR102032407B1 (en) Mobile terminal and method for controlling of the same
KR20140025048A (en) Terminal and operating method thereof
US10133457B2 (en) Terminal for displaying contents and operating method thereof
US20160196058A1 (en) Mobile terminal and control method thereof
KR101572035B1 (en) Mobile terminal and control method thereof
KR102020325B1 (en) Control apparatus of mobile terminal and method thereof
KR20100117417A (en) Method for executing application in mobile terminal and mobile terminal using the same
KR20120116288A (en) Mobile device and bookmark method for mobile device
KR20140073990A (en) Method for paragraph jump service using tag and terminal therefor

Legal Events

Date Code Title Description
WITN Withdrawal due to no request for examination