WO2014142412A1 - Mobile device and control method for the same - Google Patents

Mobile device and control method for the same Download PDF

Info

Publication number
WO2014142412A1
WO2014142412A1 PCT/KR2013/009830 KR2013009830W WO2014142412A1 WO 2014142412 A1 WO2014142412 A1 WO 2014142412A1 KR 2013009830 W KR2013009830 W KR 2013009830W WO 2014142412 A1 WO2014142412 A1 WO 2014142412A1
Authority
WO
WIPO (PCT)
Prior art keywords
touch
instance
message
subject
controller
Prior art date
Application number
PCT/KR2013/009830
Other languages
French (fr)
Inventor
Seyoung LEE
Original Assignee
Lg Electronics Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lg Electronics Inc. filed Critical Lg Electronics Inc.
Publication of WO2014142412A1 publication Critical patent/WO2014142412A1/en

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B1/00Details of transmission systems, not covered by a single one of groups H04B3/00 - H04B13/00; Details of transmission systems not characterised by the medium used for transmission
    • H04B1/38Transceivers, i.e. devices in which transmitter and receiver form a structural unit and in which at least one part is used for functions of transmitting and receiving
    • H04B1/40Circuits
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/7243User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages
    • H04M1/72436User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages for text messaging, e.g. SMS or e-mail
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/04Changes in size, position or resolution of an image
    • G09G2340/0464Positioning
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/22Details of telephonic subscriber devices including a touch pad, a touch sensor or a touch detector

Definitions

  • the present disclosure relates to a mobile device capable of outputting instance messages and a control method thereof.
  • Mobile devices can be easily carried and have one or more of functions such as supporting voice and video telephony calls, inputting and/or outputting information, storing data and the like.
  • the mobile device can be allowed to capture still images or moving images, play music or video files, play games, receive broadcast and the like, so as to be implemented as an integrated multimedia player.
  • the improvement of structural or software elements of the terminal may be taken into consideration to support and enhance the functions of the mobile device.
  • an object of embodiments of the present disclosure is to provide a mobile device for more easily retrieving a user's desired specific instance message and/or its related instance messages among instance messages transmitted and received in the past and a control method thereof.
  • Another object of embodiments of the present disclosure is to provide a mobile device for displaying only related instance messages on a display screen when it is desired to transmit and receive a newly prepared instance message associated with instance messages transmitted and received in the past and a control method thereof.
  • a mobile device may include a wireless communication unit configured to transmit and receive an instance message; a display unit configured to enable a touch, and display the instance message transmitted and received through the wireless communication unit; and a controller configured to classify the instance message into a message belonging to a specific subject and display the message belonging to the specific subject to be visually distinguished from the other instance messages when a first touch gesture to the instance message is sensed and the specific subject is selected according to a second touch gesture.
  • the second touch gesture may be a touch to a selection menu popped up according to the first touch gesture, and the controller may classify the instance message into a message belonging to the selected subject when the specific subject is selected according to the touch.
  • the controller may classify the transmitted instance message into a message belonging to the specific subject.
  • the second touch gesture may be a touch and drag to the instance message
  • the controller may classify the instance message into an instance message belonging to the specific subject when the touch and drag being moved into a region associated with the specific subject is sensed.
  • the region associated with the specific subject may be either one of a region in which an object indicating the specific subject is displayed and a region in which a message belonging to the specific subject is displayed.
  • the controller may switch the current screen of the display unit to a screen on which a list of the object indicating the specific subject is displayed when a third touch gesture is sensed.
  • the controller may perform at least one of functions associated with a subject corresponding to the specific object.
  • the controller may classify the received instance message into a message belonging to the specific subject, and when the third touch gesture is sensed, the controller may further display an indicator indicating the reception of the received instance message on an object indicating the specific subject.
  • the controller may classify the pre-formatted data contained in the transmitted and received instance message into a message belonging to a specific subject containing only the pre-formatted data.
  • the controller may classify total instance messages into each reference range determined according to the fourth touch gesture.
  • the controller may switch the current screen of the display unit to a screen on which at least one image object indicating a reference range determined by the fourth touch gesture is displayed.
  • the fourth touch gesture may be a gesture in which at least two touch positions applied to the display unit are closer to each other, and the controller may control the reference range to be varied according to the extent of the touch positions being closer to each other.
  • the controller may switch the current screen of the display unit to a screen on which the instance message is displayed.
  • the fifth touch gesture may be a gesture in which at least two touch positions applied to the display unit are away from each other, and the controller may control the instance message display range to be varied according to the extent of the touch positions being away from each other.
  • the controller may include a setting unit for setting at least one search keyword, and when the set search keyword may be contained in an instance message with a specific reference range, the controller may further display an indicator indicating the search keyword on an image object indicating the specific reference range.
  • the controller may switch the current screen of the display unit to a screen on which instance messages with a reference range corresponding to the touch sensed image object are displayed.
  • the controller may switch the current screen of the display unit to a screen on which instance messages with the corresponding reference range from an instance message containing the touch sensed indicator are displayed.
  • the controller may display the instance message in a first region of the display unit and display a search window for retrieving information associated with the instance message in a second region distinguished from the first region.
  • the controller may receive the object as a search keyword for retrieving an instance message associated with the object.
  • a control method of a mobile device may include displaying an instance message transmitted and received through a wireless communication unit; sensing a first touch gesture to the instance message; classifying the instance message into a message belonging to a subject selected according to a second touch gesture when the first touch gesture is sensed; and displaying the message belonging to the selected subject to be visually distinguished from the other instance messages.
  • the method may further include sensing a third touch gesture to the display unit; a displaying a list of objects indicating a prestored subject when the third touch gesture is sensed; and performing at least one of functions associated with a subject corresponding to the specific object when a touch to the specific object is sensed on the list.
  • the method may further include sensing a fourth touch gesture to the display unit; and classifying total instance messages into each reference range determined according to the fourth touch gesture when the fourth touch gesture is sensed.
  • the method may further include displaying a search window for retrieving information associated with the instance message in a second region distinguished from the first region in which the instance message is displayed.
  • transmitted and received instance messages may be classified into each subject selected according to a touch, thereby allowing the user to more easily retrieve the user's desired specific instance message and/or instance messages associated therewith.
  • the screen may be switched to display total instance messages for each reference range, for example, each day range, determined according to a touch, thereby allowing the user to intuitively and quickly retrieve instance messages transmitted and received on his or her desired specific date.
  • a search window capable of directly retrieving information associated with an instance message in a region distinguished from a region displayed with instance messages may be displayed according to a touch, thereby allowing the user to directly enter his or her desired search criterion.
  • FIG. 1 is a block diagram illustrating a mobile device according to an embodiment of the present disclosure
  • FIG. 2 is a view for explaining a control method of a mobile device according to an embodiment of the present disclosure
  • FIGS. 3A and 3B are views for explaining a control method of a mobile device according to another embodiment different from FIG. 2;
  • FIGS. 4A and 4B are block conceptual view for explaining a flow chart in FIG. 2 according to an embodiment of the present disclosure
  • FIGS. 5A and 5B are views for explaining a method for displaying a list of objects indicating a previously registered subject according to an embodiment of the present disclosure
  • FIG. 5C is a view for explaining a method of selecting a specific subject from the previously registered subject list to add a new message according to an embodiment of the present disclosure
  • FIG. 6 is a view for explaining a method of performing a delete function for a specific subject on a list of objects indicating a previously registered subject according to an embodiment of the present disclosure
  • FIG. 7 is a view for explaining a method of displaying the reception of an instance message belonging to a specific subject and a method for changing an attribute for a specific subject from a list of objects indicating a previously registered subject according to an embodiment of the present disclosure
  • FIG. 8 is a view for explaining a method of classifying an instance message into a preset subject when the instance message containing a pre-formatted data is transmitted and received according to an embodiment of the present disclosure
  • FIG. 9A is a view for explaining a method of classifying total instance messages into each reference range determined based on a touch according to an embodiment of the present disclosure
  • FIGS. 9B and 9C are views for explaining a method of displaying the corresponding instance message according to a touch to at least one of an image object indicating the each reference range and a preset search keyword displayed on the image object according to an embodiment of the present disclosure
  • FIGS. 10A through 10C are views for explaining a method of classifying image objects indicating the each reference range into each narrower reference range and displaying its related instance message according to an embodiment of the present disclosure
  • FIGS. 11A and 11B are views for explaining a method of displaying a search window for retrieving information associated with an instance message in a region distinguished from a region in which the instance message is displayed according to an embodiment of the present disclosure.
  • FIGS. 11C and 11D are views for explaining a method of entering a keyword to the search window using at least one of an object contained in an instance message and an object indicating the sender of the instance message.
  • a mobile device disclosed herein may include a portable phone, a smart phone, a laptop computer, a digital broadcast mobile device, a personal digital assistant (PDA), a mobile multimedia player (PMP), a navigation, a slate PC, a tablet PC, an ultrabook, and the like.
  • PDA personal digital assistant
  • PMP mobile multimedia player
  • a navigation a slate PC, a tablet PC, an ultrabook, and the like.
  • a configuration according to the following description may be applicable to a stationary terminal such as a digital TV, a desktop computer, and the like, excluding constituent elements particularly configured for mobile purposes.
  • FIG. 1 is a block diagram illustrating a mobile device according to an embodiment disclosed in the present disclosure.
  • the mobile device 100 may include a wireless communication unit 110, an audio/video (A/V) input unit 120, a user input unit 130, a sensing unit 140, an output unit 150, a memory 160, an interface unit 170, a controller 180, a power supply unit 190, and the like.
  • A/V audio/video
  • the mobile device 100 may include a wireless communication unit 110, an audio/video (A/V) input unit 120, a user input unit 130, a sensing unit 140, an output unit 150, a memory 160, an interface unit 170, a controller 180, a power supply unit 190, and the like.
  • A/V audio/video
  • FIG. 1 the constituent elements as illustrated in FIG. 1 are not necessarily required, and the mobile communication terminal may be implemented with greater or less number of elements than those illustrated elements.
  • the wireless communication unit 110 may include one or more modules allowing radio communication between the mobile device 100 and a wireless communication system, or allowing radio communication between the mobile device 100 and a network in which the mobile device 100 is located.
  • the wireless communication unit 110 may include at least one of a broadcast receiving module 111, a mobile communication module 112, a wireless Internet module 113, a short-range communication module 114, a location information module 115, and the like.
  • At least one instance message may be transmitted and received through the wireless communication unit 110.
  • the broadcast receiving module 111 receives a broadcast signal and/or broadcast associated information from an external broadcast managing entity via a broadcast channel.
  • the broadcast channel may include a satellite channel and a terrestrial channel.
  • the broadcast managing entity may indicate a server which generates and transmits a broadcast signal and/or broadcast associated information or a server which receives a pre-generated broadcast signal and/or broadcast associated information and sends them to the mobile device.
  • the broadcast signal may be implemented as a TV broadcast signal, a radio broadcast signal, and a data broadcast signal, among others.
  • the broadcast signal may further include a data broadcast signal combined with a TV or radio broadcast signal.
  • broadcast associated information may include information associated with a broadcast channel, a broadcast program, a broadcast service provider, and the like.
  • the broadcast associated information may be provided via a mobile communication network, and received by the mobile communication module 112.
  • broadcast associated information may be implemented in various formats.
  • broadcast associated information may include Electronic Program Guide (EPG) of Digital Multimedia Broadcasting (DMB), Electronic Service Guide (ESG) of Digital Video Broadcast-Handheld (DVB-H), and the like.
  • EPG Electronic Program Guide
  • DMB Digital Multimedia Broadcasting
  • ESG Electronic Service Guide
  • DVD-H Digital Video Broadcast-Handheld
  • the broadcast receiving module 111 may be configured to receive digital broadcast signals transmitted from various types of broadcast systems. Such broadcast systems may include Digital Multimedia Broadcasting-Terrestrial (DMB-T), Digital Multimedia Broadcasting-Satellite (DMB-S), Media Forward Link Only (MediaFLO), Digital Video Broadcast-Handheld (DVB-H), Integrated Services Digital Broadcast-Terrestrial (ISDB-T) and the like.
  • DMB-T Digital Multimedia Broadcasting-Terrestrial
  • DMB-S Digital Multimedia Broadcasting-Satellite
  • MediaFLO Media Forward Link Only
  • DVD-H Digital Video Broadcast-Handheld
  • ISDB-T Integrated Services Digital Broadcast-Terrestrial
  • the broadcast receiving module 111 may be configured to be suitable for every broadcast system transmitting broadcast signals as well as the digital broadcasting systems.
  • Broadcast signals and/or broadcast associated information received via the broadcast receiving module 111 may be stored in a memory 160.
  • the mobile communication module 112 transmits and receives wireless signals to and from at least one a base station, an external terminal and a server on a mobile communication network.
  • the wireless signals may include audio call signals, video call signals, or various formats of data according to the transmission and reception of text/multimedia messages.
  • the mobile communication module 112 may be configured to implement an video communication mode and a voice communication mode.
  • the video communication mode refers to a configuration in which communication is made while viewing the image of the counterpart
  • the voice communication mode refers to a configuration in which communication is made without viewing the image of the counterpart.
  • the mobile communication module 112 may be configured to transmit or receive at least one of audio or video data to implement the video communication mode and voice communication mode.
  • the wireless Internet module 113 refers to a module for supporting wireless Internet access, and may be built-in or externally installed on the mobile device 100.
  • it may be used a wireless Internet access technique including WLAN (Wireless LAN), Wi-Fi (Wireless Fidelity) Direct, DLNA (Digital Living Network Alliance), Wibro (Wireless Broadband), Wimax (World Interoperability for Microwave Access), HSDPA (High Speed Downlink Packet Access), and the like.
  • WLAN Wireless LAN
  • Wi-Fi Wireless Fidelity
  • DLNA Digital Living Network Alliance
  • Wibro Wireless Broadband
  • Wimax Worldwide Interoperability for Microwave Access
  • HSDPA High Speed Downlink Packet Access
  • the short-range communication module 114 refers to a module for supporting a short-range communication.
  • it may be used a short-range communication technology including BluetoothTM, Radio Frequency IDentification (RFID), Infrared Data Association (IrDA), Ultra WideBand (UWB), ZigBee, and the like.
  • RFID Radio Frequency IDentification
  • IrDA Infrared Data Association
  • UWB Ultra WideBand
  • ZigBee ZigBee
  • the location information module 115 is a module for checking or acquiring the location of the mobile device, and there is a Global Positioning Module (GPS) module as a representative example.
  • GPS Global Positioning Module
  • the A/V(audio/video) input unit 120 receives an audio or video signal
  • the A/V (audio/video) input unit 120 may include a camera 121 and a microphone 122.
  • the camera 121 processes image frames, such as still or moving images, obtained by an image sensor in a video phone call or image capturing mode.
  • the processed image frame may be displayed on a display unit 151.
  • the image frames processed by the camera 121 may be stored in the memory 160 or transmitted to an external device through the wireless communication unit 110. Furthermore, the user's location information or the like may be produced from image frames acquired from the camera 121. Two or more cameras 121 may be provided according to the use environment.
  • the microphone 122 receives an external audio signal through a microphone in a phone call mode, a recording mode, a voice recognition mode, and the like, and processes the audio signal into electrical voice data.
  • the processed voice data may be converted and outputted into a format that is transmittable to a mobile communication base station through the mobile communication module 112 in the phone call mode.
  • the microphone 122 may implement various types of noise canceling algorithms to cancel noise generated in a procedure of receiving the external audio signal.
  • the user input unit 130 may generate input data to control an operation of the terminal.
  • the user input unit 130 may be configured by including a keypad, a dome switch, a touch pad (pressure/capacitance), a jog wheel, a jog switch, and the like.
  • the sensing unit 140 detects a current status of the mobile device 100 such as an opened or closed configuration of the mobile device 100, a location of the mobile device 100, a presence or absence of user contact with the mobile device 100, an orientation of the mobile device 100, an acceleration/deceleration of the mobile device 100, and the like, so as to generate a sensing signal for controlling the operation of the mobile device 100.
  • a current status of the mobile device 100 such as an opened or closed configuration of the mobile device 100, a location of the mobile device 100, a presence or absence of user contact with the mobile device 100, an orientation of the mobile device 100, an acceleration/deceleration of the mobile device 100, and the like.
  • the sensing unit 140 may sense whether a sliding portion of the mobile device is open or closed.
  • Other examples include sensing functions, such as the sensing unit 140 sensing the presence or absence of power provided by the power supply unit 190, the presence or absence of a coupling between the interface unit 170 and an external device.
  • the output unit 150 is configured to generate an output associated with visual sense, auditory sense or tactile sense, and may include a display unit 151, an audio output module 153, an alarm unit 154, a haptic module 155, and the like.
  • the display unit 151 may display (output) information processed in the mobile device 100. For example, when the mobile device 100 is in a phone call mode, the display unit 151 may display a User Interface (UI) or a Graphic User Interface (GUI) associated with a call. When the mobile device 100 is in a video call mode or image capturing mode, the display unit 151 may display a captured image and/or received image, a UI or GUI.
  • UI User Interface
  • GUI Graphic User Interface
  • the display unit 151 may include at least one of a Liquid Crystal Display (LCD), a Thin Film Transistor-LCD (TFT-LCD), an Organic Light Emitting Diode (OLED) display, a flexible display, a three-dimensional (3D) display, and an e-ink display.
  • LCD Liquid Crystal Display
  • TFT-LCD Thin Film Transistor-LCD
  • OLED Organic Light Emitting Diode
  • flexible display a three-dimensional (3D) display
  • 3D three-dimensional
  • e-ink display e-ink display
  • Some of those displays may be configured with a transparent or optical transparent type to allow viewing of the exterior through the display unit, which may be called transparent displays.
  • An example of the typical transparent displays may include a transparent LCD (TOLED), and the like. Under this configuration, a user can view an object positioned at a rear side of a mobile device body through a region occupied by the display unit 151 of the mobile device body.
  • TOLED transparent LCD
  • Two or more display units 151 may be implemented according to a configured aspect of the mobile device 100. For instance, a plurality of the display units 151 may be arranged on one surface to be spaced apart from or integrated with each other, or may be arranged on different surfaces.
  • the display unit 151 and a touch sensitive sensor have an interlayer structure (hereinafter, referred to as a "touch screen")
  • the display unit 151 may be used as an input device in addition to an output device.
  • the touch sensor may be implemented as a touch film, a touch sheet, a touch pad, and the like.
  • the touch sensor may be configured to convert changes of a pressure applied to a specific part of the display unit 151, or a capacitance occurring from a specific part of the display unit 151, into electric input signals.
  • the touch sensor may be configured to sense not only a touched position and a touched area, but also a touch pressure at which a touch object body is touched on the touch sensor.
  • the corresponding signals are transmitted to a touch controller.
  • the touch controller processes the signal(s), and then transmits the corresponding data to the controller 180. Accordingly, the controller 180 may sense which region of the display unit 151 has been touched.
  • a proximity sensor 141 may be arranged at an inner region of the mobile device 100 surrounded by the touch screen, or adjacent to the touch screen.
  • the proximity sensor 141 may be provided as an example of the sensing unit 140.
  • the proximity sensor 141 refers to a sensor to sense the presence or absence of an object approaching to a surface to be sensed, or an object disposed adjacent to a surface to be sensed, by using an electromagnetic field or infrared rays without a mechanical contact.
  • the proximity sensor 141 has a longer lifespan and a more enhanced utility than a contact sensor.
  • the proximity sensor 141 may include an optical transmission type photoelectric sensor, a direct reflective type photoelectric sensor, a mirror reflective type photoelectric sensor, a high-frequency oscillation proximity sensor, a capacitance type proximity sensor, a magnetic type proximity sensor, an infrared rays proximity sensor, and so on.
  • the touch screen is implemented as a capacitance type, the proximity of an object having conductivity (hereinafter, referred to as a "pointer") to the touch screen is sensed by changes of an electromagnetic field.
  • the touch screen may be categorized into a proximity sensor.
  • proximity touch a behavior that the pointer is positioned to be proximate onto the touch screen without contact
  • contact touch a behavior that the pointer substantially comes in contact with the touch screen
  • the proximity sensor senses a proximity touch, and a proximity touch pattern (e.g., proximity touch distance, proximity touch direction, proximity touch speed, proximity touch time, proximity touch position, proximity touch moving status, etc.). Information relating to the sensed proximity touch and the sensed proximity touch patterns may be output onto the touch screen.
  • a proximity touch pattern e.g., proximity touch distance, proximity touch direction, proximity touch speed, proximity touch time, proximity touch position, proximity touch moving status, etc.
  • the audio output module 152 may output audio data received from the wireless communication unit 110 or stored in the memory 160, in a call-receiving mode, a call-placing mode, a recording mode, a voice recognition mode, a broadcast reception mode, and so on.
  • the audio output module 152 may output audio signals relating to the functions performed in the mobile device 100 (e.g., sound alarming a call received or a message received, and so on).
  • the audio output module 152 may include a receiver, a speaker, a buzzer, and so on.
  • the alarm 153 outputs signals notifying occurrence of events from the mobile device 100.
  • the events occurring from the mobile device 100 may include call received, message received, key signal input, touch input, and so on.
  • the alarm 153 may output not only video or audio signals, but also other types of signals such as signals notifying occurrence of events in a vibration manner. Since the video or audio signals can be output through the display unit 151 or the audio output unit 152, they 151, 152 may be categorized into part of the alarm 153.
  • the haptic module 154 generates various tactile effects on which a user can feel.
  • a representative example of the tactile effects generated by the haptic module 154 includes vibration.
  • Vibration generated by the haptic module 154 may have a controllable intensity, a controllable pattern, and so on. For instance, different vibration may be output in a synthesized manner or in a sequential manner.
  • the haptic module 154 may generate various tactile effects, including not only vibration, but also arrangement of pins vertically moving with respect to a skin being touched, air injection force or air suction force through an injection hole or a suction hole, touch by a skin surface, presence or absence of contact with an electrode, effects by stimulus such as an electrostatic force, reproduction of cold or hot feeling using a heat absorbing device or a heat emitting device, and the like.
  • the haptic module 154 may be configured to transmit tactile effects through a user’s direct contact, or a user’s muscular sense using a finger or a hand.
  • the haptic module 154 may be implemented in two or more in number according to the configuration of the mobile device 100.
  • the memory 160 may store a program for processing and controlling the controller 180. Alternatively, the memory 160 may temporarily store input/output data (e.g., phonebook, messages, still images, videos, and the like). Also, the memory 160 may store data related to various patterns of vibrations and sounds outputted upon the touch input on the touch screen.
  • input/output data e.g., phonebook, messages, still images, videos, and the like.
  • the memory 160 may store data related to various patterns of vibrations and sounds outputted upon the touch input on the touch screen.
  • the memory 160 may be implemented using any type of suitable storage medium including a flash memory type, a hard disk type, a multimedia card micro type, a memory card type (e.g., SD or DX memory), Random Access Memory (RAM), Static Random Access Memory (SRAM), Read-Only Memory (ROM), Electrically Erasable Programmable Read-only Memory (EEPROM), Programmable Read-only Memory (PROM), magnetic memory, magnetic disk, optical disk, and the like.
  • the mobile device 100 may operate in association with a web storage which performs the storage function of the memory 160 on the Internet.
  • the interface unit 170 may generally be implemented to interface the mobile device with external devices connected to the mobile device 100.
  • the interface unit 170 may allow a data reception from an external device, a power delivery to each component in the mobile device 100, or a data transmission from the mobile device 100 to an external device.
  • the interface unit 170 may include, for example, wired/wireless headset ports, external charger ports, wired/wireless data ports, memory card ports, ports for coupling devices having an identification module, audio Input/Output (I/O) ports, video I/O ports, earphone ports, and the like.
  • the identification module may be configured as a chip for storing various information required to authenticate an authority to use the mobile device 100, which may include a User Identity Module (UIM), a Subscriber Identity Module (SIM), and the like.
  • the device having the identification module (hereinafter, referred to as "identification device") may be implemented in a type of smart card. Hence, the identification device can be coupled to the mobile device 100 via a port.
  • the interface unit 170 may serve as a path for power to be supplied from an external cradle to the mobile device 100 when the mobile device 100 is connected to the external cradle or as a path for transferring various command signals inputted from the cradle by a user to the mobile device 100.
  • Such various command signals or power inputted from the cradle may operate as signals for recognizing that the mobile device 100 has accurately been mounted to the cradle.
  • the controller 180 typically controls the overall operations of the mobile device 100. For example, the controller 180 performs the control and processing associated with telephony calls, data communications, video calls, and the like.
  • the controller 180 may include a multimedia module 181 which provides multimedia playback.
  • the multimedia module 181 may be configured as part of the controller 180 or as a separate component.
  • the controller 180 may classify instance messages transmitted and received through the wireless communication unit 110 based on a specific reference.
  • the power supply unit 190 receives external and internal power to provide power required for various components under the control of the controller 180.
  • ASICs application specific integrated circuits
  • DSPs digital signal processors
  • DSPDs digital signal processing devices
  • PLDs programmable logic devices
  • FPGAs field programmable gate arrays
  • processors controllers, micro-controllers, microprocessors, and electrical units designed to perform the functions described herein.
  • controllers micro-controllers, microprocessors, and electrical units designed to perform the functions described herein.
  • the embodiments such as procedures or functions described in the present disclosure may be implemented with separate software modules.
  • Each of the software modules may perform at least one function or operation described in the present disclosure.
  • Software codes can be implemented by a software application written in any suitable programming language.
  • the software codes may be stored in the memory 160 and executed by the controller 180.
  • the controller 180 of the mobile device 100 capable of including at least one of the foregoing constituent elements according to an embodiment of the present disclosure may sense a first touch gesture to an instance message, and classify the touch sensed instance message into a message belonging to a subject selected according to a second touch gesture. Furthermore, the controller 180 may display messages belonging to the selected subject to be visually distinguished from the other instance messages.
  • the instance message may be one through which two or more talkers use real time text communication using a network such as the Internet, and may be referred to as a messenger.
  • the instance message may be displayed on the counterpart's screen immediately when sent.
  • the touch gesture may include touch gestures with a predetermined scheme, for example, all touch gestures using a long-press touch, a short-press touch, a touch-up, a touch-down, a touch and drag, a flicking or drag touch input, a proximity touch, and other user motions.
  • a touch gesture to the instance message may be a long-press touch to the instance message.
  • first touch gesture and the second touch gesture may indicate touch gestures implemented in different schemes.
  • the conversation matter may correspond to this.
  • the transmitted and received instance message itself may be registered as the subject or any object received through user input may be also registered as the subject.
  • the selected subject may be a subject selected from the previously registered subjects according to a touch or may be a newly registered subject according to a touch.
  • classifying the touch sensed instance message into a message belonging to the selected subject may denote storing, managing and displaying instance messages previously contained in the selected subject along with the touch sensed instance message.
  • displaying the instance message to be visually distinguished from the other instance messages may denote distinguishing the instance message from the other instance messages, for example, by connecting the same tree shaped guideline, performing the same edge processing, performing a highlight processing, displaying the same color for messages belonging to the selected subject, or through a change of the shape or size when belongs to a specific subject.
  • the controller 180 may switch a current screen displayed on the display unit 151 to a screen displayed with a list of objects indicating a previously registered subject according to a third touch gesture.
  • the third touch gesture may be one of the foregoing touch gestures with a predetermined scheme.
  • the third touch gesture may be a touch gesture with a different scheme from the first and the second touch gesture, for example, a touch to a region displayed with a specific control key (for example, "view subject” touch key) or a flicking or drag touch input applied to one region of the display unit 151.
  • the controller 180 may perform at least one function associated with a subject corresponding to the touch sensed specific subject.
  • the at least one function may include a function of displaying instance messages belonging to a subject, an edit function for the subject itself, for example, an attribute setting and change function of the subject such as a main subject, a secret subject or the like, a subject share function, a reception notification setting function of an instance message belonging to the subject, a bookmark add and edit function for link address information contained in a specific instance message contained in the subject, and the like.
  • controller 180 may classify total transmitted and received instance messages into each reference range determined according to a fourth touch gesture.
  • the fourth touch gesture may denote a touch gesture with the foregoing predetermined scheme, and a touch gesture distinguished from the foregoing first, second, third touch gestures.
  • the reference range may denote a range classified according to a time, date or period during which instance messages are transmitted and received, for example.
  • the reference range determined according to a touch gesture may denote that a time range during which instance messages are classified can be varied according to the direction and length of a drag when the touch gesture is a touch and drag touch input, for example.
  • Total instance messages may be classified and displayed for each date or classified and displayed for each longer period range (for example, week unit) according to the drag length of a touch and drag touch input, for example.
  • controller 180 may further include a setting unit 181 for setting at least one search keyword for retrieving an instance message.
  • the controller 180 may display an indicator indicating the detected search keyword on an object indicating the corresponding reference range.
  • controller 180 may display the instance message in a first region of the display unit 151 and display a search window for retrieving information associated with the instance message in a second region distinguished from the first region.
  • the information associated with the instance message may include all information such as an instance message writer, information of a specific object contained in the instance message and a time during which the instance message is transmitted and received, and the like, for example.
  • embodiments disclosed herein may provide an interface capable of classifying transmitted and received instance messages into each subject selected according to a touch, classifying them into each reference range determined according to a touch or allowing the user to directly enter the related keyword, thereby providing an environment capable of more easily and speedily retrieving a specific instance message and/or its related instance messages desired to be retrieved by the user.
  • the mobile device 100 may include a display unit 151 (refer to FIG. 3) disposed at one surface thereof, for example, front surface thereof, and the display unit 151 is configured to enable a touch input.
  • a display unit 151 (refer to FIG. 3) disposed at one surface thereof, for example, front surface thereof, and the display unit 151 is configured to enable a touch input.
  • FIG. 2 is a view illustrating a control method for classifying and displaying instance messages transmitted and received by the mobile device 100 according to an embodiment of the present disclosure into each subject selected according to a touch
  • FIGS. 4A and 4B are conceptual views for explaining the control method in FIG. 2.
  • the mobile device 100 displays an instance message transmitted and received through the wireless communication unit 110 on the display unit 151 (S200).
  • the instance message may be transmitted and received through a messenger provided from the mobile device 100 according to the present disclosure or a message related application (for example, Kakao Talk, Daum People, WhatsApp) downloaded from the mobile device 100. Furthermore, the instance message may be transmitted and received between two talkers, and transmitted and received in the form of group chatting by two or more talkers.
  • a message related application for example, Kakao Talk, Daum People, WhatsApp
  • instance messages displayed on the display unit 151 may be transmitted and received by a plurality of talkers B, C and D 411, 413, 417, and instance messages being transmitted and received 412, 414, 416, 418, 420 may be comprised of data with various formats including text, images, videos and the like.
  • the controller 180 may sense a first touch gesture to a specific instance message, for example, the instance message 416, in a state that the instance message is displayed (S210).
  • the first touch gesture may be a touch gesture for making a long press to a region in which the specific instance message 416 is displayed, but may not be necessarily limited to this.
  • the controller 180 classifies the instance message 416 into a message belonging to a subject selected according to a second touch gesture (S220).
  • the second touch gesture may be a touch gesture with a scheme which is distinguished from the foregoing first touch gesture, for example, a touch and drag to the injection molded article or touch to a selection menu popped up according to the first touch gesture.
  • FIG. 4A illustrate an embodiment of selecting the subject when the second touch gesture is a touch to a selection menu popped up according to the first touch gesture.
  • the controller 180 may sense a touch to a selection menu popped up according to the first touch gesture, and when a specific subject is selected according to the touch to a selection menu, the controller 180 may classify the instance message into a message belonging to the selected subject.
  • a selection menu 430 including one or more selectable objects is popped up.
  • a selectable object in the selection menu 430 may include an object displayed with text "register as a subject", "insert message into the existing subject” and the like, for example. Furthermore, subjects previously registered as a lower item of the object displayed with text "insert message into the existing subject", for example, “overseas travel", "project A” and the like may be displayed.
  • the controller 180 When a touch to "register as a subject" in the selection menu 430 is sensed, the controller 180 creates a new subject, and classify the instance message 416 sensed by the first touch gesture into a message belonging to the new subject. At this time, information associated with the created subject may be stored in the memory 160 of the mobile device 100.
  • the controller 180 creates and registers text "Let us meet in April" contained in the instance message 416 sensed by the first touch gesture as a new subject.
  • the controller 180 may classify the transmitted instance message into a message belonging to the specific subject. In other words, the controller 180 automatically classifies instance messages transmitted after selecting the specific subject as messages belonging to the selected subject.
  • the controller 180 may switch the current screen of the display unit 151 to a screen on which instance messages belonging to the subject "overseas travel" are displayed. Then, when a new instance message is transmitted, the controller 180 may immediately the relevant instance message into an instance message belonging to the subject "travel".
  • FIG. 4B illustrates an embodiment of selecting the subject when the second touch gesture is a touch and drag to the instance message.
  • the controller 180 can sense a touch and drag to the instance message 416, and when the touch and drag being moved to a region associated with the specific subject is sensed, the controller 180 classifies the instance message 416 into an instance message belonging to the specific subject.
  • an object 450 indicating a specific subject previously registered and/or selected according to a touch may be displayed on the display unit 151.
  • an object 450 indicating the previously registered subject for example, an object 450 displayed with text "Current subject: Let us meet in April", is displayed in one region (upper region) of the display unit 151.
  • the object 450 may include a subject comprised of at least one of various texts and images.
  • the controller 180 may edit at least one of texts and images contained in the object 450 through user input. Furthermore, the controller 180 may control the object 450 to disappear from the display unit 151 at normal times, and control the object 450 to appear when a touch is sensed at one region (upper region) of the display unit 151.
  • the controller 180 senses a first touch gesture to the instance message 420 with an image type, and senses a touch and drag using the instance message 420 as a touch starting point.
  • the controller 180 classifies the instance message 420 into a message belonging to a subject to the object 450.
  • the region associated with the subject may further include a region displayed with a message belonging to a specific subject in addition to a region in which an object indicating the specific subject is displayed on the display unit 151.
  • the controller 180 classifies the instance message 420 into a message belonging to the subject 450. Furthermore, even when a touch and drag using the instance message 420 as a touch starting point being moved to a region in which another instance message belonging to the subject 450 is displayed is sensed, the controller 180 classifies the instance message 420 into a message belonging to the subject 450.
  • the controller 180 displays a message belonging to the selected subject to be visually distinguished from the other instance messages (S230).
  • the controller 180 may display message belonging to the selected subject to be distinguished, for example, through connecting the same tree shaped guideline, performing the same edge processing, performing a highlight processing, displaying the same color for messages belonging to the selected subject, or changing the existing shape or size to be distinguished from the other instance messages, and the like.
  • instance messages 416, 420 classified into a message belonging to the subject 450 are connected and displayed to each other through a tree shaped guideline 455, 465 coming out of the subject 450, and thus distinguished from the other instance messages 412, 414, 418.
  • the controller 180 may display a subject selected in a preview format around the region in which the touch sensed instance messages are displayed to be distinguished from the other instance messages.
  • transmitted and received instance messages may be classified into each subject selected according to a touch and messages belonging to the selected subject may be displayed to be distinguished from the other instance messages, thereby providing an environment capable of retrieving and managing instance messages for each subject as described below.
  • FIG. 3A is a view for explaining a control method for displaying a list of objects indicating a registered subject
  • FIGS. 5A and 5B are conceptual view for explaining the control method in FIG. 3A.
  • the mobile device 100 displays an instance message transmitted and received through the wireless communication unit 110 on the display unit 151(S300).
  • the controller 180 senses a third touch gesture to the display unit 151 (S310).
  • the third touch gesture may be a touch to a control key 570 displayed in one region of the display unit 151 as illustrated in FIG. 5A, for example.
  • the controller 180 may control the control key 570 to disappear from the display unit 151 at normal times, and control the control key 570 to appear on the display unit 151 when there exists a user input with a predetermined scheme.
  • the third touch gesture may be a touch gesture for flicking a touch applied to one region of the display unit 151 in the predetermined direction as illustrated in FIG. 5B, but may not be necessarily limited to this.
  • the third touch gesture indicates a touch gesture with a scheme which is distinguished from the foregoing first touch gesture and second touch gesture.
  • the controller 180 may display a list of objects indicating a previously registered subject (S320).
  • the controller 180 may switch the current screen of the display unit 151 to a screen on which a list of objects indicating the specific subject is displayed.
  • the controller 180 may switch the current screen of the display unit 151 to a screen on which a list 580 of objects indicating a previously registered subject is displayed.
  • the list 580 may include objects 581 to 584 indicating one or more previously registered subjects as illustrated in the drawing.
  • the subject-1 581 is a subject for "test”
  • the subject-2 582 is a subject for "overseas travel”
  • the subject-3 583 is a subject for "project A”
  • the subject-4 584 is a subject for "project B”.
  • the objects 581 to 584 indicating previously registered subjects may further include image information indicating a subject in addition to text information.
  • the list 580 of objects indicating the subject may be displayed in the order of storing them in the memory 160. Furthermore, the shape, order and the like of the displayed objects 581 to 584 may be changed on the list 580 through user input.
  • the controller 180 may sense a touch to a specific object on the list, and accordingly, the controller 180 may perform at least one function associated with a subject corresponding to a specific object to which a touch is made.
  • the controller 180 may display at least one instance message belonging to the corresponding specific subject on the display unit 151. Furthermore, for example, when a long-press touch to a specific object indicating a specific subject is sensed, the controller 180 may pop up a control menu window for selectively performing an edit function to the corresponding specific subject itself, for example, functions such as change object, delete subject, send subject, change subject attribute, add bookmark and the like.
  • FIG. 5C is a conceptual view for explaining a function of displaying an instance message belonging to a specific subject selected from the foregoing subject list and a control method of inserting a new message thereto according to an embodiment of the present disclosure.
  • the controller 180 displays a screen displayed with the list 580 on the display unit 151 as illustrated in FIG. 5C.
  • the controller 180 may immediately display the list 580 screen using a specific control key or the like even without passing through a state that the transmitted and received instance message is displayed on the display unit 151.
  • the controller 180 displays instance messages belonging to a subject corresponding to the specific object on the display unit 151. For example, when a short-press touch to the subject-2 182 is sensed as illustrated in FIG. 5C, instance messages 512-514 belonging to the subject-2 182 are displayed on the display unit 151.
  • the controller 180 When a new message, for example, "I like it, too” 516, is written through user input in a state that instance messages 512-514 belonging to the subject-2 182 are displayed, and a transmit command is entered through the transmit key 575, the controller 180 automatically classifies the transmitted new message into an instance message belonging to the subject-2 550.
  • a new message for example, "I like it, too” 516
  • controller 180 may further form a guideline 575 that connects guidelines 555, 565 for other instance messages to a new instance message 516.
  • FIG. 6 is a view for explaining a method of performing a function of deleting a specific subject selected from the foregoing subject list according to an embodiment of the present disclosure.
  • the controller 180 may pop up a control menu 660 including a function key that can be selected for a subject corresponding to the relevant specific object.
  • the popped up control menu 660 may include a subject delete function, a subject share function, a main subject register function, a secret subject register function, a bookmark function, and the like, but may not be necessarily limited to this, and may further include other functions which are not described herein. Furthermore, the controller 180 may perform a control command for adding a user's desired new function to the control menu 660 or deleting an unused function through user input.
  • the controller 180 deletes the selected subject-2 682 from the list 680. Accordingly, instance messages belonging to the subject-2 682 cannot be retrieved any more. However, they are remained as they are in the original instance message record.
  • the controller 180 may delete the indication (for example, tree shaped guideline) for distinguishing instance messages belonging to the relevant subject from the other instance messages as well.
  • the indication for example, tree shaped guideline
  • the controller 180 may transmit at least part (instance messages to which a share is limited can be excluded) of instance messages belonging to the selected subject to another mobile device 100 through the wireless communication unit 110.
  • the related instance messages can be immediately sent, and thus the transmitter side does not have to make an effort to collect the related content one by one and transmit them and the receiver side receives conversations transmitted and received in the time order as they are, thereby allowing them to easily and quickly be aware of the subject related content.
  • the controller 180 may further display an indicator (for example, shape change of an object itself indicating a text, image or subject, or the like) indicating a main subject for the selected subject on the list 680.
  • an indicator for example, shape change of an object itself indicating a text, image or subject, or the like
  • the controller 180 may display a main subject release function on the popped up control menu 660.
  • the controller 180 may configure reception notification for instance messages belonging to the relevant subject in a different manner. For example, when an instance message belonging to a subject registered as a main subject is received, the controller 180 allows it to be displayed in another reception notification mode or reduce the notification interval and number of times.
  • FIG. 7 is a view for explaining a method of performing a function of registering a specific subject selected from the foregoing subject list 780 as a main subject according to an embodiment of the present disclosure.
  • the controller 180 may allow the control menu 760 to be popped up.
  • the controller 180 may further display an indicator 764 indicating secret subject registration on the subject-2 782 as illustrated in FIG. 7.
  • the indicator 764 may include images, text and the like, but may not be necessarily limited to this, and may include any visual indication that can be distinguished from the other subjects 781, 782, 784.
  • the controller 180 may switch the current screen of the display unit 151 to a screen 765 for requesting a preset password input prior to displaying instance messages belonging to the subject-2 782.
  • the controller 180 may change a preset password through user input.
  • the controller 180 may reinforce the display limitation of instance messages belonging to the subject-2 782 step by step.
  • the controller 180 may display a secret subject release function on the popped up control menu 760.
  • the controller 180 may display a web page corresponding to specific link information contained in the selected subject on the display unit 151.
  • the controller 180 implements a web browser application and displays a web page screen corresponding to specific link information contained in the selected subject on the display unit 151.
  • the controller 180 may automatically classify the received instance message into a message belonging to a specific subject. Furthermore, the controller 180 may display an indication (for example, tree shaped guideline) for distinguishing the received instance message from the other instance messages on the display unit 151.
  • an indication for example, tree shaped guideline
  • the controller 180 may display an indicator indicating the reception on the display unit 151. Furthermore, when an instance message belonging to a specific subject is received, the controller 180 may output predetermined reception notification through the audio output module 152.
  • the controller 180 classifies the received instance message into a message belonging to the specific subject, and further displays an indicator 790 indicating the reception of the received instance message on objects 781-784, respectively, indicating the relevant specific subject.
  • the indicator 790 may be a text or image indicating the number of the received instance messages belonging to a specific subject as illustrated in FIG. 7, but may not be necessarily limited to this.
  • the controller 180 may display the received instance message along with messages belonging to the relevant subject on the display unit 151.
  • the controller 180 may control the indicator 790 to disappear from the display unit 151.
  • FIG. 8 is illustrate a method of classifying and displaying instance messages containing data with a predetermined format into messages belonging to a specific subject when they are transmitted and received according to an embodiment of the present disclosure.
  • the transmitted and received instance messages displayed on the display unit 151 may include instance messages 814, 822 with a text format, instance messages 812, 820 with an image file format, and instance messages 816, 818 with a video file format.
  • the controller 180 may classify only instance messages containing data with a specific format in a separate manner.
  • the controller 180 may extract instance messages with a predetermined specific format, and classify the extracted messages into a previously generated or automatically generated specific subject.
  • the controller 180 may include a detection unit (not shown) for extracting data with a predetermined specific format from the transmitted and received instance messages.
  • the specific format may include information with a specific format contained in text, for example, link address information, phone number information, and address information and the like as well as an image file format, and a video file format.
  • the controller 180 may extract instance messages 812, 820 with an image file format and instance messages 816, 818 with a video file format, respectively, from the transmitted and received instance messages 812, 814, 816, 818, 820, 822, to classify them into a separate subject.
  • the controller 180 may sense a third touch gesture applied to the display unit 151, for example, a touch to the control key "view subject", and display it by further adding an object indicating a video 883 and an object indicating an image 884.
  • the controller 180 may switch the current screen of the display unit 151 into a screen on which only instance messages 812, 820 with an image file format are displayed.
  • an object 850 displayed with a subject "image” may be displayed in one region of the display unit 151. Furthermore, the display unit 151 may connect and display instance messages 812, 820 with an image file format belonging to the subject using a tree shaped guideline 860.
  • FIG. 3B is a view illustrating a control method for classifying total instance messages into each reference range determined according to a touch.
  • the mobile device 100 displays an instance message transmitted and received through the wireless communication unit 110 on the display unit 151 (S400).
  • the controller 180 senses a fourth touch gesture to the display unit 151 (S410).
  • the fourth touch gesture may be a touch to a control key displayed in one region of the display unit 151, and may be a flicking touch input or drag touch input applied to one region of the display unit 151 in a specific direction.
  • it may not be necessarily limited to this, and may include all touch gestures with a predetermined different scheme which is distinguished from the foregoing first through third touch gestures.
  • the controller 180 classifies total instance messages into each reference range determined according to the fourth touch gesture (S420).
  • the reference range denotes grouping by each predetermined unit according to a time during which instance messages are transmitted and received.
  • the reference range may be grouped by a date unit or grouped by a week unit.
  • determining a reference range according to a fourth touch gesture denotes that each time range classified according to the direction and drag length of a touch gesture or the like, for example is varied.
  • a time range for classifying instance messages can be determined according to a touch.
  • the meaning of determining a reference range according to a fourth touch gesture may denote that instance messages are classified into each predetermined time range, for example, each time range with a fixed week unit.
  • FIG. 9A is view for explaining a control method of classifying total instance messages into each reference range determined according to a touch according to an embodiment of the present disclosure.
  • the controller 180 may sense a fourth touch gesture to the display unit 151, and switch the current screen of the display unit 151 into a screen displayed with at least one image object indicating a reference range determined according to the fourth touch gesture.
  • the fourth touch gesture indicates a gesture in which at least two touch positions applied to the display unit 151 are closer to each other. Furthermore, the at least two touch positions applied to the display unit 151 may be applied to the background screen other than a region displayed with an instance message, for example.
  • the controller 180 may narrowly set the reference range determined according to a touch to a date unit when a distance between touch positions applied to the display unit 151 is relatively wide. Furthermore, the controller 180 may widely set the reference range determined according to a touch to a week unit when a distance between touch positions applied to the display unit 151 is relatively narrow. Here, a distance between touch positions applied to the display unit 151 is determined in a relative manner.
  • the controller 180 may flexibly set the reference range based on a touch direction of the fourth touch gesture, a distance between the touch positions, a number of times for which the touch is repeated, and the like, and instance messages are classified according to the set reference range.
  • the controller 180 may classify instance messages in the range of a date unit to display a list 940 of image objects indicating each date unit range on the display unit 151.
  • Text information including year, month, day and weekday may be further displayed on image objects 941 to 946 displayed on the display unit 151.
  • the controller 180 may further display a specific keyword on the displayed image objects 941 to 946.
  • the controller 180 may further include a setting unit 181 for setting at least one search keyword.
  • the controller 180 may further include a detection unit (not shown) for detecting the set search keyword from instance messages.
  • keyword denotes information comprised of text that summarizes an instance message, as an object contained the instance message. Furthermore, being contained in an instance message may be expressed as "information associated with an object contained in an instance message being tagged”.
  • a tag is a set of a sort of keywords in which words indicating the feature, meaning, title, subject, and the like, that belong to instance messages are previously stored, and an object itself contained in an instance message may include tag information even when tag information is not directly entered through user input or not entered by the user.
  • the tag information may be expressed as metadata, wherein the metadata is used to effectively find the tag or keyword, for example, as data for explaining an image object.
  • the controller 180 may further display an indicator indicating a search keyword set to an image object corresponding to the specific reference range.
  • the indicator may be displayed along with other images (for example, thumbnail image) in addition to text indicating a search keyword.
  • the image object 941 among the image objects 941 to 946 as illustrated in FIG. 9A may include a preset keyword "meeting” detected from instance messages transmitted and received on "Thursday, January 10, 2013”
  • the image object 943 may include a preset keyword “meeting” detected from instance messages transmitted and received on "Saturday, January 12, 2013”.
  • the controller 180 may switch the current screen of the display unit to a previous screen displayed with the instance message.
  • the fifth touch gesture indicates a gesture in which at least two touch positions applied to the display unit 151 are away from each other. Furthermore, the at least two touch positions applied to the display unit 151 may be applied to the background screen other than a region displayed with an instance message, for example.
  • the controller 180 may control such that a range in which the instance message is displayed is varied according to the extent of touch positions applied to the display unit 151 being away from each other.
  • the controller 180 may display the range displayed with the instance message in a narrow manner. Furthermore, when a distance between touch positions applied to the display unit 151 is relatively narrow, the controller 180 may display the range displayed with the instance message in a wide manner.
  • a distance between touch positions applied to the display unit 151 is determined in a relative manner.
  • FIGS. 9B and 9C are conceptual view for explaining a method of displaying the corresponding instance message according to a touch applied to the image object in a state that the screen of the display unit 151 is switched to a screen displayed with the image objects 941 to 946.
  • the controller 180 may switch the current screen of the display unit 151 to a screen on which instance messages with a reference range corresponding to the touch sensed image object are displayed.
  • the controller 180 displays instance messages transmitted and received on "Thursday, January 10, 2013" on the display unit 151 from the first instance message of the relevant day.
  • the controller 180 may display only some instance messages from the first instance message of the relevant day.
  • FIG. 9B illustrates a view in which instance messages are displayed from the instance message 912 "Good morning” transmitted for the first time at "9:20 a.m.” on "Thursday, January 10, 2013".
  • the controller 180 may switch the current screen of the display unit 151 to a screen on which instance messages are displayed from an instance message containing the touch sensed indicator.
  • FIG. 9C illustrates a view in which instance messages are displayed from an instance message "meeting at 5 o'clock" 914 received at "3:45 p.m.” on "Thursday, January 10, 2013".
  • FIGS. 10A through 10C are views for explaining a method of classifying image objects indicating the each reference range into a narrower reference range according to the number of times of touches and displaying instance messages related thereto according to an embodiment of the present disclosure.
  • the controller 180 may set a reference range for classifying instance messages in a broader manner. Even in this case, a distance between touch positions applied to the display unit 151 may be relatively determined as described above. In other words, the controller 180 may control a range displayed with the instance message to be varied according to the extent of touch positions applied to the display unit 151 being closer to each other.
  • the number of image objects 1041' to 1043' indicating a reference range determined according to the repeated fourth touch gestures is reduced and the classification reference range displayed on the image objects is further broadened.
  • the classification reference range of an image object displayed for the first time according to the repeated fourth touch gestures is set in a broader manner from "Thursday, January 10, 2013" to "December 31, 2012 to January 6, 2013".
  • the controller 180 switches the current screen of the display unit 151 to a screen on which instance messages with an reference range corresponding to the touch sensed image object are displayed.
  • the controller 180 displays instance messages transmitted and received during "December 31, 2012 to January 6, 2013" from the first instance message of the relevant range on the display unit 151.
  • the instance messages are displayed from the instance message "Good morning” 1012 transmitted at "9:22 a.m.” on “Monday, January 7, 2013” in the time order.
  • the controller 180 may switch the current screen of the display unit 151 to a screen on which instance messages with the corresponding reference range are displayed from an instance message containing the touch sensed indicator.
  • the controller 180 displays instance messages transmitted and received during "January 7, 2013 to January 13, 2013" from the first instance message containing the search keyword "meeting” on the display unit 151.
  • the instance messages are displayed from the instance message "meeting at 5 o'clock" 1014 received at "3:45 p.m.” on "Thursday, January 10, 2013" in the time order.
  • the controller 180 may switch the current screen of the display unit 151 to a screen on which at least one image object indicating each classified reference range is displayed.
  • An image object displayed on the display unit 151 may provide a search reference in the time unit for retrieving the user's desired specific instance message from instance messages transmitted and received in the past.
  • Embodiments described herein may be implemented along with the embodiments of classifying instance messages into each subject and the embodiments of classifying instance messages into each reference range determined according to a touch or in an independent manner.
  • FIGS. 11A and 11B are views for explaining a control method of displaying a search window for retrieving information associated with an instance message
  • FIGS. 11C and 11D are views for explaining a control method of entering a keyword to the search window.
  • the controller 180 may sense a change of inclination of the mobile device 100 to determine the screen direction of the current display unit 151.
  • the controller 180 may display a search window for retrieving information associated with an instance message/instance message in at least one region of the left/right side regions of the display unit 151.
  • the controller 180 may display a search window for retrieving information associated with an instance message/instance message in at least one region of the upper/lower regions of the display unit 151.
  • the screen direction illustrated in FIG. 11A is a vertical mode
  • the screen direction illustrated in FIG. 11B is a horizontal mode.
  • the controller 180 may display a search window including a predetermined keyword input bar 1173 and a search execution key 1175 in the second region distinguished from the first region displayed with an instance message. At this time, at least part of the first region displayed with the instance message may be reduced than prior to applying a touch input.
  • the controller 180 may sense a flicking or drag touch input in the first direction (upward) applied to one region (for example, upper/lower region) of the display unit 151, and accordingly, display the search window in the second region (for example, upper/lower region of the display unit 151) distinguished from the first region displayed with an instance message.
  • the controller 180 may replace an instance message input window with a search window if a predetermined criterion is satisfied.
  • the predetermined criterion may be a touch gesture with a predetermined scheme to the display unit 151.
  • the instance message input window when the instance message input window is replaced with a search window, it may be implemented to deactivate a "send command" function of the key "send” displayed on the existing instance message input window and active a "search command".
  • the "key” may display the text information ("send” or "search") of the currently activated function.
  • only one input window may be provided and a plurality of control keys (for example, send key, search key) may be provided.
  • search window according to an embodiment of the present disclosure is illustrated in FIG. 11A, but the foregoing instance message input window can be of course displayed as well.
  • the controller 180 may display an instant message input window in the first region of the display unit 151 and the search window in the second region distinguished from the first region.
  • the controller 180 may display the first and the second region by controlling the location and size thereof according to a predetermined criterion.
  • the controller 180 may control the search window to disappear from the display unit 151.
  • the controller 180 may sense a flicking or drag touch input in the second direction (left direction) applied to one region (for example, left/right region) of the display unit 151, and accordingly, display the search window in the second region (for example, left/right region of the display unit 151) distinguished from the first region displayed with an instance message.
  • the controller 180 may display the search window for the first time.
  • the controller 180 may display a touch keyboard for entering a keyword in the lower region of the input bar 1173.
  • a keyword that can be entered using a touch keyboard.
  • the controller 180 may retrieve information associated with an instance message/instance message associated with the receive keyword, and display a search result on the display unit 151.
  • the controller 180 may pop up a selection menu (not shown) presented with a specific transmitter, a specific search keyword and the like according to a touch input (for example, long-press touch input) applied to a region displayed with the keyword input bar 1173.
  • a specific transmitter for example, is selected in a state that the selection menu (not shown) is displayed, the controller 180 may retrieve information associated with instance messages/instance message transmitted by the selected specific transmitter.
  • the controller 180 may control the displayed search window to disappear from the display unit 151.
  • FIGS. 11A and 11B a method of displaying a search window for retrieving transmitted and received instance messages has been described with reference to FIGS. 11A and 11B.
  • a control method of entering a keyword to the displayed search window will be described with reference to FIGS. 11C and 11D.
  • the controller 180 may display the instance message 1114 or a specific object, for example, "meeting at 5 o'clock" or "meeting", contained in the instance message 1114 on the keyword input bar 1173.
  • the controller 180 retrieves information associated with the instance message 1114 and displays the search result on the display unit 151.
  • information associated with the instance message 1114 may include other instance messages with the same or similar format to the instance message 1114, a next instance message displayed with a specific object contained in the instance message 1114, and the like, but may not be necessarily limited to them.
  • the controller 180 displays the transmitter 1113 of the instance message 1114 on the keyword input bar 1173.
  • the controller 180 displays information associated with the transmitter 1113 of the instance message 1114 as a search result on the display unit 151.
  • information associated with the transmitter 1113 of the instance message 1114 may be other instance messages transmitted by the transmitter 1113, for example, but may not be necessarily limited to them.
  • controller 180 may change the size and display direction of the first region displayed with an instance message and the second region displayed with a search window, and an object, a user interface or the like contained therein based on a touch input.
  • a search window capable of directly retrieving information associated with an instance message in a region distinguished from a region displayed with instance messages may be displayed according to a touch, thereby allowing the user to directly enter the search criterion of the instance message.
  • transmitted and received instance messages may be classified into each subject selected according to a touch, thereby allowing the user to more easily retrieve the user's desired specific instance message and/or instance messages associated therewith.
  • the screen may be switched to display total instance messages for each reference range, for example, each day range, determined according to a touch, thereby allowing the user to intuitively and quickly retrieve instance messages transmitted and received on his or her desired specific date.
  • a search window capable of directly retrieving information associated with an instance message in a region distinguished from a region displayed with instance messages may be displayed according to a touch, thereby allowing the user to directly enter his or her desired search criterion.

Abstract

The present disclosure discloses a mobile device and a control method thereof. According to embodiments of the present disclosure, the mobile device may include a wireless communication unit configured to transmit and receive an instance message, a display unit configured to enable a touch, and display the instance message transmitted and received through the wireless communication unit, and a controller configured to classify the instance message into a message belonging to a specific subject and display the message belonging to the specific subject to be visually distinguished from the other instance messages when a first touch gesture to the instance message is sensed and the specific subject is selected according to a second touch gesture. Accordingly, the transmitted and received instance messages can be classified into each subject selected according to a touch, thereby allowing the user to more easily retrieve his or her desired specific instance message and/or its related instance messages.

Description

MOBILE DEVICE AND CONTROL METHOD FOR THE SAME
The present disclosure relates to a mobile device capable of outputting instance messages and a control method thereof.
Mobile devices can be easily carried and have one or more of functions such as supporting voice and video telephony calls, inputting and/or outputting information, storing data and the like.
As it becomes multifunctional, the mobile device can be allowed to capture still images or moving images, play music or video files, play games, receive broadcast and the like, so as to be implemented as an integrated multimedia player.
Moreover, the improvement of structural or software elements of the terminal may be taken into consideration to support and enhance the functions of the mobile device.
On the other hand, when it is desired to retrieve instance messages transmitted and received in the past and follow an instance message with the related content among instance messages sent and received using such a mobile device, there has been inconvenience that the screen should be scrolled for a long time to retrieve a specific instance message and then a newly prepared instance message is mixed with other messages, thereby disrupting the flow of the story.
Accordingly, an object of embodiments of the present disclosure is to provide a mobile device for more easily retrieving a user's desired specific instance message and/or its related instance messages among instance messages transmitted and received in the past and a control method thereof.
Furthermore, another object of embodiments of the present disclosure is to provide a mobile device for displaying only related instance messages on a display screen when it is desired to transmit and receive a newly prepared instance message associated with instance messages transmitted and received in the past and a control method thereof.
A mobile device according to an embodiment of the present disclosure may include a wireless communication unit configured to transmit and receive an instance message; a display unit configured to enable a touch, and display the instance message transmitted and received through the wireless communication unit; and a controller configured to classify the instance message into a message belonging to a specific subject and display the message belonging to the specific subject to be visually distinguished from the other instance messages when a first touch gesture to the instance message is sensed and the specific subject is selected according to a second touch gesture.
According to an embodiment, the second touch gesture may be a touch to a selection menu popped up according to the first touch gesture, and the controller may classify the instance message into a message belonging to the selected subject when the specific subject is selected according to the touch.
According to an embodiment, when a new instance message is transmitted in a state that a specific subject is selected according to the second touch gesture, the controller may classify the transmitted instance message into a message belonging to the specific subject.
According to an embodiment, the second touch gesture may be a touch and drag to the instance message, and the controller may classify the instance message into an instance message belonging to the specific subject when the touch and drag being moved into a region associated with the specific subject is sensed.
According to an embodiment, the region associated with the specific subject may be either one of a region in which an object indicating the specific subject is displayed and a region in which a message belonging to the specific subject is displayed.
According to an embodiment, the controller may switch the current screen of the display unit to a screen on which a list of the object indicating the specific subject is displayed when a third touch gesture is sensed.
According to an embodiment, when a touch to the specific subject of the list is sensed in a state of being switched to a screen on which the list is displayed, the controller may perform at least one of functions associated with a subject corresponding to the specific object.
According to an embodiment, when an instance message belonging to a specific subject is received, the controller may classify the received instance message into a message belonging to the specific subject, and when the third touch gesture is sensed, the controller may further display an indicator indicating the reception of the received instance message on an object indicating the specific subject.
According to an embodiment, when an instance message containing pre-formatted data is transmitted and received, the controller may classify the pre-formatted data contained in the transmitted and received instance message into a message belonging to a specific subject containing only the pre-formatted data.
According to an embodiment, when a fourth touch gesture to the display unit is sensed, the controller may classify total instance messages into each reference range determined according to the fourth touch gesture.
According to an embodiment, when a fourth touch gesture to the display unit is sensed, the controller may switch the current screen of the display unit to a screen on which at least one image object indicating a reference range determined by the fourth touch gesture is displayed.
According to an embodiment, the fourth touch gesture may be a gesture in which at least two touch positions applied to the display unit are closer to each other, and the controller may control the reference range to be varied according to the extent of the touch positions being closer to each other.
According to an embodiment, when a fifth touch gesture is sensed in a state of being switched to a screen on which the image object is displayed, the controller may switch the current screen of the display unit to a screen on which the instance message is displayed.
According to an embodiment, the fifth touch gesture may be a gesture in which at least two touch positions applied to the display unit are away from each other, and the controller may control the instance message display range to be varied according to the extent of the touch positions being away from each other.
According to an embodiment, the controller may include a setting unit for setting at least one search keyword, and when the set search keyword may be contained in an instance message with a specific reference range, the controller may further display an indicator indicating the search keyword on an image object indicating the specific reference range.
According to an embodiment, when a touch is sensed at any one of the image objects, the controller may switch the current screen of the display unit to a screen on which instance messages with a reference range corresponding to the touch sensed image object are displayed.
According to an embodiment, when a touch to the indicator is sensed, the controller may switch the current screen of the display unit to a screen on which instance messages with the corresponding reference range from an instance message containing the touch sensed indicator are displayed.
According to an embodiment, the controller may display the instance message in a first region of the display unit and display a search window for retrieving information associated with the instance message in a second region distinguished from the first region.
According to an embodiment, when a touch to an object contained in the instance message being moved to a region displayed with the search window is sensed, the controller may receive the object as a search keyword for retrieving an instance message associated with the object.
Furthermore, a control method of a mobile device according to an embodiment of the present disclosure may include displaying an instance message transmitted and received through a wireless communication unit; sensing a first touch gesture to the instance message; classifying the instance message into a message belonging to a subject selected according to a second touch gesture when the first touch gesture is sensed; and displaying the message belonging to the selected subject to be visually distinguished from the other instance messages.
According to an embodiment, the method may further include sensing a third touch gesture to the display unit; a displaying a list of objects indicating a prestored subject when the third touch gesture is sensed; and performing at least one of functions associated with a subject corresponding to the specific object when a touch to the specific object is sensed on the list.
According to an embodiment, the method may further include sensing a fourth touch gesture to the display unit; and classifying total instance messages into each reference range determined according to the fourth touch gesture when the fourth touch gesture is sensed.
According to an embodiment, the method may further include displaying a search window for retrieving information associated with the instance message in a second region distinguished from the first region in which the instance message is displayed.
As described above, according to a mobile device and a control method thereof according to an embodiment of the present disclosure, transmitted and received instance messages may be classified into each subject selected according to a touch, thereby allowing the user to more easily retrieve the user's desired specific instance message and/or instance messages associated therewith.
Furthermore, the screen may be switched to display total instance messages for each reference range, for example, each day range, determined according to a touch, thereby allowing the user to intuitively and quickly retrieve instance messages transmitted and received on his or her desired specific date.
Furthermore, a search window capable of directly retrieving information associated with an instance message in a region distinguished from a region displayed with instance messages may be displayed according to a touch, thereby allowing the user to directly enter his or her desired search criterion.
FIG. 1 is a block diagram illustrating a mobile device according to an embodiment of the present disclosure;
FIG. 2 is a view for explaining a control method of a mobile device according to an embodiment of the present disclosure;
FIGS. 3A and 3B are views for explaining a control method of a mobile device according to another embodiment different from FIG. 2;
FIGS. 4A and 4B are block conceptual view for explaining a flow chart in FIG. 2 according to an embodiment of the present disclosure;
FIGS. 5A and 5B are views for explaining a method for displaying a list of objects indicating a previously registered subject according to an embodiment of the present disclosure;
FIG. 5C is a view for explaining a method of selecting a specific subject from the previously registered subject list to add a new message according to an embodiment of the present disclosure;
FIG. 6 is a view for explaining a method of performing a delete function for a specific subject on a list of objects indicating a previously registered subject according to an embodiment of the present disclosure;
FIG. 7 is a view for explaining a method of displaying the reception of an instance message belonging to a specific subject and a method for changing an attribute for a specific subject from a list of objects indicating a previously registered subject according to an embodiment of the present disclosure;
FIG. 8 is a view for explaining a method of classifying an instance message into a preset subject when the instance message containing a pre-formatted data is transmitted and received according to an embodiment of the present disclosure;
FIG. 9A is a view for explaining a method of classifying total instance messages into each reference range determined based on a touch according to an embodiment of the present disclosure;
FIGS. 9B and 9C are views for explaining a method of displaying the corresponding instance message according to a touch to at least one of an image object indicating the each reference range and a preset search keyword displayed on the image object according to an embodiment of the present disclosure;
FIGS. 10A through 10C are views for explaining a method of classifying image objects indicating the each reference range into each narrower reference range and displaying its related instance message according to an embodiment of the present disclosure;
FIGS. 11A and 11B are views for explaining a method of displaying a search window for retrieving information associated with an instance message in a region distinguished from a region in which the instance message is displayed according to an embodiment of the present disclosure; and
FIGS. 11C and 11D are views for explaining a method of entering a keyword to the search window using at least one of an object contained in an instance message and an object indicating the sender of the instance message.
Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the accompanying drawings, and the same or similar elements are designated with the same numeral references regardless of the numerals in the drawings and their redundant description will be omitted. A suffix "module" or "unit" used for constituent elements disclosed in the following description is merely intended for easy description of the specification, and the suffix itself does not give any special meaning or function. In describing the present invention, moreover, the detailed description will be omitted when a specific description for publicly known technologies to which the invention pertains is judged to obscure the gist of the present invention. Also, it should be noted that the accompanying drawings are merely illustrated to easily explain the concept of the invention, and therefore, they should not be construed to limit the technological concept disclosed herein by the accompanying drawings.
A mobile device disclosed herein may include a portable phone, a smart phone, a laptop computer, a digital broadcast mobile device, a personal digital assistant (PDA), a mobile multimedia player (PMP), a navigation, a slate PC, a tablet PC, an ultrabook, and the like. However, it would be easily understood by those skilled in the art that a configuration according to the following description may be applicable to a stationary terminal such as a digital TV, a desktop computer, and the like, excluding constituent elements particularly configured for mobile purposes.
FIG. 1 is a block diagram illustrating a mobile device according to an embodiment disclosed in the present disclosure.
The mobile device 100 may include a wireless communication unit 110, an audio/video (A/V) input unit 120, a user input unit 130, a sensing unit 140, an output unit 150, a memory 160, an interface unit 170, a controller 180, a power supply unit 190, and the like. However, the constituent elements as illustrated in FIG. 1 are not necessarily required, and the mobile communication terminal may be implemented with greater or less number of elements than those illustrated elements.
Hereinafter, the foregoing constituent elements will be described in sequence.
The wireless communication unit 110 may include one or more modules allowing radio communication between the mobile device 100 and a wireless communication system, or allowing radio communication between the mobile device 100 and a network in which the mobile device 100 is located. For example, the wireless communication unit 110 may include at least one of a broadcast receiving module 111, a mobile communication module 112, a wireless Internet module 113, a short-range communication module 114, a location information module 115, and the like.
At least one instance message may be transmitted and received through the wireless communication unit 110.
The broadcast receiving module 111 receives a broadcast signal and/or broadcast associated information from an external broadcast managing entity via a broadcast channel.
The broadcast channel may include a satellite channel and a terrestrial channel. The broadcast managing entity may indicate a server which generates and transmits a broadcast signal and/or broadcast associated information or a server which receives a pre-generated broadcast signal and/or broadcast associated information and sends them to the mobile device. The broadcast signal may be implemented as a TV broadcast signal, a radio broadcast signal, and a data broadcast signal, among others. The broadcast signal may further include a data broadcast signal combined with a TV or radio broadcast signal.
Examples of broadcast associated information may include information associated with a broadcast channel, a broadcast program, a broadcast service provider, and the like. The broadcast associated information may be provided via a mobile communication network, and received by the mobile communication module 112.
The broadcast associated information may be implemented in various formats. For instance, broadcast associated information may include Electronic Program Guide (EPG) of Digital Multimedia Broadcasting (DMB), Electronic Service Guide (ESG) of Digital Video Broadcast-Handheld (DVB-H), and the like.
The broadcast receiving module 111 may be configured to receive digital broadcast signals transmitted from various types of broadcast systems. Such broadcast systems may include Digital Multimedia Broadcasting-Terrestrial (DMB-T), Digital Multimedia Broadcasting-Satellite (DMB-S), Media Forward Link Only (MediaFLO), Digital Video Broadcast-Handheld (DVB-H), Integrated Services Digital Broadcast-Terrestrial (ISDB-T) and the like. Of course, the broadcast receiving module 111 may be configured to be suitable for every broadcast system transmitting broadcast signals as well as the digital broadcasting systems.
Broadcast signals and/or broadcast associated information received via the broadcast receiving module 111 may be stored in a memory 160.
The mobile communication module 112 transmits and receives wireless signals to and from at least one a base station, an external terminal and a server on a mobile communication network. Here, the wireless signals may include audio call signals, video call signals, or various formats of data according to the transmission and reception of text/multimedia messages.
The mobile communication module 112 may be configured to implement an video communication mode and a voice communication mode. The video communication mode refers to a configuration in which communication is made while viewing the image of the counterpart, and the voice communication mode refers to a configuration in which communication is made without viewing the image of the counterpart. The mobile communication module 112 may be configured to transmit or receive at least one of audio or video data to implement the video communication mode and voice communication mode.
The wireless Internet module 113 refers to a module for supporting wireless Internet access, and may be built-in or externally installed on the mobile device 100. Here, it may be used a wireless Internet access technique including WLAN (Wireless LAN), Wi-Fi (Wireless Fidelity) Direct, DLNA (Digital Living Network Alliance), Wibro (Wireless Broadband), Wimax (World Interoperability for Microwave Access), HSDPA (High Speed Downlink Packet Access), and the like.
The short-range communication module 114 refers to a module for supporting a short-range communication. Here, it may be used a short-range communication technology including BluetoothTM, Radio Frequency IDentification (RFID), Infrared Data Association (IrDA), Ultra WideBand (UWB), ZigBee, and the like.
The location information module 115 is a module for checking or acquiring the location of the mobile device, and there is a Global Positioning Module (GPS) module as a representative example.
Subsequently, referring to FIG. 1, the A/V(audio/video) input unit 120 receives an audio or video signal, and the A/V (audio/video) input unit 120 may include a camera 121 and a microphone 122. The camera 121 processes image frames, such as still or moving images, obtained by an image sensor in a video phone call or image capturing mode. The processed image frame may be displayed on a display unit 151.
The image frames processed by the camera 121 may be stored in the memory 160 or transmitted to an external device through the wireless communication unit 110. Furthermore, the user's location information or the like may be produced from image frames acquired from the camera 121. Two or more cameras 121 may be provided according to the use environment.
The microphone 122 receives an external audio signal through a microphone in a phone call mode, a recording mode, a voice recognition mode, and the like, and processes the audio signal into electrical voice data. The processed voice data may be converted and outputted into a format that is transmittable to a mobile communication base station through the mobile communication module 112 in the phone call mode. The microphone 122 may implement various types of noise canceling algorithms to cancel noise generated in a procedure of receiving the external audio signal.
The user input unit 130 may generate input data to control an operation of the terminal. The user input unit 130 may be configured by including a keypad, a dome switch, a touch pad (pressure/capacitance), a jog wheel, a jog switch, and the like.
The sensing unit 140 detects a current status of the mobile device 100 such as an opened or closed configuration of the mobile device 100, a location of the mobile device 100, a presence or absence of user contact with the mobile device 100, an orientation of the mobile device 100, an acceleration/deceleration of the mobile device 100, and the like, so as to generate a sensing signal for controlling the operation of the mobile device 100. For example, when the mobile device 100 is a slide phone type, the sensing unit 140 may sense whether a sliding portion of the mobile device is open or closed. Other examples include sensing functions, such as the sensing unit 140 sensing the presence or absence of power provided by the power supply unit 190, the presence or absence of a coupling between the interface unit 170 and an external device.
The output unit 150 is configured to generate an output associated with visual sense, auditory sense or tactile sense, and may include a display unit 151, an audio output module 153, an alarm unit 154, a haptic module 155, and the like.
The display unit 151 may display (output) information processed in the mobile device 100. For example, when the mobile device 100 is in a phone call mode, the display unit 151 may display a User Interface (UI) or a Graphic User Interface (GUI) associated with a call. When the mobile device 100 is in a video call mode or image capturing mode, the display unit 151 may display a captured image and/or received image, a UI or GUI.
The display unit 151 may include at least one of a Liquid Crystal Display (LCD), a Thin Film Transistor-LCD (TFT-LCD), an Organic Light Emitting Diode (OLED) display, a flexible display, a three-dimensional (3D) display, and an e-ink display.
Some of those displays may be configured with a transparent or optical transparent type to allow viewing of the exterior through the display unit, which may be called transparent displays. An example of the typical transparent displays may include a transparent LCD (TOLED), and the like. Under this configuration, a user can view an object positioned at a rear side of a mobile device body through a region occupied by the display unit 151 of the mobile device body.
Two or more display units 151 may be implemented according to a configured aspect of the mobile device 100. For instance, a plurality of the display units 151 may be arranged on one surface to be spaced apart from or integrated with each other, or may be arranged on different surfaces.
On the other hand, when the display unit 151 and a touch sensitive sensor (hereinafter, referred to as a "touch sensor") have an interlayer structure (hereinafter, referred to as a "touch screen"), the display unit 151 may be used as an input device in addition to an output device. The touch sensor may be implemented as a touch film, a touch sheet, a touch pad, and the like.
The touch sensor may be configured to convert changes of a pressure applied to a specific part of the display unit 151, or a capacitance occurring from a specific part of the display unit 151, into electric input signals. The touch sensor may be configured to sense not only a touched position and a touched area, but also a touch pressure at which a touch object body is touched on the touch sensor.
When there is a touch input to the touch sensor, the corresponding signals are transmitted to a touch controller. The touch controller processes the signal(s), and then transmits the corresponding data to the controller 180. Accordingly, the controller 180 may sense which region of the display unit 151 has been touched.
Referring to FIG. 1, a proximity sensor 141 may be arranged at an inner region of the mobile device 100 surrounded by the touch screen, or adjacent to the touch screen. The proximity sensor 141 may be provided as an example of the sensing unit 140. The proximity sensor 141 refers to a sensor to sense the presence or absence of an object approaching to a surface to be sensed, or an object disposed adjacent to a surface to be sensed, by using an electromagnetic field or infrared rays without a mechanical contact. The proximity sensor 141 has a longer lifespan and a more enhanced utility than a contact sensor.
The proximity sensor 141 may include an optical transmission type photoelectric sensor, a direct reflective type photoelectric sensor, a mirror reflective type photoelectric sensor, a high-frequency oscillation proximity sensor, a capacitance type proximity sensor, a magnetic type proximity sensor, an infrared rays proximity sensor, and so on. When the touch screen is implemented as a capacitance type, the proximity of an object having conductivity (hereinafter, referred to as a "pointer") to the touch screen is sensed by changes of an electromagnetic field. In this case, the touch screen (touch sensor) may be categorized into a proximity sensor.
Hereinafter, for the sake of convenience of brief explanation, a behavior that the pointer is positioned to be proximate onto the touch screen without contact will be referred to as a "proximity touch", whereas a behavior that the pointer substantially comes in contact with the touch screen will be referred to as a "contact touch". For the position corresponding to the proximity touch of the pointer on the touch screen, such position corresponds to a position where the pointer faces perpendicular to the touch screen upon the proximity touch of the pointer.
The proximity sensor senses a proximity touch, and a proximity touch pattern (e.g., proximity touch distance, proximity touch direction, proximity touch speed, proximity touch time, proximity touch position, proximity touch moving status, etc.). Information relating to the sensed proximity touch and the sensed proximity touch patterns may be output onto the touch screen.
The audio output module 152 may output audio data received from the wireless communication unit 110 or stored in the memory 160, in a call-receiving mode, a call-placing mode, a recording mode, a voice recognition mode, a broadcast reception mode, and so on. The audio output module 152 may output audio signals relating to the functions performed in the mobile device 100 (e.g., sound alarming a call received or a message received, and so on). The audio output module 152 may include a receiver, a speaker, a buzzer, and so on.
The alarm 153 outputs signals notifying occurrence of events from the mobile device 100. The events occurring from the mobile device 100 may include call received, message received, key signal input, touch input, and so on. The alarm 153 may output not only video or audio signals, but also other types of signals such as signals notifying occurrence of events in a vibration manner. Since the video or audio signals can be output through the display unit 151 or the audio output unit 152, they 151, 152 may be categorized into part of the alarm 153.
The haptic module 154 generates various tactile effects on which a user can feel. A representative example of the tactile effects generated by the haptic module 154 includes vibration. Vibration generated by the haptic module 154 may have a controllable intensity, a controllable pattern, and so on. For instance, different vibration may be output in a synthesized manner or in a sequential manner.
The haptic module 154 may generate various tactile effects, including not only vibration, but also arrangement of pins vertically moving with respect to a skin being touched, air injection force or air suction force through an injection hole or a suction hole, touch by a skin surface, presence or absence of contact with an electrode, effects by stimulus such as an electrostatic force, reproduction of cold or hot feeling using a heat absorbing device or a heat emitting device, and the like.
The haptic module 154 may be configured to transmit tactile effects through a user’s direct contact, or a user’s muscular sense using a finger or a hand. The haptic module 154 may be implemented in two or more in number according to the configuration of the mobile device 100.
The memory 160 may store a program for processing and controlling the controller 180. Alternatively, the memory 160 may temporarily store input/output data (e.g., phonebook, messages, still images, videos, and the like). Also, the memory 160 may store data related to various patterns of vibrations and sounds outputted upon the touch input on the touch screen.
The memory 160 may be implemented using any type of suitable storage medium including a flash memory type, a hard disk type, a multimedia card micro type, a memory card type (e.g., SD or DX memory), Random Access Memory (RAM), Static Random Access Memory (SRAM), Read-Only Memory (ROM), Electrically Erasable Programmable Read-only Memory (EEPROM), Programmable Read-only Memory (PROM), magnetic memory, magnetic disk, optical disk, and the like. Also, the mobile device 100 may operate in association with a web storage which performs the storage function of the memory 160 on the Internet.
The interface unit 170 may generally be implemented to interface the mobile device with external devices connected to the mobile device 100. The interface unit 170 may allow a data reception from an external device, a power delivery to each component in the mobile device 100, or a data transmission from the mobile device 100 to an external device. The interface unit 170 may include, for example, wired/wireless headset ports, external charger ports, wired/wireless data ports, memory card ports, ports for coupling devices having an identification module, audio Input/Output (I/O) ports, video I/O ports, earphone ports, and the like.
On the other hand, the identification module may be configured as a chip for storing various information required to authenticate an authority to use the mobile device 100, which may include a User Identity Module (UIM), a Subscriber Identity Module (SIM), and the like. Also, the device having the identification module (hereinafter, referred to as "identification device") may be implemented in a type of smart card. Hence, the identification device can be coupled to the mobile device 100 via a port.
Furthermore, the interface unit 170 may serve as a path for power to be supplied from an external cradle to the mobile device 100 when the mobile device 100 is connected to the external cradle or as a path for transferring various command signals inputted from the cradle by a user to the mobile device 100. Such various command signals or power inputted from the cradle may operate as signals for recognizing that the mobile device 100 has accurately been mounted to the cradle.
The controller 180 typically controls the overall operations of the mobile device 100. For example, the controller 180 performs the control and processing associated with telephony calls, data communications, video calls, and the like. The controller 180 may include a multimedia module 181 which provides multimedia playback. The multimedia module 181 may be configured as part of the controller 180 or as a separate component.
The controller 180 may classify instance messages transmitted and received through the wireless communication unit 110 based on a specific reference.
The power supply unit 190 receives external and internal power to provide power required for various components under the control of the controller 180.
Various embodiments described herein may be implemented in a computer or similar device readable medium using software, hardware, or any combination thereof.
For hardware implementation, it may be implemented by using at least one of application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), processors, controllers, micro-controllers, microprocessors, and electrical units designed to perform the functions described herein. In some cases, such embodiments may be implemented in the controller 180 itself.
For software implementation, the embodiments such as procedures or functions described in the present disclosure may be implemented with separate software modules. Each of the software modules may perform at least one function or operation described in the present disclosure.
Software codes can be implemented by a software application written in any suitable programming language. The software codes may be stored in the memory 160 and executed by the controller 180.
Furthermore, the controller 180 of the mobile device 100 capable of including at least one of the foregoing constituent elements according to an embodiment of the present disclosure may sense a first touch gesture to an instance message, and classify the touch sensed instance message into a message belonging to a subject selected according to a second touch gesture. Furthermore, the controller 180 may display messages belonging to the selected subject to be visually distinguished from the other instance messages.
Here, the instance message may be one through which two or more talkers use real time text communication using a network such as the Internet, and may be referred to as a messenger. The instance message may be displayed on the counterpart's screen immediately when sent.
Furthermore, the touch gesture may include touch gestures with a predetermined scheme, for example, all touch gestures using a long-press touch, a short-press touch, a touch-up, a touch-down, a touch and drag, a flicking or drag touch input, a proximity touch, and other user motions. For example, a touch gesture to the instance message may be a long-press touch to the instance message.
Furthermore, the first touch gesture and the second touch gesture may indicate touch gestures implemented in different schemes.
Furthermore, for the subject, when main content and/or meaning contained in the transmitted and received instance message has a commonly related conversation matter, the conversation matter may correspond to this. However, herein, it may not be necessarily limited to this, the transmitted and received instance message itself may be registered as the subject or any object received through user input may be also registered as the subject.
Furthermore, the selected subject may be a subject selected from the previously registered subjects according to a touch or may be a newly registered subject according to a touch.
Furthermore, classifying the touch sensed instance message into a message belonging to the selected subject may denote storing, managing and displaying instance messages previously contained in the selected subject along with the touch sensed instance message.
Furthermore, displaying the instance message to be visually distinguished from the other instance messages may denote distinguishing the instance message from the other instance messages, for example, by connecting the same tree shaped guideline, performing the same edge processing, performing a highlight processing, displaying the same color for messages belonging to the selected subject, or through a change of the shape or size when belongs to a specific subject.
Furthermore, the controller 180 may switch a current screen displayed on the display unit 151 to a screen displayed with a list of objects indicating a previously registered subject according to a third touch gesture.
Here, the third touch gesture may be one of the foregoing touch gestures with a predetermined scheme. Furthermore, the third touch gesture may be a touch gesture with a different scheme from the first and the second touch gesture, for example, a touch to a region displayed with a specific control key (for example, "view subject" touch key) or a flicking or drag touch input applied to one region of the display unit 151.
When a touch to a specific object of the list is sensed in a state that the current screen of the display unit 151 is switched to a screen displayed with a list of objects indicating the subject as described above, the controller 180 may perform at least one function associated with a subject corresponding to the touch sensed specific subject.
Here, the at least one function may include a function of displaying instance messages belonging to a subject, an edit function for the subject itself, for example, an attribute setting and change function of the subject such as a main subject, a secret subject or the like, a subject share function, a reception notification setting function of an instance message belonging to the subject, a bookmark add and edit function for link address information contained in a specific instance message contained in the subject, and the like.
Furthermore, the controller 180 may classify total transmitted and received instance messages into each reference range determined according to a fourth touch gesture.
Here, the fourth touch gesture may denote a touch gesture with the foregoing predetermined scheme, and a touch gesture distinguished from the foregoing first, second, third touch gestures.
Here, the reference range may denote a range classified according to a time, date or period during which instance messages are transmitted and received, for example.
Furthermore, the reference range determined according to a touch gesture may denote that a time range during which instance messages are classified can be varied according to the direction and length of a drag when the touch gesture is a touch and drag touch input, for example. Total instance messages may be classified and displayed for each date or classified and displayed for each longer period range (for example, week unit) according to the drag length of a touch and drag touch input, for example.
Furthermore, the controller 180 may further include a setting unit 181 for setting at least one search keyword for retrieving an instance message.
When it is detected that the search keyword set by the setting unit 181 is contained in an instance message with any one of the reference ranges determined according to the touch gesture, the controller 180 may display an indicator indicating the detected search keyword on an object indicating the corresponding reference range.
Furthermore, the controller 180 may display the instance message in a first region of the display unit 151 and display a search window for retrieving information associated with the instance message in a second region distinguished from the first region.
Here, the information associated with the instance message may include all information such as an instance message writer, information of a specific object contained in the instance message and a time during which the instance message is transmitted and received, and the like, for example.
In this manner, embodiments disclosed herein may provide an interface capable of classifying transmitted and received instance messages into each subject selected according to a touch, classifying them into each reference range determined according to a touch or allowing the user to directly enter the related keyword, thereby providing an environment capable of more easily and speedily retrieving a specific instance message and/or its related instance messages desired to be retrieved by the user.
Hereinafter, a method of classifying and displaying instance messages transmitted and received by the mobile device 100 into each subject according to a touch will be described with reference to FIGS. 1, 2, 4A and 4B.
The mobile device 100 according to embodiments disclosed herein may include a display unit 151 (refer to FIG. 3) disposed at one surface thereof, for example, front surface thereof, and the display unit 151 is configured to enable a touch input.
FIG. 2 is a view illustrating a control method for classifying and displaying instance messages transmitted and received by the mobile device 100 according to an embodiment of the present disclosure into each subject selected according to a touch, and FIGS. 4A and 4B are conceptual views for explaining the control method in FIG. 2.
First, referring to FIG. 2, the mobile device 100 displays an instance message transmitted and received through the wireless communication unit 110 on the display unit 151 (S200).
The instance message may be transmitted and received through a messenger provided from the mobile device 100 according to the present disclosure or a message related application (for example, Kakao Talk, Daum People, WhatsApp) downloaded from the mobile device 100. Furthermore, the instance message may be transmitted and received between two talkers, and transmitted and received in the form of group chatting by two or more talkers.
Referring to FIG. 4A, instance messages displayed on the display unit 151 may be transmitted and received by a plurality of talkers B, C and D 411, 413, 417, and instance messages being transmitted and received 412, 414, 416, 418, 420 may be comprised of data with various formats including text, images, videos and the like.
The controller 180 may sense a first touch gesture to a specific instance message, for example, the instance message 416, in a state that the instance message is displayed (S210).
Here, the first touch gesture may be a touch gesture for making a long press to a region in which the specific instance message 416 is displayed, but may not be necessarily limited to this.
When a first touch gesture is sensed as described above, the controller 180 classifies the instance message 416 into a message belonging to a subject selected according to a second touch gesture (S220).
Here, the second touch gesture may be a touch gesture with a scheme which is distinguished from the foregoing first touch gesture, for example, a touch and drag to the injection molded article or touch to a selection menu popped up according to the first touch gesture.
FIG. 4A illustrate an embodiment of selecting the subject when the second touch gesture is a touch to a selection menu popped up according to the first touch gesture.
The controller 180 may sense a touch to a selection menu popped up according to the first touch gesture, and when a specific subject is selected according to the touch to a selection menu, the controller 180 may classify the instance message into a message belonging to the selected subject.
Specifically, as illustrated in FIG. 4A, when a first touch gesture to the instance message 416 is sensed, a selection menu 430 including one or more selectable objects is popped up.
A selectable object in the selection menu 430 may include an object displayed with text "register as a subject", "insert message into the existing subject" and the like, for example. Furthermore, subjects previously registered as a lower item of the object displayed with text "insert message into the existing subject", for example, "overseas travel", "project A" and the like may be displayed.
When a touch to "register as a subject" in the selection menu 430 is sensed, the controller 180 creates a new subject, and classify the instance message 416 sensed by the first touch gesture into a message belonging to the new subject. At this time, information associated with the created subject may be stored in the memory 160 of the mobile device 100.
Specifically, when a first touch gesture to the instance message 416 is sensed in FIG. 4A, and then a touch to the object "register as a subject" among the objects of the popped up selection menu 430 is applied, the controller 180 creates and registers text "Let us meet in April" contained in the instance message 416 sensed by the first touch gesture as a new subject.
On the other hand, when the second touch gesture is a touch to a selection menu popped up according to the first touch gesture, and a new instance message is transmitted in a state that a specific subject is selected according to the second touch gesture, the controller 180 may classify the transmitted instance message into a message belonging to the specific subject. In other words, the controller 180 automatically classifies instance messages transmitted after selecting the specific subject as messages belonging to the selected subject.
Specifically, if a firsts touch gesture to the instance message 416 is sensed in FIG. 4A and a touch to "overseas travel" which is a lower item of the object "insert message into the existing subject" among the objects of the popped up selection menu 430 is applied, then the controller 180 may switch the current screen of the display unit 151 to a screen on which instance messages belonging to the subject "overseas travel" are displayed. Then, when a new instance message is transmitted, the controller 180 may immediately the relevant instance message into an instance message belonging to the subject "travel".
Referring to FIG. 4B, FIG. 4B illustrates an embodiment of selecting the subject when the second touch gesture is a touch and drag to the instance message.
The controller 180 can sense a touch and drag to the instance message 416, and when the touch and drag being moved to a region associated with the specific subject is sensed, the controller 180 classifies the instance message 416 into an instance message belonging to the specific subject.
Specifically, as illustrated in FIG. 4B, an object 450 indicating a specific subject previously registered and/or selected according to a touch may be displayed on the display unit 151. In FIG. 4B, an object 450 indicating the previously registered subject, for example, an object 450 displayed with text "Current subject: Let us meet in April", is displayed in one region (upper region) of the display unit 151. Here, the object 450 may include a subject comprised of at least one of various texts and images.
The controller 180 may edit at least one of texts and images contained in the object 450 through user input. Furthermore, the controller 180 may control the object 450 to disappear from the display unit 151 at normal times, and control the object 450 to appear when a touch is sensed at one region (upper region) of the display unit 151.
Subsequently, referring to FIG. 4B, in a state that the object 450 indicating a previously registered subject is displayed on the display unit 151, the controller 180 senses a first touch gesture to the instance message 420 with an image type, and senses a touch and drag using the instance message 420 as a touch starting point. When the sensed touch and drag being moved to a region associated with a subject, for example, a region in which the object 450 is displayed is sensed, the controller 180 classifies the instance message 420 into a message belonging to a subject to the object 450.
Here, the region associated with the subject may further include a region displayed with a message belonging to a specific subject in addition to a region in which an object indicating the specific subject is displayed on the display unit 151.
For example, in FIG. 4B, when a touch and drag using the instance message 420 as a touch starting point being moved to a region in which the subject 450 is displayed is sensed, the controller 180 classifies the instance message 420 into a message belonging to the subject 450. Furthermore, even when a touch and drag using the instance message 420 as a touch starting point being moved to a region in which another instance message belonging to the subject 450 is displayed is sensed, the controller 180 classifies the instance message 420 into a message belonging to the subject 450.
When an instance message is classified as described above, the controller 180 displays a message belonging to the selected subject to be visually distinguished from the other instance messages (S230).
Specifically, the controller 180 may display message belonging to the selected subject to be distinguished, for example, through connecting the same tree shaped guideline, performing the same edge processing, performing a highlight processing, displaying the same color for messages belonging to the selected subject, or changing the existing shape or size to be distinguished from the other instance messages, and the like.
For example, as illustrated in FIG. 4b, instance messages 416, 420 classified into a message belonging to the subject 450 are connected and displayed to each other through a tree shaped guideline 455, 465 coming out of the subject 450, and thus distinguished from the other instance messages 412, 414, 418.
Furthermore, according to another embodiment, when a touch (for example, proximity touch) to a region in which instance messages are displayed on the display unit 151 is sensed, the controller 180 may display a subject selected in a preview format around the region in which the touch sensed instance messages are displayed to be distinguished from the other instance messages.
As described above, according to the foregoing embodiment, transmitted and received instance messages may be classified into each subject selected according to a touch and messages belonging to the selected subject may be displayed to be distinguished from the other instance messages, thereby providing an environment capable of retrieving and managing instance messages for each subject as described below.
From now on, a control method for registering, displaying, editing and managing instance messages classified as described above for each subject will be described with reference to FIGS. 1, 3A, 5A and 5B. A method of registering a subject has been described in the above, and thus will be omitted herein.
FIG. 3A is a view for explaining a control method for displaying a list of objects indicating a registered subject, and FIGS. 5A and 5B are conceptual view for explaining the control method in FIG. 3A.
Referring to FIG. 3A, first, the mobile device 100 displays an instance message transmitted and received through the wireless communication unit 110 on the display unit 151(S300).
In a state that the instance message is displayed, the controller 180 senses a third touch gesture to the display unit 151 (S310).
Here, the third touch gesture may be a touch to a control key 570 displayed in one region of the display unit 151 as illustrated in FIG. 5A, for example. The controller 180 may control the control key 570 to disappear from the display unit 151 at normal times, and control the control key 570 to appear on the display unit 151 when there exists a user input with a predetermined scheme. Furthermore, the third touch gesture may be a touch gesture for flicking a touch applied to one region of the display unit 151 in the predetermined direction as illustrated in FIG. 5B, but may not be necessarily limited to this. Furthermore, the third touch gesture indicates a touch gesture with a scheme which is distinguished from the foregoing first touch gesture and second touch gesture.
When a third touch gesture to the display unit 151 is sensed as described above, the controller 180 may display a list of objects indicating a previously registered subject (S320).
In other words, when a third touch gesture is sensed, the controller 180 may switch the current screen of the display unit 151 to a screen on which a list of objects indicating the specific subject is displayed.
Specifically, when a touch input to the control key 570 "view subject" displayed in one region of the display unit 151 is sensed as illustrated in FIG. 5A or a touch input for flicking in a predetermined direction (for example, left direction) to one region of the display unit 151 is sensed as illustrated in FIG. 5B, the controller 180 may switch the current screen of the display unit 151 to a screen on which a list 580 of objects indicating a previously registered subject is displayed.
Here, the list 580 may include objects 581 to 584 indicating one or more previously registered subjects as illustrated in the drawing. For example, referring to FIG. 5A, the subject-1 581 is a subject for "test", and the subject-2 582 is a subject for "overseas travel", and the subject-3 583 is a subject for "project A", and the subject-4 584 is a subject for "project B". The objects 581 to 584 indicating previously registered subjects may further include image information indicating a subject in addition to text information.
The list 580 of objects indicating the subject may be displayed in the order of storing them in the memory 160. Furthermore, the shape, order and the like of the displayed objects 581 to 584 may be changed on the list 580 through user input.
In a state that the list 580 is displayed on the display unit 151 as described above, the controller 180 may sense a touch to a specific object on the list, and accordingly, the controller 180 may perform at least one function associated with a subject corresponding to a specific object to which a touch is made.
For example, when a short-press touch to a specific object indicating a specific subject is sensed, the controller 180 may display at least one instance message belonging to the corresponding specific subject on the display unit 151. Furthermore, for example, when a long-press touch to a specific object indicating a specific subject is sensed, the controller 180 may pop up a control menu window for selectively performing an edit function to the corresponding specific subject itself, for example, functions such as change object, delete subject, send subject, change subject attribute, add bookmark and the like.
Hereinafter, functions associated with the foregoing subject will be described in more detail with reference to FIGS. 5C, 6 and 7.
FIG. 5C is a conceptual view for explaining a function of displaying an instance message belonging to a specific subject selected from the foregoing subject list and a control method of inserting a new message thereto according to an embodiment of the present disclosure.
When a third touch gesture to the display unit 151 is sensed in a state that a transmitted and received instance message is displayed on the display unit 151, the controller 180 displays a screen displayed with the list 580 on the display unit 151 as illustrated in FIG. 5C. On the other hand, according to another embodiment, the controller 180 may immediately display the list 580 screen using a specific control key or the like even without passing through a state that the transmitted and received instance message is displayed on the display unit 151.
When a first touch to a specific object of the list 580 displayed on the display unit 151 is sensed, the controller 180 displays instance messages belonging to a subject corresponding to the specific object on the display unit 151. For example, when a short-press touch to the subject-2 182 is sensed as illustrated in FIG. 5C, instance messages 512-514 belonging to the subject-2 182 are displayed on the display unit 151.
When a new message, for example, "I like it, too" 516, is written through user input in a state that instance messages 512-514 belonging to the subject-2 182 are displayed, and a transmit command is entered through the transmit key 575, the controller 180 automatically classifies the transmitted new message into an instance message belonging to the subject-2 550.
Furthermore, the controller 180 may further form a guideline 575 that connects guidelines 555, 565 for other instance messages to a new instance message 516.
FIG. 6 is a view for explaining a method of performing a function of deleting a specific subject selected from the foregoing subject list according to an embodiment of the present disclosure.
As illustrated in FIG. 6, when a second touch (for example, long-press touch) to a specific object 682 of the list 580 is sensed in a state that a list 680 is displayed on the display unit 151, the controller 180 may pop up a control menu 660 including a function key that can be selected for a subject corresponding to the relevant specific object.
The popped up control menu 660 may include a subject delete function, a subject share function, a main subject register function, a secret subject register function, a bookmark function, and the like, but may not be necessarily limited to this, and may further include other functions which are not described herein. Furthermore, the controller 180 may perform a control command for adding a user's desired new function to the control menu 660 or deleting an unused function through user input.
When a subject delete function is selected from the control menu 660, the controller 180 deletes the selected subject-2 682 from the list 680. Accordingly, instance messages belonging to the subject-2 682 cannot be retrieved any more. However, they are remained as they are in the original instance message record.
When the selected specific subject is deleted as described above, the controller 180 may delete the indication (for example, tree shaped guideline) for distinguishing instance messages belonging to the relevant subject from the other instance messages as well.
Furthermore, though not shown in the drawing, according to an embodiment, when a subject share function is selected from the popped up control menu 660, the controller 180 may transmit at least part (instance messages to which a share is limited can be excluded) of instance messages belonging to the selected subject to another mobile device 100 through the wireless communication unit 110.
Due to this, when the third party wants to receive information on a specific subject, the related instance messages can be immediately sent, and thus the transmitter side does not have to make an effort to collect the related content one by one and transmit them and the receiver side receives conversations transmitted and received in the time order as they are, thereby allowing them to easily and quickly be aware of the subject related content.
Furthermore, though not shown in the drawing, according to an embodiment, when a main subject register function is selected from the popped up control menu 660, the controller 180 may further display an indicator (for example, shape change of an object itself indicating a text, image or subject, or the like) indicating a main subject for the selected subject on the list 680.
On the other hand, when the object selected from the list 580 is a subject previously registered as a main subject, the controller 180 may display a main subject release function on the popped up control menu 660.
When it is registered as a main subject as described above, the controller 180 may configure reception notification for instance messages belonging to the relevant subject in a different manner. For example, when an instance message belonging to a subject registered as a main subject is received, the controller 180 allows it to be displayed in another reception notification mode or reduce the notification interval and number of times.
Due to this, when an instance message belonging to a subject registered as a main subject is received, convenience may be provided to the user, thus immediately checking the message in a selective manner.
Subsequently, FIG. 7 is a view for explaining a method of performing a function of registering a specific subject selected from the foregoing subject list 780 as a main subject according to an embodiment of the present disclosure.
As illustrated in FIG. 7, when a second touch (for example, long-press touch) to a specific object 782 on the list 780 is sensed in a state that a screen displayed with the list 780 is output on the display unit 151, the controller 180 may allow the control menu 760 to be popped up.
When a secret subject register function is selected from selectable functions displayed on the control menu 760, the controller 180 may further display an indicator 764 indicating secret subject registration on the subject-2 782 as illustrated in FIG. 7.
The indicator 764 may include images, text and the like, but may not be necessarily limited to this, and may include any visual indication that can be distinguished from the other subjects 781, 782, 784.
When a first touch to the subject-2 782 registered as a secret subject is sensed, the controller 180 may switch the current screen of the display unit 151 to a screen 765 for requesting a preset password input prior to displaying instance messages belonging to the subject-2 782. Here, the controller 180 may change a preset password through user input. Furthermore, when wrong passwords are entered by exceeding a predetermined number of times, the controller 180 may reinforce the display limitation of instance messages belonging to the subject-2 782 step by step.
On the other hand, when an object selected from the list 780 is previously registered as a secret subject, the controller 180 may display a secret subject release function on the popped up control menu 760.
According to the foregoing function, it provides the user's convenience capable of limiting the other person's reading on instance messages belonging to a subject registered as a secret subject.
Furthermore, though not shown in the drawing, according to an embodiment, when a bookmark function is selected from the popped up control menu 760, the controller 180 may display a web page corresponding to specific link information contained in the selected subject on the display unit 151. In other words, when a bookmark function is selected from the popped up control menu 760, the controller 180 implements a web browser application and displays a web page screen corresponding to specific link information contained in the selected subject on the display unit 151.
Due to this, it may be possible to provide the user's convenience capable of more quickly retrieving a specific web page associated with a specific subject.
On the other hand, there is a case where an instance message belonging to a specific subject is immediately received from another mobile device 100.
In this case, the controller 180 may automatically classify the received instance message into a message belonging to a specific subject. Furthermore, the controller 180 may display an indication (for example, tree shaped guideline) for distinguishing the received instance message from the other instance messages on the display unit 151.
Furthermore, when an instance message belonging to a specific subject is received, the controller 180 may display an indicator indicating the reception on the display unit 151. Furthermore, when an instance message belonging to a specific subject is received, the controller 180 may output predetermined reception notification through the audio output module 152.
For example, referring to FIG. 7, when an instance message belonging to a specific subject on the list 780 is received, the controller 180 classifies the received instance message into a message belonging to the specific subject, and further displays an indicator 790 indicating the reception of the received instance message on objects 781-784, respectively, indicating the relevant specific subject. Here, the indicator 790 may be a text or image indicating the number of the received instance messages belonging to a specific subject as illustrated in FIG. 7, but may not be necessarily limited to this.
When a touch is applied to the indicator 790 indicating the reception of an instance message, the controller 180 may display the received instance message along with messages belonging to the relevant subject on the display unit 151. When check for the received instance message is carried out, the controller 180 may control the indicator 790 to disappear from the display unit 151.
On the other hand, there is a case where the user wants to retrieve only data with a specific format from the transmitted and received instance messages. In this connection, FIG. 8 is illustrate a method of classifying and displaying instance messages containing data with a predetermined format into messages belonging to a specific subject when they are transmitted and received according to an embodiment of the present disclosure.
As illustrated in FIG. 8, the transmitted and received instance messages displayed on the display unit 151 may include instance messages 814, 822 with a text format, instance messages 812, 820 with an image file format, and instance messages 816, 818 with a video file format.
The controller 180 may classify only instance messages containing data with a specific format in a separate manner.
Specifically, the controller 180 may extract instance messages with a predetermined specific format, and classify the extracted messages into a previously generated or automatically generated specific subject. To this end, the controller 180 may include a detection unit (not shown) for extracting data with a predetermined specific format from the transmitted and received instance messages.
Here, the specific format may include information with a specific format contained in text, for example, link address information, phone number information, and address information and the like as well as an image file format, and a video file format.
For example, in FIG. 8, the controller 180 may extract instance messages 812, 820 with an image file format and instance messages 816, 818 with a video file format, respectively, from the transmitted and received instance messages 812, 814, 816, 818, 820, 822, to classify them into a separate subject.
The controller 180 may sense a third touch gesture applied to the display unit 151, for example, a touch to the control key "view subject", and display it by further adding an object indicating a video 883 and an object indicating an image 884.
When a touch to an object indicating an image 884 is sensed on the displayed list 880, the controller 180 may switch the current screen of the display unit 151 into a screen on which only instance messages 812, 820 with an image file format are displayed.
In FIG. 8, an object 850 displayed with a subject "image" may be displayed in one region of the display unit 151. Furthermore, the display unit 151 may connect and display instance messages 812, 820 with an image file format belonging to the subject using a tree shaped guideline 860.
In the above, embodiments for classifying, editing, managing and retrieving instance messages transmitted and received at the mobile device 100 for each subject according to a touch have been described. However, it may not be necessarily limited to the foregoing embodiments, and other functions associated with the selected subject and other user interfaces for them can be also implemented.
From now on, a control method of classifying instance messages transmitted and received at the mobile device 100 into each reference range determined according to a touch will be described with reference to FIGS. 1, 3B, 9A through 9C, and 10A through 10C.
First, FIG. 3B is a view illustrating a control method for classifying total instance messages into each reference range determined according to a touch.
Referring to FIG. 3B, first, the mobile device 100 displays an instance message transmitted and received through the wireless communication unit 110 on the display unit 151 (S400).
In a state that the instance message is displayed, the controller 180 senses a fourth touch gesture to the display unit 151 (S410).
Here, the fourth touch gesture may be a touch to a control key displayed in one region of the display unit 151, and may be a flicking touch input or drag touch input applied to one region of the display unit 151 in a specific direction. However, it may not be necessarily limited to this, and may include all touch gestures with a predetermined different scheme which is distinguished from the foregoing first through third touch gestures.
When a fourth touch gesture is sensed as described above, the controller 180 classifies total instance messages into each reference range determined according to the fourth touch gesture (S420).
Here, the reference range denotes grouping by each predetermined unit according to a time during which instance messages are transmitted and received. For example, the reference range may be grouped by a date unit or grouped by a week unit.
Furthermore, the meaning of determining a reference range according to a fourth touch gesture denotes that each time range classified according to the direction and drag length of a touch gesture or the like, for example is varied. In other words, a time range for classifying instance messages can be determined according to a touch.
On the other hand, according to another embodiment, the meaning of determining a reference range according to a fourth touch gesture may denote that instance messages are classified into each predetermined time range, for example, each time range with a fixed week unit.
More specific examples will be described in detail with reference to FIGS. 9A through 9C, and 10A through 10C.
First, FIG. 9A is view for explaining a control method of classifying total instance messages into each reference range determined according to a touch according to an embodiment of the present disclosure.
In a state that the transmitted and received instance messages are displayed, the controller 180 may sense a fourth touch gesture to the display unit 151, and switch the current screen of the display unit 151 into a screen displayed with at least one image object indicating a reference range determined according to the fourth touch gesture.
Here, the fourth touch gesture indicates a gesture in which at least two touch positions applied to the display unit 151 are closer to each other. Furthermore, the at least two touch positions applied to the display unit 151 may be applied to the background screen other than a region displayed with an instance message, for example.
The controller 180 may narrowly set the reference range determined according to a touch to a date unit when a distance between touch positions applied to the display unit 151 is relatively wide. Furthermore, the controller 180 may widely set the reference range determined according to a touch to a week unit when a distance between touch positions applied to the display unit 151 is relatively narrow. Here, a distance between touch positions applied to the display unit 151 is determined in a relative manner.
Furthermore, the controller 180 may flexibly set the reference range based on a touch direction of the fourth touch gesture, a distance between the touch positions, a number of times for which the touch is repeated, and the like, and instance messages are classified according to the set reference range.
As illustrated in FIG. 9A, when a gesture in which at least two touch positions applied to the display unit 151 are closer to each other is sensed in a state that instance messages 912, 914, 916 are displayed on the display unit 151, the controller 180 may classify instance messages in the range of a date unit to display a list 940 of image objects indicating each date unit range on the display unit 151.
Text information including year, month, day and weekday may be further displayed on image objects 941 to 946 displayed on the display unit 151.
Furthermore, the controller 180 may further display a specific keyword on the displayed image objects 941 to 946. To this end, the controller 180 may further include a setting unit 181 for setting at least one search keyword. Furthermore, the controller 180 may further include a detection unit (not shown) for detecting the set search keyword from instance messages.
Here, keyword denotes information comprised of text that summarizes an instance message, as an object contained the instance message. Furthermore, being contained in an instance message may be expressed as "information associated with an object contained in an instance message being tagged".
A tag is a set of a sort of keywords in which words indicating the feature, meaning, title, subject, and the like, that belong to instance messages are previously stored, and an object itself contained in an instance message may include tag information even when tag information is not directly entered through user input or not entered by the user. Furthermore, the tag information may be expressed as metadata, wherein the metadata is used to effectively find the tag or keyword, for example, as data for explaining an image object.
When a search keyword set by the setting unit 181 is contained in ice making system with a specific reference range, the controller 180 may further display an indicator indicating a search keyword set to an image object corresponding to the specific reference range. At this time, the indicator may be displayed along with other images (for example, thumbnail image) in addition to text indicating a search keyword.
For example, the image object 941 among the image objects 941 to 946 as illustrated in FIG. 9A may include a preset keyword "meeting" detected from instance messages transmitted and received on "Thursday, January 10, 2013", and the image object 943 may include a preset keyword "meeting" detected from instance messages transmitted and received on "Saturday, January 12, 2013".
In this manner, when a fifth touch gesture to the display unit 151 is sensed in case that the current screen of the display unit 151 is switched to a screen displayed with an image object, the controller 180 may switch the current screen of the display unit to a previous screen displayed with the instance message.
Here, the fifth touch gesture indicates a gesture in which at least two touch positions applied to the display unit 151 are away from each other. Furthermore, the at least two touch positions applied to the display unit 151 may be applied to the background screen other than a region displayed with an instance message, for example.
The controller 180 may control such that a range in which the instance message is displayed is varied according to the extent of touch positions applied to the display unit 151 being away from each other.
For example, when a distance between touch positions applied to the display unit 151 is relatively wide, the controller 180 may display the range displayed with the instance message in a narrow manner. Furthermore, when a distance between touch positions applied to the display unit 151 is relatively narrow, the controller 180 may display the range displayed with the instance message in a wide manner. Here, a distance between touch positions applied to the display unit 151 is determined in a relative manner.
FIGS. 9B and 9C are conceptual view for explaining a method of displaying the corresponding instance message according to a touch applied to the image object in a state that the screen of the display unit 151 is switched to a screen displayed with the image objects 941 to 946.
In an embodiment according to the present disclosure, when a touch to any one of the displayed image objects is sensed, the controller 180 may switch the current screen of the display unit 151 to a screen on which instance messages with a reference range corresponding to the touch sensed image object are displayed.
For example, when a touch to the image object 941 illustrated in FIG. 9B is sensed, the controller 180 displays instance messages transmitted and received on "Thursday, January 10, 2013" on the display unit 151 from the first instance message of the relevant day. At this time, when there are a lot of instance messages transmitted and received on the relevant day, the controller 180 may display only some instance messages from the first instance message of the relevant day. FIG. 9B illustrates a view in which instance messages are displayed from the instance message 912 "Good morning" transmitted for the first time at "9:20 a.m." on "Thursday, January 10, 2013".
In an embodiment according to the present disclosure, when a touch to the indicator displayed on the output image object is sensed, the controller 180 may switch the current screen of the display unit 151 to a screen on which instance messages are displayed from an instance message containing the touch sensed indicator.
For example, when a touch to the indicator indicating a search keyword "meeting" 950 on the image object 941 illustrated in FIG. 9C is sensed, the controller 180 displays instance messages transmitted and received on "Thursday, January 10, 2013" from the first instance message containing the search keyword "meeting". In other words, FIG. 9C illustrates a view in which instance messages are displayed from an instance message "meeting at 5 o'clock" 914 received at "3:45 p.m." on "Thursday, January 10, 2013".
FIGS. 10A through 10C are views for explaining a method of classifying image objects indicating the each reference range into a narrower reference range according to the number of times of touches and displaying instance messages related thereto according to an embodiment of the present disclosure.
First, when a fourth touch gesture is sensed again in a state that the screen of the display unit 151 is switched to a screen on which image objects 1041 to 1046 are displayed according to the fourth touch gesture, the controller 180 may set a reference range for classifying instance messages in a broader manner. Even in this case, a distance between touch positions applied to the display unit 151 may be relatively determined as described above. In other words, the controller 180 may control a range displayed with the instance message to be varied according to the extent of touch positions applied to the display unit 151 being closer to each other.
For example, in FIG. 10A, the number of image objects 1041' to 1043' indicating a reference range determined according to the repeated fourth touch gestures is reduced and the classification reference range displayed on the image objects is further broadened. For example, in FIG. 10A, the classification reference range of an image object displayed for the first time according to the repeated fourth touch gestures is set in a broader manner from "Thursday, January 10, 2013" to "December 31, 2012 to January 6, 2013".
When a touch is sensed at any one of the displayed image objects, the controller 180 switches the current screen of the display unit 151 to a screen on which instance messages with an reference range corresponding to the touch sensed image object are displayed.
For example, when a touch to the image object 1042' illustrated in FIG. 10B is sensed, the controller 180 displays instance messages transmitted and received during "December 31, 2012 to January 6, 2013" from the first instance message of the relevant range on the display unit 151. In other words, the instance messages are displayed from the instance message "Good morning" 1012 transmitted at "9:22 a.m." on "Monday, January 7, 2013" in the time order.
When a touch is sensed at any one of indicators displayed on the image objects, the controller 180 may switch the current screen of the display unit 151 to a screen on which instance messages with the corresponding reference range are displayed from an instance message containing the touch sensed indicator.
For example, when a touch to the indicator "meeting" 1050'b among indicators 1050' displayed on the image object 1042' illustrated in FIG. 10C is sensed, the controller 180 displays instance messages transmitted and received during "January 7, 2013 to January 13, 2013" from the first instance message containing the search keyword "meeting" on the display unit 151. In other words, in FIG. 10C, the instance messages are displayed from the instance message "meeting at 5 o'clock" 1014 received at "3:45 p.m." on "Thursday, January 10, 2013" in the time order.
In this manner, when instance messages are classified into a predetermined reference range, the controller 180 may switch the current screen of the display unit 151 to a screen on which at least one image object indicating each classified reference range is displayed. An image object displayed on the display unit 151 may provide a search reference in the time unit for retrieving the user's desired specific instance message from instance messages transmitted and received in the past.
Hereinafter, a method of displaying a search window for retrieving information associated with an instance message in a region distinguished from a region displayed with the instance message and a method of retrieving using the displayed search window will be described with reference to FIGS. 1, 2, and 11A through 11D. Embodiments described herein may be implemented along with the embodiments of classifying instance messages into each subject and the embodiments of classifying instance messages into each reference range determined according to a touch or in an independent manner.
First, FIGS. 11A and 11B are views for explaining a control method of displaying a search window for retrieving information associated with an instance message, and FIGS. 11C and 11D are views for explaining a control method of entering a keyword to the search window.
The controller 180 may sense a change of inclination of the mobile device 100 to determine the screen direction of the current display unit 151.
According to the determined screen direction, when the screen direction displayed on the display unit 151 is in a horizontal mode, the controller 180 may display a search window for retrieving information associated with an instance message/instance message in at least one region of the left/right side regions of the display unit 151.
According to the determined screen direction, when the screen direction displayed on the display unit 151 is in a vertical mode, the controller 180 may display a search window for retrieving information associated with an instance message/instance message in at least one region of the upper/lower regions of the display unit 151.
In other words, when the screen direction is changed, a region displayed with the search window for retrieving information associated with an instance message/instance message is also changed in a corresponding manner.
For example, the screen direction illustrated in FIG. 11A is a vertical mode, and the screen direction illustrated in FIG. 11B is a horizontal mode.
In a state that the displayed screen direction is determined, when a flicking or drag touch input entered to the display unit 151 in a predetermined direction is sensed, the controller 180 may display a search window including a predetermined keyword input bar 1173 and a search execution key 1175 in the second region distinguished from the first region displayed with an instance message. At this time, at least part of the first region displayed with the instance message may be reduced than prior to applying a touch input.
For example, when the determined screen direction is in a vertical mode as illustrated in FIG. 11A, the controller 180 may sense a flicking or drag touch input in the first direction (upward) applied to one region (for example, upper/lower region) of the display unit 151, and accordingly, display the search window in the second region (for example, upper/lower region of the display unit 151) distinguished from the first region displayed with an instance message.
On the other hand, according to another embodiment, the controller 180 may replace an instance message input window with a search window if a predetermined criterion is satisfied. Here, the predetermined criterion may be a touch gesture with a predetermined scheme to the display unit 151. In this manner, when the instance message input window is replaced with a search window, it may be implemented to deactivate a "send command" function of the key "send" displayed on the existing instance message input window and active a "search command". At this time, the "key" may display the text information ("send" or "search") of the currently activated function. On the other hand, according to another example, only one input window may be provided and a plurality of control keys (for example, send key, search key) may be provided.
Furthermore, only a "search window" according to an embodiment of the present disclosure is illustrated in FIG. 11A, but the foregoing instance message input window can be of course displayed as well.
Furthermore, the controller 180 may display an instant message input window in the first region of the display unit 151 and the search window in the second region distinguished from the first region. In addition, the controller 180 may display the first and the second region by controlling the location and size thereof according to a predetermined criterion. Moreover, the controller 180 may control the search window to disappear from the display unit 151.
On the other hand, as a result of sensing a change of inclination, when the determined screen direction is in a horizontal mode as illustrated in FIG. 11B, the controller 180 may sense a flicking or drag touch input in the second direction (left direction) applied to one region (for example, left/right region) of the display unit 151, and accordingly, display the search window in the second region (for example, left/right region of the display unit 151) distinguished from the first region displayed with an instance message.
Furthermore, according to another embodiment, when a touch gesture with another scheme other than the flicking or drag touch input, for example, a touch input to the scroll bar, is sensed, the controller 180 may display the search window for the first time.
When a touch to the input bar 1173 on the displayed search window is sensed, the controller 180 may display a touch keyboard for entering a keyword in the lower region of the input bar 1173. There is no limit for a keyword that can be entered using a touch keyboard. When a predetermined keyword (for example, text, numeral, image, etc.) is entered to the keyword input bar 1173 of the displayed search window and a touch to the search execution key 1175 is sensed, the controller 180 may retrieve information associated with an instance message/instance message associated with the receive keyword, and display a search result on the display unit 151.
Furthermore, the controller 180 may pop up a selection menu (not shown) presented with a specific transmitter, a specific search keyword and the like according to a touch input (for example, long-press touch input) applied to a region displayed with the keyword input bar 1173. When a specific transmitter, for example, is selected in a state that the selection menu (not shown) is displayed, the controller 180 may retrieve information associated with instance messages/instance message transmitted by the selected specific transmitter.
On the other hand, if any keyword is not entered to the displayed search window for a predetermined period of time, then the controller 180 may control the displayed search window to disappear from the display unit 151.
In the above, a method of displaying a search window for retrieving transmitted and received instance messages has been described with reference to FIGS. 11A and 11B. Hereinafter, a control method of entering a keyword to the displayed search window will be described with reference to FIGS. 11C and 11D.
First, referring to FIG. 11C, when a drag touch input applied to a specific instance message 1114 is terminated on the keyword input bar 1173 in a state that a search window is displayed in the second region distinguished from the first region displayed with instance messages, the controller 180 may display the instance message 1114 or a specific object, for example, "meeting at 5 o'clock" or "meeting", contained in the instance message 1114 on the keyword input bar 1173.
When a touch input is applied to the search execution key 1175 in a state that the 1114 or a specific object contained in the instance message 1114 is displayed on the keyword input bar 1173, the controller 180 retrieves information associated with the instance message 1114 and displays the search result on the display unit 151.
Here, information associated with the instance message 1114 may include other instance messages with the same or similar format to the instance message 1114, a next instance message displayed with a specific object contained in the instance message 1114, and the like, but may not be necessarily limited to them.
Furthermore, referring to FIG. 11D, when a drag touch input applied to the transmitter 1113 of the instance message 1114 is terminated on the keyword input bar 1173 in a state that a search window is displayed in the second region distinguished from the first region displayed with the instance message, the controller 180 displays the transmitter 1113 of the instance message 1114 on the keyword input bar 1173.
When a touch input is applied to the search execution key 1175 in a state that the transmitter 1113 of the instance message 1114 is displayed on the keyword input bar 1173, the controller 180 displays information associated with the transmitter 1113 of the instance message 1114 as a search result on the display unit 151. Here, information associated with the transmitter 1113 of the instance message 1114 may be other instance messages transmitted by the transmitter 1113, for example, but may not be necessarily limited to them.
Furthermore, the controller 180 may change the size and display direction of the first region displayed with an instance message and the second region displayed with a search window, and an object, a user interface or the like contained therein based on a touch input.
As in the above, a search window capable of directly retrieving information associated with an instance message in a region distinguished from a region displayed with instance messages may be displayed according to a touch, thereby allowing the user to directly enter the search criterion of the instance message.
As described above, according to a mobile device and a control method thereof according to an embodiment of the present disclosure, transmitted and received instance messages may be classified into each subject selected according to a touch, thereby allowing the user to more easily retrieve the user's desired specific instance message and/or instance messages associated therewith.
Furthermore, the screen may be switched to display total instance messages for each reference range, for example, each day range, determined according to a touch, thereby allowing the user to intuitively and quickly retrieve instance messages transmitted and received on his or her desired specific date.
Furthermore, a search window capable of directly retrieving information associated with an instance message in a region distinguished from a region displayed with instance messages may be displayed according to a touch, thereby allowing the user to directly enter his or her desired search criterion.
Moreover, it will be apparent to those having ordinary skill in the art to which the invention pertains that the invention can be embodied in other specific forms without departing from the concept and essential characteristics thereof. It should be understood that the foregoing embodiments are merely illustrative but not restrictive in all aspects. The scope of the present invention is defined by the appended claims rather than by the detailed description, and all changes or modifications derived from the meaning, scope and equivalent concept of the claims should be construed to be embraced by the scope of the present invention.

Claims (23)

  1. A mobile device, comprising:
    a wireless communication unit configured to transmit and receive an instance message;
    a display unit configured to enable a touch, and display the instance message transmitted and received through the wireless communication unit; and
    a controller configured to classify the instance message into a message belonging to a specific subject and display the message belonging to the specific subject to be visually distinguished from the other instance messages when a first touch gesture to the instance message is sensed and the specific subject is selected according to a second touch gesture.
  2. The mobile device of claim 1, wherein the second touch gesture is a touch to a selection menu popped up according to the first touch gesture, and
    the controller classifies the instance message into a message belonging to the selected subject when the specific subject is selected according to the touch.
  3. The mobile device of claim 1, wherein the second touch gesture is a touch and drag to the instance message, and
    the controller classifies the instance message into an instance message belonging to the specific subject when the touch and drag being moved into a region associated with the specific subject is sensed.
  4. The mobile device of claim 3, wherein the region associated with the specific subject is either one of a region in which an object indicating the specific subject is displayed and a region in which a message belonging to the specific subject is displayed.
  5. The mobile device of claim 1, wherein when a new instance message is transmitted in a state that a specific subject is selected according to the second touch gesture, the controller classifies the transmitted instance message into a message belonging to the specific subject.
  6. The mobile device of claim 1, wherein the controller switches the current screen of the display unit to a screen on which a list of the object indicating the specific subject is displayed when a third touch gesture is sensed.
  7. The mobile device of claim 6, wherein when a touch to the specific subject of the list is sensed in a state of being switched to a screen on which the list is displayed, the controller performs at least one of functions associated with a subject corresponding to the specific object.
  8. The mobile device of claim 6, wherein when an instance message belonging to a specific subject is received, the controller classifies the received instance message into a message belonging to the specific subject, and when the third touch gesture is sensed, the controller further displays an indicator indicating the reception of the received instance message on an object indicating the specific subject.
  9. The mobile device of claim 6, wherein when an instance message containing pre-formatted data is transmitted and received, the controller classifies the pre-formatted data contained in the transmitted and received instance message into a message belonging to a specific subject containing only the pre-formatted data.
  10. The mobile device of claim 1, wherein when a fourth touch gesture to the display unit is sensed, the controller classifies total instance messages into each reference range determined according to the fourth touch gesture.
  11. The mobile device of claim 10, wherein when a fourth touch gesture to the display unit is sensed, the controller switches the current screen of the display unit to a screen on which at least one image object indicating a reference range determined by the fourth touch gesture is displayed.
  12. The mobile device of claim 11, wherein the fourth touch gesture is a gesture in which at least two touch positions applied to the display unit are closer to each other, and
    the controller controls the reference range to be varied according to the extent of the touch positions being closer to each other.
  13. The mobile device of claim 11, wherein when a fifth touch gesture is sensed in a state of being switched to a screen on which the image object is displayed, the controller switches the current screen of the display unit to a screen on which the instance message is displayed.
  14. The mobile device of claim 13, wherein the fifth touch gesture is a gesture in which at least two touch positions applied to the display unit are away from each other, and
    the controller controls the instance message display range to be varied according to the extent of the touch positions being away from each other.
  15. The mobile device of claim 11, wherein the controller comprises a setting unit for setting at least one search keyword, and
    when the set search keyword is contained in an instance message with a specific reference range, the controller further displays an indicator indicating the search keyword on an image object indicating the specific reference range.
  16. The mobile device of claim 11, wherein when a touch is sensed at any one of the image objects, the controller switches the current screen of the display unit to a screen on which instance messages with a reference range corresponding to the touch sensed image object are displayed.
  17. The mobile device of claim 15, wherein when a touch to the indicator is sensed, the controller switches the current screen of the display unit to a screen on which instance messages with the corresponding reference range from an instance message containing the touch sensed indicator are displayed.
  18. The mobile device of claim 1, wherein the controller displays the instance message in a first region of the display unit and displays a search window for retrieving information associated with the instance message in a second region distinguished from the first region.
  19. The mobile device of claim 18, wherein when a touch to an object contained in the instance message is moved to a region displayed with the search window, the controller receives the object as a search keyword for retrieving an instance message associated with the object.
  20. A control method of a mobile device, the method comprising:
    displaying an instance message transmitted and received through a wireless communication unit;
    sensing a first touch gesture to the instance message;
    classifying the instance message into a message belonging to a subject selected according to a second touch gesture when the first touch gesture is sensed; and
    displaying the message belonging to the selected subject to be visually distinguished from the other instance messages.
  21. The method of claim 20, further comprising:
    sensing a third touch gesture to the display unit;
    a displaying a list of objects indicating a prestored subject when the third touch gesture is sensed; and
    performing at least one of functions associated with a subject corresponding to the specific object when a touch to the specific object is sensed on the list.
  22. The method of claim 20, comprising:
    sensing a fourth touch gesture to the display unit; and
    classifying total instance messages into each reference range determined according to the fourth touch gesture when the fourth touch gesture is sensed.
  23. The method of claim 22, further comprising:
    displaying a search window for retrieving information associated with the instance message in a second region distinguished from the first region in which the instance message is displayed.
PCT/KR2013/009830 2013-03-15 2013-11-01 Mobile device and control method for the same WO2014142412A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020130028197A KR20140113155A (en) 2013-03-15 2013-03-15 Mobile device and control method for the same
KR10-2013-0028197 2013-03-15

Publications (1)

Publication Number Publication Date
WO2014142412A1 true WO2014142412A1 (en) 2014-09-18

Family

ID=51537032

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2013/009830 WO2014142412A1 (en) 2013-03-15 2013-11-01 Mobile device and control method for the same

Country Status (2)

Country Link
KR (1) KR20140113155A (en)
WO (1) WO2014142412A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106161769A (en) * 2015-05-14 2016-11-23 Lg电子株式会社 Mobile terminal and control method thereof
CN106170033A (en) * 2015-05-22 2016-11-30 Lg电子株式会社 Watch type mobile terminal and control method thereof
US10257670B2 (en) 2015-04-16 2019-04-09 Samsung Electronics Co., Ltd. Portable device and method for providing notice information thereof
CN113325978A (en) * 2021-05-28 2021-08-31 维沃移动通信(杭州)有限公司 Message display method and device and electronic equipment

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016167612A1 (en) * 2015-04-16 2016-10-20 삼성전자 주식회사 Electronic device for providing notification information, and notification information provision method therefor
CN105260088B (en) * 2015-11-26 2020-06-19 北京小米移动软件有限公司 Information classification display processing method and device
WO2020096087A1 (en) 2018-11-09 2020-05-14 라인플러스 주식회사 Method, system, and non-transitory computer-readable recording medium for managing message group

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080301165A1 (en) * 2007-05-04 2008-12-04 Samsung Electronics Co., Ltd. Apparatus for processing time-base data and method thereof
US7882189B2 (en) * 2003-02-20 2011-02-01 Sonicwall, Inc. Using distinguishing properties to classify messages
US20110045803A1 (en) * 2009-08-19 2011-02-24 Samsung Electronics Co., Ltd. Method of informing occurrence of a missed event and mobile terminal using the same
US20110084921A1 (en) * 2009-10-08 2011-04-14 Lg Electronics Inc. Mobile terminal and data extracting method in a mobile terminal
US8225216B2 (en) * 2007-04-30 2012-07-17 Samsung Electronics Co., Ltd Image production system, apparatus, and method using user data of mobile communication terminal

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7882189B2 (en) * 2003-02-20 2011-02-01 Sonicwall, Inc. Using distinguishing properties to classify messages
US8225216B2 (en) * 2007-04-30 2012-07-17 Samsung Electronics Co., Ltd Image production system, apparatus, and method using user data of mobile communication terminal
US20080301165A1 (en) * 2007-05-04 2008-12-04 Samsung Electronics Co., Ltd. Apparatus for processing time-base data and method thereof
US20110045803A1 (en) * 2009-08-19 2011-02-24 Samsung Electronics Co., Ltd. Method of informing occurrence of a missed event and mobile terminal using the same
US20110084921A1 (en) * 2009-10-08 2011-04-14 Lg Electronics Inc. Mobile terminal and data extracting method in a mobile terminal

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10257670B2 (en) 2015-04-16 2019-04-09 Samsung Electronics Co., Ltd. Portable device and method for providing notice information thereof
CN106161769A (en) * 2015-05-14 2016-11-23 Lg电子株式会社 Mobile terminal and control method thereof
CN106170033A (en) * 2015-05-22 2016-11-30 Lg电子株式会社 Watch type mobile terminal and control method thereof
CN113325978A (en) * 2021-05-28 2021-08-31 维沃移动通信(杭州)有限公司 Message display method and device and electronic equipment
CN113325978B (en) * 2021-05-28 2023-08-15 维沃移动通信(杭州)有限公司 Message display method and device and electronic equipment

Also Published As

Publication number Publication date
KR20140113155A (en) 2014-09-24

Similar Documents

Publication Publication Date Title
WO2014142412A1 (en) Mobile device and control method for the same
WO2018034402A1 (en) Mobile terminal and method for controlling the same
WO2016036192A1 (en) Image display apparatus and image display method
WO2015037794A1 (en) Mobile terminal and control method for the mobile terminal
WO2015072677A1 (en) Mobile terminal and method of controlling the same
WO2016137167A1 (en) Terminal
WO2014157885A1 (en) Method and device for providing menu interface
WO2015083969A1 (en) Mobile terminal and method for controlling the same
WO2015050345A1 (en) Control apparatus for mobile terminal and control method thereof
WO2015122590A1 (en) Electronic device and method for controlling the same
WO2014025186A1 (en) Method for providing message function and electronic device thereof
WO2015056854A1 (en) Mobile terminal and control method for the mobile terminal
WO2015056844A1 (en) Mobile terminal and control method thereof
WO2017043784A1 (en) Mobile terminal and method for controlling the same
WO2015199280A1 (en) Mobile terminal and method of controlling the same
WO2012046890A1 (en) Mobile terminal, display device, and method for controlling same
WO2015088166A1 (en) Mobile terminal and method for operating rear surface input unit of same
WO2017052043A1 (en) Mobile terminal and method for controlling the same
WO2014112678A1 (en) Method for providing shopping information using mobile terminal, and user interface for providing shopping information using mobile terminal
WO2015068911A1 (en) Mobile terminal and method of controlling the same
WO2017105018A1 (en) Electronic apparatus and notification displaying method for electronic apparatus
WO2018105834A1 (en) Terminal and method for controlling the same
WO2016076570A1 (en) Display apparatus and display method
WO2015068872A1 (en) Electronic device and method for controlling of the same
WO2015105257A1 (en) Mobile terminal and control method therefor

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13877751

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13877751

Country of ref document: EP

Kind code of ref document: A1