KR20140073232A - Mobil terminal and Operating Method for the Same - Google Patents

Mobil terminal and Operating Method for the Same Download PDF

Info

Publication number
KR20140073232A
KR20140073232A KR1020120141228A KR20120141228A KR20140073232A KR 20140073232 A KR20140073232 A KR 20140073232A KR 1020120141228 A KR1020120141228 A KR 1020120141228A KR 20120141228 A KR20120141228 A KR 20120141228A KR 20140073232 A KR20140073232 A KR 20140073232A
Authority
KR
South Korea
Prior art keywords
input
emoticon
user
portable terminal
unit
Prior art date
Application number
KR1020120141228A
Other languages
Korean (ko)
Inventor
김태호
황정환
김재동
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020120141228A priority Critical patent/KR20140073232A/en
Publication of KR20140073232A publication Critical patent/KR20140073232A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser

Abstract

A method for operating a portable terminal according to the embodiment of the present invention includes the steps of: displaying a message input window; sensing the motion input of a user; determining one or more emoticons corresponding to the sensed motion input among previously stored emoticon data based on at least one of the kind of the sensed emotion input, input time, and input intensity; and displaying the determined emotion on the message input window. Thereby, the convenience of the user is improved by quickly and conveniently inputting and transmitting the emoticon.

Description

Technical Field [0001] The present invention relates to a mobile terminal and an operating method thereof,

BACKGROUND OF THE INVENTION 1. Field of the Invention [0002] The present invention relates to a portable terminal and an operation method thereof, and more particularly, to a portable terminal and an operation method of the portable terminal capable of quickly and conveniently inputting and transmitting emoticons.

A portable terminal is a portable device having one or more functions capable of carrying out voice and video communication, capable of inputting and outputting information, and storing data, while being portable. As the functions of the portable terminal have diversified, they have complicated functions such as photographing and photographing of a movie, playback of a music file or a video file, reception of a game, reception of broadcasting, wireless Internet, etc., and a comprehensive multimedia device .

In order to implement a complex function in a portable terminal implemented in the form of multimedia devices, various attempts have been made in terms of hardware and software. For example, there is a user interface environment in which a user can easily and conveniently search for or select a function.

On the other hand, various social network services, chat services, and message services are increasingly used through mobile terminals.

Accordingly, it is an object of the present invention to provide a mobile terminal with improved user convenience and an operation method thereof.

It is also an object of the present invention to provide a mobile terminal and an operation method thereof that can input and transmit emoticons more quickly and conveniently.

A method of operating a portable terminal according to an exemplary embodiment of the present invention includes displaying a message input window, detecting a motion input of a user, detecting at least one of a type of detected motion input, an input time, Determining one or more emoticons corresponding to the sensed motion input from among the previously stored emoticon data and displaying the determined emoticons on the message input window.

A portable terminal according to an exemplary embodiment of the present invention includes a display unit for displaying a message input window, a sensing unit for sensing a motion input of the user, a type of motion input, an input time, And a controller for discriminating one or more emoticons corresponding to the sensed motion input from the stored emoticon data, wherein the control unit controls the emoticon determined to be displayed on the message input window.

According to the embodiment of the present invention, it is possible to input and transmit the emoticons more quickly and conveniently, thereby improving convenience for the user.

1 is a block diagram of a portable terminal according to an embodiment of the present invention;
2 is a front perspective view of a mobile terminal according to an embodiment of the present invention,
3 is a rear perspective view of a portable terminal according to an embodiment of the present invention,
4 is a flowchart illustrating an operation method of a portable terminal according to an embodiment of the present invention.
5 to 14 are drawings referred to explain various embodiments of a method of operating a portable terminal according to the present invention.
15 is a block diagram of a portable terminal according to an embodiment of the present invention.

Hereinafter, the present invention will be described in detail with reference to the drawings.

The portable terminal described in the present specification may be used in various applications such as a mobile phone, a smart phone, a notebook computer, a digital broadcasting terminal, a PDA (Personal Digital Assistants), a PMP (Portable Multimedia Player), a camera, tablet computers, e-book terminals, and the like. In addition, suffixes "module" and " part "for the components used in the following description are given merely for convenience of description, and do not give special significance or role in themselves. Accordingly, the terms "module" and "part" may be used interchangeably.

1 is a block diagram of a portable terminal according to an embodiment of the present invention. Referring to FIG. 1, a mobile terminal according to an embodiment of the present invention will be described in terms of components according to functions.

1, the portable terminal 100 includes a wireless communication unit 110, an audio / video (A / V) input unit 120, a user input unit 130, a sensing unit 140, an output unit 150, A controller 160, an interface 170, a controller 180, and a power supply 190. When such components are implemented in practical applications, two or more components may be combined into one component, or one component may be divided into two or more components as necessary.

The wireless communication unit 110 may include a broadcast receiving module 111, a mobile communication module 113, a wireless Internet module 115, a short distance communication module 117, and a GPS module 119.

The broadcast receiving module 111 receives at least one of a broadcast signal and broadcast related information from an external broadcast management server through a broadcast channel. At this time, the broadcast channel may include a satellite channel, a terrestrial channel, and the like. The broadcast management server may refer to a server for generating and transmitting at least one of a broadcast signal and broadcast related information and a server for receiving at least one of the generated broadcast signal and broadcast related information and transmitting the broadcast signal to the terminal.

The broadcast signal may include a TV broadcast signal, a radio broadcast signal, a data broadcast signal, and a broadcast signal in which a data broadcast signal is combined with a TV broadcast signal or a radio broadcast signal. The broadcast-related information may mean information related to a broadcast channel, a broadcast program, or a broadcast service provider. The broadcast-related information can also be provided through a mobile communication network, in which case it can be received by the mobile communication module 113. Broadcast-related information can exist in various forms.

The broadcast receiving module 111 receives broadcast signals using various broadcasting systems. In particular, the broadcast receiving module 111 may be a Digital Multimedia Broadcasting-Terrestrial (DMB-T), a Digital Multimedia Broadcasting-Satellite (DMB-S) ), Digital Video Broadcast-Handheld (DVB-H), Integrated Services Digital Broadcast-Terrestrial (ISDB-T), and the like. In addition, the broadcast receiving module 111 may be configured to be suitable for all broadcasting systems that provide broadcasting signals, as well as the digital broadcasting system. The broadcast signal and / or broadcast related information received through the broadcast receiving module 111 may be stored in the memory 160.

The mobile communication module 113 transmits and receives a radio signal to at least one of a base station, an external terminal, and a server on a mobile communication network. Here, the wireless signal may include various types of data according to a voice call signal, a video call signal, or a text / multimedia message transmission / reception.

The wireless Internet module 115 refers to a module for wireless Internet access, and the wireless Internet module 115 can be embedded in the mobile terminal 100 or externally. WLAN (Wi-Fi), Wibro (Wireless broadband), Wimax (World Interoperability for Microwave Access), HSDPA (High Speed Downlink Packet Access) and the like can be used as wireless Internet technologies.

The short-range communication module 117 refers to a module for short-range communication. Bluetooth, radio frequency identification (RFID), infrared data association (IrDA), ultra wideband (UWB), ZigBee, and Near Field Communication (NFC) may be used as the short distance communication technology.

A GPS (Global Position System) module 119 receives position information from a plurality of GPS satellites.

The A / V (Audio / Video) input unit 120 is for inputting an audio signal or a video signal, and may include a camera 121 and a microphone 123. The camera 121 processes image frames such as still images or moving images obtained by the image sensor in the video communication mode or the photographing mode. Then, the processed image frame can be displayed on the display unit 151. [

The image frame processed by the camera 121 may be stored in the memory 160 or transmitted to the outside through the wireless communication unit 110. [ The camera 121 may be equipped with two or more cameras according to the configuration of the terminal.

The microphone 123 receives an external sound signal by a microphone in a communication mode, a recording mode, a voice recognition mode, or the like, and processes it as electrical voice data. The processed voice data can be converted into a form that can be transmitted to the mobile communication base station through the mobile communication module 113 and output when the voice data is in the call mode. The microphone 123 may be a variety of noise reduction algorithms for eliminating noise generated in receiving an external sound signal.

The user input unit 130 generates key input data that the user inputs to control the operation of the terminal. The user input unit 130 may include a key pad, a dome switch, and a touch pad (static / static) capable of receiving commands or information by a user's pressing or touching operation. The user input unit 130 may be a jog wheel for rotating the key, a jog type or a joystick, or a finger mouse. Particularly, when the touch pad has a mutual layer structure with the display unit 151 described later, it can be called a touch screen.

The sensing unit 140 senses the current state of the portable terminal 100 such as the open / close state of the portable terminal 100, the position of the portable terminal 100, Thereby generating a sensing signal. For example, when the portable terminal 100 is in the form of a slide phone, it is possible to sense whether the slide phone is opened or closed. In addition, a sensing function related to whether or not the power supply unit 190 is powered on, whether the interface unit 170 is coupled to an external device, and the like can be handled.

The sensing unit 140 may include a proximity sensor 141, a pressure sensor 143, a motion sensor 145, and the like. The proximity sensor 141 can detect the presence of an object approaching the portable terminal 100 or an object existing near the portable terminal 100 without mechanical contact. The proximity sensor 141 can detect a nearby object by using a change in the alternating magnetic field or a change in the static magnetic field, or a rate of change in capacitance. In addition, it is possible to detect whether the user holds the surface of the portable terminal 100. The proximity sensor 141 may be equipped with two or more sensors according to the configuration.

The pressure sensor 143 can detect whether the pressure is applied to the portable terminal 100, the magnitude of the pressure, and the like. The pressure sensor 143 may be installed at a portion where the pressure of the portable terminal 100 is required depending on the use environment.

When the pressure sensor 143 is installed on the display unit 151, the touch input through the display unit 151 and the pressure applied by the touch input The pressure touch input can be identified. Also, the magnitude of the pressure applied to the display unit 151 at the time of the pressure touch input can be determined according to the signal output from the pressure sensor 143. [

If the pressure sensor 143 is disposed at the outer periphery of the portable terminal 100, it is possible to detect whether the user holds the surface of the portable terminal 100 by detecting the pressure.

The motion sensor 145 detects the position and movement of the portable terminal 100 using an acceleration sensor, a gyro sensor, or the like. An acceleration sensor that can be used for the motion sensor 145 is a device that converts an acceleration change in one direction into an electric signal and is widely used along with the development of MEMS (micro-electromechanical systems) technology.

The acceleration sensor measures the acceleration of a small value built in the airbag system of an automobile and recognizes the minute motion of the human hand and measures the acceleration of a large value used as an input means such as a game There are various types. Acceleration sensors are usually constructed by mounting two or three axes in one package. Depending on the usage environment, only one axis of Z axis is required. Therefore, when the acceleration sensor in the X-axis direction or the Y-axis direction is used instead of the Z-axis direction for some reason, the acceleration sensor may be mounted on the main substrate by using a separate piece substrate.

The gyro sensor is a sensor for measuring the angular velocity, and it can sense the direction of rotation with respect to the reference direction.

Meanwhile, the sensing unit 140 may include sensors for user authentication. For example, when the user authentication is performed through the user's biometric information, a sensor for recognizing a body temperature, a fingerprint, an iris, a face, and the like may be provided. In accordance with a user authentication method set in the mobile terminal 100, A sensor may be provided.

The output unit 150 is for outputting an audio signal, a video signal, or an alarm signal. The output unit 150 may include a display unit 151, an audio output module 153, an alarm unit 155, and a haptic module 157.

The display unit 151 displays and outputs the information processed by the portable terminal 100. For example, when the portable terminal 100 is in the call mode, a UI (User Interface) or a GUI (Graphic User Interface) associated with a call is displayed. When the portable terminal 100 is in the video communication mode or the photographing mode, the photographed or received images can be displayed individually or simultaneously, and the UI and the GUI are displayed.

Meanwhile, as described above, when the display unit 151 and the touch pad have a mutual layer structure to constitute a touch screen, the display unit 151 may be an input device capable of inputting information by a user's touch in addition to the output device Can also be used.

If the display unit 151 is configured as a touch screen, it may include a touch screen panel, a touch screen panel controller, and the like. In this case, the touch screen panel is a transparent panel attached to the outside, and can be connected to the internal bus of the portable terminal 100. The touch screen panel keeps a watch on the contact result, and if there is a touch input, sends the corresponding signals to the touch screen panel controller. The touch screen panel controller processes the signals, and then transmits corresponding data to the controller 180 so that the controller 180 can determine whether the touch input has been made and which area of the touch screen has been touched.

The display unit 151 may be formed of an e-paper. Electronic paper (e-Paper) is a kind of reflective display, and has excellent visual characteristics such as high resolution, wide viewing angle and bright white background as conventional paper and ink. The electronic paper (e-paper) can be implemented on any substrate such as plastic, metal, paper, and the image is retained even after the power is shut off, and the battery life of the portable terminal 100 is long Can be maintained. As the electronic paper, a hemispherical twist ball filled with a telephone can be used, or an electrophoresis method and a microcapsule can be used.

In addition, the display unit 151 may be a liquid crystal display, a thin film transistor-liquid crystal display, an organic light-emitting diode, a flexible display, a three- (3D display). In addition, there may be two or more display units 151 according to the embodiment of the portable terminal 100. For example, the portable terminal 100 may include an external display unit (not shown) and an internal display unit (not shown) at the same time.

The audio output module 153 outputs audio data received from the wireless communication unit 110 or stored in the memory 160 in a call signal reception mode, a call mode or a recording mode, a voice recognition mode, a broadcast reception mode, The sound output module 153 outputs sound signals related to functions performed in the portable terminal 100, for example, a call signal reception tone, a message reception tone, and the like. The sound output module 153 may include a speaker, a buzzer, and the like.

The alarm unit 155 outputs a signal for notifying the occurrence of an event of the portable terminal 100. Examples of events occurring in the portable terminal 100 include call signal reception, message reception, and key signal input. The alarm unit 155 outputs a signal for notifying the occurrence of an event in a form other than an audio signal or a video signal. For example, it is possible to output a signal in a vibration mode. The alarm unit 155 can output a signal to notify when a call signal is received or a message is received. Also, when the key signal is inputted, the alarm unit 155 can output the signal as the feedback to the key signal input. The user can recognize the occurrence of an event through the signal output by the alarm unit 155. A signal for notifying the occurrence of an event in the portable terminal 100 may also be output through the display unit 151 or the sound output module 153. [

The haptic module 157 generates various tactile effects that the user can feel. A typical example of the haptic effect generated by the haptic module 157 is a vibration effect. When the haptic module 157 generates vibration with a haptic effect, the intensity and pattern of the vibration generated by the haptic module 157 can be converted, and the different vibrations can be synthesized and output or sequentially output.

In addition to the vibration, the haptic module 157 may be provided with a function of stimulating by a pin arrangement vertically moving with respect to the contact skin surface, an effect of stimulating air through the injection or suction force of the air through the injection port or the suction port, A variety of tactile effects such as an effect of stimulation through the contact of the electrode (eletrode), an effect of stimulation by electrostatic force, and an effect of cold / warm reproduction using a device capable of endothermic or exothermic can be generated. The haptic module 157 can be implemented not only to transmit the tactile effect through direct contact but also to feel the tactile effect through the muscular sense of the user's finger or arm. The haptic module 157 may include two or more haptic modules 157 according to the configuration of the portable terminal 100.

The memory 160 may store a program for processing and controlling the control unit 180 and may store a function for temporarily storing input or output data (e.g., a phone book, a message, a still image, .

The memory 160 may be a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (for example, SD or XD memory), a RAM , And a ROM. ≪ / RTI > In addition, the portable terminal 100 may operate a web storage for storing the memory 150 on the Internet.

The interface unit 170 serves as an interface with all external devices connected to the portable terminal 100. Examples of the external device connected to the portable terminal 100 include a wired / wireless headset, an external charger, a wired / wireless data port, a memory card, a SIM (Subscriber Identification Module) card, a UIM An audio input / output (I / O) terminal, a video I / O (input / output) terminal, and an earphone. The interface unit 170 receives data from the external device or receives power from the external device and transmits the data to each component in the portable terminal 100 so that data in the portable terminal 100 can be transmitted to the external device .

The interface unit 170 is a path through which power from the cradle connected when the portable terminal 100 is connected to the external cradle is supplied to the portable terminal 100 or various command signals inputted from the cradle by the user are carried And may be a passage to be transmitted to the terminal 100.

The controller 180 typically controls the operation of the respective units to control the overall operation of the mobile terminal 100. For example, voice communication, data communication, video communication, and the like. In addition, the control unit 180 may include a multimedia playback module 181 for multimedia playback. The multimedia playback module 181 may be configured in hardware in the controller 180 or separately from software in the controller 180. [

The power supply unit 190 receives external power and internal power under the control of the controller 180 and supplies power necessary for operation of the respective components.

The mobile terminal 100 having such a configuration can be configured to be operable in a communication system capable of transmitting data through a frame or a packet, including a wired / wireless communication system and a satellite-based communication system. have.

FIG. 2 is a perspective view of a portable terminal according to an exemplary embodiment of the present invention, and FIG. 3 is a rear perspective view of the portable terminal shown in FIG. 2. Referring to FIG. Hereinafter, the portable terminal according to the present invention will be described with reference to FIGS. 2 and 3 in terms of components according to the external form. Hereinafter, for convenience of description, a bar-type portable terminal having a front touch screen among various types of portable terminals such as a folder type, a bar type, a swing type, a slider type, etc. will be described as an example. However, the present invention is not limited to a bar-type portable terminal, but can be applied to all types of portable terminals including the above-mentioned type.

Referring to FIG. 2, the case constituting the outer appearance of the portable terminal 100 is formed by the front case 100-1 and the rear case 100-2. Various electronic components are incorporated in the space formed by the front case 100-1 and the rear case 100-2.

The display unit 151, the first sound output module 153a, the first camera 121a, and the first through third user input units 130a, 130b, and 130c are connected to the main body, specifically, the front case 100-1 . A fourth user input unit 130d, a fifth user input unit 130e, and a microphone 123 may be disposed on a side surface of the rear case 100-2.

The display unit 151 may be constructed such that the touch pad is overlapped with the layer structure so that the display unit 151 operates as a touch screen so that information can be input by a user's touch.

The first acoustic output module 153a may be implemented in the form of a receiver or a speaker. The first camera 121a may be implemented in a form suitable for capturing an image or a moving image of a user. The microphone 123 may be implemented in a form suitable for receiving a user's voice, other sounds, and the like.

The first through fifth user input units 130a 130b 130c 130d and 130e and the sixth and seventh user input units 130f and 130g described below may be collectively referred to as a user input unit 130, Any manner can be employed in a tactile manner.

For example, the user input unit 130 may be embodied as a dome switch or a touch pad capable of receiving a command or information by a user's pressing or touching operation, or may be a wheel, a jog type or a joystick Or the like. In a functional aspect, the first to third user input units 130a, 130b and 130c are for inputting commands such as start, end, and scroll, and the fourth user input unit 130d is for inputting a selection of an operation mode, . In addition, the fifth user input unit 130e may operate as a hot-key for activating a special function in the portable terminal 100. [

 3, a second camera 121b may be additionally mounted on a rear surface of the rear case 100-2. On the side surface of the rear case 100-2, a sixth and a seventh user input units 130f, 130g, and an interface unit 170 may be disposed.

The second camera 121b has a photographing direction substantially opposite to that of the first camera 121a, and may have pixels different from those of the first camera 121a. A flash (not shown) and a mirror (not shown) may be additionally disposed adjacent to the second camera 121b. In addition, another camera may be installed adjacent to the second camera 121b to use it for shooting a three-dimensional stereoscopic image.

The flash illuminates the subject when the subject is photographed by the second camera 121b. The mirror enables the user to illuminate the user's own face or the like when the user intends to photograph (self-photograph) himself / herself using the second camera 121b.

A second sound output module (not shown) may be further disposed in the rear case 100-2. The second sound output module may implement the stereo function together with the first sound output module 153a, and may be used for talking in the speakerphone mode.

The interface unit 170 can be used as a path for exchanging data with an external device. An antenna for receiving broadcast signals (not shown) may be disposed in one area of the front case 100-1 and the rear case 100-2 in addition to the antenna for communication. The antenna may be installed to be capable of being drawn out from the rear case 100-2.

A power supply unit 190 for supplying power to the portable terminal 100 may be mounted on the rear case 100-2. The power supply unit 190 may be a rechargeable battery, for example, and may be detachably coupled to the rear case 100-2 for charging or the like.

On the other hand, in the present embodiment, the second camera 121b and the like are disposed in the rear case 100-2, but the present invention is not limited thereto.

Also, the first camera 121a may be rotatably formed so that the second camera 121b can be photographed up to the photographing direction, even if the second camera 121b is not separately provided.

FIG. 4 is a flowchart illustrating an operation method of a mobile terminal according to an embodiment of the present invention, and FIGS. 5 to 14 are referred to for illustrating various embodiments of a method of operating the mobile terminal according to the present invention.

Referring to the drawings, a message input window may be displayed on the display unit 151 according to the execution of a message service such as a social network service, a text message service, or a multimedia message service (S410).

Alternatively, the message input window may be displayed on the display unit 151 without executing the specific message service (S420)

In this case, after inputting the message, the user can determine the usage form of the inputted message such as e-mail, memo, social network service, text messaging service, multimedia message service and the like.

Emoticon is a combination of Emotion and Icon. It is a combination of letters, symbols, and numbers to express various emotions.

Meanwhile, as the use of social network services, chat services, and message services is increasing, emoticons can be displayed in a variety of forms such as a picture, a character, an animation, etc., And their types and numbers are increasing.

However, there are inconveniences that the user has to go through many steps in order to input such emoticons. The number of users using various SNSs is increasing, and the use of emoticons that express emotions rather than simple text transmission is increasing, but the method of selecting and transmitting emoticons is not simple.

FIG. 5 shows an example of a process of transmitting emoticons while using a predetermined social network service.

Referring to FIG. 5, when the user desires to input an emoticon while executing a social network service, a message service, or the like, the user must first select the emoticon input icon 510. In addition, the user must search for and select emoticons to input among many emoticons.

On the other hand, emoticons that can be input are classified according to predetermined criteria such as emoticon provider, emotion, and status, and can help the user. In the example of FIG. 5, the emoticons are classified into a plurality of groups, and the user can select a predetermined emoticon group through the tab menu 520.

When the user selects one of the tab menus 520, the emoticons 530 included in the selected item are displayed on the screen. When the number of emoticons 530 is large, it is possible to check and select emoticons which are not displayed on a screen by scrolling or the like.

Thereafter, when the user selects the emoticon 531 to be input, the selected emoticon is displayed in the message input window 540, and the emoticon selected in accordance with the transmission instruction 550 of the user is transmitted to the counterpart side. Can be displayed on the display unit 151. [

5, although the emoticones 530 included in the selected item 521 are selected and transmitted, the emoticones 530 are displayed for convenience of explanation. After the selection of the user or transmission of the emoticon, Can be ended.

Meanwhile, as in the case of transmitting the emoticon described with reference to FIG. 5, the user has to perform an operation of five or more steps in order to transmit the emoticon.

Therefore, the present invention aims to provide a method for inputting emoticons more quickly and conveniently.

The mobile terminal 100 according to the exemplary embodiment of the present invention may sense a user's motion input through the sensing unit 140. In operation S430, A gyro sensor, and a pressure sensor. The sensing unit 140 may sense the motion input.

In addition, the portable terminal 100 may sense the touch input through the sensing unit 140 or the touch screen display unit 151. [

On the other hand, the controller 180 determines one or more emoticons corresponding to the sensed motion input from the previously stored emoticon data based on at least one of the type, the input time, and the input intensity of the sensed motion input (S440)

In the storage unit 160, emoticons matched according to the type, the input time, and the input intensity of the detected motion input are stored, and the data can be stored and managed in the form of an emoticon table.

The emoticon table may include reference value information of types, strengths, and directions of motion inputs for digitizing the various emotional states of the user into data to determine the emotional state. In addition, the emoticon table can associate and store emoticon corresponding to the corresponding reference value. In addition, the emoticons table may include effect-related data so as to output various effects such as a vibration effect and a sound effect associated with the emoticons.

The control unit 180 can identify the emoticon corresponding to the motion input by calling the emoticon table, and can determine the effect information that can be added. On the other hand, each table may be a standardized table for compatibility with other electronic devices. In the case of tables of different standards, it is also possible to convert them. For example, it is also possible to convert a table received through a network and store it in the storage unit 160.

The control unit 180 may analyze the detected motion input and search for the emoticon of the emotion state most similar to the motion input. For example, the user's motion can be analyzed to determine the user's emotional state such as alarm, joy, depression, sadness, and at least one of the emoticon corresponding to the determined emotional state can be provided to the user.

Accordingly, the portable terminal 100 according to the embodiment of the present invention senses a motion input of the user through the sensing unit 140, and detects at least one of the type, input time, and input intensity of the sensed motion input , It is possible to identify one or more emoticons corresponding to the sensed motion input from the previously stored emoticon data.

The emoticon can be displayed on the message input window of the display unit 151 (S450). The present invention can input the emoticon more quickly and easily, thereby improving the user's convenience. have.

 6A to 6C, according to the execution of the social network service, the text message service or the multimedia message service, the display unit 151 displays a message input window emoticon input icon 620, a transmission menu 630, A message screen 640 on which a message is displayed can be displayed.

By selecting the emoticon input icon 620, the user can input the emoticon according to a conventional method.

Alternatively, the control unit 180 may determine the motion input of the user through the sensing unit 140 or the like, and the user may input the emoticon through one step of the motion input.

6B, when the user performs an operation 650 to hit the portable terminal 100, the motion sensor 145 detects the movement of the portable terminal 100 according to the operation 650 using an acceleration sensor, a gyro sensor, ), And the like. Particularly, the acceleration sensor recognizes the minute motion of the hand and can measure the acceleration of a minute value used as an input means of a game or the like.

In the emoticon discrimination step S440, the controller 180 can discriminate the emoticon matching the rotation direction, the number of times, the acceleration change, and the grip pressure change of the motion input.

For example, when the motion input time is short and the position and the movement of the portable terminal 100 are changed once, the controller 180 determines that the motion input of the user is an operation 650 to hit the portable terminal 100 can do.

Also, whether or not the user touches can be referred to in the operation determination. The motion input sensing step (S430) can detect whether the touch input is inputted through the display unit 151, the intensity and the holding time of the touch input, and can also detect the touch area and the shape.

6C, the controller 180 can control the emoticon 531 corresponding to the discriminated operation 650 among the emoticons stored in the message input window 610 to be displayed. Accordingly, as described with reference to FIG. 5, the emoticon that has been input by the operation of five or more steps can be inputted by one operation.

Hereinafter, an operation method of the mobile terminal according to an embodiment of the present invention may further include a step of receiving a transmission command (S460) and transmitting the data based on the emoticon to a receiving terminal or a service server (S470) .

The user can input a transmission command to transmit the emoticon displayed in the message input window if the emoticon to be inputted is correct. The portable terminal 100 can transmit data based on the emoticon to the receiving terminal or the service server according to the transmission instruction of the user.

Alternatively, the emoticon determined according to the embodiment may be automatically transmitted without the user's instruction.

Meanwhile, the portable terminal 100 may transmit the selected emoticon data itself, or may transmit data including the related information so that the receiving terminal can output emoticon which is the same as or similar to the selected emoticon.

In addition, the transmitting step (S470) may further transmit data for generating a vibration effect and a sound effect associated with the emoticon

The receiving terminal can not only display the emoticons based on data received directly via the service provider, but also can output vibration or audio effects.

According to another aspect of the present invention, there is provided a method of operating a mobile terminal, the method comprising: displaying a corresponding emoticon when a plurality of emoticons corresponding to a user's motion input are present; And selecting at least one of the displayed emoticons.

According to an embodiment of the present invention, the method may further include activating a sensing unit for sensing the motion input when the message input window is displayed or a related function such as a social network service or a message service is executed.

That is, in a normal use environment, at least some of the sensors in the sensing unit are turned off, only the minimum operation is performed, and when necessary, the sensors and functions necessary for sensing the motion input of the user are activated, Can be managed.

On the other hand, the control unit 180 may discriminate one or more emoticons corresponding to the sensed motion input from the stored emoticon data based on at least one of the type, input time, and input intensity of the sensed motion input. In addition, the controller 180 may select emoticons matched with the movement direction, the rotation direction, the number of times, the acceleration change, and the grip pressure change of the motion input.

FIGS. 7A and 11B show various examples of the emoticon transmitted after being discriminated according to the motion input and the motion input according to the embodiment of the present invention.

The emoticon is an image, a letter, a number, a symbol, or a combination thereof expressing emotions, so that a motion representing a user's emotions can be matched.

For example, referring to FIGS. 7A and 7B, when the user moves the portable terminal 100 like a nod, the user can select and transmit an emoticon indicating acceptance or consent. In this case, the sensing unit 140 may sense the rotation and the angle of the portable terminal in the up-and-down direction, and the controller 180 may determine the operation of nodding the portable terminal 100 and the corresponding emoticon have.

Referring to FIGS. 8A and 8B, when the user moves the portable terminal 100 in such a manner that the user stands on the head, the emoticon indicating rejection can be selected and transmitted. In this case, the sensing unit 140 can sense the rotation and the angle of the portable terminal in the horizontal direction, and the controller 180 can determine the user's motion input and the corresponding emoticon.

Referring to FIGS. 9A and 9B, when the user moves the portable terminal 100 as if to express a confused spirit, the control unit 180 can control to select and transmit emoticon indicating confused or embarrassed emotional states have.

When the control unit 180 detects a plurality of revolving motions rotating in the same direction, it is possible to control to select and transmit the emoticon indicating the embarrassed emotional state.

Referring to FIGS. 10A and 10B, when the user grasps a specific portion of the portable terminal 100, the controller 180 may control the emoticon indicating the emotional state of the anger to be selected and transmitted.

The pressure sensor 143 can detect whether the pressure is applied to the portable terminal 100, the magnitude of the pressure, and the like. Therefore, by setting the predetermined operation to be performed only when the magnitude of the detected pressure is equal to or greater than a predetermined value, a malfunction can be prevented while quickly performing the corresponding operation.

On the other hand, the pressure sensor 143 may be installed in a part where the pressure of the portable terminal 100 is required according to the use environment. The pressure sensor 143 may be disposed at a rim portion of the portable terminal 100. Or may be disposed on the left and right sides of the edge of the portable terminal 100 where the user is likely to catch the portable terminal 100 in order to grasp it. Or may be disposed on one side of the side surface of the portable terminal 100. Alternatively, the pressure sensor 143 may be disposed in a partial area of the side surface, for example, the bottom area.

The control unit 180 can determine that the user's emotional state is an angry state and select the corresponding emoticon when the pressure detected by the pressure sensor 143 exceeds a predetermined reference value or the rate of pressure increase is equal to or higher than the reference value. Further, the effect-related data can be transmitted so that the receiving-side terminal outputs a vibration effect or the like.

11A and 11B, when a user touches a specific portion of the portable terminal 100, particularly, the display portion 100, the controller 180 determines the contact area and the shape of the contact portion, You can control to select and send emoticons. Further, the receiving-side terminal can transmit effect-related data so as to output a sound effect or the like.

According to another aspect of the present invention, there is provided a method of operating a mobile terminal, the method comprising: displaying a corresponding emoticon when a plurality of emoticons corresponding to a user's motion input are present; And selecting at least one of the displayed emoticons.

Referring to FIG. 12, a plurality of emoticons 1211 and 1212 corresponding to a user's motion input may be displayed in the message input window 1210. The user may transmit all of the plurality of displayed emoticons 1211 and 1212, or may select and transmit some emoticons.

13A shows a plurality of emoticons 1311 and 1312 corresponding to a user's motion input in a message input window 1310 and a plurality of emoticons 1311 and 1312 each include a delete menu FIG. The user can delete the corresponding emoticon by touching the delete menu.

13B shows an example in which a plurality of emoticons 1321 and 1322 corresponding to a user's motion input are displayed in an area other than the message input window 1320. [ When the user selects an emoticon to be transmitted among the plurality of emoticons 1321 and 1322, the emoticon selected in the message input window 1320 can be displayed.

Meanwhile, the emoticon display step (S450) may display the size or the number in proportion to at least one of the input time and the input intensity of the sensed motion input.

Referring to FIG. 14 (a), the input number of the same emoticon can be increased as the time increases in proportion to various motion input times of a user such as a touch input.

Referring to (b) of FIG. 14, the display size of the same emoticon can be increased as the time increases in proportion to the user's various motion input times such as touch input.

15 is a block diagram of a portable terminal according to an embodiment of the present invention. More specifically, an example 180a of the control unit 180 of FIG. 1 is shown in detail.

15, the control unit 180a may include an input determination unit 1510, an input processing unit 1520, an emoticon processing unit 1530, a sound effect processing unit 1540, and a vibration effect processing unit 1550.

When such components are implemented in practical applications, two or more components may be combined into one component, or one component may be divided into two or more components as necessary. Further, the control unit 180a may be provided separately.

The input determination unit 1510 can determine whether a user's action is input by each input means. For example, the sensing unit 140 and the display unit 151 may determine whether a user has a motion input or a touch input. In addition, image data to be acquired from the camera 121 may be input.

The control unit 180a may activate the sensing unit 140 to sense motion input according to execution of a social network service, a message service, and the like.

Meanwhile, the input determination unit 1510 may include elements that operate according to the input type. For example, the input determination unit 1510 may include a touch input determination unit 1511, a grip input determination unit 1512, and a motion input determination unit 1513.

When a motion event occurs, the sensed raw data may be filtered, corrected, and transmitted to the input processing unit 1520.

The input processing unit 1520 can determine the input time and the input intensity (intensity) for various effects, and can include the input time processing unit 1521 and the input intensity processing unit 1522.

The emoticon processing unit 1530 includes an emoticon type selection unit 1531 for selecting the emoticon corresponding to the motion input by searching the memory 160 and an emoticon number / size calculation unit 1532 for determining the emoticon number and emoticon size for various effects. . ≪ / RTI >

The input processing unit 1520 can recognize the motions and convert each motion into a representative value. A gesture aware algorithm is used and the number of each representative value can match the number of emoticons. Alternatively, each representative value may correspond to a plurality of emoticons.

The emoticon processing unit 1530 may map each representative value and each corresponding emoticon to one-to-one mapping, or may select a plurality of emoticons satisfying a predetermined condition.

The sound effect processor 1540 may include a sound effect selector 1541 for selecting a corresponding sound effect and a sound effect time / size processor 1542 for determining time and size for various effects.

The vibration effect processing unit 1550 may include a vibration effect selecting unit 1551 for selecting a corresponding vibration effect and a vibration effect time / size processing unit 1552 for determining a time / size for various effects.

The transmitting terminal can also output the effect through the acoustic output module 153 and the haptic module 1570.

The memory 160 may store reference value information for discriminating emoticons based on the motion input, and may store various emoticons.

On the other hand, the wireless communication unit 110 can transmit data based on the determined emoticons to the receiving terminal or the service provider under the control of the control unit 181a.

According to the present invention, it is possible to detect various motions such as vibration of the portable terminal, hitting the portable terminal, up / down rotation, left / right rotation, tight gripping,

According to the embodiment of the present invention, it is possible to input and transmit the emoticons more quickly and conveniently, thereby improving convenience for the user.

The portable terminal according to the present invention and its operation method are not limited to the configuration and method of the embodiments described above but the embodiments can be applied to all or a part of each embodiment Or may be selectively combined.

Meanwhile, the present invention can be implemented as a processor-readable code in a processor-readable recording medium provided in a portable terminal. The processor-readable recording medium includes all kinds of recording apparatuses in which data that can be read by the processor is stored. Examples of the recording medium readable by the processor include a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like, and also a carrier wave such as transmission over the Internet. In addition, the processor readable recording medium may be distributed over networked computer systems so that code readable by the processor in a distributed manner can be stored and executed.

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed exemplary embodiments, but, on the contrary, It will be understood by those skilled in the art that various changes and modifications may be made without departing from the spirit and scope of the present invention.

110: wireless communication unit 120: A / V input unit
130: user input unit 140: sensing unit
150: output unit 151: display unit
160: memory 170: interface section
180:

Claims (11)

Displaying a message input window;
Sensing a user's motion input;
Determining at least one emoticon corresponding to the sensed motion input from previously stored emoticon data based on at least one of the type, the input time, and the input intensity of the sensed motion input; And
And displaying the determined emoticon on the message input window.
The method according to claim 1,
Wherein the motion input sensing step senses the motion input through a sensing unit including at least one of an acceleration sensor, a gyro sensor, and a pressure sensor.
The method according to claim 1,
Wherein the emoticon discriminating step selects an emoticon matched with a movement direction, a rotation direction, a number of times, an acceleration change, and a grip pressure change of the motion input.
The method according to claim 1,
Wherein the emoticon display step displays the size or the number of the emoticon differently in proportion to at least one of an input time and an input intensity of the sensed motion input.
The method according to claim 1,
Wherein the motion input sensing step senses the shape of the touch input input through the display unit.
The method according to claim 1,
And selecting at least one of the displayed emoticons.
The method according to claim 1,
Receiving a transmission command;
And transmitting data based on the emoticons to a receiving terminal or a service server.
8. The method of claim 7,
Wherein the transmitting step further transmits data for generating a vibration effect and a sound effect associated with the emoticon.
The method according to claim 1,
Executing a social network service or a text message service or a multimedia message service.
10. The method of claim 9,
And activating a sensing unit for sensing the motion input.
A display unit for displaying a message input window;
A sensing unit sensing a motion input of the user;
And a controller for discriminating at least one emoticon corresponding to the sensed motion input from previously stored emoticon data based on at least one of the type, the input time, and the input intensity of the sensed motion input,
Wherein the control unit controls the display unit to display the determined emoticons on the message input window.


KR1020120141228A 2012-12-06 2012-12-06 Mobil terminal and Operating Method for the Same KR20140073232A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020120141228A KR20140073232A (en) 2012-12-06 2012-12-06 Mobil terminal and Operating Method for the Same

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020120141228A KR20140073232A (en) 2012-12-06 2012-12-06 Mobil terminal and Operating Method for the Same

Publications (1)

Publication Number Publication Date
KR20140073232A true KR20140073232A (en) 2014-06-16

Family

ID=51126833

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020120141228A KR20140073232A (en) 2012-12-06 2012-12-06 Mobil terminal and Operating Method for the Same

Country Status (1)

Country Link
KR (1) KR20140073232A (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016190640A1 (en) * 2015-05-22 2016-12-01 삼성전자 주식회사 Method for controlling mobile terminal, and mobile terminal
WO2018043998A1 (en) * 2016-09-05 2018-03-08 삼성전자 주식회사 Electronic device and control method therefor
KR102017857B1 (en) * 2018-05-18 2019-09-03 주식회사 카카오 Operating method of terminal for displaying tilting emoticon and the terminal thereof
US10425523B2 (en) 2015-05-22 2019-09-24 Samsung Electronics Co., Ltd. Method for controlling mobile terminal, and mobile terminal
US20230214107A1 (en) * 2014-09-02 2023-07-06 Apple Inc. User interface for receiving user input
US11921998B2 (en) 2020-05-11 2024-03-05 Apple Inc. Editing features of an avatar
US11918857B2 (en) 2016-06-11 2024-03-05 Apple Inc. Activity and workout updates

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230214107A1 (en) * 2014-09-02 2023-07-06 Apple Inc. User interface for receiving user input
WO2016190640A1 (en) * 2015-05-22 2016-12-01 삼성전자 주식회사 Method for controlling mobile terminal, and mobile terminal
US10425523B2 (en) 2015-05-22 2019-09-24 Samsung Electronics Co., Ltd. Method for controlling mobile terminal, and mobile terminal
US11918857B2 (en) 2016-06-11 2024-03-05 Apple Inc. Activity and workout updates
WO2018043998A1 (en) * 2016-09-05 2018-03-08 삼성전자 주식회사 Electronic device and control method therefor
KR102017857B1 (en) * 2018-05-18 2019-09-03 주식회사 카카오 Operating method of terminal for displaying tilting emoticon and the terminal thereof
US11921998B2 (en) 2020-05-11 2024-03-05 Apple Inc. Editing features of an avatar

Similar Documents

Publication Publication Date Title
KR101252169B1 (en) Mobile terminal and operation control method thereof
KR101678812B1 (en) Mobile terminal and operation control method thereof
KR101685363B1 (en) Mobile terminal and operation method thereof
KR102058043B1 (en) Image display apparatus and method for operating the same
EP2746979A1 (en) Mobile device having face recognition function using additional component and method for controlling the mobile device
KR101657549B1 (en) Mobile terminal and control method thereof
KR20140073232A (en) Mobil terminal and Operating Method for the Same
US20160004388A1 (en) Operation method of portable terminal
US20140145966A1 (en) Electronic device with touch input display system using head-tracking to reduce visible offset for user input
KR20150068175A (en) Operating Method for Mobil terminal
KR20140074651A (en) Mobile terminal and operation method thereof
KR20150039999A (en) Mobile terminal and operation method thereof
KR101626307B1 (en) Mobile terminal and operation control method thereof
KR20140064116A (en) Mobil terminal and operating method for the same
KR101883179B1 (en) Mobile terminal and operation method thereof
KR20120001477A (en) Mobile terminal and operation method thereof
KR101661974B1 (en) Mobile terminal and operation method thereof
KR20140040457A (en) Mobile terminal and operating method for the same
KR101982775B1 (en) Mobile terminal and Operationg method thereof
KR101685349B1 (en) Mobile terminal and operation method thereof
KR101635559B1 (en) Mobile terminal and operation method thereof
EP2735942A1 (en) Electronic device with touch input display system using head-tracking to reduce visible offset for user input
KR20150081694A (en) Mobile terminal and operation method thereof
KR101749531B1 (en) Mobile terminal and operation method thereof
KR101521218B1 (en) Mobile terminal and information control method using 3-axis coordinate in mobile terminal

Legal Events

Date Code Title Description
WITN Withdrawal due to no request for examination