KR20170038569A - Mobile terminal and method for controlling the same - Google Patents

Mobile terminal and method for controlling the same Download PDF

Info

Publication number
KR20170038569A
KR20170038569A KR1020150138136A KR20150138136A KR20170038569A KR 20170038569 A KR20170038569 A KR 20170038569A KR 1020150138136 A KR1020150138136 A KR 1020150138136A KR 20150138136 A KR20150138136 A KR 20150138136A KR 20170038569 A KR20170038569 A KR 20170038569A
Authority
KR
South Korea
Prior art keywords
content
mobile terminal
wheel
user
touch
Prior art date
Application number
KR1020150138136A
Other languages
Korean (ko)
Inventor
이현철
추지민
조세현
이상혁
이철배
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020150138136A priority Critical patent/KR20170038569A/en
Publication of KR20170038569A publication Critical patent/KR20170038569A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output

Abstract

The present invention relates to a mobile terminal for providing contents through an external speaker and a control method thereof. The mobile terminal includes a display unit, a wheel which surrounds the outside of the display unit and is rotatably formed, a sensing unit which senses the rotation movement of the wheel and a touch by a finger of a user which is applied to the wheel, and a control unit which detects specific user information among a plurality of user information based on an input pattern corresponding to the touch by the finger of the user which is applied to the wheel and the rotation movement of the wheel, and displays a content list including at least one content mapped on the detected specific user information on the display unit. Accordingly, the present invention can provide the contents suitable for each user without a login procedure.

Description

[0001] MOBILE TERMINAL AND METHOD FOR CONTROLLING THE SAME [0002]

The present invention relates to a mobile terminal providing contents through an external speaker and a control method thereof.

A terminal can be divided into a mobile terminal (mobile / portable terminal) and a stationary terminal according to whether the terminal can be moved. The mobile terminal can be divided into a handheld terminal and a vehicle mounted terminal according to whether the user can directly carry the mobile terminal.

Mobile terminal functions are diversified. For example, there are data and voice communication, photographing and video shooting through a camera, voice recording, music file playback through a speaker system, and outputting an image or video on a display unit. Some terminals are equipped with an electronic game play function or a multimedia player function. In particular, modern mobile terminals can receive multicast signals that provide visual content such as broadcast and video or television programs.

Such a terminal has various functions, for example, in the form of a multimedia device having multiple functions such as photographing and photographing of a moving picture, reproduction of a music or video file, reception of a game and broadcasting, etc. .

In order to support and enhance the functionality of such terminals, it may be considered to improve the structural and / or software parts of the terminal.

Meanwhile, such a terminal can provide various functions by performing communication with an external device. For example, the terminal can provide content by communicating with an external device and using an external device.

Thus, a terminal that provides contents using an external device needs to be improved in terms of software to efficiently provide contents.

An object of the present invention is to provide content suitable for each user of a mobile terminal.

It is another object of the present invention to provide a method of sharing contents among a plurality of mobile terminals in various ways.

A mobile terminal according to an exemplary embodiment of the present invention includes a display unit, a wheel that surrounds an outer periphery of the display unit and includes a rotatable wheel, a sensing unit that senses a rotation of the wheel, and a touch by a user's finger And a control unit for detecting specific user information among a plurality of pieces of user information based on the rotational movement of the sensed wheel and an input pattern corresponding to a touch by the user's finger applied to the wheel, And a controller for displaying a content list including at least one content on the display unit.

In one embodiment, the controller selects any one of at least one content included in the content list in response to the rotational movement of the wheel.

In one embodiment, the apparatus further includes a camera for photographing a user's face image, wherein the control unit detects the user's emotion state based on the user's face image, and based on the detected emotion state, And the specific content is selected from among at least one content included in the content.

In one embodiment, the portable terminal further includes a wireless communication unit for communicating with an external speaker located within a predetermined range based on the position of the mobile terminal, wherein when the plurality of external speakers are within the predetermined range, The method comprising the steps of: communicating with an external speaker located closest to the mobile terminal, and transmitting at least one content from the external speaker to the external speaker through the communication so that at least one content included in the content list is audibly output; To the mobile station.

In one embodiment, the controller detects position information of the external speaker when communicating with the external speaker, and detects a position of the external speaker based on position information of the external speaker, And outputting it from the speaker.

In one embodiment, when the position of the mobile terminal is changed, the controller releases communication with the external speaker performing the communication.

In one embodiment of the present invention, the control unit may be configured to communicate with a new external speaker based on the changed position of the mobile terminal, and to transmit the new external speaker to the new external speaker, And transmits one content.

In one embodiment, the control unit detects specific user information based on an input pattern corresponding to a rotational speed of the wheel and a position at which a user's touch is sensed on the wheel.

In one embodiment, the apparatus further includes a wireless communication unit for performing communication with an external device, wherein when the communication with the external device is performed, the controller reproduces the first content currently being reproduced and an external device that performs the communication with the first content And outputs the synthesized content obtained by synthesizing the second content.

In the method of searching for contents of a mobile terminal according to another embodiment of the present invention, a method of searching for contents of a mobile terminal having a wheel formed so as to be rotatable, comprises the steps of: Detecting an input pattern corresponding to a rotational movement of the wheel, detecting specific user information among a plurality of pieces of user information based on the detected input pattern, and detecting at least one content And displaying the content list on the display unit.

According to the present invention, the user can be recognized through the touch applied to the wheel and the rotational movement of the wheel. Thus, the present invention can provide contents suitable for each user without performing a login procedure.

In addition, the present invention can control the content displayed on the display unit through various operations using the wheel. Through this, the user can conveniently control the content through a simple operation using the wheel.

1A is a block diagram illustrating a mobile terminal according to the present invention.
1B and 1C are conceptual diagrams showing the structure of a mobile terminal according to the present invention.
2 is a conceptual diagram illustrating a state where a mobile terminal and an external speaker are connected to each other according to the present invention.
3 is a flowchart illustrating a control method of providing contents in a mobile terminal according to the present invention.
4A and 4B are conceptual diagrams illustrating a method of authenticating a user through a user operation applied to a wheel in a mobile terminal related to the present invention.
5A and 5B are conceptual diagrams illustrating a method of controlling content through a rotational movement of a wheel in a mobile terminal according to the present invention.
FIG. 6 is a conceptual diagram illustrating a method of controlling contents through a touch operation on a wheel in a mobile terminal according to the present invention.
7 is a conceptual diagram showing a method of executing different functions associated with each of a plurality of areas included in a wheel.
8A and 8B are conceptual diagrams showing a method of controlling reproduction of content being reproduced by using a touch operation applied to a wheel.
Figs. 9A and 9B are conceptual diagrams showing a method of controlling reproduction of content being reproduced by using a touch operation applied to a wheel. Fig.
10A and 10B are conceptual diagrams illustrating a method of reproducing specific contents using position information of an external speaker communicating with a mobile terminal.
11A and 11B are conceptual diagrams illustrating a method of sharing content among a plurality of mobile terminals providing content.
12 is a conceptual diagram illustrating a method of controlling a plurality of mobile terminals when the mobile terminal performs communication with a plurality of mobile terminals.
13 is a conceptual diagram illustrating a method of synthesizing contents being reproduced in a plurality of mobile terminals connected to one speaker and outputting the synthesized contents as one content.

Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings, wherein like reference numerals are used to designate identical or similar elements, and redundant description thereof will be omitted. The suffix "module" and " part "for the components used in the following description are given or mixed in consideration of ease of specification, and do not have their own meaning or role. In the following description of the embodiments of the present invention, a detailed description of related arts will be omitted when it is determined that the gist of the embodiments disclosed herein may be blurred. It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are intended to provide further explanation of the invention as claimed. , ≪ / RTI > equivalents, and alternatives.

Terms including ordinals, such as first, second, etc., may be used to describe various elements, but the elements are not limited to these terms. The terms are used only for the purpose of distinguishing one component from another.

It is to be understood that when an element is referred to as being "connected" or "connected" to another element, it may be directly connected or connected to the other element, . On the other hand, when an element is referred to as being "directly connected" or "directly connected" to another element, it should be understood that there are no other elements in between.

The singular expressions include plural expressions unless the context clearly dictates otherwise.

In the present application, the terms "comprises", "having", and the like are used to specify that a feature, a number, a step, an operation, an element, a component, But do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, or combinations thereof.

The mobile terminal described in this specification includes a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a personal digital assistant (PDA), a portable multimedia player (PMP), a navigation device, a slate PC A tablet PC, an ultrabook, a wearable device such as a smartwatch, a smart glass, and a head mounted display (HMD). have.

However, it will be appreciated by those skilled in the art that the configuration according to the embodiments described herein may be applied to fixed terminals such as a digital TV, a desktop computer, a digital signage, and the like, will be.

1A is a block diagram for explaining a mobile terminal according to the present invention, and FIGS. 1B and 1C are conceptual diagrams illustrating an example of a mobile terminal according to the present invention in different directions.

The mobile terminal 100 includes a wireless communication unit 110, an input unit 120, a sensing unit 140, an output unit 150, an interface unit 160, a memory 170, a control unit 180, ), And the like. The components shown in FIG. 1A are not essential for implementing a mobile terminal, so that the mobile terminal described herein may have more or fewer components than the components listed above.

The wireless communication unit 110 may be connected between the mobile terminal 100 and the wireless communication system or between the mobile terminal 100 and another mobile terminal 100 or between the mobile terminal 100 and the external server 100. [ Lt; RTI ID = 0.0 > wireless < / RTI > In addition, the wireless communication unit 110 may include one or more modules for connecting the mobile terminal 100 to one or more networks.

The wireless communication unit 110 may include at least one of a broadcast receiving module 111, a mobile communication module 112, a wireless Internet module 113, a short distance communication module 114, and a location information module 115 .

The input unit 120 includes a camera 121 or an image input unit for inputting a video signal, a microphone 122 for inputting an audio signal, an audio input unit, a user input unit 123 for receiving information from a user A touch key, a mechanical key, and the like). The voice data or image data collected by the input unit 120 may be analyzed and processed by a user's control command.

The sensing unit 140 may include at least one sensor for sensing at least one of information in the mobile terminal, surrounding environment information surrounding the mobile terminal, and user information. For example, the sensing unit 140 may include a proximity sensor 141, an illumination sensor 142, a touch sensor, an acceleration sensor, a magnetic sensor, A G-sensor, a gyroscope sensor, a motion sensor, an RGB sensor, an infrared sensor, a finger scan sensor, an ultrasonic sensor, A microphone 226, a battery gauge, an environmental sensor (for example, a barometer, a hygrometer, a thermometer, a radiation detection sensor, A thermal sensor, a gas sensor, etc.), a chemical sensor (e.g., an electronic nose, a healthcare sensor, a biometric sensor, etc.). Meanwhile, the mobile terminal disclosed in the present specification can combine and utilize information sensed by at least two of the sensors.

The output unit 150 includes at least one of a display unit 151, an acoustic output unit 152, a haptic tip module 153, and a light output unit 154 to generate an output related to visual, auditory, can do. The display unit 151 may have a mutual layer structure with the touch sensor or may be integrally formed to realize a touch screen. The touch screen may function as a user input unit 123 that provides an input interface between the mobile terminal 100 and a user and may provide an output interface between the mobile terminal 100 and a user.

The interface unit 160 serves as a path to various types of external devices connected to the mobile terminal 100. The interface unit 160 is connected to a device having a wired / wireless headset port, an external charger port, a wired / wireless data port, a memory card port, And may include at least one of a port, an audio I / O port, a video I / O port, and an earphone port. In the mobile terminal 100, corresponding to the connection of the external device to the interface unit 160, it is possible to perform appropriate control related to the connected external device.

In addition, the memory 170 stores data supporting various functions of the mobile terminal 100. The memory 170 may store a plurality of application programs or applications running on the mobile terminal 100, data for operation of the mobile terminal 100, and commands. At least some of these applications may be downloaded from an external server via wireless communication. Also, at least a part of these application programs may exist on the mobile terminal 100 from the time of shipment for the basic functions (e.g., telephone call receiving function, message receiving function, and calling function) of the mobile terminal 100. Meanwhile, the application program may be stored in the memory 170, installed on the mobile terminal 100, and may be operated by the control unit 180 to perform the operation (or function) of the mobile terminal.

In addition to the operations related to the application program, the control unit 180 typically controls the overall operation of the mobile terminal 100. The control unit 180 may process or process signals, data, information, and the like input or output through the above-mentioned components, or may drive an application program stored in the memory 170 to provide or process appropriate information or functions to the user.

In addition, the controller 180 may control at least some of the components illustrated in FIG. 1A in order to drive an application program stored in the memory 170. FIG. In addition, the controller 180 may operate at least two of the components included in the mobile terminal 100 in combination with each other for driving the application program.

The power supply unit 190 receives external power and internal power under the control of the controller 180 and supplies power to the components included in the mobile terminal 100. The power supply unit 190 includes a battery, which may be an internal battery or a replaceable battery.

At least some of the components may operate in cooperation with one another to implement a method of operation, control, or control of a mobile terminal according to various embodiments described below. In addition, the operation, control, or control method of the mobile terminal may be implemented on the mobile terminal by driving at least one application program stored in the memory 170. [

Hereinafter, the various components of the mobile terminal 100 will be described in detail with reference to FIG. 1A.

First, referring to the wireless communication unit 110, the broadcast receiving module 111 of the wireless communication unit 110 receives broadcast signals and / or broadcast-related information from an external broadcast management server through a broadcast channel. The broadcast channel may include a satellite channel and a terrestrial channel. Two or more broadcast receiving modules may be provided to the mobile terminal 100 for simultaneous broadcast reception or broadcast channel switching for at least two broadcast channels.

The mobile communication module 112 may be a mobile communication module or a mobile communication module such as a mobile communication module or a mobile communication module that uses technology standards or a communication method (e.g., Global System for Mobile communication (GSM), Code Division Multi Access (CDMA), Code Division Multi Access 2000 (Enhanced Voice-Data Optimized or Enhanced Voice-Data Only), Wideband CDMA (WCDMA), High Speed Downlink Packet Access (HSDPA), High Speed Uplink Packet Access (HSUPA), Long Term Evolution And an external terminal, or a server on a mobile communication network established according to a long term evolution (AR), a long term evolution (AR), or the like.

The wireless signal may include various types of data depending on a voice call signal, a video call signal or a text / multimedia message transmission / reception.

The wireless Internet module 113 is a module for wireless Internet access, and may be built in or externally attached to the mobile terminal 100. The wireless Internet module 113 is configured to transmit and receive a wireless signal in a communication network according to wireless Internet technologies.

Wireless Internet technologies include, for example, wireless LAN (WLAN), wireless fidelity (Wi-Fi), wireless fidelity (Wi-Fi) Direct, DLNA (Digital Living Network Alliance), WiBro Interoperability for Microwave Access, High Speed Downlink Packet Access (HSDPA), High Speed Uplink Packet Access (HSUPA), Long Term Evolution (LTE) and Long Term Evolution-Advanced (LTE-A) 113 transmit and receive data according to at least one wireless Internet technology, including Internet technologies not listed above.

The wireless Internet module 113 for performing a wireless Internet connection through the mobile communication network can be used for wireless Internet access by WiBro, HSDPA, HSUPA, GSM, CDMA, WCDMA, LTE or LTE- May be understood as a kind of the mobile communication module 112.

The short-range communication module 114 is for short-range communication, and includes Bluetooth ™, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB) (Near Field Communication), Wi-Fi (Wireless-Fidelity), Wi-Fi Direct, and Wireless USB (Wireless Universal Serial Bus) technology. The short-range communication module 114 is connected to the mobile terminal 100 and the wireless communication system through the wireless area networks, between the mobile terminal 100 and another mobile terminal 100, or between the mobile terminal 100 ) And the other mobile terminal 100 (or the external server). The short-range wireless communication network may be a short-range wireless personal area network.

Here, the other mobile terminal 100 may be a wearable device (e.g., a smartwatch, a smart glass, etc.) capable of interchanging data with the mobile terminal 100 according to the present invention (smart glass), HMD (head mounted display)). The short range communication module 114 may detect (or recognize) a wearable device capable of communicating with the mobile terminal 100 around the mobile terminal 100. [ If the detected wearable device is a device authenticated to communicate with the mobile terminal 100 according to the present invention, the control unit 180 may transmit at least a part of the data processed by the mobile terminal 100 to the short- 114 to the wearable device. Therefore, the user of the wearable device can use the data processed by the mobile terminal 100 through the wearable device. For example, according to this, when a telephone is received in the mobile terminal 100, the user performs a telephone conversation via the wearable device, or when a message is received in the mobile terminal 100, It is possible to check the message.

The position information module 115 is a module for obtaining the position (or current position) of the mobile terminal, and a representative example thereof is a Global Positioning System (GPS) module or a Wireless Fidelity (WiFi) module. For example, when the mobile terminal utilizes the GPS module, it can acquire the position of the mobile terminal by using a signal transmitted from the GPS satellite. As another example, when the mobile terminal utilizes the Wi-Fi module, it can acquire the position of the mobile terminal based on information of a wireless access point (AP) that transmits or receives the wireless signal with the Wi-Fi module. Optionally, the location information module 115 may perform any of the other functions of the wireless communication unit 110 to obtain data relating to the location of the mobile terminal, in addition or alternatively. The location information module 115 is a module used to obtain the location (or current location) of the mobile terminal, and is not limited to a module that directly calculates or obtains the location of the mobile terminal.

Next, the input unit 120 is for inputting image information (or signal), audio information (or signal), data, or information input from a user. For inputting image information, Or a plurality of cameras 121 may be provided. The camera 121 processes image frames such as still images or moving images obtained by the image sensor in the video communication mode or the photographing mode. The processed image frame may be displayed on the display unit 151 or stored in the memory 170. [ A plurality of cameras 121 provided in the mobile terminal 100 may be arranged to have a matrix structure and various angles or foci may be provided to the mobile terminal 100 through the camera 121 having the matrix structure A plurality of pieces of image information can be input. In addition, the plurality of cameras 121 may be arranged in a stereo structure to acquire a left image and a right image for realizing a stereoscopic image.

The microphone 122 processes the external acoustic signal into electrical voice data. The processed voice data can be utilized variously according to a function (or a running application program) being executed in the mobile terminal 100. Meanwhile, the microphone 122 may be implemented with various noise reduction algorithms for eliminating noise generated in receiving an external sound signal.

The user input unit 123 is for receiving information from a user and when the information is inputted through the user input unit 123, the control unit 180 can control the operation of the mobile terminal 100 to correspond to the input information . The user input unit 123 may include a mechanical input means (or a mechanical key such as a button located on the front, rear or side of the mobile terminal 100, a dome switch, a jog wheel, Jog switches, etc.) and touch-type input means. For example, the touch-type input means may comprise a virtual key, a soft key or a visual key displayed on the touch screen through software processing, The virtual key or the visual key can be displayed on the touch screen with various forms. For example, the virtual key or the visual key can be displayed on the touch screen, ), An icon, a video, or a combination thereof.

Meanwhile, the sensing unit 140 senses at least one of information in the mobile terminal, surrounding environment information surrounding the mobile terminal, and user information, and generates a corresponding sensing signal. The control unit 180 may control the driving or operation of the mobile terminal 100 or may perform data processing, function or operation related to the application program installed in the mobile terminal 100 based on the sensing signal. Representative sensors among various sensors that may be included in the sensing unit 140 will be described in more detail.

First, the proximity sensor 141 refers to a sensor that detects the presence of an object approaching a predetermined detection surface, or the presence of an object in the vicinity of the detection surface, without mechanical contact by using electromagnetic force or infrared rays. The proximity sensor 141 may be disposed in the inner area of the mobile terminal or in proximity to the touch screen, which is covered by the touch screen.

Examples of the proximity sensor 141 include a transmission type photoelectric sensor, a direct reflection type photoelectric sensor, a mirror reflection type photoelectric sensor, a high frequency oscillation type proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor. In the case where the touch screen is electrostatic, the proximity sensor 141 can be configured to detect the proximity of the object with a change of the electric field along the proximity of the object having conductivity. In this case, the touch screen (or touch sensor) itself may be classified as a proximity sensor.

On the other hand, for convenience of explanation, the act of recognizing that the object is located on the touch screen in proximity with no object touching the touch screen is referred to as "proximity touch & The act of actually touching an object on the screen is called a "contact touch. &Quot; The position at which the object is closely touched on the touch screen means a position where the object corresponds to the touch screen vertically when the object is touched. The proximity sensor 141 can detect a proximity touch and a proximity touch pattern (e.g., a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, have. Meanwhile, the control unit 180 processes data (or information) corresponding to the proximity touch operation and the proximity touch pattern sensed through the proximity sensor 141 as described above, and further provides visual information corresponding to the processed data It can be output on the touch screen. Furthermore, the control unit 180 can control the mobile terminal 100 such that different operations or data (or information) are processed according to whether the touch to the same point on the touch screen is a proximity touch or a touch touch .

The touch sensor uses a touch (or touch input) applied to the touch screen (or the display unit 151) by using at least one of various touch methods such as a resistance film type, a capacitive type, an infrared type, an ultrasonic type, Detection.

For example, the touch sensor may be configured to convert a change in a pressure applied to a specific portion of the touch screen or a capacitance generated in a specific portion to an electrical input signal. The touch sensor may be configured to detect a position, an area, a pressure at the time of touch, a capacitance at the time of touch, and the like where a touch object touching the touch screen is touched on the touch sensor. Here, the touch object may be a finger, a touch pen, a stylus pen, a pointer, or the like as an object to which a touch is applied to the touch sensor.

Thus, when there is a touch input to the touch sensor, the corresponding signal (s) is sent to the touch controller. The touch controller processes the signal (s) and transmits the corresponding data to the controller 180. Thus, the control unit 180 can know which area of the display unit 151 is touched or the like. Here, the touch controller may be a separate component from the control unit 180, and may be the control unit 180 itself.

On the other hand, the control unit 180 may perform different controls or perform the same control according to the type of the touch object touching the touch screen (or a touch key provided on the touch screen). Whether to perform different controls or to perform the same control according to the type of the touch object may be determined according to the current state of the mobile terminal 100 or an application program being executed.

On the other hand, the touch sensors and the proximity sensors discussed above can be used independently or in combination to provide a short touch (touch), a long touch, a multi touch, a drag touch ), Flick touch, pinch-in touch, pinch-out touch, swipe touch, hovering touch, and the like. Touch can be sensed.

The ultrasonic sensor can recognize the position information of the object to be sensed by using ultrasonic waves. Meanwhile, the controller 180 can calculate the position of the wave generating source through the information sensed by the optical sensor and the plurality of ultrasonic sensors. The position of the wave source can be calculated using the fact that the light is much faster than the ultrasonic wave, that is, the time when the light reaches the optical sensor is much faster than the time the ultrasonic wave reaches the ultrasonic sensor. More specifically, the position of the wave generating source can be calculated using the time difference with the time when the ultrasonic wave reaches the reference signal.

The camera 121 includes at least one of a camera sensor (for example, a CCD, a CMOS, etc.), a photo sensor (or an image sensor), and a laser sensor.

The camera 121 and the laser sensor may be combined with each other to sense a touch of the sensing object with respect to the three-dimensional stereoscopic image. The photosensor can be laminated to the display element, which is arranged to scan the motion of the object to be sensed proximate to the touch screen. More specifically, the photosensor mounts photo diodes and TRs (Transistors) in a row / column and scans the contents loaded on the photosensor using an electrical signal that varies according to the amount of light applied to the photo diode. That is, the photo sensor performs coordinate calculation of the object to be sensed according to the amount of change of light, and position information of the object to be sensed can be obtained through the calculation.

The display unit 151 displays (outputs) information processed by the mobile terminal 100. For example, the display unit 151 may display execution screen information of an application program driven by the mobile terminal 100 or UI (User Interface) and GUI (Graphic User Interface) information according to the execution screen information .

Also, the display unit 151 may be configured as a stereoscopic display unit for displaying a stereoscopic image.

In the stereoscopic display unit, a three-dimensional display system such as a stereoscopic system (glasses system), an autostereoscopic system (no-glasses system), and a projection system (holographic system) can be applied.

The sound output unit 152 may output audio data received from the wireless communication unit 110 or stored in the memory 170 in a call signal reception mode, a call mode or a recording mode, a voice recognition mode, a broadcast reception mode, The sound output unit 152 also outputs sound signals related to functions (e.g., call signal reception sound, message reception sound, etc.) performed in the mobile terminal 100. [ The audio output unit 152 may include a receiver, a speaker, a buzzer, and the like.

The haptic module 153 generates various tactile effects that the user can feel. A typical example of the haptic effect generated by the haptic module 153 may be vibration. The intensity and pattern of the vibration generated in the haptic module 153 can be controlled by the user's selection or the setting of the control unit. For example, the haptic module 153 may synthesize and output different vibrations or sequentially output the vibrations.

In addition to vibration, the haptic module 153 may be configured to perform various functions such as a pin arrangement vertically moving with respect to the contact skin surface, a spraying force or suction force of the air through the injection port or the suction port, a touch on the skin surface, And various tactile effects such as an effect of reproducing a cold sensation using an endothermic or exothermic element can be generated.

The haptic module 153 can transmit the tactile effect through the direct contact, and the tactile effect can be felt by the user through the muscles of the finger or arm. The haptic module 153 may include two or more haptic modules 153 according to the configuration of the mobile terminal 100.

The light output unit 154 outputs a signal for notifying the occurrence of an event using the light of the light source of the mobile terminal 100. Examples of events that occur in the mobile terminal 100 may include message reception, call signal reception, missed call, alarm, schedule notification, email reception, information reception through an application, and the like.

The signal output from the light output unit 154 is implemented as the mobile terminal emits light of a single color or a plurality of colors to the front or rear surface. The signal output may be terminated by the mobile terminal detecting the event confirmation of the user.

The interface unit 160 serves as a path for communication with all external devices connected to the mobile terminal 100. The interface unit 160 receives data from an external device or supplies power to each component in the mobile terminal 100 or transmits data in the mobile terminal 100 to an external device. For example, a port for connecting a device equipped with a wired / wireless headset port, an external charger port, a wired / wireless data port, a memory card port, an audio I / O port, a video I / O port, an earphone port, and the like may be included in the interface unit 160.

The identification module is a chip for storing various information for authenticating the use right of the mobile terminal 100 and includes a user identification module (UIM), a subscriber identity module (SIM) A universal subscriber identity module (USIM), and the like. Devices with identification modules (hereinafter referred to as "identification devices") can be manufactured in a smart card format. Accordingly, the identification device can be connected to the terminal 100 through the interface unit 160. [

The interface unit 160 may be a path through which power from the cradle is supplied to the mobile terminal 100 when the mobile terminal 100 is connected to an external cradle, And various command signals may be transmitted to the mobile terminal 100. The various command signals or the power source input from the cradle may be operated as a signal for recognizing that the mobile terminal 100 is correctly mounted on the cradle.

The memory 170 may store a program for the operation of the controller 180 and temporarily store input / output data (e.g., a phone book, a message, a still image, a moving picture, etc.). The memory 170 may store data on vibration and sound of various patterns outputted when a touch is input on the touch screen.

The memory 170 may be a flash memory type, a hard disk type, a solid state disk type, an SDD type (Silicon Disk Drive type), a multimedia card micro type ), Card type memory (e.g., SD or XD memory), random access memory (RAM), static random access memory (SRAM), read-only memory (ROM), electrically erasable programmable read memory, a programmable read-only memory (PROM), a magnetic memory, a magnetic disk, and / or an optical disk. The mobile terminal 100 may operate in association with a web storage that performs the storage function of the memory 170 on the Internet.

Meanwhile, as described above, the control unit 180 controls the operations related to the application program and the general operation of the mobile terminal 100. [ For example, when the state of the mobile terminal meets a set condition, the control unit 180 can execute or release a lock state for restricting input of a user's control command to applications.

In addition, the control unit 180 performs control and processing related to voice communication, data communication, video call, or the like, or performs pattern recognition processing to recognize handwriting input or drawing input performed on the touch screen as characters and images, respectively . Further, the controller 180 may control any one or a plurality of the above-described components in order to implement various embodiments described below on the mobile terminal 100 according to the present invention.

The power supply unit 190 receives external power and internal power under the control of the controller 180 and supplies power necessary for operation of the respective components. The power supply unit 190 includes a battery, the battery may be an internal battery configured to be chargeable, and may be detachably coupled to the terminal body for charging or the like.

In addition, the power supply unit 190 may include a connection port, and the connection port may be configured as an example of an interface 160 through which an external charger for supplying power for charging the battery is electrically connected.

As another example, the power supply unit 190 may be configured to charge the battery in a wireless manner without using the connection port. In this case, the power supply unit 190 may use at least one of an inductive coupling method based on a magnetic induction phenomenon from an external wireless power transmission apparatus and a magnetic resonance coupling method based on an electromagnetic resonance phenomenon Power can be delivered.

In the following, various embodiments may be embodied in a recording medium readable by a computer or similar device using, for example, software, hardware, or a combination thereof.

1B and 1C, a mobile terminal according to the present invention includes a main body 200, a display unit 251 disposed on a front surface of the main body, a wheel 201 configured to surround the display unit 251, A wheel 201 may be provided.

The body 200 has a circular shape and may be formed of various materials. For example, various materials may include metals, synthetic resins, and the like.

The display unit 251 may be disposed on the front surface of the main body so as to display visual information. Also, the display unit 251 may be formed in a circular shape.

The display unit 251 may further include a touch sensor to detect a touch input of a user applied to the display unit 251.

The wheel 201 may be formed in a hollow shape (or a donut shape or a ring shape) so as to surround the outer periphery of the display portion 251. In this case, the shape of the wheel 201 may be determined according to the shape of the display unit 251. For example, referring to FIG. 1B, the wheel 201 may be hollow.

The wheel 201 can be formed to move based on the operation of the user. For example, the wheel 201 can rotate based on an external force applied from a user.

Further, the wheel 201 may further include a touch sensor to sense a touch applied to the surface of the wheel. Such a touch sensor can form a layer structure with the surface of the wheel 201.

Accordingly, the user can generate the control command by rotating the wheel 201 or by touching the surface of the wheel 201. [

Hereinafter, a control method of a mobile terminal having at least one of the above-described components will be described with reference to the drawings. In the following description of the drawings, the drawings drawn on the left side are referred to in the order of clockwise or top-down.

2 is a conceptual diagram illustrating a state where a mobile terminal and an external speaker are connected to each other according to the present invention.

The mobile terminal according to the present invention can perform communication with an external speaker through short-range communication. In the short-distance communication, the scheme described above with reference to FIG. 1A may be used.

Referring to FIG. 2, when performing communication with the external speaker, the mobile terminal can audibly output specific contents through an external speaker.

Here, the specific content may be sound content. The specific contents may be stored in the memory 170 of the mobile terminal or may be received in real time on the mobile terminal through communication from an external server (or an external device).

Hereinafter, a method of providing specific contents to be output through an external speaker will be described. 3 is a flowchart illustrating a control method of providing contents in a mobile terminal according to the present invention. 4A and 4B are conceptual diagrams illustrating a method of authenticating a user through a user operation applied to a wheel in a mobile terminal related to the present invention.

A mobile terminal according to the present invention can detect an input pattern corresponding to a touch of a user's finger and a rotational movement of a wheel applied to the wheel (S410).

The control unit 180 may sense at least one of an input corresponding to a touch input by the user's finger applied to the wheel and an input corresponding to the rotational movement of the wheel.

To this end, the mobile terminal may further include a sensing unit for sensing the rotational movement of the wheel and the touch of the user's finger applied to the wheel. For example, the sensing unit may include a gyro sensor and a touch sensor.

Here, the rotation of the wheel may have either a clockwise or counterclockwise direction. In addition, the user can rotate the wheel 201 by applying an external force to the wheel 201. [

The control unit 180 may sense an input corresponding to the rotational motion of the wheel 201. [ The input corresponding to the rotational motion may be different according to the rotational property (e.g., rotational speed, rotational length, etc.) of the wheel 201. [ For example, a rotational movement having a first rotational speed may correspond to a first input, and a rotational movement having a second rotational speed may correspond to a second input.

In addition, the wheel 201 may be divided into a plurality of areas in order to sense a touch applied to the surface of the wheel 201. The plurality of regions may be partitioned based on predetermined reference points. In this case, the controller 180 can detect the touch position based on the position of the area where the touch is detected among the plurality of areas.

In addition, the wheel 201 may further include a skin adhesion sensor for sensing an electrical change of the skin surface. In this case, when the wheel 201 is touched, the controller 180 can detect the skin conductance due to the skin of the finger that touches the wheel 201.

The skin electrical conductivity is a value that numerically indicates the electrical change of the skin. Such skin electrical conductivity may be changed according to the emotional state of the user. For example, if the user is in a stressed state or in an excited state, a skin electrical conductivity that is more exciting than normal skin conduction can be detected.

When the rotation of the wheel 201 and a touch on the surface of the wheel 201 are detected, the controller 180 controls the input corresponding to the rotational motion of the wheel 201 and the input corresponding to the rotational movement of the wheel 201, It is possible to detect an input pattern of an input corresponding to an applied touch.

The input pattern may be a combination of an input corresponding to the rotational motion of the wheel 201 and an input corresponding to a touch applied to the surface of the wheel 201. For example, the input pattern may be an input pattern corresponding to a rotation movement rotating at a first speed in a state where a touch is detected in a specific area.

The input pattern may vary depending on the user. That is, each user can have a unique input pattern. Therefore, the control unit 180 can distinguish the user based on the input pattern.

This unique input pattern can be determined by the size of the user's hand, the propensity of the user, and the like.

For example, as shown in the first drawing of Figs. 4A and 4B, since the size of the hands of the adult and the hands of the child are different from each other, the eye includes two specific regions 202a, 202c, and the adult can apply a touch to another specific two areas 202a, 202d.

As another example, although not shown, a person with a sudden personality may rotate the wheel 201 rapidly, and a person with a sudden personality may rotate the wheel 201 more slowly. As another example, though not shown, the preferred touch positions may be different for each user.

For example, as shown in the first drawing in Fig. 4A, the first user may have a first input pattern rotating at a first speed, with the touch being sensed at the first position of the wheel 201. [ As another example, as shown in the first drawing of Fig. 4B, the second user may have a second input pattern rotating at the second speed, with the touch being sensed at the second position of the wheel 201. [

Also, the controller 180 may detect an input pattern based on the input corresponding to the rotational movement, the input corresponding to the touch applied to the surface of the wheel 201, and the input corresponding to the skin electrical conductivity. A concrete explanation of this will be replaced with the preceding explanation.

The mobile terminal according to the present invention can detect specific user information among a plurality of user information based on an input pattern (S420).

When an input pattern is detected, the control unit 180 can detect specific user information among a plurality of user information based on the detected input pattern.

A plurality of pieces of user information may be stored in the memory 170 by the user. Here, the user information may include ID, identification information, password information, log-in information, and the like.

The control unit 180 may determine an input pattern to be mapped to specific user information. That is, the controller 180 can detect an input pattern of a specific user using the mobile terminal in real time and map the detected input pattern to a specific user. Accordingly, when the specific input pattern is received, the control unit 180 can detect the specific user information mapped to the specific input pattern.

The mobile terminal according to the present invention may display a content list including at least one content mapped to specific user information on the display unit (S430).

The control unit 180 may display the content list including at least one content mapped to specific user information on the display unit 251 when specific user information is detected.

For this, the control unit 180 can perform login based on the specific user information. Here, the login is the operation of the mobile terminal to acquire the right to access the specific information only by the user having the specific user information through the ID and the password of the specific user.

When the login is performed, the control unit 180 may detect a content list including at least one content mapped to specific user information.

The at least one content may be content stored on an external server (e.g., a web server) or content stored on the memory 170 of the mobile terminal.

The external server may be a content server for storing contents. Such an external server may have storage space for storing a plurality of contents (for example, a cloud, a web hard, etc.).

When the login is performed, the control unit 180 may obtain the right to access at least one content stored in the external server. In this case, the control unit 180 may display at least one content on the display unit 251. For example, as shown in FIG. 4A, when the specific user information is detected based on the first input pattern, the controller 180 can perform login based on specific user information. Thereafter, the controller 180 may display a content list including at least one sound content mapped to the specific user A information on the display unit 151.

The at least one content may be content preset by a specific user. For example, a user can select at least one content among a plurality of contents stored in an external server. In this case, the controller 180 may map the selected at least one content to specific user information. Accordingly, the control unit 180 may display at least one content mapped to the specific user information on the display unit 151, when the login is performed, based on the specific user information.

Or the at least one content may be determined by a use pattern of a specific user. In this case, the control unit 180 may detect a usage pattern of a specific user, and may map at least one content to specific user information based on the detected usage pattern. Here, the usage pattern may include information such as frequently played back content, recently played back content, a genre of the content, a maker of the content, and the like. Therefore, the user can receive appropriate contents suitable for his / her taste without any separate control command.

Also, the controller 180 may provide different contents for each user mapped to the input pattern. For example, as shown in FIG. 4B, when a specific user and another user B information are mapped to the detected input pattern, at least one content mapped to other user information is displayed on the display unit 180. [ Can be displayed on the display unit 151.

On the other hand, if the specific user information is not detected based on the detected input pattern, the controller 180 may not perform login. In this case, the control unit 180 may not display the content list including at least one content mapped to the specific user information on the display unit 151.

In this case, the control unit 180 can display the default content set on the display unit 151. The default content may be content that is not mapped to specific user information, that is, content that is provided by an external server, or that is set as a default content at the time of factory initialization of the mobile terminal.

In addition, if the login is not performed, the user can perform login through an external electronic device (e.g., a smart phone or a tablet PC) performing close-range communication with the mobile terminal.

More specifically, when there is an external electronic device that is performing close-range communication with the mobile terminal, the user can operate the external electronic device to perform the login. This login process can be accomplished by inputting an ID and a password in a conventional manner.

When login is performed from the login to the external electronic device, the external electronic device can transmit the login information to the mobile terminal. Accordingly, the mobile terminal can perform login.

The control unit 180 may control an external speaker so that the at least one content is audibly output based on a control command of the user after the content list including the at least one content is displayed. The external speaker may be an electronic device performing short-distance communication with the mobile terminal.

The control unit 180 may communicate with the external speaker through a control command of the user or the wireless communication unit 110 automatically. For example, the control unit 180 may perform communication with a specific external speaker based on a user's control command for communicating with an external speaker. As another example, when the external speaker having the stored identification information is located within a preset range based on the position of the mobile terminal, the controller 180 can automatically communicate with the external speaker.

Accordingly, the present invention can audibly output various contents provided by the mobile terminal through the external speaker.

In addition, in the present invention, in addition to an external speaker, if the speaker itself is provided, at least one content can be output through a speaker provided therein.

In the foregoing, a method of recognizing a user by using a user's input pattern applied to the mobile terminal and providing contents suitable for each user has been described. In this way, the user can conveniently receive customized content without an existing login process.

Hereinafter, a method of controlling the content through the rotational movement of the wheel and the touch applied to the wheel will be described. 5A and 5B are conceptual diagrams illustrating a method of controlling content through a rotational movement of a wheel in a mobile terminal according to the present invention. FIG. 6 is a conceptual diagram illustrating a method of controlling contents through a touch operation on a wheel in a mobile terminal according to the present invention.

The control unit 180 can control the content list including at least one content displayed on the display unit 151 based on the rotational movement of the wheel and the touch applied to the wheel.

More specifically, the control unit 180 may move between two content lists based on rotational movements applied to the wheel and / or touches on the wheel, or may scroll through at least one content included in the content list , And can select specific contents.

For example, as shown in FIG. 5A, the control unit 180 may display a currently displayed content list (e.g., a play list) in a different content list (for example, You can go to the list you listen to frequently). That is, according to the present invention, an operation of moving between a content list and a content list may be performed in accordance with a rotation movement of the wheel 201. [

In addition, the control unit 180 can control the moving direction between the content lists according to the rotational direction of the rotational motion applied to the wheel 201. [ For example, as shown in FIG. 5A, when the first content list is displayed and the wheel 201 is rotated in the first direction, the controller 180 determines that the second content list . ≪ / RTI > 5B, when the second content list is displayed and the wheel 201 is rotated in the second direction opposite to the first direction, the control unit 180 determines that the second content From the list to the first content list.

In addition, the controller 180 may scroll at least one content included in the content list based on the dragging of the wheel 201. [

For example, as shown in the first and second figures of FIG. 5A, the controller 180 may perform scrolling between at least one content based on the dragging of the wheel 201.

Also, although not shown, the controller 180 may scroll the wheel 201 in a direction corresponding to a direction in which the dragging is applied. Accordingly, the user can move through the content list through the operation of rotating the wheel, and after the user moves to the desired content list, the user can retrieve the desired content through the touch.

Meanwhile, the controller 180 can visually distinguish the currently selectable content from at least one content included in the content list with the remaining content. For example, as shown in the first drawing of FIG. 6, the controller 180 may display one content, which is selectable among the three contents, differently from other contents.

The controller 180 may move the selectable content in response to the dragging of the wheel. For example, as shown in the second diagram of FIG. 6, the controller 180 may move the selectable content from B.xxx to D.xxx in response to a drag touch on the wheel.

As shown in the third drawing of FIG. 6, the controller 180 can select specific contents in response to the short touch being applied to the wheel while specific contents are set as selectable contents. In this case, although not shown, the control unit 180 can reproduce the specific content through the external speaker.

In the above, a method of controlling contents through various operations applied to the wheel has been described. Accordingly, the user can easily control the content displayed on the display unit without covering the display unit. The present invention is not limited to the content control method, and various functions such as volume control, content reproduction control, and frequency control can be controlled through various operations applied to the wheel. Such a control method is similar to the above-described method, so that description thereof will be omitted.

Hereinafter, a method of executing different functions associated with each of a plurality of areas included in a wheel will be described. 7 is a conceptual diagram showing a method of executing different functions associated with each of a plurality of areas included in a wheel.

The control unit 180 can associate different functions with respect to each of the plurality of regions included in the wheel 201. [

The different functions are functions executable in the mobile terminal. For example, it can be a favorite function, a recent play list function, a playback control function, a power on / off function, a display on / off function, and a volume control function. The favorite function may be a function of reproducing the content that is reproduced most frequently. The recent play list function may be a display function of a recently played list starting from a point of time when a control command is received. The playback control function can be playback stop, stop playback, start playback, rewind, fast forward, and the like.

These different functions can be linked to a specific area by the user, or can be shipped out in advance in factory of the mobile terminal.

For example, referring to FIG. 7, the control unit 180 sequentially displays a plurality of regions 202a, 202b, 202c, 202d, and 202e included in the wheel 201 in the first region 202a, When defining the area 202e, it is possible to associate the favorite function with the fifth area 202e.

As another example, although not shown, the controller 180 may associate a recent playlist function with the first area 202a and a play stop function with the second area 202b.

The control unit 180 displays an execution screen of a function associated with the specific area on the display unit 151 in response to the touch being applied to a specific one of the plurality of areas 202a, 202b, 202c, 202d, and 202e can do. For example, as shown in the first drawing of FIG. 7, in response to touching the fifth region 202e, the control unit 180 displays an execution screen of the favorite function on the display unit 251 Can be displayed.

In response to a touch being applied to the specific area, in response to the touch being applied to the specific area, the control unit (180) It is possible to reproduce the contents included in the execution screen of the program.

In the foregoing, a method of executing a function associated with a specific area has been described. Through this, the user can perform various functions through a simple operation of touching the wheel.

Hereinafter, a method for selecting a specific content among a plurality of contents will be described in the mobile terminal related to the invention, based on the emotion state of the user. 8A and 8B are conceptual diagrams illustrating a method for selecting a specific content among a plurality of contents based on a user's emotional state in a mobile terminal according to the present invention.

The control unit 180 of the mobile terminal according to the present invention can determine the emotional state of the user through the sensing unit through various sensed sensing information.

More specifically, the control unit 180 controls the skin electrical conductivity of the user's finger applied to the wheel 201, the rotational speed of the wheel 201, and the face image of the user input through the camera 221 provided in the mobile terminal The emotional state of the user can be determined based on at least one of the emotional state of the user. A user's emotional state can have states such as joy, happiness, depression, sadness, and stress.

For example, the control unit 180 may determine a user's emotional state of stress when the skin electrical conductivity of the user's finger has a specific pattern and the rotational speed of the wheel is equal to or greater than a predetermined speed. As another example, when the controller 180 determines that the face image of the user looking at the front of the display unit 151 is a smiling face, the controller 180 can determine the user's emotional state of joy.

The control unit 180 can select specific content from the content list including at least one content based on the determined emotion state of the user. More specifically, the controller 180 may select specific content that is mapped to the determined emotional state of the user. The control unit 180 can reproduce the selected specific content through the external speaker.

For example, as shown in the first drawing in FIG. 8A, the controller 180 can sense the skin electrical conductivity of the user's finger applied to the wheel 201. Thereafter, the controller 180 may determine the emotional state of the user based on the skin electrical conductivity.

8A, the control unit 180 can reproduce the specific content "IF you" based on the skin electrical conductivity of the user's finger applied to the wheel 201. [

The control unit 180 may further include a camera 221 having a front direction of the display unit 151 in the photographing direction. In this case, as shown in the first drawing of FIG. 8B, the controller 180 can receive a face image of a user looking at the display unit 151 through the camera 221.

The control unit 180 can determine the emotion state of the user based on the inputted face image of the user. Thereafter, the controller 180 can reproduce specific contents based on the determined emotion state of the user. For example, as shown in the second diagram of FIG. 8B, the controller 180 may select specific contents called " IF you "based on the determined emotional state of the user, have.

Meanwhile, although not shown, the controller 180 may determine the emotional state of the user through a wearable device associated with the mobile terminal. In this case, the controller 180 can receive the user's biometric information (e.g., heart rate information, fingerprint information, etc.) from the wearable device and determine the emotional state of the user.

Accordingly, the user can be provided with contents suitable for his / her emotional state without applying a separate control command to the mobile terminal.

Hereinafter, a method of controlling the reproduction of the content being reproduced by using the touch operation applied to the wheel will be described. Figs. 9A and 9B are conceptual diagrams showing a method of controlling reproduction of content being reproduced by using a touch operation applied to a wheel. Fig.

The control unit 180 can control reproduction of specific content being reproduced by the external speaker through the touch operation of the user applied to the wheel 201 when the specific content is being reproduced through the external speaker.

More specifically, in response to a user's touch operation being performed during playback of a specific content, the controller 180 controls the playback of the specific content to be paused, stopped, moved to the previous or next song, rewound and fast-forwarded At least one can be performed.

The user's touch operation may be a drag touch, a multi-touch, a multi-drag touch, a short touch, a long touch, a double touch, or the like, which is applied to the wheel 201.

For example, as shown in the first drawing of FIG. 9A, the controller 180 may sense a dragging touch in the first direction applied to the wheel 201 during reproduction of a specific content called "Lemon Pie" . In this case, as shown in the second diagram of FIG. 9A, the controller 180 may move to the content 'I miss you' in response to the drag touch in the first direction.

Likewise, as shown in the third drawing of FIG. 9A, in response to a dragging touch in the second direction having a direction opposite to the first direction, the control unit 180 reads "Lemon Quot; Pie ".

As another example, as shown in the first drawing of FIG. 9B, the controller 180 can detect a multi-drag touch applied to the wheel 201 during reproduction of a specific content called "Lemon Pie ". The multi-drag touch may be a touch that is dragged simultaneously on at least two touch points.

In this case, as shown in the second drawing of FIG. 9B, the controller 180 may perform rewinding of the "Lemon Pie" in response to the multi-drag touch in the first direction.

Likewise, as shown in the third drawing of Fig. 9B, the controller 180 can perform fast forwarding of "Lemon Pie" in response to the multi-drag touch in the second direction opposite to the first direction have.

Accordingly, the user can easily control playback of the specific content being played back from the external speaker through the mobile terminal.

Hereinafter, a method of reproducing specific contents using location information of an external speaker communicating with a mobile terminal will be described. 10A and 10B are conceptual diagrams illustrating a method of reproducing specific contents using position information of an external speaker communicating with a mobile terminal.

The mobile terminal according to the present invention may further include a wireless communication unit 110. [ At this time, the controller 180 may perform communication with an external speaker. For example, the control unit 180 may perform communication with an external speaker using a Bluetooth communication method.

Alternatively, the control unit 180 may communicate with the mobile terminal by wire.

The external speaker communicating with the mobile terminal may be an electronic device located within a predetermined range based on the position of the mobile terminal. That is, the controller 180 may communicate with an external speaker located within a predetermined range based on the position of the mobile terminal.

Also, the control unit 180 may perform a control command of the user or automatically communicate with the external speaker. For example, when an external speaker having identification information previously stored on the memory 170 is detected among the external speakers located within a preset range based on the position of the mobile terminal, You can automatically communicate with your speakers.

If a plurality of external speakers located within a preset range and having identification information previously stored on the memory 170 are detected based on the position of the mobile terminal, Lt; RTI ID = 0.0 > 1, < / RTI >

More specifically, when a plurality of external speakers is detected, the control unit 180 may display an identification information list including identification information of a plurality of external speakers on the display unit 151. FIG. In this case, the user can select one or more external speakers to perform communication among a plurality of external speakers included in the identification information list. Thereafter, the controller 180 may perform communication with the selected one or more external speakers.

Alternatively, when a plurality of external speakers, which are located within a predetermined range on the basis of the position of the mobile terminal and have identification information previously stored on the memory 170, are detected, And can communicate with an external speaker.

More specifically, the control unit 180 can detect a relative distance between a plurality of speakers and a mobile terminal based on a beacon signal of each of a plurality of speakers. Thereafter, the controller 180 can perform communication with the external speaker having the shortest relative distance.

If there is an external speaker having the same relative distance, the controller 180 may select the user or perform communication with one of the external speakers based on a predetermined priority. Here, the predetermined priority may be set by frequently connected communication connection history information or by the user.

On the other hand, the controller 180 may change an external speaker for performing communication based on the location of the mobile terminal.

More specifically, when the position of the mobile terminal changes from the first position to the second position, the controller 180 can cancel the communication with the external speaker performing the communication.

Then, the control unit 180 can perform communication with a new external speaker located within a predetermined range based on the second position. The controller 180 may transmit the content being played back to the new external speaker so that the current content is being reproduced on the new external speaker when there is content being played back on the external speaker performing communication.

Accordingly, the controller 180 can continuously reproduce the content being reproduced through another external speaker even when the position of the mobile terminal is changed and communication with the external speaker is interrupted.

The controller 180 can reproduce specific contents included in the mobile terminal by using an external speaker for performing communication.

At this time, the controller 180 may detect position information of an external speaker performing communication.

More specifically, on the memory 170, information that maps specific position information for each identification information of the external speaker can be stored. At this time, when the controller 180 performs communication with a specific external speaker, the controller 180 may detect position information mapped to a specific external speaker.

In addition, the controller 180 can reproduce specific contents based on at least one of position information and time information of an external speaker performing communication.

More specifically, the controller 180 may map specific contents according to location information of the external speaker based on the history information of reproducing the contents. For example, when the user has the history information in which the user repeatedly reproduces the specific content at the location of the "living room ", the controller 180 can map the specific content to the location" living room ". Alternatively, the user can directly map specific content to specific content.

Alternatively, the controller 180 may map specific content to a specific location based on the location and time information. For example, when there is the history information in which the morning call is repeatedly reproduced at 7:00 am and the bedroom position is 7:00 am, the current time is 7:00 am. If it is detected that the bed room is located, , A morning call can be played.

In addition, the controller 180 may map specific content to specific location information based on information that another mobile terminal communicating with the mobile terminal maps specific content at a specific location. For example, the control unit 180 may receive information on a specific position and specific content mapped from another mobile terminal in which identification information is stored on the memory 170, and store the information on the memory 170. [

Thereafter, the controller 180 may automatically reproduce specific content at a specific location.

For example, as shown in FIG. 10A, when the position of a specific speaker is detected as "living room ", the control unit 180 can reproduce specific contents mapped to the" living room ". As another example, as shown in FIG. 10B, when the position of a specific speaker is detected as "bedroom ", the controller 180 can reproduce the specific content mapped to the" bedroom ".

In the above, a method of reproducing specific contents according to at least one of the position and the time of the external speaker communicating with the mobile terminal has been described. Through this, the user can be provided with the most suitable content based on at least one of the current position and the time.

Hereinafter, a method of sharing content among a plurality of mobile terminals providing content will be described. 11A and 11B are conceptual diagrams illustrating a method of sharing content among a plurality of mobile terminals providing content.

The control unit 180 can share content with an external mobile terminal through communication. Here, the sharing of content may be performed by an external mobile terminal, or by reproducing or storing the stored content on a mobile terminal through communication, or by storing the content being reproduced in an external mobile terminal in its own memory, ≪ / RTI >

More specifically, the control unit 180 can store the identification information of each of the plurality of mobile terminals in the memory 170. [ Here, the plurality of mobile terminals may include a smart phone, a PC, a tablet PC, and the like. Hereinafter, the identification information of each of the plurality of mobile terminals will be referred to as "friend list ".

As shown in FIG. 11A, the controller 180 may transmit source information of a specific content to a mobile terminal corresponding to specific identification information selected by the user, among a plurality of identification information included in the friend list. Here, the source information is information necessary for reproducing the content, and may be content information or URL address information corresponding to the content.

Likewise, as shown in FIG. 11B, the controller 180 may receive source information of specific contents from the mobile terminal corresponding to the specific identification information. When receiving the source information of the specific content from the mobile terminal corresponding to the specific identification information, the controller 180 can reproduce the source information through the external speaker. For example, when receiving the URL address information of the specific contents, the controller 180 can access the URL address through communication and reproduce the URL address through the external speaker.

Meanwhile, although not shown, the controller 180 can receive the contents of the mobile terminal corresponding to the specific identification information selected by the user in real time. In this case, when the new content is stored or reproduced in the mobile terminal corresponding to the specific identification information, the controller 180 can store or reproduce the new content in real time. Thus, the user can share a playlist of a specific user similar to his / her taste in real time.

In the above, a method of sharing content with an external mobile terminal has been described. Through this, the user can receive more various contents.

Hereinafter, a method for controlling a plurality of mobile terminals when the mobile terminal communicates with a plurality of mobile terminals will be described. 12 is a conceptual diagram illustrating a method for controlling a plurality of mobile terminals when the mobile terminal communicates with a plurality of mobile terminals.

The controller 180 of the mobile terminal according to the present invention can perform communication with a plurality of mobile terminals simultaneously. In this case, the control unit 180 may control all of the plurality of mobile terminals performing the communication.

12, when the controller 180 communicates with the first device 200b and the second device 200c, the controller 180 reproduces specific content to the first device 200b, The second device 200c can stop the reproduction of the specific content.

As another example, although not shown, the control unit 180 can control the first and second devices 200b and 200c to simultaneously reproduce the same content.

Accordingly, the user can conveniently control a plurality of mobile terminals through one device.

Hereinafter, a method of synthesizing contents being reproduced by a plurality of mobile terminals connected to one speaker and outputting the synthesized contents as one content will be described. 13 is a conceptual diagram illustrating a method of synthesizing contents being reproduced in a plurality of mobile terminals connected to one speaker and outputting the synthesized contents as one content.

When a plurality of mobile terminals according to the present invention perform communication with one speaker, the contents being reproduced by each of the plurality of mobile terminals can be transmitted to the external server.

Hereinafter, for convenience, the plurality of mobile terminals will be described as the first device 200a and the second device 200b. However, the present invention can be equally applied to two or more devices.

The user can control the reproduction speed, the reproduction position, and the like of the contents reproduced in each of the first and second devices 200a and 200b using the wheels of the first device 200a and the second device 200b. That is, the user can control the content reproduced by each device through each device.

The first device 200a and the second device 200b may transmit the content controlled by the user to an external server.

In this case, the external server can generate one content by synthesizing the content being reproduced by each of the plurality of mobile terminals. In addition, the external server may transmit the synthesized content to the first device 200a and the second device 200b such that the synthesized content is reproduced from the one speaker.

 The first device 200a and the second device 200b can output the synthesized content from one speaker.

For example, as shown in FIG. 13, the first device 200a and the second device 200b can communicate with one speaker 1200. FIG. At this time, the first device 200a and the second device 200b can transmit the content being played back to the external server 1210, respectively.

Thereafter, the first device 200a and the second device 200b can receive the synthesized content obtained by synthesizing the content being played back by the first device 200a and the second device 200b from the external server 1210 have. The first device 200a and the second device 200b can control one speaker 1200 to output the synthesized contents through one speaker 1200 through communication.

Therefore, the user can generate and output the synthesized contents in real time.

According to the present invention, the user can be recognized through the touch applied to the wheel and the rotational movement of the wheel. Thus, the present invention can provide contents suitable for each user without performing a login procedure.

In addition, the present invention can control the content displayed on the display unit through various operations using the wheel. Through this, the user can conveniently control the content through a simple operation using the wheel.

The present invention described above can be embodied as computer-readable codes on a medium on which a program is recorded. The computer readable medium includes all kinds of recording devices in which data that can be read by a computer system is stored. Examples of the computer readable medium include a hard disk drive (HDD), a solid state disk (SSD), a silicon disk drive (SDD), a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, , And may also be implemented in the form of a carrier wave (e.g., transmission over the Internet). Also, the computer may include a control unit 180 of the terminal. Accordingly, the above description should not be construed in a limiting sense in all respects and should be considered illustrative. The scope of the present invention should be determined by rational interpretation of the appended claims, and all changes within the scope of equivalents of the present invention are included in the scope of the present invention.

Claims (10)

A display unit;
A wheel rotatably surrounding an outer periphery of the display unit;
A sensing unit sensing a rotational movement of the wheel and a touch by a user's finger applied to the wheel; And
Detecting specific user information among a plurality of pieces of user information based on a rotational movement of the sensed wheel and an input pattern corresponding to a touch by a user's finger applied to the wheel,
And displaying a content list including at least one content mapped to the detected specific user information on the display unit.
The method according to claim 1,
The control unit
Wherein the mobile terminal selects one of at least one content included in the content list in response to the rotational movement of the wheel.
The method according to claim 1,
And a camera for photographing a user's face image,
The control unit
Based on the face image of the user, detects the emotion state of the user,
And selects a specific content from at least one content included in the content list based on the detected emotion state.
The method according to claim 1,
Further comprising a local communication unit for communicating with an external speaker located within a predetermined range based on the position of the mobile terminal,
Wherein,
And communicates with an external speaker located closest to the mobile terminal when a plurality of external speakers are located within the predetermined range,
Wherein the at least one content is transmitted to the external speaker through the communication so that at least one content included in the content list is audibly output from the external speaker.
5. The method of claim 4,
The control unit
Detecting position information of the external speaker when communicating with the external speaker,
And outputs the specific content from the external speaker based on the position information of the external speaker.
5. The method of claim 4,
The control unit
And when the position of the mobile terminal is changed, releasing the communication with the external speaker performing the communication.
The method according to claim 6,
Wherein,
Performs communication with a new external speaker based on the changed position of the mobile terminal,
And transmits the at least one content to the new external speaker so that the new external speaker outputs the at least one content.
The method according to claim 1,
The control unit
And detects specific user information based on an input pattern corresponding to a rotational speed of the wheel and a position at which a user's touch is sensed in the wheel.
The method according to claim 1,
And a wireless communication unit for performing communication with an external device,
The control unit
And outputs the synthesized content obtained by synthesizing the first content currently being reproduced and the second content reproduced by the external device performing the communication when the communication with the external device is performed.
A method for searching contents of a mobile terminal having a wheel formed to be rotatable,
Detecting an input pattern corresponding to a touch of a user's finger and a rotational movement of the wheel applied to the wheel;
Detecting specific user information among a plurality of pieces of user information based on the detected input pattern; And
And displaying a content list including at least one content mapped to the detected specific user information on a display unit.
KR1020150138136A 2015-09-30 2015-09-30 Mobile terminal and method for controlling the same KR20170038569A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150138136A KR20170038569A (en) 2015-09-30 2015-09-30 Mobile terminal and method for controlling the same

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150138136A KR20170038569A (en) 2015-09-30 2015-09-30 Mobile terminal and method for controlling the same

Publications (1)

Publication Number Publication Date
KR20170038569A true KR20170038569A (en) 2017-04-07

Family

ID=58583811

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150138136A KR20170038569A (en) 2015-09-30 2015-09-30 Mobile terminal and method for controlling the same

Country Status (1)

Country Link
KR (1) KR20170038569A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114088063A (en) * 2021-10-19 2022-02-25 青海省交通工程技术服务中心 Pier local scour terrain measurement method based on mobile terminal

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114088063A (en) * 2021-10-19 2022-02-25 青海省交通工程技术服务中心 Pier local scour terrain measurement method based on mobile terminal
CN114088063B (en) * 2021-10-19 2024-02-02 青海省交通工程技术服务中心 Pier local scour terrain measurement method based on mobile terminal

Similar Documents

Publication Publication Date Title
KR20160014226A (en) Mobile terminal and method for controlling the same
KR20160150421A (en) Mobile terminal and method for controlling the same
KR20150095124A (en) Mobile terminal and control method for the mobile terminal
KR20170059760A (en) Mobile terminal and method for controlling the same
KR20170058758A (en) Tethering type head mounted display and method for controlling the same
KR20180017638A (en) Mobile terminal and method for controlling the same
KR20170001219A (en) Mobile terminal and method for unlocking thereof
KR20160016397A (en) Mobile terminal and method for controlling the same
KR101685361B1 (en) Mobile terminal and operation method thereof
KR20170058756A (en) Tethering type head mounted display and method for controlling the same
KR20170052190A (en) Terminal device and controlling method thereof
KR20160006518A (en) Mobile terminal
KR20150145893A (en) Mobile terminal and the control method thereof
KR20170038569A (en) Mobile terminal and method for controlling the same
KR20170035755A (en) Mobile terminal and method for controlling the same
KR20160125647A (en) Mobile terminal and method for controlling the same
KR20160024272A (en) Mobile terminal and control method thereof
KR20160031336A (en) Mobile terminal and method for controlling the same
KR20160032915A (en) Mobile terminal
KR20150141084A (en) Mobile terminal and method for controlling the same
KR20180031238A (en) Mobile terminal and method for controlling the same
KR20180079051A (en) Mobile terninal and method for controlling the same
US10284503B2 (en) Mobile terminal and control method thereof
KR20160025279A (en) Mobile terminal
KR20160093499A (en) Mobile terminal and method for controlling the same