KR20140133073A - Method of operating a Mobile Terminal - Google Patents

Method of operating a Mobile Terminal Download PDF

Info

Publication number
KR20140133073A
KR20140133073A KR1020130052634A KR20130052634A KR20140133073A KR 20140133073 A KR20140133073 A KR 20140133073A KR 1020130052634 A KR1020130052634 A KR 1020130052634A KR 20130052634 A KR20130052634 A KR 20130052634A KR 20140133073 A KR20140133073 A KR 20140133073A
Authority
KR
South Korea
Prior art keywords
mobile terminal
content
screen
user
external device
Prior art date
Application number
KR1020130052634A
Other languages
Korean (ko)
Inventor
김문정
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020130052634A priority Critical patent/KR20140133073A/en
Publication of KR20140133073A publication Critical patent/KR20140133073A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1637Details related to the display arrangement, including those related to the mounting of the display in the housing
    • G06F1/1643Details related to the display arrangement, including those related to the mounting of the display in the housing the display being associated to a digitizer, e.g. laptops that can be used as penpads
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Telephone Function (AREA)

Abstract

According to an embodiment of the present invention, a method for operating a mobile terminal comprises the steps of: entering into a content sharing mode which shares contents with a first exterior device; receiving a certain touch; and displaying a content sharing controller screen including an object which shows a content executed by a mobile terminal and the first exterior device on a display part. Accordingly, the terminal can more conveniently share data with other electric devices, perform multi-tasking, and improve user convenience.

Description

TECHNICAL FIELD The present invention relates to a method of operating a mobile terminal,

BACKGROUND OF THE INVENTION 1. Field of the Invention [0001] The present invention relates to a mobile terminal and an operation method thereof, and more particularly, to a mobile terminal and an operation method thereof that can improve a user's convenience.

A mobile terminal is a portable device having one or more functions capable of carrying out voice and video communication, capable of inputting and outputting information, and storing data, while being portable. As the function of the mobile terminal has diversified, it has been equipped with complicated functions such as photographing and photographing of a moving picture, reproduction of a music file or a moving picture file, reception of a game, broadcasting, wireless Internet, transmission and reception of a message, multimedia player). Mobile terminals implemented in the form of multimedia devices are being applied variously in terms of hardware and software in order to implement complex functions.

It is an object of the present invention to provide a mobile terminal and an operation method thereof that can improve the usability of the user.

Another object of the present invention is to provide a mobile terminal and a method of operating the same that can more conveniently share data with other electric devices and perform multitasking work.

According to another aspect of the present invention, there is provided a method of operating a mobile terminal, including: entering a content sharing mode for sharing content with a first external device; inputting a predetermined touch; And displaying a content sharing controller screen on the display unit, the content sharing controller including an object representing content that is being executed.

According to the present invention, it is possible to more conveniently share data with other electric devices, perform multitasking work, and improve the usability of the user.

1 is a block diagram of a mobile terminal according to an embodiment of the present invention.
FIG. 2 is a perspective view of the mobile terminal of FIG. 1 viewed from the front; FIG.
3 is a rear perspective view of the mobile terminal shown in FIG.
4 is a flowchart illustrating an operation method of a mobile terminal according to an embodiment of the present invention.
5 to 13 are diagrams for explaining various embodiments of a method of operating a mobile terminal according to the present invention.

Hereinafter, the present invention will be described in more detail with reference to the drawings.

Examples of the mobile terminal described in the present specification include a mobile phone, a smart phone, a notebook computer, a digital broadcasting terminal, a PDA (Personal Digital Assistants), a PMP (Portable Multimedia Player), a camera, tablet computers, e-book terminals, and the like. In addition, suffixes "module" and " part "for the components used in the following description are given merely for convenience of description, and do not give special significance or role in themselves. Accordingly, the terms "module" and "part" may be used interchangeably.

1 is a block diagram of a mobile terminal according to an embodiment of the present invention. Referring to FIG. 1, a mobile terminal according to an exemplary embodiment of the present invention will be described in terms of functional components.

1, a mobile terminal 100 includes a wireless communication unit 110, an audio / video (A / V) input unit 120, a user input unit 130, a sensing unit 140, an output unit 150, A controller 160, an interface 170, a controller 180, and a power supply 190. When such components are implemented in practical applications, two or more components may be combined into one component, or one component may be divided into two or more components as necessary.

The wireless communication unit 110 may include a broadcast receiving module 111, a mobile communication module 113, a wireless Internet module 115, a short distance communication module 117, and a GPS module 119.

The broadcast receiving module 111 receives at least one of a broadcast signal and broadcast related information from an external broadcast management server through a broadcast channel. At this time, the broadcast channel may include a satellite channel, a terrestrial channel, and the like. The broadcast management server may refer to a server for generating and transmitting at least one of a broadcast signal and broadcast related information and a server for receiving at least one of the generated broadcast signal and broadcast related information and transmitting the broadcast signal to the terminal.

The broadcast signal may include a TV broadcast signal, a radio broadcast signal, a data broadcast signal, and a broadcast signal in which a data broadcast signal is combined with a TV broadcast signal or a radio broadcast signal. The broadcast-related information may mean information related to a broadcast channel, a broadcast program, or a broadcast service provider. The broadcast-related information can also be provided through a mobile communication network, in which case it can be received by the mobile communication module 113. Broadcast-related information can exist in various forms.

The broadcast receiving module 111 receives broadcast signals using various broadcasting systems. In particular, the broadcast receiving module 111 may be a Digital Multimedia Broadcasting-Terrestrial (DMB-T), a Digital Multimedia Broadcasting-Satellite (DMB-S) ), Digital Video Broadcast-Handheld (DVB-H), Integrated Services Digital Broadcast-Terrestrial (ISDB-T), and the like. In addition, the broadcast receiving module 111 may be configured to be suitable for all broadcasting systems that provide broadcasting signals, as well as the digital broadcasting system. The broadcast signal and / or broadcast related information received through the broadcast receiving module 111 may be stored in the memory 160.

The mobile communication module 113 transmits and receives a radio signal to at least one of a base station, an external terminal, and a server on a mobile communication network. Here, the wireless signal may include various types of data according to a voice call signal, a video call signal, or a text / multimedia message transmission / reception.

The wireless Internet module 115 is a module for wireless Internet access, and the wireless Internet module 115 can be built in or externally attached to the mobile terminal 100. WLAN (Wi-Fi), Wibro (Wireless broadband), Wimax (World Interoperability for Microwave Access), HSDPA (High Speed Downlink Packet Access) and the like can be used as wireless Internet technologies.

The short-range communication module 117 refers to a module for short-range communication. Bluetooth, radio frequency identification (RFID), infrared data association (IrDA), ultra wideband (UWB), ZigBee, and Near Field Communication (NFC) may be used as the short distance communication technology.

A GPS (Global Position System) module 119 receives position information from a plurality of GPS satellites.

The A / V (Audio / Video) input unit 120 is for inputting an audio signal or a video signal, and may include a camera 121 and a microphone 123. The camera 121 processes image frames such as still images or moving images obtained by the image sensor in the video communication mode or the photographing mode. Then, the processed image frame can be displayed on the display 151. [

The image frame processed by the camera 121 may be stored in the memory 160 or transmitted to the outside through the wireless communication unit 110. [ The camera 121 may be equipped with two or more cameras according to the configuration of the terminal.

The microphone 123 receives an external audio signal by a microphone in an audio reception mode, for example, a communication mode, a recording mode, or a voice recognition mode, and processes the audio signal as electrical voice data. The processed voice data can be converted into a form that can be transmitted to the mobile communication base station through the mobile communication module 113 and output when the voice data is in the call mode. The microphone 123 may use various noise reduction algorithms for eliminating noise generated in receiving an external audio signal.

On the other hand, the plurality of microphones 123 may be arranged at different positions. An audio signal received at each microphone can be processed as an audio signal in the controller 180 or the like.

The user input unit 130 generates key input data that the user inputs to control the operation of the terminal. The user input unit 130 may include a key pad, a dome switch, and a touch pad (static / static) capable of receiving commands or information by a user's pressing or touching operation. The user input unit 130 may be a jog wheel for rotating the key, a jog type or a joystick, or a finger mouse. Particularly, when the touch pad has a mutual layer structure with the display 151 described later, it can be called a touch screen.

The sensing unit 140 senses the current state of the mobile terminal 100 such as the open / close state of the mobile terminal 100, the position of the mobile terminal 100, Thereby generating a sensing signal. For example, when the mobile terminal 100 is in the form of a slide phone, it is possible to sense whether the slide phone is opened or closed. In addition, a sensing function related to whether or not the power supply unit 190 is powered on, whether the interface unit 170 is coupled to an external device, and the like can be handled.

The sensing unit 140 may include a proximity sensor 141, a pressure sensor 143, a motion sensor 145, and the like. The proximity sensor 141 can detect an object approaching the mobile terminal 100 or the presence or absence of an object in the vicinity of the mobile terminal 100 without mechanical contact. The proximity sensor 141 can detect a nearby object by using a change in the alternating magnetic field or a change in the static magnetic field, or a rate of change in capacitance. The proximity sensor 141 may be equipped with two or more sensors according to the configuration.

The pressure sensor 143 can detect whether or not pressure is applied to the mobile terminal 100, the magnitude of the pressure, and the like. The pressure sensor 143 may be installed at a portion where the pressure of the mobile terminal 100 is required according to the use environment. In the case where the pressure sensor 143 is installed on the display 151, a touch input through the display 151 and a pressure touch that a larger pressure than the touch input is applied Input can be identified. In addition, the magnitude of the pressure applied to the display 151 at the time of pressure-touch input can be determined according to the signal output from the pressure sensor 143. [

The motion sensor 145 senses the position or movement of the mobile terminal 100 using an acceleration sensor, a gyro sensor, or the like. An acceleration sensor that can be used for the motion sensor 145 is a device that converts an acceleration change in one direction into an electric signal and is widely used along with the development of MEMS (micro-electromechanical systems) technology.

The acceleration sensor measures the acceleration of a small value built in the airbag system of an automobile and recognizes the minute motion of the human hand and measures the acceleration of a large value used as an input means such as a game There are various kinds. Acceleration sensors are usually constructed by mounting two or three axes in one package. Depending on the usage environment, only one axis of Z axis is required. Therefore, when the acceleration sensor in the X-axis direction or the Y-axis direction is used instead of the Z-axis direction for some reason, the acceleration sensor may be mounted on the main substrate by using a separate piece substrate.

The gyro sensor is a sensor for measuring the angular velocity, and it can sense the direction of rotation with respect to the reference direction.

The output unit 150 is for outputting an audio signal, a video signal, or an alarm signal. The output unit 150 may include a display 151, an audio output module 153, an alarm unit 155, and a haptic module 157.

The display 151 displays and outputs information processed by the mobile terminal 100. For example, when the mobile terminal 100 is in the call mode, a UI (User Interface) or a GUI (Graphic User Interface) associated with a call is displayed. When the mobile terminal 100 is in the video communication mode or the photographing mode, the photographed or received images can be displayed individually or simultaneously, and the UI and the GUI are displayed.

Meanwhile, as described above, when the display 151 and the touch pad have a mutual layer structure to constitute a touch screen, the display 151 may be used as an input device capable of inputting information by a user's touch in addition to the output device .

If the display 151 is configured as a touch screen, it may include a touch screen panel, a touch screen panel controller, and the like. In this case, the touch screen panel is a transparent panel that is attached to the outside, and can be connected to the internal bus of the mobile terminal 100. The touch screen panel keeps a watch on the contact result, and if there is a touch input, sends the corresponding signals to the touch screen panel controller. The touch screen panel controller processes the signals, and then transmits corresponding data to the controller 180 so that the controller 180 can determine whether the touch input has been made and which area of the touch screen has been touched.

The display 151 may be composed of an e-paper. Electronic paper (e-Paper) is a kind of reflective display, and has excellent visual characteristics such as high resolution, wide viewing angle and bright white background as conventional paper and ink. The electronic paper (e-paper) can be implemented on any substrate such as plastic, metal, paper, and the image is maintained even after the power is shut off, and the battery life of the mobile terminal 100 is long Can be maintained. As the electronic paper, a hemispherical twist ball filled with a telephone can be used, or an electrophoresis method and a microcapsule can be used.

In addition, the display 151 may be a liquid crystal display, a thin film transistor-liquid crystal display, an organic light-emitting diode, a flexible display, a three-dimensional display 3D display). In addition, there may be two or more displays 151 according to the embodiment of the mobile terminal 100. [ For example, the mobile terminal 100 may be provided with an external display (not shown) and an internal display (not shown) at the same time.

The audio output module 153 outputs audio data received from the wireless communication unit 110 or stored in the memory 160 in a call signal reception mode, a call mode or a recording mode, a voice recognition mode, a broadcast reception mode, The sound output module 153 outputs audio signals related to functions performed in the mobile terminal 100, for example, a call signal reception sound, a message reception sound, and the like. The sound output module 153 may include a speaker, a buzzer, and the like.

The alarm unit 155 outputs a signal for notifying the occurrence of an event of the mobile terminal 100. Examples of events that occur in the mobile terminal 100 include call signal reception, message reception, and key signal input. The alarm unit 155 outputs a signal for notifying the occurrence of an event in a form other than an audio signal or a video signal. For example, it is possible to output a signal in a vibration mode. The alarm unit 155 can output a signal to notify when a call signal is received or a message is received. Also, when the key signal is inputted, the alarm unit 155 can output the signal as the feedback to the key signal input. The user can recognize the occurrence of an event through the signal output by the alarm unit 155. A signal for notifying the occurrence of an event in the mobile terminal 100 may also be output through the display 151 or the sound output module 153.

The haptic module 157 generates various tactile effects that the user can feel. A typical example of the haptic effect generated by the haptic module 157 is a vibration effect. When the haptic module 157 generates vibration with a haptic effect, the intensity and pattern of the vibration generated by the haptic module 157 can be converted, and the different vibrations can be synthesized and output or sequentially output.

In addition to the vibration, the haptic module 157 may be provided with a function of stimulating by a pin arrangement vertically moving with respect to the contact skin surface, an effect of stimulating air through the injection or suction force of the air through the injection port or the suction port, A variety of tactile effects such as an effect of stimulation through the contact of the electrode (eletrode), an effect of stimulation by electrostatic force, and an effect of cold / warm reproduction using a device capable of endothermic or exothermic can be generated. The haptic module 157 can be implemented not only to transmit the tactile effect through direct contact but also to feel the tactile effect through the muscular sense of the user's finger or arm. The haptic module 157 may include two or more haptic modules 157 according to the configuration of the mobile terminal 100.

The memory 160 may store a program for processing and controlling the control unit 180 and may store a function for temporarily storing input or output data (e.g., a phone book, a message, a still image, .

The memory 160 may be a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (for example, SD or XD memory), a RAM , And a ROM. ≪ / RTI > In addition, the mobile terminal 100 may operate a web storage for storing the memory 150 on the Internet.

The interface unit 170 serves as an interface with all external devices connected to the mobile terminal 100. Examples of the external device connected to the mobile terminal 100 include a wired / wireless headset, an external charger, a wired / wireless data port, a memory card, a SIM (Subscriber Identification Module) card, a UIM An audio input / output (I / O) terminal, a video I / O (input / output) terminal, and an earphone. The interface unit 170 may receive data from the external device or supply power to the respective components in the mobile terminal 100 and may transmit data in the mobile terminal 100 to the external device .

The interface unit 170 may be a path through which the power from the cradle connected to the mobile terminal 100 is connected to the cradle when the mobile terminal 100 is connected to the cradle, And may be a passage to be transmitted to the terminal 100.

The controller 180 typically controls the operation of the respective units to control the overall operation of the mobile terminal 100. For example, voice communication, data communication, video communication, and the like. In addition, the control unit 180 may include a multimedia playback module 181 for multimedia playback. The multimedia playback module 181 may be configured in hardware in the controller 180 or separately from software in the controller 180. [ On the other hand, the control unit 180 may include an application processor (not shown) for driving an application. Or an application processor (not shown) may be provided separately from the control unit 180.

The power supply unit 190 receives external power and internal power under the control of the controller 180 and supplies power necessary for operation of the respective components.

The mobile terminal 100 having such a configuration can be configured to be operable in a communication system capable of transmitting data through a frame or a packet, including a wired / wireless communication system and a satellite-based communication system. have.

FIG. 2 is a perspective view of the mobile terminal of FIG. 1, and FIG. 3 is a rear perspective view of the mobile terminal of FIG. Hereinafter, with reference to FIG. 2 and FIG. 3, a mobile terminal according to the present invention will be described in terms of components according to the external appearance. Hereinafter, for convenience of description, a bar type mobile terminal having a front touch screen among various types of mobile terminals such as a folder type, a bar type, a swing type, a slider type, etc. will be described as an example. However, the present invention is not limited to the bar-type mobile terminal but can be applied to all types of mobile terminals including the above-mentioned types.

Referring to FIG. 2, the case constituting the appearance of the mobile terminal 100 is formed by the front case 100-1 and the rear case 100-2. Various electronic components are incorporated in the space formed by the front case 100-1 and the rear case 100-2.

The display 151, the first sound output module 153a, the first camera 121a, and the first to third user input units 130a, 130b, and 130c are disposed in the main body, specifically, the front case 100-1 . A fourth user input unit 130d, a fifth user input unit 130e, and a microphone 123 may be disposed on a side surface of the rear case 100-2.

The display 151 may be configured such that the touch pad is overlapped with the layer structure so that the display 151 operates as a touch screen so that information can be input by a user's touch.

The first acoustic output module 153a may be implemented in the form of a receiver or a speaker. The first camera 121a may be implemented in a form suitable for capturing an image or a moving image of a user. The microphone 123 may be implemented in a form suitable for receiving a user's voice, other sounds, and the like.

The first through fifth user input units 130a 130b 130c 130d and 130e and the sixth and seventh user input units 130f and 130g described below may be collectively referred to as a user input unit 130, Any manner can be employed in a tactile manner.

For example, the user input unit 130 may be embodied as a dome switch or a touch pad capable of receiving a command or information by a user's pressing or touching operation, or may be a wheel, a jog type or a joystick Or the like. In a functional aspect, the first to third user input units 130a, 130b and 130c are for inputting commands such as start, end, and scroll, and the fourth user input unit 130d is for inputting a selection of an operation mode, . In addition, the fifth user input unit 130e may operate as a hot-key for activating a special function in the mobile terminal 100. [

3, a second camera 121b and a fourth microphone 123d can be additionally mounted on the rear surface of the rear case 100-2. On the side surface of the rear case 100-2, The seventh user input units 130f and 130g, and the interface unit 170 may be disposed.

The second camera 121b has a photographing direction substantially opposite to that of the first camera 121a, and may have pixels different from those of the first camera 121a. A flash (not shown) and a mirror (not shown) may be additionally disposed adjacent to the second camera 121b. In addition, another camera may be installed adjacent to the second camera 121b to use it for shooting a three-dimensional stereoscopic image.

The flash illuminates the subject when the subject is photographed by the second camera 121b. The mirror enables the user to illuminate the user's own face or the like when the user intends to photograph (self-photograph) himself / herself using the second camera 121b.

A second sound output module (not shown) may be further disposed in the rear case 100-2. The second sound output module may implement the stereo function together with the first sound output module 153a, and may be used for talking in the speakerphone mode.

The interface unit 170 can be used as a path for exchanging data with an external device. An antenna for receiving broadcast signals (not shown) may be disposed in one area of the front case 100-1 and the rear case 100-2 in addition to the antenna for communication. The antenna may be installed to be capable of being drawn out from the rear case 100-2.

A power supply unit 190 for supplying power to the mobile terminal 100 may be mounted on the rear case 100-2. The power supply unit 190 may be a rechargeable battery, for example, and may be detachably coupled to the rear case 100-2 for charging or the like.

The fourth microphone 123d may be disposed on the front surface of the rear case 100-2, that is, on the rear surface of the mobile terminal 100 for audio signal collection.

On the other hand, in the present embodiment, the second camera 121b and the like are disposed in the rear case 100-2, but the present invention is not limited thereto.

Also, the first camera 121a may be rotatably formed so that the second camera 121b can be photographed up to the photographing direction, even if the second camera 121b is not separately provided.

FIG. 4 is a flowchart illustrating an exemplary method of operating a mobile terminal according to an exemplary embodiment of the present invention. FIGS. 5 to 13 are flowcharts illustrating a method of operating a mobile terminal according to an exemplary embodiment of the present invention. to be.

Referring to the drawings, a mobile terminal according to an exemplary embodiment of the present invention may enter a content sharing mode in which content is shared with a first external device (S410). The mobile terminal receives a command input of a user, for example, Or enter a content sharing mode based on a command input to execute the application.

The content sharing mode may mean a mode in which a mobile terminal can transmit content to an external device connected to a wireless network or receive content transmitted from an external device. In addition, the content sharing mode is a mode in which a content reproduced by one of a plurality of devices connected to a mirroring mode in which a mobile terminal and an external device display the same screen can be used in another device, And a dual screen dual play (DSDP) mode capable of performing a dual screen dual play (DSDP) mode.

A mobile terminal according to an exemplary embodiment of the present invention may be connected to another external terminal, a PC, a TV, or the like in a wired or wireless manner to share screens and contents.

For example, the mobile terminal according to the embodiment of the present invention can share the screen displayed by the mobile terminal, the content being reproduced with the connected external device, and transmit data for displaying the screen of the external device.

In contrast, the mobile terminal according to the embodiment of the present invention can be synchronized with an external device capable of displaying a web page screen or a messenger screen. The web page screen or the messenger screen can be displayed through a web browser or a messenger browser. The mobile terminal according to the embodiment of the present invention can display the currently displayed web page, the currently displayed messenger screen, the web page history, the web page screen, the text input on the messenger screen, or the messenger screen currently displayed on the sinked external terminal And displays the web page screen, the messenger screen, and the like on the display unit 151 based on the received data.

Meanwhile, data for content sharing may be directly transmitted to an external device via wired or wireless communication according to various communication standards, or may be transmitted to an external device through a separately connected web server.

On the other hand, when the mobile terminal receives a predetermined touch (S420), the mobile terminal displays a content sharing controller (controller) screen on the display unit 151 including the mobile terminal and an object representing the content being executed by the first external apparatus. (S430)

Here, the controller screen is a screen for manipulating the overall operation related to content sharing, and the user can call the controller screen by inputting a predetermined touch.

Upon receiving the additional input for operating the controller screen (S440), the controller 180 can control the mobile terminal 100 to perform the corresponding operation.

The user can perform multitasking operations more conveniently while sharing data such as contents and screens by setting and changing shared contents, devices, playback states, etc. through the controller screen (S450)

FIG. 5 is a diagram illustrating an existing DSDP mode and a multitasking flow when a mobile terminal and a TV are used. FIG. 6 is a diagram illustrating a DSDP mode and a multitasking process when a mobile terminal and a TV according to an embodiment of the present invention are used. Fig.

5, the user can share the first task screen performed by the mobile terminal with the first external device TV. Also, the mobile terminal may share the first task screen with the mirroring or play back the content corresponding to the first task, and then change to the DSDP mode or start the content sharing in the DSDP mode from the beginning.

In the case where the user operates the mobile terminal to perform the task B, which is the second task in multitasking in addition to the task A, the TV can still maintain the playback state of the A task screen, for example, the A movie being played back.

Thereafter, if the user wants to share the B task execution screen of the mobile terminal with the TV, the sharing of the A task must be ended. In addition, there is an inconvenience that the operation related to the content sharing mode must be performed from the beginning in order to use the A task again after the sharing of the B task.

As an example, when the A task is a movie and the B task is an SNS or a web browser screen, the user can share the movie of the mobile terminal with the TV and watch the movie on the large screen of the TV. In addition, the user can use the SNS or the web browser while viewing the movie on the TV. If the user wants to view the SNS or the web browser screen as a large screen of the TV, the user can share the SNS or web browser screen again have.

However, there is an inconvenience that the content sharing operation must be performed from the beginning to view the movie being viewed by the user again on the screen of the TV.

On the other hand, according to the embodiment of the present invention, it is possible to switch multitasking of the content sharing job without ending the predetermined content sharing.

Referring to FIG. 6, when the user manipulates the mobile terminal to perform the task B, which is the second task in multitasking after sharing the first A task, the TV still displays the A task screen, for example, State can be maintained.

Thereafter, when the user wishes to share the B task execution screen of the mobile terminal with the TV, the controller screen is invoked to change only the playback status to pause and share the B task without completely terminating the A task. The controller screen will be described in detail later with reference to FIG. 7 to FIG.

On the other hand, after sharing the B task, the user can easily switch back to the A task screen by calling the controller screen again.

Accordingly, it is possible to reduce the unnecessary operation depth of the multitasking flow when the mobile terminal is connected to another device, and to prevent the end of the second task from being re- Without moving to the previous state. In addition, each of the devices connected to the mobile terminal can move to a previous state in which different contents of DSDP are used.

In addition, it is possible to maintain a full screen state of task screens shared among a plurality of devices without performing steps such as adjusting a task screen size or terminating a task.

In addition, there is no need for a separate home screen and operation for controlling content on the connected device.

7A, a predetermined touch input for calling a controller screen is displayed on the mobile terminal screen (e.g., a touch screen) with two fingers in a state where a predetermined screen 710 such as a home screen and a lock screen is displayed, And a flicking input 711 for touching and dragging the touch panel 710.

On the other hand, the predetermined touch input for calling the controller screen illustrated in FIG. 7A is for the explanation of the present invention, and the present invention is not limited thereto. For example, the predetermined touch input for calling the controller screen may be a touch input using three fingers. Or may be freely set or changed by the manufacturer or the user.

Meanwhile, the controller screen may be displayed in a translucent state. The controller screen is displayed in a semitransparent state, so that the user can confirm the screen in use and the controller screen at the same time. 7 (b) and 7 (c), the controller screen 720 may be created as a semi-transparent layer similar to a Notification panel or may be displayed on a translucent layer, It can be displayed on the screen as if it is descending according to the direction of the king input 710. [

On the other hand, according to the embodiment, when there is a touch input for calling the controller screen again while the semi-transparent controller screen 720 is displayed, the displayed controller screen 720 can display another controller screen on the multi-layer screen .

According to the embodiment of the present invention, a controller screen can be entered in any screen such as a state where a task being played in the mobile terminal is playing.

In addition, according to the embodiment of the present invention, even if contents or an application (App) are used in a full screen, they do not interfere with each other.

In addition, according to the embodiment of the present invention, since a plurality of layers are provided, complicated tasks can be separated according to the type of DSDP connection, and switching between tasks is easy.

8 to 11 show examples of controller screens.

Fig. 8 shows an example of the controller screen 800. Fig.

The controller screen 800 can be displayed in a translucent state so that content that is being shared with the connected external device can be confirmed without affecting the screen currently being executed by the mobile terminal

The controller screen 800 may further include content usage history information of the mobile terminal and the first external device.

In addition, the controller screen 800 further includes content usage history information of another external device currently connected to the mobile terminal.

The controller screen 800 may include name and device state information 811, 821, and 831 of currently connected external devices. In addition, the controller screen 800 may include contents 810. 820 or paused contents 830 information reproduced by each device.

In addition, the controller screen 800 may include the name 822 of the content and / or the application currently being provided.

Accordingly, the user can check at a glance the status of the currently connected devices and the contents being played through the image included in the controller screen 800. [

In addition, when one of the contents is touched, menu buttons such as play, pause, rewind, and fast forward that can control the reproduction of the corresponding content can be displayed.

Accordingly, when the user touches the content shared in the connected device, information such as the content name and application name can be confirmed, and the content of the connected device can be easily reproduced, paused, stopped Etc. can be conveniently input.

Meanwhile, the controller screen 800 can display an object representing the content being executed by the mobile terminal and the first external device, respectively, at the top of the content usage history information, The same can be displayed.

Meanwhile, the object representing the content being executed by the mobile terminal and the first external apparatus may be a thumbnail image.

Meanwhile, a history that has been played back in the connected device through the controller screen 800 according to the embodiment of the present invention can be easily confirmed through a flicking input or the like.

Upon receiving the flicking input for the content use history information, the control unit 180 may control to change the display order of the content usage history information according to the direction of the flicking input.

Referring to FIG. 9, the content usage history information is displayed in a stacked order in a reverse order of usage order. Accordingly, the B content currently being executed is displayed at the top, and the A content used before can be displayed below.

When the user flickes to the right, the A content moves to the highest position and the B content changes its display position downward. In this state, when the user selects play, the A content is reproduced.

Therefore, the controller screen according to the embodiment of the present invention provides the history of the individual devices to the user, and the user can flicker to search the history and change contents.

On the other hand, upon receiving the touch input 1050 for an object representing the content being executed by any one of the mobile terminal and the first external device, the control unit 180 determines whether the content is to be stopped , And menus (1010, 1020, 1030) for controlling reproduction can be displayed.

For example, when the user touches the content 820 for about 2 to 3 seconds, a menu button such as play, pause, rewind, or fast forward can be displayed. In addition, content information being executed at the lower end of the content 820 can be displayed.

In addition, it is possible to display a Close button 1040 which can interrupt content sharing.

On the other hand, upon receiving an input for dragging one of the contents included in the controller screen to one of the devices, the controller 180 can control the dragged content to be shared with the dragged device and reproduced.

Referring to FIG. 11, when a user searches for a history of a TV and drags and drops B contents to the mobile terminal, the A contents which are being played back by the mobile terminal are accumulated as a history, Can be changed. Thus, the user can conveniently change the shared content.

Referring to FIGS. 12 and 13, an exemplary embodiment in which the mobile terminal 100 shares content with the TV 1200 will be described with reference to FIG. 12. First, as shown in FIG. 12 (a) And the TV 1200 may mirror the content A, or the TV 1200 may play the content A in the DSDP mode.

12 (b), the user can switch to the home screen by pressing the home key / button of the mobile terminal 100 or execute another B application. The user can call the controller screen to change the screen displayed by the TV 1200. [

Figs. 13A and 13B show an example of a controller screen. The user can pause the A content of the TV and then drag and drop the content B to change the TV 1200 to reproduce the content B as shown in FIG. 12C.

Alternatively, even if the user does not pause the A content, if the user drags B to the TV, the user can automatically switch to the temporarily stored state and change and reproduce the content.

The mobile terminal and its operation method according to the present invention are not limited to the configuration and method of the embodiments described above but can be applied to all or some of the embodiments so that various modifications can be made. Or may be selectively combined.

Meanwhile, a method of operating a mobile terminal according to an embodiment of the present invention can be implemented as a code that can be read by a processor on a recording medium readable by a processor included in the mobile terminal. The processor-readable recording medium includes all kinds of recording apparatuses in which data that can be read by the processor is stored. Examples of the recording medium that can be read by the processor include a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like, and may also be implemented in the form of a carrier wave such as transmission over the Internet . In addition, the processor-readable recording medium may be distributed over network-connected computer systems so that code readable by the processor in a distributed fashion can be stored and executed.

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed exemplary embodiments, but, on the contrary, It will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the present invention.

Wireless communication unit: 110
A / V (Audio / Video) input section: 120
User Inputs: 130
Sensing section: 140
Output: 150
Memory: 160
Interface section: 170
Control section: 180
Power supply: 190

Claims (9)

Entering a content sharing mode for sharing content with a first external device;
Receiving a predetermined touch; And
And displaying a content sharing controller screen including a mobile terminal and an object representing the content being executed by the first external device on a display unit.
The method according to claim 1,
Wherein the controller screen further includes content usage history information of the mobile terminal and the first external device.
3. The method of claim 2,
Wherein the controller screen further includes content usage history information of another external device currently connected to the mobile terminal.
3. The method of claim 2,
Wherein the controller screen displays an object representing the content being executed by the mobile terminal and the first external device at the top of the content usage history information, respectively.
3. The method of claim 2,
Receiving a flicking input for the content usage history information;
And changing the display order of the content usage history information according to the direction of the flicking input.
The method according to claim 1,
Wherein the object representing the content being executed by the mobile terminal and the first external device is a thumbnail image.
The method according to claim 1,
Receiving a touch input for an object representing content being executed by the mobile terminal and the first external device;
And displaying a menu for controlling stopping and playback of the content based on the touch input.
The method according to claim 1,
Receiving an input for dragging one of the contents included in the controller screen to one of the devices;
And sharing the dragged content with the dragged device.
The method according to claim 1,
Wherein the controller screen is displayed in a translucent state.
KR1020130052634A 2013-05-09 2013-05-09 Method of operating a Mobile Terminal KR20140133073A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020130052634A KR20140133073A (en) 2013-05-09 2013-05-09 Method of operating a Mobile Terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020130052634A KR20140133073A (en) 2013-05-09 2013-05-09 Method of operating a Mobile Terminal

Publications (1)

Publication Number Publication Date
KR20140133073A true KR20140133073A (en) 2014-11-19

Family

ID=52453828

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020130052634A KR20140133073A (en) 2013-05-09 2013-05-09 Method of operating a Mobile Terminal

Country Status (1)

Country Link
KR (1) KR20140133073A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20170027435A (en) * 2015-09-02 2017-03-10 엘지전자 주식회사 Electronic device and method for controlling the same

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20170027435A (en) * 2015-09-02 2017-03-10 엘지전자 주식회사 Electronic device and method for controlling the same

Similar Documents

Publication Publication Date Title
KR101252169B1 (en) Mobile terminal and operation control method thereof
KR101685363B1 (en) Mobile terminal and operation method thereof
KR20110080348A (en) Mobile terminal, mobile terminal system and operation control method thereof
KR20120081649A (en) Mobile terminal and operation control method thereof
KR20140080007A (en) Image display apparatus and method for operating the same
KR20140143610A (en) Mobile terminal and operation method thereof
KR20150039999A (en) Mobile terminal and operation method thereof
KR20150051757A (en) Mobile terminal and operation method thereof
KR20140144562A (en) Method of operating a Mobile Terminal
KR20150000681A (en) Mobile terminal and operation method thereof
KR101883179B1 (en) Mobile terminal and operation method thereof
KR20140040457A (en) Mobile terminal and operating method for the same
KR20140133073A (en) Method of operating a Mobile Terminal
KR20150093025A (en) Mobile terminal and operation method thereof
KR20150010182A (en) Mobile terminal and operation method thereof
KR20150039025A (en) Mobile terminal and operation method thereof
KR102104433B1 (en) Mobile terminal and operation method thereof
KR20130007348A (en) Mobile terminal and operation method thereof
KR101799320B1 (en) Mobile terminal and operation method thereof
KR20140094135A (en) Mobile terminal and Operationg method thereof
KR101685349B1 (en) Mobile terminal and operation method thereof
KR101687550B1 (en) Mobile terminal and operation method thereof
KR20150057840A (en) Mobile terminal and operation method thereof
KR20150093023A (en) Mobile terminal and operation method thereof
KR20130066337A (en) Mobile terminal and operation method thereof

Legal Events

Date Code Title Description
WITN Withdrawal due to no request for examination