KR20140146759A - Mobile terminal and method for controlling the same - Google Patents

Mobile terminal and method for controlling the same Download PDF

Info

Publication number
KR20140146759A
KR20140146759A KR20130069425A KR20130069425A KR20140146759A KR 20140146759 A KR20140146759 A KR 20140146759A KR 20130069425 A KR20130069425 A KR 20130069425A KR 20130069425 A KR20130069425 A KR 20130069425A KR 20140146759 A KR20140146759 A KR 20140146759A
Authority
KR
South Korea
Prior art keywords
area
terminal
screen
image
touched
Prior art date
Application number
KR20130069425A
Other languages
Korean (ko)
Inventor
이정빈
임소연
이지선
박진우
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR20130069425A priority Critical patent/KR20140146759A/en
Publication of KR20140146759A publication Critical patent/KR20140146759A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F13/00Interconnection of, or transfer of information or other signals between, memories, input/output devices or central processing units
    • G06F13/14Handling requests for interconnection or transfer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F13/00Interconnection of, or transfer of information or other signals between, memories, input/output devices or central processing units
    • G06F13/38Information transfer, e.g. on bus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Telephone Function (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present invention can easily transmit the image of the content displayed on the first terminal to the second terminal by dragging the first terminal on the second terminal with the first terminal being placed on the second terminal or by dragging the first terminal on the second terminal, To a first terminal, and a control method thereof.

Description

[0001] MOBILE TERMINAL AND METHOD FOR CONTROLLING THE SAME [0002]

The present invention relates to a portable terminal and a control method thereof, in which the use of the terminal can be realized by further considering the convenience of the user.

A terminal such as a personal computer, a notebook computer, a mobile phone, or the like can be configured to perform various functions. Examples of such various functions include a data and voice communication function, a function of photographing a video or a moving image through a camera, a voice storage function, a music file playback function through a speaker system, and an image or video display function. Some terminals include additional functions to execute games, and some other terminals are also implemented as multimedia devices. Moreover, recent terminals can receive a broadcast or multicast signal to view a video or television program.

In general, the terminal is movable The mobile terminal can be divided into a portable terminal, a handheld terminal and a vehicle mount terminal according to whether the user can carry the portable terminal directly or not. ≪ / RTI >

Meanwhile, in order to acquire a screen image displayed on the external terminal, the user of the portable terminal transmits the desired image to the portable terminal through the wireless communication by operating the external terminal, or transmits the image of the external terminal to the portable terminal through the data cable Or the screen image displayed on the external terminal must be directly captured by the camera of the portable terminal.

SUMMARY OF THE INVENTION It is an object of the present invention to provide a method and a device for displaying a content on a screen when first and second terminals, which are portable terminals, are connected to each other, To display an image of a content displayed in an area associated with the partial area in the partial area, and a control method thereof.

It is another object of the present invention to provide a mobile terminal in which communication is established between a first terminal and a second terminal when a first terminal is touched on a part of a screen of a second terminal, And displays the received image on an area associated with the partial area of the screen of the first terminal, and a control method thereof.

It is a further object of the present invention to provide a mobile terminal in which, when a first terminal and a second terminal are connected to each other and a second terminal is touched in a part of a screen of the first terminal, To be displayed in an area associated with the partial area in the screen of the second terminal, and a control method thereof.

According to an aspect of the present invention, there is provided a portable terminal including: a display unit displaying a specific content on a screen; A wireless communication unit for communicating with at least one external terminal; When the portable terminal is touched to a part of the screen of the external terminal in a state that the communication with the external terminal is connected so that an image displayed in an area associated with the partial area on the screen of the display unit is displayed on a part of the external terminal And a controller for transmitting the image through the wireless communication unit.

According to another aspect of the present invention, there is provided a method of controlling a mobile terminal, the method comprising: connecting a mobile terminal with at least one external terminal; Displaying a specific content on a screen of the portable terminal; Detecting whether the portable terminal is touched on a part of the screen of the external terminal in a state where communication is established with the external terminal; And transmitting the image so that an image displayed in an area associated with the partial area in the screen of the portable terminal is displayed in a partial area of the external terminal when the portable terminal is detected as being touched in the partial area, .

According to another aspect of the present invention, there is provided a portable terminal including: a touch screen; A wireless communication unit for communicating with at least one external terminal displaying a specific content on a screen; Receiving an image of the content displayed in the partial area from the external terminal through the wireless communication unit when the portable terminal is touched to a part of the screen of the external terminal in a state where communication with the external terminal is connected, And a controller for displaying the touch screen on an area associated with the partial area in the screen of the touch screen.

According to another aspect of the present invention, there is provided a method of controlling a mobile terminal, the method comprising: connecting a mobile terminal with at least one external terminal displaying a specific content on a screen; Detecting whether the portable terminal is touched on a part of the screen of the external terminal in a state where communication is established with the external terminal; Receiving an image of the content displayed in the partial area from the external terminal when the portable terminal is detected as being touched in the partial area; And displaying the received image on an area associated with the partial area in the screen of the portable terminal.

According to another aspect of the present invention, there is provided a portable terminal including: a touch screen displaying a specific content on a screen; A wireless communication unit for communicating with at least one external terminal; When the external terminal is touched on a part of the screen of the touch screen in a state where communication with the external terminal is connected, an image of the content displayed on the touched part of the area is displayed in a region And a controller for transmitting the image through the wireless communication unit so that the image is displayed on the display unit.

According to another aspect of the present invention, there is provided a method of controlling a mobile terminal, the method comprising: connecting a mobile terminal with at least one external terminal; Displaying a specific content on a screen of a touch screen provided in the portable terminal; Detecting whether the external terminal is touched in a part of the screen of the touch screen in a state where communication with the external terminal is connected; Transmitting the image so that an image of the content displayed on the touched part of the area is displayed in an area associated with the part of the area of the screen of the external terminal when the external terminal is touched to the part of the area .

A portable terminal and its control method according to the present invention can easily transmit an image of a content displayed on a first terminal to a second terminal by dragging a first terminal, which is a portable terminal, on a second terminal or on a second terminal Or the function of easily transmitting the image of the content displayed on the second terminal to the first terminal.

1 is a block diagram illustrating a portable terminal according to an embodiment of the present invention.
2A is a perspective view of a portable terminal according to an embodiment of the present invention.
FIG. 2B is a rear perspective view of the portable terminal shown in FIG. 2A.
3 is a block diagram illustrating an external terminal according to an embodiment of the present invention.
4 is an explanatory view illustrating a system configured by a portable terminal and one or more external terminals according to the present invention.
5 is a flowchart illustrating an image transmission process between a first terminal and a second terminal according to an embodiment of the present invention.
6 to 8 illustrate a first exemplary embodiment of an image transmission process between the first and second terminals according to the present invention.
9 is a flowchart illustrating an image transmission process between a first terminal and a second terminal according to the present invention.
FIGS. 10 to 16 are diagrams illustrating an image transmission process between the first and second terminals according to the second embodiment of the present invention.
17 is a flowchart illustrating a third embodiment of the image transmission process between the first and second terminals according to the present invention.

Hereinafter, a portable terminal related to the present invention will be described in detail with reference to the drawings. The suffix "module" and " part "for the components used in the following description are given or mixed in consideration of ease of specification, and do not have their own meaning or role.

The portable terminal described in this specification may include a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a PDA (Personal Digital Assistants), a PMP (Portable Multimedia Player), and navigation. However, it will be readily apparent to those skilled in the art that the configuration according to the embodiments described herein may be applied to fixed terminals such as a digital TV, a desktop computer, and the like, unless the configuration is applicable only to the portable terminal.

1 is a block diagram of a portable terminal according to an embodiment of the present invention.

The portable terminal 100 includes a wireless communication unit 110, an A / V input unit 120, a user input unit 130, a sensing unit 140, an output unit 150, a memory 160, A controller 170, a controller 180, a power supply 190, and the like. The components shown in Fig. 1 are not essential, and a portable terminal having more or fewer components may be implemented.

Hereinafter, the components will be described in order.

The wireless communication unit 110 may include one or more modules that enable wireless communication between the wireless terminal 100 and the wireless communication system or between the wireless terminal 100 and a network in which the wireless terminal 100 is located. For example, the wireless communication unit 110 may include a broadcast receiving module 111, a mobile communication module 112, a wireless Internet module 113, a short range communication module 114, and a location information module 115 .

The wireless communication unit 110 searches for at least one external terminal capable of communication connection under the control of the control unit 180, connects communication with any one of the searched external terminals, It is possible to transmit data provided in the terminal 100 and receive data from the external terminal connected to the communication.

The broadcast receiving module 111 receives broadcast signals and / or broadcast-related information from an external broadcast management server through a broadcast channel.

The broadcast channel may include a satellite channel and a terrestrial channel. The broadcast management server may refer to a server for generating and transmitting broadcast signals and / or broadcast related information, or a server for receiving broadcast signals and / or broadcast related information generated by the broadcast management server and transmitting the generated broadcast signals and / or broadcast related information. The broadcast signal may include a TV broadcast signal, a radio broadcast signal, a data broadcast signal, and a broadcast signal in which a data broadcast signal is combined with a TV broadcast signal or a radio broadcast signal.

The broadcast-related information may refer to a broadcast channel, a broadcast program, or information related to a broadcast service provider. The broadcast-related information may also be provided through a mobile communication network. In this case, it may be received by the mobile communication module 112.

The broadcast-related information may exist in various forms. For example, an EPG (Electronic Program Guide) of DMB (Digital Multimedia Broadcasting) or an ESG (Electronic Service Guide) of Digital Video Broadcast-Handheld (DVB-H).

For example, the broadcast receiving module 111 may be a Digital Multimedia Broadcasting-Terrestrial (DMB-T), a Digital Multimedia Broadcasting-Satellite (DMB-S), a Media Forward Link Only A digital broadcasting system such as DVB-CB, OMA-BCAST, or Integrated Services Digital Broadcast-Terrestrial (ISDB-T). Of course, the broadcast receiving module 111 may be adapted to other broadcasting systems as well as the digital broadcasting system described above.

The broadcast signal and / or broadcast related information received through the broadcast receiving module 111 may be stored in the memory 160.

The mobile communication module 112 transmits and receives radio signals to at least one of a base station, an external terminal, and a server on a mobile communication network. The wireless signal may include various types of data depending on a voice call signal, a video call signal or a text / multimedia message transmission / reception.

In addition, the mobile communication module 112 may communicate with at least one external terminal under control of the control unit 180, transmit data provided to the portable terminal 100 to the external terminal connected to the communication, Data can be received from an external terminal.

The wireless Internet module 113 refers to a module for wireless Internet access, and may be built in or externally mounted in the mobile terminal 100. WLAN (Wi-Fi), Wibro (Wireless broadband), Wimax (World Interoperability for Microwave Access), HSDPA (High Speed Downlink Packet Access) and the like can be used as wireless Internet technologies.

In accordance with the present invention, the wireless Internet module 113 searches for at least one external terminal capable of communication connection according to the control of the controller 180, communicates with any one of the external terminals, The mobile terminal 100 may transmit data to the external terminal connected to the communication and receive data from the external terminal connected to the communication.

The short-range communication module 114 refers to a module for short-range communication. Using short range communication technology, such as Bluetooth, Radio Frequency Identification (RFID), Infrared Data Association (UDA), UWB (Ultra Wideband), ZigBee, DLNA (Digital Living Network Alliance) .

In addition, according to the present invention, the short-range communication module 114 searches for at least one external terminal capable of communication connection under the control of the controller 180, connects communication with any one of the searched external terminals, The mobile terminal 100 may transmit data to the external terminal connected to the communication and receive data from the external terminal connected to the communication.

The position information module 115 is a module for acquiring the position of the portable terminal, and a representative example thereof is a GPS (Global Position System) module. According to the current technology, the GPS module 115 calculates distance information and accurate time information from three or more satellites, and then applies trigonometry to the calculated information to obtain a three-dimensional string of latitude, longitude, The location information can be accurately calculated. At present, a method of calculating position and time information using three satellites and correcting an error of the calculated position and time information using another satellite is widely used. In addition, the GPS module 115 can calculate speed information by continuously calculating the current position in real time.

Referring to FIG. 1, an A / V (Audio / Video) input unit 120 is for inputting an audio signal or a video signal, and may include a camera 121 and a microphone 122. The camera 121 processes image frames such as still images or moving images obtained by the image sensor in the video communication mode or the photographing mode. The processed image frame can be displayed on the display unit 151. [

The image frame processed by the camera 121 may be stored in the memory 160 or transmitted to the outside through the wireless communication unit 110. [ Two or more cameras 121 may be provided depending on the use environment.

The microphone 122 receives an external sound signal through a microphone in a communication mode, a recording mode, a voice recognition mode, or the like, and processes it as electrical voice data. The processed voice data can be converted into a form that can be transmitted to the mobile communication base station through the mobile communication module 112 when the voice data is in the call mode, and output. Various noise reduction algorithms may be implemented in the microphone 122 to remove noise generated in receiving an external sound signal.

The user input unit 130 generates input data for a user to control the operation of the terminal.

The user input unit 130 may receive from the user a signal designating two or more contents among the displayed contents according to the present invention. A signal for designating two or more contents may be received via the touch input, or may be received via the hard key and soft key input.

The user input unit 130 may receive an input from the user for selecting the one or more contents. In addition, an input for generating an icon related to a function that the portable terminal 100 can perform can be received from the user.

The user input unit 130 may include a directional keypad, a keypad, a dome switch, a touchpad (static / static), a jog wheel, a jog switch, and the like.

The sensing unit 140 senses the current state of the portable terminal 100 such as the open / close state of the portable terminal 100, the position of the portable terminal 100, the presence of the user, the orientation of the portable terminal, And generates a sensing signal for controlling the operation of the portable terminal 100. For example, when the portable terminal 100 is in the form of a slide phone, it is possible to sense whether the slide phone is opened or closed. It is also possible to sense whether the power supply unit 190 is powered on, whether the interface unit 170 is connected to an external device, and the like. Meanwhile, the sensing unit 140 may include a proximity sensor 141. The proximity sensor 141 will be described later in relation to the touch screen.

The output unit 150 is for generating an output relating to visual, auditory or tactile sense and includes a display unit 151, an acoustic output module 152, an alarm unit 153, a haptic module 154, 155, and the like.

The display unit 151 displays (outputs) the information processed in the portable terminal 100. For example, when the portable terminal is in the call mode, a UI (User Interface) or a GUI (Graphic User Interface) associated with a call is displayed. When the portable terminal 100 is in the video communication mode or the photographing mode, the photographed and / or received image, UI, or GUI is displayed.

The display unit 151 may be a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED) A flexible display, and a three-dimensional display (3D display).

Some of these displays may be transparent or light transmissive so that they can be seen through. This can be referred to as a transparent display, and a typical example of the transparent display is TOLED (Transparent OLED) and the like. The rear structure of the display unit 151 may also be of a light transmission type. With this structure, the user can see an object located behind the terminal body through the area occupied by the display unit 151 of the terminal body.

There may be two or more display units 151 according to the embodiment of the portable terminal 100. [ For example, in the portable terminal 100, a plurality of display units may be spaced apart from one another or may be disposed integrally with each other, or may be disposed on different surfaces.

(Hereinafter, referred to as a 'touch screen') in which a display unit 151 and a sensor for sensing a touch operation (hereinafter, referred to as 'touch sensor') form a mutual layer structure, It can also be used as an input device. The touch sensor may have the form of, for example, a touch film, a touch sheet, a touch pad, or the like.

The touch sensor may be configured to convert a change in a pressure applied to a specific portion of the display unit 151 or a capacitance generated in a specific portion of the display unit 151 into an electrical input signal. The touch sensor can be configured to detect not only the position and area to be touched but also the pressure at the time of touch.

If there is a touch input to the touch sensor, the corresponding signal (s) is sent to the touch controller (not shown). The touch controller processes the signal (s) and transmits the corresponding data to the controller 180. Thus, the control unit 180 can know which area of the display unit 151 is touched or the like.

The proximity sensor 141 may be disposed in an inner area of the portable terminal or in the vicinity of the touch screen, which is enclosed by the touch screen. The proximity sensor refers to a sensor that detects the presence or absence of an object approaching a predetermined detection surface or a nearby object without mechanical contact using the force of an electromagnetic field or infrared rays. The proximity sensor has a longer life span than the contact sensor and its utilization is also high.

Examples of the proximity sensor include a transmission type photoelectric sensor, a direct reflection type photoelectric sensor, a mirror reflection type photoelectric sensor, a high frequency oscillation type proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor. And to detect the proximity of the pointer by the change of the electric field along the proximity of the pointer when the touch screen is electrostatic. In this case, the touch screen (touch sensor) may be classified as a proximity sensor.

Hereinafter, for convenience of explanation, the act of recognizing that the pointer is positioned on the touch screen while the pointer is not in contact with the touch screen is referred to as "proximity touch & The act of actually touching the pointer on the screen is called "contact touch. &Quot; The position where the pointer is proximately touched on the touch screen means a position where the pointer is vertically corresponding to the touch screen when the pointer is touched.

The proximity sensor detects a proximity touch and a proximity touch pattern (e.g., a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, a proximity touch movement state, and the like). Information corresponding to the detected proximity touch operation and the proximity touch pattern may be output on the touch screen.

The audio output module 152 may output audio data received from the wireless communication unit 110 or stored in the memory 160 in a call signal reception mode, a call mode or a recording mode, a voice recognition mode, a broadcast reception mode, The sound output module 152 also outputs sound signals related to functions (e.g., call signal reception tones, message reception tones, etc.) performed in the portable terminal 100. The audio output module 152 may include a receiver, a speaker, a buzzer, and the like.

The alarm unit 153 outputs a signal for notifying the occurrence of an event of the portable terminal 100. Examples of events occurring in the portable terminal include receiving a call signal, receiving a message, inputting a key signal, and touch input. The alarm unit 153 may output a signal for notifying the occurrence of an event in a form other than the video signal or the audio signal, for example, vibration. In this case, the display unit 151 and the audio output module 152 may be a type of the alarm unit 153. The display unit 151 and the audio output module 152 may be connected to the display unit 151 or the audio output module 152, .

The haptic module 154 generates various tactile effects that the user can feel. A typical example of the haptic effect generated by the haptic module 154 is vibration. The intensity and pattern of the vibration generated by the hit module 154 can be controlled. For example, different vibrations may be synthesized and output or sequentially output.

In addition to the vibration, the haptic module 154 may include a pin arrangement vertically moving with respect to the contact skin surface, a spraying force or suction force of the air through the injection port or the suction port, a touch on the skin surface, contact with an electrode, And various tactile effects such as an effect of reproducing a cold sensation using an endothermic or exothermic element can be generated.

The haptic module 154 can be implemented not only to transmit the tactile effect through the direct contact but also to allow the user to feel the tactile effect through the muscular sensation of the finger or arm. The haptic module 154 may include two or more haptic modules 154 according to the configuration of the portable terminal 100.

The projector module 155 is a component for performing an image project function using the portable terminal 100 and is similar to the image displayed on the display unit 151 in accordance with a control signal of the controller 180 Or at least partly display another image on an external screen or wall.

Specifically, the projector module 155 includes a light source (not shown) that generates light (for example, laser light) for outputting an image to the outside, a light source And a lens (not shown) for enlarging and outputting the image at a predetermined focal distance to the outside. Further, the projector module 155 may include a device (not shown) capable of mechanically moving the lens or the entire module to adjust the image projection direction.

The projector module 155 can be divided into a CRT (Cathode Ray Tube) module, an LCD (Liquid Crystal Display) module and a DLP (Digital Light Processing) module according to the type of the display means. In particular, the DLP module may be advantageous for miniaturization of the projector module 151 by enlarging and projecting an image generated by reflecting light generated from a light source on a DMD (Digital Micromirror Device) chip.

Preferably, the projector module 155 may be provided on the side surface, the front surface, or the back surface of the portable terminal 100 in the longitudinal direction. It goes without saying that the projector module 155 may be provided at any position of the portable terminal 100 as needed.

The memory 160 may store a program for processing and controlling the controller 180 and may store the input / output data (e.g., a telephone directory, a message, an audio, a still image, an electronic book, History, and the like). The memory 160 may also store the frequency of use of each of the data (for example, each telephone number, each message, and frequency of use for each multimedia). In addition, the memory 160 may store data on vibration and sound of various patterns outputted when a touch is input on the touch screen.

The memory 160 may be a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (for example, SD or XD memory, etc.) ), A random access memory (RAM), a static random access memory (SRAM), a read-only memory (ROM), an electrically erasable programmable read-only memory (EEPROM), a programmable read- A magnetic disk, an optical disk, a memory, a magnetic disk, or an optical disk. The portable terminal 100 may operate in association with a web storage that performs a storage function of the memory 160 on the Internet.

The interface unit 170 serves as a path for communication with all external devices connected to the portable terminal 100. The interface unit 170 receives data from an external device or receives power from the external device and transmits the data to each component in the portable terminal 100 or allows data in the portable terminal 100 to be transmitted to an external device. For example, a wired / wireless headset port, an external charger port, a wired / wireless data port, a memory card port, a port for connecting a device having an identification module, an audio I / O port, A video input / output (I / O) port, an earphone port, and the like may be included in the interface unit 170.

The identification module is a chip for storing various information for authenticating the usage right of the mobile terminal 100 and includes a user identification module (UIM), a subscriber identity module (SIM), a general user authentication module A Universal Subscriber Identity Module (USIM), and the like. Devices with identification modules (hereinafter referred to as "identification devices") can be manufactured in a smart card format. Accordingly, the identification device can be connected to the terminal 100 through the port.

When the portable terminal 100 is connected to an external cradle, the interface unit may be a path through which power from the cradle is supplied to the portable terminal 100, or various command signals input from the cradle by the user It can be a passage to be transmitted to the terminal. The various command signals input from the cradle or the power source may be operated as a signal for recognizing that the portable terminal is correctly mounted on the cradle.

In addition, the interface unit 170 connects the wired communication with at least one external terminal through a data cable, transmits the data provided in the portable terminal 100 to the external terminal connected to the communication, Lt; / RTI >

The controller 180 typically controls the overall operation of the portable terminal. For example, voice communication, data communication, video communication, and the like. The control unit 180 may include a multimedia module 181 for multimedia playback. The multimedia module 181 may be implemented in the control unit 180 or may be implemented separately from the control unit 180. [

The controller 180 may perform a pattern recognition process for recognizing handwriting input or drawing input performed on the touch screen as characters and images, respectively.

The power supply unit 190 receives external power and internal power under the control of the controller 180 and supplies power necessary for operation of the respective components.

2A is a perspective view of a portable terminal according to an embodiment of the present invention.

The disclosed mobile terminal 100 has a bar-shaped main body. However, the present invention is not limited thereto, and can be applied to various structures such as a slide type, a folder type, a swing type, and a swivel type in which two or more bodies are relatively movably coupled.

The body includes a case (a casing, a housing, a cover, and the like) which forms an appearance. In this embodiment, the case may be divided into a front case 101 and a rear case 102. [ A variety of electronic components are embedded in the space formed between the front case 101 and the rear case 102. At least one intermediate case may be additionally disposed between the front case 101 and the rear case 102. [

The cases may be formed by injection molding a synthetic resin, or may be formed to have a metal material such as stainless steel (STS) or titanium (Ti) or the like.

The display unit 151, the sound output unit 152, the camera 121, the user input units 130/131 and 132, the microphone 122, the interface 170, and the like may be disposed in the front body 101 have.

The display unit 151 occupies most of the main surface of the front case 101. A sound output unit 151 and a camera 121 are disposed in an area adjacent to one end of both ends of the display unit 151 and a user input unit 131 and a microphone 122 are disposed in an area adjacent to the other end. The user input unit 132 and the interface 170 may be disposed on the side surfaces of the front case 101 and the rear case 102. [

The user input unit 130 is operated to receive a command for controlling the operation of the portable terminal 100 and may include a plurality of operation units 131 and 132. The operation units 131 and 132 may be collectively referred to as a manipulating portion.

The contents inputted by the first or second operation unit 131 or 132 may be variously set. For example, the first operation unit 131 receives commands such as start, end, scroll, and the like, and the second operation unit 132 controls the size of the sound output from the sound output unit 152 or the size of the sound output from the display unit 151 The touch recognition mode can be activated or deactivated.

FIG. 2B is a rear perspective view of the portable terminal shown in FIG. 2A.

Referring to FIG. 2B, a camera 121 'may be further mounted on the rear surface of the terminal body, that is, the rear case 102. The camera 121 'may have a photographing direction substantially opposite to that of the camera 121 (see FIG. 2A), and may be a camera having the same or different pixels as the camera 121.

For example, the camera 121 may have a low pixel so that the face of the user can be photographed and transmitted to the other party in case of a video call or the like, and the camera 121 ' It is preferable to have a large number of pixels. The cameras 121 and 121 'may be installed in the terminal body so as to be rotatable or pop-upable.

A flash 123 and a mirror 124 may be additionally disposed adjacent to the camera 121 '. The flash 123 illuminates the subject when the subject is photographed by the camera 121 '. The mirror 124 allows the user to illuminate the user's own face or the like when the user intends to shoot himself / herself (self-photographing) using the camera 121 '.

An acoustic output module 152 'may be additionally disposed on the rear side of the terminal body. The sound output unit 152 'may implement the stereo function together with the sound output module 152 (see FIG. 2A), and may be used for the implementation of the speakerphone mode during a call.

In addition to the antenna for communication, a broadcast signal receiving antenna 116 may be additionally disposed on the side of the terminal body. The antenna 116, which forms part of the broadcast receiving unit 111 (see FIG. 1), may be installed to be able to be drawn out from the terminal body.

A power supply unit 190 for supplying power to the portable terminal 100 is mounted on the terminal body. The power supply unit 190 may be built in the terminal body or may be detachable from the outside of the terminal body.

The rear case 102 may further include a touch pad 135 for sensing a touch. The touch pad 135 may be of a light transmission type for the display unit 151. [ In this case, if the display unit 151 is configured to output time information on both sides (that is, in both the front and rear directions of the portable terminal), the time information can be recognized through the touch pad 135 do. The information output on both sides may be all controlled by the touch pad 135. [

Meanwhile, the display for exclusive use of the touch pad 135 may be separately installed, so that the touch screen may be disposed in the rear case 102 as well.

The touch pad 135 operates in correlation with the display portion 151 of the front case 101. The touch pad 135 may be disposed parallel to the rear of the display unit 151. The touch pad 135 may have a size equal to or smaller than that of the display unit 151.

The various embodiments described herein may be embodied in a recording medium readable by a computer or similar device using, for example, software, hardware, or a combination thereof.

According to a hardware implementation, the embodiments described herein may be implemented as application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays May be implemented using at least one of a processor, controllers, micro-controllers, microprocessors, and other electronic units for performing other functions. In some cases, The embodiments described may be implemented by the control unit 180 itself.

According to a software implementation, embodiments such as the procedures and functions described herein may be implemented with separate software modules. Each of the software modules may perform one or more of the functions and operations described herein. Software code can be implemented in a software application written in a suitable programming language. The software code is stored in the memory 160 and can be executed by the control unit 180. [

The portable terminal referred to herein may include at least one of the components shown in FIG. In addition, the control unit 180 can control the individual operation of the component or the linking operation between the plurality of components for performing an operation using the component (e.g., a touch screen, a wireless communication unit, a memory, and the like).

Hereinafter, with reference to FIG. 3 through FIG. 17, the image transmission process between the wireless terminals according to the present invention will be described in detail.

Referring to FIG. 3, an external terminal 700, which connects a communication with the portable terminal 100 and outputs data received from the portable terminal 100 or transmits data to the portable terminal 100, Will be described in detail.

3 is a block diagram illustrating an external terminal according to an embodiment of the present invention.

3, an external terminal 700 according to the present invention includes a wireless communication unit 710, a camera 720, an input unit 730, a memory 740, a speaker 750, a display unit 760 and a control unit 770.

Of course, in addition to the above-described components, the external terminal 700 can equally be equipped with the components mentioned in the portable terminal 100. [ That is, the external terminal 700 is a portable terminal, which can have the same components as the portable terminal 100 shown in FIG.

The wireless communication unit 710 may communicate with the wireless terminal 100 in accordance with the present invention and may receive data to be displayed or output from the wireless terminal 100 to the external device 700. The data received from the mobile terminal 100 may include at least one of image data of the content executed in the mobile terminal 100 and video / audio data of the content reproduced or executed in the mobile terminal 100 .

The wireless communication unit 710 may transmit a part or all of the content displayed on the display unit 760 to the portable terminal 100.

The wireless communication unit 710 may transmit the partial area to the portable terminal 100 under the control of the controller 180 when the portable terminal 100 is touched in a part of the screen of the touch screen type display unit 760, To the mobile terminal 100, the touch area information including the area and the location touched by the touch area information.

When the portable terminal 100 is placed on the screen of the touch-screen type display unit 760 and is dragged in a specific direction, the wireless communication unit 710 transmits the control signal to the portable terminal 100 on the screen according to the control of the controller 180 And the drag-touch area information including the touch area and the position of the dragged area can be transmitted to the mobile terminal 100. [

The wireless communication unit 710 may include a mobile communication module that enables communication between the portable terminal 100 and the external terminal 700 such as the wireless communication unit 110 of the portable terminal 100 of FIG. And a short range communication module.

For example, the portable terminal 100 and the external terminal 700 may be connected to each other through a communication method of any one of mobile communication, wireless Internet communication, Bluetooth, and DLNA.

The camera 720 processes image frames such as still images or moving images obtained by the image sensor in the video communication mode or the photographing mode. The processed image frame can be displayed on the display unit 760. The camera 720 may be driven according to a command signal received from the portable terminal 100 through the wireless communication unit 110. In this case, the preview image input after being driven may be transmitted through the wireless communication unit 710 And transmitted to the mobile terminal 100. As described above, the camera 121 may be provided in two or more depending on the use environment.

The input unit 730 generates a key signal for controlling the operation of the external device 700. The input unit 730 may include a key pad dome switch, a touch pad (static / static), a jog wheel, a jog switch, a mouse, and the like, as the case may be.

The memory 740 may store a program for the operation of the external device 700, and various data such as a moving picture file, an audio file, and an image file may be stored.

The speaker 750 outputs the audio file stored in the memory 740 and the audio data received from the portable terminal 100 through the wireless communication unit 710.

The display unit 760 displays information to be processed by the external device 700. The display unit 760 may be a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT) LCD, an organic light-emitting diode (OLED) A flexible display, and a three-dimensional display (3D display). Also, the display unit 760 may be formed as a touch screen by being combined with a touch sensor.

In addition, the display unit 760 displays image data (including still images and moving images) received from the portable terminal 100 on the screen.

The control unit 770 controls the overall operation of the external device 700 and displays image data received from the portable terminal 100 on the screen of the display unit 770 and displays the image data received from the display unit 770, And transmits the image of the content displayed on the screen of the terminal 100 to the portable terminal 100.

The configuration of the external terminal 700 according to the present invention has been described above.

4 illustrates a communication connection and a data sharing process between the portable terminal 100 and the external terminal 700 configured as described above.

4 is an explanatory view illustrating a system configured by a portable terminal and one or more external terminals according to the present invention.

First, FIG. 4A shows a system including a portable terminal 100 and one external terminal 700.

Referring to FIG. 4A, the portable terminal 100 searches for an external terminal 700 capable of communicating with the portable terminal 100 through the wireless communication unit 110, And may transmit all or a part of the content displayed on the screen of the portable terminal 100 to the external terminal 700. [

For example, the portable terminal 100 and the external terminal 700 can communicate with each other through any one of a mobile communication, a wireless Internet communication, and a local area communication. At this time, when the short-distance communication is used, the portable terminal 100 and the external device 700 can establish communication with each other through a short-distance method such as Bluetooth or DLNA.

Referring to FIG. 4B, the portable terminal 100 searches for two or more external terminals 700-1 to 700-n, and the searched external terminals 700-1 to 700-n are searched. 700-n. ≪ / RTI >

Hereinafter, an image transmission process between mobile terminals according to the present invention will be described in detail with reference to FIG. 5 through FIG.

The portable terminal 100 and the external terminal 700 of the present invention use different terms to distinguish the subject of operation and the constituent elements of the portable terminal 100 and the external terminal 700 may be the same have.

In the following description, the portable terminal 100 and the external terminal 700 will be referred to as a first terminal and a second terminal. Of course, the operation of the first terminal 100 described below can be performed in the same manner in the second terminal 700 having the same components as those of the first terminal 100, 1 < / RTI >

[First Embodiment]

When the first terminal 100 is placed on a part of the screen of the second terminal 700 and is dragged and touched while being touched or placed on the first terminal 100, 1 illustrates a process of transmitting and displaying an image of a content displayed on a screen of the terminal 100. FIG.

Hereinafter, a first embodiment of the present invention will be described in detail with reference to Figs. 5 to 8. Fig.

5 is a flowchart illustrating an image transmission process between a first terminal and a second terminal according to an embodiment of the present invention.

6 to 8 illustrate a first exemplary embodiment of an image transmission process between the first and second terminals according to the present invention.

5 to 8, the controller 180 of the first terminal 100 communicates with the second terminal 700 through the wireless communication unit 110, and displays the screen of the touch screen 151 The content selected or designated by the user is displayed on the screen [S120].

At this time, the content includes all data that can be executed or displayed on the first terminal 100. For example, the content may include broadcast data, moving pictures, music, pictures, games, documents, maps, navigation, menu functions, An idle screen, a home screen, and the like.

The control unit 180 detects whether the first terminal 100 is touched to a part of the screen of the second terminal 700 in a state in which the communication with the second terminal 700 is connected S130.

That is, the controller 180 detects whether the first terminal 100 is placed in a part of the screen of the second terminal 700 and touched the first terminal 100, And then detects whether it has been dragged in a specific direction.

At this time, the first terminal 100 can detect that the second terminal 700 is touched on the screen of the second terminal 700 by the second terminal 700.

That is, when the first terminal 100 is placed in a partial area on the screen of the display unit 760 and is touched while communication is established with the first terminal 100, The touch area information including the detected touch area and the position is generated and transmitted to the first terminal 100. [

When the touch area information is received from the second terminal 700 through the wireless communication unit 110, the controller 180 of the first terminal 100 determines whether the first terminal 100 receives the touch area information from the second terminal 700 It is judged that it is placed on a partial area on the screen to be touched.

The second terminal 700 is touched by touching a part of the screen of the display unit 760 of the first terminal 100 while the first terminal 100 is in communication with the first terminal 100, The drag and touch area information including the detected touch area and the position is generated and transmitted to the first terminal 100. In this case,

When the drag-touch area information is received from the second terminal 700 through the wireless communication unit 110, the controller 180 of the first terminal 100 transmits the drag-touch area information to the second terminal 700, It is judged that the user has dragged and touched the screen of FIG.

If the control unit 180 detects that the first terminal 100 is touched on the screen of the second terminal 700 in step S140, 2 captures or crops an image of the content displayed in an area associated with the touched area on the screen of the terminal 700 (hereinafter, referred to as an 'associated area') and transmits the captured or cropped image to the wireless communication unit 110 To the second terminal 700 (S150).

That is, the first terminal 100 transmits the image to the second terminal 100 so that the image displayed in the association area is displayed in the area where the first terminal 100 is touched within the screen of the second terminal 700 .

At this time, the control unit 180 transmits the image to the second terminal 700 (700) along with a signal to display the image displayed in the association area in the area of the second terminal 700 that is touched by the first terminal 100, The second terminal 700 displays the received image on the area where the first terminal 100 is touched according to the command signal when the image is received together with the command signal.

In this case, if the first terminal 100 is placed on a part of the screen of the second terminal 700 and is touched, the associated area in the screen of the touch screen 151 for the partial area is displayed on the touch screen 151 And may be an area having the same area as the partial area at a position opposite to the partial area within the screen.

That is, when the touch area information about the partial area is received from the second terminal 700, the controller 180 determines whether the touch area of the touch screen 151 is in a position opposite to the partial area on the screen of the touch screen 151, The area having the same area as the partial area is determined as the associated area.

For example, if a portion of the second terminal 700 that is touched by the first terminal 100 is a left side region of the screen, the controller 180 is opposite to the left side region in the screen of the touch screen 151 , And determines the right side area having the same area as the left side area as the related area.

If the first terminal 100 is touched on the screen of the second terminal 700 and then dragged in a specific direction, the associated area in the screen of the touch screen 151 with respect to the dragged area is displayed on the touch screen 151. [ An area having the same area as the dragged area may be located at a position opposite to the dragged area on the screen of the display unit 151. [

That is, when the drag-touch area information on the dragged area is received from the second terminal 700, the controller 180 displays the dragged area on the screen of the touch screen 151 based on the drag- An area having the same area as the dragged area is determined as the associated area.

For example, if the area of the second terminal 700 dragged by the first terminal 100 is the area from the left side to the center of the screen, the controller 180 controls the display of the left side The area from the right side to the center having the same area as the area from the left side to the center is determined as the association area.

As described above, when a region related to a part of the screen of the second terminal 700, which is touched by the first terminal 100, is determined within the screen of the touch screen 151, Captures or crops the displayed image and instructs the capture or cropped image to be displayed on the capturing or cropping screen as a signal that instructs the displayed image to be displayed with a gradually increasing visual effect within the portion of the second terminal 700, To the second terminal (700).

6A, the controller 180 determines whether the area on the right side of the screen of the first terminal 100 is located in the left area 811 of the screen 810 of the second terminal 700, for example, And receives touch area information including the touch area and the position of the left area 811 of the second terminal 700, which is touched by the first terminal, from the second terminal 700.

On the basis of the received touch area information, the controller 180 determines whether the content 210 of the touch screen 151 is opposite to the left area 811 of the second terminal 700 and has the same area To the second terminal 700 so that the image 211 of the content displayed in the right region is displayed in the left region 811 of the second terminal 700. [

The control unit 770 of the second terminal 700 receives the image 211 from the first terminal 100 and displays the received image 211 in the left region 811 to which the first terminal 100 is touched Or as a visual effect in which the received image 211 gradually fills into the left region 811 as shown in Figures 6 (b) to 6 (d).

7 (a), when the first terminal 100 is dragged in the direction of the right area from the left area of the screen 810 of the second terminal 700, 2 touch screen area information including the touch area and the position of the area 812 dragged and touched by the first terminal 100 from the terminal 700. [

The control unit 180 then determines whether the content 210 of the touch screen 151 is opposite to the dragged touch area 812 of the second terminal 700 based on the received drag touch area information To the second terminal 700 so that the image 211 of the content displayed in the area having the same area is displayed in the drag-touched area 812 of the second terminal 700.

The control unit 770 of the second terminal 700 receives the image 211 from the first terminal 100 and displays the received image in the area 812 dragged and touched by the first terminal 100 Or the received image 211 may be displayed as a visual effect gradually fading in the drag-touched area 812, as shown in FIGS. 7 (b) to (d)

8 (a), when the first terminal 100 is dragged in the upper direction in the lower end area of the screen 810 of the second terminal 700, The drag and touch area information including the touch area and the position of the area 812 dragged and touched by the first terminal 100 is received from the terminal 700. [

The control unit 180 then determines whether the content 210 of the touch screen 151 is opposite to the dragged touch area 812 of the second terminal 700 based on the received drag touch area information To the second terminal 700 so that the image 211 of the content displayed in the area having the same area is displayed in the drag-touched area 812 of the second terminal 700.

[Second Embodiment]

When the first terminal 100 is placed on a part of the screen of the second terminal 700 and is dragged and touched while being touched or placed, the second embodiment of the present invention is displayed in the touched area or the dragged area And receiving an image of the content from the second terminal 700 and displaying the received image.

Hereinafter, a second embodiment of the present invention will be described in detail with reference to Figs. 9 to 16. Fig.

9 is a flowchart illustrating an image transmission process between a first terminal and a second terminal according to the present invention.

FIGS. 10 to 16 are diagrams illustrating an image transmission process between the first and second terminals according to the second embodiment of the present invention.

9 to 16, the controller 180 of the first terminal 100 communicates with the second terminal 700 that displays specific content on the screen through the wireless communication unit 110 (S210) .

The control unit 180 detects whether the first terminal 100 is touched to a part of the screen of the second terminal 700 in a state where the communication with the second terminal 700 is connected S220.

That is, the controller 180 detects whether the first terminal 100 is placed in a part of the screen of the second terminal 700 and touched the first terminal 100, And then detects whether it has been dragged in a specific direction.

At this time, the first terminal 100 can detect that the second terminal 700 is touched on the screen of the second terminal 700 by the second terminal 700.

That is, when the first terminal 100 is touched in a part of the screen on which the content of the display unit 760 is displayed while the first terminal 100 is in communication with the first terminal 100, And the touch area information including the detected touch area and the position is generated and transmitted to the first terminal 100. [

When the touch area information is received from the second terminal 700 through the wireless communication unit 110, the controller 180 of the first terminal 100 determines whether the first terminal 100 receives the touch area information from the second terminal 700 It is judged that it is placed on a partial area on the screen to be touched.

The second terminal 700 is connected to the first terminal 100 so that the first terminal 100 is touched on a part of the screen on which the content of the display unit 760 is displayed, The drag and touch area information including the detected touch area and the position is generated and transmitted to the first terminal 100. In this case,

When the drag-touch area information is received from the second terminal 700 through the wireless communication unit 110, the controller 180 of the first terminal 100 transmits the drag-touch area information to the second terminal 700, It is judged that the user has dragged and touched the screen of FIG.

As described above, the control unit 180 determines that the first terminal 100 is touched on the screen of the second terminal 700 based on the touch area information or the drag touch area information received from the second terminal 700 (S230), the image displayed in the area touched by the first terminal 100 from the second terminal 700 is received (S240).

At this time, the touch area information or the drag touch area information described above may be received together with the image.

The control unit 180 determines whether or not the screen area of the second terminal 100 touched by the first terminal 100 among the screen areas of the first terminal 100 based on the received touch area information or the drag touch area information And displays the received image in the area [S250].

In this case, if the first terminal 100 is placed on a part of the screen of the second terminal 700 and is touched, the associated area in the screen of the touch screen 151 for the partial area is displayed on the touch screen 151 And may be an area having the same area as the partial area at a position opposite to the partial area within the screen.

That is, when the touch area information about the partial area is received from the second terminal 700, the controller 180 determines whether the touch area of the touch screen 151 is in a position opposite to the partial area on the screen of the touch screen 151, An area having the same area as the partial area is determined as the associated area.

For example, if a portion of the second terminal 700 that is touched by the first terminal 100 is a left side region of the screen, the controller 180 is opposite to the left side region in the screen of the touch screen 151 , And determines the right side area having the same area as the left side area as the related area.

If the first terminal 100 is touched on the screen of the second terminal 700 and then dragged in a specific direction, the associated area in the screen of the touch screen 151 with respect to the dragged area is displayed on the touch screen 151. [ An area having the same area as the dragged area may be located at a position opposite to the dragged area on the screen of the display unit 151. [

That is, when the drag-touch area information on the dragged area is received from the second terminal 700, the controller 180 determines whether the dragged area is within the screen of the touch screen 151 based on the drag- An area having the same area as the dragged area is determined as the associated area.

For example, if the area of the second terminal 700 dragged by the first terminal 100 is the area from the left side to the center of the screen, the controller 180 controls the display of the left side The area from the right side to the center having the same area as the area from the left side to the center is determined as the association area.

As described above, when the area associated with a part of the area of the screen of the second terminal 700 touched by the first terminal 100 is determined within the screen of the touch screen 151, 700 as a visual effect gradually fading into the associated region.

10 (a), when the first terminal 100 is in communication with the second terminal 700, as shown in FIG. 10 (b), the first terminal 100 The first terminal 100 displays the screen 220 of the first terminal 100 in a semi-transparent form to inform the user of the first terminal 100 that the screen capture of the second terminal 700 is possible by using the first terminal 100. [

10 (c), when the first terminal 100 is touched and placed in a partial area 221 of the screen on which the content 820 of the second terminal 700 is displayed The touch area information including the touch area and the position of the image 821 displayed on the partial area 221 and the partial area 221 from the second terminal 700.

The control unit 180 displays the received image 821 in an area associated with the partial area 221 on the screen of the touch screen 151 based on the received touch area information, Image 821 is displayed with a visual effect that gradually fills into the associated area.

11 (a), when the first terminal 100 is dragged in the direction of the right area from the left area of the screen on which the content 820 of the second terminal 700 is displayed, An image 822 displayed in an area dragged and touched by the first terminal 100 from the second terminal 700 and drag and touch area information including a touch area and a position of the dragged area are received.

The control unit 180 displays the received drag image on the screen 220 of the touch screen 151 based on the received drag touch area information, 822, or displays the received image 822 as a gradually increasing visual effect within the associated region, as shown in Figures 11 (b) - (d).

12 (a), when the first terminal 100 is dragged in the upper direction in the bottom area of the screen on which the content 820 of the second terminal 700 is displayed, An image 823 displayed in the area dragged and touched by the first terminal 100 from the second terminal 700, and drag touch area information including the touch area and the position of the drag-touched area.

The control unit 180 displays the received drag image on the screen 220 of the touch screen 151 based on the received drag touch area information, 822, or displays the received image 822 as a progressive visual effect within the associated region, as shown in Figures 12 (b) to 12 (d).

13 (a), the second terminal 700 displays a screen in which the content 820 is displayed on the first terminal 100 in a state where communication is established with the first terminal 100, A list 310 including two or more menus associated with the image 824 of the content displayed in the area 221 touched by the first terminal 100 is displayed on the screen And displays it on the screen.

In one example, the list 310 includes a first menu for sharing the image 824, a second menu for printing the image 824, and a second menu for printing the image 824, A third menu for capturing from the image and a fourth menu for cropping the image 824 from the entire image of the content 820. [

That is, when the first menu is selected, the second terminal 700 transmits the image 824 to the first terminal 100 to which the communication is connected. Also, when the second menu 700 is selected, the second terminal 700 prints the image 824 through an external printer communicably connected to the second terminal 700. The second terminal 700 also captures and stores the area 221 in which the image 824 is displayed from the entire image of the content 820 when the third menu is selected and stores the stored image 824 To the first terminal (100). If the fourth menu is selected, the second terminal 700 crops and stores the region 221 in which the image 824 is displayed from the entire image of the content 820, and stores the stored image 824 To the first terminal (100).

13 (b), the first terminal 100 is connected to the second terminal 700, which is touched by the first terminal 100, in a state where communication is established with the second terminal 700, A list 320 of two or more menus associated with the received image 824 is displayed on the screen of the touch screen 151 Lt; / RTI >

In one example, the list 320 includes a first menu for sharing the image 824, a second menu for printing the image 824, a third menu for storing the image 824, And a fourth menu for cropping or capturing the image 824.

That is, when the first menu is selected, the first terminal 100 transmits the received image 824 to another external terminal. Also, when the second menu is selected, the first terminal 100 prints the image 824 through an external printer communicably connected to the first terminal 100. Also, the first terminal 100 stores the received image 824 in the memory 160 when the third menu is selected. In addition, the first terminal 700 may crop or capture the image displayed in the designated area from the image 824 if the fourth menu is selected and a desired area is designated from the image 824.

14 (a), the first terminal 100 is connected to the second terminal 700, which is touched by the first terminal 100, in a state in which the communication with the second terminal 700 is connected to the first terminal 100 700 on the screen of the touch screen 151 when the image 824 displayed in the area 221 of the screen of the first terminal 700 is received from the second terminal 700.

14 (b), the first terminal 100 zooms in on the image 824 corresponding to a pinching-in / out touch gesture input on the screen, (Zoom in) or zoom out (zoom in).

For example, FIG. 14B shows a reduced image 332 corresponding to the pinching-in-touch gesture input from the user.

15 (a), when the first terminal 100 is connected to the second terminal 700 and the first terminal 100 is connected to the second terminal 700, The second terminal 700 receives the entire image of the content 820 displayed on the screen of the second terminal 700 and the touch area information of the partial area 221 from the second terminal 700 when the partial area 221 on the screen is touched.

The first terminal 100 captures or crops a first image 824 corresponding to the partial area 221 in the entire image based on the received touch area information and displays the first image 824 on the screen, And displays a minimap 340 that can identify the area in which the first image 824 is displayed within the entire image.

That is, the mini-map 340 displays the entire image of the content 820, and the portion 824A corresponding to the first image 824 and the portion corresponding to the first image 824 And the portion 824B that does not exist is distinguished.

Then, the user can drag or touch on the mini-map 340 to capture or crop a desired portion in the entire image of the content 820 and display it on the screen.

That is, as shown in FIG. 15A, the first terminal 100 receives a drag touch on the mini-map 340, and the portion 824A corresponding to the first image 824 enters the second When the image 825 is moved to another portion 825A displayed, the second image 825 is captured or cropped from the whole image to be displayed on the screen, as shown in FIG. 15 (b) A portion 825A corresponding to the second image 825 and a portion 825B corresponding to the second image 825 are displayed on the mini-map 340 to be distinguished.

Next, FIG. 16 shows that the content displayed on the screen of the second terminal 700 is a web page 830 on which some content 831 is not displayed on the screen.

16 (a), the first terminal 100 is touched to a partial area (upper area) on the screen of the web page 830, and then the second terminal 700 is shown in FIG. 16 (b) As shown in FIG. 16C, when the touch is held for a predetermined time after dragging to the bottom area of the screen of the web page 830, the lower end area of the screen of the web page 830 Scrolls the partial content 831 on the screen of the web page 830 such that the content 831 is displayed on the screen of the web page 830.

The second terminal 700 captures an entire image of the web page 830 including the contents 831 on the screen and transmits the captured image to the first terminal 100.

FIG. 16D shows that the first terminal 100 displays the entire image of the web page 830 received from the second terminal 700. FIG.

[Third Embodiment]

In the third embodiment of the present invention, when the second terminal 700 is dragged and touched in a state in which the content of the first terminal 100 is placed on a part of the screen on which the content is displayed, An image of the content displayed in the area is transmitted to the second terminal 700 and displayed.

Hereinafter, a third embodiment of the present invention will be described in detail with reference to Fig.

17 is a flowchart illustrating a third embodiment of the image transmission process between the first and second terminals according to the present invention.

17, the controller 180 of the first terminal 100 communicates with the second terminal 700 through the wireless communication unit 110 (S310) The content selected or designated by the user is displayed (S320).

At this time, the content includes all data that can be executed or displayed on the first terminal 100. For example, the content may include broadcast data, moving pictures, music, pictures, games, documents, maps, navigation, menu functions, An idle screen, a home screen, and the like.

The controller 180 detects whether the second terminal 700 is touched to a part of the screen on which the content of the first terminal 100 is displayed in step S330 while communication is established with the second terminal 700. [

That is, the controller 180 detects whether the second terminal 700 is placed in a part of the screen of the first terminal 100 to be touched or the second terminal 700 detects a part of the screen of the first terminal 100 And then detects whether it has been dragged in a specific direction.

If it is detected that the second terminal 700 is touched on the screen of the first terminal 100 in step S340, the controller 180 displays the second The terminal 700 captures or crops an image of the content displayed in the touched area and displays the captured or cropped image in an area associated with the touched area of the second terminal 700 To the second terminal 700 (S350).

At this time, if the second terminal 700 is placed on a part of the screen of the first terminal 100 and is touched, the controller 180 controls the touch area information including the touched area and the touched area, To the second terminal 700. The second terminal 700 displays the received image on an area associated with the partial area of the screen area based on the received touch area information.

At this time, the associated area of the second terminal 700 with respect to a partial area of the screen of the first terminal 100 may have the same area as the partial area of the second terminal 700 at a position opposite to the partial area An area can be an area.

For example, if a portion of the first terminal 100 touched by the second terminal 700 is a left side region of the screen, the second terminal 700 is opposite to the left side region of the screen region of the second terminal 700, And the right side region having the same area as the region is determined as the related region.

When the second terminal 700 is touched on the screen of the first terminal 100 and then dragged in a specific direction, the controller 180 controls the dragging of the dragged area, The second terminal 700 transmits the area information to the second terminal 700 together with the image and the second terminal 700 transmits the received image to the dragged area of the screen area of the second terminal 700 based on the received drag- To display in the associated area.

At this time, the associated region of the second terminal 700 with respect to the dragged region of the screen of the first terminal 100 is located at a position opposite to the dragged region within the screen of the second terminal 700, It can be an area having the same area.

For example, if the area of the first terminal 100 dragged by the second terminal 700 is in the area from the left side to the center of the screen, the second terminal 700 may display And the area from the right side to the center having the same area as the area from the left side to the center is determined as the related area.

6 to 8, if the touch area information or the drag touch area information for the image and the image is received from the first terminal 100 in step S350, Based on the touch area information or the drag touch area information, the image is gradually displayed in the area associated with the screen area of the first terminal 100 in which the image is displayed.

13 (b), when the image is received from the first terminal 100, the second terminal 700 displays a list including two or more menus associated with the received image on the screen .

For example, the list may include a first menu for sharing the image, a second menu for printing the image, a third menu for storing the image, and a third menu for cropping or capturing the image. 4 menu.

That is, when the first menu is selected, the second terminal 700 transmits the received image to another external terminal. In addition, when the second menu 700 is selected, the second terminal 700 prints the image through an external printer connected to the second terminal 700 in communication. In addition, the second terminal 700 stores the received image in the memory 740 when the third menu is selected. Also, the second terminal 700 may crop or capture the image displayed in the designated area from the image, if the fourth menu is selected and a desired area is designated from the image.

14, when the image is received from the first terminal 100, the second terminal 700 displays the received image on its own screen, and displays the pinned / (Zoom in) or zoom out (enlarge) the image in accordance with a pinching-in / out touch gesture.

15, when the second terminal 700 is touched on a part of the screen of the first terminal 100 in a state where communication is established with the first terminal 100, Receiving the entire image of the content displayed on the screen of the first terminal 100 and the touch area information of the partial area and displaying the first image corresponding to the partial area in the entire image on the basis of the received touch area information Capturing or cropping and displaying the mini-map on the screen and displaying the area in which the first image is displayed within the entire image.

When a drag-touch is input on the mini-map and the portion corresponding to the first image is moved to another portion in which the second image 825 is displayed, the second terminal 700 transmits the second image And displays a portion corresponding to the second image and a portion not corresponding to the second image on the mini-map 340 so as to be distinguished from each other.

It will be apparent to those skilled in the art that the present invention may be embodied in other specific forms without departing from the spirit or essential characteristics thereof.

The present invention described above can be implemented as computer readable codes on a medium on which a program is recorded. The computer readable medium includes all kinds of recording devices in which data that can be read by a computer system is stored. Examples of the computer-readable medium include ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage, and the like, and also implemented in the form of a carrier wave (for example, transmission over the Internet) . Also, the computer may include a control unit 180 of the terminal.

Accordingly, the above description should not be construed in a limiting sense in all respects and should be considered illustrative. The scope of the present invention should be determined by rational interpretation of the appended claims, and all changes within the scope of equivalents of the present invention are included in the scope of the present invention.

The above-described portable terminal and its control method are not limited to the configuration and method of the embodiments described above, but the embodiments may be modified so that all or some of the embodiments are selectively And may be configured in combination.

100: mobile terminal 110: wireless communication unit
111: broadcast receiver 112: mobile communication module
113 wireless Internet module 114 short-range communication module
115: Position information module 120: A / V input section
121: camera 122: microphone
130: user input unit 140: sensing unit
141: proximity sensor 150: output section
151: Display unit 152: Acoustic output module
153: Alarm module 154: Haptic module
155: projector module 160: memory
170: interface unit 180: control unit
181: Multimedia module 190: Battery

Claims (22)

A display unit displaying a specific content on a screen;
A wireless communication unit for communicating with at least one external terminal; And
When the portable terminal is touched to a part of the screen of the external terminal in a state that the communication with the external terminal is connected so that an image displayed in an area associated with the partial area on the screen of the display unit is displayed on a part of the external terminal And a controller for transmitting the image through the wireless communication unit.
The method according to claim 1,
Wherein the control unit captures an image displayed in the associated area and transmits the captured image so that the captured image gradually appears in a partial area of the external terminal.
The method according to claim 1,
Wherein the controller determines that the portable terminal is touched by the partial area when the touch area information including the area and the touched area of the partial area is received from the external terminal through the wireless communication unit .
The method of claim 3,
Wherein the control unit determines the area having the same area as the partial area as the associated area at a position opposite to the partial area on the screen of the display unit based on the touch area information.
The method according to claim 1,
Wherein the control unit displays an image displayed in an area associated with the drag-touched area on the screen of the display unit on the drag-touched area of the screen of the external terminal when the mobile terminal is dragged and touched on the screen of the external terminal And transmits the image through the wireless communication unit.
6. The method of claim 5,
When the drag-touch area information including the touched area and the position of the drag-touched area is received from the external terminal through the wireless communication unit, the controller determines that the mobile terminal is drag-touched on the screen of the external terminal To the portable terminal.
The method according to claim 6,
Wherein the control unit determines the area having the same area as the drag-touched area as the associated area at a position opposite to the drag-touched area on the screen of the display unit based on the drag-touch area information Mobile terminal.
Connecting the portable terminal with at least one external terminal;
Displaying a specific content on a screen of the portable terminal;
Detecting whether the portable terminal is touched on a part of the screen of the external terminal in a state where communication is established with the external terminal; And
And transmitting the image so that an image displayed in an area associated with the partial area in the screen of the portable terminal is displayed in a partial area of the external terminal when the portable terminal is detected as being touched in the partial area, The control method comprising:
touch screen;
A wireless communication unit for communicating with at least one external terminal displaying a specific content on a screen; And
Receiving an image of the content displayed in the partial area from the external terminal through the wireless communication unit when the portable terminal is touched to a part of the screen of the external terminal in a state where communication with the external terminal is connected, And displaying the touch screen on an area associated with the partial area in the screen of the touch screen.
10. The method of claim 9,
Wherein the control unit displays the received image as a visual effect gradually fading into the associated area.
10. The method of claim 9,
Wherein the control unit further receives touch area information including an area and a touched area of the partial area from the external terminal and displays the touch area information on a screen opposite to the partial area on the screen of the touch screen based on the received touch area information And determining an area having the same area as the partial area as the associated area.
10. The method of claim 9,
Wherein the control unit receives an image displayed in the drag-touched area from the external terminal through the wireless communication unit when the mobile terminal is dragged and touched on the screen of the external terminal, Wherein the display unit displays an area associated with the drag-touched area within the portable terminal.
13. The method of claim 12,
Wherein the control unit further receives the drag-touch area information including the area and the position of the drag-touched area from the external terminal, and based on the received drag-touch area information, And determines an area having the same area as the drag-touched area as the associated area.
10. The method of claim 9,
The control unit receives the entire image of the specific content displayed on the screen of the external terminal from the external terminal and crops or captures a portion corresponding to the image displayed in the partial area within the received whole image And displays the information in the associated area.
15. The method of claim 14,
Wherein the control unit crops or captures a portion displayed from the associated region to the drag-touched region in the entire image, when the associated region is dragged and touched, in a state that the image is displayed in the associated region .
10. The method of claim 9,
Wherein the control unit enlarges or reduces the image according to the touch gesture when the touch gesture corresponding to the enlargement or reduction is input on the image while the image is displayed in the associated area Mobile terminal.
10. The method of claim 9,
Wherein the specific content is a web page in which some contents are not displayed on the screen of the external terminal,
The controller receives the entire image of the web page including the contents from the external terminal through the wireless communication unit when the portable terminal is dragged and touched from a part of the web page to the bottom part of the web page And displays the displayed information.
Connecting a communication with at least one external terminal, the portable terminal displaying specific content on the screen;
Detecting whether the portable terminal is touched on a part of the screen of the external terminal in a state where communication is established with the external terminal;
Receiving an image of the content displayed in the partial area from the external terminal when the portable terminal is detected as being touched in the partial area; And
And displaying the received image in an area associated with the partial area in the screen of the portable terminal.
A touch screen displaying specific content on the screen;
A wireless communication unit for communicating with at least one external terminal; And
When the external terminal is touched on a part of the screen of the touch screen in a state where communication with the external terminal is connected, an image of the content displayed on the touched part of the area is displayed in a region And a controller for transmitting the image through the wireless communication unit so that the image is displayed on the display unit.
20. The method of claim 19,
Wherein the control unit captures an image displayed in the partial area and transmits the captured image so that the captured image is displayed with a gradually increasing visual effect on the screen of the external terminal.
20. The method of claim 19,
Wherein the control unit transmits the image of the content displayed in the drag-touched area to the external terminal when the external terminal is dragged and touched on the screen of the touch screen.
Connecting the portable terminal with at least one external terminal;
Displaying a specific content on a screen of a touch screen of the portable terminal;
Detecting whether the external terminal is touched in a part of a screen of the touch screen in a state where communication is established with the external terminal;
Transmitting the image so that an image of the content displayed on the touched part of the area is displayed in an area associated with the part of the area of the screen of the external terminal when the external terminal is touched to the part of the area And controlling the mobile terminal.
KR20130069425A 2013-06-18 2013-06-18 Mobile terminal and method for controlling the same KR20140146759A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR20130069425A KR20140146759A (en) 2013-06-18 2013-06-18 Mobile terminal and method for controlling the same

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR20130069425A KR20140146759A (en) 2013-06-18 2013-06-18 Mobile terminal and method for controlling the same

Publications (1)

Publication Number Publication Date
KR20140146759A true KR20140146759A (en) 2014-12-29

Family

ID=52675826

Family Applications (1)

Application Number Title Priority Date Filing Date
KR20130069425A KR20140146759A (en) 2013-06-18 2013-06-18 Mobile terminal and method for controlling the same

Country Status (1)

Country Link
KR (1) KR20140146759A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20160102834A (en) * 2015-02-23 2016-08-31 엘지전자 주식회사 Terminal and operating method thereof
JP2019061244A (en) * 2018-10-19 2019-04-18 シャープ株式会社 Main display device, display system, terminal display device, and display method
WO2020218852A1 (en) * 2019-04-23 2020-10-29 구윤택 Portable terminal and method for providing user interface of portable terminal using compatibility

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20160102834A (en) * 2015-02-23 2016-08-31 엘지전자 주식회사 Terminal and operating method thereof
JP2019061244A (en) * 2018-10-19 2019-04-18 シャープ株式会社 Main display device, display system, terminal display device, and display method
WO2020218852A1 (en) * 2019-04-23 2020-10-29 구윤택 Portable terminal and method for providing user interface of portable terminal using compatibility
KR20200123984A (en) * 2019-04-23 2020-11-02 구윤택 Mobile communication device and method for providing user interface using mutual compatibility thereof
CN113728352A (en) * 2019-04-23 2021-11-30 丘伦宅 Portable terminal and user interface providing method of portable terminal using phase matching degree

Similar Documents

Publication Publication Date Title
KR101608770B1 (en) Mobile terminal and method for controlling the same
KR101527037B1 (en) Mobile terminal and method for controlling the same
KR101832959B1 (en) Mobile device and control method for the same
KR101701852B1 (en) Mobile terminal and method for controlling the same
KR101608761B1 (en) Mobile terminal and method for controlling the same
KR101701839B1 (en) Mobile terminal and method for controlling the same
KR20150019792A (en) Mobile terminal
KR102065410B1 (en) Mobile terminal and controlling method thereof
KR101695812B1 (en) Mobile terminal and method for controlling the same
KR101578005B1 (en) Mobile terminal and method for inputting user action using camera in mobile terminal
KR101692729B1 (en) Mobile terminal, and method for producing and obtaining message about outside object
KR101752417B1 (en) Mobile terminal and method for controlling device
KR101674213B1 (en) Method for managing picked-up image in mobile terminal and mobile terminal using the same
KR20120122314A (en) Mobile terminal and control method for the same
KR20150012945A (en) Mobile terminal and method for controlling the same
KR20140146759A (en) Mobile terminal and method for controlling the same
KR101549005B1 (en) Mobile terminal and method for controlling the same
KR101622216B1 (en) Mobile terminal and method for controlling input thereof
KR101650579B1 (en) Method for providing messenger using augmented reality in mobile terminal
KR101604698B1 (en) Mobile terminal and method for controlling the same
KR101638913B1 (en) Mobile terminal and method for controlling the same
KR101823479B1 (en) Mobile terminal and method for controlling the same
KR101531507B1 (en) Mobile terminal and method for controlling display thereof
KR101649637B1 (en) Mobile terminal and method for controlling the same
KR20150050052A (en) Mobile terminal and method for controlling of the same

Legal Events

Date Code Title Description
WITN Withdrawal due to no request for examination