KR20170011798A - Mobile terminal and operating method thereof - Google Patents

Mobile terminal and operating method thereof Download PDF

Info

Publication number
KR20170011798A
KR20170011798A KR1020150105086A KR20150105086A KR20170011798A KR 20170011798 A KR20170011798 A KR 20170011798A KR 1020150105086 A KR1020150105086 A KR 1020150105086A KR 20150105086 A KR20150105086 A KR 20150105086A KR 20170011798 A KR20170011798 A KR 20170011798A
Authority
KR
South Korea
Prior art keywords
mobile terminal
screen
mode
image
divided
Prior art date
Application number
KR1020150105086A
Other languages
Korean (ko)
Inventor
허수영
송영훈
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020150105086A priority Critical patent/KR20170011798A/en
Priority to PCT/KR2015/008542 priority patent/WO2017018573A1/en
Publication of KR20170011798A publication Critical patent/KR20170011798A/en

Links

Images

Classifications

    • H04M1/72519
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/22Details of telephonic subscriber devices including a touch pad, a touch sensor or a touch detector
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/52Details of telephonic subscriber devices including functional features of a camera

Abstract

A method of operating a mobile terminal having a display unit according to an exemplary embodiment of the present invention includes receiving a screen division input for dividing a screen of the display unit and receiving a screen division input for dividing a screen of the mobile unit, Entering the first operation mode and displaying items corresponding to the first operation mode on each of the divided screens according to the screen division input.

Description

[0001] MOBILE TERMINAL AND OPERATING METHOD THEREOF [0002]

The present invention relates to a mobile terminal and a method of operating the same.

A terminal can be divided into a mobile terminal (mobile / portable terminal) and a stationary terminal according to whether the terminal can be moved. Again, the mobile terminal can be divided into a handheld terminal and a vehicle mounted terminal according to whether the user can carry the mobile phone directly.

The functions of mobile terminals are diversified. For example, there are data and voice communication, image shooting and video shooting through a camera, voice recording, music file playback through a speaker system, and output of an image or video on a display unit. Some terminals are equipped with an electronic game play function or a multimedia player function. In particular, modern mobile terminals can receive broadcast and multicast signals that provide visual content such as video or television programs.

Such a terminal has various functions, for example, in the form of a multimedia player having a complex function of shooting an image or moving picture, playing music or moving picture file, receiving a game, receiving a broadcast, etc. .

An object of the present invention is to provide a mobile terminal and a method of operating the same, which can promptly display an item corresponding to a specific operation mode by receiving a screen division input.

A method of operating a mobile terminal having a display unit according to an exemplary embodiment of the present invention includes receiving a screen division input for dividing a screen of the display unit, Entering the mobile terminal into the first operation mode and displaying items corresponding to the first operation mode on each of the divided screens according to the screen division input.

According to another aspect of the present invention, there is provided a mobile terminal for receiving a screen division input for dividing a display unit and a screen of the display unit, and when the screen is off, Mode, and controls the display unit to display items corresponding to the first operation mode on each of the divided screens according to the screen division input.

The mobile terminal may further include a front camera and a rear camera, wherein the first operation mode is a mode for executing the front camera and the rear camera, and the control unit displays a preview image And to control the display unit to display a preview image obtained through the rear side camera on the second divided screen.

Wherein the mobile terminal further includes a first rear camera and a second rear camera, wherein the first operation mode is a mode for executing the first rear camera and the second rear camera, 1 display of the preview image acquired through the rear camera, and the display unit to display the preview image obtained through the second rear-side camera on the second divided screen.

The first operation mode is a mode for providing information on a recently executed application and the control unit can control the display unit to display an execution screen of each of recently executed applications on each of the divided screens .

Wherein the control unit enters the second operation mode when the screen is on and displays items corresponding to the second operation mode on each of the divided screens according to the screen division input, The display unit can be controlled.

The second mode of operation may be a mode for displaying information associated with information displayed on the screen.

Wherein the control unit displays the execution window of the first application and the execution window of the second application associated with the first application on each of the divided screens when the execution window of the first application is displayed on the screen before the division, Can be controlled.

According to at least one of the embodiments of the present invention, the user can perform multitasking through the dual screen only by applying the division input on the screen.

1 is a block diagram illustrating a mobile terminal according to the present invention.
2 is a ladder diagram for explaining an operation method of a mobile terminal according to an embodiment of the present invention.
FIGs. 3A and 3B are views illustrating a process of displaying information about the correspondent parties corresponding to the retrieved mobile terminals on the preview screen according to an embodiment of the present invention.
4A to 4D are diagrams illustrating a process of sharing an image being photographed by each mobile terminal according to an embodiment of the present invention.
FIGS. 5A and 5B are diagrams illustrating a process for displaying an image being shot by the other party on the entire area of the preview screen according to an embodiment of the present invention.
FIGS. 6A to 6D are views for explaining a process of collecting and displaying images being shot in each mobile terminal on a preview screen according to an embodiment of the present invention.
7A and 7B illustrate a process of automatically classifying an image being shot by a partner in response to a request to select a part of the subject included in the image being displayed on the preview screen according to an embodiment of the present invention to be.
8A and 8B are views for explaining an embodiment in which an image taken by a counterpart is recommended according to an embodiment of the present invention.
FIG. 9 is a diagram illustrating an example in which if a composition of a first image taken by the user is similar to a composition of a second image taken by the other party according to an embodiment of the present invention, 1 image and a second image together.
FIGS. 10A and 10B are views for explaining a process of stopping the sharing of a captured image with a counterpart terminal according to an embodiment of the present invention.
11A to 11C are diagrams illustrating an embodiment that additionally provides various modes in the photographing mode according to an embodiment of the present invention.
12 is a flowchart illustrating an operation method of another mobile terminal according to another embodiment of the present invention.
13 is a diagram illustrating a process of receiving a screen division input in a screen off state according to an embodiment of the present invention.
14A and 14B are diagrams for explaining an example in which the divided photographic camera mode is executed according to the screen division input.
FIG. 15 is an embodiment for displaying execution screens of recently executed applications according to the screen division input.
16A and 16B illustrate functions that can be performed when a screen division screen is input in a state that a screen of the display unit is turned on.
17A to 17C illustrate functions that can be performed when a screen division screen is input in a state that a screen of the display unit is turned on.
FIGS. 18A and 18B are diagrams for explaining an embodiment for performing various functions according to the input received on the divided screen.
Figs. 19A and 19B are diagrams for explaining another embodiment for performing various functions according to inputs received on a divided screen. Fig.
20A and 20B are diagrams for explaining another embodiment for performing various functions according to the input received on the divided screen.

Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings, wherein like reference numerals are used to designate identical or similar elements, and redundant description thereof will be omitted. The suffix "module" and " part "for the components used in the following description are given or mixed in consideration of ease of specification, and do not have their own meaning or role. In the following description of the embodiments of the present invention, a detailed description of related arts will be omitted when it is determined that the gist of the embodiments disclosed herein may be blurred. It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are intended to provide further explanation of the invention as claimed. , ≪ / RTI > equivalents, and alternatives.

Terms including ordinals, such as first, second, etc., may be used to describe various elements, but the elements are not limited to these terms. The terms are used only for the purpose of distinguishing one component from another.

It is to be understood that when an element is referred to as being "connected" or "connected" to another element, it may be directly connected or connected to the other element, . On the other hand, when an element is referred to as being "directly connected" or "directly connected" to another element, it should be understood that there are no other elements in between.

The singular expressions include plural expressions unless the context clearly dictates otherwise.

In the present application, the terms "comprises", "having", and the like are used to specify that a feature, a number, a step, an operation, an element, a component, But do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, or combinations thereof.

The mobile terminal described in this specification includes a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a personal digital assistant (PDA), a portable multimedia player (PMP), a navigation device, a slate PC A tablet PC, an ultrabook, a wearable device such as a smartwatch, a smart glass, and a head mounted display (HMD). have.

However, it will be appreciated by those skilled in the art that the configuration according to the embodiments described herein may be applied to fixed terminals such as a digital TV, a desktop computer, a digital signage, and the like, will be.

1 is a block diagram illustrating a mobile terminal according to the present invention.

The mobile terminal 100 includes a wireless communication unit 110, an input unit 120, a sensing unit 140, an output unit 150, an interface unit 160, a memory 170, a control unit 180, and a power supply unit 190 ), And the like. The components shown in FIG. 1 are not essential for implementing a mobile terminal, so that the mobile terminal described herein may have more or fewer components than the components listed above.

The wireless communication unit 110 may be connected between the mobile terminal 100 and the wireless communication system or between the mobile terminal 100 and another mobile terminal 100 or between the mobile terminal 100 and the external server 100. [ Lt; RTI ID = 0.0 > wireless < / RTI > In addition, the wireless communication unit 110 may include one or more modules for connecting the mobile terminal 100 to one or more networks.

The wireless communication unit 110 may include at least one of a broadcast receiving module 111, a mobile communication module 112, a wireless Internet module 113, a short distance communication module 114, and a location information module 115 .

The input unit 120 includes a camera 121 or an image input unit for inputting a video signal, a microphone 122 for inputting an audio signal, an audio input unit, a user input unit 123 for receiving information from a user A touch key, a mechanical key, and the like). The voice data or image data collected by the input unit 120 may be analyzed and processed by a user's control command.

The sensing unit 140 may include at least one sensor for sensing at least one of information in the mobile terminal, surrounding environment information surrounding the mobile terminal, and user information. For example, the sensing unit 140 may include a proximity sensor 141, an illumination sensor 142, a touch sensor, an acceleration sensor, a magnetic sensor, A G-sensor, a gyroscope sensor, a motion sensor, an RGB sensor, an infrared sensor, a finger scan sensor, an ultrasonic sensor, A microphone 226, a battery gauge, an environmental sensor (for example, a barometer, a hygrometer, a thermometer, a radiation detection sensor, A thermal sensor, a gas sensor, etc.), a chemical sensor (e.g., an electronic nose, a healthcare sensor, a biometric sensor, etc.). Meanwhile, the mobile terminal disclosed in the present specification can combine and utilize information sensed by at least two of the sensors.

The output unit 150 includes at least one of a display unit 151, an acoustic output unit 152, a haptic tip module 153, and a light output unit 154 to generate an output related to visual, auditory, can do. The display unit 151 may have a mutual layer structure with the touch sensor or may be integrally formed to realize a touch screen. The touch screen may function as a user input unit 123 that provides an input interface between the mobile terminal 100 and a user and may provide an output interface between the mobile terminal 100 and a user.

The interface unit 160 serves as a path to various types of external devices connected to the mobile terminal 100. The interface unit 160 is connected to a device having a wired / wireless headset port, an external charger port, a wired / wireless data port, a memory card port, And may include at least one of a port, an audio I / O port, a video I / O port, and an earphone port. In the mobile terminal 100, corresponding to the connection of the external device to the interface unit 160, it is possible to perform appropriate control related to the connected external device.

In addition, the memory 170 stores data supporting various functions of the mobile terminal 100. The memory 170 may store a plurality of application programs or applications running on the mobile terminal 100, data for operation of the mobile terminal 100, and commands. At least some of these applications may be downloaded from an external server via wireless communication. Also, at least a part of these application programs may exist on the mobile terminal 100 from the time of shipment for the basic functions (e.g., telephone call receiving function, message receiving function, and calling function) of the mobile terminal 100. Meanwhile, the application program may be stored in the memory 170, installed on the mobile terminal 100, and may be operated by the control unit 180 to perform the operation (or function) of the mobile terminal.

In addition to the operations related to the application program, the control unit 180 typically controls the overall operation of the mobile terminal 100. The control unit 180 may process or process signals, data, information, and the like input or output through the above-mentioned components, or may drive an application program stored in the memory 170 to provide or process appropriate information or functions to the user.

In addition, the controller 180 may control at least some of the components illustrated in FIG. 1 in order to drive an application program stored in the memory 170. In addition, the controller 180 may operate at least two of the components included in the mobile terminal 100 in combination with each other for driving the application program.

The power supply unit 190 receives external power and internal power under the control of the controller 180 and supplies power to the components included in the mobile terminal 100. The power supply unit 190 includes a battery, which may be an internal battery or a replaceable battery.

At least some of the components may operate in cooperation with one another to implement a method of operation, control, or control of a mobile terminal according to various embodiments described below. In addition, the operation, control, or control method of the mobile terminal may be implemented on the mobile terminal by driving at least one application program stored in the memory 170. [

Hereinafter, the components listed above will be described in more detail with reference to FIG. 1 before explaining various embodiments implemented through the mobile terminal 100 as described above.

First, referring to the wireless communication unit 110, the broadcast receiving module 111 of the wireless communication unit 110 receives broadcast signals and / or broadcast-related information from an external broadcast management server through a broadcast channel. The broadcast channel may include a satellite channel and a terrestrial channel. Two or more broadcast receiving modules may be provided to the mobile terminal 100 for simultaneous broadcast reception or broadcast channel switching for at least two broadcast channels.

The mobile communication module 112 may be a mobile communication module or a mobile communication module such as a mobile communication module or a mobile communication module that uses technology standards or a communication method (e.g., Global System for Mobile communication (GSM), Code Division Multi Access (CDMA), Code Division Multi Access 2000 (Enhanced Voice-Data Optimized or Enhanced Voice-Data Only), Wideband CDMA (WCDMA), High Speed Downlink Packet Access (HSDPA), High Speed Uplink Packet Access (HSUPA), Long Term Evolution (LTE) And an external terminal, or a server on a mobile communication network established according to a long term evolution (e. G., Long Term Evolution-Advanced).

The wireless signal may include various types of data depending on a voice call signal, a video call signal or a text / multimedia message transmission / reception.

The wireless Internet module 113 is a module for wireless Internet access, and may be built in or externally attached to the mobile terminal 100. The wireless Internet module 113 is configured to transmit and receive a wireless signal in a communication network according to wireless Internet technologies.

Wireless Internet technologies include, for example, wireless LAN (WLAN), wireless fidelity (Wi-Fi), wireless fidelity (Wi-Fi) Direct, DLNA (Digital Living Network Alliance), WiBro Interoperability for Microwave Access, High Speed Downlink Packet Access (HSDPA), High Speed Uplink Packet Access (HSUPA), Long Term Evolution (LTE) and Long Term Evolution-Advanced (LTE-A) 113 transmit and receive data according to at least one wireless Internet technology, including Internet technologies not listed above.

The wireless Internet module 113 for performing a wireless Internet connection through the mobile communication network can be used for wireless Internet access by WiBro, HSDPA, HSUPA, GSM, CDMA, WCDMA, LTE or LTE- May be understood as a kind of the mobile communication module 112.

The short-range communication module 114 is for short-range communication, and includes Bluetooth ™, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB) (Near Field Communication), Wi-Fi (Wireless-Fidelity), Wi-Fi Direct, and Wireless USB (Wireless Universal Serial Bus) technology. The short-range communication module 114 is connected to the mobile terminal 100 and the wireless communication system through the wireless area networks, between the mobile terminal 100 and another mobile terminal 100, or between the mobile terminal 100 ) And the other mobile terminal 100 (or the external server). The short-range wireless communication network may be a short-range wireless personal area network.

Here, the other mobile terminal 100 may be a wearable device (e.g., a smartwatch, a smart glass, etc.) capable of interchanging data with the mobile terminal 100 according to the present invention (smart glass), HMD (head mounted display)). The short range communication module 114 may detect (or recognize) a wearable device capable of communicating with the mobile terminal 100 around the mobile terminal 100. [ If the detected wearable device is a device authenticated to communicate with the mobile terminal 100 according to the present invention, the control unit 180 may transmit at least a part of the data processed by the mobile terminal 100 to the short- 114 to the wearable device. Therefore, the user of the wearable device can use the data processed by the mobile terminal 100 through the wearable device. For example, according to this, when a telephone is received in the mobile terminal 100, the user performs a telephone conversation via the wearable device, or when a message is received in the mobile terminal 100, It is possible to check the message.

The position information module 115 is a module for obtaining the position (or current position) of the mobile terminal, and a representative example thereof is a Global Positioning System (GPS) module or a Wireless Fidelity (WiFi) module. For example, when the mobile terminal utilizes the GPS module, it can acquire the position of the mobile terminal by using a signal transmitted from the GPS satellite. As another example, when the mobile terminal utilizes the Wi-Fi module, it can acquire the position of the mobile terminal based on information of a wireless access point (AP) that transmits or receives the wireless signal with the Wi-Fi module. Optionally, the location information module 115 may perform any of the other functions of the wireless communication unit 110 to obtain data relating to the location of the mobile terminal, in addition or alternatively. The location information module 115 is a module used to obtain the location (or current location) of the mobile terminal, and is not limited to a module that directly calculates or obtains the location of the mobile terminal.

Next, the input unit 120 is for inputting image information (or signal), audio information (or signal), data, or information input from a user. For inputting image information, Or a plurality of cameras 121 may be provided. The camera 121 processes image frames such as still images or moving images obtained by the image sensor in the video communication mode or the photographing mode. The processed image frame may be displayed on the display unit 151 or stored in the memory 170. [ A plurality of cameras 121 provided in the mobile terminal 100 may be arranged to have a matrix structure and various angles or foci may be provided to the mobile terminal 100 through the camera 121 having the matrix structure A plurality of pieces of image information can be input. In addition, the plurality of cameras 121 may be arranged in a stereo structure to acquire a left image and a right image for realizing a stereoscopic image.

The microphone 122 processes the external acoustic signal into electrical voice data. The processed voice data can be utilized variously according to a function (or a running application program) being executed in the mobile terminal 100. Meanwhile, the microphone 122 may be implemented with various noise reduction algorithms for eliminating noise generated in receiving an external sound signal.

The user input unit 123 is for receiving information from a user and when the information is inputted through the user input unit 123, the control unit 180 can control the operation of the mobile terminal 100 to correspond to the input information . The user input unit 123 may include a mechanical input unit or a mechanical key such as a button located on the rear or side of the mobile terminal 100, a dome switch, a jog wheel, Jog switches, etc.) and touch-type input means. For example, the touch-type input means may comprise a virtual key, a soft key or a visual key displayed on the touch screen through software processing, And a touch key disposed on the touch panel. Meanwhile, the virtual key or the visual key can be displayed on a touch screen having various forms, for example, a graphic, a text, an icon, a video, As shown in FIG.

Meanwhile, the sensing unit 140 senses at least one of information in the mobile terminal, surrounding environment information surrounding the mobile terminal, and user information, and generates a corresponding sensing signal. The control unit 180 may control the driving or operation of the mobile terminal 100 or may perform data processing, function or operation related to the application program installed in the mobile terminal 100 based on the sensing signal. Representative sensors among various sensors that may be included in the sensing unit 140 will be described in more detail.

First, the proximity sensor 141 refers to a sensor that detects the presence of an object approaching a predetermined detection surface, or the presence of an object in the vicinity of the detection surface, without mechanical contact by using electromagnetic force or infrared rays. The proximity sensor 141 may be disposed in the inner area of the mobile terminal or in proximity to the touch screen, which is covered by the touch screen.

Examples of the proximity sensor 141 include a transmission type photoelectric sensor, a direct reflection type photoelectric sensor, a mirror reflection type photoelectric sensor, a high frequency oscillation type proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor. In the case where the touch screen is electrostatic, the proximity sensor 141 can be configured to detect the proximity of the object with a change of the electric field along the proximity of the object having conductivity. In this case, the touch screen (or touch sensor) itself may be classified as a proximity sensor.

On the other hand, for convenience of explanation, the act of recognizing that the object is located on the touch screen in proximity with no object touching the touch screen is referred to as "proximity touch & The act of actually touching an object on the screen is called a "contact touch. &Quot; The position at which the object is closely touched on the touch screen means a position where the object corresponds to the touch screen vertically when the object is touched. The proximity sensor 141 can detect a proximity touch and a proximity touch pattern (e.g., a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, have. Meanwhile, the control unit 180 processes data (or information) corresponding to the proximity touch operation and the proximity touch pattern sensed through the proximity sensor 141 as described above, and further provides visual information corresponding to the processed data It can be output on the touch screen. Furthermore, the control unit 180 can control the mobile terminal 100 such that different operations or data (or information) are processed according to whether the touch to the same point on the touch screen is a proximity touch or a touch touch .

The touch sensor uses a touch (or touch input) applied to the touch screen (or the display unit 151) by using at least one of various touch methods such as a resistance film type, a capacitive type, an infrared type, an ultrasonic type, Detection.

For example, the touch sensor may be configured to convert a change in a pressure applied to a specific portion of the touch screen or a capacitance generated in a specific portion to an electrical input signal. The touch sensor may be configured to detect a position, an area, a pressure at the time of touch, a capacitance at the time of touch, and the like where a touch object touching the touch screen is touched on the touch sensor. Here, the touch object may be a finger, a touch pen, a stylus pen, a pointer, or the like as an object to which a touch is applied to the touch sensor.

Thus, when there is a touch input to the touch sensor, the corresponding signal (s) is sent to the touch controller. The touch controller processes the signal (s) and then transmits the corresponding data to the control unit 180. Thus, the control unit 180 can know which area of the display unit 151 is touched or the like. Here, the touch controller may be a separate component from the control unit 180, and may be the control unit 180 itself.

On the other hand, the control unit 180 may perform different controls or perform the same control according to the type of the touch object touching the touch screen (or a touch key provided on the touch screen). Whether to perform different controls or to perform the same control according to the type of the touch object may be determined according to the current state of the mobile terminal 100 or an application program being executed.

On the other hand, the touch sensors and the proximity sensors discussed above can be used independently or in combination to provide a short touch (touch), a long touch, a multi touch, a drag touch ), Flick touch, pinch-in touch, pinch-out touch, swipe touch, hovering touch, and the like. Touch can be sensed.

The ultrasonic sensor can recognize the position information of the object to be sensed by using ultrasonic waves. Meanwhile, the controller 180 can calculate the position of the wave generating source through the information sensed by the optical sensor and the plurality of ultrasonic sensors. The position of the wave source can be calculated using the fact that the light is much faster than the ultrasonic wave, that is, the time when the light reaches the optical sensor is much faster than the time the ultrasonic wave reaches the ultrasonic sensor. More specifically, the position of the wave generating source can be calculated using the time difference with the time when the ultrasonic wave reaches the reference signal.

The camera 121 includes at least one of a camera sensor (for example, a CCD, a CMOS, etc.), a photo sensor (or an image sensor), and a laser sensor.

The camera 121 and the laser sensor may be combined with each other to sense a touch of the sensing object with respect to the three-dimensional stereoscopic image. The photosensor can be laminated to the display element, which is adapted to scan the movement of the object to be detected proximate to the touch screen. More specifically, the photosensor mounts photo diodes and TRs (Transistors) in a row / column and scans the contents loaded on the photosensor using an electrical signal that varies according to the amount of light applied to the photo diode. That is, the photo sensor performs coordinate calculation of the object to be sensed according to the amount of change of light, and position information of the object to be sensed can be obtained through the calculation.

The display unit 151 displays (outputs) information processed by the mobile terminal 100. For example, the display unit 151 may display execution screen information of an application program driven by the mobile terminal 100 or UI (User Interface) and GUI (Graphic User Interface) information according to the execution screen information .

Also, the display unit 151 may be configured as a stereoscopic display unit for displaying a stereoscopic image.

In the stereoscopic display unit, a three-dimensional display system such as a stereoscopic system (glasses system), an autostereoscopic system (no-glasses system), and a projection system (holographic system) can be applied.

The sound output unit 152 may output audio data received from the wireless communication unit 110 or stored in the memory 170 in a call signal reception mode, a call mode or a recording mode, a voice recognition mode, a broadcast reception mode, The sound output unit 152 also outputs sound signals related to functions (e.g., call signal reception sound, message reception sound, etc.) performed in the mobile terminal 100. [ The audio output unit 152 may include a receiver, a speaker, a buzzer, and the like.

The haptic module 153 generates various tactile effects that the user can feel. A typical example of the haptic effect generated by the haptic module 153 may be vibration. The intensity and pattern of the vibration generated in the haptic module 153 can be controlled by the user's selection or the setting of the control unit. For example, the haptic module 153 may synthesize and output different vibrations or sequentially output the vibrations.

In addition to vibration, the haptic module 153 may be configured to perform various functions such as a pin arrangement vertically moving with respect to the contact skin surface, a spraying force or suction force of the air through the injection port or the suction port, a touch on the skin surface, And various tactile effects such as an effect of reproducing a cold sensation using an endothermic or exothermic element can be generated.

The haptic module 153 can transmit the tactile effect through the direct contact, and the tactile effect can be felt by the user through the muscles of the finger or arm. The haptic module 153 may include two or more haptic modules 153 according to the configuration of the mobile terminal 100.

The light output unit 154 outputs a signal for notifying the occurrence of an event using the light of the light source of the mobile terminal 100. Examples of events that occur in the mobile terminal 100 may include message reception, call signal reception, missed call, alarm, schedule notification, email reception, information reception through an application, and the like.

The signal output from the light output unit 154 is implemented as the mobile terminal emits light of a single color or a plurality of colors to the front or rear surface. The signal output may be terminated by the mobile terminal detecting the event confirmation of the user.

The interface unit 160 serves as a path for communication with all external devices connected to the mobile terminal 100. The interface unit 160 receives data from an external device or supplies power to each component in the mobile terminal 100 or allows data in the mobile terminal 100 to be transmitted to an external device. For example, a port for connecting a device equipped with a wired / wireless headset port, an external charger port, a wired / wireless data port, a memory card port, an audio I / O port, a video I / O port, an earphone port, and the like may be included in the interface unit 160.

The identification module is a chip for storing various information for authenticating the use right of the mobile terminal 100 and includes a user identification module (UIM), a subscriber identity module (SIM) A universal subscriber identity module (USIM), and the like. Devices with identification modules (hereinafter referred to as "identification devices") can be manufactured in a smart card format. Accordingly, the identification device can be connected to the terminal 100 through the interface unit 160. [

The interface unit 160 may be a path through which power from the cradle is supplied to the mobile terminal 100 when the mobile terminal 100 is connected to an external cradle, And various command signals may be transmitted to the mobile terminal 100. The various command signals or the power source input from the cradle may be operated as a signal for recognizing that the mobile terminal 100 is correctly mounted on the cradle.

The memory 170 may store a program for the operation of the controller 180 and temporarily store input / output data (e.g., a phone book, a message, a still image, a moving picture, etc.). The memory 170 may store data on vibration and sound of various patterns outputted when a touch is input on the touch screen.

The memory 170 may be a flash memory type, a hard disk type, a solid state disk type, an SDD type (Silicon Disk Drive type), a multimedia card micro type ), Card type memory (e.g., SD or XD memory), random access memory (RAM), static random access memory (SRAM), read-only memory (ROM), electrically erasable programmable read memory, a programmable read-only memory (PROM), a magnetic memory, a magnetic disk, and / or an optical disk. The mobile terminal 100 may operate in association with a web storage that performs the storage function of the memory 170 on the Internet.

Meanwhile, as described above, the control unit 180 controls the operations related to the application program and the general operation of the mobile terminal 100. [ For example, when the state of the mobile terminal meets a set condition, the control unit 180 can execute or release a lock state for restricting input of a user's control command to applications.

In addition, the control unit 180 performs control and processing related to voice communication, data communication, video call, or the like, or performs pattern recognition processing to recognize handwriting input or drawing input performed on the touch screen as characters and images, respectively . Further, the controller 180 may control any one or a plurality of the above-described components in order to implement various embodiments described below on the mobile terminal 100 according to the present invention.

The power supply unit 190 receives external power and internal power under the control of the controller 180 and supplies power necessary for operation of the respective components. The power supply unit 190 includes a battery, the battery may be an internal battery configured to be chargeable, and may be detachably coupled to the terminal body for charging or the like.

In addition, the power supply unit 190 may include a connection port, and the connection port may be configured as an example of an interface 160 through which an external charger for supplying power for charging the battery is electrically connected.

As another example, the power supply unit 190 may be configured to charge the battery in a wireless manner without using the connection port. In this case, the power supply unit 190 may use at least one of an inductive coupling method based on the magnetic induction phenomenon and a magnetic resonance coupling based on the electromagnetic resonance phenomenon from an external wireless power transmission apparatus Power can be delivered.

In the following, various embodiments may be embodied in a recording medium readable by a computer or similar device using, for example, software, hardware, or a combination thereof.

Next, a communication system that can be implemented through the mobile terminal 100 according to the present invention will be described.

First, the communication system may use different wireless interfaces and / or physical layers. For example, wireless interfaces that can be used by a communication system include Frequency Division Multiple Access (FDMA), Time Division Multiple Access (TDMA), Code Division Multiple Access (CDMA) ), Universal mobile telecommunication systems (UMTS) (in particular Long Term Evolution (LTE), Long Term Evolution-Advanced (LTE-A)), Global System for Mobile Communications May be included.

Hereinafter, for the sake of convenience of description, the description will be limited to CDMA. However, it is apparent that the present invention can be applied to all communication systems including an OFDM (Orthogonal Frequency Division Multiplexing) wireless communication system as well as a CDMA wireless communication system.

A CDMA wireless communication system includes at least one terminal 100, at least one base station (BS) (also referred to as a Node B or Evolved Node B), at least one Base Station Controllers (BSCs) , And a Mobile Switching Center (MSC). The MSC is configured to be coupled to a Public Switched Telephone Network (PSTN) and BSCs. The BSCs may be paired with the BS via a backhaul line. The backhaul line may be provided according to at least one of E1 / T1, ATM, IP, PPP, Frame Relay, HDSL, ADSL or xDSL. Thus, a plurality of BSCs may be included in a CDMA wireless communication system.

Each of the plurality of BSs may comprise at least one sector, and each sector may comprise an omnidirectional antenna or an antenna pointing to a particular direction of radial emission from the BS. In addition, each sector may include two or more antennas of various types. Each BS may be configured to support a plurality of frequency assignments, and a plurality of frequency assignments may each have a specific spectrum (e.g., 1.25 MHz, 5 MHz, etc.).

The intersection of sector and frequency assignment may be referred to as a CDMA channel. The BS may be referred to as a base station transceiver subsystem (BTSs). In this case, a combination of one BSC and at least one BS may be referred to as a "base station ". The base station may also indicate a "cell site ". Alternatively, each of the plurality of sectors for a particular BS may be referred to as a plurality of cell sites.

A broadcast transmission unit (BT) transmits a broadcast signal to terminals 100 operating in the system. The broadcast receiving module 111 shown in FIG. 1 is provided in the terminal 100 to receive a broadcast signal transmitted by the BT.

In addition, a Global Positioning System (GPS) may be associated with the CDMA wireless communication system to identify the location of the mobile terminal 100. The satellite aids in locating the mobile terminal 100. Useful location information may be obtained by two or more satellites. Here, the position of the mobile terminal 100 can be tracked using all the techniques capable of tracking the location as well as the GPS tracking technology. Also, at least one of the GPS satellites may optionally or additionally be responsible for satellite DMB transmissions.

The location information module 115 included in the mobile terminal is for detecting, computing, or identifying the location of the mobile terminal, and may include a Global Position System (GPS) module and a WiFi (Wireless Fidelity) module. Optionally, the location information module 115 may perform any of the other functions of the wireless communication unit 110 to obtain data relating to the location of the mobile terminal, in addition or alternatively.

The GPS module 115 calculates distance information and accurate time information from three or more satellites and then applies trigonometry to the calculated information to accurately calculate three-dimensional current location information according to latitude, longitude, and altitude can do. At present, a method of calculating position and time information using three satellites and correcting an error of the calculated position and time information using another satellite is widely used. In addition, the GPS module 115 can calculate speed information by continuously calculating the current position in real time. However, it is difficult to accurately measure the position of the mobile terminal by using the GPS module in the shadow area of the satellite signal as in the room. Accordingly, a WPS (WiFi Positioning System) can be utilized to compensate the positioning of the GPS system.

The WiFi Positioning System (WPS) is a system in which a mobile terminal 100 uses a WiFi module included in the mobile terminal 100 and a wireless AP (wireless access point) transmitting or receiving a wireless signal with the WiFi module, Is a technology for tracking a location of a wireless local area network (WLAN) using WiFi.

The WiFi location tracking system may include a Wi-Fi location server, a mobile terminal 100, a wireless AP connected to the mobile terminal 100, and a database in which certain wireless AP information is stored.

The mobile terminal 100 connected to the wireless AP can transmit the location information request message to the Wi-Fi location server.

The Wi-Fi position location server extracts information of the wireless AP connected to the mobile terminal 100 based on the location information request message (or signal) of the mobile terminal 100. The information of the wireless AP connected to the mobile terminal 100 may be transmitted to the Wi-Fi position location server through the mobile terminal 100, or may be transmitted from the wireless AP to the Wi-Fi location server.

The information of the wireless AP to be extracted based on the location information request message of the mobile terminal 100 includes a MAC address, an SSID (Service Set IDentification), a Received Signal Strength Indicator (RSSI), a Reference Signal Received Power (RSRP) Reference Signal Received Quality), channel information, Privacy, Network Type, Signal Strength, and Noise Strength.

As described above, the Wi-Fi position location server can receive the information of the wireless AP connected to the mobile terminal 100 and extract the wireless AP information corresponding to the wireless AP to which the mobile terminal is connected from the pre-established database. In this case, the information of any wireless APs stored in the database includes at least one of MAC address, SSID, channel information, privacy, network type, radius coordinates of the wireless AP, building name, Available), the address of the AP owner, telephone number, and the like. At this time, in order to remove the wireless AP provided using the mobile AP or the illegal MAC address in the positioning process, the Wi-Fi location server may extract only a predetermined number of wireless AP information in order of RSSI.

Then, the Wi-Fi location server can extract (or analyze) the location information of the mobile terminal 100 using at least one wireless AP information extracted from the database. And compares the received information with the received wireless AP information to extract (or analyze) the location information of the mobile terminal 100.

As a method for extracting (or analyzing) the position information of the mobile terminal 100, a Cell-ID method, a fingerprint method, a triangulation method, and a landmark method can be utilized.

The Cell-ID method is a method of determining the position of the mobile station with the strongest signal strength among neighboring wireless AP information collected by the mobile terminal. Although the implementation is simple, it does not cost extra and it can acquire location information quickly, but there is a disadvantage that positioning accuracy is lowered when the installation density of the wireless AP is low.

The fingerprint method collects signal strength information by selecting a reference position in a service area, and estimates the position based on the signal strength information transmitted from the mobile terminal based on the collected information. In order to use the fingerprint method, it is necessary to previously convert the propagation characteristics into a database.

The triangulation method is a method of calculating the position of the mobile terminal based on the coordinates of at least three wireless APs and the distance between the mobile terminals. (Time of Arrival, ToA), Time Difference of Arrival (TDoA) in which a signal is transmitted, and the time difference between the wireless AP and the wireless AP, in order to measure the distance between the mobile terminal and the wireless AP. , An angle at which a signal is transmitted (Angle of Arrival, AoA), or the like.

The landmark method is a method of measuring the position of a mobile terminal using a landmark transmitter that knows the location.

Various algorithms can be utilized as a method for extracting (or analyzing) the location information of the mobile terminal.

The location information of the extracted mobile terminal 100 is transmitted to the mobile terminal 100 via the Wi-Fi location server, so that the mobile terminal 100 can acquire the location information.

The mobile terminal 100 may be connected to at least one wireless AP to obtain location information. At this time, the number of wireless APs required to acquire the location information of the mobile terminal 100 may be variously changed according to the wireless communication environment in which the mobile terminal 100 is located.

Next, an operation method of a mobile terminal according to an embodiment of the present invention will be described with reference to FIG.

2 is a ladder diagram for explaining an operation method of a mobile terminal according to an embodiment of the present invention.

Hereinafter, each of the first mobile terminal 100_1, the second mobile terminal 100_2, and other mobile terminals may include all of the components described in FIG.

The control unit 180 of the first mobile terminal 100_1 displays the image of the subject on the preview screen of the display unit 151 (S101). The control unit 180 may enter the first mobile terminal 100_1 into the photographing mode according to the user input. The photographing mode may include a moving image photographing mode for photographing a subject in real time to acquire a moving image, and a photograph photographing mode for photographing the subject to obtain a photograph. The control unit 180 can display the image of the subject on the preview screen as the first mobile terminal 100_1 enters the shooting mode.

The control unit 180 obtains a search input for searching for other mobile terminals located in the vicinity of the first mobile terminal 100_1 (103). In one embodiment, the search input may be a touch input that selects the preview screen for a predetermined time or more.

In yet another embodiment, the search input may be an input for selecting a search button displayed on the preview screen.

The control unit 180 may search for one or more other mobile terminals located in the vicinity of the first mobile terminal 100_1 in response to the obtained search input. The control unit 180 may search for one or more mobile terminals located in the vicinity through the short distance communication module 114. Here, the local communication standard may be any one of Wi-Fi and Bluetooth standards, but this is merely an example.

The control unit 180 displays a plurality of items corresponding to each of the other mobile terminals searched in response to the obtained search input through the display unit 151 (S105). In one embodiment, each of the plurality of items may be an icon representing a user of each of the retrieved mobile terminals.

In one embodiment, each of the plurality of items may be an icon representing a user of mobile terminals corresponding to a contact stored in the memory 170 of the retrieved mobile terminals. Steps S101 to S105 will be described in the following drawings.

In the following embodiments, the first mobile terminal 100_1 and the second mobile terminal 100_2 may be applied to a case of sharing a photographed picture, which is described as an example of sharing a moving picture being photographed.

FIGs. 3A and 3B are views illustrating a process of displaying information about the correspondent parties corresponding to the retrieved mobile terminals on the preview screen according to an embodiment of the present invention.

Referring to FIG. 3A, an image 310 of a subject can be displayed on a preview screen 300. FIG. 3A and 3B, the first mobile terminal 100_1 may operate in a moving image shooting mode, and the image 310 of a subject may be an image being shot in a moving image shooting mode. When an input for touching one point on the preview screen 300 for a predetermined time or longer is received, the controller 180 displays a plurality of items 321 to 325 May be displayed on one area 320 of the preview screen 300. The display unit 151 may be configured to display the preview image on the preview area. Each of the plurality of items 321 to 325 may represent each of the users of the retrieved other mobile terminals. Each item may include identification information that identifies a user of another mobile terminal. The identification information may include one or more of an image of the user, a name of the user, and a contact of the user.

Meanwhile, referring to FIG. 3B, the preview screen 300 may include a search button 311. FIG. The search button 311 may be a button for acquiring only mobile terminals corresponding to the contacts stored in the memory 170 among other mobile terminals searched. When the search input for selecting the search button 311 is received, the control unit 180, in response to the search input, searches for a search corresponding to the contacts stored in the memory 170 among other searched mobile terminals, The display unit 151 may be controlled to display a plurality of items 321 to 325 corresponding to the mobile terminals.

In one embodiment, when the search input is received, the controller 180 may power on the local communication module 114. That is, when the power of the local communication module 114 is off, the controller 180 can turn on the local communication module 114 as the search input is received.

Fig. 2 will be described again.

The control unit 180 receives a request for selecting one or more of the plurality of items (S107), and transmits the image being shot at the second mobile terminal 100_2 corresponding to the selected item to each mobile terminal To the second mobile terminal 100_2 (S109). It is assumed that the second mobile terminal 100_2 also operates in the moving image shooting mode and photographs the image of the subject in real time.

The control unit 180 may transmit a sharing request to the second mobile terminal 100_2 corresponding to the selected item through the short distance communication module 114 according to a request to select a selected item among the plurality of items.

The control unit 180 of the first mobile terminal 100_1 receives the image being shot by the second mobile terminal 100_2 in response to the sharing request (S111), and displays the received image on the preview screen (S113).

Likewise, the control unit 180 of the second mobile terminal 100_2 receives the image being shot by the first mobile terminal 100_1 in response to the sharing request (S111), and displays the received image on the preview screen (S113) .

When the second mobile terminal 100_2 receives the sharing request from the first mobile terminal 100_1, the second mobile terminal 100_2 may display an acceptance message window indicating whether to accept or not to share the image of the subject being shot. When the second mobile terminal 100_2 accepts the sharing of the image of the subject being photographed, the second mobile terminal 100_2 can transmit the data of the image being photographed to the first mobile terminal 100_1 in real time, 1 mobile terminal 100_1 may also transmit data on the image being shot to the second mobile terminal 100_2. Steps S107 to S113 will be described with reference to the following figures.

4A to 4D are diagrams illustrating a process of sharing an image being photographed by each mobile terminal according to an embodiment of the present invention.

4A, the controller 180 of the first mobile terminal 100_1 receives a request to select a specific item 321 among a plurality of items 321 to 325 displayed on the preview screen 310 And may transmit a sharing request to the second mobile terminal 100_2 of the user corresponding to the selected specific item 321. [

Referring to FIG. 4B, a preview screen 500 of the second mobile terminal 100_2 is shown. The preview 510 of the second mobile terminal 100_2 may display the image 510 of the subject being photographed by the second mobile terminal 100_2. Here, it is assumed that the second mobile terminal 100_2 is operating in the moving picture shooting mode. The second mobile terminal 100_2 displays a message window 520 of a share acceptance request for accepting or rejecting the sharing of the image being shot in accordance with the sharing request received from the first mobile terminal 100_1 on the preview screen 500 can do. The share acceptance inquiry message window 520 includes a message informing that the sharing request has come, information about the user who transmitted the sharing request, the image of the subject being shot by the first mobile terminal 100_1 of the user who transmitted the sharing request, A thumbnail image, a reject button to reject the sharing request, and an accept button to accept the sharing request. The information about the user who transmitted the sharing request may include one or more of the name of the user who transmitted the sharing request, and the image of the user. When a request to select the Accept button is received in FIG. 4B, the second mobile terminal 100_2 can transmit the image 510 of the subject being shot to the first mobile terminal 100_1 in real time. When the second mobile terminal 100_2 accepts the sharing request, the first mobile terminal 100_1 can transmit the image 310 of the subject that it is shooting to the second mobile terminal 100_2.

Referring to FIG. 4C, the first mobile terminal 100_1 receives the image 330 of the object being photographed by the second mobile terminal 100_2, and displays the received image 330 of the object on the preview screen 300 Area can be displayed. The control unit 180 can display the image 310 being shot through the camera on the entire area of the preview screen 300 and can display the image 330 received from the second mobile terminal 100_2 on the preview screen 300 It can be displayed in one area. The control unit 180 can receive the image 330 of the subject in real time from the second mobile terminal 100_2 and display the received image 330 of the subject in a thumbnail form. The control unit 180 may control the display unit 151 to display the shared state information 340 indicating that the second mobile terminal 100_2 shares the image being shot with the preview screen 300. [ The sharing status information 340 may include the name of the shared party, the image of the partner, and the like.

4D, the second mobile terminal 100_2 receives the image 530 of the subject being photographed by the first mobile terminal 100_1, and displays the received image 530 of the subject on the preview screen 500, In a region of the display screen. The second mobile terminal 100_2 can receive the image 530 of the subject in real time from the first mobile terminal 100_1 and display the received image 530 of the subject in the thumbnail form. The second mobile terminal 100_2 controls the display unit 151 to display the sharing state information 540 indicating that the first mobile terminal 100_1 shares the image being captured with the first mobile terminal 100_1 on the preview screen 500 . The sharing status information 540 may include the name of the shared partner, the image of the partner, and the like.

The user can view the image that he / she is shooting and the image that is shot by the other party at a glance.

Fig. 2 will be described again.

In response to the request to select an image to be received from the second mobile terminal 100_2 displayed on the preview screen, the controller 180 edits and displays the image being captured by the second mobile terminal 100_2 (S115). This will be described in the following drawings.

FIGS. 5A and 5B are diagrams illustrating a process for displaying an image being shot by the other party on the entire area of the preview screen according to an embodiment of the present invention.

In FIG. 5A, a preview screen 300 of the first mobile terminal 100_1 is shown. An image 310 captured by the first mobile terminal 100_1 and an image 330 received from the second mobile terminal 100_2 may be displayed on the preview screen 300. [ The image 310 captured by the first mobile terminal 100_1 may be displayed on the entire area of the preview screen 300 and the image 330 received from the second mobile terminal 100_2 may be displayed on the preview screen 300 Can be displayed in one area. 5B, when the request to select the image 330 received from the second mobile terminal 100_2 is received, the controller 180 transmits the image 330 received from the second mobile terminal 100_2, Can be enlarged and displayed on the entire area of the preview screen 300 and the image 310 captured by the first mobile terminal 100_1 can be displayed in a reduced size on one area of the preview screen 300. [

According to another embodiment of the present invention, a user can view a video shot by himself / herself and an image shot by the other party on a split screen.

FIGS. 6A to 6D are views for explaining a process of collecting and displaying images being shot in each mobile terminal on a preview screen according to an embodiment of the present invention.

Referring to FIG. 6A, a preview image 300 of the first mobile terminal 100_1 includes a first image 310 being captured by the first mobile terminal 100_1 and a second image being captured by the second mobile terminal 100_2 (330) are displayed. In addition, the preview screen 300 may further display a first user icon 350 that identifies a user of the first mobile terminal 100_1. The control unit 180 displays the first image 310 in one area of the preview screen 300 as shown in FIG. 6B when the image 330 is selected and an input for dragging the image is received, The display unit 151 may be controlled to display the second image 330 in the remaining area of the preview screen 300. [ Here, the area of one area and the area of the remaining area may be the same. In the remaining area of the preview screen 300, a second user icon 370 for identifying the user of the second mobile terminal 100_2 may be further displayed. A storage icon 351 may be further displayed on the preview screen 300 of FIG. 6B. The control unit 180 may store the first image 310 and the second image 330 being displayed on the preview screen 300 as one file when a request to select the storage icon 351 is received.

Referring to FIG. 6C, a preview screen 300 for displaying an image being shot at each of the first mobile terminal 100_1, the second mobile terminal 100_2, and the third mobile terminal is shown. In FIG. 6C, the first mobile terminal 100_1 receives and displays an image being shot from each of the second mobile terminal 100_2 and the third mobile terminal. That is, in the first area of the preview screen 300, the first user icon 350 for identifying the first image 310 being captured by the first mobile terminal 100_1 and the user of the first mobile terminal 100_1 is displayed . A second user icon 370 for identifying the user of the second mobile terminal 100_2 and the second image 330 being captured by the second mobile terminal 100_2 may be displayed in the second area of the preview screen 300 have. In the third area of the preview screen 300, a third image 380, which is being photographed by the third mobile terminal, and a third user icon, which identifies the user of the third mobile terminal, may be displayed.

6D illustrates a case where four persons are being photographed. On the preview screen 300, a fourth user icon 391 for identifying a user of the fourth mobile terminal and a video 390 being shot by the fourth mobile terminal Can be displayed further.

6C and 6D Also, the storage icon 351 can be further displayed. The user can save the image being shot by four persons including himself / herself as one file through the save icon 351. [

According to another embodiment of the present invention, in response to a request to select a part of an object included in an image being displayed on a preview screen, the other party can automatically classify an image being shot.

7A and 7B illustrate a process of automatically classifying an image being shot by a partner in response to a request to select a part of the subject included in the image being displayed on the preview screen according to an embodiment of the present invention to be.

Referring to FIG. 7A, the first image 310 being captured by the first mobile terminal 100_1 is displayed in the entire area of the preview screen 300 of the first mobile terminal 100_1, and a part of the preview image 300 In the area 320, images 330, 390, and 380 being photographed may be displayed at each of the other mobile terminals. Each of the images 330, 390, and 380 may be a photograph. When a request to select a face region 319 of a specific person among the persons included in the first image 310 is received, the control unit 180 selects a face region 319 of the plurality of images 330, 390, 319) can be classified and displayed. 7B, the controller 180 recognizes a person corresponding to the selected face area 319 and displays the images including the recognized person among the plurality of images 330, 390, and 380 330, and 390 can be displayed in a certain area 320. [ In addition, the controller 180 may display an indicator indicating a face of the person in each of the images 330 and 390 including the person. In addition, the control unit 180 may automatically store only the images 330 and 390.

If there is no image including a person corresponding to the face region 319 at a time point when the face region 319 is selected, the images 330, 390 and 380 can be dimmed. The control unit 180 may display an indicator that a corresponding character has appeared in a specific image.

According to another embodiment of the present invention, an image captured by the other party in the same space and the same composition may be recommended.

8A and 8B are views for explaining an embodiment in which an image taken by a counterpart is recommended according to an embodiment of the present invention.

According to the embodiment of the present invention, the photographed image of the other party can be performed through short-range communication. Accordingly, there is a high probability that the user and the other party are located in the same space. The control unit 180 compares the image captured by the counterpart terminal with the image captured by the terminal and if the quality of the image captured by the counterpart terminal is better, Area can be displayed. For example, the quality of the image may be determined by a composition of the image.

The first image 310 to be photographed by the first mobile terminal 100_1 is displayed in the entire area of the preview screen 300 in FIG. The images 330, 390, and 380 being shot may be displayed. If the third image 380 is a higher quality image than the first image 310, the controller 180 may display the third image 380 on the preview screen 300 as shown in FIG. The display unit 151 may be controlled to display the first image 310 on the entire area and display the first image 310 on the partial area 320 of the preview screen 300. [

According to another embodiment of the present invention, when it is confirmed that the composition of the first image captured by the user is similar to that of the second image captured by the other party, The image and the second image can be stored together.

FIG. 9 is a diagram illustrating an example in which if a composition of a first image taken by the user is similar to a composition of a second image taken by the other party according to an embodiment of the present invention, 1 image and a second image together.

Referring to FIG. 9, a preview image 300 of the first mobile terminal 100_1 displays a first image 310 being captured by the first mobile terminal 100_1. The first mobile terminal 100_1 can share images with the second mobile terminal 100_2. The composition of the second image 410 captured by the second mobile terminal 100_2 is similar to that of the first image 310 and the shooting time of the first image 310 is similar to that of the second image 410 The control unit 180 may display a simultaneous storage message window 400 indicating that the second image 410 can be stored together with the first image 310. In this case, The simultaneous storage message window 400 includes a second image 410 being captured by the second mobile terminal 100_2, a reject button 420 for rejecting simultaneous storage, and an accept button 430 for accepting simultaneous storage can do.

The control unit 180 may receive information on the shooting start time of the second image 410 when receiving the second image 410 and may use the information to compare the shooting start time of the first image 310 . The control unit 180 may compare the position of the subject included in the first image 310 with the position of the subject included in the second image 410 to determine whether the subject has a similar composition.

Next, a process of disconnecting the shot image from the partner terminal will be described.

FIGS. 10A and 10B are views for explaining a process of stopping the sharing of a captured image with a counterpart terminal according to an embodiment of the present invention.

10A is the same as that of FIG. 6A, so the overlapping contents are omitted. The controller 180 can disconnect the second mobile terminal 100_2 when the second image 330 is selected and an input for dragging the selected second image 330 downward is received. The control unit 180 may control the display unit 151 to disappear from the preview screen 300 as the second image 330 disappears as shown in FIG. 10B as the connection with the second mobile terminal 100_2 is broken have.

11A to 11C are diagrams illustrating an embodiment that additionally provides various modes in the photographing mode according to an embodiment of the present invention.

Referring to FIG. 11A, the mobile terminal 100 operates in a shooting mode and displays a video 610 obtained through a camera on a preview screen 600. FIG. When a request to select one point 611 of the preview screen 600 for a predetermined time or longer is received, the controller 180 can provide the theme mode while maintaining the shooting mode as shown in FIG. 11B. The theme mode may be a mode that provides a function to add a background music, or to automatically take a picture at a predetermined time interval when photographing a moving image. The control unit 180 controls the display unit 151 to display a mode providing area 620 providing various modes, a music providing area 630 providing information on the background music, and a clip button 640 under the theme mode Can be controlled. The mode providing area 620 may include a theme icon 621 indicating that the current mode is being operated. The music providing area 630 may include a change button 631 for providing and setting additional background music. The clip button 640 may be a button for automatically photographing a subject at regular time intervals under the theme mode.

11A, when a request to select one point 611 of the preview screen 600 for a certain period of time or longer is received, the control unit 180 determines whether the preview mode is selected You can provide a mode together. The coexistence mode may be a mode for providing a function of shooting a picture or a moving picture together with the other party located nearby when photographing or photographing the moving picture. The control unit 180 displays a mode providing area 620, a user icon 651 for identifying itself, a counterpart icon 653 for identifying a counterpart located nearby and a guide area 650 for guiding an area to be shot together ) Can be displayed. The mode providing area 620 may include an icon 621 together indicating that it is currently operating in a combined mode. The control unit 180 can transmit information on the guide area 650 to the terminal of the other party connected via the short distance communication. The guide area 650 can guide the user's mobile terminal and the other mobile terminal to photograph the same subject.

Next, an operation method of the mobile terminal according to another embodiment of the present invention will be described.

12 is a flowchart illustrating an operation method of another mobile terminal according to another embodiment of the present invention.

The control unit 180 of the mobile terminal 100 receives the screen division input for dividing the screen displayed by the display unit 151 (S301). In one embodiment, the screen division input may be an input for moving the touch from the bezel on which the display unit 151 is mounted to the screen side of the display unit 151. [

After receiving the screen division input, the controller 180 determines whether the screen displayed by the display unit 151 is off (S303). If the screen is off, the controller 180 controls the mobile terminal 100 to perform the first operation Mode (S305). In one embodiment, the screen may be off when the screen of the display unit 151 is off. Specifically, the screen off state may be a state in which power is not applied to the display unit 151 and the screen is not turned on.

In one embodiment, the first operation mode may be a mode for executing the photographing function of the camera 121 provided in the mobile terminal 100. When the mobile terminal 100 includes a front camera and a rear camera, the first mode of operation may be a divided shooting camera mode in which the front camera and the rear camera are simultaneously turned on. More specifically, the first operation mode may be a mode for displaying a front image photographed by the front camera and a rear image photographed by the rear camera on each of the divided screens according to the screen division input.

In another embodiment, the first mode of operation may be a mode for displaying an execution screen of a recently executed application. For example, the first operation mode may be a mode for displaying an execution screen of each of the two most recently executed applications on the mobile terminal 100 on each of the divided screens.

In yet another embodiment, the first mode of operation may be a mode for displaying execution views of the most recently executed application. For example, if the most recently executed application is an Internet application for web access, the first mode of operation may be a mode for displaying recently accessed web site screens through the execution of an Internet application.

The control unit 180 displays a specific item corresponding to the first operation mode on each of the divided screens in response to the received screen division input (S307).

In one embodiment, the control unit 180 may divide the screen of the display unit 151 based on the screen division input.

In one embodiment, the particular item may be an item that depends on the nature of the first mode of operation.

In one embodiment, the specific item corresponding to the first mode of operation may be a preview image of the camera.

In yet another embodiment, the particular item corresponding to the first mode of operation may be an executable screen of a recently executed application.

In yet another embodiment, the particular item corresponding to the first mode of operation may be recently played media content. Steps S301 to S307 will be described with reference to the following drawings.

13 is a diagram illustrating a process of receiving a screen division input in a screen off state according to an embodiment of the present invention.

Referring to FIG. 13, the screen 1000 displayed by the display unit 151 may be in a screen-off state where nothing is displayed. The screen 1000 displayed by the display unit 151 may have a rectangular shape. The screen 1000 may be a rectangle comprising a first corner 151a, a second corner 151b, a third corner 151c and a fourth corner 151d. The screen split input may be an input to move (or drag) a touch input received at one point of a particular edge to another point at another edge. For example, as shown in FIG. 13, the screen division input may be performed by touching one point A of the first corner 151a with another point B of the second corner 151b parallel to the first corner ). ≪ / RTI > The control unit 180 may divide the screen 1000 according to a line on which the screen division input moves.

Next, functions that can be performed according to the screen division input in the screen off state will be described.

14A and 14B are diagrams for explaining an example in which the divided photographic camera mode is executed according to the screen division input.

FIG. 14A shows a state in which the divided shooting camera mode is executed according to the screen division input described with reference to FIG. The control unit 180 may divide the screen 1000 into the first divided screen 1100 and the second divided screen 1200 according to the screen divided input in the screen off state. A preview image 1110 of the rear camera provided in the mobile terminal 100 may be displayed on the first divided screen 1100. [ A preview image 1210 of a front camera included in the mobile terminal 100 may be displayed on the second divided screen 1200. [ That is, when receiving the screen division input in the screen off state, the control unit 180 can control the display unit 151 to turn on the rear camera and the front camera simultaneously and display the preview image on each divided screen.

With the screen off, the user can quickly turn on both the front and rear cameras with just the tongue just dragging the touch input.

FIG. 14B shows a state in which the mobile terminal 100 enters the divided shooting camera mode when the mobile terminal 100 has two rear cameras. The control unit 180 may divide the screen 1000 into a third divided screen 1300 and a fourth divided screen 1400 when receiving the divided screen input in the screen off state. A first preview image 1310 of the first rear side camera provided in the mobile terminal 100 may be displayed on the third divided screen 1300. [ The fourth divided screen 1400 may display a second preview image 1410 of the fourth rear camera provided in the mobile terminal 100. [ That is, when the screen division input is received in the screen off state, the controller 180 turns on the first rear camera and the second rear camera simultaneously and controls the display unit 151 to display a preview image on each divided screen .

Next, Fig. 15 will be described.

FIG. 15 is an embodiment for displaying execution screens of recently executed applications according to the screen division input.

Referring to FIG. 15, the controller 180 may divide the screen 1000 into a fifth divided screen 1510 and a sixth divided screen 1530 according to a screen division input. The control unit 180 may control the display unit 151 to display each of the recently executed applications on each divided screen in accordance with the screen division input. For example, the fifth divided screen 1510 may be an execution screen of a recently executed mobile message application, and the sixth divided screen 1530 may be an execution screen of a recently executed internet application. For example, the execution time of the mobile message application corresponding to the fifth split screen 1510 may be faster than the execution time of the Internet application corresponding to the sixth split screen 1530.

12 is described again.

The control unit 180 enters the mobile terminal 100 in the second operation mode when the screen displayed by the display unit 151 is on (S309) (S311). In one embodiment, the second operation mode may be a mode for providing information related to the information displayed on the screen of the display unit 151 in the split screen.

The control unit 180 displays a specific item corresponding to the second operation mode on each of the divided screens in response to the received screen division input (S313).

Steps S309 to S313 will be described with reference to the following drawings.

16A and 16B illustrate functions that can be performed when a screen division screen is input in a state that a screen of the display unit is turned on.

Referring to FIG. 16A, the screen 1000 of the display unit 151 is on and the screen 1000 in the on state may include an execution window 1610 of the mobile message application. In this case, when a screen division input that halves the execution window 1610 of the mobile message application is received, the control unit 180 divides the screen 1000 into a seventh divided screen 1630 and a seventh divided screen 1630, And can be divided into an eighth divided screen 1650. The seventh divided screen 1630 may include an execution window 1631 of the mobile message application being displayed before receiving the screen division input. The preview image 1651 of the camera 121 according to the execution of the camera application can be displayed on the eighth divided screen 1650. [ Here, the camera 121 may be a front camera, but this is merely an example. The control unit 180 may display information related to the information displayed on the screen 1000 on the divided screen 1650 before the screen division input. That is, in the execution window 1631 of the mobile message application, a photograph is directly taken and transmitted from the other party. Accordingly, if the mobile message application is being executed before the screen division input, the execution screen of the camera application related to the mobile message application may be displayed on the divided screen 1650 according to the screen division input.

17A to 17C illustrate functions that can be performed when a screen division screen is input in a state that a screen of the display unit is turned on.

17A, a preview image 1710 acquired through the camera 121 is displayed on a screen 1000 of the display unit 151. [ Here, it is assumed that the camera 121 is a first rear side camera.

In addition, the screen 1000 may further display a message window 1720 that guides screen division input. 17B, when the screen division input for moving the touch input at one point of the first corner constituting the screen 1000 to the other point of the second corner is received, The screen 1000 can be divided into two divided screens 1810 and 1820 based on the input. A preview image 1710 acquired through the first rear camera may be displayed on the split screen 1810 and a preview image 1711 obtained through the second rear camera may be displayed on the split screen 1820 Can be displayed. That is, the controller 180 may operate the second rear camera according to the screen division input during the operation of the first rear camera, and display preview images obtained from the rear cameras on the respective divided screens.

17B, when the screen division input for moving the touch input at one corner of the fourth corner to the other corner of the third corner is received, the controller 180 displays the screen 1000 as shown in FIG. 17C, (1830, 1840, 1850, 1860). In the split screen 1830, a preview image 1710 obtained through the first rear side camera described with reference to FIG. 17B may be displayed. In the split screen 1840, a preview image 1711 obtained through the second rear camera described in FIG. 17B may be displayed. A preview image 1712 obtained through the third rear-side camera may be displayed on the divided screen 1850. A preview image 1713 obtained through the fourth rear camera may be displayed on the divided screen 1850. That is, the controller 180 can operate the third and fourth rear cameras according to the additional screen division input during the operation of the first and second rear cameras, and display the preview images acquired from the rear cameras on the respective divided screens .

Next, an embodiment in which various functions are performed according to the input received on the divided screen will be described.

FIGS. 18A and 18B are diagrams for explaining an embodiment for performing various functions according to the input received on the divided screen.

18A, a screen 1000 includes a split screen 1810 including a first preview image 1710 obtained through a first rear camera, a second preview image 1711 obtained through a second rear camera, And a second split screen 1820 including the second split screen 1820. [ The controller 180 may acquire an image or a moving image corresponding to the first preview image 1710 by capturing the first preview image 1710. In this case, That is, the controller 180 may perform a photographing function or a moving picture photographing function according to an input for touching the first divided screen 1710.

18B, when an input for flicking the first divided screen 1810 from left to right (or from right to left) is received, the controller 180 performs a color filter function, The color of the image 1710 can be changed.

Figs. 19A and 19B are diagrams for explaining another embodiment for performing various functions according to inputs received on a divided screen. Fig.

19A, when a flicking input is received from the lower side to the upper side (or from the upper side to the lower side) on the first divided screen 1810, the controller 180 controls the front camera So that the preview image 1910 obtained through the front camera can be displayed on the first divided screen 1810. That is, the control unit 180 can control the operation of the front camera in a state in which the self-photographing is possible according to the flicking input.

20A and 20B are diagrams for explaining another embodiment for performing various functions according to the input received on the divided screen.

20A, the first split screen 1810 may include a first option button 1911 and the second split screen 1820 may include a second option button 1913. [ When the first option button 1911 is selected, the controller 180 displays function buttons 1921 for giving various effects to the preview image 1710 on the first divided screen 1810, The display unit 151 can be controlled.

The present invention described above can be embodied as computer-readable codes on a medium on which a program is recorded. The computer readable medium includes all kinds of recording devices in which data that can be read by a computer system is stored. Examples of the computer readable medium include a hard disk drive (HDD), a solid state disk (SSD), a silicon disk drive (SDD), a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, And may also be implemented in the form of a carrier wave (e.g., transmission over the Internet). Also, the computer may include a control unit 180 of the terminal. Accordingly, the above description should not be construed in a limiting sense in all respects and should be considered illustrative. The scope of the present invention should be determined by rational interpretation of the appended claims, and all changes within the scope of equivalents of the present invention are included in the scope of the present invention.

Claims (14)

A method of operating a mobile terminal having a display unit,
Receiving a screen division input for dividing a screen of the display unit;
Entering the mobile terminal into a first operation mode when the screen is in a screen off state in which the screen is not turned on; And
And displaying items corresponding to the first operation mode on each of the screens divided according to the screen division input
A method of operating a mobile terminal.
The method according to claim 1,
The first mode of operation
A front camera and a rear camera provided in the mobile terminal,
The step of displaying
A preview image obtained through the front camera is displayed on a first divided screen,
And displaying a preview image acquired through the rear camera on a second split screen
A method of operating a mobile terminal.
The method according to claim 1,
The first mode of operation
A first rear camera and a second rear camera provided in the mobile terminal,
The step of displaying
Displaying a preview image obtained through the first back side camera on a first divided screen,
And displaying a preview image obtained through the second rear-view camera on a second split screen
A method of operating a mobile terminal.
The method according to claim 1,
The first mode of operation
A mode for providing information on a recently executed application,
The step of displaying
And displaying an execution screen of each of the recently executed applications on each of the divided screens
A method of operating a mobile terminal.
The method according to claim 1,
Entering the mobile terminal into a second mode of operation when the screen is on; And
And displaying items corresponding to the second operation mode on each of the divided screens according to the screen division input
A method of operating a mobile terminal.
6. The method of claim 5,
The second mode of operation
A mode for displaying information associated with the information displayed on the screen
A method of operating a mobile terminal.
The method according to claim 6,
The step of displaying items corresponding to the second mode of operation
And displaying an execution window of the first application and an execution window of a second application associated with the first application on each of the divided screens when the execution window of the first application is displayed on the screen before being divided
A method of operating a mobile terminal.
In the mobile terminal,
A display unit; And
And a control unit for receiving a screen division input for dividing a screen of the display unit and entering the first operation mode when the screen is in a screen off state in which the screen is not on, And a control unit for controlling the display unit to display items corresponding to the first operation mode
Mobile terminal.
9. The method of claim 8,
Further comprising a front camera and a rear camera,
The first mode of operation
The front camera and the rear camera,
The control unit
The display unit displays the preview image obtained through the front camera on the first divided screen and displays the preview image obtained through the rear camera on the second divided screen
Mobile terminal.
9. The method of claim 8,
Further comprising a first rear camera and a second rear camera,
The first mode of operation
The first rear camera and the second rear camera are operated,
The control unit
Controls the display unit to display a preview image obtained through the first back side camera on a first divided screen and a preview image acquired through the second rear side camera on a second divided screen
Mobile terminal.
9. The method of claim 8,
The first mode of operation
A mode for providing information on a recently executed application,
The control unit
And controls the display unit to display an execution screen of each of the recently executed applications on each of the divided screens
Mobile terminal.
9. The method of claim 8,
The control unit
When the display is in the on-screen state in which the screen is on, the mobile terminal enters the second operation mode and displays the items corresponding to the second operation mode on each of the divided screens according to the screen division input, Controlling the wealth
Mobile terminal.
13. The method of claim 12,
The second mode of operation
A mode for displaying information associated with the information displayed on the screen
Mobile terminal.
14. The method of claim 13,
The control unit
And controls the display unit to display an execution window of the first application and an execution window of a second application associated with the first application on each of the divided screens when the execution window of the first application is displayed on the screen before being divided
Mobile terminal.
KR1020150105086A 2015-07-24 2015-07-24 Mobile terminal and operating method thereof KR20170011798A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
KR1020150105086A KR20170011798A (en) 2015-07-24 2015-07-24 Mobile terminal and operating method thereof
PCT/KR2015/008542 WO2017018573A1 (en) 2015-07-24 2015-08-14 Mobile terminal and method for operating same

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150105086A KR20170011798A (en) 2015-07-24 2015-07-24 Mobile terminal and operating method thereof

Publications (1)

Publication Number Publication Date
KR20170011798A true KR20170011798A (en) 2017-02-02

Family

ID=58154261

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150105086A KR20170011798A (en) 2015-07-24 2015-07-24 Mobile terminal and operating method thereof

Country Status (1)

Country Link
KR (1) KR20170011798A (en)

Similar Documents

Publication Publication Date Title
CN107959788B (en) Mobile terminal and operating method thereof
KR20180020386A (en) Mobile terminal and operating method thereof
KR20180060236A (en) Mobile terminal and operating method thereof
KR20180040451A (en) Mobile terminal and operating method thereof
KR101714207B1 (en) Mobile terminal and method of controlling thereof
KR20160147441A (en) Mobile terminal and operating method thereof
KR20170046338A (en) Mobile terminal and method for controlling the same
KR20180095196A (en) Mobile terminal and method for controlling the same
KR20180043019A (en) Mobile terminal
KR20160073132A (en) Mobile terminal and operating method thereof
KR20170030980A (en) Watch-type mobile terminal and operating method thereof
KR20170112527A (en) Wearable device and method for controlling the same
KR20180041430A (en) Mobile terminal and operating method thereof
KR20170035506A (en) Terminal and operating method thereof
KR20170052353A (en) Mobile terminal and operating method thereof
KR20160061154A (en) Mobile terminal and method for controlling the same
KR101796607B1 (en) Mobile terminal and method for controlling the same
KR20170071334A (en) Mobile terminal and operating method thereof
KR20170045676A (en) Mobile terminal and operating method thereof
KR20170020158A (en) Mobile terminal and method for controlling the same
KR20160069406A (en) Mobile terminal and operating method thereof
KR101727823B1 (en) Image processing device and method for operating thereof
KR101639123B1 (en) Mobile terminal and method for controlling the same
KR20160139499A (en) Mobile terminal and operating method thereof
KR20150146149A (en) Mobile terminal