KR20140136854A - Application operating method and electronic device implementing the same - Google Patents

Application operating method and electronic device implementing the same Download PDF

Info

Publication number
KR20140136854A
KR20140136854A KR20130124868A KR20130124868A KR20140136854A KR 20140136854 A KR20140136854 A KR 20140136854A KR 20130124868 A KR20130124868 A KR 20130124868A KR 20130124868 A KR20130124868 A KR 20130124868A KR 20140136854 A KR20140136854 A KR 20140136854A
Authority
KR
South Korea
Prior art keywords
window
background application
foreground
application
application window
Prior art date
Application number
KR20130124868A
Other languages
Korean (ko)
Inventor
박영주
Original Assignee
삼성전자주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 삼성전자주식회사 filed Critical 삼성전자주식회사
Priority to US14/283,986 priority Critical patent/US20140351729A1/en
Publication of KR20140136854A publication Critical patent/KR20140136854A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0412Digitisers structurally integrated in a display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/445Program loading or initiating
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The specification discloses an electronic device having a multitasking function. A method of operating the electronic device disclosed herein includes the steps of: displaying a window of a foreground application, displaying at least one window of background applications on at least a part of the window of the foreground application; detecting a user input of selecting one of at least one window of the background applications; and assigning a foreground authority to a background application corresponding to the selected window to update the selected window.

Description

TECHNICAL FIELD [0001] The present invention relates to an application operating method and an electronic apparatus for implementing the same.

This specification discloses an electronic device capable of multitasking.

In recent years, electronic devices such as smart phones, tablet PCs, and the like can support multitasking that allows users to perform multiple tasks simultaneously. For example, a user can read an article or enjoy a game using an electronic device. At this time, if the short message is received by the electronic device, the electronic device can notify the user that the short message has been received. The electronic device responds to the user's request to display a window of the message application and to transmit the reply message entered via the window. The electronic device may re-display the window of the previous application (i.e., the article or game-related window) in response to the user's request after the message has been transmitted. However, when another short message is received, the user needs to reload the message application to reply to it. In this way, the user needs to switch applications to perform desired tasks. However, such a switching operation may cause inconvenience to the user.

It is an object of the present invention to provide an apparatus and a method for displaying a window of a background application temporarily on a part of a window of a foreground application so as to perform a task of a background application.

Where the background and foreground applications may be applications in run mode. The execution mode may be that the application is loaded into the main memory from the auxiliary memory and is being executed by the operating system. The foreground application may be an application having access rights to the screen. In other words, the foreground application may be an application that performs the task with the highest priority.

A method of operating an electronic device of the present disclosure includes: displaying a window of a foreground application; Displaying at least one background application window in a portion of the foreground application window; Detecting a user input selecting one of the at least one background application window; And updating the selected window by granting foreground permission to the background application of the selected window.

An electronic apparatus of the present disclosure includes: a display unit that displays a window of an application; An input for detecting user input; Displaying a window of a foreground application; displaying at least one background application window in a portion of the foreground application window; detecting a user input selecting one of the at least one background application window; An operation manager configured to perform an operation of granting foreground authority to a background application of the selected window to update the selected window; And at least one processor for executing the task manager.

The method and apparatus according to the present disclosure may display at least one background application window in a portion of a foreground application window and temporarily perform foreground operations by granting foreground permission to one of the foreground application windows.

1 is a block diagram of an electronic device according to an embodiment of the present disclosure;
2 is a flowchart for explaining an example of a procedure for temporarily assigning a foreground authority to a background application and performing an operation.
3 is a flowchart for explaining a different example of a procedure for temporarily assigning a foreground authority to a background application to perform an operation.
4 is a flowchart for explaining an example of a procedure for replacing foreground application.
5A, 5B, 5C, and 5D are screens for explaining an example of a procedure for interacting with a message application.
6A and 6B are screens for explaining an example of a procedure for interacting with a plurality of applications.
7 is a flowchart for explaining an example of a procedure in which a window of a background application to which a foreground right is temporarily granted is updated.

The electronic device according to the present disclosure may be used in various applications such as smart phones, cameras, tablet PCs, notebook PCs, desktop PCs, media players (e.g. MP3 players), PDAs, gaming terminals, wearable computers May be a computing device. Further, the electronic device according to the present disclosure may be a home appliance (e.g., a refrigerator, a TV, a washing machine, etc.) in which such a computing device is embedded.

The electronic device according to the present disclosure may display a " background interface that includes at least one background application window "in a portion of the foreground application window in response to a user request (e.g., tapping a message reception notification displayed on the screen). The electronic device may grant a foreground privilege to the background application of the window selected in the background interface. That is, the electronic device can update and display the selected background application window. The electronic device may stop displaying the background interface in response to a user request (e.g., tapping the foreground application window). That is, the electronic device may reassign the original application to the original application. Thus, the electronic device can provide the user with an interaction that allows the background application to temporarily perform foreground privilege, thereby enabling the background application to perform the operation.

Various embodiments of the present disclosure will now be described in detail with reference to the accompanying drawings. In describing the embodiments, descriptions of techniques that are well known in the art to which the present disclosure pertains and are not directly related to the present disclosure may be omitted. Further, detailed description of components having substantially the same configuration and function can be omitted. In the drawings, some of the elements may be exaggerated, omitted, or schematically illustrated.

1 is a block diagram of an electronic device according to an embodiment of the present disclosure;

1, an electronic device 100 includes a display unit 110, a key input unit 120, a wireless communication unit 130, an audio processing unit 140, a speaker 141, a microphone 142, a receiver 143, An earphone 144, a memory 150, and a control unit 160. [

The display unit 110 may display various information on the screen under the control of the controller 160, in particular, an application processor (AP). For example, the control unit 160 processes (e.g., decodes) information and stores the information in a memory (e.g., a frame buffer). For example, a plurality of application windows may be stored in the frame buffer. The display unit 110 can convert the data stored in the frame buffer into an analog signal and display it on the screen. For example, the display unit 110 may display a window of a foreground application among a plurality of application windows. In addition, the display unit 110 may display a background interface on a part of the foreground application window. The background application window selected by the user in the background interface can be updated and stored in the frame buffer. Then, the display unit 110 can display the updated background application window on a part of the foreground application window.

The display unit 110 may be a liquid crystal display (LCD), an active matrix organic light emitting diode (AMOLED), a flexible display, or a transparent display.

When power is supplied to the display unit 110, the display unit 110 can display a lock image on the screen. If the user input (e.g., password) for unlocking is detected while the lock image is being displayed, the controller 160 can release the lock. When the lock is released, the display unit 110 may display, for example, a home image on the screen under the control of the controller 160, instead of the lock image. The home image may include a background and icons displayed thereon. Icons can indicate applications or content (e.g., photo files, video files, recorded files, documents, messages, etc.). When a user input for executing an application icon is detected, the control unit 160 can execute the application and control the display unit 110 to display the window on the screen. On the other hand, the screen may be referred to as a name associated with the display object. For example, a screen in which a lock image is displayed, a screen in which a home image is displayed, and a screen in which an execution image (i.e., a window) of an application is displayed may be referred to as a lock screen, a home screen, and an execution screen, respectively.

The touch panel 111 is installed on the screen of the display unit 110. That is, the display unit 110 may include a touch panel 111 as an input unit. For example, the touch panel 111 may be an add-on type located on the screen of the display unit 110, an on-cell type or an in- cell type).

The touch panel 111 may include a capacitive touch panel. The hand touch panel may include a plurality of scan input ports (hereinafter, scan ports) and a plurality of sense output ports (hereinafter, detection ports). The hand touch panel generates sensing information (e.g., a change amount of electrostatic capacitance) in response to a touch of a conductive object (e.g., a finger) by a scan control signal of a touch screen controller of the control unit 160, Information can be transmitted to the touch screen controller through the sensing port.

The touch panel 111 may include a pen touch panel, a digitizer sensor substrate. The pen touch panel may be formed of Electro-Magnetic Resonance (EMR). Accordingly, the pen touch panel can generate sensing information in response to a hovering or touch of a pen specially designed to form a magnetic field, and can transmit sensing information to the touch screen controller of the controller 160. Here, the pen may have a button. For example, when the user depresses the button, the magnetic field generated in the coil of the pen may be changed. The pen touch panel may generate sensing information in response to a change in the magnetic field, and may transmit sensing information to the touch screen controller of the controller 160.

The key input unit 120 may include at least one capacitive touch key. The touch key may generate a key event in response to a touch of a conductive object, and may transmit the generated key event to the control unit 160. The key input unit 120 may further include a key other than the touch key. For example, the key input unit 120 may include at least one dome key. When the user depresses the dome key, the dome key is deformed and contacts the printed circuit board, so that a key event is generated on the printed circuit board and can be transmitted to the controller 160. Meanwhile, the key of the key input unit 120 may be referred to as a hard key, and the key displayed on the display unit 110 may be referred to as a soft key.

The wireless communication unit 130 can perform voice communication, video communication, or data communication with an external device through the network under the control of the controller 160. [ The wireless communication unit 130 may include a mobile communication module (e.g., a 3-Generation mobile communication module, a 3.5-Generation mobile communication module or a 4-Generation mobile communication module) (E.g., a DMB module) and a short range communication module (e.g., a Wi-Fi module, a bluetooth module, a NFC (Near Field Communication) module).

The audio processor 140 is connected to the speaker 141, the microphone 142, the receiver 143 and the earphone 144 to generate audio data for voice recognition, voice recording, voice modulation, digital recording, And performs input and output of signals (e.g., voice data). The audio processing unit 140 receives an audio signal (e.g., audio data) from the control unit 160, D / A-converts the received audio signal into an analog signal and amplifies the received audio signal and outputs the amplified audio signal to the speaker 141, 144). The earphone 144 can be connected to and disconnected from the electronic device 100 via an ear jack. When the earphone 144 is connected to the audio processing unit 140, the audio processing unit 140 can output an audio signal to the earphone 144. When the communication mode is the speaker mode, the audio processing unit 140 can output an audio signal to the speaker 141. [ When the communication mode is the receiver mode, the audio processing unit 140 can output the audio signal to the receiver 143. [ The speaker 141, the receiver 143 and the earphone 144 convert the audio signal received from the audio processing unit 140 into a sound wave and output the sound wave. The microphone 142 converts a sound wave transmitted from a person or other sound source into an audio signal. Meanwhile, the earphone 144 may be a four-pole earphone, that is, an earphone having a microphone. The audio processor 140 A / D-converts the audio signal received from the microphone 142 or the microphone of the earphone 144 into a digital signal, and transmits the audio signal to the controller 160.

The audio processing unit 140 can provide the user with auditory feedback (e.g., voice or sound) related to the display of the background application window under the control of the control unit 160. [ For example, when at least one background application window is displayed in a part of the foreground application window, the audio processing unit 140 can reproduce the audio data or sound data to guide the background application window. When the display of the background application window is interrupted, the audio processing unit 140 can reproduce the audio data or audio data to guide the application window. When one of the displayed background application windows is set as the foreground application window, the audio processing unit 140 can reproduce the audio data or sound data to guide the application window.

The memory 150 may store data received according to the operation of the electronic device 100 or received from the external device through the wireless communication unit 130 under the control of the controller 160. [ Memory 150 may include a buffer as a data temporary store. The memory 150 may store various setting information for setting the usage environment of the electronic device 100 (e.g., screen brightness, vibration at the time of touch occurrence, automatic rotation of the screen, and the like). Accordingly, the control unit 160 can operate the electronic device 100 by referring to the setting information.

The memory 150 may store various programs for operating the electronic device 100, such as a boot program, one or more operating systems, applications 151_1 through 151_N, and a window resource manager 152 that manages the resources of the application windows have. For example, if the operating system is Linux, the window resource manager 152 may be an X server. In particular, the memory 150 may store the task manager 153.

The task manager 153 responds to the display request of the background application (hereinafter, "app") window by displaying the "background interface including at least one background app window" on a part of the foreground application window, An operation of requesting the application to update the window and an operation of displaying the window updated by the application can be set. That is, the task manager 153 may grant a foreground privilege to a selected window application temporarily (short session).

The task manager 153 may be configured to perform an action to change the foreground application in response to a replace request while the background interface is being displayed and an action to display another background app window in response to the move request while the background interface is being displayed have.

The task manager 153 may include a touch event handler 153a, a window event handler 153b, and a task display module 153c. The touch event handler 153a may be configured to perform an operation of transferring a touch event to the window resource manager 152. [ The window event handler 153b can be set to perform the operation of acquiring the information of the updated background application window and controlling the task display module 153c to display it. The task display module 153c can be set to perform an operation to display the updated background application window.

The memory 150 may include a main memory and a secondary memory. The main memory may be implemented by, for example, a RAM or the like. The auxiliary memory may be implemented as a disk, a RAM, a ROM, a flash memory, or the like. The main memory may store various programs loaded from the auxiliary memory, such as boot programs, operating systems, and applications. When the power of the battery is supplied to the controller 160, the boot program may be first loaded into the main memory. These boot programs can load the operating system into main memory. The operating system can load the application into main memory. The control unit 160 (for example, an AP (Applicatoin Processor)) accesses the main memory to decrypt the program's instructions (routines) and execute functions according to the decryption result. That is, various programs can be loaded into main memory and operated as a process.

The control unit 160 controls the overall operation of the electronic device 100 and the signal flow between the internal configurations of the electronic device 100, performs the function of processing data, controls the power supply from the battery to the configurations do. The control unit 160 may include a touch screen controller 161 and an application processor (AP)

The touch screen controller 161 receives the sensing information from the touch screen panel 111 and analyzes it to recognize that touch, hovering, pressing of the pen button, or the like occurs. The touch screen controller 161 can determine the hovering area on the touch screen in response to the hovering and calculate the hovering coordinate (x_hovering, y_hovering) in the hovering area. The touch screen controller 161 may pass a hovering event containing the calculated hovering coordinates to the application processor (AP) The hovering event may also include a depth value. For example, the hovering event may include a three-dimensional hovering coordinate (x, y, z). Here, z value can mean depth. The touch screen controller 161 can determine the touch area on the touch screen in response to the touch and calculate the touch coordinates (x_touch, y_touch) in the touch area. The touch screen controller 161 can transmit a touch event including the calculated touch coordinates to the application processor 162. [ The touch screen controller 161 may forward the pen button event to the application processor 162 in response to the pressing of the pen button.

The application processor 162 may receive a touch screen event (e.g., a hovering event, a touch event, a pen button event, etc.) from the touch screen controller 161 and perform a function corresponding to the touch screen event.

When the hovering coordinate is received from the touch screen controller 161, the application processor 162 determines that the pointing device is hovered on the touch screen, and when the hovering coordinate is not received from the touch panel 111, It can be determined that the hovering has been released. In addition, the application processor 162 may determine that the hovering movement of the pointing device has occurred if the hovering coordinates are changed and the amount of change exceeds a predetermined travel threshold. The application processor 172 can calculate the position change amount (dx, dy) of the pointing mechanism, the moving speed of the pointing mechanism, and the trajectory of the hovering motion in response to the hovering motion of the pointing mechanism. The application processor 162 may also determine a hovering gesture for the touch screen based on the hovering coordinates, whether or not the hovering of the pointing device is released, the movement of the pointing device, the amount of position change of the pointing device, Can be determined. Here, the hovering gesture may include, for example, a drag, a flick, a pinch in, a pinch out, and the like.

The application processor 162 determines that the pointing device is touched by the touch panel 111 when the touch coordinates are received from the touch screen controller 161. If the touch coordinates are not received from the touch panel 111, It can be determined that the touch of the pointing mechanism is released. In addition, the application processor 162 may determine that touch movement of the pointing device has occurred when the touch coordinates are changed and the amount of change exceeds a predetermined movement threshold value. The application processor 162 can calculate the position change amount (dx, dy) of the pointing mechanism, the moving speed of the pointing mechanism, and the locus of the touch movement in response to the touch movement of the pointing mechanism. In addition, the application processor 162 determines whether or not a touch to the touch screen is performed based on the touch coordinates, whether or not the pointing mechanism is touched off, whether the pointing mechanism is moved, the position change amount of the pointing mechanism and the moving speed of the pointing mechanism, The gesture can be determined. Here, the touch gesture may be a touch, a multitouch, a tap, a double tap, a long tap, a drag, a flick, a press, a pinch in, , Pinch-out, and the like.

The application processor 162 may receive a key event from the key input unit 120 and perform a function corresponding to the key event.

The application processor 162 may execute various programs stored in the memory 150. That is, the application processor 162 can load various programs from the auxiliary memory to the main memory and operate them as a process. In particular, the application processor 162 can operate the task manager 153 as a process.

Meanwhile, the control unit 160 may further include various processors in addition to the application processor 162. For example, the control unit 160 may include a graphics processing unit (GPU) for performing graphics processing. The control unit 160 controls the operation of the electronic device 100 in accordance with the operation of the mobile communication module (for example, a 3-generation mobile communication module, a 3.5-generation mobile communication module, a 4-generation mobile communication module, Module, etc.), it may further include a communication processor (CP) that is responsible for the processing of the mobile communication. Each of the above-described processors may be integrated into a single package of two or more independent cores (e.g., quad-core) in a single integrated circuit. For example, the application processor 162 may be one integrated into a multicore processor. The above-described processors may be integrated on a single chip (SoC). In addition, the above-described processors may be packaged in a multi-layer.

Meanwhile, the electronic device 100 may further include configurations not mentioned above such as a GPS receiving module, a vibration motor, a camera, an acceleration sensor, a gyro sensor, a proximity sensor, and the like. When the electronic device 100 is set to the automatic rotation mode, the controller 160 analyzes the sensed information collected from the sensors to calculate the posture of the electronic device 100, (landscape mode) or portrait mode (portrait mode).

2 is a flowchart for explaining an example of a procedure for temporarily assigning a foreground authority to a background application and performing an operation.

Referring to FIG. 2, in operation 210, the control unit 160 checks whether a user input requesting display of a background application window is detected. Where the user input may be a specific touch gesture. When the touch gesture is detected, the controller 160 compares the touch gesture with a preset value and checks whether the detected touch gesture is a user input requesting display of the background application window. For example, a pinch-in can be set as a user input to display a background app window. Of course, other touch gestures or specific hovering gestures may be set as user input to display the background app window. On the other hand, the user input for requesting the display of the background application window may be an input for selecting a specific icon displayed on the screen (for example, a user tapping a specific icon). Also, the user input may be a key event. The user input may also be a voice command event input via the microphone of the microphone 142 or the earphone 144. [

If a user input requesting display of the background app window is detected in operation 210, the control unit 160 controls the display unit 110 to display a background interface on a part of the foreground application window in operation 220. The background interface may include at least one of the background application windows stored in a memory (e.g., a frame buffer). The foreground app window is the window displayed on the screen before user input is detected. That is, the foreground application window is the window of the application having access right to the screen. For example, the foreground app window can be a lock image, a home image, a game image, a web page, a document, and the like. Also, a plurality of foreground application windows may be displayed on the screen. For example, when foreground privileges are granted to a plurality of applications, the screen may be divided into a plurality of areas, and foreground application windows may be displayed in each area. Meanwhile, when a user input requesting a change of the background application window is detected, the control unit 160 may control the display unit 110 to display another background application window in response to the user input. For example, when a flick or drag occurs in the background interface, the window of application A disappears and the window of application B can be displayed.

In operation 230, the control unit 160 checks whether a user input for selecting a background app window in the background interface is detected. Where the user input may be a tab for that window. The user input may also be a voice command event input via the microphone of the microphone 142 or the earphone 144. [

When a user input for selecting a background application window is detected in operation 230, the control unit 160 temporarily grants the foreground permission to the application of the selected window in operation 240. That is, the control unit 160 updates the selected window. For example, when the window of the messenger is selected, the control unit 160 determines whether new information (e.g., message, announcement, update, etc.) has been received with respect to the messenger. If it is determined that there is new information, the control unit 160 may control the display unit 110 to display new information in the window.

In operation 250, the control unit 160 checks whether a user input requesting execution of the function is detected. If a user input requesting the execution of the function is detected in operation 250, the controller 160 performs the requested function in operation 260. For example, when an input window is selected in the background application window, the control unit 160 may control the display unit 110 to display the keypad on a part of the window. Messages entered via the keypad can be displayed in the input window. When the transmission of the message is selected (for example, a tab for the transmission button), the control unit 160 may control the wireless communication unit 130 to transmit the message displayed on the input window to the chatting device of the other party. After operation 260 is performed, the process may return to operation 250. If no user input is requested to perform the function at operation 250, the process may proceed to operation 270.

In operation 270, the controller 160 checks whether a user input requesting termination of the background interface is detected. For example, when the user taps the foreground application window, the display of the background interface is terminated and the process can be terminated. Alternatively, the process may return to operation 210. If no user input requesting termination of the background interface is detected at operation 280, the process may return to operation 250.

If no user input is selected to select the background app window at operation 230, then the process may proceed to operation 280. In operation 280, the control unit 160 checks whether a user input requesting termination of the background interface is detected. If a user input requesting termination of the background interface is detected, the display of the background interface is terminated and the process can be terminated. Alternatively, the process may return to operation 210. The control unit 160 also gives foreground permission to the foreground application. If no user input requesting termination of the background interface is detected, the process may return to operation 230.

On the other hand, the background interface may be automatically terminated if no user input is detected within a predetermined time (for example, one minute) from the displayed time. The process may then return to operation 210.

3 is a flowchart for explaining a different example of a procedure for temporarily assigning a foreground authority to a background application to perform an operation.

Referring to FIG. 3, in operation 310, the control unit 160 checks whether a user input requesting display of a background application window is detected. For example, an indicator associated with the background application may be displayed on the screen along with the foreground application window. For example, when a message, update information, notice, or the like is received from the outside through the wireless communication unit 130, an indicator indicating the corresponding background application may be displayed on the screen. User input can be a tab for such an indicator.

If at operation 310 a user input requesting display of the background app window is detected, the controller 160 updates any of the background application windows at operation 320. Here, the window to be updated may be the window of the background application corresponding to the indicator selected by the user.

In operation 330, the control unit 160 may control the display unit 110 to display the updated background application window on a part of the foreground application window.

In operation 340, the control unit 160 checks whether a user input requesting execution of the function is detected. If a user input requesting the execution of the function is detected in operation 340, the controller 160 performs the requested function in operation 350. After operation 350 is performed, the process may return to operation 340. If no user input requesting performance of the function at operation 340 is detected, the process may proceed to operation 360.

In operation 360, the control unit 160 checks whether a user input requesting termination of the background application window is detected. For example, when the user taps the foreground application window, the display of the background application window is terminated and the process can be terminated. Alternatively, the processor may return to operation 310. The control unit 160 also gives foreground permission to the foreground application. If no user input requesting termination of the background application window is detected at operation 360, the process may return to operation 340.

4 is a flowchart for explaining an example of a procedure for replacing foreground application.

Referring to FIG. 4, in operation 410, the controller 160 checks whether a user input requesting display of a background app window is detected. If a user input requesting display of the background app window is detected in operation 410, the controller 160 controls the display unit 110 to display a background interface on a part of the foreground application window in operation 420. Meanwhile, when a user input requesting a change of the background application window is detected, the control unit 160 may control the display unit 110 to display another background application window in response to the user input. For example, when a flick or drag occurs in the background interface, the window of application A disappears and the window of application B can be displayed. In addition, the controller 160 may temporarily grant foreground privileges to any one of the displayed background app windows in response to the user's request.

In operation 430, the control unit 160 checks whether a user input for selecting a background application window in the background interface is detected. Where the user input may be a double tap for that window. The user input may also be a voice command event input via the microphone of the microphone 142 or the earphone 144. [

If a user input for selecting a background app window is detected in operation 430, the controller 160 newly sets the application of the selected window as a foreground application in operation 440. The control unit 160 may control the display unit 110 to display the background image of the foreground application on the screen. When the operation 440 is completed, the process can be terminated. Alternatively, the process may return to operation 410.

If no user input is selected to select the background app window at operation 430, the process may proceed to operation 450.

In operation 450, the control unit 160 checks whether a user input requesting termination of the background interface is detected. If a user input requesting termination of the background interface is detected, the display of the background interface is terminated and the process can be terminated. Alternatively, the process may return to operation 410. If no user input requesting termination of the background interface is detected, the process may return to operation 430.

5A, 5B, 5C, and 5D are screens for explaining an example of a procedure for interacting with a message application. Here, the display mode can be operated in a portrait mode.

Referring to FIG. 5A, a window of the application A can be displayed on the screen as a foreground application window. Referring to FIG. 5B, when a user input for requesting display of the background application window occurs while the application A is being displayed, the window 520 of the application C can be displayed on a part of the window of the application A. Further, the window of the application A may be displayed blurred. In addition, only the window 510 of the application B and the window 530 of the application D can be partially displayed on the left and right sides of the screen. Application C is a messaging application whose window 520 can be selected (e.g., tapped) by the user. Then, the control unit 160 may temporarily grant foreground authority to the application C in response to the selection. If the input window 521 of the application C is selected, the control unit 160 may control the display unit 110 to display the keypad on a part of the window. A message input via the keypad can be displayed in the input window 521. [ If the transmission of the message is selected, the controller 160 may control the wireless communication unit 130 to transmit the message displayed in the input window 521 to the other party's apparatus in the chat. Referring to FIG. 5C, the controller 160 may control the display unit 110 to display a transmission message 522. FIG. 5D, the window 520 of the application C may be selected (e.g., double-taped) by the user. Then, the application C can be set as a foreground application. Accordingly, the window 520 of the application C can be displayed on the entire screen as a foreground application window. Application A is set as a background application.

6A and 6B are screens for explaining an example of a procedure for interacting with a plurality of applications. Here, the display mode can be operated in a landscape mode.

Referring to FIG. 6A, a window of application A can be displayed on the screen as a foreground application window. A window 610 of the application B and a window 620 of the application C can be displayed in a part of the window of the application A when a user input for requesting display of the background application window occurs while the application A is being displayed. The displayed background application windows 610 and 620 may be temporarily granted foreground rights. Referring to FIG. 6B, information may be exchanged between background applications. For example, the user may touch the message 621 of the window 620 of the application C with the pointing mechanism and move the pointing mechanism to the window 610 of the application B and then release the touch. In response to the drag and drop operation, the control unit 160 copies the message 621 and stores it in a memory (e.g., a clipboard) 610).

7 is a flowchart for explaining an example of a procedure in which a window of a background application to which a foreground right is temporarily granted is updated.

Referring to FIG. 7, in operation 710, the task manager 153 recognizes touch coordinates in the background application window. At this time, the background app window is displayed on a part of the foreground app window and displayed smaller than the originally set size. Accordingly, in operation 720, the task manager 153 converts the touch coordinates by referring to the reduction rate of the background application window. That is, the recognized touch coordinates are converted to fit the originally set size of the window. In operation 730, the operation manager 153 transfers the converted touch coordinates to the window resource manager 152. In operation 740, the window resource manager 152 transmits the converted touch coordinates to the background application 151. In operation 750, the background application 151 updates the window using the converted touch coordinates. For example, when the converted touch coordinates correspond to a display request of the keypad, the background application 151 includes a keypad in the window. At operation 760, the background application 151 delivers a window update event to the window resource manager 152. The window update event includes the updated window. Also, if the operating system is Linux, the window update event may be referred to as a damage event. At operation 770, the window resource manager 152 forwards the window update event to the task manager 153. In operation 780, the operation manager 153 receives the updated window (i.e., the background application window) from the window resource manager 152, and reduces the updated window with reference to the reduction ratio and displays the window on the screen.

The method according to the present disclosure as described above can be recorded in a computer-readable recording medium implemented with program instructions that can be executed through various computers. The recording medium may include a program command, a data file, a data structure, and the like. Also, the program instructions may be those specially designed and constructed for the present invention or may be available to those skilled in the computer software. In addition, a recording medium includes a magnetic medium such as a hard disk, a floppy disk and a magnetic tape, an optical medium such as a CD-ROM and a DVD, and a magnetic optical medium such as a floppy disk. Hardware such as a magneto-optical medium, a ROM, a RAM, a flash memory, and the like may be included. The program instructions may also include machine language code such as those generated by the compiler, as well as high-level language code that may be executed by the computer using an interpreter or the like.

The method and apparatus according to the present disclosure are not limited to the above-described embodiments, and can be variously modified and practiced within the scope of the technical idea of the present disclosure.

100: Electronic device
110: Display portion 111: Touch panel
120: key input unit 130: wireless communication unit
140: audio processor 150: memory
151_1 ~ 151_N: Applications
152: Windows Resource Manager 153: Task Manager
153a: Job display module 153b: Touch event handler
153c: Window event handler
160: controller 161: touch screen controller
162: Application processor

Claims (14)

A method of operating an electronic device,
Displaying a window of the foreground application;
Displaying at least one background application window in a portion of the foreground application window;
Detecting a user input selecting one of the at least one background application window; And
And granting a foreground privilege to the background application of the selected window to update the selected window.
The method according to claim 1,
Further comprising: responsive to a user input requesting termination of the background application window, terminating the display of the at least one background application window and re-granting foreground privileges to the foreground application.
The method according to claim 1,
Detecting a second user input selecting one of the at least one background application window;
Further comprising setting the background application of the window selected by the second user input to the foreground application.
The method according to claim 1,
Further comprising: displaying another background application window in a portion of the foreground application window in response to a user input requesting a change in the background application window.
The method according to claim 1,
The electronic device has a touch screen,
Further comprising displaying information of a first background application window in a second background application window in response to a touch gesture of the pointing device with respect to the touch screen.
6. The method of claim 5,
And simultaneously displaying the first background application window and the second background application window in a portion of the foreground application window.
A display unit for displaying a window of the application;
An input for detecting user input;
Displaying a window of a foreground application; displaying at least one background application window in a portion of the foreground application window; detecting a user input selecting one of the at least one background application window; An operation manager configured to perform an operation of granting foreground authority to a background application of the selected window to update the selected window; And
And at least one processor for executing the task manager.
8. The method of claim 7,
The operation manager,
And to perform an operation of terminating the display of the at least one background application window and re-granting foreground permission to the foreground application in response to a user input requesting termination of the background application window.
8. The method of claim 7,
The operation manager,
Detecting a second user input that selects one of the at least one background application window and setting a background application of the window selected by the second user input to the foreground application; Device.
8. The method of claim 7,
The operation manager,
And to display another background application window in a part of the foreground application window in response to a user input requesting a change of the background application window.
8. The method of claim 7,
Wherein the input unit includes a touch panel provided on the display unit,
The operation manager,
And to display information of the first background application window in the second background application window in response to the touch gesture of the pointing device with respect to the touch screen of the display unit.
12. The method of claim 11,
The operation manager,
And simultaneously displaying the first background application window and the second background application window in a part of the foreground application window.
8. The method of claim 7,
The operation manager,
An operation of recognizing touch coordinates in a background application window displayed smaller than an originally set size, an operation of converting the recognized touch coordinates into the original size, an operation of transmitting the converted touch coordinates to a corresponding background application, And to receive and display the updated window from the background application.
8. The method of claim 7,
Wherein the at least one processor comprises an application processor.
KR20130124868A 2013-05-21 2013-10-18 Application operating method and electronic device implementing the same KR20140136854A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US14/283,986 US20140351729A1 (en) 2013-05-21 2014-05-21 Method of operating application and electronic device implementing the same

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201361825725P 2013-05-21 2013-05-21
US61/825,725 2013-05-21

Publications (1)

Publication Number Publication Date
KR20140136854A true KR20140136854A (en) 2014-12-01

Family

ID=52456979

Family Applications (1)

Application Number Title Priority Date Filing Date
KR20130124868A KR20140136854A (en) 2013-05-21 2013-10-18 Application operating method and electronic device implementing the same

Country Status (1)

Country Link
KR (1) KR20140136854A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102022920B1 (en) 2019-06-25 2019-09-19 주식회사 태성 Roll-to-roll Horizontal Continuous Plating Equipment
CN114518924A (en) * 2022-01-29 2022-05-20 苏州达家迎信息技术有限公司 Page display method, device, equipment and storage medium for mobile client

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102022920B1 (en) 2019-06-25 2019-09-19 주식회사 태성 Roll-to-roll Horizontal Continuous Plating Equipment
CN114518924A (en) * 2022-01-29 2022-05-20 苏州达家迎信息技术有限公司 Page display method, device, equipment and storage medium for mobile client
CN114518924B (en) * 2022-01-29 2024-02-02 苏州达家迎信息技术有限公司 Page display method, device and equipment of mobile client and storage medium

Similar Documents

Publication Publication Date Title
US20140351729A1 (en) Method of operating application and electronic device implementing the same
KR102032449B1 (en) Method for displaying image and mobile terminal
EP2778881B1 (en) Multi-input control method and system, and electronic device supporting the same
US20150012881A1 (en) Method for controlling chat window and electronic device implementing the same
KR102044826B1 (en) Method for providing function of mouse and terminal implementing the same
KR102010955B1 (en) Method for controlling preview of picture taken in camera and mobile terminal implementing the same
KR102064952B1 (en) Electronic device for operating application using received data
US20130106700A1 (en) Electronic apparatus and input method
WO2014109502A1 (en) Touch event processing method and portable device implementing the same
CN103677711A (en) Method for connecting mobile terminal and external display and apparatus implementing the same
EP2746924B1 (en) Touch input method and mobile terminal
KR20130133980A (en) Method and apparatus for moving object in terminal having touchscreen
US20140240257A1 (en) Electronic device having touch-sensitive user interface and related operating method
KR102095912B1 (en) Operating Method of Secure Indicator and Electronic Device supporting the same
KR20140019530A (en) Method for providing user's interaction using mutil touch finger gesture
US20150128031A1 (en) Contents display method and electronic device implementing the same
KR20140034100A (en) Operating method associated with connected electronic device with external display device and electronic device supporting the same
US20140164186A1 (en) Method for providing application information and mobile terminal thereof
KR20150002312A (en) Page display method and electronic device implementing the same
KR102030669B1 (en) Login management method and mobile terminal for implementing the same
KR20140105354A (en) Electronic device including a touch-sensitive user interface
KR102015349B1 (en) Call switching method and mobile terminal
KR20140136854A (en) Application operating method and electronic device implementing the same
KR20140032851A (en) Touch input processing method and mobile device
KR102076193B1 (en) Method for displaying image and mobile terminal

Legal Events

Date Code Title Description
WITN Withdrawal due to no request for examination