KR101957173B1 - Method and apparatus for providing multi-window at a touch device - Google Patents

Method and apparatus for providing multi-window at a touch device Download PDF

Info

Publication number
KR101957173B1
KR101957173B1 KR1020120105898A KR20120105898A KR101957173B1 KR 101957173 B1 KR101957173 B1 KR 101957173B1 KR 1020120105898 A KR1020120105898 A KR 1020120105898A KR 20120105898 A KR20120105898 A KR 20120105898A KR 101957173 B1 KR101957173 B1 KR 101957173B1
Authority
KR
South Korea
Prior art keywords
window
application
screen
execution
multi
Prior art date
Application number
KR1020120105898A
Other languages
Korean (ko)
Other versions
KR20140039575A (en
Inventor
황대식
정혜순
김정훈
이동준
오종화
Original Assignee
삼성전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 삼성전자 주식회사 filed Critical 삼성전자 주식회사
Priority to KR1020120105898A priority Critical patent/KR101957173B1/en
Publication of KR20140039575A publication Critical patent/KR20140039575A/en
Application granted granted Critical
Publication of KR101957173B1 publication Critical patent/KR101957173B1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object or an image, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the screen or tablet into independently controllable areas, e.g. virtual keyboards, menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04803Split screen, i.e. subdividing the display area or the window area into separate subareas

Abstract

The present invention provides a multi-window device that divides one screen of a touch device into at least two windows according to a predetermined division method, and provides a plurality of applications on a single screen more efficiently The present invention relates to a method of executing an application in a touch device, the method comprising: displaying an execution screen of a first application on a full screen; Receiving an execution event input for executing a second application; Configuring a multi-window according to a predetermined partitioning method when the execution event is released in a specific window; And displaying the screen of the first application and the second application independently through each of the divided windows.

Description

[0001] METHOD AND APPARATUS FOR PROVIDING MULTI-WINDOW AT A TOUCH DEVICE [0002]

The present invention relates to a method and apparatus for operating a function of a touch device, and more particularly, to a multi-window providing method in a touch device that allows a plurality of applications to be more effectively used through a window multi- And apparatus.

2. Description of the Related Art [0002] With the recent development of digital technology, portable terminals capable of mobile communication and personal information processing such as mobile communication terminals, PDAs (Personal Digital Assistants), electronic notebooks, smart phones, tablet PCs, Such a mobile terminal has reached the stage of mobile convergence that does not stay in the conventional inherent domain but covers the area of other terminals. Typically, the mobile terminal has a function of communicating such as voice call and video call, a message sending / receiving function such as SMS (Short Message Service) / MMS (Multimedia Message Service) and e-mail, A music playback function, an Internet function, a messenger function, and a social networking service (SNS) function.

Meanwhile, the portable terminal is configured to have only one application view at a time due to the characteristics of the portable terminal having a narrow screen, and only a few exceptional applications are pop-up So as to be fixedly displayed. Therefore, even if a plurality of applications are simultaneously executed due to a screen having a small size, only one application view corresponding to the user's selection is provided on the current screen. That is, there is a problem that a plurality of applications can not be effectively used in the related art.

It is an object of the present invention to provide a method and an apparatus for providing a multi-window in a touch device capable of realizing a multi-window environment composed of at least two divided windows in a single system of a touch device.

It is another object of the present invention to provide a multi-window environment support method in a touch device capable of maximizing the usability of a user's touch device by allowing a plurality of applications to be easily arranged and executed by dividing one screen into at least two windows in a touch device And an apparatus.

It is still another object of the present invention to provide a method and apparatus for supporting a multi-window environment in a touch device capable of supporting a simpler layout change and a convenience of a user operation in a multi-window environment for convenience of operating a plurality of applications according to a multi- .

It is another object of the present invention to provide a touch screen which can minimize the burden of user's operation and hassle in a multi-window environment, enable free adjustment of windows for a plurality of applications, A method and apparatus for supporting a multi-window environment in a device.

It is another object of the present invention to provide a method and apparatus for supporting a multi-window environment in a touch device that can support a user with more information and various user experiences by implementing a multi-window environment in a touch device.

It is another object of the present invention to provide a method and apparatus for supporting a multi-window environment capable of improving user convenience and usability of a touch device by implementing an optimal environment for supporting a multi-window environment in a touch device.

According to another aspect of the present invention, there is provided a method for executing an application in a touch device, the method comprising: displaying an execution screen of a first application on a full screen; Receiving an input of an execution event for executing a second application; Configuring a multi-window according to a predetermined partitioning method when the execution event is released in a specific window; And displaying the screen of the first application and the second application independently through each of the divided windows.

According to another aspect of the present invention, there is provided a method of executing an application in a touch device, the method comprising: executing a first application corresponding to a user selection and displaying the first application on a full screen through one window; Receiving a first event input for selecting and moving a second application while the first application is running; Determining a predetermined multi-window division method and an area where the first event is input; Outputting a feedback on a window in which the second application is executable corresponding to the division method and the area where the first event is input; Receiving a second event input for executing the second application; Configuring a multi-window in response to the second event input; And displaying the screen of the first application and the screen of the second application independently through corresponding windows separated by the multi-window.

According to another aspect of the present invention, there is provided a method for executing an application in a touch device, the method comprising: displaying an execution screen of a first application on a full screen; Sliding the tray including the execution icon of the application according to a user input while the first application is running; Selecting an execution icon of the second application in the tray and receiving an input to drag the icon into the entire screen; Receiving an input that the execution icon drops in a specific window during dragging; Executing the second application in response to a drop input of the execution icon; Dividing the entire screen into windows for screen display of the first application and the second application; And displaying a screen of the second application through the specific window in which the execution icon is dropped, and displaying a screen of the first application through another divided window.

According to an embodiment of the present invention, there is provided a computer-readable recording medium having recorded thereon a program for causing a processor to execute the method.

According to an aspect of the present invention, there is provided an apparatus for displaying a screen interface according to a multi-window environment, the apparatus comprising: A touch screen displaying an event input for operating the plurality of applications; And a control unit for controlling the execution of the plurality of applications in the multi-window environment and controlling the display of at least two or more applications according to the user's selection among the plurality of applications to be displayed independently through the plurality of windows .

According to an aspect of the present invention, there is provided a recording medium for receiving an input of an execution event for executing a second application in a state of displaying an execution screen of a first application on a full screen, And a program for processing a screen of the first application and the second application so that the screen of the first application and the screen of the second application are independently displayed through the divided windows. And the like.

The foregoing is a somewhat broad summary of features and technical advantages of the present invention in order that those skilled in the art will be better able to understand it from the following detailed description of the present invention. Additional features and advantages of the present invention, which form the subject matter of the claims of the invention, in addition to those features and advantages, will be better understood from the following detailed description of the invention.

As described above, according to the method and apparatus for providing a multi-window in the touch device proposed in the present invention, a user can use a plurality of applications simultaneously in a predetermined divided screen or free style in a simple manner . For example, if a user wants to use a multi-window by dividing a screen while an application is running in full screen, drag the additional application from the tray and drag and drop it to a predetermined position or free position So that a plurality of applications can be operated simultaneously.

Further, according to the present invention, a user can easily arrange and check a plurality of applications on a single screen through a multi-window, and can freely change each window according to a multi-window to a desired layout, And the burden of the operation of the plurality of applications and the hassle of the users can be solved.

According to the present invention, there is an advantage in that, through a multi-window environment, more information and various user experiences can be provided to a user. In addition, according to the present invention, a user can perform a multi-window environment for a narrow screen of a touch device more effectively and simultaneously work on various applications. For example, the user can watch and listen to a moving picture on one screen of the touch device and simultaneously perform other operations such as creating a message, composing a mail, and the like.

Therefore, according to the present invention, by providing an optimal environment for supporting a multi-window environment in a touch device, it is possible to improve the usability of the user and to improve the usability, convenience and competitiveness of the touch device. The present invention can be easily implemented in all types of touch devices and various devices corresponding thereto.

1 is a view schematically showing a configuration of a touch device according to an embodiment of the present invention.
FIG. 2 is a schematic view illustrating a screen interface of a touch device according to an exemplary embodiment of the present invention. Referring to FIG.
3 is a diagram schematically illustrating an operation of operating a multi-window in a touch device according to an embodiment of the present invention.
4 is a diagram schematically illustrating an operation of dividing a multi-window in a touch device according to an embodiment of the present invention.
5 to 12 are views illustrating an example of an operation screen for operating a tray for quick execution of an application in a multi-window environment according to an embodiment of the present invention.
13 to 17 are diagrams illustrating exemplary operation screens for operating a plurality of applications in a multi-window environment according to an embodiment of the present invention.
18 to 23 are diagrams illustrating an operation example of operating a plurality of applications in a multi-window environment according to an embodiment of the present invention.
FIGS. 24 to 29 are views illustrating an operation example of operating a keypad for text input in a multi-window environment according to an embodiment of the present invention.
30 is a diagram illustrating an operation example of operating a plurality of applications in a multi-window environment according to an embodiment of the present invention.
31 to 34 are diagrams illustrating an example of an operation screen for providing information on a plurality of applications executed in a multi-window environment in a touch device according to an embodiment of the present invention.
35 is a flowchart illustrating a method for executing an additional application by switching a multi-window environment in a touch device according to an embodiment of the present invention.
36 is a flowchart illustrating a method of executing an additional application in a multi-window environment in a touch device according to an embodiment of the present invention.

Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the accompanying drawings. Note that, in the drawings, the same components are denoted by the same reference symbols as possible. Further, the detailed description of well-known functions and constructions that may obscure the gist of the present invention will be omitted. In other words, it should be noted that only parts necessary for understanding the operation according to the embodiment of the present invention will be described, and descriptions of other parts will be omitted so as not to disturb the gist of the present invention.

According to the present invention, a screen of a touch device is divided into at least two windows according to a predefined split method to provide a multi-window, and a plurality of And more particularly, to a method and apparatus for providing a multi-window in a touch device in order to make it possible to more efficiently use applications in a single screen.

According to the embodiment of the present invention, when selecting an additional application in the touch device and performing dragging, the predetermined screen division method is determined, and a corresponding window among the windows divided in one screen, And may provide feedback to the user. This allows you to tell where the additional apps to run will be located. According to the embodiment of the present invention, when an additional application is executed at a position selected by the user, the screen of the application can be displayed in accordance with the size of the window.

Hereinafter, a configuration of a touch device and an operation control method according to an embodiment of the present invention will be described with reference to the following drawings. It should be noted that the configuration of the touch device and the operation control method thereof according to the embodiment of the present invention are not limited to or limited to the following description and thus can be applied to various embodiments based on the following embodiments.

1 is a view schematically showing a configuration of a touch device according to an embodiment of the present invention.

Referring to FIG. 1, the touch device of the present invention includes a wireless communication unit 110, a user input unit 120, a display unit 130, an audio processing unit 140, a storage unit 150, an interface unit 160, A controller 170, and a power supply unit 180. The touch device of the present invention is not necessarily the components shown in Fig. 1, but may be implemented with more or fewer components.

The wireless communication unit 110 may include one or more modules for enabling wireless communication between a touch device and a wireless communication system or between a touch device and a network in which other devices are located. For example, the wireless communication unit 110 may include a mobile communication module 111, a wireless local area network (WLAN) module 113, a short range communication module 115, a position calculation module 117, (119), and the like.

The mobile communication module 111 may include at least one of a base station, an external terminal, and various servers (e.g., an integration server, a provider server, a content server, etc.) And wireless signals. The wireless signal may include various types of data for transmitting and receiving a voice call signal, a video call signal, and a text / multimedia message. The mobile communication module 111 may be connected to at least one of the various servers under the control of the controller 170 to receive an application available in the touch device according to a user's selection.

The wireless LAN module 113 represents a module for forming a wireless LAN link with a wireless Internet connection and another touch device, and may be built in or externally attached to a touch device. Wi-Fi, Wibro (Wireless broadband), Wimax (World Interoperability for Microwave Access), HSDPA (High Speed Downlink Packet Access) and the like can be used as wireless Internet technologies. The wireless LAN module 113 can be connected to at least one of the various servers under the control of the controller 170 to receive an application available in the touch device according to a user's selection. In addition, the wireless LAN module 113 may transmit or receive an application according to a user selection to another touch device when a wireless LAN link with another touch device is formed.

The short-range communication module 115 represents a module for short-range communication. Bluetooth, Bluetooth Low Energy (BLE), Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee ), And NFC (Near Field Communication). Also, the short-range communication module 115 may transmit or receive an application according to a user selection to another touch device when short-range communication with another touch device is connected.

The position calculation module 115 is a module for acquiring the position of the touch device, and a representative example thereof is a Global Position System (GPS) module. The position calculation module 115 calculates distance information and accurate time information from three or more base stations and then applies trigonometry to the calculated information to calculate current three-dimensional position information according to latitude, longitude, and altitude . Or the position calculation module 115 can calculate the position information by continuously receiving the current position of the touch device from three or more satellites in real time. The position information of the touch device can be obtained by various methods.

The broadcast receiving module 119 receives a broadcast signal (e.g., a TV broadcast signal, a radio broadcast signal, a data broadcast signal, etc.) from an external broadcast management server through a broadcast channel (e.g., a satellite channel or a terrestrial channel) (E.g., information related to a broadcast channel, a broadcast program, or a broadcast service provider).

The user input unit 120 generates input data for a user to control the operation of the touch device. The user input unit 120 may include a key pad, a dome switch, a touch pad (static / static), a jog wheel, a jog switch, and the like. The user input unit 120 may be implemented as a button on the outside of the touch device, and some buttons may be implemented as a touch panel.

The display unit 130 displays (outputs) information processed in the touch device. For example, when the touch device is in the call mode, a user interface (UI) related to the call or a graphical user interface (GUI) is displayed. Also, the display unit 130 displays a captured image and / or a received image or a UI and a GUI when the touch device is in a video communication mode or a shooting mode. In the present invention, the display unit 130 displays an execution screen for various functions (or applications) executed in the touch device through one or more windows. In particular, the display unit 130 provides at least two divided screen regions according to a predetermined division method, and provides each of the divided screen regions as one window. That is, the display unit 130 supports screen display corresponding to a multi-window environment, and displays execution screens for a plurality of applications through multi-windows, which are divided areas. At this time, the display unit 130 can simultaneously display the screen of one window and the screen of another window in parallel. The display unit 130 may include a separator for separating each window according to a multi-window, that is, a divided region, a tray (or an application launcher (for example, (For example, a touch keypad (or a floating keypad)) that is freely moved within the entire screen area of the display unit 130. The display unit 130 also displays one or more The display unit 130 receives a user input in a full screen or an individual window screen provided through the window and transmits an input signal corresponding to the user input to the controller 170. The display unit 130 also displays a rotation direction ), The screen display by the vertical mode, and the display by the horizontal mode and the vertical mode Surface can support the switching display. Will be described below for the example of the screen display section 130 operating in the present invention.

The display unit 130 may be a liquid crystal display (LCD), a thin film transistor liquid crystal display (TFT LCD), a light emitting diode (LED), an organic light emitting diode (OLED) And may include at least one of an active matrix OLED (AMOLED), a flexible display, a bended display, and a 3D display. Some of these displays may be implemented as transparent displays that are transparent or optically transparent for viewing the outside.

In addition, when the display unit 130 and the touch panel that detects a touch operation form a layer structure (hereinafter, referred to as a 'touch screen'), the display unit 130 may be used as an input device Can be used. The touch panel may be configured to convert a change in a pressure applied to a specific portion of the display unit 130 or a capacitance generated in a specific portion of the display unit 130 into an electrical input signal. The touch panel can be configured to detect not only the position and area to be touched but also the pressure at the time of touch. If there is a touch input to the touch panel, the corresponding signal (s) is sent to the touch controller (not shown). The touch controller (not shown) processes the signal (s) and then transmits the corresponding data to the controller 170. Thus, the control unit 170 can know which area of the display unit 130 is touched or the like.

The audio processing unit 140 transmits the audio signal received from the control unit 170 to the speaker 141 and transmits the audio signal such as voice received from the microphone 143 to the control unit 170. The audio processing unit 140 converts the audio / sound data into audible sound through the speaker 141 under the control of the control unit 170, and converts the audio signal, such as voice, received from the microphone 143 into a digital signal, (170).

The speaker 141 can receive audio data stored in the storage unit 150 or received from the wireless communication unit 110 in a communication mode, a recording mode, a media content playback mode, a shooting mode, and a multiwindow mode. The speaker 141 may output a sound signal related to functions performed in the touch device (e.g., call connection reception, call connection origination, music file playback, video file playback, external output, etc.).

The microphone 143 receives an external sound signal and processes it as electrical voice data in a communication mode, a recording (recording) mode, a voice recognition mode, and a photographing mode. The processed voice data can be converted into a form that can be transmitted to the mobile communication base station through the mobile communication module 111 when the voice data is in the call mode, and output. Various noise reduction algorithms may be implemented in the microphone 143 to remove noise generated in receiving an external sound signal.

The storage unit 150 may store a program for processing and controlling the controller 170 and may store input / output data (e.g., telephone number, message, audio, media contents (e.g., , An application, and the like). The storage unit 150 stores usage frequency (for example, application usage frequency, media content playback frequency, phone number, frequency of use for messages and multimedia), importance, priority, and preference according to the touch device function operation . The storage unit 150 may store the fitting data in the vibration and the sound of various patterns outputted when the touch is input on the touch screen. In particular, the storage unit 150 may store partition information for a screen division method for multi-window operation, application information registered in a tray, application information executed by multi-window muting, and the like.

The storage unit 150 includes a memory such as a flash memory type, a hard disk type, a micro type, and a card type (for example, an SD card or an XD card) (ROM), a programmable ROM (PROM), an electrically erasable PROM (EEPROM), a magnetic random access memory (MRAM), a magnetic random access memory ), And an optical disk type of memory. The touch device may operate in association with a web storage that performs a storage function of the storage unit 150 on the Internet.

The interface unit 160 serves as a path for communication with all external devices connected to the touch device. The interface unit 160 receives data from an external device or receives power from the external device and transmits the received data to each component in the touch device or transmits data in the touch device to an external device. For example, a wired / wireless headset port, an external charger port, a wired / wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input / A video input / output port, an earphone port, and the like may be included in the interface unit 160. The interface unit 160 includes an interface for wired or wireless connection to an external device.

The controller 170 controls the overall operation of the touch device. For example, it performs control related to voice communication, data communication, video communication, application operation according to multi-window environment, and the like. The control unit 170 may have a separate multimedia module (not shown) for multi-window function operation. In the present invention, the multimedia module (not shown) may be implemented in the controller 170 or separately from the controller 170.

In particular, the controller 170 controls a series of operations for supporting the multi-window function according to the embodiment of the present invention. For example, the control unit 170 controls execution of a plurality of applications in a multi-window environment, and displays a screen of at least two applications according to a user's selection among a plurality of applications being executed independently through the plurality of windows Respectively.

For example, the control unit 170 may receive an input of an execution event for executing the second application in a state of displaying the execution screen of the first application on the full screen. The control unit 170 can control the feedback output to the window of the moved position during the movement in the state where the execution event is not released. When the execution event is released in the specific window in which the execution event is moved, the control unit 170 forms a multi-window according to a predetermined partitioning method, and the screen of the first application and the second application is divided into independent windows So that the display can be controlled.

When receiving an input requesting execution of an additional application while displaying a screen of a plurality of applications through the multi-window, the control unit 170 controls the execution of the additional application through a window selected for executing the additional application . At this time, the control unit 170 executes the process, processes the application that was previously executed through the selected window as a background, and controls the additional application screen to be displayed through the selected window.

Also, the controller 170 can control the display of the tray, the separator, and the floating keypad provided in the screen interface according to the multi-window environment of the present invention and the free movement of the display in the screen. In particular, the controller 170 can determine (change) the size of each window according to the multi-window environment according to the movement of the separator.

The detailed control operation of the controller 170 will be described in the operation example of the touch device and the control method thereof with reference to the drawings described later.

The power supply unit 180 receives external power and internal power under the control of the controller 170 and supplies power necessary for operation of the respective components.

Meanwhile, the various embodiments described in the present invention can be implemented in a recording medium that can be read by a computer or a similar device using software, hardware, or a combination thereof. According to a hardware implementation, the embodiments described in the present invention may be applied to various types of application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs) ), A processor, microcontrollers, microprocessors, and an electrical unit for carrying out other functions. In some cases, the embodiments described herein may be implemented by the controller 170 itself. According to a software implementation, embodiments such as the procedures and functions described herein may be implemented with separate software modules. Each of the software modules may perform one or more of the functions and operations described herein.

Here, the recording medium receives an input of an execution event for executing the second application in a state of displaying the execution screen of the first application on the full screen, and when the execution event is not released, And outputs the feedback to the window. When the execution event is released in the specific window in which the execution event is moved, a multi-window is configured according to a predetermined partitioning method, and the screen of the first application and the second application is divided And a computer-readable recording medium having recorded thereon a program for independently displaying the program.

The touch device of the present invention shown in FIG. 1 may include an AP (Application Processor), a GPU (Graphic Processing Unit), a CPU (Central Processing Unit), and the like, as well as all information communication devices, multimedia devices, Unit). ≪ / RTI > For example, the touch device may include a mobile terminal that operates based on communication protocols corresponding to various communication systems, a tablet PC (personal computer), a smart phone, A digital camera, a portable multimedia player (PMP), a media player, a portable game terminal, and a PDA (Personal Digital Assistant).

FIG. 2 is a schematic view illustrating a screen interface of a touch device according to an exemplary embodiment of the present invention. Referring to FIG.

2, the screen interface for supporting a multi-window environment in the touch device of the present invention includes execution areas 210 and 230 for displaying an execution screen of an application divided in one screen, And a separator 200 for separating the at least two execution regions 210 and 230 and supporting the window size of the execution regions 210 and 230. In this case, each of the execution regions 210 and 230 divided according to the multi-window environment can be made independent of navigation, scrolling, and text input according to the execution application.

In addition, the screen interface of the present invention provides a tray 300 for more conveniently supporting the execution of applications using windows classified into multi-windows. The tray 300 includes an execution icon (or a shortcut icon) 400 of all applications that can be installed and executed in the touch device or a user-defined application. The tray 300 may be slid in (displayed) and displayed on a screen or slide out to be hidden in a screen. In addition, the tray 300 may include a handle item 350 for receiving a user command for a slide-in slide-out state.

The tray 300 supports scrolling of the execution icon 400 in the tray 300 and the execution icon 400 in the tray 300 can be modified, added or deleted according to the user's selection. In FIG. 2, the tray 300 is provided in a row. However, the tray 300 may be implemented in various forms such as a heat mode and a triple mode.

2, the screen of the touch device is divided into two execution regions (windows) 210 and 230 through one separator 200. However, according to the embodiment of the present invention, And can be divided into N (N is a natural number of 1) which can be divided maximum in proportion. Accordingly, the separator 200 may be provided with one or more pieces corresponding to the number of divisions of the screen, that is, the division mode in which the multi-windows are configured. For example, as shown in FIG. 2, if two execution regions are classified into one separator 200 and three execution regions, two separators 200 can be operated. And if it is divided into four execution regions, it can be operated as two or three separators 200 according to the divided regions.

3 is a diagram schematically illustrating an operation of operating a multi-window in a touch device according to an embodiment of the present invention.

Referring to FIG. 3, a screen example of reference numeral 301 is a screen example of a touch device when the touch device is executing an Internet application. In particular, the screen of reference numeral 301 denotes a state in which the Internet application is displayed in a full screen through one window.

The screen example of reference numeral 303 is a screen example of the touch device when the touch device is executing two applications through a multi-window. For example, the user may additionally execute a MAP application in a state where the full screen of the Internet application is displayed. Then, one screen is divided into two different execution areas by two windows through the separator 200 as shown in the screen example of the reference number, and the screen of the Internet application and the map application Respectively. As described above, it is possible to simultaneously operate a plurality of applications by at least two screen division according to the embodiment of the present invention.

An example of the screen of reference numeral shows an example of a screen in which the size of each window is changed according to a user operation. For example, the user can move (touch & drag) the separator 200 to adjust the window size of the execution area in which the Internet application is running and the execution area in which the map application is running. According to the embodiment of the present invention, when the window size is adjusted by the movement of the separator 200, the screen size of the application may be appropriately changed according to the change of the window size of the execution range.

4 is a diagram schematically illustrating an operation of dividing a multi-window in a touch device according to an embodiment of the present invention.

Referring to FIG. 4, an exemplary screen of reference numeral 401 illustrates an example in which a screen is divided into two windows for a multi-window environment, and the A application and the B application are divided into one window 200, FIG. 2 is a view showing a screen when the program is being executed through two windows divided through FIG.

Reference numeral 403 and reference numeral 405 illustrate examples in which a screen is divided into three windows for a multi-window environment. The A application, the B application, and the C application are divided into two windows FIG. 3 shows a screen example when the program is being executed through three windows through the window 200. FIG.

As shown in the screen example of the reference numeral 403 and the reference numeral 405, the screen division in the present invention can be divided into various types according to the user definition, and such a division method can be defined in advance.

5 to 12 are views illustrating an example of an operation screen for operating a tray for quick execution of an application in a multi-window environment according to an embodiment of the present invention.

5 to 12, FIG. 5 illustrates a screen example of a touch device when the touch device is displaying an idle screen (or a home screen). In the screen example of FIG. 5, the idle screen is displayed as a full screen, but the execution screen of a specific application may be displayed as a full screen. In particular, FIG. 5 illustrates an example in which the user is operating in the normal mode before operating the multi-window environment. That is, according to the embodiment of the present invention, it is possible to operate the touch device by the multi-window mode and the general mode.

The user can display the tray 300 on the idle screen as shown in FIG. 6 while the idle screen is displayed. For example, when the idle screen is displayed, the user can select one of the menu items of the touch device, the function key selection for executing the multi-window mode, or the set touch event (for example, Gestures, etc.). Then, the touch device can display the tray 300 in a predetermined area on the idle screen by activating the tray 300 as shown in FIG. For example, the tray 300 may be provided in a left frame in a square frame of a basic window displaying one full screen. The tray 300 may be provided in an overlay form through a separate layer on a currently displayed screen.

The user can input a movement event (e.g., touch and drag) for moving the tray 300 to another area on the screen, as shown in FIG. 7, when the tray 300 is displayed on the idle screen. For example, the user can touch a portion of the tray 300 and input a moving event for dragging in a direction opposite to the screen (toward the right frame of the window). Then, the touch device may provide a UI or a GUI for releasing the tray 300 from the left frame according to the movement event and moving the tray 300 together with the drag of the user. At this time, the touch device switches the direction of the handle item 350 of the tray 300 when the tray 300 is moved in a specific direction (relative to the center of the screen) in accordance with the drag movement of the user . That is, the handle item 350 for sliding the tray 300 in the screen may be different depending on the region where the tray 300 is positioned. For example, the handle item 350 shown in FIG. 6 may be provided in the direction of the handle item 350 shown in FIG. 7 according to the movement of the tray 300.

The user can move the tray 300 close to the desired area and release the entered movement event as shown in FIG. That is, the drag input for moving the tray 300 can be canceled. Then, the touch device can determine the area where the tray 300 is moved and display the tray 300 in the determined area. For example, it may be provided in the right frame of the window as shown in FIG. That is, when the user input for moving the tray 300 is released, the touch device displays the screen as shown in FIG. That is, the screen provided with the tray 300 of the touch device shown in FIG. 6 is switched as shown in FIG. 8 according to the movement of the tray 300. Here, the touch device can determine the placement area of the tray 300 according to the degree of movement of the tray 300. [ For example, the tray 300 can be placed in the window frame that is closest to the area where the tray 300 has been moved (based on the user input point on the tray 300). For example, in a rectangular frame of a window, a tray 300 is disposed in a left frame when approaching a left frame, a tray 300 is disposed in a right frame in a case of approaching a right frame, May be disposed on the upper frame and the tray 300 may be disposed on the lower frame when the frame 300 is located close to the lower frame.

6 (in the case of being disposed in the left frame), FIG. 8 (in the case of being disposed in the right frame), FIG. 9 (in the case of placing in the upper frame) , And Fig. 10 (when arranged in the lower frame). That is, according to the embodiment of the present invention, as shown in FIG. 6 to FIG. 10, the tray 300 can be arranged in real time according to user input.

11 shows an example of a slide-out screen, that is, a screen in a state in which the tray 300 is disposed in the lower frame, as shown in FIG. 11, when the tray 300 is slid out, the tray 300 is not displayed on the screen, and only the handle item 350 of the tray 300 may be displayed. In the present invention, the slide-out of the tray 300 may be performed by user input using the handle item 350, or may be automatically slid-out when no user input is generated for a predetermined time in the slide state. The tray 300 can be automatically slid out when a specific execution icon 400 is selected in the tray 300 from the tray 300 according to a user input as shown in the following description.

11, when a user touches the hand item 350 and moves (drag, flick, etc.) in the direction toward the inside of the screen after touching the handle item 350 in a state in which the tray 300 is slid- The tray 300 can be slid.

12 illustrates an example of a screen when the touch device is performing screen display in the landscape mode in accordance with the rotation of the touch device in the screen display in the vertical mode as shown in Figs. 6 to 11 above. Here, when the touch device is switched from the horizontal mode to the vertical mode, or from the vertical mode to the horizontal mode, the tray 300 may be provided at a position corresponding to the orientation in the previous mode. For example, when the touch device is placed in the left frame at the time when the tray 300 looks at the user's screen in the landscape mode, the tray 300 switches to the portrait mode, And can be automatically arranged in the left frame. That is, they can be provided at the same position based on the viewpoint of the user irrespective of mode switching.

As shown in FIG. 12, the screen of each application of the divided execution regions is rotated and provided according to the mode change, and the window size separated by the separator 200 can maintain the previous state.

13 to 17 are diagrams illustrating exemplary operation screens for operating a plurality of applications in a multi-window environment according to an embodiment of the present invention.

13 to 17, FIG. 13 illustrates an example of a screen of a touch device when the touch device is executing an application (for example, an Internet application) in full screen. Here, as shown in FIG. 13, the tray 300 may be in an activated state, the tray 300 may slide out and be hidden on the screen, and only the handler item 350 may be displayed on the screen.

The user can select (touch & drag) the handler item 350 while the Internet application is displayed and slide the tray 300 on the screen as shown in FIG. The touch device displays a screen as shown in FIG. 14 when a user input to the handler item 350 is sensed while the tray 300 is slid-out. That is, the screen of the touch device shown in FIG. 13 is switched as shown in FIG. 14 according to user input.

The user may select an execution icon 410 of an application to be additionally executed according to the multi-window environment among the application execution icons 400 previously registered in the tray 300, as shown in FIG. 15, when the tray 300 is displayed. Can be selected to input an event moving on the screen. For example, the user can select an action icon 410 that can execute a map application in the tray 300, and input an event to move (drag) the object into the screen while keeping the touch.

Then, the touch device displays a state in which the execution icon 410 is moved into the screen corresponding to the user input, as shown in FIG. At this time, the touch device confirms the area where the execution icon 410 is located and the partitioning method as shown in FIG. 15, and notifies the user of feedback about the execution area where the execution icon 410 is to be executed Output. The feedback may be performed by a user intuitively such as focusing the corresponding window where the execution icon 410 is located, highlighting only the corresponding window, or changing the color of the window, Can be expressed in various ways that can be perceived.

When the execution icon 410 in the tray 300 enters the screen according to a user input, the space where the execution icon 410 is located in the tray 300 is fade-out effect A UI or a GUI provided in a blank space may be provided. When the execution icon 410 moves out of the tray 300 and enters the screen, the tray 300 may be slid out. That is, the screen of the touch device shown in FIG. 15 is switched as shown in FIG. 16 according to user input.

In the present invention, the blank processing is provided for the intuitiveness of the user. When the tray 300 is slid out, that is, when the tray 300 is switched from FIG. 15 to FIG. 16, The space processed according to the departure of the memory 410 may have the original shape again. That is, the space in which the execution icon 410 is located may be provided in a state where the space is filled with the corresponding icon as in the screen example of FIG. 18 to be described later.

15 and 16, a multi-window environment is divided into two execution regions and divided into two windows such as an upper window and a lower window. 15 shows an example of the case where the execution icon 410 is positioned in the upper window according to a user input. When the user moves to the lower side of the screen while maintaining the touch inputted to the execution icon 410, The lower window may appear in focus.

The user may move to the lower side of the screen while maintaining the touch inputted to the execution icon 410 as shown in FIG. 16, and then input an event to release the touch inputted to the execution icon 410 in the lower window . For example, the user may drag (drag & drop) the touch input to the execution icon 410 when the lower icon is displayed while the lower icon is displayed while the execution icon 410 is dragged and moved to the lower window.

Then, the touch device executes the application (i.e., the map application) of the execution icon 410 corresponding to the user input as shown in FIG. 17 and displays the execution screen on the lower window. At this time, if the touch device is running in full screen as a previous application like the Internet application and execution of an additional application such as the map application is detected, the touch device is divided into two separate execution areas of the full screen through the separator 200 . The touch device displays the screen of the additional application (i.e., the map application) through the window (lower window) of the execution area where the execution icon 410 is located, And displays the screen of the previous application (i.e., the Internet application).

At this time, when the additional application is executed, the touch device displays a screen of an appropriate size corresponding to the size of the window (lower window) of the execution area in which the additional application is executed. In addition, the touch device displays a screen of a previous application in a full screen or a partial screen in a window (upper window) of a divided execution area according to the characteristics of a previous application, and displays the screen of the additional application in another divided execution It can be displayed in a full screen or in a partial screen within a window (lower window) of the area.

For example, when the previous application and the additional application are applications having the characteristics of playback such as moving pictures, the touch device changes the screen to an appropriate size screen corresponding to the size of the corresponding window (upper window, lower window) of the divided execution area It is possible to display the post-playback screen in the full screen within the corresponding window. Alternatively, the touch device can display the previous application and the additional application in a part of the screen corresponding to the size of the corresponding window (upper window, lower window) of the divided execution area when the application has an attribute such as text or list such as the Internet.

As described above, according to the embodiments of the present invention, when the application is executed in the touch device, the execution screen of the first application can be displayed on the full screen first. Then, an execution event input for executing the second application (for example, a user input for moving to the screen after selecting the execution icon 400 in the tray 300) is received from the user while displaying the first application on the full screen . At this time, during the movement of the execution event into the screen while the execution event is not being released, the feedback of the window at which the execution event is moved (i.e., the position at which the execution icon 400 is being moved (dragged) Can be output. When the execution event is released (for example, the execution icon 400 dragged into a specific window area is dropped) in the specific window in which the execution event is moved, a multi-window is configured according to a predetermined partitioning method, The screen of the second application can be independently displayed through the divided windows.

18 to 23 are diagrams illustrating an operation example of operating a plurality of applications in a multi-window environment according to an embodiment of the present invention.

18 to 23, the touch device displays the screen of different applications through the respective windows of the two execution areas divided as shown in FIG. 17, And the tray 300 is slid in accordance with a user input using the touch panel.

The user may select an execution icon 430 of an application (e.g., a note application) to be additionally executed among the execution icons 400 registered in advance in the tray 300 in response to the operation as described above in the state that the tray 300 is displayed The user can input an event moving on the screen as shown in FIG.

19, the touch device displays the execution icon 430 corresponding to the user input in the screen and displays the execution icon 430 in response to the movement of the execution icon 430, To the user. The slide-out operation of the tray 300 according to the movement of the execution icon 430 and the execution of the application (for example, the note application) of the execution icon 430 correspond to the operations as described above. In FIG. 19, the touch input to the execution icon 430 is moved to the upper window of the screen and is released (dragged and dropped).

The touch device executes an application (i.e., a note application) of the execution icon 430 corresponding to the user input and displays an execution screen thereof on the upper window as shown in FIG. At this time, the touch device processes an application (for example, an Internet application) that was previously executed through the upper window in the background and displays a new application (e.g., a note application) screen requested through execution of a new application through the upper window. Then, the application (for example, the map application) allocated to the lower window is continuously executed and the screen (current progress screen) according to the execution state is continuously displayed through the lower window.

As described above, according to the exemplary embodiments of the present invention, when a screen of a plurality of applications is displayed through the multi-window, user input for executing an additional application can be received have. Then, the touch device executes the additional application through the corresponding window selected by the user for executing the additional application. Here, when executing the additional application, an application that was previously executed through the selected window is processed in the background, and the additional application screen can be displayed through the selected window.

20, the user can change the window size of the two execution regions divided through the separator 200 in real time. That is, FIGS. 21 to 23 show operations of the window of the divided execution regions of the touch device to change the window size according to the user input in the display state.

The user can input an event to move in a specific direction (for example, upward direction or downward direction) after selecting the separator 200 as shown in FIG. 21 on the screen as shown in FIG. For example, as shown in FIG. 21, the user may touch the separator 200 and input an event of dragging the separator 200 in the downward direction of the screen while keeping the touch.

Then, the touch device displays a state in which the separator 200 is moved corresponding to the user's input, as shown in FIG. At this time, as shown in FIG. 21, the touch device can display only the moving state of the separator 200 according to user input while maintaining the current state of the screen of the application being displayed through each window. However, according to the embodiment of the present invention, when the separator 200 moves according to the user's input according to the window size adjustment method, the screen of the application may be adaptively changed according to the window size corresponding to the window size.

The user can input an event for releasing the touch input to the separator 200 after moving the separator 200 in accordance with the size ratio of each window to be adjusted. For example, the user can release (drag & drop) the touch input to the separator 200 while the separator 200 is dragged and moved to the position of the lower window as shown in FIG.

22, the touch device changes the size of the window according to the movement of the separator 200 corresponding to the user's input. At this time, the touch device changes the display state of the screen of the application allocated to each window (for example, the upper window and the lower window) according to the change of the window size. For example, as shown in FIG. 22, the screen of the application displayed on the upper window may show remaining hidden contents according to the expansion of the window size, and the screen of the application displayed on the lower window may be displayed The displayed area can be provided in a reduced state.

FIG. 23 shows the opposite case of FIG. 22, in which the separator 200 is moved in the upward direction of the screen according to a user input, the size of the upper window is reduced and the size of the lower window is expanded For example.

FIGS. 24 to 29 are views illustrating an operation example of operating a keypad for text input in a multi-window environment according to an embodiment of the present invention.

As shown in FIGS. 24 to 29, the present invention provides a touch keypad (hereinafter, referred to as a floating keypad) 500 different from a general touch keypad for efficient operation of a multi-window environment. That is, according to the embodiment of the present invention, a touch keypad operated in a general mode that provides a screen of one application as a whole screen, and a multi-window mode in which screens of a plurality of applications are provided as individual screens through screen division, And the floating keypad 500 to be operated can be separately provided. In the present invention, the floating keypad 500 is not fixedly provided in a predefined area such as a general touch keypad, but is a touch keypad that is freely movable in the screen of the touch device in response to user input. In the screen interface according to the present invention, when a text input is requested in a specific window application according to a user's selection among applications of a plurality of windows separated by a multi-window (for example, User input to select a text input window).

Referring to FIGS. 24 to 29, FIG. 24 illustrates an example of a screen of a touch device in which a touch device displays screens of different applications through respective windows of two divided execution regions.

25, the user can display a plurality of applications according to the multi-window environment, as shown in FIG. 24, by moving the floating keypad in a predetermined area (for example, The area in which it was executed). For example, the user may select a touch event (for example, a gesture having a specific pattern such as graphics and text) for menu operation of the touch device, function key selection for executing the floating keypad 500, or execution of the floating keypad 500, And so on. Particularly, in the present invention, the floating keypad 500 can be automatically executed and provided on the screen when a text input window capable of inputting text is selected in an application screen running in a window of each divided execution area.

25, the touch device can be activated by activating the floating keypad 500 in one area of the screen being operated by the multi-window. For example, the position at which the floating keypad 500 is activated may be provided such that the lower end of the floating keypad 500 adheres to the lower frame. In the present invention, the floating keypad 500 may be provided in the form of an overlay with a separate layer on the screens according to the multi-window.

The user can input a movement event (e.g., touch & drag) for moving the floating keypad 500 to another area on the screen, as shown in FIG. 26, while the floating keypad 500 is displayed on the screen. For example, the user may touch a portion of the floating keypad 500 and input a movement event to drag in another area of the screen (e.g., the upper direction of the screen). Then, the touch device may provide a UI or a GUI for releasing the floating keypad 500 from the lower frame according to the movement event and moving the corresponding floating keypad 500 together with the drag of the user.

The user can move the floating keypad 500 to a desired position as shown in FIG. 27, and then release the entered movement event. That is, the drag input for moving the floating keypad 500 can be released. Then, the touch device can display the floating keypad 500 at the position where the drag input is released.

Meanwhile, according to the embodiment of the present invention, in the state where the floating keypad 500 is provided, the user input can be made in both the windows of the divided execution regions and the floating keypad 500. At this time, the user input by the floating keypad 500 is received within the area occupied by the floating keypad 500, and the user input for the window is received from the other area.

The user can perform text input using the floating keypad 500 in a state where the floating keypad 500 is displayed as shown in FIG. For example, assume that a user enters text on a screen of an application running in an upper window. In this case, the user selects an upper window (in particular, selects an area (for example, a text input window) capable of entering text in the application window of the upper window), then selects a desired character button on the floating keypad 500 can do.

27 and 28, a user selects a text input window 610 on a screen of an application running through an upper window to implement a state in which text input is possible. Then, the user can sequentially input each of the buttons to which the p, s, and y characters are assigned for the "psy" input using the floating keypad 500. 27 and 28, the touch device may display the corresponding character in the text input window 610 corresponding to the user input.

28, the touch device underlines the text (e.g., psy) input to the text input window 610 of the application running in the upper window to the floating keypad 500, . For example, as shown in the example of FIG. 28, the text input to the text input window 610 may include a recommendation area of a new layout recommending a retrieved result corresponding to the text input to the text input window 610, (620). Here, the recommended region 620 may be overlaid on the screen of the application and provided under the floating keypad 500. That is, the floating keypad 500 may be disposed at the uppermost position to maintain the current state.

Alternatively, the text input to the text input window 610 may be input directly to the same layer as the application screen. For example, in the case of a text input window in which recipient information is input, such as a mail application being executed in the lower window, only the input results can be displayed through a text input window of the application screen without a separate new layer have.

The user can select any one of the recommendation results in a state in which the recommended region 620 is displayed as underlay on the floating keypad 500 as shown in FIG. 28, or perform a search for the text input in the text input window 610 (Command). The resulting screen is shown in Fig. 29 above. That is, the screen of the touch device shown in FIG. 28 is switched as shown in FIG. 29 according to user input.

29, when a user inputs a text through the floating keypad 500 according to a user input and executes a function for the application (e.g., a search execution, a mail transmission execution, a memo storage execution, a message transmission execution, The user can remove the floating keypad 500 from the screen and provide the result of its execution in the corresponding window of the application executing the function. For example, referring to FIG. 28 and FIG. 29, a search result for 'psy' input in the application of the upper window can be provided through the upper window.

30 is a diagram illustrating an operation example of operating a plurality of applications in a multi-window environment according to an embodiment of the present invention.

30, the touch device displays the screens of different applications through the respective windows of the two execution regions, and the specific settings for the windows are changed according to the user input A screen example during the middle time is shown.

According to the embodiment of the present invention, function setting can be supported independently for each divided window. That is, the function setting corresponding to the characteristics of the execution application of the window selected by the user among the windows of the divided execution regions can be changed.

For example, the user can select the left window among the windows of the divided execution region and then manipulate the setting function (for example, function key operation provided for volume control). Then, the touch device identifies the characteristics of the application being executed through the left window. Then, the touch device displays the volume setting item 700 on the screen according to the characteristics of the divided application (for example, media characteristics such as moving picture), and provides feedback on the setting value changed according to the user input. At this time, if the user defines the screen brightness setting for the media characteristic, the volume setting item 700 dash screen brightness setting item (not shown) is provided on the screen, and the feedback for changing the screen brightness according to user input is provided . It is also possible to perform the setting change for the application executed in the right window in accordance with the above-described method.

Here, when the function setting is changed according to the user input in the specific window as described above, independent setting can be made for each lane. For example, when the volume setting or the screen brightness is set in the left window, the setting value may be reflected only in the left window.

31 to 34 are diagrams illustrating an example of an operation screen for providing information on a plurality of applications executed in a multi-window environment in a touch device according to an embodiment of the present invention.

31 to 34, FIG. 31 illustrates a screen example of a touch device when the touch device is displaying a list of a plurality of applications executed according to a multi-window environment. As shown in FIG. 31, a list of applications executed by a user in a multi-window environment may be provided on a full screen according to user selection. The user can select the function key for the menu operation of the touch device, the function list for executing the list, or the touch event set for the list execution (for example, specific patterns such as figures and characters, etc.) And the like) can be input. 31, the touch device can display a list of currently executed applications (including background execution) through a set UI or a GUI as shown in FIG.

As shown in FIG. 31, applications that are executed by a user in a multi-window environment and whose execution is currently being performed may be provided in a specific arrangement. For example, the applications may be arranged in the order in which they are executed or in a random order. 31 shows a state when a list of the Email application 910, the Video Player application 920, the Note application 930, the Map application 940, and the Play Store application 950 is being displayed.

As shown in FIGS. 32 to 33, the rest of the applications hidden by the user's scroll (or navigation) control (e.g., the Gmail application 960, Wi-Fi application 970, and Phone application 980) may appear unfolded. That is, the list shown in FIG. 31 is not displayed on the screen, but includes other hidden applications. This is because the number of applications included in the initial list can be appropriately set in consideration of the intuitiveness of the user according to the screen size of the touch device, and when the number of applications being executed is larger than the set number, Excess applications can be hidden.

Herein, the information about the applications of the list is allocated most to the information display area of the application (for example, the Video Player application 920) disposed below the applications, and the information display area is gradually reduced Can be provided in all forms. Therefore, in the case of an application located on the top side (e.g., the Play Store application 950), only a status bar for identifying the application may appear.

31, an application (e. G., An email application 910) that is disposed in the lowermost region and displays only a status bar may be executed by the user most recently or by at least one application . As described above, in the case of the application arranged in the lowermost region, the application can be fixedly provided in the corresponding region regardless of the scroll control of the user, and the fixed placement may not be performed according to the user setting.

The list screen for the execution applications of the present invention includes a command area 800 for supporting editing (application scrolling, application execution end, application search, etc.) for the execution applications in the list. In particular, the list screen includes scroll items 850 for controlling the scrolling (or spreading) of applications in the list. That is, the user can scroll the applications in the list through the user input using the scroll item 850. [ The touch device provides a UI or a GUI on which information of overlapping applications is spread according to a user input method for the scroll item 850. [ At this time, when the user input method is repeatedly input one time, the user scrolls one time in response to the input, and when the user input method maintains the input state (touch) of the scroll item 850, Continuously controls the automatic scrolling while the input is maintained.

On the other hand, the user can select (touch) the scroll item 850 while the list is displayed as shown in FIG. 31, and maintain the input. Then, when the user inputs the scroll item 850, the touch device displays a screen on which the information of the applications is spread from top to bottom as shown in FIG. 32 to FIG. That is, the list screen of the touch device shown in FIG. 31 is switched as shown in FIGS. 32 to 34 according to user input.

As shown in FIGS. 32 to 34, in response to a user input using the scroll item 850, the Video Play application 920 is pushed downward, gradually disappears on the screen, and information of other applications disposed on the upper side is gradually A UI or a GUI that is sequentially pushed downward while being unfolded can be provided. When scrolling the list is performed according to the scroll control according to the user input, other applications hidden (for example, the Gmail application 960 (FIG. 33), the Wi-Fi application 970 (Fig. 34), and the Phone application 980 (Fig. 34)) may sequentially appear on the screen. At this time, as shown in FIG. 32 to FIG. 34, in the case of the Email application 910, it can be fixed and displayed continuously at the corresponding position.

On the other hand, the user can select an item of a specific application in a state in which a list is displayed or in a scroll control as shown in FIGS. 31 to 34. Then, the touch device can display the selected application on the full screen. Alternatively, when the user input by the scroll item 850 is input until all the applications included in the list are scrolled, as shown in FIGS. 31 to 34, that is, when all applications in the list are expanded (E.g., an application that is fixedly placed on the lowermost side (e.g., the email application 910)) may be automatically displayed in a full screen.

35 is a flowchart illustrating a method of operating a multi-window environment in a touch device according to an embodiment of the present invention. In particular, FIG. 35 illustrates an operation example when switching to a multi-window environment while operating one window.

Referring to FIG. 35, the controller 170 executes an application corresponding to a user selection (hereinafter referred to as a first application) (Step 3501), and controls screen display of the first application to be executed (Step 3503). At this time, the controller 170 controls the full screen display of the first application through one window.

The control unit 170 determines a predetermined multi-window partitioning method (3505) when receiving the execution wait event input for executing the additional application (hereinafter referred to as a second application) while the first application is running (3505) step). In the present invention, the execution wait event may indicate an event for displaying a multi-window environment by additionally executing another application in a state in which a user executes an application and displays the application. Particularly, the execution wait event is an event that the user activates (slides) the tray 300 on the screen, selects an execution icon of the application to be further executed in the activated tray 300, and moves (drag) Lt; / RTI >

The control unit 170 tracks the position where the execution icon is moved when the execution icon is moved from the tray 300 and enters the screen (step 3509). Here, the control unit 170 can check the window of the current location by moving the execution icon through the location tracking of the execution icon.

In operation 3511, the controller 170 controls the feedback output to the window of the execution region in which the additional application is executable in accordance with the partitioning method and the position of the execution icon determined in the above-described manner. That is, the control unit 170 may control the feedback output to the specific window at the position where the execution icon is being dragged, while the execution icon is moved in the full screen according to the drag. For example, the control unit 170 may focus on a window at a position where the execution icon is moved.

The control unit 170 divides the screen and controls the execution of the second application (step 3515) (step 3517) when the execution event of the second application by the execution icon is inputted (step 3513). The execution event may be an event that drops the execution icon in one area of the screen. The control unit 170 identifies an area where the execution icon is moved and an execution event is generated (for example, an area where the execution icon is dragged and dragged and dropped), divides the entire screen for the first application, (I.e., an execution region) for screen display of the second application among the divided regions.

When executing the second application, the controller 170 controls screen display of an appropriate size corresponding to the window size of the divided execution area (i.e., the execution area in which the second application is executed) (step 3519). Here, the controller 170 displays the first application screen in a full screen or a partial screen in a divided window (e.g., an upper window) of the execution area, and displays the screen of the second application in another divided execution area It may be displayed in a full screen or a partial screen within a window (e.g., a lower window). For example, when the first application and the second application are applications having the characteristics of playback such as moving pictures, the controller 170 changes the screen to an appropriate size screen corresponding to the corresponding window size of the divided execution area, Can be displayed as a full screen within the window. Alternatively, if the first application and the second application are applications having a text or a list such as the Internet, the controller 170 may display a partial screen corresponding to the corresponding window size of the divided execution region. That is, according to the embodiment of the present invention, the screen of the first application and the screen of the second application can be independently displayed in the corresponding window by the multi-window environment implementation.

That is, the control unit 170 can execute the second application in response to the drop input of the execution icon when the execution icon receives an input to drop in the specific window during dragging. At this time, when the second application is executed, the controller 170 may divide the entire screen into windows for screen display of the first application and the second application. The control unit 170 may display the screen of the second application through the specific window in which the execution icon is dropped and display the screen of the first application through another divided window.

36 is a flowchart illustrating a method of operating a multi-window environment in a touch device according to an embodiment of the present invention. Particularly, FIG. 36 shows an operation example when an additional application is executed while operating a multi-window.

Referring to FIG. 36, the controller 170 may receive an input to select an additional application for further execution of an application while displaying a screen of a plurality of applications by multi-window (step 3601) ). That is, according to the embodiment of the present invention, it is possible to additionally execute another application while displaying screens of a plurality of different applications in the multi-window environment independently through each divided window.

When an input for selecting an additional application is received in the multi-window environment, the controller 170 determines the partitioning method and the currently executed window (hereinafter, execution window) (step 3605). For example, the control unit 170 checks how many window division methods are used for screen division for a multi-window environment through predefined division information, and determines how many windows are currently being used .

The controller 170 compares the division information with the number of execution windows to determine whether the number of execution windows corresponds to the maximum value set in the predefined division information (operation 3607). For example, the controller 170 may distinguish whether the pre-defined partition information is 3 and the number of windows currently being executed corresponds to three.

If the number of execution windows does not correspond to the maximum set in the partition information (NO in step 3607), the controller 170 controls the execution of the corresponding operation (step 3609). For example, as described above, the control unit 170 may control additional screen division for executing additional applications, thereby executing additional applications and screen display for a plurality of applications. This may correspond to an operation for controlling the execution of an additional application by screen division in the full screen as described in the example of FIG.

If the number of the execution windows corresponds to the maximum set in the partition information (YES in step 3607), the control unit 170 tracks and determines the position of the user input for selecting the execution area to execute the additional application (step 3611 ). For example, when the user selects an execution icon of the application to be further executed in the tray 300 and moves into the screen, the controller 170 can track the position where the execution icon is moved.

The control unit 170 provides feedback on the execution region in which the additional application is executable in response to the determined position (Step 3613). For example, when the execution icon is moved from the tray 300 and enters the screen, the control unit 170 may focus on the window at the position where the execution icon is moved.

When an execution event for the additional application is input (step 3615), the controller 170 executes the additional application and controls the background process of the previous application that is being executed in the execution area (step 3617). For example, when executing an additional application corresponding to a user input, the control unit 170 processes an application that was previously executed through the window selected for executing the additional application in the background, Display through window. That is, the control unit 170 can continuously execute the previous application allocated to the corresponding window through the background process, and only the screen replacement displayed in the window can be performed.

When executing the additional application, the control unit 170 controls screen display corresponding to the window size of the execution region in which the additional application is executed (Step 3619). For example, the control unit 170 may display the screen of the additional application in a full screen or a partial screen within the window of the execution area. Here, if the additional application is an application having a property such as playback of a moving image, the control unit 170 changes the screen to a screen having an appropriate size corresponding to the window size of the execution area, and then displays the playback screen in a full screen within the window If the additional application is an application having a property such as a text or a list such as the Internet, it can be displayed in a part of the screen corresponding to the window size of the execution region.

The embodiments of the present invention as described above may be implemented in the form of program instructions that can be executed through various computer means and recorded in a computer-readable recording medium. The computer-readable recording medium may include a program command, a data file, a data structure, and the like, alone or in combination. The program instructions recorded on the recording medium may be those specially designed and constructed for the present invention or may be those known and used by those skilled in the computer software.

The computer-readable recording medium includes a magnetic medium such as a hard disk, a floppy disk and a magnetic tape, an optical recording medium such as a CD-ROM (Compact Disc Read Only Memory), a DVD (Digital Versatile Disc) A magneto-optical medium such as a floppy disk, and a program command such as a ROM (Read Only Memory), a RAM (Random Access Memory), a flash memory, Lt; RTI ID = 0.0 > a < / RTI > The program instructions also include high-level language code that can be executed by a computer using an interpreter or the like, as well as machine code as produced by the compiler. The hardware devices described above may be configured to operate as one or more software modules to perform the operations of the present invention, and vice versa.

It is to be understood that both the foregoing general description and the following detailed description of the present invention are exemplary and explanatory and are intended to provide further explanation of the invention as claimed. Accordingly, the scope of the present invention should be construed as being included in the scope of the present invention, all changes or modifications derived from the technical idea of the present invention.

110: wireless communication unit 120: user input unit
130: Display section 140: Audio processing section
150: storage unit 160: interface unit
170: control unit 180: power supply unit
200: separator 300: tray
350: handle items 400, 410, 430: execute icon
500: Floating keypad 850: Scroll item

Claims (35)

  1. A method for executing an application of an electronic device,
    Displaying an execution screen of the first application on a single window on the touch screen;
    Receiving an input for displaying an icon display area including a plurality of execution icons each corresponding to the application while the execution screen of the first application is continuously displayed in the single window;
    Receiving an input for selecting and dragging an execution icon corresponding to a second application from the icon display area while the icon display area is displayed and the execution screen of the first application is continuously displayed in the single window; And
    Responsive to releasing an execution icon corresponding to the dragged second application to a predefined area of the second side of the touch screen, the execution screen of the first application is moved to a multi- Providing the multi-window in such a way as to divide the touch screen in a first window of a window and to display an execution screen of the second application in a second window of the multi-window on the second side of the touch screen ≪ / RTI >
    Wherein the first side of the touch screen is opposite to the second side of the touch screen,
    Wherein the first window of the multi-window occupies the second window of the multi-window and another area of the touch screen.
  2. The method according to claim 1,
    In response to dragging an execution icon corresponding to the second application from the icon display area to the predefined area of the touch screen, the execution icon is dragged to the window area for the second application Further comprising displaying the feedback for the electronic device.
  3. 3. The method of claim 2,
    And the predefined area is located outside the icon display area.
  4. The method according to claim 1,
    Displaying an execution screen of the first application so as to correspond to the size of the first window of the multi-window or displaying an execution screen of the second application so as to correspond to the size of the second window of the multi- Of the electronic device.
  5. The method according to claim 1,
    And displaying the icon display area on the execution screen of the first application when receiving an input for displaying the icon display area.
  6. The method according to claim 1,
    Wherein the first window of the multi-window and the second window of the multi-window are separated by a separator.
  7. In an electronic device,
    touch screen; And
    A processor,
    The processor comprising:
    Displaying an execution screen of the first application on a single window on the touch screen,
    An input for displaying an icon display area including a plurality of execution icons each corresponding to the application while the execution screen of the first application is continuously displayed in the single window,
    Receiving an input for selecting and dragging an execution icon corresponding to a second application from the icon display area while the icon display area is displayed and the execution screen of the first application is continuously displayed in the single window,
    Responsive to releasing an execution icon corresponding to the dragged second application to a predefined area of the second side of the touch screen, the execution screen of the first application is moved to a multi- To provide the multi-window in such a way that it is displayed on a first window of the window and the touch screen is divided to display an execution screen of the second application on a second window of the multi-window on the second side of the touch screen Is set,
    Wherein the first side of the touch screen is opposite to the second side of the touch screen,
    Wherein the first window of the multi-window occupies the second window of the multi-window and another area of the touch screen.
  8. 8. The method of claim 7,
    The processor comprising:
    In response to dragging an execution icon corresponding to the second application from the icon display area to the predefined area of the touch screen, the execution icon is dragged to the window area for the second application The electronic device being configured to display feedback to the device.
  9. 9. The method of claim 8,
    And the predefined area is located outside the icon display area.
  10. 8. The method of claim 7,
    The processor comprising:
    An execution screen of the first application is displayed so as to correspond to the size of the first window of the multi-window or an execution screen of the second application is displayed so as to correspond to the size of the second window of the multi- Device.
  11. 8. The method of claim 7,
    The processor
    And to display the icon display area on the execution screen of the first application when receiving an input for displaying the icon display area.
  12. 8. The method of claim 7,
    Wherein the first window of the multi-window and the second window of the multi-window are separated by a separator.
  13. A computer-readable recording medium storing a program for executing a process,
    The process comprises:
    Displaying an execution screen of the first application on a single window on the touch screen;
    Receiving an input for displaying an icon display area including a plurality of execution icons each corresponding to the application while the execution screen of the first application is continuously displayed in the single window;
    Receiving an input for selecting and dragging an execution icon corresponding to a second application from the icon display area while the icon display area is displayed and the execution screen of the first application is continuously displayed in the single window; And
    Responsive to releasing an execution icon corresponding to the dragged second application to a predefined area of the second side of the touch screen, the execution screen of the first application is moved to a multi- Providing the multi-window in such a way that the touch screen is divided to display on a first window of the window and to display an execution screen of the second application on a second window of the multi-window on the second side of the touch screen ≪ / RTI >
    Wherein the first side of the touch screen is opposite to the second side of the touch screen,
    Wherein the first window of the multi-window occupies the second window of the multi-window and another area of the touch screen.
  14. 14. The method of claim 13,
    The process comprises:
    In response to dragging an execution icon corresponding to the second application from the icon display area to the predefined area of the touch screen, the execution icon is dragged to the window area for the second application The computer program product further comprising the step of: displaying feedback to the user.
  15. 15. The method of claim 14,
    And the predefined area is located outside the icon display area.
  16. 14. The method of claim 13,
    Displaying an execution screen of the first application so as to correspond to the size of the first window of the multi-window or displaying an execution screen of the second application so as to correspond to the size of the second window of the multi- To a computer readable recording medium.
  17. 14. The method of claim 13,
    And when the input for displaying the icon display area is received, the icon display area is displayed on the execution screen of the first application.
  18. 14. The method of claim 13,
    Wherein the first window of the multi-window and the second window of the multi-window are separated by a separator.
  19. In an electronic device,
    touch screen; And
    A processor,
    The processor comprising:
    An execution screen of the first application is displayed on the touch screen in a single window manner,
    An input for displaying an icon display area including a plurality of execution icons each corresponding to the application while the execution screen of the first application is continuously displayed in the single window manner,
    Receiving an input for selecting and dragging an execution icon corresponding to a second application from the icon display area while the icon display area is displayed on an execution screen of the first application continuously displayed in the single window manner,
    In response to dragging an execution icon corresponding to the second application to a predefined area of the second side of the touch screen, the feedback to the window area for the second application while the execution icon is being dragged and not yet released (feedback)
    Responsive to releasing an execution icon corresponding to the second application dragged during display of the feedback to the window region, displaying an execution screen of the first application on the first screen of the multi- 1 window and dividing the touch screen to display an execution screen of the second application on a second window of the multi-window on the second side of the touch screen,
    Wherein the first side of the touch screen is opposite the second side of the touch screen.
  20. 20. The method of claim 19,
    The processor comprising:
    An execution screen of the first application is displayed so as to correspond to the size of the first window of the multi-window or an execution screen of the second application is displayed so as to correspond to the size of the second window of the multi- Device.
  21. delete
  22. delete
  23. delete
  24. delete
  25. delete
  26. delete
  27. delete
  28. delete
  29. delete
  30. delete
  31. delete
  32. delete
  33. delete
  34. delete
  35. delete
KR1020120105898A 2012-09-24 2012-09-24 Method and apparatus for providing multi-window at a touch device KR101957173B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020120105898A KR101957173B1 (en) 2012-09-24 2012-09-24 Method and apparatus for providing multi-window at a touch device

Applications Claiming Priority (8)

Application Number Priority Date Filing Date Title
KR1020120105898A KR101957173B1 (en) 2012-09-24 2012-09-24 Method and apparatus for providing multi-window at a touch device
PCT/KR2013/008551 WO2014046525A1 (en) 2012-09-24 2013-09-24 Method and apparatus for providing multi-window in touch device
ES13185767T ES2706010T3 (en) 2012-09-24 2013-09-24 Procedure and apparatus for executing applications on a touch device
EP18214482.4A EP3493042A1 (en) 2012-09-24 2013-09-24 Method and apparatus for executing applications in a touch device
US14/035,266 US20140089833A1 (en) 2012-09-24 2013-09-24 Method and apparatus for providing multi-window in touch device
AU2013318697A AU2013318697B2 (en) 2012-09-24 2013-09-24 Method and apparatus for providing multi-window in touch device
EP13185767.4A EP2725466B1 (en) 2012-09-24 2013-09-24 Method and apparatus for executing applications in a touch device
CN201310439519.5A CN103677627A (en) 2012-09-24 2013-09-24 Method and apparatus for providing multi-window in touch device

Publications (2)

Publication Number Publication Date
KR20140039575A KR20140039575A (en) 2014-04-02
KR101957173B1 true KR101957173B1 (en) 2019-03-12

Family

ID=49263142

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020120105898A KR101957173B1 (en) 2012-09-24 2012-09-24 Method and apparatus for providing multi-window at a touch device

Country Status (7)

Country Link
US (1) US20140089833A1 (en)
EP (2) EP2725466B1 (en)
KR (1) KR101957173B1 (en)
CN (1) CN103677627A (en)
AU (1) AU2013318697B2 (en)
ES (1) ES2706010T3 (en)
WO (1) WO2014046525A1 (en)

Families Citing this family (92)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8555201B2 (en) * 2008-06-05 2013-10-08 Qualcomm Incorporated Wireless communication device having deterministic control of foreground access of the user interface
US20120159395A1 (en) 2010-12-20 2012-06-21 Microsoft Corporation Application-launching interface for multiple modes
US8612874B2 (en) 2010-12-23 2013-12-17 Microsoft Corporation Presenting an application change through a tile
US9658766B2 (en) 2011-05-27 2017-05-23 Microsoft Technology Licensing, Llc Edge gesture
US9104440B2 (en) 2011-05-27 2015-08-11 Microsoft Technology Licensing, Llc Multi-application environment
US9104307B2 (en) 2011-05-27 2015-08-11 Microsoft Technology Licensing, Llc Multi-application environment
US9158445B2 (en) 2011-05-27 2015-10-13 Microsoft Technology Licensing, Llc Managing an immersive interface in a multi-application immersive environment
US9146670B2 (en) 2011-09-10 2015-09-29 Microsoft Technology Licensing, Llc Progressively indicating new content in an application-selectable user interface
USD735736S1 (en) * 2012-01-06 2015-08-04 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphical user interface
US9460029B2 (en) 2012-03-02 2016-10-04 Microsoft Technology Licensing, Llc Pressure sensitive keys
US9075566B2 (en) 2012-03-02 2015-07-07 Microsoft Technoogy Licensing, LLC Flexible hinge spine
KR20140146992A (en) * 2012-03-27 2014-12-29 엘지전자 주식회사 Optimization of application execution based on length of pulled out flexible display screen
USD742396S1 (en) * 2012-08-28 2015-11-03 General Electric Company Display screen with graphical user interface
AU2013356799B2 (en) * 2012-12-06 2019-08-08 Samsung Electronics Co., Ltd. Display device and method of controlling the same
USD737841S1 (en) * 2013-03-14 2015-09-01 Microsoft Corporation Display screen with graphical user interface
USD735749S1 (en) * 2013-03-14 2015-08-04 Microsoft Corporation Display screen with graphical user interface
JP6163839B2 (en) * 2013-04-09 2017-07-19 富士通株式会社 Electronic equipment and copy control program
JP6132644B2 (en) * 2013-04-24 2017-05-24 キヤノン株式会社 Information processing apparatus, display control method, computer program, and storage medium
USD739873S1 (en) * 2013-06-10 2015-09-29 Huawei Technologies Co., Ltd. Display screen with icon
KR20150026360A (en) 2013-09-02 2015-03-11 삼성전자주식회사 Method and apparatus for providing multiple applications
USD748673S1 (en) * 2013-09-03 2016-02-02 Samsung Electronics Co., Ltd. Display screen portion with icon
USD751603S1 (en) * 2013-09-03 2016-03-15 Samsung Electronics Co., Ltd. Display screen portion with icon
USD748140S1 (en) * 2013-09-03 2016-01-26 Samsung Electronics Co., Ltd. Display screen portion with icon
AU353073S (en) * 2013-09-03 2013-12-23 Samsung Electronics Co Ltd Display screen with icon for an electronic device
US9846532B2 (en) * 2013-09-06 2017-12-19 Seespace Ltd. Method and apparatus for controlling video content on a display
USD751082S1 (en) * 2013-09-13 2016-03-08 Airwatch Llc Display screen with a graphical user interface for an email application
JP2015066979A (en) * 2013-09-26 2015-04-13 ヤマハ発動機株式会社 Display system for marine vessel and small marine vessel with the same
USD760780S1 (en) * 2013-09-30 2016-07-05 Terumo Kabushiki Kaisha Display screen with icon
US9841944B2 (en) * 2013-10-28 2017-12-12 Lenovo (Beijing) Co., Ltd. Method for processing information and electronic apparatus
KR20150080756A (en) * 2014-01-02 2015-07-10 삼성전자주식회사 Controlling Method For Multi-Window And Electronic Device supporting the same
EP2930049B1 (en) * 2014-04-08 2017-12-06 Volkswagen Aktiengesellschaft User interface and method for adapting a view on a display unit
KR20150135837A (en) * 2014-05-26 2015-12-04 삼성전자주식회사 Electronic Apparatus and Method for Management of Display
CN103995722B (en) * 2014-05-26 2017-08-25 天津三星通信技术研究有限公司 The method of simultaneously open multiple windows on the screen and equipment
US9785340B2 (en) 2014-06-12 2017-10-10 Apple Inc. Systems and methods for efficiently navigating between applications with linked content on an electronic device with a touch-sensitive display
US9648062B2 (en) * 2014-06-12 2017-05-09 Apple Inc. Systems and methods for multitasking on an electronic device with a touch-sensitive display
KR20150144665A (en) * 2014-06-17 2015-12-28 엘지전자 주식회사 Mobile terminal
USD774062S1 (en) 2014-06-20 2016-12-13 Google Inc. Display screen with graphical user interface
USD756398S1 (en) * 2014-06-23 2016-05-17 Google Inc. Portion of a display panel with an animated computer icon
USD754184S1 (en) * 2014-06-23 2016-04-19 Google Inc. Portion of a display panel with an animated computer icon
CN104049866B (en) * 2014-06-25 2019-06-21 努比亚技术有限公司 The implementation method and device of a kind of mobile terminal and its split screen
WO2016004116A1 (en) * 2014-06-30 2016-01-07 Reliance Jio Infocomm Usa, Inc. System and method for providing a user-controlled overlay for user interface
US10324733B2 (en) 2014-07-30 2019-06-18 Microsoft Technology Licensing, Llc Shutdown notifications
US10254942B2 (en) 2014-07-31 2019-04-09 Microsoft Technology Licensing, Llc Adaptive sizing and positioning of application windows
US9787576B2 (en) 2014-07-31 2017-10-10 Microsoft Technology Licensing, Llc Propagating routing awareness for autonomous networks
KR20160020738A (en) * 2014-08-14 2016-02-24 삼성전자주식회사 Electronic Device And Method For Providing User Interface Of The Same
CN104168515A (en) * 2014-08-21 2014-11-26 三星电子(中国)研发中心 Intelligent television terminal and screen control method thereof
USD851118S1 (en) * 2014-09-02 2019-06-11 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphical user interface
US10338765B2 (en) 2014-09-05 2019-07-02 Microsoft Technology Licensing, Llc Combined switching and window placement
KR20160032604A (en) * 2014-09-16 2016-03-24 삼성전자주식회사 Electronic Device having Independent screen configurations
CN105487742B (en) * 2014-09-18 2019-06-18 北京三星通信技术研究有限公司 The display methods and device of more application widgets
CN104360787A (en) * 2014-10-17 2015-02-18 联想(北京)有限公司 Display method and electronic device
US10073976B2 (en) 2014-10-24 2018-09-11 Samsung Electronics Co., Ltd. Application executing method and device, and recording medium thereof
US20160209973A1 (en) * 2015-01-21 2016-07-21 Microsoft Technology Licensing, Llc. Application user interface reconfiguration based on an experience mode transition
US9423360B1 (en) 2015-02-09 2016-08-23 Microsoft Technology Licensing, Llc Optical components
US9372347B1 (en) 2015-02-09 2016-06-21 Microsoft Technology Licensing, Llc Display system
US9535253B2 (en) 2015-02-09 2017-01-03 Microsoft Technology Licensing, Llc Display system
US9513480B2 (en) 2015-02-09 2016-12-06 Microsoft Technology Licensing, Llc Waveguide
US9429692B1 (en) 2015-02-09 2016-08-30 Microsoft Technology Licensing, Llc Optical components
US10317677B2 (en) 2015-02-09 2019-06-11 Microsoft Technology Licensing, Llc Display system
US9827209B2 (en) 2015-02-09 2017-11-28 Microsoft Technology Licensing, Llc Display system
US10018844B2 (en) 2015-02-09 2018-07-10 Microsoft Technology Licensing, Llc Wearable image display system
USD795917S1 (en) 2015-05-17 2017-08-29 Google Inc. Display screen with an animated graphical user interface
USD770530S1 (en) * 2015-05-27 2016-11-01 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphical user interface
CN104965697A (en) * 2015-05-29 2015-10-07 深圳市金立通信设备有限公司 Window display method and terminal
CN105094733A (en) * 2015-06-30 2015-11-25 努比亚技术有限公司 Method and device for split screen display
CN105573740A (en) * 2015-06-30 2016-05-11 宇龙计算机通信科技(深圳)有限公司 Split-screen display mode operation method and terminal
CN104991704A (en) * 2015-07-06 2015-10-21 魅族科技(中国)有限公司 Screen-splitting method for terminal and terminal
USD808421S1 (en) * 2015-07-07 2018-01-23 Google Llc Display screen or portion thereof with a transitional graphical user interface component for identifying current location
KR20170008041A (en) * 2015-07-13 2017-01-23 엘지전자 주식회사 Mobile terminal and control method thereof
CN104991705A (en) * 2015-07-16 2015-10-21 魅族科技(中国)有限公司 Interface display method and terminal
KR20170058152A (en) * 2015-11-18 2017-05-26 삼성전자주식회사 Electronic apparatus and method for configuring of display thereof
CN105511778A (en) * 2015-11-25 2016-04-20 网易(杭州)网络有限公司 Interaction method device for controlling display of multiple game scenes
CN105426150B (en) * 2015-11-27 2018-10-23 青岛海信电器股份有限公司 A kind of multimedia messages playback method and device
US20170199771A1 (en) * 2016-01-08 2017-07-13 Nasdaq, Inc. Systems and methods for calendar synchronization with enterprise web applications
USD793440S1 (en) 2016-01-26 2017-08-01 Google Inc. Display screen with transitional graphical user interface
USD792462S1 (en) 2016-01-26 2017-07-18 Google Inc. Display screen with transitional graphical user interface for image navigation and selection
CN105912192A (en) * 2016-03-31 2016-08-31 联想(北京)有限公司 Display control method and electronic equipment
CN105955802A (en) * 2016-04-21 2016-09-21 青岛海信移动通信技术股份有限公司 Application operation method for mobile terminal, and mobile terminal
CN105892823B (en) * 2016-04-27 2019-06-11 宇龙计算机通信科技(深圳)有限公司 A kind of multiwindow edit methods, system and mobile terminal
CN106020592A (en) * 2016-05-09 2016-10-12 北京小米移动软件有限公司 Split screen display method and device
CN106055252B (en) * 2016-05-30 2019-04-30 努比亚技术有限公司 Mobile terminal and its split screen display available processing method
USD808428S1 (en) 2016-06-29 2018-01-23 Quantcast Corporation Display screen or portion thereof with icon
KR20180041911A (en) * 2016-10-17 2018-04-25 삼성전자주식회사 Electronic device and method of controlling display in the electronic device
CN106502513A (en) * 2016-10-31 2017-03-15 珠海我爱拍科技有限公司 Split-screen display technology based on android system
CN106534914A (en) * 2016-10-31 2017-03-22 努比亚技术有限公司 Split screen display device, mobile terminal and method
KR20180067855A (en) * 2016-12-13 2018-06-21 엘지전자 주식회사 Mobile terminal and method for controlling the same
CN106843638A (en) * 2016-12-26 2017-06-13 北京奇艺世纪科技有限公司 Control method and device for video playing terminal and video playing terminal
US10203982B2 (en) * 2016-12-30 2019-02-12 TCL Research America Inc. Mobile-phone UX design for multitasking with priority and layered structure
USD823871S1 (en) * 2017-02-03 2018-07-24 Google Llc Display screen with animated graphical user interface
CN109324846A (en) * 2017-07-28 2019-02-12 北京小米移动软件有限公司 Application display method and device, storage medium
WO2019061541A1 (en) * 2017-09-30 2019-04-04 华为技术有限公司 Method for editing main screen, graphical user interface and electronic device
KR20190089374A (en) * 2018-01-22 2019-07-31 삼성전자주식회사 Elelctronic device for controlling a plurality of applications

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070097092A1 (en) 2005-10-31 2007-05-03 Samsung Electronics Co., Ltd. Method of using a touch screen and user interface apparatus employing the same
US20120176322A1 (en) 2011-01-07 2012-07-12 Qualcomm Incorporated Systems and methods to present multiple frames on a touch screen

Family Cites Families (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1994024657A1 (en) * 1993-04-20 1994-10-27 Apple Computer Inc. Interactive user interface
US6008809A (en) * 1997-09-22 1999-12-28 International Business Machines Corporation Apparatus and method for viewing multiple windows within a dynamic window
US7081887B2 (en) * 2002-12-19 2006-07-25 Intel Corporation Method and apparatus for positioning a software keyboard
KR100831721B1 (en) * 2006-12-29 2008-05-22 엘지전자 주식회사 Apparatus and method for displaying of mobile terminal
US8549429B2 (en) * 2007-01-25 2013-10-01 Sharp Kabushiki Kaisha Multi-window management apparatus and program, storage medium and information processing apparatus
JP5253937B2 (en) * 2008-09-08 2013-07-31 株式会社エヌ・ティ・ティ・ドコモ Information processing apparatus and program
KR101548958B1 (en) * 2008-09-18 2015-09-01 삼성전자주식회사 A method for operating control in mobile terminal with touch screen and apparatus thereof.
KR101546782B1 (en) * 2008-10-02 2015-08-25 삼성전자주식회사 Apparatus and method for composing idle screen in a portable terminal
KR20100048297A (en) * 2008-10-30 2010-05-11 에스케이텔레시스 주식회사 Screen controlling apparatus and method thereof for mobile terminal
KR101640460B1 (en) * 2009-03-25 2016-07-18 삼성전자 주식회사 Operation Method of Split Window And Portable Device supporting the same
FR2953590B1 (en) * 2009-12-03 2012-08-03 Mobile Devices Ingenierie Information device for vehicle driver and method for controlling such a device.
EP2354914A1 (en) * 2010-01-19 2011-08-10 LG Electronics Inc. Mobile terminal and control method thereof
US9367205B2 (en) * 2010-02-19 2016-06-14 Microsoft Technolgoy Licensing, Llc Radial menus with bezel gestures
DE202011110735U1 (en) * 2010-04-06 2015-12-10 Lg Electronics Inc. Mobile terminal
US9483175B2 (en) * 2010-07-26 2016-11-01 Apple Inc. Device, method, and graphical user interface for navigating through a hierarchy
KR101657122B1 (en) * 2010-09-15 2016-09-30 엘지전자 주식회사 Mobile terminal and method for controlling the same
US20120131483A1 (en) * 2010-11-22 2012-05-24 International Business Machines Corporation Drag-and-drop actions for web applications using an overlay and a set of placeholder elements
KR101767504B1 (en) * 2010-12-01 2017-08-11 엘지전자 주식회사 Mobile terminal and operation method thereof
KR101251761B1 (en) * 2011-05-13 2013-04-05 주식회사 케이티 Method for Data Transferring Between Applications and Terminal Apparatus Using the Method
US20130057587A1 (en) * 2011-09-01 2013-03-07 Microsoft Corporation Arranging tiles
US9182935B2 (en) * 2011-09-27 2015-11-10 Z124 Secondary single screen mode activation through menu option
KR20130054074A (en) * 2011-11-16 2013-05-24 삼성전자주식회사 Apparatus displaying event view on splited screen and method for controlling thereof
KR101888457B1 (en) * 2011-11-16 2018-08-16 삼성전자주식회사 Apparatus having a touch screen processing plurality of apllications and method for controlling thereof
DE102013002891A1 (en) * 2013-03-22 2014-09-25 Volkswagen Aktiengesellschaft An information reproduction system for a vehicle and method for providing information to the user of a vehicle

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070097092A1 (en) 2005-10-31 2007-05-03 Samsung Electronics Co., Ltd. Method of using a touch screen and user interface apparatus employing the same
US20120176322A1 (en) 2011-01-07 2012-07-12 Qualcomm Incorporated Systems and methods to present multiple frames on a touch screen

Also Published As

Publication number Publication date
KR20140039575A (en) 2014-04-02
ES2706010T3 (en) 2019-03-27
AU2013318697B2 (en) 2018-09-20
CN103677627A (en) 2014-03-26
EP3493042A1 (en) 2019-06-05
EP2725466B1 (en) 2018-12-26
US20140089833A1 (en) 2014-03-27
AU2013318697A1 (en) 2015-02-26
WO2014046525A1 (en) 2014-03-27
EP2725466A1 (en) 2014-04-30

Similar Documents

Publication Publication Date Title
US10345992B2 (en) Method for displaying unread message contents and electronic device thereof
JP5912083B2 (en) User interface providing method and apparatus
JP5743232B2 (en) Application operation method and apparatus for touch device having touch-based input interface
AU2013201324B2 (en) Electronic device and method of controlling the same
US9338749B2 (en) Mobile terminal and battery power saving mode switching method thereof
EP3019944B1 (en) User terminal device for supporting user interaction and methods thereof
US8904311B2 (en) Method, apparatus, and computer program product for implementing a variable content movable control
US9983664B2 (en) Mobile device for executing multiple applications and method for same
EP2615535B1 (en) Mobile terminal and method of controlling the same
KR101704531B1 (en) Method and apparatus for displaying text information in mobile terminal
JP5974068B2 (en) Terminal and display method thereof
AU2013215357B2 (en) Navigating among content items in a browser using an array mode
KR20130076397A (en) Method and apparatus for multi-tasking in a user device
US20130227413A1 (en) Method and Apparatus for Providing a Contextual User Interface on a Device
US20140013254A1 (en) System and method for rearranging icons displayed in a graphical user interface
US9645730B2 (en) Method and apparatus for providing user interface in portable terminal
EP2624119B1 (en) Electronic device and method of controlling the same
KR20130093043A (en) Method and mobile device for user interface for touch and swipe navigation
KR101640463B1 (en) Operation Method And Apparatus For Portable Device
EP2809055A2 (en) Method and apparatus for controlling screen display using environmental information
US9703456B2 (en) Mobile terminal
US10152196B2 (en) Mobile terminal and method of operating a message-based conversation for grouping of messages
US9736218B2 (en) Device, system and method for processing character data
KR101660746B1 (en) Mobile terminal and Method for setting application indicator thereof
KR101788049B1 (en) Mobile terminal and method for controlling thereof

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right
GRNT Written decision to grant