CN107967087B - Display apparatus and method of controlling the same - Google Patents

Display apparatus and method of controlling the same Download PDF

Info

Publication number
CN107967087B
CN107967087B CN201711096847.4A CN201711096847A CN107967087B CN 107967087 B CN107967087 B CN 107967087B CN 201711096847 A CN201711096847 A CN 201711096847A CN 107967087 B CN107967087 B CN 107967087B
Authority
CN
China
Prior art keywords
window
application
display
application window
controller
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711096847.4A
Other languages
Chinese (zh)
Other versions
CN107967087A (en
Inventor
金永振
金刚兑
朴大旭
金泰秀
崔祯桓
金圣熙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020130022422A external-priority patent/KR102172792B1/en
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Publication of CN107967087A publication Critical patent/CN107967087A/en
Application granted granted Critical
Publication of CN107967087B publication Critical patent/CN107967087B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

A display device having a touch screen and running at least one application and a method for controlling the same are provided. The method comprises the following steps: receiving an application execution command for executing at least one application; determining at least one of a size and a position of a window to run the at least one application according to a position of an application run command input; and displaying the window according to at least one of a size and a position of the window.

Description

Display apparatus and method of controlling the same
The present application is a divisional application of an invention patent application having an application date of 2013, 12 and 6, and an application number of 201380071613.8, entitled "display device and method of controlling display device".
Technical Field
The present disclosure relates to a display apparatus and a method for controlling the same. More particularly, the present disclosure relates to a display device displaying a window in which an application is executed and a method for controlling the same.
Background
Desktop computers are equipped with at least one display device (e.g., a monitor). Similarly, mobile devices with touch screens (e.g., mobile phones, smart phones, tablet Personal Computers (PCs), etc.) are also equipped with display devices.
The user may divide a screen of the display device according to a task environment using the desktop computer (e.g., the screen is divided horizontally or vertically and a plurality of windows are called in the divided screen). When running a web browser, a user may scroll up or down a web page by means of a page up button or a page down button in a keyboard. If the user uses a mouse instead of a keyboard, the user can scroll up or down the web page by selecting a scroll bar in one side of the web page using the mouse cursor. The user may also move to the top of the web page by selecting the top button, which is displayed as text or an icon in the bottom of the web page.
Compared to desktop computers, mobile devices have a small screen size and are limited in input to the screen. Therefore, it is difficult to divide the screen in the mobile device.
A variety of applications may be run in the mobile device. The applications include basic applications installed by a manufacturer during a manufacturing process and additional applications downloaded from an application sales website. Additional applications may be developed by the average user and registered with the application sales website. Therefore, anyone can freely sell the application he or she developed to the mobile user through the application selling website. Currently, tens of thousands to hundreds of thousands of free or paid applications are available for mobile devices per product.
Disclosure of Invention
Technical problem
While mobile devices are provided with many applications that stimulate user interest and meet user needs, mobile devices have limitations in display size and User Interface (UI) due to their portable size. As a result, users feel inconvenienced when running multiple applications in their mobile devices. For example, when a user runs an application in a mobile device, the application is displayed throughout the entirety of the display area. If the user runs another application during the running of the current application, the user needs to first end the ongoing application and then select a run key to run the desired application. For example, the user needs to do a waste (reclamation) process that repeats running and terminates each application in order to run multiple applications. However, a method of simultaneously running a plurality of applications in a mobile device is yet to be specified.
As described above, although mobile devices are provided with many applications that stimulate user interest and meet user needs, mobile devices have limitations in display size and UI due to their portable size. As a result, users feel inconvenienced when running multiple applications in their mobile devices.
Therefore, there is a need to develop a method for displaying multiple windows on a single display. Further, a method of easily calling a plurality of windows and facilitating arrangement of windows after window calling is required.
More specifically, when displaying a plurality of overlapping windows, a configuration is required in which the currently displayed window is switched to another low-priority window.
The above information is presented as background information only to aid in understanding the present disclosure. It is not determined or claimed whether any of the above information may be used as prior art with respect to the present disclosure.
Technical scheme
Aspects of the present disclosure are to address at least the above problems and/or disadvantages and to provide at least the advantages described below. Accordingly, an aspect of the present disclosure is to provide a display apparatus that runs a plurality of windows in various sizes on a single display and facilitates switching from one window to another lower-level window, and a method for controlling the same.
According to an aspect of the present disclosure, there is provided a method for controlling a display device having a touch screen, which runs at least one application. The method comprises the following steps: receiving an application execution command for executing at least one application; determining at least one of a size and a position of a window to run the at least one application according to a position of an application run command input; and displaying the window according to at least one of a size and a position of the window.
According to another aspect of the present disclosure, a display apparatus is provided. The display device includes a touch screen configured to receive an application execution command for executing at least one application, and a controller configured to: determining at least one of a size and a position of a window to run the at least one application according to a position of an application run command input; and controlling display of the window on the touch screen according to at least one of a size and a position of the window.
According to another aspect of the present disclosure, a method for running an application in a display device including a touch screen is provided. The method comprises the following steps: displaying a running window of an application in each of a plurality of regions of a touch screen; displaying a button on at least one boundary separating the plurality of regions; receiving an input of a selection button; and displaying a list of at least one application running in a specific area among the plurality of areas in the specific area according to the received input.
According to another aspect of the present disclosure, a method for running an application in a display device including a touch screen is provided. The method comprises the following steps: displaying a running window of an application in each of a plurality of regions of a touch screen; displaying a button on at least one boundary separating the plurality of regions; displaying a list of at least one application execution icon in a partial region of the touch screen; receiving a drag input dragging an application launch icon from the list; determining an area for running a new application according to the end position of the drag input and the position of the button; and displaying an application execution window of the application corresponding to the application execution icon in the determined area.
According to another aspect of the present disclosure, a display apparatus is provided. The display device includes: a touch screen configured to display a running window of an application in each of a plurality of areas; displaying a button on at least one boundary separating the plurality of regions, and receiving an input selecting the button; and a controller configured to display a list of at least one application running in a specific region among the plurality of regions according to the received input.
According to another aspect of the present disclosure, a display apparatus is provided. The display device includes a touch screen configured to: displaying a running window of the application in each of the plurality of regions; displaying a button on at least one boundary separating the plurality of regions; displaying a list of at least one application execution icon in a partial area of the touch screen; and receiving a drag input dragging an application launch icon from the list; the controller is configured to: determining an area for running a new application based on the end position of the drag input and the position of the button; and controlling the touch screen to display an application running window of the application corresponding to the application running icon in the determined area.
Other aspects, advantages and salient features of the disclosure will become apparent to those skilled in the art from the following detailed description, which, taken in conjunction with the annexed drawings, discloses various embodiments of the disclosure.
Drawings
The above and other aspects, features and advantages of certain embodiments of the present disclosure will become more apparent from the following description when taken in conjunction with the accompanying drawings, in which:
fig. 1 is a block diagram of a display device according to an embodiment of the present disclosure;
2a, 2b, 2c, 2d, 2e, 2f, 2g, 2h, 2i, 2j and 2k illustrate a window operation method according to an embodiment of the present disclosure;
3a, 3b, 3c, 3d, 3e, 3f, 3g, 3h and 3i illustrate action stacks managed in a display device according to an embodiment of the present disclosure;
FIG. 4a is a flowchart illustrating a method for controlling a display device according to an embodiment of the present invention;
FIG. 4b is a flowchart illustrating a method for controlling a display device according to an embodiment of the present invention;
FIG. 5 shows a display order (Z-order) of windows according to an embodiment of the disclosure;
6a, 6b, 6c, and 6d illustrate application execution methods according to embodiments of the present disclosure;
fig. 7 is a flowchart illustrating a method for controlling a display apparatus according to an embodiment of the present invention;
8a, 8b, 8c, 8d, 8e, 8f, 8g, 8h, 8i, 8j, 8k, 8l and 8m illustrate a method for displaying multiple windows according to an embodiment of the present disclosure;
9a, 9b, 9c, 9d, 9e, 9f, 9g, and 9h illustrate layouts according to embodiments of the present disclosure;
10a, 10b, 10c and 10d illustrate screens of a display device according to an embodiment of the present disclosure;
11a, 11b, and 11c illustrate screens of a display device according to an embodiment of the present disclosure;
12a, 12b and 12c illustrate screens of a display device according to an embodiment of the present disclosure;
13a, 13b and 13c illustrate screens of a display device according to an embodiment of the present disclosure;
14a, 14b and 14c illustrate screens of a display device according to an embodiment of the present disclosure;
15a, 15b and 15c illustrate screens of a display device according to an embodiment of the present disclosure;
fig. 16a, 16b, 16c and 16d illustrate screens of a display device according to an embodiment of the present disclosure;
fig. 17 illustrates a screen of a display device according to an embodiment of the present disclosure;
18a and 18b illustrate a 9-region splitting mode according to an embodiment of the present disclosure;
fig. 19 is a flowchart illustrating a method for controlling a display apparatus according to an embodiment of the present invention;
fig. 20 is a flowchart illustrating a method for controlling a display apparatus according to an embodiment of the present invention;
fig. 21a, 21b and 21c illustrate screens of a display device according to an embodiment of the present disclosure;
FIG. 22 illustrates an action stack according to an embodiment of the present disclosure;
fig. 23a and 23b illustrate screens of a display device describing Z-order change according to an embodiment of the present disclosure;
FIG. 24 illustrates an action stack according to an embodiment of the present disclosure;
fig. 25a and 25b illustrate screens of a display device describing a Z-order changing command according to an embodiment of the present disclosure;
FIG. 26 illustrates an action stack according to an embodiment of the present disclosure;
fig. 27a and 27b illustrate screens of a display device describing a Z-order changing command according to an embodiment of the present disclosure;
FIG. 28 illustrates an action stack according to an embodiment of the present disclosure;
fig. 29a and 29b illustrate screens of a display device describing a Z-order changing command according to an embodiment of the present disclosure;
FIG. 30 illustrates an action stack according to an embodiment of the present disclosure;
fig. 31a and 31b illustrate screens of a display device describing a Z-order changing command according to an embodiment of the present disclosure;
FIG. 32 illustrates an action stack according to an embodiment of the present disclosure;
fig. 33a and 33b illustrate screens of a display device describing a Z-order changing command according to an embodiment of the present disclosure;
FIG. 34 illustrates an action stack according to an embodiment of the present disclosure;
fig. 35 is a flowchart illustrating a method for controlling a display apparatus according to an embodiment of the present invention;
fig. 36a, 36b, and 36c illustrate screens of a display device describing a Z-order changing command according to an embodiment of the present disclosure;
37a, 37b, and 37c illustrate action stacks according to embodiments of the present disclosure;
fig. 38a, 38b, and 38c illustrate screens of a display device describing a Z-order changing command according to an embodiment of the present disclosure;
39a, 39b and 39c illustrate an action stack according to an embodiment of the present disclosure;
40a, 40b, 40c, 40d, 40e, 40f, 40g, 40h, 40i, 40j and 40k illustrate a method for displaying an application execution window according to an embodiment of the present disclosure;
41a, 41b, 41c, 41d, 41e, and 41f illustrate action stacks according to various embodiments of the present disclosure;
FIG. 42 is a flow diagram illustrating a method for running an application in a display device according to an embodiment of the present invention;
FIGS. 43a and 43b illustrate a method for controlling a display area of an application launch window using a center button according to an embodiment of the present disclosure;
fig. 44a, 44b, 44c, 44d, 44e, 44f, 44g, 44h, 44i, 44j, 44k, 44l, 44m, 44n, 44o, 44p, 44q, 44r, 44s, 44t, 44u, 44v and 44w illustrate a method for running multiple applications according to an embodiment of the present disclosure;
45a, 45b, 45c, 45d, 45e, 45f, 45g, 45h, 45i and 45j illustrate action stacks according to embodiments of the present disclosure;
FIG. 46 is a flowchart illustrating a method for providing a user interface on which to run an application in a display device according to an embodiment of the present invention;
FIG. 47 is a flow diagram illustrating a method for running an application in a display device according to an embodiment of the present invention;
FIG. 48 is a block diagram of a display device according to an embodiment of the present disclosure; and
fig. 49a, 49b, 49c, and 49d are diagrams illustrating a method for displaying buttons according to an embodiment of the present disclosure.
Throughout the drawings, like reference numerals will be understood to refer to like parts, components and structures.
Detailed Description
The following description with reference to the accompanying drawings is provided to assist in a comprehensive understanding of various embodiments of the disclosure as defined by the claims and their equivalents. The following description includes various specific details to aid understanding, but these specific details should be construed as merely illustrative. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present disclosure. Also, descriptions of well-known functions and constructions may be omitted for clarity and conciseness.
The terms and words used in the following description and claims are not limited to bibliographic meanings, but are used by the inventors solely for the purpose of providing a clear and consistent understanding of the disclosure. Accordingly, it will be apparent to those skilled in the art that the following descriptions of the various embodiments of the present disclosure are provided for illustration only and not for the purpose of limiting the disclosure as defined by the appended claims and their equivalents.
It is to be understood that the singular forms "a", "an" and "the" include plural referents unless the context clearly dictates otherwise. Thus, for example, when referring to "a component surface," one or more such surfaces are included.
By the term "substantially" it is meant that the recited characteristic, parameter or value need not be implemented exactly, but that deviations or variations may occur, including for example tolerances, measurement errors, measurement accuracy limits and other factors known to the person skilled in the art, which in any case do not exclude the effect of the characteristic which is intended to be provided.
When ordinal numbers such as first, second, etc. are used to describe a plurality of elements, the elements are not limited by the term. The term is used to distinguish one component from another. For example, within the scope and spirit of the present disclosure, a first component may be termed a second component, or vice versa. The term "and/or" is meant to include one of or a combination of a plurality of the described associated items.
The terminology used herein is provided to describe various embodiments and is not intended to be limiting of the disclosure. As used herein, the singular encompasses the plural unless the context clearly dictates otherwise. In this specification, the terms "comprising" or "having" are not to be construed as necessarily including all of the features, numbers, steps, operations, components, parts, or combinations thereof described in the specification. Rather, it should be understood that there are possibilities to omit or add one or more features, numbers, steps, operations, components, parts, or combinations thereof.
Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which the term belongs. Further, terms defined in a general dictionary will be understood to have the same meaning as the context of the prior art. Unless expressly defined herein, terms are not to be construed in an idealized or overly formal sense.
Fig. 1 is a block diagram of a display device according to an embodiment of the present disclosure.
Referring to fig. 1, the display apparatus 100 may be connected to an external apparatus (not shown) through the mobile communication module 120, the sub communication module 130, or the connector 165. The term "external device" encompasses a variety of devices, such as another device (not shown), a mobile phone (not shown), a smartphone (not shown), a tablet Personal Computer (PC) (not shown), a server (not shown), and so forth.
The display device 100 includes a touch screen 190 and a touch screen controller 195. The display apparatus 100 further includes a controller 110, a mobile communication module 120, a sub communication module 130, a multimedia module 140, a camera module 150, a Global Positioning System (GPS) module 155, an input/output (I/O) module 160, a sensor module 170, a memory (storage) 175, and a power supply 180. The sub communication module 130 includes at least one of a Wireless Local Area Network (WLAN) module 131 and a short range communication module 132. The multimedia module 140 includes at least one of a broadcast communication module 141, an audio play module 142, and a video play module 143. The camera module 150 includes at least one of the first camera 151 and the second camera 152, and the I/O module 160 includes at least one of a button 161, a microphone 162, a speaker 163, a vibration motor 164, a connector 165, and a keypad 166.
The controller 110 may include a Central Processing Unit (CPU)111, a Read Only Memory (ROM)112 storing a control program for controlling the display apparatus 100, and a Random Access Memory (RAM)113 serving as a storage space for operations performed by the display apparatus 100. The CPU 111 may include one or more cores. The CPU 111, ROM 112, and RAM 113 may be connected to each other through an internal bus.
The controller 110 may control the mobile communication module 120, the sub communication module 130, the multimedia module 140, the camera module 150, the GPS module 155, the I/O module 160, the sensor module 170, the memory 175, the power supply 180, the touch screen 190, and the touch screen controller 195.
The mobile communication module 120 connects the display apparatus 100 to an external apparatus through mobile communication via one or more antennas (not shown) under the control of the controller 110. The mobile communication module 120 transmits or receives a wireless signal to or from a mobile phone (not shown), a smart phone (not shown), a tablet PC (not shown), or another device (not shown) having a phone number input to the display device 100 for a voice call, a video call, a Short Message Service (SMS), or a Multimedia Message Service (MMS).
The sub communication module 130 may include at least one of a WLAN module 131 and a short range communication module 132.
The WLAN module 131 may be connected to the internet under the control of the controller 110 at a location where a wireless AP (not shown) is installed. The WLAN module 131 supports the WLAN standard, Institute of Electrical and Electronics Engineers (IEEE)802.11 x. The short-range communication module 132 may implement short-range wireless communication between the display apparatus 100 and an image forming apparatus (not shown) under the control of the controller 110. The short-range communication may conform to bluetooth, infrared data association (IrDA), zigbee, and the like.
The display apparatus 100 may include at least one of the mobile communication module 120, the WLAN module 131, and the short range communication module 132 according to its capability. For example, the display device 100 may include a combination of the mobile communication module 120, the WLAN module 131, and the short range communication module 132 according to its capability.
The multimedia module 140 may include a broadcast communication module 141, an audio play module 142, or a video play module 143. The broadcast communication module 141 may receive a broadcast signal, such as a television broadcast signal, a radio broadcast signal, or a data broadcast signal, and additional broadcast information, such as an Electronic Program Guide (EPG) or an Electronic Service Guide (ESG), from a broadcasting station through a broadcast communication antenna (not shown) under the control of the controller 110. The audio play module 142 may open stored or received digital audio files (e.g., files having extensions such as mp3, wma, ogg, or wav) under the control of the control unit 110. The video play module 143 may open a stored or received digital video file (e.g., a file having an extension such as mpeg, mpg, mp4, avi, mov, or mkv) under the control of the control unit 110. The video playback module 143 can also open a digital audio file.
The multimedia module 140 may include an audio play module 142 and a video play module 143 without the broadcast communication module 141. Alternatively, the audio play module 142 or the video play module 143 of the multimedia module 140 may be incorporated into the controller 110.
The camera module 150 may include at least one of a first camera 151 and a second camera 152 capturing still images or video under the control of the control unit 110. The first camera 151 or the second camera 152 may include an auxiliary light source (e.g., a flash (not shown)) that provides light intensity for capturing an image. The first camera may be disposed on a front surface of the display apparatus 100, and the second camera may be disposed on a rear surface of the display apparatus 100. Alternatively, the first camera 151 and the second camera 152 may be arranged close to each other (e.g., the distance between the first camera 151 and the second camera 152 may be between 1cm and 8 cm) in order to capture a three-dimensional still image or video.
The GPS module 155 may receive signal waves from a plurality of GPS satellites (not shown) in the earth orbit and calculate the position of the display device 100 based on the arrival times (ToAs) of satellite signals from the GPS satellites to the display device 100.
I/O module 160 may include at least one of a plurality of buttons 161, a microphone 162, a speaker 163, a vibrating motor 164, a connector 165, and a keypad 166.
The buttons 161 may be formed on a front surface, a side surface, or a rear surface of the housing of the display device 100, and may include an on/off button (not shown), a volume button (not shown), a menu button, a home button, a back button, and a search button.
The microphone 162 receives voice or sound and converts the received voice or sound into an electrical signal under the control of the controller 110.
The speaker 163 may output sounds corresponding to various signals (e.g., wireless signals, broadcast signals, digital audio files, digital video files, photo-taking, etc.) received from the mobile communication module 120, the sub communication module 130, the multimedia module 140, and the camera module 150 to the outside of the display apparatus 100. The speaker 163 may output a sound corresponding to a function (e.g., a button operation sound, a ring back tone for a call, etc.) performed by the display apparatus 100. One or more speakers 163 may be disposed at a suitable location or locations of the housing of the display device 100.
The vibration motor 164 may convert the electrical signal into mechanical vibration under the control of the controller 110. For example, when the display apparatus 100 receives an incoming voice call from another mobile apparatus (not shown) in a vibration mode, the vibration motor 164 operates. One or more vibration motors 164 may be installed inside the housing of the display device 100. The vibration motor 164 may operate in response to a user touch on the touch screen 190 and continuous movement of the touch on the touch screen 190.
The connector 165 may be used as an interface to connect the display device 100 to an external device (not shown) or a power source (not shown). The connector 165 may transmit data stored in the memory 175 to an external device via a cable connected to the connector 165 under the control of the controller 110, or may receive data from the external device via the cable. The display device 100 may receive power from a power supply or charge a battery (not shown) via a cable connected to the connector 165.
Keypad 166 may receive key inputs from a user to control display device 100. The keypad 166 includes a physical keypad (not shown) formed in the display device 100 or a virtual keypad (not shown) displayed on the display 190. A physical keypad may not be provided according to the capability or configuration of the display apparatus 100.
The sensor module 170 includes at least one sensor (not shown) for detecting a state of the display apparatus 100. The sensor module 170 may include a proximity sensor that detects whether a user approaches the display device 100, an illuminance sensor that detects an amount of ambient light around the display device 100, or a motion sensor that detects a motion (e.g., rotational acceleration, vibration, etc.) of the display device 100. At least one sensor may detect a state of the display apparatus 100, generate a signal corresponding to the detected state, and transmit the generated signal to the controller 110. Sensors may be added to the sensor module 170 or removed from the sensor module 170 depending on the capabilities of the device 100.
The memory 175 may store input and output signals or data according to operations of the mobile communication 120, the sub communication module 130, the multimedia module 140, the camera module 150, the GPS module 155, the I/O module 160, the sensor module 170, and the touch screen 190 under the control of the control unit 110. The memory 175 may store programs and applications for controlling the display apparatus 100 or the controller 110.
The term "memory" encompasses the memory 175 installed into the display device 100, the ROM 112 and RAM 113 within the controller 110, or a memory card (not shown), such as a Secure Digital (SD) card or a memory stick. The memory 110 may include a nonvolatile memory, a volatile memory, a Hard Disk Drive (HDD), or a Solid State Drive (SSD).
The power supply 180 may supply power to one or more batteries (not shown) mounted in a housing of the display device 100 under the control of the controller 110. One or more batteries supply power to the display device 100. Further, the power supply unit 180 may supply power received from an external power supply (not shown) to the display apparatus 100 via a cable connected to the connector 165.
The touch screen 190 may provide a user with a User Interface (UI) corresponding to various services (e.g., call, data transmission, broadcast, photography, etc.). The touch screen 190 may send analog signals corresponding to at least one touch on the UI to the touch screen controller 195. The touch screen 190 may receive at least one touch input through a body part (e.g., a finger) of a user or a touch input device (e.g., a stylus pen). The touch screen 190 may also receive a touch input signal corresponding to a continuous motion of one touch among the one or more touches. The touch screen 190 may send analog signals corresponding to the input touch to the touch screen controller 195.
As used in this description, "touch" may include contactless touch (i.e., a detectable gap between the touch screen 190 and a portion of the user or a touch input device is 1mm or less), and need not be limited to contact between the touch screen 190 and a body portion of the user or a touch input tool. The gap detectable by the touch screen 190 may vary depending on the capabilities or configuration of the display device 100.
For example, the touch screen 190 may be implemented as a resistive type, a capacitive type, an infrared type, or an acoustic wave type.
The touch screen controller 195 converts analog signals received from the touch screen 190 into digital signals (e.g., X and Y coordinates). The controller 110 may control the touch screen 190 using digital signals received from the touch screen controller 195. For example, the controller 110 may control selection or execution of a shortcut icon (not shown) displayed on the touch screen 190 in response to a touch. The touch screen controller 195 may be incorporated into the controller 110.
Fig. 2a, 2b, 2c, 2d, 2e, 2f, 2g, 2h, 2i, 2j, and 2k illustrate a window operation method according to an embodiment of the present disclosure. Those skilled in the art will readily understand that the display apparatus 200 may be any one of the display apparatus 100 shown in fig. 1, a standard TV (television), an internet TV, a medical data display apparatus, and the like. Thus, any device may be used as the display device as long as it is equipped with means for displaying the presented image.
Referring to fig. 2a, the display device 200 may define a plurality of window display areas 201, 202, 203, and 204 on the touch screen. For example, a controller (not shown) may configure the first window display area 201, the second window display area 202, the third window display area 203, and the fourth window display area 204. The controller may set a first boundary line 211 between the first window display area 201 and the second window display area 202, a second boundary line 212 between the third window display area 203 and the fourth window display area 204, a third boundary line 213 between the first window display area 201 and the third window display area 203, and a fourth boundary line 214 between the second window display area 202 and the fourth window display area 204. The first boundary line 211 and the second boundary line 212 may be connected as a single line, and the third boundary line 213 and the fourth boundary line 214 may be connected as a single line. The controller configures the first to fourth window display areas 201, 202, 203 and 204 so that they do not overlap with each other. Referring to fig. 2a, for example, the controller defines the first window display region 201 at an upper left corner, the second window display region 202 at an upper right corner, the third window display region 203 at a lower left corner, and the fourth window display region 204 at a lower right corner. The controller divides the screen into left and right halves by first and second boundary lines 211 and 212, and divides the screen into upper and lower halves by third and fourth boundary lines 213 and 214.
The controller displays the center button 220 at the intersection where the first boundary line 211 and the second boundary line 212 intersect the third boundary line 213 and the fourth boundary line 214. The center button 220 may be a function key that changes the size of an application display area or shifts the display apparatus 200 to a window repositioning mode.
The controller controls display of a window in each of the window display areas 201, 202, 203, and 204 to run an application in the window. For example, as shown in fig. 2b, 2c, 2d, 2e, 2f, 2g, 2h, 2i, 2j, and 2k, the controller controls the display of a window in each of the window display areas 201, 202, 203, and 204.
The window may include an execution screen of a specific application and a title of the executed application. An object related to an application may be displayed on an execution screen of the application. The objects may take a variety of forms, such as text, graphics, icons, buttons, check boxes, photos, videos, web pages, maps, and so forth. When a user touches an object, a function or event corresponding to the touched object may be run in the application. An object may be referred to as a view according to an Operating System (OS). The title bar may include at least one control key that controls the display of the window. For example, the at least one control key may include a window minimizing button, a window maximizing button, and a window closing button.
The application is a program independently written by a manufacturer of the display device 200 or an application developer. Thus, the running of one application does not require the preparatory running of another application. Even when one application ends, the other application can continue to run.
Applications are configured independently as compared to a combined function application (or dual application) designed by adding some functions (e.g., remark functions, message sending/receiving functions, etc.) available to other applications to one application (e.g., a video application). Unlike existing applications, a combined function application is a single application configured to include a plurality of functions. Therefore, the combined function application provides only limited functions like existing applications, and the user will additionally purchase such a new combined function application.
Referring to fig. 2b, the controller controls the display of the first window 230 to run a launch application in the first window display area 201: l is used. As shown in fig. 2b, the application is launched: application L displays available application icons 231, 232, 233, 234, 235, 236, 237, and 238. Upon receiving an application run command by a touch on one of the application icons 231, 232, 233, 234, 235, 236, 237 and 238, the application is launched: the application L displays an application corresponding to the touch icon in one of the first to fourth display areas 201, 202, 203 and 204.
Fig. 3a, 3b, 3c, 3d, 3e, 3f, 3g, 3h and 3i illustrate action stacks managed in a display device according to an embodiment of the present disclosure.
Referring to fig. 3a, a controller generates and manages a launch application in an action stack in response to the execution of the launch application.
Referring to fig. 2c and 2d, the user 1 may touch the icon 232 representing the application B. When the icon 232 representing the application B is touched, the controller controls to display a second window 240 in which the application B is executed in the second window display area 202. The controller may display windows in the first to fourth window display regions 201, 202, 203 and 204 in order. For example, the controller may control the display of new windows in the clockwise order of the second window display area 202, the third window display area 203, and the fourth window display area 204. The clockwise window display order is one example of controlling the display of new windows, and thus, the controller may control the display of new windows in a counterclockwise order. The order in which new windows are displayed in the window display areas 201, 202, 203, and 204 may be changed.
FIG. 3b shows an action stack corresponding to the window displayed in FIG. 2 d. The controller generates an application B stack 302 in the action stack in response to the execution of application B. The controller places the last running application B stack 302 on top of the initiating application stack 301. This may imply that the Z-order of application B (which may also be described as order, rank, or priority) is higher than the launching application: the Z order of L is applied.
Referring to fig. 2e, the user 1 may touch an icon 233 corresponding to the application C.
Fig. 3c shows an action stack corresponding to the window shown in fig. 2 e. Because as shown in fig. 2e, user 1 is starting an application: application L enters an application run command, so it is noted from fig. 3c that the Z-order of launching application L is higher than the Z-order of application B.
Referring to fig. 2f, when the icon 233 representing the application C is touched, the controller controls the display of the third window 250 in which the application C is executed in the fourth window display area 204.
Fig. 3d shows an action stack corresponding to the window shown in fig. 2 f. The controller generates an application C stack 303 in the action stack in response to the running of application C. The controller places the last running application C stack 303 on top of the initiating application stack 301. This may imply that the Z-order of application C is higher than the launching application: the Z order of L is applied.
Referring to fig. 2g, user 1 may touch an icon 234 representing application D.
Figure 3e shows an action stack corresponding to the window shown in figure 2 g. Because as shown in FIG. 2g, user 1: application L enters an application run command, so it is noted from fig. 3e that the application is started: the Z-order of application L is higher than that of application C.
Referring to fig. 2h, when the icon 234 representing the application D is touched, the controller controls the display of the fourth window 260 in which the application D is executed in the third window display area 203.
FIG. 3f shows an action stack corresponding to the window shown in FIG. 2 h. The controller generates an application D stack 304 in the action stack in response to the running of application D. The controller places the last running application D stack 304 on top of the initiating application stack 301. This may imply that the Z-order of application D is higher than the Z-order of the launching application, application L.
Referring to fig. 2i, the user 1 can operate the application B.
FIG. 3g shows an action stack corresponding to the window shown in FIG. 2 i. The controller places application B stack 302 on top of the action stack in response to user input to application B.
Referring to fig. 2j, the user 1 may touch the icon 235 representing the application E.
Figure 3h shows an action stack corresponding to figure 2 j. Because as shown in FIG. 2j, user 1 is applying to the start-up application: application L enters an application run command so it is noted from fig. 3h that the application is started, the Z-order of application L being higher than the Z-order of application D.
When the icon 235 representing the application E is touched, referring to fig. 2k, the controller controls the display of a fifth window 270 in which the application E is executed in the fourth window display area 204. In the absence of an empty window display area, the controller may refer to the action stack shown in FIG. 3 h. The controller may determine the application with the lowest Z-order in the action stack. For example, the controller may determine that the Z-order of application C is lowest in the action stack of fig. 3 h. The controller controls the display of a fifth window 270 running application E in the fourth window display area 204 instead of the window of application C having the lowest Z-order.
Fig. 3i shows an action stack corresponding to the window shown in fig. 2 k. The controller generates an application E stack 305 in an action stack in response to the execution of application E. The controller places the last running application E stack 305 on top of the initiating application stack 301. This may imply that the Z-order of application E is higher than the Z-order of the starting application, application L.
Fig. 4a is a flowchart illustrating a method for controlling a display apparatus according to an embodiment of the present invention.
Referring to fig. 4a, a display apparatus may run a plurality of applications in operation S401. For example, the display device may run the application in response to an application run command triggered by a user touch on an icon representing the application. The display device, in particular, a window manager of the display device, may generate a window in which an application is run.
The display device may determine a layout for arranging the windows. The layout defines a window display area in which windows can be arranged. For example, two modes are available for layout, namely split mode and freestyle mode.
In split mode, the screen is divided in such a way that: the plurality of windows are displayed without overlapping. For example, if the display device displays a first window and a second window, the display may divide a screen such as a touch screen in a set layout and define the divided screen portion as a window display area. The display device may display a window in each window display region. Since each window display region is a screen segment (segment), the display device can display a plurality of windows without overlapping.
The display device may assign a plurality of windows to one window display area in the split mode. For example, the display device may assign a first window and a second window to a first window display area. In this case, the display device may compare the Z-order (order, level, position in the stack) of the first window and the second window. The display device may display the first window in the first window display area if the Z-order of the first window is higher than the Z-order of the second window. In this case, although the display device manages the second window to be arranged in the first window display area, the display device does not display the second window in the first window display area.
On the other hand, in the free-form mode, a plurality of windows may be displayed being overlapped according to their display priority levels. For example, if the display area of a first window overlaps the display area of a second window, the display device may compare the Z-order of the first window and the second window. The Z-order of windows may refer to the display order of the windows. For example, if the Z-order of the first window is higher than the Z-order of the second window, the display device may control to display the first window instead of the second window in the overlapping portion.
In split mode, multiple layouts are available, such as 2 up/down region split layouts, 2 left/right region split layouts, 3 region split layouts, 4 region split layouts, and so on. In operation S405, the display device may determine whether the layout of the window is in the split mode or the free-form mode. If the layout is in the split mode, the display device may further determine whether the layout is a 2 up/down region split layout, a 2 left/right region split layout, a3 region split layout, or a 4 region split layout.
Once the mode of the layout is determined in operation S405, the display apparatus may determine the window position in the layout in operation S407. In the case of 2 upper/lower area layouts, the display apparatus may determine that the first window and the third window are arranged in the upper window display area and the second window is arranged in the lower window display area. Alternatively, in the freestyle mode, the display device may determine a coordinate area for the first window and a coordinate area for the second window.
The display apparatus may determine a Z-order of a plurality of applications in operation S409 and may display a plurality of windows based on the Z-order of the applications in operation S411. For example, in the case of 2 upper/lower region splitting modes, the display apparatus may compare the Z-order of the first window and the third window. Further, the display device may control to display windows having a relatively higher Z-order in the corresponding window display regions. In the freestyle mode, the display device may compare the Z-order of the first window and the second window and may control the window having the relatively higher Z-order displayed in the overlap region.
Fig. 4b is a flowchart illustrating a method for controlling a display apparatus according to an embodiment of the present invention.
Referring to fig. 4b, the display apparatus may run a plurality of applications in operation S401. For example, an application launch command may be triggered by a drag gesture that drags an icon representing the application to a point at which a window for the application is to be displayed. Drag gesture input is one example of an application launch command, and thus, an application may be launched in a variety of ways. Those skilled in the art will readily appreciate that the present disclosure is not limited to a particular application operating method.
In operation S421, the display apparatus may determine whether the current layout is in the free-form mode. In the case of the free-form mode layout, the display apparatus may determine a Z-order of each of windows in which a plurality of applications are running in operation S423. In operation S425, the display apparatus may display windows according to the Z-order of the windows.
In the case of the split mode layout in operation S421, the display apparatus may arrange windows in the window display area in operation S431. Further, in operation S433, the display apparatus may determine a Z-order of windows in each window display region. For example, the display device may determine the Z-order of the windows as shown in Table 1.
[ Table 1]
Window opening Window display area (Page) Z sequence
A
1 1
B 2 5
C 3 6
D 2 2
E 1 3
F 4 4
As described above, the display device can control to display the window a having the relatively higher Z order, instead of the window E, in the first window display area. The display device may control to display the window D having the relatively higher Z-order, instead of the window B, in the second window display area. Further, the display device may display a window C in the third window display area and a window F in the fourth window display area. For example, in operation S435, the display apparatus may display a window having the highest Z-order among windows allocated to the window display regions in each window display region.
Fig. 5 illustrates a display order (Z-order) of windows according to an embodiment of the present disclosure.
Referring to fig. 5, the Z-order of the screen may be divided into N layers, and the nth layer may be a further upper layer placed on the (N-1) th layer. Windows may exist in each layer and applications may run in the windows. For example, when a first application is running, the first application is running in a window of a first layer. When the second application is executed, the second application is executed in a window of the second layer, and when the third application is executed, the third application is executed in a window of the third layer. Thus, the first, second and third layers are created hierarchically. The last created layer may be the top of the stack of layers and, therefore, may be displayed on the top of the screen. For example, a plurality of windows (a) to (d) may be displayed superimposed on the main screen. For example, the first window (a) is displayed overlapping the second window (b), the third window (c), and the fourth window (d), the second window (b) is displayed overlapping the third window (c) and the fourth window (d), and the third window (c) is displayed overlapping the fourth window (d). For example, when a plurality of windows (a) to (d) are displayed in an overlapping manner, the order in which the windows (a) to (d) are displayed is the Z order of the windows (a) to (d). The Z-order may be the order in which windows are displayed along the Z-axis. The layer view (e) may be a screen that hierarchically displays the Z-order of windows. The Z-order may be referred to as a display order.
Fig. 6a, 6b, 6c, and 6d illustrate an application running method according to an embodiment of the present disclosure. More specifically, fig. 6a, 6b, 6c and 6d illustrate a method for running an application in a free-form mode topology.
Referring to fig. 6a, 6b, 6c, and 6d, the display apparatus 600 displays a window display area 620. The display apparatus 600 displays the tray 610 accommodating the available application icons 611, 612, 613, 614, 615, 616, and 617 to the left of the window display area 620. The user 10 may operate the display device 600 to run the first application a 1. For example, in FIG. 6b, the user 10 may make a drag gesture 625 of dragging the icon 611 representing the first application A1 to a first point in the window display area 620. The display device 600 may display the first window 630 at a first point in the window display area 620 in response to the drag gesture 625 to run the first application a1 in the first window 630. The first window 630 may be displayed in a default size and shape or in a size and shape set by the user 10 prior to termination.
The user 10 may operate the display device 600 to additionally run the third application a 3. For example, as shown in FIG. 6c, the user 10 may make a drag gesture 635 that drags the icon 613 representing the third application A3 to a second point in the window display area 620. The display device 600 may display the third window 640 at a second point in the window display area 620 in response to the input execution command (i.e., the drag gesture 635) to execute the third application a3 in the third window 640. The third window 640 may be displayed in the default size and shape or in the size and shape set by the user 10 prior to termination. Because the third window 640 is the last window to which the user 10 has applied the gesture input, the controller (not shown) may assign a higher task priority level to the third application A3 than the first application a 1. Accordingly, the controller may control the third application A3 to be displayed on the first application a 1.
Fig. 7 is a flowchart illustrating a method for controlling a display apparatus according to an embodiment of the present invention.
Referring to fig. 7, in operation S701, a display apparatus may display at least one icon representing an application. For example, the display device may display a tray containing at least one icon in a portion of the touch screen.
In operation S703, the display device may receive a drag gesture input when the user drags the icon to a first point where the window is to be disposed. The display device may recognize a dragging gesture from the icon to the first point as a command for running the application corresponding to the icon. More specifically, in operation S705, the display device may determine a location of a first point at which the drag gesture has ended on the layout. For example, if the split mode has been set for the layout, the display device may determine a window region on the layout to which the first point corresponds.
In operation S707, the display apparatus may determine at least one of a size and a position of the window according to a position of the first point on the layout. In operation S709, the display apparatus may display the window according to the determined size and/or position.
Fig. 8a, 8b, 8c, 8d, 8e, 8f, 8g, 8h, 8i, 8j, 8k, 8l, and 8m illustrate a method for displaying multiple windows according to an embodiment of the present disclosure.
Referring to fig. 8a, 8b, 8c, 8d, 8e, 8f, 8g, 8h, 8i, 8j, 8k, 8l, and 8m, the display apparatus 800 displays a menu screen 817. The menu screen 817 may be a running screen of a launcher and may include icons representing applications. In addition, the menu screen 817 may include information on the current time and may further include a widget. The display device 800 displays a tray 810 containing available icons 811, 812, 813, 814, 815, and 816 to the left of the touch screen.
As shown in fig. 8b, the user 10 may operate the display device 800 to run the first application a. For example, as shown in fig. 8c, the user 10 may touch an icon 811 representing the first application a and drag the touched icon 811 to the menu screen 817. A controller (not shown) may control the display of icons 811 at the drag point. The controller may further control the display of a ghost view (ghostview)818 at the drag point. The ghost view 818 refers to a preview of the size and shape of the window in which the first application a will run so that the user 10 can select the window position. Because no window is already displayed, the controller may display the ghost view 818 full screen. As described below, the controller may control the display of the full screen ghost view in the absence of any window already displayed on the touch screen. If a single window is already displayed on the touch screen, the controller may display the ghost view in a size and shape corresponding to half of the touch screen. If both windows are already displayed on the touch screen, the controller may display the ghost view in a size and shape corresponding to half of one of the two windows on the touch screen. If three windows are already displayed on the touch screen, the controller may display the ghost view in a size and shape corresponding to half of a maximum one of the three windows.
The controller may recognize the drag gesture as a command for running a new application. The controller may generate a first window 819 for running the first application a. As shown in fig. 8d, the controller may control the display of the first window 819 in a full screen.
The user 10 may operate the display apparatus 800 to additionally run the second application B. For example, as shown in fig. 8e, the user may touch an icon 812 representing the second application B and drag the touch icon 812 towards the lower half of the first window 819 as shown in fig. 8 f. The controller may control the display of the icon 812 at the dragged point. Further, the controller may control the display of the ghost view 823 at the drag point. As previously described, because the single window 819 is already displayed on the touch screen, the controller may control the ghost view 823 to be displayed in a size and shape corresponding to half of the touch screen. Although not shown, if the user 10 drags the touched icon 812 to the upper half of the touch screen, the controller controls the display of the ghost view 823 in the upper half of the touch screen. Displaying the ghost view in the lower half of the touch screen is only one example of displaying the ghost view, and thus, the controller may divide the touch screen into left and right halves and may control the ghost view to be displayed in one of the left and right halves of the touch screen.
If the user ends the drag in the lower half of the touch screen as shown in fig. 8f, the controller determines that a new application launch command has been received. As shown in fig. 8g, the controller controls the second window 830 to be displayed in the lower half of the touch screen, in correspondence with the ghost view 823 shown in fig. 8 f. Further, the controller reduces the size and shape of the first window 819 to the first window 820 so that the first window 820 may be displayed in the upper half of the touch screen. The controller generates and displays a center button 825 at the boundary between the first window 820 and the second window 830.
The user 10 may operate the display apparatus 800 to additionally run the third application C. For example, as shown in fig. 8h, the user may touch an icon 813 representing the third application C, and drag the touched icon 813 to the right portion of the first window 820 as shown in fig. 8 i. The controller may control the icon 813 to be displayed at the dragged point. In addition, the controller may control the ghost view 827 to be displayed at the dragged point. As previously described, since the two windows 820 and 830 are already displayed on the touch screen, the controller may control the display of the ghost view 827 in a size and shape corresponding to half of the first window 820. Although not shown, if the user 10 drags the touched icon 813 to the left portion of the first window 820, the controller controls the ghost view 827 to be displayed in the left half portion of the first screen 820. Displaying the ghost view 827 in the right half of the first window 820 is only one example of displaying the ghost view, and thus, the controller may divide the first window 820 into an upper half and a lower half and may control the ghost view 827 to be displayed in one of the upper half and the lower half of the first screen 820. Displaying the ghost view 827 in one half of the first window 820 is another example of displaying a ghost view. The controller may determine the size and shape of the ghost view 827 with respect to the center button 825 and display the ghost view 827 accordingly.
If the user finishes dragging in the right portion of the first window 820 as shown in fig. 8i, the controller determines that a new application execution command has been received. As shown in fig. 8j, the controller controls the third window 840 to be displayed in the right half of the first screen 820, in accordance with the ghost view 827 shown in fig. 8 i. Alternatively, the controller may control the display of the third window 840 in correspondence with the position of the center button 825. Thus, as more applications are selected to run, portions of the screen may be progressively subdivided to allocate respective portions of the screen to each running application.
Further, the controller shrinks the size and shape of the first window 820 in correspondence with the creation of the third window 840. For example, the controller may control the first window 820 to be displayed in an area other than the display area of the third window 840.
The user 10 may operate the display apparatus 800 to additionally run the fourth application D. For example, as shown in fig. 8k, the user may touch an icon 814 representing the fourth application D, and drag the touch icon 814 to the right portion of the second window 830 as shown in fig. 8 l. The controller may control the icon 814 to be displayed at the dragged point. Further, the controller may control the ghost view 831 to be displayed at the dragged point. As previously described, since the three windows 820, 830, and 840 have been displayed on the touch screen, the controller may control the display of the ghost view 831 in a size and shape corresponding to half of the second window 830. Although not shown, if the user 10 drags the touched icon 814 to the left portion of the second screen 830, the controller controls the display of the ghost view 831 in the left half of the second screen 830. Displaying the ghost view 831 in the right half of the second window is only one example of displaying the ghost view, and thus, the controller may divide the second window 830 into an upper half and a lower half and may control the ghost view 831 to be displayed in one of the upper half and the lower half of the second screen 830. Displaying the ghost view 831 in one half of the second window 830 is another example of displaying a ghost view. The controller may determine the size and shape of the ghost view 831 relative to the center button 825 and display the ghost view 831 accordingly.
If the user finishes dragging in the right portion of the second window 830 as shown in fig. 8l, the controller determines that a new application execution command has been received. As shown in fig. 8j, the controller controls the fourth window 850 to be displayed in the right half of the second screen 830, in accordance with the ghost view 831 shown in fig. 8 l. Alternatively, the controller may control the display of the fourth window 850 in correspondence with the position of the center button 825.
Further, the controller shrinks the size and shape of the second window 830 in correspondence with the creation of the fourth window 850.
As described above, the display device may control the display of a window in the window display area where the drag gesture ends. In fig. 8a, 8b, 8c, 8d, 8e, 8f, 8g, 8h, 8i, 8j, 8k, 8l and 8m, the windows are displayed in the same size at different positions. With reference to fig. 9a, 9b, 9c, 9d, 9e, 9f, 9g, 9h, 10a, 10b, 10c, 10d, 11a, 11b, 11c, 12a, 12b, 12c, 13a, 13b, 13c, 14a, 14b, 14c, 15a, 15b, 15c, 16a, 16b, 16c, 16d and 17, various embodiments of configuring windows at different positions and different sizes will be described below.
Fig. 9a, 9b, 9c, 9d, 9e, 9f, 9g, and 9h illustrate layouts according to embodiments of the present disclosure.
Fig. 9a shows a full screen layout for the case where the split mode is not set. In fig. 9a, the display device defines a first window display area 901 throughout the entire screen.
Fig. 9b shows the input area 902 corresponding to the first window display area 901.
Fig. 9c shows a screen layout in 2 upper/lower area splitting modes. In fig. 9c, the display apparatus may divide a screen into an upper area and a lower area and define a first window display area 911 and a second window display area 912 in the upper area and the lower area, respectively.
Fig. 9d shows the input area in 2 up/down region split modes. The first input area 913 may correspond to the first window display area 911 and the third input area 915 may correspond to the second window display area 912. The second input region 914 may correspond to a boundary between the first window display region 911 and the second window display region 912. For example, when the user makes a drag gesture to drag an icon to the first input region 913, the display device may display a window in the first window display area 911 shown in fig. 9 c. For example, when the user drags an icon to the third input area 915, the display device may display a window in the second window display area 912 illustrated in fig. 9 c. For example, when the user drags an icon to the second input region 912, the display device may display windows throughout the entirety of the first window display region 911 and the second window display region 912 shown in fig. 9 c.
Fig. 9e shows the screen layout in 2 left/right region split mode. In fig. 9e, the display device may divide the screen into a left area and a right area and define a first window display area 921 and a second window display area 922 in the left area and the right area, respectively.
Fig. 9f shows the input area in 2 left/right region split mode. The first input region 923 may correspond to a first window display area 921 and the third input region 925 may correspond to a second window display area 922. The second input region 924 may correspond to a boundary between the first window display region 921 and the second window display region 922. For example, when the user makes a drag gesture of dragging an icon to the first input region 923, the display device may display a window in the first window display area 921 shown in fig. 9 e. For example, when the user drags an icon to the third input region 925, the display device may display a window in the second window display area 922 shown in fig. 9 e. For example, when the user drags an icon to the second input region 924, the display device may display windows throughout the entirety of the first window display area 921 and the second window display area 922 shown in fig. 9 e.
Fig. 9g illustrates a layout in 4 region splitting mode according to an embodiment of the present disclosure, and fig. 9h illustrates an input region defined according to the layout of the 4 region splitting mode illustrated in fig. 9 g.
Referring to fig. 9g and 9h, the display device defines first to fourth window display regions 931, 932, 933, and 934. Accordingly, the user can operate the display device to run a window in any one of the first to fourth window display regions 941, 942, 943 and 944. For example, when the user drags an icon representing an application to the third input area 932, the display device may display windows in the second window display area 932 while being aligned. If the user completes the drag gesture at the boundary between the first display area 931 and the second display area 932, the display device may display windows throughout the entirety of the first window display area 931 and the second window display area 932. For example, the display device may define a first input region 941 corresponding to the first window display area 931 and a second input region 942 corresponding to the second window display area 932. The display device may further define a fifth input field 945 at a boundary between the first window display area 931 and the second window display area 932. Similarly, the display device may define a third input region 943 and a fourth input region 944 corresponding to the third window display region 933 and the fourth window display region 934, respectively. The display device may further define a sixth input region 946 at a boundary between the first window display region and the third window display region 933, a seventh input region 947 at a boundary between the second window display region 932 and the fourth window display region 934, and an eighth input region 948 at a boundary between the third window display region 933 and the fourth window display region 934. The display device may further define a ninth input region 949 at an intersection where the first to fourth window display regions 931, 932, 933, and 934 meet. When the drag gesture ends in the particular input region, the display device determines a window display area for displaying the window based on the mapping shown in table 2.
[ Table 2]
Figure BDA0001462400100000221
Figure BDA0001462400100000231
As described above, the display device may define an input region for determining an application display area in which the drag gesture ends. More specifically, the display device may define an input region corresponding to a boundary between the plurality of window display regions or an input region corresponding to an intersection at which the plurality of window display regions meet. When the drag gesture ends in the input region corresponding to the boundary between window display areas, the display device may display windows throughout all of the window display areas. When the drag gesture ends in an input region corresponding to an intersection where multiple window display regions meet, the display device may display windows throughout all of those window display regions. The display device may display the window at different locations in different sizes. The above-described configuration of displaying windows at different positions in different sizes is described in more detail with reference to fig. 10a, 10b, 10c, 10d, 11a, 11b, 11c, 12a, 12b, 12c, 13a, 13b, 13c, 14a, 14b, 14c, 15a, 15b, 15c, 16a, 16b, 16c, 16d, and 17. More specifically, fig. 10a, 10b, 10c, 10d, 11a, 11b, 11c, 12a, 12b, 12c, 13a, 13b, 13c, 14a, 14b, 14c, 15a, 15b, 15c, 16a, 16b, 16c, 16d and 17 show the layout in the 4-region splitting mode. Accordingly, fig. 9g and 9h will also be described with reference to the following.
Fig. 10a, 10b, 10c, and 10d illustrate screens of a display device according to an embodiment of the present disclosure.
Referring to fig. 10a, 10b, 10c, and 10d, the controller controls the display window display area 1000 and the tray 1010 accommodating the available icons 1011, 1012, 1013, 1014, 1015, 1016, and 1017 representing the applications. The controller may always display the tray 1010. Alternatively, the controller may display the tray 1010 only when a tray call command is received. The tray call command may be generated in response to an edge flick received from the left side of the touch screen. Those skilled in the art will readily appreciate that the present disclosure is not limited to the type of input that triggers the tray call command. Assume that the display device is displaying a first window running application a in the window display area 1000.
Referring to fig. 10b, the user 10 may make a drag gesture 1021 dragging the icon 1016 representing the application F to a first point 1027 in the lower half of the window display area 1000. In fig. 10c, the controller may determine a window display area. In the case of a 4-zone layout, the controller may determine the input zone in which the dragging gesture 1021 ends. For example, if the first point 1027 is located in the eighth input region 948 of fig. 9h, the controller may determine to display the F window 1024 throughout the entirety of the third window display region 933 and the fourth window display region 934 as shown in table 2. Thereafter, the controller may display the ghost view 1023 in the determined area.
The user 10 may determine whether the window is to be displayed at a desired position by viewing the ghost view 1023. The user 10 may release the drag gesture 1021 and the F-window 1024 may be displayed throughout all of the third and fourth window display areas 933, 934 as shown in fig. 10 d. Since the F window 1024 is displayed, the controller may reduce the size of the a window 1000 to half and display the collapsed a window 1000. The controller may scale down the a-window 1000 at the same horizontal-to-vertical ratio or at a new horizontal-to-vertical ratio.
Fig. 11a, 11b, and 11c illustrate screens of a display device according to an embodiment of the present disclosure. More specifically, fig. 11a, 11b and 11c illustrate subsequent operations for the operations of fig. 10a, 10b, 10c and 10 d.
Referring to fig. 11a, the display device displays an a window 1000 and an F window 1024 in the upper and lower halves of the screen, respectively, in a split mode. The user 10 may operate the display apparatus 800 to additionally run the application E. The user 10 may make a drag gesture 1032 by dragging the icon 1015 representing the application E to the second point 1033.
Referring to fig. 11b and 11c, the controller may determine an input region corresponding to the second point 1033. If the controller determines that the second point 1033 corresponds to the eighth input region 948 shown in fig. 9h, the controller may determine to display the E window 1034 throughout the entirety of the third and fourth window display regions 933 and 934, as shown in table 2. Accordingly, the controller may display the determined window display area as the ghost view 1031.
The user 10 may determine whether the window is to be displayed at a desired position by viewing the ghost view 1031. The user 10 may release the drag gesture 1032. The E window 1034 may be displayed throughout the entirety of the third and fourth window display regions 933 and 934.
Fig. 12a, 12b, and 12c illustrate screens of a display device according to an embodiment of the present disclosure. More specifically, fig. 12a, 12b and 12c illustrate subsequent operations for the operations of fig. 11a, 11b and 11 c.
Referring to fig. 12a, the display device displays an a window 1000 and an E window 1034 in the upper and lower halves of the screen, respectively, in a split mode. The user 10 may operate the display device to additionally run the application G. The user 10 may make a drag gesture 1041 that drags the icon 1017 representing application G toward the third point 1042.
Referring to fig. 12b and 12c, the controller may determine an input region corresponding to the third point 1042. If the controller determines that the third point 1042 corresponds to the ninth input region 949 shown in fig. 9h, the controller may determine to display the G window 1044 throughout the entirety of the first to fourth window display areas 931 to 934, as shown in table 2. Accordingly, the controller may display the determined window display area as the ghost view 1043.
The user 10 may determine whether the window is to be displayed at a desired position by viewing the ghost view 1043. The user 10 may release the drag gesture 1042. The G window 1044 may be displayed in full screen, as shown in fig. 12 c.
Fig. 13a, 13b, and 13c illustrate screens of a display device according to an embodiment of the present disclosure. More specifically, fig. 13a, 13b and 13c illustrate subsequent operations for the operations of fig. 12a, 12b and 12 c.
Referring to fig. 13a, 13b, and 13c, the display apparatus is displaying a G window 1044. The user 10 may make a drag gesture 1051 to drag the icon 1012 representing application B to a fourth point 1052 in the lower half of the G window 1044 in fig. 13B. When the controller determines that the fourth point 1052 corresponds to the eighth input region 948 shown in fig. 9h, the controller may determine to display the E window 1054 throughout all of the third window display area 933 and the fourth window display area 934, as shown in table 2. Accordingly, the controller may display the determined window display area as the ghost view 1053.
The user 10 can determine whether the window will be displayed at a desired position by viewing the ghost view 1053. The user 10 may release the drag gesture 1052. The B window 1054 may be displayed throughout the entirety of the third window display region 933 and the fourth window display region 934 as shown in fig. 13 c. Because the B window 1054 is displayed, the controller may shrink the G window 1044 to half of the screen and display the shrunk G window 1044 in the upper half of the screen.
Fig. 14a, 14b, and 14c illustrate screens of a display device according to an embodiment of the present disclosure. More specifically, fig. 14a, 14b and 14c illustrate subsequent operations for the operations of fig. 13a, 13b and 13 c.
Referring to fig. 14a, the display apparatus displays a G window 1044 and a B window 1054 in the upper half and the lower half of the screen, respectively, in the split mode. The user 10 may operate the display device to additionally run the application G. The user 10 may make a drag gesture 1061 to drag the icon 1013 representing application B to a fifth point 1062.
Referring to fig. 14b and 14c, the controller may determine an input area corresponding to the fifth point 1062. If the controller determines that the fifth point 1062 corresponds to the second input area 942 shown in fig. 9h, the controller may determine to display the C window 1064 in the second window display area 932, as shown in table 2. Accordingly, the controller may display the determined window display area as the ghost view 1063.
The user 10 can determine whether the window will be displayed at a desired position by viewing the ghost view 1063. The user 10 may release the drag gesture 1061. The C window 1064 may be displayed in the second window display area 932, as shown in fig. 14C.
Fig. 15a, 15b, and 15c illustrate screens of a display device according to an embodiment of the present disclosure. More specifically, fig. 15a, 15b and 15c illustrate subsequent operations for the operations of fig. 14a, 14b and 14 c.
Referring to fig. 15a, the display apparatus displays a G window 1044, a B window 1054, and a C window 1064 in a 3-region splitting mode. The user 10 may operate the display device to additionally run the application D. The user 10 may make a drag gesture 1071 that drags the icon 1014 representing application D to the sixth point 1072.
Referring to fig. 15b and 15c, the controller may determine an input area corresponding to the sixth point 1072. If the controller determines that the sixth point 1072 corresponds to the fourth input area 944 shown in FIG. 9h, the controller may determine to display a D window 1074 in the fourth window display area 934, as shown in Table 2. Accordingly, the controller may display the determined window display area as the ghost view 1073.
The user 10 can determine whether the window will be displayed at a desired position by viewing the ghost view 1073. The user 10 may release the drag gesture 1071. The D window 1074 may be displayed in the fourth window display area 934 as shown in fig. 15 c.
Fig. 16a, 16b, 16c, and 16d illustrate screens of a display device according to an embodiment of the present disclosure.
Fig. 16a illustrates a screen of a display device according to an embodiment of the present disclosure. More specifically, fig. 16a illustrates a subsequent operation to that used for fig. 15a, 15b and 15 c.
Referring to fig. 16a, the display apparatus displays a G window 1044, a B window 1054, a C window 1064, and a D window 1074 in a 4-region splitting mode. The user 10 may operate the display device to additionally run the application H. The user 10 may make a drag gesture that drags the icon 1018 representing the application H to the seventh point 1081.
Referring to fig. 16a, the controller may determine an input area corresponding to the seventh point 1081. If the controller determines that the seventh point 1081 corresponds to the fifth input area 945 shown in fig. 9H, the controller may determine to display the H window 1083 in the first window display area 931 and the second window display area 932, as shown in table 2. Accordingly, the controller may display the determined window display area as the ghost view 1082. Icons 1015, 1016, 1017, 1018, 1019, 1020, and 1021 representing applications E through K may be arranged in the tray 1010. For example, the user 10 may input an upward drag gesture across the tray 1010 such that hidden icons 1018, 1019, 1020, and 1021 representing applications H through K may be exposed in the tray 1010.
The user 10 can determine whether the window will be displayed at the desired location by viewing the ghost view 1082. The user 10 may release the drag gesture. The H window 1083 may be displayed in the first window display area 931 and the second window display area 932, as shown in fig. 16 a.
Fig. 16b illustrates a screen of a display device according to an embodiment of the present disclosure. More specifically, fig. 16b illustrates a subsequent operation to that used for fig. 15a, 15b and 15 c.
Referring to fig. 16B, the display apparatus displays a G window 1044, a B window 1054, a C window 1064, and a D window 1074 in a 4-region splitting mode. The user 10 may operate the display device to additionally run the application H. The user 10 may make a drag gesture by dragging the icon 1018 representing application H to an eighth point 1084.
Referring to fig. 16b, the controller may determine an input area corresponding to the eighth point 1084. If the controller determines that the eighth point 1084 corresponds to the sixth input area 946 shown in fig. 9H, the controller may determine that the H window 1086 is displayed in the first window display area 931 and the third window display area 933, as shown in table 2. Accordingly, the controller may display the determined window display area as the ghost view 1085.
The user 10 can determine whether the window will be displayed at the desired location by viewing the ghost view 1085. The user 10 may release the drag gesture. The H window 1086 may be displayed in the first window display area 931 and the third window display area 933, as shown in fig. 16 b.
Fig. 16c illustrates a screen of a display device according to an embodiment of the present disclosure. More specifically, fig. 16c illustrates a subsequent operation to that used for fig. 15a, 15b and 15 c.
Referring to fig. 16C, the display apparatus displays a G window 1044, a B window 1054, a C window 1064, and a D window 1074 in a 4-region splitting mode. The user 10 may operate the display device to additionally run the application H. The user 10 may make a drag gesture that drags the icon 1018 representing the application H to the ninth point 1087.
Referring to fig. 16c, the controller may determine an input area corresponding to the ninth point 1087. If the controller determines that the ninth point 1087 corresponds to the eighth input area 948 shown in fig. 9H, the controller may determine that the H window 1089 is displayed in the third window display area 933 and the fourth window display area 934, as shown in table 2. Accordingly, the controller may display the determined window display area as the ghost view 1088.
The user 10 can determine whether the window will be displayed at the desired location by viewing the ghost view 1088. The user 10 may release the drag gesture. The H window 1089 may be displayed in the third window display area 933 and the fourth window display area 934, as shown in fig. 16 c.
Fig. 16d illustrates a screen of a display device according to an embodiment of the present disclosure. More specifically, fig. 16d illustrates a subsequent operation to that used for fig. 15a, 15b and 15 c.
Referring to fig. 16D, the display apparatus displays a G window 1044, a B window 1054, a C window 1064, and a D window 1074 in a 4-region splitting mode. The user 10 may operate the display device to additionally run the application H. The user 10 may make a drag gesture that drags the icon 1018 representing application H to the tenth point 1090.
Referring to fig. 16d, the controller may determine an input zone corresponding to the tenth point 1090. If the controller determines that the tenth point 1090 corresponds to the seventh input field 947 shown in fig. 9H, the controller may determine to display the H window 1092 in the second window display area 932 and the fourth window display area 934, as shown in table 2. Accordingly, the controller may display the determined window display area as the ghost view 1091.
The user 10 may determine whether the window is to be displayed at a desired position by viewing the ghost view 1091. The user 10 may release the drag gesture. The H window 1092 may be displayed in the second window display area 932 and the fourth window display area 934 as shown in fig. 16 d.
Fig. 17 illustrates a screen of a display device according to an embodiment of the present disclosure. More specifically, FIG. 17 illustrates subsequent operations for the operations of FIGS. 15a, 15b, and 15 c.
Referring to fig. 17, the display apparatus displays a G window 1044, a B window 1054, a C window 1064, and a D window 1074 in a 4-region splitting mode. The user 10 may operate the display device to additionally run the application H. The user 10 may make a drag gesture of dragging the icon 1018 representing the application H to the eleventh point 1093.
Referring to fig. 17, the controller may determine an input region corresponding to an eleventh point 1093. If the controller determines that the eleventh point 1093 corresponds to the ninth input region 949 shown in fig. 9H, the controller may determine to display the H window 1095 in the third window display area 932 and the fourth window display area 934, as shown in table 2. Accordingly, the controller may display the determined window display area as the ghost view 1094.
The user 10 may determine whether the window is to be displayed at a desired location by viewing the ghost view 1094. The user 10 may release the drag gesture. The H window 1095 may be displayed full screen.
As described above, the display device may provide windows in different sizes at different locations according to the end point of the drag gesture. Although 4 region splitting patterns have been described above, the above description can be extended to 9 region splitting patterns and the like.
Fig. 18a and 18b illustrate a 9-zone splitting mode according to an embodiment of the present disclosure.
Referring to fig. 18a and 18b, the display device may define 9a split window display area. Further, the display device may define input regions A, C, E, K, M, O, U, W and Y corresponding to respective window display regions, input regions B, D, F, H, J, L, N, P, R, T, V and X corresponding to boundaries between window display regions, and input regions G, I, Q and S corresponding to intersections where the window display regions meet. When the end point of the drag gesture is located in the input region corresponding to the boundary between the display regions, the display device may display the window throughout the entirety of the window display region. When the end point of the drag gesture is located in the input region corresponding to the intersection at which the window display regions meet, the display device may display the window throughout the entirety of the window display regions. In this manner, the display device may display windows at different locations and sizes depending on the end point of the drag gesture.
Fig. 19 is a flowchart illustrating a method for controlling a display apparatus according to an embodiment of the present invention.
Referring to fig. 19, in operation S1901, a display apparatus may display at least one icon representing an application. In operation S1903, the display device may receive a drag gesture to drag an icon to a first point. In operation S1905, the display device may determine a location of an end of the drag gesture on the layout.
In operation S1907, the display device may determine whether the drag gesture ends at a boundary between the window display regions. If the drag gesture ends at the boundary between the window display regions, the display device may display the window throughout the entirety of the window display regions in operation S1909.
In operation S1911, the display device may determine whether the drag gesture ends at an intersection where the window display regions meet. If the drag gesture ends at the intersection where the window display regions meet, the display device may display the window throughout the entirety of the window display regions in operation S1913.
In operation S1915, the display device may determine whether the drag gesture ends in the window display area. If the drag gesture ends in the window display area, the display device may display a window in the window display area in operation S1917.
Fig. 20 is a flowchart illustrating a method for controlling a display apparatus according to an embodiment of the present invention.
Referring to fig. 20, the controller may set the layout in the split mode in operation S2001. In operation S2003, the controller may define a plurality of window display regions according to the split mode. Further, in operation S2005, the controller may assign a plurality of windows to the window display area. More specifically, the controller may assign a plurality of windows to one window display area.
In operation S2007, the controller may control to display a window having the highest Z-order among windows allocated to the window display regions in each window display region. For example, if a plurality of windows are allocated to the first window display region, the controller may control to display a window having the highest Z-order among the allocated windows.
In operation S2009, the controller may determine whether a Z-order change command has been received through the touch screen. The Z-order change command is a command requesting to change the Z-order of the window. For example, the Z-order change command may be triggered by a flick gesture on the touch screen. When a Z-order change command is received in operation S2009, the controller may change at least one Z-order and display a window based on the changed Z-order in operation S2011.
Fig. 21a, 21b, and 21c illustrate screens of a display device according to an embodiment of the present disclosure.
Referring to fig. 21a, the display device 2100 displays a first window 2101 for running application a in a first area, a second window 2102 for running application B in a second area, a third window 2103 for running application C in a third area, and a fourth window 2104 for running application D in a fourth area. The display device 2100 further displays a center button 2110.
FIG. 22 illustrates an action stack according to an embodiment of the present disclosure.
Referring to fig. 22, a controller (not shown) may manage the action stack on the left. The controller may manage the Z-order of windows in the order of applications C, F, G, A, D, B, J, K, H, M, L and I. The controller assigns windows for applications C, F and G to the third region, A, J and L to the first region, D, H and M to the fourth region, and B, K and I to the second region.
The controller detects applications that are supposed to be running in the first zone and compares the Z-order of the detected applications. The controller may determine that application a has the highest Z-order in the first zone. Accordingly, the controller controls to display a first window 2101 in which the application a runs in the first area. The controller detects an application assumed to be running in the second zone and compares the Z-order of the detected applications. The controller may determine that application B has the highest Z-order in the second zone. Accordingly, the controller controls to display the second window 2102 in which the application B runs in the second area. The controller detects an application assumed to be running in the third zone and compares the Z-order of the detected applications. The controller may determine that application C has the highest Z-order in the third zone. Accordingly, the controller controls to display the third window 2103 in which the application B runs in the third region. The controller detects an application assumed to be running in the fourth zone and compares the Z-order of the detected applications. The controller may determine that application D has the highest Z-order in the fourth zone. Accordingly, the controller controls to display the fourth window 2104 in which the application B runs in the fourth area.
With continued reference to fig. 21a, the user 1 may input a Z-order change command to the third area. For example, user 1 may make a flick right gesture 2120 across the third region. The controller may recognize this flick right gesture 2120 as a Z-order change command. The flick right gesture 2120 may be set as a command for assigning the highest Z-order to the application with the lowest Z-order in the window display area. The Z-order change is reflected in the action stack on the right shown in fig. 22. Note from the action stack on the right shown in FIG. 22 that application G is located at the top of the action stack. The flick right gesture 2120 is an example of a Z-order in the assignment window display area, and thus, those skilled in the art will readily appreciate that a Z-order change command may be defined by gestures made in directions other than to the right. For example, the Z-order change command may be defined by a variety of gestures, such as a flick left gesture, a flick down gesture, a flick up gesture, and so forth. Further, many other gestures besides flick gestures, including tilt, drag, pan, and the like, may be defined as Z-order change commands, which should not be construed as limiting the present disclosure. Herein, the rightward direction may be referred to as a first direction, and the first direction is not limited to the rightward direction.
Referring to fig. 21b and 21c, the controller may control to display a fifth window 2113 in the third area in order to run the application G. When a Z-order change command is received through a flick right gesture across the third region, the controller may determine that application F has the lowest Z-order in the third region, as shown in fig. 21 c. The controller may control the sixth window 2123 to be displayed in the third area so as to run the application F.
Fig. 23a and 23b illustrate screens of a display device describing Z-order change according to an embodiment of the present disclosure. FIG. 24 illustrates an action stack according to an embodiment of the present disclosure. FIG. 23a may be substantially the same as FIG. 21c, and the left action stack shown in FIG. 24 may depict the Z-order of the windows displayed in FIG. 23 a.
Referring to fig. 23a, the user 1 may input a Z-order change command to the first region by making a flick right gesture 2121 across the first region. A controller (not shown) may determine that the application L has the lowest Z-order in the first zone. The controller may assign the highest Z-order to application L as shown in the action stack on the right shown in fig. 24.
Referring to fig. 23b, the controller may control a seventh window 2131 to be displayed in the first region in order to run the application L.
Fig. 25a and 25b illustrate screens of a display device describing Z-order change according to an embodiment of the present disclosure. FIG. 26 illustrates an action stack according to an embodiment of the present disclosure.
Referring to fig. 25a and 25b, the user may input a Z-order changing command to the second area by making a flick gesture 2130 to the left across the second area. A controller (not shown) may recognize the flick gesture to the left as a Z-order change command. The flick gesture to the left may be set as a command to assign the lowest Z-order to the application with the highest Z-order in the window display area. The leftward direction may be referred to as a second direction.
Thus, the controller may assign the lowest Z-order to the application with the highest Z-order in the second region, as shown in the action stack to the right shown in FIG. 26. Since the application B is assigned the lowest Z-order, the controller may control the highest Z-order to be assigned to the application K in the second area.
Accordingly, the controller may display an eighth window 2142 in the second region to run the application K, as shown in fig. 25 b. The resulting action stack changes are shown in FIG. 26.
As previously described, the controller may recognize a flick gesture to the left as a command to assign the highest Z-order to the application with the lowest Z-order in the window display area. Further, the controller may recognize the flick right gesture as a command to assign a lowest Z-order to an application having a second highest Z-order in the window display area. Therefore, the user can easily switch to the screen of the application having the lowest or highest Z-order.
Fig. 27a and 27b illustrate screens of a display device describing a Z-order changing command according to an embodiment of the present disclosure. FIG. 28 illustrates an action stack according to an embodiment of the present disclosure.
Referring to fig. 27a and 27B, the display device 2100 displays a first window 2101 to run an application a in a first area, a second window 2102 to run an application B in a second area, a third window 2103 to run an application C in a third area, and a fourth window 2104 to run an application D in a fourth area. The display device 2100 may manage an action stack as shown in fig. 28. The user 1 can input a Z-order changing command to the third area. For example, user 1 may make a right tilt gesture 2700 when touching the third region as indicated by reference numeral 2701. The controller may recognize the touch and tilt right gestures as Z-order change commands.
The controller can change the Z-order in the action stack shown in fig. 28, which has been described previously with reference to fig. 22, based on the Z-order change command, and thus will not be described excessively. The controller may control to display a fifth window 2113 in the third area so as to run the application G, as shown in fig. 27 b.
Fig. 29a and 29b illustrate screens of a display device describing a Z-order changing command according to an embodiment of the present disclosure.
Referring to fig. 29a and 29B, the display device 2100 displays a first window 2101 to run an application a in a first area, a second window 2102 to run an application B in a second area, a third window 2103 to run an application C in a third area, and a fourth window 2104 to run an application D in a fourth area. The display device 2100 may manage an action stack as shown in fig. 30. The user 1 can input a Z-order changing command to the second area. For example, when the second area is touched as indicated by reference numeral 2901, the user 1 may make a tilt gesture 2900 to the left. The controller may recognize the touch and tilt gesture to the left as a Z-order change command.
The controller can change the Z-order in the action stack shown in fig. 30 based on the Z-order change command, which has been previously described with reference to fig. 26, and thus will not be described excessively. The controller may control the eighth window 2142 to be displayed in the second region so as to run the application K, as shown in fig. 29 b.
Fig. 31a and 31b illustrate screens of a display device describing a Z-order changing command according to an embodiment of the present disclosure. FIG. 32 illustrates an action stack according to an embodiment of the present disclosure.
Referring to fig. 31a, 31B, and 32, the display apparatus 2100 displays a first window 2101 to execute an application a in a first area, displays a second window 2102 to execute an application B in a second area, displays a third window 2103 to execute an application C in a third area, and displays a fourth window 2104 to execute an application D in a fourth area. The display device 2100 can manage an action stack as shown in fig. 32. The user 1 can input a Z-order changing command to the third area. For example, user 1 may touch point 3100 in the third region and make a flick gesture 3101 to the right edge in the third region. The controller may recognize a flick gesture to the right edge as a Z-order change command.
The controller can change the Z-order in the action stack shown in fig. 32, which has been described previously with reference to fig. 22, based on the Z-order change command, and thus will not be described excessively. The controller may control to display a fifth window 2113 in the third area so as to run the application G, as shown in fig. 31 b.
Fig. 33a and 33b illustrate screens of a display device describing a Z-order changing command according to an embodiment of the present disclosure. FIG. 34 illustrates an action stack according to an embodiment of the present disclosure.
Referring to fig. 33a, 33B, and 34, the display apparatus 2100 displays a first window 2101 to execute an application a in a first area, displays a second window 2102 to execute an application B in a second area, displays a third window 2103 to execute an application C in a third area, and displays a fourth window 2104 to execute an application D in a fourth area. The display device 2100 may manage an action stack as shown in fig. 34.
The user 1 can input a Z-order changing command to the second area. For example, user 1 may touch point 3300 in the second area and make a flick gesture 3301 of the edge to the left in the second area. The controller may recognize the flick gesture to the left edge as a Z-order change command.
The controller can change the Z-order in the action stack shown in fig. 34 based on the Z-order change command, which has been described previously with reference to fig. 26, and thus will not be described excessively. The controller may control the eighth window 2142 to be displayed in the second region so as to run the application K, as shown in fig. 33 b.
A method of changing the Z order in the split mode has been described so far. Now, a description will be given of a method of changing the Z order in the free-form mode.
Fig. 35 is a flowchart illustrating a method for controlling a display apparatus according to an embodiment of the present invention.
Referring to fig. 35, the controller may set a layout in the free-form mode in operation S3501. The controller may receive an application execution command, and thus, a plurality of windows for executing a plurality of applications may be generated in operation S3503. The controller may determine a Z-order of each of the plurality of windows in operation S3505 and may display the windows based on their Z-order in operation S3507.
In operation S3509, the controller may determine whether a Z-order change command has been received. When the Z order change command is received, the controller may control the overlapping windows to be displayed according to the changed Z order in operation S3511.
Fig. 36a, 36b, and 36c illustrate screens of a display device describing a Z-order changing command according to an embodiment of the present disclosure and fig. 37a, 37b, and 37c illustrate action stacks according to an embodiment of the present disclosure.
Referring to fig. 36a, 36b, 36c, 37a, 37b, and 37c, the controller may manage the action stack shown in fig. 37 a. For example, the controller may assign the lowest Z-order to window A running application A, the middle Z-order to window B running application B, and the highest Z-order to window C running application C. Accordingly, the controller may display the windows in the order of windows C3631 and 3632, windows B3621 and 3622, and windows a 3611 and 3612, as shown in fig. 36 a.
User 1 may enter a Z-order change command. When the title bar 3631 is touched in the window C as indicated by reference numeral 3641, the Z-order changing command may be triggered by a pinch-out gesture 3642 to the left. Pinching may be a gesture that spreads out two touch points. In this case, the Z-order of windows C3631 and 3632 may be reset to the lowest Z-order. As a result, the controller assigns a middle Z-order to window A, a highest Z-order to window B, and a lowest Z-order to window C. For example, the controller may assign the lowest Z-order to window C while increasing the Z-order of the other windows by 1. Accordingly, the controller may control the windows to be displayed in the order of windows B3621 and 3622, windows a 3611 and 3612, and windows C3631 and 3632. Pinch-out is a Z-order change command, and thus, as is readily understood by those skilled in the art, Z-order change commands can be triggered by a variety of gestures including flick, drag, edge flick, touch and tilt, and pan.
User 1 may enter a Z-order change command. When the application execution screen 3622 is touched in the window B as indicated by reference numeral 3651, the Z-order changing command can be triggered by a pinch gesture 3652 to the left. In this case, the Z-order of windows B3621 and 3622 may be reset to the lowest Z-order. As a result, the controller assigns window A the highest Z-order, window B running application B the lowest Z-order, and window C running application C the middle Z-order. For example, the controller may assign the lowest Z-order to window B, while increasing the Z-order of the other windows by 1. Accordingly, the controller may control the windows to be displayed in the order of windows a 3611 and 3612, windows C3631 and 3632, and windows B3621 and 3622.
Fig. 38a, 38b, and 38c illustrate screens of a display device describing a Z-order changing command according to an embodiment of the present disclosure and fig. 39a, 39b, and 39c illustrate action stacks according to an embodiment of the present disclosure.
Referring to fig. 38a, 38b, 38c, 39a, 39b, and 39c, the controller may manage the action stack shown in fig. 39 a. For example, the controller may assign the lowest Z-order to window A running application A, the middle Z-order to window B running application B, and the highest Z-order to window C running application C. Accordingly, the controller may display the windows in the order of windows C3631 and 3632, windows B3621 and 3622, and windows a 3611 and 3612, as shown in fig. 38 a.
The user 1 can input a Z-order changing command. When the title bar 3631 is touched in the window C as indicated by reference numeral 3841, the Z-order changing command may be triggered by a pinch-in gesture 3842 from the left. The pinch may be a gesture that narrows the distance between two touch points. In this case, the Z-order of windows a 3631 and 3632 having the lowest Z-order may be reset to the highest Z-order while the Z-order of the other windows is lowered by 1. Thus, the controller may assign window a the highest Z-order, window B running application B the lowest Z-order, and window C running application C the middle Z-order, as shown in fig. 39B.
As shown in fig. 38B, the controller may control the windows to be displayed in the order of windows a 3611 and 3612, windows C3631 and 3632, and windows B3621 and 3622. The user 1 can input a Z-order changing command. When the title bar 3631 is touched in the window C as indicated by reference numeral 3851, the Z-order change command may be triggered by a pinch gesture 3852 from the left. In this case, the Z-order of windows B3621 and 3622 having the lowest Z-order may be reset to the highest Z-order while reducing the Z-order of the other windows by 1. As a result, the controller may assign a middle Z-order to window a running application a, a highest Z-order to window B running application B, and a lowest Z-order to window C running application C, as shown in fig. 39C.
Accordingly, the controller may control the windows to be displayed in the order of windows B3621 and 3622, windows a 3611 and 3612, and windows C3631 and 3632.
As described above, when receiving a Z-order change command by a pinch gesture, the controller may assign the lowest Z-order to the window having the highest Z-order. Conversely, when a Z-order change command by a pinch gesture is received, the controller may assign the highest Z-order to the window having the lowest Z-order. Since the Z-order of the windows can be changed by a simple manipulation for the user, a desired window appears on the top layer of the screen. Therefore, user convenience can be improved.
Fig. 40a, 40b, 40c, 40d, 40e, 40f, 40g, 40h, 40i, 40j, and 40k illustrate a method for displaying an application execution window according to an embodiment of the present disclosure.
Referring to fig. 40a, 40b, 40c, 40d, 40e, 40f, 40g, 40h, 40i, 40j, and 40k, the display device 4200 may define a plurality of regions 4201, 4202, 4203, and 4204 on the touch screen. For convenience in describing fig. 40a and subsequent drawings, the plurality of regions 4201, 4202, 4203 and 4204 are referred to as a first region 4201, a second region 4202, a third region 4203 and a fourth region 4204 (region 1, region 2, region 3 and region 4), respectively. Both the first region 4201 and the third region 4203 form a fifth region (region 5, not shown) and both the second region 4202 and the fourth region 4204 form a sixth region (region 6, not shown). Both the first region 4201 and the second region 4202 form a seventh region (region 7, not shown) and both the third region 4203 and the fourth region 4204 form an eighth region (region 8, not shown). The first to fourth regions 4201, 4202, 4203 and 4204 form a region F. The first boundary 4211 may be disposed between the first region 4201 and the second region 4202, the second boundary 4212 may be disposed between the third region 4203 and the fourth region 4204, the third boundary 4213 may be disposed between the first region 4201 and the third region 4203, and the fourth boundary 4214 may be disposed between the second region 4202 and the fourth region 4204. The first boundary 4211 and the second boundary 4212 may form a single line, and the third boundary 4213 and the fourth boundary 4214 may form a single line. The first to fourth boundaries 4211 to 4214 are not necessarily explicitly displayed. Instead, the first to fourth boundaries 4211 to 4214 may be dotted lines. The controller (not shown) may configure the first to fourth regions 4201, 4202, 4203 and 4204 in such a manner that the first to fourth regions 4201, 4202, 4203 and 4204 do not overlap. For example, as shown in fig. 40a, the controller may configure a first region 4201 in the upper left corner, a second region 4202 in the upper right corner, a third region 4203 in the lower left corner, and a fourth region 4204 in the lower right corner. The controller may divide the screen into left and right portions by the first and second boundaries 4211 and 4212, and divide the screen into upper and lower portions by the third and fourth boundaries 4213 and 4214.
The touch screen may display the center button 4220 at an intersection where the first to fourth boundaries 4211 to 4214 meet. The center button 4220 may be a function key that changes the size of an area in which an application execution window is displayed or sets an operation mode that controls the execution window.
The controller may control the touch screen in such a way: an application execution window is displayed in each of the plurality of regions. For example, the controller may control the touch screen such that a window to run an application is displayed, that is, an application running window is displayed in each of the areas 4201, 4202, 4203 and 4204, as shown in fig. 40b, 40c, 40d, 40e, 40f, 40g, 40h, 40i, 40j and the center.
The object related to the application may be displayed on an execution screen of the application. The objects may take a variety of forms, such as text, graphics, icons, buttons, check boxes, photos, videos, web pages, maps, and so forth. When a user touches an object, a function or event corresponding to the touched object may be run in the application. An object may be referred to as a view according to the OS. For example, at least one of a capture button to capture the execution window to control display of the execution window, a minimize button to minimize a size of the execution window, a maximize button to maximize a size of the execution window, and an exit button to end the execution window may be displayed.
Referring to fig. 40b, the controller may control icons 4231, 4232, 4233, 4234, 4235, 4236 and 4237 representing executable applications to be displayed on the touch screen. The display device 4200 may run application a. As shown in fig. 40b, in response to the execution of the application a, the controller may control the touch screen to display an execution window 4230 of the application a in the first region 4201. Further, the controller may control icons 4231, 4232, 4233, 4234, 4235, 4236, and 4237 representing executable applications to be displayed at specific positions of the touch screen. When a touch input to one of the icons 4231, 4232, 4233, 4234, 4235, 4236 and 4237 is received, that is, when an input of selecting an icon representing an application to be executed is received, the display device 4200 may display an execution window of the application corresponding to the selected icon in one of the first to fourth areas 4201, 4202, 4203 and 4204.
The controller may display an indicator 4221 indicating an active region on the displayed center button 4220. The active area may be an area in which a last running application or a last application selected by a user is displayed. The user-operated application may be considered a user-selected application.
Indicator 4221 may be implemented in a variety of ways to indicate the location of the active region. For example, at least a portion of the application execution window displayed in the active region may be displayed in an area overlapping between the application execution window of the active region and the center button 4220. Alternatively, an arrow indicating the direction of the active region may be displayed on the center button 4220.
The active region may be determined based on an action stack. The last running application or the last user-selected application may be at the top of the action stack. The display device 4200 may determine an area in which a running window of an application at the top of an action stack is displayed as an active region. The active area may be referred to as a focus area. For example, in fig. 40b, the indicator 4221 may indicate the first region 4201.
41a, 41b, 41c, 41d, 41e, and 41f illustrate action stacks according to various embodiments of the present disclosure.
Referring to FIG. 41a, an action stack managed by a display device 4200 is shown. The controller may generate and manage actions 4301 for application a in an action stack in response to the execution of application a.
Referring to fig. 40c, the user 1 may touch an icon 4232 representing application B. When the icon 4232 representing the application B is touched, the controller controls to display the run window 4240 of the application B in the second region 4202, as shown in fig. 40 d. The controller may determine the region in which the running windows are displayed according to a particular order. For example, the controller may control to display new operation windows in the order of the second region 4202, the third region 4203, and the fourth region 4204. The above-described operating window display order is merely an example, and thus the order in which new operating windows are displayed in the regions 4201, 4202, 4203, and 4204 may be changed according to various embodiments of the present disclosure.
Since the run window 4240 of the application B is displayed in the second region 4202, the indicator 4221 may indicate the second region 4202 in fig. 40 d.
Referring to FIG. 41b, an action stack corresponding to FIG. 40d is shown. The controller generates an action 4301 for application B in an action stack in response to the running of application B. The controller may place the action 4302 of the last running application B on top of the action 4301 of application a.
Referring to fig. 40e, the user 1 may touch an icon 4233 corresponding to the application C. When the icon 4233 representing the application C is touched, the controller controls to display the run window 4250 of the application C in the fourth region 4204, as shown in fig. 40 f. Along with the display of the running window 4250 of the application C in the fourth region 4204, the indicator 4221 may indicate the fourth region 4204.
FIG. 41c shows an action stack corresponding to FIG. 40 f. The controller generates an action 4303 for application C in an action stack in response to the running of application C. The controller places the action 4303 of the last running application C on top of the action stack.
Referring to fig. 40g, the user 1 may touch an icon 4234 representing an application D. When the icon 4234 representing the application D is touched, the controller controls to display the run window 4260 of the application D in the third region 4203, as shown in fig. 40 h. Since the execution window 4260 of the application D is displayed in the third region 4203, the indicator 4221 on the center button 4220 may indicate the third region 4203.
FIG. 41d shows an action stack corresponding to FIG. 40 h. The controller generates an action 4304 for application D in an action stack in response to the running of application D. The controller places the action 4304 of the last running application D on top of the action stack.
Referring to fig. 40i, the user 1 can operate the application B. FIG. 41e shows an action stack corresponding to FIG. 40 i. The controller moves the action 4302 of application B to the top of the action stack in response to user input to the run window 4240 of application B.
When receiving the user input to the run window 4240 of the application B, the controller may determine the second region 4202 as the active region, as shown in fig. 40 i. Thus, the indicator 4221 on the center button 4220 may indicate the second region 4202.
Referring to fig. 40j, the user 1 may touch an icon 4235 representing the application E. When the icon 4235 representing the application E is touched, the controller controls to display a run window 4270 of the application E in a fourth region 4204 on the touch screen, as shown in fig. 40 k. In the absence of empty regions, the controller may refer to the action stack shown in fig. 41 e. The controller may select the lowest application action from the action stack and may display application E in the fourth region 4204 instead of the run window 4270 corresponding to application C.
FIG. 41f shows an action stack corresponding to FIG. 40 k. The controller generates an action 4305 for application E in an action stack in response to the running of application E. The controller places the action 4305 of the last running application E on top of the action stack.
Fig. 42 is a flowchart illustrating a method for running an application in a display device according to an embodiment of the present invention.
Referring to fig. 42, in operation S4410, the display apparatus may run a plurality of applications. For example, a display device may run an application in response to receipt of user input on an icon representing the application.
In operation S4420, the display apparatus may determine a layout for arranging the execution windows of the application. The layout defines the area in which the run windows can be arranged. For example, various layouts are available, including 2 top/bottom region split layouts, 2 left/right region split layouts, 3 region split layouts, 4 region split layouts, and so on.
In operation S4430, the display apparatus may determine a window position in the layout. In the case where 2 left/right region split layouts of the fifth region and the sixth region are defined, the display apparatus may allocate a running window of a web browser and a phonebook application to the fifth region and a running window of a video playing application to the sixth region.
In operation S4440, the display apparatus may display a plurality of execution windows according to the priority levels of the applications. For example, if the execution windows of the web browser and the phonebook application are allocated to the fifth region, the execution window of an application having a higher priority between the web browser and the phonebook application may be displayed in the fifth region.
Fig. 43a and 43b illustrate a method for controlling a display area of an application execution window using a center button according to an embodiment of the present disclosure.
Referring to fig. 43a, the display device 4500 may define a first region 4501, a second region 4502, a third region 4503, a fourth region 4504, a first boundary 4505, a second boundary 4507, a third boundary 4506, and a fourth boundary 4508, which will not be construed as limiting the present disclosure. Thus, the display device 4500 can define regions and boundaries in a variety of ways.
The display device 4500 may display a center button 4220 on at least one boundary. For example, if a first boundary 4505, a second boundary 4507, a third boundary 4506, and a fourth boundary 4508 are defined, the display device 4500 may display a center button 4220 at an intersection point where the first boundary 4505, the second boundary 4507, the third boundary 4506, and the fourth boundary 4508 meet, as shown in fig. 43 a. In another example, if the display device 4500 defines fifth and sixth regions (not shown) and first and second boundaries 4505 and 4507, the display device 4500 may display a center button 4220 on the first or second boundaries 4505 and 4507.
Referring to fig. 43b, if the user 10 touches the center button 4220 and drags the touched center button 4220, the display device 4500 may move the center button 4220 to a dragged position. As the center button 4220 moves, the display device 4500 may change the size and position of an area in which an application execution window and a boundary are displayed.
Fig. 44a, 44b, 44c, 44d, 44e, 44f, 44g, 44h, 44i, 44j, 44k, 44l, 44m, 44n, 44o, 44p, 44q, 44r, 44s, 44t, 44u, 44v and 44w illustrate a method for running multiple applications according to an embodiment of the present disclosure.
Referring to fig. 44a, 44b, 44c, 44d, 44e, 44f, 44g, 44h, 44i, 44j, 44k, 44l, 44m, 44n, 44o, 44p, 44q, 44r, 44s, 44t, 44u, 44v, and 44w, when the application a is running, the display device 4600 may display a list 4610 of at least one application. The applications list 4610 lists the applications that may run. For example, icons 4611, 4612, 4613, 4614, 4615, 4616, and 4617 representing executable applications may be listed in the application list 4610.
Fig. 45a, 45b, 45c, 45d, 45e, 45f, 45g, 45h, 45i, and 45j illustrate action stacks according to embodiments of the present disclosure.
Referring to FIG. 45a, there is shown an action stack corresponding to FIG. 44 a. Since the execution window of the application a is displayed in the area F occupying the entire screen, the controller (not shown) generates an action for the application a as shown in fig. 45 a.
The user 10 can operate the display device 4600 to additionally run the application B. For example, as shown in fig. 45B, the user 10 may touch the icon 4612 representing the second application B, and drag the touched icon 4612 to the sixth area 4623 as shown in fig. 45 c.
If the drag input ends in the sixth area 4623, the display device 4600 runs application B corresponding to the selected icon 4612. Further, while displaying the application B in the sixth area 4623, the display device 4600 moves the application a displayed in the area F to the fifth area 4619.
Finally, a running window 4620 of application a is displayed in the fifth area 4619, and a running window 4630 of application B is displayed in the sixth area 4623.
The display device 4600 may display the center button 4622 on the boundary between the fifth area 4619 and the sixth area 4623. The display device 4600 may also display an indicator 4621 indicating a running window 4630 of the last running application B on the center button 4622. Indicator 4621 may have displayed therein an area of the running window of the application placed on top of the action stack.
FIG. 45b shows an action stack corresponding to FIG. 44 d. The action of the application a displayed in the field F is changed so that the application a can be displayed in the fifth field 4619. Since the application B is executed, an action for the application B is generated and the application B is arranged in the sixth area 4623. The actions of application B are placed on top of the action stack.
Referring to fig. 44e, the user 10 may move the center button 4622 of the display. As shown in fig. 43a and 43b, along with the movement of the center button 4622, the size of the area in which the execution window of the application is displayed can be changed.
Subsequently, as shown in fig. 44f, the user 10 may touch the icon 4613 representing the application C, and drag the touched icon 4613 to the fourth area 4627 as shown in fig. 44 g. The size of the fourth area 4627 may be determined based on the position of the center button 4622. When the dragging of the icon 4613 representing the application C ends in the fourth area 4627, the display device 4600 may execute the application C. As shown in fig. 44h, the display device 4600 may display a running window 4640 of application C in a fourth area 4627. The display device 4600 may display an indicator 4621 on the center button 4622, the indicator 4621 indicating an active area of the execution window 4640 in which the application C is displayed.
FIG. 45c shows an action stack corresponding to FIG. 44 h. A controller (not shown) generates an action for application C in response to the running of application C. Application C is disposed in fourth area 4627. Since the area of the running window 4630 displaying application B is split, application B is allocated to the second area in fig. 44 e.
Referring to fig. 44i, the user 10 may control the size of the area displaying the execution windows 4620 to 4640 of the application by applying an input of the move center button 4622.
As shown in fig. 44j, the user 10 may touch the icon 4614 representing the application D, and drag the touched icon 4614 to the third area 4631 as shown in fig. 44 k.
If the dragging of the touched icon 4614 representing the application D is ended in the third area 4631, the display device 4600 may execute the application D. As shown in fig. 44l, the display device 4600 may display a running window 4650 of the application D in the third region 4631. The display device 4600 may display an indicator 4621 on the button 4622, the indicator 4621 indicating an active area of the execution window 4650 in which the application D is displayed.
FIG. 45d shows an action stack corresponding to FIG. 44 l. The controller generates an action for application D in response to the running of application D. Application D is allocated to the third area 4631. Since the area of the execution window 4620 displaying application a is split, application a is allocated to the first area in fig. 44 i.
The user 10 may touch the icon 4615 representing the application E and drag the touched icon 4615 to the border area 4659, as shown in fig. 44 m. Display device 4600 defines border area 4659 to include border 4685.
If the dragging of the touched icon 4615 representing the application E is ended in the border area 4659, the display device 4600 may execute the application E. The display device 4600 may arrange the execution window 4660 of the application E in a seventh area including the first area and the second area adjacent to the boundary 4685 included in the boundary area 4659, as shown in fig. 44 n. The display device 4600 may display an indicator 4621 on the button 4622, the indicator 4621 indicating an active area of the execution window 4660 in which the application E is displayed.
FIG. 45e shows an action stack corresponding to FIG. 44 n. The controller generates an action for application E in response to the execution of application E. The application E is allocated to the seventh area, and the applications a and B displayed in the first area and the second area included in the seventh area are placed in the seventh area.
The user 10 may touch the icon 4616 representing the application F and drag the touched icon 4616 to the second area 4661, as shown in fig. 44 o.
If the dragging of the touched icon 4616 representing the application F ends in the second area 4661, the display device 4600 may execute the application F. As shown in fig. 44p, the display device 4600 may display a running window 4670 of the application F in the second region 4661.
Referring to fig. 45F, the controller may generate an action for the application F disposed in the second area. With the seventh area split, the application a, the application B, and the application E displayed in the seventh area may be arranged in the first area.
The touch screen may receive an input from the user 10 to select the execution window 4660 of the application E, as shown in fig. 44 p.
Referring to fig. 45g, in response to selection of the execution window 4660 of the application E, the controller may move the action of the application E to the top of the action stack. The display device 4600 may display an indicator 4621 indicating the position of the execution window 4660 on the button 4622.
Referring to fig. 44q, the display device 4600 may receive an input from the user 10 to select the execution window 4660 of the application E. For example, the user 10 may touch the button 4622. In response to receiving an input selecting the button 4622, the display device 4600 may display a list of the applications 4611, 4612, and 4615 displayed in the first area active at the moment. For example, the display device 4600 may display icons representing application a, application B, and application E allocated to the first area in the first area with reference to the action stack shown in fig. 45 g.
In response to receiving an input of the selection button 4622, the display device 4600 may further display icons 4691, 4692, and 4693 representing operations related to the execution window of the application displayed in the first area.
When receiving an input of an icon 4611 representing an application among icons displayed in the first area, the display device 4600 may display a running window 4620 of the application a in the first area, as shown in fig. 44 s.
FIG. 45h shows an action stack corresponding to FIG. 44 s. In response to receiving an input selecting the icon 4611 representing application A, the controller may move the actions of application A to the top of the action stack.
Referring to fig. 44t, when an input of selecting the center button 4622 is received, a list of applications 4611, 4612, and 4615 allocated to the first area as an active area may be displayed. Further, a drag input may be received from the user 10 to drag the icon 4612 representing application B to the area where the execution window 4640 of application C is displayed. With the drag input completed, the display device 4600 may display the running window 4630 of application B in the fourth area, as shown in fig. 44 u. The display device 4600 may display an indicator 4621 indicating the position of the execution window 4630 of application B on the button 4622.
FIG. 45i shows an action stack corresponding to FIG. 44 u. Because the launch window 4630 of application B is displayed in the fourth area, the controller updates the area to which application B is allocated to the fourth area and moves the actions of application B to the top of the action stack.
Referring to fig. 44v, when an input of selecting the center button 4622 is received, icons 4691, 4692, and 4693 representing operations related to a running window of an application displayed in the first region as an active region may be further displayed. In the case of a running window of an application, operations related to the running window of the application may perform a variety of additional functions. For example, the icons representing operations related to the application execution window may include at least one of an exit button 4691 to end the execution window, a maximize button 4692 to display the execution window full screen, and a capture button 4693 to capture the execution window, which should not be construed as limiting the present disclosure. When an input of selecting the exit button 4691 is received from the user 10, the controller may end the execution window of the application a, as shown in (b) of fig. 44 v.
FIG. 45j shows an action stack corresponding to FIG. 44 v. When the running window of application a is terminated, the action of application a may be removed from the action stack.
Referring to fig. 44w, when receiving a selection maximization button 46921 input from the user 10, the display device 4600 may display an execution window 4660 of application a displayed in an active area in a full screen on a touch screen.
The display device 4600 may capture the active run window 4660 when an input is received from the user 10 selecting the capture button 4693.
Fig. 46 is a flowchart illustrating a method for providing a user interface on which an application is executed in a display device according to an embodiment of the present invention.
Referring to fig. 46, in operation S4810, a display device may display a running window of an application in a plurality of regions defined on a touch screen. The display apparatus may further display a button on at least one boundary between the plurality of regions in operation S4820.
The display device may display an indicator on the button to indicate the active region. The active region may refer to a last region selected from a plurality of regions. Further, the active region means a region in which the execution window is in a state controllable by a user input.
In operation S4830, the display device may receive an input of a selection button. When an input of a selection button is received, the display apparatus may display a list of applications in a specific area in operation S4840. Herein, the specific area may be an active area.
The application list may list at least one icon representing at least one application. When an input selecting at least one of the applications included in the application list is received, the display apparatus may display a running window of the selected application in a specific area. When a drag input dragging an icon included in the application list is received, the display apparatus may display a running window of the application corresponding to the dragged icon in the dragged area.
The display device may further display icons of operations related to the execution window of the application displayed in the specific area. The icons representing operations related to the execution window of the application may include at least one of a capture button that captures the execution window to control display of the execution window, a minimize button that minimizes a size of the execution window, a maximize button that maximizes a size of the execution window, and an exit button that ends the execution window.
Fig. 47 is a flowchart illustrating a method for running an application in a display device according to an embodiment of the present invention.
Referring to fig. 47, in operation S4910, a display device may display a running window of an application in a plurality of regions defined on a touch screen. The display device may further display a button on at least one boundary between the plurality of regions in operation S4920.
In operation S4930, the display device may display a list of at least one application execution icon in a partial area of the touch screen.
In operation S4940, the display device may determine an area in which a new application is executed based on a position to which the application execution icon is dragged and a position of the button. The execution region of the new application is a region in which an execution window of an additionally executed application will be displayed.
If the dragged location falls within a bounding region that includes at least one boundary, the run region of the new application may be determined to include a region adjacent to the at least one boundary.
Subsequently, in operation S4950, the display device may display a running window of the application in the determined region.
Fig. 48 is a block diagram of a display device according to an embodiment of the present disclosure.
Referring to fig. 48, the display device 5000 may include: a touch screen 5010 configured to display a running window of an application in a plurality of areas, display buttons on at least one boundary between the plurality of areas, and receive an input selecting the buttons; and a controller 5020 configured to control the touch screen 5010 to display a list of at least one application running in a specific area selected from among the plurality of areas in the specific area based on the received input.
The specific area includes an active region that is controllable by user input. The active area may be a last area selected from among a plurality of areas.
In addition, the controller 5020 may control the touch screen 5010 to display an indicator indicating an active area on the button.
Fig. 49a, 49b, 49c, and 49d are diagrams illustrating a method for displaying buttons according to an embodiment of the present disclosure.
Referring to fig. 49a, the display device 5100 may display a button 5122 on a boundary separating regions in which execution windows of a plurality of applications are displayed. In addition, the display device 5100 may define an arrangement line 5120 according to a layout of the application execution window. The alignment line 5120 may include a dotted line and an outline of the touch screen.
The display device 5100 may further define an arrangement area 5110. The arrangement line 5120 may be included in the arrangement region 5110.
As shown in fig. 49a, the arrangement line 5120 and the arrangement region 5110 may be determined according to the number and position of the execution windows of the application displayed on the touch screen. For example, if the layout is 2 up/down region split layouts, 2 left/right region split layouts, 3 region split layouts, or 4 region split layouts, the arrangement line 5120 and the arrangement region 5110 may be defined according to the layout.
Referring to fig. 49b, when an input to move the button 5122 to the arrangement region 5110 on the touch screen is received, the display device 5100 may move the button 5122 to a portion of the arrangement line 5120 closest to the last position of the button 5122.
The display device 5100 may determine an area in which a running window of an application is displayed based on the position of the button 5122 on the arrangement line 5120. Accordingly, the display device 5100 may arrange a display area of the application execution window.
Referring to fig. 49c, the display device 5100 may define an arrangement point 5130 at a specific position of the arrangement line 5120. When an input to the button 5122 is received (e.g., two consecutive touches on the button 5122), the display device 5100 may move the button 5122 to the arrangement point 5130.
Referring to fig. 49d, if the button 5122 is moved in the manner shown in fig. 49a, 49b and 49c, the movement may be active according to the function shown in fig. 49 d. For example, when the center button 5122 moves to the arrangement line 5120 or the arrangement point 5130, the button 5122 may pass through the arrangement line 5120 or the arrangement point 5130 and return to the arrangement line 5120 or the arrangement point 5130.
Embodiments of the present disclosure provide a display apparatus that can easily switch from one window to another low priority window after running a plurality of windows on a single display, and a method for controlling the same. Thus, a user can use multiple applications in multiple windows simultaneously. Further, if a plurality of windows are displayed in an overlapped manner, the currently displayed window is easily switched to another low-priority window. Accordingly, in an environment in which a plurality of windows are displayed on a screen, a user can run a window of a desired size at a desired position with improved convenience.
It will be understood that the various embodiments of the present disclosure may be implemented in hardware, software, or a combination thereof. Software may be stored in volatile or non-volatile memory devices such as ROM regardless of whether the data is erasable or rewritable, in memory such as RAM, in memory chips, devices, integrated circuits, storage media such as Compact Discs (CDs), Digital Versatile Discs (DVDs) from which data may be recorded optically or magnetically and read by machines (e.g., computers), in magnetic disks, magnetic tapes, and so forth. Further, embodiments of the present disclosure may be implemented in a computer or portable terminal having a controller and a memory, such as a memory adapted to store a program or programs including commands, for implementing embodiments of the present disclosure. Accordingly, the present disclosure includes a program having codes for implementing an apparatus or method defined by the claims and a storage medium storing the program readable by a machine. The program may be transferred electronically through a medium such as a communication signal transmitted via a wired or wireless connection, wherein the medium and its equivalents are included in the present disclosure.
The device may receive the program from the program providing device through a wired or wireless connection and store the program. The program providing device may include a program containing commands for implementing embodiments of the present disclosure, a memory for storing information used for embodiments of the present disclosure, a communication module for communicating with the mobile device through a wired or wireless connection, and a controller for transmitting the program to the mobile device automatically or upon request.
While the disclosure has been particularly shown and described with reference to the particular exemplary embodiments disclosed, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the disclosure as defined by the appended claims and their equivalents.

Claims (12)

1. An electronic device, comprising:
a touch display; and
one or more processors configured to control to:
displaying, on the touch display, a center button at least partially overlapping a boundary of the first application window and a boundary of the second application window in the split view, wherein a portion of the boundary of the first application window and a portion of the boundary of the second application window meet,
receiving, by touching the display, a first touch input on the center button for displaying a plurality of options for performing a plurality of operations related to the active first application window, wherein the plurality of options includes at least an option for ending the first application window;
displaying the plurality of options based on receiving a first touch input, wherein the plurality of options are displayed in the split view only in the active first application window,
receiving, through the touch display, a second touch input on an option for ending the first application window among the plurality of options, an
Based on receiving a second touch input on the option for ending the first application window, ending the first application window.
2. The electronic device of claim 1, wherein the electronic device,
wherein the active first application window is visually indicated as an active window among the first application window and the second application window based on the indicator on the center button.
3. The electronic device of claim 1, wherein the first application window is active among the first application window and the second application window based on one of: a first application displayed on the first application window is executed at a more recent time than a second application displayed on the second application window, or the first application window is selected at a more recent time than the second application window.
4. The electronic device of claim 1, wherein the one or more processors are further configured to control to: displaying at least one icon in the active first application window on the touch display, wherein the at least one icon represents at least one application executable in the active first application window, respectively, while maintaining the split view.
5. A method for displaying an application window on a touch display of an electronic device, the method comprising:
displaying, on the touch display, a center button at least partially overlapping a boundary of the first application window and a boundary of the second application window in the split view, wherein a portion of the boundary of the first application window and a portion of the boundary of the second application window meet;
receiving, by touching the display, a first touch input on the center button for displaying a plurality of options for performing a plurality of operations related to the active first application window, wherein the plurality of options includes at least an option for ending the first application window;
based on receiving the first touch input, displaying the plurality of options, wherein the plurality of options are displayed in the split view only in the active first application window;
receiving, through the touch display, a second touch input on an option for ending the first application window among the plurality of options; and
based on receiving a second touch input on the option for ending the first application window, ending the first application window.
6. The method of claim 5, wherein the first and second light sources are selected from the group consisting of,
wherein the active first application window is visually indicated as an active window among the first application window and the second application window based on the indicator on the center button.
7. The method of claim 5, wherein the first application window is active among the first application window and the second application window based on one of: a first application displayed on the first application window is executed at a more recent time than a second application displayed on the second application window, or the first application window is selected at a more recent time than the second application window.
8. The method of claim 5, further comprising:
displaying at least one icon in the active first application window on the touch display, wherein the at least one icon represents at least one application executable in the active first application window, respectively, while maintaining the split view.
9. A non-transitory computer-readable storage medium storing instructions configured to, when executed, cause one or more processors to control to:
displaying, on the touch display, a center button at least partially overlapping a boundary of the first application window and a boundary of the second application window in the split view, wherein a portion of the boundary of the first application window and a portion of the boundary of the second application window meet,
receiving, by touching the display, a first touch input on the center button for displaying a plurality of options for performing a plurality of operations related to the active first application window, wherein the plurality of options includes at least an option for ending the first application window,
displaying the plurality of options based on receiving a first touch input, wherein the plurality of options are displayed in the split view only in the active first application window,
receiving, through the touch display, a second touch input on an option for ending the first application window among the plurality of options, an
Based on receiving a second touch input on the option for ending the first application window, ending the first application window.
10. The non-transitory computer-readable storage medium of claim 9, wherein the active first application window is visually indicated as an active window among the first application window and the second application window based on the indicator on the center button.
11. The non-transitory computer-readable storage medium of claim 9, wherein the first application window is active among the first application window and the second application window based on one of: a first application displayed on the first application window is executed at a more recent time than a second application displayed on the second application window, or the first application window is selected at a more recent time than the second application window.
12. The non-transitory computer-readable storage medium of claim 9, further storing instructions configured to, when executed, cause the one or more processors to control to:
displaying at least one icon in the active first application window on the touch display, wherein the at least one icon represents at least one application executable in the active first application window, respectively, while maintaining the split view.
CN201711096847.4A 2012-12-06 2013-12-06 Display apparatus and method of controlling the same Active CN107967087B (en)

Applications Claiming Priority (13)

Application Number Priority Date Filing Date Title
US201261734097P 2012-12-06 2012-12-06
US61/734,097 2012-12-06
US201261737540P 2012-12-14 2012-12-14
US61/737,540 2012-12-14
US201261740887P 2012-12-21 2012-12-21
US61/740,887 2012-12-21
KR10-2013-0012019 2013-02-01
KR20130012019 2013-02-01
KR1020130022422A KR102172792B1 (en) 2012-12-06 2013-02-28 Display apparatus and method for controlling thereof
KR10-2013-0022422 2013-02-28
KR20130099927 2013-08-22
KR10-2013-0099927 2013-08-22
CN201380071613.8A CN104956301B (en) 2012-12-06 2013-12-06 The method for showing equipment and control display equipment

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN201380071613.8A Division CN104956301B (en) 2012-12-06 2013-12-06 The method for showing equipment and control display equipment

Publications (2)

Publication Number Publication Date
CN107967087A CN107967087A (en) 2018-04-27
CN107967087B true CN107967087B (en) 2021-08-17

Family

ID=53054284

Family Applications (4)

Application Number Title Priority Date Filing Date
CN201711096847.4A Active CN107967087B (en) 2012-12-06 2013-12-06 Display apparatus and method of controlling the same
CN201380071613.8A Active CN104956301B (en) 2012-12-06 2013-12-06 The method for showing equipment and control display equipment
CN201910525895.3A Active CN110427130B (en) 2012-12-06 2013-12-06 Display apparatus and method of controlling the same
CN201910525925.0A Active CN110413191B (en) 2012-12-06 2013-12-06 Display apparatus and method of controlling the same

Family Applications After (3)

Application Number Title Priority Date Filing Date
CN201380071613.8A Active CN104956301B (en) 2012-12-06 2013-12-06 The method for showing equipment and control display equipment
CN201910525895.3A Active CN110427130B (en) 2012-12-06 2013-12-06 Display apparatus and method of controlling the same
CN201910525925.0A Active CN110413191B (en) 2012-12-06 2013-12-06 Display apparatus and method of controlling the same

Country Status (3)

Country Link
CN (4) CN107967087B (en)
AU (1) AU2013356799B2 (en)
BR (1) BR112015012539B1 (en)

Families Citing this family (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105511778A (en) * 2015-11-25 2016-04-20 网易(杭州)网络有限公司 Interaction method device for controlling display of multiple game scenes
KR20170141453A (en) * 2016-06-15 2017-12-26 에스케이플래닛 주식회사 Method for analyzing interest using scroll pattern and apparatus using the same
CN106202909A (en) * 2016-07-06 2016-12-07 沈阳东软医疗系统有限公司 A kind of image processing method and device
CN106403985A (en) * 2016-09-06 2017-02-15 深圳格兰泰克汽车电子有限公司 Vehicle-mounted navigation split-screen display method and device
WO2018076328A1 (en) * 2016-10-31 2018-05-03 北京小米移动软件有限公司 Split-screen display method and apparatus
CN106874097A (en) * 2017-02-28 2017-06-20 努比亚技术有限公司 The multi-screen display method and device of a kind of terminal screen
CN111090366B (en) * 2017-05-15 2021-08-31 苹果公司 Method for multitasking, storage medium and electronic device
DK180117B1 (en) 2017-05-15 2020-05-15 Apple Inc. Systems and methods for interacting with multiple applications that are simultaneously displayed on an electronic device with a touchsensitive display
CN110928612B (en) * 2018-09-20 2022-08-19 网易(杭州)网络有限公司 Display control method and device of virtual resources and electronic equipment
CN109558051B (en) * 2018-11-21 2021-07-20 连尚(新昌)网络科技有限公司 Switching processing method and device of multifunctional page and computer readable storage medium
CN111212261B (en) * 2018-11-22 2021-07-20 浙江宇视科技有限公司 Scene switching method and device
CN109725979A (en) * 2019-01-28 2019-05-07 联想(北京)有限公司 A kind of display control method and electronic equipment
EP3929715B1 (en) 2019-02-22 2024-02-07 Sony Group Corporation Information processing device and information processing method
CN110203786A (en) * 2019-06-05 2019-09-06 上海三菱电梯有限公司 A kind of elevator display apparatus and lift facility
CN112289339A (en) * 2020-06-04 2021-01-29 郭亚力 System for converting voice into picture
CN113535060B (en) * 2021-07-07 2024-04-26 深圳康佳电子科技有限公司 Split screen implementation method and device and storage medium
CN115700459A (en) * 2022-10-31 2023-02-07 北京小米移动软件有限公司 Split screen control method and device and storage medium

Family Cites Families (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH02150919A (en) * 1988-12-01 1990-06-11 Fujitsu Ltd Display system for state display row at the time of dividing and displaying
US6212577B1 (en) * 1993-03-03 2001-04-03 Apple Computer, Inc. Method and apparatus for improved interaction with an application program according to data types and actions performed by the application program
US6008809A (en) * 1997-09-22 1999-12-28 International Business Machines Corporation Apparatus and method for viewing multiple windows within a dynamic window
AR029671A1 (en) * 2000-06-12 2003-07-10 Novartis Ag COLOR CONTACT LENS WITH MORE NATURAL APPEARANCE AND METHOD FOR MANUFACTURING IT
US7694233B1 (en) * 2004-04-30 2010-04-06 Apple Inc. User interface presentation of information in reconfigured or overlapping containers
KR20070001771A (en) * 2005-06-29 2007-01-04 정순애 Control method of screen data
US8645853B2 (en) * 2006-11-03 2014-02-04 Business Objects Software Ltd. Displaying visualizations linked to one or more data source queries
US8549429B2 (en) * 2007-01-25 2013-10-01 Sharp Kabushiki Kaisha Multi-window management apparatus and program, storage medium and information processing apparatus
KR101450584B1 (en) * 2007-02-22 2014-10-14 삼성전자주식회사 Method for displaying screen in terminal
CN101308416B (en) * 2007-05-15 2012-02-01 宏达国际电子股份有限公司 User interface operation method
CN101515227B (en) * 2008-02-20 2011-05-25 联想(北京)有限公司 Window management method and computer
US8229410B2 (en) * 2008-06-30 2012-07-24 Qualcomm Incorporated Methods for supporting multitasking in a mobile device
KR101548958B1 (en) * 2008-09-18 2015-09-01 삼성전자주식회사 A method for operating control in mobile terminal with touch screen and apparatus thereof.
US8302026B2 (en) * 2008-11-28 2012-10-30 Microsoft Corporation Multi-panel user interface
US20100179674A1 (en) * 2009-01-15 2010-07-15 Open Labs Universal music production system with multiple modes of operation
KR101640460B1 (en) * 2009-03-25 2016-07-18 삼성전자 주식회사 Operation Method of Split Window And Portable Device supporting the same
US8627228B2 (en) * 2009-05-24 2014-01-07 International Business Machines Corporation Automatic sash configuration in a GUI environment
US9152299B2 (en) * 2009-10-08 2015-10-06 Red Hat, Inc. Activity management tool
US8208964B2 (en) * 2009-10-30 2012-06-26 Cellco Partnership Flexible home page layout for mobile devices
EP2354914A1 (en) * 2010-01-19 2011-08-10 LG Electronics Inc. Mobile terminal and control method thereof
JP5800501B2 (en) * 2010-03-12 2015-10-28 任天堂株式会社 Display control program, display control apparatus, display control system, and display control method
TW201133329A (en) * 2010-03-26 2011-10-01 Acer Inc Touch control electric apparatus and window operation method thereof
DE202011110735U1 (en) * 2010-04-06 2015-12-10 Lg Electronics Inc. Mobile terminal
US20120144331A1 (en) * 2010-12-03 2012-06-07 Ari Tolonen Method for Arranging Application Windows on a Display
KR101788051B1 (en) * 2011-01-04 2017-10-19 엘지전자 주식회사 Mobile terminal and method for controlling thereof
KR20120095155A (en) * 2011-02-18 2012-08-28 박철 Operation method of personal portable device having touch panel
CN102646010A (en) * 2011-02-22 2012-08-22 中兴通讯股份有限公司 Software switching method and device
CN102736903A (en) * 2011-04-08 2012-10-17 腾讯科技(深圳)有限公司 Method and device for managing widgets based on intelligent terminal desktop
KR101199618B1 (en) * 2011-05-11 2012-11-08 주식회사 케이티테크 Apparatus and Method for Screen Split Displaying
CN102780932B (en) * 2011-05-13 2016-08-03 上海信颐电子科技有限公司 Multiwindow player method and system
KR101841590B1 (en) * 2011-06-03 2018-03-23 삼성전자 주식회사 Method and apparatus for providing multi-tasking interface
CN102521034B (en) * 2011-12-27 2014-05-07 惠州Tcl移动通信有限公司 Multitask management method and multitask management system based on android system
CN102664747B (en) * 2012-03-27 2015-01-07 易云捷讯科技(北京)有限公司 Cloud calculating platform system
KR101957173B1 (en) * 2012-09-24 2019-03-12 삼성전자 주식회사 Method and apparatus for providing multi-window at a touch device

Also Published As

Publication number Publication date
CN104956301A (en) 2015-09-30
BR112015012539B1 (en) 2022-03-03
CN104956301B (en) 2019-07-12
CN110413191B (en) 2022-12-23
CN110427130B (en) 2023-07-21
CN110413191A (en) 2019-11-05
CN110427130A (en) 2019-11-08
BR112015012539A2 (en) 2017-07-11
AU2013356799A1 (en) 2015-05-14
AU2013356799B2 (en) 2019-08-08
CN107967087A (en) 2018-04-27
BR112015012539A8 (en) 2019-10-01

Similar Documents

Publication Publication Date Title
CN107967087B (en) Display apparatus and method of controlling the same
EP3690624B1 (en) Display device and method of controlling the same
US20210342059A1 (en) Display device and method of controlling the same
US10671282B2 (en) Display device including button configured according to displayed windows and control method therefor
CN105683894B (en) Application execution method of display device and display device thereof
EP2690542B1 (en) Display device and control method thereof
CN104903830B (en) Display apparatus and control method thereof
CN110362246B (en) Method of controlling electronic device, and storage medium
KR20200043325A (en) Display apparatus and method for controlling thereof
US11604580B2 (en) Configuration of application execution spaces and sub-spaces for sharing data on a mobile touch screen device
EP2753053B1 (en) Method and apparatus for dynamic display box management
KR102301053B1 (en) Display apparatus and method for controlling thereof
US9886167B2 (en) Display apparatus and control method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant