WO2014088374A1 - Display device and control method therefor - Google Patents

Display device and control method therefor Download PDF

Info

Publication number
WO2014088374A1
WO2014088374A1 PCT/KR2013/011306 KR2013011306W WO2014088374A1 WO 2014088374 A1 WO2014088374 A1 WO 2014088374A1 KR 2013011306 W KR2013011306 W KR 2013011306W WO 2014088374 A1 WO2014088374 A1 WO 2014088374A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
center button
plurality
windows
window
Prior art date
Application number
PCT/KR2013/011306
Other languages
French (fr)
Korean (ko)
Inventor
이승운
김강태
김영진
박대욱
최정환
Original Assignee
삼성전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US201261734097P priority Critical
Priority to US61/734,097 priority
Priority to US201261737540P priority
Priority to US61/737,540 priority
Priority to US61/740,887 priority
Priority to US201261740887P priority
Priority to KR10-2013-0011933 priority
Priority to KR20130011933 priority
Priority to KR10-2013-0096206 priority
Priority to KR1020130096206A priority patent/KR20140073399A/en
Application filed by 삼성전자 주식회사 filed Critical 삼성전자 주식회사
Priority claimed from US14/649,451 external-priority patent/US20150325211A1/en
Publication of WO2014088374A1 publication Critical patent/WO2014088374A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the screen or tablet into independently controllable areas, e.g. virtual keyboards, menus
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/14Display of multiple viewports

Abstract

Disclosed is a method for controlling a display device comprising a touch screen. The control method, according to the present invention, comprises: the steps of: displaying, on the touch screen, a plurality of windows for executing applications such that the plurality of windows do not overlap; displaying a center button disposed on a point where a plurality of boundary lines for discriminating the plurality of windows intersect; receiving, on the touch screen, a window size change command for enabling a change in the size of at least one among the plurality of windows; changing the size of at least one among the plurality of windows according to the window size change command so as to display the window having the changed size; and controlling the plurality of windows so as not to display a part of the plurality of windows, and enlarging the sizes of the remaining windows so as to display the enlarged windows.

Description

Display device and control method

The present invention relates to a display apparatus and a control method thereof, and more particularly, to a display apparatus and a control method for controlling the display of a window in which an application is executed.

The desktop computer has at least one display device (eg a monitor). Mobile devices (eg, mobile phones, smartphones or tablet PCs) that use a touch screen have one display device.

The user of the desktop computer may divide the screen of the display apparatus according to the working environment (eg, horizontally or vertically by using a plurality of windows to work). When the web browser is executed, the page up button or the page down button on the keyboard may be used to move up or down the web page. When using a mouse instead of a keyboard, the mouse cursor can be used to select a scroll bar on the side of the web page to move up or down the web page. You can also move to the top part of the web page by selecting the top button, which is represented by text or an icon at the bottom of the web page.

Mobile devices have a smaller screen size and limited input compared to desktop computers. Mobile devices are difficult to use by splitting the screen.

In addition, the mobile device may execute various applications such as basic applications manufactured by the device manufacturer and installed in the device, and additional applications downloaded through an application sales site on the Internet. The additional applications may be developed by general users and registered at the sales site. Therefore, anyone can freely sell their applications to the user of the mobile device through the application sales site. As a result, tens of thousands to hundreds of thousands of applications are currently available for free or for a fee, depending on the product.

 As such, various applications that stimulate consumer's curiosity and satisfy consumer's needs are provided in the mobile device, but since the mobile device is manufactured to be portable, the display size and the user interface (UI) There is a limit. Accordingly, there is a user's inconvenience in executing a plurality of applications in the mobile device.

Accordingly, the development of technology for displaying a plurality of windows on one display is required. In addition, while developing a plurality of windows easily, the development of a technology that can facilitate the arrangement of windows after execution is required.

The present invention has been made in response to the above-described technical development request, and the present invention provides a display apparatus and a control method capable of adjusting a plurality of window arrangements after executing the plurality of windows on one display.

In order to achieve the above, a control method of a display apparatus including a touch screen according to an embodiment of the present invention, the process of displaying a plurality of windows, each running an application on the touch screen so as not to overlap each other And displaying a center button disposed at intersections of a plurality of boundary lines separating the plurality of windows, receiving a window size change command to change the size of at least one of the plurality of windows, and And changing the size of at least one of the plurality of windows in response to the size change command, and controlling the display of some of the plurality of windows so as not to display the enlarged size of the remaining windows. .

On the other hand, the display device according to another aspect of the present invention, to display a plurality of windows each running the application so as not to overlap each other, and to display a center button disposed at the intersection of a plurality of boundary lines for separating the plurality of windows When a touch screen and a window size change command for changing the size of at least one of the plurality of windows are input, the touch screen is controlled to change and display at least one of the plurality of windows in response to the window size change command. And a control unit controlling to enlarge and display the size of the remaining windows while controlling not to display some of the plurality of windows.

On the other hand, the method for controlling the display of the touch screen according to another aspect of the present invention is the process of dividing the touch screen into a plurality of windows, and the center button disposed at the intersection of a plurality of boundary lines for separating the divided plurality of windows And displaying and changing the size of at least one of the plurality of windows in response to the movement of the center button.

According to various embodiments of the present disclosure, a display apparatus and a method of controlling the same may be provided while executing a plurality of windows on one display and adjusting a plurality of window arrangements after the execution. Accordingly, the user can simultaneously use a plurality of applications displayed in the plurality of windows. In addition, the user can easily manipulate the arrangement of the plurality of applications, thereby maximizing user convenience. In addition, the user can easily switch between the full-screen mode and the split mode.

1 is a schematic block diagram showing an apparatus according to an embodiment of the present invention.

2A to 2K are conceptual views illustrating a window execution method according to an embodiment of the present invention.

3A to 3I are conceptual diagrams of an activity stack corresponding to various embodiments of the present disclosure.

4 is a flowchart illustrating a control method of a display apparatus according to an exemplary embodiment.

5A to 5G are conceptual views illustrating a window size change according to another embodiment of the present invention.

6 is a flowchart illustrating a window size changing method according to an embodiment of the present invention.

7A to 7E are conceptual diagrams of a display apparatus according to an exemplary embodiment.

8 is a flowchart illustrating a control method of a display apparatus according to an exemplary embodiment.

9A to 9D are conceptual views of a display apparatus according to another exemplary embodiment.

10A to 10C are conceptual views illustrating a display apparatus displaying a center button according to various embodiments of the present disclosure.

11 is a flowchart illustrating a control method of a display apparatus according to an exemplary embodiment.

12A to 12D are conceptual views illustrating a method of executing a full screen mode according to another embodiment of the present invention.

13 is a flowchart illustrating a control method of a display apparatus according to an exemplary embodiment.

14A and 14B are conceptual views illustrating a display apparatus for describing an embodiment of regenerating and displaying a center button.

15 is a flowchart illustrating a control method of a display apparatus according to an exemplary embodiment.

Hereinafter, with reference to the contents described in the accompanying drawings will be described in detail an exemplary embodiment according to the present invention. However, the present invention is not limited or limited by the exemplary embodiments. Like reference numerals in the drawings denote members that perform substantially the same function.

1 is a schematic block diagram showing an apparatus according to an embodiment of the present invention.

Referring to FIG. 1, the display apparatus 100 may be connected to an external device (not shown) using the mobile communication module 120, the sub communication module 130, and the connector 165. External devices include other devices (not shown), mobile phones (not shown), smart phones (not shown), tablet PCs (not shown), and servers (not shown).

Referring to FIG. 1, the display apparatus 100 includes a touch screen 190 and a touch screen controller 195. In addition, the display apparatus 100 may include a controller 110, a mobile communication module 120, a sub communication module 130, a multimedia module 140, a camera module 150, a GPS module 155, and an input / output module ( 160, a sensor module 170, a storage unit 175, and a power supply unit 180. The sub communication module 130 includes at least one of a wireless LAN module 131 and a short range communication module 132, and the multimedia module 140 includes a broadcast communication module 141, an audio play module 142, and a video play module. 143 at least one. The camera module 150 includes at least one of the first camera 151 and the second camera 152, and the input / output module 160 includes a button 161, a microphone 162, a speaker 163, and vibrations. At least one of a motor 164, a connector 165, and a keypad 166.

The controller 110 may store a signal or data input from an external memory of the ROM 111 and the display apparatus 100 in which the CPU 111, a control program for controlling the display apparatus 100, and the display apparatus 100 may be stored. The RAM 113 may be used as a storage area for a task performed at 100. The CPU 111 may include a single core, dual cores, triple cores, or quad cores. The CPU 111, the ROM 112, and the RAM 113 may be connected to each other through an internal bus.

The controller 110 may include a mobile communication module 120, a sub communication module 130, a multimedia module 140, a camera module 150, a GPS module 155, an input / output module 160, and a sensor module 170. The controller 175 may control the storage unit 175, the power supply unit 180, the first touch screen 190a, the second touch screen 190b, and the touch screen controller 195.

The mobile communication module 120 allows the display apparatus 100 to be connected to an external device through mobile communication using at least one, one, or a plurality of antennas (not shown) under the control of the controller 110. The mobile communication module 120 may make a voice call or video call with a mobile phone (not shown), a smartphone (not shown), a tablet PC, or another device (not shown) having a phone number input to the display device 100. Send / receive wireless signals for text messages (SMS) or multimedia messages (MMS).

The sub communication module 130 may include at least one of the WLAN module 131 and the short range communication module 132. For example, only the WLAN module 131 may be included, only the local area communication module 132 may be included, or both the WLAN module 131 and the local area communication module 132 may be included.

The WLAN module 131 may be connected to the Internet at a place where a wireless access point (AP) (not shown) is installed under the control of the controller 110. The WLAN module 131 supports a WLAN standard (IEEE802.11x) of the Institute of Electrical and Electronics Engineers (IEEE). The short range communication module 132 may wirelessly perform short range communication between the display apparatus 100 and an image forming apparatus (not shown) under the control of the controller 110. The short range communication method may include a Bluetooth, an infrared data association (IrDA), and a Zig-bee method.

The display apparatus 100 may include at least one of the mobile communication module 120, the wireless LAN module 131, and the short range communication module 132 according to performance. For example, the display apparatus 100 may include a combination of the mobile communication module 120, the wireless LAN module 131, and the short range communication module 132 according to performance.

The multimedia module 140 may include a broadcast communication module 141, an audio play module 142, or a video play module 143. The broadcast communication module 141 may control a broadcast signal (eg, a TV broadcast signal, a radio broadcast signal or a data broadcast signal) transmitted from a broadcast station through a broadcast communication antenna (not shown) under the control of the controller 110 and the broadcast unit information. (For example, an electronic program guide (EPG) or an electronic service guide (ESG)) can be received. The audio playback module 142 may play a digital audio file (eg, a file extension of mp3, wma, ogg, or wav) stored or received under the control of the controller 110. The video playback module 143 may play a digital video file (eg, a file extension of mpeg, mpg, mp4, avi, mov, or mkv) stored or received under the control of the controller 110. The video play module 143 may play a digital audio file.

The multimedia module 140 may include an audio play module 142 and a video play module 143 except for the broadcast communication module 141. In addition, the audio playback module 142 or the video playback module 143 of the multimedia module 140 may be included in the controller 100.

The camera module 150 may include at least one of the first camera 151 and the second camera 152 for capturing a still image or a video under the control of the controller 110. Also, the first camera 151 or the second camera 152 may include an auxiliary light source (eg, a flash (not shown)) that provides a light amount required for photographing. The first camera 151 may be disposed on the front surface of the display apparatus 100, and the second camera 152 may be disposed on the rear surface of the display apparatus 100. In a different manner, the first camera 151 and the second camera 152 are adjacent (eg, the distance between the first camera 151 and the second camera 152 is greater than 1 cm and less than 8 cm). It is arranged to take a three-dimensional still image or a three-dimensional video.

The GPS module 155 receives radio waves from a plurality of GPS satellites (not shown) on an earth orbit, and uses a time of arrival from the GPS satellites (not shown) to the display device 100. The position of the display apparatus 100 can be calculated.

The input / output module 160 may include at least one of a plurality of buttons 161, a microphone 162, a speaker 163, a vibration motor 164, a connector 165, and a keypad 166.

Button 161 may be formed on the front, side or rear of the housing of the display device 100, the power / lock button (not shown), volume buttons (not shown), menu button, home button, back It may include at least one of a back button and a search button 161.

The microphone 162 generates an electric signal by receiving a voice or sound under the control of the controller 110.

The speaker 163 may control various signals (eg, wireless signals, broadcast signals, etc.) of the mobile communication module 120, the sub communication module 130, the multimedia module 140, or the camera module 150 under the control of the controller 110. Sound corresponding to a digital audio file, a digital video file, or a photo capture) may be output to the outside of the display apparatus 100. The speaker 163 may output a sound corresponding to a function performed by the display apparatus 100 (for example, a button operation sound corresponding to a phone call, or a call connection sound). One or more speakers 163 may be formed at appropriate positions or positions of the housing of the display apparatus 100.

The vibration motor 164 may convert an electrical signal into mechanical vibration under the control of the controller 110. For example, when the display apparatus 100 in the vibration mode receives a voice call from another device (not shown), the vibration motor 164 operates. One or more may be formed in the housing of the display apparatus 100. The vibration motor 164 may operate in response to a touch operation of a user who touches the touch screen 190 and a continuous movement of the touch on the touch screen 190.

The connector 165 may be used as an interface for connecting the display apparatus 100 to an external device (not shown) or a power source (not shown). The data stored in the storage unit 175 of the display apparatus 100 may be transmitted to an external device (not shown) or an external device (not shown) through a wired cable connected to the connector 165 under the control of the controller 110. Data can be received. Power may be input from a power source (not shown) or a battery (not shown) may be charged through a wired cable connected to the connector 165.

The keypad 166 may receive a key input from the user for the control of the display apparatus 100. The keypad 166 includes a physical keypad (not shown) formed on the display apparatus 100 or a virtual keypad (not shown) displayed on the touch screen 190. The physical keypad (not shown) formed in the display apparatus 100 may be excluded according to the performance or structure of the display apparatus 100.

The sensor module 170 includes at least one sensor that detects a state of the display apparatus 100. For example, the sensor module 170 may include a proximity sensor that detects whether the user approaches the display device 100, an illumination sensor that detects an amount of light around the display device 100, or a display. It may include a motion sensor (not shown) for detecting the operation of the device 100 (eg, rotation of the display device 100, acceleration or vibration applied to the display device 100). At least one sensor may detect a state, generate a signal corresponding to the detection, and transmit the signal to the controller 110. The sensor of the sensor module 170 may be added or deleted according to the performance of the display apparatus 100.

The storage unit 175 may control the mobile communication module 120, the sub communication module 130, the multimedia module 140, the camera module 150, the GPS module 155, and the input / output module under the control of the controller 110. In operation 160, the input / output signal or data corresponding to the operation of the sensor module 170 and the touch screen 190 may be stored. The storage unit 175 may store a control program and applications for controlling the display apparatus 100 or the controller 110.

 The term storage refers to a memory card (not shown) mounted on the storage unit 175, the ROM 112 in the control unit 110, the RAM 113, or the display apparatus 100 (eg, an SD card or a memory stick). It includes. The storage unit may include a nonvolatile memory, a volatile memory, a hard disk drive (HDD), or a solid state drive (SSD).

The power supply unit 180 may supply power to one or a plurality of batteries (not shown) disposed in the housing of the display apparatus 100 under the control of the controller 110. One or more batteries (not shown) supply power to the display apparatus 100. In addition, the power supply unit 180 may supply power input from an external power source (not shown) to the display apparatus 100 through a wired cable connected to the connector 165.

The touch screen 190 may provide a user interface corresponding to various services (eg, a call, data transmission, broadcasting, and photography). The touch screen 190 may transmit an analog signal corresponding to at least one touch input to the user interface to the touch screen controller 195. The touch screen 190 may receive at least one touch through a user's body (eg, a finger including a thumb) or a touchable input means (eg, a stylus pen). In addition, the touch screen 190 may receive a continuous movement of one touch among at least one touch. The touch screen 190 may transmit an analog signal corresponding to continuous movement of an input touch to the touch screen controller 195.

In the present invention, the touch is not limited to the contact between the touch screen 190 and the user's body or touchable input means, and a non-contact (eg, a detectable distance between the touch screen 190 and the user's body or touchable input means) 1 mm or less). The detectable interval on the touch screen 190 may be changed according to the performance or structure of the display apparatus 100.

The touch screen 190 may be implemented by, for example, a resistive method, a capacitive method, an infrared method, or an ultrasonic wave method.

The touch screen controller 195 converts an analog signal received from the touch screen 190 into a digital signal (eg, X and Y coordinates) and transmits the same to the controller 110. The controller 110 may control the touch screen 190 by using a digital signal received from the touch screen controller 195. For example, the controller 110 may select a shortcut icon (not shown) displayed on the touch screen 190 or execute a shortcut icon (not shown) in response to the touch. In addition, the touch screen controller 195 may be included in the controller 110.

2A to 2K are conceptual views illustrating a window execution method according to an embodiment of the present invention. Meanwhile, those skilled in the art will readily understand that the display device 200 may be implemented in various forms such as a general TV, an Internet TV, and a medical data display device, as well as the mobile device described above with reference to FIG. 1. That is, the display device is not limited as long as the device is provided with means capable of displaying the rendered image.

As illustrated in FIG. 2A, the display apparatus 200 may set a plurality of window display spaces 201 to 204 on the touch screen. In more detail, the controller (not shown) may set the first area 201, the second area 202, the third area 203, and the fourth area 204. Although FIG. 2A illustrates an example of setting four windows on a touch screen, this is only an example, and the present invention may set at least two windows on the touch screen. The controller (not shown) includes a second boundary line that is a boundary line between the first boundary line 211, the third region 203, and the fourth region 204, which is a boundary line between the first region 201 and the second region 202. 3D, the third boundary line 213 which is a boundary line between the first region 201 and the third region 203, and the fourth boundary line 214 which is a boundary line between the second region 202 and the fourth region 204. Can be set. Here, the first boundary line 211 and the second boundary line 212 may form one line segment, and the third boundary line 213 and the fourth boundary line 214 may form one line segment. The controller (not shown) sets the first region 201 to the fourth region 204 so that they do not overlap each other. For example, as shown in FIG. 2A, the controller (not shown) sets the first region 201 on the upper left side, sets the second region 202 on the upper right side, and sets the third region 203. Is set on the lower left side, and the fourth region 204 is set on the lower right side. The controller (not shown) sets the first and second boundary lines 211 and 212 to divide the screen from side to side, and sets the third and fourth boundary lines 213 and 214 to divide the screen up and down.

The controller (not shown) may display the center button 220 at a portion where the first and second boundary lines 211 and 212 and the third and fourth boundary lines 213 and 214 cross each other. 2A to 2L, the center button 220 shows a rectangular shape, which is merely an exemplary embodiment, and the present invention may include a polygonal shape, a circular shape, or an elliptical shape as well as a rectangular shape. The center button may be a function key for providing a function of changing the size of the application display space or entering a window position changing mode.

The controller (not shown) may control to arrange and display one window for executing an application in each of the regions 201 to 204. Alternatively, the controller (not shown) may execute at least one application in each or all of the regions 201 to 204. For example, the controller (not shown) controls to display a window in each of the regions 201 to 204 as shown in FIGS. 2B to 2K.

Here, the window may be an area including an execution screen of a specific application and a title bar and a control area for the executed application. Objects related to the application may be displayed on the execution screen of the application. The object may be formed in various shapes such as a text, a figure, an icon, a button, a check box, a photo, a video, a web, a map, and the like. When a user touches an object, a predetermined function or event may be performed in an application corresponding to the object. The object may be called a view according to an operating system. The title bar may include at least one control key for controlling the display of the window. For example, the control key may be a window display minimize button, a window display maximize button, and a window end button.

On the other hand, the applications are programs implemented independently of each other by the display device manufacturer or the application developer, respectively. Accordingly, it does not require that another application be executed in advance in order to execute one application. In addition, even if one application terminates, another application may continue to run.

Since the applications are programs that are implemented independently of each other, a multi-function application (or dual) has added some functions (memo function, message sending / receiving function) provided by another application in one application (for example, a video application). Application). However, such a composite function application is a single application newly produced to have various functions, which is different from existing applications. Therefore, the composite function application does not provide various functions like the existing applications, provides limited functions, and the user has a burden of separately purchasing such a new composite function application.

Referring to FIG. 2B, the controller (not shown) controls to display a window 230 for executing a launcher application on the first area 201. The launcher application displays application icons 231 to 238 that can be executed as shown in FIG. 2B. When an application execution command for touching one of the application icons 231 to 238 is input, the launcher application displays an application corresponding to the touched icon in one of the first area 201 to the fourth area 204 which is a window display space. Display.

3A is a conceptual diagram of an activity stack managed by a display device. The controller (not shown) generates and manages the launcher application stack 301 on the activity stack in response to the execution of the launcher application.

In FIG. 2C, the user 1 may touch an icon corresponding to the B application. When the icon corresponding to the B application is touched, the controller (not shown) may control to display the second window 240 executing the B application on the second area as shown in FIG. 2D. The controller (not shown) may determine the window display space in which the windows are displayed in a predetermined order. For example, the controller (not shown) may control to display the new windows in the clockwise order of the second region, the third region, and the fourth region. Meanwhile, the above-described order is merely an example, and the controller (not shown) may control the new windows to be displayed in the counterclockwise order. The order in which new windows are displayed in the window display space can be changed.

3B is a conceptual diagram of an activity stack corresponding to FIG. 2D. The controller (not shown) generates the B application stack 302 on the activity stack in response to the execution of the B application. Meanwhile, the controller (not shown) arranges the stack 302 of the most recently executed B application on the stack 301 of the launcher application. This may mean that the activity stack rank of the B application is higher than the activity stack rank of the launcher application.

In FIG. 2E, the user 1 may touch an icon 233 corresponding to the C application. 3C is a conceptual diagram of an activity stack corresponding to FIG. 2E. As shown in FIG. 2E, since the user inputs an application execution command to the launcher application, it can be confirmed that the activity stack rank of the launcher application is higher than the activity stack rank of the B application.

When the icon 233 corresponding to the C application is touched, the controller (not shown) controls to display the third window 250 for executing the C application on the fourth region as shown in FIG. 2F.

3D is a conceptual diagram of an activity stack corresponding to FIG. 2F. The controller (not shown) generates the C application stack 303 on the activity stack in response to the execution of the C application. Meanwhile, the controller (not shown) arranges the stack 303 of the most recently executed C application on the stack 301 of the launcher application. This may mean that the activity stack rank of the C application is higher than the activity stack rank of the launcher application.

In FIG. 2G, the user 1 may touch an icon 234 corresponding to the D application. 3E is a conceptual diagram of an activity stack corresponding to FIG. 2G. As shown in FIG. 2G, since the user inputs an application execution command to the launcher application, it can be confirmed that the activity stack rank of the launcher application is higher than the activity stack rank of the C application.

When the icon 234 corresponding to the D application is touched, the controller (not shown) controls to display the fourth window 260 executing the D application on the third area as shown in FIG. 2H.

3F is a conceptual diagram of an activity stack corresponding to FIG. 2H. The controller (not shown) generates the D application stack 304 on the activity stack in response to the execution of the D application. Meanwhile, the controller (not shown) arranges the stack 304 of the most recently executed D application on the stack 301 of the launcher application. This may mean that the activity stack rank of the D application is higher than the activity stack rank of the launcher application.

Meanwhile, the user 1 may operate the B application as shown in FIG. 2I. 3G is a conceptual diagram of an activity stack corresponding to FIG. 2I. The controller (not shown) places the stack 302 of the most recently executed B applications at the highest level in response to a user input to the B application.

In FIG. 2J, the user 1 may touch an icon 235 corresponding to the E application. 3H is a conceptual diagram of an activity stack corresponding to FIG. 2K. As shown in FIG. 2J, since the user inputs an application execution command to the launcher application, it can be confirmed that the activity stack rank of the launcher application is higher than the activity stack rank of the D application.

When the icon 235 corresponding to the E application is touched, the controller (not shown) controls to display the fifth window 270 for executing the D application on the fourth area as shown in FIG. 2K. The controller (not shown) may refer to the activity stack of FIG. 3H when there is no empty window display space. The controller (not shown) may identify an application having the lowest activity stack rank among the activity stacks. For example, in FIG. 3H, the controller (not shown) may confirm that the activity stack rank of the C application is lowest. The controller (not shown) controls to display the fifth window 270 for executing the E application on the fourth area 250 where the C application having the lowest activity stack rank is displayed.

3I is a conceptual diagram of an activity stack corresponding to FIG. 2K. The controller (not shown) generates the E application stack 305 on the activity stack in response to the execution of the E application. Meanwhile, the controller (not shown) arranges the stack 305 of the most recently executed E application on the stack 301 of the launcher application. This may mean that the activity stack rank of the E application is higher than the activity stack rank of the launcher application.

4 is a flowchart illustrating a control method of a display apparatus according to an exemplary embodiment.

For example, the display apparatus may set a plurality of window display spaces as illustrated in FIG. 2A (S401). The display apparatus executes a launcher application capable of executing a new application in one of the plurality of window display spaces (S403). More specifically, the display device displays a window for executing the launcher application in one of the plurality of window display spaces.

The display device may receive a new application execution command on a window for executing a launcher application (S405). For example, the display device may receive a new application execution command for touching an icon corresponding to the application as shown in FIG. 2C.

The display device may display a window for executing a new application in one of the window display spaces other than the application display space in which the launcher application is executed (S407).

5A to 5G are conceptual views illustrating a window size change according to another embodiment of the present invention.

FIG. 5A is a conceptual diagram of a display apparatus according to an embodiment of the present disclosure. The touch screen is divided into four regions to display different windows in each region. The controller (not shown) of the display apparatus 500 may set a layout for dividing the screen into four equal parts. In more detail, the controller (not shown) may set the first region 501, the second region 502, the third region 503, and the fourth region 504. The controller (not shown) includes a second boundary line 512 that is a boundary line between the first boundary line 511, the third region 503, and the fourth region 504, which is a boundary line between the first region 501 and the second region 502. ), A third boundary line 513 that is a boundary line between the first region 501 and a third region 503, and a fourth boundary line 514 that is a boundary line between the second region 502 and the fourth region 504. have.

The controller (not shown) controls to display and display one window for executing an application in each of the regions 501 to 504. For example, the controller (not shown) displays the first window for executing the A application on the first area 501, and displays the second window for executing the B application on the second area 502, The third window for executing the C application is displayed on the third area 503, and the fourth window for executing the D application is displayed on the fourth area 504.

In addition, the controller (not shown) may execute and display at least one application in each or all of the regions 501 to 501.

The controller (not shown) displays the center button 510 at a portion where the first and second boundary lines 511 and 512 and the third and fourth boundary lines 513 and 514 cross each other. 5A to 5G, the center button 510 shows a rectangular shape, but this is only an exemplary embodiment, and the present invention may include a polygonal shape, a circular shape, or an elliptical shape as well as a rectangular shape.

As in FIG. 5B, the user 1 may have a first gesture from the center button 510 to the first end point 522 (eg, a left drag gesture 521 or a point where the second area and the fourth area are located). The gesture 521 in a direction away from may be input as a window size change command. The controller (not shown) controls the display to change the position of the center button 510 to the first end point 522 as shown in FIG. 5C. In addition, the controller (not shown) may control to reset and display the boundary lines 511 to 514 based on the center button 510. For example, the controller (not shown) resets the first boundary line 511 upward from the center button 510, resets the second boundary line 512 downward from the center button 510, and the center button 510. The third boundary line 513 is reset from the center button 510 to the left side, and the fourth boundary line 514 is reset from the center button 510 to the right side. In addition, the controller (not shown) controls to display the changed sizes of the first to fourth regions 526 to 529 based on the reset boundary lines 511 to 514. That is, in the embodiment of FIG. 5B, in contrast to FIG. 2C, the sizes of all window display areas may be changed together.

5D is a conceptual diagram illustrating a method of changing a window size according to another embodiment of the present invention. As shown in FIG. 5D, the user may have a second gesture (eg, an upward drag gesture 535 or a third area 503 and a fourth area) from the center button 510 to the second end point 536. The gesture 535 in a direction away from the point at which the 504 is located may be input as the window size change command. The controller (not shown) controls to change the position of the center button 510 to the second end point 536 and display it as shown in FIG. 5E. In addition, the controller (not shown) may control to reset and display the boundary lines 511 to 514 based on the center button 510. The controller (not shown) controls the display to change the size of the first to fourth regions 531 to 534 based on the reset boundary lines 511 to 514.

5F is a conceptual diagram illustrating a method of changing a window size according to another embodiment of the present invention. As shown in FIG. 5F, the user moves from the center button 510 to the third end point 565 in a third gesture (ie, in a direction away from the upper left drag gesture 564 or the fourth area 504). Gesture 564 may be entered as a window size change command. The controller (not shown) controls to change the position of the center button 510 to the third end point 565 and display it as shown in FIG. 5G. In addition, the controller (not shown) may control to reset and display the boundary lines 511 to 514 based on the center button 510. The controller (not shown) controls the display to change the size of the first to fourth regions 541 to 544 based on the reset boundary lines 511 to 514. In addition, when the center button 510 is moved in a direction away from the point where the fourth area 504 is located (for example, the third gesture), the size of the fourth area is enlarged and displayed, and the first area 501 is displayed. ) To reduce the size of the display. The rate at which the first area is reduced may be the same as the rate at which the fourth area is enlarged.

6 is a flowchart illustrating a window size changing method according to an embodiment of the present invention.

For example, the display apparatus may display a plurality of windows by setting a plurality of window display spaces, such as the first to fourth regions in FIG. 5A (S601). For example, the display apparatus may set a plurality of window display spaces by setting at least one boundary line. Here, the center button may be formed at the intersection of the at least one boundary line.

When a plurality of window display spaces are set, the display device may receive a window size change command for dragging the center button (S603). After moving the center button to the drag end point, the display apparatus resets the boundary line based on the moved center button (S605). The display device resets and displays the window size based on the reset at least one boundary line (S607).

7A to 7E are conceptual diagrams of a display apparatus according to an exemplary embodiment.

As shown in FIG. 7A, the display apparatus 700 may arrange and display the first window 701 to the fourth window 704 in the first area to the fourth area, respectively. The display apparatus 700 may display the first boundary line 701 to the fourth boundary line 704, and the center button 710 may be displayed at the intersection of the first boundary line 701 and the fourth boundary line 704. have. 7A to 7D, the center button 710 shows a rectangular shape, but this is only an example, and the present invention may include a polygonal shape, a circular shape, or an elliptical shape as well as a rectangular shape.

 Meanwhile, the user 1 may input a drag gesture 720 starting from the center button 710 to the left. As illustrated in FIG. 7B, the controller (not shown) reduces the horizontal lengths of the first window 701 and the third window 703, and simultaneously controls the second window 702 and the fourth window 704. The display can be controlled to enlarge the horizontal length. The controller (not shown) may control to move and display the center button 710 at a point where the drag gesture 720 is touched.

On the other hand, as shown in FIG. 7C, when the drag gesture 720 reaches the boundary of the touch screen and the center button 710 reaches the boundary of the touch screen, the controller (not shown) may control the first window 701 and the first window. It is possible to control not to display the three windows 703. The controller (not shown) may control to further enlarge the horizontal length of the second window 712 to display the upper half. The controller (not shown) may control to further enlarge the horizontal length of the fourth window 714 and display it on the lower half. In addition, the controller (not shown) may display only the fourth boundary line 714 and control not to display the remaining boundary lines.

In FIG. 7D, the user may end the input of the drag gesture. The controller (not shown) may determine that the drag gesture input is terminated based on the release of the touch of the drag gesture. The controller (not shown) may control not to display the center button 710 as shown in FIG. 7E. For example, the controller (not shown) may control not to display the center button 710 after a drag gesture input ends after a preset time.

Meanwhile, the controller (not shown) divides the touch screen up and down when the position of the center button 710 is the left boundary line or the right boundary line of the touch screen, and the position of the center button 710 is displayed on the touch screen. In the case of an upper boundary line or a lower boundary line, the touch screen may be divided into left and right sides for display. In more detail, when the center button 710 is moved to the left boundary of the touch screen by the left drag gesture, the controller divides the touch screen up and down, and displays the center button 710 by the right drag gesture. When the touch screen is moved to the right boundary of the touch screen, the touch screen is divided into two parts to display the touch screen. When the center button 710 is moved to the upper boundary of the touch screen by an upward drag gesture, the touch screen is moved to the left and right. When the center button 710 is moved to the lower boundary of the touch screen by the downward drag gesture, the touch screen is divided into left and right and displayed.

8 is a flowchart illustrating a control method of a display apparatus according to an exemplary embodiment.

The display device may display a plurality of windows on the touch screen (S801). In addition, the display device may display a center button for adjusting a window size at a point where boundary lines for distinguishing a plurality of windows cross each other (S803).

The display device may receive a window size change command for dragging the center button (S805). For example, the window size change command may be a drag gesture starting from the center button. The display device may control to change and display the sizes of the windows based on the input window size change command.

In operation S807, the display device may determine whether the center button is dragged to the touch screen boundary line. When a drag gesture to the touch screen boundary is input (S807 -Yes), the display device may control not to display some windows (S811). More specifically, the display device may increase the size of other windows and display the windows, thereby controlling some windows to be reduced in size and not displayed.

On the other hand, if the drag gesture to the touch screen boundary line is not input (S807-No), the display device may move and display the center button to the end point of the drag gesture (S809). On the other hand, the display device may be displayed while moving the center button corresponding to the touch point of the drag gesture.

The display device may reset and display the boundary line based on the moved center button (S813), and change and display the window size correspondingly (S815).

9A to 9D are conceptual views of a display apparatus according to another exemplary embodiment.

As shown in FIG. 9A, the display apparatus 900 may arrange and display the first window 901 to the fourth window 904 in the first to fourth areas, respectively. The display apparatus 900 may display the first boundary line 901 to the fourth boundary line 904, and may display the center button 910 at the intersection of the first boundary line 901 to the fourth boundary line 904. have. 9A to 9C, the center button 910 shows a rectangular shape, but this is only an exemplary embodiment, and the present invention may include a polygonal shape, a circular shape, or an elliptical shape as well as a rectangular shape.

 Meanwhile, the user 1 may input a drag gesture 920 starting from the center button 910 to the upper left. As illustrated in FIG. 9B, the controller (not shown) may control to enlarge and display the horizontal length and the vertical length of the fourth window 904. When the center button 910 is moved to the edge of the touch screen, the controller (not shown) may display the fourth area 904 full screen. The fourth region 904 is the window farthest from the corner of the plurality of windows displayed on the touch screen 914. The remaining areas (first to third areas) except for the fourth area 904 are not displayed. In addition, the controller (not shown) may control to move and display the center button 910 at the point where the drag gesture 920 is touched.

On the other hand, as shown in FIG. 9C, when the drag gesture 920 reaches the boundary of the touch screen and the center button 910 reaches the edge of the touch screen, the controller (not shown) may control the first window 901 and the first window. The second window 902 and the third window 903 may be controlled not to be displayed. The controller (not shown) may control to enlarge the horizontal and vertical lengths of the fourth window 904 to display the entire touch screen. The controller (not shown) may control not to display the boundary lines.

In FIG. 9D, the user may end the input of the drag gesture. The controller (not shown) may determine that the drag gesture input is terminated based on the release of the touch of the drag gesture. The controller (not shown) may control not to display the center button 910 as shown in FIG. 9D. For example, the controller (not shown) may control not to display the center button 910 after the drag gesture input ends after a preset time.

10A to 10C are conceptual views illustrating a display apparatus displaying a center button according to various embodiments of the present disclosure.

As shown in FIG. 10A, the display apparatus 1000 may display the center button 1010 in the center of the touch screen. 10A to 10C, the center button 1010 shows a rectangular shape, but this is only an example, and the present invention may include a polygonal shape, a circular shape, or an elliptical shape as well as a rectangular shape. In addition, the controller (not shown) controls to display the first boundary line 1011, the second boundary line 1012, the third boundary line 1013, and the fourth boundary line 1014 connected to the center button 1010. In particular, the controller (not shown) may determine the location of the center button 1010 on the touch screen. For example, in FIG. 10A, the controller (not shown) may recognize that the center button 1010 is positioned at the center of the touch screen, that is, at a horizontal split point of 50:50. The controller (not shown) may also determine the positions of the first boundary line 1011 and the second boundary line 1012 based on the position of the center button on the touch screen. For example, the controller (not shown) may be configured such that the first boundary line 1011 and the second boundary line 1012 are set to 50 of the center button 1010 based on the center button located at a horizontal division point of 50:50 of the touch screen. It can be controlled to display connected to 50 horizontal division points.

10B is a conceptual diagram of a display apparatus according to another embodiment of the present invention. As shown in FIG. 10B, the display apparatus 1000 may display the center button 1010 at a horizontal division point of 30: 70 of the touch screen. The controller (not shown) may determine the location of the center button 1010 on the touch screen. In addition, the controller (not shown) is based on the center button 1010 is located at the horizontal split point of 30: 70 of the touch screen, the first boundary line 1011 and the second boundary line 1012 of the center button 1010 It can be controlled to display connected to the horizontal split point of 30:70.

Meanwhile, as shown in FIG. 10C, when the center button 1010 is disposed at the boundary line of the touch screen, the controller (not shown) may control not to display the boundary line.

11 is a flowchart illustrating a control method of a display apparatus according to an exemplary embodiment.

The display device may display a plurality of windows (S1101). The display device may display a plurality of boundary lines that divide the plurality of windows. In addition, the display device may display a center button for controlling window display at a point where a plurality of boundary lines intersect (S1103).

The display device may receive a window size change command for dragging the center button (S1105). The display device can determine the position of the center button on the touch screen. The display device may reset and display the boundary line corresponding to the position of the center button on the touch screen (S1107). For example, the display device may recognize that the center button is located at a split point of a specific ratio of the touch screen. The display device may display the boundary line so that the boundary line is connected to the division point of the specific ratio of the center button. In addition, the display device may change and display the window size corresponding to the final position of the center button (S1109).

12A to 12D are conceptual views illustrating a method of executing a full screen mode according to another embodiment of the present invention.

As shown in FIGS. 12A and 12C, the display apparatus 1200 may include a center button of any shape among the center button 1212 of FIG. 12A and the center button 1210 of FIG. 12C. It can be marked to touch the boundary line. 12A to 12D, the center button 1210 shows a rectangular shape or a circular shape, but this is only an exemplary embodiment, and the present invention may include various shapes such as a polygonal shape or an oval shape as well as a rectangular shape or a circular shape. have. The present invention may include various types of buttons in which the user is free to move on the touch screen in addition to these various types. The second boundary line 1212 and the fourth boundary line 1214 may be displayed to be connected to the center button 1210. Accordingly, a problem may occur in that the fourth window 1204 is not displayed at full screen.

Accordingly, the controller (not shown) may further receive an additional drag gesture even when the center button 1210 contacts the boundary of the touch screen. As illustrated in FIGS. 12B and 12D, the controller (not shown) may display only a part of the center button 1210 in response to the additional drag gesture. In addition, the controller (not shown) may control not to display the boundary line when the center of the center button 1210 is not displayed. Accordingly, the fourth window 1204 may be displayed on the full screen.

13 is a flowchart illustrating a control method of a display apparatus according to an exemplary embodiment.

The display device may display a plurality of windows (S1301). The display device may display a plurality of boundary lines that divide the plurality of windows. In addition, the display device may display a center button for controlling window display at a point where a plurality of boundary lines intersect (S1303).

The display apparatus may receive a window size change command for dragging the center button (S1305). The display device may display the center button at the point where the drag gesture is touched.

The display device may determine whether the center button reaches to touch the boundary of the touch screen (S1307). When the center button is positioned to be in contact with the touch screen boundary (S1307-Yes), only a part of the center button may be displayed and the boundary line may not be displayed (S1309). On the other hand, if the center button does not touch the boundary of the touch screen (S1307-No), the display device may change the window size corresponding to the final position of the center button and display it (S1311). On the other hand, in the case of S1309, the display device can display a specific window on a full screen.

14A and 14B are conceptual views illustrating a display apparatus for describing an embodiment of regenerating and displaying a center button.

In FIG. 14A, the display device may display a specific window 1404 at full screen. In the full screen mode, the display device may control not to display the center button and the boundary line. Meanwhile, the user 1 may want to display the center button again, and thus may input a center button generation command. In FIG. 14A, the center button generation command may be a double edge flick 1401 and 1402 simultaneously input from an upper boundary line and a left boundary line of the touch screen. Meanwhile, in FIG. 14A, the double edge flick input simultaneously from the upper boundary line and the left boundary line of the touch screen is merely exemplary, and there is no limitation on the boundary line of the touch screen to which the double edge flick is input. In addition, those skilled in the art will be able to change the center button generation command in various ways in addition to the double edge flick, and the scope of the present invention will not be limited by the features of the center button generation command.

In FIG. 14B, the controller (not shown) may control to display the center button 1410 again based on the center button generation command. Although the center button 1410 shows a rectangular shape in FIG. 14B, this is merely an exemplary embodiment, and the present invention may include a polygonal shape, a circular shape, or an elliptical shape as well as a rectangular shape. The controller (not shown) may control the center button 1410 to be displayed again where the center button 1410 is most recently located.

15 is a flowchart illustrating a control method of a display apparatus according to an exemplary embodiment.

The display device may display a plurality of windows (S1501). The display device may display a plurality of boundary lines that divide the plurality of windows. In addition, the display device may display a center button for controlling window display at a point where a plurality of boundary lines cross (S1503).

The display device may receive a window size change command for dragging the center button (S1505). The display device may display the center button at the point where the drag gesture is touched.

In operation S1507, the display apparatus may display a specific window on a full screen in response to the window size change command and not display the center button. The display apparatus determines whether a center button generation command has been input (S1509). If the center button generation command is not input (S1509-Yes), the display device may maintain the full screen mode. When the center button generation command is input (S1509-Yes), the display device may redisplay the center button before the non-display position, that is, where the center button is most recently located (S1511).

It will be appreciated that embodiments of the invention may be realized in the form of hardware, software or a combination of hardware and software. Any such software may be, for example, volatile or nonvolatile storage, such as a storage device such as a ROM, whether or not removable or rewritable, or a memory such as, for example, a RAM, a memory chip, a device or an integrated circuit. Or, for example, CD or DVD, magnetic disk or magnetic tape and the like can be stored in a storage medium that is optically or magnetically recordable and simultaneously readable by a machine (eg computer). The graphic screen updating method of the present invention may be implemented by a computer or a portable terminal including a control unit and a memory, wherein the memory is a machine suitable for storing a program or programs including instructions for implementing embodiments of the present invention. It will be appreciated that this is an example of a readable storage medium. Accordingly, the present invention includes a program comprising code for implementing the apparatus or method described in any claim herein and a storage medium readable by a machine (such as a computer) storing such a program. In addition, such a program may be transferred electronically through any medium, such as a communication signal transmitted via a wired or wireless connection, and the present invention includes equivalents thereof as appropriate.

In addition, the device may receive and store the program from a program providing device connected by wire or wirelessly. The program providing apparatus includes a memory for storing a program including instructions for causing the graphic processing apparatus to perform a preset content protection method, information necessary for the content protection method, and wired or wireless communication with the graphic processing apparatus. A communication unit for performing and a control unit for automatically transmitting the program or the corresponding program to the request or the graphics processing unit.

Claims (34)

  1. In the control method of a display device including a touch screen,
    Displaying a plurality of windows, each of which executes an application, on the touch screen so as not to overlap each other;
    Displaying a center button disposed at intersections of a plurality of boundary lines separating the plurality of windows;
    Receiving a window size change command for changing a size of at least one of the plurality of windows;
    Changing and displaying at least one of the plurality of windows in response to the window size changing command;
    And controlling to not display some windows of the plurality of windows, and enlarging and displaying the sizes of the remaining windows.
  2. The method of claim 1,
    The window size changing command is a drag gesture starting from the center button and ending at a first point.
  3. The method of claim 2,
    The process of changing and displaying at least one size of the plurality of windows includes:
    And moving and displaying the center button, to which the drag gesture is input, to the first point.
  4. The method of claim 3, wherein
    The process of changing and displaying at least one size of the plurality of windows includes:
    Resetting and displaying the plurality of boundary lines in response to the movement of the center button;
    And changing and displaying sizes of each of the plurality of windows based on the reset plurality of boundary lines.
  5. The method of claim 4, wherein
    The first point is a boundary of the touch screen,
    The resetting and displaying of the plurality of boundary lines may include displaying only some boundary lines among the plurality of boundary lines.
  6. The method of claim 2,
    The process of enlarging and displaying the size of the remaining window,
    When the first point is a left boundary line or a right boundary line of the touch screen, the touch screen is split up and down to display the remaining window, and when the first point is an upper boundary line or a lower boundary line of the touch screen, And dividing the touch screen from side to side to display the remaining window.
  7. The method of claim 2,
    The process of enlarging and displaying the size of the remaining window,
    If the first point is an edge of the touch screen, displaying the remaining window on a full screen.
  8. The method of claim 4, wherein
    The process of resetting and displaying the plurality of boundary lines,
    And resetting the position of the boundary line based on the position of the center button on the touch screen.
  9. The method of claim 1,
    After the enlarged size of the remaining window is displayed,
    And controlling the center button not to be displayed.
  10. The method of claim 9,
    Receiving a center button generation command for regenerating and displaying the center button;
    And in response to the center button generation command, displaying the center button at the most recently arranged position.
  11. The method of claim 7, wherein
    If the center button is disposed at the corner, the control method of the display device further comprising the step of displaying only a part of the center button.
  12. The method of claim 1,
    The center button is a control method of the display device, characterized in that any one of a rectangular shape, a circular shape, an elliptical shape, or a polygonal shape.
  13. A touch screen for displaying a plurality of windows each executing an application so as not to overlap each other, and displaying a center button disposed at an intersection of a plurality of boundary lines separating the plurality of windows;
    When a window size change command to change the size of at least one of the plurality of windows is input, the controller controls to change and display at least one of the plurality of windows in response to the window size change command. And a control unit controlling to enlarge and display the size of the remaining windows while controlling not to display some of the windows.
  14. The method of claim 13,
    And the window size change command is a drag gesture starting from the center button and ending at a first point.
  15. The method of claim 14,
    The control unit may control to move and display the center button to which the drag gesture is input to the first point.
  16. The method of claim 15,
    The controller may be configured to reset and display the plurality of boundary lines in response to the movement of the center button, and to change and display the size of each of the plurality of windows based on the reset plurality of boundary lines. Display device.
  17. The method of claim 16,
    The first point is a boundary of the touch screen,
    The controller may control to display only some of the boundary lines.
  18. The method of claim 14,
    When the first point is a left boundary line or a right boundary line of the touch screen, the controller divides the touch screen up and down to display the remaining window, and the first point is an upper boundary line or a lower side of the touch screen. In the case of a border line, the display device is divided into left and right by controlling the display to display the remaining window.
  19. The method of claim 14,
    The control unit, if the first point is the corner of the touch screen, the display device, characterized in that for controlling to display the remaining window in full screen.
  20. The method of claim 16,
    And the controller is configured to control the display to reset and display the position of the boundary line based on the position of the center button on the touch screen.
  21. The method of claim 13,
    And the control unit controls not to display the center button after the enlarged size of the remaining window is displayed.
  22. The method of claim 21,
    When the touch screen receives a center button generation command for regenerating and displaying the center button, the control unit controls to display the center button at the most recently arranged position in response to the center button generation command. Display device characterized in that.
  23. The method of claim 19,
    And the control unit controls to display only a part of the center button when the center button is disposed at the corner.
  24. The method of claim 13,
    The center button is a display device, characterized in that any one of a rectangular shape, a circular shape, an elliptical shape, or a polygonal shape.
  25. In the method of controlling the display of the touch screen,
    Setting the touch screen to a plurality of windows;
    Displaying a center button at intersections of a plurality of boundary lines separating the set plurality of windows;
    And changing the size of at least one window of the plurality of windows in response to the movement of the center button.
  26. The method of claim 25,
    And when the center button is moved in a direction away from the point where the at least one window is located, the size of the at least one window is enlarged.
  27. The method of claim 25,
    And when the center button is moved in a direction approaching from the point where the at least one window is located, the size of the at least one window is reduced.
  28. The method of claim 25,
    When the center button is moved in a direction away from the point where the at least one window is located, the size of the remaining windows other than the at least one window among the plurality of windows is reduced by a ratio at which the at least one window is enlarged. Display control method, characterized in that.
  29. The method of claim 25,
    When the center button is moved in a direction closer to the point at which the at least one window is located, the size of the remaining windows except for the at least one window among the plurality of windows is enlarged by a ratio at which the at least one window is reduced. Display control method characterized in that the.
  30. The method of claim 25,
    And displaying the one of the plurality of windows in full screen when the center button is moved to an edge of the touch screen.
  31. The method of claim 30,
    And the one window is a window farthest from the corner of the plurality of windows.
  32. The method of claim 30,
    If the center button is moved to the edge of the touch screen, the display control method, characterized in that the remaining windows other than the one of the plurality of windows are not displayed.
  33. The method of claim 25,
    And each set window is displayed so as not to overlap each other.
  34. The method of claim 25,
    And displaying an intersection changed in response to the movement of the center button.
PCT/KR2013/011306 2012-12-06 2013-12-06 Display device and control method therefor WO2014088374A1 (en)

Priority Applications (10)

Application Number Priority Date Filing Date Title
US201261734097P true 2012-12-06 2012-12-06
US61/734,097 2012-12-06
US201261737540P true 2012-12-14 2012-12-14
US61/737,540 2012-12-14
US201261740887P true 2012-12-21 2012-12-21
US61/740,887 2012-12-21
KR10-2013-0011933 2013-02-01
KR20130011933 2013-02-01
KR1020130096206A KR20140073399A (en) 2012-12-06 2013-08-13 Display apparatus and method for controlling thereof
KR10-2013-0096206 2013-08-13

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US14/649,451 US20150325211A1 (en) 2012-12-06 2013-12-06 Display device and control method therefor
US16/055,670 US20180349025A1 (en) 2012-12-06 2018-08-06 Display device including button configured according to displayed windows and control method therefor

Related Child Applications (2)

Application Number Title Priority Date Filing Date
US201514649451A A-371-Of-International 2015-06-03 2015-06-03
US16/055,670 Continuation US20180349025A1 (en) 2012-12-06 2018-08-06 Display device including button configured according to displayed windows and control method therefor

Publications (1)

Publication Number Publication Date
WO2014088374A1 true WO2014088374A1 (en) 2014-06-12

Family

ID=50883723

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2013/011306 WO2014088374A1 (en) 2012-12-06 2013-12-06 Display device and control method therefor

Country Status (1)

Country Link
WO (1) WO2014088374A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20070001771A (en) * 2005-06-29 2007-01-04 정순애 Control method of screen data
JP2008227679A (en) * 2007-03-09 2008-09-25 Funai Electric Co Ltd Television broadcast signal receiver
US7437678B2 (en) * 2005-10-27 2008-10-14 International Business Machines Corporation Maximizing window display area using window flowing
JP2009005258A (en) * 2007-06-25 2009-01-08 Sharp Corp Television receiver
KR20120059909A (en) * 2010-12-01 2012-06-11 엘지전자 주식회사 Mobile terminal and operation method thereof

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20070001771A (en) * 2005-06-29 2007-01-04 정순애 Control method of screen data
US7437678B2 (en) * 2005-10-27 2008-10-14 International Business Machines Corporation Maximizing window display area using window flowing
JP2008227679A (en) * 2007-03-09 2008-09-25 Funai Electric Co Ltd Television broadcast signal receiver
JP2009005258A (en) * 2007-06-25 2009-01-08 Sharp Corp Television receiver
KR20120059909A (en) * 2010-12-01 2012-06-11 엘지전자 주식회사 Mobile terminal and operation method thereof

Similar Documents

Publication Publication Date Title
CN102262503B (en) Electronic device and method of controlling the same
US8718556B2 (en) Mobile terminal and controlling method thereof
US9261995B2 (en) Apparatus, method, and computer readable recording medium for selecting object by using multi-touch with related reference point
US20140351728A1 (en) Method and apparatus for controlling screen display using environmental information
US8907977B2 (en) Mobile terminal having a display configured to display multiple zones and control method thereof
KR20130007956A (en) Method and apparatus for controlling contents using graphic object
KR20130052151A (en) Data input method and device in portable terminal having touchscreen
EP2720132B1 (en) Display apparatus and method of controlling the same
US20130265284A1 (en) Object control method performed in device including transparent display, the device, and computer readable recording medium thereof
KR101749933B1 (en) Mobile terminal and method for controlling the same
US20130305184A1 (en) Multiple window providing apparatus and method
US9342232B2 (en) Information-processing apparatus providing multiple display modes
US20190212915A1 (en) Display device and method of controlling the same
US20140101579A1 (en) Multi display apparatus and multi display method
WO2013032234A1 (en) Method of providing of user interface in portable terminal and apparatus thereof
EP2402846A2 (en) Mobile terminal and method for controlling operation of the mobile terminal
JP2014509422A (en) Device having a plurality of touch screens and screen changing method for the device
EP2466438A2 (en) Mobile terminal and control method thereof
JP6210975B2 (en) Method and apparatus for sharing data between network electronic devices
KR20120079579A (en) Method and apparatus for changing a size of screen using multi-touch
EP2672762B1 (en) Connecting the highest priority Bluetooth device to a mobile terminal
JP2013175180A (en) Device and method for changing application
EP2595046A2 (en) Apparatus including a touch screen under a multi-application environment and controlling method thereof
EP2741192A2 (en) Display device for executing a plurality of applications and method for controlling the same
US20130300684A1 (en) Apparatus and method for executing multi applications

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13860132

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 14649451

Country of ref document: US

NENP Non-entry into the national phase in:

Ref country code: DE

122 Ep: pct app. not ent. europ. phase

Ref document number: 13860132

Country of ref document: EP

Kind code of ref document: A1