NL2011505C2 - A processing device and method of manipulating a window of an application. - Google Patents

A processing device and method of manipulating a window of an application. Download PDF

Info

Publication number
NL2011505C2
NL2011505C2 NL2011505A NL2011505A NL2011505C2 NL 2011505 C2 NL2011505 C2 NL 2011505C2 NL 2011505 A NL2011505 A NL 2011505A NL 2011505 A NL2011505 A NL 2011505A NL 2011505 C2 NL2011505 C2 NL 2011505C2
Authority
NL
Netherlands
Prior art keywords
application window
application
window
display area
touch pattern
Prior art date
Application number
NL2011505A
Other languages
Dutch (nl)
Inventor
Seokhwan Jeong
Original Assignee
Piit Group B V
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Piit Group B V filed Critical Piit Group B V
Priority to NL2011505 priority Critical
Priority to NL2011505A priority patent/NL2011505C2/en
Application granted granted Critical
Publication of NL2011505C2 publication Critical patent/NL2011505C2/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object or an image, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for entering handwritten data, e.g. gestures, text

Abstract

The invention relates to an improved user interface for a mobile device (100) comprising a processor, a touch screen having a display area, an Input/Output device to connect to a network system and a data storage comprising instructions. In response to a touch pattern (106A, 108, 106B) with predefined characteristics in the full screen window of an application, the full screen window is repositioned in a direction (108') associated with the touch pattern with the predefined characteristics as a result of which the application window is shifted and displayed partially in a section of the display area. The repositioning does not change the appearance and functionality of the part of the application window that is displayed in the section of the display area. The invention enables a user to control fully the application with a hand that holds the mobile device at the same time.

Description

A processing device and method of manipulating a window of an application.

TECHNICAL FIELD

The invention relates to the field of user interfaces on processing devices provided with a touch screen, and in particular, to a processing device and method of manipulating a window of an application executing on a processing device comprising a touchscreen having a display area.

BACKGROUND

Nowadays, a smartphone and / or a tablet are part of daily life. These mobile devices have a touchscreen as input device. The tip of one or more fingers touching the touchscreen is used to interact with the device. The manual actions by means of one or more fingers include: tapping, scrolling, pinching and swiping. The manual actions enable a person to get the desired information of the display area of the touchscreen in a more or less natural and intuitive way. A smartphone or tablet could be regarded as a processing device comprising a processor, a touchscreen having a display area, an Input/Output device to connect the processor to a network system, and a data storage comprising instructions. The instructions when executed by the processor cause the processing device to execute an application. The application generates an application window with first dimensions in the display area of the touch screen. The application window normally comprises a plurality of areas which are individually selectable via a touch on the touchscreen to display content associated with a selected area. A smartphone had a touchscreen size enabling the user to control the smartphone with one hand. However, smartphone with larger touchscreen are introduced. The larger touchscreen makes it almost impossible to control the smartphone with one hand via the touchscreen. For example, when holding the smartphone in the right hand, it became impossible to touch the upper-left, upper or left side of the touchscreen. To avoid that the smartphone falls from the hand, fingers of the left hand are needed to select selectable areas at the left or upper side of the display area. Thus more manual or more complex actions are needed to control a smartphone with a larger touchscreen.

On the other hand, the size of touchscreens of tablets and multi-touch laptops become smaller, however the size is still too large to control easily the device with the hand holding the device.

SUMMARY

It is an object of the invention to provide an improved touchscreen device and method of manipulating a window of an application executing on a processing device comprising a touchscreen, and to obviate at least one of the disadvantageous, described above. More particularly, it is an object of the invention to provide an improved user interface on a touchscreen device enabling a user to control fully an application running on the device with a finger of the hand holding the device.

According to the invention, this object is achieved by a processing device having the features of Claim 1 and the method of manipulating a window of an application having the features of claim 8. Advantageous embodiments and further ways of carrying out the invention may be attained by the measures mentioned in the dependent claims.

According to a first aspect of the invention, there is provided a processing device. The processing device comprises a processor, a touch screen having a display area, an Input/Output device to connect the device to a network system and a data storage. The data storage comprises instructions, which when executed by the processor, cause the processing device to execute an application. The application generates an application window with first dimensions in the display area. The window has a plurality of areas which are individually selectable via a touch on the touchscreen to display content associated with a selected area. The data storage further comprises instructions, which when executed by the processor, cause the processing device to detect a touch pattern with predefined characteristics on the touch screen in the application window. In response to the detection of the touch pattern with the predefined characteristics, the instructions cause the processor to reposition the application window in a direction associated with the touch pattern with the predefined characteristics. As a result of the repositioning the application window is shifted and displayed partially in a section of the display area and wherein appearance and functionality of the plurality of areas displayed in the section of the display area after repositioning have not changed.

Due to the relative small size of touchscreens of handheld mobile devices and the relative large amount of information to be displayed on the display area of a touchscreen, the applications are started with an application window having the largest possible window size on the touchscreen. This applies both for smartphones and tablets. In the context of the present description “application window” means the display area of the touchscreen that is filled by the application. Application window does not include the display area of a possible frame around the application window. The entire display area in the application could be used to have user selectable buttons or areas, which when they are selected show new content related to a selected button or area or start another application. In the past, smartphones had a size which makes it possible to control the device with one hand. Tablets were designed to be controlled by the fingers of two hands. Thus, in both field of technology there was not the problem that a selectable area could not easily be selected. However, the smartphones get a display size which makes it almost impossible to control the device only with the thumb of the hand holding the device. The size and weight of the tablets has reduced such that the tablet could easily be hold in one hand. In view of that it would be advantageous if a tablet could be controlled with one hand. The features of the present applications provides a user interface which enables a person to manipulate the position of the application window in the display area such that the areas of the application window that could not be reached by the thumb are moved or shifted in the direction of the area that is reachable by the thumb. According to the present application, in response to a touch pattern, which could be regarded as a specific command, the application window is repositioned on the display area of the touchscreen.

An advantage of the present application is that there is no need to adapt the individual software applications that are obtained from an internet store over the internet. According to the present application, after detection of the touch pattern, the operating system running on the device knows that the pixels of the application window have to be shifted a first number of pixels in horizontal direction and a second number of pixel in vertical direction. This is an operation that could easily be performed by a graphics processing unit (GPU). Similarly, the process identifying the location of a touch on the screen, has to transform the measured x,y-location after repositioning into a corrected x,y-location which is corrected for the shift of the pixels of the application window in horizontal and vertical direction. The corrected x,y-location is subsequently forwarded to the application running on the device.

In an embodiment, the touch pattern includes a swipe in a particular direction and the repositioning action results in a shift of the window of the application in a direction comparable with the particular direction. These features provides an intuitive user interface. The start and stop location of the touch of the finger forming the swipe define the translation of the application window. An advantage of this feature is that a user could never shift the application window out of the display area. The start location of the swipe is always in the displayed part of the application window and end location is always in the display area. As a result of the swipe, the content in the application window at the location of the start location will be displayed after the repositioning action at the stop location of the swipe, which is somewhere in the display area.

For some applications, at least a part of the plurality of areas could be scrolled in the application window. In a further embodiment, the data storage further comprises instructions, which when executed by the processor, cause the processing device to detect a display state of the application wherein a scrollable part of the window touched and swiped by a user could not be scrolled in a direction comparable with the particular direction of the swipe. The reposition action is performed after detection of the touch pattern and the display state. These features enable to add an additional function to a swipe, which is also a touch pattern on the touch screen. Next to performing a scroll function it allows to reposition the entire application window as a result of which selectable areas at the left, upper or bottom side of the application window become reachable by the thumb in the display window of the touchscreen.

In an alternative embodiment, the touch pattern comprises touching an area in the application window for a predefined period of time prior to the swipe in the particular direction. This feature provides a simple and intuitive user interface instruction to reposition the application window irrespective of the content in the application window.

In an embodiment, the section of the display area is a predefined section with predefined second dimensions. An advantage of this embodiment is that a user could with one touch pattern position selectable areas near the border of the application that could not be reached by the finger of the hand holding the device in a display area that is reachable by the finger.

In an embodiment, the application window at detection of the touch pattern utilizes at least 90% of the display area, more particular 95% of the display area and after repositioning utilizes not more than 50% of the display area. According to this feature, the left boundary, the top boundary or the bottom boundary of the application window, which approximately coincide with the edge of the display window, are shifted in right, bottom or top direction such that after repositioning of the application window the boundary of the application is positioned in respectively the right half, bottom half or top half of the display window.

In an embodiment, the data storage further comprises instructions, which when executed by the processor, cause the processing device to detect another touch pattern with predefined characteristics on the touch screen starting in the window of the application and in response to the detection of the another touch pattern with the predefined characteristics to restore the application window to comprise the first dimensions. These features allow a user to shift back the application window to its original position.

According to a second aspect, there is provided a method of manipulating a window of an application which is running on a processing device. The processing device comprises a touch screen having a display area. The method comprises detecting a touch pattern with predefined characteristics on the touch screen in the application window and repositioning in response to the detection of the touch pattern with the predefined characteristics the application window in a direction associated with the touch pattern with the predefined characteristics. As a result of this the application window is shifted and displayed partially in a section of the display area and wherein appearance and functionality of the plurality of areas of the application window displayed in the section of the display area after repositioning have not changed.

It should be noted that it is known that the operating system of computers allow to move a window of an application. However, to do this, the user has to select the top side of the frame around the application window and to drag the top side of the window to another position. Applications on a smartphone or tablet do not have such a frame and if they had a frame, the top side of the frame is just the part or the frame that is the least easy to select by a finger of the hand that is holding the device.

According to a third aspect of the present application, there is provided a computer program product readable by a processing device and comprising computer program code means configuring the processing device to carry out the method according to the present patent application.

Other features and advantages will become apparent from the following detailed description, taken in conjunction with the accompanying drawings which illustrate, by way of example, various features of embodiments.

BRIEF DESCRIPTION OF THE DRAWINGS

These and other aspects, properties and advantages will be explained hereinafter based on the following description with reference to the drawings, wherein like reference numerals denote like or comparable parts, and in which:

Fig. 1 shows schematically an embodiment wherein the application window is moved in lower right direction in the display area of the touchscreen and subsequently restored;

Fig. 2 shows schematically an embodiment wherein the application window is moved to the right in the display area of the touchscreen and subsequently restored;

Fig. 3 shows schematically an embodiment wherein the application window is moved down in the display area of the touchscreen and subsequently restored;

Fig. 4 shows schematically an embodiment wherein the bottom part of the application window is moved up in the display area of the touchscreen and subsequently restored;

Fig. 5 shows schematically another embodiment wherein the application window is moved in lower right direction in the display area of the touchscreen;

Fig. 6 is a block diagram illustrating a processing device; and,

Fig. 7 is a block diagram showing an embodiment of a process running in a processing device.

DETAILED DESCRIPTION

Fig. 1 shows a view of a handheld device such as a smartphone after a user had opened an application on said smartphone 100. Some examples of applications are: a software program that is downloaded via a web-store, a web browser. A handheld device according to the present application is a processing device 1000 as shown in Fig. 6. The processing device comprising a processor 1010, a touch screen 1040 having a display area, an Input/Output device 1030 to connect to a network system. The processing device further comprises a data storage 1020 comprising instructions. The instructions when executed by the processor 1010, cause the processing device 1000 to execute an application or software program. The application could be stored in the data storage or be downloaded over the internet. The application generates an application window 104 with first dimensions in the display area. Normally, with the exception of the status bar 102, the application window uses the remaining display area of the touchscreen. Application window in the context of the present description means the area of the screen that is filled with content or images by the application. Application window does not include the display area used by a possible frame around the application window. Such a frame is not generated by the application but by the operating system running on the processing device. The application window utilizes in this way at least 90%, more particular 95% of the display area. The application could also be utilizing the entire display area. The status bar 102 indicates the status of various functions of the processing device, such as time, battery lifetime, signal strength of telephone network, signal strength of WIFI network, etc. Besides the status bar 102, the processing device comprises some buttons 102A, 102B to control the device and/or the application. The application window 104 has a plurality of areas 104A- 104F which are individually selectable via a touch on the touchscreen. In response to a tap on the touchscreen in an area, the application displays content associated with the selected area. Some areas 104A in the application window could have a fixed position, i.e. their position cannot be moved in the application window. Other areas 104B-104F could be part of a scroll list. By moving the finger in upward/downward direction on the screen over a specific distance, the content in the areas 104B - 104F will move upward/downward in the application window corresponding to the movement of the finger as long as content is available below or above the content of the scroll list currently shown in the application window.

As an example, an information page of an application is shown in Fig. 1. The areas indicated with 104A are fixed areas which location in the application window is fixed as long as the information page is shown. After selection of an area another information page is shown with content linked with the selected area. Areas 104B - 104F form scrollable content of the information page. The areas indicated with reference numeral 104B form the top of the scroll list. Area 104F is an item in somewhere between the top and bottom item of the scroll list. Thus by scrolling the content in upward direction, at least a part of the items after area 104F will be shown. Area 104D is a selectable picture Pict-1 with a short description Desc-1. By selecting area 104D, an information page is displayed with content related or associated with the picture and short description. A user interacts with his finger (e.g. thumb or index finger) by tapping, scrolling pinching, dragging and swiping to get the right information on the screen in a natural and intuitive way.

Fig. 1A illustrates further a touch pattern on the touch screen. The touch pattern is indicated with reference numerals 106A, 106B and 108. Reference numeral 106A indicates the user action wherein he holds his finger for a predetermined period of time in the application window 104. A predetermined period of time could be in the range of 0,5 - 3 seconds, more particularly in the range of 1 - 2 seconds. The application will recognize that this touch pattern does not correspond with a tap which would indicate that the user would like to see the content related to a particular item in the area of the touching. Subsequently, the user swipes or drags his finger somewhere in the direction of the bottom right corner of the touchscreen along dashed line 108. Circles 106B indicate the position where the finger loses contact with the touchscreen. The processing device detects the touch pattern with predefined characteristics on the touch screen in the window of the application. In the present example the touch pattern is: holding the finger for some time at a location on the touchscreen in the application window and subsequently swiping the finger in a direction. In response to the detection of the touch pattern with the predefined characteristics, the location of the application window is shifted on the screen. In the present example the predefined characteristics comprises at least two elements: first, holding the finger for a predefined period of time in the application window and second, swiping the finger in a direction. In the present example, the shift of the application in the display area of the touchscreen is defined by the distance and direction between the start location 106A and the stop location 106B, i.e., the location where the finger losses contact with the touchscreen.

The application window and its content are translated along a predefined vector. As a result of this the content of the application window is shifted and displayed partially in a section of the display area. The appearance and functionality of the plurality of areas displayed after repositioning have not changed due to the translation or repositioning.

Assume that the region indicated with reference numeral 106A is the corner of the area that could maximally be reached by the thumb when holding the smartphone in one hand. Prior to the repositioning, the thumb could not reach the area 104A at the upper left corner of the application window. By moving the upper left corner of the application window in the direction of the lower right corner of the touchscreen, the thumb could select the area 104A in the upper left section of the application window.

As the touch pattern starts in the application window and ends at a position on the touchscreen which is reachable by the finger, the application window can be moved or shifted in the display area in any direction according to the vector defined by the start and stop location on the touchscreen. The application window is dragged from a first location to a second location by the touch pattern. The area of the application window at the starting location will be moved to the stop position. In this way, the application window can never be moved with one hand to a location on the screen where it cannot be reached by the finger which moved the application to the second position.

Fig. 1B shows the application window after repositioning. It can be seen that the appearance, i.e. size and shape of the areas in the application window, has not changed. Arrow with reference numeral 108’ indicates the vector of displacement of the application window in the display area of the touchscreen. The length and direction of vector 108’ is equivalent to the length and direction of the swipe 108. The functionality of the areas of the application that are display on the touchscreen have not changed by the repositioning.

In Fig. 1B the display area next to the application window is not used by the application any more. This region can be defined as background region. In Fig. 1B the background region displays a wall paper comprising one colour. The background region could also be used to display a part of the application window of an application which is started before the application which application window has been shifted. In this case, a user can see the content of two applications simultaneously. It might also be possible, that the application receives from the program code performing the method according the present description information about the location of its application window. In that case, the application could use this information to use determine whether the background region is sufficient large to display additional content of the application in an additional window next to the application window. The additional content could be an advertisement banner. In terms of the present description, the additional window is not a part of the application window.

Fig. 1C illustrates the user interaction needed to restore the original size and location of the application window. The application window approximately utilizes only a quarter of the entire display area of the touch screen. The user first touches with a finger for a predetermined period of time at location 116A in the application window. Subsequently, he swipes along a path 118 his finger in the direction of the upper left corner of the touchscreen. At location 116B his finger loses contact with the touchscreen. When the processing device detects such a touch pattern, which includes a swipe in a particular direction and the knowledge of the location of the upper left corner of the application window the processing device will shift the application window back in its original location on the display area. Fig. 1D shows the appearance and size of the application window after restoration. It can be seen that the size of the application window in Fig. 1D is equivalent to the size of the application window in Fig. 1A. Path 118 in Fig. 1C has the same length but inverted direction of path 108 in Fig. 1A. As a result of this, the application window has obtained its original dimensions and location, i.e. dimensions at start-up of the application. If the straight path 118 was shorter, the application would not have obtained its original dimensions and location, but would still have been shifted and shown partially. Flowever if the distance in vertical direction and horizontal direction of the swipe were both in vertical direction and horizontal direction larger than the displacement of the application window in horizontal and vertical direction, the application window would have obtained also its original location in the display area.

It should be noted, that the program code of the application does not have to change to implement the present invention. The operating system of the processing device should be adapted to enable the shift or translation of the application window in the display area of the touch screen. The translation of a window along x-pixels in horizontal direction and y-pixels in vertical direction is a function which could easily performed by a Graphics Processing Unit. Furthermore, the operating system has to correct the coordinates of a touch on the screen prior to supplying the coordinates to the application. This could be done by correcting the coordinated by a value corresponding to the translation of x-pixels in horizontal direction and y-pixels in vertical direction. By doing this, the application is not aware that only a part of the application window is used on the touchscreen.

It should further be noted that the program code performing the present method could comprise a setting to enable the function that after selection of an area in the “shifted” application window the application window is restored to its original location and dimensions at start-up of the application, i.e., “full” screen. Scrolling through a scroll list in a “shifted” application window will not result in restoration of the original application window location and dimensions.

It should further be noted, that if a user performs a touch pattern with the predefined characteristic which touch pattern starts in the application window, the application window will be moved to a subsequent location along a vector corresponding to the path of the swipe. In this way, the user can move a selectable area in the application window to a location of the display area which is easy to reach by the finger controlling the device via the touchscreen.

The implementation of the user interface or method described above is suitable for smartphones due to their relative small size. With one touch pattern, the user could shift the application window to a location where a desired area of the application window comes within the range of his finger. However, processing devices with larger touchscreens would require more than one touch pattern to bring a desired area within the range of his finger.

Fig. 5 show schematically a touch pattern similar to the touch pattern described in Fig. 1A-B. In Fig. 5A the processing device is a tablet, whereas in Fig. 1Athe processing device is a smartphone. The areas A, B’, C’ and D’ have a fixed position in the application. The areas indicated with Item-A,... Item-H forms a part of a scroll-list. The area of the scroll list in the application window has also a fixed position. The touch pattern is indicated with reference numerals 506A, 506B and 508. It is further assumed that position 506A is the farthest position from the lower right corner of the touch screen that could be reached by the finger of the hand holding the device. Now, the predefined characteristics of the touch pattern comprises also at least two elements: first, holding the finger for a predefined period of time in the application window and second, swiping the finger in a direction. In order to distinguish between the touch pattern in Fig. 1 and the current touch pattern, the finger is hold for another period of time at a location in the application window, wherein the periods of time do not overlap. For example, the period of time could be now 3 seconds or more. After detecting the touch pattern, the application window is repositioned at a predefined location in the display area. The application window is now also shifted along a vector, but now the vector has no relation with the path 508 of the swipe. The vector is in this case a predefined displacement of the pixels in both horizontal and vertical direction. As a consequence, the display area used by the application is a predefined section with predefined second dimensions, i.e. fixed number of pixels in horizontal direction and fixed number of pixels in vertical direction.

As said before, the displacement of the application window in the display area is not directly related to the length and direction of the path of the swipe. It might be clear that a swipe in a range of directions indicate that the application window has to be moved in the direction of the lower right corner of the touchscreen. For example, all swipes having an angle in the range of 292,5° -337,5 are assumed to indicate that the application window has to be displaced in the direction of the lower right corner of the touch screen. An angle of 0°, 90°, 180° and 270° points to the right side, upper side, left side and bottom side of the touch screen respectively. As a result of the detection, the upper left corner of the application window is moved in the direction of the lower right corner of the touch screen. The location of the upper left corner of the application window after repositioning is predefined and does not depend on the length and/or precise direction of the touch pattern.

From Fig. 5B can be seen that by repositioning the application window in the display area of the touch screen, the area has a fixed position at the left side of the application window come in reach of the thumb of the right hand (not shown) holding the tablet.

If the processing device is capable to detect both the touch pattern described with reference to Fig. 1 and the touch pattern described with reference to Fig. 5, a user is capable to move the selectable area indicated with Pict-2 and

Desc-2 in a section of the touchscreen that could be reached by a finger of the hand holding the device by a touch pattern described with reference to Fig. 1 and to move the selectable areas indicated with e.g. A’ - D’ in a section of the touchscreen that could be reached by a finger of the hand holding the device by a touch pattern described with reference to Fig. 5.

Fig. 2A shows schematically a touch pattern corresponding to a user interface command to move the left edge of the application window to the middle of the display area of the touchscreen. The touch pattern is indicated with reference numerals 206A, 206B and 208. Reference numeral 206A indicates the user action wherein he holds his finger for a predetermined period of time in the application window. Subsequently, the user swipes his finger to the right side of the touchscreen along a path indicated with dashed line 208. Circles 206B indicate the position where the finger loses contact with the touchscreen. It might be clear that the direction is not one specific direction but includes a range of directions. For example, all swipes having an angle in the range of -22,5° to 22,5° are assumed to direct to the right side of the touch screen. An angle of 0°, 90°, 180° and -90° points to the right side, upper side, left side and bottom side of the touch screen respectively. Fig. 2B shows the result of the user interaction. The left side of the application window is moved in the direction of the right side of the touch screen. The location of the left side of the application window after repositioning the application window is predefined and does not depend on the length and/or direction of the touch pattern. The application window and its content are translated along a predefined vector. The vector has only a component in horizontal direction. As a result of this the content of the application window is shifted and displayed partially in a predefined right section of the display area. The appearance and functionality of the plurality of areas displayed after repositioning have not changed due to the translation or repositioning. The operation on the application window corresponds to a translation of the position of pixels of the application window along x-pixels in horizontal direction and 0-pixels in vertical direction. By moving the left side of application window to somewhere about the middle of the display area, the areas at the left of the application window are in reach of the thumb of the right hand holding the smartphone.

Fig. 2C illustrates the user interaction needed to restore the original size and location of the application window. The application window approximately utilizes only a half of the entire display area of the touch screen. First, the user touches with a finger for a predetermined period of time at location 216A in the application window. Subsequently, he swipes along a path 218 his finger globally in the direction of the left side of the touchscreen. At location 216B his finger loses contact with the touchscreen. When the processing device detects such a touch pattern, which includes a swipe in a particular direction and the knowledge that the left side of the application window has been moved in the direction of the right side of the touchscreen, the processing device will restore the original size of the application window. Fig. 2D shows the appearance and size of the application window after restoration. It can be seen that the size of the application window in Fig. 2D is equivalent to the size of the application window in Fig. 2A.

Fig. 3A shows schematically a touch pattern corresponding to a user interface command to move the top of the application window to the middle of the display area of the touchscreen. The touch pattern is indicated with reference numerals 306A, 306B and 308. Reference numeral 306A indicates the user action wherein he holds his finger for a predetermined period of time in the application window. Subsequently, the user swipes his finger to the bottom side of the touchscreen along a path indicated with dashed line 308. Circles 306B indicate the position where the finger loses contact with the touchscreen. It might be clear that the direction is not one specific direction but includes a range of directions. For example, a swipe having an angle in the range of -247,5° to 292,5° is assumed to direct to the bottom side of the touch screen. An angle of 0°, 90°, 180° and -270° points to the right side, upper side, left side and bottom side of the touch screen respectively. Fig. 3B shows the result of the user interaction. The top side of the application window is moved in the direction of the bottom side of the touch screen. The location of the top side of the application window after repositioning the application window is predefined and does not depend on the length and/or precise direction of the touch pattern. The pixels of the application window and its content are translated along a predefined vector. The vector has only a component in vertical direction. As a result of this the content of the application window is shifted and displayed partially in a predefined bottom section of the display area. The appearance and functionality of the plurality of areas displayed after repositioning have not changed due to the translation or repositioning. The operation on the application window corresponds to a translation of the position of pixels of the application window along 0-pixels in horizontal direction and y-pixels in vertical direction. By moving the top side of application window to somewhere about the middle of the display area, the areas at the top of the application window are in reach of the thumb of the right hand holding the smartphone.

Fig. 3C illustrates the user interaction needed to restore the original size and location of the application window. The application window approximately utilizes only a half of the entire display area of the touch screen. First, the user touches with a finger for a predetermined period of time at location 316A in the application window. Subsequently, he swipes along a path 318 his finger globally in the direction of the top side of the touchscreen. At location 316B his finger loses contact with the touchscreen. When the processing device detects such a touch pattern, which includes a swipe in a particular direction and the knowledge that the top side of the application window has been moved in the direction of the bottom side of the touchscreen, the processing device will restore the original size of the application window. Fig. 3D shows the appearance and size of the application window after restoration. It can be seen that the size of the application window in Fig. 3D is equivalent to the size of the application window in Fig. 3A.

Fig. 3A shows schematically a touch pattern corresponding to a user interface command to move the top of the application window to the middle of the display area of the touchscreen. The touch pattern is indicated with reference numerals 306A, 306B and 308. Reference numeral 306A indicates the user action wherein he holds his finger for a predetermined period of time in the application window. Subsequently, the user swipes his finger to the bottom side of the touchscreen along a path indicated with dashed line 308. Circles 306B indicate the position where the finger loses contact with the touchscreen. It might be clear that the direction is not one specific direction but includes a range of directions. For example, a swipes having an angle in the range of -247,5° to 292,5° is assumed to direct to the bottom side of the touch screen. An angle of 0°, 90°, 180° and -270° points to the right side, upper side, left side and bottom side of the touch screen respectively. Fig. 3B shows the result of the user interaction. The top side of the application window is moved in the direction of the bottom side of the touch screen. The location of the top side of the application window after repositioning the application window is predefined and does not depend on the length and/or precise direction of the touch pattern. The pixels of the application window and its content are translated along a predefined vector. The vector has only a component in vertical direction. As a result of this the content of the application window is shifted and displayed partially in a predefined bottom section of the display area. The appearance and functionality of the plurality of areas displayed after repositioning have not changed due to the translation or repositioning. The operation on the application window corresponds to a translation of the position of pixels of the application window along 0-pixels in horizontal direction and y-pixels in vertical direction. By moving the top side of application window to somewhere about the middle of the display area, the areas at the top of the application window are in reach of the thumb of the right hand holding the smartphone.

Fig. 3C illustrates the user interaction needed to restore the original size and location of the application window. The application window approximately utilizes only a half of the entire display area of the touch screen. First, the user touches with a finger for a predetermined period of time at location 316A in the application window. Subsequently, he swipes along a path 318 his finger globally in the direction of the top side of the touchscreen. At location 316B his finger loses contact with the touchscreen. When the processing device detects such a touch pattern, which includes a swipe in a particular direction and the knowledge that the top side of the application window has been moved in the direction of the bottom side of the touchscreen, the processing device will restore the original size of the application window. Fig. 3D shows the appearance and size of the application window after restoration. It can be seen that the size of the application window in Fig. 3D is equivalent to the size of the application window in Fig. 3A.

Fig. 4A shows schematically a touch pattern corresponding to a user interface command to move the bottom of the application window to the middle of the display area of the touchscreen. The touch pattern is indicated with reference numerals 406A, 406B and 408. Reference numeral 406A indicates the user action wherein he holds his finger for a predetermined period of time in the application window. Subsequently, the user swipes his finger to the top side of the touchscreen along a path indicated with dashed line 408. Circles 406B indicate the position where the finger loses contact with the touchscreen. It might be clear that the direction is not one specific direction but includes a range of directions. For example, a swipe having an angle in the range of 67,5° to 112,5° is assumed to direct to the top side of the touch screen. An angle of 0°, 90°, 180° and -270° points to the right side, upper side, left side and bottom side of the touch screen respectively. Fig. 4B shows the result of the user interaction. The bottom side of the application window is moved in the direction of the top side of the touch screen. The location of the bottom side of the application window after repositioning the application window is predefined and does not depend on the length and/or precise direction of the touch pattern. The pixels of the application window and its content are translated along a predefined vector. The vector has only a component in vertical direction. As a result of this the content of the application window is shifted and displayed partially in a predefined top section of the display area. The appearance and functionality of the plurality of areas displayed after repositioning have not changed due to the translation or repositioning. The operation on the application window corresponds to a translation of the position of pixels of the application window along 0-pixels in horizontal direction and y-pixels in vertical direction. By moving the bottom side of application window to somewhere about the middle of the display area, the areas at the bottom of the application window are better in reach of the thumb of the right hand holding the smartphone.

Fig. 4C illustrates the user interaction needed to restore the original size and location of the application window. The application window approximately utilizes only a half of the entire display area of the touch screen. First, the user touches with a finger for a predetermined period of time at location 416A in the application window. Subsequently, he swipes along a path 418 his finger globally in the direction of the top side of the touchscreen. At location 416B his finger loses contact with the touchscreen. When the processing device detects such a touch pattern, which includes a swipe in a particular direction and the knowledge that the bottom side of the application window has been moved in the direction of the top side of the touchscreen, the processing device will restore the original size of the application window. Fig. 4D shows the appearance and size of the application window after restoration. It can be seen that the size of the application window in Fig. 4D is equivalent to the size of the application window in Fig. 4A.

Fig. 7 shows an exemplary embodiment of the actions to perform the method according to the present application. In block 701, an application is started by the user. Due to the relative small size and the resolution of the touchscreen in comparison with the amount of information/content that has to be displayed in the screen, the application on tablets and smartphones are started in Full screen, that is in a as large as possible display area. The size of this application window is called application with first dimensions. The window is composed of a plurality of areas which are individually selectable via a touch on the touchscreen. If an area is selected, the application displays in its application window content associated with the selected area. Block 702 represents the detection of a touch pattern with predefined characteristics on the touch screen in the window of the application by the processor. Above some examples of touch patterns are described. Optionally, the processor could detect the display state of an application. The display state of an application indicates whether a touched area in the application before the swipe is part of a scroll list and if so it indicates whether the scroll list could be scrolled, i.e. moved, in the direction of the swipe. In other words, the display state indicates whether an area could be moved in the application window in a comparable direction of the swipe. If the first touch pattern and optionally the display state are detected, action 704 is performed by the processor. From that moment, the application window is repositioned and utilizing only a part of the display area of the touchscreen that is available for an application. Examples of touch patterns and corresponding translation/displacement of the application window are described above. Thus in action 704, in response to the detection of the touch pattern with the predefined characteristics and optionally the display state the application window with first dimensions is repositioned in a predefined section of the display area. The predefined section forms a window with second dimensions. The second dimensions are associated with the first touch pattern. As a result of this the window is shifted and displayed partially in the predefined section of the display area.

Block 705 corresponds to the detection of a second touch pattern which starts in the application window with second dimensions. Examples of the second touch patters are described above. In block 706 in response to the detection of the second touch pattern the application window is restored to comprise the first dimensions. From now on the application window utilizes the same area of the touchscreen as when starting the application.

The method described above is very suitable to be performed when holding the processing device in one hand and a finger of said one hand is used to generate the touch pattern on the touchscreen.

It should be noted, that the method described in the present application could also be advantageous for users holding the processing device with touchscreen in the left hand. In that case, the method is able to move the right edge of the application window to the middle area of the display area of the touchscreen.

In case the processing device is capable to detect the display state of an area in the application window, the touch pattern with predefined characteristic of holding the finger for a predetermined time at a location in the application window followed by a swipe in a direction which will determine at what location the application window with reduced size will be displayed, the touch pattern could also be a swipe twice in a particular direction. The first swipe could then be used to detect the display state and the second swipe, which has to occur in a predetermined time after the first swipe is then used to trigger the action of repositioning the application window.

In the examples of touch patterns described above, the touch pattern starts with holding the finger for a predetermined period of time at a location in the application window which is followed by a swipe from a start position to a stop position. This pattern could be replaced by any another touch pattern that is easy to made by a human. An example is such touch pattern is a short tap in the application window on the touch screen followed almost directly by a swipe in a direction indicative for the displacement of the application window.

Another embodiment of a touch patters that could be used as a second touch pattern is tapping two or three times in a predetermined period of time in the application window with second dimensions. This touch pattern could be used for each of the situations shown in Figure part 1C, 2C, 3C and 4D to restore the application window to its original dimensions.

It might further be possible to if the application window is shifted to the right as shown in Fig. 2B, the user could use a touch patters as shown in Fig. 3A to instruct the processing device to reposition the application window at the location as shown in Fig. 1B.

It might further be possible that the processing device is capable to respond on any of the touch patterns described in the present application. In other words, there may be one or more touch patterns which result in the same pixel displacement of x-pixels in the horizontal direction and y-pixels in the vertical of the pixels of the application window in the display area of the touchscreen.

The present application could be provided to processing devices via the internet or data storage devices such as USB-stick and CD-com, as a computer program product readable by a processing device and comprising computer program code means configuring the processing device to carry out the method described in the present application.

In summary, the present application describes a user interface wherein the user interacts with his finger, e.g. thumb or index finger) to pinch and swipe the application window or web browser window to a desired location on the touchscreen of a processing device, for example a smartphone or tablet. As illustrated above, the user needs to pinch/hold an area in the application window or web-browser before the user can swipe the application window or web-browser window to the desired location e.g. lower right / bottom of the screen or sideways. As described, the application window is pinched on the desired location on the screen of a processing device. Also you can scroll within the page of the application window that is pinched. As an example, if you double tap an area in the pinched application window, a new window will pop up on the location you already pinned into. It might also be possible that the new window gets the full-sized window, which is the same window size when starting an application.

While the invention has been described in terms of several embodiments, it is contemplated that alternatives, modifications, permutations and equivalents thereof will become apparent to those skilled in the art upon reading the specification and upon study of the drawings. The invention is not limited to the illustrated embodiments. Changes can be made without departing from the idea of the invention. *******

Claims (16)

  1. A processing device (1000) comprising a processor (1010), a touch screen (1040) with a display area, an Input / Output device (1030) for connecting to a network system and a data storage (1020) comprising instructions, which, when executed by the processor (1010), cause the processing device (1000) to: - execute an application, the application generates an application window with first dimensions in the display area, the window has a plurality of areas that can be individually selected via a touch on the touch screen to display content linked to a selected area, characterized in that the data storage further comprises instructions which, when executed by the processor (1010), cause the processing device (1000) to: - have a touch pattern with detect predefined characteristics on the touch screen in the application window; - in response to detecting the touch pattern with the predefined characteristics, reposition the application window in a direction associated with the touch pattern with the predefined characteristics whereby the window is shifted and partially displayed in a portion of the display area and where appearance and functionality of the plurality of areas of the application displayed in the portion of the display area after repositioning have not changed.
  2. The processing device of claim 1, wherein the touch pattern includes a swipe in a particular direction and the repositioning action results in a shift of the application window in a direction similar to the particular direction.
  3. The processor of claim 2, wherein at least a portion of the plurality of regions can be scrolled in the window and the data storage further comprises instructions which, when executed by the processor (1010), cause the processor (1000) to: - detects a display state of the application in which a sliding part of the window that is touched by a user cannot be scrolled in a direction similar to the particular direction of the swipe and the repositioning action is performed after detection of the touch pattern and the display state .
  4. The processing device of claim 2 or 3, wherein the touch pattern includes touching an area in the application window for a predefined period of time before the swipe in the determined direction.
  5. The processing device of any one of claims 1-4, wherein the portion of the display area is a predefined portion with predefined second dimensions.
  6. The processing device according to any of claims 1-5, wherein the application window uses at least 90% of the display area, more specifically 95% of the display area, and uses no more than 50% of the display area after repositioning. .
  7. The processing device of any one of claims 1 to 6, wherein the data storage further comprises instructions which, when executed by the processor (1010), cause the processing device (1000) to: - another touch pattern with predefined characteristics on the touch screen which starts in the application window; - in response to the detection of the other touch pattern with the predefined characteristics, the application window restores so that the application window acquires the first dimensions.
  8. A method of manipulating a window of an application that is executed on a processing device (1000) comprising a touch screen (1040) with a display area and wherein the method comprises: - a processor (1010) of the processing device (1000) executing an application whereby the application generates an application window with first dimensions in the display area, the application window has a plurality of areas that can be individually selected via a touch on the touch screen in order to display content linked to a selected area characterized in that the method further comprises: - detecting a touch pattern with predefined characteristics on the touch screen screen in the application window; repositioning the application window in response to detecting the touch pattern with the predefined characteristics in a direction associated with the touch pattern with the predefined characteristics whereby the application window is shifted and partially displayed in a portion of the display area and in which appearance and functionality of the plurality of areas of the application displayed in the portion of the screen area after repositioning have not changed.
  9. The method of claim 8, wherein the touch pattern includes a swipe in a particular direction and the repositioning action results in a shift of the application window in a direction similar to the particular direction.
  10. The method of claim 9, wherein at least a portion of the plurality of regions can be scrolled in the window and the method further comprises: - detecting a display state of the application in which a sliding part of the window is touched by a user cannot be scrolled in a direction similar to the particular direction of the wipe and the repositioning action is performed after detection of the touch pattern and the display condition.
  11. The method according to claim 9 or 10, wherein the touch pattern comprises touching a region in the application window for a predefined period of time before the swipe in the determined direction.
  12. The method of any one of claims 8-11, wherein the portion of the display area is a predefined portion with predefined second dimensions.
  13. A method according to any one of claims 8-12, wherein the application window uses at least 90% of the display area, more specifically 95% of the display area, and uses no more than 50% of the display area after repositioning.
  14. A method according to any of claims 8-13, wherein the method further comprises: - detecting another touch pattern with predefined characteristics on the touch screen which touch pattern starts in the application window; - restoring the application window in response to the detection of the other touch pattern with the predefined characteristics so that the application window acquires the first dimensions.
  15. A method according to any of claims 8-14, wherein the method can be performed by holding the processing device in a hand and a finger of that one hand is used to generate the touch pattern on the touch screen.
  16. A computer program product readable by a processing device and comprising computer program code means for configuring the processing device to perform the method according to any of claims 8-14.
NL2011505A 2013-09-26 2013-09-26 A processing device and method of manipulating a window of an application. NL2011505C2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
NL2011505 2013-09-26
NL2011505A NL2011505C2 (en) 2013-09-26 2013-09-26 A processing device and method of manipulating a window of an application.

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
NL2011505A NL2011505C2 (en) 2013-09-26 2013-09-26 A processing device and method of manipulating a window of an application.
PCT/NL2014/050662 WO2015047093A1 (en) 2013-09-26 2014-09-26 A processing device and method of manipulating a window of an application.

Publications (1)

Publication Number Publication Date
NL2011505C2 true NL2011505C2 (en) 2015-03-30

Family

ID=49956313

Family Applications (1)

Application Number Title Priority Date Filing Date
NL2011505A NL2011505C2 (en) 2013-09-26 2013-09-26 A processing device and method of manipulating a window of an application.

Country Status (2)

Country Link
NL (1) NL2011505C2 (en)
WO (1) WO2015047093A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060080621A1 (en) * 2004-10-13 2006-04-13 Samsung Electronics Co., Ltd. Method of controlling location of display window on display screen of information processing device and apparatus using the method
US20110296329A1 (en) * 2010-05-28 2011-12-01 Kabushiki Kaisha Toshiba Electronic apparatus and display control method
US20130097556A1 (en) * 2011-10-15 2013-04-18 John O. Louch Device, Method, and Graphical User Interface for Controlling Display of Application Windows

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110304584A1 (en) * 2009-02-23 2011-12-15 Sung Jae Hwang Touch screen control method and touch screen device using the same

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060080621A1 (en) * 2004-10-13 2006-04-13 Samsung Electronics Co., Ltd. Method of controlling location of display window on display screen of information processing device and apparatus using the method
US20110296329A1 (en) * 2010-05-28 2011-12-01 Kabushiki Kaisha Toshiba Electronic apparatus and display control method
US20130097556A1 (en) * 2011-10-15 2013-04-18 John O. Louch Device, Method, and Graphical User Interface for Controlling Display of Application Windows

Also Published As

Publication number Publication date
WO2015047093A1 (en) 2015-04-02

Similar Documents

Publication Publication Date Title
EP3120230B1 (en) Devices and methods for navigating between user interfaces
US10019096B1 (en) Gestures and touches on force-sensitive input devices
US10346030B2 (en) Devices and methods for navigating between user interfaces
US9639258B2 (en) Manipulation of list on a multi-touch display
US20160062467A1 (en) Touch screen control
US9250729B2 (en) Method for manipulating a plurality of non-selected graphical user elements
TWI617953B (en) Multi-task switching method, system and electronic device for touching interface
JP6009454B2 (en) Enhanced interpretation of input events that occur when interacting with a computing device that uses the motion of the computing device
US10331313B2 (en) Method and apparatus for text selection
US10353570B1 (en) Thumb touch interface
US8930852B2 (en) Touch screen folder control
US10275145B2 (en) Drawing support tool
US9851883B2 (en) Method and apparatus for adjusting and moving a user interface for single handed use on an endpoint device
US8543934B1 (en) Method and apparatus for text selection
KR101384857B1 (en) User interface methods providing continuous zoom functionality
US9430139B2 (en) Information processing apparatus, information processing method, and program
WO2013094371A1 (en) Display control device, display control method, and computer program
EP2715491B1 (en) Edge gesture
US8994674B2 (en) Information viewing apparatus, control program and controlling method
US20160202865A1 (en) Coordination of static backgrounds and rubberbanding
US10042546B2 (en) Systems and methods to present multiple frames on a touch screen
US10318146B2 (en) Control area for a touch screen
US8976140B2 (en) Touch input processor, information processor, and touch input control method
CN104285195B (en) Overscanning display device and the method using the overscanning display device
US20140359528A1 (en) Method and apparatus of controlling an interface based on touch operations

Legal Events

Date Code Title Description
PD Change of ownership

Owner name: SEOKHWAN JEONG; NL

Free format text: DETAILS ASSIGNMENT: CHANGE OF OWNER(S), ASSIGNMENT; FORMER OWNER NAME: PIIT GROUP B.V.

Effective date: 20190905