US20170344202A1 - Methods and apparatuses for window display, and methods and apparatuses for touch-operating an application - Google Patents
Methods and apparatuses for window display, and methods and apparatuses for touch-operating an application Download PDFInfo
- Publication number
- US20170344202A1 US20170344202A1 US15/676,563 US201715676563A US2017344202A1 US 20170344202 A1 US20170344202 A1 US 20170344202A1 US 201715676563 A US201715676563 A US 201715676563A US 2017344202 A1 US2017344202 A1 US 2017344202A1
- Authority
- US
- United States
- Prior art keywords
- main menu
- touch
- touch area
- application
- instruction
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
Definitions
- the present invention relates to computer and embedding technologies, and in particular to methods and apparatuses for window display, and methods and apparatuses for touch-operating an application.
- a terminal having a touch screen such as personal computer (PC), tablet computer (Pad) or mobile terminal, will display a window in one of the following manners when a user clicks an icon or triggers an application on the terminal:
- the inventor of the application has found that there are some problems with the conventional display methods.
- the window cannot be displayed at a position corresponding to the user. Consequently, the user may not face directly the displayed window, and have to move. This is inconvenient to the user.
- Most of such terminals are designed for use by a single user. It is thus inconvenient when the terminals are used by more than one user.
- Some of these conventional techniques support two-point touch operation mode, which is developed from the existing windows interactive operation mode. However, these techniques support only usage by a single user, and applications are executed in a single task mode. No interactive mode is enabled for simultaneous use by several users. Further, the software supports only display in a single orientation, and thus the user can only use the terminal along a single orientation. Other of the conventional techniques support multi-touch mode, and the user can perform touch operations at upper and lower positions of the screen. However, these techniques allow the user to use the terminal in just two orientations, which are fixed and unchangeable.
- Embodiments of the present disclosure provide methods and apparatus for window display, which enables displaying a window in an appropriate position for convenient use.
- a method for displaying a window on a data processing terminal comprising:
- the at least one menu item is arranged in a loop to form the main menu.
- a position for displaying the first window is determined based on the position of the first menu item relative to the main menu.
- the first window is determined to be positioned in an extended part of a line connecting a center of the main menu and the first menu item based on the position of the first menu item relative to the main menu.
- an orientation for displaying the first menu item is a first orientation
- an orientation for displaying the first window is a second orientation
- the first orientation is the same as the second orientation
- the at least one menu item is arranged in a loop to form the main menu which has a shape of circle or regular polygon.
- the first window is determined to be positioned in a radial orientation of the main menu.
- the terminal comprises at least three edges, and whether the terminal is being used by a user can be determined through detection on the edges.
- determining a first one of the edges which is closest to the user determining the first window to be at a position having the shortest distance from the first edge.
- the terminal comprises at least three edges, and whether the terminal is being used by a user can be determined through detection on the edges.
- determining a first one of the edges which is closest to the user determining the first window to be displayed in a orientation perpendicular to the first edge.
- the first window is displayed in an orientation perpendicular to a line connecting the eyes of a user in front of any of the menu items.
- An apparatus for displaying a window on a data processing terminal comprising:
- a first display module configured to display a main menu including at least one menu item
- a first determination module configured to determine a first window corresponding to a first menu item in the main menu based on a first input operation on the first menu item
- a second determination module configured to determine a position parameter and/or an orientation parameter for the first window
- a second display module configured to display the first window according to the position parameter and/or the orientation parameter.
- the at least one menu item is arranged in a loop to form the main menu.
- the second determination module is further configured to determine a position for displaying the first window based on the position of the first menu item relative to the main menu.
- the second determination module is further configured to determine the first window to be positioned in an extended part of a line connecting a center of the main menu and the first menu item based on the position of the first menu item relative to the main menu.
- an orientation for displaying the first menu item is a first orientation
- an orientation for displaying the first window is a second orientation
- the first orientation is the same as the second orientation
- the at least one menu item is arranged in a loop to form the main menu which has a shape of circle or regular polygon.
- the second determination module is further configured to determine the first window to be positioned in a radial orientation of the main menu.
- the terminal comprises at least three edges, and whether the terminal is being used by a user can be determined through detection on the edges.
- the second determination module is further configured to determine a first one of the edges which is closest to the user when it is determined that the terminal is being used by a user, and determine the first window to be at a position having the shortest distance from the first edge.
- the terminal comprises at least three edges, and whether the terminal is being used by a user can be determined through detection on the edges.
- the second determination module is further configured to determine a first one of the edges which is closest to the user when it is determined that the terminal is being used by a user, and to determine the first window to be displayed in an orientation perpendicular to the first edge.
- the first window is displayed in an orientation perpendicular to a line connecting the eyes of a user in front of any of the menu items.
- the method for displaying a window displays a main menu including at least one menu item, determines a first window corresponding to a first menu item in the main menu based on a first input operation on the first menu item, determines a position parameter and/or an orientation parameter for the first window, and displays the first window according to the position parameter and/or the orientation parameter.
- a method for touch-operating an application in a data processing terminal the data processing terminal comprises a screen having a multi-touch function, and displays in operation at least a main menu on the screen, the main menu comprises M touch areas each corresponding to an application, and graphics and/or texts in each of the M touch areas can be displayed in at least two different positive orientations, wherein M is an integer greater than 1, the method comprises:
- each of the M touch areas of the main menu is, at both of its sides, adjacent to one of the touch areas.
- the main menu is of a circle, ellipse or regular polygon shape. If the main menu is of a circle or regular polygon shape, the positive orientation is a radial orientation of the main menu.
- generating an operation instruction for the main menu based on a first input operation on a region of the screen corresponding to the main menu comprises:
- the regions of the screen corresponding to the respective M touch areas are equal or different in size.
- the regions of the screen occupied by the applications are equal or different in size.
- At least one of the M touch areas has at least one corresponding sub touch area
- the method further comprises:
- An apparatus for touch-operating an application in a data processing terminal comprises a screen having a multi-touch function, and displays in operation at least a main menu on the screen, the main menu comprises M touch areas each corresponding to an application, and graphics and/or texts in each of the M touch areas can be displayed in at least two different positive orientations, wherein M is an integer greater than 1, the apparatus comprises:
- a first generation module configured to generate an operation instruction for the main menu based on a first input operation on a region of the screen corresponding to the main menu
- an adjustment module configured to, for each of the M touch areas, adjusting the display position of the touch area from a first position to a second position based on the operation instruction, wherein the positive orientation for graphics and/or texts in at least one of the M touch areas is different between the first position and the second position;
- a second generation module configured to generate an application enabling instruction based on an operation for enabling an application corresponding to a first touch area on the first touch area
- an operation module configured to enable the application corresponding to the first touch area based on the application enabling instruction.
- each of the M touch areas of the main menu is, at both of its sides, adjacent to one of the touch areas.
- the main menu is of a circle, ellipse or regular polygon shape, or an irregular pattern having only one central region. If the main menu is of a circle or regular polygon shape, the positive orientation is a radial orientation of the main menu.
- the first generation module is further configured to:
- the regions of the screen corresponding to the respective M touch areas are equal or different in size.
- the regions of the screen occupied by the applications are equal or different in size.
- At least one of the M touch areas has at least one corresponding sub touch area
- the second generation module is further configured to generate a sub touch area enabling instruction based on a second input operation on the touch area having the corresponding sub touch area;
- the operation module is further configured to enable the sub touch area corresponding to the touch area based on the sub touch area enabling instruction.
- the method for touch-operating an application comprises: generating an operation instruction for the main menu based on a first input operation on a region of the screen corresponding to the main menu; for each of the M touch areas, adjusting the display position of the touch area from a first position to a second position based on the operation instruction, wherein the positive orientation for graphics and/or texts in at least one of the M touch areas is different between the first position and the second position; generating an application enabling instruction based on an operation for enabling an application corresponding to a first touch area on the first touch area; and enabling the application corresponding to the first touch area based on the application enabling instruction.
- the main menu can be rotated and operated simultaneously by multiple users. It is possible to enable the users to always face directly to the main menu, and thus improve convenience.
- FIG. 1 is a schematic block diagram showing the structure of an apparatus for displaying a window according to an embodiment of the present invention
- FIG. 2 shows a schematic diagram of a circular main menu according to an embodiment of the present invention
- FIG. 3 is a schematic flowchart showing a method for displaying a window according to an embodiment of the present invention
- FIG. 4 is a schematic block diagram showing the structure of an apparatus for touch-operating an application according to an embodiment of the present invention
- FIG. 5 shows a schematic diagram of a circular main menu according to an embodiment of the present invention.
- FIG. 6 is a schematic flowchart showing a method for touch-operating an application according to an embodiment of the present invention.
- the method for displaying a window displays a main menu including at least one menu item, determines a first window corresponding to a first menu item in the main menu based on a first input operation on the first menu item, determines a position parameter and/or an orientation parameter for the first window, and displays the first window according to the position parameter and/or the orientation parameter.
- an apparatus for displaying a window may include a first display module 101 , a first determination module 102 , a second determination module 103 , and a second display module 104 .
- the apparatus may be applied to a data processing terminal.
- the first display module 101 may be configured to display a main menu including at least one menu item.
- the first display module 101 may display at last a main menu on a screen of a data processing terminal during the operation of the terminal.
- the main menu may include M menu items arranged in a loop to form the main menu.
- Each of the M menu items may be, at either of its sides, adjacent to one of the M menu items.
- At least one of the M menu items may correspond to an application, and/or correspond to a sub menu item.
- M may be an integer not less than 1.
- Each of the M menu items in the main menu may correspond to a different type of application.
- the user may initiate an application, such as playing back videos or viewing photos, after performing an input operation on the corresponding menu item.
- At least one of the M menu items may correspond to a sub menu item.
- the sub menu item may be preferably in a shape of fan. If the main menu can be rotated or moved, the corresponding sub menu item will be rotated or moved along with the rotation or movement of the main menu.
- Each menu item or sub menu item may have corresponding texts and/or graphics placed thereon for explaining content corresponding to the menu item or the sub menu item.
- a menu item may be “video,” and the sub menu items under the menu item may be classified into different categories, such as “entertainment video,” “sports video” and the like.
- the sub menu items under the menu item may be displayed when the user clicks on the menu item. The user may initiate the desired application by clicking on the corresponding sub menu item.
- the embodiment of the present invention may allow the user to use several applications simultaneously, and also allow multiple users to operate on the screen.
- the main menu may be in different shapes, such as circle, polygon or ellipse.
- the main menu may be rotatable to allow a user to rotate it clockwise or counter-clockwise at any angle.
- the graphics and/or texts on the main menu may be rotated, along with the rotation of the main menu, to a position desirable by the user.
- the main menu may be an enclosed pattern.
- the main menu is in a circle shape and partitioned into 6 touch areas.
- Each of the touch areas may represent a different type of application.
- Graphics and/or texts for each touch area may be arranged in an orientation perpendicular to the tangent of the circle. No matter along which orientation a user operates the main menu on the screen, graphics and/or texts on the main menu closest to the user may always face directly the user. This guarantees that the user may always perform operations, such as selection, along a positive orientation, and thus improves convenience.
- the first determination module 102 may be configured to determine a first window corresponding to a first menu item in the main menu based on a first input operation on the first menu item. When the user performs the first input operation on the first menu item in the main menu, the first determination module 102 may determine the first window corresponding to the first menu item.
- the first window may represent an application or sub menu items. In an embodiment, the size of the first window may be smaller than that of the display region of the terminal screen.
- the second determination module 103 may be configured to determine a position parameter and/or an orientation parameter for the first window. After the first determination module 102 determines the first window corresponding to the first menu item, the second determination module 103 may determine a corresponding position parameter and/or orientation parameter for the first window.
- the second determination module 103 may determine the first window to be positioned in an extended part of a line connecting a center of the main menu and the first menu item based on the position of the first menu item relative to the main menu, as shown in FIG. 2 in which the dotted line denotes the extended part of the line connecting the center of the main menu and the first menu item.
- the orientation for displaying the first menu item is a first orientation
- the orientation for displaying the first window is a second orientation
- the first orientation are the same as the second orientation.
- the second determination module 103 may determine the first window to be positioned in a radial orientation of the main menu.
- the radial of the main menu may have its endpoint as the center of the main menu.
- the main menu may be rotatable or not rotatable.
- the terminal may include at least three edges.
- the second determination module 103 may perform detection, such as detection at regular interval, periodical detection or random detection, on each of the edges to determine whether the terminal is being used by a user.
- the second determination module 103 may determine a first one of the edges which is closest to the user, and then determine the first window to be at a position having the shortest distance from the first edge. That is, the first window is placed at a position having the shortest distance from the user so that it is more convenient for the user to view the first window.
- the second determination module 103 may determine the first window to be displayed in an orientation perpendicular to the first edge, that is, placing the first window at a position directly facing the user.
- the first window for any menu item may be displayed in an orientation perpendicular to a line connecting the eyes of the user operating on the menu item, and the first window may be displayed in an orientation away from the user.
- the second display module 104 may be configured to display the first window according to the position parameter and/or the orientation parameter. After the second determination module 103 determines the display position and orientation for the first window, the second display module 104 may display the first window according to the determined display position and orientation.
- the user may rotate a menu item in the main menu to a positive orientation with respect to the user before operating on the menu item.
- the second determination module 103 may determine a positive orientation for a sub menu item or an application with respect to the user, based on a position and/or orientation parameter of the user, so that the application or sub menu item may be displayed in a positive orientation with respect to the user, and thus convenient to use by the user. If the user performs an input operation on the first menu item displayed in the positive orientation with respect to the user, the operation module 102 may also display the sub menu item or the application to be enabled according to the position and/or orientation parameter of the first menu item.
- the first menu item to be operated may be not in the positive orientation with respect to the user.
- the first determination module 102 may determine a first window corresponding to the first menu item
- the second determination module 103 may determine a position parameter and/or an orientation parameter corresponding to the first window
- the second display module 104 may display the sub menu item or application to be enabled in an orientation as close as possible to the positive orientation with respect to the user, based on the position parameter and/or orientation parameter determined by the second determination module 103 .
- the positive orientation with respect to the user in the embodiment may refer to an orientation perpendicular to a line connecting the eyes of the user in front of an input area closest to the user.
- the first window may be display in an orientation perpendicular to a line connecting the eyes of the user in front of any of the menu items.
- the flow of the method for displaying a window in an embodiment is shown in FIG. 3 .
- the method may be applicable in a data processing terminal.
- Step 301 displaying a main menu including at least one menu item
- Step 302 determining a first window corresponding to a first menu item in the main menu based on a first input operation on the first menu item;
- Step 303 determining a position parameter and/or an orientation parameter for the first window.
- Step 304 displaying the first window according to the position parameter and/or the orientation parameter.
- the method for displaying a window displays a main menu including at least one menu item, determines a first window corresponding to a first menu item in the main menu based on a first input operation on the first menu item, determines a position parameter and/or an orientation parameter for the first window, and displays the first window according to the position parameter and/or the orientation parameter.
- a sub menu item or an application may be displayed according to information about position and/or orientation of the user performing the first input operation, so that the sub menu item or application may be displayed in the positive orientation with respect to the user, even though the main menu is not rotatable or the user does not rotate the main menu in advance. This facilitates the user's operation, and improves intelligence, user-friendliness and flexibility of a device.
- a method for touch-operating an application may include: generating an operation instruction for the main menu based on a first input operation on a region of the screen corresponding to the main menu; for each of the M touch areas, adjusting the display position of the touch area from a first position to a second position based on the operation instruction, wherein the positive orientation for graphics and/or texts in at least one of the M touch areas is different between the first position and the second position; generating an application enabling instruction based on an operation for enabling an application corresponding to a first touch area on the first touch area; and enabling the application corresponding to the first touch area based on the application enabling instruction.
- the main menu can be rotated and operated simultaneously by multiple users. It is possible to enable the users to always face directly to the main menu, and thus improve convenience.
- an apparatus for touch-operating an application may include a first generation module 401 , an adjustment module 403 , a second generation module 403 and an operation module 404 .
- the apparatus may be applied to a data processing terminal equipped with a screen having a multi-touch function.
- the data processing terminal may display in operation at least a main menu on the screen.
- the main menu may include M touch areas each of which is, at both of its sides, adjacent to one of the touch areas.
- Each of the M touch areas may correspond to an application.
- Graphics and/or texts in each of the M touch areas can be displayed in at least two different positive orientations, wherein M is an integer greater than 1.
- a positive orientation may refer to an orientation in which graphics and/or texts in the touch areas face directly the user.
- the positive orientation in which graphics and/or texts in any of the touch areas are displayed is an orientation perpendicular to a line connecting the eyes of the user in front of the touch area.
- the main menu may be an enclosed spatial pattern. A radial of the main menu may have its endpoint as the center of the main menu.
- the first generation module 401 may be configured to generate an operation instruction for the main menu based on a first input operation on a region of the screen corresponding to the main menu.
- the user may perform the first input operation on a region of the screen corresponding to the main menu.
- the first generation module 401 may generate an operation instruction for the main menu based on a rotation operation with a radian on a region of the screen corresponding to the main menu, or generate an operation instruction for the main menu based on a moving operation along a line on a region of the screen corresponding to the main menu, or generate an operation instruction for the main menu based on a click operation on an initial position of any of the M touch areas and a click operation on a target position on a region of the screen corresponding to the main menu.
- the adjustment module 403 may be configured to, for each of the M touch areas, adjusting the display position of the touch area from a first position to a second position based on the operation instruction, wherein the positive orientation for graphics and/or texts in at least one of the M touch areas may be different between the first position and the second position.
- the first position of the first touch area is further away from the user than the second position of the first touch area.
- the adjustment module 403 may, based on the operation instruction, move the desired first touch area in the main menu to a position closer to the user.
- Each of the M touch areas in the main menu may correspond to a different type of application, and thus a type of application, such as playing back video or viewing photos, may be initiated when the user clicks on the corresponding touch area.
- at least one of the M touch areas may correspond to at least one sub touch area.
- the sub touch area may be in a shape of fan.
- the main menu is rotated or moved, the sub touch area may be rotated or moved along with the main menu.
- one of the touch areas may correspond to “video,” and the sub touch areas under the touch area may correspond to different categories, such as “entertainment video,” or “sports video.” If a touch area has corresponding sub touch areas, these sub touch areas may be displayed when the user clicks on the touch area. Then, the user may initiate the desired application by clicking on one or more of the sub touch area.
- the main menu may be rotatable.
- the main menu may be in different shapes, such as circle, polygon or ellipse.
- the main menu may be rotated clockwise or counter-clockwise at any angle.
- graphics and/or texts on the main menu may be rotated along with the main menu to a position as expected by the user.
- the positive orientation may refer to a radial orientation of the main menu in embodiments of the present invention.
- FIG. 5 shows an example of the main menu.
- the main menu is shaped as a circle, and partitioned into 6 touch areas. Each of the touch areas may represent a type of application, and graphics and/or texts (in FIG.
- each touch area may be displayed in an orientation perpendicular to the tangent of the circle. No matter in which orientation the main menu is operated by the user, graphics and/or texts on the main menu closest to the user may always face directly the user. This guarantees that the user may always operate the main menu in a positive orientation, which is convenient for the user's operation.
- the sub touch areas may be scaled up or down.
- the sub touch area may be scaled up or down by the user's manual operation.
- the user may separate his or her two fingers in a direction of increasing the distance between the two fingers while touching one of the sub touch area, and accordingly the sub touch area may be increased in size.
- the user may also pinch his or her two fingers in a direction of decreasing the distance between the two fingers while touching one of the sub touch area, and accordingly the sub touch area may be decreased in size.
- the respective M touch areas may correspond to display regions on the screen which are equal or different in size.
- the sizes of these display regions may be adjustable.
- the first touch area may correspond to 3 sub touch areas, and their display regions on the screen may be equal or different in size.
- the sizes of the display regions may be adjustable.
- the first touch area may correspond to one sub touch area, and the second touch area may correspond to 3 sub touch areas.
- the display regions on the screen corresponding to the four sub touch areas may be equal or different in size.
- the sizes of the display regions may be adjustable.
- the second generation module 403 may be configured to generate an application enabling instruction based on an operation for enabling an application corresponding to a first touch area on the first touch area.
- the user may operate on the first touch area, such as clicking on the touch area.
- the second generation module 403 may generate an application enabling instruction when receiving the clicking operation.
- the second generation module 403 may be further configured to generate a sub touch area enabling instruction based on a second input operation on the touch area having the corresponding sub touch area.
- the second input operation may be a clicking operation.
- the second generation module 403 receives the click operation by the user, and generates a sub touch area enabling instruction.
- the operation module 404 is configured to enable the application corresponding to the first touch area based on the application enabling instruction. When receiving the application enabling instruction, the operation module 404 may enable the application corresponding to the first touch area.
- the application enabling instruction may carry a touch area identifier so that the operation module 404 may enable an application corresponding to the touch area as identified by the touch area identifier.
- the operation module 404 may automatically adjust the size of the screen region occupied by each application, if necessary. For example, a user is watching video, and another user is listening to music. The video application may require a larger region of the screen, and accordingly the operation module 404 may increase the size of the screen region occupied by the video application. Alternatively, the user may perform manual adjustment. For example, an application may be embedded with buttons for scale-up or scale-down, and the user may perform adjustment by clicking on these buttons. When applications corresponding to the respective M touch areas are enabled, these applications may occupy regions of the screen having equal or different sizes. The size of each region may be adjustable.
- the operation module 404 may be further configured to enable the sub touch area corresponding to the touch area based on the sub touch area enabling instruction.
- the operation module 404 may enable the sub touch area corresponding to the touch area based on the sub touch area enabling instruction.
- the sub touch area enabling instruction may carry a touch area identifier, and the operation module 404 may enable a sub touch area corresponding to the touch area as identified by the touch area identifier.
- FIG. 6 The flow of the method for touch-operating an application in an embodiment is shown in FIG. 6 .
- Step 601 generating an operation instruction for the main menu based on a first input operation on a region of the screen corresponding to the main menu;
- Step 602 for each of the M touch areas, adjusting the display position of the touch area from a first position to a second position based on the operation instruction, wherein the positive orientation for graphics and/or texts in at least one of the M touch areas is different between the first position and the second position;
- Step 603 generating an application enabling instruction based on an operation for enabling an application corresponding to a first touch area on the first touch area;
- Step 604 enabling the application corresponding to the first touch area based on the application enabling instruction.
- the method may be applied to a data processing terminal equipped with a screen having a multi-touch function.
- the data processing terminal may display in operation at least a main menu on the screen.
- the main menu may include M touch areas each corresponding to an application. Graphics and/or texts in each of the M touch areas can be displayed in at least two different positive orientations, wherein M is an integer greater than 1.
- the method for touch-operating an application comprises: generating an operation instruction for the main menu based on a first input operation on a region of the screen corresponding to the main menu; for each of the M touch areas, adjusting the display position of the touch area from a first position to a second position based on the operation instruction, wherein the positive orientation for graphics and/or texts in at least one of the M touch areas is different between the first position and the second position; generating an application enabling instruction based on an operation for enabling an application corresponding to a first touch area on the first touch area; and enabling the application corresponding to the first touch area based on the application enabling instruction.
- the main menu can be rotated and operated simultaneously by multiple users. It is possible to enable the users to always face directly to the main menu, and thus improve convenience. Regions of the screen occupied by the touch areas or applications may be adjusted in size as required.
- the main menu may be presented in different shapes. This also improves convenience and flexibility.
- embodiments of the present invention may be provided as methods, systems or computer program products, and thus may take a form of hardware, software or combination thereof.
- Embodiments of the present invention may also be implemented as computer program product in the form of one or more computer readable storage media (including but not limited to disk memories and optical memories) containing computer readable program codes.
- the present invention has been described with reference to flowcharts and/or block diagrams of methods, apparatuses (systems) and computer program products according to embodiments of the present invention. It will be appreciated that each step and/or block in the flowcharts and/or block diagrams or combination thereof may be implemented in computer program instructions.
- the computer program instructions may be loaded onto a processor of a general computer, a dedicated computer, an embedded processing device or any other programmable data processing device to generate a machine, so that the instructions, when executed by the processor of the computer or other programmable data processing device, generate means for performing the functions specified in one or more steps and/or blocks in the flowcharts and/or block diagrams.
- These computer program instructions may be also stored in a computer readable memory that may direct computers or other programmable data processing devices to operate in a specific manner, so that the instructions stored in the computer readable memory may generate a manufacture article containing instruction means which perform the functions specified in one or more steps and/or blocks in the flowcharts and/or block diagrams.
- These computer program instructions may be also loaded onto computers or other programmable data processing devices, so that the computers or other programmable data processing devices perform a sequence of operation steps for computer-implemented processing.
- the instructions When executed on the computers or other programmable data processing devices, the instructions provide steps for performing the functions specified in one or more steps and/or blocks in the flowcharts and/or block diagrams.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
- This application is a divisional of U.S. application Ser. No. 13/666,998 filed Nov. 2, 2012 for “METHODS AND APPARATUSES FOR WINDOW DISPLAY, AND METHODS AND APPARATUSES FOR TOUCH-OPERATING AN APPLICATION” by Lu Lu, Yu Chen, Jun Li, Xin Li, and Shuangxi Huang, which claims the benefit of Chinese Application Nos. 201110342315.0, filed Nov. 2, 2011, and 201110421915.6, filed Dec. 15, 2011. U.S. application Ser. No. 13/666,998 is hereby incorporated by reference in its entirety.
- The present invention relates to computer and embedding technologies, and in particular to methods and apparatuses for window display, and methods and apparatuses for touch-operating an application.
- Conventionally a terminal having a touch screen, such as personal computer (PC), tablet computer (Pad) or mobile terminal, will display a window in one of the following manners when a user clicks an icon or triggers an application on the terminal:
-
- 1. displaying the window at a preset physical position on the screen, such as the upper right corner or center of the screen;
- 2. displaying the window at a position where the window was previously closed.
- The inventor of the application has found that there are some problems with the conventional display methods. The window cannot be displayed at a position corresponding to the user. Consequently, the user may not face directly the displayed window, and have to move. This is inconvenient to the user. Most of such terminals are designed for use by a single user. It is thus inconvenient when the terminals are used by more than one user.
- There are further problems with some conventional techniques. Some of these conventional techniques support two-point touch operation mode, which is developed from the existing windows interactive operation mode. However, these techniques support only usage by a single user, and applications are executed in a single task mode. No interactive mode is enabled for simultaneous use by several users. Further, the software supports only display in a single orientation, and thus the user can only use the terminal along a single orientation. Other of the conventional techniques support multi-touch mode, and the user can perform touch operations at upper and lower positions of the screen. However, these techniques allow the user to use the terminal in just two orientations, which are fixed and unchangeable.
- Embodiments of the present disclosure provide methods and apparatus for window display, which enables displaying a window in an appropriate position for convenient use.
- A method for displaying a window on a data processing terminal, the method comprising:
- displaying a main menu including at least one menu item;
- determining a first window corresponding to a first menu item in the main menu based on a first input operation on the first menu item;
- determining a position parameter and/or an orientation parameter for the first window; and
- displaying the first window according to the position parameter and/or the orientation parameter.
- In an embodiment, the at least one menu item is arranged in a loop to form the main menu. A position for displaying the first window is determined based on the position of the first menu item relative to the main menu.
- In an embodiment, the first window is determined to be positioned in an extended part of a line connecting a center of the main menu and the first menu item based on the position of the first menu item relative to the main menu.
- In an embodiment, an orientation for displaying the first menu item is a first orientation, an orientation for displaying the first window is a second orientation, and the first orientation is the same as the second orientation.
- In an embodiment, the at least one menu item is arranged in a loop to form the main menu which has a shape of circle or regular polygon. The first window is determined to be positioned in a radial orientation of the main menu.
- In an embodiment, the terminal comprises at least three edges, and whether the terminal is being used by a user can be determined through detection on the edges. When it is determined that the terminal is being used by a user, determining a first one of the edges which is closest to the user, and determining the first window to be at a position having the shortest distance from the first edge.
- In an embodiment, the terminal comprises at least three edges, and whether the terminal is being used by a user can be determined through detection on the edges. When it is determined that the terminal is being used by a user, determining a first one of the edges which is closest to the user, and determining the first window to be displayed in a orientation perpendicular to the first edge.
- In an embodiment, the first window is displayed in an orientation perpendicular to a line connecting the eyes of a user in front of any of the menu items.
- An apparatus for displaying a window on a data processing terminal, the apparatus comprising:
- a first display module configured to display a main menu including at least one menu item;
- a first determination module configured to determine a first window corresponding to a first menu item in the main menu based on a first input operation on the first menu item;
- a second determination module configured to determine a position parameter and/or an orientation parameter for the first window; and
- a second display module configured to display the first window according to the position parameter and/or the orientation parameter.
- In an embodiment, the at least one menu item is arranged in a loop to form the main menu. The second determination module is further configured to determine a position for displaying the first window based on the position of the first menu item relative to the main menu.
- In an embodiment, the second determination module is further configured to determine the first window to be positioned in an extended part of a line connecting a center of the main menu and the first menu item based on the position of the first menu item relative to the main menu.
- In an embodiment, an orientation for displaying the first menu item is a first orientation, an orientation for displaying the first window is a second orientation, and the first orientation is the same as the second orientation.
- In an embodiment, the at least one menu item is arranged in a loop to form the main menu which has a shape of circle or regular polygon. The second determination module is further configured to determine the first window to be positioned in a radial orientation of the main menu.
- In an embodiment, the terminal comprises at least three edges, and whether the terminal is being used by a user can be determined through detection on the edges. The second determination module is further configured to determine a first one of the edges which is closest to the user when it is determined that the terminal is being used by a user, and determine the first window to be at a position having the shortest distance from the first edge.
- In an embodiment, the terminal comprises at least three edges, and whether the terminal is being used by a user can be determined through detection on the edges. The second determination module is further configured to determine a first one of the edges which is closest to the user when it is determined that the terminal is being used by a user, and to determine the first window to be displayed in an orientation perpendicular to the first edge.
- In an embodiment, the first window is displayed in an orientation perpendicular to a line connecting the eyes of a user in front of any of the menu items.
- The method for displaying a window according to embodiments of the present invention displays a main menu including at least one menu item, determines a first window corresponding to a first menu item in the main menu based on a first input operation on the first menu item, determines a position parameter and/or an orientation parameter for the first window, and displays the first window according to the position parameter and/or the orientation parameter. By displaying a sub menu item or an application according to a position parameter and/or an orientation parameter for a first menu item or according to a position parameter and/or an orientation parameter for a user, it is possible to display a corresponding window in an appropriate position so that it is more convenient for the user to view or use. This improves the user experiences and facilitates practical applications.
- A method for touch-operating an application in a data processing terminal, the data processing terminal comprises a screen having a multi-touch function, and displays in operation at least a main menu on the screen, the main menu comprises M touch areas each corresponding to an application, and graphics and/or texts in each of the M touch areas can be displayed in at least two different positive orientations, wherein M is an integer greater than 1, the method comprises:
- generating an operation instruction for the main menu based on a first input operation on a region of the screen corresponding to the main menu;
- for each of the M touch areas, adjusting the display position of the touch area from a first position to a second position based on the operation instruction, wherein the positive orientation for graphics and/or texts in at least one of the M touch areas is different between the first position and the second position;
- generating an application enabling instruction based on an operation for enabling an application corresponding to a first touch area on the first touch area; and
- enabling the application corresponding to the first touch area based on the application enabling instruction.
- In an embodiment, each of the M touch areas of the main menu is, at both of its sides, adjacent to one of the touch areas.
- In an embodiment, the main menu is of a circle, ellipse or regular polygon shape. If the main menu is of a circle or regular polygon shape, the positive orientation is a radial orientation of the main menu.
- In an embodiment, generating an operation instruction for the main menu based on a first input operation on a region of the screen corresponding to the main menu comprises:
- generating an operation instruction for the main menu based on a rotation operation with a radian on a region of the screen corresponding to the main menu; or
- generating an operation instruction for the main menu based on a moving operation along a line on a region of the screen corresponding to the main menu; or
- generating an operation instruction for the main menu based on a click operation on an initial position of any of the M touch areas and a click operation on a target position on a region of the screen corresponding to the main menu.
- In an embodiment, the regions of the screen corresponding to the respective M touch areas are equal or different in size.
- In an embodiment, when applications corresponding to the respective M touch areas are enabled, the regions of the screen occupied by the applications are equal or different in size.
- In an embodiment, at least one of the M touch areas has at least one corresponding sub touch area;
- the method further comprises:
- generating a sub touch area enabling instruction based on a second input operation on the touch area having the corresponding sub touch area; and
- enabling the sub touch area corresponding to the touch area based on the sub touch area enabling instruction.
- An apparatus for touch-operating an application in a data processing terminal, the data processing terminal comprises a screen having a multi-touch function, and displays in operation at least a main menu on the screen, the main menu comprises M touch areas each corresponding to an application, and graphics and/or texts in each of the M touch areas can be displayed in at least two different positive orientations, wherein M is an integer greater than 1, the apparatus comprises:
- a first generation module configured to generate an operation instruction for the main menu based on a first input operation on a region of the screen corresponding to the main menu;
- an adjustment module configured to, for each of the M touch areas, adjusting the display position of the touch area from a first position to a second position based on the operation instruction, wherein the positive orientation for graphics and/or texts in at least one of the M touch areas is different between the first position and the second position;
- a second generation module configured to generate an application enabling instruction based on an operation for enabling an application corresponding to a first touch area on the first touch area; and
- an operation module configured to enable the application corresponding to the first touch area based on the application enabling instruction.
- In an embodiment, each of the M touch areas of the main menu is, at both of its sides, adjacent to one of the touch areas.
- In an embodiment, the main menu is of a circle, ellipse or regular polygon shape, or an irregular pattern having only one central region. If the main menu is of a circle or regular polygon shape, the positive orientation is a radial orientation of the main menu.
- In an embodiment, the first generation module is further configured to:
- generate an operation instruction for the main menu based on a rotation operation with a radian on a region of the screen corresponding to the main menu; or
- generate an operation instruction for the main menu based on a moving operation along a line on a region of the screen corresponding to the main menu; or
- generate an operation instruction for the main menu based on a click operation on an initial position of any of the M touch areas and a click operation on a target position on a region of the screen corresponding to the main menu.
- In an embodiment, the regions of the screen corresponding to the respective M touch areas are equal or different in size.
- In an embodiment, when applications corresponding to the respective M touch areas are enabled, the regions of the screen occupied by the applications are equal or different in size.
- In an embodiment, at least one of the M touch areas has at least one corresponding sub touch area;
- the second generation module is further configured to generate a sub touch area enabling instruction based on a second input operation on the touch area having the corresponding sub touch area; and
- the operation module is further configured to enable the sub touch area corresponding to the touch area based on the sub touch area enabling instruction.
- The method for touch-operating an application according to embodiments of the present invention comprises: generating an operation instruction for the main menu based on a first input operation on a region of the screen corresponding to the main menu; for each of the M touch areas, adjusting the display position of the touch area from a first position to a second position based on the operation instruction, wherein the positive orientation for graphics and/or texts in at least one of the M touch areas is different between the first position and the second position; generating an application enabling instruction based on an operation for enabling an application corresponding to a first touch area on the first touch area; and enabling the application corresponding to the first touch area based on the application enabling instruction. According to the embodiments, the main menu can be rotated and operated simultaneously by multiple users. It is possible to enable the users to always face directly to the main menu, and thus improve convenience.
-
FIG. 1 is a schematic block diagram showing the structure of an apparatus for displaying a window according to an embodiment of the present invention; -
FIG. 2 shows a schematic diagram of a circular main menu according to an embodiment of the present invention; -
FIG. 3 is a schematic flowchart showing a method for displaying a window according to an embodiment of the present invention; -
FIG. 4 is a schematic block diagram showing the structure of an apparatus for touch-operating an application according to an embodiment of the present invention; -
FIG. 5 shows a schematic diagram of a circular main menu according to an embodiment of the present invention; and -
FIG. 6 is a schematic flowchart showing a method for touch-operating an application according to an embodiment of the present invention. - The method for displaying a window according to embodiments of the present invention displays a main menu including at least one menu item, determines a first window corresponding to a first menu item in the main menu based on a first input operation on the first menu item, determines a position parameter and/or an orientation parameter for the first window, and displays the first window according to the position parameter and/or the orientation parameter. By displaying a sub menu item or an application according to a position parameter and/or an orientation parameter for a first menu item or according to a position parameter and/or an orientation parameter for a user, it is possible to display a corresponding window in an appropriate position so that it is more convenient for the user to view or use. This improves the user experiences and facilitates practical applications.
- Referring to
FIG. 1 , an apparatus for displaying a window according to an embodiment of the present invention may include afirst display module 101, afirst determination module 102, asecond determination module 103, and asecond display module 104. The apparatus may be applied to a data processing terminal. - The
first display module 101 may be configured to display a main menu including at least one menu item. Thefirst display module 101 may display at last a main menu on a screen of a data processing terminal during the operation of the terminal. The main menu may include M menu items arranged in a loop to form the main menu. Each of the M menu items may be, at either of its sides, adjacent to one of the M menu items. At least one of the M menu items may correspond to an application, and/or correspond to a sub menu item. M may be an integer not less than 1. Each of the M menu items in the main menu may correspond to a different type of application. The user may initiate an application, such as playing back videos or viewing photos, after performing an input operation on the corresponding menu item. Alternatively, at least one of the M menu items may correspond to a sub menu item. The sub menu item may be preferably in a shape of fan. If the main menu can be rotated or moved, the corresponding sub menu item will be rotated or moved along with the rotation or movement of the main menu. Each menu item or sub menu item may have corresponding texts and/or graphics placed thereon for explaining content corresponding to the menu item or the sub menu item. For example, a menu item may be “video,” and the sub menu items under the menu item may be classified into different categories, such as “entertainment video,” “sports video” and the like. If a menu item may correspond to sub menu items, the sub menu items under the menu item may be displayed when the user clicks on the menu item. The user may initiate the desired application by clicking on the corresponding sub menu item. The embodiment of the present invention may allow the user to use several applications simultaneously, and also allow multiple users to operate on the screen. - In an embodiment, the main menu may be in different shapes, such as circle, polygon or ellipse. The main menu may be rotatable to allow a user to rotate it clockwise or counter-clockwise at any angle. The graphics and/or texts on the main menu may be rotated, along with the rotation of the main menu, to a position desirable by the user. The main menu may be an enclosed pattern.
- In an example shown in
FIG. 2 , the main menu is in a circle shape and partitioned into 6 touch areas. Each of the touch areas may represent a different type of application. Graphics and/or texts for each touch area may be arranged in an orientation perpendicular to the tangent of the circle. No matter along which orientation a user operates the main menu on the screen, graphics and/or texts on the main menu closest to the user may always face directly the user. This guarantees that the user may always perform operations, such as selection, along a positive orientation, and thus improves convenience. - The
first determination module 102 may be configured to determine a first window corresponding to a first menu item in the main menu based on a first input operation on the first menu item. When the user performs the first input operation on the first menu item in the main menu, thefirst determination module 102 may determine the first window corresponding to the first menu item. The first window may represent an application or sub menu items. In an embodiment, the size of the first window may be smaller than that of the display region of the terminal screen. - The
second determination module 103 may be configured to determine a position parameter and/or an orientation parameter for the first window. After thefirst determination module 102 determines the first window corresponding to the first menu item, thesecond determination module 103 may determine a corresponding position parameter and/or orientation parameter for the first window. - For example, the
second determination module 103 may determine the first window to be positioned in an extended part of a line connecting a center of the main menu and the first menu item based on the position of the first menu item relative to the main menu, as shown inFIG. 2 in which the dotted line denotes the extended part of the line connecting the center of the main menu and the first menu item. The orientation for displaying the first menu item is a first orientation, the orientation for displaying the first window is a second orientation, and the first orientation are the same as the second orientation. - Alternatively, if the main menu is of a circle or regular polygon shape, the
second determination module 103 may determine the first window to be positioned in a radial orientation of the main menu. The radial of the main menu may have its endpoint as the center of the main menu. The main menu may be rotatable or not rotatable. - The terminal may include at least three edges. The
second determination module 103 may perform detection, such as detection at regular interval, periodical detection or random detection, on each of the edges to determine whether the terminal is being used by a user. When it is determined that the terminal is being used by a user, thesecond determination module 103 may determine a first one of the edges which is closest to the user, and then determine the first window to be at a position having the shortest distance from the first edge. That is, the first window is placed at a position having the shortest distance from the user so that it is more convenient for the user to view the first window. Alternatively, thesecond determination module 103 may determine the first window to be displayed in an orientation perpendicular to the first edge, that is, placing the first window at a position directly facing the user. Preferably, the first window for any menu item may be displayed in an orientation perpendicular to a line connecting the eyes of the user operating on the menu item, and the first window may be displayed in an orientation away from the user. - The
second display module 104 may be configured to display the first window according to the position parameter and/or the orientation parameter. After thesecond determination module 103 determines the display position and orientation for the first window, thesecond display module 104 may display the first window according to the determined display position and orientation. - If the main menu is rotatable, the user may rotate a menu item in the main menu to a positive orientation with respect to the user before operating on the menu item. The
second determination module 103 may determine a positive orientation for a sub menu item or an application with respect to the user, based on a position and/or orientation parameter of the user, so that the application or sub menu item may be displayed in a positive orientation with respect to the user, and thus convenient to use by the user. If the user performs an input operation on the first menu item displayed in the positive orientation with respect to the user, theoperation module 102 may also display the sub menu item or the application to be enabled according to the position and/or orientation parameter of the first menu item. - If the user does not first rotate the main menu, or if the main menu is not rotatable, the first menu item to be operated may be not in the positive orientation with respect to the user. In this case, when the user performs an input operation on the first menu item, the
first determination module 102 may determine a first window corresponding to the first menu item, thesecond determination module 103 may determine a position parameter and/or an orientation parameter corresponding to the first window, and thesecond display module 104 may display the sub menu item or application to be enabled in an orientation as close as possible to the positive orientation with respect to the user, based on the position parameter and/or orientation parameter determined by thesecond determination module 103. Here, the positive orientation with respect to the user in the embodiment may refer to an orientation perpendicular to a line connecting the eyes of the user in front of an input area closest to the user. In other words, the first window may be display in an orientation perpendicular to a line connecting the eyes of the user in front of any of the menu items. - Hereafter an example of a method for displaying a window will be described.
- The flow of the method for displaying a window in an embodiment is shown in
FIG. 3 . The method may be applicable in a data processing terminal. - Step 301: displaying a main menu including at least one menu item;
- Step 302: determining a first window corresponding to a first menu item in the main menu based on a first input operation on the first menu item;
- Step 303: determining a position parameter and/or an orientation parameter for the first window; and
- Step 304: displaying the first window according to the position parameter and/or the orientation parameter.
- The method for displaying a window according to embodiments of the present invention displays a main menu including at least one menu item, determines a first window corresponding to a first menu item in the main menu based on a first input operation on the first menu item, determines a position parameter and/or an orientation parameter for the first window, and displays the first window according to the position parameter and/or the orientation parameter. By displaying a sub menu item or an application according to information about position and/or orientation for a first menu item, it is possible to display a corresponding window in an appropriate position so that it is more convenient for the user to view or use. This improves the user experiences and facilitates practical applications. Alternatively, a sub menu item or an application may be displayed according to information about position and/or orientation of the user performing the first input operation, so that the sub menu item or application may be displayed in the positive orientation with respect to the user, even though the main menu is not rotatable or the user does not rotate the main menu in advance. This facilitates the user's operation, and improves intelligence, user-friendliness and flexibility of a device.
- In another aspect of the present invention, a method for touch-operating an application according to embodiments of the present invention may include: generating an operation instruction for the main menu based on a first input operation on a region of the screen corresponding to the main menu; for each of the M touch areas, adjusting the display position of the touch area from a first position to a second position based on the operation instruction, wherein the positive orientation for graphics and/or texts in at least one of the M touch areas is different between the first position and the second position; generating an application enabling instruction based on an operation for enabling an application corresponding to a first touch area on the first touch area; and enabling the application corresponding to the first touch area based on the application enabling instruction. According to the embodiments, the main menu can be rotated and operated simultaneously by multiple users. It is possible to enable the users to always face directly to the main menu, and thus improve convenience.
- Referring to
FIG. 4 , an apparatus for touch-operating an application according to embodiments of the present invention may include afirst generation module 401, anadjustment module 403, asecond generation module 403 and anoperation module 404. The apparatus may be applied to a data processing terminal equipped with a screen having a multi-touch function. The data processing terminal may display in operation at least a main menu on the screen. The main menu may include M touch areas each of which is, at both of its sides, adjacent to one of the touch areas. Each of the M touch areas may correspond to an application. Graphics and/or texts in each of the M touch areas can be displayed in at least two different positive orientations, wherein M is an integer greater than 1. In embodiments of the present invention, a positive orientation may refer to an orientation in which graphics and/or texts in the touch areas face directly the user. In other words, the positive orientation in which graphics and/or texts in any of the touch areas are displayed is an orientation perpendicular to a line connecting the eyes of the user in front of the touch area. In an embodiment, the main menu may be an enclosed spatial pattern. A radial of the main menu may have its endpoint as the center of the main menu. - The
first generation module 401 may be configured to generate an operation instruction for the main menu based on a first input operation on a region of the screen corresponding to the main menu. The user may perform the first input operation on a region of the screen corresponding to the main menu. For example, thefirst generation module 401 may generate an operation instruction for the main menu based on a rotation operation with a radian on a region of the screen corresponding to the main menu, or generate an operation instruction for the main menu based on a moving operation along a line on a region of the screen corresponding to the main menu, or generate an operation instruction for the main menu based on a click operation on an initial position of any of the M touch areas and a click operation on a target position on a region of the screen corresponding to the main menu. - The
adjustment module 403 may be configured to, for each of the M touch areas, adjusting the display position of the touch area from a first position to a second position based on the operation instruction, wherein the positive orientation for graphics and/or texts in at least one of the M touch areas may be different between the first position and the second position. In an embodiment, there is a first touch area among the M touch areas. The first position of the first touch area is further away from the user than the second position of the first touch area. Theadjustment module 403 may, based on the operation instruction, move the desired first touch area in the main menu to a position closer to the user. - Each of the M touch areas in the main menu may correspond to a different type of application, and thus a type of application, such as playing back video or viewing photos, may be initiated when the user clicks on the corresponding touch area. Alternatively, at least one of the M touch areas may correspond to at least one sub touch area. In an embodiment, the sub touch area may be in a shape of fan. When the main menu is rotated or moved, the sub touch area may be rotated or moved along with the main menu. For example, one of the touch areas may correspond to “video,” and the sub touch areas under the touch area may correspond to different categories, such as “entertainment video,” or “sports video.” If a touch area has corresponding sub touch areas, these sub touch areas may be displayed when the user clicks on the touch area. Then, the user may initiate the desired application by clicking on one or more of the sub touch area. These embodiments of the present invention allow a user to use more than one application simultaneously, and also allow several users to operate simultaneously.
- In an embodiment, the main menu may be rotatable. The main menu may be in different shapes, such as circle, polygon or ellipse. The main menu may be rotated clockwise or counter-clockwise at any angle. At the same time, graphics and/or texts on the main menu may be rotated along with the main menu to a position as expected by the user. When the main menu is in a shape of circle or regular polygon, the positive orientation may refer to a radial orientation of the main menu in embodiments of the present invention.
FIG. 5 shows an example of the main menu. The main menu is shaped as a circle, and partitioned into 6 touch areas. Each of the touch areas may represent a type of application, and graphics and/or texts (inFIG. 5 , texts are taken as an example) on each touch area may be displayed in an orientation perpendicular to the tangent of the circle. No matter in which orientation the main menu is operated by the user, graphics and/or texts on the main menu closest to the user may always face directly the user. This guarantees that the user may always operate the main menu in a positive orientation, which is convenient for the user's operation. - In an embodiment, the sub touch areas may be scaled up or down. For example, the sub touch area may be scaled up or down by the user's manual operation. The user may separate his or her two fingers in a direction of increasing the distance between the two fingers while touching one of the sub touch area, and accordingly the sub touch area may be increased in size. The user may also pinch his or her two fingers in a direction of decreasing the distance between the two fingers while touching one of the sub touch area, and accordingly the sub touch area may be decreased in size. The respective M touch areas may correspond to display regions on the screen which are equal or different in size. The sizes of these display regions may be adjustable. For example, the first touch area may correspond to 3 sub touch areas, and their display regions on the screen may be equal or different in size. The sizes of the display regions may be adjustable. In another example, the first touch area may correspond to one sub touch area, and the second touch area may correspond to 3 sub touch areas. The display regions on the screen corresponding to the four sub touch areas may be equal or different in size. The sizes of the display regions may be adjustable.
- The
second generation module 403 may be configured to generate an application enabling instruction based on an operation for enabling an application corresponding to a first touch area on the first touch area. The user may operate on the first touch area, such as clicking on the touch area. Thesecond generation module 403 may generate an application enabling instruction when receiving the clicking operation. - The
second generation module 403 may be further configured to generate a sub touch area enabling instruction based on a second input operation on the touch area having the corresponding sub touch area. In an embodiment, the second input operation may be a clicking operation. When the user clicks on a touch area corresponding to a sub touch area, thesecond generation module 403 receives the click operation by the user, and generates a sub touch area enabling instruction. - The
operation module 404 is configured to enable the application corresponding to the first touch area based on the application enabling instruction. When receiving the application enabling instruction, theoperation module 404 may enable the application corresponding to the first touch area. The application enabling instruction may carry a touch area identifier so that theoperation module 404 may enable an application corresponding to the touch area as identified by the touch area identifier. - The
operation module 404 may automatically adjust the size of the screen region occupied by each application, if necessary. For example, a user is watching video, and another user is listening to music. The video application may require a larger region of the screen, and accordingly theoperation module 404 may increase the size of the screen region occupied by the video application. Alternatively, the user may perform manual adjustment. For example, an application may be embedded with buttons for scale-up or scale-down, and the user may perform adjustment by clicking on these buttons. When applications corresponding to the respective M touch areas are enabled, these applications may occupy regions of the screen having equal or different sizes. The size of each region may be adjustable. - The
operation module 404 may be further configured to enable the sub touch area corresponding to the touch area based on the sub touch area enabling instruction. When receiving the sub touch area enabling instruction sent from thesecond generation module 403, theoperation module 404 may enable the sub touch area corresponding to the touch area based on the sub touch area enabling instruction. In an embodiment, the sub touch area enabling instruction may carry a touch area identifier, and theoperation module 404 may enable a sub touch area corresponding to the touch area as identified by the touch area identifier. - Hereafter an example of a method for touch-operating an application will be described.
- The flow of the method for touch-operating an application in an embodiment is shown in
FIG. 6 . - Step 601: generating an operation instruction for the main menu based on a first input operation on a region of the screen corresponding to the main menu;
- Step 602: for each of the M touch areas, adjusting the display position of the touch area from a first position to a second position based on the operation instruction, wherein the positive orientation for graphics and/or texts in at least one of the M touch areas is different between the first position and the second position;
- Step 603: generating an application enabling instruction based on an operation for enabling an application corresponding to a first touch area on the first touch area; and
- Step 604: enabling the application corresponding to the first touch area based on the application enabling instruction.
- The method may be applied to a data processing terminal equipped with a screen having a multi-touch function. The data processing terminal may display in operation at least a main menu on the screen. The main menu may include M touch areas each corresponding to an application. Graphics and/or texts in each of the M touch areas can be displayed in at least two different positive orientations, wherein M is an integer greater than 1.
- The method for touch-operating an application according to embodiments of the present invention comprises: generating an operation instruction for the main menu based on a first input operation on a region of the screen corresponding to the main menu; for each of the M touch areas, adjusting the display position of the touch area from a first position to a second position based on the operation instruction, wherein the positive orientation for graphics and/or texts in at least one of the M touch areas is different between the first position and the second position; generating an application enabling instruction based on an operation for enabling an application corresponding to a first touch area on the first touch area; and enabling the application corresponding to the first touch area based on the application enabling instruction. According to the embodiments, the main menu can be rotated and operated simultaneously by multiple users. It is possible to enable the users to always face directly to the main menu, and thus improve convenience. Regions of the screen occupied by the touch areas or applications may be adjusted in size as required. The main menu may be presented in different shapes. This also improves convenience and flexibility.
- Those skilled in the art will appreciate that embodiments of the present invention may be provided as methods, systems or computer program products, and thus may take a form of hardware, software or combination thereof. Embodiments of the present invention may also be implemented as computer program product in the form of one or more computer readable storage media (including but not limited to disk memories and optical memories) containing computer readable program codes.
- The present invention has been described with reference to flowcharts and/or block diagrams of methods, apparatuses (systems) and computer program products according to embodiments of the present invention. It will be appreciated that each step and/or block in the flowcharts and/or block diagrams or combination thereof may be implemented in computer program instructions. The computer program instructions may be loaded onto a processor of a general computer, a dedicated computer, an embedded processing device or any other programmable data processing device to generate a machine, so that the instructions, when executed by the processor of the computer or other programmable data processing device, generate means for performing the functions specified in one or more steps and/or blocks in the flowcharts and/or block diagrams.
- These computer program instructions may be also stored in a computer readable memory that may direct computers or other programmable data processing devices to operate in a specific manner, so that the instructions stored in the computer readable memory may generate a manufacture article containing instruction means which perform the functions specified in one or more steps and/or blocks in the flowcharts and/or block diagrams.
- These computer program instructions may be also loaded onto computers or other programmable data processing devices, so that the computers or other programmable data processing devices perform a sequence of operation steps for computer-implemented processing. When executed on the computers or other programmable data processing devices, the instructions provide steps for performing the functions specified in one or more steps and/or blocks in the flowcharts and/or block diagrams.
- The foregoing description is intended to illustrate the exemplary embodiments of the present disclosure. It will be readily understood by a person skilled in the art that various modifications and variations may be made to the present invention without departing from the spirit and scope of the present invention, and these modifications and variations also fall into the scope of the present invention.
Claims (8)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/676,563 US20170344202A1 (en) | 2011-11-02 | 2017-08-14 | Methods and apparatuses for window display, and methods and apparatuses for touch-operating an application |
Applications Claiming Priority (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201110342315.0 | 2011-11-02 | ||
CN201110342315.0A CN102768607B (en) | 2011-11-02 | 2011-11-02 | Method and device for realizing touch operation application program |
CN201110421915.6A CN102778997B (en) | 2011-12-15 | 2011-12-15 | A kind of window display method and device |
CN201110421915.6 | 2011-12-15 | ||
US13/666,998 US9766777B2 (en) | 2011-11-02 | 2012-11-02 | Methods and apparatuses for window display, and methods and apparatuses for touch-operating an application |
US15/676,563 US20170344202A1 (en) | 2011-11-02 | 2017-08-14 | Methods and apparatuses for window display, and methods and apparatuses for touch-operating an application |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/666,998 Division US9766777B2 (en) | 2011-11-02 | 2012-11-02 | Methods and apparatuses for window display, and methods and apparatuses for touch-operating an application |
Publications (1)
Publication Number | Publication Date |
---|---|
US20170344202A1 true US20170344202A1 (en) | 2017-11-30 |
Family
ID=48084497
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/666,998 Active 2034-05-03 US9766777B2 (en) | 2011-11-02 | 2012-11-02 | Methods and apparatuses for window display, and methods and apparatuses for touch-operating an application |
US15/676,563 Abandoned US20170344202A1 (en) | 2011-11-02 | 2017-08-14 | Methods and apparatuses for window display, and methods and apparatuses for touch-operating an application |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/666,998 Active 2034-05-03 US9766777B2 (en) | 2011-11-02 | 2012-11-02 | Methods and apparatuses for window display, and methods and apparatuses for touch-operating an application |
Country Status (2)
Country | Link |
---|---|
US (2) | US9766777B2 (en) |
DE (1) | DE102012110278A1 (en) |
Families Citing this family (42)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9417754B2 (en) | 2011-08-05 | 2016-08-16 | P4tents1, LLC | User interface system, method, and computer program product |
US9754085B2 (en) | 2011-08-17 | 2017-09-05 | Integrated Chemistry Design, Inc. | Systems and methods of editing a chemical structure on a touch-screen |
WO2013169842A2 (en) | 2012-05-09 | 2013-11-14 | Yknots Industries Llc | Device, method, and graphical user interface for selecting object within a group of objects |
CN104487928B (en) | 2012-05-09 | 2018-07-06 | 苹果公司 | For equipment, method and the graphic user interface of transition to be carried out between dispaly state in response to gesture |
CN107977084B (en) | 2012-05-09 | 2021-11-05 | 苹果公司 | Method and apparatus for providing haptic feedback for operations performed in a user interface |
WO2013169875A2 (en) | 2012-05-09 | 2013-11-14 | Yknots Industries Llc | Device, method, and graphical user interface for displaying content associated with a corresponding affordance |
EP2847661A2 (en) | 2012-05-09 | 2015-03-18 | Apple Inc. | Device, method, and graphical user interface for moving and dropping a user interface object |
WO2013169843A1 (en) | 2012-05-09 | 2013-11-14 | Yknots Industries Llc | Device, method, and graphical user interface for manipulating framed graphical objects |
WO2013169849A2 (en) | 2012-05-09 | 2013-11-14 | Industries Llc Yknots | Device, method, and graphical user interface for displaying user interface objects corresponding to an application |
CN109298789B (en) | 2012-05-09 | 2021-12-31 | 苹果公司 | Device, method and graphical user interface for providing feedback on activation status |
WO2013169845A1 (en) | 2012-05-09 | 2013-11-14 | Yknots Industries Llc | Device, method, and graphical user interface for scrolling nested regions |
AU2013259637B2 (en) | 2012-05-09 | 2016-07-07 | Apple Inc. | Device, method, and graphical user interface for selecting user interface objects |
WO2013169851A2 (en) | 2012-05-09 | 2013-11-14 | Yknots Industries Llc | Device, method, and graphical user interface for facilitating user interaction with controls in a user interface |
AU2013259606B2 (en) | 2012-05-09 | 2016-06-02 | Apple Inc. | Device, method, and graphical user interface for displaying additional information in response to a user contact |
WO2013169865A2 (en) | 2012-05-09 | 2013-11-14 | Yknots Industries Llc | Device, method, and graphical user interface for moving a user interface object based on an intensity of a press input |
WO2014105275A1 (en) | 2012-12-29 | 2014-07-03 | Yknots Industries Llc | Device, method, and graphical user interface for forgoing generation of tactile output for a multi-contact gesture |
CN105264479B (en) | 2012-12-29 | 2018-12-25 | 苹果公司 | Equipment, method and graphic user interface for navigating to user interface hierarchical structure |
AU2013368443B2 (en) | 2012-12-29 | 2016-03-24 | Apple Inc. | Device, method, and graphical user interface for transitioning between touch input to display output relationships |
EP2939095B1 (en) | 2012-12-29 | 2018-10-03 | Apple Inc. | Device, method, and graphical user interface for moving a cursor according to a change in an appearance of a control icon with simulated three-dimensional characteristics |
WO2014105279A1 (en) | 2012-12-29 | 2014-07-03 | Yknots Industries Llc | Device, method, and graphical user interface for switching between user interfaces |
EP2939096B1 (en) | 2012-12-29 | 2019-08-28 | Apple Inc. | Device, method, and graphical user interface for determining whether to scroll or select contents |
US9250780B2 (en) * | 2013-10-28 | 2016-02-02 | Lenovo (Beijing) Co., Ltd. | Information processing method and electronic device |
US9244593B2 (en) * | 2013-10-28 | 2016-01-26 | Beijing Lenovo Software Ltd. | Information processing methods and electronic devices |
CN104572058B (en) * | 2013-10-28 | 2019-03-29 | 联想(北京)有限公司 | A kind of information processing method and electronic equipment |
US10095396B2 (en) | 2015-03-08 | 2018-10-09 | Apple Inc. | Devices, methods, and graphical user interfaces for interacting with a control object while dragging another object |
US9645732B2 (en) | 2015-03-08 | 2017-05-09 | Apple Inc. | Devices, methods, and graphical user interfaces for displaying and using menus |
US9990107B2 (en) | 2015-03-08 | 2018-06-05 | Apple Inc. | Devices, methods, and graphical user interfaces for displaying and using menus |
US10048757B2 (en) | 2015-03-08 | 2018-08-14 | Apple Inc. | Devices and methods for controlling media presentation |
US9632664B2 (en) * | 2015-03-08 | 2017-04-25 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
US10379671B2 (en) | 2015-03-19 | 2019-08-13 | Huawei Technologies Co., Ltd. | Touch event processing method and apparatus, and terminal device |
US9639184B2 (en) | 2015-03-19 | 2017-05-02 | Apple Inc. | Touch input cursor manipulation |
US20170045981A1 (en) | 2015-08-10 | 2017-02-16 | Apple Inc. | Devices and Methods for Processing Touch Inputs Based on Their Intensities |
US10152208B2 (en) | 2015-04-01 | 2018-12-11 | Apple Inc. | Devices and methods for processing touch inputs based on their intensities |
US10346030B2 (en) | 2015-06-07 | 2019-07-09 | Apple Inc. | Devices and methods for navigating between user interfaces |
US9860451B2 (en) | 2015-06-07 | 2018-01-02 | Apple Inc. | Devices and methods for capturing and interacting with enhanced digital images |
US10200598B2 (en) | 2015-06-07 | 2019-02-05 | Apple Inc. | Devices and methods for capturing and interacting with enhanced digital images |
US9830048B2 (en) | 2015-06-07 | 2017-11-28 | Apple Inc. | Devices and methods for processing touch inputs with instructions in a web page |
US9891811B2 (en) | 2015-06-07 | 2018-02-13 | Apple Inc. | Devices and methods for navigating between user interfaces |
US10416800B2 (en) | 2015-08-10 | 2019-09-17 | Apple Inc. | Devices, methods, and graphical user interfaces for adjusting user interface objects |
US10235035B2 (en) | 2015-08-10 | 2019-03-19 | Apple Inc. | Devices, methods, and graphical user interfaces for content navigation and manipulation |
US10248308B2 (en) | 2015-08-10 | 2019-04-02 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interfaces with physical gestures |
US9880735B2 (en) | 2015-08-10 | 2018-01-30 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
Family Cites Families (62)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5524196A (en) * | 1992-12-18 | 1996-06-04 | International Business Machines Corporation | Method and system for manipulating data through a graphic user interface within a data processing system |
US5706448A (en) * | 1992-12-18 | 1998-01-06 | International Business Machines Corporation | Method and system for manipulating data through a graphic user interface within a data processing system |
US5499334A (en) * | 1993-03-01 | 1996-03-12 | Microsoft Corporation | Method and system for displaying window configuration of inactive programs |
US5973666A (en) * | 1995-07-20 | 1999-10-26 | International Business Machines Corporation | Method and means for controlling the concurrent execution of a plurality of programs on a computer system |
KR100375054B1 (en) * | 1996-03-15 | 2003-05-09 | 가부시끼가이샤 히다치 세이사꾸쇼 | Display device and its operation method |
JP3511462B2 (en) * | 1998-01-29 | 2004-03-29 | インターナショナル・ビジネス・マシーンズ・コーポレーション | Operation image display device and method thereof |
US6549219B2 (en) * | 1999-04-09 | 2003-04-15 | International Business Machines Corporation | Pie menu graphical user interface |
KR100434265B1 (en) | 1999-12-07 | 2004-06-04 | 엘지전자 주식회사 | OSD apparatus and method for displaying menu in OSD apparatus |
US7287232B2 (en) * | 2000-05-08 | 2007-10-23 | Fujitsu Limited | Information display system having graphical user interface switchingly controlling information display on display screen |
US6791530B2 (en) * | 2000-08-29 | 2004-09-14 | Mitsubishi Electric Research Laboratories, Inc. | Circular graphical user interfaces |
US7441220B2 (en) * | 2000-12-07 | 2008-10-21 | Cadence Design Systems, Inc. | Local preferred direction architecture, tools, and apparatus |
US6925611B2 (en) * | 2001-01-31 | 2005-08-02 | Microsoft Corporation | Navigational interface for mobile and wearable computers |
JP4098637B2 (en) | 2002-01-21 | 2008-06-11 | ミツビシ・エレクトリック・リサーチ・ラボラトリーズ・インコーポレイテッド | Method and system for visualizing multiple images in a circular graphical user interface |
US6996783B2 (en) * | 2002-01-28 | 2006-02-07 | International Business Machines Corporation | Selectively adjusting transparency of windows within a user interface using a flashlight tool |
US7898529B2 (en) * | 2003-01-08 | 2011-03-01 | Autodesk, Inc. | User interface having a placement and layout suitable for pen-based computers |
US20050204306A1 (en) * | 2003-09-15 | 2005-09-15 | Hideya Kawahara | Enhancements for manipulating two-dimensional windows within a three-dimensional display model |
US20050140696A1 (en) * | 2003-12-31 | 2005-06-30 | Buxton William A.S. | Split user interface |
US20050178074A1 (en) * | 2004-02-02 | 2005-08-18 | Kerosetz Jay E. | Multifunction table |
US7523413B2 (en) * | 2004-06-14 | 2009-04-21 | At&T Intellectual Property I, L.P. | Organizing session applications |
US7535481B2 (en) * | 2004-06-28 | 2009-05-19 | Microsoft Corporation | Orienting information presented to users located at different sides of a display surface |
US7728821B2 (en) * | 2004-08-06 | 2010-06-01 | Touchtable, Inc. | Touch detecting interactive display |
US20060107229A1 (en) * | 2004-11-15 | 2006-05-18 | Microsoft Corporation | Work area transform in a graphical user interface |
US8341541B2 (en) * | 2005-01-18 | 2012-12-25 | Microsoft Corporation | System and method for visually browsing of open windows |
US7478326B2 (en) * | 2005-01-18 | 2009-01-13 | Microsoft Corporation | Window information switching system |
US20060181519A1 (en) | 2005-02-14 | 2006-08-17 | Vernier Frederic D | Method and system for manipulating graphical objects displayed on a touch-sensitive display surface using displaced pop-ups |
KR20070006477A (en) * | 2005-07-08 | 2007-01-11 | 삼성전자주식회사 | Method for arranging contents menu variably and display device using the same |
JP2009508274A (en) * | 2005-09-13 | 2009-02-26 | スペースタイムスリーディー・インコーポレーテッド | System and method for providing a three-dimensional graphical user interface |
US8060840B2 (en) * | 2005-12-29 | 2011-11-15 | Microsoft Corporation | Orientation free user interface |
US8930834B2 (en) * | 2006-03-20 | 2015-01-06 | Microsoft Corporation | Variable orientation user interface |
DE102006021400B4 (en) * | 2006-05-08 | 2008-08-21 | Combots Product Gmbh & Co. Kg | Method and device for providing a selection menu associated with a displayed symbol |
US7552402B2 (en) * | 2006-06-22 | 2009-06-23 | Microsoft Corporation | Interface orientation using shadows |
WO2008043587A1 (en) * | 2006-10-13 | 2008-04-17 | Abb Research Ltd | A device, system and computer implemented method to display and process technical data for a device in an industrial control system |
US8549429B2 (en) | 2007-01-25 | 2013-10-01 | Sharp Kabushiki Kaisha | Multi-window management apparatus and program, storage medium and information processing apparatus |
US20080192059A1 (en) * | 2007-02-09 | 2008-08-14 | Microsoft Corporation | Multi-user display |
US8352881B2 (en) * | 2007-03-08 | 2013-01-08 | International Business Machines Corporation | Method, apparatus and program storage device for providing customizable, immediate and radiating menus for accessing applications and actions |
US8161407B2 (en) * | 2007-03-15 | 2012-04-17 | International Business Machines Corporation | Multiple sorting of columns in a displayed table in a user interactive computer display interface through sequential radial menus |
US8244068B2 (en) * | 2007-03-28 | 2012-08-14 | Sony Ericsson Mobile Communications Ab | Device and method for adjusting orientation of a data representation displayed on a display |
US20090106667A1 (en) | 2007-10-19 | 2009-04-23 | International Business Machines Corporation | Dividing a surface of a surface-based computing device into private, user-specific areas |
US7976372B2 (en) * | 2007-11-09 | 2011-07-12 | Igt | Gaming system having multiple player simultaneous display/input device |
CN101499253B (en) | 2008-01-28 | 2011-06-29 | 宏达国际电子股份有限公司 | Output picture regulation method and apparatus |
US20090225040A1 (en) | 2008-03-04 | 2009-09-10 | Microsoft Corporation | Central resource for variable orientation user interface |
US8826181B2 (en) * | 2008-06-28 | 2014-09-02 | Apple Inc. | Moving radial menus |
WO2010030984A1 (en) * | 2008-09-12 | 2010-03-18 | Gesturetek, Inc. | Orienting a displayed element relative to a user |
CN101365117B (en) | 2008-09-18 | 2010-12-29 | 中兴通讯股份有限公司 | Method for customized screen splitting mode |
US20100083109A1 (en) * | 2008-09-29 | 2010-04-01 | Smart Technologies Ulc | Method for handling interactions with multiple users of an interactive input system, and interactive input system executing the method |
US8289288B2 (en) * | 2009-01-15 | 2012-10-16 | Microsoft Corporation | Virtual object adjustment via physical object detection |
US20100201636A1 (en) * | 2009-02-11 | 2010-08-12 | Microsoft Corporation | Multi-mode digital graphics authoring |
JP5487679B2 (en) | 2009-03-31 | 2014-05-07 | ソニー株式会社 | Information processing apparatus, information processing method, and information processing program |
KR101651859B1 (en) * | 2009-06-05 | 2016-09-12 | 삼성전자주식회사 | Method for providing UI for each user, and device applying the same |
EP2270640A1 (en) * | 2009-06-26 | 2011-01-05 | France Telecom | Method for managing display of an application window on a screen, a program and a terminal using same |
CN101630222B (en) | 2009-08-19 | 2012-09-26 | 中科方德软件有限公司 | Method, system and device for processing user menu |
US8832585B2 (en) * | 2009-09-25 | 2014-09-09 | Apple Inc. | Device, method, and graphical user interface for manipulating workspace views |
US8487888B2 (en) * | 2009-12-04 | 2013-07-16 | Microsoft Corporation | Multi-modal interaction on multi-touch display |
KR20110069563A (en) | 2009-12-17 | 2011-06-23 | 엘지전자 주식회사 | Apparatus for displaying image and method for operating the same |
JP2011180990A (en) | 2010-03-03 | 2011-09-15 | Panasonic Corp | Menu display device, electronic device, and menu display control method |
US8584041B2 (en) * | 2010-08-13 | 2013-11-12 | Markus Schulz | Graphical user interface with a concentric arrangement and method for accessing data objects via a graphical user interface |
US8959444B2 (en) * | 2010-12-15 | 2015-02-17 | International Business Machines Corporation | Presenting a navigation order of shapes |
US20120262489A1 (en) * | 2011-04-12 | 2012-10-18 | Caliendo Jr Neal Robert | Relative and Absolute Screen Rotation Draft Agent |
US9086794B2 (en) * | 2011-07-14 | 2015-07-21 | Microsoft Technology Licensing, Llc | Determining gestures on context based menus |
US8869068B2 (en) * | 2011-11-22 | 2014-10-21 | Backplane, Inc. | Content sharing application utilizing radially-distributed menus |
US10503373B2 (en) * | 2012-03-14 | 2019-12-10 | Sony Interactive Entertainment LLC | Visual feedback for highlight-driven gesture user interfaces |
US9223340B2 (en) * | 2013-08-14 | 2015-12-29 | Lenovo (Singapore) Pte. Ltd. | Organizing display data on a multiuser display |
-
2012
- 2012-10-26 DE DE102012110278A patent/DE102012110278A1/en active Pending
- 2012-11-02 US US13/666,998 patent/US9766777B2/en active Active
-
2017
- 2017-08-14 US US15/676,563 patent/US20170344202A1/en not_active Abandoned
Also Published As
Publication number | Publication date |
---|---|
US9766777B2 (en) | 2017-09-19 |
DE102012110278A1 (en) | 2013-05-02 |
US20130111398A1 (en) | 2013-05-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20170344202A1 (en) | Methods and apparatuses for window display, and methods and apparatuses for touch-operating an application | |
US10671282B2 (en) | Display device including button configured according to displayed windows and control method therefor | |
US8749497B2 (en) | Multi-touch shape drawing | |
US8910072B2 (en) | Browsing and interacting with open windows | |
KR102027612B1 (en) | Thumbnail-image selection of applications | |
US9658766B2 (en) | Edge gesture | |
CN104025003B (en) | Translate animation | |
JP5460679B2 (en) | Information processing apparatus, information processing method, and data structure of content file | |
US9354899B2 (en) | Simultaneous display of multiple applications using panels | |
US20130047126A1 (en) | Switching back to a previously-interacted-with application | |
US20140292668A1 (en) | Touch input device haptic feedback | |
KR20140133357A (en) | display apparatus and user interface screen providing method thereof | |
WO2012166176A1 (en) | Edge gesture | |
KR20140133361A (en) | display apparatus and user interface screen providing method thereof | |
WO2012166175A1 (en) | Edge gesture | |
ES2724423T3 (en) | Imaging apparatus and procedure | |
US20150339026A1 (en) | User terminal device, method for controlling user terminal device, and multimedia system thereof | |
TW201203081A (en) | Representative image | |
US20140085210A1 (en) | Electronic tabletop system | |
EP2998833B1 (en) | Electronic device and method of controlling display of screen thereof | |
KR102426088B1 (en) | User terminal device and method for displaying thereof | |
US10241659B2 (en) | Method and apparatus for adjusting the image display | |
WO2013119477A1 (en) | Presentation techniques | |
EP3128397B1 (en) | Electronic apparatus and text input method for the same | |
WO2012046295A1 (en) | Information processing device and input device display method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: BEIJING LENOVO SOFTWARE LTD., CHINA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:LU, LU;CHEN, YU;LI, JUN;AND OTHERS;REEL/FRAME:043544/0232 Effective date: 20121101 Owner name: LENOVO (BEIJING) LIMITED, CHINA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:LU, LU;CHEN, YU;LI, JUN;AND OTHERS;REEL/FRAME:043544/0232 Effective date: 20121101 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |