WO2014075451A1 - 用户界面管理方法及装置 - Google Patents
用户界面管理方法及装置 Download PDFInfo
- Publication number
- WO2014075451A1 WO2014075451A1 PCT/CN2013/078896 CN2013078896W WO2014075451A1 WO 2014075451 A1 WO2014075451 A1 WO 2014075451A1 CN 2013078896 W CN2013078896 W CN 2013078896W WO 2014075451 A1 WO2014075451 A1 WO 2014075451A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- layer
- user interface
- contact
- gesture
- applications
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 40
- 238000007726 management method Methods 0.000 claims description 52
- 238000012217 deletion Methods 0.000 claims description 32
- 230000037430 deletion Effects 0.000 claims description 32
- 238000001514 detection method Methods 0.000 claims description 12
- 230000008859 change Effects 0.000 abstract description 2
- 238000010586 diagram Methods 0.000 description 18
- 230000006870 function Effects 0.000 description 8
- 230000008569 process Effects 0.000 description 3
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
Definitions
- the present disclosure relates to a user interface of an electronic device having a touch display screen, and more particularly to a user interface management method and apparatus. Background technique
- An electronic device with a touch display provides a user interface as a medium for interaction between the user and the electronic device.
- One or more soft keys, menus, and shortcut icons for each application are available on the user interface. Due to the size of the touch screen, when there is a lot of content to be displayed, the user interface can be divided into multiple pages for display.
- the electronic device detects and responds to user contact with the user interface. For example, when a user wants to use an application, the icon corresponding to the application can be touched in the user interface, and after the electronic device detects the contact, the corresponding application is started. For example, when the icon corresponding to the application that the user wants to use is not displayed on the current page, the user may slide on the touch display screen, and after detecting the sliding contact, the electronic device displays the next page content of the current page of the user interface according to the sliding direction. .
- the embodiment of the invention provides a user interface management method and device, which enables a user to quickly find a set of applications that need to be used at a time through a simple operation, and simplify the operation of the user.
- An embodiment of the present invention provides a user interface management method, where the method includes:
- the contact matches a predetermined layer switching gesture, switching the layer displayed on the user interface; wherein the different layers include icons of applications that are not identical.
- the switching layer displayed on the user interface includes:
- the layer selected by the user is displayed on the user interface.
- the switching layer displayed on the user interface includes:
- the determined layer is displayed on the user interface.
- the layer includes a base layer and an extended layer
- the base layer includes icons of all applications
- different extended layers include icons of different categories of applications.
- An embodiment of the present invention provides a user interface management method, where the method includes:
- the contact matches a predetermined layer locking gesture, the currently displayed layer on the user interface is locked; wherein the different layers include icons of applications that are not identical.
- the layer includes a base layer and an extension layer, the base layer includes icons for all applications, and different extension layers include icons for different categories of applications.
- An embodiment of the present invention provides a user interface management method, where the method includes:
- the contact matches a predetermined application deletion gesture, deleting an icon of the application corresponding to the contact in a layer currently displayed on the user interface;
- the layer includes a base layer and an extension layer, the base layer includes icons for all applications, and different extension layers include icons for different categories of applications.
- deleting the icon of the application corresponding to the contact in the layer currently displayed on the user interface includes:
- the application corresponding to the contact is uninstalled, and the icon of the application corresponding to the contact is deleted from each layer;
- the icon of the application corresponding to the contact is deleted from the extended layer currently displayed on the user interface.
- the delete side is further displayed on the user interface. a prompt, and then deleting the icon of the application corresponding to the contact from the extended layer currently displayed on the user interface according to the user's selection, or unloading the application corresponding to the contact, and corresponding to the contact
- the icon for the app is removed from each layer.
- An embodiment of the present invention provides a user interface management method, where the method includes:
- the icon of the application added in the layer currently displayed on the user interface includes:
- An icon of the application selected by the user is added to the layer currently displayed on the user interface.
- the layer includes a base layer and an extended layer
- the base layer includes icons of all applications
- different extended layers include icons of different categories of applications.
- An embodiment of the present invention provides a user interface management method, where the method includes:
- the layer includes a base layer and an extension layer, the base layer includes icons for all applications, and different extension layers include icons for different categories of applications.
- the editing operation includes adding an extended layer, deleting an extended layer, merging two or more extended layers, changing the naming of the extended layer, changing the background of the extended layer, and changing the icon style of the application in the extended layer. At least one of them.
- An embodiment of the present invention provides a user interface management method, where the method includes:
- the layer includes a base layer and an extension layer, the base layer includes icons for all applications, and different extension layers include icons for different categories of applications.
- An embodiment of the present invention provides a user interface management apparatus, where the method includes:
- a detecting module configured to detect contact of the touch display screen
- a determining module configured to determine a gesture that matches the contact
- a switching module configured to switch a layer displayed on the user interface when the contact matches a predetermined layer switching gesture
- the switching module includes:
- a first processing submodule configured to: when the contact matches a predetermined layer switching gesture, display a selectable layer list on the user interface, and confirm a user selection of one of the layers;
- the first switching execution sub-module is configured to display the layer selected by the user on the user interface.
- the switching module includes:
- a second processing submodule configured to determine, according to a set order, a next layer of a layer currently displayed on the user interface when the contact matches a predetermined layer switching gesture
- the second switching execution sub-module is configured to display the determined layer on the user interface.
- the layer includes a base layer and an extended layer
- the base layer includes icons of all applications
- different extended layers include icons of different categories of applications.
- An embodiment of the present invention provides a user interface management apparatus, where the apparatus includes:
- a detecting module configured to detect contact of the touch display screen
- a determining module configured to determine a gesture that matches the contact
- a locking module configured to match the contact with a predetermined layer locking gesture, and lock the currently displayed on the user interface Layer
- the layer includes a base layer and an extension layer, the base layer includes icons for all applications, and different extension layers include icons for different categories of applications.
- An embodiment of the present invention provides a user interface management apparatus, where the apparatus includes:
- a detecting module configured to detect contact of the touch display screen
- a determining module configured to determine a gesture that matches the contact
- a deleting module configured to delete an icon of the application corresponding to the contact in a layer currently displayed on the user interface when the contact matches a predetermined application deletion gesture
- the layer includes a base layer and an extension layer, the base layer includes icons for all applications, and different extension layers include icons for different categories of applications.
- the deleting module includes:
- the execution sub-module is configured to: when the base layer is currently displayed on the user interface, uninstall the application corresponding to the contact, and delete the icon of the application corresponding to the contact from each layer; When the extended layer is currently displayed on the interface, the icon of the application corresponding to the contact is deleted from the extended layer currently displayed on the user interface.
- the deletion execution sub-module further displays the extended layer on the user interface
- the prompt of the deletion mode is displayed on the user interface, and then the icon of the application corresponding to the contact is selected according to the user's selection.
- the extended layer currently displayed on the user interface is deleted, or the application corresponding to the contact is uninstalled, and the icon of the application corresponding to the contact is deleted from each layer.
- the embodiment of the invention provides a user interface management method, and the device includes:
- a detecting module configured to detect contact of the touch display screen
- a determining module configured to determine a gesture that matches the contact
- an adding module configured to add an icon of the application to the layer currently displayed on the user interface when the contact increases the gesture matching with the predetermined application
- the adding module includes:
- a display sub-module for displaying a list of selectable applications on the user interface;
- An execution sub-module is added to confirm the user's selection of one or more applications; an icon of the application selected by the user is added to the layer currently displayed on the user interface.
- the layer includes a base layer and an extended layer
- the base layer includes icons of all applications
- different extended layers include icons of different categories of applications.
- An embodiment of the present invention provides a user interface management apparatus, where the apparatus includes:
- a detecting module configured to detect contact of the touch display screen
- a determining module configured to determine a gesture that matches the contact
- An editing module configured to: when the contact matches a predetermined layer editing gesture, control the user interface to enter an editing state of the layer; receive an editing instruction input by the user in the editing state, and perform an editing operation for the layer;
- different layers include icons of applications that are not identical.
- the layer includes a base layer and an extension layer, the base layer includes icons for all applications, and different extension layers include icons for different categories of applications.
- the editing operation includes adding an expansion layer, deleting an extended layer, merging two or more extended layers, changing the naming of the extended layer, changing the background of the extended layer, and changing an icon style of the application. .
- An embodiment of the present invention provides a user interface management apparatus, where the apparatus includes:
- a detecting module configured to detect contact of the touch display screen
- a determining module configured to determine a gesture that matches the contact
- a management module configured to: when the contact matches a predetermined layer switching gesture, switch a layer displayed on the user interface; when the contact matches a predetermined layer locking gesture, lock the currently displayed image on the user interface a layer; when the contact matches a predetermined application deletion gesture, deleting an icon of the application corresponding to the contact in a layer currently displayed on the user interface; when the contact matches a predetermined application to increase a gesture Adding an icon of the application to the currently displayed layer on the user interface; when the contact matches the predetermined layer editing gesture, entering the editing state of the layer, and receiving the editing instruction input by the user in the editing state , perform editing operations for layers; where different layers include icons for applications that are not identical.
- the layer includes a base layer and an extension layer, the base layer includes icons for all applications, and different extension layers include icons for different categories of applications.
- the user interface can display different layers. Different layers include icons of applications that are not identical.
- the user can switch the layer currently displayed on the user interface by performing a simple touch gesture on the touch screen display, thereby quickly finding the current layer.
- Embodiment 1 is a flowchart of a display method of a user interface in Embodiment 1 of the embodiment of the present invention
- FIG. 2 is a schematic diagram of a user interface showing a layer list
- Embodiment 3 is a flowchart of a user interface management method in Embodiment 2 of the embodiment of the present invention.
- FIG. 4 is a schematic diagram of a user interface showing a prompt for deleting a mode
- FIG. 5 is a flowchart of a user interface management method in Embodiment 3 of the embodiment of the present invention.
- Embodiment 6 is a flowchart of a user interface management method in Embodiment 4 according to an embodiment of the present invention.
- FIG. 7 is a flowchart of a user interface management method in Embodiment 5 of the embodiment of the present invention.
- Figure 8 is a schematic diagram showing a user interface of a layer editing menu
- FIG. 9 is a schematic diagram of a user interface showing a setting dialog of an extended layer
- FIG. 10 is a schematic diagram showing a user interface for deleting a state of an extended layer
- FIG. 11 is a schematic diagram of a user interface showing a state of a merged extended layer
- FIG. 12 is a schematic structural diagram of a user interface management apparatus in Embodiment 6 of the embodiment of the present invention.
- FIG. 13 is a schematic structural diagram of a user interface management apparatus in Embodiment 7 of the embodiment of the present invention.
- FIG. 14 is a schematic structural diagram of a user interface management apparatus in Embodiment 8 of the embodiment of the present invention.
- FIG. 15 is a schematic structural diagram of a user interface management apparatus in Embodiment 9 of the embodiment of the present invention.
- FIG. 16 is a schematic structural diagram of a user interface management apparatus in Embodiment 10 of the embodiment of the present invention.
- FIG. 17 is a schematic structural diagram of a user interface management apparatus in Embodiment 11 of the embodiment of the present invention. detailed description
- Step 11 Detect contact with the touch display.
- Step 12 Determine a predetermined gesture that matches the contact.
- Step 13 If the contact matches a predetermined layer switching gesture, switching the layer displayed on the user interface; wherein different layers include icons of applications that are not identical.
- the layer includes a base layer and an extended layer
- the base layer includes icons of all applications
- different extended layers include icons of different categories of applications.
- the categories here are custom categories such as "no network”, “work” and “online”, and the category name can be used as the theme name of the extended layer.
- the predetermined layer switching gesture is a two-point touch on the touch display screen and a vertical slide, and when the contact detected in step 11 matches the predetermined layer switching gesture, the layer displayed on the user interface is switched.
- Embodiment 1 includes the following sub-steps:
- Step 131 Display a list of selectable layers on the user interface.
- the user can know that all the layers currently selectable include the base layer, the extended layer 1, the extended layer 2, and the extended layer 3.
- the base layer and the extended layer are used.
- the theme name of the extended layer can be directly displayed in the layer list, such as "no network theme layer” and "Internet theme layer”.
- Step 132 Confirm the user's selection of one of the layers.
- the user can directly select the layer to switch to by a single touch, and after detecting the single touch of the user, confirm the user's selection of a layer.
- Step 133 Display the layer selected by the user on the user interface.
- the basic layer is currently displayed on the user interface. If the user wants to switch to the extended layer 2, the selection can be performed through steps 131 to 133, and the fast switching can be realized on the basis of the simplified operation. Moreover, the user does not need to memorize the category of the icon of the application included in each extension layer, and the topic name of the extended layer in the layer list can be known, thereby further simplifying the user's operation.
- Embodiment 2 includes the following sub-steps:
- Step 13 Determine the next layer of the layer currently displayed on the user interface in the predetermined order.
- the base layer defaults to the first layer, and the other extended layers are created one by one. This will be described in detail in the following embodiments, so the predetermined order here may be the order in which the layers are created, or Other custom order.
- Step 132' Display the determined layer on the user interface.
- the creation order of the layer is the base layer, the extended layer 1, the extended layer 2, and the extended layer 3. If the extended layer 1 is currently displayed on the user interface, step 13 and steps are performed.
- 132' displays the expanded layer 2 on the user interface. If the user wishes to display the extended layer 3 on the user interface, the user needs to perform steps 11 to 13 repeatedly.
- Embodiment 2
- FIG. 3 is a flowchart of a user interface management method according to Embodiment 2 of the embodiment of the present invention, where the process includes the following steps.
- Step 31 Detect contact with the touch display.
- Step 32 Determine a predetermined gesture that matches the contact.
- Step 33 If the contact matches a predetermined application deletion gesture, deleting the application corresponding to the contact in a layer currently displayed by the user interface; wherein different layers include icons of applications that are not identical.
- the layer includes a base layer and an extended layer
- the base layer includes icons of all applications
- different extended layers include icons of different categories of applications.
- the categories here are custom categories such as "no network”, “work” and “online”.
- the predetermined application deletion gesture is to make two touches on the touch display screen and slide horizontally to the right, at least one of the two touch points needs to correspond to the application to be deleted, the contact detected in step 31 and the reservation When the application deletes the gesture matching, the application corresponding to the contact is deleted in the layer currently displayed by the user interface.
- the detected contact matches the predetermined application deletion gesture, first determine the type of the layer currently displayed on the user interface. If the basic layer is currently displayed on the user interface, the corresponding application is contacted. The program is uninstalled, and the icon of the application corresponding to the contact is deleted from each layer. If the extended layer is currently displayed on the user interface, the icon of the application corresponding to the contact is from the currently displayed extended layer. Deleted.
- the prompt of the deletion mode may be displayed on the user interface, prompting the user to uninstall the application, or only in the currently displayed extended layer.
- Delete the icon of the application as shown in Figure 4, and then delete the icon of the application from the currently displayed extended layer according to the user's selection, or uninstall the application, and the icon of the corresponding application is contacted. From each Deleted in the layer.
- the function of "hard deletion" is also provided for the extended layer, and the operation is performed according to the user's selection, and the operation mode is more flexible.
- FIG. 5 is a flowchart of a user interface management method according to Embodiment 3 of the embodiment of the present invention, where the process includes the following steps.
- Step 51 Detect contact with the touch display.
- Step 52 Determine a predetermined gesture that matches the contact.
- Step 53 If the contact matches the predetermined application to increase the gesture, the application is added to the currently displayed layer on the user interface; wherein different layers include icons of the applications that are not identical.
- the layer includes a base layer and an extended layer
- the base layer includes icons of all applications
- different extended layers include icons of different categories of applications.
- the categories here are custom categories such as "no network”, “work” and “online”.
- the predetermined application increments the gesture to two-touch and slides horizontally to the left.
- the application is added to the layer currently displayed on the user interface.
- the selectable application list is displayed on the user interface to determine the user's selection of one or more applications, and the currently displayed image on the user interface. Add the user-selected application to the layer.
- the application list includes all the applications installed on the electronic device.
- the application list includes downloadable and installable applications. For example, a dedicated application for downloading an application can be opened, and a list of the dedicated application is displayed on the user interface for the user to select.
- the user can select the application to be added by a single touch, and the single touch can be detected to determine the user's selection of the application.
- FIG. 6 is a flowchart of a user interface management method in Embodiment 4 of the embodiment of the present invention, where the process includes the following steps. Step 61: Detect contact with the touch display.
- Step 62 Determine a predetermined gesture that matches the contact.
- Step 63 If the contact matches the predetermined lock layer gesture, the currently displayed layer on the user interface is locked; wherein different layers include icons of applications that are not identical.
- the layer includes a base layer and an extended layer
- the base layer includes icons of all applications
- different extended layers include icons of different categories of applications.
- the categories here are custom categories such as "no network”, “work” and “online”.
- the predetermined locking layer gesture is a single touch and slides to form a closed graphic.
- the contact detected in step 61 matches the predetermined locked graphic gesture, the currently displayed layer on the user interface is locked, and the layer switching gesture is not locked after the locking.
- the response is made until the user performs an unlock layer gesture, thereby preventing display layer switching caused by erroneous operations.
- the unlock layer gesture here can be a single touch and slide in the opposite direction to form a closed graphic.
- FIG. 7 is a flowchart of a method for managing a user interface according to Embodiment 5 of the embodiment of the present invention, where the method includes the following steps.
- Step 72 Determine a predetermined gesture that matches the contact.
- Step 73 If the contact matches a predetermined layer editing gesture, enter the editing state of the layer.
- the predetermined layer editing gesture is a three-touch and slides vertically, and if the contact detected in step 71 matches the predetermined extended layer editing gesture, the editing state of the layer is entered.
- Step 74 Receive an editing instruction input by the user in the editing state, and perform a corresponding editing operation; wherein different layers include icons of applications that are not identical.
- the layer includes a base layer and an extended layer
- the base layer includes icons of all applications
- different extended layers include icons of different categories of applications.
- the category here is "no network"
- Custom categories such as "work” and “online”.
- the layer editing menu is displayed on the user interface, as shown in FIG. 8, which includes a list of currently existing extension layers, and "add extension layer”, “merge extension layer”, and “delete extension map”. Layer "etc.
- the layer setting dialog box as shown in FIG. 9, the user can input the name of the newly added extended layer in the input box 91, and can further select the corresponding extended layer by the drop-down menu arrow 93 in the check box 92.
- Backgrounds and icons display styles, etc. It can be seen that each extended layer can have different background and icon display styles, enriching the user experience.
- the base layer is set to not allow the user to delete.
- the user can select more than two extended layers to be merged, and then click The "OK" option, at this time, displays the setting dialog box of the extended layer on the user interface, which is the same as that shown in Figure 9.
- the user can name the merged new extended layer, and can further select the Expand the background and icon display style of the layer.
- each user interface is only an example.
- Each menu and dialog box can also have other arrangement modes.
- the options in each menu and dialog box can also be arranged in other ways.
- An application example of an embodiment of the present invention is given below.
- No Network Theme Layer includes icons of local game applications and icons of local e-book applications
- Internet Theme Layer includes icons of networked game applications, icons of webpage access applications, and instant messaging.
- the application's icon includes a work-related document application icon
- the "Older Simple Theme Layer” includes an icon for the dialing application, an icon for the short message sending application, and an icon for the photo application.
- FIG. 12 is a schematic structural diagram of a user interface management apparatus according to Embodiment 6 of the embodiment of the present invention.
- the apparatus includes: a detection module 121, a determination module 122, and a switching module 123.
- the detecting module 121 is configured to detect a contact of the touch display screen.
- the determining module 122 is configured to determine a gesture that matches the contact.
- the switching module 123 is configured to switch the layer displayed on the user interface when the contact matches a predetermined layer switching gesture.
- the layer includes a base layer and an extended layer
- the base layer includes icons of all applications
- different extended layers include icons of different categories of applications.
- the category here is "no network"
- Custom categories such as "work” and “online”.
- the switching module 123 includes: a first processing submodule and a first switching execution submodule.
- the first processing submodule is configured to display a selectable layer list on the user interface when the contact matches a predetermined layer switching gesture, and confirm a user selection of one of the layers.
- the first switching execution submodule is configured to display a layer selected by the user on the user interface.
- the switching module 123 includes: a second processing submodule and a second switching execution submodule.
- the second processing submodule is configured to determine, according to a set order, a next layer of a layer currently displayed on the user interface when the contact matches a predetermined layer switching gesture.
- the second switching execution sub-module is configured to display the determined layer on the user interface.
- FIG. 13 is a schematic structural diagram of a user interface management apparatus according to Embodiment 7 of the embodiment of the present invention.
- the apparatus includes: a detection module 131, a determination module 132, and a locking module 133.
- the detecting module 131 is configured to detect a contact of the touch display screen.
- the determining module 132 is configured to determine a gesture that matches the contact.
- the locking module 133 is configured to lock the layer currently displayed on the user interface when the contact matches a predetermined layer locking gesture.
- the layer includes a base layer and an extended layer, where the base layer includes icons of all applications, and different extended layers include icons of different categories of applications.
- the categories here are custom categories such as "no network”, “work” and “online”.
- FIG. 14 is a schematic structural diagram of a user interface management apparatus according to Embodiment 8 of the embodiment of the present invention.
- the apparatus includes: a detection module 141, a determination module 142, and a deletion module 143.
- the detecting module 141 is configured to detect a contact of the touch display screen.
- a determination module 142 is configured to determine a gesture that matches the contact.
- the deleting module 143 is configured to delete an icon of the application corresponding to the contact in the currently displayed layer on the user interface when the contact matches the predetermined application deletion gesture.
- the layer includes a base layer and an extended layer
- the base layer includes icons of all applications
- different extended layers include icons of different categories of applications.
- the categories here are custom categories such as "no network”, “work” and “online”.
- the deleting module 143 includes: a determining submodule and a deleting execution submodule.
- the determining submodule is configured to determine a layer type currently displayed on the user interface.
- the deletion execution sub-module is configured to: when the base layer is currently displayed on the user interface, uninstall the application corresponding to the contact, and delete the icon of the application corresponding to the contact from each layer; When the extended layer is currently displayed on the user interface, the icon of the application corresponding to the contact is deleted from the currently displayed extended layer.
- the deletion execution sub-module currently displays the extended layer on the user interface
- the prompt of the deletion mode is displayed on the user interface, and then the icon of the application corresponding to the contact is selected from the current according to the user's selection.
- the displayed extended layer is deleted, or the application corresponding to the contact is uninstalled, and the icon of the application corresponding to the contact is deleted from each layer.
- FIG. 15 is a schematic structural diagram of a user interface management apparatus according to Embodiment 9 of the embodiment of the present invention.
- the apparatus includes: a detection module 151, a determination module 152, and an addition module 153.
- the detecting module 151 is configured to detect a contact of the touch display screen.
- the determining module 152 is configured to determine a gesture that matches the contact.
- the adding module 153 is configured to add an icon of the application to the currently displayed layer on the user interface when the contact increases the gesture matching with the predetermined application.
- the layer includes a base layer and an extended layer
- the base layer includes icons of all applications
- different extended layers include icons of different categories of applications.
- the categories here are custom categories such as "no network”, “work” and “online”.
- the adding module 153 includes: a display sub-module and an add-on sub-module.
- the display submodule is configured to display a selectable application list on the user interface.
- the adding execution sub-module is configured to confirm a user's selection of one or more applications; and an icon of the application selected by the user is added to the currently displayed layer on the user interface.
- FIG. 16 is a schematic structural diagram of a user interface management apparatus in Embodiment 10 of the embodiment of the present invention.
- the apparatus includes: a detection module 161, a determination module 162, and an editing module 163.
- the detecting module 161 is configured to detect a contact of the touch display screen.
- a determination module 162 is configured to determine a gesture that matches the contact.
- the editing module 163 is configured to: when the contact matches a predetermined layer editing gesture, control the user interface to enter an editing state of the layer; receive an editing instruction input by the user in the editing state, and perform an editing operation for the layer .
- the layer includes a base layer and an extended layer
- the base layer includes icons of all applications
- different extended layers include icons of different categories of applications.
- the categories here are custom categories such as "no network”, “work” and “online”.
- the editing operation includes adding an extended layer, deleting an extended layer, merging two or more extended layers, changing the naming of the extended layer, changing the background of the extended layer, and/or an application. At least one of the icon styles.
- Example ⁇ 1
- FIG. 17 is a schematic structural diagram of a user interface management apparatus according to Embodiment 11 of the present invention.
- the apparatus includes: a detection module 171, a determination module 172, and a management module 173.
- the detecting module 171 is configured to detect a contact of the touch display screen.
- a determination module 172 is configured to determine a gesture that matches the contact.
- the management module 173 is configured to: when the contact matches a predetermined layer switching gesture, switch a layer displayed on the user interface; when the contact matches a predetermined layer locking gesture, lock the currently displayed on the user interface a layer; when the contact matches a predetermined application deletion gesture, deleting an icon of the application corresponding to the contact in a layer currently displayed on the user interface; adding a gesture match between the contact and the predetermined application Adding an icon of the application to the layer currently displayed on the user interface; when the contact matches the predetermined layer editing gesture, entering the editing state of the layer, and receiving the editing input by the user in the editing state Instruction to perform an edit operation for the layer.
- the layer includes a base layer and an extended layer
- the base layer includes icons of all applications
- different extended layers include icons of different categories of applications.
- the categories here are custom categories such as "no network”, “work” and “online”.
- An embodiment of the present invention further provides an apparatus, where the apparatus includes:
- One or more processors are One or more processors;
- One or more modules the one or more modules being stored in the memory and configured to be executed by the one or more processors, wherein the one or more modules have the following functions:
- An embodiment of the present invention further provides a second device, where the device includes:
- One or more processors are One or more processors;
- One or more modules the one or more modules being stored in the memory and configured to be executed by the one or more processors, wherein the one or more modules have the following functions: Detecting contact with the touch screen display;
- the embodiment of the invention further provides a third device, the device comprising:
- One or more processors are One or more processors;
- One or more modules the one or more modules being stored in the memory and configured to be executed by the one or more processors, wherein the one or more modules have the following functions:
- the contact matches a predetermined application deletion gesture, deleting an icon of the application corresponding to the contact in a layer currently displayed on the user interface;
- the embodiment of the invention further provides a fourth device, the device comprising:
- One or more processors are One or more processors;
- One or more modules the one or more modules being stored in the memory and configured to be executed by the one or more processors, wherein the one or more modules have the following functions:
- the embodiment of the invention further provides a fifth device, the device comprising:
- One or more processors are One or more processors;
- Memory Memory; and One or more modules, the one or more modules being stored in the memory and configured to be executed by the one or more processors, wherein the one or more modules have the following functions:
- the embodiment of the invention further provides a sixth device, the device comprising:
- One or more processors are One or more processors;
- One or more modules the one or more modules being stored in the memory and configured to be executed by the one or more processors, wherein the one or more modules have the following functions:
- the contact matches a predetermined layer locking gesture, locking the layer currently displayed on the user interface; if the contact matches a predetermined application deletion gesture, deleting the contact in the layer currently displayed on the user interface The icon of the corresponding application;
- the embodiment further provides a non-volatile readable storage medium, wherein the storage medium stores one or more programs, and when the one or more modules are applied to a device having a touch screen, The device performs the following steps (instructions):
- An embodiment of the present invention provides a second non-volatile readable storage medium, where one or more programs are stored, and when the one or more modules are applied to a device having a touch screen, The device performs the following steps (instructions):
- Embodiments of the present invention provide a third non-volatile readable storage medium, where one or more programs are stored, and when the one or more modules are applied to a device having a touch screen, The device performs the following steps (instructions):
- the contact matches a predetermined application deletion gesture, deleting an icon of the application corresponding to the contact in a layer currently displayed on the user interface;
- Embodiments of the present invention provide a fourth non-volatile readable storage medium, where one or more programs are stored, and when the one or more modules are applied to a device having a touch screen, The device performs the following steps (instructions):
- Embodiments of the present invention provide a fifth non-volatile readable storage medium, where one or more modules are stored in the storage medium. Programs, when the one or more modules are applied to a device having a touch screen, the device can be caused to execute the following steps:
- Embodiments of the present invention provide a sixth non-volatile readable storage medium, where one or more programs are stored, and when the one or more modules are applied to a device having a touch screen, The device performs the following steps (instructions):
- the contact matches a predetermined layer locking gesture, locking the layer currently displayed on the user interface; if the contact matches a predetermined application deletion gesture, deleting the contact in the layer currently displayed on the user interface The icon of the corresponding application;
- a person skilled in the art may understand that all or part of the steps of implementing the above embodiments may be completed by hardware, or may be instructed by a program to execute related hardware, and the program may be stored in a computer readable storage medium.
- the storage medium mentioned may be a read only memory, a magnetic disk or an optical disk or the like.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
- Digital Computer Display Output (AREA)
Abstract
Description
Claims
Priority Applications (7)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP13855130.4A EP2871569B1 (en) | 2012-11-16 | 2013-07-05 | Method and device for user interface management |
RU2014152055A RU2606553C2 (ru) | 2012-11-16 | 2013-07-05 | Способ и устройство для управления пользовательским интерфейсом |
MX2014015547A MX350834B (es) | 2012-11-16 | 2013-07-05 | Método y dispositivo para manejar la interfaz de usuario. |
JP2015516440A JP6139673B2 (ja) | 2012-11-16 | 2013-07-05 | ユーザインタフェース管理方法、ユーザインタフェース管理装置、プログラム及び記録媒体 |
BR112014032281A BR112014032281A2 (pt) | 2012-11-16 | 2013-07-05 | método e dispositivo para gerenciar interface de usuário |
KR1020147035122A KR101676155B1 (ko) | 2012-11-16 | 2013-07-05 | 사용자 인터페이스의 관리방법, 장치, 프로그램 및 기록매체 |
US14/076,273 US9459760B2 (en) | 2012-11-16 | 2013-11-11 | Method and device for managing a user interface |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201210465079.6A CN103019586B (zh) | 2012-11-16 | 2012-11-16 | 用户界面管理方法及装置 |
CN201210465079.6 | 2012-11-16 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/076,273 Continuation-In-Part US9459760B2 (en) | 2012-11-16 | 2013-11-11 | Method and device for managing a user interface |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014075451A1 true WO2014075451A1 (zh) | 2014-05-22 |
Family
ID=47968240
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2013/078896 WO2014075451A1 (zh) | 2012-11-16 | 2013-07-05 | 用户界面管理方法及装置 |
Country Status (8)
Country | Link |
---|---|
EP (1) | EP2871569B1 (zh) |
JP (1) | JP6139673B2 (zh) |
KR (1) | KR101676155B1 (zh) |
CN (1) | CN103019586B (zh) |
BR (1) | BR112014032281A2 (zh) |
MX (1) | MX350834B (zh) |
RU (1) | RU2606553C2 (zh) |
WO (1) | WO2014075451A1 (zh) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2947538A3 (en) * | 2014-05-23 | 2016-01-13 | LG Electronics Inc. | Mobile terminal and method of controlling the same |
Families Citing this family (41)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2013169849A2 (en) | 2012-05-09 | 2013-11-14 | Industries Llc Yknots | Device, method, and graphical user interface for displaying user interface objects corresponding to an application |
EP3264252B1 (en) | 2012-05-09 | 2019-11-27 | Apple Inc. | Device, method, and graphical user interface for performing an operation in accordance with a selected mode of operation |
WO2013169842A2 (en) | 2012-05-09 | 2013-11-14 | Yknots Industries Llc | Device, method, and graphical user interface for selecting object within a group of objects |
WO2013169865A2 (en) | 2012-05-09 | 2013-11-14 | Yknots Industries Llc | Device, method, and graphical user interface for moving a user interface object based on an intensity of a press input |
EP2847657B1 (en) | 2012-05-09 | 2016-08-10 | Apple Inc. | Device, method, and graphical user interface for displaying additional information in response to a user contact |
DE112013002387T5 (de) | 2012-05-09 | 2015-02-12 | Apple Inc. | Vorrichtung, Verfahren und grafische Benutzeroberfläche für die Bereitstellung taktiler Rückkopplung für Operationen in einer Benutzerschnittstelle |
CN104487928B (zh) | 2012-05-09 | 2018-07-06 | 苹果公司 | 用于响应于手势而在显示状态之间进行过渡的设备、方法和图形用户界面 |
CN109298789B (zh) | 2012-05-09 | 2021-12-31 | 苹果公司 | 用于针对激活状态提供反馈的设备、方法和图形用户界面 |
EP3410287B1 (en) | 2012-05-09 | 2022-08-17 | Apple Inc. | Device, method, and graphical user interface for selecting user interface objects |
WO2013169843A1 (en) | 2012-05-09 | 2013-11-14 | Yknots Industries Llc | Device, method, and graphical user interface for manipulating framed graphical objects |
US9459760B2 (en) | 2012-11-16 | 2016-10-04 | Xiaomi Inc. | Method and device for managing a user interface |
WO2014105277A2 (en) | 2012-12-29 | 2014-07-03 | Yknots Industries Llc | Device, method, and graphical user interface for moving a cursor according to a change in an appearance of a control icon with simulated three-dimensional characteristics |
EP3564806B1 (en) | 2012-12-29 | 2024-02-21 | Apple Inc. | Device, method and graphical user interface for determining whether to scroll or select contents |
CN104345881B (zh) * | 2013-08-09 | 2017-11-07 | 联想(北京)有限公司 | 信息处理的方法及电子设备 |
CN104423988B (zh) * | 2013-09-02 | 2019-02-05 | 联想(北京)有限公司 | 一种信息处理方法及电子设备 |
CN103744653B (zh) * | 2013-12-20 | 2017-11-17 | 宇龙计算机通信科技(深圳)有限公司 | 移动终端及应用图标的管理方法 |
JP6278262B2 (ja) * | 2014-03-12 | 2018-02-14 | ヤマハ株式会社 | 表示制御装置 |
CN104932798A (zh) * | 2014-03-19 | 2015-09-23 | 宇龙计算机通信科技(深圳)有限公司 | 一种终端软件的管理方法及装置 |
CN104978103A (zh) * | 2014-04-08 | 2015-10-14 | 连科通讯股份有限公司 | 变更使用者界面为Skype 专用界面的方法及其界面变更系统 |
CN104217150B (zh) * | 2014-08-21 | 2019-01-08 | 百度在线网络技术(北京)有限公司 | 一种用于调用应用的方法与装置 |
US9632664B2 (en) | 2015-03-08 | 2017-04-25 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
US10095396B2 (en) | 2015-03-08 | 2018-10-09 | Apple Inc. | Devices, methods, and graphical user interfaces for interacting with a control object while dragging another object |
CN104765525A (zh) * | 2015-03-18 | 2015-07-08 | 百度在线网络技术(北京)有限公司 | 操作界面的切换方法及装置 |
US9639184B2 (en) | 2015-03-19 | 2017-05-02 | Apple Inc. | Touch input cursor manipulation |
US20170045981A1 (en) | 2015-08-10 | 2017-02-16 | Apple Inc. | Devices and Methods for Processing Touch Inputs Based on Their Intensities |
US10200598B2 (en) | 2015-06-07 | 2019-02-05 | Apple Inc. | Devices and methods for capturing and interacting with enhanced digital images |
US9860451B2 (en) | 2015-06-07 | 2018-01-02 | Apple Inc. | Devices and methods for capturing and interacting with enhanced digital images |
US9891811B2 (en) | 2015-06-07 | 2018-02-13 | Apple Inc. | Devices and methods for navigating between user interfaces |
US9830048B2 (en) | 2015-06-07 | 2017-11-28 | Apple Inc. | Devices and methods for processing touch inputs with instructions in a web page |
CN104915103B (zh) * | 2015-06-29 | 2018-06-05 | 努比亚技术有限公司 | 整理桌面图标的方法及移动终端 |
US9880735B2 (en) | 2015-08-10 | 2018-01-30 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
US10235035B2 (en) | 2015-08-10 | 2019-03-19 | Apple Inc. | Devices, methods, and graphical user interfaces for content navigation and manipulation |
CN105278839A (zh) * | 2015-09-30 | 2016-01-27 | 天脉聚源(北京)科技有限公司 | 动态切换图片的方法和装置 |
CN105138213A (zh) * | 2015-09-30 | 2015-12-09 | 天脉聚源(北京)科技有限公司 | 动态切换图片的方法和装置 |
CN105138214A (zh) * | 2015-09-30 | 2015-12-09 | 天脉聚源(北京)科技有限公司 | 动态切换图片的方法和装置 |
CN107315577A (zh) * | 2016-04-26 | 2017-11-03 | 斑马网络技术有限公司 | 业务处理方法、装置、终端设备和用户界面系统 |
CN106227410A (zh) * | 2016-07-26 | 2016-12-14 | 深圳天珑无线科技有限公司 | 一种信息删除方法及装置 |
CN109683796B (zh) * | 2018-12-25 | 2021-09-07 | 努比亚技术有限公司 | 一种交互控制方法、设备及计算机可读存储介质 |
CN111290672A (zh) * | 2020-01-17 | 2020-06-16 | 惠州Tcl移动通信有限公司 | 一种图像显示方法、装置、存储介质及终端 |
KR102301706B1 (ko) * | 2020-02-10 | 2021-09-14 | 한국과학기술원 | 맥신 섬유의 제조방법 및 이로부터 제조된 맥신 섬유 |
KR20230021618A (ko) | 2021-08-05 | 2023-02-14 | 주식회사 애니랙티브 | 라인레이저 센서기반의 홀로터치 디스플레이 시스템 및 그에 대한 구현 방법 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101763219A (zh) * | 2010-02-03 | 2010-06-30 | 北京优视动景网络科技有限公司 | 可触摸式液晶屏操作网页浏览器的用户界面方法及设备 |
CN101930282A (zh) * | 2009-06-27 | 2010-12-29 | 英华达(上海)电子有限公司 | 移动终端、基于移动终端的输入方法 |
CN103019556A (zh) * | 2012-11-21 | 2013-04-03 | 用友软件股份有限公司 | 快捷帮助信息显示系统和快捷帮助信息显示方法 |
Family Cites Families (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002055753A (ja) * | 2000-08-10 | 2002-02-20 | Canon Inc | 情報処理装置、機能一覧表表示方法、及び記憶媒体 |
US7663605B2 (en) * | 2003-01-08 | 2010-02-16 | Autodesk, Inc. | Biomechanical user interface elements for pen-based computers |
JP5255753B2 (ja) * | 2005-06-29 | 2013-08-07 | シャープ株式会社 | 情報端末装置および通信システム |
US7707514B2 (en) * | 2005-11-18 | 2010-04-27 | Apple Inc. | Management of user interface elements in a display environment |
KR101613838B1 (ko) * | 2009-05-19 | 2016-05-02 | 삼성전자주식회사 | 휴대 단말기의 홈 스크린 지원 방법 및 이를 지원하는 휴대 단말기 |
KR101620058B1 (ko) * | 2009-11-23 | 2016-05-24 | 삼성전자주식회사 | 가상 머신 간 화면 전환 장치 및 방법 |
US20120256959A1 (en) * | 2009-12-30 | 2012-10-11 | Cywee Group Limited | Method of controlling mobile device with touch-sensitive display and motion sensor, and mobile device |
JP2011180964A (ja) * | 2010-03-03 | 2011-09-15 | Kddi Corp | 携帯端末機、携帯端末電話機の画面表示方法および表示プログラム |
KR101728725B1 (ko) * | 2010-10-04 | 2017-04-20 | 엘지전자 주식회사 | 이동 단말기 및 그 제어방법 |
CN102566850A (zh) * | 2010-12-14 | 2012-07-11 | 上海三旗通信科技股份有限公司 | 一种移动终端综合管理系统 |
KR101788049B1 (ko) * | 2010-12-15 | 2017-10-19 | 엘지전자 주식회사 | 이동 단말기 및 그 제어방법 |
JP2012198626A (ja) * | 2011-03-18 | 2012-10-18 | Panasonic Corp | 情報端末、表示画面切り替えのための方法、及びそのプログラム |
CN102736906A (zh) * | 2011-04-14 | 2012-10-17 | 上海三旗通信科技股份有限公司 | 在个人终端上实现多种操作界面切换和管理的方式 |
JP2012226516A (ja) * | 2011-04-19 | 2012-11-15 | Sony Corp | 電子機器、表示制御方法及びプログラム |
CN102184255A (zh) * | 2011-05-30 | 2011-09-14 | 昆山富泰科电脑有限公司 | 在便携式多功能设备上进行多网页浏览切换的方法与用户图形界面 |
RU2455676C2 (ru) * | 2011-07-04 | 2012-07-10 | Общество с ограниченной ответственностью "ТРИДИВИ" | Способ управления устройством с помощью жестов и 3d-сенсор для его осуществления |
CN102298502A (zh) * | 2011-09-26 | 2011-12-28 | 鸿富锦精密工业(深圳)有限公司 | 触摸型电子装置及其图标换页的方法 |
CN102760043B (zh) * | 2012-06-19 | 2016-01-27 | 华为终端有限公司 | 一种用户界面的图标管理方法及触控设备 |
-
2012
- 2012-11-16 CN CN201210465079.6A patent/CN103019586B/zh active Active
-
2013
- 2013-07-05 MX MX2014015547A patent/MX350834B/es active IP Right Grant
- 2013-07-05 RU RU2014152055A patent/RU2606553C2/ru active
- 2013-07-05 KR KR1020147035122A patent/KR101676155B1/ko active IP Right Grant
- 2013-07-05 BR BR112014032281A patent/BR112014032281A2/pt not_active IP Right Cessation
- 2013-07-05 JP JP2015516440A patent/JP6139673B2/ja active Active
- 2013-07-05 EP EP13855130.4A patent/EP2871569B1/en active Active
- 2013-07-05 WO PCT/CN2013/078896 patent/WO2014075451A1/zh active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101930282A (zh) * | 2009-06-27 | 2010-12-29 | 英华达(上海)电子有限公司 | 移动终端、基于移动终端的输入方法 |
CN101763219A (zh) * | 2010-02-03 | 2010-06-30 | 北京优视动景网络科技有限公司 | 可触摸式液晶屏操作网页浏览器的用户界面方法及设备 |
CN103019556A (zh) * | 2012-11-21 | 2013-04-03 | 用友软件股份有限公司 | 快捷帮助信息显示系统和快捷帮助信息显示方法 |
Non-Patent Citations (1)
Title |
---|
See also references of EP2871569A4 * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2947538A3 (en) * | 2014-05-23 | 2016-01-13 | LG Electronics Inc. | Mobile terminal and method of controlling the same |
Also Published As
Publication number | Publication date |
---|---|
KR101676155B1 (ko) | 2016-11-14 |
BR112014032281A2 (pt) | 2017-06-27 |
EP2871569B1 (en) | 2019-05-01 |
CN103019586B (zh) | 2017-03-15 |
EP2871569A4 (en) | 2016-03-02 |
EP2871569A1 (en) | 2015-05-13 |
MX2014015547A (es) | 2015-04-08 |
KR20150012292A (ko) | 2015-02-03 |
CN103019586A (zh) | 2013-04-03 |
MX350834B (es) | 2017-09-22 |
JP6139673B2 (ja) | 2017-05-31 |
JP2015523648A (ja) | 2015-08-13 |
RU2606553C2 (ru) | 2017-01-10 |
RU2014152055A (ru) | 2016-07-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2014075451A1 (zh) | 用户界面管理方法及装置 | |
EP3951578B1 (en) | Processing method, device and apparatus for split-screen display, and storage medium | |
JP5728562B2 (ja) | アプリケーションの保安性管理方法及びその電子装置 | |
US9459760B2 (en) | Method and device for managing a user interface | |
KR101278346B1 (ko) | 이벤트 인식 | |
WO2017032005A1 (zh) | 一种操作菜单显示方法及终端 | |
US20110179097A1 (en) | Creating virtual targets in directory structures | |
US20130234963A1 (en) | File management method and electronic device having file management function | |
US20120144293A1 (en) | Display apparatus and method of providing user interface thereof | |
RU2623885C2 (ru) | Запись формулы для ограниченного устройства отображения | |
KR20130093043A (ko) | 터치 및 스와이프 내비게이션을 위한 사용자 인터페이스 방법 및 모바일 디바이스 | |
TWI512601B (zh) | 電子裝置及其控制方法與電腦程式產品 | |
KR20190039521A (ko) | 호버를 사용한 디바이스 조작 | |
KR20130127146A (ko) | 다중 터치에 대응하는 기능을 처리하기 위한 방법 및 그 전자 장치 | |
CN103809871A (zh) | 应用程序图标的处理方法和移动终端 | |
WO2014029207A1 (zh) | 基于触摸屏的多选处理方法和用户设备 | |
TW201523433A (zh) | 用於在不相似螢幕上顯示應用程式資料的遠端控制 | |
JP2012203899A (ja) | ユーザーインターフェースをカスタマイズする方法とその電子装置 | |
KR20140136855A (ko) | 기능 실행 방법 및 그 전자 장치 | |
WO2013182141A1 (zh) | 一种人机交互方法、装置及其电子设备 | |
US11169652B2 (en) | GUI configuration | |
KR102192233B1 (ko) | 디스플레이 장치 및 그 제어 방법 | |
WO2022218192A1 (zh) | 文件处理方法及装置 | |
US10698566B2 (en) | Touch control based application launch | |
TWI515643B (zh) | 電子裝置的操作方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13855130 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 20147035122 Country of ref document: KR Kind code of ref document: A Ref document number: 2015516440 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: MX/A/2014/015547 Country of ref document: MX |
|
ENP | Entry into the national phase |
Ref document number: 2014152055 Country of ref document: RU Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2013855130 Country of ref document: EP |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112014032281 Country of ref document: BR |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 112014032281 Country of ref document: BR Kind code of ref document: A2 Effective date: 20141222 |