CN112148193A - Navigation gesture setting method and device and electronic equipment - Google Patents

Navigation gesture setting method and device and electronic equipment Download PDF

Info

Publication number
CN112148193A
CN112148193A CN202011057134.9A CN202011057134A CN112148193A CN 112148193 A CN112148193 A CN 112148193A CN 202011057134 A CN202011057134 A CN 202011057134A CN 112148193 A CN112148193 A CN 112148193A
Authority
CN
China
Prior art keywords
navigation gesture
input
navigation
desktop
setting control
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011057134.9A
Other languages
Chinese (zh)
Inventor
张磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202011057134.9A priority Critical patent/CN112148193A/en
Publication of CN112148193A publication Critical patent/CN112148193A/en
Priority to PCT/CN2021/120656 priority patent/WO2022068725A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Telephone Function (AREA)

Abstract

The application discloses a navigation gesture setting method and device and electronic equipment, and belongs to the technical field of mobile terminals, wherein the setting method comprises the following steps: receiving a first input for a navigation gesture setting control while the navigation gesture setting control is displayed; responding to the first input, displaying a navigation gesture setting window of a region corresponding to the navigation gesture setting control, wherein the navigation gesture setting window comprises a plurality of marks used for indicating navigation gestures; receiving a second input to set a window for the navigation gesture; and responding to the second input, and determining the navigation gesture of the area corresponding to the navigation gesture setting control. According to the setting method, the user can call the setting of the navigation gesture in any scene, the convenience of gesture setting is improved, the intuition of the navigation gesture setting is improved, the user can conveniently set and understand the navigation gesture, and the memory of the navigation gesture corresponding to the region is enhanced.

Description

Navigation gesture setting method and device and electronic equipment
Technical Field
The application relates to the technical field of mobile terminals, in particular to a method and a device for setting a navigation gesture and electronic equipment.
Background
The mobile terminal in the current market has many related navigation gestures, is complex and complicated, has the possibility of different navigation gestures of different models of each manufacturer, has no unified management on the navigation gestures, is very inconvenient to set the navigation gestures, can only be adjusted in system navigation through setting application, has no visual display on the navigation gestures after adjustment, lacks the display of real-time setting effect, and is inconvenient for a user to set, understand and memorize.
Disclosure of Invention
The embodiment of the application aims to provide a navigation gesture setting method and device and electronic equipment, and the problems that the navigation gesture is inconvenient to set and the setting effect is not displayed can be solved.
In order to solve the technical problem, the following technical scheme is adopted in the application:
in a first aspect, an embodiment of the present application provides a method for setting a navigation gesture, including:
receiving a first input for a navigation gesture setting control while the navigation gesture setting control is displayed;
displaying a navigation gesture setting window of a region corresponding to the navigation gesture setting control in response to the first input, wherein the navigation gesture setting window comprises a plurality of marks used for indicating navigation gestures;
receiving a second input for setting a window for the navigation gesture;
and responding to the second input, and determining the navigation gesture of the area corresponding to the navigation gesture setting control.
In a second aspect, an embodiment of the present application provides a setting apparatus for a navigation gesture, including:
the device comprises a first input module, a second input module and a display module, wherein the first input module is used for receiving a first input aiming at a navigation gesture setting control under the condition that the navigation gesture setting control is displayed;
the display module is used for responding to the first input and displaying a navigation gesture setting window of an area corresponding to the navigation gesture setting control, and the navigation gesture setting window comprises a plurality of marks used for indicating navigation gestures;
a second input module to receive a second input to set a window for the navigation gesture;
and the determining module is used for responding to the second input and determining the navigation gesture of the area corresponding to the navigation gesture setting control.
In a third aspect, an embodiment of the present application provides an electronic device, including a processor, a memory, and a program or instructions stored on the memory and executable on the processor, where the program or instructions, when executed by the processor, implement the steps of the method as described in the first aspect.
In a fourth aspect, embodiments of the present application provide a readable storage medium, on which a program or instructions are stored, which when executed by a processor implement the steps of the method as described in the embodiments of the first aspect.
In a fifth aspect, an embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the setting method.
The beneficial effects of the above technical scheme of this application are as follows:
according to the setting method of the navigation gesture, under the condition that the navigation gesture setting control is displayed, first input aiming at the navigation gesture setting control is received; displaying a navigation gesture setting window of a region corresponding to the navigation gesture setting control in response to the first input, wherein the navigation gesture setting window comprises a plurality of marks used for indicating navigation gestures; receiving a second input for setting a window for the navigation gesture; and responding to the second input, and determining the navigation gesture of the area corresponding to the navigation gesture setting control. According to the setting method, the navigation gesture setting window corresponding to the navigation gesture setting control is displayed, the navigation gesture setting window comprises a plurality of marks used for indicating navigation gestures, the needed navigation gestures can be selected, then the navigation gestures of the area corresponding to the navigation gesture setting control are determined, and therefore a user can clearly know the navigation gestures of the area corresponding to the navigation gesture setting control. In the setting process, the user can call the setting of the navigation gesture without entering the setting interface for adjustment in any scene, the convenience of gesture setting is improved, the navigation gesture corresponding to the region can be intuitively operated through the navigation gesture setting window, the intuitiveness of the navigation gesture setting is improved, the setting effect is displayed in real time, the user can conveniently set and understand, and the memory of the navigation gesture corresponding to the region is enhanced.
Drawings
FIG. 1 is a schematic flow chart diagram of a setup method according to an embodiment of the present application;
FIG. 2 is a schematic diagram of a display desktop during setup;
FIG. 3 is a schematic diagram of a thumbnail desktop generated on a display desktop during a setup process;
FIG. 4 is a schematic view of a navigation gesture setup window when the navigation gesture setup control is clicked during setup;
FIG. 5 is a schematic view of a toolbar during setup;
FIG. 6 is a schematic diagram of a navigation mode selected in a three-stage mode during a setup process;
FIG. 7 is a schematic diagram of a reminder when a navigation gesture does not meet a preset condition during a setup process;
FIG. 8 is a schematic diagram of another alert when the navigation gesture does not meet the preset condition during the setup process;
FIG. 9 is a schematic diagram of a navigation gesture displaying a corresponding region on a display desktop in a landscape manner;
FIG. 10 is a pictorial illustration of a thumbnail desktop on a display desktop;
FIG. 11 is a schematic view of a setting device according to an embodiment of the present application;
FIG. 12 is a schematic view of an electronic device in an embodiment of the application;
fig. 13 is another schematic diagram of an electronic device in an embodiment of the present application.
Reference numerals
A display desktop 10;
a first input module 11; a second input module 12;
a display module 20;
a determination module 30;
a thumbnail desktop 70; navigation gesture setting controls 71; a navigation gesture setting window 72; a toolbar 73.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that embodiments of the application may be practiced in sequences other than those illustrated or described herein, and that the terms "first," "second," and the like are generally used herein in a generic sense and do not limit the number of terms, e.g., the first term can be one or more than one. In addition, "and/or" in the specification and claims means at least one of connected objects, a character "/" generally means that a preceding and succeeding related objects are in an "or" relationship.
The following describes in detail a setting method provided by the embodiment of the present application through a specific embodiment and an application scenario thereof with reference to fig. 1 to fig. 10.
As shown in fig. 1, a method for setting a navigation gesture according to an embodiment of the present application includes:
step S1, receiving a first input aiming at the navigation gesture setting control under the condition that the navigation gesture setting control is displayed;
step S2, responding to the first input, displaying a navigation gesture setting window of an area corresponding to the navigation gesture setting control, wherein the navigation gesture setting window comprises a plurality of marks used for indicating navigation gestures;
step S3, receiving a second input for the navigation gesture setup window;
step S4, in response to the second input, determining a navigation gesture of the area corresponding to the navigation gesture setting control.
That is, in step S1, in the case that the navigation gesture setting control is displayed, a first input to the navigation gesture setting control 71 is received, wherein the first input may include a signal input such as a gesture, voice or text. For example, as shown in fig. 3, in response to one input, a navigation gesture setting control 71 is displayed on the display desktop 10, the navigation gesture setting control 71 may be a different style such as a pointing bar, a bar box, a dialog box, or a virtual key, and in the case of displaying the navigation gesture setting control 71, a first input for the navigation gesture setting control 71 may be received.
In step S2, as shown in fig. 4, in response to the first input, a navigation gesture setting window 72 of an area corresponding to the navigation gesture setting control 71 is displayed (e.g., on the display desktop 10). The position of the navigation gesture setting window 72 may be moved or dragged, the navigation gesture setting window 72 may include a plurality of identifiers for indicating navigation gestures, the identifiers may be characters, graphics, numbers, or symbols, and the like, and may be specifically selected as needed, and the shape, size, and color of the identifier may also be selected as needed. For example, the identifier is a text, the navigation gesture setting window 72 may include identifiers such as "return to main menu", "return to previous level", or "recent application" for indicating the navigation gesture, different identifiers indicate different navigation gestures, and a corresponding navigation gesture may be selected according to the identifiers.
In step S3, a second input is received for the navigation gesture setup window 72, which may include signal input such as gestures, voice or text.
In step S4, in response to the second input, a navigation gesture of the area corresponding to the navigation gesture setting control 71 is determined, for example, in response to the second input, a navigation gesture of the area corresponding to the navigation gesture setting control 71 is determined, after the setting is completed, when the area corresponding to the navigation gesture setting control 71 slides up with a hand, a corresponding operation function is provided, for example, when the area corresponding to the navigation gesture setting control 71 slides up with a hand, a main menu may be returned or a set interface may be entered. For example, in one embodiment, an input (e.g., long-pressing an icon of an application a as shown in fig. 2) is received, in response to the input, a navigation gesture setting control 71 (e.g., an indication bar) is displayed on the display desktop 10 (e.g., the left edge of the display desktop 10), a first input (e.g., clicking the indication bar) for the indication bar is received, in response to the first input, a navigation gesture setting window 72 is displayed on the left edge of the display desktop 10, the navigation gesture setting window 72 includes an identifier of "return to main menu", "return to previous level", or "recent application" for indicating a navigation gesture, and when the identifier of "return to main menu" is clicked or long-pressed, a navigation gesture displaying the left edge of the desktop 10 can be determined, and at this time, the navigation gesture displaying the left edge of the desktop 10 is a navigation gesture indicated by "return to main menu", the specific navigation gesture indicated by the "return to main menu" identification can be reasonably set as required.
Therefore, in the setting method, the navigation gesture setting window corresponding to the navigation gesture setting control is displayed, the navigation gesture setting window comprises a plurality of marks used for indicating navigation gestures, the needed navigation gestures can be selected, and then the navigation gestures of the area corresponding to the navigation gesture setting control are determined, so that a user can clearly know the navigation gestures of the area corresponding to the navigation gesture setting control. In the setting process, the user can call the setting of the navigation gesture without entering the setting interface for adjustment in any scene, the convenience of gesture setting is improved, the navigation gesture corresponding to the region can be intuitively operated through the navigation gesture setting window, the intuitiveness of the navigation gesture setting is improved, the setting effect is displayed in real time, the user can conveniently set and understand, and the memory of the navigation gesture corresponding to the region is enhanced.
In some embodiments of the present application, as shown in fig. 3, the navigation gesture setting control 71 is a bar control, the navigation gesture setting control 71 may be displayed on the display desktop 10, and the navigation gesture setting control 71 may be disposed along an edge of the display desktop 10, the navigation gesture setting control 71 may have one or more navigation gesture setting controls, when a plurality of navigation gesture setting controls 71 are provided, the plurality of navigation gesture setting controls 71 may be disposed at intervals around the edge of the display desktop 10, and may be disposed at even intervals, and a specific position, a shape, a color, or the like of each navigation gesture setting control 71 may be selected as needed, so as to be distinguished and memorized. In an actual application process, the navigation gesture setting controls 71 may be respectively displayed on a left side edge, a right side edge, an upper side edge or a lower side edge of the display desktop 10, each side edge may be divided into a plurality of segments, for example, two segments or three segments, and a region where each navigation gesture setting control 71 is located may be set with different navigation gestures according to needs.
In other embodiments of the present application, displaying a navigation gesture setting control comprises:
in response to the third input, a thumbnail desktop 70 is displayed on display desktop 10 and the area of thumbnail desktop 70 is smaller than the area of display desktop 10, a navigation gesture setting control 71 is displayed in an area of the display desktop 10 that is not a thumbnail desktop area, and the area of the non-thumbnail desktop is an area of the display desktop 10 that is not a thumbnail desktop area. As shown in fig. 3, that is, receiving a third input, in response to the third input, displaying a contracted abbreviated desktop 70 on the display desktop 10, where the display content of the abbreviated desktop 70 is the same as the display content of an actual desktop, and the area of the abbreviated desktop 70 is smaller than the area of the display desktop 10, the abbreviated desktop 70 may be located in the middle of the display desktop 10, leaving a non-abbreviated desktop area at the edge of the display desktop 10, and displaying a navigation gesture setting control 71 in the area of the non-abbreviated desktop 70 on the display desktop 10, so that when setting a navigation gesture, normal use and operation of the desktop are not affected, and at the same time, the navigation gesture can be intuitively set.
Alternatively, as shown in fig. 3, the navigation gesture setting controls 71 may be displayed around the periphery of the abbreviated desktop 70, the navigation gesture setting controls 71 may have one or more navigation gesture setting controls 71, when a plurality of navigation gesture setting controls 71 are provided, the plurality of navigation gesture setting controls 71 may be arranged at intervals around the periphery of the abbreviated desktop 70, and the specific position, shape, color, or the like of each navigation gesture setting control 71 may be selected as needed, so as to facilitate distinguishing and memorizing, facilitate setting of navigation gestures, and not affect normal use and operation of the desktop.
In some embodiments, the method of setting further comprises:
receiving a fourth input of a touch operation for the thumbnail desktop 70;
and responding to the fourth input, acquiring the abscissa and the ordinate of the touch operation, calculating the actual coordinates of the abscissa and the ordinate on the display desktop 10, executing corresponding operation according to the actual coordinates, and when a navigation gesture is set, operating the display desktop in real time through the abbreviated desktop 70 without influencing the use and operation of the display desktop.
In a specific implementation process, a desktop icon or a desktop pendant is displayed on the display desktop 10, and the screen can be retracted after the desktop icon or the desktop pendant is clicked, so that the thumbnail desktop 70 is displayed on the display desktop 10. After screen retraction, a bar is generated around the thumbnail desktop 70, as shown in fig. 5, and a toolbar 73 may be displayed above the thumbnail desktop 70 or the display desktop 10, the toolbar including options for cancel, rotate, and complete, and thus being used to cancel, rotate, and complete settings. After clicking is finished, the set navigation gesture takes effect, the set navigation gesture exits from the retraction mode, the indication strip disappears, and only the display desktop 10 is displayed on the screen and is expanded into complete screen display; and clicking to cancel, setting the navigation gesture to be ineffective, and exiting the retraction mode. In response to the second input, displaying a navigation gesture setting window 72 indicating an area where the bar is located on the display desktop 10, where the navigation gesture setting window 72 has a plurality of identifiers for indicating navigation gestures, such as identifiers of "return to main menu", "return to upper level", or "recent application" in the navigation gesture setting window 72; in response to the second input, after determining the navigation gesture of the area corresponding to the navigation gesture setting control 71, for example, after selecting the "return to main menu" identifier in the navigation gesture setting window 72, clicking the completion in the toolbar, the set navigation gesture takes effect, and exiting the retracted mode, the indication bar disappears.
After the navigation gesture setting control 71 (e.g., an indication bar) is clicked, a navigation gesture setting window 72 of a region corresponding to the navigation gesture setting control 71 is popped up, for example, as shown in fig. 4 and fig. 5, after the indication bar at the bottom of the display desktop 10 is clicked, options of a navigation mode are provided in the navigation gesture setting window 72 (e.g., a pop-up window), which may include "full-screen gesture", "classic three-stage" and "navigation key" and the like for indicating a navigation gesture, and a selection adjustment may be made in the navigation gesture setting window 72 for a navigation gesture of a region corresponding to the navigation gesture setting control 71. As shown in fig. 4 and 5, different navigation modes in the navigation gesture setting window 72 can be selected, when the navigation mode is selected in a classical three-stage manner, the state dynamics of the peripheral indicator bars can be changed to the effect in fig. 6, three segments of indicator bars can be displayed on the lower side edge of the display desktop 10, a user can reset the navigation gestures of the regions corresponding to the current indicator bars, the regions without the indicator bars cannot be set, and the set navigation gestures take effect in real time, as shown in fig. 6, the bars on the lower side edge of the display desktop 10 are divided into three parts, specifically including a bottom left-side bar, a bottom middle bar and a bottom right-side bar, and for the bottom left-side bar, three navigation gestures including recent application, a control center and returning can be provided for the user to select; if the user selects the recent application, the background operation of the recent application can be responded after the left area at the bottom of the display desktop 10 slides upwards; if the user selects return, the return operation may be responded to by sliding the bottom left area of the display desktop 10 up. Similarly, the other parts of the indicator bar selection logic are also the same. In addition, the system navigation list can be set, the entrance of the method is increased, and the user can quickly adjust the navigation method. The display desktop on the screen can be reduced through the grabbing gesture to generate the contracted desktop, the contracted desktop on the screen can be enlarged through the expanding gesture to enable the contracted desktop not to be displayed any more, and the user can conveniently enter a setting interface; the setting interface can be called through voice recognition, and the electronic equipment (such as a mobile phone) can enter the setting interface when receiving corresponding words through words such as 'screen reduction', 'gesture adjustment' and the like.
The retraction mode can be as follows: adding a window to cover the display desktop 10, and adding a black mask on the window to cover the display desktop 10; in addition, the image of the display desktop 10 is acquired in real time, the image can be zoomed according to the ratio of 10:8, the acquired zoomed image is arranged in the center of the mask, the display process from the display desktop 10 to the zoomed-in image can be completed, when the content of the actual display desktop 10 is updated, the mirror image of the display desktop 10 at the moment is acquired in real time and is displayed in the middle of the mask in a zoomed-in mode in real time, the display of the thumbnail desktop can be completed, and the effect of displaying the thumbnail desktop on the display desktop 10 is achieved. Click effect of the indented desktop: as shown in fig. 10, when the contracted-in contracted desktop 70 is clicked, the abscissa of the origin at the upper left corner of the clicked contracted desktop 70, the abscissa X and the ordinate Y can be obtained, the X represents the length of the contracted desktop 70, the Y represents the width of the contracted desktop 70, the actual click event coordinate can be obtained by calculating the distance ratio of the abscissa to the abscissa of the contracted desktop to the length and the width of the horizontal and vertical screens and then multiplying the distance ratio by the length and the width of the actual display desktop 10, and the event is encapsulated according to the finally obtained actual click coordinate, so that the down-sending of the click coordinate position is realized, the screen displays the corresponding effect, and through the above manners, the click response of the contracted desktop is realized.
The layout after the internal contraction can be as shown in fig. 3, the display desktop 10 on the screen mainly includes a peripheral navigation gesture setting control 71 (such as an indicator bar) and a contracted desktop formed by the internal contraction, the peripheral indicator bar can be clicked, a corresponding navigation gesture setting window 72 (such as a selection elastic box) pops up, an application icon in the contracted desktop 70 is clicked, an abscissa x and an ordinate y on the corresponding contracted desktop are obtained, an event is converted according to a coordinate calculation formula, the actual operation of the display desktop is realized by clicking the contracted desktop, the display desktop can be operated in real time through the contracted desktop, and the use and the operation of the display desktop are not affected in the process of setting the navigation gesture.
In other embodiments, the method of providing further comprises:
receive a fifth input for navigation gesture setting control 71;
in response to the fifth input, the navigation gesture setting control 71 is moved on the display desktop 10. That is, the position of the navigation gesture setting control 71 on the display desktop 10 can be moved, and the region of the navigation gesture can be set to move as required, for example, the navigation gesture setting control 71 can be initially located at the left edge region of the display desktop 10, if a navigation gesture is desired to be set at the right edge region of the display desktop 10, a fifth input for the navigation gesture setting control 71 can be received, in response to the fifth input, the navigation gesture setting control 71 at the left edge region of the display desktop 10 can be moved to the right edge region of the display desktop 10, and then, in response to the first input, the navigation gesture setting window 72 corresponding to the region of the navigation gesture setting control 71 (the right edge region of the display desktop 10) is displayed, the navigation gesture setting window 72 includes a plurality of identifiers for indicating the navigation gesture, a second input for the navigation gesture setting window 72 is received, in response to the second input, the setting of the navigation gesture to display the right edge region of desktop 10 is thereby completed.
Optionally, determining the navigation gesture of the area corresponding to the navigation gesture setting control 71 further includes:
and if the determined navigation gesture does not meet the preset condition, displaying reminding information. When the navigation gesture set by the user has unreasonable phenomena, such as lack of a return key or lack of a control center, the user can be reminded by popping a frame, and reasonable suggestions are given to be selected by the user. For example, as shown in FIG. 7, when the selected navigation gesture does not have a center of control and a return, the following may be displayed: at present, no 'control center' exists, and no 'return' exists at the previous stage; as shown in fig. 8, it is also possible to display: currently, there is no "control center" suggestion to adjust [ top right side ] of the display desktop 10 to open "control center", and the user can set according to the reminder information or the suggestion in the reminder information, so that the setting of the navigation gesture meets the preset condition.
In some embodiments, the method of setting further comprises:
receiving a sixth input;
in response to the sixth input, displaying the navigation gesture of the area corresponding to the navigation gesture setting control 71 on the display desktop 10 in a vertical screen or horizontal screen manner, so that the user can know the navigation gesture of the area corresponding to the navigation gesture setting control 71 displayed on the display desktop 10. For example, as shown in fig. 9, when the rotation control in the toolbar is clicked, the setting mode of the navigation gesture in the current navigation gesture state and the setting mode of the navigation gesture in the landscape screen state may be displayed, and the user may clearly know the navigation gesture operation mode in the landscape screen state on the interface, and may return to the portrait screen state after clicking the rotation.
It should be noted that, in the setting method of the navigation gesture provided in the embodiment of the present application, the execution subject may be a setting device of the navigation gesture, or a control module used for executing the setting method in the setting device of the navigation gesture. In the embodiment of the present application, a setting method for executing a navigation gesture by a setting device for a navigation gesture is taken as an example, and the setting device for a navigation gesture provided in the embodiment of the present application is described.
The embodiment of the application provides a setting device for a navigation gesture.
As shown in fig. 11, the setting device for navigation gestures according to the embodiment of the present application includes:
the first input module 11 is configured to receive a first input for the navigation gesture setting control when the navigation gesture setting control is displayed;
the display module 20 is configured to display, in response to the first input, a navigation gesture setting window of an area corresponding to the navigation gesture setting control, where the navigation gesture setting window includes a plurality of identifiers used for indicating navigation gestures;
a second input module 12 for receiving a second input for setting a window for a navigation gesture;
and the determining module 30 is configured to determine, in response to the second input, a navigation gesture of the area corresponding to the navigation gesture setting control.
In the setting apparatus of the present application, in a case where the navigation gesture setting control is displayed, the first input module 11 is configured to receive a first input for the navigation gesture setting control, the display module 20 is configured to display, in response to the first input, a navigation gesture setting window corresponding to a region corresponding to the navigation gesture setting control, the navigation gesture setting window includes a plurality of identifiers indicating navigation gestures, a desired navigation gesture may be selected, the second input module 12 may receive a second input for the navigation gesture setting window, and then, the determination module 30 is configured to determine, in response to the second input, a navigation gesture of the region corresponding to the navigation gesture setting control. In the setting process, the user can call the setting of the navigation gesture without entering the setting interface for adjustment in any scene, the convenience of gesture setting is improved, the navigation gesture corresponding to the region can be intuitively operated through the navigation gesture setting window, the intuitiveness of the navigation gesture setting is improved, the setting effect is displayed in real time, the user can conveniently set and understand, and the memory of the navigation gesture corresponding to the region is enhanced.
In some embodiments, the navigation gesture setting control is a bar control that is displayed on the display desktop and is set along an edge of the display desktop.
In other embodiments, the display module 20 is configured to display a thumbnail desktop on the display desktop in response to the third input, the thumbnail desktop having an area smaller than an area of the display desktop, and display a navigation gesture setting control in an area of the display desktop that is not the thumbnail desktop.
Optionally, navigation gesture setting controls are displayed around the periphery of the abbreviated desktop.
Optionally, the setting device further comprises:
the fourth input module is used for receiving fourth input aiming at touch operation of the abbreviated desktop;
and the processing module is used for responding to the fourth input, acquiring the abscissa and the ordinate of the touch operation, calculating the actual coordinates of the abscissa and the ordinate on the display desktop, and executing corresponding operation according to the actual coordinates.
In an embodiment of the present application, the setting device further includes:
a fifth input module to receive a fifth input to the navigation gesture setting control;
and the adjusting module is used for responding to a fifth input and moving the navigation gesture setting control on the display desktop.
In some embodiments, the setting means further comprises:
and the reminding module is used for displaying reminding information if the determined navigation gesture does not meet the preset condition.
Optionally, the setting device further comprises:
a sixth input module for receiving a sixth input;
and the display module 20 is configured to display, in response to the sixth input, the navigation gesture of the area corresponding to the navigation gesture setting control on the display desktop in a vertical screen or a horizontal screen manner.
The setting device of the navigation gesture in the embodiment of the present application may be a device, or may also be a component, an integrated circuit, or a chip in a terminal. The device can be mobile electronic equipment or non-mobile electronic equipment. By way of example, the mobile electronic device may be a mobile phone, a tablet computer, a notebook computer, a palm top computer, a vehicle-mounted electronic device, a wearable device, an ultra-mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and the non-mobile electronic device may be a Network Attached Storage (NAS), a personal computer (personal computer, PC), a Television (TV), a teller machine, a self-service machine, and the like, and the embodiments of the present application are not limited in particular.
The setting device of the navigation gesture in the embodiment of the application can be a device with an operating system. The operating system may be an Android (Android) operating system, an ios operating system, or other possible operating systems, and embodiments of the present application are not limited specifically.
The setting device for the navigation gesture provided in the embodiment of the present application can implement each process implemented by the method embodiments of fig. 1 to fig. 10, and is not described here again to avoid repetition.
Optionally, as shown in fig. 12, an electronic device 1200 is further provided in an embodiment of the present application, and includes a processor 1202, a memory 1201, and a program or an instruction stored in the memory 1201 and executable on the processor 1202, where the program or the instruction is executed by the processor 1202 to implement each process of the foregoing method embodiment, and can achieve the same technical effect, and no further description is provided here to avoid repetition.
It should be noted that the electronic device in the embodiment of the present application includes the mobile electronic device and the non-mobile electronic device described above.
Fig. 13 is a schematic hardware structure diagram of an electronic device implementing an embodiment of the present application.
The electronic device 100 includes, but is not limited to: a radio frequency unit 101, a network module 102, an audio output unit 103, an input unit 104, a sensor 105, a display unit 106, a user input unit 107, an interface unit 108, a memory 109, and a processor 110.
Those skilled in the art will appreciate that the electronic device 100 may further comprise a power source (e.g., a battery) for supplying power to various components, and the power source may be logically connected to the processor 110 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system. The electronic device structure shown in fig. 13 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than those shown, or combine some components, or arrange different components, and thus, the description is omitted here.
A user input unit 107, configured to receive a first input for a navigation gesture setting control in a case where the navigation gesture setting control is displayed;
the display unit 106 is configured to display, in response to the first input, a navigation gesture setting window of an area corresponding to the navigation gesture setting control, where the navigation gesture setting window includes multiple identifiers used for indicating navigation gestures;
a user input unit 107 for receiving a second input for setting a window for the navigation gesture;
and the processor 110 is configured to determine, in response to the second input, a navigation gesture of a region corresponding to the navigation gesture setting control.
In the application, a navigation gesture setting window corresponding to the navigation gesture setting control is displayed, the navigation gesture setting window comprises a plurality of marks used for indicating navigation gestures, the needed navigation gestures can be selected, and then the navigation gestures of the corresponding area of the navigation gesture setting control are determined, so that a user can clearly know the navigation gestures of the corresponding area of the navigation gesture setting control. In the setting process, the user can call the setting of the navigation gesture without entering the setting interface for adjustment in any scene, the convenience of gesture setting is improved, the navigation gesture corresponding to the region can be intuitively operated through the navigation gesture setting window, the intuitiveness of the navigation gesture setting is improved, the setting effect is displayed in real time, the user can conveniently set and understand, and the memory of the navigation gesture corresponding to the region is enhanced.
Optionally, the navigation gesture setting control is a strip control, and the navigation gesture setting control is displayed on the display desktop and is arranged along the edge of the display desktop.
Optionally, the display unit 106 is configured to, in response to the third input, display a thumbnail desktop on the display desktop, where an area of the thumbnail desktop is smaller than an area of the display desktop, and display the navigation gesture setting control in an area of a non-thumbnail desktop on the display desktop.
Optionally, the navigation gesture setting controls are displayed around a periphery of the abbreviated desktop.
Optionally, the user input unit 107 is configured to receive a fourth input of a touch operation on the thumbnail desktop;
and the processor 110 is configured to, in response to the fourth input, acquire the abscissa and the ordinate of the touch operation, calculate actual coordinates of the abscissa and the ordinate on the display desktop, and execute a corresponding operation according to the actual coordinates.
Optionally, a user input unit 107 for receiving a fifth input for the navigation gesture setting control;
and the display unit 106 is used for responding to the fifth input and moving the navigation gesture setting control on the display desktop.
Optionally, the display unit 106 is configured to display a reminding message if the determined navigation gesture does not meet the preset condition.
Optionally, a user input unit 107 for receiving a sixth input;
and the display unit 106 is configured to display, in response to the sixth input, the navigation gesture of the area corresponding to the navigation gesture setting control on the display desktop in a vertical screen or a horizontal screen manner.
It should be understood that, in the embodiment of the present application, the input Unit 104 may include a Graphics Processing Unit (GPU) 1041 and a microphone 1042, and the Graphics Processing Unit 1041 processes image data of a still picture or a video obtained by an image capturing device (such as a camera) in a video capturing mode or an image capturing mode. The display unit 106 may include a display panel 1061, and the display panel 1061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 107 includes a touch panel 1071 and other input devices 1072. The touch panel 1071 is also referred to as a touch screen. The touch panel 1071 may include two parts of a touch detection device and a touch controller. Other input devices 1072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein. The memory 109 may be used to store software programs as well as various data including, but not limited to, application programs and an operating system. The processor 110 may integrate an application processor, which primarily handles operating systems, user interfaces, applications, etc., and a modem processor, which primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 110.
The embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or the instruction is executed by a processor, the program or the instruction implements each process of the setting method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
The processor is the processor in the electronic device described in the above embodiment. The readable storage medium includes a computer readable storage medium, such as a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and so on.
The embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to execute a program or an instruction to implement each process of the setting method embodiment, and can achieve the same technical effect, and the details are not repeated here to avoid repetition.
It should be understood that the chips mentioned in the embodiments of the present application may also be referred to as system-on-chip, system-on-chip or system-on-chip, etc.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Further, it should be noted that the scope of the methods and apparatus of the embodiments of the present application is not limited to performing the functions in the order illustrated or discussed, but may include performing the functions in a substantially simultaneous manner or in a reverse order based on the functions involved, e.g., the methods described may be performed in an order different than that described, and various steps may be added, omitted, or combined. In addition, features described with reference to certain examples may be combined in other examples.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (11)

1. A setting method of a navigation gesture is characterized by comprising the following steps:
receiving a first input for a navigation gesture setting control while the navigation gesture setting control is displayed;
displaying a navigation gesture setting window of a region corresponding to the navigation gesture setting control in response to the first input, wherein the navigation gesture setting window comprises a plurality of marks used for indicating navigation gestures;
receiving a second input for setting a window for the navigation gesture;
and responding to the second input, and determining the navigation gesture of the area corresponding to the navigation gesture setting control.
2. The method of claim 1, wherein the navigation gesture setting control is a bar control, and the navigation gesture setting control is displayed on a display desktop and is arranged along an edge of the display desktop.
3. The method of claim 1, wherein displaying a navigation gesture setting control comprises:
and responding to a third input, displaying a thumbnail desktop on a display desktop, wherein the area of the thumbnail desktop is smaller than that of the display desktop, and displaying the navigation gesture setting control in an area of a non-thumbnail desktop on the display desktop.
4. The method of claim 3, wherein the navigation gesture setting controls are displayed around a periphery of the thumbnail desktop.
5. The method of claim 3, further comprising:
receiving a fourth input of touch operation aiming at the abbreviated desktop;
and responding to the fourth input, acquiring the abscissa and the ordinate of the touch operation, calculating the actual coordinates of the abscissa and the ordinate on the display desktop, and executing corresponding operation according to the actual coordinates.
6. The method of claim 1, further comprising:
receiving a fifth input to the navigation gesture setting control;
in response to the fifth input, moving the navigation gesture setting control on a display desktop.
7. The method of claim 1, wherein determining the navigation gesture corresponding to the region of the navigation gesture setting control further comprises:
and if the determined navigation gesture does not meet the preset condition, displaying reminding information.
8. The method of claim 1, further comprising:
receiving a sixth input;
and responding to the sixth input, and displaying the navigation gesture of the area corresponding to the navigation gesture setting control on the display desktop in a vertical screen or horizontal screen mode.
9. A setting device of navigation gestures is characterized by comprising:
the device comprises a first input module, a second input module and a display module, wherein the first input module is used for receiving a first input aiming at a navigation gesture setting control under the condition that the navigation gesture setting control is displayed;
the display module is used for responding to the first input and displaying a navigation gesture setting window of an area corresponding to the navigation gesture setting control, and the navigation gesture setting window comprises a plurality of marks used for indicating navigation gestures;
a second input module to receive a second input to set a window for the navigation gesture;
and the determining module is used for responding to the second input and determining the navigation gesture of the area corresponding to the navigation gesture setting control.
10. An electronic device comprising a processor, a memory, and a program or instructions stored on the memory and executable on the processor, the program or instructions when executed by the processor implementing the steps of the method of any one of claims 1-8.
11. A readable storage medium, characterized in that it stores thereon a program or instructions which, when executed by a processor, implement the steps of the method according to any one of claims 1-8.
CN202011057134.9A 2020-09-30 2020-09-30 Navigation gesture setting method and device and electronic equipment Pending CN112148193A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202011057134.9A CN112148193A (en) 2020-09-30 2020-09-30 Navigation gesture setting method and device and electronic equipment
PCT/CN2021/120656 WO2022068725A1 (en) 2020-09-30 2021-09-26 Navigation gesture setting method and apparatus, and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011057134.9A CN112148193A (en) 2020-09-30 2020-09-30 Navigation gesture setting method and device and electronic equipment

Publications (1)

Publication Number Publication Date
CN112148193A true CN112148193A (en) 2020-12-29

Family

ID=73896090

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011057134.9A Pending CN112148193A (en) 2020-09-30 2020-09-30 Navigation gesture setting method and device and electronic equipment

Country Status (2)

Country Link
CN (1) CN112148193A (en)
WO (1) WO2022068725A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112947747A (en) * 2021-02-02 2021-06-11 深圳市江元科技(集团)有限公司 Gesture navigation operation method and device, terminal equipment and storage medium
WO2022068725A1 (en) * 2020-09-30 2022-04-07 维沃移动通信有限公司 Navigation gesture setting method and apparatus, and electronic device

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1488894A (en) * 2002-10-11 2004-04-14 ���ǵ�����ʽ���� Mircrowave oven and method for controlling same through setting functional push button
CN101286310A (en) * 2007-04-13 2008-10-15 群康科技(深圳)有限公司 Display screen display control system and its operation method
CN103616981A (en) * 2013-10-31 2014-03-05 小米科技有限责任公司 Application process method, device and mobile terminal
CN104657211A (en) * 2015-02-03 2015-05-27 百度在线网络技术(北京)有限公司 Method and equipment used for operating target application on corresponding equipment
CN104951177A (en) * 2014-03-24 2015-09-30 联想(北京)有限公司 Information processing method and electronic equipment
CN105117090A (en) * 2015-09-29 2015-12-02 上海华豚科技有限公司 Mobile communication equipment with capacitive type sliding key and operation method thereof
CN105867770A (en) * 2016-03-30 2016-08-17 深圳市宝尔爱迪科技有限公司 Method and device for simulating touch screen to start application function through physical key
CN105867926A (en) * 2016-03-30 2016-08-17 深圳市宝尔爱迪科技有限公司 Method and device for physical key function user defining
CN106569713A (en) * 2016-10-31 2017-04-19 努比亚技术有限公司 Touch area adjusting device and method, and terminal
CN107579885A (en) * 2017-08-31 2018-01-12 广东美的制冷设备有限公司 Information interacting method, device and computer-readable recording medium
CN107613110A (en) * 2017-08-31 2018-01-19 努比亚技术有限公司 Method, terminal and the computer-readable recording medium that adjustment terminal interface is shown
CN107765907A (en) * 2016-08-22 2018-03-06 东莞市健耀烨电子科技有限公司 A kind of digital device that need not distinguish direction
CN109976641A (en) * 2019-03-29 2019-07-05 努比亚技术有限公司 Operating method, terminal and computer readable storage medium based on screenshot picture
CN110874142A (en) * 2019-11-11 2020-03-10 诸葛嘉 User-defined method of full-screen mobile phone operation gesture, computing terminal and storage medium
CN111049952A (en) * 2018-10-15 2020-04-21 珠海格力电器股份有限公司 Intelligent terminal and control method thereof

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015066399A1 (en) * 2013-10-31 2015-05-07 Evernote Corporation Multi-touch navigation of multidimensional object hierarchies
CN106681648A (en) * 2016-11-21 2017-05-17 北京技德网络技术有限公司 Gesture navigation method used for computer device
CN112148193A (en) * 2020-09-30 2020-12-29 维沃移动通信有限公司 Navigation gesture setting method and device and electronic equipment

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1488894A (en) * 2002-10-11 2004-04-14 ���ǵ�����ʽ���� Mircrowave oven and method for controlling same through setting functional push button
CN101286310A (en) * 2007-04-13 2008-10-15 群康科技(深圳)有限公司 Display screen display control system and its operation method
CN103616981A (en) * 2013-10-31 2014-03-05 小米科技有限责任公司 Application process method, device and mobile terminal
CN104951177A (en) * 2014-03-24 2015-09-30 联想(北京)有限公司 Information processing method and electronic equipment
CN104657211A (en) * 2015-02-03 2015-05-27 百度在线网络技术(北京)有限公司 Method and equipment used for operating target application on corresponding equipment
CN105117090A (en) * 2015-09-29 2015-12-02 上海华豚科技有限公司 Mobile communication equipment with capacitive type sliding key and operation method thereof
CN105867770A (en) * 2016-03-30 2016-08-17 深圳市宝尔爱迪科技有限公司 Method and device for simulating touch screen to start application function through physical key
CN105867926A (en) * 2016-03-30 2016-08-17 深圳市宝尔爱迪科技有限公司 Method and device for physical key function user defining
CN107765907A (en) * 2016-08-22 2018-03-06 东莞市健耀烨电子科技有限公司 A kind of digital device that need not distinguish direction
CN106569713A (en) * 2016-10-31 2017-04-19 努比亚技术有限公司 Touch area adjusting device and method, and terminal
CN107579885A (en) * 2017-08-31 2018-01-12 广东美的制冷设备有限公司 Information interacting method, device and computer-readable recording medium
CN107613110A (en) * 2017-08-31 2018-01-19 努比亚技术有限公司 Method, terminal and the computer-readable recording medium that adjustment terminal interface is shown
CN111049952A (en) * 2018-10-15 2020-04-21 珠海格力电器股份有限公司 Intelligent terminal and control method thereof
CN109976641A (en) * 2019-03-29 2019-07-05 努比亚技术有限公司 Operating method, terminal and computer readable storage medium based on screenshot picture
CN110874142A (en) * 2019-11-11 2020-03-10 诸葛嘉 User-defined method of full-screen mobile phone operation gesture, computing terminal and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
生活妙招小常识: "《你知道手机桌面编辑模式怎么打开吗,简单几步,轻松完成》", 《HTTPS://WWW.IXIGUA.COM/6803250838992060941?APP=VIDEO_ARTICLE&TIMESTAMP=1640592886&UTM_MEDIUM=ANDROID&UTM_CAMPAIGN=CLIENT_SHARE&UTM_SOURCE=WECHAT_FRIEND&TEST_GROUP=V1&WID_TRY=1》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022068725A1 (en) * 2020-09-30 2022-04-07 维沃移动通信有限公司 Navigation gesture setting method and apparatus, and electronic device
CN112947747A (en) * 2021-02-02 2021-06-11 深圳市江元科技(集团)有限公司 Gesture navigation operation method and device, terminal equipment and storage medium

Also Published As

Publication number Publication date
WO2022068725A1 (en) 2022-04-07

Similar Documents

Publication Publication Date Title
CN112099686B (en) Icon display control method and device and electronic equipment
CN112162665B (en) Operation method and device
US20100039449A1 (en) Menu controlling method
CN114546212B (en) Method, device and equipment for adjusting interface display state and storage medium
CN112286614A (en) User interface display method and device, electronic equipment and storage medium
CN112148166A (en) Desktop component management method and device
WO2022068725A1 (en) Navigation gesture setting method and apparatus, and electronic device
CN113296649A (en) Icon display method and device and electronic equipment
CN112783408A (en) Gesture navigation method and device of electronic equipment, equipment and readable storage medium
CN112433693A (en) Split screen display method and device and electronic equipment
CN112269501A (en) Icon moving method and device and electronic equipment
CN114063845A (en) Display method, display device and electronic equipment
CN111796746B (en) Volume adjusting method, volume adjusting device and electronic equipment
CN113885749A (en) Icon display method and device and electronic equipment
CN113342232A (en) Icon generation method and device, electronic equipment and readable storage medium
CN112783406A (en) Operation execution method and device and electronic equipment
CN111857474A (en) Application program control method and device and electronic equipment
CN112596643A (en) Application icon management method and device
CN111638828A (en) Interface display method and device
CN113407290B (en) Application notification display method and device and electronic equipment
CN114020389A (en) Application program display method and device and electronic equipment
CN111796736B (en) Application sharing method and device and electronic equipment
CN114879872A (en) Display method, display device, electronic equipment and storage medium
CN114115639A (en) Interface control method and device, electronic equipment and storage medium
CN113986428A (en) Picture correction method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20201229