CN106126107B - Electronic apparatus and control method - Google Patents

Electronic apparatus and control method Download PDF

Info

Publication number
CN106126107B
CN106126107B CN201610509731.8A CN201610509731A CN106126107B CN 106126107 B CN106126107 B CN 106126107B CN 201610509731 A CN201610509731 A CN 201610509731A CN 106126107 B CN106126107 B CN 106126107B
Authority
CN
China
Prior art keywords
input information
display panel
specific
specific condition
operation body
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201610509731.8A
Other languages
Chinese (zh)
Other versions
CN106126107A (en
Inventor
李凡智
庞建军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN201610509731.8A priority Critical patent/CN106126107B/en
Publication of CN106126107A publication Critical patent/CN106126107A/en
Application granted granted Critical
Publication of CN106126107B publication Critical patent/CN106126107B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour

Abstract

The invention provides a control method responding to operation of an operation body and an electronic device capable of processing responding to the operation of the operation body, which can improve operation experience of a user. The control method comprises the following steps: sensing the operation of the operation body and generating input information of the operation body; judging whether the input information meets any one of at least one specific condition; when the input information is determined to satisfy one of the specific conditions, specific processing corresponding to the satisfied specific condition is executed. Wherein the specific processing corresponding to the specific condition is able to be performed in response to the input information not satisfying the specific condition.

Description

Electronic apparatus and control method
Technical Field
The present invention relates to a control method responsive to an operation of an operation body, and an electronic apparatus capable of performing processing responsive to an operation of an operation body.
Background
A user can operate (e.g., click, double-click, drag, etc.) an electronic device such as a mobile phone, a tablet computer, a notebook computer, etc., through an operation body such as a finger, a stylus, etc., thereby achieving interaction with the electronic device. In the electronic apparatus, it is possible to generate input information of the operation body by sensing an operation of the operation body, and perform a predetermined process based on the generated input information. An operation body capable of operating an electronic device is more convenient to carry than an input device such as a mouse or a keyboard, or a special operation body (such as a finger) does not need to be prepared, and thus, an operation based on the operation body is favored by users.
Disclosure of Invention
The invention provides a control method responding to operation of an operation body and an electronic device capable of processing responding to the operation of the operation body, which can improve operation experience of a user.
According to an aspect of the present invention, there is provided a control method responsive to an operation of an operating body. The control method comprises the following steps: sensing the operation of the operation body and generating input information of the operation body; judging whether the input information meets any one of at least one specific condition; when the input information is determined to satisfy one of the specific conditions, specific processing corresponding to the satisfied specific condition is executed. Wherein the specific processing corresponding to the specific condition is able to be performed in response to the input information not satisfying the specific condition.
According to another aspect of the present invention, there is provided an electronic apparatus capable of performing processing in response to an operation of an operating body. The electronic device includes: the sensing module is configured to sense the operation of the operation body and generate input information of the operation body; and the control module is used for judging whether the input information meets any one of at least one specific condition or not, and executing specific processing corresponding to the met specific condition under the condition that the input information meets one of the specific conditions. Wherein the control module is capable of performing a specific process corresponding to a specific condition in response to input information not satisfying the specific condition.
According to the control method and the electronic apparatus of the present invention, the specific condition and the specific process corresponding thereto are set for the operation of the operation body, and when the operation of the operation body satisfies the specific condition, the specific process corresponding to the satisfied specific condition can be executed. Therefore, the operation experience of the user can be improved, and the user can process more convenient blocks through the operation body.
Drawings
Fig. 1 is a functional block diagram of an electronic device according to an embodiment of the present invention.
Fig. 2 is a schematic diagram illustrating a first specific condition and a first specific process of the present invention.
Fig. 3 is a schematic diagram illustrating a second specific condition and a second specific process of the present invention.
Fig. 4 is a schematic diagram illustrating a third specific condition and a third specific process of the present invention.
Fig. 5 is a flowchart showing a control method according to an embodiment of the present invention.
Detailed Description
Embodiments of the present invention will be described below with reference to the drawings. The following description with reference to the accompanying drawings is provided to assist in understanding the exemplary embodiments of the invention as defined by the claims and their equivalents. It includes various specific details to assist understanding, but they are to be construed as merely illustrative. Accordingly, those skilled in the art will recognize that various changes and modifications can be made to the embodiments described herein without departing from the scope and spirit of the present invention. Also, in order to make the description clearer and simpler, a detailed description of functions and configurations well known in the art will be omitted.
An electronic apparatus according to an embodiment of the present invention is described with reference to fig. 1. Fig. 1 is a functional block diagram of an electronic device according to an embodiment of the present invention.
As shown in fig. 1, the electronic device 1 includes a sensing module 101 and a control module 102. Among them, the electronic device 1 is an electronic device such as a mobile phone, a tablet computer, a notebook computer, or the like, and is capable of performing processing in response to an operation of an operation body such as a finger, a touch pen, or the like.
The sensing module 101 is configured to sense an operation of an operation body and generate input information of the operation body. Specifically, the sensing module 101 can sense whether the operating body is in contact with the sensing module, the position of the contact, and the like in real time, thereby generating input information regarding the operation position, the operation type, and the like of the operating body.
For example, the sensing module 101 is provided on a display panel included in the electronic apparatus 1, or constitutes a touch display panel together with the display panel, for example. In the case where the user touches the sensing module 101 by, for example, a finger, the sensing module 101 can sense a touch position in real time, and can determine that a double-click operation, a single-click operation, a pulling operation, or the like is performed from the touch position sensed in real time. Here, the contact is not limited to the direct contact, and may be in a predetermined distance range, and may be a contact sensing a different distance range.
For another example, the operation body is a stylus pen capable of sensing an operation on the display panel by the sensing module 101. Under the condition that the sensing module 101 is disposed on the display panel, the sensing module 101 can sense the operation of the stylus pen on the display panel. The sensing module 101 can sense a contact position of the stylus pen in real time, and can determine whether a double-click operation, a single-click operation, a pulling operation, and the like are performed according to the sensed contact position in real time. For example, if only one touch is made within a predetermined time, it is considered that the single-click operation is performed, and if two touches are made within a predetermined time, it is considered that the double-click operation is performed, and if the touches are continuously performed and the touch position is changed, it is considered that the pull operation is performed.
The sensing module 101 may be configured by a separate touch panel, or may be configured in another form as long as it can sense the operation of the operation body and generate the input information of the operation body. Further, in the embodiment of the present invention, the operation body is different from a conventional input device such as a mouse, a keyboard, or the like. Specifically, the electronic apparatus 1 can directly execute processing based on the input information generated by the input device, but the electronic apparatus 1 needs to generate input information by sensing the operation of the operation body by the sensing module 101 and execute processing based on the input information.
The control module 102 executes processing based on the input information generated by the sensing module 101. As described above, input information on the operation position, the operation type, and the like of the operation body can be generated by the sensing module 101. The control module 102 can execute corresponding processing in response to input information on an operation position, an operation type, and the like of the operation body.
Specifically, the control module 102 determines which operation object the operation is performed on, based on the touch position. For example, when the position of the touch by the stylus pen corresponds to the display position of a specific application icon, it can be determined that the application icon has been operated. Alternatively, when the position of the touch by the stylus pen corresponds to the display position of the specific function icon, it can be determined that the function icon has been operated. For example, when the position of the touch by the stylus pen corresponds to the display position of a specific editing interface, it can be determined that the editing interface is edited.
The control module 102 then selects the process to be performed according to the particular operation type. For example, when the application icon is clicked, the opening process of the application is executed, and when the function icon is clicked, the function is executed. For example, when the editing interface is subjected to a drawing operation, the drawing process corresponding to the drawing trajectory is executed.
In the embodiment of the present invention, the control module 102 determines whether the input information satisfies any one of at least one specific condition, and if it is determined that the input information satisfies one of the specific conditions, performs a specific process corresponding to the satisfied specific condition. Wherein the control module is capable of performing a specific process corresponding to a specific condition in response to input information not satisfying the specific condition.
Regarding a specific process corresponding to a certain specific condition, the control module 102 can be executed in response to input information satisfying the specific condition, and can also be executed in response to input information not satisfying the specific condition. For convenience of explanation, hereinafter, input information satisfying a specific condition is referred to as shortcut input information, and input information not satisfying the specific condition is referred to as regular input information. As described below, in the embodiment of the present invention, the control module 102 can perform a specific process that needs to be performed by a relatively cumbersome regular input information in response to a simple and quick input information, so that the user's operation experience can be further improved.
Specific conditions and specific processing corresponding to the specific conditions in the embodiment will be described below by way of example with reference to fig. 2 to 4. However, the embodiment of the present invention is not limited to the specific conditions and the specific processes shown in fig. 2 to 4, and other specific conditions and specific processes may be adopted as necessary as long as the operation experience of the user can be improved.
Fig. 2 is a schematic diagram illustrating a first specific condition and a first specific process of the present invention. In the embodiment of the present invention, the control module 102 determines whether or not the input information indicates that the operation body has performed the movement operation in a state of being kept in contact with the display panel, and if so, determines that the input information satisfies the first specific condition, and when determining that the input information satisfies the first specific condition, cuts out and stores the screen displayed in the area of the display panel corresponding to the movement of the operation body.
Specifically, as shown in fig. 2, when the user presses and pulls the display panel with the stylus pen, the input information generated by the sensing module 101 indicates that the operation body has performed the movement operation in the state of being kept in contact with the display panel. At this time, the control module intercepts and saves a screen displayed in an area corresponding to the movement of the operation body in the display panel in response to the input information. As shown in fig. 2, the area corresponding to the movement of the operation body is, for example, a rectangular area determined according to the movement trajectory of the stylus pen, and may be set as another area according to the movement operation.
Here, the processing of cutting out and saving the screen displayed in the area corresponding to the movement of the operation body in the display panel may be performed by an operation for cutting out the entire display screen and an operation for clipping out the entire display screen (first normal operation). In the embodiment of the present invention, the user can cut and save the display screen of a desired region only by pressing and pulling on the display panel (first shortcut operation), and therefore, the operation is simple and intuitive as compared with the first normal operation described above. The input information generated by the sensing module through the first normal operation is first normal input information, and the input information generated by the sensing module 101 through the first shortcut operation is first shortcut input information.
In the embodiment of the present invention, it is preferable that the operation body performs processing of cutting out and saving the screen of the corresponding area when performing the pull operation in the non-editing area of the display panel, whereas the operation body does not perform processing of cutting out and saving the screen when performing the pull operation in the editing area of the display panel, and for example, drawing processing corresponding to the pull trajectory may be performed.
Fig. 3 is a schematic diagram illustrating a second specific condition and a second specific process of the present invention. In the embodiment of the present invention, the control module 102 determines whether the input information indicates that the operation body performs a double-click operation on the first specific area of the screen displayed on the display panel, determines that the input information satisfies the second specific condition if it is determined that the input information satisfies the second specific condition, and cuts and saves the screen displayed on the display panel if it is determined that the input information satisfies the second specific condition.
Specifically, as shown in fig. 3, when the user performs a double-click operation on the first specific area of the screen displayed on the display panel by using the stylus pen, the input information generated by the sensing module 101 indicates that the operation body performs the double-click operation on the first specific area of the screen displayed on the display panel. For example, the first specific area may be set as an area where no application icon or function icon is displayed on the screen displayed on the display panel, or the first specific area may be designated as necessary. In addition, the first specific area may be set so as not to include the editing area. At this time, the control module intercepts and saves the screen displayed in the display panel in response to the input information.
In addition, in the embodiment of the present invention, in response to the input information satisfying the first specific condition, the control module 102 can further perform other processing (for example, as shown in fig. 3, function icons "new", "expanded", etc. for the saved screen are displayed on the display panel) in addition to the screen displayed on the display panel being cut out and saved.
Among them, regarding the processing of cutting out and saving the screen displayed in the display panel, it is also possible to execute by an operation (second normal operation) for cutting out the entire display screen. For example, in the second conventional operation, the user needs to find a specific function icon and click. In the embodiment of the present invention, the user can cut and save the screen displayed in the display panel only by double-clicking (second shortcut operation) on the first specific area of the screen displayed in the display panel, and thus it is simple compared to the second conventional operation described above. The input information generated by the sensing module through the second normal operation is second normal input information, and the input information generated by the sensing module 101 through the second shortcut operation is second shortcut input information.
Fig. 4 is a schematic diagram illustrating a third specific condition and a third specific process of the present invention. In the embodiment of the present invention, the control module 102 determines whether the input information indicates that the operator has performed a double-click operation on the editing area in the screen displayed on the display panel, determines that the input information satisfies the third specific condition if the input information does, and shares the editing screen displayed on the display panel if the input information satisfies the third specific condition.
Specifically, as shown in fig. 4, when the user performs a double-click operation on the editing area in the screen displayed on the display panel by using the stylus pen, the input information generated by the sensing module 101 indicates that the operator performs the double-click operation on the editing area in the screen displayed on the display panel. At this time, the control module 102 shares the editing screen displayed on the display panel in response to the input information. Specifically, the control module 102 transmits the edit screen to a specified destination or destinations, or sets the edit screen to a state that is browsable by other users, for example.
In the embodiment of the present invention, it is preferable that the processing of sharing the editing screen is executed in response to the input information determined that the third specific condition is satisfied when the input information indicates that the operator performs the double-click operation on the portion of the display panel other than the editing screen in the editing area.
Here, the process of sharing the editing screen may be performed by an operation (third normal operation) for sharing the editing screen. For example, the editing screen is saved and set to a shared state, or the saved editing screen is transmitted to a designated destination or destinations, or the like. In the embodiment of the present invention, the user can share the editing screen by merely double-clicking (third shortcut operation) on the editing area of the screen displayed on the display panel, and therefore, the operation is simple as compared with the third normal operation described above. The input information generated by the sensing module through the third normal operation is third normal input information, and the input information generated by the sensing module 101 through the third shortcut operation is third shortcut input information.
Specific processing for capturing and saving a screen of a partial area, capturing and saving an entire screen, and sharing an editing screen by a shortcut operation is described above with reference to fig. 2 to 4. However, in the embodiment of the present invention, the specific conditions and the specific processing can be flexibly set according to the usage habits of the user. Among them, the specific treatment is preferably a treatment which is relatively cumbersome in ordinary operation but is often used. In the embodiment of the present invention, it is preferable that the specific conditions do not overlap with specific conditions corresponding to other specific processes so as not to satisfy a plurality of specific conditions simultaneously for a certain operation. Further, in the case where a plurality of specific conditions are simultaneously satisfied for a certain operation, one or more specific processes to be executed may be further selected by the user, so that the control module 102 executes the selected one or more specific processes.
Next, a control method according to an embodiment of the present invention will be described with reference to fig. 5. Fig. 5 is a flowchart showing a control method according to an embodiment of the present invention. The control method shown in fig. 5 is used to respond to the operation of the operation body, and is applied to the electronic device shown in fig. 1, for example. In the embodiment of the present invention, the operation body is, for example, a stylus pen that can sense an operation on the display panel. The operation body may be a finger or the like as long as the operation body can be sensed and input information can be generated. Further, in the embodiment of the present invention, the operating body is different from a conventional input device such as a mouse, a keyboard, or the like, and the input device can directly generate input information, so that the electronic device as shown in fig. 1 can directly perform processing according to the input information generated by the input device.
In step S510, the operation of the operation body is sensed, and input information of the operation body is generated.
Specifically, in the case of being applied to the electronic apparatus 1 shown in fig. 1, the sensing module 101 is capable of sensing whether the operating body is in contact with the sensing module, the position of the contact, and the like in real time, thereby generating input information on the operation position, the operation type, and the like of the operating body.
For example, in the case where the user touches the sensing module 101 by, for example, a finger, the sensing module 101 can sense a touch position in real time, and can determine that a double-click operation, a single-click operation, a pull operation, or the like is performed from the touch position sensed in real time. Here, the contact is not limited to the direct contact, and may be in a predetermined distance range, and may be a contact sensing a different distance range.
For another example, in a case where the sensing module 101 is disposed on the display panel, the sensing module 101 can sense an operation of the touch pen on the display panel. The sensing module 101 can sense a contact position of the stylus pen in real time, and can determine whether a double-click operation, a single-click operation, a pulling operation, and the like are performed according to the sensed contact position in real time. For example, if only one touch is made within a predetermined time, it is considered that the single-click operation is performed, and if two touches are made within a predetermined time, it is considered that the double-click operation is performed, and if the touches are continuously performed and the touch position is changed, it is considered that the pull operation is performed.
In step S520, it is determined whether the input information satisfies any one of at least one specific condition, and in step S530, if it is determined that the input information satisfies one of the specific conditions, specific processing corresponding to the satisfied specific condition is performed. Wherein the specific processing corresponding to the specific condition is able to be performed in response to the input information not satisfying the specific condition.
Specifically, in the case of being applied to the electronic apparatus 1 shown in fig. 1, the control module 102 executes processing in accordance with the input information generated by the sensing module 101. As described above, input information regarding the operation position, the operation type, and the like of the operation body can be generated in step S510. The control module 102 can execute corresponding processing in response to input information on an operation position, an operation type, and the like of the operation body.
The specific processing corresponding to a specific condition may be executed in response to input information satisfying the specific condition, or may be executed in response to input information not satisfying the specific condition. For convenience of explanation, hereinafter, input information satisfying a specific condition is referred to as shortcut input information, and input information not satisfying the specific condition is referred to as regular input information. As described below, in the embodiment of the present invention, through the steps S520 and S530, a specific process that needs to be performed by a relatively cumbersome regular input information can be performed in response to a simple and quick input information, so that the user' S operation experience can be further improved.
Specifically, in the embodiment of the present invention, step S520 includes a step of determining whether or not the input information indicates that the operation body has performed a movement operation in a state of being in contact with the display panel, and if so, determining that the input information satisfies the first specific condition, and step S530 includes a step of, if it is determined that the input information satisfies the first specific condition, intercepting and storing a screen displayed in an area of the display panel corresponding to the movement of the operation body.
When the touch panel display device is applied to the electronic apparatus 1 shown in fig. 1, as shown in fig. 2, when the user presses and pulls the display panel with a stylus pen, the input information generated by the sensing module 101 indicates that the operation body has performed a moving operation in a state of being kept in contact with the display panel. At this time, the control module intercepts and saves a screen displayed in an area corresponding to the movement of the operation body in the display panel in response to the input information. As shown in fig. 2, the area corresponding to the movement of the operation body is, for example, a rectangular area determined according to the movement trajectory of the stylus pen, and may be set as another area according to the movement operation.
Here, the processing of cutting out and saving the screen displayed in the area corresponding to the movement of the operation body in the display panel may be performed by an operation for cutting out the entire display screen and an operation for clipping out the entire display screen (first normal operation). In the embodiment of the present invention, the user can cut and save the display screen of a desired region only by pressing and pulling on the display panel (first shortcut operation), and therefore, the operation is simple and intuitive as compared with the first normal operation described above. The input information generated by the sensing module through the first normal operation is first normal input information, and the input information generated by the sensing module 101 through the first shortcut operation is first shortcut input information.
Specifically, in the embodiment of the present invention, step S520 includes a step of determining whether the input information indicates that the operation body has performed a double-click operation on the first specific area of the screen displayed on the display panel, and if so, determining that the input information satisfies the second specific condition, and step S530 includes a step of, if it is determined that the input information satisfies the second specific condition, clipping and saving the screen displayed on the display panel.
When the method is applied to the electronic device 1 shown in fig. 1, as shown in fig. 3, when the user performs a double-click operation on the first specific area of the screen displayed on the display panel with the stylus pen, the input information generated by the sensing module 101 indicates that the operation body performs the double-click operation on the first specific area of the screen displayed on the display panel. For example, the first specific area may be set as an area where no application icon or function icon is displayed on the screen displayed on the display panel, or the first specific area may be designated as necessary. In addition, the first specific area may be set so as not to include the editing area. At this time, the control module intercepts and saves the screen displayed in the display panel in response to the input information.
In addition, in the embodiment of the present invention, in response to the input information satisfying the first specific condition, the control module 102 can further perform other processing (for example, as shown in fig. 3, function icons "new", "expanded", etc. for the saved screen are displayed on the display panel) in addition to the screen displayed on the display panel being cut out and saved.
Among them, regarding the processing of cutting out and saving the screen displayed in the display panel, it is also possible to execute by an operation (second normal operation) for cutting out the entire display screen. For example, in the second conventional operation, the user needs to find a specific function icon and click. In the embodiment of the present invention, the user can cut and save the screen displayed in the display panel only by double-clicking (second shortcut operation) on the first specific area of the screen displayed in the display panel, and thus it is simple compared to the second conventional operation described above. The input information generated by the sensing module through the second normal operation is second normal input information, and the input information generated by the sensing module 101 through the second shortcut operation is second shortcut input information.
Specifically, in the embodiment of the present invention, step S520 includes a step of determining whether the input information indicates that the operating body has performed a double-click operation on the editing area in the screen displayed on the display panel, and if so, determining that the input information satisfies the third specific condition, and step S530 includes a step of sharing the editing screen displayed on the display panel when it is determined that the input information satisfies the third specific condition.
Specifically, as shown in fig. 4, when the user performs a double-click operation on the editing area in the screen displayed on the display panel by using the stylus pen, the input information generated by the sensing module 101 indicates that the operator performs the double-click operation on the editing area in the screen displayed on the display panel. At this time, the control module 102 shares the editing screen displayed on the display panel in response to the input information. Specifically, the control module 102 transmits the edit screen to a specified destination or destinations, or sets the edit screen to a state that is browsable by other users, for example. In the embodiment of the present invention, it is preferable that the processing of sharing the editing screen is executed in response to the input information determined that the third specific condition is satisfied when the input information indicates that the operator performs the double-click operation on the portion of the display panel other than the editing screen in the editing area.
Here, the process of sharing the editing screen may be performed by an operation (third normal operation) for sharing the editing screen. For example, the editing screen is saved and set to a shared state, or the saved editing screen is transmitted to a designated destination or destinations, or the like. In the embodiment of the present invention, the user can share the editing screen by merely double-clicking (third shortcut operation) on the editing area of the screen displayed on the display panel, and therefore, the operation is simple as compared with the third normal operation described above. The input information generated by the sensing module through the third normal operation is third normal input information, and the input information generated by the sensing module 101 through the third shortcut operation is third shortcut input information.
The specific processing for capturing and saving a screen of a partial area, capturing and saving an entire screen, and sharing an editing screen by a shortcut operation has been described above. However, in the embodiment of the present invention, the specific conditions and the specific processing can be flexibly set according to the usage habits of the user. Among them, the specific treatment is preferably a treatment which is relatively cumbersome in ordinary operation but is often used. In the embodiment of the present invention, it is preferable that the specific conditions do not overlap with specific conditions corresponding to other specific processes so as not to satisfy a plurality of specific conditions simultaneously for a certain operation. Further, in the case where a plurality of specific conditions are simultaneously satisfied for a certain operation, one or more specific processes to be executed may be further selected by the user so that the selected one or more specific processes are executed.
Those of ordinary skill in the art will appreciate that the various modules and steps described in connection with the embodiments of the invention may be implemented as electronic hardware, computer software, or combinations of both. And the computer software may be disposed in any form of computer storage media. To clearly illustrate this interchangeability of hardware and software, various illustrative components and steps have been described above generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
Various embodiments of the present invention are described in detail above. However, those skilled in the art will appreciate that various modifications, combinations, or sub-combinations of the embodiments may be made without departing from the spirit and principle of the invention, and such modifications are intended to be within the scope of the invention.

Claims (8)

1. A control method responsive to an operation of an operation body, the control method comprising:
sensing the operation of the operation body and generating input information of the operation body;
judging whether the input information meets any one of at least one specific condition;
executing a specific process corresponding to the satisfied specific condition in a case where it is determined that the input information satisfies one of the specific conditions,
wherein the specific processing corresponding to the specific condition is executable in response to the input information not satisfying the specific condition;
wherein the content of the first and second substances,
the step of judging whether the input information satisfies any one of at least one specific condition includes:
determining whether the input information indicates that the operation body has performed a movement operation in a state of being held in contact with a display panel, and if so, determining that the input information satisfies a first specific condition,
in the step of executing specific processing corresponding to the satisfied specific condition when it is determined that the input information satisfies one of the specific conditions,
when the input information is judged to satisfy the first specific condition, if the operation body performs the moving operation in the editing area of the display panel, drawing processing corresponding to the moving operation is executed, and if the operation body performs the moving operation in the non-editing area of the display panel, a screen displayed in an area corresponding to the movement of the operation body in the display panel is cut out and stored.
2. The control method according to claim 1,
the step of judging whether the input information satisfies any one of at least one specific condition includes:
determining whether the input information indicates that the operation body performs a double-click operation on a first specific region of a screen displayed on a display panel, and if so, determining that the input information satisfies a second specific condition,
in the step of executing specific processing corresponding to the satisfied specific condition when it is determined that the input information satisfies one of the specific conditions,
and when the input information is judged to meet the second specific condition, intercepting and saving the picture displayed in the display panel.
3. The control method according to claim 1,
the step of judging whether the input information satisfies any one of at least one specific condition includes:
determining whether the input information indicates that the operation body has performed a double-click operation on an editing area in a screen displayed on a display panel, and if so, determining that the input information satisfies a third specific condition,
in the step of executing specific processing corresponding to the satisfied specific condition when it is determined that the input information satisfies one of the specific conditions,
and sharing the editing picture displayed on the display panel when the input information is judged to meet the third specific condition.
4. The control method according to any one of claims 2 to 3,
the operation body is a stylus pen capable of being sensed for an operation of the display panel.
5. An electronic apparatus capable of processing in response to an operation of an operation body, the electronic apparatus comprising:
the sensing module is configured to sense the operation of the operation body and generate input information of the operation body;
a control module which judges whether the input information satisfies any one of at least one specific condition, and executes a specific process corresponding to the satisfied specific condition when judging that the input information satisfies one of the specific conditions,
wherein the control module is capable of performing a specific process corresponding to a specific condition in response to input information not satisfying the specific condition;
further comprising:
a display panel configured to display a picture,
the control module determines whether the input information indicates that the operation body has performed a movement operation in a state of being kept in contact with the display panel, determines that the input information satisfies a first specific condition if so, performs a drawing process corresponding to the movement operation if the operation body performs the movement operation in an editing area of the display panel if it is determined that the input information satisfies the first specific condition, and cuts out and stores a screen displayed in an area corresponding to the movement of the operation body in the display panel if the operation body performs the movement operation in a non-editing area of the display panel.
6. The electronic device of claim 5, further comprising:
a display panel configured to display a picture,
the control module judges whether the input information indicates that the operation body performs double-click operation on a first specific area of a picture displayed on a display panel, if so, the input information is judged to meet a second specific condition, and the picture displayed on the display panel is intercepted and stored under the condition that the input information is judged to meet the second specific condition.
7. The electronic device of claim 5, further comprising:
a display panel configured to display a picture,
the control module judges whether the input information indicates that the operation body performs double-click operation on an editing area in a picture displayed on a display panel, if so, the input information is judged to meet a third specific condition, and the editing picture displayed on the display panel is shared under the condition that the input information is judged to meet the third specific condition.
8. The electronic device of any of claims 5-7,
the operation body is a stylus pen capable of sensing an operation for the display panel by the sensing module.
CN201610509731.8A 2016-06-30 2016-06-30 Electronic apparatus and control method Active CN106126107B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610509731.8A CN106126107B (en) 2016-06-30 2016-06-30 Electronic apparatus and control method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610509731.8A CN106126107B (en) 2016-06-30 2016-06-30 Electronic apparatus and control method

Publications (2)

Publication Number Publication Date
CN106126107A CN106126107A (en) 2016-11-16
CN106126107B true CN106126107B (en) 2020-07-24

Family

ID=57467968

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610509731.8A Active CN106126107B (en) 2016-06-30 2016-06-30 Electronic apparatus and control method

Country Status (1)

Country Link
CN (1) CN106126107B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102270139A (en) * 2011-08-16 2011-12-07 潘天华 Screenshot method
CN103135914A (en) * 2011-11-28 2013-06-05 阿里巴巴集团控股有限公司 Screen capture method and screen capture device based on touch screen
CN103577099A (en) * 2012-07-30 2014-02-12 三星电子株式会社 Method and apparatus for virtual TOUR creation in mobile device

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130227474A1 (en) * 2011-01-05 2013-08-29 King Fahd University Of Petroleum And Minerals Cliclkess graphical user interface for displaying and selecting a pop-up menu
CN103455315B (en) * 2012-06-04 2018-09-07 百度在线网络技术(北京)有限公司 A kind of method and apparatus corresponding to target information for realizing screenshotss and acquisition
CN204440491U (en) * 2015-03-26 2015-07-01 天机数码创新技术有限公司 A kind of augmented reality system with a key screenshotss sharing function

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102270139A (en) * 2011-08-16 2011-12-07 潘天华 Screenshot method
CN103135914A (en) * 2011-11-28 2013-06-05 阿里巴巴集团控股有限公司 Screen capture method and screen capture device based on touch screen
CN103577099A (en) * 2012-07-30 2014-02-12 三星电子株式会社 Method and apparatus for virtual TOUR creation in mobile device

Also Published As

Publication number Publication date
CN106126107A (en) 2016-11-16

Similar Documents

Publication Publication Date Title
CN109062479B (en) Split screen application switching method and device, storage medium and electronic equipment
US9733815B2 (en) Split-screen display method and apparatus, and electronic device thereof
EP2608006B1 (en) Category search method and mobile device adapted thereto
KR102020345B1 (en) The method for constructing a home screen in the terminal having touchscreen and device thereof
US9542020B2 (en) Remote session control using multi-touch inputs
KR101381484B1 (en) Mobile device having a graphic object floating function and execution method using the same
US10223057B2 (en) Information handling system management of virtual input device interactions
US20130132878A1 (en) Touch enabled device drop zone
CN115269094A (en) Managing workspaces in a user interface
KR102027879B1 (en) Menu contolling method of media equipment, apparatus thereof, and medium storing program source thereof
SG185530A1 (en) Method and device for adjusting size of list element
JP6458751B2 (en) Display control device
CN112148170A (en) Desktop element adjusting method and device and electronic equipment
JP2016528600A (en) How to select parts of a graphical user interface
WO2018019050A1 (en) Gesture control and interaction method and device based on touch-sensitive surface and display
US20150100901A1 (en) Information processing device, method, and program
US20130127745A1 (en) Method for Multiple Touch Control Virtual Objects and System thereof
CN107817927B (en) Application icon management method and device
TWI607369B (en) System and method for adjusting image display
JP5882973B2 (en) Information processing apparatus, method, and program
US10228892B2 (en) Information handling system management of virtual input device interactions
CN106126107B (en) Electronic apparatus and control method
US20140380188A1 (en) Information processing apparatus
CN112162689B (en) Input method and device and electronic equipment
CN111796736B (en) Application sharing method and device and electronic equipment

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant