CN114690888A - Application interface processing method and related equipment - Google Patents

Application interface processing method and related equipment Download PDF

Info

Publication number
CN114690888A
CN114690888A CN202011631548.8A CN202011631548A CN114690888A CN 114690888 A CN114690888 A CN 114690888A CN 202011631548 A CN202011631548 A CN 202011631548A CN 114690888 A CN114690888 A CN 114690888A
Authority
CN
China
Prior art keywords
display screen
application interface
electronic device
display
virtual
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011631548.8A
Other languages
Chinese (zh)
Inventor
刘逸硕
黄大源
李维
闫澈
周轩
赵韵景
梁敬非
李宏汀
黄雪妍
王卓
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202011631548.8A priority Critical patent/CN114690888A/en
Priority to PCT/CN2021/141913 priority patent/WO2022143607A1/en
Publication of CN114690888A publication Critical patent/CN114690888A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus

Abstract

The invention relates to the field of human-computer interaction, and provides a processing method of an application interface in an embodiment, wherein the method is applied to electronic equipment, the electronic equipment comprises a first display screen and a second display screen, and the method comprises the following steps: displaying a first application interface through a first display screen; in response to the detected first operation, converting the mode type corresponding to the first application interface into handwriting input; and responding to the input mode of the handwriting input, and triggering to display the first application interface on the second display screen so as to acquire the handwriting content aiming at the first application interface through the second display screen. If the user places the second display screen in the direction convenient for writing, the user does not need to execute any operation, the electronic equipment can automatically display the application interface needing to be written and input on the second display screen convenient for writing, the efficiency of the whole input process is improved, the redundant steps are avoided, the operation is simple, and the improvement of the viscosity of the user is facilitated.

Description

Application interface processing method and related equipment
Technical Field
The present invention relates to the field of computer technologies, and in particular, to a method for processing an application interface and a related device.
Background
At present, electronic pens are equipped on a plurality of computers and panels to meet the requirements of users for directly performing content creation such as annotation, note taking, drawing and the like. At present, a user can input through an electronic pen on one display screen and copy the input content to another display screen, so that the function of inputting through the electronic pen is realized.
However, the user needs to perform the steps of copying and pasting after inputting the content, and the operation steps are complicated.
Disclosure of Invention
The embodiment of the application interface processing method and the related equipment are provided, if a user places the second display screen in the direction convenient for writing, the user does not need to execute any operation, the electronic equipment can automatically display the application interface needing to be written and input on the second display screen convenient for writing, the efficiency of the whole input process is improved, the redundant step is avoided, the operation is simple, and the improvement of the viscosity of the user is facilitated.
In order to solve the above technical problem, an embodiment of the present application provides the following technical solutions:
in a first aspect, an embodiment of the present application provides a feedback method, which may be used in the field of virtual keyboards. The method is applied to electronic equipment, the electronic equipment is provided with a touch screen, and a plurality of vibration feedback elements are arranged in the touch screen, and the method comprises the following steps: the electronic equipment detects a first contact operation acting on the touch screen, responds to the first contact operation, and acquires first position information of a first contact point corresponding to the first contact operation, wherein the first position information corresponds to a first virtual key on the virtual keyboard. Under the condition that the first virtual key is an anchor point key, the electronic equipment acquires one or more first vibration feedback elements from the plurality of vibration feedback elements; the first vibration feedback element is matched with the first virtual key, and the vibration feedback elements matched with different virtual keys are not identical; the virtual keyboard may be represented by any type of keyboard, for example, the virtual keyboard may be a full keyboard, a numeric keyboard, a functional keyboard, or the like, or the virtual keyboard may be a generic term of all operation keys on the touch screen. The meaning of the anchor point keys is not equal to that of the positioning keys, that is, the anchor point keys refer to keys for giving a prompt effect to a user, and after the currently displayed virtual keyboard is determined, which virtual keys are anchor point keys can be pre-configured in the electronic device, that is, which virtual keys are anchor point keys can be pre-fixed; the method can also be customized by the user, that is, the user can define which virtual keys are the anchor point keys by himself through a 'setup' interface in the electronic device. Further, in a process of determining whether the first virtual key is the anchor point key according to the first position information, in an implementation manner, the electronic device acquires the first virtual key corresponding to the first contact point according to the first position information, and then determines whether the first virtual key is the anchor point key. In another implementation manner, the electronic device may pre-store which location areas on the touch screen are location areas of the anchor point keys, which location areas on the touch screen are location areas of the non-anchor point keys, and directly determine, according to the first location information, whether the location of the first contact point is located in the location area of the anchor point key, so as to determine whether the first virtual key corresponding to the first location information is the anchor point key. The electronic equipment indicates all the first vibration feedback elements matched with the first virtual keys to send vibration waves so as to execute first feedback operation, and the first feedback operation is used for prompting that the first virtual keys are anchor point keys.
In the implementation mode, when the user contacts the anchor point key on the virtual key, the first feedback operation is executed through the touch screen to prompt the user that the key currently contacts the anchor point key, so that the user can perceive the position of the anchor point key, and the touch screen is favorable for reducing the difficulty of realizing touch typing; in addition, a plurality of vibration feedback elements are configured in the touch screen, and under the condition that the first virtual key is determined to be the anchor point key, obtaining at least one first vibratory feedback element from the plurality of vibratory feedback elements that matches the first virtual key, and instructs the at least one first vibration feedback to emit vibration waves, so that the effect of generating vibration feedback only around the first virtual key can be realized, namely, the vibration feedback is not performed on the full screen, since all fingers are placed on the touch screen during typing, if the vibration is full screen, all fingers will experience vibration, which is easily confusing for the user, but only the effect of vibration feedback is generated around the first virtual key, the user is not easy to confuse and is more easily helped to form muscle memory at the finger so as to assist the user in realizing touch typing on the touch screen.
In one possible implementation manner of the first aspect, after the electronic device acquires the first position information of the first contact point corresponding to the first contact operation, the method further includes: the electronic equipment acquires a first virtual key corresponding to the first contact point according to the first position information. In the implementation mode, the first virtual key corresponding to the first contact point can be acquired in real time according to the first position information, so that the scheme not only can be compatible with a virtual keyboard with a fixed position, but also can be compatible with a virtual keyboard with a movable position, and the application scene of the scheme is expanded.
In one possible implementation of the first aspect, a first mapping is configured in the electronic device, the first mapping indicating a correspondence between the virtual key and the vibratory feedback element. The electronic device obtains a first vibratory feedback element from a plurality of vibratory feedback elements, comprising: and the electronic equipment acquires a first vibration feedback element matched with the first virtual key according to the first mapping relation and the first virtual key. Optionally, if a plurality of mapping relationships corresponding to a plurality of types of virtual keyboards one to one are pre-configured on the electronic device, each mapping relationship includes a corresponding relationship between a plurality of virtual keys and a plurality of first vibration feedback elements. Before the electronic device obtains the first vibration feedback element matched with the first virtual key according to the first mapping relationship and the first virtual key, it is necessary to obtain a first mapping relationship matched with the type of the currently displayed virtual keyboard from the multiple mapping relationships.
In the implementation mode, the first mapping relation is configured in advance, so that after the first virtual key is obtained, at least one first vibration feedback element matched with the first virtual key can be obtained through the first mapping relation, convenience and rapidness are achieved, and the efficiency of the matching process of the vibration feedback elements is improved; the step of determining the vibration feedback element is split to facilitate accurate location of the fault location when a fault occurs.
In one possible implementation manner of the first aspect, a first mapping relationship is configured in the electronic device, and the first mapping relationship indicates a correspondence relationship between the position information and the vibration feedback element. An electronic device obtains a first vibratory feedback element from a plurality of vibratory feedback elements, comprising: the electronic equipment acquires a first vibration feedback element matched with the first position information according to the first mapping relation and the first position information, and the first position information corresponds to a first virtual key on the virtual keyboard, namely the first vibration feedback element corresponding to the first virtual key is acquired. In the implementation mode, at least one first vibration feedback element matched with the first virtual key can be obtained according to the first position information and the first mapping relation, so that the method is convenient and quick, and is beneficial to improving the efficiency of the matching process of the vibration feedback elements; and the first mapping relation can indicate the first position information and indicate the corresponding relation between one first vibration feedback element, so that the virtual keyboard with fixed position can be compatible, the virtual keyboard with movable position can be compatible, and vibration feedback can be provided under various scenes.
In one possible implementation manner of the first aspect, before the electronic device emits the vibration wave through the first vibration feedback element, the method further includes: the electronic equipment acquires the vibration intensity of the vibration wave corresponding to each first vibration feedback element in the at least one first vibration feedback element, and the vibration intensity of the vibration wave of each first vibration feedback element in the at least one first vibration feedback element is related to any one or more of the following factors: the first number, the distance between each first vibration feedback unit and the center point of the first virtual key, the type of vibration waves, whether the virtual key is an anchor point key or the position type of the first position information, and the first number is the number of the first vibration feedback elements. The electronic device emits a vibration wave through a first vibration feedback element, comprising: the electronic equipment sends out vibration waves through at least one first vibration feedback element according to the vibration intensity of the vibration waves corresponding to each first vibration feedback element, so that the difference between the vibration feedback intensity corresponding to the first virtual key and the vibration feedback intensity corresponding to the second virtual key is within a preset intensity range, and the second virtual key and the first virtual key are different virtual keys; the predetermined intensity range may be within two percent of the intensity difference, within three percent of the intensity difference, within four percent of the intensity difference, or within five percent of the intensity difference. Further, in the process of measuring the intensity on the surface of the touch screen, the probe of the vibration measuring instrument may be attached to the surface of a virtual key (i.e., a detection point) on the touch screen to collect the vibration wave from the detection point, so as to obtain a waveform curve of the collected vibration wave, and the intensity of the vibration feedback corresponding to the detection point is indicated by the waveform curve. Further, the difference between the intensity of the vibration feedback corresponding to the first virtual key and the intensity of the vibration feedback corresponding to the second virtual key can be obtained by comparing the difference between the waveform curve measured at the detection point of the first virtual key and the waveform curves of the two regions at the detection point of the second virtual key.
In this implementation, since the number of the vibration feedback elements corresponding to different virtual keys may be different, the strength of each vibration feedback element is determined according to the number of the matched vibration feedback elements, so as to achieve that the difference of the vibration feedback strengths of the virtual keys is within a preset range.
In one possible implementation of the first aspect, the first vibratory feedback element is any one of: piezoelectric ceramic plates, linear motors, or piezoelectric films. In the implementation mode, various concrete expression forms of the vibration feedback element are provided, and the implementation flexibility of the scheme is improved.
In a possible implementation manner of the first aspect, the first contact point is a newly added contact point on the touch screen. In the embodiment of the application, because the user often focuses on the actual key of the new contact when using the physical keyboard, the feedback is generated only for the newly added contact point in the scheme, so that the experience of the user when using the physical keyboard for input can be better simulated, the feedback is generated only for the newly added contact point, the memory relationship between the user and the newly added contact point is easier to establish, and the difficulty of training touch typing on the touch screen is further reduced.
In one possible implementation manner of the first aspect, the method further includes: and under the condition that the first virtual key is a non-anchor point key, the electronic equipment executes a second feedback operation, wherein the second feedback operation is used for prompting that the first virtual key is the non-anchor point key, and the first feedback operation and the second feedback operation are different feedback operations. In the implementation mode, the feedback operation is executed under the condition that the first virtual key is the anchor point key, the feedback operation is also executed under the condition that the first virtual key is the non-anchor point key, the first feedback operation and the second feedback operation are different feedback operations, and because each key can give feedback to a user when the user uses the physical keyboard, through the mode, the similarity between the virtual keyboard and the physical keyboard can be increased, different feedback operations are given to the anchor point key and the non-anchor point key, the user can be helped to remember different types of keys, and the user can be helped to realize touch typing on the virtual keyboard.
In one possible implementation manner of the first aspect, the first feedback operation is to emit a first type of vibration wave through the touch screen, the second feedback operation is to emit a second type of vibration wave through the touch screen, and the first type of vibration wave and the second type of vibration wave are different types of vibration waves. If the electronic device emits continuous vibration waves through the vibration feedback element, the difference between the different types of vibration waves includes any one or more of the following characteristics: vibration amplitude, vibration frequency, vibration duration, or vibration waveform. If the electronic device emits the vibration waves in the form of pulses through the vibration feedback element, the difference between the different types of vibration waves includes any one or more of the following characteristics: vibration amplitude, vibration frequency, vibration duration, vibration waveform or frequency of vibration waves in the form of pulses emitted by the electronic device.
In one possible implementation manner of the first aspect, before the electronic device performs the first feedback operation, the method further includes: the electronic equipment acquires a position type corresponding to the first contact point according to the first position information, wherein the position type comprises a first position area (also called as a characteristic area of an anchor point key) of the first virtual key where the first contact point is located and a second position area (also called as an edge area of the anchor point key) of the first virtual key where the first contact point is located, and the first position area and the second position area are different; the electronic device performs a first feedback operation comprising: the electronic equipment executes a first feedback operation through the touch screen according to the position type corresponding to the first contact point, wherein the feedback operation corresponding to the first position area is different from the feedback operation corresponding to the second position area.
In the implementation mode, all position areas of the anchor point key and/or the non-anchor point key are divided into a first position area and a second position area, under the condition that the first contact point is located in the first position area and the first contact point is located in the second position area, the types of vibration waves sent by the electronic equipment through the at least one first vibration feedback element are different, the user can be helped to memorize the boundary of the virtual key, the user can be helped to establish muscle memory for different areas of the virtual key, and the difficulty of realizing touch typing on the touch screen is further reduced.
In one possible implementation manner of the first aspect, the feedback operation corresponding to the first position area of the anchor point key is the same as the feedback operation corresponding to the first position area of the non-anchor point key, and the feedback operation corresponding to the second position area of the anchor point key is different from the feedback operation corresponding to the second position area of the non-anchor point key; or the feedback operation corresponding to the first position area of the anchor point key is different from the feedback operation corresponding to the first position area of the non-anchor point key, and the feedback operation corresponding to the second position area of the anchor point key is the same as the feedback operation corresponding to the second position area of the non-anchor point key; or the feedback operation corresponding to the first position area of the anchor point key is different from the feedback operation corresponding to the first position area of the non-anchor point key, and the feedback operation corresponding to the second position area of the anchor point key is different from the feedback operation corresponding to the second position area of the non-anchor point key.
In one possible implementation manner of the first aspect, the first contact operation is a pressing operation, and the method further includes: the electronic equipment detects a second contact operation acting on the touch screen and acquires second position information of a second contact point corresponding to the second contact operation, wherein the second contact operation is touch operation; the electronic device changes the tactile characteristics of the second contact point on the touch screen in response to the second contact operation, the tactile characteristics including any one or more of the following: coefficient of sliding friction, stick-slip and temperature.
In one possible implementation manner of the first aspect, before the electronic device detects the first contact operation acting on the touch screen, the method further includes: the electronic equipment detects a first gesture operation acting on a touch screen; the electronic equipment responds to a first gesture operation, and selects a first type of virtual keyboard corresponding to the first gesture operation from a plurality of types of virtual keyboards, wherein virtual keys included in different types of virtual keyboards in the plurality of types of virtual keyboards are not identical; the electronic equipment displays a first type of virtual keyboard through a touch screen, and the position of the first type of virtual keyboard on the touch screen is fixed in the display process of the first type of virtual keyboard; the electronic equipment detects a first contact operation acting on a touch screen, and the method comprises the following steps: the electronic device detects a first contact operation acting on the touch screen in a display process of the first type of virtual keyboard. The meanings of the terms, the specific implementation steps, and the beneficial effects of the present implementation will be described in the following seventh aspect, which will not be introduced here.
In a second aspect, an embodiment of the present application provides an electronic device, which may be used in the field of virtual keyboards. The electronic device is provided with a touch screen, the touch screen comprises a contact sensing module and a vibration feedback module, and the vibration feedback module comprises a plurality of vibration feedback elements. The touch sensing module is used for acquiring first position information of a first contact point on the touch screen, and the touch sensing module can be embodied as a touch sensing film, and the touch sensing film can be a capacitive touch sensing film, a pressure type touch sensing film, a temperature type touch sensing film or other types of films. The first vibration feedback element is used for sending out vibration waves under the condition that the first virtual key corresponding to the first contact point is the anchor point key, the vibration waves are used for prompting that the first virtual key is the anchor point key, and the first vibration feedback element is any one of the following components: piezoelectric ceramic plates, linear motors or piezoelectric films; the first virtual key is one virtual key in the virtual keyboard, and the first vibration feedback element is one of the plurality of vibration feedback elements which is matched with the first virtual key.
In a possible implementation manner of the second aspect, the first contact point is obtained based on a pressing operation acting on the touch screen, the touch screen further includes a cover plate and an ultrasonic module, and the ultrasonic module is configured to emit ultrasonic waves to change a tactile characteristic of the cover plate; specifically, the contact sensing module is further configured to acquire second position information of a second contact point on the touch screen; the ultrasonic module is specifically used for sending out ultrasonic waves to change the tactile characteristics of the cover plate under the condition that the second contact point is obtained based on touch operation acting on the touch screen. Or, the touch screen further comprises a cover plate and an electrostatic module, wherein the electrostatic module is used for generating an electric signal to change the touch characteristics of the cover plate; specifically, the contact sensing module is further configured to acquire second position information of a second contact point on the touch screen; the electrostatic module is specifically used for generating an electric signal to change the tactile characteristics of the cover plate under the condition that the second contact point is obtained based on the touch operation acted on the touch screen. Wherein the tactile characteristics include any one or more of the following: coefficient of sliding friction, stick-slip and temperature.
In this implementation, the touch screen can also change the touch characteristics of the cover plate by setting the ultrasonic wave module or the electrostatic module, so that richer touch feedback can be provided, and further richer touch feedback can be utilized to train the user to realize touch typing on the touch screen, thereby further reducing the difficulty of realizing touch typing on the touch screen.
In a possible implementation manner of the second aspect, the touch screen further includes a pressure sensing module, the pressure sensing module and the vibration feedback module are integrated into a whole, and the vibration feedback element is a piezoelectric ceramic piece, a piezoelectric polymer, or a piezoelectric composite material. The pressure sensing module is used for collecting a pressure value corresponding to the first contact operation so as to determine whether the first contact operation is a pressing operation or a touch operation. Specifically, in one case, a plurality of vibration feedback elements included in the vibration feedback module (which may also be referred to as a pressure sensing module) may be divided, a second vibration feedback element of the plurality of vibration feedback elements is used to acquire a pressure value, and a third vibration feedback element of the plurality of vibration feedback elements is used to emit a vibration wave to perform vibration feedback. Wherein the second and third vibration feedback elements are different vibration feedback elements. In another case, the plurality of vibration feedback elements in the vibration feedback module (which may also be referred to as a pressure sensing module) are used for acquiring pressure values in a first time period and emitting vibration waves in a second time period, and the first time period and the second time period are different.
In the implementation mode, the touch screen is also provided with a pressure sensing module for acquiring pressure values, so that not only the position information of the contact point can be acquired, but also the pressure values of the contact point can be acquired, and the contact operation acquired through the touch screen can be further carefully managed; and the pressure sensing module and the vibration feedback module are integrated into a whole, so that the thickness of the touch screen is reduced, and the convenience of the electronic equipment is improved.
For the beneficial effects brought by the concepts and specific implementation steps of the terms in the second aspect and the partial possible implementation manners of the second aspect in the embodiment of the present application, reference may be made to descriptions in various possible implementation manners of the first aspect, and details are not repeated here.
In a third aspect, an embodiment of the present application provides an electronic device, which may be used in the field of virtual keyboards. The electronic device includes a touch screen having a plurality of vibration feedback elements configured therein, a memory, one or more processors, and one or more programs that, when executed, cause the electronic device to perform the steps of: detecting a first contact operation acting on a touch screen; responding to the first contact operation, and acquiring first position information of a first contact point corresponding to the first contact operation, wherein the first position information corresponds to a first virtual key on a virtual keyboard; under the condition that the first virtual key is an anchor point key, acquiring a first vibration feedback element from a plurality of vibration feedback elements, wherein the first vibration feedback element is a vibration feedback element matched with the first virtual key; and indicating the first vibration feedback element to send out vibration waves so as to execute a first feedback operation, wherein the first feedback operation is used for prompting that the first virtual key is an anchor point key.
In the third aspect of the embodiment of the present application, the electronic device may further be configured to implement steps executed by the electronic device in various possible implementation manners of the first aspect, and for specific implementation manners of some steps in the third aspect and the various possible implementation manners of the third aspect of the embodiment of the present application and beneficial effects brought by each possible implementation manner, reference may be made to descriptions in the various possible implementation manners of the first aspect, and details are not repeated here.
In a fourth aspect, embodiments of the present application provide a computer program, which when run on a computer, causes the computer to perform the feedback method according to the first aspect.
In a fifth aspect, an embodiment of the present application provides an electronic device, including a processor, coupled with the memory; the memory is used for storing programs; the processor is configured to execute the program in the memory, so that the execution device executes the feedback method according to the first aspect.
In a sixth aspect, the present application provides a computer-readable storage medium, in which a computer program is stored, and when the computer program runs on a computer, the computer is caused to execute the feedback method according to the first aspect.
In a seventh aspect, an embodiment of the present application provides a chip system, which includes a processor, and is configured to support implementing the functions referred to in the first aspect, for example, sending or processing data and/or information referred to in the method. In one possible design, the system-on-chip further includes a memory for storing program instructions and data necessary for the server or the communication device. The chip system may be formed by a chip, or may include a chip and other discrete devices.
In an eighth aspect, an embodiment of the present application provides a processing method for a virtual keyboard, which may be used in the field of human-computer interaction. The method is applied to the electronic equipment, a display screen is configured in the electronic equipment, and the method comprises the following steps: the method comprises the steps that the electronic equipment detects a first gesture operation acting on a display screen, and responds to the detected first gesture operation, a first type of virtual keyboard corresponding to the first gesture operation is selected from a plurality of types of virtual keyboards, wherein virtual keys included in different types of virtual keyboards in the plurality of types of virtual keyboards are not identical; the electronic device presents a first type of virtual keyboard via a display screen.
In the implementation mode, the electronic equipment is provided with a plurality of different types of virtual keyboards, the virtual keys included in the different types of virtual keyboards are not identical, a user can call the different types of virtual keyboards through different gesture operations, namely the virtual keyboards can not only display 26 letters any more, but provide more virtual keys for the user through the different types of virtual keyboards, so that the flexibility of the user in the process of calling the virtual keyboards is improved, the more abundant virtual keys are provided, and the additional physical keyboard does not need to be provided any more.
In a possible implementation manner of the eighth aspect, the selecting, by the electronic device, a first type of virtual keyboard corresponding to the first gesture operation from a plurality of types of virtual keyboards includes: the electronic equipment selects a first type of virtual keyboard corresponding to the first gesture operation from the multiple types of virtual keyboards according to a first rule, and the first rule indicates the corresponding relation between the multiple types of gesture operation and the multiple types of virtual keyboards. In this implementation manner, the electronic device is preconfigured with a first rule, the first rule indicates a correspondence between a plurality of types of gesture operations and a plurality of types of virtual keyboards, and after a first gesture operation acting on the display screen is detected, a first type of virtual keyboard corresponding to a specific first gesture operation can be obtained according to the first rule, so that efficiency of a virtual keyboard matching process is improved.
In one possible implementation of the eighth aspect, in one case, the first rule directly includes a correspondence between a plurality of types of gesture operations and a plurality of types of virtual keyboards; the first rule comprises a corresponding relation between a plurality of first identification information and a plurality of second identification information, the first identification information is used for uniquely pointing to the first identification information corresponding to one type of gesture operation, and the second identification information is used for uniquely pointing to one type of virtual keyboard. In another case, the first rule includes correspondence between multiple sets of conditions and multiple types of virtual keyboards, each set of conditions in the multiple sets of conditions corresponds to one type of gesture operation, that is, each set of conditions in the multiple sets of conditions is a defined condition of a gesture parameter corresponding to the gesture operation, and each set of conditions corresponds to one type of gesture operation.
In a possible implementation manner of the eighth aspect, before the electronic device selects a first type of virtual keyboard corresponding to the first gesture operation from a plurality of types of virtual keyboards, the method further includes: the electronic equipment acquires a first gesture parameter corresponding to a first gesture operation, wherein the first gesture parameter comprises any one or more of the following parameters: position information of contact points corresponding to the first gesture operation, number information of contact points corresponding to the first gesture operation, area information of contact points corresponding to the first gesture operation, relative angle information of a hand corresponding to the first gesture operation, position information of a hand corresponding to the first gesture operation, number information of hands corresponding to the first gesture operation, and shape information of a hand corresponding to the first gesture operation; the electronic equipment selects a first type of virtual keyboard corresponding to a first gesture operation from a plurality of types of virtual keyboards, and comprises the following steps: the electronic equipment selects a first type of virtual keyboard from a plurality of types of virtual keyboards according to the first gesture parameter.
In the implementation mode, the first gesture parameter not only includes the position information of each contact point and the quantity information of the plurality of contact points, but also includes the area information of each contact point, and the area information of the contact points can distinguish the contact points triggered by the palm from the plurality of contact points, so that the type of the first gesture operation can be accurately estimated, the wrong virtual keyboard can be prevented from being displayed, and the accuracy of the virtual keyboard display process can be improved; after the acquired first gesture parameters are subjected to secondary processing, information such as relative angle information of hands, position information of the hands, number information of the hands or shape information of the hands can be obtained, namely richer information about the first gesture operation can be obtained based on the first gesture parameters, and flexibility of a virtual keyboard matching process is improved.
In one possible implementation manner of the eighth aspect, the method further includes: the electronic device responds to the first gesture operation, and obtains a first angle, wherein the first angle indicates a relative angle between a hand corresponding to the first gesture operation and the edge of the display screen, or indicates a relative angle between the hand corresponding to the first gesture operation and the central line of the display screen. The electronic device presents a first type of virtual keyboard through a display screen, comprising: the electronic equipment acquires a first display angle of the first type of virtual keyboard according to the first angle, and displays the first type of virtual keyboard according to the first display angle through the display screen; the first display angle indicates a relative angle between an edge of the first type of virtual keyboard and an edge of the display screen, or the first display angle indicates a relative angle between an edge of the first type of virtual keyboard and a centerline of the display screen.
In this implementation, obtain the relative angle (also be first angle) between user's hand and the limit or the center line that show the interface to confirm the show angle of virtual keyboard according to first angle, thereby make the show angle of keyboard laminate the angle of placing of user's hand more, make the process that the user utilized virtual keyboard to carry out the input more comfortable and convenient.
In a possible implementation manner of the eighth aspect, if the first type of virtual keyboard is a full keyboard, the full keyboard is split into a first sub-keyboard and a second sub-keyboard, the first angle includes a relative angle of a left hand and a relative angle of a right hand, the first sub-keyboard and the second sub-keyboard include different virtual keys in the full keyboard, and the first display angle includes a display angle of the first sub-keyboard and a display angle of the second sub-keyboard. If the first angle indicates a relative angle between the hand and the edge of the display screen in a first gesture corresponding to the first gesture operation, the display angle of the first sub-keyboard indicates a relative angle between the edge of the first sub-keyboard and the edge of the display screen, and the display angle of the second sub-keyboard indicates a relative angle between the edge of the second sub-keyboard and the edge of the display screen; if the first angle indicates a relative angle between the hand and the edge of the display screen in the first gesture corresponding to the first gesture operation, the display angle of the first sub-keyboard indicates a relative angle between the edge of the first sub-keyboard and the center line of the display screen, and the display angle of the second sub-keyboard indicates a relative angle between the edge of the second sub-keyboard and the center line of the display screen.
In a possible implementation manner of the eighth aspect, in one case, the electronic device determines whether the first angle is greater than or equal to a preset angle threshold, and if the first angle is greater than or equal to the preset angle threshold, obtains the first display angle, and displays the first type of virtual keyboard according to the first display angle through the display screen, where a value of the preset angle threshold may be 25 degrees, 28 degrees, 30 degrees, 32 degrees, 35 degrees, or other numerical values, and the like, and is not limited herein. In another case, after acquiring the first angle, the electronic device determines a first display angle of the virtual keyboard of the first type as the first angle, displays the virtual keyboard of the first type according to the first angle through the display screen,
in one possible implementation of the eighth aspect, the functions of different types of virtual keyboards in the plurality of types of virtual keyboards are different, and the different-function virtual keyboards include a combination of any two or more of the following: the keyboard comprises a numeric keyboard, a function key keyboard, a full keyboard and a self-defined keyboard, wherein the function key keyboard consists of function keys. In the implementation mode, the different types of virtual keyboards have different functions, so that the virtual keyboards with different functions can be provided for users, the flexibility of the users in the use process of the virtual keyboards is improved, and the user viscosity of the scheme is improved.
In one possible implementation of the eighth aspect, in a case where the first gesture operation is a one-handed operation, the first type of virtual keyboard is any one of the following: the keyboard comprises a mini keyboard, a numeric keyboard, a functional key keyboard, a circular keyboard, an arc-shaped keyboard and a user-defined keyboard, wherein the mini keyboard comprises 26 letter keys, the functional keyboard is displayed in an application program, and virtual keys of the functional keyboard correspond to functions of the application program. It should be noted that, the mini keyboard, the numeric keyboard, the functional key keyboard, the circular keyboard, the arc-shaped keyboard and the custom keyboard do not need to be configured in the same electronic device at the same time, and the examples here are only to prove that the virtual keyboard triggered by one-hand operation in one electronic device may be any one of the mini keyboard, the numeric keyboard, the functional key keyboard, the circular keyboard, the arc-shaped keyboard or the custom keyboard. In the implementation mode, various concrete representation forms of the virtual keyboard displayed through the display screen are provided under the condition that the first gesture operation is one-hand operation and two-hand operation, the implementation flexibility of the scheme is improved, and the application scene of the scheme is expanded.
In one possible implementation of the eighth aspect, in a case where the first gesture operation is a two-handed operation, the first type of virtual keyboard is a full keyboard, the full keyboard including at least 26 letter keys, the full keyboard being larger in size than the mini keyboard. The electronic equipment shows virtual keyboard of first type through the display screen, includes: under the condition that the distance between the two hands is smaller than or equal to a first distance threshold value, the electronic equipment displays the full keyboard in an integrated mode through the display screen; under the condition that the distance between the two hands is larger than a first distance threshold value, the electronic equipment displays a first sub-keyboard through a second area of the display screen and displays a second sub-keyboard through a third area of the display screen, wherein the second area and the third area are different areas in the display screen, and the first sub-keyboard and the second sub-keyboard comprise different virtual keys in the full keyboard; the first distance threshold may be 70 mm, 75 mm, 80 mm, etc., and is not limited herein.
In this implementation, whether to adopt the integral type to show virtual keyboard or adopt the mode of disconnect-type to show virtual keyboard can be decided based on the distance between user's two hands, has further improved the flexibility of the show process of virtual keyboard for the virtual keyboard of show is convenient for the user to use more, further improves the user viscosity of this scheme.
In one possible implementation of the eighth aspect, in a case where the first gesture operation is a first one-handed operation, the first type of virtual keyboard is a mini keyboard. In this implementation, under the condition that the first gesture operation is a one-hand operation, the first type of virtual keyboard is a mini keyboard, which is beneficial to improving the flexibility of the user in the process of inputting letters.
In one possible implementation of the eighth aspect, the one-handed operation includes a left-handed one-handed operation and a right-handed one-handed operation; under the condition that the first gesture operation is right-hand one-hand operation, the first type of virtual keyboard is a numeric keyboard; in a case where the first gesture operation is a left-handed single-handed operation, the virtual keyboard of the first type is a functional keyboard, and virtual keys included in the functional keyboard correspond to functions of the application program. For another example, if the first gesture operation is obtained from a drawing application, the functional keyboard may be a common key in drawing software.
In this implementation manner, the first type of virtual keyboard is a numeric keyboard under the condition that the first gesture operation is a right-handed one-handed operation, and the first type of virtual keyboard is a functional keyboard under the condition that the first gesture operation is a left-handed one-handed operation, so that the use habit of the user on the physical keyboard is better met, the difference between the virtual keyboard and the physical keyboard is reduced, and the user viscosity is favorably enhanced.
In one possible implementation of the eighth aspect, in a case where the first gesture operation is a one-handed operation located in a first area of the display screen, the first type of virtual keyboard is a function key keyboard, and the first area is located at a lower left or lower right of the display screen. In this implementation, since the function key is disposed at the lower left or lower right of the physical keyboard, the first type of virtual keyboard is the function key keyboard under the condition that the first gesture operation is a one-handed operation in the first area of the display screen, and since the trigger gesture is the same as the user's habit of using the physical keyboard, the user can remember the trigger gesture conveniently, the implementation difficulty of the scheme is reduced, and the user's viscosity is enhanced.
In one possible implementation manner of the eighth aspect, the method further includes: the electronic device obtains a contact operation for a first virtual key in the function key keyboard, for example, the first virtual key may be a Ctrl key, and may also include the Ctrl key and a Shift key at the same time. The electronic equipment responds to the contact operation aiming at the first virtual key in the function key keyboard, and highlights a second virtual key on the display screen, wherein the second virtual key is a key except the first virtual key in the combined shortcut key. Highlighting includes, but is not limited to, highlighting, bolding, or flashing, and is not limited thereto. As an example, for example, in an application program such as drawing, the combination key of the Ctrl key + Shift key + I key can provide a function of displaying the currently processed image in reverse phase, and the first virtual key includes the Ctrl key and the Shift key, and the second virtual key is the virtual key I.
In the embodiment of the application, in the process of displaying the function key keyboard in the display screen, the contact operation aiming at the first virtual key in the function key keyboard is obtained, the second virtual key is prominently displayed on the display screen in response to the contact operation, the second virtual key is the key except the first virtual key in the combined shortcut key, and the occupied area of the function key keyboard is small, so that the area required by displaying the virtual keyboard is reduced, and the second virtual key in the combined shortcut key can be automatically displayed when the user performs the contact operation on the first virtual key in the function key keyboard, so that the requirement of the user on the shortcut key is ensured, and the waste of the display area of the display screen is avoided.
In a possible implementation manner of the eighth aspect, the first gesture operation is a contact operation acquired through a display screen, and the first gesture parameter includes information of the number of contact points corresponding to the first gesture operation; in the case where the first gesture operation is a one-handed operation with less than three contact points, the first type of virtual keyboard is a circular keyboard or an arc-shaped keyboard. In the implementation mode, when the first gesture operation is a single-hand operation with less than three contact points, a circular keyboard or an arc-shaped keyboard can be provided, not only can a keyboard existing in the physical keyboard be provided, but also a keyboard not existing in the physical keyboard can be provided, the types of the keyboard are enriched, more choices are provided for a user, and the selection flexibility of the user is further enhanced.
In a possible implementation manner of the eighth aspect, the first rule includes a first sub-rule, and the first sub-rule is obtained based on a user-defined operation performed on at least one type of gesture operation and/or at least one type of virtual keyboard. In the implementation mode, the user can customize the type of the trigger gesture and/or the virtual keyboard, so that the display process of the virtual keyboard is more in line with the expectation of the user, and the user viscosity of the scheme is further improved.
In a possible implementation manner of the eighth aspect, a plurality of vibration feedback elements are configured in the display screen, during the displaying of the first type of virtual keyboard, the position of the first type of virtual keyboard on the display screen is fixed, and after the displaying of the first type of virtual keyboard through the display screen, the method further includes: the electronic equipment detects a first contact operation acting on the display screen, responds to the first contact operation, and acquires first position information of a first contact point corresponding to the first contact operation, wherein the first position information corresponds to a first virtual key on the virtual keyboard. Under the condition that the first virtual key is an anchor point key, the electronic equipment acquires a first vibration feedback element from a plurality of vibration feedback elements, wherein the first vibration feedback element is a vibration feedback element matched with the first virtual key; and indicating the first vibration feedback element to send out vibration waves so as to execute a first feedback operation, wherein the first feedback operation is used for prompting that the first virtual key is an anchor point key. For the meaning, specific implementation steps and beneficial effects of the terms such as the first contact operation, the first contact point, the first position information, the first virtual key, the first vibration feedback element, etc. in this implementation manner, reference may be made to the descriptions in various possible implementations of the first aspect, which are not described herein for a while.
In the eighth aspect of the embodiment of the present application, the electronic device may further be configured to implement steps executed by the electronic device in various possible implementation manners of the first aspect, and for specific implementation manners of some steps in the eighth aspect and the various possible implementation manners of the eighth aspect and beneficial effects brought by each possible implementation manner, reference may be made to descriptions in the various possible implementation manners of the first aspect, and details are not repeated here.
In a ninth aspect, an embodiment of the present application provides an electronic device, which may be used in the field of human-computer interaction. The electronic device includes a display screen, a memory, one or more processors, and one or more programs, the one or more programs being stored in the memory, the one or more processors, when executing the one or more programs, causing the electronic device to perform the steps of: in response to the detected first gesture operation, selecting a first type of virtual keyboard corresponding to the first gesture operation from a plurality of types of virtual keyboards, wherein virtual keys included in different types of virtual keyboards in the plurality of types of virtual keyboards are not identical; a first type of virtual keyboard is presented via a display screen.
In the ninth aspect of the embodiment of the present application, the electronic device may further be configured to implement steps executed by the electronic device in various possible implementation manners of the eighth aspect, and for specific implementation manners of some steps in the ninth aspect and various possible implementation manners of the ninth aspect of the embodiment of the present application and beneficial effects brought by each possible implementation manner, reference may be made to descriptions in various possible implementation manners of the eighth aspect, which are not described in detail herein again.
In a tenth aspect, embodiments of the present application provide a computer program, which, when run on a computer, causes the computer to execute the processing method of the virtual keyboard according to the above eighth aspect.
In an eleventh aspect, an embodiment of the present application provides an electronic device, including a processor, coupled with the memory; the memory is used for storing programs; the processor is configured to execute the program in the memory, so that the electronic device executes the processing method of the virtual keyboard according to the eighth aspect.
In a twelfth aspect, an embodiment of the present application provides a computer-readable storage medium, where a computer program is stored, and when the computer program runs on a computer, the computer is caused to execute the processing method of the virtual keyboard according to the above eighth aspect.
In a thirteenth aspect, embodiments of the present application provide a chip system, which includes a processor, and is configured to support implementation of the functions referred to in the foregoing aspects, for example, sending or processing data and/or information referred to in the foregoing methods. In one possible design, the system-on-chip further includes a memory for storing program instructions and data necessary for the server or the communication device. The chip system may be formed by a chip, or may include a chip and other discrete devices.
In a fourteenth aspect, an embodiment of the present application provides a method for processing an application interface, which can be used in the field of human-computer interaction. The method is applied to the electronic equipment, the electronic equipment comprises a first display screen and a second display screen, and the method comprises the following steps: the electronic equipment displays a first application interface through a first display screen; the electronic equipment responds to the detected first operation and converts the mode type corresponding to the first application interface into handwriting input; and responding to the input mode of the handwriting input, and triggering to display the first application interface on the second display screen so as to acquire the handwriting content aiming at the first application interface through the second display screen. Specifically, an operating system runs on the electronic device, and the electronic device can display the first application interface on the second display screen in a mode of calling a move to function in the operating system, or in a mode of calling a Set Window Position function in the operating system, or in a mode of calling a Set Window place function in the operating system.
In the implementation mode, the electronic equipment displays a first application interface on a first display screen, and when the mode type corresponding to the first application interface is detected to be handwriting input, the electronic equipment triggers the display of the first application interface on a second display screen, so that the input is directly carried out through the first application interface displayed on the second display screen; through the mode, if the user places the second display screen in the direction convenient to write, the user does not need to execute any operation, the electronic equipment can automatically display the application interface needing to be written and input on the second display screen convenient to write, the efficiency of the whole input process is improved, the redundancy step is avoided, the operation is simple, and the improvement of the viscosity of the user is facilitated.
In one possible implementation manner of the fourteenth aspect, after the electronic device triggers presentation of the first application interface on the second display screen in response to an input mode of the handwriting input, the method further includes: the electronic equipment triggers the display of the first application interface on the first display screen and the display of the virtual keyboard on the second display screen in response to the input mode of the keyboard input under the condition that the mode type corresponding to the first application interface is detected to be converted into the keyboard input, so that the input content aiming at the first application interface is obtained through the virtual keyboard on the second display screen. Or, the electronic device triggers the display of the first application interface on the first display screen and the display of the virtual keyboard and the application control bar on the second display screen in response to the input mode of the keyboard input under the condition that the mode type corresponding to the first application interface is detected to be converted into the keyboard input.
In the implementation mode, in the process of displaying the application interface, the layout of the application interface on different display screens of the electronic equipment can be automatically adjusted when the application interface is changed from other mode types to the handwriting input, the layout of the application interface on different display screens can also be automatically adjusted when the mode type of the application interface is changed to the keyboard input, and the virtual keyboard can be automatically displayed, so that when the mode type of the application interface is changed to the keyboard input, a user does not need to manually adjust the layout of the application interface on different display screens, but can directly perform the keyboard input, the steps are simple, and the user viscosity of the scheme is further improved.
In one possible implementation manner of the fourteenth aspect, the method may further include: the electronic equipment detects a second operation acting on the second display screen; and changing the first display area of the application control bar into a second display area in response to the second operation, and changing a first control key group included in the application control bar into a second control key group, wherein the first control key group and the second control key group are control key sets corresponding to the target application. The specific meanings of the terms in the foregoing steps and the specific implementation manners of the foregoing steps will be described in the twentieth aspect, and will not be described in detail here.
In one possible implementation manner of the fourteenth aspect, the first application interface includes a first control key, and the method may further include: the electronic equipment detects a second operation on the first target application interface; and responding to the second operation, displaying the first control key in the application control bar, and hiding the first control key in the first application interface. Specific meanings of the terms in the foregoing steps and specific implementations of the foregoing steps will be described in the twenty-first aspect, and will not be described in detail here.
In one possible implementation manner of the fourteenth aspect, the displaying the virtual keyboard on the second display screen includes: displaying a second type of virtual keyboard on a second display screen; the method further comprises the following steps: the electronic equipment detects a first gesture operation acting on the second display screen, and responds to the first gesture operation to select a first type of virtual keyboard corresponding to the first gesture operation from a plurality of types of virtual keyboards, wherein virtual keys included in different types of virtual keyboards in the plurality of types of virtual keyboards are not identical; and displaying the first type of virtual keyboard through the second display screen, wherein the first type of virtual keyboard and the second type of virtual keyboard are different types of virtual keyboards in the multiple types of virtual keyboards. For specific meanings of terms in the foregoing steps and specific implementation manners of the foregoing steps, reference may be made to the description of the eighth aspect, in a fourteenth aspect of the embodiment of the present application, the electronic device may further perform steps performed by the electronic device in various possible implementation manners of the eighth aspect, and for specific implementation manners of some steps in the fourteenth aspect and the various possible implementation manners of the fourteenth aspect and beneficial effects brought by each possible implementation manner of the embodiment of the present application, reference may be made to the description in the various possible implementation manners of the eighth aspect, and details are not repeated here.
In one possible implementation manner of the fourteenth aspect, after the electronic device triggers presentation of the first application interface on the second display screen in response to an input mode of the handwriting input, the method further includes: the electronic equipment responds to the browsing mode and triggers the first application interface to be displayed on the first display screen and stops displaying the first application interface on the second display screen under the condition that the mode type corresponding to the first application interface is detected to be converted into the browsing mode. In the implementation manner, when the mode type of the application interface is changed into the browsing mode, the layout of the application interface on different display screens can be automatically adjusted, so that when the mode type of the application interface is changed into the browsing mode, a user does not need to manually adjust the layout of the application interface on different display screens, that is, under various different application scenes, simplification of operation steps can be achieved, and the user viscosity of the scheme is further improved.
In one possible implementation manner of the fourteenth aspect, the electronic device detecting the first operation includes any one or a combination of five of: the electronic equipment determines that the first operation is detected when detecting that the holding gesture of the electronic pen meets a first preset condition, wherein the holding gesture comprises any one or combination of more of the following items: the first preset condition comprises any one or combination of more of the following items: the holding position is located in the first position range, the holding force is located in the first force range, and the holding angle is located in the first angle range; or the electronic equipment acquires a trigger instruction for handwriting input through a first icon, and the first icon is displayed on the first application interface; or, the electronic device determines that the first operation is detected when a preset click operation or a preset track operation is detected, where the preset click operation may be a click operation, a double click operation, a triple click operation, or a long press operation, and the preset track operation may be a "Z" -shaped track operation, a downward sliding operation, a "hook-shaped track operation, or a" circle "-shaped track operation; or determining that the first operation is detected under the condition that the electronic pen is detected to be located within the preset range of the second display screen; or, in a case that it is detected that the electronic pen is changed from the first preset state to the second preset state, it is determined that the first operation is detected, where the change of the electronic pen from the first preset state to the second preset state may be a change of the electronic pen from a stationary state to a moving state, a change of the electronic pen from an unhacked state to a gripped state, and the like.
In the implementation mode, a plurality of judgment modes of the mode type corresponding to the first application interface are provided, so that the implementation flexibility of the scheme is improved, and the application scene of the scheme is expanded; furthermore, the mode type corresponding to the first application interface is determined according to the holding posture of the electronic pen, a user can realize the conversion of the mode type of the first application interface without performing other operations, the mode type corresponding to the first application interface is determined according to the holding posture of the user on the electronic pen, the error rate of the judgment process of the mode type corresponding to the first application interface can be reduced, the probability of misplacing the first application interface is reduced, the waste of computer resources is avoided, and the viscosity of the user is improved.
In a possible implementation manner of the fourteenth aspect, the first operation is a sliding operation in a first direction acquired by the second display screen, the sliding operation in the first direction is a sliding operation of sliding from an upper edge of the second display screen to a lower edge of the second display screen, and a distance between the upper edge of the second display screen and the first display screen is shorter than a distance between the lower edge of the second display screen and the first display screen. Specifically, the electronic device obtains a sliding operation in a first direction through the second display screen, responds to the sliding operation in the first direction, moves the virtual keyboard displayed on the second display screen to the lower edge of the second display screen along the first direction, and confirms that the mode type corresponding to the first application interface is converted into handwriting input when the upper edge of the virtual keyboard reaches the lower edge of the second display screen. In the implementation mode, the virtual keyboard displayed on the second display screen can be operated along with downward sliding of the user, and when the upper edge of the virtual keyboard reaches the lower edge of the second display screen, the electronic equipment confirms that the mode type corresponding to the first application interface is changed into handwriting input, so that the interestingness of the process from keyboard input to handwriting input is increased, and the improvement of the viscosity of the user is facilitated.
In one possible implementation manner of the fourteenth aspect, after the electronic device triggers presentation of the first application interface on the second display screen, the method further includes: the electronic equipment acquires starting operation aiming at a second application interface, and determines a mode type corresponding to the second application interface based on the starting operation, wherein the second application interface and the first application interface are different application interfaces; under the condition that the mode type corresponding to the second application interface is handwriting input, the electronic equipment responds to the input mode of the handwriting input and triggers the second application interface to be displayed on the second display screen; or, under the condition that the mode type corresponding to the second application interface is keyboard input, the electronic device triggers to display the second application interface on the first display screen and display the virtual keyboard on the second display screen in response to the input mode of the keyboard input; or, when the mode type corresponding to the second application interface is the browsing mode, the electronic device triggers to display the second application interface on the first display screen in response to the browsing mode.
In the implementation manner, the mode type corresponding to the application interface can be automatically detected in the process that the user uses the application interface, the display position of the application interface is adjusted according to the mode type corresponding to the application interface, and when the application interface is opened, the mode type corresponding to the application interface can be determined based on the starting operation, so that the display position of the application interface is determined, the user can conveniently and directly use the application interface after the starting operation is executed on the application interface, the position moving operation of the application interface is not needed, the convenience of the scheme is further improved, and the user viscosity of the scheme is increased.
In a possible implementation manner of the fourteenth aspect, the determining, by the electronic device, the type of the mode corresponding to the second application interface based on the start operation includes: under the condition that the starting operation is acquired through the first display screen, the electronic equipment determines that the mode type corresponding to the second application interface is a keyboard input or browsing mode; and under the condition that the starting operation is acquired through the second display screen, the electronic equipment determines that the mode type corresponding to the second application interface is handwriting input.
In a possible implementation manner of the fourteenth aspect, the determining, by the electronic device, a mode type corresponding to the second application interface based on the start operation includes: under the condition that the starting operation is acquired through the electronic pen, the electronic equipment determines that the mode type corresponding to the second application interface is handwriting input; and under the condition that the starting operation is acquired through a mouse or fingers, the electronic equipment determines that the mode type corresponding to the second application interface is a keyboard input or browsing mode.
In a fifteenth aspect, an embodiment of the present application provides an electronic device, which can be used in the field of human-computer interaction. The electronic device comprises a first display screen, a second display screen, a memory, one or more processors, and one or more programs; the one or more programs are stored in the memory, and the one or more processors, when executing the one or more programs, cause the electronic device to perform the steps of: displaying a first application interface through the first display screen; in response to the detected first operation, converting the mode type corresponding to the first application interface into handwriting input; and responding to the input mode of the handwriting input, triggering the first application interface to be displayed on the second display screen, so as to acquire the handwriting content aiming at the first application interface through the second display screen. For the concept of the nouns, specific implementation steps, and beneficial effects brought by each possible implementation manner in the second aspect and part of possible implementation manners of the second aspect in the embodiment of the present application, reference may be made to descriptions in various possible implementation manners in the first aspect, and details are not repeated here.
In the fifteenth aspect of the embodiment of the present application, the electronic device may further be configured to implement steps performed by the electronic device in various possible implementation manners of the fourteenth aspect, and for specific implementation manners of some steps in the fifteenth aspect and various possible implementation manners of the fifteenth aspect of the embodiment of the present application and beneficial effects brought by each possible implementation manner, reference may be made to descriptions in various possible implementation manners of the fourteenth aspect, which are not described herein again.
In a sixteenth aspect, an embodiment of the present application provides a computer program, which, when run on a computer, causes the computer to execute the processing method of the application interface described in the fourteenth aspect.
In a seventeenth aspect, an embodiment of the present application provides an electronic device, including a processor coupled to the memory; the memory is used for storing programs; the processor is configured to execute the program in the memory, so that the electronic device executes the processing method of the application interface according to the fourteenth aspect.
In an eighteenth aspect, embodiments of the present application provide a computer-readable storage medium, in which a computer program is stored, and when the computer program runs on a computer, the computer is caused to execute the processing method of the application interface according to the fourteenth aspect.
In a nineteenth aspect, embodiments of the present application provide a chip system, which includes a processor, and is configured to support implementation of the functions referred to in the foregoing aspects, for example, sending or processing data and/or information referred to in the foregoing methods. In one possible design, the system-on-chip further includes a memory for storing program instructions and data necessary for the server or the communication device. The chip system may be formed by a chip, or may include a chip and other discrete devices.
In a twentieth aspect of the embodiments of the present invention, there is provided a screen display method applied to an electronic device including a first display screen and a second display screen, the screen display method including:
displaying an interface of a target application on a first display screen;
displaying an application control bar on a second display screen;
changing a first display area of the application control bar to a second display area in response to the received first operation;
when the display area of the application control bar is the first display area, the application control bar comprises a first control key group;
when the display area of the application control bar is a second display area, the application control bar comprises a second control key group;
The first control key group and the second control key group are both control key sets used for controlling the target application, and the control keys included in the first control key group and the second control key group are not identical.
The electronic device may be an electronic device having two display screens connected together (for example, connected by a shaft connection or the like), where the two display screens may be two independent display screens, or may be divided by one flexible folding screen or one curved screen, and may be used to execute two display screens with different functions. The electronic device may be an electronic device that works independently as a whole, such as a personal notebook, or an electronic device that is formed by connecting two electronic devices that work independently and working together, such as a dual-screen electronic device formed by splicing two mobile phones or two tablet computers.
The first operation may be an operation directly acting on the application control bar, for example, the first operation may be changing a display area of the application control bar through a touch screen gesture; alternatively, the first operation may be to change the display area of the application control bar by clicking (finger click, mouse click, or the like) an enlargement button or a reduction button of the application control bar; alternatively, the first operation may be to change the display area of the application control bar by dragging the boundary of the application control bar with a mouse. The first operation may also be an operation indirectly acting on the application control bar, for example, the first operation may directly act on the control area through the three manners, and the display area of the application control bar is changed by changing the display area of the control area; or, the first operation may directly act on other application display interfaces or input modules (virtual keyboard or handwriting input area, etc.) in the second display screen through the three manners, and change the display area of the application control bar by changing the display area of other display modules on the second display screen; or, the first operation may be an operation of the target application in the first display screen by the user, for example, when the number of control keys displayed in the application control bar corresponding to the first operation by the user is different from the number of control keys displayed in the application control bar before the first operation, the display area of the application control bar may be adaptively adjusted, so that the control keys drunk by the first operation can be better displayed.
According to the operation and/or the requirement of the user, the display area and the control keys of the application control bar are flexibly changed, so that the application control bar can be flexibly adjusted according to the operation or the requirement of the user, the control keys related to the current operation of the user are always displayed in the application control bar, more convenient input operation is provided for the user, and the user experience is improved.
With reference to the twentieth aspect, in a first possible implementation manner of the twentieth aspect:
before the first display area of the application control bar is changed into the second display area, a virtual keyboard is displayed on the second display screen;
after the first display area of the application control bar is changed into the second display area, the display layout of the virtual keyboard is changed.
Specifically, when the second display area is larger than the first display area, the display area of the virtual keyboard is correspondingly reduced, and the layout of the keys in the virtual keyboard is changed along with the change of the display area, for example, all or part of the keys may be reduced, or the intervals between the keys may be compressed. When the second display area is smaller than the first display area, the display area of the virtual keyboard is correspondingly increased, and the layout of the keys in the virtual keyboard is changed along with the change of the display area, for example, all or part of the keys may be increased, or the intervals between the keys may be increased, and other functional modules such as a touch pad may be added on the basis of the virtual keyboard.
Because the application control bar is usually displayed on the second display screen together with other display modules (applications or input modules, etc.) in the second display screen, when the display area of the application control bar is changed, the display layout of the other display modules is adaptively adjusted, so that no part without display exists in the second display screen, no part with folded display exists, the display layout on the second display screen is optimized, and the user experience is improved.
With reference to the twentieth aspect or the twenty-first possible implementation manner of the twentieth aspect, in a second possible implementation manner of the twentieth aspect:
before the first display area of the application control bar is changed into the second display area, the interface of the target application comprises a third control key group;
the second display area is larger than the first display area;
the second control key group comprises a first control key group and a third control key group;
and after the first display area of the application control bar is changed into the second display area, the interface of the target application comprises a third control key group.
When the user needs to display more control keys in the application control bar or the number of the control keys corresponding to the current operation of the user is large, the display area of the application control area is increased, the control keys displayed in the application control area are increased, more control keys can be provided for the user, and a more convenient input mode is provided for the user. In addition, when the display area of the application control bar is increased, the control key in the first display screen is transferred to the application control bar in the second display screen to be displayed, so that the display space of the first display screen can be saved, and the display content of the first display screen is more concise and refreshing. In addition, after the control keys displayed in the application control bar are removed from the display of the control keys in the first display screen, the size of original display content in the first display screen can be increased, or new display content is added on the basis of the original display content, so that more convenient operation is provided for a user, and the user experience is improved.
With reference to the second possible implementation manner of the twentieth aspect, in a third possible implementation manner of the twentieth aspect:
and determining a third control key group according to the second display area and the priority order of the control keys to be displayed in the control key set to be displayed of the target application.
Specifically, the set of control keys to be displayed of the target application may be provided by the application program, and may display the set of control keys to be displayed in the application control bar. The priority order of the control keys to be displayed in the set can be defined by an application program, or can be determined by the operating system according to the functions of the control keys to be displayed or the use frequency of a user and other factors.
The application program provides a set of control keys to be displayed, the application program or the operating system specifies the priority order of the control keys to be displayed, the operating system determines that the control keys added in the application control bar can determine the control keys displayed in the application control bar under various display areas of the application control bar when the display area of the application control bar is increased, so that the setting of the application control bar is more flexible, and various operation modes and requirements of a user can be supported.
When the display area of the application control bar is increased, the display keys added in the application control area are determined according to the priority sequence of the control keys to be displayed, the control keys with higher priorities (important or the use frequency of a user) can be preferentially displayed in the application control bar under the condition that the display area of the application control bar is limited, more convenient and faster operation can be provided for the user, and the user experience is improved.
With reference to the second possible implementation manner of the twentieth aspect or the third possible implementation manner of the twentieth aspect, in a fourth possible implementation manner of the twentieth aspect:
the third control key group is displayed at a position closer to the first display screen than the first control key group.
Under the above setting, the first control key group is displayed at a position closer to both hands of the user than the third control key group. Namely, when the application control bar is enlarged each time, the newly added control keys are always displayed at the position close to the first display screen, and the originally displayed control keys in the application control bar are displayed at the positions closer to the two hands of the user, so that the operation of the user is facilitated. In the process of enlarging the display area of the application control bar, the priority of the newly added control key is often lower than that of the originally displayed control key in the application control bar, so that the control key with higher priority can be always displayed at the position closer to the two hands of the user, and more convenient operation is provided for the user, and the user experience is improved.
With reference to the twentieth aspect or the twenty-first possible implementation manner of the twentieth aspect, in a fifth possible implementation manner of the twentieth aspect:
before the first display area of the application control bar is changed into the second display area, the interface of the target application does not comprise a fourth control key group, and the fourth control key group is a control key set for controlling the target application;
the second display area is smaller than the first display area;
the second control key group is formed by reducing a fourth control key group in the first control key group;
and after the first display area of the application control bar is changed into the second display area, the interface of the target application comprises part or all of the fourth control key group.
When a user wants to display fewer control keys in the application control bar, or the number of the control keys corresponding to the current operation of the user is less, or the user needs to enlarge the display area of other display modules on the second display screen, the display area of the application control area is reduced, the control keys displayed in the application control area are reduced, the display control on the second display screen can be saved, the interference to the vision of the user can be reduced, and the user can conveniently and quickly position the needed control keys. In addition, after the fourth control key group is reduced in the application control bar, part or all of the control key groups in the fourth control key group are displayed on the first display screen, so that when the user needs to use the part of the control keys, the user can still operate through the first display screen, the influence on the user operation under the condition of reducing the application control bar is compensated, and the user experience is improved.
With reference to the fifth possible implementation manner of the twentieth aspect, in a sixth possible implementation manner of the twentieth aspect:
and determining a fourth control key group according to the second display area and the priority order of the control keys in the first control key group or the position relation of the control keys in the first control key group.
The priority order of the control keys in the first control key group can be defined by an application program or a system. According to the priority sequence of the control keys, which control keys in the application control bar are removed when the display area of the application control bar is reduced is determined, so that the control keys with higher priorities can be reserved in the application control bar, and the operation experience of a user is improved. When the user performs the operation of reducing the display area of the application control bar, the display of the specific area of the application control bar is hidden, for example, by dragging and operating the hidden part of the display content, which control keys are hidden can be determined according to the positions of the control keys, so as to achieve the operation purpose of the user.
With reference to the twentieth aspect, or any one of the first two possible implementations of the twentieth aspect, or the fifth possible implementation of the twentieth aspect, in a seventh possible implementation of the twentieth aspect:
The second control key group is a control key group corresponding to the second display area of the application control bar.
Wherein the control key group corresponding to the second display area of the application control bar may be provided by the application program. For example, the application program may define a corresponding control key group for several fixed-size display areas of the application control bar, and when the display area of the application control bar corresponds to a certain fixed-size display area, display the control key group corresponding to the display area in the application control bar; alternatively, the application program may define corresponding control key groups for several size variation ranges of the display area of the application control bar, and display the control key group corresponding to the size range in the application control bar when the actual display area of the application control bar falls within a certain size range.
By adopting the method to determine the control keys displayed in the application control bar, the calculation amount of the operating system can be greatly reduced, the reaction time of the operating system when the first operation is executed is shortened, and the operation efficiency is improved.
With reference to the twentieth aspect, or any one of the first seven possible implementations of the twenty-second aspect, in an eighth possible implementation of the twentieth aspect:
The first operation is a gesture operation;
in response to the received first operation, changing the first display area of the application control bar into a second display area, specifically:
responding to the gesture operation, selecting a first type of virtual keyboard corresponding to the gesture operation from a plurality of types of virtual keyboards, wherein virtual keys included in different types of virtual keyboards in the plurality of types of virtual keyboards are not identical;
displaying a first type of virtual keyboard through a second display screen;
and determining a second display area according to the display area of the first type of virtual keyboard.
Input modes such as the application control column and the virtual keyboard can be displayed simultaneously in the second display screen, different gesture virtual keyboards can be opened through different gestures, when the gesture virtual keyboard is opened through the gestures, the display area and/or the display area of the application control column can be determined according to the display area (the display area, the display position and the like) of the gesture virtual keyboard, the application control column can be flexibly matched with the gesture virtual keyboard, the display on the second display screen is more reasonable, tidy and attractive, and the user experience is improved.
With reference to the twentieth aspect or any one of the eight possible implementations before the twentieth aspect, in a ninth possible implementation of the twentieth aspect:
Displaying the interface of the target application on a second display screen in response to the received second operation so as to acquire the handwriting content of the interface of the target application through the second display screen, wherein the second operation indicates to start a handwriting input mode of the target application;
and after the interface of the target application is displayed on the second display screen, the second display screen does not comprise an application control bar.
When it is detected that the user starts the handwriting input mode through the second operation, the interface of the target application can be displayed on the second display screen, so that the handwriting content of the interface of the target application is acquired through the second display screen. At this time, as the interfaces of the target application are all copied in the second display screen, the application control bar in the second display screen can be hidden, and the display control on the second display screen is saved, so that the display content on the second display screen is simpler and cleaner, and the interference of the application control bar on the handwriting input is avoided.
With reference to the twentieth aspect, or any one of the first nine possible implementations of the twentieth aspect, in a tenth possible implementation of the twentieth aspect:
the first operation is used for switching the input mode into a handwriting input mode;
in response to the received first operation, displaying a handwriting input area and/or a control key group associated with a handwriting input mode in the application control bar;
When the user switches the input mode to the handwriting input mode, the handwriting input area can be displayed in the application control bar, so that the user can more conveniently execute the handwriting input operation through the application control bar, and the operation efficiency is improved. A set of control keys associated with the handwriting input means may also be displayed in the application control bar, for example: the pen, the eraser, the color, the font and the like enable a user to operate the handwriting input mode through the application control bar, and more convenient operation is provided for the user. The handwriting input area and the control key group related to the handwriting input mode can be displayed in the application control bar by colleagues, and the beneficial effects are achieved.
With reference to the twentieth aspect, or any one of the first ten possible implementations of the twenty-second aspect, in an eleventh possible implementation of the twentieth aspect, the method further includes:
acquiring a contact operation acting on an application control bar;
responding to the contact operation, and acquiring a first control key corresponding to the contact operation, wherein the first control key is positioned in an application control bar;
obtaining at least one first vibration feedback element matched with the first control key from the plurality of vibration feedback elements;
And indicating at least one first vibration feedback element to send out vibration waves so as to execute a first feedback operation, wherein the first feedback operation is used for prompting that the first control key is a key of the application control column.
The control area displayed in the second display screen can comprise a system control bar and an application control bar, when a user contacts a control key in the application control bar, feedback operation is provided, the user can position the position of the application control bar and the control key in the application control bar in the control area without moving the sight to the second display screen, the user is helped to quickly position the control key which is required to be positioned in the process of changing the display area and the control key of the application control bar, and the operation efficiency is greatly improved. Conversely, according to the use habits of the user, the feedback operation can be set for the control keys in the system control bar in the control area, so that the user can position the position of the system control bar in the control area and the control keys in the system control bar without moving the sight to the second display screen, and the operation efficiency is greatly improved. In addition, feedback operation can be set on the control key with more important function or higher use frequency of the user in the application control bar, so that the user can be helped to quickly position the control key to be positioned in the process of changing the display area and the control key of the application control bar, and the operation efficiency is greatly improved.
With reference to the twentieth aspect or any one of the twenty-first eleven possible implementations of the twentieth aspect, in a twelfth possible implementation of the twentieth aspect, the application control bar may be closed by any one of:
closing the application control bar based on the received instruction for closing the virtual keyboard;
closing the application control bar based on a key instruction of the virtual keyboard;
closing the application control bar based on the gesture instruction; and
and closing the application control bar based on the received instruction for closing the full screen mode of the application.
With reference to the twentieth aspect, or any one of the twenty-first two possible implementations of the twentieth aspect, in a thirteenth possible implementation of the twentieth aspect, the application control bar may be opened by any one of:
activating an application control bar based on the received instruction for activating the virtual keyboard;
activating an application control bar based on a key instruction of a virtual keyboard;
activating an application control bar based on the gesture instruction; and
based on the received instruction to start the full screen mode of the application, the application control bar is activated.
The mode of opening and closing the application control bar is only exemplary, and the design enables a user to activate or close the application control area in a flexible mode under the condition that any content is displayed in the second display screen, so that more convenient operation is provided for the user, and user experience is improved.
In a twenty-first aspect of an embodiment of the present invention, there is provided a screen display method, where the screen display method is applied to an electronic device including a first display screen and a second display screen, and the screen display method includes:
displaying an interface of a target application on the first display screen, wherein the interface of the target application comprises a fifth control key group;
displaying an application control bar on a second display screen;
in response to a third operation on the interface of the target application, displaying a fifth control group in an application control bar, and hiding the fifth control group in the interface of the target application.
According to the operation of the user on the target application interface, the control key corresponding to the user operation is displayed in the application control bar, the shortcut operation control key corresponding to the current operation of the user can be displayed in the application control bar all the time, more convenient operation is provided for the user, and the operation efficiency of the user is improved. In addition, after the control keys are displayed in the application control bar, the display of the control keys in the first display screen is removed, so that the display area of the first display screen can be saved, and the display content of the first display screen is more concise and refreshing. In addition, after the control keys displayed in the application control bar are removed from the display of the control keys in the first display screen, the size of original display content in the first display screen can be increased, or new display content is added on the basis of the original display content, so that more convenient operation is provided for a user, and the user experience is improved.
With reference to the twenty-first aspect, in a first possible implementation manner of the twenty-first aspect, the screen display method further includes:
and changing the display area of the application control bar corresponding to the third operation of the interface of the target application.
After the fifth control key group is displayed in the application control bar, the number of the control keys in the application control bar may be changed, and at this time, the display area of the application control bar may be adaptively adjusted, and the display of the control keys in the application control bar is optimized, so that the display of the control keys in the application control bar better conforms to the use habit of the user, and the user experience is improved.
With reference to the twenty-first aspect or the first possible implementation manner of the twenty-first aspect, in a second possible implementation manner of the twenty-first aspect, before the fifth control key group is displayed in the application control bar, the screen display method further includes:
the application control bar includes a sixth control group, which is a set of initial controls for controlling the target application.
When a user opens a target application, an initial control key group for controlling the target application can be displayed in the application control bar, when the user executes an operation on the target application, a fifth control key group corresponding to the current operation of the user can be added on the basis of the initial control key group, and the initial control key group can be partially or completely replaced by the fifth control key group corresponding to the current operation of the user, so that the control key most relevant to the current operation of the user can be displayed in the application control bar all the time, more convenient and faster operation is provided for the user, and the operation efficiency of the user is improved. In addition, after the initial control key group is displayed in the application control bar, the part of control keys in the interface of the target application can be removed, so that the display area of the first display screen is saved, and the display content of the first display screen is more concise and refreshing.
With reference to the twenty-first aspect or the first possible implementation manner of the twenty-first aspect, in a third possible implementation manner of the twenty-first aspect, after the fifth control key group is displayed in the application control bar, the screen display method further includes:
displaying a seventh control key group in an application control bar in response to a fourth operation on the interface of the target application, and hiding the seventh control key group in the interface of the target application.
Specifically, when the user performs the third operation on the interface of the target application, the user may continue to perform the fourth operation on the interface of the same target application. When the user performs the fourth operation on the target application, a seventh control group corresponding to the fourth operation of the user may be added on the basis of the fifth control group in the application control field, or part or all of the control keys in the fifth control group in the application control field may be replaced with the seventh control group corresponding to the fourth operation of the user. In addition, the fourth operation may be displaying a new target application on the first display screen, the fourth operation may be realized by starting the new target application, or the target application may originally run in the background, and the interface of the target application is displayed on the first display screen through the fourth operation. When the interface of the new target application is displayed on the first display screen, the interface of the original target application in the first display screen may be hidden, and at this time, the fifth control key group may be replaced with the seventh control key group. Alternatively, the interfaces of the two target applications may be simultaneously displayed in the first display screen (e.g., a dual-screen display), and at this time, the fifth control group may be added on the basis of the seventh control group, that is, the seventh control group and the fifth control group are commonly displayed in the application control bar.
According to the change of the user operation, the control keys in the application control bar are flexibly changed, the control keys with the strongest relevance with the user operation can be always displayed in the application control bar, more convenient operation is provided for the user, and the operation efficiency is improved.
With reference to the twenty-first aspect or any one of the twenty-first three possible implementation manners of the twenty-first aspect, in a fourth possible implementation manner of the twenty-first aspect:
the third operation is to select a target object in the interface of the target application;
the fifth control key group is a control key for operating the target object.
The third operation may be to select a target object in the interface of the target application, for example, the shading of the target object deepens to indicate that it is selected, or the third operation may be to select the target object by moving the cursor over the target object. Specifically, the target object selected by the third operation may be a picture or a text, and the fifth control key group may include a control key related to text or picture editing, so that the user can edit the text or the picture through the control key in the application control bar conveniently. The target object selected by the third operation can be an audio/video, and the fifth control key group can comprise a control key group related to audio/video control, so that a user can conveniently control the audio/video through the control keys in the application control bar.
With reference to the twenty-first aspect or any one of the twenty-first three possible implementation manners of the twenty-first aspect, in a fifth possible implementation manner of the twenty-first aspect:
the third operation is to move the cursor to a target position in the interface of the target application;
the fifth control key group is a control key in the menu bar displayed when the right mouse button is clicked at the target position.
When a user moves a cursor to a target position of an interface of a target application, a control key in a menu bar displayed when a right mouse button is clicked at the position of the cursor is displayed in an application control bar, the control key in the right menu bar is designed according to the intention of the user, the current operation requirement of the user can be met in a large probability, and the control key in the right menu bar is directly adopted, so that the secondary development of a developer can be avoided, and the development period is shortened.
With reference to the twenty-first aspect or any one of the twenty-first three possible implementation manners of the twenty-first aspect, in a sixth possible implementation manner of the twenty-first aspect:
the third operation is to browse the content in the target area of the interface of the target application through a sliding gesture or rolling a mouse wheel;
the fifth control key group is a thumbnail of the target area and a positioning frame for quickly positioning the target object in the thumbnail.
Through the setting, a user can quickly position the target content required by the user through the thumbnail of the target area in the application control bar and the positioning frame for quickly positioning the target object in the thumbnail, and the operation efficiency of the user is improved.
In a twenty-second aspect of an embodiment of the present invention, there is provided an electronic device including:
a first display screen, a second display screen, a memory, one or more processors, and one or more programs; wherein the one or more programs are stored in the memory; wherein the one or more processors, when executing the one or more programs, cause the electronic device to perform the steps of:
displaying an interface of a target application on a first display screen;
displaying an application control bar on a second display screen;
changing a first display area of the application control bar to a second display area in response to the received first operation;
when the display area of the application control bar is the first display area, the application control bar comprises a first control key group;
when the display area of the application control bar is a second display area, the application control bar comprises a second control key group;
the first control key group and the second control key group are both control key sets used for controlling the target application, and the control keys included in the first control key group and the second control key group are not identical.
With reference to the twenty-second aspect, in a first possible implementation manner of the twenty-second aspect, the one or more processors, when executing the one or more programs, cause the electronic device to perform the following steps:
before the first display area of the application control bar is changed into the second display area, a virtual keyboard is displayed on the second display screen;
after the first display area of the application control bar is changed into the second display area, the display layout of the virtual keyboard is changed.
With reference to the twenty-second aspect or the twenty-second aspect first possible implementation, in a twenty-second possible implementation, the one or more processors, when executing the one or more programs, cause the electronic device to perform the following steps:
before the first display area of the application control bar is changed into the second display area, the interface of the target application comprises a third control key group;
the second display area is larger than the first display area;
the second control key group comprises a first control key group and a third control key group;
and after the first display area of the application control bar is changed into the second display area, the interface of the target application comprises a third control key group.
With reference to the twenty-second possible implementation manner of the twenty-second aspect, in a twenty-second possible implementation manner, the one or more processors, when executing the one or more programs, cause the electronic device to perform the following steps:
And determining a third control key group according to the second display area and the priority order of the control keys to be displayed in the control key set to be displayed of the target application.
With reference to the twenty-second aspect or the twenty-second aspect first possible implementation manner, in a twenty-second aspect fourth possible implementation manner, the one or more processors, when executing the one or more programs, cause the electronic device to perform the following steps:
before the first display area of the application control bar is changed into the second display area, the interface of the target application does not comprise a fourth control key group, and the fourth control key group is a control key set for controlling the target application;
the second display area is smaller than the first display area;
the second control key group is formed by reducing a fourth control key group in the first control key group;
and after the first display area of the application control bar is changed into the second display area, the interface of the target application comprises part or all of the fourth control key group.
With reference to the twenty-second aspect of the fourth possible implementation manner, in a twenty-second aspect of the fifth possible implementation manner, the one or more processors, when executing the one or more programs, cause the electronic device to perform the following steps:
And determining a fourth control key group according to the second display area and the priority order of the control keys in the first control key group or the position relation of the control keys in the first control key group.
With reference to the twenty-second aspect, or any one of the twenty-second aspect's five possible implementations, in a twenty-second aspect's sixth possible implementation, the one or more processors, when executing the one or more programs, cause the electronic device to perform the steps of:
the first operation is a gesture operation;
responding to the received first operation, changing the first display area of the application control bar into a second display area, specifically:
responding to the gesture operation, selecting a first type of virtual keyboard corresponding to the gesture operation from a plurality of types of virtual keyboards, wherein virtual keys included in different types of virtual keyboards in the plurality of types of virtual keyboards are not identical;
displaying a first type of virtual keyboard through a second display screen;
the second display area is determined according to the display area of the first type of virtual keyboard.
With reference to the twenty-second aspect, or any one of the twenty-second aspect's six possible implementations, in a seventh possible implementation of the twenty-second aspect, the one or more processors, when executing the one or more programs, cause the electronic device to perform the steps of:
Displaying an interface of a target application on a second display screen in response to the received second operation so as to acquire handwritten content of the interface of the target application through the second display screen, wherein the second operation indicates that a handwriting input mode of the target application is started;
and after the interface of the target application is displayed on the second display screen, the second display screen does not comprise an application control bar.
The electronic device provided in the twenty-second aspect of the embodiment of the present invention is capable of implementing various possible implementation manners described in the twentieth aspect of the embodiment of the present invention, and achieves all the advantageous effects.
A twenty-third aspect of an embodiment of the present invention provides an electronic device, including:
a first display, a second display, a memory, one or more processors, and one or more programs; wherein the one or more programs are stored in the memory; wherein the one or more processors, when executing the one or more programs, cause the electronic device to perform the steps of:
displaying an interface of a target application on the first display screen, wherein the interface of the target application comprises a fifth control key group;
displaying an application control bar on a second display screen;
in response to a third operation on the interface of the target application, displaying a fifth control group in an application control bar, and hiding the fifth control group in the interface of the target application.
With reference to the twenty-third aspect, in a first possible implementation manner of the twenty-third aspect, the one or more processors, when executing the one or more programs, cause the electronic device to, before displaying the fifth control key group in the application control bar, perform the following steps:
the application control column comprises a sixth control key group, and the sixth control key group is a set of initial control keys for controlling the target application;
the sixth control group is not included in the interface of the target application.
With reference to the twenty-third aspect or the first possible implementation manner of the twenty-third aspect, in a second possible implementation manner of the twenty-third aspect, when the one or more processors execute the one or more programs, after the electronic device displays the fifth control key group in the application control bar, the following steps are performed:
displaying a seventh control key group in the application control bar in response to a fourth operation on the interface of the target application;
the seventh control key is not included in the interface of the target application.
With reference to the twenty-third aspect or any one of the first two possible implementation manners of the twenty-third aspect, in a third possible implementation manner of the twenty-third aspect:
the third operation is to select a target object in the interface of the target application;
The fifth control key group is a control key for operating the target object.
With reference to the twenty-third aspect or any one of the first two possible implementation manners of the twenty-third aspect, in a fourth possible implementation manner of the twenty-third aspect:
the third operation is to move the cursor to a target position in the interface of the target application;
the fifth control key group is a control key in the menu bar displayed when the right mouse button is clicked at the target position.
With reference to the twenty-third aspect or any one of the first two possible implementation manners of the twenty-third aspect, in a fifth possible implementation manner of the twenty-third aspect:
the third operation is to browse the content in the target area of the interface of the target application through a sliding gesture or rolling a mouse wheel;
the fifth control key group is a thumbnail of the target area and a positioning frame for quickly positioning the target object in the thumbnail.
The electronic device provided by the twenty-third aspect of the embodiment of the present invention is capable of implementing various possible implementation manners described in the twenty-first aspect of the embodiment of the present invention, and achieves all beneficial effects.
A twenty-fourth aspect of embodiments of the present invention provides a computer storage medium, where a program is stored in the computer storage medium, and when the program runs on a computer, the computer is enabled to implement the screen display method according to any one of the twentieth aspect or the twenty-first possible implementation manner, or the screen display method according to any one of the twentieth aspect or the twenty-first aspect or the sixth possible implementation manner, and achieve all the above beneficial effects.
A twenty-fifth aspect of the embodiments of the present invention provides a computer program product, which when run on a computer, enables the computer to implement the screen display method according to any one of the twentieth aspect or the twenty-first eleven possible implementation manners, or implement the screen display method according to any one of the twentieth aspect or the twenty-first sixth possible implementation manners, and achieve all the above beneficial effects.
Drawings
Fig. 1 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 2 is another schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 3 is a schematic structural diagram of a touch screen according to an embodiment of the present disclosure;
fig. 4 is schematic diagrams illustrating two arrangements of a plurality of vibration feedback units in an electronic device according to an embodiment of the present application;
fig. 5 is a schematic cross-sectional view of a touch screen according to an embodiment of the present disclosure;
fig. 6 is a schematic diagram illustrating an arrangement layout of a plurality of vibration feedback units included in a vibration feedback module according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of a touch screen according to an embodiment of the present disclosure;
fig. 8 is a schematic structural diagram of a touch screen according to an embodiment of the present disclosure;
Fig. 9 is a schematic flowchart of a feedback method according to an embodiment of the present application;
fig. 10 is schematic diagrams of two virtual keyboards in a feedback method provided in an embodiment of the present application;
fig. 11 is two schematic diagrams of a first location area and a second location area in a feedback method according to an embodiment of the present disclosure;
fig. 12 is another schematic diagram of a first location area and a second location area in a feedback method provided in an embodiment of the present application;
fig. 13 is a further schematic diagram of the first location area and the second location area in the feedback method according to the embodiment of the present application;
fig. 14 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 15 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 16 is a schematic diagram of an electronic device provided in an embodiment of the present application;
fig. 17 is a flowchart illustrating a processing method of a virtual keyboard according to an embodiment of the present application;
FIG. 18 is a schematic diagram illustrating a first gesture parameter in the processing method of a virtual keyboard according to the embodiment of the present application;
fig. 19 is a schematic diagram illustrating relative angle information in a processing method of a virtual keyboard according to an embodiment of the present application;
fig. 20 is two schematic diagrams of a first area in the processing method of a virtual keyboard according to the embodiment of the present application;
FIG. 21 is a diagram illustrating a first gesture operation in a processing method for a virtual keyboard according to an embodiment of the present disclosure;
fig. 22 is another schematic diagram illustrating a first gesture operation in the processing method of a virtual keyboard according to the embodiment of the present application;
fig. 23 is a schematic diagram of a first type of virtual keyboard in a processing method of a virtual keyboard according to an embodiment of the present application;
fig. 24 is another schematic diagram of a first type of virtual keyboard in the processing method for virtual keyboards according to the embodiment of the present application;
fig. 25 is another schematic diagram of a first type of virtual keyboard in a processing method of virtual keyboards according to an embodiment of the present application;
fig. 26 is a further schematic diagram of a first type of virtual keyboard in the processing method for virtual keyboards according to the embodiment of the present application;
fig. 27 is another schematic diagram of a first type of virtual keyboard in the processing method for virtual keyboards according to the embodiment of the present application;
fig. 28 is a further schematic diagram of a first type of virtual keyboard in the processing method for virtual keyboards according to the embodiment of the present application;
fig. 29 is another schematic diagram of a first type of virtual keyboard in the processing method for virtual keyboards according to the embodiment of the present application;
Fig. 30 is a schematic diagram of a first setting interface in the processing method of the virtual keyboard according to the embodiment of the present application;
fig. 31 is another schematic diagram of a first setting interface in the processing method of the virtual keyboard according to the embodiment of the present application;
FIG. 32 is a schematic diagram illustrating a custom gesture operation in the processing method for a virtual keyboard according to the embodiment of the present application;
fig. 33 is a further schematic diagram of a first type of virtual keyboard in the processing method of virtual keyboards according to the embodiment of the present application;
fig. 34 is another schematic diagram of a first type of virtual keyboard in the processing method for virtual keyboards according to the embodiment of the present application;
fig. 35 is a further schematic diagram of a first type of virtual keyboard in the processing method for virtual keyboards according to the embodiment of the present application;
fig. 36 is a schematic diagram of a second virtual key in the processing method of the virtual keyboard according to the embodiment of the present application;
fig. 37 is another schematic diagram of a second virtual key in the processing method of the virtual keyboard according to the embodiment of the present application;
fig. 38 is a schematic flowchart of another processing method for a virtual keyboard according to an embodiment of the present application;
fig. 39 is a schematic structural diagram of an electronic device provided in this embodiment of the application;
Fig. 40 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 41 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
FIG. 42 is a flowchart illustrating a processing method for an application interface according to an embodiment of the present disclosure;
fig. 43 is an interface schematic diagram of a display interface of a second display screen in the processing method of an application interface according to the embodiment of the present application;
fig. 44 is a schematic flowchart of a processing method of an application interface according to an embodiment of the present application;
fig. 45 is a schematic flowchart of another processing method of an application interface according to an embodiment of the present application;
FIG. 46 is a schematic diagram illustrating various holding gestures in a processing method for an application interface according to an embodiment of the present disclosure;
fig. 47 is an interface schematic diagram of a first application interface in the processing method of an application interface according to the embodiment of the present application;
fig. 48 is schematic diagrams of two interfaces of a first application interface in the processing method of an application interface according to the embodiment of the present application;
fig. 49 is a schematic diagram illustrating a first contact operation in the processing method of the application interface according to the embodiment of the present application;
fig. 50 is a schematic diagram of a display interface of a first application interface in the processing method of application interfaces according to the embodiment of the present application;
Fig. 51 is a schematic flowchart of a processing method of an application interface according to an embodiment of the present application;
fig. 52 is a schematic flowchart of a processing method of an application interface according to an embodiment of the present application;
fig. 53 is a schematic flowchart of a processing method of an application interface according to an embodiment of the present application;
fig. 54 is a schematic diagram of a display interface of a first application interface in the processing method of an application interface according to the embodiment of the present application;
fig. 55 is a schematic diagram of a display interface of a first application interface in a processing method of an application interface according to the embodiment of the present application;
fig. 56 is a schematic diagram of a display interface of a first application interface in the processing method of application interfaces according to the embodiment of the present application;
fig. 57 is a schematic diagram of a display interface of a first application interface in a processing method of an application interface according to the embodiment of the present application;
fig. 58 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 59 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 60 is a diagram of a dual-screen electronic device according to an embodiment of the invention;
FIG. 61 is an application scenario provided by an embodiment of the present invention;
FIG. 62 is a diagram of a screen display method according to an embodiment of the present invention;
fig. 63A is a display manner of a control area according to an embodiment of the present invention;
FIG. 63B is a diagram illustrating an alternative display of a control area, according to an embodiment of the present invention;
FIG. 63C is a diagram illustrating an alternative display of a control area, according to an embodiment of the present invention;
FIG. 64A is a flowchart of a method for activating a control area according to an embodiment of the present invention;
FIG. 64B is a block diagram of another method for activating a control area according to an embodiment of the invention;
FIG. 64C is a block diagram of another method for activating a control area according to an embodiment of the present invention;
FIG. 64D is an alternative method for activating a control area according to embodiments of the present invention;
FIG. 65A illustrates a manner of associating user actions with a set of control keys according to an embodiment of the present invention;
FIG. 65B illustrates another exemplary correspondence between user actions and control key sets, according to an embodiment of the present invention;
FIG. 65C illustrates another exemplary correspondence between user actions and control key sets, in accordance with an embodiment of the present invention;
FIG. 65D is a diagram illustrating another exemplary correspondence between user actions and control key sets, in accordance with an embodiment of the present invention;
FIG. 65E illustrates another exemplary correspondence between user actions and control key sets, in accordance with an embodiment of the present invention;
FIG. 65F is a diagram illustrating another exemplary correspondence between user actions and control key sets, according to an embodiment of the present invention;
Fig. 66A is a display manner of a control area according to an embodiment of the present invention;
FIG. 66B is a diagram illustrating an alternative display of a control area, according to an embodiment of the present invention;
fig. 67 is a layout manner of display contents of a control area according to an embodiment of the present invention;
FIG. 68 illustrates a priority setting scheme provided by an embodiment of the present invention;
FIG. 69 is another priority setting provided by embodiments of the present invention;
FIG. 70A is a block diagram illustrating a method for closing a control area according to an embodiment of the present invention;
FIG. 70B is a flowchart of an alternative method for closing a control area, according to an embodiment of the present invention;
FIG. 70C is a flowchart of an alternative method for closing a control area, according to an embodiment of the present invention;
FIG. 70D is a flowchart of an alternative method for closing a control area, according to an embodiment of the present invention;
FIG. 71 is another method for displaying a screen according to an embodiment of the present invention;
FIG. 72 is another screen display method provided by embodiments of the present invention;
FIG. 73A is a diagram illustrating a method for changing a display area of an application control bar according to an embodiment of the present invention;
FIG. 73B is a flowchart of a method for increasing the display area of an application control bar according to an embodiment of the present invention;
FIG. 73C is a flowchart of a method for increasing the display area of an application control bar according to an embodiment of the present invention;
FIG. 74A is a flowchart of an alternative method for changing the display area of an application control bar, according to an embodiment of the present invention;
FIG. 74B is a flowchart of an alternative method for increasing the display area of an application control bar, according to an embodiment of the present invention;
FIG. 74C is a flowchart of an alternative method for increasing the display area of an application control bar, according to an embodiment of the present invention;
FIG. 75A is a block diagram of another method for changing the display area of an application control bar according to an embodiment of the present invention;
FIG. 75B is a flowchart of a method for increasing the display area of an application control bar, according to an embodiment of the present invention;
FIG. 75C is a flowchart of a method for increasing the display area of an application control bar, according to an embodiment of the present invention;
FIG. 76A is a block diagram illustrating a method for changing the display area and control keys of an application control bar according to a user operation, according to an embodiment of the present invention;
FIG. 76B is a block diagram illustrating another exemplary method for changing the display area and control keys of an application control bar according to a user operation;
FIG. 77A is a diagram illustrating a gesture control method according to an embodiment of the present invention;
FIG. 77B illustrates another gesture control method provided by embodiments of the present invention;
FIG. 77C is a diagram of another gesture control method provided by embodiments of the present invention;
FIG. 77D is a block diagram of another gesture control method provided by embodiments of the present invention;
FIG. 78A is a diagram of another gesture control method according to an embodiment of the present invention;
FIG. 78B is a diagram of another gesture control method according to an embodiment of the present invention;
FIG. 79 is a block diagram of another gesture control method provided in embodiments of the present invention;
FIG. 80A is a diagram illustrating an exemplary implementation of a method for displaying a screen;
FIG. 80B is a flowchart of another exemplary implementation of a method for displaying a screen according to an embodiment of the present invention;
FIG. 80C is a flowchart of another exemplary implementation of a method for displaying a screen according to an embodiment of the present invention;
FIG. 80D is a flowchart of another exemplary implementation of a method for displaying a screen according to an embodiment of the present invention;
FIG. 80E is a diagram illustrating an exemplary implementation of another screen display method according to an embodiment of the invention;
FIG. 80F is a diagram illustrating an exemplary implementation of another screen display method according to an embodiment of the invention;
FIG. 80G is a diagram illustrating another exemplary implementation of a screen display method according to an embodiment of the invention;
fig. 81 is an electronic apparatus according to an embodiment of the present invention;
fig. 82 is another electronic device provided in an embodiment of the invention.
Detailed Description
The terms "first," "second," and the like in the description and in the claims of the present application and in the above-described drawings are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the terms so used are interchangeable under appropriate circumstances and are merely descriptive of the various embodiments of the application and how objects of the same nature can be distinguished. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of elements is not necessarily limited to those elements, but may include other elements not expressly listed or inherent to such process, method, article, or apparatus.
Embodiments of the present application are described below with reference to the accompanying drawings. As can be known to those skilled in the art, with the development of technology and the emergence of new scenarios, the technical solution provided in the embodiments of the present application is also applicable to similar technical problems.
The first embodiment is as follows:
the embodiment of the application can be applied to various application scenes for inputting through the virtual keyboard. For example, when a user uses a text entry application, creates a presentation (PPT), browses a web page, plays a video, plays a music, and uses a navigation application, the user can input the information via a virtual keyboard, and in the above-described various scenarios, it is a difficult task to perform touch typing on a touch screen.
In order to solve the above problem, an embodiment of the present application provides a feedback method, where the feedback method is applied to an electronic device configured with a touch screen, and the electronic device obtains first position information of a first contact point on the touch screen, and obtains a first virtual key corresponding to the first contact point according to the first position information, and executes a first feedback operation when the first virtual key is an anchor point key, so as to prompt that the first virtual key is the anchor point key, thereby facilitating to train muscle memory of the anchor point key for a user, and performing a touch-typing training by means of the muscle memory of the anchor point key, so as to reduce difficulty in implementing a touch-typing on the touch screen.
The feedback method provided in the embodiment of the present application may be used in the electronic device shown in fig. 1, please refer to fig. 1 and fig. 2, and fig. 1 and fig. 2 are schematic structural diagrams of the electronic device provided in the embodiment of the present application. Referring to fig. 1, an electronic device 1 includes a processor 10 and a touch screen 20, the touch screen 10 includes a touch sensing module 100 and a vibration feedback module 200, and the vibration feedback module 200 includes a plurality of vibration feedback elements.
Specifically, the processor 10 obtains first position information of a first contact point on the touch screen through the contact sensing module 100, and under the condition that the processor 10 determines that a first virtual key corresponding to the first contact point is an anchor point key, obtains a vibration feedback element matched with the first virtual keyboard from a plurality of vibration feedback elements included in the vibration feedback module 200, and sends out a vibration wave through the vibration feedback element matched with the first virtual key, so as to send out a vibration feedback through the touch screen at the first contact point (that is, within a preset range around the first contact point on the touch screen), where the first virtual key for prompting the user to touch is the anchor point key. It should be noted that the vibration feedback is not a full-screen vibration feedback, but a vibration feedback for the first contact point, and the strength of the vibration feedback at the first contact point is the greatest.
In some application scenarios, referring to fig. 2, the electronic device 1 includes a display 30 and a touch screen 20, a virtual keyboard is displayed on the touch screen 20, anchor point keys exist in the virtual keyboard, that is, the touch screen 20 needs to have functions of displaying the virtual keyboard and performing vibration feedback at the same time, and a display module needs to be further disposed in the touch screen 20. In other application scenarios, the electronic device 1 may also be a Virtual Reality (VR) device, an Augmented Reality (AR) device, or a Mixed Reality (MR) device, that is, the touch screen 20 may not need to display a virtual keyboard, and only needs to perform vibration feedback, and then a display module is not needed to be arranged in the touch screen 20. It should be understood that in the following embodiments, only the touch screen 20 is provided with a display module as an example for illustration.
Further, please refer to fig. 3, wherein fig. 3 is a schematic structural diagram of a touch screen according to an embodiment of the present disclosure. The touch screen 20 may further include a cover plate 300 and a display module 400, and in fig. 3, for example, the cover plate 300 and the contact sensing module 100 are integrated, and the cover plate 300 and the contact sensing module 100 may also be separated from each other.
The cover plate 300 may be made of a glass-type transparent rigid material, a flexible transparent organic material, or other materials, the touch sensing module 100 may be specifically represented as a touch sensing film, and the touch sensing film may be specifically a capacitive touch sensing film, a pressure-type touch sensing film, a temperature-type touch sensing film, or other types of films, further, for example, the touch sensing film may be specifically made of an Indium Tin Oxide (ITO) wire mesh, a carbon nanotube network with protrusions, or other materials, which is not exhaustive here. In the embodiment of the application, various concrete expression forms of the vibration feedback element are provided, and the implementation flexibility of the scheme is improved.
The display module 400 is used to display a virtual keyboard, the display module 400 and the contact sensing module 100 may be integrated into a whole or separated from each other, and fig. 3 only illustrates that the display module 400 and the contact sensing module 100 are separated from each other. The display module 400 may be embodied as a display panel, and the display panel may be a Liquid Crystal Display (LCD), an active matrix/organic light Emitting Diode (AMOLED) panel, or other types of display panels, which are not exhaustive herein.
In one implementation, as shown in fig. 3, the vibration feedback module 200 may be embodied as a vibration feedback layer, and the vibration feedback layer is located below the contact sensing module 100, specifically above the display module 400, and also below the display module 400.
The vibration feedback module 200 is configured with a plurality of vibration feedback units 201, and each dark gray diamond in fig. 3 represents a vibration feedback unit; one vibration feedback unit 201 may include one or more vibration feedback elements therein. In one case, the vibration feedback layer may be embodied as a vibration feedback film partitioned to define a plurality of vibration feedback elements, and in another case, the vibration feedback elements may be embodied as a piezoelectric ceramic plate, a linear motor, or other types of electronic elements, which are not exhaustive herein.
Further, the plurality of vibration feedback units 201 may have various layout arrangements. In one case, referring to fig. 3, the layout of the virtual keyboard is completely consistent with that of a physical keyboard, which may be a keyboard including 61 keys, a keyboard including 87 keys, a keyboard including 104 keys, a keyboard including 108 keys, an ergonomic keyboard, or other types of physical keyboards, and the like, and the design of a specific virtual keyboard may be flexibly set in combination with an actual application scenario. The plurality of vibration feedback units 201 may be arranged in a one-to-one correspondence with the plurality of virtual keys, that is, each virtual key corresponds to one vibration feedback unit 201 in position.
In another case, please refer to fig. 4, wherein fig. 4 is a schematic diagram illustrating two arrangements of a plurality of vibration feedback units in an electronic device according to an embodiment of the present disclosure. Fig. 4 includes a sub-schematic diagram (a) and a sub-schematic diagram (b), and referring to the sub-schematic diagram (a) of fig. 4, a plurality of vibration feedback units 201 are arranged in a matrix. Referring to the (b) sub-diagram of fig. 4, a plurality of vibration feedback units 201 are arranged in a chess board-like fashion, and each gray square in the (a) sub-diagram of fig. 4 and the (b) sub-diagram of fig. 4 represents one vibration feedback unit 201.
In another implementation, as shown in fig. 5 and 6, a plurality of vibration feedback units 201 (i.e., a plurality of vibration feedback elements) may be located around the display module 400. Fig. 5 is a schematic cross-sectional view of a touch screen provided in an embodiment of the present application, and fig. 6 is a schematic layout view of an arrangement of a plurality of vibration feedback units included in a vibration feedback module provided in the embodiment of the present application. Referring to fig. 5, the touch screen 20 includes a cover 300, a touch sensing module 100, a display module 400, a vibration feedback device, a supporting structure of the vibration feedback device, other modules in the touch screen, and a bottom plate, in fig. 5, the cover 300 and the touch sensing module 100 are integrated into a whole, and a plurality of vibration feedback devices are parallel to the display module 400 and can directly support the cover 300. It should be noted that, in other embodiments, the cover plate 300 and the touch sensing module 100 may be independent from each other, and the plurality of vibration feedback units may also be parallel to the touch sensing module 100. As can be seen from fig. 5 and fig. 6, the plurality of vibration feedback units 201 are arranged in a surrounding manner, that is, the plurality of vibration feedback units 201 surround the periphery of the display module 400, and correspondingly, in other embodiments, the plurality of vibration feedback units 201 may also surround the periphery of the contact sensing module 100.
Further, a gap layer may be disposed between the display module 400 and the cover plate 300 to provide a margin for a moving space for the vibration feedback element to emit vibration waves, and the display module 400 and the cover plate 300 may be bonded by using a light-transmissive adhesive material. The support structure of the vibration feedback element and the base plate may be integrated or may be separate from each other. In the embodiment of the application, various arrangement and layout modes of a plurality of vibration feedback units are provided, and the implementation flexibility of the scheme is improved.
It should be noted that the above list of arrangement and layout manners of the plurality of vibration feedback units is only for facilitating understanding of the present solution, and the plurality of vibration feedback units may also adopt other layout manners, and the specific implementation manner should be determined by combining with the form of the actual product, which is not exhaustive here.
Optionally, the touch screen 20 may further include a pressure sensing module, and the pressure sensing module is configured to detect a pressure change and a position on the touch screen.
In one implementation, the pressure sensing module and the vibration feedback module 200 may be two independent modules, respectively, and the pressure sensing module may be disposed above the vibration feedback module 200 or below the vibration feedback module 200. The pressure sensing module may be embodied as a pressure sensing membrane, a distributed pressure sensor, or in other forms, which are not exhaustive here.
In another implementation, the pressure sensing module may be integrated with the vibration feedback module 200, and the vibration feedback module 200 may also be referred to as a pressure sensing module, and the vibration feedback element may also be referred to as a pressure sensing element. In this implementation, the vibration feedback element may specifically be a piezoelectric ceramic sheet, a piezoelectric polymer (e.g., a piezoelectric film), a piezoelectric composite material, or other types of elements, and the piezoelectric composite material is a composite material obtained by using the piezoelectric ceramic sheet and the piezoelectric polymer. Further, in one case, the plurality of vibration feedback elements included in the vibration feedback module 200 (which may also be referred to as a pressure sensing module) may be divided, a second vibration feedback element of the plurality of vibration feedback elements is used for acquiring a pressure value, and a third vibration feedback element of the plurality of vibration feedback elements is used for emitting a vibration wave to perform vibration feedback. Wherein the second and third vibration feedback elements are different vibration feedback elements. As an example, for example, one vibration feedback unit 201 includes two vibration feedback elements, one vibration feedback element in the same vibration feedback unit 201 is used for acquiring a pressure value, and the other vibration feedback element is used for emitting a vibration wave to perform vibration feedback.
In another case, the plurality of vibration feedback elements in the vibration feedback module 200 (which may also be referred to as a pressure sensing module) are used for acquiring pressure values during a first time period and for emitting vibration waves during a second time period, and the first time period and the second time period are different. As an example, for example, a plurality of vibration feedback elements in the vibration feedback module 200 (which may also be referred to as a pressure sensing module) may be used to collect pressure values in a default state, and when a first pressure value threshold is reached (i.e. when it is confirmed that a pressing operation is received), to emit vibration waves for vibration feedback.
In the embodiment of the application, the touch screen is also provided with a pressure sensing module for acquiring the pressure value, so that not only the position information of the contact point can be acquired, but also the pressure value of the contact point can be acquired, and the contact operation acquired through the touch screen can be further carefully managed; and the pressure sensing module and the vibration feedback module are integrated into a whole, so that the thickness of the touch screen is reduced, and the convenience of the electronic equipment is improved.
Optionally, the tactile characteristics of the cover 300 of the touch screen 20 are changeable, the tactile characteristics including any one or more of the following: coefficient of sliding friction, stick-slip, temperature or other tactile properties, etc.; further, the stick-slip property represents a speed of change of the sliding friction coefficient. Further, the tactile characteristics of the entire cover plate 300 may be changed, or only the tactile characteristics of the contact points in the cover plate 300 may be changed.
Specifically, in an implementation manner, as shown in fig. 7, fig. 7 is a schematic structural diagram of a touch screen provided in the embodiment of the present application. The touch screen 20 may further include an ultrasonic module 500, where the ultrasonic module 500 is configured to emit ultrasonic waves to change the tactile characteristics of the cover plate 300, and may be implemented by an ultrasonic vibration film, a piezoelectric film, a speaker, or other components, which is not exhaustive here. The ultrasonic module 500 may be disposed below the cover plate 300, specifically above the contact sensing module 100 or the display module 400, or below the contact sensing module 100 or the display module 400, and fig. 7 illustrates the case of being disposed above the contact sensing module 100, it should be understood that the example in fig. 7 is only for convenience of understanding the present disclosure, and is not limited to the present disclosure.
In another implementation manner, as shown in fig. 8, fig. 8 is a schematic structural diagram of a touch screen provided in the embodiment of the present application, and the touch screen 20 further includes an electrostatic module 600, where the electrostatic module 600 is configured to generate an electric signal to change a tactile characteristic of the cover plate. The electrostatic module 600 may be embodied as an electrostatic thin film layer, and may be disposed below the cover plate 300, specifically above the touch sensing module 100 or the display module 400, or below the touch sensing module 100 or the display module 400, and fig. 8 illustrates the case of being disposed above the touch sensing module 100, it should be understood that the example in fig. 8 is only for convenience of understanding the present disclosure, and is not limited to the present disclosure.
In the embodiment of the application, the touch screen can also change the touch characteristics of the cover plate by setting the ultrasonic wave module or the electrostatic module, so that richer touch feedback can be provided, and further richer touch feedback can be utilized to train a user to realize touch typing on the touch screen, so that the difficulty of realizing touch typing on the touch screen is further reduced.
Based on the above description, the embodiments of the present application provide a feedback method, which can be applied to the electronic devices shown in fig. 1 to 8. Specifically, referring to fig. 9, fig. 9 is a schematic flow chart of a feedback method provided in the embodiment of the present application, where the feedback method provided in the embodiment of the present application may include:
901. the electronic equipment detects a first contact operation acting on the touch screen, and responds to the first contact operation to acquire first position information of a first contact point corresponding to the first contact operation.
In the embodiment of the application, the electronic device may detect, in real time, a first contact operation acting on the touch screen, and when the electronic device detects, through the touch screen, the first contact operation input by a user, the electronic device may acquire, in response to the first contact operation, the number of at least one first contact point and first position information of each first contact point, which are acquired by a contact sensing module in the touch screen. The at least one first contact point may only include a newly added contact point on the touch screen, or may include all contact points on the touch screen. The first position information is established based on a touch screen coordinate system, and a center point of the touch screen, a vertex of an upper left corner, a vertex of a lower left corner, a vertex of an upper right corner, a vertex of a lower right corner, an arbitrary position point in the touch screen or other position points can be used as a coordinate system origin.
More specifically, if the at least one first contact point may only include a newly added contact point on the touch screen, when the virtual keyboard in the electronic device is in an open state, the touch sensing module in the touch screen may continuously detect the touch signals corresponding to the respective contact points on the touch screen, and when a contact signal of a new contact point on the touch screen is detected, the position information of the newly added at least one first contact point is collected in time. For example, when a user just opens a text entry application and calls up a virtual keyboard, a plurality of new first contact points on the touch screen can be acquired from the time that both hands are not in contact with the touch screen to the time that both hands are placed at standard finger positions. As another example, when the user moves away from, drops down, or slides in another virtual key during the keyboard input, a plurality of new first contact points may appear on the touch screen, and a new first contact point on the touch screen may be obtained. It should be understood that the examples are provided herein for ease of understanding and are not intended to limit the present disclosure. The virtual keyboard in the embodiment of the present application may be represented by any type of keyboard, for example, the virtual keyboard may be a full keyboard, a numeric keyboard, a functional keyboard, and the like, or the virtual keyboard may also be a generic term of all operation keys on the touch screen.
In step 901, it is necessary to perform a false touch prevention process. Specifically, the touch screen may be configured to generate a contact point on the touch screen by a finger of a user, and the palm, the forearm, the back of the hand, the capacitive pen, or the like of the user may generate a contact point on the touch screen, that is, the electronic device may acquire a touch signal of a contact point generated by a non-user finger of the palm, the forearm, the back of the hand, the capacitive pen, or the like of the user through the contact sensing module of the touch screen, and then the processor of the electronic device needs to perform filtering analysis after acquiring a touch signal corresponding to each newly added contact point on the touch screen, and filter out the touch signal of the newly added contact point, except for the touch signal triggered by the finger, of the acquired touch signal of the newly added contact point, that is, the first contact point only includes the newly added contact point triggered by the finger of the user.
In the embodiment of the application, because the user usually pays attention to the placement of the newly-contacted actual key when using the physical keyboard, feedback is only generated on the newly-added contact point in the scheme, the experience of the user when using the physical keyboard for input can be better simulated, the feedback is only generated aiming at the newly-added contact point, the memory relation between the user and the newly-added contact point is easier to establish, and the difficulty of blind typing training on the touch screen is further reduced.
Optionally, a proximity sensing module may be further configured in the touch screen, and when the virtual keyboard in the electronic device is in an open state, the electronic device senses a moving track of a finger of a user above the touch screen through the proximity sensing module in the touch screen, and estimates an expected contact point between the finger and the touch screen.
Optionally, before step 901, the electronic device may further select, in response to the detected first gesture operation, a first type of virtual keyboard corresponding to the first gesture operation from a plurality of types of virtual keyboards, where virtual keys included in different types of virtual keyboards in the plurality of types of virtual keyboards are not exactly the same; displaying a first type of virtual keyboard through a touch screen, wherein the position of the first type of virtual keyboard on the touch screen is fixed in the displaying process of the first type of virtual keyboard; when the electronic device determines that the first type of virtual keyboard is a virtual keyboard with a fixed display position in the display process, the electronic device may acquire first position information of a first contact point on the touch screen in real time, that is, the electronic device is triggered to enter step 901. The concept of the first gesture operation, the virtual keyboards of multiple types, and the specific implementation of the foregoing steps will be described in the following embodiment two, and details thereof are not described herein.
902. The electronic device obtains a pressure value corresponding to the first contact point.
In some embodiments of the application, when the electronic device acquires a first contact operation input by a user through the touch screen, a pressure value corresponding to at least one first contact point on the touch screen may be further acquired through a pressure sensing module in the touch screen. The pressure value corresponding to at least one first contact point on the touch screen may include a pressure value of each first contact point in the at least one first contact point, or may share one pressure value for the at least one first contact point.
Specifically, in one case, if the pressure sensing module in the touch screen is independent, the pressure sensing module may directly collect the pressure value of each first contact point in the at least one first contact point.
In another case, if the pressure sensing module and the vibration feedback module are integrated into a whole, and each vibration feedback unit included in the vibration feedback module corresponds to a virtual key in the virtual keyboard one to one, the pressure sensing module may also directly acquire the pressure value of each first contact point in the at least one first contact point.
In another case, if the pressure sensing module and the vibration feedback module are integrated into a whole and the vibration feedback units are not in a one-to-one correspondence relationship with the virtual keys, for example, the arrangement layout of the plurality of vibration feedback units is the various arrangement layouts shown in fig. 4 to 6, that is, the plurality of vibration feedback units are arranged in a matrix, a chess, or a ring. The electronics can take readings of all of the pressure sensing elements (also referred to as vibration feedback elements) in the pressure sensing module. Further, in one implementation, the electronic device may solve the pressure value of each first contact point (i.e., each pressure center point) of the at least one first contact point (i.e., each pressure center point) according to the coordinate position of each pressure sensing element and the pressure value acquired by each pressure sensing unit according to a principle that the moments are equal.
In another implementation manner, the electronic device may also calculate a pressure value of the entire touch screen based on pressure values collected by all the pressure sensing elements, and determine the pressure value of each first contact point in at least one contact point as the pressure value of the entire touch screen.
903. The electronic equipment acquires a first virtual key corresponding to the first contact point according to the first position information of the first contact point.
In some embodiments of the application, after obtaining the first position information of each first contact point in the at least one first contact point, the electronic device may obtain first virtual keys corresponding to each first contact point one by one; the first virtual key is a virtual key in the virtual keyboard.
Specifically, the process of acquiring the first virtual key corresponding to the first contact point is targeted. The electronic equipment can display one or more types of virtual keyboards, so that the position information of each virtual key in each type of virtual keyboard can be stored in the electronic equipment, the electronic equipment determines the currently displayed virtual keyboard from the multiple types of virtual keyboards, obtains the position information of each virtual key in the currently displayed virtual keyboard, and further matches the position information of each virtual key in the currently displayed virtual keyboard according to the first position information of the first contact point, so that the first virtual key corresponding to the first contact point is determined. For a more intuitive understanding of the present disclosure, please refer to fig. 10, and fig. 10 is two schematic diagrams of a virtual keyboard in the feedback method according to the embodiment of the present disclosure. The sub-diagram of fig. 10 (a) and the sub-diagram of fig. 10 (b) show two types of virtual keyboards on a touch screen, the sub-diagram of fig. 10 (a) shows a virtual keyboard corresponding to a physical keyboard with 74 keys, and the sub-diagram of fig. 10 (b) shows an ergonomic keyboard, it should be understood that the example in fig. 10 is only for convenience of understanding the virtual keyboard in the present solution, and is not used to limit the present solution.
As an example, for example, the currently displayed virtual keyboard is an ergonomic keyboard, after the electronic device determines first position information of a first contact point on the touch screen through a contact sensing module of the touch screen, the first position information is compared with position information of each virtual key in the ergonomic keyboard, so that it is determined that the first contact point is located in a position area of the virtual key K, and then it is determined that the first virtual key corresponding to the first contact point is the key K.
More specifically, in an implementation manner, since the first contact point may be represented as a location area in an actual situation, the first location information may describe a location area, and the electronic device may obtain coordinates of a center point of the first location information, and match the coordinates of the center point of the first location information with location information of each virtual key in the currently displayed virtual keyboard, so as to determine the first virtual key corresponding to the first contact point.
In another implementation manner, the electronic device may also directly match the first location information of the first contact point with the location information of each virtual key in the currently displayed virtual keyboard, and select a first virtual key from the first location information, where an intersection between the location information of the first virtual key and the first location information is the largest.
904. The electronic equipment judges whether the contact operation corresponding to the first contact point is a pressing operation or a touch operation according to the pressure value corresponding to the first contact point, and if the contact operation is the pressing operation, the step 905 is executed; if yes, go to step 908.
In some embodiments of the present application, the electronic device may be preset with a first pressure value threshold and a second pressure value threshold, where the first pressure value threshold refers to a threshold of a pressing operation, and the second pressure value threshold is a threshold of a touch operation. For any one first contact point in the at least one first contact point, after acquiring a pressure value corresponding to the first contact point, the electronic device may determine whether the pressure value corresponding to the first contact point is greater than or equal to a first pressure value threshold, and if the pressure value corresponding to the first contact point is greater than or equal to the first pressure value threshold, determine that the contact operation corresponding to the first contact point is a pressing operation; if the pressure value corresponding to the first contact point is greater than or equal to the second pressure value threshold and smaller than the first pressure value threshold, determining that the contact operation corresponding to the first contact point is a touch operation; and if the pressure value corresponding to the first contact point is smaller than the second pressure value threshold value, determining that the contact operation corresponding to the first contact point is idle operation, and further not performing any feedback.
For example, the value of the first pressure value threshold may range from 50 gf to 60 gf, and for example, the value of the first pressure value threshold may range from 55 gf, 60 gf, or other values, and the value of the second pressure value threshold may range from 0 gf to 30 gf, and for example, the value of the first pressure value threshold may range from 15 gf, 20 gf, and the like, which are not limited herein.
905. The electronic device determines whether the first virtual key is an anchor point key, and if the first virtual key is an anchor point key, the process proceeds to step 906, and if the first virtual key is not an anchor point key, the process proceeds to step 908.
In one case, in the display process of the first type of virtual keyboard, the position of the displayed first type of virtual keyboard is fixed; in another case, during the presentation of the virtual keyboard of the first type, the position of the presented virtual keyboard of the first type may be moved.
If the position of the displayed first type of virtual keyboard is fixed in the displaying process of the first type of virtual keyboard, in an implementation manner, the electronic device may store in advance which keys are anchor point keys and which keys are non-anchor point keys, step 903 is an optional step, and after the electronic device determines the first virtual key corresponding to the first contact point through step 903, the electronic device may determine whether the first virtual key is an anchor point key. In another implementation manner, the electronic device may store in advance which location areas on the touch screen are location areas of the anchor point keys, and which location areas on the touch screen are location areas of the non-anchor point keys, then step 903 is an optional step, and the electronic device directly determines, according to the first location information of the first contact point obtained in step 901, whether the location of the first contact point is located in the location area of the anchor point key, that is, whether the first virtual key corresponding to the first location information is an anchor point key.
If the position of the displayed first type of virtual keyboard can move in the displaying process of the first type of virtual keyboard, step 903 is an optional step, the electronic device can store the position information of each virtual key in the first type of virtual keyboard, after the first position information of the first contact point is obtained, the first virtual key corresponding to the first contact point is obtained according to the first position information, and then whether the first virtual key is an anchor point key is judged.
In the embodiment of the application, the first virtual key corresponding to the first contact point can be acquired in real time according to the first position information, so that the scheme not only can be compatible with a virtual keyboard with a fixed position, but also can be compatible with a virtual keyboard with a movable position, and the application scene of the scheme is expanded.
It should be noted that the meaning of the anchor point keys is not equal to that of the positioning keys, that is, the anchor point keys refer to keys for providing a prompt effect to a user, and after the currently displayed virtual keyboard is determined, which virtual keys are anchor point keys may be configured in the electronic device in advance, that is, which virtual keys are anchor point keys may be fixed in advance; the method can also be customized by the user, that is, the user can define which virtual keys are the anchor point keys by himself through a 'setting' interface in the electronic device. Further, since the same electronic device can provide a plurality of different types of virtual keys, the anchor key may be different among the different types of virtual keys.
As an example, for example, the anchor point keys may be key "F" and key "J", or the anchor point keys may also include a space key; as another example, the anchor point keys may further include common function keys such as an ESC key, a Backspace key, an Enter key, a Ctrl key, and numeric keys; as another example, for example, a virtual keyboard that uses a layout of "DVORAK", anchor point keys may include eight standard-indexed keys, the key "AOEUHTNS"; as another example, the virtual keyboard adopts an "AZERTY" layout, and the anchor point keys may further include eight keys "QSDFJKLM"; as another example, the anchor keys may also include six keys of "AZERTY", etc., which are not exhaustive.
906. The electronic device performs a first feedback operation.
In an embodiment of the application, when the contact operation corresponding to the first contact point is a pressing operation and the first virtual key is an anchor point key, the electronic device executes a first feedback operation, where the first feedback operation is used to prompt that the first virtual key is the anchor point key.
Specifically, in one implementation, the first feedback operation may be in the form of vibration feedback, and step 906 may include: the electronic equipment acquires a first vibration feedback element from the plurality of vibration feedback elements, the first vibration feedback element is configured in the touch screen, the first vibration feedback element is matched with a first virtual key, and the vibration feedback elements matched with different virtual keys are not identical; a first type of vibration wave is emitted through the first vibration feedback element to perform a first feedback operation. The vibration wave emitted by the vibration feedback element is non-ultrasonic, and the frequency is generally less than or equal to 500 Hz.
And more particularly to a process for obtaining a first vibratory feedback element that matches a first virtual key. Since the positions of the plurality of vibration feedback elements included in the touch screen are already fixed when the electronic device is shipped from the factory, the electronic device may be configured with the first mapping relationship when the electronic device is shipped from the factory. In one implementation, the whole touch screen may be divided into a plurality of position areas, and the first mapping relationship stored in the electronic device includes a correspondence relationship between each of the plurality of position areas in the touch screen and at least one vibration feedback element, so that the position of the displayed first-type virtual keyboard is fixed no matter in the display process of the first-type virtual keyboard; still in the process of displaying the first type of virtual keyboard, the position of the displayed first type of virtual keyboard may move, and the electronic device may obtain, according to the first position information and the first mapping relationship obtained in step 901, at least one first vibration feedback element that matches the first virtual key (that is, that matches the first position information) from the plurality of vibration feedback elements. In the embodiment of the application, at least one first vibration feedback element matched with the first virtual key can be obtained according to the first position information and the first mapping relation, so that the method is convenient and fast, and is beneficial to improving the efficiency of the matching process of the vibration feedback elements; and the first mapping relation can indicate the first position information and indicate the corresponding relation between one first vibration feedback element, so that the virtual keyboard with fixed position can be compatible, the virtual keyboard with movable position can be compatible, and vibration feedback can be provided under various scenes.
In another implementation manner, if the position of the displayed first type of virtual keyboard is fixed in the displaying process of the first type of virtual keyboard, a plurality of mapping relationships corresponding to the plurality of virtual keyboards one to one may be pre-configured on the electronic device, where each mapping relationship includes a corresponding relationship between each virtual key of the plurality of virtual keys and at least one vibration feedback element. The electronic device first obtains a first mapping relationship matching the currently displayed virtual keyboard from the multiple mapping relationships, where the first mapping relationship includes a corresponding relationship between each virtual key in the currently displayed virtual keyboard and at least one first vibration feedback element, and the electronic device obtains one or more first vibration feedback elements matching the first virtual key according to the first mapping relationship and the first virtual key determined in step 903.
In the embodiment of the application, the first mapping relation is configured in advance, so that after the first virtual key is obtained, at least one first vibration feedback element matched with the first virtual key can be obtained through the first mapping relation, convenience and rapidness are achieved, and the efficiency of the matching process of the vibration feedback elements is improved; the step of determining the vibration feedback element is split to facilitate accurate location of the fault location when a fault occurs.
In another implementation manner, the electronic device is preconfigured with position information of each vibration feedback element, and the electronic device determines whether a vibration feedback element for generating a vibration wave exists below the first virtual key according to the first position information of the first virtual key and the position information of each vibration feedback element in the vibration feedback module, and if a vibration feedback element for generating a vibration wave exists below the first virtual key, obtains at least one vibration feedback element located below the first position information, where the at least one vibration feedback element located below the first position information refers to a vibration feedback element whose position area intersects with a projection of the first virtual key on the vibration feedback module. If the vibration feedback element for generating the vibration wave does not exist below the first virtual key, the electronic device searches for the vibration feedback element for generating the vibration wave existing in the preset area by taking the center point coordinate of the first position information of the first virtual key as a center point. The preset area may be a circle, a square, a rectangle, etc., and the size of the preset area may be determined by combining the arrangement and layout of the vibration feedback elements, the type of the elements used by the vibration feedback elements, etc., and is not limited herein.
A process for emitting a vibration wave through the first vibration feedback element to perform the first feedback operation. Specifically, after the electronic device determines at least one first vibration feedback element matched with the virtual key, the electronic device emits a first type of vibration wave through the at least one first vibration feedback element.
Optionally, the electronic device may further obtain a location type corresponding to the first contact point according to the first location information of the first contact point. The position type comprises a first position area of the first contact point on the anchor point key and a second position area of the first contact point on the anchor point key, and the first position area and the second position area are different; that is, the whole location area of one anchor point key is further divided into a first location area (which may also be referred to as a feature area of the anchor point key) and a second location area (which may also be referred to as an edge area of the anchor point key). The division manner of the first position area and the second position area in different virtual keys can be different.
For a more intuitive understanding of the present disclosure, please refer to fig. 11 to 13, and fig. 11 to 13 are four schematic diagrams of a first location area and a second location area in a feedback method provided in an embodiment of the present disclosure. Fig. 11 includes four sub-diagrams (a) and (b), in which the area inside the dashed line frame in the sub-diagram (a) of fig. 11 represents the first position area of the virtual key K (which may also be the characteristic position area of the key K), and the area outside the dashed line frame in the sub-diagram (a) of fig. 11 represents the second position area of the virtual key K (which may also be the edge position area of the key K). The area inside the dashed line frame in the sub-diagram (b) of fig. 11 represents the first position area of the virtual key J, the area outside the dashed line frame in the sub-diagram (b) of fig. 11 represents the second position area of the virtual key J, and the sub-diagram (b) of fig. 11 illustrates the division manner of the first position area and the second position area in the virtual key corresponding to the key with the small protrusion in the physical keyboard.
Referring to fig. 12, the area inside the dotted line frame in fig. 12 represents the first location area of the virtual key K, and the area outside the dotted line frame in fig. 12 represents the second location area of the virtual key K. The sub-diagrams of fig. 12 and fig. 11 (a) are two different area division manners, and the division manner in fig. 12 is to simulate a keycap having a concave arc surface in a physical keyboard. Referring to fig. 13 again, the area inside the dotted line frame of the virtual key K in fig. 12 represents the first position area of the virtual key K, and the area between the two dotted line frames of the virtual key K in fig. 12 represents the second position area of the virtual key K. Fig. 13, fig. 12 and the sub-schematic diagram (a) of fig. 11 are different area division manners, and in the case that the virtual key is the anchor point key, the second position area of the virtual key K (which may also be referred to as the edge position area of the virtual key) is expanded to the outside of the edge of the virtual key K, and the key gap around the virtual key K is covered, so that the tactile difference of the anchor point key can be further enhanced.
The electronic device may determine the type of the vibration wave emitted by the first vibration feedback element according to the type of the location corresponding to the first contact point. Wherein the type of the vibration wave emitted by the electronic device through the at least one first vibration feedback element may be different in case the first contact point is located in the first location area of the anchor point key and in case the first contact point is located in the second location area of the anchor point key. Wherein, if the electronic device sends out through the vibration feedback element as continuous vibration waves, the difference of the different types of vibration waves includes any one or more characteristics of the following: vibration amplitude, vibration frequency, vibration duration, or vibration waveform. If the electronic device emits the vibration waves in the form of pulses through the vibration feedback element, the difference between the different types of vibration waves includes any one or more of the following characteristics: vibration amplitude, vibration frequency, vibration duration, vibration waveform or frequency of vibration waves in the form of pulses emitted by the electronic device.
Further, vibration waves of different vibration amplitudes may be realized by different trigger voltages, and the vibration amplitude of the vibration wave generated when a voltage of 300v is input to the vibration feedback element is different from the vibration amplitude of the vibration wave generated when a voltage of 400v is input for the vibration feedback. The vibration frequency of the vibration wave emitted by the vibration feedback element corresponding to the anchor point key may be between 200 hz and 400 hz, such as 240 hz, 260 hz, 300 hz, 350 hz, 380 hz, or other values, which are not exhaustive here. The vibration duration may be 10 milliseconds, 15 milliseconds, 20 milliseconds, 25 milliseconds, 30 milliseconds, and so on. The vibration wave sent by the vibration feedback element corresponding to the anchor point key can be a single basic waveform or the superposition of multiple different basic waveforms; the aforementioned fundamental waveforms include, but are not limited to, square waves, sine waves, sawtooth waves, triangular waves, or other types of fundamental waveforms, among others. For example, the vibration wave emitted by the first vibration feedback element may be a sine wave generated by a voltage of 350v (which determines the vibration amplitude) and having a vibration frequency of 290 hz and a duration of 20 ms.
Optionally, since at least one first vibration feedback element is matched with the first virtual key in the touch screen, a second virtual key may also be present in the virtual keyboard, and the number of vibration feedback elements of the second virtual key may be different from or the same as the number of vibration feedback elements corresponding to the first virtual key, that is, the number of vibration feedback elements corresponding to different virtual keys may be the same or different. As an example, for example, the number of vibration feedback elements corresponding to the virtual key K may be 3, and the number of vibration feedback elements corresponding to the virtual key J may be 2.
In order to achieve that the difference between the intensity of the vibration feedback corresponding to the first virtual key and the intensity of the vibration feedback corresponding to the second virtual key is within a preset intensity range, that is, in order to enable the difference between the total intensity of the vibration feedback corresponding to different virtual keys (that is, the intensity of the vibration feedback that can be perceived by a user) to be within a preset intensity range, the electronic device obtains the vibration intensity of the vibration wave corresponding to each first vibration feedback element in the at least one first vibration feedback element, the vibration intensity of the vibration wave of each first vibration feedback element in the at least one first vibration feedback element is related to a first number, and the first number is the number of vibration feedback elements matched with the first virtual key. And then according to the vibration intensity of the vibration wave corresponding to each first vibration feedback element, a first type of vibration wave is sent out through each first vibration feedback unit in at least one first vibration feedback element. The preset intensity range may be within two percent of intensity difference, within three percent of intensity difference, within four percent of intensity difference, within five percent of intensity difference or other intensity ranges, which are not exhaustive here.
Specifically, in one implementation manner, after determining at least one first vibration feedback element matched with the first virtual key, the electronic device may determine, directly according to the number of the first vibration feedback elements matched with the first virtual key, the vibration intensity of a vibration wave corresponding to each first vibration feedback element in the at least one first vibration feedback element; wherein the electronic device may determine the vibration strength of the vibration wave of each first vibratory feedback element based on any one or a combination of factors: the number of the first vibration feedback elements matched with the first virtual key, the distance between each first vibration feedback unit and the center point of the first virtual key, the type of vibration waves, whether the virtual key is an anchor point key, the position type of the first position information or other factors and the like.
In another implementation manner, the electronic device may store a second mapping relationship in advance, in one case, the second mapping relationship indicates a relationship between vibration intensities of the first vibration feedback elements corresponding to the first position information, and the electronic device may acquire the vibration intensities of the first vibration feedback elements according to the first position information and the second mapping relationship acquired in step 901. In another case, the second mapping relationship indicates a relationship between the first virtual key and the vibration intensity of each first vibration feedback element, and the electronic device obtains the vibration intensity of each first vibration feedback element according to the first virtual key and the second mapping relationship obtained in step 903.
Further, in the process of measuring the intensity on the surface of the touch screen, the probe of the vibration measuring instrument may be attached to the surface of a virtual key (i.e., a detection point) on the touch screen to collect the vibration wave from the detection point, so as to obtain a waveform curve of the collected vibration wave, and the intensity of the vibration feedback corresponding to the detection point is indicated by the waveform curve. Further, the difference between the intensity of the vibration feedback corresponding to the first virtual key and the intensity of the vibration feedback corresponding to the second virtual key can be obtained by comparing the difference between the waveform curve measured at the detection point of the first virtual key and the waveform curves of the two regions at the detection point of the second virtual key.
In the embodiment of the application, because the number of the vibration feedback elements corresponding to different virtual keys may be different, the strength of each vibration feedback element is determined according to the number of the matched vibration feedback elements, so that the difference of the vibration feedback strength of each virtual key is within a preset range, and because the force feedback given by different keys is basically the same when a user uses an entity key, the difference between the virtual keyboard and the entity keyboard can be reduced, so that the viscosity of the user is increased.
In another implementation, the first feedback operation may be in the form of voice feedback, and step 907 may include: the electronic device emits a first warning tone, which may be a "droplet", "beep" or other sound, and the specific representation of the first warning tone is not exhaustive here.
Optionally, the electronic device may further obtain a location type corresponding to the first contact point according to the first location information of the first contact point; in case that the first contact point is located in the first location area of the anchor point key, and in case that the first contact point is located in the second location area of the anchor point key, the electronic device emits different alert tones. As an example, for example, in a case where the first contact point is located in the first position region of the anchor point key, the electronic device emits a warning sound of "droplet", and in a case where the first contact point is located in the second position region of the anchor point key, the electronic device emits a warning sound of "beep".
The electronic device may also adopt other types of feedback manners except for sound feedback and vibration feedback, and specifically which type of feedback manner is adopted may be determined by combining the actual product form and the actual application scenario of the product, which is not exhaustive here.
907. The electronic device performs a second feedback operation.
In some embodiments of the application, when the contact operation corresponding to the first contact point is a pressing operation and the first virtual key is not an anchor point key, the electronic device may perform a second feedback operation, where the second feedback operation is used to prompt that the first virtual key is a non-anchor point key, and the first feedback operation and the second feedback operation are different feedback operations. In the embodiment of the application, the feedback operation is executed under the condition that the first virtual key is the anchor point key, the feedback operation is executed under the condition that the first virtual key is the non-anchor point key, and the first feedback operation and the second feedback operation are different feedback operations.
In one implementation, the second feedback operation may be in the form of vibration feedback, and step 907 may include: the electronic equipment acquires a first vibration feedback element matched with the first virtual key, and the first vibration feedback element is configured in the touch screen; a second type of vibration wave is emitted through the first vibration feedback element to perform a second feedback operation. The distinction between the first type of vibration wave and the second type of vibration wave includes any one or more of the following characteristics: vibration amplitude, vibration frequency, vibration duration and vibration waveform. In the embodiment of the application, specific distinguishing modes of different types of vibration waves are provided, the different types of vibration waves can be distinguished through vibration amplitude, vibration frequency, vibration duration and/or vibration waveform and the like, and the realization flexibility of the scheme is improved.
Specifically, the specific implementation manner of the electronic device obtaining the first vibration feedback element matched with the first virtual key may refer to the description in step 906, which is not described herein again.
A process for emitting a vibration wave through the first vibration feedback element to perform the second feedback operation. Specifically, after the electronic device determines at least one first vibration feedback element matched with the virtual key, the electronic device emits a second type of vibration wave through the at least one first vibration feedback element.
Optionally, under the condition that the first virtual key is not an anchor point key, the electronic device may also obtain a location type corresponding to the first contact point according to the first location information of the first contact point, where the location type includes a first location area where the first contact point is located on the non-anchor point key and a second location area where the first contact point is located on the non-anchor point key, and the first location area and the second location area are different; that is, the whole location area of one non-anchor point key is further divided into a first location area (which may also be referred to as a feature area of the non-anchor point key) and a second location area (which may also be referred to as an edge area of the non-anchor point key), and the division manner of the first location area and the second location area in different virtual keys may be different.
The electronic device may determine the type of vibration wave emitted by the first vibration feedback element based on the type of location corresponding to the first contact point. The type of vibration wave emitted by the electronic device through the at least one first vibration feedback element may be different in the case where the first contact point is located in the first location area of the non-anchor point key and in the case where the first contact point is located in the second location area of the non-anchor point key.
Further, in one case, a type of vibration wave corresponding to the first position area of the anchor point key is the same as a type of vibration wave corresponding to the first position area of the non-anchor point key, and a type of vibration wave corresponding to the second position area of the anchor point key is different from a type of vibration wave corresponding to the second position area of the non-anchor point key.
In another case, the type of vibration wave corresponding to the first location area of the anchor point key is different from the type of vibration wave corresponding to the first location area of the non-anchor point key, and the type of vibration wave corresponding to the second location area of the anchor point key is the same as the type of vibration wave corresponding to the second location area of the non-anchor point key.
In another case, the type of vibration wave corresponding to the first location area of the anchor point key is different from the type of vibration wave corresponding to the first location area of the non-anchor point key, and the type of vibration wave corresponding to the second location area of the anchor point key is different from the type of vibration wave corresponding to the second location area of the non-anchor point key.
In the embodiment of the application, all position areas of the anchor point key and/or the non-anchor point key are divided into the first position area and the second position area, under the condition that the first contact point is located in the first position area and the first contact point is located in the second position area, the types of vibration waves emitted by the electronic equipment through the at least one first vibration feedback element are different, the user can be helped to memorize the boundary of the virtual key, the user can be helped to establish muscle memory for different areas of the virtual key, and the difficulty of realizing touch typing on the touch screen is further reduced.
In another implementation, the second feedback operation may be in the form of sound feedback, and then step 907 may include: the electronic equipment sends out a second prompt tone, and the second prompt tone and the first prompt tone are different prompt tones.
Optionally, the electronic device may further obtain a location type corresponding to the first contact point according to the first location information of the first contact point; in the case where the first contact point is located in the first position area of the non-anchor point key, and in the case where the first contact point is located in the second position area of the non-anchor point key, the electronic apparatus emits different alert tones.
It should be noted that step 907 is an optional step, and step 907 may not be executed, that is, in a case where the electronic device determines that the first virtual key is not an anchor point key, no feedback may be executed.
908. The electronic device determines whether the first virtual key is an anchor point key, and proceeds to step 909 if the first virtual key is an anchor point key, and proceeds to step 910 if the first virtual key is not an anchor point key.
In the embodiment of the present application, the specific implementation manner of step 908 may refer to the description in step 905, which is not described herein again.
909. The electronic device changes a tactile characteristic of a first contact point on the touch screen to appear as a first tactile state.
In some embodiments of the application, when the contact operation corresponding to the first contact point is a touch operation and the first virtual key is an anchor point key, the electronic device changes a tactile characteristic of the first contact point in the cover plate of the touch screen to present a first tactile state. Wherein the tactile properties of the cover plate of the touch screen include any one or more of the following: coefficient of sliding friction, stick-slip, temperature, and other types of tactile properties, among others. The electronic equipment can change the whole cover plate of the touch screen into a first touch state so as to change a first contact point in the cover plate of the touch screen into the first touch state; it is also possible to change only the first contact point in the cover of the touch screen to the first tactile state without changing the tactile state of other areas in the cover of the touch screen.
Specifically, in an implementation manner, if an ultrasonic module is integrated in a touch screen of the electronic device, the electronic device changes a tactile characteristic of a first contact point in a cover plate of the touch screen by sending an ultrasonic wave through the ultrasonic module in the touch screen. The electronic device may emit different types of ultrasonic waves through the ultrasonic module so that the first contact point in the cover of the touch screen exhibits different tactile characteristics. If the electronic device sends a single ultrasonic wave through the ultrasonic module, the difference of the different types of ultrasonic waves includes any one or more of the following characteristics: vibration amplitude, vibration frequency, vibration duration, or vibration waveform. Further, the frequency of the ultrasonic wave emitted by the ultrasonic module is greater than 20 khz, and specifically may be 21 khz, 22 khz, 24 khz, 25 khz or other values, and the like, which is not limited herein. If the electronic device is a pulsed wave emitted by the ultrasonic module, the difference between the different types of ultrasonic waves includes any one or more of the following characteristics: the vibration amplitude, the vibration frequency, the vibration duration, the vibration waveform, or the frequency of the pulse wave emitted by the electronic device may also be referred to as the rhythm of the pulse wave emitted by the electronic device. As an example, the electronic device sends out the ultrasonic wave in a pulse form every 3 milliseconds through the ultrasonic module, and the electronic device sends out the ultrasonic wave in a pulse form every 10 milliseconds through the ultrasonic module, in the two cases, the frequency of the ultrasonic wave sent out by the electronic device is different, and it should be understood that the example is only for convenience of understanding of the present solution, and is not used to limit the present solution.
Step 909 may include: the electronic device obtains a third type of ultrasonic wave corresponding to the anchor point key, and the third type of ultrasonic wave is sent out through an ultrasonic wave module in the touch screen so as to change the tactile characteristics of the first contact point on the touch screen to the first tactile state.
Optionally, when the contact operation corresponding to the first contact point is a touch operation and the first virtual key is an anchor point key, the electronic device may also obtain a location type corresponding to the first contact point according to the first location information of the first contact point. The electronic equipment can also determine the type of the ultrasonic wave corresponding to the first position information according to the position type corresponding to the first contact point, and then the ultrasonic wave of the type is sent out through an ultrasonic wave module in the touch screen. The type of the ultrasonic wave emitted by the electronic equipment through the ultrasonic module can be different under the condition that the first contact point is located in the first position area of the anchor point key and under the condition that the first contact point is located in the second position area of the anchor point key.
In another implementation manner, if an electrostatic module is integrated in a touch screen of the electronic device, the electronic device changes a tactile characteristic of a first contact point in a cover plate of the touch screen by emitting static electricity through the electrostatic module in the touch screen. The electronic device can emit static electricity of different magnitudes through the static electricity module, so that the first contact point in the cover plate of the touch screen presents different tactile characteristics. The voltage of the static electricity emitted by the static electricity module may range from 100 volts to 400 volts, for example, the voltage of the static electricity emitted by the static electricity module is 120 volts, 200 volts, 380 volts, or other values, and is not limited herein.
Step 909 may include: the electronic equipment acquires a first voltage value of static electricity corresponding to the anchor point key, and static electricity of the first voltage value is sent out through a static electricity module in the touch screen so as to change the touch characteristic of a first contact point on the touch screen to a first touch state.
Optionally, when the contact operation corresponding to the first contact point is a touch operation and the first virtual key is an anchor point key, the electronic device may also obtain a location type corresponding to the first contact point according to the first location information of the first contact point. And determining the volt value of the current corresponding to the first position information according to the position type corresponding to the first contact point, and then sending out the current of the volt value through a current module in the touch screen. The voltage value of the current emitted by the electronic device through the current module can be different under the condition that the first contact point is located in the first position area of the anchor point key and under the condition that the first contact point is located in the second position area of the anchor point key.
It should be noted that the electronic device may also change the tactile characteristics of the cover plate in the touch screen in other ways, which are not listed here.
910. The electronic device changes a tactile characteristic of the first contact point on the touch screen to appear as a second tactile state.
In some embodiments of the application, in a case that the contact operation corresponding to the first contact point is a touch operation and the first virtual key is a non-anchor point key, the electronic device changes a tactile characteristic of the first contact point in the cover plate of the touch screen to present a second tactile state, and the tactile characteristic when the touch screen presents the first tactile state may be different from the tactile characteristic when the touch screen presents the second tactile state, that is, a feeling when the user touches the anchor point key may be different from a feeling when the user touches the non-anchor point key, so as to further assist the user in distinguishing the anchor point key from the non-anchor point key on the virtual keyboard, and further assist the user in positioning the virtual key in the virtual keyboard.
Specifically, in an implementation manner, if an ultrasonic module is integrated in a touch screen of the electronic device, the electronic device changes a tactile characteristic of a first contact point in a cover plate of the touch screen by emitting ultrasonic waves through the ultrasonic module in the touch screen. Step 910 may include: the electronic device acquires a fourth type of ultrasonic wave corresponding to the non-anchor point key, and sends out the fourth type of ultrasonic wave through an ultrasonic wave module in the touch screen so as to change the tactile characteristics of the first contact point on the touch screen to a second tactile state.
Optionally, in a case that the contact operation corresponding to the first contact point is a touch operation and the first virtual key is a non-anchor point key, the electronic device may also obtain a location type corresponding to the first contact point. And determining the type of the ultrasonic wave corresponding to the first position information according to the position type corresponding to the first contact point. The type of the ultrasonic wave emitted by the electronic equipment through the ultrasonic module can be different under the condition that the first contact point is located in the first position area of the non-anchor point key and under the condition that the first contact point is located in the second position area of the non-anchor point key.
In another implementation manner, if an electrostatic module is integrated in a touch screen of the electronic device, the electronic device changes a tactile characteristic of a first contact point in a cover plate of the touch screen by emitting static electricity through the electrostatic module in the touch screen. Step 909 may include: the electronic equipment acquires a second voltage value of static electricity corresponding to the anchor point key, and static electricity of the second voltage value is sent out through a static electricity module in the touch screen so as to change the touch characteristic of the first contact point on the touch screen to a second touch state.
Optionally, in a case that the contact operation corresponding to the first contact point is a touch operation and the first virtual key is an anchor point key, the electronic device may also obtain a location type corresponding to the first contact point. And determining a volt value of the current corresponding to the first position information according to the position type corresponding to the first contact point. The voltage value of the current emitted by the electronic device through the current module can be different under the condition that the first contact point is located in the first position area of the non-anchor point key and under the condition that the first contact point is located in the second position area of the non-anchor point key.
It should be noted that step 908 is an optional step, and if step 908 is not executed, steps 909 and 910 may be combined, that is, when the contact operation corresponding to the first contact point is a touch operation, the tactile characteristic of the first contact point on the touch screen may be in the same tactile state regardless of whether the first virtual key is an anchor point key or a non-anchor point key.
In addition, steps 908 to 910 are optional steps, and after the electronic device determines that the contact operation corresponding to the first contact point is not a pressing operation, the electronic device may not directly perform any feedback, that is, the electronic device may not perform any feedback when the pressure value corresponding to the first contact point is smaller than the first pressure value threshold.
In the embodiment of the application, when the user contacts the anchor point key on the virtual key, the first feedback operation is executed through the touch screen to prompt the user that the key currently contacts the anchor point key, so that the user can perceive the position of the anchor point key, and the touch screen is favorable for reducing the difficulty of realizing touch typing; in addition, a plurality of vibration feedback elements are configured in the touch screen, and under the condition that the first virtual key is determined to be the anchor point key, obtaining at least one first vibratory feedback element from the plurality of vibratory feedback elements that matches the first virtual key, and instructs the at least one first vibration feedback to emit vibration waves, so that the effect of generating vibration feedback only around the first virtual key can be realized, namely, the vibration feedback is not performed on the full screen, since all fingers are placed on the touch screen during typing, if the vibration is full screen, all fingers will experience vibration, which is easily confusing for the user, but only the effect of vibration feedback is generated around the first virtual key, the user is not easy to confuse and is more easily helped to form muscle memory at the finger so as to assist the user in realizing touch typing on the touch screen.
On the basis of the embodiments corresponding to fig. 1 to fig. 13, in order to better implement the above-mentioned scheme of the embodiments of the present application, the following also provides related equipment for implementing the above-mentioned scheme. Referring to fig. 14, fig. 14 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure. The electronic device 1 comprises a touch screen 20, a memory 40, one or more processors 10, and one or more programs 401, the touch screen 20 having a plurality of vibration feedback elements configured therein, the one or more programs 401 being stored in the memory 40, the one or more processors 10, when executing the one or more programs 401, causing the electronic device to perform the steps of: detecting a first contact operation acting on the touch screen 20; responding to the first contact operation, and acquiring first position information of a first contact point corresponding to the first contact operation, wherein the first position information corresponds to a first virtual key on a virtual keyboard; under the condition that the first virtual key is an anchor point key, acquiring a first vibration feedback element from a plurality of vibration feedback elements, wherein the first vibration feedback element is a vibration feedback element matched with the first virtual key; and indicating the first vibration feedback element to send out vibration waves so as to execute a first feedback operation, wherein the first feedback operation is used for prompting that the first virtual key is an anchor point key.
In one possible design, the electronic device 1 is configured with a first mapping indicating a correspondence between virtual keys and vibration feedback elements, and the one or more processors 10, when executing the one or more programs 401, cause the electronic device 1 to specifically perform the steps of: and acquiring a first vibration feedback element according to the first mapping relation and the first virtual key.
In one possible design, the electronic device 1 is configured with a first mapping in the electronic device 1, the first mapping indicating a correspondence between the position information and the vibration feedback elements, and the one or more processors 10, when executing the one or more programs 401, cause the electronic device 1 to specifically perform the steps of: and acquiring a first vibration feedback element according to the first mapping relation and the first position information.
In one possible design, the one or more processors 10, when executing the one or more programs 401, cause the electronic device 1 to further perform the steps of: the method comprises the steps of obtaining the vibration intensity of vibration waves corresponding to each first vibration feedback element in at least one first vibration feedback element, wherein the vibration intensity of the vibration waves of each first vibration feedback element in the at least one first vibration feedback element is related to a first number, and the first number is the number of the first vibration feedback elements. The one or more processors 10, when executing the one or more programs 401, cause the electronic device 1 to specifically perform the steps of: and sending out the vibration waves through at least one first vibration feedback element according to the vibration intensity of the vibration waves corresponding to each first vibration feedback element, so that the difference between the vibration feedback intensity corresponding to the first virtual key and the vibration feedback intensity corresponding to the second virtual key is within a preset intensity range, and the second virtual key and the first virtual key are different virtual keys.
In one possible design, the first vibratory feedback element is any one of: piezoelectric ceramic plates, linear motors, or piezoelectric films.
In one possible design, the one or more processors 10, when executing the one or more programs 401, cause the electronic device 1 to further perform the steps of: and acquiring a position type corresponding to the first contact point according to the first position information, wherein the position type comprises a first position area of the first contact point on the first virtual key and a second position area of the first contact point on the first virtual key, and the first position area and the second position area are different. The one or more processors 10, when executing the one or more programs 401, cause the electronic device 1 to specifically perform the steps of: according to the position type corresponding to the first contact point, a first feedback operation is performed through the touch screen 20, the feedback operation corresponding to the first position area being different from the feedback operation corresponding to the second position area.
In one possible design, the one or more processors 10, when executing the one or more programs 401, cause the electronic device 1 to further perform the steps of: in response to the detected first gesture operation, selecting a first type of virtual keyboard corresponding to the first gesture operation from a plurality of types of virtual keyboards, wherein virtual keys included in different types of virtual keyboards in the plurality of types of virtual keyboards are not identical; the first type of virtual keyboard is displayed through the touch screen 20, and the position of the first type of virtual keyboard on the touch screen 20 is fixed during the displaying process of the first type of virtual keyboard. The one or more processors 10, when executing the one or more programs 401, cause the electronic device 1 to specifically perform the steps of: during the presentation of the first type of virtual keyboard, a first contact operation acting on the touch screen 20 is detected.
It should be noted that, the contents of information interaction, execution process, and the like between the modules/elements in the electronic device 1 are based on the same concept as that of the method embodiments corresponding to fig. 9 to 13 in the present application, and specific contents may refer to descriptions in the foregoing method embodiments in the present application, and are not described herein again.
Referring to fig. 15, fig. 15 is a schematic structural diagram of the electronic device provided in the embodiment of the present application, and the electronic device 1 may be embodied as a mobile phone, a tablet, a notebook computer, or other devices configured with a touch screen, which is not limited herein. The electronic device 1 may be disposed with the electronic device described in the embodiment corresponding to fig. 1 to 8, and is configured to implement the function of the electronic device in the embodiment corresponding to fig. 9 to 13. In particular, electronic device 1 may vary widely due to configuration or performance differences and may include one or more Central Processing Units (CPUs) 1522 (e.g., one or more processors) and memory 40, one or more storage media 1530 (e.g., one or more mass storage devices) storing applications 1542 or data 1544. Memory 40 and storage media 1530 may be, among other things, transient or persistent storage. The program stored on the storage medium 1530 may include one or more modules (not shown), each of which may include a sequence of instruction operations for the electronic device. Further, the central processor 1522 may be configured to communicate with the storage medium 1530, and execute a series of instruction operations in the storage medium 1530 on the electronic device 1.
The electronic device 1 may also include one or more power supplies 1526, one or more wired or wireless network interfaces 1550, one or more input-output interfaces 1558, and/or one or more operating systems 1541, such as Windows Server, Mac OS XTM, UnixTM, LinuxTM, FreeBSDTM, etc.
In this embodiment, the central processing unit 1522 is configured to implement functions of the electronic device in the embodiment corresponding to fig. 9 to 13. It should be noted that, for the specific implementation manner and the advantageous effects brought by the central processing unit 1522 executing the functions of the electronic device in the embodiments corresponding to fig. 9 to 13, reference may be made to the descriptions in each method embodiment corresponding to fig. 9 to 13, and details are not repeated here.
An embodiment of the present application further provides a computer-readable storage medium, in which a program for generating a vehicle running speed is stored, and when the program runs on a computer, the computer is enabled to execute the steps executed by the electronic device in the method described in the foregoing embodiments shown in fig. 9 to 13.
Embodiments of the present application further provide a computer program, which when run on a computer, causes the computer to perform the steps performed by the electronic device in the method described in the foregoing embodiments shown in fig. 9 to 13.
Further provided in an embodiment of the present application is a circuit system, which includes a processing circuit configured to execute the steps performed by the electronic device in the method described in the foregoing embodiments shown in fig. 9 to 13.
The electronic device provided by the embodiment of the application may specifically be a chip, and the chip includes: a processing unit, which may be, for example, a processor, and a communication unit, which may be, for example, an input/output interface, a pin or a circuit, etc. The processing unit may execute the computer executable instructions stored in the storage unit to enable the chip to perform the steps performed by the electronic device in the method described in the foregoing embodiments shown in fig. 9 to 13. Optionally, the storage unit is a storage unit in the chip, such as a register, a cache, and the like, and the storage unit may also be a storage unit located outside the chip in the radio access device, such as a read-only memory (ROM) or another type of static storage device that may store static information and instructions, a Random Access Memory (RAM), and the like.
Wherein any of the aforementioned processors may be a general purpose central processing unit, a microprocessor, an ASIC, or one or more integrated circuits configured to control the execution of the programs of the method of the first aspect.
It should be noted that the above-described embodiments of the apparatus are merely schematic, where the units described as separate parts may or may not be physically separate, and the parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on multiple network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment. In addition, in the drawings of the embodiments of the apparatus provided in the present application, the connection relationship between the modules indicates that there is a communication connection therebetween, and may be implemented as one or more communication buses or signal lines.
Through the above description of the embodiments, those skilled in the art will clearly understand that the present application can be implemented by software plus necessary general hardware, and certainly can also be implemented by special hardware including application specific integrated circuits, special CLUs, special memories, special components and the like. Generally, functions performed by computer programs can be easily implemented by corresponding hardware, and specific hardware structures for implementing the same functions may be various, such as analog circuits, digital circuits, or dedicated circuits. However, for the present application, the implementation of a software program is more preferable. Based on such understanding, the technical solutions of the present application may be substantially embodied in the form of a software product, which is stored in a readable storage medium, such as a floppy disk, a usb disk, a removable hard disk, a ROM, a RAM, a magnetic disk, or an optical disk of a computer, and includes several instructions for enabling a computer device (which may be a personal computer, a server, or a network device) to execute the methods described in the embodiments of the present application.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, it may be implemented in whole or in part in the form of a computer program.
The computer program includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the application to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another computer readable storage medium, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center via wired (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.) means. The computer-readable storage medium can be any available medium that a computer can store or a data storage device, such as a server, a data center, etc., that includes one or more available media. The usable medium may be a magnetic medium (e.g., a floppy Disk, a hard Disk, a magnetic tape), an optical medium (e.g., a DVD), or a semiconductor medium (e.g., a Solid State Disk (SSD)), among others.
Example two:
the method and the device for inputting the character string can be applied to various application scenes needing inputting through the virtual keyboard. As an example, in an application program for performing text editing, for example, it is necessary to input contents such as text, numbers, characters, and the like through a virtual keyboard; as another example, in an application program that makes a document file (PPT), text, numbers, characters, etc. may also need to be input through a virtual keyboard; as another example, in an application program of a game class, functions such as operating a virtual character to move, modifying a character name, performing instant messaging with a game friend, and the like may also need to be performed through a virtual keyboard. In all the above scenarios, there is a problem that the number of keys of the virtual keyboard is limited, and an additional physical keyboard needs to be provided to meet the input requirements of the user.
In order to solve the above problem, an embodiment of the present application provides a processing method for a virtual keyboard, where the method is applied to an electronic device shown in fig. 16, where multiple types of virtual keyboards are configured in the electronic device, and a user can invoke different types of virtual keyboards by inputting different types of gestures, that is, the virtual keyboards no longer can only display 26 letters, but provide more virtual keys to the user through different types of virtual keyboards, so that not only is flexibility of the user in invoking the virtual keyboards improved, but also richer virtual keys are provided, and thus, no additional physical keyboard needs to be provided.
Referring to fig. 16, fig. 16 is a schematic view of an electronic device according to an embodiment of the disclosure. In some application scenarios, as shown in fig. 2, the electronic device 1 includes at least one display screen, and the display screen is a touch screen (i.e., the touch screen 20 in fig. 2), so that the electronic device 1 can obtain various types of gesture operations input by the user through the display screen, and display various types of virtual keyboards through the display screen.
In other application scenarios, as shown in fig. 16, the electronic device 2 may be represented as a virtual reality device such as VR, AR, or MR, and the electronic device 2 captures various types of gesture operations of the user through a camera configured on the head display device, and presents various types of virtual keyboards to the user through the head display device.
With reference to fig. 17, referring to the above description, fig. 17 is a schematic flowchart of a flow of a processing method of a virtual keyboard according to an embodiment of the present application, where the processing method of a virtual keyboard according to the embodiment of the present application may include:
1701. the electronic equipment detects a first gesture operation and acquires a first gesture parameter corresponding to the first gesture operation.
In the embodiment of the application, the electronic device can detect whether the user inputs the gesture operation in real time, and when the electronic device detects the first gesture operation input by the user, first gesture parameters corresponding to the first gesture operation are generated. Specifically, in some application scenarios, the electronic device is configured with a touch screen, and then the electronic device obtains a first gesture operation input by a user in real time through the touch screen. In other application scenarios, the electronic device may acquire a first gesture operation input by a user through a camera configured on the head display device, and further generate a first gesture parameter corresponding to the first gesture operation.
If the first gesture operation is acquired through a display screen configured by the electronic device, the first gesture parameters include any one or more of the following parameters: position information of the contact points corresponding to the first gesture operation, number information of the contact points corresponding to the first gesture operation, area information of the contact points corresponding to the first gesture operation, other types of parameter information, and the like. In the embodiment of the application, which information is included in the first gesture parameter is introduced, the first gesture parameter not only includes the position information of each contact point and the quantity information of a plurality of contact points, but also includes the area information of each contact point, the area information of the contact points can distinguish the contact points triggered by the palm from the contact points, the accurate estimation of the type of the first gesture operation is facilitated, the display of a wrong virtual keyboard is avoided, and the accuracy of the virtual keyboard display process is improved.
Further, the position information of the contact point corresponding to the first gesture operation may be represented by coordinate information, a function, or other information, the origin of the coordinate system corresponding to the position information may be a center point of the display screen, an upper left corner vertex of the display screen, a lower left corner vertex of the display screen, an upper right corner vertex of the display screen, a lower right corner vertex of the display screen, or other position points, and the setting of the origin of the specific coordinate system may be determined by combining with an actual application scene.
Specifically, a display screen of the electronic device may be a touch screen, a contact sensing module may be configured in the touch screen, and the electronic device collects a first gesture parameter corresponding to the first gesture operation through the contact sensing module configured in the display screen. To understand the present embodiment more intuitively, please refer to fig. 18, fig. 18 is a schematic diagram of a first gesture parameter in the processing method of a virtual keyboard according to the embodiment of the present application. In fig. 18, the first gesture operation is taken as a single-hand operation as an example, as shown in the figure, 4 contact points can be acquired on the display screen, the area of 3 contact points generated by fingers in the 4 contact points is relatively small, and the area of the remaining 1 contact point generated by a palm is relatively large.
If the electronic device is a virtual reality device, the virtual keyboard may be visually presented in a stereoscopic space. The electronic device can detect gesture operations in the space in real time, so that when the first gesture operation is detected, first gesture parameters corresponding to the first gesture operation are obtained.
Specifically, in one case, the electronic device may track the user's hand in real time via the user's handheld device or hand-worn device to monitor the user's first gesture operation. In another case, the electronic device includes a head-up device, the first gesture operation is acquired by a camera configured in the head-up device, the first gesture parameter may be embodied as an image corresponding to the first gesture operation, and the electronic device may input the image corresponding to the first gesture operation into a neural network for image recognition to generate the first gesture parameter corresponding to the first gesture operation.
1702. The electronic equipment generates first indication information according to the first gesture parameter information.
In the embodiment of the application, after the electronic device acquires the first gesture parameter corresponding to the first gesture operation, secondary processing may be performed according to the acquired first gesture parameter to generate first indication information corresponding to the first gesture parameter, and the first indication information may also be regarded as a gesture parameter obtained through the secondary processing. Wherein the first indication information comprises any one or more of the following (i.e. the first gesture parameter indicates any one or more of the following): the information indicating the first indication information may include information indicating a type of the first indication information, which may be flexibly set in accordance with an actual application scenario, such as relative angle information of a hand corresponding to the first gesture operation, position information of the hand corresponding to the first gesture operation, number information of the hands corresponding to the first gesture operation, and shape information of the hand corresponding to the first gesture operation. In the embodiment of the application, after the acquired first gesture parameters are subjected to secondary processing, information such as relative angle information of hands, position information of the hands, quantity information of the hands or shape information of the hands can be obtained, namely richer information about the first gesture operation can be obtained based on the first gesture parameters, and flexibility of a virtual keyboard matching process is improved.
Specifically, in some application scenarios, if the first gesture parameter is acquired through a display screen of the electronic device. Relative angle information for a hand corresponding to the first gesture operation, the relative angle information may include any one or more of: a relative angle between the hand corresponding to the first gesture operation and any one side of the display screen, a relative angle between the hand corresponding to the first gesture operation and a center line of the display screen, a relative angle between the hand corresponding to the first gesture operation and a diagonal line of the display screen, and the like, which are not limited herein.
More specifically, if the electronic device determines that the first gesture operation is a one-handed operation (that is, the number of hands corresponding to the first gesture operation is 1), the electronic device obtains at least two first contact points (that is, contact points generated based on fingers) from a plurality of contact points corresponding to the first gesture operation, connects the at least two first contact points, or connects two first contact points farthest from the at least two first contact points, to generate a straight line corresponding to the first gesture operation, and further calculates a relative angle between the straight line and a preset line, where the preset line includes any one or more of the following items: any one side of the display screen, the center line of the display screen, the diagonal line of the display screen, etc., to obtain the relative angle information of the hand corresponding to the first gesture operation.
If the electronic device determines that the first gesture operation is a two-handed operation (that is, the number of hands corresponding to the first gesture operation is 2), the electronic device obtains at least two first contact points from the plurality of contact points corresponding to the first gesture operation, and connects the at least two first contact points corresponding to the left hand, or connects the two first contact points which are farthest from each other in the at least two first contact points corresponding to the left hand, so as to generate a first straight line corresponding to the left hand; and connecting at least two first contact points corresponding to the right hand, or connecting two first contact points which are farthest away from the at least two first contact points corresponding to the right hand to generate a second straight line corresponding to the right hand, further respectively calculating a first sub-angle between the first straight line and a preset line, and calculating a second sub-angle between the second straight line and the preset line to obtain the relative angle information of the hand corresponding to the first gesture operation.
For a more intuitive understanding of the present disclosure, please refer to fig. 19, and fig. 19 is a schematic diagram of relative angle information in the processing method of the virtual keyboard according to the embodiment of the present disclosure. Taking the first gesture operation as the two-hand operation as an example in fig. 19, fig. 19 includes two sub-diagrams (a) and (b), where the sub-diagram (a) in fig. 19 shows the positions of the contact points corresponding to the two-hand operation, and the sub-diagram (b) in fig. 19 takes a preset line as an example in which the bottom edge of the display screen is taken as an example, two contact points farthest away from each other among 4 contact points corresponding to the left hand are respectively connected to form a first straight line, and two contact points farthest away from each other among 4 contact points corresponding to the right hand are connected to form a second straight line, so as to obtain the first sub-angle and the second sub-angle.
Determination of position information for the hand. The electronic equipment determines the number of hands corresponding to the first gesture operation according to the acquired first gesture parameters, and if the first gesture operation is a two-hand operation, the position of the hand corresponding to the first gesture operation comprises the distance between the two hands; if the first gesture operation is a one-hand operation, the position of the hand corresponding to the first gesture operation includes a first region and a fourth region. The first area is positioned at the lower left or lower right of the display screen, and the fourth area is an area except the first area in the display panel; further, the width of the first area may be 3 cm, 4 cm or 5 cm, and the bottom edge of the first area coincides with the bottom edge of the display screen. To more intuitively understand the present disclosure, please refer to fig. 20, and fig. 20 is two schematic diagrams of the first area in the processing method of the virtual keyboard according to the embodiment of the present application. The sub-diagram of fig. 20 (a) and the sub-diagram of fig. 20(b) respectively show two diagrams of the first region, and it should be understood that the example in fig. 20 is only for convenience of understanding the present solution and is not intended to limit the present solution.
When the first gesture operation is a two-handed operation, the electronic device may determine a distance between the left index finger and the right index finger as a distance between the two hands; the closest distance between the left hand and the right hand can also be determined as the distance between the two hands; the shapes of the left hand and the right hand can be generated according to the plurality of contact points, and further, the distance between the left-hand boundary and the right-hand boundary and the like can be generated, and the determination mode of the distance between the two hands is not exhaustive here.
If the first gesture operation is a one-hand operation, the electronic device selects a plurality of first contact points from the plurality of contact points corresponding to the first gesture operation according to a first gesture parameter corresponding to the first gesture operation, and determines the position of the hand corresponding to the first gesture operation according to the positions of the plurality of first contact points. In one implementation, if all of the at least one first contact point are located in the first area, determining the position of the hand as the first area; and if the first contact point outside the first area exists in the at least one first contact point, determining that the position of the hand is a fourth area. In another implementation manner, if a first contact point located in the first area exists in the at least one first contact point, determining that the position of the hand is the first area; and if all the first contact points in the at least one first contact point are located in the fourth area, determining that the position of the hand is the fourth area.
A determination process for the number of hands. The first gesture operation acquired by the electronic device may be a one-hand operation or a two-hand operation. The electronic device may determine the number of hands corresponding to the first gesture parameter according to the number of contact points and the position information of the contact points. In one implementation manner, the electronic device determines whether the number of the plurality of contact points is greater than or equal to a first value, and two contact points with a distance greater than a second distance threshold exist in the plurality of contact points, and if the number of the plurality of contact points is greater than the first value and two contact points with a distance greater than the second distance threshold exist in the plurality of contact points, determines that the number of hands corresponding to the first gesture operation is 2; and if the number of the contact points is smaller than the first numerical value, or two contact points with the distance larger than the second distance threshold value do not exist in the contact points, determining that the number of the hands corresponding to the first gesture operation is 1. The value of the first numerical value can be 2, 3, 4, 5 or other numerical values, and the value of the first numerical value can also be self-defined by a user; the value of the second distance threshold may be 22 mm, 25 mm, 26 mm, or other values, and the value of the second distance threshold may also be self-defined by the user, and the value of the second distance threshold may be determined by combining the size of the display screen, the size of the hand of the user, and the like, and is not limited here.
In another implementation manner, the electronic device determines whether a first subset and a second subset exist in the plurality of contact points, determines that the number of hands corresponding to the first gesture operation is 2 if the first subset and the second subset exist, and determines that the number of hands corresponding to the first gesture operation is 1 if the first subset or the second subset does not exist. The number of the contact points in the first subset and the number of the contact points in the second subset are both greater than or equal to a first value, the distance between any two contact points in the first subset is smaller than a second distance threshold, the distance between any two contact points in the second subset is smaller than a second distance threshold, and the distance between any contact point in the first subset and any contact point in the second subset is greater than or equal to the second distance threshold.
For a more intuitive understanding, please refer to fig. 21, where fig. 21 is a schematic diagram illustrating a first gesture operation in the processing method of the virtual keyboard according to the embodiment of the present application. Taking the value of the first numerical value as 3 in fig. 21 as an example, fig. 21 includes two sub-diagrams (a) and (b), where the sub-diagram (a) in fig. 21 shows a case where the number of hands corresponding to the first gesture operation is 1 (that is, the first gesture operation is a one-hand operation), the electronic device may obtain 3 contact points in the sub-diagram (a) in fig. 21, and a distance between the 3 contact points is less than 25 mm; the sub-diagram in fig. 21 (b) shows a case where the number of hands corresponding to the first gesture operation is 2 (that is, the first gesture operation is a two-handed operation), the electronic device can acquire 8 contact points (respectively, a1, a2, A3, a4, a5, A6, a7, and A8 in fig. 21), the contact points represented by a7 and A8 are contact points generated based on the palm, a1, a2, A3, a4, a5, and A6 are contact points a1 generated based on the finger, a2 and A3 form a first subset, a4, a5 and a6 form a second subset, the distances between the three contact points a1, a2 and A3 are all less than 25 mm, the distances between the three contact points a4, a5 and a6 are all less than 25 mm, and the distances between the first subset and the second subset are all greater than 25 mm.
Optionally, the electronic device may further divide the plurality of contact points corresponding to the first gesture operation into a first contact point and a second contact point according to the first gesture parameter corresponding to the first gesture operation, where the first contact point is generated based on the finger of the user, and the second contact point is generated based on the palm of the user. And then judging whether the number of the first contact points in the plurality of contact points is larger than or equal to a first numerical value or not, and whether two contact points with the distance larger than a second distance threshold exist in at least one first contact point or not so as to determine the number of the hands corresponding to the first gesture operation. Specifically, in one implementation, the electronic device may determine whether an area of each contact point is greater than or equal to a first area threshold, determine the contact point as a second contact point (i.e., a contact point generated by the palm) if the area of each contact point is greater than or equal to the first area threshold, and determine the contact point as a first contact point (i.e., a contact point generated by the finger) if the area of each contact point is less than the first area threshold. The value of the first area threshold value can be preset or can be self-defined by a user; the value of the first area threshold may be determined by combining the size of the hand of the user and other factors, which are not limited herein. It should be noted that, here, the determination of whether the contact point is the first contact point or the second contact point by using the area of the contact point is taken as an example, which is only for convenience of understanding the feasibility of the present solution and is not used to limit the present solution.
Determination of shape information for a hand. The first gesture operation may be a static gesture operation, and the shape information of the hand corresponding to the first gesture operation may specifically be left hand, right hand, two fingers, a fist, or other shape information. Optionally, if the first gesture operation may also be a dynamic sliding operation, the shape information of the hand may specifically be a "Z" shape, a hook shape, a circle shape, and the like, which is not exhaustive here. Specifically, if the number of the plurality of contact points acquired by the electronic device is two, it may be determined that the shape information corresponding to the first gesture operation is two fingers. For a more direct understanding of the present disclosure, please refer to fig. 22, and fig. 22 is a schematic diagram illustrating a first gesture operation in the processing method of the virtual keyboard according to the embodiment of the present disclosure. Fig. 22 includes two sub-diagrams (a) and (b), the sub-diagram (a) of fig. 22 shows a first gesture operation of the two-finger operation, and the sub-diagram (b) of fig. 22 shows two contact points corresponding to the two-finger operation, and it should be understood that the example in fig. 22 is only for convenience of understanding of the present solution and is not intended to limit the present solution.
If the number of the multiple contact points acquired by the electronic device is greater than or equal to three, the electronic device needs to determine whether the number of the hands corresponding to the first gesture operation is 1 or 2, and if the electronic device determines that the electronic device is operated by one hand, the electronic device needs to determine whether the shape of the hand corresponding to the first gesture operation is left-handed or right-handed according to the acquired first gesture parameter. Specifically, in one implementation, if the plurality of contact points corresponding to the first gesture operation are all located on the left side of the center line of the display screen, the shape of the hand corresponding to the first gesture operation is a left hand; if the plurality of contact points corresponding to the first gesture operation are all located on the right side of the center line of the display screen, the shape of the hand corresponding to the first gesture operation is a right hand. It should be noted that the manner for determining whether the left hand or the right hand is provided herein is only for convenience of understanding the feasibility of the present solution, and is not used to limit the present solution.
In other application scenarios, if the first gesture parameter is generated based on an image corresponding to the first gesture operation. The electronic device may input an image corresponding to the first gesture operation to a neural network for image recognition to directly generate the first indication information.
1703. The electronic device obtains a first rule.
In the embodiment of the application, the electronic device may be preconfigured with a first rule, the first rule indicates a correspondence between a plurality of types of gesture operations and a plurality of types of virtual keyboards, and the first type of virtual keyboard is one of the plurality of types of virtual keyboards. In one case, the functions of different types of virtual keyboards in the multiple types of virtual keyboards are different, and the virtual keyboards with different functions may include any two or more of the following combinations of virtual keyboards: the keyboard comprises a numeric keyboard, a function key keyboard, a full keyboard and a self-defined keyboard, wherein the function key keyboard consists of function keys. In the embodiment of the application, the different types of virtual keyboards have different functions, so that the virtual keyboards with different functions can be provided for users, the flexibility of the users in the using process of the virtual keyboards is improved, and the user viscosity of the scheme is improved.
In another case, the different types of virtual keyboards may include a combination of any two or more of the following: mini-keyboard, numeric keyboard, functional key keyboard, circular keyboard, curved keyboard, custom keyboard, and full keyboard.
The first rule indicates the following information: where the first gesture operation is a one-handed operation, the first type of virtual keyboard is any one of: the keyboard comprises a mini keyboard, a numeric keyboard, a functional key keyboard, a circular keyboard, an arc-shaped keyboard and a custom keyboard, wherein the mini keyboard comprises 26 letter keys, the functional keyboard is displayed in an application program, and virtual keys of the functional keyboard correspond to functions of the application program; it should be noted that, the mini keyboard, the numeric keyboard, the functional key keyboard, the circular keyboard, the arc-shaped keyboard and the custom keyboard do not need to be configured in the same electronic device at the same time, and the examples here are only to prove that the virtual keyboard triggered by one-hand operation in one electronic device may be any one of the mini keyboard, the numeric keyboard, the functional key keyboard, the circular keyboard, the arc-shaped keyboard or the custom keyboard. In case the first gesture operation is a two-handed operation, the first type of virtual keyboard is a full keyboard comprising at least 26 letter keys, the full keyboard having a size larger than the mini keyboard. In the embodiment of the application, under the two conditions that the first gesture operation is the one-hand operation and the two-hand operation, the multiple concrete representation forms of the virtual keyboard displayed through the display screen are provided, the realization flexibility of the scheme is improved, and the application scene of the scheme is expanded.
Further, since the correspondence between gesture operations and different types of virtual keyboards in different electronic devices may be different, the same electronic device may include a combination of at least two of the following five items:
and (I) under the condition that the first gesture operation is a first one-hand operation, the first type of virtual keyboard is a mini keyboard.
The first single-hand operation can be a left-hand single-hand operation or a right-hand single-hand operation; the first one-handed operation may be a one-handed operation with a stylus pen held by a hand, or may be a one-handed operation without a stylus pen held by a hand. For a more intuitive understanding of the present disclosure, please refer to fig. 23, where fig. 23 is a schematic diagram of a first type of virtual keyboard in the processing method of a virtual keyboard according to the embodiment of the present disclosure. In fig. 23, the first single-handed operation is taken as an example that the user holds the stylus pen, the electronic device detects that the first gesture operation is the first single-handed operation, the corresponding first type of virtual keyboard is a mini keyboard, the mini keyboard includes 26 letter keys, and the size of the mini keyboard is smaller than that of a full keyboard.
In the embodiment of the application, under the condition that the first gesture operation is single-hand operation, the first type of virtual keyboard is a mini keyboard, and the flexibility of the letter input process of a user is improved.
And (II) when the first gesture operation is a right-hand one-hand operation, the first type virtual keyboard is a numeric keyboard, and when the first gesture operation is a left-hand one-hand operation, the first type virtual keyboard is a functional keyboard.
For example, if the first gesture operation is obtained from an application program of a game class, the functional keyboard may be a game keyboard, and game frequently-used keys are configured in the game keyboard. For another example, the first gesture operation is obtained from an application program of a drawing type, and the functional keyboard may be a common key in drawing software, which is not exhaustive here.
For a more intuitive understanding of the present disclosure, please refer to fig. 24 and fig. 25, and fig. 24 and fig. 25 are two schematic diagrams of a first type of virtual keyboard in the processing method of a virtual keyboard according to the embodiment of the present application. Fig. 23 and 24 each include two sub-diagrams (a) and (b), and referring first to fig. 24, the sub-diagram (a) of fig. 24 shows the first gesture operation as the right-hand operation, and the sub-diagram (b) of fig. 24 represents the virtual keyboard of the first type embodied as a numeric keyboard. Referring again to fig. 25, the sub-diagram of fig. 25 (a) shows the first gesture operation as a left-hand operation, and the sub-diagram of fig. 25 (b) represents the first type of virtual keyboard embodied as a designer keyboard, it being understood that the examples of fig. 24 and 25 are merely for convenience of understanding the present solution and are not intended to limit the present solution.
In the embodiment of the application, under the condition that the first gesture operation is right-handed single-handed operation, the first type of virtual keyboard is a numeric keyboard, and under the condition that the first gesture operation is left-handed single-handed operation, the first type of virtual keyboard is a functional keyboard, so that the use habit of a user on an entity keyboard is better met, the difference between the virtual keyboard and the entity keyboard is reduced, and the viscosity of the user is favorably enhanced.
And (iii) in the case that the first gesture operation is a one-handed operation located in the first area of the display screen, the first type of virtual keyboard is a function key keyboard, the first area is located at the lower left or lower right of the display screen, and the concept of the first area may refer to the description in step 202, which is not described herein again.
One or more function keys are shown in the function key keyboard, the function key keyboard includes but is not limited to a Shift key, a Ctrl key, an Alt key, an Fn (abbreviation of function) key, a Delete key, and the like, and specific function keys included in the function key keyboard may be limited by combining with an actual application scenario, which is not limited herein; the Fn key is a decoration key used on a computer keyboard, and the main function of the Fn key is to define more keys with one key and two meanings in a compact layout keyboard in a combined key mode. For a more intuitive understanding of the present disclosure, please refer to fig. 26 and fig. 27, and fig. 26 and fig. 27 are two schematic diagrams of a first type of virtual keyboard in the processing method of a virtual keyboard according to the embodiment of the present application. Fig. 26 and 27 include two sub-schematic diagrams (a) and (b), referring to fig. 26 first, in fig. 26, the situation that the first area is located at the lower left of the display screen is taken as the sub-schematic diagram (a) of fig. 26, and when the user places a single hand in the first area of the display screen, the sub-schematic diagram (b) of fig. 26 is triggered to enter, that is, the first type of virtual keyboard is the function key keyboard. With continued reference to fig. 27, in the case that the first area is located at the lower right of the display screen in fig. 27, as shown in the sub-diagram (a) of fig. 27, when the user places a single hand on the first area of the display screen, the sub-diagram (b) of fig. 27 is triggered to be entered, that is, the first type of virtual keyboard is a function key keyboard, it should be understood that the examples in fig. 26 and fig. 27 are only for convenience of understanding the present solution, and are not used for limiting the present solution.
In the embodiment of the application, because the function key keys are arranged at the lower left side or the lower right side of the physical keyboard, under the condition that the first gesture operation is the single-hand operation in the first area of the display screen, the first type of virtual keyboard is the function key keyboard, and because the triggering gesture is the same as the habit of using the physical keyboard of a user, the user can remember the triggering gesture conveniently, the realization difficulty of the scheme is reduced, and the user viscosity is favorably enhanced.
And (IV) in the case that the first gesture operation is a one-handed operation with less than three contact points, the first type of virtual keyboard is a circular keyboard or an arc-shaped keyboard.
The circular keyboard refers to a keyboard in a circular shape, and the arc-shaped keyboard refers to a keyboard in an arc shape. Optionally, in a case that the first gesture operation is a one-handed operation of two contact points, and a distance between the two contact points is greater than a third distance threshold, the first type of virtual key is a circular keyboard or an arc keyboard, and the value of the third distance threshold may be 58 mm, 60 mm, 62 mm, and so on, which is not exhaustive here. For a more intuitive understanding of the present disclosure, please refer to fig. 28, in which fig. 28 is a schematic diagram of a first type of virtual keyboard in the processing method of a virtual keyboard according to the embodiment of the present disclosure. Fig. 28 includes two sub-diagrams (a) and (b), the sub-diagram (a) of fig. 28 represents a single-hand operation of a first gesture operating two contact points (i.e., less than three contact points), and the sub-diagram (b) of fig. 28 represents that a first type of virtual keyboard is a circular keyboard, it should be understood that the example in fig. 28 is only for convenience of understanding the scheme and is not intended to limit the scheme.
In the embodiment of the application, when the first gesture operation is a single-hand operation with less than three contact points, a circular keyboard or an arc-shaped keyboard can be provided, not only a keyboard existing in an entity keyboard but also a keyboard not existing in the entity keyboard can be provided, the types of the keyboards are enriched, more choices are provided for a user, and the selection flexibility of the user is further enhanced.
(V) in the case where the first gesture operation is a two-handed operation, the first type of virtual keyboard is a full keyboard.
For a more intuitive understanding of the present disclosure, please refer to fig. 29, in which fig. 29 is a schematic diagram of a first type of virtual keyboard in the processing method of a virtual keyboard according to the embodiment of the present disclosure. Fig. 29 represents that the virtual keyboard corresponding to the two-hand operation is a full keyboard, the full keyboard includes at least 26 letter keys, and comparing fig. 29 and fig. 23, it can be understood that the size of the full keyboard is larger than that of the mini keyboard, and it should be understood that the example in fig. 29 is only for convenience of understanding the scheme, and is not used to limit the scheme.
It should be noted that, except that the item (a) and the item (b) are incompatible with each other, they may not be configured in the same electronic device, and the other items may be arbitrarily collocated with each other.
Still further, in one implementation, the first rule directly includes correspondences between a plurality of types of gesture operations and a plurality of types of virtual keyboards, i.e., as illustrated in items (one) to (five) above. The first rule comprises a corresponding relation between a plurality of pieces of first identification information and a plurality of pieces of second identification information, the first identification information is used for uniquely pointing to the first identification information corresponding to one type of gesture operation, and the second identification information is used for uniquely pointing to one type of virtual keyboard.
In another implementation manner, the first rule includes correspondence between multiple sets of conditions and multiple types of virtual keyboards, where each set of conditions in the multiple sets of conditions corresponds to one type of gesture operation, that is, each set of conditions in the multiple sets of conditions is used to define one type of gesture operation.
Specifically, if the first gesture parameter is collected based on the display screen, a set of conditions for limiting the one-handed operation may be that the number of the contact points is greater than or equal to a first value, and distances between the plurality of contact points are all smaller than a second distance threshold, and values of the first value and the second distance threshold may refer to the above description. Alternatively, one set of conditions for defining one-handed operation may be that the number of contact points having an area smaller than a first area threshold is greater than or equal to a first value, and that the distances between the plurality of contact points having an area smaller than the first area threshold are each smaller than a second distance threshold.
One set of conditions for defining a left-handed, one-handed operation may be that the number of contact points is greater than or equal to a first value, and the distances between the plurality of contact points are each less than a second distance threshold, and the plurality of contact points are each located to the left of the centerline of the display screen; one set of conditions for defining a left-handed, one-handed operation may be that the number of contact points is greater than or equal to a first value, and the distances between the plurality of contact points are each less than a second distance threshold, and the plurality of contact points are each located to the right of the center line of the display screen.
One set of conditions for one-handed operation to define the first area may be that the number of contact points is greater than or equal to a first value, and distances between the plurality of contact points are all less than a second distance threshold, and the plurality of contact points are all located within the first area of the display screen; alternatively, the set of conditions for one-handed operation to define the first area may be that the number of contact points is greater than or equal to a first value, and distances between the plurality of contact points are each less than a second distance threshold, and that at least one of the plurality of contact points is located within the first area of the display screen, and so on.
The set of conditions for defining two-handed operation may be that the plurality of contact points includes a first subset and a second subset, the number of contact points of the first subset and the second subset are both greater than or equal to a first value, and the distance between the plurality of contact points in the first subset is less than a second distance threshold, and the distance between the plurality of contact points in the second subset is less than a second distance threshold, and the distance between any one contact point in the first subset and any one contact point in the second subset is greater than the second distance threshold. Alternatively, the set of conditions for defining two-handed operation may be that the plurality of contact points having an area less than the first area threshold comprise a first subset and a second subset.
It should be noted that, a plurality of sets of conditions for defining a plurality of types of gesture operations are provided above, but what types of gesture operations are configured in a specific electronic device and what defining conditions correspond to each type of gesture operations may be flexibly set in combination with an actual application scenario, and are not limited here.
Optionally, the first rule includes a first sub-rule, where the first sub-rule is obtained based on performing a custom operation on at least one type of gesture operation and/or at least one type of virtual keyboard. In the embodiment of the application, the user can customize the type of the trigger gesture and/or the virtual keyboard, so that the display process of the virtual keyboard is more in line with the expectation of the user, and the user viscosity of the scheme is further improved.
Specifically, there is a "setting" function in the electronic device, and a first setting interface for a first rule is configured in the "setting" function, so that a user can customize any one or more of the following through the first setting interface: and customizing the gesture operation, the virtual keyboard and the corresponding relation between the gesture operation and the virtual keyboard. To more intuitively understand the present disclosure, please refer to fig. 30 to fig. 32, fig. 30 and fig. 31 are schematic diagrams of a first setting interface in the processing method of the virtual keyboard according to the embodiment of the present application, and fig. 32 is a schematic diagram of a user-defined gesture operation in the processing method of the virtual keyboard according to the embodiment of the present application. Referring to fig. 30, fig. 30 includes four sub-schematic diagrams (a), (b), (c), and (d), where the sub-schematic diagram (a) of fig. 30 represents a correspondence relationship between multiple types of gesture operations and multiple types of virtual keyboards configured in advance in an electronic device, and as shown in the sub-schematic diagram (a) of fig. 30, a numeric keyboard is displayed by single-hand operation trigger, a full keyboard is displayed by two-hand operation trigger, a circular keyboard is displayed by two-finger operation trigger, and when a user clicks a "numeric keyboard" (one type of virtual keyboard), the sub-schematic diagram (b) of fig. 30 is triggered to enter, that is, a custom operation is performed on the "numeric keyboard". In the sub-diagram of fig. 30 (b), when the user performs a long-press operation, a double-click operation, a three-click operation, or other types of contact operations on any one of the keys in the numeric keypad, taking the long-press operation on key 2 in the numeric keypad as an example in fig. 30, a delete icon (i.e., the symbol of "x" shown in fig. 30) appears on a part of the keys in the numeric keypad, that is, triggering into the sub-diagram of fig. 30 (c). In the sub-diagram of fig. 30 (c), the key on which the "x" symbol appears is a deletable key, and the user can move the position of the key in the numeric keypad by long-pressing and dragging the key. The aforementioned operations of deleting the key and moving the key position may be performed a plurality of times, as shown in the sub-diagram (d) of fig. 30, and the user deletes the number keys other than 1 to 9 to implement the customization of the number keypad. It should be noted that the example in fig. 30 is only for convenience of understanding the present disclosure, and the deletion or movement operation of the virtual keys in the virtual keyboard may also be implemented by other operations, and fig. 30 only takes the customization of the numeric keyboard as an example, and may also perform customization of other types of virtual keyboards.
Continuing to refer to fig. 31, fig. 31 is described with reference to fig. 30, when the user clicks the "one-hand operation" in the sub-diagram (a) of fig. 30, the sub-diagram (a) of fig. 31 is entered, an icon for inputting the "custom gesture" is shown in the sub-diagram (a) of fig. 31, the user clicks the icon, the sub-diagram (b) of fig. 31 is entered, and the user performs the input of the custom gesture based on the prompt of the sub-diagram (b) of fig. 31, that is, the user inputs the "fist making" gesture as shown in the sub-diagram (c) of fig. 31. In one implementation, the electronic device may preset a first duration threshold, where the first duration threshold is a total duration threshold of the input custom gesture, and when the input duration threshold is reached, go to sub-diagram (d) in fig. 31; in another implementation manner, the electronic device may also pre-configure a second duration threshold, where the second duration threshold is a threshold for the user to stop inputting the gesture, and when the electronic device detects that the duration for the user to stop inputting the gesture reaches the second duration threshold, the sub-diagram in (d) in fig. 31 and the like are entered, and here, the modes of entering the sub-diagram in (d) in fig. 31 are not exhaustive. In the sub-diagram of fig. 31 (d), an icon for indicating "confirm" and an icon for indicating "re-input custom gesture" are displayed in the display screen, if the user clicks the "confirm" icon, the electronic device determines the gesture operation acquired through the sub-diagram of fig. 31 (c) as the custom gesture 1, updates the first rule, updates the correspondence between the one-hand operation and the numeric keypad to the correspondence between the custom gesture 1 and the numeric keypad, and displays the sub-diagram of fig. 31 (e), that is, confirms the custom gesture 1 as the trigger gesture of the numeric keypad, so as to complete the customization of the trigger gesture. In addition, referring to the sub-diagram (f) of fig. 31, the sub-diagram (f) of fig. 31 represents the shape (that is, "fist making" shape) of the custom gesture 1 acquired by the electronic device, and it should be understood that the example in fig. 31 is only for convenience of understanding the present solution, and is not used for defining the present solution, and the user may also set a custom gesture in another shape, which is not limited here.
Please continue to refer to fig. 32, where fig. 32 needs to be described with reference to fig. 31, the user starts to input a custom gesture based on the prompt in the sub-diagram (b) of fig. 31, that is, the user enters the sub-diagram (a) of fig. 32 and the sub-diagram (b) of fig. 32, the custom gesture is taken as an example in fig. 32, the custom gesture is taken as a dynamic gesture that opens after making a fist in fig. 32, when the electronic device determines that the user completes the input of the custom gesture, the electronic device may enter the sub-diagram (d) of fig. 31, and the subsequent steps may refer to the description of fig. 31, which is not described herein again.
1704. The electronic device judges whether the first gesture operation is included in the prestored multiple types of gesture operations according to the first rule, and if the first gesture operation is one of the prestored multiple types of gesture operations, the step 1705 is executed; if the first gesture parameter is not included in the pre-stored gesture operations of multiple types, step 1701 is re-entered.
In this embodiment, if the first rule includes a correspondence between multiple types of gesture operations and multiple types of virtual keyboards, the electronic device needs to generate first indication information through step 1702, where the first indication information needs to include information about the number of hands corresponding to the first gesture operation, information about the position of the hand corresponding to the first gesture operation, and information about the shape of the hand corresponding to the first gesture operation, and after the electronic device knows the first indication information, it may be determined whether the first gesture operation is one of multiple types of gesture operations preconfigured in the electronic device.
If the first rule includes multiple sets of conditions, after the first gesture parameter corresponding to the first gesture operation is acquired in step 1701, the electronic device may directly determine whether the first gesture operation satisfies any one of the multiple sets of conditions included in the first rule, and for the description of the multiple sets of conditions, refer to the description in step 1703 above.
1705. The electronic device presents the virtual keyboard of the first type through the display screen.
In this embodiment, after determining, according to a first rule, that a first gesture operation is a target type of gesture operation among multiple types of gesture operations prestored in the electronic device, the electronic device may acquire a first type of virtual keyboard corresponding to the target type of gesture operation (that is, acquire the first type of virtual keyboard corresponding to the first gesture operation). And then the first type of virtual keyboard is displayed through the display screen. In the embodiment of the application, the electronic device is preconfigured with the first rule, the first rule indicates the corresponding relationship between the gesture operations of the multiple types and the virtual keyboards of the multiple types, and after the first gesture operation acting on the display screen is detected, the virtual keyboard of the first type corresponding to the specific first gesture operation can be obtained according to the first rule, so that the efficiency of the virtual keyboard matching process is improved.
Specifically, in one implementation, during the displaying process of the first type of virtual keyboard, the position of the first type of virtual keyboard on the display screen is fixed. In another implementation, during presentation of the first type of virtual keyboard, the position of the first type of virtual keyboard on the display screen may move as the user's hand moves.
In another implementation manner, the multiple types of virtual keyboards are divided into a third subset and a fourth subset, each of the third subset and the fourth subset includes at least one type of virtual keyboard, each type of virtual keyboard in the third subset is fixed in position during the display process, and each type of virtual keyboard in the fourth subset can be moved along with the movement of the user's hand during the display process; that is, the position of some types of virtual keyboards in the multiple types of virtual keyboards is fixed in the process of displaying, and the other types of virtual keys move along with the movement of the user's hand in the process of displaying.
Optionally, if the first type of virtual keyboard is a mini-keyboard, a numeric keyboard or a functional keyboard, the first type of virtual keyboard may move with the movement of the user's hand, that is, the third subset includes any one or more of the following: mini-keyboard, numeric keyboard and functional keyboard. If the first type of virtual keyboard is a circular keyboard, an arc-shaped keyboard or a full keyboard, then the first type of virtual keyboard may be fixed in position during the display process, that is, the fourth subset includes any one or more combinations of the following: a circular keyboard, an arc keyboard, or a full keyboard.
Further, for the virtual keyboard that moves along with the movement of the hand of the user, when the user wants to close the movement function of the virtual keyboard, a second gesture operation may be input through the display screen, and the second gesture operation may be a double-click operation, a triple-click operation, a single-click operation, or other operations, and is not limited herein.
An initial presentation position for the virtual keyboard. The initial display position of the first type of virtual keyboard may be preset or determined by the electronic device based on the position of the finger. As an example, for example, if the first type of virtual keyboard is a numeric keyboard, the key corresponding to the number 5 may be configured below the index finger when the numeric keyboard is initially displayed; as another example, for example, if the first type of virtual keyboard is a mini keyboard, the initial display position of the mini keyboard may be under the hand, etc., and the examples herein are only for convenience of understanding the present solution and are not used to limit the present solution.
Display size for the virtual keyboard. In one implementation, the size of each type of virtual keyboard in the electronic device is fixed. In another implementation, the same type of virtual keyboard may have different dimensions to accommodate different finger/hand sizes; specifically, at least two different sizes may be pre-stored for the same type of virtual keyboard in the electronic device, and the correspondence between the size of the contact point and the different sizes may be pre-stored, so that after the first type of virtual keyboard is determined, the target size corresponding to the size of the contact point may be obtained, and the first type of virtual keyboard of the target size may be displayed.
Optionally, before the first type of virtual keyboard is displayed on the display screen, the electronic device may further obtain a first angle according to the first indication information generated in step 1702, where the first angle indicates a relative angle between a hand and an edge of the display screen in the first gesture corresponding to the first gesture operation, or the first angle indicates a relative angle between a hand and a center line of the display screen in the first gesture corresponding to the first gesture operation. Step 1705 may include: the electronic equipment acquires a first display angle of the first type of virtual keyboard according to the first angle, and displays the first type of virtual keyboard according to the first display angle through the display screen; the first presentation angle indicates a relative angle between an edge of the first type of virtual keyboard and an edge of the display screen, or the first presentation angle indicates a relative angle between an edge of the first type of virtual keyboard and a centerline of the display screen.
Specifically, in an implementation manner, the electronic device determines whether the first angle is greater than or equal to a preset angle threshold, and if the first angle is greater than or equal to the preset angle threshold, acquires a first display angle, and displays the first type of virtual keyboard according to the first display angle through the display screen, where a value of the preset angle threshold may be 25 degrees, 28 degrees, 30 degrees, 32 degrees, 35 degrees, or other numerical values, and the value is not limited herein.
Further, if the first type of virtual keyboard is a full keyboard, the first angle includes a relative angle of a left hand and a relative angle of a right hand, the full keyboard is split into a first sub-keyboard and a second sub-keyboard, the first sub-keyboard and the second sub-keyboard include different virtual keys in the full keyboard, and the first display angle includes a display angle of the first sub-keyboard and a display angle of the second sub-keyboard. If the first angle indicates a relative angle between the hand and the edge of the display screen in a first gesture corresponding to the first gesture operation, the first display angle indicates a relative angle between the bottom edge of the virtual keyboard and the edge of the display screen; further, the display angle of the first sub-keyboard indicates a relative angle between an edge of the first sub-keyboard and an edge of the display screen, and the display angle of the second sub-keyboard indicates a relative angle between an edge of the second sub-keyboard and an edge of the display screen. If the first angle indicates a relative angle between a hand and a center line of the display screen in a first gesture corresponding to the first gesture operation, the first display angle indicates a relative angle between a bottom edge of the virtual keyboard and the center line of the display screen; further, the display angle of the first sub-keyboard indicates a relative angle between an edge of the first sub-keyboard and a center line of the display screen, and the display angle of the second sub-keyboard indicates a relative angle between an edge of the second sub-keyboard and the center line of the display screen.
For a more intuitive understanding of the present invention, please refer to fig. 33, in which fig. 33 is a schematic diagram of a first type of virtual keyboard in the processing method of a virtual keyboard according to the embodiment of the present invention. In fig. 33, taking the value of the preset angle threshold as 30 as an example, fig. 33 includes three sub-schematic diagrams (a), (b), and (c), where the sub-schematic diagram (a) in fig. 33 represents 8 first contact points corresponding to two-hand operation (one kind of the first gesture operation), and the sub-schematic diagram (b) in fig. 33 respectively shows a first sub-angle formed by a first straight line and a bottom edge of the display screen (i.e., a relative angle of the left hand) and a second sub-angle formed by a second straight line and the bottom edge of the display screen (i.e., a relative angle of the right hand), and the first sub-angle and the second sub-angle are both 32 degrees. The sub-diagram (c) of fig. 33 represents the presentation of the first type of virtual keyboard at a first presentation angle through the display screen, it being understood that the example in fig. 33 is only for ease of understanding the present solution and is not intended to limit the present solution.
If the first type of virtual keyboard is a mini keyboard, a numeric keyboard, a functional keyboard or a functional key keyboard, the first angle is an angle of a single hand, and the first display angle is a relative angle of the whole virtual keyboard.
In another implementation manner, after obtaining the first angle, the electronic device determines a first display angle of the first type of virtual keyboard as the first angle, and displays the first type of virtual keyboard according to the first angle through the display screen, wherein if the first angle indicates a relative angle between a hand and an edge of the display screen in a first gesture corresponding to the first gesture operation, the first display angle indicates a relative angle between a bottom edge of the virtual keyboard and the edge of the display screen; if the first angle indicates a relative angle between the hand and an edge of the display screen in a first gesture corresponding to the first gesture operation, the first display angle indicates a relative angle between a bottom edge of the virtual keyboard and a center line of the display screen.
In the embodiment of the application, the relative angle (also a first angle) between the hand of the user and the edge or the center line of the display interface is obtained, and the display angle of the virtual keyboard is determined according to the first angle, so that the display angle of the keyboard is more suitable for the placement angle of the hand of the user, and the process that the user inputs the hand by using the virtual keyboard is more comfortable and convenient.
Optionally, if the electronic device determines that the first gesture parameter is a two-hand operation, that is, the first type of virtual keyboard is a full keyboard, the electronic device may further obtain a distance between the two hands, and determine whether the distance between the two hands is greater than or equal to a first distance threshold, and display the first type of virtual keyboard in an integrated manner through the display screen under the condition that the distance between the two hands is less than or equal to the first distance threshold; under the condition that the distance between the two hands is larger than a first distance threshold value, a first sub keyboard is displayed through a second area of the display screen, and a second sub keyboard is displayed through a third area of the display screen, wherein the second area and the third area are different areas in the display screen, and the first sub keyboard and the second sub keyboard comprise different virtual keys in a full keyboard; the first distance threshold may be 70 mm, 75 mm, 80 mm, etc., and is not limited herein.
For a more intuitive understanding of the present disclosure, please refer to fig. 34, where fig. 34 is a schematic diagram of a first type of virtual keyboard in the processing method of a virtual keyboard according to the embodiment of the present disclosure. Taking the value of the first distance threshold as 75 mm as an example in fig. 34, fig. 34 includes two sub-diagrams (a) and (b), where the sub-diagram (a) in fig. 34 represents that the distance between two hands in the two-hand operation is 80 mm, and since 80 mm is greater than 75 mm, the sub-diagram (b) in fig. 34 represents that the first sub-keyboard is respectively displayed in the second area of the display screen, and the second sub-keyboard is displayed in the third area of the display screen, it should be understood that the example in fig. 34 is only for convenience of understanding the present solution, and is not used to limit the present solution.
In the embodiment of the application, whether the integrated virtual keyboard is displayed or the separated virtual keyboard is displayed can be determined based on the distance between the two hands of the user, so that the flexibility of the display process of the virtual keyboard is further improved, the displayed virtual keyboard is more convenient for the user to use, and the user viscosity of the scheme is further improved.
Further optionally, if the electronic device determines that the first gesture parameter is a two-hand operation, that is, the first type of virtual keyboard is a full keyboard, the electronic device may further obtain a distance between the two hands, and determine whether the distance between the two hands is smaller than a fourth distance threshold, if the distance between the two hands is smaller than the fourth distance threshold, the electronic device displays a prompt message to instruct the user to adjust the distance between the two hands, and/or the electronic device directly displays the full keyboard in an integrated manner; optionally, the electronic device displays a full keyboard of minimum size in a unified manner. The prompt information may be a text prompt, a voice prompt, a vibration prompt or other types of prompts, and the display modes of the prompt information are not exhaustive here.
For a more intuitive understanding of the present disclosure, please refer to fig. 35, in which fig. 35 is a schematic diagram of a first type of virtual keyboard in the processing method of a virtual keyboard according to the embodiment of the present disclosure. Fig. 35 includes two sub-diagrams (a) and (B), where the sub-diagram (a) of fig. 35 represents that the distance between the two hands is 0 mm in the two-hand operation, and since the distance between the two hands is too small, B1 in the sub-diagram (B) of fig. 35 represents prompt information to prompt the user that the two hands are too close, and the full keyboard is displayed in an integrated manner, it should be understood that the example in fig. 35 is only for convenience of understanding the present solution, and is not used to limit the present solution.
Optionally, the display screen is further configured with a plurality of vibration feedback elements, if the position of the first type of virtual keyboard on the display screen is fixed in the display process of the first type of virtual keyboard, after the electronic device displays the first type of virtual keyboard through the display screen, the electronic device may further detect a first contact operation acting on the display screen, and in response to the first contact operation, obtain first position information of a first contact point corresponding to the first contact operation, where the first position information corresponds to a first virtual key on the virtual keyboard. Under the condition that the first virtual key is an anchor point key, the electronic equipment acquires a first vibration feedback element from the plurality of vibration feedback elements, wherein the first vibration feedback element is a vibration feedback element matched with the first virtual key; and indicating the first vibration feedback element to send out vibration waves so as to execute a first feedback operation, wherein the first feedback operation is used for prompting that the first virtual key is an anchor point key. It should be noted that, for the meanings of various terms, specific implementation manners of steps, and beneficial effects brought by the foregoing description, reference may be made to the description in the first embodiment, and details are not described herein. The purpose of setting the anchor point keys is to assist a user to remember the positions of the anchor point keys and further assist the user to realize touch typing on various types of virtual keyboards, so that flexible equipment can be provided for which virtual keys on each type of virtual keyboard are the anchor point keys.
To further understand the present solution, anchor point keys in each type of virtual keyboard are exemplified below in conjunction with the various types of virtual keyboards shown above. By way of example, if the first type of virtual keyboard is the numeric keyboard shown in FIG. 24, for example, then the anchor key may be the virtual key to which the number "5" points. As another example, for example, if the first type of virtual keyboard is the function key keyboard shown in fig. 26, the anchor point keys may be Ctrl keys and Shift keys, and it should be understood that, for example, only to facilitate understanding of concepts of the anchor point keys existing in various types of virtual keyboards, which virtual keys are specifically set as the anchor point keys in each type of virtual keyboard may be flexibly set by a developer in combination with an actual application scenario, or may be set by a user in a user-defined manner, which is not limited here.
1706. The electronic equipment acquires a contact operation aiming at a first virtual key in the function key keyboard.
In some embodiments of the application, the first type of virtual keyboard displayed by the electronic device is a function key keyboard, and the electronic device may further acquire a contact operation for one or more first virtual keys in the function key keyboard, where the contact operation may be a pressing operation or a touching operation. For example, the first virtual key may be a Ctrl key, and may also include the Ctrl key and a Shift key, and the like, which is not limited herein.
1707. And responding to the touch operation by the electronic equipment, and highlighting a second virtual key on the display screen, wherein the second virtual key is a key except the first virtual key in the combined shortcut key.
In some embodiments of the present application, the electronic device highlights at least one second virtual key on the display screen in response to the touch operation. Each second virtual key in the at least one second virtual key can form a shortcut key together with the first virtual key, and the second virtual keys are keys except the first virtual key in the combined shortcut key; highlighting includes, but is not limited to, highlighting, bolding, flashing, etc., and is not limited thereto. For example, in an application program such as drawing, a combination key of a Ctrl key + Shift key + I key can provide a function of displaying an image currently being processed in reverse phase, where the first virtual key includes the Ctrl key and the Shift key, and the second virtual key is the virtual key I; the term "displaying the currently processed image in reverse" refers to changing the color of the currently processed image to its complementary color, and it should be understood that the example is only for convenience of understanding the present solution and is not intended to limit the present solution.
For a more intuitive understanding of the present disclosure, please refer to fig. 36, where fig. 36 is a schematic diagram of a second virtual key in the processing method of the virtual keyboard according to the embodiment of the present disclosure. Fig. 36 is a diagram of the current application as an example of a drawing-type application, where fig. 36 includes four sub-diagrams (a), (b), (c), and (d), and the sub-diagram (a) of fig. 36 represents a keyboard with function keys displayed on a display screen. The sub-diagram (b) of fig. 36 represents that the user performs pressing operation on the Ctrl key and the Shift key, so as to trigger the electronic device to highlight the key on which the letter I is located on the display screen. The sub-diagram (c) in fig. 36 represents that the user clicks the key where the letter I is located, so as to trigger the sub-diagram (d) in fig. 36 to be entered, that is, the currently displayed image is displayed in a reversed phase, it should be understood that the example in fig. 36 is only for convenience of understanding the present solution, and is not used to limit the present solution.
Optionally, the electronic device highlights the second virtual keys on the display screen in response to the touch operation, and also displays functions of the shortcut keys corresponding to each of the second virtual keys.
For a more intuitive understanding, please refer to fig. 37, where fig. 37 is a schematic diagram of a second virtual key in the processing method of the virtual keyboard according to the embodiment of the present application. Fig. 37 illustrates an example in which the current application is a document presentation application and a virtual keyboard is displayed on a presentation interface of a document in a floating manner, where fig. 37 includes three sub-diagrams (a), (b), and (c), and the sub-diagram (a) in fig. 37 represents a keyboard with function keys displayed on a display screen. The sub-diagram (b) in fig. 37 represents that the user performs a pressing operation on the Ctrl key, so as to trigger entering the sub-diagram (c) in fig. 37, that is, the electronic device displays a plurality of second virtual keys on the display screen in a highlighted manner, and further displays functions of a shortcut key corresponding to each second virtual key, which are respectively used for starting 5 shortcut functions of saving (save), cutting (corresponding to the scissors icon in fig. 37), copying (copy), pasting, and inserting (insert).
In the embodiment of the application, in the process of displaying the function key keyboard in the display screen, the contact operation aiming at the first virtual key in the function key keyboard is obtained, the second virtual key is prominently displayed on the display screen in response to the contact operation, the second virtual key is the key except the first virtual key in the combined shortcut key, and the occupied area of the function key keyboard is small, so that the area required by displaying the virtual keyboard is reduced, and the second virtual key in the combined shortcut key can be automatically displayed when the user performs the contact operation on the first virtual key in the function key keyboard, so that the requirement of the user on the shortcut key is ensured, and the waste of the display area of the display screen is avoided.
To understand the present solution more intuitively, please refer to fig. 38, fig. 38 is a schematic flow diagram of a processing method of a virtual keyboard provided in an embodiment of the present application, where in fig. 38, taking an application program applied to text editing in the embodiment of the present application as an example, fig. 38 includes four sub-schematic diagrams (a), (b), (c), and (d), in the sub-schematic diagram (a) of fig. 38, an electronic device acquires a first gesture parameter corresponding to a two-hand operation, acquires a full keyboard corresponding to the two-hand operation according to a first rule and the first gesture parameter corresponding to the two-hand operation, displays the full keyboard through a display screen, and a user inputs a content "main ingredient low gluten flour: ". In the sub-diagram of fig. 38 (b), the electronic device detects that the user lifts up one hand, and stops displaying the full keyboard on the display screen, the electronic device obtains the first gesture parameter corresponding to the right-handed one-handed operation, obtains the numeric keyboard corresponding to the right-handed one-handed operation according to the first rule and the first gesture parameter corresponding to the right-handed one-handed operation, and displays the numeric keyboard through the display screen, that is, as shown in the sub-diagram of fig. 38 (c), the numeric keyboard is displayed below the hand of the user, and the user inputs the content "145" through the numeric keyboard. As shown in the sub-diagram (d) of fig. 38, in the displaying process of the numeric keypad, the electronic device checks that the hand of the user moves above the display screen, the electronic device obtains the moving track of the hand, and controls the numeric keypad to move along with the movement of the hand of the user, when the user inputs a double-click operation through the display screen, the position of the numeric keypad is fixed, it should be noted that the example of fig. 38 is only for convenience of understanding how to switch among multiple types of virtual keypads, and is not used for limiting the present solution.
In the embodiment of the application, a plurality of different types of virtual keyboards are configured in the electronic device, the virtual keys included in the different types of virtual keyboards are not identical, a user can call the different types of virtual keyboards through different gesture operations, namely the virtual keyboards can not only display 26 letters any more, but more virtual keys are provided for the user through the different types of virtual keyboards, so that the flexibility of the user in the process of calling the virtual keyboards is improved, the more abundant virtual keys are provided, and the additional physical keyboard is not needed any more.
On the basis of the embodiments corresponding to fig. 1 to fig. 38, in order to better implement the above-mentioned solution of the embodiment of the present application, the following also provides related equipment for implementing the above-mentioned solution. Referring to fig. 39, fig. 39 is a schematic structural diagram of an electronic device according to an embodiment of the present application. The electronic device 1 comprises a display screen 50, a memory 40, one or more processors 10, and one or more programs 401, the display screen 50 in fig. 39 may be the same element as the touch screen 20 in fig. 1 to 29, the one or more programs 401 are stored in the memory 40, and the one or more processors 10, when executing the one or more programs 401, cause the electronic device to perform the following steps: in response to the detected first gesture operation, selecting a first type of virtual keyboard corresponding to the first gesture operation from a plurality of types of virtual keyboards, wherein virtual keys included in different types of virtual keyboards in the plurality of types of virtual keyboards are not identical; a first type of virtual keyboard is presented via the display screen 50.
In one possible design, the one or more processors 10, when executing the one or more programs 401, cause the electronic device to perform the following steps: according to a first rule, a first type of virtual keyboard corresponding to a first gesture operation is selected from the multiple types of virtual keyboards, and the first rule indicates the corresponding relation between the multiple types of gesture operations and the multiple types of virtual keyboards.
In one possible design, the one or more processors 10, when executing the one or more programs 401, cause the electronic device to further perform the steps of: acquiring a first gesture parameter corresponding to the first gesture operation, wherein the first gesture parameter comprises any one or more of the following parameters: position information of the contact points corresponding to the first gesture operation, number information of the contact points corresponding to the first gesture operation, area information of the contact points corresponding to the first gesture operation, relative angle information of the hand corresponding to the first gesture operation, position information of the hand corresponding to the first gesture operation, number information of the hand corresponding to the first gesture operation, and shape information of the hand corresponding to the first gesture operation. The one or more processors 10, when executing the one or more programs 401, cause the electronic device to perform the following steps: and selecting a first type of virtual keyboard from the plurality of types of virtual keyboards according to the first gesture parameter.
In one possible design, the one or more processors 10, when executing the one or more programs 401, cause the electronic device to further perform the steps of: in response to the first gesture operation, a first angle indicating a relative angle between a hand corresponding to the first gesture operation and a side of the display screen 50 or a relative angle between the hand corresponding to the first gesture operation and a center line of the display screen 50 is acquired. The one or more processors 10, when executing the one or more programs 401, cause the electronic device to perform the following steps: according to the first angle, a display angle of the first type of virtual keyboard is obtained, and the first type of virtual keyboard is displayed through the display screen 50 according to the display angle, wherein the display angle indicates a relative angle between an edge of the first type of virtual keyboard and an edge of the display screen 50, or the display angle indicates a relative angle between the edge of the first type of virtual keyboard and a center line of the display screen 50.
In one possible design, the different types of virtual keyboards in the plurality of types of virtual keyboards differ in functionality, the different-functionality virtual keyboards including a combination of any two or more of the following: the keyboard comprises a numeric keyboard, a function key keyboard, a full keyboard and a self-defined keyboard, wherein the function key keyboard consists of function keys.
In one possible design, where the first gesture operation is a one-handed operation, the first type of virtual keyboard is any one of the following: the keyboard comprises a mini keyboard, a numeric keyboard, a functional key keyboard, a circular keyboard, an arc-shaped keyboard and a custom keyboard, wherein the mini keyboard comprises 26 letter keys, the functional keyboard is displayed in an application 401, and virtual keys of the functional keyboard correspond to functions of the application 401.
In one possible design, where the first gesture operation is a two-handed operation, the virtual keyboard of the first type is a full keyboard comprising at least 26 letter keys; the one or more processors 10, when executing the one or more programs 401, cause the electronic device to perform the following steps: displaying the full keyboard in an integrated manner through the display screen 50 under the condition that the distance between the two hands is less than or equal to the first distance threshold; under the condition that the distance between the two hands is larger than the first distance threshold value, the first sub-keyboard is displayed through the second area of the display screen 50, and the second sub-keyboard is displayed through the third area of the display screen 50, wherein the second area and the third area are different areas in the display screen 50, and the first sub-keyboard and the second sub-keyboard comprise different virtual keys in the full keyboard.
In one possible design, the one-handed operation includes a left-handed one-handed operation and a right-handed one-handed operation; under the condition that the first gesture operation is right-hand one-hand operation, the first type of virtual keyboard is a numeric keyboard; in the case where the first gesture operation is a left-handed one-handed operation, the first type of virtual keyboard is a functional keyboard.
In one possible design, the display screen 50 is configured with a plurality of vibration feedback elements, during the displaying of the first type of virtual keyboard, the position of the first type of virtual keyboard on the display screen 50 is fixed, and the one or more processors 10, when executing the one or more programs 401, cause the electronic device to further perform the following steps: detecting a first touch operation acting on the display screen 50; responding to the first contact operation, and acquiring first position information of a first contact point corresponding to the first contact operation, wherein the first position information corresponds to a first virtual key on a virtual keyboard; under the condition that the first virtual key is an anchor point key, acquiring a first vibration feedback element from a plurality of vibration feedback elements, wherein the first vibration feedback element is a vibration feedback element matched with the first virtual key; and indicating the first vibration feedback element to send out vibration waves so as to execute a first feedback operation, wherein the first feedback operation is used for prompting that the first virtual key is an anchor point key.
It should be noted that, the contents of information interaction, execution process, and the like between the modules/units in the electronic device 1 are based on the same concept as that of the method embodiments corresponding to fig. 17 to fig. 38 in the present application, and specific contents may refer to the description in the foregoing method embodiments of the present application, and are not described herein again.
Referring to fig. 40, fig. 40 is a schematic structural diagram of the electronic device provided in the embodiment of the present application, and the electronic device 1 may be embodied as a mobile phone, a tablet, a notebook, or other device configured with a display screen, which is not limited herein. The electronic device 1 may be disposed with the electronic device described in the embodiment corresponding to fig. 39, and is configured to implement the functions of the electronic device in the embodiments corresponding to fig. 17 to fig. 38. In particular, electronic device 1 may vary widely due to configuration or performance differences and may include one or more Central Processing Units (CPUs) 1522 (e.g., one or more processors) and memory 40, one or more storage media 1530 (e.g., one or more mass storage devices) storing applications 1542 or data 1544. Memory 40 and storage media 1530 may be, among other things, transient or persistent storage. The program stored on the storage medium 1530 may include one or more modules (not shown), each of which may include a sequence of instruction operations for the electronic device. Further, the central processor 1522 may be configured to communicate with the storage medium 1530, and execute a series of instruction operations in the storage medium 1530 on the electronic device 1.
Electronic device 1 may also include one or more power supplies 1526, one or more wired or wireless network interfaces 1550, one or more input-output interfaces 1558, and/or one or more operating systems 1541, such as Windows Server, Mac OS XTM, UnixTM, LinuxTM, FreeBSDTM, and so forth.
In this embodiment, the central processing unit 1522 is configured to implement functions of the electronic device in the embodiment corresponding to fig. 17 to fig. 38. It should be noted that, for the specific implementation manner and the advantageous effects brought by the central processing unit 1522 executing the functions of the electronic device in the embodiments corresponding to fig. 17 to fig. 38, reference may be made to the descriptions in each method embodiment corresponding to fig. 17 to fig. 38, and details are not repeated here.
Also provided in an embodiment of the present application is a computer-readable storage medium, in which a program for generating a vehicle running speed is stored, and when the program runs on a computer, the computer is caused to execute the steps executed by an electronic device in the method described in the embodiment shown in fig. 17 to 38.
In the embodiments of the present application, a computer program is further provided, which, when running on a computer, causes the computer to execute the steps performed by the electronic device in the method described in the foregoing embodiments shown in fig. 17 to fig. 38.
Further provided in an embodiment of the present application is a circuit system, which includes a processing circuit configured to execute the steps performed by the electronic device in the method described in the foregoing embodiments shown in fig. 17 to fig. 38.
The electronic device provided by the embodiment of the application can be specifically a chip, and the chip comprises: a processing unit, which may be for example a processor, and a communication unit, which may be for example an input/output interface, a pin or a circuit, etc. The processing unit can execute the computer-executable instructions stored in the storage unit to make the chip execute the steps executed by the electronic device in the method described in the foregoing embodiments shown in fig. 17 to 38. Optionally, the storage unit is a storage unit in the chip, such as a register, a cache, and the like, and the storage unit may also be a storage unit located outside the chip in the wireless access device, such as a read-only memory (ROM) or another type of static storage device that can store static information and instructions, a Random Access Memory (RAM), and the like.
Wherein any of the aforementioned processors may be a general purpose central processing unit, a microprocessor, an ASIC, or one or more integrated circuits configured to control the execution of the programs of the method of the first aspect.
It should be noted that the above-described embodiments of the apparatus are merely illustrative, where the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on multiple network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment. In addition, in the drawings of the embodiments of the apparatus provided in the present application, the connection relationship between the modules indicates that there is a communication connection therebetween, which may be specifically implemented as one or more communication buses or signal lines.
Through the above description of the embodiments, those skilled in the art will clearly understand that the present application can be implemented by software plus necessary general hardware, and certainly can also be implemented by special hardware including application specific integrated circuits, special CLUs, special memories, special components and the like. Generally, functions performed by computer programs can be easily implemented by corresponding hardware, and specific hardware structures for implementing the same functions may be various, such as analog circuits, digital circuits, or dedicated circuits. However, for the present application, the implementation of a software program is more preferable. Based on such understanding, the technical solutions of the present application may be substantially embodied in the form of a software product, which is stored in a readable storage medium, such as a floppy disk, a usb disk, a removable hard disk, a ROM, a RAM, a magnetic disk, or an optical disk of a computer, and includes several instructions for enabling a computer device (which may be a personal computer, a server, or a network device) to execute the methods described in the embodiments of the present application.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, it may be implemented in whole or in part in the form of a computer program.
The computer program includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the application to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, from one website site, computer, server, or data center to another website site, computer, server, or data center via wired (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that a computer can store or a data storage device, such as a server, a data center, etc., that is integrated with one or more available media. The usable medium may be a magnetic medium (e.g., floppy Disk, hard Disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
Example three:
please refer to fig. 41, where fig. 41 is a schematic structural diagram of an electronic device according to an embodiment of the present application, in which the processing method of an application interface provided in the embodiment of the present application is applicable to the electronic device shown in fig. 41. The electronic device comprises a first display 501 and a first display 502. The first display 501 is different from the first display 502 in that: the first display screen 502 is a display screen for acquiring a user's handwriting input, and the first display screen 501 is not a display screen for acquiring a user's handwriting input. The first display screen 502 is a touch screen, and the first display screen 502 needs to have both functions of receiving input and displaying output. It should be understood that, in fig. 41, only the electronic device includes one first display 501 and one first display 502 as an example, but in an actual situation, one electronic device may further include at least two first displays 501, or include at least two first displays 502, and the like, and the number of the first displays 501 and the first displays 502 included in a specific electronic device may be determined according to an actual application scenario, and is not limited herein.
In one implementation manner, the electronic device presets a display screen (i.e., the second display screen) for acquiring the handwriting input of the user and a display screen (i.e., the first display screen) not for acquiring the handwriting input of the user in at least two display screens included, so that the user can place the preset second display screen at a position convenient for the user to write by hand.
In another implementation, the electronic device determines a display screen (i.e., the second display screen) for acquiring the handwriting input of the user and a display screen (i.e., the first display screen) not used for acquiring the handwriting input of the user according to a placement direction of each of at least two display screens included. Specifically, the electronic device may obtain an included angle between a placement angle of each of the at least two display screens and the horizontal direction, and then the electronic device may select one display screen having the smallest included angle with the horizontal direction from the at least two display screens included therein as the first display screen 502, and use the remaining display screens in the at least two display screens as the first display screen 501. The electronic device may also select, from at least two display screens included in the electronic device, at least one display screen whose included angle with the horizontal direction is smaller than a first angle threshold as the first display screen 502, and use the remaining display screens in the at least two display screens as the first display screen 501, where the first angle threshold may be 25 degrees, 30 degrees, 40 degrees, or other values, and the like, which is not exhaustive here.
Further, in one case, the first display screen 501 and the first display screen 502 may be separate screens, and the first display screen 501 and the first display screen 502 are connected through a data interface, or the first display screen 501 and the first display screen 502 are connected through a bus. In another case, the first display 501 and the first display 502 are integrated into one flexible screen, and the first display 501 and the first display 502 are two different areas of the flexible screen.
Optionally, the electronic device may further include an electronic pen, which may specifically employ an electronic pen of electromagnetic resonance (EMR) technology, an electronic pen of active electrostatic induction (AES) technology, or another type of electronic pen, and the like, which is not limited herein.
Based on the electronic device shown in fig. 41, an application scenario of the embodiment of the present application is described below. As an example, in an application scenario where a student takes notes in class, the student may need to use an input mode of handwriting input to draw down a schematic on a blackboard during typing notes through a virtual keyboard (i.e. the input mode of the current application interface using keyboard input). As another example, during browsing a novel or a picture (i.e. the current application interface adopts a browsing mode), the user may need to adopt an input mode of handwriting input to add annotations to the novel or the picture. As another example, during the process of writing a report through a virtual keyboard (i.e. the current application interface uses a keyboard input mode), for example, a user may suddenly have an idea to draw with a pen (i.e. the current application interface needs to use a handwriting input mode), and the application scenarios of the embodiments of the present application are not exhaustive here. In all the above scenarios, there is a problem of complicated operation in the handwriting input process.
In order to solve the above problem, an embodiment of the present application provides a method for processing an application interface, where the method for processing an application interface is applied to an electronic device shown in fig. 41, where the electronic device displays a first application interface through a first display screen, and when it is detected that a mode type corresponding to the first application interface is a handwriting input, in response to an input mode of the handwriting input, the electronic device triggers displaying of the first application interface on a second display screen, so as to obtain a handwriting content for the first application interface through the second display screen, that is, when it is detected that a current mode type corresponding to the first application interface is an input mode of the handwriting input, the electronic device automatically displays the first application interface on the second display screen, so as to directly obtain the handwriting content for the first application interface through the second display screen, that is, there is no need to perform steps of copying and pasting, and the like, the conversion from other modes to the handwriting input can be directly finished, so that the complex steps are avoided, and the efficiency of the handwriting input is greatly improved.
Further, in some application scenarios, one application interface can only switch between an input mode of keyboard input and an input mode of handwriting input; in other application scenarios, an application interface can only switch between an input mode of keyboard input and a browsing mode, and the specific implementation flows are different in the two application scenarios, which are described below.
Firstly, switching between an input mode of keyboard input and an input mode of handwriting input.
In the embodiment of the present application, please refer to fig. 42, where fig. 42 is a schematic flowchart of a processing method of an application interface provided in the embodiment of the present application, and the processing method of the application interface provided in the embodiment of the present application may include:
4201. the electronic equipment acquires a starting operation aiming at the first application interface.
In this embodiment of the application, the electronic device obtains a start operation for a first application interface, where a target Application (APP) may include one or more application interfaces, and the first application interface refers to any one of at least one application interface included in the target APP, that is, the first application interface refers to an interface that appears when the target APP is opened, and may also be a new interface that is opened in a process of using the target APP.
Specifically, step 4201 may include: the electronic equipment acquires the starting operation aiming at the first application interface through the first display screen, or the electronic equipment acquires the starting operation aiming at the first application interface through the second display screen. Further, the electronic device acquires the starting operation for the first application interface through an electronic pen, a mouse or a finger.
4202. The electronic device determines a mode type corresponding to the first application interface based on the start operation.
In some embodiments of the application, the electronic device determines a mode type corresponding to the first application interface based on the acquired start operation, where the mode type corresponding to the first application interface is handwriting input or keyboard input.
In one implementation, the electronic device determines a mode type corresponding to the first application interface according to an acquisition location corresponding to the start operation. Specifically, under the condition that the starting operation is acquired through the first display screen, it can be proved that the first application interface is often displayed on the first display screen by the user, and the mode type corresponding to the first application interface is determined to be keyboard input by the electronic device, that is, the initial mode type of the first application interface is keyboard input. Under the condition that the starting operation is acquired through the second display screen, it is proved that the user often uses the first application interface on the second display screen, and the electronic device determines that the mode type corresponding to the first application interface is handwriting input, namely, the initial mode type of the first application interface is handwriting input. The difference between the first display screen and the second display screen can be referred to the above description of fig. 41, and is not described herein again.
In the embodiment of the application, the mode type corresponding to the first application interface is determined based on the position of the starting operation acquired by the electronic equipment, and the method is simple to operate and easy to implement.
In another implementation manner, the electronic device determines a mode type corresponding to the first application interface according to a starting manner corresponding to the starting operation. Specifically, the electronic device determines that the mode type corresponding to the first application interface is handwriting input under the condition that the starting operation is acquired through the electronic pen; as an example, for example, if the electronic device acquires that the user clicks an application icon of the target application program through an electronic pen to open the first application interface, the electronic device may determine that the mode type corresponding to the first application interface is handwriting input. And under the condition that the starting operation is acquired through a mouse or fingers, determining that the mode type corresponding to the first application interface is keyboard input.
In the embodiment of the application, another implementation manner for determining the mode type corresponding to the first application interface based on the starting operation is provided, which is beneficial to improving the implementation flexibility of the scheme, and is simple to operate and easy to implement.
In another implementation manner, the electronic device may determine the mode type corresponding to the first application interface according to the acquisition position corresponding to the starting operation and the starting manner corresponding to the starting operation. Specifically, in a case where the start operation is acquired through the electronic pen, or in a case where the start operation is acquired through the second display screen, the electronic device determines the mode type corresponding to the first application interface as the handwriting input. And under the condition that the starting operation is acquired through a mouse or a finger and the starting operation is acquired through the first display screen, the electronic equipment determines the mode type corresponding to the first application interface as keyboard input.
In one case, in a case where the start operation is acquired through the electronic pen and the start operation is acquired through the second display screen, the electronic device determines a mode type corresponding to the first application interface as the handwriting input. And if the starting operation is acquired through a mouse or a finger, or if the starting operation is acquired through the first display screen, the electronic equipment determines the mode type corresponding to the first application interface as the keyboard input.
It should be noted that the electronic device may also determine the initial mode type of the first application interface by other means, which are not listed here.
4203. The electronic device determines whether the mode type corresponding to the first application is handwriting input, and if the mode type corresponding to the first application is keyboard input, the electronic device goes to step 4204; if the mode type corresponding to the first application is handwriting input, the process proceeds to step 4211.
4204. The electronic equipment responds to an input mode of keyboard input, and triggers the display of a first application interface on a first display screen and displays a virtual keyboard on a second display screen.
In the embodiment of the application, when the electronic device determines that the mode type corresponding to the first application interface is not handwriting input but keyboard input, the electronic device triggers to display the first application interface on the first display screen in response to the input mode of the keyboard input, and displays the virtual keyboard on the second display screen, so that the input content aiming at the first application interface is acquired through the virtual keyboard on the second display screen.
Further, if a receiving interface of handwriting input of other application interfaces is also displayed on the second display screen when the virtual keyboard is displayed on the second display screen, in an implementation manner, the electronic device may be provided with an opening icon corresponding to each application one-to-one at the top end or the bottom end of the second display screen, so as to switch between the virtual keyboard displayed on the second display screen and the other application interfaces. For a more intuitive understanding of the present disclosure, please refer to fig. 43, where fig. 43 is an interface schematic diagram of a display interface of the second display screen in the processing method of an application interface provided in the embodiment of the present application. Fig. 43 shows an open icon of the application interface 1, an open icon of the application interface 2, and a display interface of the virtual keyboard (corresponding to the first application interface shown on the first display screen), so that the user can realize switching between the virtual keyboard and the application interface 1 by clicking the open icon of the application interface 1; the user can realize the switching between the virtual keyboard and the application interface 2 by clicking the open icon of the application interface 2, and it should be understood that the example in fig. 43 is only for facilitating understanding of the present solution, and is not used for limiting the present solution.
In another implementation manner, the electronic device may also be configured with a zoom icon on a display interface of the virtual keyboard of the second display screen, and when the user clicks the zoom icon through an electronic pen, a finger, a mouse, or the like, the virtual keyboard displayed on the second display screen is retracted; when the user clicks the enlarged icon with a stylus, finger, mouse, or the like, the virtual keyboard displayed on the second display screen is expanded. In another implementation manner, the user may also perform switching between the virtual keyboard displayed on the second display screen and other application interfaces by inputting a sliding operation to the second display screen, where the sliding operation may be a sliding operation in a left-right direction, a sliding operation in an up-down direction, and the like, and the electronic device may also perform switching between the virtual keyboard and other application interfaces by using other manners, which is not exhaustive here.
For a more intuitive understanding of the present embodiment, please refer to fig. 44 and fig. 45, and fig. 44 and fig. 45 are schematic flow diagrams respectively illustrating a processing method of an application interface according to an embodiment of the present application. Referring to fig. 44, fig. 44 includes two sub-schematic diagrams (a) and (b), in the sub-schematic diagram (a) of fig. 44, the electronic device obtains an opening operation for a target application program (i.e., an application program such as "note" in the diagram) through the first display screen, and since the opening operation is input through the first display screen, the electronic device determines that a mode type corresponding to the first application interface is a keyboard input, the electronic device enters the sub-schematic diagram (b) of fig. 44, and displays the first application interface (i.e., an initial application interface of the application program such as "note") on the first display screen and displays a virtual keyboard and a touchpad area on the second display screen.
With continuing reference to fig. 45, fig. 45 includes two sub-schematic diagrams (a) and (b), in the sub-diagram of fig. 45 (a), the electronic device obtains the opening operation of the target application (i.e. the application of "note" in the illustration) through the first display screen, since the start operation is acquired by the finger, the electronic device determines that the mode type corresponding to the first application interface is a keyboard input, the electronic device enters into the sub-diagram (b) of fig. 45, the electronic device presents the first application interface on the first display screen, the virtual keyboard and the touchpad area are displayed on the second display screen, and it should be noted that, only the virtual keyboard may be displayed on the second display screen of fig. 44 and 45, and the touch pad area is not shown, it should be understood that the examples in fig. 44 and 45 are only for convenience of understanding the present solution and are not intended to limit the present solution.
Optionally, step 4204 may include: and displaying the virtual keyboard and the application control bar on the second display screen. The method may further comprise: the electronic equipment detects a second operation acting on the second display screen; and changing the first display area of the application control bar into a second display area in response to the second operation, and changing a first control key group included in the application control bar into a second control key group, wherein the first control key group and the second control key group are control key sets corresponding to the target application. The specific meanings of the terms in the foregoing steps and the specific implementation manners of the foregoing steps will be described in the following fourth embodiment, which will not be described herein again.
Optionally, the first application interface includes a first control key, and step 4204 may include: and displaying the virtual keyboard and the application control bar on the second display screen. The method may further comprise: the electronic equipment detects a second operation on the first target application interface; and responding to the second operation, displaying the first control key in the application control bar, and hiding the first control key in the first application interface. The specific meanings of the terms in the foregoing steps and the specific implementation manners of the foregoing steps will be described in the following fourth embodiment, which will not be described herein again.
Optionally, step 4204 may include: the electronic device presents a second type of virtual keyboard (which may also be referred to as a default type of virtual keyboard) on a second display screen. The method further comprises the following steps: the electronic equipment detects a first gesture operation acting on the second display screen; selecting a first type of virtual keyboard corresponding to the first gesture operation from a plurality of types of virtual keyboards in response to the first gesture operation, wherein virtual keys included in different types of virtual keyboards in the plurality of types of virtual keyboards are not identical; and displaying the first type of virtual keyboard through the second display screen, wherein the first type of virtual keyboard and the second type of virtual keyboard are different types of virtual keyboards in the multiple types of virtual keyboards. That is, after the electronic device displays the virtual keyboard of the second type on the second display screen, the user may input different gesture operations to change the type of the virtual keyboard displayed on the second display screen. The meanings of the terms of the first gesture operation, the different types of virtual keyboards, and the like, and the specific implementation of the foregoing steps can be referred to the description in the second embodiment, which is not described herein again.
4205. The electronic equipment acquires a mode type corresponding to the first application interface.
In the embodiment of the application, after the electronic device opens the first application interface, that is, in the running process of the first application interface, the electronic device may further detect and acquire the mode type corresponding to the first application interface in real time, so as to determine whether the mode type corresponding to the first application interface changes. Specifically, if the electronic device can detect a first operation, a mode type corresponding to the first application interface is converted into handwriting input in response to the first operation; if the electronic device does not detect the first operation, the mode type corresponding to the first application interface is a keyboard input, and the electronic device continuously detects and acquires the mode type corresponding to the first application interface.
More specifically, in one implementation, the electronic device determines a mode type corresponding to the first application interface according to a holding gesture of the electronic pen by a user. Specifically, under one condition, a first preset condition is stored in the electronic device in advance, the electronic device can acquire the holding posture of the user on the electronic pen in real time and judge whether the holding posture of the user on the electronic pen meets the first preset condition, and under the condition that the holding posture of the user on the electronic pen meets the first preset condition, the electronic device determines that the first operation of the user is detected, so that the type of the mode corresponding to the first application interface is converted into the input mode of the handwriting input; and under the condition that the holding posture of the user on the electronic pen does not meet a first preset condition, determining that the type of the mode corresponding to the first application interface is an input mode of keyboard input.
Wherein the holding posture comprises any one or combination of more of the following: the holding position, holding strength, holding angle or other holding related factors, etc., which are not limited herein, the first preset condition includes any one or a combination of more than one of the following conditions: the holding position is in the first position range, the holding force is in the first force range, the holding angle is in the first angle range or other preset conditions and the like.
In the embodiment of the present application, since the electronic pen may be used for performing other operations besides writing, for example, some operations performed by the mouse may be performed by the electronic pen, such as a sliding operation, a selecting operation, and the like. Alternatively, the user may simply hold the electronic pen unconsciously, rather than want to perform writing operations, etc., which is not exhaustive herein. The electronic equipment does not roughly determine that the mode type corresponding to the first application is the writing mode when the user uses the electronic pen, but further determines the mode type corresponding to the first application interface according to the holding posture of the user on the electronic pen, so that the error rate of the judgment process of the mode type corresponding to the first application interface is reduced, the probability of mistakenly placing the first application interface is reduced, the waste of computer resources is avoided, and the improvement of the viscosity of the user is facilitated.
Further, the electronic pen may be configured in the electronic device, and after the electronic pen is taken out from the electronic device by the user, a communication interface may be configured between the electronic pen and the electronic device, and the electronic pen may acquire, in real time, holding parameters corresponding to the holding posture and send the holding parameters to the electronic device, so that the electronic device determines whether the holding posture of the electronic pen by the user satisfies a first preset condition. The gripping parameters include any one or a combination of more of the following: the position of the contact point corresponding to the holding operation, the holding strength, the inclination angle of the electronic pen, or other parameters.
The electronic pen can be provided with a contact sensing module, the contact sensing module of the electronic pen acquires the position of each contact point between the user and the electronic pen in real time (namely, the holding position of the user on the electronic pen is determined), the position of each contact point is sent to the electronic equipment, and the electronic equipment judges whether the holding position of the user on the electronic pen is located in the first position range according to the position of each contact point. The touch sensing module can be embodied as a contact sensing film, and the contact sensing film can be embodied as a capacitive contact sensing film, a pressure type contact sensing film, a temperature type contact sensing film or other types of films, and the like, which are not exhaustive.
The electronic pen can be provided with a pressure sensing module, the pressure sensing module of the electronic pen collects the holding force of the user on the electronic pen in real time, the holding force of the user on the electronic pen is sent to the electronic equipment, and whether the holding force of the user on the electronic pen is within a first force range or not is judged by the electronic equipment. The pressure sensing module may be embodied as a pressure sensing diaphragm, a distributed pressure sensor, or in other forms, which are not exhaustive here.
The electronic pen can be provided with an angle measuring module, the angle measuring module of the electronic pen collects the inclination angle of the electronic pen in real time (namely, determines the holding angle of the user to the electronic pen), the inclination angle of the electronic pen is sent to the electronic equipment, whether the holding angle of the user to the electronic pen is within a first angle range or not is judged by the electronic equipment, and the angle measuring module can be specifically represented as a gyroscope or other types of angle measuring modules and the like, and is not limited here.
Further, in an implementation manner, the electronic device may enter a holding gesture of the user during handwriting input by using the electronic pen in advance, and then determine the first preset condition according to the holding gesture entered by the user; optionally, the electronic device may further acquire a holding posture of the user during a process of writing with the electronic pen by the user, that is, acquire a position of a contact point between a finger of the user and the electronic pen, a holding strength of the user, an inclination angle of the electronic pen, and the like, so as to adjust the first preset condition. In another implementation, the first preset condition in the electronic device may be preset.
To understand the present embodiment more intuitively, please refer to fig. 46, wherein fig. 46 is a schematic diagram illustrating various holding postures in the processing method of the application interface according to the embodiment of the present application. Fig. 46 shows six sub-diagrams (a), (b), (c), (d), (e) and (f), wherein the sub-diagram (a), (b), (c) and (d) of fig. 46 respectively show four holding postures of the user when writing with the electronic pen, and the sub-diagram (e) and (f) of fig. 46 show two postures of the user when not writing although holding the electronic pen, it should be understood that the example in fig. 46 is only for convenience of understanding the concept of the holding gesture of the user on the electronic pen, and is not used for limiting the present solution.
In another implementation manner, the electronic device may set, on the first application interface or the display interface of the virtual keyboard, trigger icons corresponding to an input mode of keyboard input and an input mode of handwriting input one to one, respectively, and when the user clicks the icon of handwriting input on the first application interface, the electronic device may obtain a trigger instruction for the handwriting input, that is, the electronic device detects the first operation of the user; when a user clicks an icon input by the keyboard on the first application interface, the electronic equipment can acquire a trigger instruction for keyboard input. Or, the electronic device may be provided with a switching icon for switching between the input modes of the keyboard input and the handwriting input on the first application interface, and when the switching icon is in the first state, the switching icon is regarded as a trigger operation of the user input for the handwriting input; when the switching icon is in the second state, it is regarded that the user inputs a trigger operation for keyboard input, and the like, and the manner in which the electronic device acquires a trigger instruction for handwriting input is not exhaustive here. The electronic equipment responds to the trigger instruction for the handwriting input, and determines that the mode type corresponding to the first application interface is an input mode of the handwriting input.
To understand the present solution more intuitively, please refer to fig. 47 and fig. 48, where fig. 47 is an interface schematic diagram of a first application interface in a processing method of an application interface provided in the embodiment of the present application, and fig. 48 is two interface schematic diagrams of the first application interface in the processing method of an application interface provided in the embodiment of the present application. Referring to fig. 47, two icons are arranged on the first application interface, C1 represents a trigger icon corresponding to an input mode of a keyboard input, and C2 represents a trigger icon corresponding to an input mode of a handwriting input, so that when a user clicks C2 through the first application interface, the electronic device can obtain a trigger instruction for the handwriting input.
Referring again to fig. 48, fig. 48 includes two sub-schematic diagrams (a) and (b), in each of the sub-schematic diagrams (a) and (b) of fig. 48, D1 represents a switching icon for switching between the keyboard input mode and the input mode of the handwriting input, in the sub-schematic diagram (a) of fig. 48, the switching icon is in the first state, and the mode type corresponding to the first application interface is the input mode of the keyboard input, and in the sub-schematic diagram (b) of fig. 48, the switching icon is in the second state, and the mode type corresponding to the first application interface is the input mode of the handwriting input, it should be understood that the examples in fig. 47 and 48 are only for convenience of understanding of the present solution and are not used for limiting the present solution.
In another implementation manner, the electronic device may further obtain a first contact operation input by the user through the first application interface displayed on the first display screen or through an interface of the virtual keyboard displayed on the second display screen, and determine that the first operation is detected under the condition that the first contact operation is detected, so as to convert the mode type corresponding to the first application interface into the input mode of the handwriting input. The first contact operation is a click operation or a preset track operation; further, the first contact operation may be a single-click operation, a double-click operation, a three-click operation, a long-press operation, a "Z" -shaped track operation, a slide-down operation, a "hook-shaped track operation, a" circle "-shaped track operation, or other contact operations, and the like, which are not exhaustive herein.
Still further, step 4205 may include: the electronic equipment obtains the sliding operation of the first direction through the second display screen, the sliding operation of the first direction is the sliding operation from the upper edge of the second display screen to the lower edge of the second display screen, and the distance between the upper edge of the second display screen and the first display screen is shorter than the distance between the lower edge of the second display screen and the first display screen. The electronic equipment responds to the sliding operation in the first direction, the virtual keyboard displayed on the second display screen moves towards the lower edge of the second display screen along the first direction, and when the upper edge of the virtual keyboard reaches the lower edge of the second display screen, the mode type corresponding to the first application interface is confirmed to be converted into handwriting input. In the embodiment of the application, the virtual keyboard displayed on the second display screen can be operated along with downward sliding of the user, and when the upper edge of the virtual keyboard reaches the lower edge of the second display screen, the electronic equipment confirms that the mode type corresponding to the first application interface is changed into handwriting input, so that the interestingness of the process from keyboard input to handwriting input is increased, and the improvement of the viscosity of the user is facilitated.
For a more intuitive understanding of the present solution, please refer to fig. 49, in which fig. 49 is a schematic diagram of a first touch operation in the processing method of the application interface according to the embodiment of the present application. Fig. 49 includes three sub-diagrams (a), (b), and (c), and the first touch operation is taken as a slide-down operation input through the second display screen in fig. 49 as an example. As shown in the sub-diagram (a) of fig. 49 and the sub-diagram (b) of fig. 49, when the user inputs the slide-down operation through the display interface of the virtual keyboard on the second display screen, the virtual keyboard on the second display screen is retracted, and when the virtual keyboard on the second display screen is completely retracted, it is considered that the first contact operation is obtained through the second display screen, the electronic device determines that the mode type corresponding to the first application interface is the input mode of the handwriting input, so as to trigger entering the sub-diagram (c) of fig. 49, that is, trigger displaying the first application interface on the second display screen, it should be understood that the examples in fig. 49 are only for convenience of understanding of the present solution, and are not used for limiting the present solution.
In another implementation manner, in the process of displaying the first application interface through the first display screen, the electronic device may detect a distance between the electronic pen and the second screen in real time, determine that the first operation is detected when the electronic pen is found to be located within a preset range of the second display screen, and convert a mode type corresponding to the first application interface into an input mode of handwriting input. The preset range of the second display screen may refer to a range within 3 cm, within 4 cm, within 5 cm, or other ranges directly above the second display screen, and the like, which is not limited herein.
In another implementation mode, the electronic device may acquire the state of the electronic pen in real time, determine that the first operation is detected when the electronic pen is changed from a first preset state to a second preset state, and change a mode type corresponding to the first application interface into an input mode of handwriting input; and under the condition that the electronic pen is not in the second preset state, determining that the mode type corresponding to the first application interface is the input mode of keyboard input. The electronic pen may be changed from a stationary state to a moving state, or from an unhacked state to a gripped state, and the like, which are not exhaustive.
As an example, after the user opens the first application interface, the user takes the electronic pen out of the electronic device (the electronic pen is converted from the non-held state to the held state) or takes the electronic pen out of the electronic device (the electronic pen is converted from the non-held state to the held state) in the process of the first application interface, and the electronic device may determine that the mode type corresponding to the first application interface is the input mode of the handwriting input.
Further, after the electronic pen is taken out of the electronic device, a communication connection is established with the electronic device, and if the electronic device detects that the electronic pen is switched from the communication connection with the electronic device to the communication connection with the electronic device, it may be regarded that the electronic device is switched from the non-held state to the held state.
The electronic pen may be configured with a vibration sensor (e.g., a gyroscope, an acceleration sensor, or other types of sensors, etc.), so that the electronic device may collect vibration data of the electronic pen in real time and transmit the vibration data of the electronic pen to the electronic device through the communication module in real time, and the electronic device determines whether the electronic pen is in a moving state from a stationary state.
The sensing of taking the electronic pen out of the device may be realized by receiving a signal of disconnection between the stylus and the interface of the device by means of a processing module of the device itself, or sensing the disconnection between the stylus and the device by means of a sensor module on the stylus and sending the signal to the screen device by means of a communication module. The sensing of the pen taking-up process of the user is realized by sensing vibration caused by the stylus pen when the user takes up the stylus pen through a sensor module (such as a gyroscope sensor or an acceleration sensor) of the stylus pen, and then sending vibration data to a main equipment processing module through a communication module.
In the embodiment of the application, a plurality of judgment modes of the mode type corresponding to the first application interface are provided, so that the realization flexibility of the scheme is improved, and the application scene of the scheme is expanded; furthermore, the mode type corresponding to the first application interface is determined according to the holding posture of the electronic pen, a user can realize the conversion of the mode type of the first application interface without performing other operations, the mode type corresponding to the first application interface is determined according to the holding posture of the user on the electronic pen, the error rate of the judgment process of the mode type corresponding to the first application interface can be reduced, the probability of misplacing the first application interface is reduced, the waste of computer resources is avoided, and the viscosity of the user is improved.
4206. The electronic device determines whether the mode type corresponding to the first application interface is converted into a handwriting input, and if the mode type corresponding to the first application interface is converted into the handwriting input, the electronic device goes to step 4207; if the mode type corresponding to the first application interface is not converted into handwriting input, step 4205 is re-entered.
In this embodiment, after the electronic device performs step 4205, the electronic device performs step 4206 to determine whether the mode type corresponding to the first application interface is changed from the input mode of the keyboard input to the input mode of the handwriting input, and if the mode type corresponding to the first application interface is changed to the handwriting input, step 4207 is performed; if the mode type corresponding to the first application interface is not converted into the handwriting input, step 4205 is re-entered to continue detecting the mode type corresponding to the first application interface. It should be noted that, in the embodiment of the present application, step 4205 and step 4206 may be performed in a crossed manner, and the embodiment of the present application does not limit the relationship between the number of times steps 4205 and 4206 are performed and step 4207, and step 4207 may be performed once after step 4205 and step 4206 are performed multiple times.
4207. The electronic equipment responds to an input mode of the handwriting input and triggers the first application interface to be displayed on the second display screen.
In the embodiment of the application, when the electronic device obtains that the mode type corresponding to the first application interface is changed from the input mode of keyboard input to the input mode of handwriting input, the display of the first application interface on the second display screen is triggered in response to the input mode of the handwriting input, and the virtual keyboard displayed on the second display screen is closed, so that the handwriting content input by the user for the first application interface is obtained through the second display screen. The electronic device displays the first application interface on the second display screen, and the first application interface may be displayed by moving the first application interface to the second display screen, or displayed by the second display screen after the electronic device automatically copies the first application interface.
Specifically, an operating system runs on the electronic device, and the electronic device can display the first application interface on the second display screen in a mode of calling a move to function in the operating system, or in a mode of calling a Set Window Position function in the operating system, or in a mode of calling a Set Window place function in the operating system.
More specifically, in one case, the electronic device does not have any other application interface on the second display screen, and in one implementation, the electronic device may close the virtual keyboard displayed on the second display screen, move the first application interface to the second display screen (i.e., not display the first application interface on the first display screen), and display the first application interface in full screen through the second display screen. In another implementation, the electronic device may close the virtual keyboard displayed on the second display screen, copy the first application interface to the second display screen, and display the first application interface on both the first display screen and the second display screen.
To understand the present application more intuitively, please refer to fig. 50, in which fig. 50 is a schematic diagram of a display interface of a first application interface in a processing method of an application interface according to an embodiment of the present application. Fig. 50 includes two sub-diagrams (a) and (b), and the sub-diagram (a) of fig. 50 is a diagram illustrating a first display screen and a second display screen of the electronic device in a case where the mode type corresponding to the first application interface is an input mode of a keypad input, in the case that the electronic device acquires that the input mode of the mode type corresponding to the first application interface is changed from the input mode of the keyboard input to the input mode of the handwriting input, the sub-diagram of fig. 50 (a) is triggered to enter the sub-diagram of fig. 50 (b), that is, the virtual keyboard displayed on the second display screen is closed, and the first application interface is moved to the second display screen, it should be noted that, besides the first application interface, other application interfaces may be displayed on the first display screen of the electronic device.
In another case, other application interfaces are also shown on the second display screen of the electronic device. In one implementation, the electronic device may close the virtual keyboard displayed on the second display screen, and display the first application interface and other application interfaces on the second display screen in a matrix manner. In another implementation, the electronic device may close the virtual keyboard displayed on the second display screen and display the first application interface on the second display screen in the form of a floating window. In another implementation manner, the electronic device may close the virtual keyboard displayed on the second display screen, and move the other application interfaces displayed on the second display screen to the first display screen, so as to display the first application interface in a full-screen manner through the second display screen, where the form of displaying the first application interface on the second display screen is not exhaustive.
Further, in the foregoing various implementation manners, while the electronic device displays the first application interface on the second display screen, the electronic device may also display the first application interface on the first display screen, or no longer display the first application interface on the first display screen.
Optionally, step 4204 may include: and displaying the first application interface and the application control bar on the second display screen. The method may further comprise: the electronic equipment detects a second operation acting on the second display screen; and changing the first display area of the application control bar into a second display area in response to the second operation, and changing a first control key group included in the application control bar into a second control key group, wherein the first control key group and the second control key group are control key sets corresponding to the target application. The specific meanings of the terms in the foregoing steps and the specific implementation manners of the foregoing steps will be described in the following fourth embodiment, which will not be described herein again.
Optionally, the first application interface includes a first control key, and step 4204 may include: and displaying the first application interface and the application control bar on the second display screen. The method may further comprise: the electronic equipment detects a second operation on the first target application interface; and responding to the second operation, displaying the first control key in the application control bar, and hiding the first control key in the first application interface. The specific meanings of the terms in the foregoing steps and the specific implementation manners of the foregoing steps will be described in the following fourth embodiment, which will not be described herein again.
4208. The electronic equipment acquires a mode type corresponding to the first application interface.
In the embodiment of the present application, the specific implementation manner of step 4208 may refer to the description of step 4205, which is not described herein again.
4209. The electronic device determines whether the mode type corresponding to the first application interface is converted into a keyboard input, and if the mode type corresponding to the first application interface is converted into a keyboard input, the electronic device proceeds to step 4210; if the mode type corresponding to the first application interface is not converted to keyboard input, step 4208 is re-entered.
In this embodiment, after the electronic device performs step 4208, the electronic device performs step 4209 to determine whether the mode type corresponding to the first application interface is changed from the input mode of the handwriting input to the input mode of the keyboard input, and if the mode type corresponding to the first application interface is changed to the input mode of the keyboard input, step 4210 is entered; if the mode type corresponding to the first application interface is not converted into the handwriting input, step 4208 is re-entered to continue detecting the mode type corresponding to the first application interface. It should be noted that, in this embodiment of the present application, step 4208 and step 4209 may be performed in a crossed manner, and the relationship between the number of times of performing steps 4208 and 4209 and step 4209 is not limited in this embodiment of the present application, and step 4210 may be performed once after steps 4208 and 4209 are performed multiple times.
4210. The electronic equipment responds to an input mode of keyboard input, and triggers the display of a first application interface on a first display screen and the display of a virtual keyboard on a second display screen.
In the embodiment of the present application, the specific implementation manner of step 4210 may refer to the description in step 4204, which is not described herein again. It should be noted that, after the electronic device executes step 4210, the electronic device may re-enter step 4205 to detect whether the mode type corresponding to the first application interface is converted into the handwriting input in real time; in addition, steps 4205 to 4209 are optional steps, and if the user closes the first application interface in any of steps 4205 to 4209, the remaining steps do not need to be executed again.
In the embodiment of the application, in the process of displaying the application interface, not only can the layout of the application interface on different display screens of the electronic equipment be automatically adjusted when the application interface is changed from other mode types to the handwriting input, but also the layout of the application interface on different display screens can be automatically adjusted when the mode type of the application interface is changed to the keyboard input, and a virtual keyboard can be automatically displayed, so that when the mode type of the application interface is changed to the keyboard input, a user does not need to manually adjust the layout of the application interface on different display screens, but can directly perform the keyboard input, the steps are simple, and the user viscosity of the scheme is further improved.
4211. The electronic equipment responds to an input mode of the handwriting input and triggers the first application interface to be displayed on the second display screen.
In the embodiment of the application, when the electronic device determines that the mode type corresponding to the first application interface is the handwriting input, the electronic device triggers the display of the first application interface on the second display screen in response to the input mode of the handwriting input, so as to obtain the input content aiming at the first application interface through the first display screen. The display manner of the first application interface on the second display screen may refer to the description in step 4207, which is not described herein again.
For a more intuitive understanding of the present disclosure, please refer to fig. 51 and fig. 52, and fig. 51 and fig. 52 are schematic flow diagrams respectively illustrating a processing method of an application interface according to an embodiment of the present disclosure. Referring to fig. 51, fig. 51 includes two sub-schematic diagrams (a) and (b), in the sub-schematic diagram (a) of fig. 51, the electronic device obtains an opening operation for a target application program (i.e., an application program such as "note" in the diagram) through the second display screen, and since the opening operation is input through the second display screen, and the electronic device determines that a mode type corresponding to the first application interface is a handwriting input, the electronic device enters the sub-schematic diagram (b) of fig. 51, and the electronic device displays the first application interface (i.e., an initial application interface of the application program such as "note") on the second display screen.
Continuing to refer to fig. 52, fig. 52 includes two sub-schematic diagrams (a) and (b), in the sub-schematic diagram (a) of fig. 52, the electronic device obtains an opening operation for a target application (i.e., an application such as "note" in the drawing) through the first display screen, since the starting operation is obtained through the electronic pen, and the electronic device determines that the mode type corresponding to the first application interface is handwriting input, the electronic device enters the sub-schematic diagram (b) of fig. 52, and the electronic device displays the first application interface on the second display screen, it should be understood that the examples in fig. 51 and 52 are only for convenience of understanding, and are not used for limiting the present solution.
4212. The electronic equipment acquires a mode type corresponding to the first application interface.
4213. The electronic device determines whether the mode type corresponding to the first application interface is converted into a keyboard input, and if the mode type corresponding to the first application interface is converted into a keyboard input, the electronic device proceeds to step 4214; if the mode type corresponding to the first application interface is not converted into the keyboard input, step 4212 is re-entered.
4214. The electronic equipment responds to an input mode of keyboard input, and triggers the display of a first application interface on a first display screen and displays a virtual keyboard on a second display screen.
In the embodiment of the present application, specific implementation manners of steps 4212 to 4214 may refer to the descriptions of steps 4208 to 4210, which are not described herein again.
4215. The electronic equipment acquires a mode type corresponding to the first application interface.
4216. The electronic device determines whether the mode type corresponding to the first application interface is converted into a handwriting input, and if the mode type corresponding to the first application interface is converted into a handwriting input, the electronic device proceeds to step 4217; if the mode type corresponding to the first application interface is not converted into handwriting input, step 4215 is re-entered.
4217. The electronic equipment responds to an input mode of the handwriting input and triggers the first application interface to be displayed on the second display screen.
In the embodiment of the present application, specific implementation manners of steps 4215 to 4217 may refer to the descriptions of steps 4205 to 4207, which are not described herein again.
It should be noted that, after the electronic device executes step 4217, the electronic device may re-enter step 4212 to detect whether the mode type corresponding to the first application interface is converted into the keyboard input in real time; in addition, steps 4212 to 4217 are optional steps, and if the user closes the first application interface in any one of steps 4212 to 4217, the remaining steps do not need to be executed continuously.
And secondly, switching between an input mode of handwriting input and a browsing mode.
In an embodiment of the present application, please refer to fig. 53, where fig. 53 is a flowchart illustrating a processing method of an application interface according to an embodiment of the present application, where the processing method of an application interface according to an embodiment of the present application may include:
5301. the electronic equipment acquires a starting operation aiming at the first application interface.
5302. The electronic device determines a mode type corresponding to the first application interface based on the start operation.
5303. The electronic device judges whether the mode type corresponding to the first application is handwriting input, and if the mode type corresponding to the first application is a browsing mode, the electronic device enters step 5304; if the mode type corresponding to the first application is handwriting input, proceed to step 5311.
In this embodiment of the application, the specific implementation manner of steps 5301 to 5303 may refer to the description in steps 4201 to 4203 in the corresponding embodiment of fig. 42, and the difference is that the mode type of the keyboard input in steps 4201 to 4203 is replaced with the browsing mode in steps 5301 to 5303, which may specifically refer to the description in the corresponding embodiment of fig. 42 and is not described herein again.
5304. The electronic equipment responds to the browsing mode and triggers the first application interface to be displayed on the first display screen.
In the embodiment of the application, under the condition that the electronic device determines that the mode type corresponding to the first application interface is not the handwriting input but the browsing mode, the electronic device triggers to display the first application interface on the first display screen only in response to the browsing mode.
5305. The electronic equipment acquires a mode type corresponding to the first application interface.
5306. The electronic device judges whether the mode type corresponding to the first application interface is converted into handwriting input, and if the mode type corresponding to the first application interface is converted into handwriting input, the electronic device enters step 5307; if the mode type corresponding to the first application interface is not converted into handwriting input, step 5305 is re-entered.
5307. The electronic equipment responds to an input mode of the handwriting input and triggers the first application interface to be displayed on the second display screen.
In this embodiment, the specific implementation manner of steps 5305 to 5307 may refer to the description in steps 205 to 207 in the corresponding embodiment of fig. 42, except that the mode type of the keyboard input in steps 205 to 207 is replaced by the browsing mode in steps 5305 to 5307, and since in the browsing mode, the virtual keyboard does not need to be displayed on the second display screen, correspondingly, when the mode type corresponding to the first application interface is converted from the browsing mode to the handwriting input, the virtual keyboard displayed on the second display screen does not need to be closed, which may specifically refer to the description in the corresponding embodiment of fig. 42, and is not described herein again.
For a more intuitive understanding of the present solution, please refer to fig. 54 to 57, and fig. 54 to 57 are four schematic diagrams of a display interface of a first application interface in a processing method of an application interface provided in the embodiment of the present application. Referring to fig. 54, fig. 54 includes two sub-schematic diagrams (a) and (b), in the sub-schematic diagram (a) of fig. 54, a bulb-shaped pattern and three circles are displayed at the bottom end of the first display screen in fig. 54, the bulb-shaped pattern represents a display interface of a desktop, the three circles represent three different application interfaces respectively, the application interface 1 (i.e., an example of the first application interface) is currently displayed in the first display screen, two icons in the upper right corner of the first display screen represent a browsing mode and a handwriting mode respectively, and the application interface 2 is displayed in the second display screen. When the electronic device acquires that the mode type corresponding to the first application interface is changed from the browsing mode to the handwriting input mode, the electronic device is triggered to enter the sub-schematic diagram (b) of fig. 54 from the sub-schematic diagram (a) of fig. 54, that is, the first application interface is moved to the second display screen, in the sub-schematic diagram (b) of fig. 54, the electronic device displays the application interface 1 and the application interface 2 in a matrix form, and the application interface 1 is not displayed on the first display screen any more, then the current display interface of the first display screen is changed into the application interface 3, and the user can display the application interface 1 in a full screen mode by clicking the application interface 1.
Referring to fig. 55 again, fig. 55 includes two sub-schematic diagrams (a) and (b), where the sub-schematic diagram (a) of fig. 55 is consistent with the sub-schematic diagram (a) of fig. 54, and is not described herein again, when the electronic device obtains that the mode type corresponding to the first application interface is changed from the browsing mode to the input mode by handwriting input, the electronic device is triggered to enter the sub-schematic diagram (b) of fig. 55, in the sub-schematic diagram (b) of fig. 55, the electronic device displays the application interface 1 (i.e., an example of the first application interface) in the form of a floating window, and does not display the application interface 1 on the first display screen any more, so that the current display interface of the first display screen is changed into the application interface 3.
Referring to fig. 56 again, fig. 56 includes two sub-schematic diagrams (a) and (b), where the sub-schematic diagram (a) of fig. 56 is consistent with the sub-schematic diagram (a) of fig. 54, and is not repeated here, and when the electronic device obtains an input mode in which a mode type corresponding to the first application interface is converted from a browsing mode to a handwriting input mode, the electronic device is triggered to enter the sub-schematic diagram (b) of fig. 56, and in the sub-schematic diagram (b) of fig. 56, the electronic device displays the application interface 1 (i.e., an example of the first application interface) in a form of a floating window, and still displays the application interface 1 on the first display screen.
Referring to fig. 57 again, fig. 57 includes two sub-schematic diagrams (a) and (b), where the sub-schematic diagram (a) of fig. 57 is consistent with the sub-schematic diagram (a) of fig. 54, and is not repeated here, and when the electronic device obtains an input mode in which a mode type corresponding to the first application interface is converted from a browsing mode to a handwriting input mode, the electronic device is triggered to enter the sub-schematic diagram (b) of fig. 57, and in the sub-schematic diagram (b) of fig. 57, the electronic device displays the application interface 1 (that is, an example of the first application interface) in a full screen manner, and the electronic device moves the application interface 2 displayed on the second display screen to the first display screen.
It should be noted that, besides the first application interface, other application interfaces may be displayed on the first display screen of the electronic device, and more application interfaces may also be displayed on the second display screen of the electronic device, and the examples in fig. 54 to fig. 57 are only for convenience of understanding the present solution and are not limited to the present solution.
5308. The electronic equipment acquires a mode type corresponding to the first application interface.
5309. The electronic device determines whether the mode type corresponding to the first application interface is converted into a browsing mode, and if the mode type corresponding to the first application interface is converted into the browsing mode, the electronic device proceeds to step 1530; if the mode type corresponding to the first application interface is not converted into the browsing mode, step 5308 is re-entered.
5310. The electronic equipment responds to an input mode of a browsing mode, triggers and displays the first application interface on the first display screen, and does not display the first application interface on the second display screen.
In this embodiment, the specific implementation manner of steps 5308 to 5310 may refer to the description in steps 4208 to 4210 in the corresponding embodiment of fig. 42, except that the mode type of keyboard input in steps 4208 to 4210 is replaced with the browsing mode in steps 5308 to 5310, and when the mode type corresponding to the first application interface is converted from the handwriting input to the browsing mode, the virtual keyboard does not need to be displayed on the second display screen, which may specifically refer to the description in the corresponding embodiment of fig. 42 and is not described herein again.
In the embodiment of the application, when the mode type of the application interface is changed into the browsing mode, the layout of the application interface on different display screens can be automatically adjusted, so that when the mode type of the application interface is changed into the browsing mode, a user does not need to manually adjust the layout of the application interface on different display screens, namely, under various different application scenes, the simplification of operation steps can be realized, and the user viscosity of the scheme is further improved.
It should be noted that, after the electronic device executes step 5310, the electronic device may re-enter step 5305 to detect whether the mode type corresponding to the first application interface is converted into handwriting input in real time; in addition, step 5305 to step 5310 are optional steps, and if the user closes the first application interface in any of step 5305 to step 5310, the remaining steps do not need to be executed continuously.
5311. The electronic equipment responds to an input mode of the handwriting input and triggers the first application interface to be displayed on the second display screen.
5312. The electronic equipment acquires a mode type corresponding to the first application interface.
5313. The electronic device determines whether the mode type corresponding to the first application interface is converted into a browsing mode, and if the mode type corresponding to the first application interface is converted into the browsing mode, step 5314 is performed; if the mode type corresponding to the first application interface is not converted into the browsing mode, step 5312 is re-entered.
5314. The electronic equipment responds to an input mode of a browsing mode, triggers and displays the first application interface on the first display screen, and does not display the first application interface on the second display screen.
5315. The electronic equipment acquires a mode type corresponding to the first application interface.
5316. The electronic device determines whether the mode type corresponding to the first application interface is converted into a handwriting input, and if the mode type corresponding to the first application interface is converted into a handwriting input, the method proceeds to step 5317; if the mode type corresponding to the first application interface is not converted into handwriting input, step 5315 is re-entered.
5317. The electronic equipment responds to an input mode of the handwriting input and triggers the first application interface to be displayed on the second display screen.
In this embodiment, specific implementation manners of steps 5311 to 5317 may refer to descriptions in the corresponding embodiment of fig. 42 for steps 4211 to 4217, except that the mode type of keyboard input in steps 4211 to 4217 is replaced with the browsing mode in steps 5311 to 5317, and when the mode type corresponding to the first application interface is converted from the handwriting input to the browsing mode, a virtual keyboard does not need to be displayed on the second display screen, and when the mode type corresponding to the first application interface is converted from the browsing mode to the handwriting input, the virtual keyboard displayed on the second display screen does not need to be closed.
It should be noted that, after the electronic device executes step 5317, the electronic device may re-enter step 5312 to detect whether the mode type corresponding to the first application interface is converted into the browsing mode in real time; in addition, steps 5312 to 5317 are optional steps, and if the user closes the first application interface in any of steps 5312 to 5317, the user does not need to continue to execute the remaining steps.
In the embodiment of the application, the mode type corresponding to the application interface can be automatically detected in the process that a user uses the application interface, the display position of the application interface is adjusted according to the mode type corresponding to the application interface, and when the application interface is opened, the mode type corresponding to the application interface can be determined based on the starting operation, so that the display position of the application interface is determined, the user can conveniently and directly use the application interface after the starting operation is executed on the application interface, the position moving operation of the application interface is not needed, the convenience of the scheme is further improved, and the user viscosity of the scheme is increased.
The electronic equipment displays a first application interface on a first display screen, detects a mode type corresponding to the first application interface, and triggers to display the first application interface on a second display screen under the condition that the mode type corresponding to the first application interface is detected to be handwriting input, so that the input is directly carried out through the first application interface displayed on the second display screen; through aforementioned mode, if the user places the second display screen in the direction of being convenient for write, the user need not carry out any operation, and electronic equipment just can be automatic write the application interface of input with needs and show in the second display screen of conveniently writing, has both improved whole input process's efficiency, has also avoided redundant step, and easy operation is favorable to improving user's viscosity.
On the basis of the embodiments corresponding to fig. 41 to fig. 57, in order to better implement the above-mentioned solution of the embodiment of the present application, the following also provides related equipment for implementing the above-mentioned solution. Referring to fig. 58, fig. 58 is a schematic structural diagram of an electronic device according to an embodiment of the disclosure. The electronic device 1 comprises a first display 501, a second display 502, a memory 40, one or more processors 10, and one or more programs 401; one or more programs 401 are stored in the memory 40, and the one or more processors 10, when executing the one or more programs 401, cause the electronic device to perform the steps of: a first application interface is displayed through a first display screen 501; in response to the detected first operation, converting the mode type corresponding to the first application interface into handwriting input; in response to the input mode of the handwriting input, the first application interface is triggered to be displayed on the second display screen 502, so that the handwriting content for the first application interface is acquired through the second display screen 502.
In one possible design, the one or more processors 10, when executing the one or more programs 401, cause the electronic device to further perform the steps of: under the condition that the mode type corresponding to the first application interface is detected to be converted into keyboard input, the first application interface is triggered to be displayed on the first display screen 501 in response to the input mode of the keyboard input, and a virtual keyboard is displayed on the second display screen 502; or, in the case that it is detected that the mode type corresponding to the first application interface is changed into the keyboard input, in response to the input mode of the keyboard input, the first application interface is triggered to be displayed on the first display screen 501, and the virtual keyboard and the application control bar are displayed on the second display screen 502.
In one possible design, the one or more processors 10, when executing the one or more programs 401, cause the electronic device to further perform the steps of: under the condition that the mode type corresponding to the first application interface is detected to be converted into the browsing mode, the first application interface is triggered to be displayed on the first display screen 501 in response to the browsing mode.
In one possible design, the one or more processors 10, when executing the one or more programs 401, cause the electronic device to perform in particular any one or combination of four of: in a case that the holding gesture of the electronic pen is detected to meet a first preset condition, determining that a first operation is detected, wherein the holding gesture comprises any one or combination of more of the following items: holding position, holding strength and holding angle; or acquiring a trigger instruction aiming at the handwriting input through a first icon, wherein the first icon is displayed on the first application interface; or detecting a first contact operation, wherein the first contact operation is a preset click operation or a preset track operation; or, in a case that the electronic pen is detected to be located within a preset range of the second display screen 502, it is determined that the first operation is detected; and under the condition that the electronic pen is detected to be changed from the first preset state to the second preset state, determining that the first operation is detected.
In one possible design, the first operation is a sliding operation in a first direction obtained by the second display screen 502, the sliding operation in the first direction is a sliding operation from an upper edge of the second display screen 502 to a lower edge of the second display screen 502, and a distance between the upper edge of the second display screen 502 and the first display screen 501 is shorter than a distance between the lower edge of the second display screen 502 and the first display screen 501.
In one possible design, the one or more processors 10, when executing the one or more programs 401, cause the electronic device to further perform the steps of: acquiring starting operation aiming at a second application interface, and determining a mode type corresponding to the second application interface based on the starting operation, wherein the second application interface and the first application interface are different application interfaces; under the condition that the mode type corresponding to the second application interface is handwriting input, responding to the input mode of the handwriting input, and triggering to display the second application interface on the second display screen 502; or, in the case that the mode type corresponding to the second application interface is a keyboard input, in response to the input mode of the keyboard input, triggering to display the second application interface on the first display screen 501, and displaying the virtual keyboard on the second display screen 502; or, in a case that the mode type corresponding to the second application interface is the browsing mode, the second application interface is triggered to be displayed on the first display screen 501 in response to the browsing mode.
It should be noted that, for the information interaction, the execution process, and other contents between the modules/units in the electronic device 1, the method embodiments corresponding to fig. 41 to fig. 57 in the present application are based on the same concept, and specific contents may refer to the description in the foregoing method embodiments in the present application, and are not described herein again.
Referring to fig. 59, fig. 59 is a schematic structural diagram of the electronic device provided in the embodiment of the present application, and the electronic device 1 may be embodied as a mobile phone, a tablet, a notebook, or other device configured with a display screen, which is not limited herein. The electronic device 1 may be disposed with the electronic device described in the embodiment corresponding to fig. 58, and is configured to implement the functions of the electronic device in the embodiments corresponding to fig. 41 to fig. 57. In particular, electronic device 1 may vary widely due to configuration or performance differences and may include one or more Central Processing Units (CPUs) 1522 (e.g., one or more processors) and memory 40, one or more storage media 1530 (e.g., one or more mass storage devices) storing applications 1542 or data 1544. Memory 40 and storage media 1530 may be, among other things, transient or persistent storage. The program stored on the storage medium 1530 may include one or more modules (not shown), each of which may include a sequence of instruction operations for the electronic device. Further, the central processor 1522 may be configured to communicate with the storage medium 1530, and execute a series of instruction operations in the storage medium 1530 on the electronic device 1.
Electronic device 1 may also include one or more power supplies 1526, one or more wired or wireless network interfaces 1550, one or more input-output interfaces 1558, and/or one or more operating systems 1541, such as Windows Server, Mac OS XTM, UnixTM, LinuxTM, FreeBSDTM, and so forth.
In this embodiment, the central processing unit 1522 is configured to implement functions of the electronic device in the embodiment corresponding to fig. 41 to fig. 57. It should be noted that, for the specific implementation manner and the beneficial effects brought by the central processor 1522 executing the functions of the electronic device in the embodiments corresponding to fig. 41 to fig. 57, reference may be made to the descriptions in each method embodiment corresponding to fig. 41 to fig. 57, and details are not repeated here.
Also provided in an embodiment of the present application is a computer-readable storage medium, in which a program for generating a vehicle running speed is stored, and when the program runs on a computer, the computer is caused to execute the steps executed by an electronic device in the method described in the embodiment shown in fig. 42 to 57.
Embodiments of the present application also provide a computer program, which when run on a computer, causes the computer to execute the steps performed by the electronic device in the method described in the foregoing embodiments shown in fig. 42 to 57.
Further provided in an embodiment of the present application is a circuit system, which includes a processing circuit configured to execute the steps performed by the electronic device in the method described in the foregoing embodiment shown in fig. 42 to 57.
The electronic device provided by the embodiment of the application may specifically be a chip, and the chip includes: a processing unit, which may be, for example, a processor, and a communication unit, which may be, for example, an input/output interface, a pin or a circuit, etc. The processing unit may execute the computer executable instructions stored in the storage unit to enable the chip to perform the steps performed by the electronic device in the method described in the embodiment shown in fig. 42 to 57. Optionally, the storage unit is a storage unit in the chip, such as a register, a cache, and the like, and the storage unit may also be a storage unit located outside the chip in the wireless access device, such as a read-only memory (ROM) or another type of static storage device that can store static information and instructions, a Random Access Memory (RAM), and the like.
Wherein any of the above processors may be a general purpose central processing unit, a microprocessor, an ASIC, or one or more integrated circuits for controlling the execution of the programs of the method of the first aspect.
It should be noted that the above-described embodiments of the apparatus are merely schematic, where the units described as separate parts may or may not be physically separate, and the parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on multiple network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. In addition, in the drawings of the embodiments of the apparatus provided in the present application, the connection relationship between the modules indicates that there is a communication connection therebetween, and may be implemented as one or more communication buses or signal lines.
Through the above description of the embodiments, those skilled in the art will clearly understand that the present application can be implemented by software plus necessary general hardware, and certainly can also be implemented by special hardware including application specific integrated circuits, special CLUs, special memories, special components and the like. Generally, functions performed by computer programs can be easily implemented by corresponding hardware, and specific hardware structures for implementing the same functions may be various, such as analog circuits, digital circuits, or dedicated circuits. However, for the present application, the implementation of a software program is more preferable. Based on such understanding, the technical solutions of the present application may be substantially embodied in the form of a software product, which is stored in a readable storage medium, such as a floppy disk, a usb disk, a removable hard disk, a ROM, a RAM, a magnetic disk, or an optical disk of a computer, and includes several instructions for enabling a computer device (which may be a personal computer, a server, or a network device) to execute the methods described in the embodiments of the present application.
In the above embodiments, all or part of the implementation may be realized by software, hardware, firmware, or any combination thereof. When implemented in software, it may be implemented in whole or in part in the form of a computer program.
The computer program includes one or more computer instructions. The procedures or functions described in accordance with the embodiments of the application are all or partially generated when the computer program instructions are loaded and executed on a computer. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, from one website site, computer, server, or data center to another website site, computer, server, or data center via wired (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that a computer can store or a data storage device, such as a server, a data center, etc., that is integrated with one or more available media. The usable medium may be a magnetic medium (e.g., a floppy Disk, a hard Disk, a magnetic tape), an optical medium (e.g., a DVD), or a semiconductor medium (e.g., a Solid State Disk (SSD)), among others.
Example four:
the embodiment of the invention can be applied to various multi-screen display intelligent terminals. As an example, the embodiment of the present invention may be applied to a dual-screen electronic device, as shown in fig. 60, the electronic device may be an electronic device having two display screens, where the two display screens may be two separate display screens, or two display screens divided by a flexible folding screen or a curved screen. The electronic device may be an electronic device that works independently as a whole, such as a personal notebook, or an electronic device that is formed by connecting two electronic devices that work independently and working together, such as a dual-screen electronic device formed by splicing two mobile phones or two tablet computers.
A dual-screen electronic device, or a dual-screen electronic device formed by two tablet computers docked together, typically includes a first display screen and a second display screen. The first display screen is mainly used for providing an output function, namely displaying currently running content or executed operation and the like to a user. Of course, the first display screen may also have an input function, for example, the first display area may have a touch screen function, and the current application is operated through a touch screen gesture. Compared with the first display screen, the second display screen is generally closer to the two hands of the user, so that the user operation is facilitated. Of course, the second display screen may also have an output function, for example, the second display screen may also be used to display currently running content or performed operations to the user.
The embodiment of the invention can also be used in the operation of single application and double screens crossing equipment, wherein the display screen of the controlled equipment is mainly used for displaying the currently running content or the executed operation and the like to the user, namely, the function of the first display screen is mainly realized. And transferring the function menu corresponding to the target application in the first display screen to the display screen of the control equipment, namely, mainly realizing the function of the second display screen so as to control the application on the controlled equipment. For example, as shown in fig. 61, an application of a computer (61-3) can be remotely operated by using a tablet computer (61-1) or a mobile phone (61-2) as a control terminal; the application of the intelligent large screen (61-4) can be remotely operated by taking a tablet personal computer (61-1) or a mobile phone (61-2) as a control terminal; or the mobile phone and the tablet personal computer are interconnected, one of the mobile phone and the tablet personal computer is used as a control terminal, and the application of the other mobile phone and the tablet personal computer is operated. Specifically, the embodiment of the invention can be applied to an intelligent home scene, for example, the controlled device can be an intelligent home device with a display screen, such as a television, a microwave oven, a refrigerator and a washing machine, and the control device can be a mobile phone, a tablet, a computer, and the like. The embodiment of the invention can also be used in the field of intelligent cabins, for example, a front car machine screen or a rear display screen is controlled by using a mobile phone, a flat panel and the like as control equipment, or the front car machine screen is controlled by using the rear display screen.
In the above scenario, the interface of the application program usually adopts a relatively fixed layout and is fully displayed in the first display screen, for example, the control key of the application usually appears in the menu of the functional area on the top or left side of the application. When a user needs to operate the control keys of the application, no matter where the current operation object of the user is, the cursor needs to be moved to the function area menu for operation, and then the cursor returns to the current operation object. In one case, it may be difficult for the user to locate a specific control key in the menu of the application function area, and in another case, it is difficult for the user to relocate to the current operation object after moving the cursor to the specific control key, and in both cases, it causes a certain difficulty in the normal operation of the user. In the above operation, the user needs to control the cursor to continuously switch between the operation object and the function area menu, so that the operation of the user is complicated, and the operation efficiency is low.
Referring to fig. 62, in an embodiment of the present invention, a screen display method 6200 is provided, where the screen display method 6200 is used for displaying a control area in the second display screen, so that a user can control a target application or an operating system on the first display screen through the control area on the second display screen.
When the method of the embodiment of the present invention is applied to an electronic device with a dual-screen display, for example, the dual-screen electronic device, a B-side (a side generally set as a display) of the dual-screen electronic device may serve as a first display, and a C-side (a side generally set as a keyboard) of the dual-screen electronic device may serve as a second display. In one implementation, as shown in fig. 63A, the B-side may display a main display interface and a function menu bar of a target application operated by a user, and the C-side may include a virtual keyboard and a control area; in another implementation, as shown in fig. 63B, the B-side may display a main display interface and a function menu bar of a target application operated by a user, and the C-side may include other application interfaces and a control area; in another implementation, as shown in fig. 63C, the B-side may display a main display interface and a function menu bar of a target application operated by a user, and the C-side may include only a control region. When the method of the embodiment of the invention is applied to the operation of single-application dual-screen cross-device, the display screen of the controlled device may correspond to the B-side of the dual-screen electronic device, the display screen of the control device may correspond to the C-side of the dual-screen electronic device, and the display contents of the display screens of the controlled device and the control device may refer to fig. 63A-63C.
Generally, an application program may include a main display interface for presenting a current state of the application or a result of execution of a user operation to a user, and one or more function menu bars including control keys corresponding to the application, the control keys in the function menu bar being generally used for receiving an input from the user and executing a specific operation on a target application. In addition, the function menu bar of the text editing application may include an editing menu bar (including control keys for file, start, insert, design, page layout, and the like), a navigation menu bar, and other function menu bars, and is used for receiving an operation instruction of a user on a document. In general, a main display interface and a function menu bar of an application are both displayed in a first display screen, and a user can only operate the application in the first display screen through a mouse or a touch screen gesture.
In an embodiment of the present invention, a control area is set in the second display screen, and the control area may include a plurality of display regions, for example, in an implementation manner, the control area may include a system control bar and an application control bar, where the system control bar may include one or more function modules, each function module includes a control key group associated with an operating system, the application control bar may include one or more function modules, a part of the function modules may include a control key group corresponding to a target application, and a part of the function modules may include a shortcut control key group related to a current operation of a user. It should be understood that other display area arrangements, and functional module arrangements are possible that help to improve the efficiency of user operation. By setting the control area in the second display screen, a user can operate a target application or system in the first display screen through a control key in the second display screen, a more convenient operation mode is provided for the user, and the operation efficiency of the user is improved.
Preferably, the position of the control area on the second display screen can be flexibly adjusted according to the user requirements. For example, the control area may be located at the upper end of the second display screen, at the upper side of other display contents (other applications, virtual keyboard, etc.) in the second display screen; the control area can also be positioned at the lower end of the second display screen and positioned at the lower side of other display contents (other applications, virtual keyboards and the like) in the second display screen; the control region may also be located to the left or right of the second display screen. The initial display position of the control area can be customized by the system or a user, and when the control area is displayed on the second display screen, the position of the control area on the second display screen can be flexibly moved by the user.
The screen display method 6200 may include the steps of:
step 6201: the control area is activated.
In one implementation, in a normal use state, the control area may be in a closed state, at this time, the main display module and a part of the function menu bar of the target application may be displayed through the first display screen, and the control key in the function menu bar in the first display screen is operated by a mouse operation or a touch screen gesture to implement an operation on the target application in a normal operation manner.
When a user needs to open the control area on the second display screen to reach the control key of the target application on the first display screen more quickly, so as to improve the operation efficiency, the control area can be activated in various ways. In one implementation, the control region may be associated with a virtual keyboard, and the control region is opened by default at the same time the virtual keyboard is opened. At this time, the virtual keyboard and the control area may be simultaneously activated by an instruction to activate the virtual keyboard, as shown in fig. 64A. In another implementation manner, as shown in fig. 64B, a control switch of the control area may be disposed on the virtual keyboard, and when the virtual keyboard is in an open state and the control area is not open, the control area may be activated through the control switch on the virtual keyboard. In another implementation, as shown in fig. 64C, the control area may be activated by gesture control, that is, a gesture corresponding to activation of the auxiliary display area is stored in the storage module, and when it is detected that the user performs the gesture, the control area is activated. The control gesture may be, for example, a finger sliding inward from an edge of the display screen. In another implementation, whether the control area is opened may be associated with a display mode of the application, since a portion of the display module of the application is typically hidden when the full-screen mode of the application is opened, the control area may be activated to supplement the content of the display portion of the display module while the full-screen mode of the application is opened, as shown in fig. 64D. It should be understood that the above manner of operation of the activation control region is merely exemplary and that other manners of operation of the activation control region are possible.
The control area is closed in a normal use state, and when the control area is activated by simple operation when needed, a user operation interface can be simplified under an unnecessary condition, and the control area is prevented from interfering a normal use condition.
In another implementation, the control area may be opened by default after the electronic device is opened, and at this time, the user does not need to activate the control area in step 6201. Therefore, step 6201 is an optional step of the screen display method 6200.
Step 6202: and acquiring the operation of the user on the target application.
In the embodiment of the invention, the display content of the control area is determined according to the operation of the user on the target application. In one embodiment, the operation of the user on the target application is to display the operation interface of the target application on the first display screen, for example, before the user operates, the target application may be in a closed state, and the user displays the operation interface of the target application on the first display screen by an operation of opening the target application; or before the user operates, the target application can be in a background running state, and the user displays an operation interface of the target application on the first display screen through switching operation. When the operation of the user on the target application is to display the operation interface of the target application on the first display screen, the control key corresponding to the target application can be displayed in the application control bar. After the operation interface of the target application is displayed on the first display screen, only the operation interface of the target application may be displayed on the first display screen, or the operation interfaces of a plurality of applications including the target application may be displayed together, for example, in a dual-screen and multi-screen operation mode.
Optionally, in an implementation manner, a developer of the application program may provide each function module and the control key in the function module of the application program, and a priority order between each function module and each control key, and then the system may determine which function modules and control keys are displayed in the application control bar corresponding to the application in the control area according to an actual situation (a display area of the control area, etc.), and determine a layout manner of the application control bar.
In this implementation manner, the information of the target application acquired by the system from the application program may include each function module of the target application, the control key included in each function module, the priority order of each function module, and the priority order of different control keys in each function module. Various information of the target application is respectively introduced as follows:
1) function module and control key
An application typically includes a main display module and a plurality of function menu bars for controlling contents in the main display module, and the function modules in the control region may correspond to the function menu bars of the target application. For example, taking a slide editing application as an example, the slide editing application may include a main display interface, a function module 1, a function module 2, a function module 3, and the like, where the main display interface displays a slide interface currently being edited by a user, the function module 1 includes a commonly used control key set for editing the slide interface, the function module 2 is configured to display all slides for the user to browse, and the function module 3 includes a shortcut control key set. It should be understood that the settings of the function modules and the settings of the control keys in the function modules may be different for different applications due to the different functions implemented by the different applications.
2) Priority of function modules
The priority of the function module indicates the importance of each function module in the use process of the user, and can be determined according to the importance of the function of each function module, the use frequency of the user and other indexes. For example, the priority of the function modules of the slide editing application described above may be defined as follows: priority of function module 1 > priority of function module 2 > priority of function module 3. It should be understood that the above definition of the priority of the function modules of the slide editing application is merely exemplary, and other possible definitions are possible, as well as the manner of definition that conforms to the usage habits of the user.
Optionally, in an implementation manner, one or more optional function modules may be defined for the application program, where the optional function modules are function modules corresponding to the application program and fixedly displayed in the open state in the control area. In addition, one or more preferred function modules can be defined for the target application, and the preferred function modules are function modules which can be preferentially displayed after all the necessary function modules of the application program are displayed in the control area. The priority order of the various functional modules of the target application may be set as follows: the priority of the selected functional module is highest, the priority of the preferred functional module is next, and the priority of other functional modules is next.
3) Priority of control keys in function modules
The priority of the control key indicates the importance of each control key in the use process of the user, and can be generally determined according to the importance of the control function of each control key, the use frequency of the user and other indexes. For example, taking a control for editing a text as an example, in one implementation, priorities of control for copying, pasting, cutting, font, paragraph, definition, synonym, translation, etc. can be defined as follows: copy and paste have the highest priority, cut has a lower priority than copy and paste, fonts and paragraphs have a lower priority than cut, definitions, synonyms and translations have a lower priority than fonts and paragraphs. It should be understood that the definition of the priority is just one possible implementation, and other definitions of the priority are possible according to the usage habits of the user or other common definitions of the function keys of the application.
In one implementation, each functional module may define one or more optional control keys, where the optional control keys are control keys that are fixedly displayed when the corresponding functional module is displayed in the control area. In addition, each function module may define one or more preferred control keys, and the preferred control keys are control keys that can be preferentially displayed after all control keys of the corresponding function module are displayed in the control area. The priority order between different control keys of the same function module can be set as follows: the priority of the necessary control key is highest, the priority of the preferred control key is next, and the priority of other control keys is next.
Optionally, in another implementation manner, the display content of the application control bar for different display areas, including the function modules and the control keys in the application control bar, and the layout manner of the application control bar, may be directly defined by the developer of the application program, for example, the developer of the application program sets an application control bar display manner 1 for a display area 1, an application control bar display manner 2 for a display area 2, an application control bar display manner 3 for a display area 3, and the like. The display area 1, the display area 2, and the display area 3 may not refer to a certain size, but may be a range, and at this time, the system may select a display mode of the corresponding application control bar according to the display area of the application control bar.
In this implementation manner, the information of the target application acquired by the system from the application program may include display manners of the application control columns for different display areas, specifically which function modules are included in the display manners of each application control column, which control keys are included in each function module, and a layout manner of the application control columns. The application control bar displayed in the control area may be displayed in exactly the same manner as provided by the application program.
Optionally, in another implementation, the system may identify each function module of the application program and the control keys in the function modules through a text or image recognition technology, the system formulates a priority order for each function module and control key of the application program according to the use frequency or importance of the user, and then the system determines which function modules and control keys are displayed in the application control field according to the formulated priority order, and determines a specific layout mode. In this implementation, the system may not obtain additional information from the application side.
It should be understood that the above three implementations are only exemplary, and other possible implementations, or implementations emerging as technology develops, are possible.
In another embodiment, the operation of the target application by the user is an operation of the operation interface of the target application, for example, selecting specific content on the operation interface of the target application, placing a cursor at a specific position of the operation interface of the target application, and the like. When the operation of the user on the target application is the operation on the operation interface of the target application, a shortcut control key associated with the operation can be displayed in the application control bar.
The operation of the user on the operation interface of the target application includes any possible operation when the user performs a specific function through the target application. In one implementation, the operation of the user on the target application may be to select a specific object of the operation interface, for example, to select a specific character, a symbol, a picture, a table, an audio/video, and the like, and the user may select the specific object in various manners such as a touch screen gesture or a mouse operation, for example, a cursor may be moved to the specific object by the touch screen gesture or the mouse operation, the specific object (shading of the specific object is deepened) may be selected by the touch screen gesture or the mouse operation, and the like. In another implementation manner, the operation of the user on the operation interface of the target application may be a unique gesture or a unique manner of operating the mouse, for example, by sliding a gesture or scrolling a mouse wheel to scroll the content of the target area, so as to browse the content of the target area. It should be understood that the above operations are exemplary only, and that other operations that a user may perform on a target application during use of an electronic device are possible.
Different operations of the user on the target application operation interface can correspond to different control key groups, and the control keys in the control key groups can be a set of shortcut operation keys associated with specific operations. As shown in fig. 65A, in one implementation, the specific operation of the user on the target application may be selecting a specific text content, for example, placing a cursor on the text content, and the control key group corresponding to the specific operation may include a set of control keys of copy, paste, cut, font, text size, paragraph, definition, synonym, translation, search using a network, and the like. As shown in fig. 65B, in one implementation, the specific operation of the user on the target application may be to select specific picture content, and the control key group corresponding to the specific operation may include a set of control keys for copying, pasting, cutting, setting picture format, changing pictures, placing on top, placing on bottom, saving pictures, and the like. As shown in fig. 65C, in one implementation, a specific operation of the target application by the user may be to select a specific table content, and the control key group corresponding to the specific operation may include a set of control keys such as copy, paste, cut, format, insert row, insert column, delete table, and the like. As shown in fig. 65D, in one implementation, a specific operation of the user on the target application may be selecting a specific video content, and the control key group corresponding to the specific operation may include a set of control keys for playing, pausing, increasing volume, decreasing volume, increasing brightness, decreasing brightness, picture-in-picture, copying video address, projecting, looping, progress bar, and the like. As shown in fig. 65E, in one implementation, the specific operation of the user on the target application may be to select specific audio content, and the control key group corresponding to the specific operation may include a set of control keys of play, pause, next, volume up, volume down, audio address copy, loop, progress bar, and the like. As shown in fig. 65F, in one implementation, the specific operation of the target application by the user may be browsing the content in the target area through a swipe gesture or a scroll wheel of a mouse, and the control key group corresponding to the specific operation may include a thumbnail of the target area, a positioning frame for quickly positioning the target content in the thumbnail, and the like.
Optionally, in one implementation, different sets of control keys may be defined by the system for different user operations. Different shortcut operation control key groups are displayed aiming at different operations of a user on a target application, so that the requirements of the user can be met, more convenient and faster operations are provided for the user, and the operation efficiency of the user is improved. In another implementation, the set of control keys may also be defined as a set of control keys displayed by clicking a right button of a mouse at the current mouse position. The simple design of defining the control key set as the control key set displayed by clicking the right mouse button can avoid the secondary development of the developer, reduce the burden of the developer and shorten the development period.
Step 6203: and acquiring the display area of the control area.
Step 6203 is an optional step, and when the display area of the control area is fixed, step 6204 may be directly performed without obtaining the display area of the control area. When the display area of the control area can be flexibly adjusted, step 6203 can be performed.
Optionally, in an implementation manner, the display area of the control area may be flexibly adjusted. Alternatively, the initial display area of the control area may be different at each time the control area is opened. For example, different applications may correspond to different initial display areas. In one implementation, the initial display area of the control region may be defined by the system, which may define different initial display areas of the control region for different applications. In another implementation, the initial display area of the control region may be user-defined, and the user may define different initial display areas of the control region for different applications. In another implementation, the initial display area of the control region may default to the display area of the control region that was opened the last time the application was used. It should be understood that other possible ways of defining the initial display area of the control region are possible as is common in the art.
Through the display area who sets up the controlling part district in a flexible way, can make the function module and the control key group that show accord with user's custom more in the controlling part district, for the user provides more convenient operation, promote user operating efficiency.
Optionally, as shown in fig. 63A, in an implementation manner, the control area may be disposed on the upper portion of the virtual keyboard or other application interface in the second display screen. As shown in FIG. 63B, in another implementation, the control region can be displayed on the left or right side of the second display screen. As shown in FIG. 63C, in another implementation, the control region may be displayed in an intermediate position on the second display screen. As shown in fig. 66A, in another implementation, when the virtual keyboard or other applications are not opened in the second display screen, the target application in the first display screen may occupy a portion of the display area of the second display screen, and correspondingly, the control area may be located at the bottom end of the second display screen. In another implementation, as shown in fig. 66B, the two display screens of the dual-screen electronic device may be placed left and right, in this case, the virtual keyboard may be in a split design and located at the lower ends of the two display screens, and correspondingly, the application display area may be located in the middle of the split keyboard.
Step 6204: a set of control keys is displayed in the control area.
And 6204, on the basis of comprehensively considering the information acquired in the one or more steps, determining the function module and the control key group contained in the control area, and displaying the function module and the control key group in the control area.
Optionally, as shown in fig. 67, in an implementation, the control area may include the following areas:
1) system control column
The system control column is mainly used for displaying a control key set related to system control, and optionally, the system control column can comprise a system control function module and a program dock function module. The system control function module may include a control key group for executing the operating system, for example, the system control function module may include: adjusting volume, adjusting brightness, querying weather, viewing time, viewing calendar, viewing alarm, viewing a set of control keys such as system notifications. The program dock function module may include a set of controls for performing switching between a plurality of task programs in the system, for example, the program dock function module may include: a list of currently running programs, or a list of commonly used/preferred applications, or a list of recently used applications, or a list of desktop applications, etc. Optionally, the control key set related to the system operation in the system control column may be a relatively fixed control key set by the system, or the control key set in the system control column set by the system may be adjusted by the user according to the use habit.
2) Application control column
The application control bar is mainly used for displaying a control key group corresponding to the target application, and the application control bar can contain one or more function modules corresponding to the target application and/or a shortcut function module associated with the operation of the target application by a user.
In one implementation manner, when the operation of the target application by the user is to display the operation interface of the target application in the first display screen, the control key group corresponding to the target application can be displayed in the control area. As shown in step 302, optionally, in an implementation manner, a developer of the application program may provide a priority order between each function module and each control key of the application program, and then the system may determine which function modules and control keys are displayed in the application control bar corresponding to the application in the control area according to actual conditions (display area of the control area, etc.), and determine a layout manner of the application control bar.
Preferably, when the display area of the control area is the minimum, the set of control keys of the target application may include the optional function module of the target application and the optional control key in the optional function module.
Preferably, when the display area of the control area is larger than the minimum display area, in an implementation manner, the control keys in the control key set of the target application may be added according to the overall priority order as shown in fig. 68 on the basis of comprehensively considering the priority order of each function module and the priority order of each control key of the target application. Specifically, the priority of the optional control key in the optional function module is higher than the priority of the preferred control key of the optional function module, higher than the priority of the optional control key of the preferred function module, higher than the priority of the preferred control key of the preferred function module, higher than the priority of the optional control key of other function modules, higher than the priority of the preferred control key of other function modules, higher than the priority of the other control key of the optional function module, higher than the priority of the other control key of the preferred function module, and higher than the priority of the other control key of the other function modules. Therefore, in the process of gradually increasing the initial display area of the control area, firstly adding the requisite control keys of the requisite function module to the control key set of the target application, then adding the preferred control keys of the requisite function module to the control key set of the target application, then adding the requisite control keys of the preferred function module to the control key set of the target application, then adding the preferred control keys of the preferred function module to the control key set of the target application, then adding the requisite control keys of other function modules to the control key set of the target application, then adding the other control keys of the requisite function module to the control key set of the target application, then adding the other control keys of the preferred function module to the control key set of the target application, and then adding other control keys of other function modules to the control key set of the target application. Specifically, in the judgment process of a certain type of control key of a certain type of functional module, the display is increased according to the priority order of each specific control key.
Preferably, when the display area of the control area is larger than the minimum display area, in another implementation manner, the control keys in the control key set of the target application may be added according to the priority order as shown in fig. 69 on the basis of comprehensively considering the priority order of each function module and the priority order of each control key of the target application. Specifically, the priority of the optional control key in the optional function module is higher than the priority of the preferred control key of the optional function module, higher than the priority of the optional control key of the preferred function module, higher than the priority of the preferred control key of the preferred function module, higher than the priority of the optional control key of other function modules, higher than the priority of other control keys of the optional function module, higher than the priority of other control keys of the preferred function module, higher than the priority of the optional control key of other function modules, higher than the priority of the preferred control key of other function modules, and higher than the priority of other control keys of other function modules. Therefore, in the process of gradually increasing the initial display area of the control area, firstly adding the requisite control keys of the requisite function module to the control key set of the target application, then adding the preferred control keys of the requisite function module to the control key set of the target application, then adding the requisite control keys of the preferred function module to the control key set of the target application, then adding the preferred control keys of the preferred function module to the control key set of the target application, then adding the other control keys of the requisite function module to the control key set of the target application, then adding the other control keys of the preferred function module to the control key set of the target application, then adding the requisite control keys of the other function module to the control key set of the target application, and then adding other control keys of other function modules to the control key set of the target application. Specifically, in the judgment process of a certain type of control key of a certain type of functional module, the display is increased according to the priority order of each specific control key.
It should be understood that the above two priority orders are only exemplary, and other defining manners of priorities according to the usage habits of the user are possible.
As shown in step 6202, optionally, in another implementation manner, the developer of the application program may directly define the display contents of the application control bar for different display areas, including the function modules and the control keys in the application control bar, and the manner of the layout of the application control bar. And selecting which application control bar display mode corresponding to the application program to display by the system according to the display area of the application control bar. Optionally, in another implementation, the system may identify each function module of the application program and the control key in the function module through a text or image recognition technology, the system assigns a priority order to the russian function module and the control key, and then the system determines which function modules and control keys are displayed in the application control column according to the assigned priority order, and determines a specific layout mode.
Optionally, the application control bar may include a shortcut function module related to the current operation of the target application by the user. The shortcut function module mainly includes a shortcut control key group related to the current operation of the target application by the user, for example, a set of control keys corresponding to different user operations listed in step 6203. In an implementation manner, the shortcut control key related to the user operation may be defined by the application developer, that is, the application developer sets a corresponding shortcut control key set according to different operations executed in the target application for the user. In another implementation, the control keys related to the user operation may be defined by the system, that is, the system sets shortcut operation control key sets corresponding to different types of operations of the user.
In another implementation, when the operation of the user on the target application is an operation on an operation interface of the target application, a shortcut control key group associated with the user operation can be displayed in a control area. Alternatively, in one implementation, only the control key group associated with the operation of the user on the operation interface of the target application may be displayed in the application control bar, that is, the originally displayed initial control key group corresponding to the target application in the application control bar is replaced by the control key group associated with the operation of the user on the operation interface of the target application. In another implementation manner, the initial control key group corresponding to the target application and the control key group associated with the operation of the user on the operation interface of the target application can also be displayed together in the application control bar, that is, the control key group associated with the operation of the user on the operation interface of the target application is added on the basis of the initial control key group corresponding to the target application.
In determining which shortcut control keys corresponding to the user operation are displayed in the application control bar, implementation logic similar to that described above for determining the control keys corresponding to the target application may be employed. In one implementation, the priority order of the shortcut control keys of the pair related to the user operation may be defined by the system, and then which shortcut control keys are displayed in the application control bar may be determined according to the display area of the application control bar. In another implementation manner, the system may define corresponding shortcut control key groups for the display areas of different application control fields, and then determine the shortcut control key group displayed in the application control field according to the actual display area of the application control field.
Step 6205: and hiding the display of the control key group in the control area in the first display screen.
Preferably, after the control area on the second display screen is activated and the related control key group is displayed in the control area, the display of the control key group in the control area on the first display screen may be hidden, so as to save the display space of the first display screen and enlarge the display area of the main display interface of the target application or other function modules in the first display area. The control key group in the hidden control area may be displayed on the first display screen without displaying the control key in the control area, or the control key in the control area may be displayed on the first display screen in a folded manner, or the control key in the control area displayed on the first display screen may be faded, for example, the control key is grayed out.
After the display of the control key group in the application control bar in the first display screen is removed, the display content of the first display screen can be adaptively adjusted. In one implementation, after the display of the control key set of the target application on the first display screen is removed, the size of the display content of the main display interface or other functional modules of the target application may be increased, for example: and amplifying display fonts, amplifying display pictures and the like, and carrying out adaptive adjustment on the layout on the first display screen. The realization mode can facilitate browsing of a user and improve user experience.
In another implementation manner, after the control key set of the target application is removed from being displayed on the first display screen, display content in the main display interface of the target application may be increased, a part of the function modules that are not displayed before may be added to the first display screen, and the non-displayed display content in the function modules displayed on the first display screen may be added, and the layout on the first display screen is adaptively adjusted. For example, a plurality of layout modes comprising different control keys for displaying in the first display screen can be defined by the application program, and the layout mode of the application program matched with the layout modes in the first display screen can be selected by the system according to the control key group displayed in the application control bar. After the control key set of the target application is removed from the display on the first display screen, the display content in the first display screen is added, so that more detailed content or operation modes of the target application can be embodied, and more convenient operation is provided for a user. Of course, one or more of the three display contents may be added simultaneously, or the display contents and the enlarged display contents may be added simultaneously. It should be understood that other ways of changing the layout of the content on the first display screen are possible, which are beneficial for enhancing the user experience when the display of the set of control keys of the target application on the first display screen is removed.
Step 6206: and closing the control area.
Optionally, in an implementation, when the user does not need to use the control area temporarily, the control area may be closed in a variety of ways. In one implementation, the control region may be associated with a virtual keyboard, and the control region is closed by default at the same time the virtual keyboard is closed. At this time, the virtual keyboard and the control area may be closed at the same time by an instruction to close the virtual keyboard, as shown in fig. 70A. In another implementation manner, as shown in fig. 70B, a control switch of the control area may be disposed on the virtual keyboard, and when the virtual keyboard is in an on state, the control area may be closed through the control switch on the virtual keyboard. In another implementation, as shown in fig. 70C, the control area may be closed through gesture control, that is, a gesture corresponding to closing the auxiliary display area is stored in the storage module, and when it is detected that the user performs the gesture, the control area is closed. The control gesture may be, for example, a finger sliding the control region toward an edge of the display screen. In another implementation, whether the control area is opened may be associated with a display mode of the application, and the control area may be closed and a portion of the contents of the control area may be migrated back to the first display area for display while the full-screen mode of the application is closed, as shown in fig. 70D.
When the user does not need to use the control area temporarily, the display of the control area is closed temporarily, the display area of other applications on the second display screen can be enlarged, the interference of the control area to the other applications on the second display screen is reduced under the unnecessary condition, and the user operation interface is simplified.
In another implementation, the contents of the control region may be always displayed in the second display screen, and thus step 307 is optional.
The screen display method 6200 displays a control area on the second display screen, wherein the control area contains a control key group related to system control and/or a control key group associated with an operation interface of a user for a target application, so that the user can operate the system or the target application in the first display screen through the control area in the second display screen. Under the assistance of the control area, the user does not need to repeatedly move the cursor position on the first screen and repeatedly position the position of the operation object or the control key, and the user operation is greatly simplified. The control area is displayed on the second screen, and is closer to two hands of a user compared with the first screen, so that more convenient operation can be provided for the user. In addition, after the relevant control key group is displayed in the control area, the display of the relevant control key group in the first display screen is removed, so that the display area in the first display screen can be saved. Therefore, the display content in the first display screen is enlarged or increased, and the user experience is improved.
Referring to fig. 71, in an embodiment of the present invention, a screen display method 7100 is provided, which is used for changing the display content of an application control bar in a control area according to the current operation of a user on a target application. The screen display method 7100 may include the steps of:
step 7101: and acquiring the operation of the user on the target application.
After the control area is opened, detecting the current operation of the user on the target application in real time, and changing the control key group displayed in the application control bar according to the current operation of the user on the target application.
As described in step 6202, in one implementation, the current operation of the target application by the user may be to display an operation interface of the target application in the first display screen. When only the operation interface of the target application is displayed in the first display screen, and the operation interfaces of other applications are not displayed, all the control key groups displayed in the application control bar can be replaced by the control key group corresponding to the target application. When the first display screen simultaneously contains the operation interface of the target application and the operation interfaces of other applications, the control key group displayed in the application control bar can be partially replaced by the control key group corresponding to the target application, or the control key group corresponding to the target application can be added on the basis of the control key group displayed in the application control bar. That is, a control key group corresponding to a plurality of applications including the target application is simultaneously displayed in the application control bar.
In another implementation, the current operation of the user on the target application may be an operation on an operation interface of the target application, as described in step 6202. Specifically, optionally, if the shortcut control key group associated with the last operation of the user on the operation interface of the target application is displayed in the application control bar before the user performs the operation on the operation interface of the target application, the shortcut control key group corresponding to the last operation in the part is replaced with the shortcut control key group corresponding to the current operation. If only the control key group corresponding to the target application is displayed in the application control bar before the user performs the operation on the operation interface of the target application, the control key group corresponding to the target application can be replaced by the shortcut control key group associated with the current operation of the user, or the shortcut control key group associated with the current operation of the user is added on the basis of the control key group corresponding to the target application.
The specific implementation manner of step 7101 is the same as that of step 6202, and is not described here again to avoid repetition.
Step 7102: and changing the control key group of the application control bar according to the user operation.
Optionally, in an implementation manner, the changing of the control key group of the application control bar according to the user operation may be that, on the basis of an original control key group in the application control bar, a part of the control key group related to the current operation of the user for the target application is added. For example, when the user opens only the target application without performing an operation on the target application, the shortcut operation control key group corresponding to the user operation may not be included in the application control field, that is, only the set of control keys corresponding to the target application is included in the initial control key group of the application control field, and the set of shortcut operation control keys is not included. When the user executes the first operation on the target application, a set of shortcut operation control keys corresponding to the user operation can be added in the application control bar, namely, a set of shortcut operation control keys associated with the first operation of the user is added in the application control bar.
Optionally, in another implementation, the changing of the control key group of the application control bar according to the user operation may be to reduce part of the control key group related to the current operation of the user for the target application, based on the original control key group in the application control bar. For example, when the operation of the user is changed, that is, when the user performs a second operation different from the first operation, the set of shortcut control keys corresponding to the second operation of the user is included in the set of shortcut control keys corresponding to the first operation of the user, and the shortcut control keys corresponding to the second operation of the user are less than the shortcut control keys corresponding to the first operation of the user, at this time, the group of shortcut control keys irrelevant to the second operation in the application control field may be reduced according to the second operation of the user for the target application.
Optionally, in another implementation manner, the changing of the control key group of the application control bar according to the user operation may be to partially or completely replace the control key group originally displayed on the application control bar with a new control key group. For example, when the user's operation is changed, that is, when the user performs a second operation different from the first operation, if the second operation is less correlated to the first operation, the shortcut key group associated with the first operation of the user in the application control field may be partially or completely replaced with a shortcut key group associated with the second operation of the user.
Optionally, if the number of shortcut operation control keys corresponding to the current operation of the user on the target application is large, the application control column is crowded, or all shortcut operation control keys cannot be completely displayed. For example, according to the current operation of the user on the target application, a set of a group of control keys needs to be added in the application control bar, or the number of sets of a group of control keys used for replacing the originally displayed control key group in the application control bar is greater than the number of originally displayed control keys. At this time, the display areas of the application control bar and the control area can be increased adaptively, so that the application control bar displays shortcut operation control keys corresponding to current operations of all users on the target application. When more control keys need to be displayed in the application control bar, the display area of the application control bar is adaptively increased, the display of the application control bar can be optimized, the situation that the control keys in the application control bar are too small in display is avoided, and better operation experience is provided for a user.
Optionally, if the shortcut operation control keys corresponding to the current operation of the user on the target application are fewer, the application control bar has an idle display area. For example, according to the current operation of the user on the target application, the set of a group of control keys needs to be reduced in the application control bar, or the number of the set of a group of control keys used for replacing the originally displayed group of control keys in the application control bar is smaller than the number of the originally displayed control keys. At this time, the display areas of the application control bar and the control area can be adaptively reduced, so that the display area of the application control bar is matched with the shortcut operation control key corresponding to the current operation of the user on the target application. When fewer control keys need to be displayed in the application control bar, the display area of the application control bar is adaptively reduced, the display of the application control bar can be optimized, the idle display area in the application control bar is avoided, the display control of the second display screen is saved, the display areas of other applications on the second display screen can be enlarged, and better operation experience is provided for a user.
Step 7103: hiding display of the control group displayed by the application control bar in the first screen.
Preferably, after the control key group in the application control bar is changed according to the operation of the user, the display of the control key displayed in the application control area on the first display screen may be hidden, and the specific implementation manner is as described in step 6205.
In the screen display method 7100, the control keys displayed on the application control bar are changed according to the change of the current operation of the user on the target application, so that the control key group corresponding to the current operation of the user on the target application is displayed on the application control bar at any time, the operation requirement of the user is met to the greatest extent, more efficient operation is provided for the user, and the operation efficiency of the user is improved.
It should be noted that the number of times steps 7101-7103 are executed is not limited in the embodiments of the present invention, that is, the change of the current operation of the user for the target application may be obtained multiple times, and the control key group displayed on the application control bar may be changed multiple times.
It should be noted that, optionally, in addition to steps 7101-7103, the screen display method 7100 may further include one or more steps in the screen display method 6200, which is specifically implemented as the screen display method 6200, and is not described herein again to avoid repetition.
Referring to fig. 72, in an embodiment of the present invention, a screen display method 7200 is provided, where the screen display method is used for changing the display area of the application control bar and the control key group in the application control bar according to an operation of changing the display area of the application control bar by a user. Screen display method 7200 can include the steps of:
step 7201: and acquiring an operation of changing the display area of the application control bar instructed by the user.
The user's demand during the use of the terminal is often changed in real time, and in some cases, the user may want to display more control keys in the application control bar of the control area to better assist the user in operating, for example, when the user is performing more complicated operations on the target application. At this time, the display area of the application control bar is enlarged, so that more control keys can be provided for the user, and the operation efficiency of the user is improved. In other cases, the user may desire that the application control bar of the control area display relatively few control keys, for example, when the user is performing other operations simultaneously on the second display screen while the user desires a larger display interface for other applications, or when the user is performing simpler operations on the target application. At this moment, the display space on the second display screen can be saved by reducing the display area of the application control bar, and the user can simply and quickly locate the required control key by reducing the control keys on the application control bar, so that the operation efficiency of the user is improved, and the user experience is improved.
The user can achieve the purpose of changing the display area of the application control bar in various ways. In one implementation, the user may indirectly change the display area of the application control bar by changing the display area of the control region, for example, as shown in fig. 73A, the display area of the control region may be enlarged by an enlargement button on the control region, and the display area of the application control bar may be indirectly enlarged. The area of the control area can be reduced through a reduction button on the control area, and then the display area of the application control bar is indirectly reduced. In addition, as shown in fig. 73B, the display area of the control area can be enlarged by the enlargement gesture, and the display area of the application control bar can be indirectly enlarged. As shown in fig. 73C, the display area of the control area can be reduced by reducing the gesture, and the display area of the application control bar can be indirectly reduced. In another implementation, the user may indirectly change the display area of the control area by changing the display area of other applications on the second display screen, thereby changing the display area of the application control bar. For example, as shown in fig. 74A, the display area of the other application may be enlarged by the zoom-in button on the other application on the second display screen, the display area of the control area may be indirectly reduced, and the display area of the application control bar may be indirectly reduced. The display area of other applications can be reduced through the reduction buttons on the other applications on the second display screen, the display area of the control area is indirectly enlarged, and then the display area of the application control bar is indirectly enlarged. In addition, as shown in fig. 74B, the user may reduce other application interfaces on the second display screen by a reduction gesture, and expand the display area of the control area, thereby expanding the display area of the application control bar. As shown in fig. 74C, the user may enlarge other application interfaces on the second display screen by the zoom-in gesture, reduce the display area of the control area, and further reduce the display area of the application control bar. In another implementation, the user may directly operate the application control bar to change the display area of the application control bar. For example, as shown in fig. 75A, the display area of the application control bar can be enlarged by the enlargement button on the application control bar. The display area of the application control bar can be reduced by a reduction button on the application control bar. In addition, as shown in fig. 75B, the user can enlarge the display area of the application control bar by an enlargement gesture, and as shown in fig. 75C, the user can reduce the display area of the application control bar by a reduction gesture.
In another implementation, the display area of the application control bar may also be changed according to a user operation on the application on the first display screen. Specifically, when the number of control keys corresponding to the current operation of the user is greater than the number of control keys corresponding to the previous operation of the user, in order to display all the control keys in the application control field and ensure the display effect of the control keys in the application control field, the display area of the application control field may be appropriately increased. For example, when the previous operation of the user is to open an application, an initial control key corresponding to the application may be displayed in the application control bar, and the current operation of the user is an operation performed on the interface of the target application, in this case, a control key currently operated by the user may be added to the application control bar, and in this case, the display area of the application control bar may be appropriately increased. When the number of control keys corresponding to the current operation is less than the number of control keys corresponding to the previous operation of the user, the display area of the application control bar may be appropriately reduced in order to save the display area of the second display screen.
In another implementation manner, the display area and the position of the control area can be flexibly adapted to the display changes of other function modules on the second display screen, and meanwhile, the display area of the application control bar can be adjusted along with the display area and the position of the control area. For example, when a user triggers different types of virtual keyboards through different gestures, the display area and the position of the control area can be flexibly determined according to the display areas of the different types of virtual keyboards. The specific implementation manner of triggering different types of virtual keyboards according to different gestures is shown in embodiment two, and details are not repeated here.
In another implementation, when the user indicates to start the handwriting input mode of the target application, an interface of the target application is displayed on the second display screen, so that the user can perform handwriting input through the second display screen. In another implementation, the application control bar may not be displayed on the second display screen because the interface of the target application is already displayed in the second display screen at this time. The specific implementation manner of switching between the handwriting input mode and the virtual keyboard input mode is shown in the third embodiment, and details are not described here.
In another implementation manner, when the user switches the input mode to the handwriting input mode, a handwriting input area may be displayed in the application control bar, and a control key group associated with the handwriting input mode may also be displayed in the application control bar, for example: pen, eraser, color, font, etc., and may also display the handwriting input area and the set of control keys associated with the handwriting input mode in the application control bar at a colleague. And enabling the user to execute handwriting input through the application control bar and/or operate the handwriting input mode through the application control bar. The operation efficiency is improved. The specific implementation manner related to the switching of the handwriting input mode is shown in the third embodiment, and details are not described here.
Step 7202: and changing the display area and the control key set of the application control bar according to the user operation.
When the user operation instructs to enlarge the display area of the application control bar, the display area of the application control bar is enlarged according to the degree of the user operation. For example, when the user enlarges the display area by clicking the zoom-in button, it may be determined to what extent the display area of the application control bar is zoomed in, according to the number of clicks of the user. When the user enlarges the display area through the enlargement gesture, the degree to which the display area of the application control bar is enlarged can be determined according to the degree of the enlargement gesture of the user.
As shown in fig. 76A, while the display area of the application control bar is enlarged, the control keys in the group of control keys in the application control bar corresponding to the target application can be increased. Optionally, in an implementation manner, while the display area of the application control bar is enlarged, control keys in the original function modules in the application control bar may be increased. In another implementation manner, while the display area of the application control bar is enlarged, a new function module and a set of corresponding control keys thereof may be added to the application control bar. In another implementation manner, while the display area of the application control bar is enlarged, the control keys in the original function modules in the application control bar may be added, and a new function module and a set of corresponding control keys thereof may be added at the same time.
In one implementation, the system may add a portion of the control keys based on the set of control keys displayed in the application control bar according to the priorities of the function modules and the control keys, in order from high priority to low priority, and determine the layout of the application control bar after adding the portion of the control keys, as described in step 6202. Preferably, when the display area of the application control bar is enlarged, as shown in fig. 76A, the control group originally displayed in the application control bar may be moved downward, and the control group newly displayed in an additional manner may be displayed above the control group originally displayed in the application control bar, that is, in the application control bar after the display area is enlarged, the control group newly displayed in an additional manner is closer to the first display screen than the control group originally displayed in the application control bar. In the implementation mode, the priority of the control key group which is newly added and displayed is lower than that of the control key group which is originally displayed in the application control bar, and through the setting, when the display area of the application control bar is enlarged, the control key with higher priority (the control key with more important function or higher use frequency of the user) can be always arranged at the position closer to the two hands of the user, so that more convenient operation is provided for the user, and the operation efficiency of the user is improved.
In another implementation, the system may select, according to the display area of the application control bar, a display mode of the application control bar corresponding to the display area provided by the application program, and display the selected display mode in the control area.
Preferably, when the display area of the application control bar is enlarged so that the number of control keys corresponding to the target application in the application control bar is increased, the increased control keys may be hidden from being displayed on the first display screen, and the specific implementation manner and beneficial effects are described in step 6205.
When the user operation instructs to reduce the display area of the application control bar, the display area of the application control bar is reduced according to the degree of the user operation. For example, when the user enlarges the display area by clicking the reduction button, it is possible to determine to what extent the display area of the application control bar is reduced, according to the number of clicks of the user. When the user reduces the display area through the reduction gesture, the degree to which the display area of the application control bar is enlarged can be determined according to the degree of the reduction gesture of the user.
As shown in fig. 76B, while the display area of the application control bar is reduced, the control keys in the group of control keys in the application control bar corresponding to the target application can be reduced. Optionally, in an implementation manner, while the display area of the application control bar is reduced, the number of the function modules in the application control bar may be kept unchanged, and the number of the control keys in the function modules is reduced. In another implementation, the display area of the application control bar is reduced, and the sets of the function modules of the application control bar and the corresponding control keys thereof can be reduced. In another implementation manner, while the display area of the application control bar is reduced, the number of the function modules in the application control bar, the sets of the corresponding control keys thereof, and the number of the control keys in other reserved function modules can be reduced at the same time.
In one implementation, the system may decrease a portion of the control keys based on the set of control keys displayed in the application control bar according to the priorities of the function modules and the control keys, and determine the layout of the application control bar after decreasing the portion of the control keys, as described in step 6202. In another implementation, the system may select a display mode of the application control bar provided by the application program according to the display area of the application control bar, and display the selected display mode in the control area.
Preferably, when the display area of the application control bar is reduced, so that the control keys corresponding to the target application in the application control bar are reduced, the reduced control keys can be restored to be displayed in the first display screen, so that when a user needs to use the control keys, the user can operate the control keys through the first display screen through a touch screen gesture or a mouse operation in a traditional manner.
In the screen display method 7200, the display area of the application control bar and the control key group in the application control bar are changed according to the operation of changing the display area of the target application control bar instructed by the user, so that the display of the control area is more flexible, different use requirements of the user in different use scenes are met, and the user experience is improved.
In addition, when the display area of the application control bar is adjusted, the display area of other display areas (such as the system control bar) in the control area or the display layout of other display modules (such as display interfaces of other applications and virtual keyboards) on the second display screen can be adaptively adjusted.
In addition, in order to facilitate the user to quickly locate the control key to be located without moving the sight line to the second display screen, especially, the control key to be located can still be quickly located under the condition that the display area and the display key of the control area are changed, and an anchor point feedback technology can be added into the control area. In one implementation, the user may be provided feedback when the user contacts a control in the application control bar indicating that the user is now contacting the control in the application control bar. In another implementation, the user may be provided feedback when the user contacts a control in the system control column indicating that the user is now contacting the control in the system control column. In another possible implementation manner, some control keys in the application control column or the system control column, which have more important functions or are used more frequently by the user, may be set as anchor point feedback keys, so that the user can quickly locate the important or used more frequently control keys. The specific implementation of the anchor point feedback is shown in the first embodiment, and details are not described here.
It should be noted that, in the embodiment of the present invention, the number of times of executing steps 7201 and 7202 is not limited, the user may execute the operation of changing the display area of the application control bar multiple times, the system may obtain the operation of changing the display area of the application control bar instructed by the user in real time, and change the display area of the application control bar and the control key group in the application control bar multiple times according to the user operation.
It should be noted that, optionally, in addition to step 7201 and step 7202, the screen display method 7200 may further include one or more steps in the screen display method 6200, which is specifically implemented as the screen display method 6200, and is not described herein again to avoid repetition.
And the control area displayed on the second display screen has an output function, namely the control area is used as a man-machine interaction interface to display part of control keys of the target application to a user in a set manner. In addition, the control area may have some input functions, such as a touch screen gesture function, and receive an input from a user, so as to perform some operation on the target application or perform some operation on the control area itself.
In one implementation, the control area may receive user input to enable control of the target application functionality. For example, when the target application is mainly used for editing a document, a control key set in the control area corresponding to the target application may include a control key for processing text content, such as copying, pasting, cutting, and the like. At this time, the control key in the control area can be clicked through a touch screen gesture or the control key in the control area can be selected through a mouse, so that the text content can be edited. When the target application is mainly used for playing video content, the control key set in the control area corresponding to the target application may include control keys for controlling the video content, such as a volume control key, a brightness control key, a progress control bar, and the like. At the moment, the volume, the brightness, the playing progress and the like of the video can be controlled by clicking a control key in the control area through a touch screen gesture or selecting a corresponding control key through a mouse. It should be understood that the target application and the control keys for operating the target application are exemplary, and other target applications and their usual control keys as are common in the art are possible.
In another implementation, the user may operate the target application in the first display screen through the control key set of the control region and other input methods, for example, the user may select a specific object in the edit page through a mouse or a touch screen gesture on the first display screen, and edit the selected object through the control key set of the control region. It should be understood that the cooperative control of the set of control keys of the control region and the mouse or touch screen gesture described above is merely exemplary, and other possible cooperative modes of operation of the target application in the first display screen are possible.
In one embodiment of the invention, a user can view, edit and customize the control key set of the control area. In one implementation, the control region may support the following touch screen gesture operations by the user:
1) touch screen gesture operations
In one implementation, the control region can be operated by a drag gesture, for example, the drag gesture can be used to drag a control key at a location in the control region to another location in the control region, as shown in FIG. 77A. The dragging gesture can also be used for dragging a certain function module in the control area to another position in the control area integrally. The drag gesture may also be used to move the position of the entire control region in the second display screen, as shown in FIG. 77B.
In another implementation, the control region can be operated by a swipe gesture, which can be used, for example, to browse displayed content in the control region. For example, when all the control keys are not displayed on a certain function module due to the limitation of the display area of the control area, the control keys not displayed on the function module can be browsed through a slide gesture, as shown in fig. 77C.
In another implementation, the control region may be operated by a flick gesture, which may be used to remove certain content in the control region, for example, as shown in FIG. 77D.
2) Finger heavy pressure gesture operation
The control area can be operated through the finger repressing gesture, and different functions can be correspondingly executed when the finger repressing gesture of the user is received at different positions of the control area. As shown in fig. 78A, in one implementation, if a finger re-pressing gesture of the user is received on a control key in the control area, a delete button of the current control key may be displayed, and the control key is deleted through the delete button. In addition, after the control key is deleted, a vacancy can be displayed at the corresponding display position, an adding button is displayed, and a user can add a new control key at the position through the adding button. As shown in fig. 78B, if a finger-pressing gesture is received at a boundary between regions of different function modules, a function of moving edges of the two function modules divided by the boundary may be triggered, and a user may change display areas of the two function modules by dragging the dividing line. Specifically, the display area of one of the function modules is increased, the display area of the other function module is decreased, the control keys displayed in the function modules with the increased display areas can be increased and the control keys displayed in the function modules with the decreased display areas can be decreased according to the priority order of the control keys in the two function modules.
3) Hover gesture operation
The control region may be operated by a hover gesture, which may be used to perform a preview operation, e.g., as shown in fig. 79, which may be used to view the name of the current control key, a secondary prompt, etc. The hover gesture operation may be used to preview control keys not displayed in the current control region due to the display of the display area.
It should be understood that the above list of touchscreen gesture operations is merely exemplary, and other gesture operation modes common in the art are possible.
On the basis of the embodiments corresponding to fig. 60 to fig. 79, in order to better embody the aspects and the advantages of the embodiments of the present application, a specific embodiment is provided below.
Taking the use of a note taking application on a dual-screen electronic device as an example, as shown in fig. 80A, in a conventional display state, all contents related to the note taking application are displayed on a first display screen, for example, in fig. 80A, a main display area of note taking contents and a function menu bar such as a list navigation and fixed menu bar may be included. In the normal display state, the user may operate the note application in a normal operation manner, for example, control the note application through the first display screen by a mouse or a touch screen gesture.
When the user needs to start the auxiliary operation of the control area on the second display screen, the control area can be activated in four ways:
1) if the virtual keyboard is in a closed state, the virtual keyboard and the control area can be opened simultaneously when an instruction of opening the virtual keyboard by a user is received.
2) If the virtual keyboard is in the opening state, an instruction of opening the application control bar by a user is received through a control button of the control area on the virtual keyboard, and the control area is opened.
3) The control area may be opened upon receiving a user gesture to open the control area.
4) When the note-taking application is not displayed in full screen mode, the control area may be opened upon receiving an instruction of the user to take note of the application in full screen.
When the system receives an instruction of activating the control area from a user, according to the implementation manner in the method embodiment corresponding to the screen display method 6200, the corresponding system control key group and the control key group corresponding to the target application are displayed in the control area in the second display screen according to the display area of the control area, and the display areas of other applications in the second display screen are correspondingly reduced. For example, as shown in fig. 80B, when the initial display area of the control region is the smallest, only the system control group related to the system control is displayed in the system control field in the control region, and the part of the control group corresponding to the target application is displayed in the application control field.
When receiving an operation of changing the display area of the application control bar by the user, for example, as shown in fig. 80C, when the user enlarges the display area of the application control bar by an enlarging gesture, the system enlarges the control area and the display area of the application control bar according to the operation of the user, and adds a function module corresponding to the note application and a control key group thereof to the application control bar. And at the same time, removing the function menu bar which is originally displayed on the first display screen and corresponds to the function module. As shown in fig. 80D, when the user further enlarges the display area of the application control bar through the magnification gesture, the system further enlarges the display areas of the control area and the application control bar according to the user operation, and adds another function module and its control key group corresponding to the note-taking application in the application control bar, and at the same time, removes the function menu bar corresponding to the function module and originally displayed on the first display screen.
When a user performs a certain operation on a target application, for example, as shown in fig. 80E, a part of characters in an operation interface of the target application is selected, a current operation of the user on the target application is obtained, and according to the user operation, a control key group corresponding to the current operation of the user on the target application is set, for example: copy, paste, cut, etc., displayed in the application control bar. When the user changes the current operation of the target application, for example, a part of pictures in the operation interface of the target application is selected, the current operation of the user on the target application is acquired, and the control key group corresponding to the previous operation of the user in the application control bar is changed into the control key group corresponding to the next operation of the user according to the user operation.
For example, as shown in fig. 80F, the user may select which note of the note application is browsed by clicking a control key in the function module, and the user may edit the currently displayed note by clicking a control key in the function module. In addition, the user can operate the application control bar per se. For example, the user may custom edit the display content of the application control bar, or the user may view the names, functions, or other descriptions of the control keys in the control region through a hover gesture, as shown in fig. 80G.
When the user does not need to use the control area for auxiliary operation, the control area can be closed in four ways:
1) if the virtual keyboard is in an open state, the virtual keyboard and the control area can be closed at the same time when an instruction of closing the virtual keyboard by a user is received.
2) If the virtual keyboard is in an open state, an instruction of closing the application control bar by a user can be received through a control button of the control area on the virtual keyboard, and the control area is closed.
3) The control region may be closed upon receiving a gesture by a user to close the control region.
4) When the note application is displayed in a full screen mode, the control area may be closed upon receiving an instruction from a user to close the full screen display mode.
It will be appreciated that the electronic device, in order to implement the above-described functions, comprises corresponding hardware and/or software modules for performing the respective functions. The present application is capable of implementing the method steps of the examples described in connection with the embodiments disclosed herein in hardware or a combination of hardware and computer software. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, with the embodiment described in connection with the particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In this embodiment, the electronic device may be divided into functional modules according to the method example, for example, each functional module may be divided corresponding to each function, or two or more functions may be integrated into one processing module. The integrated module may be implemented in the form of hardware. It should be noted that the division of the modules in this embodiment is schematic, and is only a logic function division, and there may be another division manner in actual implementation.
In the case of dividing each functional module by corresponding functions, fig. 81 shows a schematic diagram of a possible composition of the electronic device involved in the foregoing method embodiment, and as shown in fig. 81, an electronic device 8100 may include: the display device comprises a first display screen 8101, a second display screen 8102, an input module 8103 and a processing module 8104.
The first display screen 8101 may be used to support the electronic device 8100 to display a target application interface, and/or for other processes for the techniques described herein.
In the electronic device with dual-screen display, the first display screen generally carries an output function, that is, displays the status of the target application and the execution result of the user operation. Optionally, the display content of the first display screen may include a main display interface and a partial function menu bar of the target application. In addition, the first display screen can also bear an input function, and the first display screen can be operated through touch screen gestures to achieve the input function.
The second display screen 8102 may be used to support the electronic device 8100 for displaying control regions, and/or for other processes for the techniques described herein.
In the electronic device with dual-screen display, in one implementation, the second display screen may bear an input function, that is, receive an input from a user, and in another implementation, the second display screen may also bear an output function, and display the state of a target application and the execution result of a user operation. The electronic device 8100 displays the control area on the second display screen, so that a user can control the target application in the first display screen through the control area on the second display screen, the operation efficiency is improved, and the user experience is improved.
Input module 8103 can be used to support electronic device 8100 performing step 6202 in screen display method 6200, can be used to support electronic device 8100 performing step 7101 in screen display method 7100, can be used to support electronic device 8100 performing step 7201 in screen display method 7200, and/or other processes for the techniques described herein.
Specifically, in step 6202 and step 7101, the input module is configured to receive an operation of the user on the target application, and in an implementation manner, the user may operate the target application by operating a mouse, where the input module may be the mouse. In another implementation, the user may operate the target application through a touch screen gesture, and in this case, the input module may be the first display screen. In another implementation, the user may operate the target application through an air gesture, and at this time, the input module may be a depth camera or the like for collecting gesture information. In step 7201, the input module is configured to receive an operation of the user to change the display area of the application control bar, and in an implementation manner, the user may operate on the second display screen through a touch screen gesture to change the display area of the application control bar, where the input module may be the second display screen. In another implementation, the user may change the display area of the application control bar through a mouse, and in this case, the input module may be a mouse. In another implementation, the user may change the display area of the application control bar through an empty gesture, and at this time, the input module may be a depth camera or the like for collecting gesture information.
Processing module 8104 may be used to support electronic device 8100 performing steps 6201, 6203, 6204, 6205, 6206 in screen display method 6200, may be used to support electronic device 8100 performing steps 7102, 7103 in screen display method 7100, may be used to support electronic device 8100 performing step 7202 in screen display method 7200, and/or other processes for the techniques described herein.
The processing module may be a processor or a controller. Which may implement or execute the various illustrative logical blocks, modules, and circuits described in connection with the disclosure herein. The processor may be a combination that implements a computational function, including, for example, a combination of one or more microprocessors, a combination of Digital Signal Processing (DSP) and a microprocessor, or the like.
It should be noted that, for information interaction, execution processes and other contents between the modules/units in the electronic device 8100, the method embodiments corresponding to fig. 60 to fig. 79 in the present application are based on the same concept, and specific contents may refer to descriptions in the foregoing method embodiments in the present application, and are not described herein again.
For example, fig. 82 shows a schematic structural diagram of an electronic device 8200. The electronic device 2200 may be embodied as a dual-screen electronic device, such as a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, a Personal Digital Assistant (PDA), and the like, having two display screens, or a curved screen or a flexible foldable screen, or may be embodied as two electronic devices, such as two tablet computers, two mobile phones, and the like, which are connected together and used synchronously.
The electronic device 8200 may include a processor 8210, an external memory interface 8220, an internal memory 8221, a Universal Serial Bus (USB) interface 8230, a charging management module 8240, a power management module 8241, a battery 8242, an antenna 1, an antenna 2, a mobile communication module 8250, a wireless communication module 8260, an audio module 8270, a speaker 8270A, a receiver 8270B, a microphone 8270C, an earphone interface 8270D, a sensor module 8280, keys 8290, a motor 8291, an indicator 8292, a camera 8293, a display 8294, a user identification module (SIM) card interface 8295, and the like. The sensor module 8280 may include a pressure sensor 8280A, a gyroscope sensor 8280B, an air pressure sensor 8280C, a magnetic sensor 8280D, an acceleration sensor 8280E, a distance sensor 8280F, a proximity optical sensor 8280G, a fingerprint sensor 8280H, a temperature sensor 8280J, a touch sensor 8280K, an ambient optical sensor 8280L, a bone conduction sensor 8280M, and the like.
It is to be understood that the exemplary structure of the embodiment of the present application does not specifically limit the electronic device 8200. In other embodiments of the present application, the electronic device 8200 may include more or fewer components than shown, or combine certain components, or split certain components, or arrange different components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 8210 may comprise one or more processing units, such as: the processor 8210 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. Wherein, the different processing units may be independent devices or may be integrated in one or more processors.
Among the controllers may be the neural center and command center of the electronic device 8200. The controller can generate an operation control signal according to the instruction operation code and the time sequence signal to finish the control of instruction fetching and instruction execution. A memory may also be provided within the processor 8210 for storing instructions and data. In some embodiments, memory in the processor 8210 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 8210. If the processor 8210 needs to reuse the instruction or data, it can call from the memory directly. Avoiding repeated accesses reduces the latency of the processor 8210 and thus increases the efficiency of the system.
In some embodiments, the processor 8210 may comprise one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
Power management module 8241 is used to connect battery 8242, charge management module 8240 and processor 8210. The power management module 8241 receives an input from the battery 8242 and/or the charging management module 8240, and supplies power to the processor 8210, the internal memory 8221, the external memory, the display screen 8294, the camera 8293, the wireless communication module 8260, and the like. Power management module 8241 may also be used to monitor battery capacity, battery cycle count, battery state of health (leakage, impedance), etc. In other embodiments, a power management module 8241 can also be disposed in the processor 8210. In other embodiments, the power management module 8241 and the charge management module 8240 may be disposed in the same device.
The electronic device 8200 realizes a display function through a GPU, a display screen 8294, an application processor, and the like. The GPU is a microprocessor for image processing, connecting the display screen 8294 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processors 8210 may comprise one or more GPUs that execute program instructions to generate or change display information.
The display screen 8294 is for displaying images, video, and the like. The display screen 8294 comprises a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In the embodiment of the present invention, the display screen 8294 is divided into a first display screen and a second display screen, and the first display screen or the second display screen may have an input function, for example, controlling the display screen through a touch screen gesture.
The external memory interface 8220 can be used for connecting an external memory card, such as a Micro SD card, and realizes the memory capability of the electronic device 8200. The external memory card communicates with the processor 8210 through the external memory interface 8220 to realize a data storage function. For example, files such as music, video, etc. are saved in the external memory card.
The internal memory 8221 may be used to store computer-executable program code, including instructions. The processor 8210 executes various functional applications and data processing of the electronic device 8200 by executing instructions stored in the internal memory 8221. The internal memory 8221 may include a program storage area and a data storage area. The storage program area may store an operating system, application programs (such as a sound playing function and an image playing function) required by one or more functions, and the like. The storage data area can store data (such as audio data, a phone book and the like) created in the using process of the electronic device 8200 and the like. In addition, the internal memory 8221 may include a high-speed random access memory, and may further include a nonvolatile memory such as one or more magnetic disk storage devices, a flash memory device, a universal flash memory (UFS), or the like.
The present embodiment also provides a computer storage medium, where computer instructions are stored, and when the computer instructions are run on an electronic device, the electronic device is caused to execute the above related method steps to implement the method for screen display in the above embodiment.
The present embodiment also provides a computer program product, which when running on a computer, causes the computer to execute the relevant steps described above, so as to implement the method for screen display in the above embodiments.
In addition, embodiments of the present application also provide an apparatus, which may be specifically a chip, a component or a module, and may include a processor and a memory connected to each other; the memory is used for storing computer execution instructions, and when the device runs, the processor can execute the computer execution instructions stored in the memory, so that the chip can execute the method for image classification in the above-mentioned method embodiments.
The electronic device, the computer storage medium, the computer program product, or the chip provided in this embodiment are all configured to execute the corresponding method provided above, so that the beneficial effects achieved by the electronic device, the computer storage medium, the computer program product, or the chip may refer to the beneficial effects in the corresponding method provided above, and are not described herein again.
Through the description of the above embodiments, those skilled in the art will understand that, for convenience and simplicity of description, only the division of the above functional modules is used as an example, and in practical applications, the above function distribution may be completed by different functional modules as needed, that is, the internal structure of the device may be divided into different functional modules to complete all or part of the above described functions.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described embodiments of the apparatus are merely illustrative, and for example, a module or a unit may be divided into only one logic function, and may be implemented in other ways, for example, a plurality of units or components may be combined or integrated into another apparatus, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
Units described as separate parts may or may not be physically separate, and parts displayed as units may be one physical unit or a plurality of physical units, may be located in one place, or may be distributed to a plurality of different places. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit may be implemented in the form of hardware, or may also be implemented in the form of a software functional unit.
The integrated unit, if implemented as a software functional unit and sold or used as a separate product, may be stored in a readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application may be essentially or partially contributed to by the prior art, or all or part of the technical solutions may be embodied in the form of a software product, where the software product is stored in a storage medium and includes several instructions to enable a device (which may be a single chip, a chip, or the like) or a processor (processor) to execute all or part of the steps of the methods of the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (14)

1. The processing method of the application interface is applied to an electronic device, wherein the electronic device comprises a first display screen and a second display screen, and the method comprises the following steps:
displaying a first application interface through the first display screen;
in response to the detected first operation, converting the mode type corresponding to the first application interface into handwriting input;
and responding to the input mode of the handwriting input, triggering the first application interface to be displayed on the second display screen, so as to acquire the handwriting content aiming at the first application interface through the second display screen.
2. The method of claim 1, wherein after the triggering of the presentation of the first application interface on the second display screen in response to the input mode of the handwritten input, the method further comprises:
Under the condition that the mode type corresponding to the first application interface is detected to be converted into keyboard input, responding to the input mode of the keyboard input, triggering to display the first application interface on the first display screen, and displaying a virtual keyboard on the second display screen; alternatively, the first and second electrodes may be,
and under the condition that the mode type corresponding to the first application interface is detected to be converted into keyboard input, responding to the input mode of the keyboard input, triggering to display the first application interface on the first display screen, and displaying a virtual keyboard and an application control bar on the second display screen.
3. The method of claim 1, wherein after the triggering of the presentation of the first application interface on the second display screen in response to the input mode of the handwritten input, the method further comprises:
and under the condition that the mode type corresponding to the first application interface is detected to be converted into a browsing mode, responding to the browsing mode, and triggering to display the first application interface on the first display screen.
4. The method of any of claims 1 to 3, wherein the detecting the first operation comprises any one or a combination of five of:
Determining that the first operation is detected when detecting that a holding gesture of the electronic pen meets a first preset condition, wherein the holding gesture comprises any one or combination of more of the following items: holding position, holding strength and holding angle; alternatively, the first and second electrodes may be,
acquiring a trigger instruction for handwriting input through a first icon, wherein the first icon is displayed on the first application interface; alternatively, the first and second electrodes may be,
determining that the first operation is detected under the condition that a preset click operation or a preset track operation is detected; alternatively, the first and second electrodes may be,
determining that the first operation is detected under the condition that the electronic pen is detected to be located within a preset range of the second display screen; alternatively, the first and second electrodes may be,
and determining that the first operation is detected under the condition that the electronic pen is detected to be changed from the first preset state to the second preset state.
5. The method according to any one of claims 1 to 3, wherein the first operation is a sliding operation in a first direction acquired by the second display screen, the sliding operation in the first direction is a sliding operation of sliding from an upper edge of the second display screen to a lower edge of the second display screen, and a distance between the upper edge of the second display screen and the first display screen is shorter than a distance between the lower edge of the second display screen and the first display screen.
6. The method of any of claims 1-3, wherein after the triggering the presentation of the first application interface on the second display screen, the method further comprises:
acquiring a starting operation aiming at the second application interface, and determining a mode type corresponding to the second application interface based on the starting operation, wherein the second application interface and the first application interface are different application interfaces;
under the condition that the mode type corresponding to the second application interface is handwriting input, responding to the input mode of the handwriting input, and triggering to display the second application interface on the second display screen; alternatively, the first and second electrodes may be,
under the condition that the mode type corresponding to the second application interface is keyboard input, responding to the input mode of the keyboard input, triggering to display the second application interface on the first display screen, and displaying a virtual keyboard on the second display screen; alternatively, the first and second electrodes may be,
and under the condition that the mode type corresponding to the second application interface is a browsing mode, responding to the browsing mode, and triggering to display the second application interface on the first display screen.
7. An electronic device comprising a first display, a second display, memory, one or more processors, and one or more programs; the one or more programs are stored in the memory, and the one or more processors, when executing the one or more programs, cause the electronic device to perform the steps of:
Displaying a first application interface through the first display screen;
in response to the detected first operation, converting the mode type corresponding to the first application interface into handwriting input;
and responding to the input mode of the handwriting input, triggering the first application interface to be displayed on the second display screen, so as to acquire the handwriting content aiming at the first application interface through the second display screen.
8. The electronic device of claim 7, wherein the one or more processors, when executing the one or more programs, cause the electronic device to further perform the steps of:
under the condition that the mode type corresponding to the first application interface is detected to be converted into keyboard input, responding to the input mode of the keyboard input, triggering to display the first application interface on the first display screen, and displaying a virtual keyboard on the second display screen; alternatively, the first and second liquid crystal display panels may be,
and under the condition that the mode type corresponding to the first application interface is detected to be converted into keyboard input, responding to the input mode of the keyboard input, triggering to display the first application interface on the first display screen, and displaying a virtual keyboard and an application control bar on the second display screen.
9. The electronic device of claim 7, wherein the one or more processors, when executing the one or more programs, cause the electronic device to further perform the steps of:
and under the condition that the mode type corresponding to the first application interface is detected to be converted into a browsing mode, responding to the browsing mode, and triggering to display the first application interface on the first display screen.
10. The electronic device of any of claims 7-9, wherein the one or more processors, when executing the one or more programs, cause the electronic device to perform in particular any one or a combination of four of:
determining that the first operation is detected when a holding gesture of the electronic pen is detected to meet a first preset condition, wherein the holding gesture comprises any one or more of the following items: holding position, holding strength and holding angle; alternatively, the first and second liquid crystal display panels may be,
acquiring a trigger instruction for handwriting input through a first icon, wherein the first icon is displayed on the first application interface; alternatively, the first and second liquid crystal display panels may be,
detecting a first contact operation, wherein the first contact operation is a preset click operation or a preset track operation; alternatively, the first and second electrodes may be,
Determining that the first operation is detected under the condition that the electronic pen is detected to be located within a preset range of the second display screen;
and under the condition that the electronic pen is detected to be changed from the first preset state to the second preset state, determining that the first operation is detected.
11. The electronic device according to any one of claims 7 to 9, wherein the first operation is a sliding operation in a first direction obtained by the second display screen, the sliding operation in the first direction is a sliding operation performed from an upper edge of the second display screen to a lower edge of the second display screen, and a distance between the upper edge of the second display screen and the first display screen is shorter than a distance between the lower edge of the second display screen and the first display screen.
12. The electronic device of any of claims 7-9, wherein the one or more processors, when executing the one or more programs, cause the electronic device to further perform the steps of:
acquiring starting operation aiming at the second application interface, and determining a mode type corresponding to the second application interface based on the starting operation, wherein the second application interface and the first application interface are different application interfaces;
Under the condition that the mode type corresponding to the second application interface is handwriting input, responding to an input mode of the handwriting input, and triggering to display the second application interface on the second display screen; alternatively, the first and second electrodes may be,
under the condition that the mode type corresponding to the second application interface is keyboard input, responding to the input mode of the keyboard input, triggering to display the second application interface on the first display screen, and displaying a virtual keyboard on the second display screen; alternatively, the first and second electrodes may be,
and under the condition that the mode type corresponding to the second application interface is a browsing mode, responding to the browsing mode, and triggering to display the second application interface on the first display screen.
13. A computer program product comprising instructions which, when loaded and executed by an electronic device, cause the electronic device to perform the method of any of claims 1-6.
14. An electronic device comprising a processor coupled to a memory, the memory storing program instructions that, when executed by the processor, implement the method of any of claims 1-6.
CN202011631548.8A 2020-12-30 2020-12-30 Application interface processing method and related equipment Pending CN114690888A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202011631548.8A CN114690888A (en) 2020-12-30 2020-12-30 Application interface processing method and related equipment
PCT/CN2021/141913 WO2022143607A1 (en) 2020-12-30 2021-12-28 Application interface processing method and related device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011631548.8A CN114690888A (en) 2020-12-30 2020-12-30 Application interface processing method and related equipment

Publications (1)

Publication Number Publication Date
CN114690888A true CN114690888A (en) 2022-07-01

Family

ID=82134523

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011631548.8A Pending CN114690888A (en) 2020-12-30 2020-12-30 Application interface processing method and related equipment

Country Status (2)

Country Link
CN (1) CN114690888A (en)
WO (1) WO2022143607A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130129310A1 (en) * 2011-11-22 2013-05-23 Pleiades Publishing Limited Inc. Electronic book
CN107357513A (en) * 2017-06-12 2017-11-17 深圳市金立通信设备有限公司 A kind of method and terminal recorded the note
CN107577445A (en) * 2017-09-19 2018-01-12 中新国际电子有限公司 A kind of electronic book equipment and calculation method
CN109739431A (en) * 2018-12-29 2019-05-10 联想(北京)有限公司 A kind of control method and electronic equipment
US20200333994A1 (en) * 2019-04-16 2020-10-22 Apple Inc. Systems and Methods for Initiating and Interacting with a Companion-Display Mode for an Electronic Device with a Touch-Sensitive Display

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8446377B2 (en) * 2009-03-24 2013-05-21 Microsoft Corporation Dual screen portable touch sensitive computing system
WO2016024330A1 (en) * 2014-08-12 2016-02-18 株式会社 東芝 Electronic device and method for displaying information
CN108459817A (en) * 2018-01-19 2018-08-28 广州视源电子科技股份有限公司 Operating method, device and the intelligent interaction tablet of intelligent interaction tablet
CN109101181B (en) * 2018-08-17 2022-04-22 联想(北京)有限公司 Data processing method and electronic equipment

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130129310A1 (en) * 2011-11-22 2013-05-23 Pleiades Publishing Limited Inc. Electronic book
CN107357513A (en) * 2017-06-12 2017-11-17 深圳市金立通信设备有限公司 A kind of method and terminal recorded the note
CN107577445A (en) * 2017-09-19 2018-01-12 中新国际电子有限公司 A kind of electronic book equipment and calculation method
CN109739431A (en) * 2018-12-29 2019-05-10 联想(北京)有限公司 A kind of control method and electronic equipment
US20200333994A1 (en) * 2019-04-16 2020-10-22 Apple Inc. Systems and Methods for Initiating and Interacting with a Companion-Display Mode for an Electronic Device with a Touch-Sensitive Display

Also Published As

Publication number Publication date
WO2022143607A1 (en) 2022-07-07

Similar Documents

Publication Publication Date Title
US7023428B2 (en) Using touchscreen by pointing means
JP5204305B2 (en) User interface apparatus and method using pattern recognition in portable terminal
US9035883B2 (en) Systems and methods for modifying virtual keyboards on a user interface
CN108121457B (en) Method and apparatus for providing character input interface
JP5755219B2 (en) Mobile terminal with touch panel function and input method thereof
WO2022143198A1 (en) Processing method for application interface, and related device
CN102902471B (en) Input interface switching method and input interface switching device
TWI505155B (en) Touch-control method for capactive and electromagnetic dual-mode touch screen and handheld electronic device
EP2915036A1 (en) Keyboard with gesture-redundant keys removed
US20150347001A1 (en) Electronic device, method and storage medium
US20230359351A1 (en) Virtual keyboard processing method and related device
TWI659353B (en) Electronic apparatus and method for operating thereof
US20150062015A1 (en) Information processor, control method and program
US20230359279A1 (en) Feedback method and related device
US20140359541A1 (en) Terminal and method for controlling multi-touch operation in the same
TW201319867A (en) Touch pen, electronic device and interactive operation method
WO2013047023A1 (en) Display apparatus, display method, and program
KR100656779B1 (en) Alphabet Input Apparatus Using A TouchPad And Method Thereof
WO2018123320A1 (en) User interface device and electronic apparatus
JP2015213320A (en) Handheld device and input method thereof
US11188224B2 (en) Control method of user interface and electronic device
KR101678213B1 (en) An apparatus for user interface by detecting increase or decrease of touch area and method thereof
WO2022143607A1 (en) Application interface processing method and related device
US11847313B2 (en) Electronic device having touchpad with operating functions selected based on gesture command and touch method thereof
KR20100034811A (en) Touch mouse

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination