WO2022143579A1 - 一种反馈方法以及相关设备 - Google Patents

一种反馈方法以及相关设备 Download PDF

Info

Publication number
WO2022143579A1
WO2022143579A1 PCT/CN2021/141838 CN2021141838W WO2022143579A1 WO 2022143579 A1 WO2022143579 A1 WO 2022143579A1 CN 2021141838 W CN2021141838 W CN 2021141838W WO 2022143579 A1 WO2022143579 A1 WO 2022143579A1
Authority
WO
WIPO (PCT)
Prior art keywords
vibration
electronic device
virtual
vibration feedback
key
Prior art date
Application number
PCT/CN2021/141838
Other languages
English (en)
French (fr)
Inventor
刘逸硕
黄大源
李维
闫澈
周轩
赵韵景
梁敬非
李宏汀
黄雪妍
王卓
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to EP21914315.3A priority Critical patent/EP4261660A1/en
Publication of WO2022143579A1 publication Critical patent/WO2022143579A1/zh
Priority to US18/343,948 priority patent/US20230359279A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/014Force feedback applied to GUI
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04803Split screen, i.e. subdividing the display area or the window area into separate subareas
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means

Definitions

  • the present application relates to the field of computer technology, and in particular, to a feedback method and related equipment.
  • a common text input method is to set a virtual keyboard on the touch screen, and users can text input.
  • the virtual keyboard lacks many features of the physical keyboard, which makes it a difficult task to realize touch typing on the touch screen.
  • the embodiments of the present application provide a feedback method and related equipment.
  • a user touches an anchor point button on a virtual button a first feedback operation is performed through a touch screen to prompt the user that the anchor point button is currently contacted. Therefore, the user can perceive the position of the anchor point button, which is beneficial to reduce the difficulty of touch typing on the touch screen.
  • the embodiments of the present application provide a feedback method, which can be used in the field of virtual keyboards.
  • the method is applied to an electronic device, the electronic device is configured with a touch screen, and the touch screen is configured with a plurality of vibration feedback elements.
  • the method includes: the electronic device detects a first contact operation acting on the touch screen, and responds to the first contact operation. , and obtain the first position information of the first contact point corresponding to the first contact operation, where the first position information corresponds to the first virtual key on the virtual keyboard.
  • the electronic device obtains one or more first vibration feedback elements from multiple vibration feedback elements; wherein, the first vibration feedback element is a vibration feedback matched with the first virtual key Components, the vibration feedback components matched with different virtual keys are not exactly the same;
  • the virtual keyboard can be represented as any type of keyboard, as an example, for example, the virtual keyboard can be a full keyboard, a numeric keyboard, a function keyboard, etc., or the virtual keyboard can also be It is the collective name of all operation keys on the touch screen.
  • the meaning of the anchor point button is not the same as the positioning button, that is, the anchor point button refers to the button used to bring a prompt effect to the user.
  • buttons are the anchor point buttons can be preset. Configured in the electronic device, that is, which virtual buttons are anchor point buttons can be pre-fixed; it can also be customized by the user, that is, the user can define which virtual buttons are through the "Settings" interface in the electronic device.
  • the button is an anchor point button.
  • the electronic device obtains the first virtual key corresponding to the first contact point according to the first position information. , and then determine whether the first virtual key is an anchor point key.
  • the electronic device may pre-store which location areas on the touch screen are the location areas of the anchor point buttons, and which location areas on the touch screen are the location areas of the non-anchor point buttons.
  • the position information directly determines whether the position of the first contact point is located in the position area of the anchor point button, so as to determine whether the first virtual button corresponding to the first position information is the anchor point button.
  • the electronic device instructs all the first vibration feedback elements matching the first virtual key to emit vibration waves to perform a first feedback operation, and the first feedback operation is used to prompt the first virtual key to be an anchor point key.
  • the first feedback operation will be performed through the touch screen to remind the user that the anchor point button is currently touched, so that the user can perceive the position of the anchor point button , which is beneficial to reduce the difficulty of touch typing on the touch screen;
  • a plurality of vibration feedback elements are configured in the touch screen. Acquiring at least one first vibration feedback element that matches the first virtual key, and instructing the at least one first vibration feedback to emit vibration waves, can realize the effect of generating vibration feedback only around the first virtual key, that is, not for the full screen.
  • the method further includes: the electronic device obtains, according to the first position information, a Click the corresponding first virtual key.
  • the first virtual key corresponding to the first contact point can be acquired in real time according to the first position information, so that this solution can be compatible not only with a virtual keyboard with a fixed position, but also with a virtual keyboard with a moving position, extending the Application scenarios of this program.
  • a first mapping relationship is configured in the electronic device, and the first mapping relationship is used to indicate a corresponding relationship between the virtual key and the vibration feedback element.
  • the electronic device acquiring the first vibration feedback element from the plurality of vibration feedback elements includes: the electronic device acquiring the first vibration feedback element matching the first virtual key according to the first mapping relationship and the first virtual key.
  • each mapping relationship includes a corresponding relationship between multiple virtual keys and multiple first vibration feedback elements. Then, before the electronic device obtains the first vibration feedback element matching the first virtual key according to the first mapping relationship and the first virtual key, it needs to obtain the first vibration feedback element matching the type of the currently displayed virtual keyboard from the multiple mapping relationships. a mapping relationship.
  • a first mapping relationship is pre-configured, so that after acquiring the first virtual key, the first mapping relationship can be used to obtain at least one first vibration feedback element matching the first virtual key, which is convenient, quick, and beneficial to Improve the efficiency of the matching process of the vibration feedback element; split the step of determining the vibration feedback element, so that when a fault occurs, it is beneficial to accurately locate the fault location.
  • a first mapping relationship is configured in the electronic device, and the first mapping relationship indicates a corresponding relationship between the position information and the vibration feedback element.
  • the electronic device obtains the first vibration feedback element from the plurality of vibration feedback elements, including: the electronic device obtains, according to the first mapping relationship and the first position information, the first vibration feedback element that matches the first position information, because the first position information Corresponding to the first virtual key on the virtual keyboard, that is, acquiring the first vibration feedback element corresponding to the first virtual key.
  • At least one first vibration feedback element matching the first virtual key can be obtained according to the first position information and the first mapping relationship, which is convenient and quick, and is beneficial to improve the efficiency of the matching process of the vibration feedback element;
  • a mapping relationship can indicate the corresponding relationship between the first position information and a first vibration feedback element, which can be compatible not only with a virtual keyboard with a fixed position, but also with a virtual keyboard with a movable position, ensuring that it can be used in various scenarios. Provides vibration feedback.
  • the method before the electronic device sends out vibration waves through the first vibration feedback element, the method further includes: the electronic device acquires vibrations corresponding to each of the first vibration feedback elements in the at least one first vibration feedback element
  • the vibration intensity of the wave, the vibration intensity of the vibration wave of each first vibration feedback element in the at least one first vibration feedback element is related to any one or more of the following factors: the first number, the relationship between each first vibration feedback unit and the first vibration feedback element.
  • the distance of the center point of a virtual key, the type of vibration wave, whether the virtual key is an anchor point key or the position type of the first position information, and the first quantity is the quantity of the first vibration feedback element.
  • the electronic device emits vibration waves through the first vibration feedback element, including: the electronic device emits vibration waves through at least one first vibration feedback element according to the vibration intensity of the vibration waves corresponding to each of the first vibration feedback elements, so as to make the vibration wave consistent with the first virtual feedback element.
  • the difference between the intensity of the vibration feedback corresponding to the key and the intensity of the vibration feedback corresponding to the second virtual key is within a preset intensity range, and the second virtual key and the first virtual key are different virtual keys; the preset intensity range may be the intensity The difference is within 2 percent, the difference in intensity is within 3 percent, the difference in intensity is within 4 percent, or the difference in intensity is within 5 percent.
  • the probe of the vibration measuring instrument can be attached to the surface of a virtual key (that is, a detection point) on the touch screen, so as to measure the intensity from the above-mentioned detection point.
  • the vibration wave is collected, and then the waveform curve of the collected vibration wave is obtained, and the intensity of the vibration feedback corresponding to the detection point is indicated by the aforementioned waveform curve.
  • the difference between the intensity of the vibration feedback corresponding to the first virtual key and the intensity of the vibration feedback corresponding to the second virtual key can be compared between the waveform curve measured at the detection point of the first virtual key and The difference between the waveform curves of the two regions at the detection point of the second virtual key is obtained.
  • the strength of each vibration feedback element is determined according to the number of matched vibration feedback elements, so as to realize the difference of the vibration feedback strength of each virtual key.
  • the difference is within the preset range, because when the user is using the physical keys, the force feedback given by different keys is basically the same, so that the difference between the virtual keyboard and the physical keyboard can be reduced to increase user viscosity.
  • the first vibration feedback element is any one of the following: a piezoelectric ceramic sheet, a linear motor, or a piezoelectric film.
  • a piezoelectric ceramic sheet a piezoelectric ceramic sheet
  • a linear motor a linear motor
  • a piezoelectric film a variety of specific representation forms of the vibration feedback element are provided, which improves the implementation flexibility of this solution.
  • the first contact point is a newly added contact point on the touch screen.
  • this solution since the user often focuses on the actual keys that are newly touched when using the physical keyboard, this solution only generates feedback for the newly added touch points, which can better simulate when the user uses the physical keyboard for input It is easier to establish a memory relationship between the user and the new contact point, and further reduce the difficulty of training touch typing on the touch screen.
  • the method further includes: when the first virtual key is a non-anchor point key, the electronic device performs a second feedback operation, where the second feedback operation is used to prompt that the first virtual key is For non-anchor point buttons, the first feedback operation and the second feedback operation are different feedback operations.
  • the first feedback operation and the second feedback operation are different feedback operations.
  • not only the feedback operation is performed when the first virtual button is an anchor point button, but also the feedback operation is performed when the first virtual button is a non-anchor point button.
  • the first feedback operation and the second feedback operation The operation is a different feedback operation, because each key will give feedback to the user when the user uses the physical keyboard.
  • the similarity between the virtual keyboard and the physical keyboard can be increased, and the difference between the anchor point key and the non-anchor key can be increased.
  • the fixed-point keys give different feedback operations, and can also help the user to remember different types of keys, so as to assist the user to realize touch typing on the virtual keyboard.
  • the first feedback operation is to send out a first type of vibration wave through the touch screen
  • the second feedback operation is to send out a second type of vibration wave through the touch screen
  • the first type of vibration wave is sent out through the touch screen.
  • the vibrational waves and the second type of vibrational waves are different types of vibrational waves. If the electronic equipment emits continuous vibration waves through the vibration feedback element, the difference between different types of vibration waves includes any one or more of the following characteristics: vibration amplitude, vibration frequency, vibration duration or envelope shape.
  • the difference between the different types of vibration waves includes any one or more of the following characteristics: vibration amplitude, vibration frequency, vibration duration, envelope shape or electronic The frequency at which the device emits vibrational waves in the form of pulses.
  • the method before the electronic device performs the first feedback operation, the method further includes: the electronic device acquires, according to the first position information, a position type corresponding to the first contact point, where the position type includes the first contact point The point is located in the first position area of the first virtual key (also known as the feature area of the anchor point button) and the first contact point is located in the second position area of the first virtual button (also known as the edge area of the anchor point button) , the first location area and the second location area are different; the electronic device performs the first feedback operation, including: the electronic device performs the first feedback operation through the touch screen according to the location type corresponding to the first contact point, and the first location area The corresponding feedback operation is different from the feedback operation corresponding to the second location area.
  • the entire location area of the anchor point button and/or the non-anchor point button is divided into a first location area and a second location area, in the case of the first location area where the first contact point is located, and, in In the two cases where the first contact point is located in the second location area, the types of vibration waves emitted by the electronic device through the at least one first vibration feedback element are different, which is helpful for helping the user memorize the boundaries of the virtual keys, that is, helping the user to Different areas of the virtual keys build muscle memory to further reduce the difficulty of touch typing on a touch screen.
  • the feedback operation corresponding to the first position area of the anchor point button is the same as the feedback operation corresponding to the first position area of the non-anchor point button, and is the same as the second position area of the anchor point button.
  • the feedback operation corresponding to the position area is different from the feedback operation corresponding to the second position area of the non-anchor point button; or, the feedback operation corresponding to the first position area of the anchor point button is different from that corresponding to the first position area of the non-anchor point button.
  • the feedback operation is different, and the feedback operation corresponding to the second position area of the anchor point button is the same as the feedback operation corresponding to the second position area of the non-anchor point button; or, the feedback operation corresponding to the first position area of the anchor point button
  • the operation is different from the feedback operation corresponding to the first position area of the non-anchor point button, and the feedback operation corresponding to the second position area of the anchor point button is different from the feedback operation corresponding to the second position area of the non-anchor point button.
  • the first contact operation is a pressing operation
  • the method further includes: the electronic device detects a second contact operation acting on the touch screen, and acquires a second contact operation corresponding to the second contact operation
  • the second position information of the contact point, the second contact operation is a touch operation;
  • the electronic device changes the tactile characteristics of the second contact point on the touch screen in response to the second contact operation, and the tactile characteristics include any one or more of the following properties: coefficient of sliding friction, stick-slip and temperature.
  • the method before the electronic device detects the first contact operation acting on the touch screen, the method further includes: the electronic device detects the first gesture operation acting on the touch screen; the electronic device detects the first gesture operation acting on the touch screen; In response to the first gesture operation, a first type of virtual keyboard corresponding to the first gesture operation is selected from a plurality of types of virtual keyboards, wherein the virtual keys included in different types of virtual keyboards among the plurality of types of virtual keyboards are incomplete. The same; the electronic device displays the first type of virtual keyboard through the touch screen, and during the display process of the first type of virtual keyboard, the position of the first type of virtual keyboard on the touch screen is fixed; the electronic device detects the effect on the touch screen.
  • the first contact operation on the screen includes: the electronic device detects the first contact operation acting on the touch screen during the display process of the first type of virtual keyboard.
  • the embodiments of the present application provide an electronic device that can be used in the field of virtual keyboards.
  • the electronic device is configured with a touch screen, the touch screen includes a contact sensing module and a vibration feedback module, and the vibration feedback module includes a plurality of vibration feedback elements.
  • the contact sensing module is used to obtain the first position information of the first contact point on the touch screen.
  • the contact sensing module can be embodied as a contact sensing film, and the contact sensing film can be a capacitive touch sensing film and a pressure contact sensing film. Or temperature touch sensing films or other types of films.
  • the first vibration feedback element is used to send out vibration waves when the first virtual key corresponding to the first contact point is the anchor point key, and the vibration wave is used to prompt the first virtual key to be the anchor point key, and the first vibration feedback element It is any one of the following: piezoelectric ceramic sheet, linear motor or piezoelectric film; wherein, the first virtual key is a virtual key in the virtual keyboard, and the first vibration feedback element is the same as the first vibration feedback element among the plurality of vibration feedback elements. Vibration feedback element for virtual key matching.
  • the first contact point is obtained based on a pressing operation acting on the touch screen
  • the touch screen further includes a cover plate and an ultrasonic module
  • the ultrasonic module is used to emit ultrasonic waves to change the cover
  • the tactile characteristics of the board specifically, the contact sensing module is also used to obtain the second position information of the second contact point on the touch screen; the ultrasonic module is specifically used for the second contact point based on the touch acting on the touch screen.
  • the touch screen further includes a cover plate and an electrostatic module, and the electrostatic module is used to generate an electrical signal to change the tactile characteristics of the cover plate; specifically, the touch sensing module is further used to obtain the first contact point of the second contact point on the touch screen. Two position information; the electrostatic module is specifically used to generate an electrical signal to change the tactile characteristics of the cover when the second contact point is obtained based on a touch operation acting on the touch screen.
  • the haptic properties include any one or more of the following properties: coefficient of sliding friction, stick-slip, and temperature.
  • the touch screen can also change the tactile characteristics of the cover by setting an ultrasonic module or an electrostatic module, so as to provide richer tactile feedback, and then use the richer tactile feedback to respond to the user.
  • the touch screen further includes a pressure sensing module, the pressure sensing module and the vibration feedback module are integrated into one, and the vibration feedback element is a piezoelectric ceramic sheet, a piezoelectric polymer or a piezoelectric composite material .
  • the pressure sensing module is configured to collect a pressure value corresponding to the first contact operation to determine whether the first contact operation is a pressing operation or a touch operation.
  • a plurality of vibration feedback elements included in the vibration feedback module may be divided, and the second vibration feedback element in the plurality of vibration feedback elements is used to collect pressure value, the third vibration feedback element among the plurality of vibration feedback elements is used to emit vibration waves for vibration feedback.
  • the second vibration feedback element and the third vibration feedback element are different vibration feedback elements.
  • a plurality of vibration feedback elements in the vibration feedback module (also referred to as a pressure sensing module) are used to collect pressure values in the first time period, and are used to emit vibration waves in the second time period, The first time period and the second time period are different.
  • the touch screen is also equipped with a pressure sensing module for collecting pressure values, so that not only the position information of the contact point, but also the pressure value of the contact point can be obtained, so as to obtain the information obtained through the touch screen.
  • the obtained contact operation is further managed in detail; and the integration of the pressure sensing module and the vibration feedback module is conducive to reducing the thickness of the touch screen, thereby improving the convenience of electronic devices.
  • the embodiments of the present application provide an electronic device, which can be used in the field of virtual keyboards.
  • the electronic device includes a touch screen, a memory, one or more processors, and one or more programs.
  • the touch screen is configured with a plurality of vibration feedback elements, one or more programs are stored in the memory, and one or more processes
  • the controller executes one or more programs, the electronic device performs the following steps: detecting a first contact operation acting on the touch screen; in response to the first contact operation, acquiring the information of the first contact point corresponding to the first contact operation
  • the first position information, the first position information corresponds to the first virtual key on the virtual keyboard; in the case that the first virtual key is an anchor point key, the first vibration feedback element is obtained from a plurality of vibration feedback elements, and the first vibration feedback element is obtained.
  • the feedback element is a vibration feedback element matched with the first virtual key; the first vibration feedback element is instructed to emit a vibration wave to perform a first feedback operation, and the first feedback operation is used to prompt the first
  • the electronic device may also be used to implement the steps performed by the electronic device in the various possible implementations of the first aspect.
  • the third aspect and the various possible implementations of the third aspect of the embodiments of the present application For the specific implementation manners of certain steps and the beneficial effects brought by each possible implementation manner, reference may be made to the descriptions in the various possible implementation manners in the first aspect, and details are not repeated here.
  • an embodiment of the present application provides a computer program that, when running on a computer, causes the computer to execute the feedback method described in the first aspect.
  • an embodiment of the present application provides an electronic device, including a processor coupled to the memory; the memory for storing a program; and the processor for executing a program in the memory the program, so that the execution device executes the feedback method according to the first aspect.
  • an embodiment of the present application provides a computer-readable storage medium, where a computer program is stored in the computer-readable storage medium, and when it runs on a computer, causes the computer to execute the feedback described in the first aspect above method.
  • an embodiment of the present application provides a chip system, where the chip system includes a processor for supporting and implementing the functions involved in the above-mentioned first aspect, for example, sending or processing the data involved in the above-mentioned method and/or or information.
  • the chip system further includes a memory for storing necessary program instructions and data of the server or the communication device.
  • the chip system may be composed of chips, or may include chips and other discrete devices.
  • an embodiment of the present application provides a method for processing a virtual keyboard, which can be used in the field of human-computer interaction.
  • the method is applied to an electronic device, the electronic device is configured with a display screen, and the method includes: the electronic device detects a first gesture operation acting on the display screen, and in response to the detected first gesture operation, selects from a plurality of types of virtual keyboards The first type of virtual keyboard corresponding to the first gesture operation, wherein different types of virtual keyboards among the multiple types of virtual keyboards include different virtual keys; the electronic device displays the first type of virtual keyboard through a display screen.
  • a plurality of different types of virtual keyboards are configured in the electronic device, and the virtual keys included in the different types of virtual keyboards are not identical, and the user can evoke different types of virtual keyboards through different gesture operations, that is, virtual keyboards It is no longer only 26 letters can be displayed, but more virtual keys are provided to users through different types of virtual keyboards, which not only improves the flexibility of users in the process of evoking the virtual keyboard, but also helps to provide more abundant virtual keys , thus eliminating the need to provide an additional physical keyboard.
  • the electronic device selects a virtual keyboard of a first type corresponding to the first gesture operation from multiple types of virtual keyboards, including: The virtual keyboard of the first type corresponding to the first gesture operation is selected from the virtual keyboard of , and the first rule indicates the correspondence between the gesture operations of the multiple types and the virtual keyboards of the multiple types.
  • the electronic device is preconfigured with a first rule, and the first rule indicates the correspondence between multiple types of gesture operations and the multiple types of virtual keyboards. After the gesture operation, a first type of virtual keyboard corresponding to the specific first gesture operation can be obtained according to the first rule, thereby improving the efficiency of the virtual keyboard matching process.
  • the first rule directly includes correspondence between multiple types of gesture operations and multiple types of virtual keyboards; the first rule includes multiple first A correspondence between identification information and a plurality of second identification information, the first identification information is used to uniquely point to the first identification information corresponding to a type of gesture operation, and the second identification information is used to uniquely point to a type of virtual keyboard.
  • the first rule includes correspondences between multiple sets of conditions and multiple types of virtual keyboards, and each set of conditions in the multiple sets of conditions corresponds to one type of gesture operation, that is, each set of conditions in the multiple sets of conditions corresponds to one type of gesture operation
  • the set of conditions are the limited conditions of the gesture parameters corresponding to the gesture operation, and each set of conditions corresponds to one type of gesture operation.
  • the method further includes: the electronic device obtains a virtual keyboard corresponding to the first gesture operation.
  • the corresponding first gesture parameter wherein the first gesture parameter includes any one or more of the following parameters: position information of the contact point corresponding to the first gesture operation, quantity information of the contact point corresponding to the first gesture operation, Area information of the contact point corresponding to the first gesture operation, relative angle information of the hand corresponding to the first gesture operation, position information of the hand corresponding to the first gesture operation, number information of the hands corresponding to the first gesture operation, and The shape information of the hand corresponding to the first gesture operation; the electronic device selects the virtual keyboard of the first type corresponding to the first gesture operation from multiple types of virtual keyboards, including: the electronic device selects a virtual keyboard from a plurality of Select the first type of virtual keyboard from Types of virtual keyboards.
  • the first gesture parameter includes not only the position information of each contact point and the quantity information of multiple contact points, but also the area information of each contact point.
  • the area information of the contact point can be obtained from the multiple contact points. Distinguish the contact points triggered by the palm, which is beneficial to accurately estimate the type of the first gesture operation, avoid displaying the wrong virtual keyboard, so as to improve the accuracy of the virtual keyboard display process; perform secondary processing on the acquired first gesture parameters Then, information such as the relative angle information of the hand, the position information of the hand, the quantity information of the hand, or the shape information of the hand can be obtained. Flexibility in the keyboard matching process.
  • the method further includes: in response to the first gesture operation, the electronic device acquires a first angle, where the first angle indicates the distance between the hand corresponding to the first gesture operation and the edge of the display screen
  • the relative angle, or the first angle indicates the relative angle between the hand corresponding to the first gesture operation and the center line of the display screen.
  • the electronic device displays the first type of virtual keyboard through the display screen, including: the electronic device obtains the first display angle of the first type of virtual keyboard according to the first angle, and displays the first type of virtual keyboard through the display screen according to the first display angle keyboard; the first presentation angle indicates the relative angle between the side of the virtual keyboard of the first type and the side of the display screen, or the first presentation angle indicates the angle between the side of the virtual keyboard of the first type and the centerline of the display screen. relative angle.
  • the relative angle (ie, the first angle) between the user's hand and the edge or center line of the display interface is obtained, and the display angle of the virtual keyboard is determined according to the first angle, so that the display angle of the keyboard is more suitable It is suitable for the placement angle of the user's hand, which makes the user's input process using the virtual keyboard more comfortable and convenient.
  • the virtual keyboard of the first type is a full keyboard
  • the full keyboard is split into a first sub-keyboard and a second sub-keyboard
  • the first angle includes the relative angle of the left hand and the right hand
  • the relative angle of the first sub-keyboard and the second sub-keyboard are different virtual keys in the full keyboard
  • the first display angle includes the display angle of the first sub-keyboard and the display angle of the second sub-keyboard. If the first angle indicates the relative angle between the hand and the edge of the display screen in the first gesture corresponding to the first gesture operation, the display angle of the first sub-keyboard indicates the angle between the edge of the first sub-keyboard and the edge of the display screen.
  • the display angle of the second sub-keyboard indicates the relative angle between the side of the second sub-keyboard and the side of the display screen; if the first angle indicates the side of the hand and the display screen in the first gesture corresponding to the first gesture operation
  • the display angle of the first sub-keyboard indicates the relative angle between the edge of the first sub-keyboard and the center line of the display screen
  • the display angle of the second sub-keyboard indicates the edge of the second sub-keyboard and the display screen. The relative angle between the centerlines.
  • the electronic device determines whether the first angle is greater than or equal to a preset angle threshold, and if it is greater than or equal to the preset angle threshold, obtains the first display angle, and The first type of virtual keyboard is displayed according to the first display angle through the display screen, wherein the value of the preset angle threshold can be 25 degrees, 28 degrees, 30 degrees, 32 degrees, 35 degrees or other values, etc. limited.
  • the electronic device determines the first display angle of the virtual keyboard of the first type as the first angle, and displays the virtual keyboard of the first type according to the first angle through the display screen ,
  • different types of virtual keyboards among the multiple types of virtual keyboards have different functions
  • the virtual keyboards with different functions include a combination of any two or more of the following virtual keyboards: numeric keyboard, Function key keyboard, full keyboard and custom keyboard, the function key keyboard consists of function keys.
  • different types of virtual keyboards have different functions, so that a user can be provided with a variety of virtual keyboards with different functions, so as to improve the flexibility of the user in the process of using the virtual keyboard, so as to improve the user viscosity of this solution.
  • the first type of virtual keyboard is any one of the following virtual keyboards: a mini keyboard, a numeric keyboard, and a functional keyboard , function key keyboard, circular keyboard, arc keyboard, custom keyboard, among which, the mini keyboard includes 26 letter keys, the functional keyboard is displayed in the application program, and the virtual keys included in the functional keyboard correspond to the functions of the application program. It should be noted that the same electronic device does not need to be equipped with a mini keyboard, a numeric keyboard, a functional keyboard, a function key keyboard, a circular keyboard, an arc keyboard and a custom keyboard.
  • the one-handed operation can be triggered by any virtual keyboard among mini keyboards, numeric keyboards, functional keyboards, function key keyboards, circular keyboards, arc keyboards or custom keyboards.
  • any virtual keyboard among mini keyboards, numeric keyboards, functional keyboards, function key keyboards, circular keyboards, arc keyboards or custom keyboards.
  • the first gesture operation is a one-hand operation and a two-hand operation, which improves the implementation flexibility of this solution, and also expands the The application scenarios of this scheme are presented.
  • the first type of virtual keyboard is a full keyboard
  • the full keyboard includes at least 26 letter keys
  • the size of the full keyboard is larger than that of the mini keyboard. big.
  • the electronic device displays the first type of virtual keyboard through the display screen, including: when the distance between the hands is less than or equal to the first distance threshold, the electronic device displays the full keyboard in an integrated manner through the display screen; When the distance between them is greater than the first distance threshold, the electronic device displays the first sub-keyboard through the second area of the display screen, and displays the second sub-keyboard through the third area of the display screen, wherein the second area and the third area are: In different areas of the display screen, the first sub-keyboard and the second sub-keyboard include different virtual keys in the full keyboard; the value of the first distance threshold can be 70 mm, 75 mm, 80 mm, etc., which is not done here. limited.
  • the first type of virtual keyboard is a mini keyboard.
  • the first type of virtual keyboard is a mini keyboard, which is beneficial to improve the flexibility of the process of inputting letters by the user.
  • the single-handed operation includes a left-handed single-handed operation and a right-handed single-handed operation; when the first gesture operation is a right-handed single-handed operation, the first type of virtual keyboard is a numeric keyboard; In the case where the first gesture operation is a left-handed one-hand operation, the first type of virtual keyboard is a functional keyboard, and the virtual keys included in the functional keyboard correspond to the functions of the application program.
  • the first gesture operation is in a game
  • the functional keyboard can be a game keyboard, and the game keyboard is configured with buttons commonly used in games.
  • the functional keyboard may be a commonly used key in drawing software or the like.
  • the first type of virtual keyboard is a numeric keyboard
  • the first type of virtual keyboard is The functional keyboard is more in line with the user's habit of using the physical keyboard, so as to reduce the difference between the virtual keyboard and the physical keyboard, which is beneficial to enhance the user's stickiness.
  • the first type of virtual keyboard is a function key keyboard, and the first area is located in the display screen bottom left or bottom right.
  • the function keys are arranged at the lower left or lower right of the physical keyboard, when the first gesture operation is a one-handed operation located in the first area of the display screen, the first type of virtual keyboard is the function key
  • the trigger gesture is the same as the user's habit of using a physical keyboard, it is convenient for the user to memorize the trigger gesture, which reduces the difficulty of implementing this solution and is beneficial to enhancing the user's stickiness.
  • the method further includes: the electronic device obtains a contact operation for the first virtual key in the function key keyboard.
  • the first virtual key may be the Ctrl key, or may simultaneously include the Ctrl key key and Shift key, etc.
  • the electronic device highlights the second virtual key on the display screen, and the second virtual key is a key other than the first virtual key in the combined shortcut keys.
  • Highlighted display includes but is not limited to highlighted display, bold display or flashing display, which is not limited here.
  • the combination of Ctrl key + Shift key + I key can provide the function of inverting the currently processed image
  • the first virtual key includes the Ctrl key and the Shift key
  • the first virtual key The two virtual keys are virtual keys I.
  • a contact operation for the first virtual key in the function key keyboard is obtained, and in response to the contact operation, the second virtual key is highlighted on the display screen, and the first virtual key is displayed on the display screen.
  • the two virtual keys are the keys other than the first virtual key among the shortcut keys of the combination type. Since the function key keyboard occupies a small area, the area required for displaying the virtual keyboard is reduced, and the first virtual key in the function key keyboard is reduced by the user.
  • the second virtual key in the combined shortcut key can be automatically displayed, thereby ensuring the user's demand for the shortcut key and avoiding the waste of the display area of the display screen.
  • the first gesture operation is a contact operation acquired through the display screen, and the first gesture parameter includes information on the number of contact points corresponding to the first gesture operation; when the first gesture operation is:
  • the first type of virtual keyboard is a circular keyboard or an arcuate keyboard.
  • a circular keyboard or an arc-shaped keyboard can also be provided, which can not only provide the keyboard existing in the physical keyboard, but also provide the physical keyboard. The keyboard that does not exist in , enriches the types of keyboards, provides users with more choices, and further enhances the user's choice flexibility.
  • the first rule includes a first sub-rule, and the first sub-rule is obtained after performing a custom operation on at least one type of gesture operation and/or at least one type of virtual keyboard .
  • the user can customize the trigger gesture and/or the type of the virtual keyboard, so that the display process of the virtual keyboard is more in line with the user's expectation, so as to further improve the user stickiness of this solution.
  • a plurality of vibration feedback elements are configured in the display screen, and during the display process of the first type of virtual keyboard, the position of the first type of virtual keyboard on the display screen is fixed, and the After the display screen displays the virtual keyboard of the first type, the method further includes: the electronic device detects a first contact operation acting on the display screen, and in response to the first contact operation, acquires the first contact point corresponding to the first contact operation. location information, the first location information corresponds to the first virtual key on the virtual keyboard.
  • the electronic device obtains the first vibration feedback element from a plurality of vibration feedback elements, and the first vibration feedback element is a vibration feedback element matched with the first virtual key; indicating the first vibration
  • the feedback element emits vibration waves to perform a first feedback operation, and the first feedback operation is used to prompt that the first virtual key is an anchor point key.
  • the first contact operation, the first contact point, the first position information, the first virtual key, the first vibration feedback element, the specific implementation steps and the beneficial effects in this implementation please refer to the first The descriptions in various possible implementation manners in the aspect will not be introduced here for the time being.
  • the electronic device may also be used to implement the steps performed by the electronic device in various possible implementations of the first aspect.
  • the eighth aspect of the embodiments of the present application and the various possible implementations of the eighth aspect For the specific implementation manners of certain steps and the beneficial effects brought by each possible implementation manner, reference may be made to the descriptions in the various possible implementation manners in the first aspect, and details are not repeated here.
  • the embodiments of the present application provide an electronic device, which can be used in the field of human-computer interaction.
  • the electronic device includes a display screen, a memory, one or more processors, and one or more programs, the one or more programs are stored in the memory, and the one or more processors, when executing the one or more programs, make the electronic device
  • the following steps are performed: in response to the detected first gesture operation, a first type of virtual keyboard corresponding to the first gesture operation is selected from a plurality of types of virtual keyboards, wherein different types of virtual keyboards in the plurality of types of virtual keyboards are selected.
  • the virtual keys included in the keyboard are not identical; the first type of virtual keyboard is displayed through the display screen.
  • the electronic device may also be used to implement the steps performed by the electronic device in various possible implementations of the eighth aspect.
  • the ninth aspect and the various possible implementations of the ninth aspect of the embodiments of the present application For specific implementation manners of certain steps and the beneficial effects brought by each possible implementation manner, reference may be made to the descriptions in the various possible implementation manners in the eighth aspect, which will not be repeated here.
  • an embodiment of the present application provides a computer program that, when running on a computer, causes the computer to execute the method for processing a virtual keyboard described in the eighth aspect.
  • an embodiment of the present application provides an electronic device, including a processor, where the processor is coupled with the memory; the memory is used for storing a program; the processor is used for executing the memory
  • the program in , causes the electronic device to execute the method for processing a virtual keyboard according to the eighth aspect above.
  • an embodiment of the present application provides a computer-readable storage medium, where a computer program is stored in the computer-readable storage medium, and when the computer program runs on a computer, causes the computer to execute the above-mentioned eighth aspect. How to handle the virtual keyboard.
  • an embodiment of the present application provides a chip system, where the chip system includes a processor, configured to support implementing the functions involved in the above aspects, for example, sending or processing the data involved in the above method and/or information.
  • the chip system further includes a memory for storing necessary program instructions and data of the server or the communication device.
  • the chip system may be composed of chips, or may include chips and other discrete devices.
  • the embodiments of the present application provide a method for processing an application interface, which can be used in the field of human-computer interaction.
  • the method is applied to an electronic device, the electronic device includes a first display screen and a second display screen, and the method includes: the electronic device displays a first application interface through the first display screen; The mode type corresponding to the application interface is changed to handwriting input; in response to the input mode of the handwriting input, triggering the display of the first application interface on the second display screen, so as to obtain the handwritten content for the first application interface through the second display screen.
  • an operating system runs on the electronic device, and the electronic device can call the move to function in the operating system, or the electronic device can also call the Set Window Position function in the operating system, or the electronic device can also The first application interface can be displayed on the second display screen by calling the Set Window Placement function in the operating system.
  • the electronic device displays the first application interface on the first display screen, and when it detects that the mode type corresponding to the first application interface is handwriting input, it triggers the display of the first application interface on the second display screen.
  • application interface and then input directly through the first application interface displayed by the second display screen; through the foregoing method, if the user places the second display screen in a direction that is convenient for writing, the user does not need to perform any operations, and the electronic device can automatically
  • the application interface that requires writing input is displayed on the second display screen which is convenient for writing, which not only improves the efficiency of the entire input process, but also avoids redundant steps, and the operation is simple, which is conducive to improving user viscosity.
  • the method further includes: when the electronic device detects a connection with the first application When the mode type corresponding to the interface is changed to keyboard input, in response to the input mode of the keyboard input, the display of the first application interface on the first display screen is triggered, and the virtual keyboard is displayed on the second display screen, so as to display the virtual keyboard on the second display screen.
  • the virtual keyboard on the screen obtains the input content for the first application interface.
  • the electronic device when the electronic device detects that the mode type corresponding to the first application interface is changed to keyboard input, in response to the input mode of the keyboard input, the electronic device triggers the display of the first application interface on the first display screen, and the display of the first application interface on the second display screen.
  • the virtual keyboard and application control bar are displayed on the screen.
  • the layout of the application interface on different display screens of the electronic device be automatically adjusted when the application interface is changed from other mode types to handwriting input, but also the mode of the application interface can be adjusted automatically.
  • the layout of the application interface on different displays can also be automatically adjusted, and the virtual keyboard can be automatically displayed, so that when the mode type of the application interface is changed to keyboard input, the user does not need to manually adjust the application interface.
  • keyboard input can be performed directly, and the steps are simple, which further improves the user viscosity of this solution.
  • the method may further include: the electronic device detects a second operation acting on the second display screen; changing the first display area of the application control bar to the first display area in response to the second operation Second, the display area is changed, and the first control key group included in the application control bar is changed to a second control key group.
  • the first control key group and the second control key group are both control key sets corresponding to the target application.
  • the first application interface includes a first control key
  • the method may further include: the electronic device detects a second operation on the first target application interface; The first control key is displayed in the application control bar, and the first control key in the first application interface is hidden.
  • displaying the virtual keyboard on the second display screen includes: displaying a second type of virtual keyboard on the second display screen;
  • the first gesture operation on the second display screen in response to the first gesture operation, select a first type of virtual keyboard corresponding to the first gesture operation from a plurality of types of virtual keyboards, wherein different types of virtual keyboards among the plurality of types of virtual keyboards are selected.
  • the virtual keys included in the virtual keyboard are not exactly the same; the first type of virtual keyboard is displayed through the second display screen, and the first type of virtual keyboard and the second type of virtual keyboard are different types of virtual keyboards in multiple types of virtual keyboards. keyboard.
  • the method further includes: when the electronic device detects a connection with the first application When the mode type corresponding to the interface changes to the browsing mode, in response to the browsing mode, the display of the first application interface on the first display screen is triggered, and the display of the first application interface on the second display screen is stopped.
  • the layout of the application interface on different display screens can also be automatically adjusted, so that when the mode type of the application interface is changed to the browsing mode, the user does not need to manually adjust any more
  • the layout of the application interface on different display screens that is, in a variety of different application scenarios, can simplify the operation steps and further improve the user viscosity of the solution.
  • the electronic device detects that the first operation includes any one or a combination of the following five items: when the electronic device detects that the holding posture of the electronic pen satisfies the first preset In the case of the condition, it is determined that the first operation is detected, the holding posture includes any one or a combination of the following: holding position, holding strength, holding angle, and the first preset condition includes any one of the following A combination of items or items: the holding position is within the first position range, the holding force is within the first force range, and the holding angle is within the first angle range; The first icon is displayed on the first application interface; or, when the electronic device detects a preset click operation or a preset track operation, it determines that the first operation is detected, and the preset click operation may be a single Click operation, double-click operation, triple-click operation or long-press operation, the preset trajectory operation can be a "Z"-shaped trajectory operation, a sliding operation, a "check"-shaped trajectory operation or a "circ
  • the mode type corresponding to the first application interface improves the implementation flexibility of this solution and also expands the application scenarios of this solution; further, according to the holding posture of the electronic pen
  • the user can change the mode type of the first application interface without performing other operations, and according to the user's holding posture of the electronic pen, to determine the corresponding mode type of the first application interface
  • the mode type can reduce the error rate of the judgment process of the mode type corresponding to the first application interface, so as to reduce the probability of misplacement of the first application interface, which not only avoids the waste of computer resources, but also helps to improve user viscosity.
  • the first operation is to obtain a sliding operation in the first direction through the second display screen, and the sliding operation in the first direction is from the upper edge of the second display screen to the second display screen.
  • the distance between the upper edge of the second display screen and the first display screen is closer than the distance between the lower edge of the second display screen and the first display screen.
  • the electronic device obtains the sliding operation in the first direction through the second display screen, and in response to the sliding operation in the first direction, the virtual keyboard displayed on the second display screen moves to the lower edge of the second display screen along the first direction, When the upper edge of the virtual keyboard reaches the lower edge of the second display screen, it is confirmed that the mode type corresponding to the first application interface is changed to handwriting input.
  • the virtual keyboard displayed on the second display screen can accompany the user's downward sliding operation, and when the upper edge of the virtual keyboard reaches the lower edge of the second display screen, the electronic device confirms that it corresponds to the first application interface
  • the mode type is changed to handwriting input, which increases the fun of the process from keyboard input to handwriting input, which is beneficial to improve user stickiness.
  • the method further includes: the electronic device acquires a start-up operation for the second application interface, and based on the start-up operation, Determine the mode type corresponding to the second application interface, and the second application interface and the first application interface are different application interfaces; when the mode type corresponding to the second application interface is handwriting input, the electronic device responds to the handwriting input.
  • the electronic device responds to the input mode of the keyboard input, triggering the display on the first display screen
  • the second application interface is displayed on the screen, and the virtual keyboard is displayed on the second display screen; or, in the case that the mode type corresponding to the second application interface is the browsing mode, the electronic device responds to the browsing mode and triggers the display on the first display
  • the second application interface is displayed on the screen.
  • the electronic device determines the mode type corresponding to the second application interface based on the start-up operation, including: when the start-up operation is acquired through the first display screen, the electronic device determines The mode type corresponding to the second application interface is keyboard input or browsing mode; when the startup operation is obtained through the second display screen, the electronic device determines that the mode type corresponding to the second application interface is handwriting input.
  • the electronic device determines a mode type corresponding to the second application interface based on the start-up operation, including: in the case that the start-up operation is obtained through an electronic pen, the electronic device determines the mode type corresponding to the first application interface.
  • the mode type corresponding to the second application interface is handwriting input; when the startup operation is obtained through a mouse or a finger, the electronic device determines that the mode type corresponding to the second application interface is keyboard input or browsing mode.
  • the embodiments of the present application provide an electronic device, which can be used in the field of human-computer interaction.
  • the electronic device includes a first display screen, a second display screen, a memory, one or more processors, and one or more programs; the one or more programs are stored in the memory, the one or more programs
  • the electronic device causes the electronic device to perform the following steps: displaying a first application interface through the first display screen; The mode type corresponding to an application interface is changed to handwriting input; in response to the input mode of the handwriting input, triggering the display of the first application interface on the second display screen, so as to obtain information about the Describe the handwritten content of the first application interface.
  • the electronic device may also be used to implement the steps performed by the electronic device in various possible implementation manners of the fourteenth aspect.
  • the electronic device may also be used to implement the steps performed by the electronic device in various possible implementation manners of the fourteenth aspect.
  • the steps performed by the electronic device in various possible implementation manners of the fourteenth aspect.
  • an embodiment of the present application provides a computer program, which, when running on a computer, enables the computer to execute the application interface processing method described in the fourteenth aspect above.
  • an embodiment of the present application provides an electronic device, including a processor, where the processor is coupled with the memory; the memory is used for storing a program; the processor is used for executing the memory
  • the program in , causes the electronic device to execute the application interface processing method described in the fourteenth aspect above.
  • an embodiment of the present application provides a computer-readable storage medium, where a computer program is stored in the computer-readable storage medium, and when it runs on a computer, causes the computer to execute the above-mentioned fourteenth aspect.
  • the processing method of the application interface
  • an embodiment of the present application provides a chip system, where the chip system includes a processor for supporting the implementation of the functions involved in the above aspects, for example, sending or processing the data involved in the above methods and/or information.
  • the chip system further includes a memory for storing necessary program instructions and data of a server or a communication device.
  • the chip system may be composed of chips, or may include chips and other discrete devices.
  • a twentieth aspect of the embodiments of the present invention provides a screen display method, which is applied to an electronic device including a first display screen and a second display screen, and the screen display method includes:
  • the application control bar When the display area of the application control bar is the first display area, the application control bar includes the first control key group;
  • the application control bar When the display area of the application control bar is the second display area, the application control bar includes a second control key group;
  • the first control key group and the second control key group are both control key sets used to control the target application, and the control keys included in the first control key group and the second control key group are not identical.
  • the electronic device may be an electronic device having two display screens connected together (for example, connected by a shaft connection, etc.), wherein the two display screens may be two independent display screens, or may be composed of a flexible
  • a folding screen or a curved screen is divided into two displays that can be used to perform different functions.
  • An electronic device can be an electronic device that works independently as a whole, such as a personal notebook, etc., or it can be an electronic device formed by connecting two electronic devices that can work independently and working together, such as two mobile phones or two electronic devices.
  • the first operation may be an operation directly acting on the application control bar.
  • the first operation may be to change the display area of the application control bar through touch gestures; or, the first operation may be to click (finger click or mouse click) etc.) zoom in and zoom out buttons of the application control bar to change the display area of the application control bar; or, the first operation may be to drag the border of the application control bar with the mouse to change the display area of the application control bar.
  • the first operation can also be an operation that indirectly acts on the application control bar.
  • the first operation can directly act on the control area in the above three ways, and change the display area of the application control bar by changing the display area of the control area; or, The first operation can directly act on other application display interfaces or input modules (virtual keyboard or handwriting input area, etc.) in the second display screen through the above three methods, and by changing the display area of other display modules on the second display screen, changing The display area of the application control bar; or, the first operation may be the user's operation on the target application in the first display screen, for example, when the number of control keys displayed in the application control bar corresponding to the user's first operation is different from When the number of control keys displayed in the application control bar before the first operation, the display area of the application control bar can be adjusted adaptively, so that the control keys of the first operation can be better displayed.
  • Control keys provide users with more convenient input operations and improve user experience.
  • the display layout of the virtual keyboard changes.
  • the display area of the virtual keyboard is correspondingly reduced, and the layout of the keys in the virtual keyboard also changes with the change of the display area. For example, all or part of the keys can be reduced or part of the keys can be reduced. Or compress the interval between keys, etc.
  • the display area of the virtual keyboard increases accordingly, and the layout of the keys in the virtual keyboard also changes with the change of the display area.
  • other functional modules such as a touchpad can also be added on the basis of the virtual keyboard.
  • the display layout of the second display screen can make the second display screen have no non-display part and no display folded part, optimize the display layout on the second display screen, and improve the user experience.
  • the interface of the target application includes a third control key group
  • the second display area is larger than the first display area
  • the second control key group includes a first control key group and a third control key group
  • the interface of the target application includes a third control key group.
  • the size of the content originally displayed in the first display screen can be increased, or a new display can be added on the basis of the original display content content to provide users with more convenient operations and improve user experience.
  • the third control key group is determined according to the second display area and the priority order of the to-be-displayed control keys in the to-be-displayed control key set of the target application.
  • the set of to-be-displayed control keys of the target application may be provided by the application, and may be displayed as a set of to-be-displayed control keys displayed in the application control bar.
  • the priority order of the control keys to be displayed in the set may be specified by the application program, or may be determined by the operating system according to the functions of the control keys to be displayed or the usage frequency of the user and other factors.
  • the set of control keys to be displayed is provided by the application program, the priority order of the control keys to be displayed is specified by the application program or the operating system, and the operating system determines that when the display area of the application control bar increases, the number of additional keys in the application control bar is determined.
  • the control key can determine the control keys displayed in the application control bar under various display areas of the application control bar, which makes the setting of the application control bar more flexible and can support various operation modes and needs of users.
  • the display keys to be added in the application control area are determined according to the priority order of the control keys to be displayed, which can give priority to the priority when the display area of the application control bar is limited.
  • the higher (more important, or the user's usage frequency) control keys are displayed in the application control bar, which can provide users with more convenient operations and improve user experience.
  • the third control key group is displayed at a position closer to the first display screen than the first control key group.
  • the first control key group is displayed at a position closer to the user's hands than the third control key group. That is, each time the application control bar is enlarged, the newly added control keys are always displayed near the first display screen, and the control keys originally displayed in the application control bar are displayed closer to the user's hands, which is convenient for the user to operate. In the process of expanding the display area of the application control bar, the priority of the newly added control keys is often lower than the control keys originally displayed in the application control bar. It is close to the user's hands, providing users with more convenient operations and improving user experience.
  • the interface of the target application does not include the fourth control key group, and the fourth control key group is a control key set for controlling the target application;
  • the second display area is smaller than the first display area
  • the second control key group is to reduce the fourth control key group in the first control key group
  • the interface of the target application includes part or all of the fourth control key group.
  • the application control When the user wants to display fewer control keys in the application control bar, or the number of control keys corresponding to the user's current operation is small, or the user needs to enlarge the display area of other display modules on the second display screen, reduce the application control
  • the display area of the application control area is reduced, and the control keys displayed in the application control area can be reduced, which can save the display controls on the second display screen, reduce the visual disturbance to the user, and facilitate the user to quickly locate the required control keys.
  • some or all of the control key groups in the fourth control key group are displayed on the first display screen, so that the user can still use this part of the control keys when they need to use them.
  • the operation can be performed through the first display screen, which makes up for the impact on the user operation when the application control bar is reduced, and improves the user experience.
  • the fourth control key group is determined according to the second display area and the priority order of the control keys in the first control key group or the positional relationship of the control keys in the first control key group.
  • the priority order of the control keys in the first control key group may be specified by the application program, or may be specified by the system. According to the priority order of the control keys, it is determined which control keys to remove in the application control bar when reducing the display area of the application control bar, which can keep the control keys with higher priority in the application control bar and improve the user's experience. operating experience.
  • the user performs the operation of reducing the display area of the application control bar, it is to hide the display of a specific area of the application control bar. For example, by dragging to hide part of the displayed content, which control keys to hide can be determined according to the position of the control keys, so as to achieve the user's Operational purpose.
  • the second control key group is a control key group corresponding to the second display area of the application control bar.
  • the control key group corresponding to the second display area of the application control bar may be provided by the application program.
  • an application can define corresponding control key groups for several fixed-size display areas of the application control bar.
  • the control key group corresponding to the display area; alternatively, the application can also define corresponding control key groups for several size variation ranges of the display area of the application control bar. When the actual display area of the application control bar falls in a certain range When it is in the size range, the control key group corresponding to the size range is displayed in the application control bar.
  • Using the above method to determine the control keys displayed in the application control bar can greatly reduce the calculation amount of the operating system, shorten the response time of the operating system when executing the first operation, and improve the operation efficiency.
  • the first operation is a gesture operation
  • a first type of virtual keyboard corresponding to the gesture operation is selected from multiple types of virtual keyboards, wherein the virtual keys included in different types of virtual keyboards in the multiple types of virtual keyboards are not exactly the same;
  • the second display area is determined according to the display area of the virtual keyboard of the first type.
  • Input methods such as the application control bar and the virtual keyboard may be displayed on the second display screen at the same time.
  • Different gestures can open different gesture virtual keyboards.
  • the display area display area and Display position, etc.
  • the application control bar can flexibly adapt to the gesture virtual keyboard, so that the display on the second display screen is more reasonable, neat and beautiful, and the user experience is improved.
  • the interface of the target application is displayed on the second display screen, so as to obtain the handwritten content for the interface of the target application through the second display screen, and the second operation instructs to enable the handwriting input mode of the target application;
  • the second display screen does not include an application control bar.
  • the interface of the target application may be displayed on the second display screen, so as to obtain the handwritten content for the interface of the target application through the second display screen.
  • the application control bar in the second display screen can be hidden, so as to save the display controls on the second display screen, so that the content displayed on the second display screen can be reduced. It is more concise and refreshing, avoiding the interference of the application control bar for handwriting input.
  • the first operation is used to switch the input mode to the handwriting input mode
  • a handwriting input area and/or a control key group associated with the handwriting input mode are displayed in the application control bar;
  • a handwriting input area can be displayed in the application control bar, which enables the user to perform handwriting input operations through the application control bar more conveniently and improves the operation efficiency.
  • the control key group associated with the handwriting input method can also be displayed in the application control bar, such as: pen, eraser, color, font, etc., so that users can operate the handwriting input method through the application control bar, providing users with more convenience operation.
  • the handwriting input area and the control key group associated with the handwriting input mode can also be displayed in the application control bar, and the above beneficial effects can be achieved.
  • the method further includes:
  • the at least one first vibration feedback element is instructed to emit vibration waves to perform a first feedback operation, and the first feedback operation is used to prompt that the first control key is a key of the application control bar.
  • the control area displayed in the second display screen may include a system control bar and an application control bar.
  • a feedback operation is provided, which enables the user to move his sight to the second display screen without moving his sight.
  • a feedback operation can also be set for the control keys in the system control bar in the control area, so that the user can locate the system control bar in the control area without moving their eyes to the second display screen position and the control keys in the system control bar, which greatly improves the operation efficiency.
  • the application control bar may be implemented in any of the following manners closure:
  • the application control bar may be implemented in any of the following manners On:
  • the above method of opening and closing the application control bar is only exemplary, and the above design enables the user to activate or close the application control area in a flexible manner when any content is displayed on the second display screen, so as to provide users with more convenient operations , to improve the user experience.
  • a twenty-first aspect of an embodiment of the present invention provides a screen display method.
  • the screen display method is applied to an electronic device including a first display screen and a second display screen.
  • the screen display method includes:
  • the interface of the target application includes a fifth control key group
  • a fifth control key group is displayed in the application control bar, and the fifth control key group in the interface of the target application is hidden.
  • the control key corresponding to the user's operation is displayed in the application control bar, and the shortcut operation control key corresponding to the user's current operation can always be displayed in the application control bar, providing users with more convenient operations. Improve the user's operational efficiency.
  • the display of this part of the control keys in the first display screen can be removed, which can save the display area of the first display screen and make the display content of the first display screen more concise and convenient. refreshing.
  • the size of the content originally displayed in the first display screen can be increased, or a new display can be added on the basis of the original display content content, providing users with more convenient operations and improving user experience.
  • the screen display method further includes:
  • the display area of the application control bar is changed.
  • the number of control keys in the application control bar may change.
  • the display area of the application control bar can be adjusted adaptively to optimize the display of the control keys in the application control bar. , so that the display of the control keys in the application control bar is more in line with the user's usage habits and improves the user experience.
  • the screen display method In combination with the twenty-first aspect or the first possible implementation manner of the twenty-first aspect, in the second possible implementation manner of the twenty-first aspect, before the fifth control key group is displayed in the application control bar, the screen display method also includes:
  • the application control bar includes a sixth control key group, and the sixth control key group is a set of initial control keys for controlling the target application.
  • an initial control key group for controlling the target application can be displayed in the application control bar.
  • an initial control key group corresponding to the user's current operation can be added on the basis of the initial control key group.
  • the fifth control key group can also replace part or all of the initial control key group with the fifth control key group corresponding to the current operation of the user, so that the control key most relevant to the current operation of the user is always displayed in the application control column, Provide users with more convenient operations and improve user operation efficiency.
  • this part of the control keys in the interface of the target application can be removed to save the display area of the first display screen and make the display content of the first display screen more concise and convenient. refreshing.
  • the screen display method In combination with the twenty-first aspect or the first possible implementation manner of the twenty-first aspect, in the third possible implementation manner of the twenty-first aspect, after the fifth control key group is displayed in the application control bar, the screen display method also includes:
  • a seventh control key group is displayed in the application control bar, and the seventh control key group in the interface of the target application is hidden.
  • the user may continue to perform the fourth operation on the interface of the same target application.
  • a seventh control key group corresponding to the user's fourth operation may be added on the basis of the fifth control key group in the application control bar, or the Some or all of the control keys in the fifth control key group are replaced with a seventh control key group corresponding to the user's fourth operation.
  • the fourth operation may also be to display a new target application on the first display screen, and the fourth operation may be realized by opening a new target application, or the target application may be originally running in the background, and the fourth operation may be implemented by opening a new target application. Display its interface on the first display screen.
  • the interface of the new target application When the interface of the new target application is displayed on the first display screen, the interface of the original target application in the first display screen can be hidden, and at this time, the fifth control key group can be replaced with the seventh control key group.
  • the interfaces of the two target applications may be displayed on the first display screen at the same time (for example, dual-screen display), and at this time, a fifth control key group may be added on the basis of the seventh control key group, that is, the The seven control key group and the fifth control key group are displayed together in the application control bar.
  • control keys in the application control bar can be flexibly changed, and the control keys most closely related to user operations can always be displayed in the application control bar, providing users with more convenient operations and improving operation efficiency.
  • the third operation is to select the target object in the interface of the target application
  • the fifth control key group is the control keys for operating the target object.
  • the third operation may be to select the target object in the interface of the target application, for example, the shading of the target object is deepened to indicate that it is selected, or the third operation may be to select the target object by moving the cursor on the target object.
  • the target object selected by the third operation may be a picture or text
  • the fifth control key group may include control keys related to text or picture editing, so that the user can edit the text or picture through the control keys in the application control bar.
  • the target object selected by the third operation may be audio and video
  • the fifth control key group may include a control key group related to audio and video control, which is convenient for the user to control the audio and video through the control keys in the application control bar.
  • the third operation is to move the cursor to the target position in the interface of the target application
  • the fifth control key group is the control keys in the menu bar displayed when the right mouse button is clicked at the target position.
  • the control keys in the menu bar displayed when the cursor is right-clicked will be displayed in the application control bar, and the control keys in the right-click menu bar will be displayed according to the user It is designed with a high probability to meet the user's current operation needs, and directly using the control keys in the right-click menu bar can avoid the developer's secondary development and shorten the development cycle.
  • the third operation is to browse the content in the target area of the interface of the target application by sliding gestures or scrolling the mouse wheel;
  • the fifth control key group is the thumbnail image of the target area and the positioning frame for quickly locating the target object in the thumbnail image.
  • the user can quickly locate the target content required by the user by applying the thumbnail image of the target area in the control bar and the positioning frame for quickly locating the target object in the thumbnail image, thereby improving the user's operation efficiency.
  • a twenty-second aspect of the embodiments of the present invention provides an electronic device, including:
  • the application control bar When the display area of the application control bar is the first display area, the application control bar includes the first control key group;
  • the application control bar When the display area of the application control bar is the second display area, the application control bar includes a second control key group;
  • the first control key group and the second control key group are both control key sets used to control the target application, and the control keys included in the first control key group and the second control key group are not identical.
  • one or more processors cause the electronic device to perform the following steps when executing one or more programs:
  • the display layout of the virtual keyboard changes.
  • the interface of the target application includes a third control key group
  • the second display area is larger than the first display area
  • the second control key group includes a first control key group and a third control key group
  • the interface of the target application includes a third control key group.
  • one or more processors cause the electronic device to perform the following steps when executing one or more programs :
  • the third control key group is determined according to the second display area and the priority order of the to-be-displayed control keys in the to-be-displayed control key set of the target application.
  • the interface of the target application does not include the fourth control key group, and the fourth control key group is a control key set for controlling the target application;
  • the second display area is smaller than the first display area
  • the second control key group is to reduce the fourth control key group in the first control key group
  • the interface of the target application includes part or all of the fourth control key group.
  • one or more processors cause the electronic device to perform the following steps when executing one or more programs :
  • the fourth control key group is determined according to the second display area and the priority order of the control keys in the first control key group or the positional relationship of the control keys in the first control key group.
  • one or more processors are executing a or more programs, cause the electronic device to perform the following steps:
  • the first operation is a gesture operation
  • a first type of virtual keyboard corresponding to the gesture operation is selected from multiple types of virtual keyboards, wherein the virtual keys included in different types of virtual keyboards in the multiple types of virtual keyboards are not exactly the same;
  • the second display area is determined according to the display area of the virtual keyboard of the first type.
  • one or more processors are executing a or more programs, cause the electronic device to perform the following steps:
  • the interface of the target application is displayed on the second display screen, so as to obtain the handwritten content for the interface of the target application through the second display screen, and the second operation instructs to enable the handwriting input mode of the target application ;
  • the second display screen does not include an application control bar.
  • the electronic device provided by the twenty-second aspect of the embodiment of the present invention can implement various possible implementation manners described in the twentieth aspect of the embodiment of the present invention, and achieve all beneficial effects.
  • a twenty-third aspect of the embodiments of the present invention provides an electronic device, including:
  • the interface of the target application includes a fifth control key group
  • a fifth control key group is displayed in the application control bar, and the fifth control key group in the interface of the target application is hidden.
  • the electronic device displays a fifth control key in the application control bar Before the group, perform the following steps:
  • the application control bar includes a sixth control key group, and the sixth control key group is a collection of initial control keys used to control the target application;
  • the interface of the target application does not include the sixth control key group.
  • the seventh control key is not included in the interface of the target application.
  • the third operation is to select the target object in the interface of the target application
  • the fifth control key group is the control keys for operating the target object.
  • the third operation is to move the cursor to the target position in the interface of the target application
  • the fifth control key group is the control keys in the menu bar displayed when the right mouse button is clicked at the target position.
  • the third operation is to browse the content in the target area of the interface of the target application by sliding gestures or scrolling the mouse wheel;
  • the fifth control key group is the thumbnail image of the target area and the positioning frame for quickly locating the target object in the thumbnail image.
  • the electronic device provided in the twenty-third aspect of the embodiment of the present invention can implement various possible implementation manners described in the twenty-first aspect of the embodiment of the present invention, and achieve all beneficial effects.
  • a twenty-fourth aspect of an embodiment of the present invention provides a computer storage medium, where a program is stored in the computer-readable medium, and when the computer is run on a computer, the computer enables the computer to implement the twentieth aspect or the first eleven items of the twentieth aspect.
  • a twenty-fifth aspect of the embodiments of the present invention provides a computer program product, which, when run on a computer, enables the computer to implement any one of the twentieth aspect or the eleventh possible implementation manners of the twentieth aspect.
  • the screen display method described above, or the screen display method described in any one of the twenty-first aspect or the sixth possible implementation manner of the twenty-first aspect is implemented, and all the above-mentioned beneficial effects are achieved.
  • FIG. 1 is a schematic structural diagram of an electronic device provided by an embodiment of the present application.
  • FIG. 2 is another schematic structural diagram of an electronic device provided by an embodiment of the present application.
  • FIG. 3 is a schematic structural diagram of a touch screen provided by an embodiment of the present application.
  • FIG. 4 is a schematic diagram of two arrangements of multiple vibration feedback units in an electronic device provided by an embodiment of the present application.
  • FIG. 5 is a schematic cross-sectional view of a touch screen provided by an embodiment of the present application.
  • FIG. 6 is a schematic diagram of an arrangement and layout of a plurality of vibration feedback units included in a vibration feedback module provided by an embodiment of the present application;
  • FIG. 7 is another schematic structural diagram of a touch screen provided by an embodiment of the present application.
  • FIG. 8 is another schematic structural diagram of a touch screen provided by an embodiment of the present application.
  • FIG. 9 is a schematic flowchart of a feedback method provided by an embodiment of the present application.
  • FIGS. 10 are two schematic diagrams of a virtual keyboard in the feedback method provided by the embodiment of the present application.
  • FIG. 11 is two schematic diagrams of a first location area and a second location area in a feedback method provided by an embodiment of the present application;
  • FIG 12 is another schematic diagram of the first location area and the second location area in the feedback method provided by the embodiment of the present application.
  • FIG. 13 is still another schematic diagram of the first location area and the second location area in the feedback method provided by the embodiment of the present application.
  • FIG. 14 is a schematic structural diagram of an electronic device provided by an embodiment of the application.
  • FIG. 15 is a schematic structural diagram of an electronic device provided by an embodiment of the application.
  • 16 is a schematic diagram of an electronic device provided by an embodiment of the present application.
  • FIG. 17 is a schematic flowchart of a method for processing a virtual keyboard provided by an embodiment of the present application.
  • FIG. 18 is a schematic diagram of a first gesture parameter in a method for processing a virtual keyboard provided by an embodiment of the present application
  • 19 is a schematic diagram of relative angle information in a method for processing a virtual keyboard provided by an embodiment of the application.
  • FIGS. 20 are two schematic diagrams of a first area in a method for processing a virtual keyboard provided by an embodiment of the present application.
  • 21 is a schematic diagram of a first gesture operation in a method for processing a virtual keyboard provided by an embodiment of the application
  • FIG. 22 is another schematic diagram of a first gesture operation in a method for processing a virtual keyboard provided by an embodiment of the present application
  • FIG. 23 is a schematic diagram of a first type of virtual keyboard in a method for processing a virtual keyboard provided by an embodiment of the present application;
  • FIG. 24 is another schematic diagram of a first type of virtual keyboard in the method for processing a virtual keyboard provided by an embodiment of the present application;
  • 25 is another schematic diagram of a first type of virtual keyboard in the method for processing a virtual keyboard provided by an embodiment of the present application;
  • 26 is a further schematic diagram of a first type of virtual keyboard in the method for processing a virtual keyboard provided by an embodiment of the present application;
  • FIG. 27 is another schematic diagram of a first type of virtual keyboard in the method for processing a virtual keyboard provided by an embodiment of the application;
  • FIG. 28 is still another schematic diagram of a first type of virtual keyboard in the method for processing a virtual keyboard provided by an embodiment of the application;
  • 29 is another schematic diagram of the first type of virtual keyboard in the processing method of the virtual keyboard provided by the embodiment of the application;
  • FIG. 30 is a schematic diagram of a first setting interface in a method for processing a virtual keyboard provided by an embodiment of the present application
  • FIG. 31 is another schematic diagram of a first setting interface in a method for processing a virtual keyboard provided by an embodiment of the application;
  • 32 is a schematic diagram of a custom gesture operation in a method for processing a virtual keyboard provided by an embodiment of the present application
  • 33 is still another schematic diagram of a first type of virtual keyboard in the method for processing a virtual keyboard provided by an embodiment of the application;
  • 34 is another schematic diagram of a first type of virtual keyboard in the method for processing a virtual keyboard provided by an embodiment of the present application;
  • 35 is a further schematic diagram of a first type of virtual keyboard in the method for processing a virtual keyboard provided by an embodiment of the application;
  • 36 is a schematic diagram of a second virtual key in a method for processing a virtual keyboard provided by an embodiment of the application;
  • FIG. 37 is another schematic diagram of a second virtual key in a method for processing a virtual keyboard provided by an embodiment of the present application.
  • FIG. 39 is a schematic structural diagram of an electronic device provided in an embodiment of the present application.
  • FIG. 40 is a schematic structural diagram of an electronic device provided by an embodiment of the present application.
  • 41 is a schematic structural diagram of an electronic device provided by an embodiment of the application.
  • 43 is a schematic interface diagram of a display interface of a second display screen in the method for processing an application interface provided by an embodiment of the present application;
  • 44 is a schematic flowchart of a method for processing an application interface provided by an embodiment of the present application.
  • FIG. 45 is another schematic flowchart of the method for processing an application interface provided by an embodiment of the present application.
  • 46 is a schematic diagram of various holding postures in the method for processing an application interface provided by an embodiment of the present application.
  • 47 is a schematic interface diagram of a first application interface in the method for processing an application interface provided by an embodiment of the present application.
  • 48 is a schematic diagram of two interfaces of a first application interface in the method for processing an application interface provided by an embodiment of the present application;
  • 49 is a schematic diagram of a first contact operation in the method for processing an application interface provided by an embodiment of the present application.
  • 50 is a schematic diagram of a display interface of a first application interface in the method for processing an application interface provided by an embodiment of the present application;
  • 51 is a schematic flowchart of a method for processing an application interface provided by an embodiment of the present application.
  • FIG. 52 is a schematic flowchart of a method for processing an application interface provided by an embodiment of the present application.
  • 53 is a schematic flowchart of a method for processing an application interface provided by an embodiment of the present application.
  • 54 is a schematic diagram of a display interface of a first application interface in the method for processing an application interface provided by an embodiment of the present application;
  • 55 is a schematic diagram of a display interface of a first application interface in the method for processing an application interface provided by an embodiment of the present application;
  • 56 is a schematic diagram of a display interface of a first application interface in the method for processing an application interface provided by an embodiment of the present application;
  • 57 is a schematic diagram of a display interface of a first application interface in the method for processing an application interface provided by an embodiment of the present application;
  • FIG. 58 is a schematic structural diagram of an electronic device provided by an embodiment of the application.
  • FIG. 59 is a schematic structural diagram of an electronic device provided by an embodiment of the application.
  • 60 is a dual-screen electronic device provided by an embodiment of the present invention.
  • FIG. 61 is an application scenario provided by an embodiment of the present invention.
  • 62 is a screen display method provided by an embodiment of the present invention.
  • 63A is a display mode of a control area provided by an embodiment of the present invention.
  • 63B is another display mode of the control area provided by an embodiment of the present invention.
  • FIG. 63C is another display mode of the control area provided by an embodiment of the present invention.
  • 64A is a method for activating a control area provided by an embodiment of the present invention.
  • FIG. 64B is another method for activating a control area provided by an embodiment of the present invention.
  • 64C is another method for activating a control area provided by an embodiment of the present invention.
  • 64D is another method for activating the control area provided by an embodiment of the present invention.
  • FIG. 65A is a corresponding manner of a user operation and a control key group provided by an embodiment of the present invention.
  • FIG. 65B is another corresponding manner of user operation and control key group provided by an embodiment of the present invention.
  • FIG. 65C is another corresponding manner of user operation and control key group provided by an embodiment of the present invention.
  • FIG. 65D is another corresponding manner of user operation and control key group provided by an embodiment of the present invention.
  • FIG. 65E is another corresponding manner of user operation and control key group provided by an embodiment of the present invention.
  • FIG. 65F is another corresponding manner of user operation and control key group provided by an embodiment of the present invention.
  • 66A is a display mode of a control area provided by an embodiment of the present invention.
  • FIG. 66B is another display mode of a control area provided by an embodiment of the present invention.
  • FIG. 67 is a layout mode of display content in a control area provided by an embodiment of the present invention.
  • FIG. 69 is another priority setting method provided by an embodiment of the present invention.
  • 70A is a method for closing a control area provided by an embodiment of the present invention.
  • FIG. 70B is another method for closing the control area provided by an embodiment of the present invention.
  • FIG. 70C is another method for closing the control area provided by an embodiment of the present invention.
  • FIG. 70D is another method for closing the control area provided by an embodiment of the present invention.
  • FIG. 71 is another screen display method provided by an embodiment of the present invention.
  • FIG. 72 is another screen display method provided by an embodiment of the present invention.
  • 73A is a method for changing the display area of an application control bar provided by an embodiment of the present invention.
  • 73B is a method for increasing the display area of an application control bar provided by an embodiment of the present invention.
  • 73C is a method for increasing the display area of an application control bar provided by an embodiment of the present invention.
  • FIG. 74A is another method for changing the display area of an application control bar provided by an embodiment of the present invention.
  • 74B is another method for increasing the display area of an application control bar provided by an embodiment of the present invention.
  • 74C is another method for increasing the display area of an application control bar provided by an embodiment of the present invention.
  • 75A is another method for changing the display area of an application control bar provided by an embodiment of the present invention.
  • 75B is another method for increasing the display area of an application control bar provided by an embodiment of the present invention.
  • 75C is another method for increasing the display area of an application control bar provided by an embodiment of the present invention.
  • 76A is a method for changing the display area and control keys of an application control bar according to a user operation provided by an embodiment of the present invention
  • 76B is another method for changing the display area and control keys of the application control bar according to user operations provided by an embodiment of the present invention.
  • 77A is a gesture control method provided by an embodiment of the present invention.
  • FIG. 77B is another gesture control method provided by an embodiment of the present invention.
  • FIG. 77C is another gesture control method provided by an embodiment of the present invention.
  • FIG. 77D is another gesture control method provided by an embodiment of the present invention.
  • FIG. 78A is another gesture control method provided by an embodiment of the present invention.
  • FIG. 78B is another gesture control method provided by an embodiment of the present invention.
  • FIG. 79 is another gesture control method provided by an embodiment of the present invention.
  • FIG. 80A is a specific implementation manner of a screen display method provided by an embodiment of the present invention.
  • FIG. 80B is a specific implementation manner of another screen display method provided by an embodiment of the present invention.
  • 80C is a specific implementation of another screen display method provided by an embodiment of the present invention.
  • FIG. 80D is a specific implementation manner of another screen display method provided by an embodiment of the present invention.
  • FIG. 80E is a specific implementation manner of another screen display method provided by an embodiment of the present invention.
  • 80F is a specific implementation of another screen display method provided by an embodiment of the present invention.
  • FIG. 80G is a specific implementation manner of another screen display method provided by an embodiment of the present invention.
  • FIG. 81 is an electronic device provided by an embodiment of the present invention.
  • FIG. 82 is another electronic device provided by an embodiment of the present invention.
  • the embodiments of the present application may be applied to various application scenarios in which input is performed through a virtual keyboard.
  • various application scenarios in which input is performed through a virtual keyboard.
  • users use text entry applications, make presentations (power point, PPT), browse web pages, play video, play music, use navigation applications and other application scenarios, they can use virtual
  • touch typing on the touch screen is a difficult task.
  • an embodiment of the present application provides a feedback method.
  • the feedback method is applied to an electronic device equipped with a touch screen.
  • the electronic device acquires the first position information of the first contact point on the touch screen, and then Acquire a first virtual key corresponding to the first contact point according to the first position information, and in the case where the first virtual key is an anchor point key, perform a first feedback operation to prompt that the first virtual key is an anchor point key , so as to help cultivate the user's muscle memory for the anchor point buttons, and perform touch typing training by means of the muscle memory of the anchor point buttons, so as to reduce the difficulty of realizing touch typing on the touch screen.
  • the electronic device 1 includes a processor 10 and a touch screen 20 .
  • the touch screen 10 includes a touch sensing module 100 and a vibration feedback module 200
  • the vibration feedback module 200 includes a plurality of vibration feedback elements.
  • the processor 10 obtains the first position information of the first contact point on the touch screen through the contact sensing module 100, and when the processor 10 determines that the first virtual button corresponding to the first contact point is the anchor point button , obtain the vibration feedback element matched with the first virtual keyboard from the multiple vibration feedback elements included in the vibration feedback module 200, and send out vibration waves through the vibration feedback element matched with the first virtual key, so that the Vibration feedback is issued at the contact point (that is, within a preset range around the first contact point on the touch screen), and the first virtual button used to prompt the user to touch is the anchor point button.
  • the aforementioned vibration feedback is not a full-screen vibration feedback, but a vibration feedback for the first contact point, where the vibration feedback intensity is the greatest.
  • the electronic device 1 includes a display 30 and a touch screen 20 , a virtual keyboard is displayed on the touch screen 20 , and there are anchor point keys in the virtual keyboard, that is, the touch screen 20 needs to have both To display the functions of virtual keyboard and vibration feedback, the touch screen 20 also needs to be provided with a display module.
  • the electronic device 1 may also be a virtual reality (virtual reality, VR) device, an augmented reality (AR) device, or a mixed reality (mixed reality, MR) device, that is, a touch screen. 20 may not need to display a virtual keyboard, but only needs to perform vibration feedback, and a display module does not need to be provided in the touch screen 20 . It should be understood that, in the subsequent embodiments, only the display module provided in the touch screen 20 is taken as an example for description.
  • FIG. 3 is a schematic structural diagram of a touch screen provided by an embodiment of the present application.
  • the touch screen 20 may further include a cover plate 300 and a display module 400 .
  • the cover plate 300 and the touch sensing module 100 are integrated as an example.
  • the cover plate 300 and the touch sensing module 100 may also be separated from each other.
  • the cover plate 300 can be made of glass transparent rigid material, flexible transparent organic material or other materials, etc.
  • the touch sensing module 100 can be embodied as a touch sensing film, and the contact sensing film can be a capacitive touch sensing film, a pressure sensing film Contact sensing film, temperature-based contact sensing film, or other types of films, etc.
  • the contact sensing film may specifically be indium tin oxide (indium tin oxide, ITO) wire mesh, carbon nanotubes with protrusions Nets or other materials, etc., are not exhaustive here.
  • ITO indium tin oxide
  • various specific expression forms of the vibration feedback element are provided, which improves the implementation flexibility of the solution.
  • the display module 400 is used to display a virtual keyboard.
  • the display module 400 and the touch sensing module 100 may be integrated or separated from each other.
  • the display module 400 and the touch sensing module 100 are separated from each other as an example.
  • the display module 400 may specifically be represented as a display panel, and the display panel may specifically be a liquid crystal display (LCD), an active matrix/organic light Emitting Diode (AMOLED) or other types
  • LCD liquid crystal display
  • AMOLED active matrix/organic light Emitting Diode
  • the vibration feedback module 200 can be embodied as a vibration feedback layer, and the vibration feedback layer is located below the contact sensing module 100 , and can be located above the display module 400 , or can be located on the display module. below 400.
  • the vibration feedback module 200 is configured with a plurality of vibration feedback units 201 , each dark gray diamond in FIG. 3 represents a vibration feedback unit; a vibration feedback unit 201 may include one or more vibration feedback elements.
  • the vibration feedback layer can be embodied as a vibration feedback film, and the vibration feedback film is partitioned to divide a plurality of vibration feedback elements, and in another case, the vibration feedback element can be embodied as a piezoelectric ceramic sheet , linear motors or other types of electronic components, not exhaustive here.
  • the plurality of vibration feedback units 201 may have various layout arrangements.
  • the layout of the virtual keyboard and the physical keyboard are completely consistent, and the aforementioned physical keyboard may be a keyboard comprising 61 keys, a keyboard with 87 keys, a keyboard with 104 keys, a keyboard with 108 keys, Ergonomic keyboards or other types of physical keyboards, etc., the design of the specific virtual keyboard can be flexibly set in combination with the actual application scenario.
  • the multiple vibration feedback units 201 may be arranged in a one-to-one correspondence with multiple virtual keys, that is, each virtual key corresponds to one vibration feedback unit 201 in position.
  • FIG. 4 is a schematic diagram of two arrangements of a plurality of vibration feedback units in an electronic device according to an embodiment of the present application.
  • FIG. 4 includes sub-schematic diagram (a) and sub-schematic diagram (b), referring to sub-schematic diagram (a) of FIG. 4 first, a plurality of vibration feedback units 201 are arranged in a matrix.
  • a plurality of vibration feedback units 201 are arranged in a form similar to a chess board, and each gray square in the sub-schematic diagram (a) of FIG. 4 and the sub-schematic diagram (b) of FIG. 4 Both represent a vibration feedback unit 201 .
  • FIG. 5 is a schematic cross-sectional view of a touch screen provided by an embodiment of the present application
  • FIG. 6 is a schematic diagram of an arrangement layout of a plurality of vibration feedback units included in a vibration feedback module provided by an embodiment of the present application.
  • the touch screen 20 includes a cover plate 300 , a touch sensing module 100 , a display module 400 , a vibration feedback element, a support structure for the vibration feedback element, other modules in the touch screen, and a bottom plate.
  • FIG. 5 is a schematic cross-sectional view of a touch screen provided by an embodiment of the present application
  • FIG. 6 is a schematic diagram of an arrangement layout of a plurality of vibration feedback units included in a vibration feedback module provided by an embodiment of the present application.
  • the touch screen 20 includes a cover plate 300 , a touch sensing module 100 , a display module 400 , a vibration feedback element, a support structure for the vibration feedback element, other modules in the touch screen, and a bottom plate.
  • the cover plate is used Taking the integration of 300 and the touch sensing module 100 as an example, a plurality of vibration feedback elements are parallel to the display module 400 and can directly support the cover plate 300 . It should be noted that, in other embodiments, the cover plate 300 and the touch sensing module 100 may also be independent, and multiple vibration feedback units may also be parallel to the touch sensing module 100 . 5 and 6, it can be seen that the plurality of vibration feedback units 201 are arranged in a surrounding arrangement, that is, the plurality of vibration feedback units 201 surround the display module 400. Correspondingly, in other embodiments, the plurality of vibration feedback units 201 can also surround the contact sensing module 100.
  • a gap layer may be formed between the display module 400 and the cover plate 300 to provide an active space margin for the vibration feedback element to emit vibration waves, and the display module 400 and the cover plate 300 may also be bonded by a light-transmitting adhesive material.
  • the support structure of the vibration feedback element and the bottom plate can be integrated into one body, or can be separated from each other. In the embodiments of the present application, various arrangement and layout manners of the plurality of vibration feedback units are provided, which improves the implementation flexibility of the solution.
  • the touch screen 20 may further include a pressure sensing module, and the pressure sensing module is used to detect pressure changes and positions on the touch screen.
  • the pressure sensing module and the vibration feedback module 200 may be two independent modules, respectively, and the pressure sensing module may be disposed above the vibration feedback module 200 or below the vibration feedback module 200 .
  • the pressure sensing module may specifically be expressed as a pressure sensing film, a distributed pressure sensor, or other forms, which are not exhaustive here.
  • the pressure sensing module may be integrated with the vibration feedback module 200, and the vibration feedback module 200 may also be called a pressure sensing module, and the vibration feedback element may also be called a pressure sensing element.
  • the vibration feedback element can specifically use piezoelectric ceramic sheets, piezoelectric polymers (such as piezoelectric films), piezoelectric composite materials, or other types of elements, etc.
  • the piezoelectric composite materials also use piezoelectric ceramic sheets. and piezo-polymer obtained review material.
  • a plurality of vibration feedback elements included in the vibration feedback module 200 may be divided, and a second vibration feedback element in the plurality of vibration feedback elements is used for collecting The pressure value, and the third vibration feedback element in the plurality of vibration feedback elements is used to emit vibration waves for vibration feedback.
  • the second vibration feedback element and the third vibration feedback element are different vibration feedback elements.
  • a vibration feedback unit 201 includes two vibration feedback elements, one vibration feedback element in the same vibration feedback unit 201 is used to collect pressure values, and the other vibration feedback element is used to emit vibration waves for vibration feedback.
  • a plurality of vibration feedback elements in the vibration feedback module 200 are used to collect pressure values during the first time period, and are used to emit vibration waves during the second time period , the first time period and the second time period are different.
  • a plurality of vibration feedback elements in the vibration feedback module 200 may be used to collect pressure values in a default state, and when a first pressure value threshold is reached (ie, confirming receipt of a press) operation), used to emit vibration waves for vibration feedback.
  • the touch screen is also configured with a pressure sensing module for collecting pressure values, so that not only the position information of the contact point, but also the pressure value of the contact point can be obtained, so as to detect the pressure value of the touch screen.
  • the obtained contact operations are further managed in detail; and the pressure sensing module and the vibration feedback module are integrated into one, which is beneficial to reduce the thickness of the touch screen, thereby improving the convenience of electronic devices.
  • the tactile properties of the cover plate 300 of the touch screen 20 can be changed, and the tactile properties include any one or more of the following properties: sliding friction coefficient, stick-slip, temperature or other tactile properties, etc.; further Basically, stick-slip represents the rate of change in the coefficient of sliding friction. Further, the haptic characteristics of the entire cover plate 300 may be changed, or only the haptic characteristics of the contact points in the cover plate 300 may be changed.
  • FIG. 7 is a schematic structural diagram of a touch screen provided by an embodiment of the present application.
  • the touch screen 20 may further include an ultrasonic module 500, and the ultrasonic module 500 is used to emit ultrasonic waves to change the tactile characteristics of the cover plate 300.
  • the ultrasonic vibration film, the piezoelectric film, the speaker or other components can be used to realize the ultrasonic vibration.
  • the ultrasonic module 500 can be arranged below the cover plate 300, specifically can be arranged above the touch sensing module 100 or the display module 400, or can be arranged below the touch sensing module 100 or the display module 400. In FIG. 7, it is arranged at Taking the upper part of the contact sensing module 100 as an example, it should be understood that the example in FIG. 7 is only for the convenience of understanding this solution, and is not used to limit this solution.
  • the touch screen 20 further includes an electrostatic module 600 , and the electrostatic module 600 is used to generate electrical signals , to change the haptic properties of the cover.
  • the electrostatic module 600 can be embodied as an electrostatic thin film layer, and can be arranged under the cover plate 300 , specifically can be arranged above the touch sensing module 100 or the display module 400 , or can be arranged on the touch sensing module 100 or the display module 400 .
  • FIG. 8 the configuration above the touch sensing module 100 is taken as an example. It should be understood that the example in FIG. 8 is only for the convenience of understanding this solution, and is not used to limit this solution.
  • the touch screen can also change the tactile characteristics of the cover by setting an ultrasonic module or an electrostatic module, so as to provide richer haptic feedback, and then use the richer haptic feedback to The user implements touch typing on the touch screen for training, so as to further reduce the difficulty of touch typing on the touch screen.
  • FIG. 9 is a schematic flowchart of a feedback method provided by an embodiment of the present application.
  • the feedback method provided by the embodiment of the present application may include:
  • the electronic device detects a first contact operation acting on the touch screen, and in response to the first contact operation, acquires first position information of a first contact point corresponding to the first contact operation.
  • the electronic device can detect the first contact operation acting on the touch screen in real time.
  • the electronic device detects the first contact operation input by the user through the touch screen, it can respond to the first contact operation to obtain The number of at least one first contact point and the first position information of each first contact point on the touch screen are collected by the contact sensing module in the touch screen.
  • the at least one first contact point may only include newly added contact points on the touch screen, or may include all contact points on the touch screen.
  • the first position information is established based on the touch screen coordinate system, and can be the center point of the touch screen, the vertex in the upper left corner, the vertex in the lower left corner, the vertex in the upper right corner, the vertex in the lower right corner, or any position in the touch screen. point or other location point as the origin of the coordinate system.
  • the touch sensing module in the touch screen will continue.
  • the touch signal corresponding to each touch point on the touch screen is detected, and the position information of the newly added at least one first touch point is collected in time when a touch signal of a new touch point is detected on the touch screen.
  • a touch signal of a new touch point is detected on the touch screen.
  • the virtual keyboard in the embodiment of the present application can be expressed as any type of keyboard.
  • the virtual keyboard can be a full keyboard, a numeric keyboard, a functional keyboard, etc., or the virtual keyboard can also be all operation keys on the touch screen. collective name.
  • an accidental touch prevention process also needs to be performed.
  • the user's fingers can generate contact points on the touch screen, but also the user's palm, forearm, back of the hand or a capacitive pen, etc. can generate contact points on the touch screen, that is, the electronic device can touch the screen
  • the touch sensing module of the electronic device can collect the touch signals of the user's palm, forearm, back of the hand or the capacitive pen and other touch points generated by the non-user's fingers, then the processor of the electronic device can acquire each new contact with the touch screen. After clicking the corresponding touch signal, filter analysis is required to filter out the acquired touch signals of the new touch point except the touch signal of the new touch point triggered by the finger, that is, the first touch point only includes A new touch point triggered by the user's finger.
  • this solution since the user often focuses on the actual keys that are newly touched when using the physical keyboard, this solution only generates feedback for the newly added touch points, which can better simulate when the user uses the physical keyboard for input It is easier to establish a memory relationship between the user and the new contact point, and further reduce the difficulty of training touch typing on the touch screen.
  • a proximity sensing module may also be configured in the touch screen.
  • the electronic device senses the movement of the user's finger above the touch screen through the proximity sensing module in the touch screen. trajectories, and estimate the expected point of contact between the finger and the touch screen.
  • the electronic device may also select a first type of virtual keyboard corresponding to the first gesture operation from a plurality of types of virtual keyboards in response to the detected first gesture operation, wherein a plurality of The virtual keys included in different types of virtual keyboards are not exactly the same; the first type of virtual keyboard is displayed through the touch screen, and during the display of the first type of virtual keyboard, the first type of virtual keyboard is touched.
  • the position on the control screen is fixed; when the electronic device determines that the virtual keyboard of the first type is a virtual keyboard with a fixed position during the display process, the electronic device will acquire the first position information of the first contact point on the touch screen in real time, and also That is, it is triggered to enter step 901 .
  • the first gesture operation, the concept of multiple types of virtual keyboards, and the specific implementation of the foregoing steps will be described in the second embodiment, which will not be repeated here.
  • the electronic device acquires a pressure value corresponding to the first contact point.
  • the electronic device when it obtains the first contact operation input by the user through the touch screen, it can also collect at least one first contact point corresponding to at least one first contact point on the touch screen through the pressure sensing module in the touch screen.
  • pressure value The pressure value corresponding to the at least one first contact point on the touch screen may include a pressure value of each of the at least one first contact point, or may share a pressure value for the at least one first contact point.
  • the pressure sensing module in the touch screen is an independent one, the pressure sensing module can directly collect the pressure value of each first contact point in the at least one first contact point.
  • the pressure sensing module can also directly collect the The pressure value of each of the at least one first contact point.
  • the arrangement of the multiple vibration feedback units is as shown in the above figure 4 to FIG. 6 show various arrangement and layout manners, that is, a plurality of vibration feedback units are arranged in a matrix, a chess-like arrangement or a surrounding arrangement.
  • the electronics can take readings from all the pressure sensing elements (also referred to as vibration feedback elements) in the pressure sensing module.
  • the electronic device may, according to the coordinate position of each pressure sensing element and the pressure value collected by each pressure sensing unit, obtain at least one first contact point (that is, the first contact point (that is, the first contact point) according to the principle of equal torque.
  • the electronic device may also calculate the pressure value of the entire touch screen based on the pressure values collected by all the pressure sensing elements, and determine the pressure value of each first contact point in the at least one contact point. is the pressure value of the entire touch screen mentioned above.
  • the electronic device acquires a first virtual key corresponding to the first contact point according to the first position information of the first contact point.
  • the electronic device may acquire the first virtual keys corresponding to each first contact point one by one;
  • the first virtual key is a virtual key in the virtual keyboard.
  • the electronic device can display one or more types of virtual keyboards
  • the position information of each virtual key in each virtual keyboard can be stored in the electronic device, and the electronic device determines the currently displayed virtual keyboard from the various virtual keyboards, Obtain the position information of each virtual key in the currently displayed virtual keyboard, and then match according to the first position information of the first contact point and the position information of each virtual key in the currently displayed virtual keyboard, thereby determining the first contact point.
  • the corresponding first virtual key For a more intuitive understanding of this solution, please refer to FIG. 10 , which are two schematic diagrams of a virtual keyboard in a feedback method provided by an embodiment of the present application. The sub-schematic diagram (a) of FIG.
  • FIG. 10 and the sub-schematic diagram (b) of FIG. 10 show two styles of virtual keyboards on the touch screen, and the sub-diagram (a) of FIG.
  • the currently displayed virtual keyboard is an ergonomic keyboard.
  • the electronic device determines the first position information of the first contact point on the touch screen through the contact sensing module of the touch screen, the electronic device associates the first position information with the ergonomic keyboard.
  • the position information of each virtual key is compared, so that it is determined that the first contact point is located in the position area of the virtual key K, then it is determined that the first virtual key corresponding to the first contact point is the key K.
  • the first location information can describe a location area
  • the electronic device can take the center of the first location information.
  • the coordinates of the point, and the coordinates of the center point of the first position information are matched with the position information of each virtual key in the currently displayed virtual keyboard, so as to determine the first virtual key corresponding to the first contact point.
  • the electronic device may also directly match the first position information of the first contact point with the position information of each virtual key in the currently displayed virtual keyboard, and select the first virtual key from the first virtual key.
  • the position information of the virtual key has the most intersection with the first position information.
  • the electronic device determines, according to the pressure value corresponding to the first contact point, whether the contact operation corresponding to the first contact point is a pressing operation or a touch operation, if it is a pressing operation, go to step 905; if it is a touch operation, go to step 908 .
  • the electronic device may be preset with a first pressure value threshold and a second pressure value threshold, where the first pressure value threshold refers to a pressing operation threshold, and the second pressure value threshold is a touch operation threshold.
  • the electronic device can determine whether the pressure value corresponding to the first contact point is greater than or equal to the first pressure value Threshold, if the pressure value corresponding to the first contact point is greater than or equal to the first pressure value threshold, it is determined that the contact operation corresponding to the first contact point is a pressing operation; if the pressure value corresponding to the first contact point is greater than or equal to the first pressure value threshold If the second pressure value threshold is smaller than the first pressure value threshold, it is determined that the contact operation corresponding to the first contact point is a touch operation; if the pressure value corresponding to the first contact point is less than the second pressure value threshold, it is determined that the contact operation with the first contact point is The touch operation
  • the value of the first pressure value threshold is greater than the value of the second pressure value threshold.
  • the value range of the first pressure value threshold may be 50 grams force to 60 grams force.
  • the first pressure value The value threshold can be 55 grams force, 60 grams force or other values, etc.
  • the value range of the second pressure value threshold can be 0 grams force to 30 grams force, as an example, for example, the value of the first pressure value threshold is 15 gram force, 20 gram force, etc., are not limited here.
  • the electronic device determines whether the first virtual key is an anchor point key. If the first virtual key is an anchor point key, go to step 906, and if the first virtual key is not an anchor point key, go to step 908.
  • the position of the displayed first type of virtual keyboard is fixed; in another case, in the first type of virtual keyboard During the presentation, the position of the presented virtual keyboard of the first type can be moved.
  • step 903 is a mandatory step. After determining the first virtual button corresponding to the first contact point in step 903, the electronic device can determine whether the first virtual button is an anchor point button.
  • the electronic device may pre-store which location areas on the touch screen are the location areas of the anchor point buttons, and which location areas on the touch screen are the location areas of the non-anchor point buttons, then step 903 is to In the selection step, the electronic device directly determines whether the position of the first contact point is located in the position area of the anchor point button according to the first position information of the first contact point obtained in step 901, that is, it is known that the first position information corresponds to Whether the first virtual key of is an anchor point key.
  • step 903 is a mandatory step, and the electronic device can store each virtual keyboard of the virtual keyboard of the first type.
  • the position information of the button after obtaining the first position information of the first contact point, obtain the first virtual button corresponding to the first contact point according to the first position information, and then determine whether the first virtual button is an anchor point button .
  • the first virtual key corresponding to the first contact point can be acquired in real time according to the first position information, so that the solution can be compatible not only with a virtual keyboard with a fixed position, but also with a virtual keyboard with a movable position, and expands the The application scenarios of this scheme are presented.
  • the meaning of the anchor point button is not the same as the positioning button, that is, the anchor point button refers to the button used to bring a prompt effect to the user.
  • the fixed-point buttons can be pre-configured in the electronic device, that is, which virtual buttons are anchor point buttons can be pre-fixed; they can also be customized by the user, that is, the user can use the "Settings" interface in the electronic device. You can define which virtual keys are anchor point keys.
  • the anchor keys in the different types of virtual keys may also be different.
  • the anchor point key may be the key "F” and the key "J", or, the anchor point key may further include a space bar; as another example, the anchor point key may also include the ESC key, the Backspace key, the Enter key, Commonly used function keys such as the Ctrl key and number keys, etc.; as another example, for example, the virtual keyboard adopts the layout of "DVORAK", and the anchor point keys may include the eight standard-pointed keys of the key "AOEUHTNS"; For example, the virtual keyboard adopts the layout of "AZERTY”, and the anchor point keys may also include eight keys of "QSDFJKLM"; as another example, the anchor point keys may also include six keys of "AZERTY", etc., here Do not exhaustively enumerate anchor point keys.
  • the electronic device performs a first feedback operation.
  • the electronic device when the contact operation corresponding to the first contact point is a pressing operation, and the first virtual key is an anchor point key, the electronic device performs a first feedback operation, and the first feedback operation is used to prompt the first Virtual keys are anchor point keys.
  • the first feedback operation may be in the form of vibration feedback
  • step 906 may include: the electronic device obtains the first vibration feedback element from a plurality of vibration feedback elements, and the first vibration feedback element Disposed in the touch screen, the first vibration feedback element is a vibration feedback element matched with the first virtual key, and the vibration feedback elements matched with different virtual keys are not exactly the same; the first type of vibration is emitted through the first vibration feedback element wave to perform the first feedback operation.
  • the vibration wave emitted by the vibration feedback element is non-ultrasonic, and generally the frequency is less than or equal to 500 Hz.
  • the electronic device can be configured with the first mapping relationship when it leaves the factory.
  • the entire touch screen may be divided into multiple location areas, and the first mapping relationship stored in the electronic device includes each location area in the multiple location areas in the touch screen and at least one vibration feedback
  • the position of the type of virtual keyboard can be moved, and the electronic device can obtain from the plurality of vibration feedback elements matching the first virtual key (that is, matching the first virtual key) according to the first position information and the first mapping relationship obtained in step 901. position information matching) at least one first vibration feedback element.
  • At least one first vibration feedback element matching the first virtual key can be obtained according to the first position information and the first mapping relationship, which is convenient and quick, and is beneficial to improve the efficiency of the matching process of the vibration feedback element; and
  • the first mapping relationship can indicate the corresponding relationship between the first position information and a first vibration feedback element, which can be compatible not only with the virtual keyboard with a fixed position, but also with the virtual keyboard with a movable position, so as to ensure that all kinds of scenes are compatible. Vibration feedback can be provided.
  • the electronic device may be pre-configured with multiple virtual keyboards corresponding to multiple virtual keyboards one-to-one.
  • There are mapping relationships and each mapping relationship includes a corresponding relationship between each virtual key in the plurality of virtual keys and at least one vibration feedback element.
  • the electronic device first obtains a first mapping relationship matching the currently displayed virtual keyboard from a plurality of mapping relationships, and the first mapping relationship includes the relationship between each virtual key in the currently displayed virtual keyboard and at least one first vibration feedback element.
  • the electronic device acquires one or more first vibration feedback elements matching the first virtual key according to the first mapping relationship and the first virtual key determined in step 903 .
  • a first mapping relationship is pre-configured, so that after the first virtual key is acquired, the first mapping relationship can be used to obtain at least one first vibration feedback element matching the first virtual key, which is convenient and quick. It is beneficial to improve the efficiency of the matching process of the vibration feedback element; the step of determining the vibration feedback element is divided, so that when a fault occurs, it is beneficial to accurately locate the fault location.
  • the position information of each vibration feedback element is pre-configured in the electronic device, and the electronic device determines the position information of each vibration feedback element in the vibration feedback module according to the first position information of the first virtual key and the position information of each vibration feedback element in the vibration feedback module. Whether there is a vibration feedback element for generating vibration waves under the first virtual key, if there is a vibration feedback element for generating vibration waves under the first virtual key, obtain at least one vibration feedback element located under the first position information The at least one vibration feedback element located below the first position information refers to the vibration feedback element whose position area intersects with the projection of the first virtual key on the vibration feedback module.
  • the electronic device will use the center point coordinates of the first position information of the first virtual key as the center point to search for the function existing in the preset area.
  • the preset area can be a circle, a square, a rectangle, etc.
  • the size of the preset area can be determined in combination with the arrangement of the vibration feedback elements, the type of elements used by the vibration feedback elements, and other factors, which are not limited here.
  • a process for performing a first feedback operation by emitting a vibration wave through the first vibration feedback element Specifically, after the electronic device determines at least one first vibration feedback element that matches the virtual key, the electronic device sends out vibration waves of the first type through the at least one first vibration feedback element.
  • the electronic device may also acquire the location type corresponding to the first contact point according to the first location information of the first contact point.
  • the location type includes that the first contact point is located in the first location area of the anchor point button and the first contact point is located in the second location area of the anchor point button, and the first location area and the second location area are different;
  • the location area is further divided into a first location area (which may also be referred to as the feature area of the anchor point button) and a second location area (which may also be referred to as the edge area of the anchor point button).
  • the division manner of the first location area and the second location area in different virtual keys may be different.
  • FIGS. 11 to 13 are four schematic diagrams of the first location area and the second location area in the feedback method provided by the embodiments of the present application.
  • 11 includes four sub-schematic diagrams (a) and (b), the area within the dotted line box in the sub-schematic diagram of FIG. 11 (a) represents the first location area of the virtual key K (it can also become the characteristic location area of the key K) , the area other than the dotted box in the sub-schematic diagram of FIG. 11 (a) represents the second position area of the virtual key K (it may also be the edge position area of the key K).
  • FIG. 11 represents the first position area of the virtual button J, and the area outside the dashed-line box in the sub-schematic diagram (b) of FIG. 11 represents the second position of the virtual button J
  • the sub-schematic diagram of FIG. 11 (b) shows the division method of the first position area and the second position area in the virtual keys corresponding to the keys with small protrusions in the physical keyboard.
  • FIG. 12 the area within the dashed frame in FIG. 12 represents the first position area of the virtual key K, and the area outside the dashed frame in FIG. 12 represents the second position area of the virtual key K. 12 and FIG. 11 (a) sub-schematic diagrams show two different area division methods, and the division method in FIG. 12 is to simulate a keycap with a concave arc surface in a physical keyboard.
  • the area within the dashed frame of the virtual key K in Fig. 12 represents the first position area of the virtual key K
  • the area between the two dashed frames of the virtual key K in Fig. 12 represents the virtual key K. the second location area. 13 and FIG. 12 and the sub-schematic diagram of FIG.
  • the second position area of the virtual key K (which may also be called the edge of the virtual key The position area) extends beyond the edge of the virtual key K, covering the key gap around the virtual key K, which can further enhance the tactile difference of the anchor point keys.
  • the first position area shown in FIG. 11 to FIG. 13 The division method of the second location area and the first location area is only for the convenience of understanding the concepts of the first location area and the second location area. The area is divided, which is not limited here.
  • the electronic device may determine the type of vibration wave emitted by the first vibration feedback element according to the position type corresponding to the first contact point. Wherein, when the first contact point is located in the first position area of the anchor point button, and, in the case where the first contact point is located in the second position area of the anchor point button, the electronic device passes at least one first vibration feedback element
  • the type of vibrational waves emitted can vary. Wherein, if the electronic equipment sends out continuous vibration waves through the vibration feedback element, the difference between different types of vibration waves includes any one or more of the following characteristics: vibration amplitude, vibration frequency, vibration duration or vibration waveform.
  • the difference between different types of vibration waves includes any one or more of the following characteristics: vibration amplitude, vibration frequency, vibration duration, vibration waveform or electronic equipment.
  • vibration amplitude amplitude
  • vibration frequency vibration frequency
  • vibration duration vibration waveform
  • vibration waveform vibration waveform or electronic equipment.
  • the vibration waves of different vibration amplitudes can be realized by different trigger voltages.
  • Vibration waves vary in amplitude.
  • the vibration frequency of the vibration wave emitted by the vibration feedback element corresponding to the anchor point button can be between 200 Hz and 400 Hz, such as 240 Hz, 260 Hz, 300 Hz, 350 Hz, 380 Hz or other values, etc. Here Don't be exhausted.
  • the vibration duration can be 10 milliseconds, 15 milliseconds, 20 milliseconds, 25 milliseconds, 30 milliseconds, and so on.
  • the vibration wave emitted by the vibration feedback element corresponding to the button of the anchor point can be a single basic waveform, or can be a superposition between multiple different basic waveforms; the aforementioned basic waveforms include but are not limited to square waves, sine waves, and sawtooth waves. , triangle waves, or other types of fundamental waveforms, etc.
  • the vibration wave emitted by a first vibration feedback element may be a sine wave with a vibration frequency of 290 Hz and a duration of 20 milliseconds, which is generated by a voltage of 350v (determining the vibration amplitude), and it should be understood that the examples here are only For the convenience of understanding this scheme, it is not used to limit this scheme.
  • the number of vibration feedback elements corresponding to the virtual keys may be different or the same, that is, the number of vibration feedback elements corresponding to different virtual keys may be the same or different.
  • the number of vibration feedback elements corresponding to the virtual key K may be three, and the number of vibration feedback elements corresponding to the virtual key J may be two.
  • the difference between the intensity of the vibration feedback corresponding to the first virtual key and the intensity of the vibration feedback corresponding to the second virtual key is within a preset intensity range, that is, in order to make the total vibration feedback corresponding to different virtual keys
  • the difference in intensity (that is, the intensity of the vibration feedback that the user can perceive) is within a preset intensity range
  • the electronic device acquires the vibration intensity of the vibration wave corresponding to each first vibration feedback element in the at least one first vibration feedback element
  • the vibration intensity of the vibration wave of each first vibration feedback element in the at least one first vibration feedback element is related to a first number, and the first number is the number of vibration feedback elements matched with the first virtual key.
  • each first vibration feedback unit in the at least one first vibration feedback element emits a first type of vibration wave.
  • the preset intensity range may be an intensity difference within 2%, an intensity difference within 3%, an intensity difference within 4%, an intensity difference within 5%, or other intensity ranges, etc. Don't be exhausted.
  • the electronic device may directly determine the number of first vibration feedback elements matching the first virtual key according to the number of first vibration feedback elements matching the first virtual key.
  • the vibration intensity of the vibration wave corresponding to each first vibration feedback element in the at least one first vibration feedback element wherein, the electronic device can determine each first vibration according to any one of the following multiple factors or a combination of multiple factors Vibration intensity of the vibration wave of the feedback element: the number of first vibration feedback elements matched with the first virtual key, the distance between each first vibration feedback unit and the center point of the first virtual key, the type of vibration wave, whether the virtual key is A key for the anchor point, the location type of the first location information, or other factors, etc.
  • a second mapping relationship may be pre-stored in the electronic device.
  • the second mapping relationship indicates the difference between the vibration intensities of the respective first vibration feedback elements corresponding to the first position information. relationship, the electronic device can obtain the vibration intensity of each first vibration feedback element according to the first position information and the second mapping relationship obtained in step 901 .
  • the second mapping relationship indicates the relationship between the first virtual key and the vibration intensity of each first vibration feedback element, then the electronic device obtains the first virtual key and the second mapping relationship according to step 903, The vibration intensity of each first vibration feedback element is acquired.
  • the probe of the vibration measuring instrument can be attached to the surface of a virtual key (that is, a detection point) on the touch screen, so as to measure the intensity from the above-mentioned detection point.
  • the vibration wave is collected, and then the waveform curve of the collected vibration wave is obtained, and the intensity of the vibration feedback corresponding to the detection point is indicated by the aforementioned waveform curve.
  • the difference between the intensity of the vibration feedback corresponding to the first virtual key and the intensity of the vibration feedback corresponding to the second virtual key can be compared between the waveform curve measured at the detection point of the first virtual key and The difference between the waveform curves of the two regions at the detection point of the second virtual key is obtained.
  • the strength of each vibration feedback element is determined according to the number of matched vibration feedback elements, so as to realize the vibration feedback strength of each virtual key
  • the difference is within the preset range, because when the user is using the physical keys, the force feedback given by different keys is basically the same, so that the difference between the virtual keyboard and the physical keyboard can be reduced to increase user viscosity.
  • the first feedback operation may be in the form of sound feedback
  • step 907 may include: the electronic device emits a first prompt sound, and the first prompt sound may be "didi", “beep” , “beep” and other sounds, the specific expressions of the first prompt sound are not exhaustively listed here.
  • the electronic device can also acquire the position type corresponding to the first contact point according to the first position information of the first contact point; in the case where the first contact point is located in the first position area of the anchor point button, and, In the case that the first contact point is located in the second position area of the anchor point key, the electronic device emits different prompting sounds.
  • the electronic device emits a prompt sound of "Di Di"
  • the electronic device emits a "beep beep" sound.
  • the electronic device can also adopt other types of feedback methods other than sound feedback and vibration feedback.
  • the specific type of feedback method to be adopted can be determined in combination with the actual product form and the actual application scenario of the product, which is not exhaustive here.
  • the electronic device performs a second feedback operation.
  • the electronic device may perform a second feedback operation, and the second feedback operation is used for It is prompted that the first virtual key is a non-anchor point key, and the first feedback operation and the second feedback operation are different feedback operations.
  • the feedback operation is performed when the first virtual button is an anchor point button, but also the feedback operation is performed when the first virtual button is a non-anchor point button.
  • the first feedback operation and the second The feedback operation is a different feedback operation.
  • each key will give feedback to the user when the user uses the physical keyboard
  • the above-mentioned method can increase the similarity between the virtual keyboard and the physical keyboard.
  • the anchor point keys give different feedback operations, and can also help the user remember different types of keys, so as to assist the user to realize touch typing on the virtual keyboard.
  • the second feedback operation may be in the form of vibration feedback
  • step 907 may include: the electronic device obtains a first vibration feedback element matching the first virtual key, and the first vibration feedback element is configured on the touch In the control screen; the second type of vibration wave is sent out through the first vibration feedback element to perform the second feedback operation.
  • the difference between the vibration wave of the first type and the vibration wave of the second type includes any one or more of the following characteristics: vibration amplitude, vibration frequency, vibration duration, and vibration waveform.
  • specific ways of distinguishing different types of vibration waves are provided, and different types of vibration waves can be distinguished by vibration amplitude, vibration frequency, vibration duration and/or vibration waveform, etc., which improves the flexibility of implementation of this solution. sex.
  • a process for performing a second feedback operation by emitting a vibration wave through the first vibration feedback element Specifically, after the electronic device determines at least one first vibration feedback element matching the virtual key, the electronic device sends out vibration waves of the second type through the at least one first vibration feedback element.
  • the electronic device can also obtain the position type corresponding to the first contact point according to the first position information of the first contact point, and the position type includes the first contact point.
  • the first position area of the non-anchor point button and the first contact point are located in the second position area of the non-anchor point button, and the first position area and the second position area are different; that is, the entire position area of a non-anchor point button is further It is divided into a first location area (which can also be called the characteristic area of the non-anchor point button) and a second location area (it can also be called the edge area of the non-anchor point button), and the first location area and the first location area in different virtual buttons
  • the division methods of the two location areas may be different.
  • the electronic device may determine the type of vibration wave emitted by the first vibration feedback element according to the position type corresponding to the first contact point. In the case where the first contact point is located in the first position area of the non-anchor point key, and, in the case where the first contact point is located in the second position area of the non-anchor point key, the electronic device passes the at least one first vibration feedback element The type of vibrational waves emitted can vary.
  • the type of the vibration wave corresponding to the first position area of the anchor point button is the same as the type of the vibration wave corresponding to the first position area of the non-anchor point button, and is the same as the type of the vibration wave corresponding to the first position area of the anchor point button.
  • the type of vibration wave corresponding to the second position area is different from the type of vibration wave corresponding to the second position area of the non-anchor point key.
  • the type of vibration wave corresponding to the first position area of the anchor point key is different from the type of vibration wave corresponding to the first position area of the non-anchor point key, and the type of vibration wave corresponding to the second position area of the anchor point key is different.
  • the type of vibration wave corresponding to the area is the same as the type of vibration wave corresponding to the second position area of the non-anchor point key.
  • the type of vibration wave corresponding to the first position area of the anchor point key is different from the type of vibration wave corresponding to the first position area of the non-anchor point key, and the type of vibration wave corresponding to the second position area of the anchor point key is different.
  • the type of vibration wave corresponding to the area is different from the type of vibration wave corresponding to the second position area of the non-anchor point key.
  • the entire location area of the anchor point button and/or the non-anchor point button is divided into a first location area and a second location area.
  • the types of vibration waves emitted by the electronic device through the at least one first vibration feedback element are different, which is helpful for helping the user to memorize the boundaries of the virtual keys, that is, it is helpful for assisting the user Build muscle memory for different areas of virtual keys to further reduce the difficulty of touch typing on a touch screen.
  • the second feedback operation may be in the form of sound feedback
  • step 907 may include: the electronic device emits a second prompt tone, and the second prompt tone and the first prompt tone are different prompt tones.
  • the electronic device can also obtain the position type corresponding to the first contact point according to the first position information of the first contact point; when the first contact point is located in the first position area of the non-anchor point button, and , when the first contact point is located in the second position area of the non-anchor point button, the electronic device emits different prompting sounds.
  • step 907 is an optional step, and step 907 may not be performed, that is, when the electronic device determines that the first virtual button is not an anchor point button, no feedback may be performed.
  • the electronic device determines whether the first virtual key is an anchor point key. If the first virtual key is an anchor point key, go to step 909, and if the first virtual key is not an anchor point key, go to step 910.
  • step 908 for a specific implementation manner of step 908, reference may be made to the above description of step 905, which is not repeated here.
  • the electronic device changes the haptic characteristics of the first contact point on the touch screen to present a first haptic state.
  • the electronic device when the contact operation corresponding to the first contact point is a touch operation and the first virtual button is an anchor point button, the electronic device changes the position of the first contact point in the cover of the touch screen.
  • haptic properties to render as a first haptic state.
  • the tactile properties of the cover plate of the touch screen include any one or more of the following properties: sliding friction coefficient, stick-slip, temperature, and other types of tactile properties.
  • the electronic device can change the entire cover of the touch screen to the first tactile state, so as to change the first contact point in the cover of the touch screen to the first tactile state; or only the cover of the touch screen can be changed.
  • the first contact point in the panel is changed to the first tactile state without changing the tactile state of other areas in the cover plate of the touch screen.
  • the electronic device changes the first contact in the cover of the touch screen by sending out ultrasonic waves from the ultrasonic module in the touch screen. Haptic properties of points. Then, the electronic device can emit different types of ultrasonic waves through the ultrasonic module, so that the first contact point in the cover plate of the touch screen exhibits different tactile characteristics. Wherein, if the electronic device emits a single ultrasonic wave through the ultrasonic module, the difference between different types of ultrasonic waves includes any one or more of the following characteristics: vibration amplitude, vibration frequency, vibration duration or vibration waveform.
  • the frequency of the ultrasonic waves emitted by the ultrasonic module is greater than 20 kHz, specifically 21 kHz, 22 kHz, 24 kHz, 25 kHz, or other values, which are not limited here.
  • the difference between different types of ultrasonic waves includes any one or more of the following characteristics: vibration amplitude, vibration frequency, vibration duration, vibration waveform, or pulse waves emitted by the electronic device.
  • the frequency at which the electronic device emits a pulse wave can also be called the rhythm of the pulse wave emitted by the electronic device.
  • the electronic device sends out ultrasonic waves in the form of pulses every 3 milliseconds through the ultrasonic module
  • the electronic equipment sends out ultrasonic waves in the form of pulses every 10 milliseconds through the ultrasonic module.
  • the frequency of the pulse waves sent out by the electronic device Different, it should be understood that the examples here are only for the convenience of understanding the solution, and are not used to limit the solution.
  • step 909 may include: the electronic device obtains the third type of ultrasonic wave corresponding to the anchor point button, and sends out the third type of ultrasonic wave through the ultrasonic wave module in the touch screen, so as to change the tactile characteristics of the first contact point on the touch screen. Change to the first haptic state.
  • the electronic device can also obtain information related to the first contact point according to the first position information of the first contact point.
  • the location type corresponding to the touch point The electronic device may also determine the type of ultrasonic wave corresponding to the first position information according to the position type corresponding to the first contact point, and then emit the aforementioned type of ultrasonic wave through the ultrasonic module in the touch screen.
  • the type of ultrasonic waves emitted by the electronic device through the ultrasonic module can be different.
  • an electrostatic module if an electrostatic module is integrated into the touch screen of the electronic device, the electronic device changes the electrostatic force of the first contact point in the cover of the touch screen by the electrostatic module in the touch screen emitting static electricity. haptic properties. Then, the electronic device can emit static electricity of different sizes through the electrostatic module, so that the first contact point in the cover plate of the touch screen exhibits different tactile characteristics.
  • the value range of the volts of static electricity emitted by the electrostatic module can be 100 volts to 400 volts. As an example, for example, the volts of static electricity emitted by the electrostatic modules are 120 volts, 200 volts, 380 volts, or other values, etc. Do limit.
  • step 909 may include: the electronic device obtains the first volt value of static electricity corresponding to the anchor point button, and emits the static electricity of the first volt value through the static electricity module in the touch screen, so as to adjust the static electricity of the first contact point on the touch screen.
  • the haptic property changes to the first haptic state.
  • the electronic device can also obtain information related to the first contact point according to the first position information of the first contact point.
  • the location type corresponding to the touch point According to the position type corresponding to the first contact point, the volt value of the current corresponding to the first position information is determined, and then the current of the aforementioned volt value is sent out through the current module in the touch screen.
  • the volts of the current sent by the electronic device through the current module Values can be different.
  • the electronic device can also change the tactile characteristics of the cover plate in the touch screen in other ways, which are not listed here one by one.
  • the electronic device changes the haptic characteristics of the first contact point on the touch screen to present a second haptic state.
  • the electronic device when the contact operation corresponding to the first contact point is a touch operation and the first virtual button is a non-anchor point button, changes the first contact point in the cover of the touch screen
  • the haptic characteristics of the touch screen can be presented as the second haptic state.
  • the haptic characteristics when the touch screen is presented in the first haptic state may be different from the haptic properties when the touch screen is presented in the second haptic state, that is, when the user touches
  • the feeling of the anchor point button can be different from the feeling of the user touching the non-anchor point button, to further assist the user to distinguish between the anchor point button and the non-anchor point button on the virtual keyboard, to further assist the user to locate the virtual buttons in the virtual keyboard .
  • step 910 may include: the electronic device acquires the fourth type of ultrasonic wave corresponding to the non-anchor point button, and sends out the fourth type of ultrasonic wave through the ultrasonic wave module in the touch screen, so as to change the tactile sensation of the first contact point on the touch screen.
  • the characteristic changes to the second haptic state.
  • the electronic device may also acquire the position type corresponding to the first contact point.
  • the type of the ultrasonic wave corresponding to the first position information is determined.
  • the ultrasonic wave emitted by the electronic device through the ultrasonic module The type can be different.
  • step 909 may include: the electronic device obtains the second volt value of static electricity corresponding to the button of the anchor point, and emits static electricity of the second volt value through the static electricity module in the touch screen, so as to adjust the static electricity of the first contact point on the touch screen.
  • the haptic property changes to a second haptic state.
  • the electronic device may also acquire the position type corresponding to the first contact point. According to the position type corresponding to the first contact point, the volt value of the current corresponding to the first position information is determined.
  • the current sent by the electronic device through the current module can vary in volts.
  • step 908 is an optional step. If step 908 is not executed, steps 909 and 910 can be combined, that is, when the contact operation corresponding to the first contact point is a touch operation, regardless of the first contact point Whether the virtual key is an anchor point key or a non-anchor point key, the tactile characteristics of the first contact point on the touch screen can all present the same tactile state.
  • steps 908 to 910 are all optional steps. After the electronic device determines that the contact operation corresponding to the first contact point is not a pressing operation, no feedback can be made directly, that is, when the pressure value corresponding to the first contact point is less than In the case of the first pressure value threshold, the electronic device may not provide any feedback.
  • the first feedback operation is performed through the touch screen to remind the user that the anchor point button is currently contacted, so that the user can perceive the anchor point button. position, which is conducive to reducing the difficulty of touch typing on the touch screen; in addition, the touch screen is equipped with multiple vibration feedback elements, and when the first virtual key is determined to be the anchor point key Obtain at least one first vibration feedback element that matches the first virtual key, and instruct the at least one first vibration feedback to send out vibration waves, which can realize the effect of only generating vibration feedback around the first virtual key, that is, it is not right Full-screen vibration feedback, since all fingers are placed on the touch screen when typing, if it is full-screen vibration, all fingers will feel the vibration, which is easy to confuse the user, but it is only generated around the first virtual key With the effect of vibration feedback, the user is not easily confused, and it is easier to help the user to form muscle memory at the finger, so as to assist the user to perform touch typing on the touch screen.
  • FIG. 14 is a schematic structural diagram of an electronic device provided by an embodiment of the present application.
  • the electronic device 1 includes a touch screen 20, a memory 40, one or more processors 10 and one or more programs 401, the touch screen 20 is configured with a plurality of vibration feedback elements, and the one or more programs 401 are stored in the memory
  • the electronic device when one or more processors 10 execute one or more programs 401, the electronic device performs the following steps: detecting a first contact operation acting on the touch screen 20; in response to the first contact operation, acquiring and The first position information of the first contact point corresponding to the first contact operation, the first position information corresponds to the first virtual key on the virtual keyboard; in the case where the first virtual key is an anchor point key, a plurality of vibration feedback elements The first vibration feedback element is obtained in A virtual key is the anchor point key.
  • the electronic device 1 is configured with a first mapping relationship, the first mapping relationship indicates the corresponding relationship between the virtual key and the vibration feedback element, and one or more processors 10 are executing one or more programs
  • the electronic device 1 is caused to specifically perform the following steps: obtaining a first vibration feedback element according to the first mapping relationship and the first virtual key.
  • the electronic device 1 is configured with a first mapping relationship, the first mapping relationship indicates the corresponding relationship between the position information and the vibration feedback element, and one or more processors 10 are executing one or more programs
  • the electronic device 1 is caused to specifically perform the following steps: acquiring the first vibration feedback element according to the first mapping relationship and the first position information.
  • the electronic device 1 when the one or more processors 10 execute the one or more programs 401, the electronic device 1 further executes the following steps: acquiring and matching each of the first vibration feedback elements of the at least one first vibration feedback element The vibration intensity of the corresponding vibration wave, the vibration intensity of the vibration wave of each first vibration feedback element in the at least one first vibration feedback element is related to the first quantity, and the first quantity is the quantity of the first vibration feedback element.
  • the electronic device 1 When one or more processors 10 execute one or more programs 401, the electronic device 1 specifically performs the following steps: according to the vibration intensity of the vibration wave corresponding to each first vibration feedback element, through at least one first vibration feedback element A vibration wave is emitted, so that the difference between the intensity of the vibration feedback corresponding to the first virtual key and the intensity of the vibration feedback corresponding to the second virtual key is within a preset intensity range, and the second virtual key and the first virtual key are different. Virtual Key.
  • the first vibration feedback element is any one of the following: a piezoelectric ceramic sheet, a linear motor or a piezoelectric film.
  • the electronic device 1 when the one or more processors 10 execute the one or more programs 401, the electronic device 1 further executes the following steps: acquiring the location type corresponding to the first contact point according to the first location information , the location type includes that the first contact point is located in the first location area of the first virtual key and the first contact point is located in the second location area of the first virtual key, and the first location area and the second location area are different.
  • the electronic device 1 specifically performs the following steps: according to the position type corresponding to the first contact point, perform a first feedback operation through the touch screen 20, and perform a first feedback operation with the first contact point. The feedback operation corresponding to one location area is different from the feedback operation corresponding to the second location area.
  • the electronic device 1 when one or more processors 10 execute one or more programs 401, the electronic device 1 further executes the following steps: in response to the detected first gesture operation, from multiple types of virtual The first type of virtual keyboard corresponding to the first gesture operation is selected from the keyboard, wherein the virtual keys included in different types of virtual keyboards in the multiple types of virtual keyboards are not exactly the same; the first type of virtual keyboard is displayed through the touch screen 20 For the keyboard, during the presentation of the first type of virtual keyboard, the position of the first type of virtual keyboard on the touch screen 20 is fixed.
  • the electronic device 1 specifically executes the following steps: during the presentation of the first type of virtual keyboard, detecting a first contact acting on the touch screen 20 operate.
  • the embodiment of the present application also provides an electronic device, please refer to FIG. 15 , which is a schematic structural diagram of the electronic device provided by the embodiment of the present application, and the electronic device 1 can be embodied as a mobile phone, a tablet, a notebook computer, or other configurations Devices with touch screens, etc., are not limited here.
  • the electronic device described in the embodiments corresponding to FIG. 1 to FIG. 8 may be deployed on the electronic device 1 to implement the functions of the electronic device in the embodiments corresponding to FIG. 9 to FIG. 13 .
  • the electronic device 1 may vary greatly due to different configurations or performances, and may include one or more central processing units (CPU) 1522 (for example, one or more processors) and the memory 40, One or more storage media 1530 (eg, one or more mass storage devices) that store applications 1542 or data 1544.
  • the memory 40 and the storage medium 1530 may be short-term storage or persistent storage.
  • the program stored in the storage medium 1530 may include one or more modules (not shown in the figure), and each module may include a series of instructions to operate on the electronic device.
  • the central processing unit 1522 may be configured to communicate with the storage medium 1530 to execute a series of instruction operations in the storage medium 1530 on the electronic device 1 .
  • the electronic device 1 may also include one or more power supplies 1526, one or more wired or wireless network interfaces 1550, one or more input and output interfaces 1558, and/or, one or more operating systems 1541, such as Windows ServerTM, Mac OS XTM, UnixTM, LinuxTM, FreeBSDTM and many more.
  • operating systems 1541 such as Windows ServerTM, Mac OS XTM, UnixTM, LinuxTM, FreeBSDTM and many more.
  • the central processing unit 1522 is used to implement the functions of the electronic device in the embodiments corresponding to FIG. 9 to FIG. 13 . It should be noted that, for the specific implementation manner of the central processing unit 1522 performing the functions of the electronic device in the embodiments corresponding to FIGS. 9 to 13 and the beneficial effects brought about, reference may be made to the respective method embodiments corresponding to FIGS. 9 to 13 . , and will not be repeated here.
  • Embodiments of the present application also provide a computer-readable storage medium, where a program for generating a vehicle running speed is stored in the computer-readable storage medium, and when it runs on a computer, the computer executes the program shown in FIGS. 9 to 13 above. The steps performed by the electronic device in the method described in the illustrated embodiment.
  • Embodiments of the present application also provide a computer program, which, when run on a computer, causes the computer to perform the steps performed by the electronic device in the methods described in the foregoing embodiments shown in FIG. 9 to FIG. 13 .
  • An embodiment of the present application further provides a circuit system, the circuit system includes a processing circuit, and the processing circuit is configured to execute the steps performed by the electronic device in the method described in the embodiments shown in the foregoing FIG. 9 to FIG. 13 .
  • the electronic device provided by the embodiment of the present application may be a chip, and the chip includes: a processing unit and a communication unit.
  • the processing unit may be, for example, a processor, and the communication unit may be, for example, an input/output interface, a pin, or a circuit.
  • the processing unit can execute the computer-executable instructions stored in the storage unit, so that the chip executes the steps performed by the electronic device in the methods described in the foregoing embodiments shown in FIG. 9 to FIG. 13 .
  • the storage unit is a storage unit in the chip, such as a register, a cache, etc.
  • the storage unit may also be a storage unit located outside the chip in the wireless access device, such as only Read-only memory (ROM) or other types of static storage devices that can store static information and instructions, random access memory (RAM), etc.
  • ROM Read-only memory
  • RAM random access memory
  • the processor mentioned in any one of the above may be a general-purpose central processing unit, a microprocessor, an ASIC, or one or more integrated circuits for controlling the execution of the program of the method in the first aspect.
  • the device embodiments described above are only schematic, wherein the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be A physical unit, which can be located in one place or distributed over multiple network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution in this embodiment.
  • the connection relationship between the modules indicates that there is a communication connection between them, which may be specifically implemented as one or more communication buses or signal lines.
  • U disk mobile hard disk
  • ROM read-only memory
  • RAM magnetic disk or optical disk
  • a computer device which may be a personal computer, server, or network device, etc.
  • the computer program includes one or more computer instructions. When the computer program instructions are loaded and executed on a computer, all or part of the processes or functions described in the embodiments of the present application are generated.
  • the computer may be a general purpose computer, special purpose computer, computer network, or other programmable device.
  • the computer instructions may be stored in or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be downloaded from a website site, computer, server, or data center Transmission to another website site, computer, server, or data center is by wire (eg, coaxial cable, optical fiber, digital subscriber line (DSL)) or wireless (eg, infrared, wireless, microwave, etc.).
  • wire eg, coaxial cable, optical fiber, digital subscriber line (DSL)
  • wireless eg, infrared, wireless, microwave, etc.
  • the computer-readable storage medium may be any available medium that can be stored by a computer or a data storage device such as a server, a data center, etc. that includes one or more available media integrated.
  • the usable media may be magnetic media (eg, floppy disks, hard disks, magnetic tapes), optical media (eg, DVD), or semiconductor media (eg, Solid State Disk (SSD)), and the like.
  • Embodiment 2 is a diagrammatic representation of Embodiment 1:
  • the embodiments of the present application can be applied to various application scenarios that require input through a virtual keyboard.
  • a virtual keyboard As an example, for example, in an application for text editing, it is necessary to input text, numbers, characters, etc. through the virtual keyboard; Input words, numbers, characters, etc. through the virtual keyboard; as another example, for example, in game applications, functions such as operating the virtual character to move, modify the character name, and instant communication with game friends are also required to be performed through the virtual keyboard.
  • the embodiments of the present application may also be applied to other application scenarios in which input is performed through a virtual keyboard, which is not exhaustive here. In the aforementioned scenarios, there is a problem that the number of keys on the virtual keyboard is limited, and an additional physical keyboard needs to be provided to meet the user's input requirements.
  • an embodiment of the present application provides a method for processing a virtual keyboard.
  • the method is applied to the electronic device shown in FIG. 16 .
  • the electronic device is configured with various types of virtual keyboards.
  • the gesture operation is used to evoke different types of virtual keyboards, that is, the virtual keyboard is no longer only able to display 26 letters, but provides users with more virtual keys through different types of virtual keyboards, which not only improves the user's ability to evoke the virtual keyboard. flexibility in the process, and it is beneficial to provide richer virtual keys, so that it is no longer necessary to provide an additional physical keyboard.
  • FIG. 16 is a schematic diagram of an electronic device provided by an embodiment of the present application.
  • the electronic device 1 includes at least one display screen, and the display screen is a touch screen (that is, the touch screen 20 in FIG. 2 ), and the electronic device 1 can display
  • the screen acquires various types of gesture operations input by the user, and displays various types of virtual keyboards through the display screen.
  • the electronic device 2 can be represented as a virtual reality device such as VR, AR, or MR.
  • the electronic device 2 collects various types of user gesture operations through the camera configured on the head-mounted display device, and Various types of virtual keyboards are presented to the user through the headset.
  • FIG. 17 is a schematic flowchart of a method for processing a virtual keyboard provided by an embodiment of the present application.
  • the method for processing a virtual keyboard provided by an embodiment of the present application may include:
  • the electronic device detects a first gesture operation, and acquires a first gesture parameter corresponding to the first gesture operation.
  • the electronic device may detect in real time whether the user inputs a gesture operation, and when the electronic device detects the first gesture operation input by the user, generates a first gesture parameter corresponding to the first gesture operation.
  • the electronic device is configured with a touch screen, and the electronic device acquires the first gesture operation input by the user in real time through the touch screen.
  • the electronic device may collect the first gesture operation input by the user through the camera configured on the head display device, and then generate the first gesture parameter corresponding to the first gesture operation.
  • the electronic device may The performance is a virtual reality device such as VR, AR or MR, which is not limited here.
  • the first gesture parameters include any one or more of the following parameters: position information of the contact point corresponding to the first gesture operation, The number information of the corresponding contact points, the area information of the contact points corresponding to the first gesture operation, or other types of parameter information, and the like. In the embodiment of this application, what information is included in the first gesture parameter is introduced.
  • the first gesture parameter includes not only the position information of each contact point and the quantity information of multiple contact points, but also the area information of each contact point, The area information of the contact point can distinguish the contact point triggered by the palm from multiple contact points, which is beneficial to accurately estimate the type of the first gesture operation, avoid displaying the wrong virtual keyboard, and improve the accuracy of the virtual keyboard display process.
  • the position information of the contact point corresponding to the first gesture operation can be represented by coordinate information, function or other information
  • the origin of the coordinate system corresponding to the aforementioned position information can be the center point of the display screen and the upper left corner vertex of the display screen. , the lower left corner vertex of the display screen, the upper right corner vertex of the display screen, the lower right corner vertex of the display screen or other position points, etc.
  • the setting of the specific coordinate system origin can be determined according to the actual application scene.
  • the display screen of the electronic device may be a touch screen, and the touch screen may be configured with a touch sensing module, and the electronic device collects the first gesture parameters corresponding to the first gesture operation through the touch sensing module configured in the display screen.
  • FIG. 18 is a schematic diagram of a first gesture parameter in the method for processing a virtual keyboard provided by an embodiment of the present application.
  • the first gesture operation is a one-hand operation as an example. As shown in the figure, 4 contact points can be obtained on the display screen.
  • the area of the 3 contact points generated by the finger is relatively small, and the remaining The area of one contact point generated by the palm is relatively large, and it should be understood that the example in FIG. 18 is only for the convenience of understanding the present solution, and is not intended to limit the present solution.
  • the virtual keyboard may be visually presented in a three-dimensional space.
  • the electronic device may perform real-time detection on the gesture operation in the space, so as to obtain the first gesture parameter corresponding to the first gesture operation when the first gesture operation is detected.
  • the electronic device may track the user's hand in real time through the user's handheld device or a hand-worn device, so as to monitor the user's first gesture operation.
  • the electronic device includes a head-mounted display device, the first gesture operation is acquired by a camera configured in the head-mounted display device, and the first gesture parameter may be embodied as an image corresponding to the first gesture operation, and the electronic device The image corresponding to the first gesture operation may be input into a neural network for image recognition to generate a first gesture parameter corresponding to the first gesture operation.
  • the electronic device generates first indication information according to the first gesture parameter information.
  • the electronic device may further perform secondary processing according to the obtained first gesture parameter, so as to generate the first gesture parameter corresponding to the first gesture parameter.
  • Indication information the first indication information can also be regarded as a gesture parameter obtained through secondary processing.
  • the first indication information includes any one or more of the following (that is, the first gesture parameter indicates any one or more of the following): relative angle information of the hand corresponding to the first gesture operation, relative angle information of the hand corresponding to the first gesture operation
  • the position information of the hand corresponding to the operation, the quantity information of the hand corresponding to the first gesture operation, and the shape information of the hand corresponding to the first gesture operation, etc., the specific type of information that can be included in the first indication information can be combined with the actual application scenario to flexibly set, there is no limitation here.
  • information such as relative angle information of the hand, position information of the hand, quantity information of the hand, or shape information of the hand can be obtained.
  • the gesture parameters can obtain more abundant information about the first gesture operation, and increase the flexibility of the virtual keyboard matching process.
  • the relative angle information of the hand corresponding to the first gesture operation may include any one or more of the following: the relative angle between the hand corresponding to the first gesture operation and any side of the display screen , the relative angle between the hand corresponding to the first gesture operation and the center line of the display screen, the relative angle between the hand corresponding to the first gesture operation and the diagonal line of the display screen, etc., which are not limited here.
  • the electronic device determines that the first gesture operation is a one-handed operation (that is, the number of hands corresponding to the first gesture operation is 1), the electronic device obtains from the multiple contact points corresponding to the first gesture operation. At least two first contact points (that is, based on the contact points generated by the finger), connect the at least two first contact points, or connect the two first contact points with the farthest distance among the at least two first contact points Points are connected to generate a straight line corresponding to the first gesture operation, and then the relative angle between the aforementioned straight line and a preset line is calculated, and the preset line includes any one or more of the following: any one of the display screen edge, the center line of the display screen, the diagonal line of the display screen, etc., to obtain the relative angle information of the hand corresponding to the first gesture operation.
  • the electronic device determines that the first gesture operation is a two-hand operation (that is, the number of hands corresponding to the first gesture operation is 2), the electronic device obtains at least two first gesture operations from a plurality of contact points corresponding to the first gesture operation contact points, and connect at least two first contact points corresponding to the left hand, or connect two first contact points with the farthest distance among the at least two first contact points corresponding to the left hand, to generating a first straight line corresponding to the left hand; connecting at least two first contact points corresponding to the right hand, or, connecting the two most distant first contact points among the at least two first contact points corresponding to the right hand Points are connected to generate a second straight line corresponding to the right hand, and then the first sub-angle between the first straight line and the preset line is calculated respectively, and the second sub-angle between the second straight line and the preset line is calculated, to obtain the relative angle information of the hand corresponding to the first gesture operation.
  • FIG. 19 is a schematic diagram of relative angle information in the method for processing a virtual keyboard provided by an embodiment of the present application.
  • the first gesture operation is a two-hand operation as an example
  • FIG. 19 includes two sub-schematic diagrams (a) and (b), and the sub-schematic diagram of FIG. 19 (a) shows the position of the contact point corresponding to the two-hand operation
  • the sub-schematic diagram (b) of FIG. 19 taking the preset line as the bottom edge of the display screen as an example, the two contact points with the farthest distance among the four contact points corresponding to the left hand are respectively connected to generate a first straight line.
  • a determination process for hand position information The electronic device first determines the number of hands corresponding to the first gesture operation according to the acquired first gesture parameters, and if the first gesture operation is a two-hand operation, the position of the hand corresponding to the first gesture operation includes the distance between the two hands; If the first gesture operation is a one-hand operation, the position of the hand corresponding to the first gesture operation includes the first area and the fourth area.
  • the first area is located at the lower left or lower right of the display screen, and the fourth area is an area other than the first area in the display panel; further, the width of the first area may be 3 cm, 4 cm or 5 cm, etc. value, the bottom edge of the first area coincides with the bottom edge of the display screen.
  • FIG. 20 are two schematic diagrams of the first area in the method for processing a virtual keyboard provided by an embodiment of the present application.
  • Sub-schematic diagrams (a) of FIG. 20 and sub-schematic diagrams of FIG. 20 (b) respectively show two schematic diagrams of the first region. It should be understood that the examples in FIG. 20 are only to facilitate understanding of the solution, and are not used to limit the solution.
  • the electronic device may determine the distance between the index finger of the left hand and the index finger of the right hand as the distance between the two hands; and may also determine the shortest distance between the left hand and the right hand as the distance between the two hands; and The shapes of the left hand and the right hand may be generated according to multiple contact points, and then the distance between the left-hand boundary and the right-hand boundary may be generated, and the method for determining the distance between the two hands is not exhaustive here.
  • the electronic device selects a plurality of first contact points from the plurality of contact points corresponding to the first gesture operation according to the first gesture parameters corresponding to the first gesture operation, and selects a plurality of first contact points according to the plurality of The position of the first contact point is used to determine the position of the hand corresponding to the first gesture operation.
  • the position of the hand is determined to be the first area; if the at least one first contact point exists outside the first area the first contact point, the position of the hand is determined as the fourth area.
  • the position of the hand is determined as the first area; if all the first contact points in the at least one first contact point If the points are all located in the fourth area, the position of the hand is determined as the fourth area.
  • the determination process for the number of hands The first gesture operation acquired by the electronic device may be a one-handed operation or a two-handed operation.
  • the electronic device may determine the number of hands corresponding to the first gesture parameter according to the number of contact points and the position information of the contact points. In an implementation manner, the electronic device determines whether the number of the multiple contact points is greater than or equal to the first value, and there are two contact points whose distance is greater than the second distance threshold in the multiple contact points.
  • the number is greater than the first value, and there are two contact points with a distance greater than the second distance threshold in the multiple contact points, it is determined that the number of hands corresponding to the first gesture operation is two; if the number of multiple contact points is less than The first value, or, if there are no two contact points whose distance is greater than the second distance threshold among the plurality of contact points, it is determined that the number of hands corresponding to the first gesture operation is one.
  • the value of the first value can be 2, 3, 4, 5 or other values, and the value of the first value can also be customized by the user; the value of the second distance threshold can be 22 mm , 25mm, 26mm or other values, etc.
  • the value of the second distance threshold can also be customized by the user, and the specific value of the second distance threshold can be determined in combination with the size of the display screen, the size of the user's hand and other factors , which is not limited here.
  • the electronic device determines whether the first subset and the second subset exist in the plurality of contact points, and if the first subset and the second subset exist, determines the hand corresponding to the first gesture operation. The number is 2, and if there is no first subset or second subset, it is determined that the number of hands corresponding to the first gesture operation is 1.
  • the number of contact points included in the first subset and the second subset is greater than or equal to the first value
  • the distance between any two contact points in the first subset is less than the second distance threshold
  • the second subset The distance between any two contact points in the first subset is less than the second distance threshold
  • the distance between any contact point in the first subset and any contact point in the second subset is greater than or equal to the second distance threshold.
  • FIG. 21 is a schematic diagram of a first gesture operation in the method for processing a virtual keyboard provided by an embodiment of the present application.
  • the value of the first numerical value is 3 as an example.
  • FIG. 21 includes two sub-schematic diagrams (a) and (b), and shown in sub-schematic diagram (a) of FIG. 21 is the hand corresponding to the first gesture operation
  • the electronic device can obtain the three contact points in the sub-schematic diagram (a) of FIG. 21 , and the distance between the aforementioned three contact points is less than 25 mm; the sub-schematic diagram of FIG.
  • the electronic device can obtain 8 contact points ( A1, A2, A3, A4, A5, A6, A7 and A8 in Figure 21 respectively), the contact points represented by A7 and A8 are the contact points generated based on the palm, A1, A2, A3, A4, A5 and A6
  • the distances between the three contact points A1, A2 and A3 are all less than 25 mm, A4, The distances between the three contact points A5 and A6 are all less than 25 mm, and the distances between the first subset and the second subset are all greater than 25 mm.
  • the electronic device may first divide the multiple contact points corresponding to the first gesture operation into first contact points and second contact points according to the first gesture parameters corresponding to the first gesture operation, wherein the first The contact point is generated based on the user's finger, and the second contact point is generated based on the user's palm. Further, it is judged whether the number of first contact points in the plurality of contact points is greater than or equal to the first value, and whether there are two contact points whose distance is greater than the second distance threshold in at least one first contact point, so as to determine whether the number of contact points with the first gesture is greater than or equal to the first value. The number of hands corresponding to the operation.
  • the electronic device can determine whether the area of each contact point is greater than or equal to the first area threshold, and if it is greater than or equal to the first area threshold, the contact point is determined as the second contact point ( That is, the contact point generated by the palm), if it is smaller than the first area threshold, the contact point is determined as the first contact point (that is, the contact point generated by the finger).
  • the value of the first area threshold may be preset or customized by the user; the value of the first area threshold may be determined in combination with factors such as the size of the user's hand, which is not limited here. It should be noted that the use of the area of the contact point to determine whether the contact point is the first contact point or the second contact point is used here as an example, only to facilitate understanding of the feasibility of this solution, and is not used to limit this solution.
  • the determination process for the shape information of the hand may be a static gesture operation, and the shape information of the hand corresponding to the first gesture operation may specifically be a left hand, a right hand, two fingers, a fist, or other shape information.
  • the shape information of the hand may specifically be a "Z" shape, a tick shape, a circle shape, etc., which will not be exhaustive here. Specifically, if the number of the multiple contact points acquired by the electronic device is two, it can be determined that the shape information corresponding to the first gesture operation is two fingers. For a more direct understanding of this solution, please refer to FIG.
  • FIG. 22 is a schematic diagram of a first gesture operation in the method for processing a virtual keyboard provided by an embodiment of the present application.
  • FIG. 22 includes two sub-schematic diagrams (a) and (b), the sub-schematic diagram (a) of FIG. 22 shows the first gesture operation of the two-finger operation, and the sub-schematic diagram (b) of FIG.
  • the two contact points corresponding to the two-finger operation should be understood that the example in FIG. 22 is only for the convenience of understanding the solution, and is not used to limit the solution.
  • the electronic device needs to determine whether the number of hands corresponding to the first gesture operation is one or two. It is necessary to determine whether the shape of the hand corresponding to the first gesture operation is a left hand or a right hand according to the acquired first gesture parameters. Specifically, in an implementation manner, if multiple contact points corresponding to the first gesture operation are located on the left side of the center line of the display screen, the shape of the hand corresponding to the first gesture operation is the left hand; The multiple contact points corresponding to the gesture operation are all located on the right side of the center line of the display screen, and the shape of the hand corresponding to the first gesture operation is the right hand. It should be noted that the method for judging left-handed or right-handed provided here is only to facilitate understanding of the feasibility of this solution, and is not intended to limit this solution.
  • the electronic device can input the image corresponding to the first gesture operation into the neural network used for image recognition, so as to directly generate the first indication information.
  • the electronic device acquires the first rule.
  • the electronic device may be preconfigured with a first rule, the first rule indicates the correspondence between multiple types of gesture operations and multiple types of virtual keyboards, and the first type of virtual keyboards are multiple types A type of virtual keyboard.
  • the first rule indicates the correspondence between multiple types of gesture operations and multiple types of virtual keyboards
  • the first type of virtual keyboards are multiple types A type of virtual keyboard.
  • different types of virtual keyboards among multiple types of virtual keyboards have different functions
  • the virtual keyboards with different functions may include any combination of two or more of the following virtual keyboards: numeric keyboard, function key keyboard , full keyboard and custom keyboard, function keys
  • the keyboard consists of function keys.
  • different types of virtual keyboards have different functions, so that a user can be provided with a variety of virtual keyboards with different functions, so as to improve the flexibility of the user in the process of using the virtual keyboard, so as to improve the user viscosity of the solution.
  • the different types of virtual keyboards may include a combination of any two or more of the following virtual keyboards: mini keyboard, numeric keyboard, functional keyboard, function key keyboard, circular keyboard, curved keyboard, auto keyboard Define keyboard and full keyboard.
  • the first rule indicates the following information: when the first gesture operation is a one-handed operation, the virtual keyboard of the first type is any one of the following virtual keyboards: mini keyboard, numeric keyboard, functional keyboard, function key keyboard, Round keyboard, curved keyboard, custom keyboard, mini keyboard includes 26 letter keys, functional keyboard is displayed in the application, and the virtual keys included in the functional keyboard correspond to the functions of the application; it should be noted that the same electronic
  • the device does not need to be equipped with mini keyboard, numeric keyboard, functional keyboard, function key keyboard, circular keyboard, curved keyboard and custom keyboard at the same time.
  • Virtual keyboard for any one of Mini Keyboard, Numeric Keyboard, Functional Keyboard, Function Key Keyboard, Round Keyboard, Curved Keyboard or Custom Keyboard.
  • the first type of virtual keyboard is a full keyboard
  • the full keyboard includes at least 26 letter keys
  • the size of the full keyboard is larger than that of the mini keyboard.
  • the same electronic device may include a combination of at least two of the following five items:
  • the first type of virtual keyboard is a mini keyboard.
  • the first one-handed operation may be a left-handed one-handed operation, or a right-handed one-handed operation; the first one-handed operation may be a one-handed operation holding a stylus, or a one-handed operation without a stylus.
  • FIG. 23 is a schematic diagram of a first type of virtual keyboard in the method for processing a virtual keyboard provided by an embodiment of the present application.
  • the electronic device detects that the first gesture operation is the first one-handed operation, and the corresponding virtual keyboard of the first type is a mini keyboard, and the mini keyboard includes 26
  • the example in FIG. 23 is only to facilitate understanding of this solution, and is not intended to limit this solution.
  • the first type of virtual keyboard is a mini keyboard, which is beneficial to improve the flexibility of the user in the process of inputting letters.
  • the first type of virtual keyboard is a numeric keyboard
  • the first type of virtual keyboard is a functional keyboard
  • the virtual keys included in the functional keyboard correspond to the functions of the application.
  • the functional keyboard may be a game keyboard, which is configured in the game keyboard. There are common game buttons.
  • the functional keyboard may be a commonly used key in drawing software, etc., which will not be exhaustive here.
  • FIG. 24 and FIG. 25 are two schematic diagrams of the first type of virtual keyboard in the method for processing a virtual keyboard provided by an embodiment of the present application.
  • Fig. 23 and Fig. 24 include two sub-schematic diagrams (a) and (b), please refer to Fig. 24 first, the sub-schematic diagram of Fig. 24 (a) shows that the first gesture operation is a right-hand operation, and the sub-schematic diagram of Fig. 24 (b) The schematic diagram represents that the first type of virtual keyboard is embodied as a numeric keyboard.
  • Fig. 25 again, the sub-schematic diagram of Fig.
  • FIG. 25 (a) shows that the first gesture operation is a left-hand operation
  • the sub-schematic diagram of Fig. 25 (b) represents that the virtual keyboard of the first type is embodied as a designer keyboard.
  • the first type of virtual keyboard when the first gesture operation is a right-handed one-handed operation, is a numeric keyboard, and when the first gesture operation is a left-handed one-handed operation, the first type of virtual keyboard It is a functional keyboard, which is more in line with the user's habit of using the physical keyboard, so as to reduce the difference between the virtual keyboard and the physical keyboard, which is beneficial to enhance the user's viscosity.
  • the first type of virtual keyboard is a function key keyboard
  • the first area is located at the lower left or lower right of the display screen
  • the first type of virtual keyboard is a function key keyboard
  • FIG. 26 and FIG. 27 are two schematic diagrams of the first type of virtual keyboard in the method for processing a virtual keyboard provided by an embodiment of the present application.
  • Figures 26 and 27 include two sub-schematic diagrams (a) and (b).
  • the first area is located at the lower left of the display screen, as shown in sub-schematic diagram (a) of Figure 26,
  • the user places one hand on the first area of the display screen, it is triggered to enter the sub-schematic diagram (b) of FIG. 26 , that is, the first type of virtual keyboard is a function key keyboard.
  • the Sub-schematic diagram (b) of FIG. 27 that is, the first type of virtual keyboard is a function key keyboard.
  • the function keys are arranged at the lower left or lower right of the physical keyboard, when the first gesture operation is a one-handed operation located in the first area of the display screen, the first type of virtual keyboard is a function
  • the trigger gesture is the same as the user's habit of using a physical keyboard, it is convenient for the user to memorize the trigger gesture, which reduces the difficulty of implementing this solution and is beneficial to enhancing the user's stickiness.
  • the first type of virtual keyboard is a circular keyboard or an arc-shaped keyboard.
  • the circular keyboard refers to a keyboard with a circular shape
  • the curved keyboard refers to a keyboard with an arc shape.
  • the first type of virtual key is a circular keyboard or an arc.
  • the value of the third distance threshold may be 58 mm, 60 mm, 62 mm, etc., which is not exhaustive here.
  • FIG. 28 is a schematic diagram of a first type of virtual keyboard in the method for processing a virtual keyboard provided by an embodiment of the present application. Fig.
  • sub-schematic diagram (a) of Fig. 28 represents a one-handed operation in which the first gesture operates two contact points (that is, less than three contact points).
  • the sub-schematic diagram represents that the virtual keyboard of the first type is a circular keyboard. It should be understood that the example in FIG. 28 is only for the convenience of understanding this solution, and is not intended to limit this solution.
  • a circular keyboard or an arc keyboard can also be provided, which can not only provide the keyboard existing in the physical keyboard, but also provide the physical keyboard.
  • the keyboard that does not exist in the keyboard enriches the types of keyboards, provides the user with more choices, and further enhances the user's choice flexibility.
  • the first type of virtual keyboard is a full keyboard.
  • FIG. 29 is a schematic diagram of a first type of virtual keyboard in the method for processing a virtual keyboard provided by an embodiment of the present application.
  • Figure 29 represents that the virtual keyboard corresponding to the two-hand operation is a full keyboard, and the full keyboard includes at least 26 letter keys. Comparing Figure 29 and Figure 23, it can be seen that the size of the full keyboard is larger than that of the mini keyboard. It should be understood that the example in Figure 29 is only for convenience Understand this scheme, not to limit this scheme.
  • the first rule directly includes the correspondence between multiple types of gesture operations and multiple types of virtual keyboards, that is, as shown in the above items (1) to (5). of.
  • the first rule includes the correspondence between a plurality of first identification information and a plurality of second identification information, the first identification information is used to uniquely point to the first identification information corresponding to a type of gesture operation, and the second identification information is used for For a virtual keyboard that only points to one type.
  • the first rule includes correspondence between multiple sets of conditions and multiple types of virtual keyboards, and each set of conditions in the multiple sets of conditions corresponds to one type of gesture operation, that is, among the multiple sets of conditions Each set of conditions is used to define a type of gesture operation.
  • a set of conditions for limiting the one-handed operation may be that the number of contact points is greater than or equal to the first value, and the distances between the multiple contact points are less than or equal to
  • the values of the first value and the second distance threshold may refer to the above description.
  • a set of conditions for limiting the one-handed operation may be that the number of contact points whose area is smaller than the first area threshold is greater than or equal to the first value, and the distance between multiple contact points whose area is smaller than the first area threshold are smaller than the second distance threshold.
  • a set of conditions for limiting the left-handed one-handed operation may be that the number of contact points is greater than or equal to the first value, and the distances between the multiple contact points are all less than the second distance threshold, and the multiple contact points are located on the display screen.
  • the left side of the center line of the Each contact point is to the right of the centerline of the display.
  • a set of conditions for the one-handed operation for defining the first area may be that the number of contact points is greater than or equal to the first value, and the distances between the multiple contact points are all less than the second distance threshold, and the multiple contact points are all smaller than the second distance threshold. It is located in the first area of the display screen; or, a set of conditions for the one-handed operation for defining the first area may be that the number of contact points is greater than or equal to the first value, and the distances between the multiple contact points are all smaller than the first value. There are two distance thresholds, and at least one of the multiple contact points is located in the first area of the display screen, and so on.
  • a set of conditions for limiting the two-hand operation may be that the multiple contact points include a first subset and a second subset, the number of contact points in the first subset and the second subset are both greater than or equal to the first value, and the first subset The distances between multiple contact points in a subset are all less than the second distance threshold, and the distances between multiple contact points in the second subset are all less than the second distance threshold, and, in the first subset The distance between any one of the contact points and any one of the contact points in the second subset is greater than the second distance threshold.
  • a set of conditions for defining the two-hand operation may be that the plurality of contact points whose area is smaller than the first area threshold includes a first subset and a second subset.
  • the first rule includes a first sub-rule, and the first sub-rule is obtained based on performing a custom operation on at least one type of gesture operation and/or at least one type of virtual keyboard.
  • the user can customize the trigger gesture and/or the type of the virtual keyboard, so that the display process of the virtual keyboard is more in line with the user's expectation, so as to further improve the user stickiness of the solution.
  • FIGS. 30 to 32 are schematic diagrams of the first setting interface in the processing method of the virtual keyboard provided by the embodiment of the present application, and FIG. 32 is provided by the embodiment of the present application.
  • FIG. 30 includes four sub-schematic diagrams (a), (b), (c) and (d).
  • Sub-schematic diagram (a) of FIG. 30 represents multiple types of gesture operations and multiple pre-configured electronic devices.
  • the corresponding relationship between the two types of virtual keyboards is shown in the sub-schematic diagram (a) of Figure 30.
  • One-handed operation triggers the display of the numeric keyboard
  • two-handed operation triggers the display of the full keyboard
  • two-finger operation triggers the display of the circular keyboard.
  • "Numeric keyboard” a type of virtual keyboard
  • trigger to enter the sub-diagram (b) of FIG. 30 that is, a custom operation on the "numerical keyboard”.
  • a delete icon ie, the "X" symbol shown in FIG. 30
  • the button with the “ ⁇ ” symbol is the button that can be deleted.
  • the user can also move the position of the button in the numeric keyboard by long pressing and dragging the button.
  • the aforementioned operations of deleting a key and moving a key position can be performed multiple times.
  • the user deletes the numeric keys except 1-9 to realize the customization of the numeric keyboard.
  • FIG. 30 is only for the convenience of understanding this solution, and other operations can also be used to realize the deletion or movement of the virtual keys in the virtual keyboard, and FIG. 30 only takes the customization of the numeric keyboard as an example. , other types of virtual keyboards can also be customized.
  • FIG. 31 needs to be described in conjunction with FIG. 30.
  • the user clicks "one-hand operation" in the sub-schematic diagram (a) of FIG. 30 the user enters the sub-schematic diagram (a) of FIG.
  • an icon for inputting a "custom gesture” is displayed.
  • the user clicks on the aforementioned icon to enter the sub-schematic diagram (b) of FIG. 31 and the user inputs a custom gesture based on the prompt of the sub-schematic diagram (b) of FIG. 31 , that is, A "clenched fist" gesture is input as shown in the sub-schematic diagram of (c) of FIG. 31 .
  • the electronic device can preset a first duration threshold, and the first duration threshold is the total duration threshold for inputting a custom gesture, and when the input duration threshold is reached, the sub-schematic diagram of FIG. 31 (d) is entered;
  • the electronic device may also preconfigure a second duration threshold, where the second duration threshold is the threshold at which the user stops inputting gestures, and when the electronic device detects that the duration for which the user stops inputting gestures reaches the second duration threshold, the user enters the In the sub-schematic diagram of FIG. 31 (d), etc., the manner of entering the sub-schematic diagram of FIG. 31 (d) is not exhaustive here.
  • an icon for indicating “confirm” and an icon for indicating “re-input custom gesture” are displayed on the display screen. If the user clicks the icon of “confirm”, the electronic device will The gesture operation obtained through the sub-schematic diagram (c) of FIG. 31 is determined to be the custom gesture 1, and the electronic device updates the first rule to update the correspondence between the one-handed operation and the numeric keypad to the one between the custom gesture 1 and the numeric keypad and shows the sub-schematic diagram (e) of FIG. 31 , that is, the custom gesture 1 is confirmed as the trigger gesture of the numeric keyboard, so as to complete the customization of the trigger gesture.
  • the sub-schematic (f) of FIG. 31 the sub-schematic (f) of FIG.
  • FIG. 31 represents the shape of the custom gesture 1 obtained by the electronic device (that is, the shape of “make a fist”), it should be understood that the example in FIG. 31 only In order to facilitate the understanding of this solution, it is not used to limit this solution, and the user can also set custom gestures of other shapes, which are not limited here.
  • FIG. 32 needs to be described in conjunction with FIG. 31.
  • the user starts to input a custom gesture based on the prompt in the sub-schematic diagram (b) of FIG. (b)
  • Sub-schematic diagram in Figure 32, the custom gesture is used as an example of a dynamic gesture, and in Figure 32, the custom gesture is a dynamic gesture that opens after making a fist.
  • the electronic device determines that the user has completed the input of the custom gesture, it can enter the Sub-schematic diagram (d) of FIG. 31 , the subsequent steps can be referred to the above description of FIG. 31 , which will not be repeated here.
  • the electronic device determines whether the first gesture operation is included in the pre-stored multiple types of gesture operations. If the first gesture operation is one of the pre-stored multiple types of gesture operations, enter the Step 1705 ; if the first gesture parameter is not included in the pre-stored multiple types of gesture operations, re-enter step 1701 .
  • the electronic device needs to generate first indication information through step 1702, and the first indication information needs to include The number information of the hands corresponding to the first gesture operation, the position information of the hands corresponding to the first gesture operation, and the shape information of the hands corresponding to the first gesture operation. After learning the first indication information, the electronic device can determine the number of hands. Whether a gesture operation is one of various types of gesture operations preconfigured in the electronic device.
  • the electronic device can directly determine whether the first gesture operation satisfies the multiple groups included in the first rule. For any group of conditions in the conditions, for the description of multiple groups of conditions, please refer to the description in step 1703 above.
  • the electronic device displays the first type of virtual keyboard through the display screen.
  • the electronic device may obtain the first gesture operation corresponding to the gesture operation of the target type.
  • One type of virtual keyboard that is, to obtain the first type of virtual keyboard corresponding to the first gesture operation.
  • the virtual keyboard of the first type is displayed through the display screen.
  • the electronic device is preconfigured with a first rule, and the first rule indicates the correspondence between multiple types of gesture operations and the multiple types of virtual keyboards. After a gesture operation, a first type of virtual keyboard corresponding to the specific first gesture operation can be obtained according to the first rule, thereby improving the efficiency of the virtual keyboard matching process.
  • the position of the first type of virtual keyboard on the display screen is fixed.
  • the position of the first type of virtual keyboard on the display screen may move with the movement of the user's hand.
  • the multiple types of virtual keyboards are divided into a third subset and a fourth subset, the third subset and the fourth subset both include at least one type of virtual keyboard, and the third subset
  • the position of each type of virtual keyboard is fixed during the presentation process, and the position of each type of virtual keyboard in the fourth subset can be moved with the user's hand movement during the presentation process; that is, multiple types of virtual keyboards.
  • the positions of some types of virtual keyboards in the keyboard are fixed during the presentation process, and the positions of other types of virtual keys are moved along with the movement of the user's hand during the presentation process.
  • the virtual keyboard of the first type can move with the movement of the user's hand, that is, the third subset includes the following: Any one or a combination of: Mini Keyboard, Numeric Keypad, and Functional Keypad.
  • the virtual keyboard of the first type is a circular keyboard, an arc keyboard or a full keyboard, then the virtual keyboard of the first type can be fixed in position during the display process, that is, the fourth subset includes the following A combination of any one or more of: Round, Curved, or Full Keyboard.
  • a second gesture operation can be input through the display screen, and the second gesture operation can be a double-click operation, Triple-click operation, single-click operation, or other operations, etc., are not limited here.
  • the initial display position of the first type of virtual keyboard may be preset, or may be determined by the electronic device based on the position of the finger.
  • the first type of virtual keyboard is a numeric keyboard
  • the key corresponding to the number 5 can be placed under the index finger; as another example, if the first type of virtual keyboard is a mini keyboard, the mini keyboard
  • the initial display position of the keyboard may be below the hand, etc.
  • the examples here are only for the convenience of understanding the solution, and are not used to limit the solution.
  • Display size for the virtual keyboard In an implementation manner, the size of each type of virtual keyboard in the electronic device is fixed. In another implementation manner, the same type of virtual keyboard may have different sizes to accommodate different finger/hand sizes; specifically, at least two different virtual keyboards may be pre-stored in the electronic device for the same type of virtual keyboard. size, and pre-store the correspondence between the size of the contact point and different sizes, then after determining the first type of virtual keyboard, the target size corresponding to the size of the contact point can be obtained, and the virtual keyboard of the first type of the target size can be displayed. keyboard.
  • the electronic device may also obtain the first angle according to the first indication information generated in step 1702, and the first angle indicates the first gesture corresponding to the first gesture operation.
  • the relative angle between the middle hand and the edge of the display screen, or the first angle indicates the relative angle between the middle hand and the center line of the display screen in the first gesture corresponding to the first gesture operation.
  • Step 1705 may include: the electronic device obtains a first display angle of the virtual keyboard of the first type according to the first angle, and displays the virtual keyboard of the first type according to the first display angle through the display screen; the first display angle indicates the first type The relative angle between the edge of the virtual keyboard and the edge of the display screen, alternatively, the first presentation angle indicates the relative angle between the edge of the virtual keyboard of the first type and the centerline of the display screen.
  • the electronic device determines whether the first angle is greater than or equal to a preset angle threshold, and if it is greater than or equal to the preset angle threshold, obtains the first display angle, and displays the display according to the first display angle through the display screen.
  • the first type of virtual keyboard is shown, wherein the value of the preset angle threshold may be 25 degrees, 28 degrees, 30 degrees, 32 degrees, 35 degrees, or other values, which are not limited here.
  • the virtual keyboard of the first type is a full keyboard
  • the first angle includes the relative angle of the left hand and the relative angle of the right hand
  • the full keyboard is split into a first sub-keyboard and a second sub-keyboard
  • the first sub-keyboard is divided into a first sub-keyboard and a second sub-keyboard.
  • the keyboard and the second sub-keyboard include different virtual keys in the full keyboard
  • the first display angle includes the display angle of the first sub-keyboard and the display angle of the second sub-keyboard.
  • the first display angle indicates the relative angle between the bottom edge of the virtual keyboard and the edge of the display screen; further
  • the display angle of the first sub-keyboard indicates the relative angle between the edge of the first sub-keyboard and the edge of the display screen, and the display angle of the second sub-keyboard indicates the relative angle between the edge of the second sub-keyboard and the edge of the display screen. angle.
  • the first display angle indicates the relative angle between the bottom edge of the virtual keyboard and the center line of the display screen
  • the display angle of the first sub-keyboard indicates the relative angle between the side of the first sub-keyboard and the center line of the display screen
  • the display angle of the second sub-keyboard indicates the side of the second sub-keyboard and the center line of the display screen relative angle between them.
  • FIG. 33 is a schematic diagram of a first type of virtual keyboard in the method for processing a virtual keyboard provided by an embodiment of the present application.
  • the value of the preset angle threshold is taken as an example of 30.
  • FIG. 33 includes three sub-schematic diagrams (a), (b) and (c). 8 first contact points corresponding to a type of operation, the sub-schematic diagram of FIG. 33 (b) respectively shows the first sub-angle (that is, the relative angle of the left hand) formed by the first straight line and the bottom edge of the display screen.
  • the sub-schematic diagram (c) of FIG. 33 represents the display of the first type of virtual keyboard according to the first display angle through the display screen. It should be understood that the example in FIG.
  • the first type of virtual keyboard is a mini keyboard, a numeric keyboard, a functional keyboard or a function key keyboard
  • the first angle is the angle of one hand
  • the first display angle is the relative angle of the entire virtual keyboard.
  • the electronic device determines the first display angle of the virtual keyboard of the first type as the first angle, and displays the virtual keyboard of the first type according to the first angle through the display screen The keyboard, wherein, if the first angle indicates the relative angle between the hand and the edge of the display screen in the first gesture corresponding to the first gesture operation, the first display angle indicates the angle between the bottom edge of the virtual keyboard and the edge of the display screen. Relative angle; if the first angle indicates the relative angle between the hand and the edge of the display screen in the first gesture corresponding to the first gesture operation, the first display angle indicates the distance between the bottom edge of the virtual keyboard and the centerline of the display screen. relative angle.
  • the relative angle (ie, the first angle) between the user's hand and the edge or center line of the display interface is obtained, and the display angle of the virtual keyboard is determined according to the first angle, so that the display angle of the keyboard is more It fits the placement angle of the user's hand, making the user's input process using the virtual keyboard more comfortable and convenient.
  • the electronic device determines that the first gesture parameter is a two-hand operation, that is, determines that the virtual keyboard of the first type is a full keyboard
  • the electronic device also obtains the distance between the hands, and determines whether the distance between the hands is greater than or equal to the first distance threshold, when the distance between the hands is less than or equal to the first distance threshold, the first type of virtual keyboard is displayed in an integrated manner through the display screen; when the distance between the hands is greater than the first
  • the first sub-keyboard is displayed through the second area of the display screen, and the second sub-keyboard is displayed through the third area of the display screen, wherein the second area and the third area are different areas in the display screen, and the third area is displayed.
  • the first sub-keyboard and the second sub-keyboard include different virtual keys in the full keyboard; the value of the first distance threshold may be 70 mm, 75 mm, 80 mm, etc., which is not limited here.
  • FIG. 34 is a schematic diagram of a first type of virtual keyboard in the method for processing a virtual keyboard provided by an embodiment of the present application.
  • the value of the first distance threshold is taken as an example of 75 mm.
  • Fig. 34 includes two sub-schematic diagrams (a) and (b), and sub-schematic diagram (a) of Fig. 34 represents that the distance between the two hands in the two-hand operation is 80 mm. mm, since 80 mm is greater than 75 mm, the sub-schematic diagram of Figure 34 (b) represents that the first sub-keyboard is displayed in the second area of the display screen, and the second sub-keyboard is displayed in the third area of the display screen. It should be understood that Figure 34 The examples in this scheme are only for the convenience of understanding this scheme, and are not used to limit this scheme.
  • the keyboard is more convenient for users to use, and the user viscosity of the solution is further improved.
  • the electronic device determines that the first gesture parameter is a two-hand operation, that is, determines that the virtual keyboard of the first type is a full keyboard, then the electronic device also obtains the distance between the hands, and judges whether the distance between the hands is not. is less than the fourth distance threshold, if the distance between the hands is less than the fourth distance threshold, the electronic device displays a prompt message to instruct the user to adjust the distance between the hands, and/or, the electronic device directly displays the full keyboard in an integrated manner; Optionally, the electronic device displays a minimum-sized full keyboard in an integrated manner.
  • the aforementioned prompt information may be text prompts, voice prompts, vibration prompts, or other types of prompts, etc., and the manner of displaying the prompt information is not exhaustive here.
  • FIG. 35 is a schematic diagram of a first type of virtual keyboard in the method for processing a virtual keyboard provided by an embodiment of the present application.
  • Figure 35 includes two sub-schematic diagrams (a) and (b).
  • Sub-schematic diagram (a) of Figure 35 represents that the distance between the two hands is 0 mm in the two-hand operation. Since the distance between the hands is too small, (b) of Figure 35 B1 in the sub-schematic diagram represents prompt information to remind the user that the distance between the hands is too close, and the full keyboard is displayed in an integrated manner. It should be understood that the example in FIG.
  • a plurality of vibration feedback elements are also configured in the display screen. If the position of the first type of virtual keyboard is fixed on the display screen during the display process of the first type of virtual keyboard, the electronic device displays the first type of virtual keyboard through the display screen. After a type of virtual keyboard, the electronic device can also detect the first contact operation acting on the display screen, and in response to the first contact operation, obtain the first position information of the first contact point corresponding to the first contact operation, the first The location information corresponds to the first virtual key on the virtual keyboard.
  • the electronic device obtains the first vibration feedback element from a plurality of vibration feedback elements, and the first vibration feedback element is a vibration feedback element matched with the first virtual key; indicating the first vibration
  • the feedback element emits vibration waves to perform a first feedback operation, and the first feedback operation is used to prompt that the first virtual key is an anchor point key.
  • the anchor point keys in each type of virtual keyboard are given as examples below in combination with the various types of virtual keyboards shown above.
  • the first type of virtual keyboard is the numeric keyboard shown in FIG. 24
  • the anchor point key may be a virtual key pointed to by the number "5".
  • the first type of virtual keyboard is the function key keyboard shown in FIG. 26
  • the anchor point keys can be the Ctrl key and the Shift key.
  • the electronic device acquires a contact operation for the first virtual key in the function key keyboard.
  • the first type of virtual keyboard displayed by the electronic device is a function key keyboard
  • the electronic device can also obtain a contact operation for one or more first virtual keys in the function key keyboard, the contact operation It can be a pressing operation or a touch operation.
  • the first virtual key may be a Ctrl key, and may also include a Ctrl key and a Shift key, etc., which is not limited here.
  • the electronic device highlights a second virtual key on the display screen, where the second virtual key is a key other than the first virtual key in the combined shortcut keys.
  • the electronic device highlights at least one second virtual key on the display screen in response to the contact operation.
  • each second virtual key in the at least one second virtual key can form a shortcut key with the first virtual key, and the second virtual key is a key other than the first virtual key in the combined shortcut keys; the highlighted display includes However, it is not limited to highlight display, bold display, flash display, etc., which are not limited here.
  • the combination of Ctrl key + Shift key + I key can provide the function of inverting the currently processed image
  • the first virtual key includes the Ctrl key and the Shift key
  • the first virtual key Two virtual keys are virtual keys 1; wherein, the inverse display of the currently processed image refers to changing the color of the currently processed image to its complementary color, it should be understood that the examples here are only for the convenience of understanding this scheme, and are not used for Limited to this program.
  • FIG. 36 is a schematic diagram of a second virtual key in the method for processing a virtual keyboard provided by an embodiment of the present application.
  • Fig. 36 takes the current application as a drawing application as an example.
  • Fig. 36 includes four sub-schematic diagrams (a), (b), (c) and (d).
  • the sub-schematic diagram (a) of Fig. 36 represents that function keys are displayed on the display screen keyboard.
  • the sub-schematic diagram (b) of FIG. 36 represents that the user performs a pressing operation on the Ctrl key and the Shift key, thereby triggering the electronic device to highlight the key where the letter I is located on the display screen.
  • FIG. 36 represents that the user clicks the button where the letter I is located, thereby triggering the entry of the sub-schematic diagram of (d) of FIG. 36 , that is, the currently displayed image is displayed in reverse phase. It should be understood that the example in FIG. 36 This scheme is only for the convenience of understanding, and is not used to limit this scheme.
  • the electronic device highlights the second virtual keys on the display screen, and also displays the function of the shortcut keys corresponding to each second virtual key.
  • FIG. 37 is a schematic diagram of a second virtual key in the method for processing a virtual keyboard provided by an embodiment of the present application.
  • Fig. 37 takes the current application as a document presentation application, and the virtual keyboard is displayed on the presentation interface of the document in a floating manner.
  • Fig. 37 includes three sub-schematic diagrams (a), (b) and (c).
  • the sub-schematic representation represents a keyboard with function keys displayed on the display screen.
  • the sub-schematic diagram (b) of FIG. 37 represents that the user performs a pressing operation on the Ctrl button, thereby triggering entry into the sub-schematic diagram of FIG.
  • a contact operation for the first virtual key in the function key keyboard is obtained, and in response to the contact operation, the second virtual key is highlighted on the display screen, and the first virtual key is displayed on the display screen.
  • the two virtual keys are the keys other than the first virtual key among the shortcut keys of the combination type. Since the function key keyboard occupies a small area, the area required for displaying the virtual keyboard is reduced, and the first virtual key in the function key keyboard is reduced by the user.
  • the second virtual key in the combined shortcut key can be automatically displayed, thereby ensuring the user's demand for the shortcut key and avoiding the waste of the display area of the display screen.
  • FIG. 38 is a schematic flowchart of a method for processing a virtual keyboard provided by an embodiment of the present application.
  • the embodiment of the present application is applied to a text editing application as:
  • FIG. 38 includes four sub-schematic diagrams (a), (b), (c) and (d).
  • the electronic device acquires the first gesture parameter corresponding to the two-handed operation, according to The first rule and the first gesture parameter corresponding to the two-hand operation are used to obtain the full keyboard corresponding to the two-hand operation, and the full keyboard is displayed on the display screen, and the user inputs the content "Main ingredient low-gluten flour:" through the full keyboard.
  • the first rule and the first gesture parameter corresponding to the two-hand operation are used to obtain the full keyboard corresponding to the two-hand operation, and the full keyboard is displayed on the display screen, and the user inputs the content "Main ingredient low-gluten flour:" through the full keyboard.
  • the electronic device detects that the user has raised one hand and stops displaying the full keyboard on the display screen, and the electronic device acquires the first gesture parameter corresponding to the one-handed operation of the right hand, according to the first rule With the first gesture parameter corresponding to the one-handed operation of the right hand, obtain the numerical keyboard corresponding to the single-handed operation of the right hand, and display the numerical keyboard through the display screen, that is, as shown in the sub-schematic diagram of (c) of FIG. 38, the numerical keyboard is displayed on the user Below the hand, the user enters the content "145" through the numeric keyboard. As shown in sub-diagram (d) of Fig.
  • the electronic device detects that the user's hand is moving above the display screen, the electronic device obtains the movement trajectory of the hand, and controls the numeric keyboard to follow the user's hand.
  • the position of the numeric keyboard is fixed.
  • a plurality of different types of virtual keyboards are configured in the electronic device, and the virtual keys included in the different types of virtual keyboards are not identical, and the user can evoke different types of virtual keyboards through different gesture operations, that is, virtual
  • the keyboard is no longer only able to display 26 letters, but provides users with more virtual keys through different types of virtual keyboards, which not only improves the flexibility of users in the process of evoking the virtual keyboard, but also helps to provide richer virtual keyboards. keys, eliminating the need to provide an additional physical keyboard.
  • FIG. 39 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
  • the electronic device 1 includes a display screen 50, a memory 40, one or more processors 10, and one or more programs 401.
  • 29 may be the same elements , one or more programs 401 are stored in the memory 40, and when the one or more processors 10 execute the one or more programs 401, the electronic device performs the following steps: in response to the detected first gesture operation, from multiple The virtual keyboard of the first type corresponding to the first gesture operation is selected from the virtual keyboards of the various types, wherein the virtual keys included in the virtual keyboards of different types among the virtual keyboards of multiple types are not exactly the same; the display screen 50 displays the first type of virtual keyboard. type of virtual keyboard.
  • the electronic device when one or more processors 10 execute one or more programs 401, the electronic device specifically executes the following steps: according to a first rule, select and first The first type of virtual keyboard corresponding to the gesture operation, and the first rule indicates the correspondence between the multiple types of gesture operations and the multiple types of virtual keyboards.
  • the electronic device when the one or more processors 10 execute the one or more programs 401, the electronic device further executes the following steps: acquiring a first gesture parameter corresponding to the first gesture operation, wherein the first The gesture parameters include any one or more of the following parameters: position information of the contact point corresponding to the first gesture operation, quantity information of the contact point corresponding to the first gesture operation, area of the contact point corresponding to the first gesture operation information, the relative angle information of the hand corresponding to the first gesture operation, the position information of the hand corresponding to the first gesture operation, the number information of the hand corresponding to the first gesture operation, and the shape information of the hand corresponding to the first gesture operation .
  • the electronic device specifically performs the following steps: selecting a first type of virtual keyboard from multiple types of virtual keyboards according to the first gesture parameter.
  • the electronic device when the one or more processors 10 execute the one or more programs 401, the electronic device further executes the following steps: in response to the first gesture operation, acquiring the first angle, the first angle indicating the The relative angle between the hand corresponding to the first gesture operation and the side of the display screen 50 , or the first angle indicates the relative angle between the hand corresponding to the first gesture operation and the center line of the display screen 50 .
  • the electronic device When one or more processors 10 execute one or more programs 401, the electronic device specifically performs the following steps: obtaining the display angle of the virtual keyboard of the first type according to the first angle, and displaying it according to the display angle through the display screen 50
  • the presentation angle indicates the relative angle between the side of the first type of virtual keyboard and the side of the display screen 50, or the presentation angle indicates the side of the first type of virtual keyboard and the centerline of the display screen 50 relative angle between them.
  • virtual keyboards among multiple types of virtual keyboards have different functions
  • the virtual keyboards with different functions include a combination of any two or more of the following virtual keyboards: numeric keyboard, function key keyboard, Full keyboard and custom keyboard, function keys
  • the keyboard consists of function keys.
  • the first type of virtual keyboard is any one of the following virtual keyboards: mini keyboard, numeric keyboard, functional keyboard, and functional key keyboard , circular keyboard, curved keyboard, custom keyboard, wherein the mini keyboard includes 26 letter keys, the functional keyboard is displayed in the application 401, and the virtual keys included in the functional keyboard correspond to the functions of the application 401.
  • the first type of virtual keyboard is a full keyboard, and the full keyboard includes at least 26 letter keys; one or more processors 10 are executing one or more
  • the electronic device is made to specifically execute the following steps: when the distance between the hands is less than or equal to the first distance threshold, the display screen 50 is used to display the full keyboard in an integrated manner; When the distance is greater than the first distance threshold, the first sub-keyboard is displayed through the second area of the display screen 50, and the second sub-keyboard is displayed through the third area of the display screen 50, wherein the second area and the third area are the display screen. In different areas in 50, the first sub-keyboard and the second sub-keyboard include different virtual keys in the full keyboard.
  • the single-handed operation includes a left-handed single-handed operation and a right-handed single-handed operation; if the first gesture operation is a right-handed single-handed operation, the first type of virtual keyboard is a numeric keyboard; in the first gesture operation When the operation is a left-handed one-handed operation, the first type of virtual keyboard is a functional keyboard.
  • the display screen 50 is configured with a plurality of vibration feedback elements.
  • the electronic device further executes the following steps: detecting a first contact operation acting on the display screen 50; in response to the first contact operation, acquiring the corresponding first contact operation The first position information of the first contact point, the first position information corresponds to the first virtual key on the virtual keyboard; in the case where the first virtual key is an anchor point key, the first vibration feedback is obtained from a plurality of vibration feedback elements element, the first vibration feedback element is a vibration feedback element matched with the first virtual key; the first vibration feedback element is instructed to send out vibration waves to perform the first feedback operation, and the first feedback operation is used to prompt the first virtual key to be the anchor point button.
  • FIG. 40 is a schematic structural diagram of the electronic device provided by the embodiment of the present application.
  • the electronic device 1 can be embodied as a mobile phone, a tablet, a notebook computer, or other configurations. Devices with a display screen, etc., are not limited here.
  • the electronic device described in the embodiment corresponding to FIG. 39 may be deployed on the electronic device 1 to implement the functions of the electronic device in the embodiment corresponding to FIG. 17 to FIG. 38 .
  • the electronic device 1 may vary greatly due to different configurations or performances, and may include one or more central processing units (CPU) 1522 (for example, one or more processors) and the memory 40, One or more storage media 1530 (eg, one or more mass storage devices) that store applications 1542 or data 1544.
  • the memory 40 and the storage medium 1530 may be short-term storage or persistent storage.
  • the program stored in the storage medium 1530 may include one or more modules (not shown in the figure), and each module may include a series of instructions to operate on the electronic device.
  • the central processing unit 1522 may be configured to communicate with the storage medium 1530 to execute a series of instruction operations in the storage medium 1530 on the electronic device 1 .
  • the electronic device 1 may also include one or more power supplies 1526, one or more wired or wireless network interfaces 1550, one or more input and output interfaces 1558, and/or, one or more operating systems 1541, such as Windows ServerTM, Mac OS XTM, UnixTM, LinuxTM, FreeBSDTM and many more.
  • operating systems 1541 such as Windows ServerTM, Mac OS XTM, UnixTM, LinuxTM, FreeBSDTM and many more.
  • the central processing unit 1522 is used to implement the functions of the electronic device in the embodiments corresponding to FIG. 17 to FIG. 38 . It should be noted that, for the specific implementation manner of the central processing unit 1522 performing the functions of the electronic device in the embodiments corresponding to FIGS. 17 to 38 and the beneficial effects brought about, reference may be made to the respective method embodiments corresponding to FIGS. 17 to 38 . , and will not be repeated here.
  • Embodiments of the present application further provide a computer-readable storage medium, where a program for generating a vehicle running speed is stored in the computer-readable storage medium, and when it runs on a computer, the computer executes the programs shown in the aforementioned FIGS. 17 to 38 .
  • the embodiments of the present application also provide a computer program, which, when run on a computer, causes the computer to perform the steps performed by the electronic device in the methods described in the foregoing embodiments shown in FIG. 17 to FIG. 38 .
  • An embodiment of the present application further provides a circuit system, the circuit system includes a processing circuit, and the processing circuit is configured to execute the steps performed by the electronic device in the methods described in the foregoing embodiments shown in FIG. 17 to FIG. 38 .
  • the electronic device provided by the embodiment of the present application may be a chip, and the chip includes: a processing unit and a communication unit.
  • the processing unit may be, for example, a processor, and the communication unit may be, for example, an input/output interface, a pin, or a circuit.
  • the processing unit can execute the computer-executed instructions stored in the storage unit, so that the chip executes the steps performed by the electronic device in the methods described in the foregoing embodiments shown in FIGS. 17 to 38 .
  • the storage unit is a storage unit in the chip, such as a register, a cache, etc.
  • the storage unit may also be a storage unit located outside the chip in the wireless access device, such as only Read-only memory (ROM) or other types of static storage devices that can store static information and instructions, random access memory (RAM), etc.
  • ROM Read-only memory
  • RAM random access memory
  • the processor mentioned in any one of the above may be a general-purpose central processing unit, a microprocessor, an ASIC, or one or more integrated circuits for controlling the execution of the program of the method in the first aspect.
  • the device embodiments described above are only schematic, wherein the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be A physical unit, which can be located in one place or distributed over multiple network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution in this embodiment.
  • the connection relationship between the modules indicates that there is a communication connection between them, which may be specifically implemented as one or more communication buses or signal lines.
  • U disk mobile hard disk
  • ROM read-only memory
  • RAM magnetic disk or optical disk
  • a computer device which may be a personal computer, server, or network device, etc.
  • the computer program includes one or more computer instructions. When the computer program instructions are loaded and executed on a computer, all or part of the processes or functions described in the embodiments of the present application are generated.
  • the computer may be a general purpose computer, special purpose computer, computer network, or other programmable device.
  • the computer instructions may be stored in or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be downloaded from a website site, computer, server, or data center Transmission to another website site, computer, server, or data center is by wire (eg, coaxial cable, fiber optic, digital subscriber line (DSL)) or wireless (eg, infrared, wireless, microwave, etc.).
  • wire eg, coaxial cable, fiber optic, digital subscriber line (DSL)
  • wireless eg, infrared, wireless, microwave, etc.
  • the computer-readable storage medium may be any available medium that can be stored by a computer, or a data storage device such as a server, data center, etc., which includes one or more available media integrated.
  • the usable media may be magnetic media (eg, floppy disks, hard disks, magnetic tapes), optical media (eg, DVD), or semiconductor media (eg, Solid State Disk (SSD)), and the like.
  • Embodiments of the present application The application interface processing method provided by the embodiments of the present application can be applied to the electronic device shown in FIG. 41 .
  • FIG. 41 is a schematic structural diagram of the electronic device provided by the embodiments of the present application.
  • the electronic device includes a first display screen 501 and a first display screen 502.
  • the difference between the first display screen 501 and the first display screen 502 is that the first display screen 502 is a display screen used to obtain the user's handwriting input, and the first display screen 501 is not a display screen used to obtain the user's handwriting input .
  • the first display screen 502 is a touch screen, and the first display screen 502 needs to have the functions of receiving input and displaying output at the same time.
  • the electronic device includes one first display screen 501 and one first display screen 502 as an example, but in practice, one electronic device may also include at least two first display screens 501, or at least one first display screen 501.
  • the two first display screens 502, etc., the number of the first display screen 501 and the first display screen 502 included in the specific electronic device can be determined according to the actual application scenario, which is not limited here.
  • the electronic device pre-sets a display screen (that is, a second display screen) used for acquiring the user's handwriting input and a display screen not used for acquiring the user's handwriting input in the at least two display screens included in the electronic device screen (ie, the first display screen), so that the user can place the preset second display screen at a position that is convenient for the user to handwrite.
  • a display screen that is, a second display screen
  • a display screen not used for acquiring the user's handwriting input in the at least two display screens included in the electronic device screen ie, the first display screen
  • the electronic device determines a display screen (that is, a second display screen) for acquiring the user's handwriting input and a display screen not for acquiring the user's handwriting input according to the placement direction of each of the included at least two display screens.
  • the display screen of the user's handwriting input ie, the first display screen.
  • the electronic device can obtain the angle between the placement angle of each display screen in the at least two display screens and the horizontal direction, and then the electronic device can select the one with the smallest angle from the horizontal direction from the at least two display screens included.
  • the display screen is used as the first display screen 502 , and the remaining display screens in the at least two display screens are used as the first display screen 501 .
  • the electronic device may also select at least one display screen whose included angle with the horizontal direction is smaller than the first angle threshold from at least two display screens included in the electronic device as the first display screen 502, and the remaining display screens in the at least two display screens are displayed.
  • the screen is used as the first display screen 501, and the first angle threshold may be 25 degrees, 30 degrees, 40 degrees, or other values, etc., which are not exhaustive here.
  • the first display screen 501 and the first display screen 502 may be independent screens, and the first display screen 501 and the first display screen 502 are connected through a data interface, or the first display screen The screen 501 and the first display screen 502 are connected through a bus.
  • the first display screen 501 and the first display screen 502 are integrated into one flexible screen, and the first display screen 501 and the first display screen 502 are respectively two different areas in the flexible screen.
  • an electronic pen may also be configured in the electronic device, and the electronic pen may specifically adopt an electronic pen of electromagnetic touch screen (electromagnetic resonance technology, EMR) technology, an electronic pen of active electrostatic induction (active electrostatic solution, AES) technology.
  • EMR electromagnetic resonance technology
  • AES active electrostatic solution
  • the application scenarios of the embodiments of the present application are not exhaustively enumerated here. In all the aforementioned scenarios, there is a problem of cumbersome operations in the handwriting input process.
  • an embodiment of the present application provides a method for processing an application interface.
  • the method for processing an application interface is applied to the electronic device shown in FIG. 41 .
  • the electronic device displays the first application interface through the first display screen, and the When it is detected that the mode type corresponding to the first application interface is handwriting input, in response to the input mode of the handwriting input, triggering the display of the first application interface on the second display screen, so as to obtain information about the first application interface through the second display screen.
  • the handwritten content of the application interface that is, when it is detected that the current mode type corresponding to the first application interface is the input mode of handwriting input, the electronic device will automatically display the first application interface on the second display screen, In order to directly obtain the handwritten content for the first application interface through the second display screen, that is, the conversion from other modes to handwritten input can be directly completed without the need to perform steps such as copying and pasting, which avoids tedious steps and greatly improves the performance. Efficiency of handwriting input.
  • an application interface can only switch between the input mode of keyboard input and the input mode of handwriting input; in other application scenarios, an application interface can only switch between the input mode of keyboard input and the input mode of handwriting input. Switching between browsing modes is described below because the specific implementation processes are different in the above two application scenarios.
  • FIG. 42 is a schematic flowchart of a method for processing an application interface provided by an embodiment of the present application.
  • the method for processing an application interface provided by the embodiment of the present application may include:
  • the electronic device acquires a startup operation for the first application interface.
  • the electronic device acquires a startup operation for the first application interface
  • the target application application, APP
  • the first application interface refers to at least one application interface included in the target application.
  • Any application interface in an application interface, that is, the first application interface may refer to an interface that appears when the target application program is opened, or may be a new interface that is opened during the use of the target application program.
  • step 4201 may include: the electronic device obtains a start-up operation for the first application interface through the first display screen, or the electronic device obtains a start-up operation for the first application interface through the second display screen. Further, the electronic device obtains the start-up operation for the first application interface through an electronic pen, a mouse or a finger.
  • the electronic device determines, based on the startup operation, a mode type corresponding to the first application interface.
  • the electronic device determines a mode type corresponding to the first application interface based on the acquired startup operation, and the mode type corresponding to the first application interface is handwriting input or keyboard input.
  • the electronic device determines the mode type corresponding to the first application interface according to the acquisition position corresponding to the startup operation. Specifically, if the startup operation is obtained through the first display screen, it can be proved that the user often displays the first application interface on the first display screen, and the electronic device determines that the mode type corresponding to the first application interface is Keyboard input, that is, the initial mode type of the first application interface is keyboard input. If the startup operation is obtained through the second display screen, it proves that the user often uses the first application interface on the second display screen, and the electronic device determines that the mode type corresponding to the first application interface is handwriting input, that is, The initial mode type of the first application interface is handwriting input.
  • the difference between the first display screen and the second display screen can be referred to the above description of FIG. 41 , which is not repeated here.
  • the mode type corresponding to the first application interface is determined based on the position where the electronic device obtains the start operation, which is simple to operate and easy to implement.
  • the electronic device determines the mode type corresponding to the first application interface according to the startup mode corresponding to the startup operation. Specifically, in the case that the startup operation is obtained through the electronic pen, the electronic device determines that the mode type corresponding to the first application interface is handwriting input; as an example, for example, the electronic device obtains that the user clicks the target application through the electronic pen. to open the first application interface, the electronic device may determine that the mode type corresponding to the first application interface is handwriting input. In the case that the starting operation is obtained through a mouse or a finger, it is determined that the mode type corresponding to the first application interface is keyboard input.
  • the electronic device may determine the mode type corresponding to the first application interface according to the acquisition position corresponding to the startup operation and the startup mode corresponding to the startup operation. Specifically, in one case, if the startup operation is obtained through the electronic pen, or if the startup operation is obtained through the second display screen, the electronic device will interface with the first application. The corresponding mode type is determined to be handwritten input. When the startup operation is acquired through a mouse or a finger, and the startup operation is acquired through the first display screen, the electronic device determines the mode type corresponding to the first application interface as keyboard input.
  • the electronic device determines the mode type corresponding to the first application interface as handwriting enter.
  • the electronic device determines the mode type corresponding to the first application interface as keyboard input.
  • the electronic device may also determine the initial mode type of the interface with the first application in other manners, which will not be listed one by one here.
  • the electronic device determines whether the mode type corresponding to the first application is handwriting input, if the mode type corresponding to the first application is keyboard input, then enter step 4204; if the mode type corresponding to the first application is handwriting input, then Go to step 4211.
  • the electronic device In response to the input mode of the keyboard input, the electronic device triggers the display of the first application interface on the first display screen, and the display of the virtual keyboard on the second display screen.
  • the electronic device when the electronic device determines that the mode type corresponding to the first application interface is not handwriting input but keyboard input, the electronic device triggers the display of the first display on the first display screen in response to the input mode of the keyboard input.
  • the application interface is displayed, and the virtual keyboard is displayed on the second display screen, so as to obtain the input content for the first application interface through the virtual keyboard on the second display screen.
  • FIG. 43 is a schematic interface diagram of a display interface of a second display screen in the method for processing an application interface provided by an embodiment of the present application.
  • FIG. 43 shows the open icon of application interface 1, the open icon of application interface 2, and the display interface of the virtual keyboard (corresponding to the first application interface shown on the first display screen), so that the user can click the application interface
  • the way of opening the icon of the interface 1 realizes the switching between the virtual keyboard and the application interface 1; the user can realize the switching between the virtual keyboard and the application interface 2 by clicking the way of opening the icon of the application interface 2, it should be understood that,
  • the example in FIG. 43 is only to facilitate the understanding of this solution, and is not intended to limit this solution.
  • the electronic device may also be provided with a zoom icon on the display interface of the virtual keyboard of the second display screen.
  • the displayed virtual keyboard is folded; when the user clicks the enlarged icon through a stylus, a finger or a mouse, etc., the displayed virtual keyboard on the second display screen is expanded.
  • the user can also switch between the virtual keyboard displayed on the second display screen and other application interfaces by inputting a sliding operation on the second display screen, and the sliding operation can be a left-right direction. Sliding operations, sliding operations in the up and down directions, etc., the electronic device may also adopt other methods to switch between the virtual keyboard and other application interfaces, which will not be exhaustive here.
  • FIG. 44 and FIG. 45 are respectively a schematic flowchart of a method for processing an application interface provided by an embodiment of the present application.
  • FIG. 44 includes two sub-schematic diagrams (a) and (b).
  • the electronic device obtains information about the target application (that is, the The opening operation of the application program "Notes"), since the starting operation is input through the first display screen, the electronic device determines that the mode type corresponding to the first application interface is keyboard input, then the electronic device enters (b of FIG. 44 ).
  • the electronic device displays the first application interface (ie, the initial application interface of the “notes” application) on the first display screen, and displays the virtual keyboard and the touchpad area on the second display screen.
  • FIG. 45 includes two sub-schematic diagrams (a) and (b).
  • sub-schematic diagram (a) of FIG. 45 the electronic device obtains information about the target application (that is, in the illustration) through the first display screen.
  • the opening operation of the “Notes” application program) since the starting operation is obtained by fingers, the electronic device determines that the mode type corresponding to the first application interface is keyboard input, then the electronic device enters (b) of FIG. 45 .
  • the electronic device displays the first application interface on the first display screen, and displays the virtual keyboard and touchpad area on the second display screen.
  • the second display screen in FIG. 44 and FIG. Only the virtual keyboard may be displayed without displaying the touchpad area.
  • FIG. 44 and FIG. 45 are only for the convenience of understanding the present solution, and are not intended to limit the present solution.
  • step 4204 may include: displaying a virtual keyboard and an application control bar on the second display screen.
  • the method may further include: the electronic device detects a second operation acting on the second display screen; changing the first display area of the application control bar to the second display area in response to the second operation, and changing the first display area included in the application control bar
  • the control key group is changed to a second control key group, and both the first control key group and the second control key group are control key sets corresponding to the target application.
  • the first application interface includes first control keys
  • step 4204 may include: displaying a virtual keyboard and an application control bar on the second display screen.
  • the method may further include: the electronic device detects a second operation on the first target application interface; in response to the second operation, displaying the first control key in the application control bar, and hiding the first control key in the first application interface.
  • step 4204 may include: the electronic device displays a second type of virtual keyboard (which may also be referred to as a default type of virtual keyboard) on the second display screen.
  • the method further includes: the electronic device detects a first gesture operation acting on the second display screen; in response to the first gesture operation, selecting a first type of virtual keyboard corresponding to the first gesture operation from a plurality of types of virtual keyboards, Among the multiple types of virtual keyboards, different types of virtual keyboards include different virtual keys; the first type of virtual keyboard is displayed through the second display screen, and the first type of virtual keyboard and the second type of virtual keyboard are many Different types of virtual keyboards among the different types of virtual keyboards.
  • the electronic device displays the virtual keyboard of the second type on the second display screen
  • the user can input different gesture operations to change the type of the virtual keyboard displayed on the second display screen.
  • the meanings of terms such as the first gesture operation, different types of virtual keyboards, and the specific implementation manner of the foregoing steps reference may be made to the description in Embodiment 2, and details are not repeated here.
  • the electronic device acquires the mode type corresponding to the first application interface.
  • the electronic device after the electronic device opens the first application interface, that is, during the running process of the first application interface, the electronic device also detects and acquires the mode type corresponding to the first application interface in real time, so as to determine the mode type corresponding to the first application interface. Whether the mode type corresponding to an application interface has changed. Specifically, if the electronic device can detect the first operation, in response to the first operation, it converts the mode type corresponding to the first application interface to handwriting input; if the electronic device does not detect the first operation, the first application The mode type corresponding to the interface is keyboard input, and the electronic device will continuously detect and acquire the mode type corresponding to the first application interface.
  • the electronic device determines the mode type corresponding to the first application interface according to the user's holding posture of the electronic pen. Specifically, in one case, a first preset condition is pre-stored in the electronic device, and the electronic device will acquire the user's holding posture of the electronic pen in real time, and determine whether the user's holding posture of the electronic pen satisfies the first predetermined condition.
  • the electronic device determines to detect the user's first operation, and then converts the type of the mode corresponding to the first application interface to the input of handwriting input mode; in the case that the user's holding posture of the electronic pen does not satisfy the first preset condition, determine that the type of the mode corresponding to the first application interface is an input mode of keyboard input.
  • the holding posture includes any one or a combination of the following: holding position, holding strength, holding angle or other holding-related factors, etc., which are not limited here
  • the first preset condition includes A combination of any one or more of the following: the holding position is within the first position range, the holding force is within the first force range, the holding angle is within the first angle range, or other preset conditions.
  • the electronic pen in addition to being used for writing, can also perform other operations, such as performing some operations performed by the mouse through the electronic pen, as an example, such as sliding operation, selection operation, and so on. Or, the user may just hold the electronic pen unconsciously, instead of wanting to perform writing operations, etc., which will not be exhaustive here.
  • the electronic device does not roughly determine that the mode type corresponding to the first application is the writing mode when the user uses the electronic pen, but further determines the mode corresponding to the first application interface according to the user's holding posture of the electronic pen. Mode type, thereby reducing the error rate of the judgment process of the mode type corresponding to the first application interface, so as to reduce the probability of misplacement of the first application interface, which not only avoids waste of computer resources, but also helps to improve user stickiness.
  • the electronic pen can be configured in the electronic device, and after the electronic pen is taken out from the electronic device by the user, a communication interface can be configured between the electronic pen and the electronic device, and the electronic pen can collect data corresponding to the holding posture in real time.
  • holding parameters and sending the holding parameters to the electronic device, so that the electronic device determines whether the user's holding posture of the electronic pen satisfies the first preset condition.
  • the holding parameter includes any one or a combination of the following: the position of the contact point corresponding to the holding operation, the holding force, the inclination angle of the electronic pen, or other parameters, and the like.
  • the electronic pen can be provided with a contact sensing module, and the contact sensing module of the electronic pen collects the position of each contact point between the user and the electronic pen in real time (that is, determines the holding position of the user on the electronic pen), and uses each contact point.
  • the position of the electronic pen is sent to the electronic device, and the electronic device determines whether the user's holding position of the electronic pen is within the first position range according to the position of each contact point.
  • the touch sensing module may be embodied as a contact sensing film, and the contact sensing film may be a capacitive touch sensing film, a pressure contact sensing film, a temperature contact sensing film, or other types of films, which are not exhaustive here.
  • the electronic pen can be provided with a pressure sensing module.
  • the pressure sensing module of the electronic pen collects the user's grip strength on the electronic pen in real time, and sends the user's grip strength on the electronic pen to the electronic device, and the electronic device determines the user's grip on the electronic pen. Whether the grip strength is within the first strength range.
  • the pressure sensing module can specifically be expressed as a pressure sensing film, a distributed pressure sensor, or other forms, which are not exhaustive here.
  • An angle measurement module may be provided in the electronic pen, and the angle measurement module of the electronic pen collects the inclination angle of the electronic pen in real time (that is, determines the holding angle of the electronic pen by the user), and sends the inclination angle of the electronic pen to the electronic device.
  • the electronic device determines whether the user's holding angle of the electronic pen is within the first angle range, and the angle measurement module may specifically be represented as a gyroscope, or other types of angle measurement modules, which are not limited here.
  • the electronic device may record in advance the holding posture of the user when using the electronic pen for handwriting input, and then determine the first preset condition according to the above-mentioned holding posture entered by the user;
  • the device can also collect the user's holding posture during the process of writing with the electronic pen, that is, the position of the contact point between the user's finger and the electronic pen, the user's holding strength, the inclination angle of the electronic pen, etc. Adjust the first preset condition.
  • the first preset condition in the electronic device may be preset.
  • FIG. 46 is a schematic diagram of various holding postures in the method for processing an application interface provided by an embodiment of the present application.
  • Figure 46 shows six sub-schematic diagrams (a), (b), (c), (d), (e) and (f), wherein the (a) sub-schematic diagram, (b) sub-schematic diagram, Sub-schematic diagram (c) and sub-schematic diagram (d) respectively show the four holding postures of the user when writing with the electronic pen.
  • the electronic pen is held, but it is not used for two gestures when writing. It should be understood that the example in FIG. 46 is only for the convenience of understanding the concept of the user's holding gesture for the electronic pen, and is not used to limit this solution.
  • the electronic device may set trigger icons corresponding to the input mode of keyboard input and the input mode of handwriting input on the first application interface or the display interface of the virtual keyboard, respectively, when the user clicks the first application
  • the electronic device can obtain a trigger instruction for the handwriting input, that is, the electronic device detects the user's first operation; when the user clicks the icon on the first application interface and the keyboard input, the electronic device The trigger command for keyboard input can be obtained.
  • the electronic device may be provided with a switching icon on the first application interface for switching between input modes of keyboard input and handwriting input, and when the switching icon is in the first state, it is considered that the user has input a trigger operation for handwriting input.
  • the switch icon When the switch icon is in the second state, it is considered that the user inputs a trigger operation for keyboard input, etc., and the manner in which the electronic device obtains the trigger instruction for handwriting input is not exhaustive here.
  • the electronic device determines that the mode type corresponding to the first application interface is an input mode of handwriting input.
  • FIG. 47 is a schematic interface diagram of the first application interface in the processing method of the application interface provided by the embodiment of the present application
  • FIG. 48 is provided by the embodiment of the present application.
  • FIG. 48 includes two sub-schematic diagrams (a) and (b), in (a) sub-schematic diagram and (b) sub-schematic diagram of FIG. 48 , D1 represents the input mode for keyboard input and handwriting input
  • the aforementioned switching icon is in the first state
  • the mode type corresponding to the first application interface is the input mode of keyboard input.
  • the aforementioned switching icon is in the second state, then the mode type corresponding to the first application interface is the input mode of handwriting input
  • the examples in Figure 47 and Figure 48 are only for the convenience of understanding this scheme, and no limited to this program.
  • the electronic device may also obtain the first contact operation input by the user through the first application interface displayed on the first display screen, or through the interface of the virtual keyboard displayed on the second display screen, and the When the first contact operation is detected, it is determined that the first operation is detected, and then the mode type corresponding to the first application interface is converted to the input mode of handwriting input.
  • the first contact operation is a click operation or a preset trajectory operation; further, the first contact operation may be a single-click operation, a double-click operation, a triple-click operation, a long-press operation, a "Z"-shaped trajectory operation, a slide-down operation
  • the operation, the "check"-shaped trajectory operation, the "circle”-shaped trajectory operation, or other contact operations, etc., are not exhaustive here.
  • step 4205 may include: the electronic device obtains a sliding operation in the first direction through the second display screen, and the sliding operation in the first direction is a sliding sliding from the upper edge of the second display screen to the lower edge of the second display screen.
  • the distance between the upper edge of the second display screen and the first display screen is closer than the distance between the lower edge of the second display screen and the first display screen.
  • the electronic device moves the virtual keyboard displayed on the second display screen to the lower edge of the second display screen along the first direction, and when the upper edge of the virtual keyboard reaches the lower edge of the second display screen, Confirm that the mode type corresponding to the first application interface is changed to handwriting input.
  • the virtual keyboard displayed on the second display screen can accompany the user's downward sliding operation, and when the upper edge of the virtual keyboard reaches the lower edge of the second display screen, the electronic device confirms the interface with the first application.
  • the corresponding mode type is changed to handwriting input, which increases the interest in the process from keyboard input to handwriting input, and is beneficial to improve user stickiness.
  • FIG. 49 is a schematic diagram of a first contact operation in the method for processing an application interface provided by an embodiment of the present application.
  • FIG. 49 includes three sub-schematic diagrams (a), (b) and (c).
  • the first touch operation is a sliding operation input through the second display screen as an example.
  • the sub-schematic diagram (a) of FIG. 49 and the sub-schematic diagram (b) of FIG. 49 when the user inputs a sliding operation through the display interface of the virtual keyboard on the second display screen, the virtual keyboard on the second display screen is closed.
  • the electronic device can detect the distance between the electronic pen and the second screen in real time, and when it is found that the electronic pen is located in the pre-position of the second display screen If it is within the set range, it is determined that the first operation is detected, and the mode type corresponding to the first application interface is changed to the input mode of handwriting input.
  • the preset range of the second display screen may refer to within 3 centimeters, within 4 centimeters, within 5 centimeters, or other ranges directly above the second display screen, which is not limited here.
  • the electronic device can collect the state of the electronic pen in real time, and when the electronic pen changes from the first preset state to the second preset state, it is determined that the first operation is detected, and the first application The mode type corresponding to the interface is changed to the input mode of handwriting input; when the electronic pen is not in the second preset state, it is determined that the mode type corresponding to the first application interface is the input mode of keyboard input.
  • the transition of the electronic pen from the first preset state to the second preset state may be the transition of the electronic pen from a stationary state to a moving state, the transition of the electronic pen from an unheld state to a held state, etc., which are not exhaustive here. lift.
  • the user takes out the electronic pen from the electronic device (the electronic pen is converted from an unheld state to a held state), or , the user picks up the electronic pen from a place outside the electronic device (the electronic pen is converted from an unheld state to a held state), and the electronic device can determine that the mode type corresponding to the first application interface is the input mode of handwriting input .
  • the electronic pen when taken out from the electronic device, it will establish a communication connection with the electronic device. It is considered that the electronic device transitions from an unheld state to a held state.
  • the electronic pen can be equipped with a vibration sensor (such as a gyroscope, an acceleration sensor or other types of sensors, etc.), so that the electronic device can collect the vibration data of the electronic pen in real time, and send the vibration data of the electronic pen to the electronic pen in real time through the communication module.
  • a vibration sensor such as a gyroscope, an acceleration sensor or other types of sensors, etc.
  • the device determines whether the electronic pen changes from a stationary state to a moving state.
  • the perception of taking the electronic pen out of the device may depend on the processing module of the device itself receiving a signal that the interface between the stylus and the device is disconnected, or the sensor module on the stylus may sense the disconnection from the device, and then It is realized by sending it to the screen device through the communication module.
  • the perception of the user picking up the pen is to sense the vibration caused by the user picking up the stylus through the sensor module of the stylus itself (such as a gyroscope sensor or an acceleration sensor, etc.), and then send the vibration data to the main device through the communication module for processing. module to implement.
  • various judgment methods of the mode type corresponding to the first application interface are provided, which improves the implementation flexibility of the solution and also expands the application scenarios of the solution; further, according to the holding of the electronic pen
  • the mode type corresponding to the first application interface is determined by the gesture, the user can realize the transition of the mode type of the first application interface without performing other operations, and according to the user's holding gesture of the electronic pen, determine the mode type corresponding to the first application interface.
  • the corresponding mode type can reduce the error rate of the judgment process of the mode type corresponding to the first application interface, so as to reduce the probability of misplacement of the first application interface, which not only avoids the waste of computer resources, but also helps to improve user viscosity. .
  • the electronic device determines whether the mode type corresponding to the first application interface is converted into handwritten input, and if the mode type corresponding to the first application interface is converted into handwritten input, enter step 4207; if the mode type corresponding to the first application interface is converted into handwriting input If not converted to handwriting input, then re-enter step 4205.
  • step 4206 the electronic device will perform step 4206 after performing step 4205 to determine whether the mode type corresponding to the first application interface is changed from the input mode of keyboard input to the input mode of handwriting input. If the mode type corresponding to the application interface is changed to handwriting input, then enter step 4207; if the mode type corresponding to the first application interface is not changed to handwriting input, then re-enter step 4205 to continue to detect the mode type corresponding to the first application interface . It should be noted that, in the embodiment of the present application, step 4205 and step 4206 may be executed alternately, and the embodiment of the present application does not limit the relationship between the execution times of steps 4205 and 4206 and step 4207. After 4206, step 4207 is performed once.
  • the electronic device In response to the input mode of the handwriting input, the electronic device triggers to display the first application interface on the second display screen.
  • the electronic device when the electronic device acquires that the mode type corresponding to the first application interface is converted from the input mode inputted by the keyboard to the input mode inputted by handwriting, in response to the input mode of the handwriting input, triggering the second
  • the first application interface is displayed on the display screen, and the virtual keyboard displayed on the second display screen is closed, so as to obtain the handwritten content input by the user for the first application interface through the second display screen.
  • the display of the first application interface by the electronic device on the second display screen may be by moving the first application interface to the second display screen for display, or by the electronic device automatically copying the first application interface and then displaying the first application interface through the second display screen. display on the display, etc.
  • an operating system runs on the electronic device, and the electronic device can call the move to function in the operating system, or the electronic device can also call the Set Window Position function in the operating system, or the electronic device can also The first application interface can be displayed on the second display screen by calling the Set Window Placement function in the operating system.
  • the electronic device can close the virtual keyboard displayed on the second display screen, The application interface is moved to the second display screen (that is, the first application interface is not displayed on the first display screen), and the first application interface is displayed in full screen through the second display screen.
  • the electronic device may close the virtual keyboard displayed on the second display screen, and copy the first application interface to the second display screen, so as to display the first application interface on both the first display screen and the second display screen. an application interface.
  • FIG. 50 is a schematic diagram of a display interface of a first application interface in the processing method of an application interface provided by an embodiment of the present application.
  • Fig. 50 includes two sub-schematic diagrams (a) and (b), and sub-schematic diagram (a) of Fig. 50 shows that when the mode type corresponding to the first application interface is the input mode of keyboard input, the first A schematic diagram of a display screen and a second display screen, when the electronic device acquires that the mode type corresponding to the first application interface is converted from the input mode inputted by the keyboard to the input mode inputted by handwriting, triggering the input mode shown in FIG. 50 Sub-schematic diagram (a) Enters sub-schematic diagram (b) of FIG.
  • the virtual keyboard displayed on the second display screen is closed, and the first application interface is moved to the second display screen.
  • the first application interface of the electronic device is In addition to the first application interface displayed on a display screen, other application interfaces may also be displayed. The examples here are only to facilitate understanding of the solution, and are not intended to limit the solution.
  • other application interfaces are also displayed on the second display screen of the electronic device.
  • the electronic device may close the virtual keyboard displayed on the second display screen, and display the first application interface and other application interfaces on the second display screen in a matrix manner.
  • the electronic device may close the virtual keyboard displayed on the second display screen, and display the first application interface on the second display screen in the form of a floating window.
  • the electronic device may close the virtual keyboard displayed on the second display screen, and move other application interfaces displayed on the second display screen to the first display screen, so as to use a full-screen display on the second display screen.
  • the form of displaying the first application interface, etc., is not exhaustive here.
  • the electronic device may also display the first application interface on the first display screen, or may no longer display the first application interface on the first display screen.
  • the first application interface is displayed.
  • step 4204 may include: displaying the first application interface and the application control bar on the second display screen.
  • the method may further include: the electronic device detects a second operation acting on the second display screen; changing the first display area of the application control bar to the second display area in response to the second operation, and changing the first display area included in the application control bar
  • the control key group is changed to a second control key group, and both the first control key group and the second control key group are control key sets corresponding to the target application.
  • the first application interface includes first control keys
  • step 4204 may include: displaying the first application interface and the application control bar on the second display screen.
  • the method may further include: the electronic device detects a second operation on the first target application interface; in response to the second operation, displaying the first control key in the application control bar, and hiding the first control key in the first application interface.
  • the electronic device acquires the mode type corresponding to the first application interface.
  • step 4208 for a specific implementation manner of step 4208, reference may be made to the above description of step 4205, which is not repeated here.
  • the electronic device determines whether the mode type corresponding to the first application interface is converted to keyboard input, if the mode type corresponding to the first application interface is converted to keyboard input, then enter step 4210; if the mode type corresponding to the first application interface is converted to keyboard input If not converted to keyboard input, then re-enter step 4208.
  • step 4209 the electronic device will execute step 4209 after executing step 4208 to determine whether the mode type corresponding to the first application interface is changed from the input mode of handwriting input to the input mode of keyboard input.
  • the mode type corresponding to the application interface is changed to keyboard input, then go to step 4210; if the mode type corresponding to the first application interface is not changed to handwriting input, then re-enter step 4208 to continue to detect the mode type corresponding to the first application interface .
  • step 4208 and step 4209 may be executed alternately, and the embodiment of the present application does not limit the relationship between the execution times of steps 4208 and 4209 and step 4209. After 4209, step 4210 is performed once.
  • the electronic device In response to the input mode of the keyboard input, the electronic device triggers the display of the first application interface on the first display screen, and the display of the virtual keyboard on the second display screen.
  • step 4210 for a specific implementation manner of step 4210, reference may be made to the above description of step 4204, which is not repeated here. It should be noted that, after performing step 4210, the electronic device can re-enter step 4205 to detect in real time whether the mode type corresponding to the first application interface is changed to handwriting input; in addition, steps 4205 to 4209 are optional steps , if the user closes the first application interface in any step from step 4205 to step 4209, it is no longer necessary to continue to perform the remaining steps.
  • the layout of the application interface on different display screens of the electronic device in the process of displaying the application interface, not only can the layout of the application interface on different display screens of the electronic device be automatically adjusted when the application interface is changed from other mode types to handwriting input, but also the layout of the application interface on different display screens of the electronic device can be adjusted automatically.
  • the mode type is changed to keyboard input, it can also automatically adjust the layout of the application interface on different displays, and automatically display the virtual keyboard, so that when the mode type of the application interface is changed to keyboard input, the user does not need to manually adjust the application interface.
  • the layout on different display screens can be directly input by keyboard, and the steps are simple, which further improves the user viscosity of this solution.
  • the electronic device In response to the input mode of the handwriting input, the electronic device triggers to display the first application interface on the second display screen.
  • the electronic device when the electronic device determines that the mode type corresponding to the first application interface is handwriting input, the electronic device triggers the display of the first application interface on the second display screen in response to the input mode of the handwriting input, so as to The input content for the first application interface is acquired through the first display screen.
  • the display manner of the first application interface on the second display screen can be referred to the description in the above step 4207, which is not repeated here.
  • FIG. 51 and FIG. 52 are schematic flowcharts of a method for processing an application interface provided by an embodiment of the present application, respectively.
  • FIG. 51 includes two sub-schematic diagrams (a) and (b).
  • the electronic device obtains information about the target application (that is, the “Notes” application program) start operation, since the start operation is input through the second display screen, the electronic device determines that the mode type corresponding to the first application interface is handwriting input, then the electronic device enters (b of FIG. 51 ).
  • the electronic device displays the first application interface (that is, the initial application interface of the “notes” application program) on the second display screen.
  • FIG. 52 includes two sub-schematic diagrams (a) and (b).
  • sub-schematic diagram (a) of FIG. 52 the electronic device obtains information about the target application (that is, in the illustration) through the first display screen.
  • the opening operation of the "note” application program since the starting operation is obtained through the electronic pen, the electronic device determines that the mode type corresponding to the first application interface is handwriting input, then the electronic device enters (b of FIG. 52 ).
  • the electronic device displays the first application interface on the second display screen.
  • the electronic device acquires the mode type corresponding to the first application interface.
  • the electronic device determines whether the mode type corresponding to the first application interface is converted into a keyboard input, and if the mode type corresponding to the first application interface is converted into a keyboard input, enter step 4214; if the mode type corresponding to the first application interface is converted into a keyboard input If not converted to keyboard input, then re-enter step 4212.
  • the electronic device In response to the input mode of the keyboard input, the electronic device triggers the display of the first application interface on the first display screen, and the display of the virtual keyboard on the second display screen.
  • the electronic device acquires the mode type corresponding to the first application interface.
  • the electronic device judges whether the mode type corresponding to the first application interface is converted into handwritten input, if the mode type corresponding to the first application interface is converted into handwritten input, then enter step 4217; if the mode type corresponding to the first application interface is converted into step 4217; If not converted to handwriting input, then re-enter step 4215.
  • the electronic device In response to the input mode of the handwriting input, the electronic device triggers to display the first application interface on the second display screen.
  • steps 4215 to 4217 for a specific implementation manner of steps 4215 to 4217, reference may be made to the above description of steps 4205 to 4207, and details are not repeated here.
  • step 4217 the electronic device can re-enter step 4212 to detect in real time whether the mode type corresponding to the first application interface is changed to keyboard input; in addition, steps 4212 to 4217 are optional steps , if the user closes the first application interface in any one of steps 4212 to 4217, it is no longer necessary to continue to perform the remaining steps.
  • FIG. 53 is a schematic flowchart of a method for processing an application interface provided by an embodiment of the present application.
  • the method for processing an application interface provided by the embodiment of the present application may include:
  • the electronic device acquires a startup operation for the first application interface.
  • the electronic device determines, based on the startup operation, a mode type corresponding to the first application interface.
  • the electronic device determines whether the mode type corresponding to the first application is a handwriting input, if the mode type corresponding to the first application is a browsing mode, then enter step 5304; if the mode type corresponding to the first application is a handwriting input, then Go to step 5311.
  • steps 5301 to 5303 please refer to the description of steps 4201 to 4203 in the corresponding embodiment of FIG.
  • steps 4201 to 4203 in the corresponding embodiment of FIG.
  • browsing mode in 5303 please refer to the description in the corresponding embodiment of FIG. 42, which will not be repeated here.
  • the electronic device triggers to display the first application interface on the first display screen.
  • the electronic device when the electronic device determines that the mode type corresponding to the first application interface is not handwriting input but a browsing mode, the electronic device triggers to display only the first application interface on the first display screen in response to the browsing mode .
  • the electronic device acquires the mode type corresponding to the first application interface.
  • the electronic device determines whether the mode type corresponding to the first application interface is converted to handwriting input, if the mode type corresponding to the first application interface is converted to handwriting input, then enter step 5307; if the mode type corresponding to the first application interface is converted to handwriting input If it is not converted to handwriting input, step 5305 is re-entered.
  • the electronic device In response to the input mode of the handwriting input, the electronic device triggers to display the first application interface on the second display screen.
  • steps 5305 to 5307 for the specific implementation of steps 5305 to 5307, please refer to the description of steps 205 to 207 in the corresponding embodiment of FIG. to the browsing mode in 5307, and since in the browsing mode, the virtual keyboard does not need to be displayed on the second display screen, correspondingly, when the mode type corresponding to the first application interface is changed from the browsing mode to the handwriting input, neither The virtual keyboard displayed on the second display screen needs to be closed.
  • the description in the corresponding embodiment of FIG. 42 which will not be repeated here.
  • FIG. 54 to FIG. 57 are four schematic diagrams of the display interface of the first application interface in the application interface processing method provided by the embodiment of the present application.
  • Fig. 54 includes two sub-schematic diagrams (a) and (b).
  • sub-schematic diagram (a) of Fig. 54 the bottom end of the first display screen in Fig. 54 shows a bulb-shaped pattern and three circles , the light bulb-shaped pattern represents the display interface of the desktop, and the three circles represent three different application interfaces respectively.
  • the current display in the first display screen is application interface 1 (that is, an example of the first application interface), the first There are two icons in the upper right corner of the display screen representing browsing mode and handwriting mode, respectively, and the second display screen displays the application interface 2 .
  • the electronic device acquires that the mode type corresponding to the first application interface is changed from the browsing mode to the input mode input by handwriting, it triggers the sub-schematic diagram (a) of FIG. 54 to enter the sub-schematic diagram (b) of FIG. That is, the first application interface is moved to the second display screen.
  • the electronic device displays the application interface 1 and the application interface 2 in the form of a matrix, and no longer displays the application interface on the first display screen. 1, the current display interface of the first display screen becomes the application interface 3, and the user can click the application interface 1 to trigger the display of the application interface 1 in a full-screen manner.
  • FIG. 55 includes two sub-schematic diagrams (a) and (b), the sub-schematic diagram (a) of FIG. 55 is consistent with the sub-schematic diagram (a) of FIG. 54 , and will not be repeated here.
  • the mode type corresponding to the first application interface is changed from the browsing mode to the input mode by handwriting input, triggering to enter the sub-schematic diagram (b) of FIG. 55 , in the sub-schematic diagram (b) of FIG.
  • the application interface 1 (that is, an example of the first application interface) is displayed in the form of , and the application interface 1 is no longer displayed on the first display screen, the current display interface of the first display screen becomes the application interface 3 .
  • FIG. 56 includes two sub-schematic diagrams (a) and (b).
  • the sub-schematic diagram (a) of FIG. 56 is consistent with the sub-schematic diagram (a) of FIG. 54 .
  • the mode type corresponding to the first application interface is changed from the browsing mode to the input mode by handwriting input, triggering to enter the sub-schematic diagram (b) of FIG. 56 , in the sub-schematic diagram (b) of FIG.
  • the application interface 1 (that is, an example of the first application interface) is displayed in the form of , and the application interface 1 is still displayed on the first display screen.
  • FIG. 57 includes two sub-schematic diagrams (a) and (b), the sub-schematic diagram (a) of FIG. 57 is consistent with the sub-schematic diagram (a) of FIG.
  • the mode type corresponding to the first application interface is changed from the browsing mode to the input mode input by handwriting, triggering to enter the sub-schematic diagram (b) of FIG. 57 , in the sub-schematic diagram (b) of FIG.
  • the application interface 1 (that is, an example of the first application interface) is displayed in the form, and the electronic device moves the application interface 2 displayed on the second display screen to the first display screen.
  • the electronic device acquires the mode type corresponding to the first application interface.
  • the electronic device determines whether the mode type corresponding to the first application interface is changed to the browsing mode, if the mode type corresponding to the first application interface is changed to the browsing mode, then enter step 1530; if the mode type corresponding to the first application interface is changed to the browsing mode If it does not change to browse mode, then re-enter step 5308.
  • the electronic device In response to the input mode of the browsing mode, the electronic device triggers to display the first application interface on the first display screen, but does not display the first application interface on the second display screen.
  • steps 5308 to 5310 for the specific implementation of steps 5308 to 5310, please refer to the description of steps 4208 to 4210 in the corresponding embodiment of FIG. to the browsing mode in 5310, and when the mode type corresponding to the first application interface is changed from handwriting input to browsing mode, there is no need to display the virtual keyboard on the second display screen.
  • steps 4208 to 4210 in the corresponding embodiment of FIG. to the browsing mode in 5310
  • the mode type corresponding to the first application interface is changed from handwriting input to browsing mode, there is no need to display the virtual keyboard on the second display screen.
  • the layout of the application interface on different display screens can also be automatically adjusted, so that when the mode type of the application interface is changed to the browsing mode, the user does not need to manually Adjusting the layout of the application interface on different display screens, that is, in a variety of different application scenarios, can simplify the operation steps, and further improve the user viscosity of this solution.
  • step 5310 the electronic device can re-enter step 5305 to detect in real time whether the mode type corresponding to the first application interface is changed to handwriting input; in addition, steps 5305 to 5310 are optional steps , if the user closes the first application interface in any of steps 5305 to 5310, it is no longer necessary to continue to perform the remaining steps.
  • the electronic device In response to the input mode of the handwriting input, the electronic device triggers to display the first application interface on the second display screen.
  • the electronic device acquires the mode type corresponding to the first application interface.
  • the electronic device determines whether the mode type corresponding to the first application interface is converted into a browsing mode. If the mode type corresponding to the first application interface is converted into a browsing mode, then enter step 5314; if the mode type corresponding to the first application interface is converted into a browsing mode If it does not change to browse mode, then re-enter step 5312.
  • the electronic device In response to the input mode of the browsing mode, the electronic device triggers to display the first application interface on the first display screen, but does not display the first application interface on the second display screen.
  • the electronic device acquires the mode type corresponding to the first application interface.
  • the electronic device determines whether the mode type corresponding to the first application interface is converted to handwriting input, if the mode type corresponding to the first application interface is converted to handwriting input, then enter step 5317; if the mode type corresponding to the first application interface If not converted to handwriting input, then re-enter step 5315.
  • the electronic device In response to the input mode of the handwriting input, the electronic device triggers to display the first application interface on the second display screen.
  • steps 5311 to 5317 for the specific implementation of steps 5311 to 5317, please refer to the description of steps 4211 to 4217 in the corresponding embodiment of FIG. to the browsing mode in 5317, and when the mode type corresponding to the first application interface is changed from handwriting input to browsing mode, there is no need to display the virtual keyboard on the second display screen, and when the mode type corresponding to the first application interface is changed by When the browsing mode is changed to handwriting input, it is also not necessary to close the virtual keyboard displayed on the second display screen.
  • the description in the corresponding embodiment of FIG. 42 which will not be repeated here.
  • step 5317 the electronic device can re-enter step 5312 to detect in real time whether the mode type corresponding to the first application interface is changed to browsing mode; in addition, steps 5312 to 5317 are optional steps , if the user closes the first application interface in any of steps 5312 to 5317, it is no longer necessary to continue to perform the remaining steps.
  • the mode type corresponding to the application interface can also be determined based on the startup operation, and then the display position of the application interface can be determined, so that the user can directly use the application interface after the startup operation without the need to move the application interface. , which further improves the convenience of this solution and increases the user viscosity of this solution.
  • the electronic device displays the first application interface on the first display screen, and detects the mode type corresponding to the first application interface.
  • the mode type corresponding to the first application interface is handwriting input, it will trigger the The first application interface is displayed on the second display screen, and then the input is directly performed through the first application interface displayed on the second display screen; through the foregoing method, if the user places the second display screen in a direction that is convenient for writing, the user does not need to perform any operation, the electronic device can automatically display the application interface that requires writing input on the second display screen that is convenient for writing, which not only improves the efficiency of the entire input process, but also avoids redundant steps. viscosity.
  • FIG. 58 is a schematic structural diagram of an electronic device provided by an embodiment of the present application.
  • the electronic device 1 includes a first display screen 501, a second display screen 502, a memory 40, one or more processors 10, and one or more programs 401; one or more programs 401 are stored in the memory 40, one or more When each processor 10 executes one or more programs 401, the electronic device performs the following steps: displaying the first application interface through the first display screen 501; The mode type is changed to handwriting input; in response to the input mode of the handwriting input, the display of the first application interface on the second display screen 502 is triggered to obtain the handwritten content for the first application interface through the second display screen 502 .
  • the electronic device when the one or more processors 10 execute the one or more programs 401, the electronic device further executes the following steps: when it is detected that the mode type corresponding to the first application interface is converted into a keyboard input mode In this case, in response to the input mode of the keyboard input, the display of the first application interface is triggered on the first display screen 501, and the virtual keyboard is displayed on the second display screen 502; or, when a mode corresponding to the first application interface is detected When the type is changed to keyboard input, in response to the input mode of the keyboard input, the display of the first application interface on the first display screen 501 is triggered, and the virtual keyboard and the application control bar are displayed on the second display screen 502 .
  • the electronic device when the one or more processors 10 execute the one or more programs 401, the electronic device further executes the following steps: when it is detected that the mode type corresponding to the first application interface is changed to the browsing mode In this case, in response to the browsing mode, the display of the first application interface on the first display screen 501 is triggered.
  • the electronic device when one or more processors 10 execute one or more programs 401, the electronic device specifically executes any one or a combination of the following four items: when detecting the grip of the electronic pen When the holding posture satisfies the first preset condition, it is determined that the first operation is detected, and the holding posture includes any one or a combination of the following: holding position, holding strength, and holding angle; An icon obtains a trigger instruction for handwriting input, and the first icon is displayed on the first application interface; or, a first contact operation is detected, and the first contact operation is a preset click operation or a preset track operation; When the electronic pen is located within the preset range of the second display screen 502, it is determined that the first operation is detected; when it is detected that the electronic pen changes from the first preset state to the second preset state, it is determined that the first operation is detected. first operation.
  • the first operation is to obtain a sliding operation in the first direction through the second display screen 502
  • the sliding operation in the first direction is from the upper edge of the second display screen 502 to the lower side of the second display screen 502 .
  • the distance between the upper edge of the second display screen 502 and the first display screen 501 is shorter than the distance between the lower edge of the second display screen 502 and the first display screen 501 .
  • the electronic device when one or more processors 10 execute one or more programs 401, the electronic device further performs the following steps: acquiring a startup operation for the second application interface, and based on the startup operation, determining the The mode type corresponding to the second application interface, the second application interface and the first application interface are different application interfaces; in the case where the mode type corresponding to the second application interface is handwriting input, in response to the input mode of the handwriting input, triggering Display the second application interface on the second display screen 502; or, in the case that the mode type corresponding to the second application interface is keyboard input, triggering the display of the first application interface on the first display screen 501 in response to the input mode of the keyboard input Second application interface, and display the virtual keyboard on the second display screen 502; or, in the case that the mode type corresponding to the second application interface is the browsing mode, in response to the browsing mode, trigger the display of the first display screen 501 on the first display screen 501.
  • Two application interface In the case where the mode type corresponding to the second application interface is handwriting
  • FIG. 59 is a schematic structural diagram of the electronic device provided by the embodiment of the present application.
  • the electronic device 1 can be embodied as a mobile phone, a tablet, a notebook computer, or other configurations. Devices with a display screen, etc., are not limited here.
  • the electronic device described in the embodiment corresponding to FIG. 58 may be deployed on the electronic device 1 to implement the functions of the electronic device in the embodiment corresponding to FIG. 41 to FIG. 57 .
  • the electronic device 1 may vary greatly due to different configurations or performances, and may include one or more central processing units (CPU) 1522 (for example, one or more processors) and the memory 40, One or more storage media 1530 (eg, one or more mass storage devices) that store applications 1542 or data 1544.
  • the memory 40 and the storage medium 1530 may be short-term storage or persistent storage.
  • the program stored in the storage medium 1530 may include one or more modules (not shown in the figure), and each module may include a series of instructions to operate on the electronic device.
  • the central processing unit 1522 may be configured to communicate with the storage medium 1530 to execute a series of instruction operations in the storage medium 1530 on the electronic device 1 .
  • the electronic device 1 may also include one or more power supplies 1526, one or more wired or wireless network interfaces 1550, one or more input and output interfaces 1558, and/or, one or more operating systems 1541, such as Windows ServerTM, Mac OS XTM, UnixTM, LinuxTM, FreeBSDTM and many more.
  • operating systems 1541 such as Windows ServerTM, Mac OS XTM, UnixTM, LinuxTM, FreeBSDTM and many more.
  • the central processing unit 1522 is used to implement the functions of the electronic device in the embodiments corresponding to FIG. 41 to FIG. 57 . It should be noted that, for the specific implementation manner of the central processing unit 1522 performing the functions of the electronic device in the embodiments corresponding to FIGS. 41 to 57 and the beneficial effects brought about, reference may be made to the respective method embodiments corresponding to FIGS. 41 to 57 . , and will not be repeated here.
  • Embodiments of the present application also provide a computer-readable storage medium, where a program for generating a vehicle speed is stored in the computer-readable storage medium, and when it is run on a computer, the computer is made to execute the programs shown in the aforementioned FIGS. 42 to 57 .
  • Embodiments of the present application also provide a computer program, which, when run on a computer, causes the computer to perform the steps performed by the electronic device in the methods described in the embodiments shown in the foregoing FIG. 42 to FIG. 57 .
  • An embodiment of the present application further provides a circuit system, where the circuit system includes a processing circuit, and the processing circuit is configured to execute the steps performed by the electronic device in the method described in the embodiments shown in the foregoing FIG. 42 to FIG. 57 .
  • the electronic device provided by the embodiment of the present application may be a chip, and the chip includes: a processing unit and a communication unit.
  • the processing unit may be, for example, a processor, and the communication unit may be, for example, an input/output interface, a pin, or a circuit.
  • the processing unit can execute the computer-executed instructions stored in the storage unit, so that the chip executes the steps performed by the electronic device in the methods described in the foregoing embodiments shown in FIGS. 42 to 57 .
  • the storage unit is a storage unit in the chip, such as a register, a cache, etc.
  • the storage unit may also be a storage unit located outside the chip in the wireless access device, such as only Read-only memory (ROM) or other types of static storage devices that can store static information and instructions, random access memory (RAM), etc.
  • ROM Read-only memory
  • RAM random access memory
  • the processor mentioned in any one of the above may be a general-purpose central processing unit, a microprocessor, an ASIC, or one or more integrated circuits for controlling the execution of the program of the method in the first aspect.
  • the device embodiments described above are only schematic, wherein the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be A physical unit, which can be located in one place or distributed over multiple network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution in this embodiment.
  • the connection relationship between the modules indicates that there is a communication connection between them, which may be specifically implemented as one or more communication buses or signal lines.
  • U disk mobile hard disk
  • ROM read-only memory
  • RAM magnetic disk or optical disk
  • a computer device which may be a personal computer, server, or network device, etc.
  • the computer program includes one or more computer instructions. When the computer program instructions are loaded and executed on a computer, all or part of the processes or functions described in the embodiments of the present application are generated.
  • the computer may be a general purpose computer, special purpose computer, computer network, or other programmable device.
  • the computer instructions may be stored in or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be downloaded from a website site, computer, server, or data center Transmission to another website site, computer, server, or data center is by wire (eg, coaxial cable, optical fiber, digital subscriber line (DSL)) or wireless (eg, infrared, wireless, microwave, etc.).
  • wire eg, coaxial cable, optical fiber, digital subscriber line (DSL)
  • wireless eg, infrared, wireless, microwave, etc.
  • the computer-readable storage medium may be any available medium that can be stored by a computer or a data storage device such as a server, a data center, etc. that includes one or more available media integrated.
  • the usable media may be magnetic media (eg, floppy disks, hard disks, magnetic tapes), optical media (eg, DVD), or semiconductor media (eg, Solid State Disk (SSD)), and the like.
  • Embodiment 4 is a diagrammatic representation of Embodiment 4:
  • the embodiments of the present invention can be applied to various multi-screen display smart terminals.
  • the embodiment of the present invention may be used in a dual-screen electronic device.
  • the above-mentioned electronic device may be an electronic device with two display screens, wherein the two display screens may be two separate display screens, It can also be divided into two display screens by a flexible folding screen or a curved screen.
  • An electronic device can be an electronic device that works independently as a whole, such as a personal notebook, etc., or it can be an electronic device formed by connecting two electronic devices that can work independently and working together, such as two mobile phones or two electronic devices.
  • a dual-screen electronic device or a dual-screen electronic device formed by docking two tablet computers generally includes a first display screen and a second display screen.
  • the first display screen is mainly used to provide an output function, that is, to display the currently running content or performed operations to the user.
  • the first display screen may also have an input function at the same time, for example, the first display area may have a touch screen function, and the current application is operated through a touch screen gesture.
  • the second display screen is usually closer to the user's hands, which is convenient for the user to operate. Therefore, the second display screen mainly performs the input function, and can receive the user's input through the touch display screen. , and can also receive user input through a virtual keyboard instead of a traditional mechanical keyboard.
  • the second display screen may also have an output function, for example, the second display screen may also be used to display the currently running content or performed operations to the user.
  • the embodiments of the present invention can also be used in a single-application, dual-screen, and cross-device operation, where the display screen of the controlled device is mainly used to display the currently running content or performed operations to the user, that is, mainly to realize the first display screen function.
  • the function menu corresponding to the target application in the first display screen is transferred to the display screen of the control device, that is, the function of the second display screen is mainly realized, so as to control the application on the controlled device.
  • the tablet computer (61-1) or the mobile phone (61-2) can be used as the control terminal to remotely operate the application of the computer (61-3); the tablet computer (61-1) or the mobile phone can be used as the control terminal.
  • the embodiments of the present invention can be applied to a smart home scenario.
  • the controlled device can be a smart home device with a display screen such as a TV, a microwave oven, a refrigerator, and a washing machine
  • the control device can be a mobile phone, a tablet, a computer, and the like.
  • the embodiments of the present invention can also be used in the field of smart cockpits.
  • a mobile phone, a tablet, etc. are used as control devices to control the front-row vehicle screen or the rear-row display screen, or use the rear-row display screen to control the front-row vehicle screen. control.
  • the interface of the application usually adopts a relatively fixed layout and is all displayed on the first display screen.
  • the control keys of the application usually appear in the functional area menu at the top or left of the application.
  • the user needs to operate the control keys of the application, no matter where the user's current operation object is, he needs to move the cursor to the function area menu for operation, and then return to the current operation object.
  • it may be difficult for the user to locate the specific control key in the ribbon menu of the application. All make the normal operation of the user have certain difficulties.
  • the user needs to control the cursor to switch between the operation object and the function area menu continuously, which makes the user's operation more complicated and the operation efficiency is low.
  • a screen display method 6200 is provided.
  • the screen display method 6200 is used to display a control area on a second display screen, so that a user can use the control area on the second display screen.
  • the target application or operating system on the first display screen controls.
  • the B side (usually set as the display screen) of the dual-screen electronic device can be used as the first display screen
  • the C side of the device (usually set to the side of the keyboard) can be used as the second display screen.
  • the B side can display the main display interface and the function menu bar of the target application operated by the user
  • the C side can include the virtual keyboard and the control area; in another implementation, as As shown in FIG.
  • the B side can display the main display interface and the function menu bar of the target application operated by the user, and the C side can include other application interfaces and control areas; in another implementation, as shown in FIG. 63C, the B side The main display interface and function menu bar of the target application operated by the user can be displayed, and the C side can only include the control area.
  • the display screen of the controlled device may correspond to the B side of the dual-screen electronic device
  • the display screen of the control device may correspond to the dual-screen electronic device.
  • the display contents of the display screens of the controlled device and the control device can be referred to in Figures 63A-63C.
  • an application program can include a main display interface and one or more function menu bars containing control keys corresponding to the application.
  • the main display interface is usually used to show the user the current state of the application or the execution of user operations
  • the control keys in the function menu bar are often used to receive user input and perform specific actions on the target application.
  • its main display interface is the interface that displays the currently edited document, which is usually the interface with the largest display area in the entire application.
  • the function menu bar of the text editing application can include the editing menu bar ( Including file, start, insert, design, page layout and other control keys), navigation menu bar and other functional menu bars, which are used to receive the user's operation instructions on the document.
  • the main display interface and the function menu bar of the application are displayed on the first display screen, and the user can only operate the application program in the first display screen through a mouse or a touch screen gesture.
  • a control area is set in the second display screen, and the control area may include multiple display areas.
  • the control area may include a system control bar and an application control bar, wherein the system
  • the control bar can contain one or more functional modules, each functional module contains a control key group associated with the operating system
  • the application control bar can contain one or more functional modules
  • some functional modules can contain control keys corresponding to the target application group
  • some function modules may contain shortcut operation control key groups related to the user's current operation. It should be understood that other display area settings and function module settings that are helpful for improving user operation efficiency are also possible.
  • the position of the control area on the second display screen can be flexibly adjusted according to user requirements.
  • the control area may be located at the upper end of the second display screen, on the upper side of other display contents (other applications, virtual keyboards, etc.) in the second display screen; the control area may also be located at the lower end of the second display screen, on the second display screen The lower side of other display contents (other applications, virtual keyboard, etc.) on the screen; the control area may also be located on the left or right side of the second display screen.
  • the initial display position of the control area can be defined by the system or the user. When the control area is displayed on the second display screen, the user can flexibly move the position of the control area on the second display screen.
  • the screen display method 6200 may include the following steps:
  • Step 6201 Activate the control area.
  • the control area in a normal use state, can be in a closed state.
  • the main display module and some function menu bars of the target application can be displayed through the first display screen, and the conventional operation mode is adopted.
  • the operation of the target application is realized by operating the control keys in the function menu bar in the first display screen through mouse operation or touch screen gesture operation.
  • the control area can be activated in various ways.
  • the control area may be associated with the virtual keyboard, and the control area is opened by default when the virtual keyboard is opened. At this time, the virtual keyboard and the control area can be activated at the same time through an instruction of activating the virtual keyboard, as shown in FIG. 64A .
  • the virtual keyboard can be provided with a control switch in the control area. When the virtual keyboard is in an open state but the control area is not open, the control switch on the virtual keyboard can be activated to activate Control area.
  • FIG. 64B the virtual keyboard can be provided with a control switch in the control area.
  • the control area can be activated through gesture control, that is, a gesture corresponding to the activation of the auxiliary display area is stored in the storage module, and when it is detected that the user performs the gesture, Activate the control area.
  • the control gesture can be, for example, a finger swipe inward from the edge of the display.
  • whether the control area is turned on can be associated with the display mode of the application. When the full-screen mode of the application is turned on, a part of the display module of the application is usually hidden. Therefore, the full-screen mode of the application can be turned on at the same time. , activate the control area to supplement the content of the display module in the display part, as shown in Figure 64D. It should be understood that the above operation manner of activating the control area is only exemplary, and other operation manners of activating the control area are also possible.
  • Closing the control area in normal use, and activating the control area through simple operations when needed, can simplify the user interface when unnecessary, and avoid the interference of the control area to normal use.
  • control area may be enabled by default after the electronic device is turned on. In this case, the user does not need to activate the control area through step 6201 . Therefore, step 6201 is an optional step of the screen display method 6200 .
  • Step 6202 Obtain the user's operation on the target application.
  • the display content of the control area is determined according to the user's operation on the target application.
  • the user's operation on the target application is to display the operation interface of the target application on the first display screen.
  • the target application may be in a closed state.
  • the operation interface of the target application is displayed on the first display screen; or, before the user operates, the target application may be running in the background, and the user displays the operation interface of the target application on the first display screen through a switching operation.
  • control keys corresponding to the target application may be displayed in the application control bar.
  • the operation interface of the target application is displayed on the first display screen, only the operation interface of the target application may be displayed on the first display screen, or the operation interfaces of multiple applications including the target application may be displayed together, for example , dual-screen multi-screen operation mode.
  • the developer of the application program can provide each functional module and the control keys in the functional module of the application, and the priority order between each functional module and each control key, and then can be provided by the developer of the application program.
  • the system determines which function modules and control keys are displayed in the application control bar corresponding to the application in the control area according to the actual situation (display area of the control area, etc.), and determines the layout of the application control bar.
  • the information of the target application obtained by the system from the application program may include various functional modules of the target application, control keys included in each functional module, and the priority order of each functional module is different from that in each functional module. Controls the priority order of keys.
  • the various information of the target application are introduced as follows:
  • An application usually includes a main display module and multiple function menu bars for controlling the content in the main display module, and the function modules in the control area may correspond to the function menu bars of the target application.
  • the slideshow editing application may include a main display interface, function module 1, function module 2, function module 3, etc., wherein the main display interface displays the slideshow interface currently being edited by the user, and the function module 1 contains a set of commonly used control keys for editing the slideshow interface, function module 2 is used to display all slides for the user to browse, and function module 3 contains a set of control keys for shortcut operations. It should be understood that, due to the different functions implemented by different applications, the settings of the function modules and the settings of the control keys in the function modules may be different for different applications.
  • the priority of the functional modules represents the importance of each functional module in the user's use process, and can usually be determined according to the importance of the function of each functional module and the user's usage frequency and other indicators.
  • the priority of the function modules of the above-mentioned slideshow editing application may be defined as follows: the priority of the function module 1 > the priority of the function module 2 > the priority of the function module 3 . It should be understood that the above definitions regarding the priorities of the functional modules of the slideshow editing application are only exemplary, and other possible definitions that conform to user usage habits are also possible.
  • one or more required function modules may be defined for the application, and the required function modules are function modules corresponding to the application that are fixedly displayed in the control area in the open state.
  • one or more preferred function modules may also be defined for the target application, and the preferred function modules are function modules that can be displayed preferentially after all required function modules of the application are displayed in the control area.
  • the priority order of each function module of the target application can be set as follows: the priority of the required function module is the highest, the priority of the preferred function module is second, and the priority of other function modules is lower.
  • the priority of the control keys represents the importance of each control key in the user's use process, and can usually be determined according to the importance of the control function of each control key and the user's usage frequency and other indicators.
  • the priorities of control keys such as copy, paste, cut, font, paragraph, definition, synonym, and translation can be defined as follows: The highest priority, cut is lower than copy and paste, fonts and paragraphs are lower than cut, definitions, synonyms, and translations are lower than fonts and paragraphs. It should be understood that the above priority definition is only a possible implementation manner, and other priority definition manners that conform to user usage habits or other common application function keys are also possible.
  • each function module may define one or more mandatory control keys, and the mandatory control keys are the control keys that are fixedly displayed when the corresponding function module is displayed in the control area.
  • each function module may define one or more preferred control keys, and the preferred control key is a control key that can be displayed preferentially after all the control keys of the corresponding function module are displayed in the control area.
  • the priority order between different control keys of the same function module can be set as follows: the priority of the mandatory control key is the highest, the priority of the preferred control key is the second, and the priority of other control keys is lower.
  • the developer of the application program can directly define the display content for the application control bar under different display areas, including the function modules and control keys in the application control bar, and the application control bar.
  • the developer of the application program sets the application control bar display mode 1 for display area 1, the application control bar display mode 2 for display area 2, and the application control bar for display area 3.
  • Display mode 3, etc. display area 1, display area 2, and display area 3 may not refer to a certain size, but may be a range.
  • the system can select the display mode of the corresponding application control bar according to the display area of the application control bar.
  • the information of the target application obtained by the system from the application program may include the display modes of the application control bar for different display areas, specifically including which function modules are included in the display modes of each application control bar, and each Which control keys are included in the function module, and how the app's control bar is typographically laid out.
  • the application control bar displayed in the control area can be displayed in exactly the same way as provided by the application.
  • the system can identify each functional module of the application program and the control keys in the functional module through text or image recognition technology, and the system can identify the application program according to the user's frequency of use or degree of importance.
  • the priority order of each function module and control key is determined, and then the system determines which function modules and control keys are displayed in the application control bar according to the priority order, and determines the specific layout method.
  • the system may not obtain additional information from the application.
  • the user's operation on the target application is an operation on the operation interface of the target application, for example, selecting specific content on the operation interface of the target application, placing the cursor on a specific position on the operation interface of the target application, and the like.
  • a shortcut control key associated with the operation may be displayed in the application control bar.
  • the user's operation on the operation interface of the target application includes any possible operation when the user performs a specific function through the target application.
  • the user's operation on the target application may be to select a specific object of the operation interface, for example, to select a specific text, symbol, picture, table, audio and video, etc.
  • the cursor can be moved to a specific object by touching the screen gesture or operating the mouse, and the specific object can be selected by touching the screen gesture or operating the mouse (the shading of the specific object becomes darker), etc.
  • the user's operation on the operation interface of the target application may be a unique gesture or operating the mouse in a unique manner, for example, scrolling the content of the target area through a sliding gesture or scrolling the mouse wheel, In order to realize the browsing of the content of the target area. It should be understood that the above operations are only exemplary, and other operations that the user may perform on the target application during the use of the electronic device are all possible.
  • Different operations performed by the user on the operation interface of the target application may correspond to different control key groups, and the control keys in the control key group may be a set of shortcut operation keys associated with a specific operation.
  • the user's specific operation on the target application may be selecting specific text content, for example, placing the cursor on the text content, and the control key group corresponding to the specific operation may include copying A collection of control keys for , paste, cut, font, text size, paragraph, definition, synonyms, translation, use web search, and more.
  • FIG. 65A in an implementation manner, the user's specific operation on the target application may be selecting specific text content, for example, placing the cursor on the text content, and the control key group corresponding to the specific operation may include copying A collection of control keys for , paste, cut, font, text size, paragraph, definition, synonyms, translation, use web search, and more.
  • the specific operation of the user on the target application may be to select specific image content
  • the control key group corresponding to the specific operation may include copy, paste, cut, set image format, A collection of control keys to change image, bring to front, bring to back, save image, etc.
  • the specific operation of the user on the target application may be to select specific table content
  • the control key group corresponding to the specific operation may include copy, paste, cut, format, insert row A collection of control keys such as , insert columns, delete tables, etc.
  • FIG. 65C in an implementation manner, the specific operation of the user on the target application may be to select specific table content
  • the control key group corresponding to the specific operation may include copy, paste, cut, format, insert row A collection of control keys such as , insert columns, delete tables, etc.
  • the specific operation of the user on the target application may be to select specific video content
  • the control key group corresponding to the specific operation may include play, pause, volume up, volume down , a collection of control keys such as increase brightness, decrease brightness, picture-in-picture, copy video address, cast, loop, progress bar, etc.
  • the specific operation of the user on the target application may be to select specific audio content
  • the control key group corresponding to the specific operation may include play, pause, next song, volume up , a collection of control keys such as volume down, copy audio address, loop, progress bar, etc.
  • the specific operation of the user on the target application may be to browse the content in the target area through a sliding gesture or scrolling the wheel of the mouse, and the control key group corresponding to the specific operation may include the target area , and a positioning box to quickly locate the target content in the thumbnail.
  • the system may define different control key sets for different user operations. According to the different operations of the user on the target application, different shortcut operation control key groups are displayed, which can meet the user's needs, provide the user with more convenient operations, and improve the user's operation efficiency.
  • the control key set can also be defined as the control key set displayed by clicking the right mouse button at the current mouse position. The simple design of defining the control key set as the control key set displayed by clicking the right mouse button can avoid the secondary development of the developer, reduce the burden of the developer, and shorten the development cycle.
  • Step 6203 Obtain the display area of the control area.
  • Step 6203 is an optional step.
  • step 6204 can be directly performed without obtaining the display area of the control area.
  • step 6203 can be performed.
  • the display area of the control area can be flexibly adjusted.
  • the initial display area of the control area may also be different.
  • different applications may correspond to different initial display areas.
  • the initial display area of the control area may be defined by the system, and the system may define different initial display areas of the control area for different applications.
  • the initial display area of the control area may be user-defined, and the user may define different initial display areas of the control area for different applications.
  • the initial display area of the control area may be the display area of the control area that was opened when the application was last used by default. It should be understood that other possible ways commonly used in the art to define the initial display area of the control area are also possible.
  • the function modules and control key groups displayed in the control area can be more in line with user habits, provide users with more convenient operations, and improve user operation efficiency.
  • control area can be set on the upper part of the virtual keyboard or other application interface in the second display screen.
  • control area may be displayed on the left or right side of the second display screen.
  • control area may be displayed in the middle position of the second display screen.
  • the target application in the first display screen may occupy part of the display area of the second display screen, and the corresponding , the control area can be located at the bottom of the second display screen.
  • the two display screens of the dual-screen electronic device can be placed left and right.
  • the virtual keyboard can adopt a separate design and be located at the lower end of the two display screens.
  • the corresponding , the application display area can be set in the middle of the split keyboard.
  • Step 6204 Display the control key group in the control area.
  • Step 6204 determines the function modules and control key groups contained in the control area on the basis of comprehensively considering the information obtained in one or more of the above steps, and displays them in the control area.
  • control area may include the following areas:
  • the system control bar is mainly used to display a set of control keys related to system control.
  • it may include a system control function module and a program dock function module.
  • the system control function module may include a control key group for executing the operating system.
  • the system control function module may include: adjust volume, adjust brightness, check weather, check time, check calendar, check alarm, check system notification A collection of other control keys.
  • the dock function module may include a control key group for performing switching between multiple task programs in the system.
  • the dock function module may include: a list of currently running programs, or a list of frequently used/favorite applications, or a list of recently used programs , or control keys such as the desktop application list.
  • the set of control keys related to system operations in the system control bar may be a relatively fixed set of control keys set by the system, or the set of control keys in the system control bar set by the system may be adjusted by the user according to usage habits. .
  • the application control bar is mainly used to display the control key group corresponding to the target application, and the application control bar may include one or more function modules corresponding to the target application and/or shortcut operation function modules associated with the user's operation on the target application.
  • a control key group corresponding to the target application may be displayed in the control area.
  • the developer of the application program can provide the priority order between each function module of the application program and each control key, and then the system can be configured according to the actual situation (control area) (display area, etc.) determine which functional modules and control keys are displayed in the application control bar corresponding to the application in the control area, and determine the layout of the application control bar.
  • the control key set of the target application may include the required function modules of the target application and the required control keys in the required function modules.
  • the priority order of each functional module of the target application and the priority order of each control key may be comprehensively considered, according to the following steps:
  • the overall priority order shown in Figure 68 increases the control keys in the target application's set of control keys.
  • the priority of the mandatory control key in the mandatory function module is higher than the priority of the preferred control key of the mandatory function module, higher than the priority of the mandatory control key of the preferred function module, and higher than the priority of the preferred function module.
  • the priority of the preferred control keys is higher than the priority of the mandatory control keys of other functional modules, which is higher than the priority of the mandatory control keys of other functional modules, and the priority of the preferred control keys of other functional modules is higher than that of the preferred control keys of other functional modules.
  • the priority of other control keys of the required function module is higher than the priority of other control keys of the preferred function module, and higher than the priority of other control keys of other function modules. Therefore, in the process of gradually increasing the initial display area of the control area, firstly add the mandatory control keys of the mandatory function module to the control key set of the target application, and then add the preferred control keys of the mandatory function module to the target application.
  • the required control keys of the preferred function module In the control key set of the application, then add the required control keys of the preferred function module to the control key set of the target application, then add the preferred control keys of the preferred function module to the control key set of the target application, and then add other functions
  • the required control keys of the module are added to the control key set of the target application, and then the required control keys of other function modules are added to the control key set of the target application, and then the other control keys of the required function modules are added to the target application.
  • add other control keys of the preferred function module to the control key set of the target application, and then add other control keys of other function modules to the control key set of the target application.
  • the display is increased according to the priority order of each specific control key.
  • the priority order of each functional module of the target application and the priority order of each control key may be comprehensively considered, according to The priority order shown in Figure 69 increases the control keys in the control key set of the target application.
  • the priority of the mandatory control key in the mandatory function module is higher than the priority of the preferred control key of the mandatory function module, higher than the priority of the mandatory control key of the preferred function module, and higher than the priority of the preferred function module.
  • the priority of the preferred control key is higher than the priority of the mandatory control key of other function modules, higher than the priority of other control keys of the mandatory function module, higher than the priority of other control keys of the preferred function module, higher than the priority of other control keys of the preferred function module.
  • the priority of mandatory control keys of other function modules is higher than the priority of preferred control keys of other function modules, and higher than the priority of other control keys of other function modules. Therefore, in the process of gradually increasing the initial display area of the control area, firstly add the mandatory control keys of the mandatory function module to the control key set of the target application, and then add the preferred control keys of the mandatory function module to the target application.
  • the control key set of the application then add the mandatory control keys of the preferred function module to the control key set of the target application, then add the preferred control keys of the preferred function module to the control key set of the target application, and then add the mandatory control keys to the control key set of the target application.
  • Add other control keys of the function module to the control key set of the target application then add other control keys of the preferred function module to the control key set of the target application, and then add the required control keys of other function modules to the target application.
  • the display is increased according to the priority order of each specific control key.
  • the developer of the application program can directly define the display content for the application control bar under different display areas, including the functional modules and controls in the application control bar keys, and how the typographic layout of the control bar is applied.
  • the system selects which application control bar display mode corresponding to the application program is displayed.
  • the system can identify each functional module of the application program and the control keys in the functional module through text or image recognition technology, and the system assigns the priority order to the Russian functional module and the control key. , and then the system determines which function modules and control keys are displayed in the application control bar according to the specified priority order, and determines the specific layout method.
  • the application control bar may include a shortcut operation function module related to the user's current operation on the target application.
  • the shortcut operation function module mainly includes a shortcut operation control key group related to the user's current operation on the target application, for example, a set of control keys corresponding to different user operations listed in step 6203 .
  • the shortcut operation control keys related to the user operation can be defined by the application developer, that is, the application developer sets the corresponding shortcut operation control key set according to different operations performed by the user in the target application.
  • the same operation of the user may correspond to different sets of shortcut operation control keys in different applications.
  • control keys related to user operations can be defined by the system, that is, the system sets a set of shortcut operation control keys corresponding to different types of user operations.
  • the same operation of the user may correspond to different applications.
  • a shortcut control key group associated with the user's operation may be displayed in the control area.
  • only the control key group associated with the user's operation on the operation interface of the target application may be displayed in the application control bar, that is, the control key group originally displayed in the application control bar corresponding to the target application is displayed.
  • the initial control key group of is replaced with the control key group associated with the user's operation on the operation interface of the target application.
  • the initial control key group corresponding to the target application and the control key group associated with the user's operation on the operation interface of the target application may also be displayed together in the application control bar. Based on the initial control key group of the application, a control key group associated with the user's operation on the operation interface of the target application is added.
  • the system can define the priority order of shortcut control keys related to user operations, and then determine which shortcut control keys to display in the application control bar according to the display area of the application control bar.
  • the system can define corresponding shortcut control key groups for different display areas of the application control bar, and then determine the shortcut control key group displayed in the application control bar according to the actual display area of the application control bar.
  • Step 6205 Hide the display of the control key group in the control area on the first display screen.
  • Step 6205 is an optional step.
  • you can hide the control key group in the control area on the first display screen. display so as to save the display space of the first display screen and expand the display area of the main display interface of the target application or other functional modules in the first display area.
  • the display of the control key group in the hidden control area on the first display screen may be that the control keys in the control area are not displayed on the first display screen, or the control keys in the control area may be displayed on the first display screen.
  • the control keys in the control area displayed in the first display screen may also be faded, for example, the control keys are grayed out.
  • the display content of the first display screen can be adjusted adaptively.
  • the size of the display content of the main display interface of the target application or other functional modules can be increased, for example: enlarge the display font , zoom in to display pictures, etc., and make adaptive adjustments to the layout on the first display screen. This implementation can facilitate the browsing of the user and improve the user experience.
  • the display content in the main display interface of the target application can be added, or part of the first display screen can be added.
  • the non-displayed display content in the functional modules displayed on the first display screen can also be added, and the layout on the first display screen can be adaptively adjusted.
  • the application can define multiple layouts that contain different control keys for display on the first display screen, and the system selects the appropriate application according to the control key group displayed in the application control bar. The layout in the first display.
  • adding the display content in the first display screen can reflect more details or operation methods of the target application, and provide users with more convenient operations.
  • one or more of the above three kinds of display contents can also be added at the same time, and the display contents and enlarged display contents can also be added at the same time. It should be understood that after removing the display of the control key set of the target application on the first display screen, other ways to change the content layout on the first display screen are also possible to improve user experience.
  • Step 6206 Close the control area.
  • the control area when the user does not need to use the control area temporarily, can be closed in various ways.
  • the control area may be associated with a virtual keyboard, and by default, the control area is closed when the virtual keyboard is closed. At this time, the virtual keyboard and the control area can be closed at the same time through the instruction of closing the virtual keyboard, as shown in FIG. 70A .
  • the virtual keyboard may be provided with a control switch in the control area. When the virtual keyboard is in an open state, the control area may be closed through the control switch on the virtual keyboard.
  • FIG. 70B the virtual keyboard may be provided with a control switch in the control area.
  • the control area can be closed through gesture control, that is, a gesture corresponding to closing the auxiliary display area is stored in the storage module, and when it is detected that the user performs the gesture, Close the control area.
  • the control gesture can be, for example, a finger sliding the control area towards the edge of the display screen.
  • whether the control area is turned on can be associated with the display mode of the application, the control area can be closed while the full-screen mode of the application is turned off, and part of the content of the control area can be migrated back to the first display area for display, As shown in Figure 70D.
  • step 307 is optional.
  • the screen display method 6200 displays a control area on the second display screen, the control area includes a control key group related to system control and/or a control key group associated with the user's operation interface for the target application, so that the user can
  • the control area in the second display screen operates the system or the target application in the first display screen. With the assistance of the control area, the user does not need to repeatedly move the cursor position on the first screen and repeatedly locate the position of the operation object or the control key, which greatly simplifies the user operation.
  • the control area is displayed on the second screen, and is closer to the user's hands than the first screen, which can provide the user with more convenient operations.
  • the relevant control key group is displayed in the control area, its display in the first display screen is removed, which can save the display area in the first display screen. Further, the display content in the first display screen is enlarged, or the display content in the first display screen is increased, so as to improve the user experience.
  • a screen display method 7100 is provided.
  • the screen display method is used to change the display content of the application control bar in the control area according to the current operation of the target application by the user.
  • the screen display method 7100 may include the following steps:
  • Step 7101 Acquire the user's operation on the target application.
  • the control area After the control area is opened, the current operation of the user on the target application is detected in real time, and the control key group displayed in the application control bar is changed according to the current operation of the user on the target application.
  • the current operation of the user on the target application may be to display the operation interface of the target application on the first display screen.
  • all the control key groups displayed in the application control bar can be replaced with control key groups corresponding to the target application.
  • the part of the control key group displayed in the application control bar can be replaced with the control key group corresponding to the target application, or the control key group displayed in the application control bar can be replaced Based on the displayed control key group, the control key group corresponding to the target application is added. That is, control key groups corresponding to a plurality of applications including the target application are simultaneously displayed in the application control bar.
  • the current operation of the user on the target application may be an operation on the operation interface of the target application.
  • the application control bar displays a shortcut control key group associated with the user's previous operation on the operation interface of the target application, then the part of the shortcut control key group is displayed in the application control bar.
  • the shortcut control key group corresponding to the previous operation is replaced with the shortcut control key group corresponding to the current operation.
  • the control key group corresponding to the target application can be replaced with the shortcut associated with the user's current operation.
  • a control key group, or a shortcut control key group associated with the current operation of the user is added on the basis of the control key group corresponding to the target application.
  • step 7101 is the same as that of step 6202, and in order to avoid repetition, details are not repeated here.
  • Step 7102 Change the control key group of the application control bar according to the user operation.
  • changing the control key group of the application control bar according to the user operation may be, on the basis of the original control key group in the application control bar, adding a part related to the current operation of the user on the target application.
  • Control key group For example, when the user only opens the target application but does not perform operations on the target application, the application control bar may not include the shortcut operation control key group corresponding to the user's operation, that is, the initial control key group in the application control bar only includes the corresponding control keys for the target application.
  • the set of control keys for the application not the set of shortcut control keys.
  • a set of shortcut operation control keys corresponding to the user's operation can be added to the application control bar, that is, a set of shortcuts associated with the user's first operation can be added to the application control bar A collection of action control keys.
  • changing the control key group of the application control bar according to the user operation may be, on the basis of the original control key group in the application control bar, reducing the part related to the current operation of the user on the target application. control key group.
  • the set of shortcut operation control keys corresponding to the user's second operation is replaced by the shortcut operation corresponding to the user's first operation.
  • the set of control keys is included, and the shortcut operation control keys corresponding to the user's second operation are less than the shortcut operation control keys corresponding to the user's first operation.
  • a shortcut operation control key group unrelated to the second operation in the application control bar.
  • changing the control key group of the application control bar according to the user operation may be to replace part or all of the control key group originally displayed on the application control bar with a new control key group.
  • the user's operation changes that is, when the user performs a second operation different from the first operation, if the correlation between the second operation and the first operation is small, the user's first
  • the operation-associated shortcut operation control key group is partially or completely replaced with a shortcut operation control key group associated with the user's second operation.
  • the application control bar is crowded, or all shortcut operation control keys cannot be completely displayed.
  • the display area of the application control bar and the control area can be adaptively increased, so that the application control bar displays all shortcut operation control keys corresponding to the current operation of the target application by the user.
  • the display area of the application control bar can be adaptively increased, which can optimize the display of the application control bar, avoid the display of the control keys in the application control bar being too small, and provide users with Better operating experience.
  • a free display area in the application control bar For example, according to the user's current operation on the target application, it is necessary to reduce a set of control keys in the application control bar, or the number of a set of control keys used to replace the control key group originally displayed in the application control bar is smaller than the original The number of control keys displayed. At this time, the display area of the application control bar and the control area may be adaptively reduced, so that the display area of the application control bar matches the shortcut operation control key corresponding to the current operation of the target application by the user.
  • adaptively reducing the display area of the application control bar can optimize the display of the application control bar, avoid idle display area in the application control bar, and save the second display
  • the display controls on the screen can also expand the display area of other applications on the second display screen, providing users with a better operating experience.
  • Step 7103 Hide the display of the control key group displayed in the application control bar on the first screen.
  • the display of the control keys that can be displayed in the application control area on the first display screen is hidden.
  • the specific implementation is as described in step 6205 .
  • the control keys displayed in the application control bar are changed, so that the control key group corresponding to the user's current operation on the target application is displayed on the application control bar from time to time.
  • the embodiment of the present invention does not limit the execution times of steps 7101-7103, that is, the changes of the current operation of the user on the target application can be obtained multiple times, and the control key group displayed on the application control bar can be changed multiple times.
  • the screen display method 7100 may also include one or more steps in the screen display method 6200, and the specific implementation is as described in the screen display method 6200. To avoid The repetition is not repeated here.
  • a screen display method 7200 is provided.
  • the screen display method is used to change the display area of the application control bar and the application control bar according to the user's operation of changing the display area of the application control bar. control key group.
  • the screen display method 7200 may include the following steps:
  • Step 7201 Acquire an operation instructing the user to change the display area of the application control bar.
  • the user may wish to display more control keys in the application control bar of the control area to better assist the user's operations.
  • the target application performs more complex operations.
  • expanding the display area of the application control bar can provide the user with more control keys and improve the user's operation efficiency.
  • the user may want the application control bar in the control area to display relatively few control keys. Or when the user is performing a simpler operation on the target application.
  • reducing the display area of the application control bar can save the display space on the second display screen, and by reducing the control keys on the application control bar, the user can locate the desired control key more easily and quickly, improving the user experience. Operation efficiency, improve user experience.
  • the user can achieve the purpose of changing the display area of the application control bar in various ways.
  • the user can indirectly change the display area of the application control bar by changing the display area of the control area.
  • the display area of the control area can be enlarged by the zoom-in button on the control area, and then the display area of the control area can be enlarged. Indirectly expand the display area of the application control bar. You can reduce the area of the control area through the shrink button on the control area, thereby indirectly reducing the display area of the application control bar.
  • the display area of the control area can be enlarged by the zoom-in gesture, thereby indirectly expanding the display area of the application control bar.
  • FIG. 73A the display area of the control area can be enlarged by the zoom-in gesture, thereby indirectly expanding the display area of the application control bar.
  • the display area of the control area can be reduced by the zoom-out gesture, thereby indirectly reducing the display area of the application control bar.
  • the user can indirectly change the display area of the control area by changing the display area of other applications on the second display screen, thereby changing the display area of the application control bar.
  • the display area of other applications can be enlarged by the zoom-in button on other applications on the second display screen, and the display area of the control area can be indirectly reduced, thereby indirectly reducing the display area of the application control bar.
  • the display area of other applications can be reduced through the reduction button on other applications on the second display screen, and the display area of the control area can be enlarged indirectly, thereby indirectly expanding the display area of the application control bar.
  • the user can shrink other application interfaces on the second display screen by a zoom-out gesture, so as to expand the display area of the control area, thereby expanding the display area of the application control bar.
  • the user can expand other application interfaces on the second display screen by a zoom-in gesture, reduce the display area of the control area, and then reduce the display area of the application control bar.
  • the user can directly operate the application control bar to change the display area of the application control bar. For example, as shown in FIG.
  • the display area of the application control bar can be enlarged by the zoom-in button on the application control bar.
  • the display area of the application control bar can be reduced by the shrink button on the application control bar.
  • the user can expand the display area of the application control bar by the enlargement gesture, and as shown in FIG. 75C , the user can reduce the display area of the application control bar by the zoom out gesture.
  • the display area of the application control bar may also be changed according to the user's operation on the application on the first display screen. Specifically, when the number of control keys corresponding to the user's current operation is greater than the number of control keys corresponding to the user's previous operation, in order to display all the control keys in the application control bar and ensure the display effect of the control keys in the application control bar , you can appropriately increase the display area of the application control bar. For example, if the user's previous operation is to open an application, the initial control key corresponding to the application can be displayed in the application control bar, and the user's current operation is the operation performed on the interface of the target application.
  • a control key for the user's current operation is added in the application control bar, and the display area of the application control bar can be appropriately increased at this time.
  • the display area of the application control bar can be appropriately reduced.
  • the display area and position of the control area can be flexibly adapted to changes in the display of other functional modules on the second display screen.
  • the display area of the application control bar will also change with the display area of the control area. and position changes.
  • the display area and position of the control area can be flexibly determined according to the display areas of the different types of virtual keyboards.
  • Embodiment 2 The specific implementation manner of triggering different types of virtual keyboards according to different gestures is shown in Embodiment 2, which will not be repeated here.
  • the interface of the target application is displayed on the second display screen, so that the user can perform handwriting input through the second display screen.
  • an application control bar may be displayed on the second display screen at this time, and the control keys associated with the handwriting input mode may be displayed in the application control bar.
  • the application control bar since the interface of the target application has been displayed on the second display screen at this time, the application control bar may not be displayed on the second display screen.
  • the handwriting input area may be displayed in the application control bar, and the control key group associated with the handwriting input mode may also be displayed in the application control bar, For example: pen, eraser, color, font, etc.
  • the handwriting input area and the control key group associated with the handwriting input method can also be displayed in the application control bar.
  • the user can perform handwriting input through the application control bar, and/or operate the handwriting input mode through the application control bar. Improve operational efficiency.
  • the specific implementation related to the switching of the handwriting input mode is shown in the third embodiment, which will not be repeated here.
  • Step 7202 Change the display area of the application control bar and the set of control keys according to the user operation.
  • the display area of the application control bar is expanded according to the degree of the user operation. For example, when the user enlarges the display area by clicking the enlargement button, the extent to which the display area of the application control bar is enlarged may be determined according to the number of clicks of the user. When the user enlarges the display area through the zoom-in gesture, it may be determined to what extent the display area of the application control bar is enlarged according to the degree of the user's zoom-in gesture.
  • the control keys in the control key group corresponding to the target application in the application control bar can be added.
  • control keys in the original function modules in the application control bar may be added.
  • a set of new function modules and their corresponding control keys may be added to the application control bar.
  • the control keys in the original function modules in the application control bar and the set of new function modules and their corresponding control keys can be added at the same time.
  • the system may, according to the priority of each functional module and each control key, in order of priority from high to low, on the basis of the set of control keys displayed in the application control bar, Add some control keys, and determine the layout of the control bar after adding some control keys.
  • the control key group originally displayed in the application control bar can be moved down, and the newly added displayed control key group can be displayed in the application control key group originally displayed in the application.
  • the newly added display control key group is closer to the first display screen than the control key group originally displayed in the application control bar .
  • the priority of the newly added display control key group is lower than that of the control key group originally displayed in the application control bar.
  • High control keys control keys with more important functions or users with higher frequency of use
  • system may select the display mode of the application control bar corresponding to the display area provided by the application program according to the display area of the application control bar, and display it in the control area.
  • the display area of the application control bar is enlarged to increase the number of control keys corresponding to the target application in the application control bar
  • the display of the increased control keys in the first display screen can be hidden.
  • the specific implementation and beneficial effects are as shown in step 6205 said.
  • the display area of the application control bar is reduced according to the degree of the user operation. For example, when the user expands the display area by clicking the shrink button, the extent to which the display area of the application control bar is reduced may be determined according to the number of clicks by the user. When the user reduces the display area through the zoom-out gesture, it may be determined to what extent the display area of the application control bar is enlarged according to the degree of the user's zoom-out gesture.
  • the control keys in the control key group corresponding to the target application in the application control bar can be reduced.
  • the number of function modules in the application control bar can be kept unchanged, and the number of control keys in the function module can be reduced.
  • the set of function modules of the application control bar and their corresponding control keys can be reduced.
  • the set of function modules and their corresponding control keys in the application control bar and the number of control keys in other reserved function modules can be simultaneously reduced.
  • the system may, according to the priority of each functional module and each control key, in order of priority from low to high, on the basis of the set of control keys displayed in the application control bar, Reduce the number of control keys, and determine the layout of the control bar applied after reducing the number of control keys.
  • the system may select the display mode of the application control bar provided by the application program according to the display area of the application control bar, and display it in the control area.
  • the display area of the application control bar is reduced so that the control keys corresponding to the target application in the application control bar are reduced
  • the display of the reduced control keys in the first display screen can be restored, so that the user can use these control keys when needed.
  • the operation can be performed through the first display screen in a conventional manner, through touch screen gestures or mouse operation.
  • changing the display area of the application control bar and the control key group in the application control bar can make the display of the control area more flexible and satisfy the user's needs in Different usage requirements in different usage scenarios to improve user experience.
  • the display area of other display areas in the control area or other display modules on the second display screen (display interfaces of other applications, virtual keyboard, etc.) display layout.
  • the user can still quickly locate the desired control key. If you want to locate the control key, you can add anchor point feedback technology in the control area.
  • a feedback may be provided to the user, indicating that the user is touching the control key in the application control bar at this time.
  • feedback may be provided to the user, indicating that the user is touching the control key in the system control bar at this time.
  • some control keys in the application control bar or the system control bar with more important functions or those that are frequently used by the user may be set as anchor point feedback keys, so that the user can quickly locate these important, Or use the more frequent control keys.
  • the specific implementation manner of the anchor point feedback is shown in Embodiment 1, which will not be repeated here.
  • the embodiment of the present invention does not limit the number of executions of steps 7201 and 7202.
  • the user can perform the operation of changing the display area of the application control bar multiple times, and the system can obtain the operation instructed by the user to change the display area of the application control bar in real time. , according to the user operation, the display area of the application control bar and the control key group in the application control bar are changed many times.
  • the screen display method 7200 may also include one or more steps in the screen display method 6200, and the specific implementation is as described in the screen display method 6200, which is: To avoid repetition, we will not repeat them here.
  • the control area displayed on the second display screen has an output function, that is, as a human-computer interaction interface, a part of the control key set of the target application is displayed to the user.
  • the control area may also have some input functions, such as a touch screen gesture function, receive user input, and then perform some operations on the target application or perform some operations on the control area itself.
  • the control area can receive user input to control the function of the target application.
  • the control key set in the control area corresponding to the target application may include control keys for processing text content, such as copy, paste, cut, etc.
  • the text content can be edited by clicking a control key in the control area or selecting a control key in the control area with a mouse through a touch screen gesture.
  • the control key set in the control area corresponding to the target application may include control keys for controlling the video content, such as volume control keys, brightness control keys, and progress control bars. At this time, the volume, brightness, playback progress, etc.
  • the user can jointly operate the target application in the first display screen through a set of control keys in the control area and other input methods, for example, the user can use a mouse or touch screen gestures on the first display screen For a specific object in the selection and editing page, edit the selected object through the control key set in the control area.
  • the cooperative control between the set of control keys in the control area and the mouse or touch screen gestures is only exemplary, and other possible cooperative modes capable of operating the target application in the first display screen are also possible.
  • the user can view, edit and customize the control key set in the control area.
  • the control area can support the following touch screen gesture operations of the user:
  • the control area can be operated by a drag gesture.
  • the drag gesture can be used to drag a control key from a certain position in the control area to another position in the control area, as shown in the figure 77A.
  • the drag gesture can also be used to drag a function module in the control area as a whole to another position in the control area.
  • the drag gesture can also be used to move the position of the entire control area in the second display screen, as shown in FIG. 77B .
  • control area may be operated through a sliding gesture, for example, the sliding gesture may be used to browse the displayed content in the control area.
  • the sliding gesture may be used to browse the displayed content in the control area.
  • the control keys not displayed by the function module can be browsed through a sliding gesture, as shown in Figure 77C.
  • control area can be operated through a flick gesture, for example, the flick gesture can be used to remove some content in the control area, as shown in FIG. 77D .
  • the control area can be operated through a finger pressure gesture, and when the user's finger pressure gesture is received at different positions in the control area, different functions can be performed correspondingly.
  • FIG. 78A in one implementation, if a user's finger pressing gesture is received on the control key in the control area, the delete button of the current control key can be displayed, and then the control key is deleted through the delete button.
  • the corresponding display position can display a vacancy and an add button, and the user can add a new control key at the position through the add button.
  • Figure 78B if a finger pressing gesture is received at the area boundary of different functional modules, the function of moving the edges of the two functional modules divided by the boundary line can be triggered.
  • the user can drag the separation line to change the two function modules.
  • Display area of function modules Specifically, if the display area of one of the functional modules increases and the display area of the other functional module decreases, the control keys displayed in the functional module with the increased display area can be increased according to the priority order of the control keys in the two functional modules. , to reduce the number of control keys displayed in function modules with reduced display area.
  • the control area can be operated by hovering gestures, and the hovering gestures can be used to perform preview operations.
  • the hovering gesture operations can be used to view the names of the current control keys, auxiliary prompts, and other contents.
  • the floating gesture operation can be used to preview the control keys that are not displayed in the current control area due to the display area.
  • touch screen gesture operations listed above are only exemplary, and other common gesture operations in the art are also possible.
  • FIG. 80A it is a conventional display state, and all the content related to the note-taking application is displayed on the first display screen.
  • the main content of the note content may be included. Displays functional menu bars such as area and list navigation and fixed menu bars.
  • the user can operate the note-taking application in a normal operation manner, for example, control the note-taking application through the first display screen through a mouse or a touch screen gesture.
  • control area can be activated in the following four ways:
  • the virtual keyboard and the control area can be opened at the same time when an instruction to open the virtual keyboard is received from the user.
  • the control area can be opened when a gesture of the user to open the control area is received.
  • control area can be opened when receiving an instruction of the user for the full-screen note application.
  • the system receives the user's instruction to activate the control area, according to the implementation in the method embodiment corresponding to the screen display method 6200, according to the display area of the control area, the corresponding system control key group and the control key group corresponding to the target application are displayed. It is displayed in the control area of the second display screen, and the display area of other applications in the second display screen is correspondingly reduced. For example, as shown in FIG. 80B, when the initial display area of the control area is the smallest, only the system control key group related to system control is displayed in the control area in the system control bar, and the part corresponding to the target application is displayed in the application control bar Control key group.
  • the system When receiving an operation from the user to change the display area of the application control bar, for example, as shown in FIG. 80C , when the user enlarges the display area of the application control bar by a zoom-in gesture, the system expands the control area and the application control bar according to the user's operation. display area, and a function module and its control key group corresponding to the note-taking application are added in the application control bar. At the same time, the function menu bar corresponding to the function module originally displayed on the first display screen is removed. As shown in Fig.
  • the system when the user further expands the display area of the application control bar through the zoom-in gesture, the system further expands the display area of the control area and the application control bar according to the user's operation, and adds corresponding notes in the application control bar
  • Another function module of the application and its control key group at the same time, remove the function menu bar originally displayed on the first display screen corresponding to the function module.
  • the user can operate the target application in the first display screen through the control keys in the control area of the second display screen.
  • the user can click the control key in the function module to select which part of the note-taking application to browse.
  • the user can edit the currently displayed note by clicking the control key in the function module.
  • users can also operate the application control bar itself.
  • the user can customize and edit the displayed content of the application control bar, and the user can also view the names, functions or other descriptions of the control keys in the control area through a hovering gesture, as shown in FIG. 80G .
  • control area can be closed in the following four ways:
  • the virtual keyboard and the control area can be closed at the same time when an instruction to close the virtual keyboard is received from the user.

Abstract

本申请涉及人机交互领域,实施例中提供一种反馈方法,应用于配置有触控屏幕的电子设备中,触控屏幕中配置有多个振动反馈元件,方法包括:检测作用于触控屏幕上的第一接触操作,获取与第一接触操作对应的第一接触点的第一位置信息,第一位置信息与虚拟键盘上的第一虚拟按键对应;在第一虚拟按键为锚定点按键的情况下,从多个振动反馈元件中获取与第一虚拟按键匹配的第一振动反馈元件,指示第一振动反馈元件发出振动波,以提示第一虚拟按键为锚定点按键,从而用户可以感知锚定点按键的位置,有利于降低在触控屏幕上实现盲打的难度。

Description

一种反馈方法以及相关设备
本申请要求于2020年12月30日提交中国专利局、申请号为202011628845.7、发明名称为“一种反馈方法以及相关设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及计算机技术领域,特别涉及一种反馈方法以及相关设备。
背景技术
为计算机系统提供文本输入的最常用方式是通过键盘来实现,但键盘并非属于十分便携的设备,为此,常见的文本输入方式是在触控屏幕上设置虚拟键盘,用户可通过该虚拟键盘进行文本输入。
但虚拟键盘缺少实体键盘的很多特性,从而导致在触控屏幕上实现盲打成为一种较为困难的任务。
发明内容
本申请实施例提供了一种反馈方法以及相关设备,当用户接触的为虚拟按键上的锚定点按键时,会通过触控屏幕执行第一反馈操作,以提示用户当前接触的为锚定点按键,从而用户可以感知锚定点按键的位置,有利于降低在触控屏幕上实现盲打的难度。
为解决上述技术问题,本申请实施例提供以下技术方案:
第一方面,本申请实施例提供一种反馈方法,可用于虚拟键盘领域中。方法应用于电子设备,电子设备配置有触控屏幕,触控屏幕中配置有多个振动反馈元件,方法包括:电子设备检测作用于触控屏幕上的第一接触操作,响应于第一接触操作,获取与第一接触操作对应的第一接触点的第一位置信息,第一位置信息与虚拟键盘上的第一虚拟按键对应。在第一虚拟按键为锚定点按键的情况下,电子设备从多个振动反馈元件中获取一个或多个第一振动反馈元件;其中,第一振动反馈元件为与第一虚拟按键匹配的振动反馈元件,与不同的虚拟按键匹配的振动反馈元件不完全相同;虚拟键盘可以表现为任意类型的键盘,作为示例,例如虚拟键盘可以为全键盘、数字键盘、功能键盘等,或者,虚拟键盘也可以为触控屏幕上所有操作按键的统称。锚定点按键的含义不等同于定位按键,也即锚定点按键指的是用于给用户带来提示效果的按键,在确定了当前展示的虚拟键盘之后,哪些虚拟按键为锚定点按键可以为预先配置于电子设备中的,也即哪些虚拟按键为锚定点按键可以为预先固定好的;也可以为由用户进行自定义,也即用户可以通过电子设备中的“设置”界面来自行定义哪些虚拟按键为锚定点按键。进一步地,针对根据第一位置信息判断第一虚拟按键是否为锚定点按键的过程,在一种实现方式中,电子设备根据该第一位置信息,获取与第一接触点对应的第一虚拟按键,继而判断第一虚拟按键是否为锚定点按键。在另一种实现方式中,电子设备可以预先存储触控屏幕上的哪些位置区域是锚定点按键的位置区域,触控屏幕上的哪些位置区域是非锚定点按键的位置区域,电子设备根据第一位置信 息,直接判断第一接触点的位置是否位于锚定点按键的位置区域内,以确定与第一位置信息对应的第一虚拟按键是否为锚定点按键。电子设备指示与第一虚拟按键匹配的所有第一振动反馈元件发出振动波,以执行第一反馈操作,第一反馈操作用于提示第一虚拟按键为锚定点按键。
本实现方式中,当用户接触的为虚拟按键上的锚定点按键时,会通过触控屏幕执行第一反馈操作,以提示用户当前接触的为锚定点按键,从而用户可以感知锚定点按键的位置,有利于降低在触控屏幕上实现盲打的难度;此外,触控屏幕中配置有多个振动反馈元件,在确定第一虚拟按键为锚定点按键的情况下,从多个振动反馈元件中获取与第一虚拟按键匹配的至少一个第一振动反馈元件,并指示该至少一个第一振动反馈发出振动波,能够实现仅在第一虚拟按键的周围产生振动反馈的效果,也即不是对全屏进行振动反馈,由于打字的时候所有手指都放置于触控屏幕上,如果是全屏的振动的话,则所有的手指都会感受到振动,就容易让用户混淆,但只在第一虚拟按键周围产生振动反馈的效果,则用户不容易产生混淆,更容易帮助用户在手指处形成肌肉记忆,以协助用户实现在触控屏幕上进行盲打。
在第一方面的一种可能实现方式中,电子设备获取与第一接触操作对应的第一接触点的第一位置信息之后,方法还包括:电子设备根据第一位置信息,获取与第一接触点对应的第一虚拟按键。本实现方式中,能够根据第一位置信息,实时获取与第一接触点对应的第一虚拟按键,使得本方案不仅能够兼容位置固定的虚拟键盘,也可以兼容位置会移动的虚拟键盘,扩展了本方案的应用场景。
在第一方面的一种可能实现方式中,电子设备中配置有第一映射关系,第一映射关系用于指示虚拟按键与振动反馈元件之间的对应关系。电子设备从多个振动反馈元件中获取第一振动反馈元件,包括:电子设备根据第一映射关系和第一虚拟按键,获取与第一虚拟按键匹配的第一振动反馈元件。可选地,若电子设备上预先配置有与多种虚拟键盘一一对应的多个映射关系,每个映射关系中包括多个虚拟按键与多个第一振动反馈元件之间的对应关系。则在电子设备根据第一映射关系和第一虚拟按键,获取与第一虚拟按键匹配的第一振动反馈元件之前,需要先从多个映射关系中获取与当前展示的虚拟键盘的类型匹配的第一映射关系。
本实现方式中,预先配置有第一映射关系,从而在获取到第一虚拟按键之后,能够第一映射关系,获取与第一虚拟按键匹配的至少一个第一振动反馈元件,方便快捷,有利于提高振动反馈元件的匹配过程的效率;将确定振动反馈元件这一步骤进行拆分,从而当出现故障时,有利于对故障位置进行精确定位。
在第一方面的一种可能实现方式中,电子设备中配置有第一映射关系,第一映射关系指示位置信息与振动反馈元件之间的对应关系。电子设备从多个振动反馈元件中获取第一振动反馈元件,包括:电子设备根据第一映射关系和第一位置信息,获取与第一位置信息匹配的第一振动反馈元件,由于第一位置信息与虚拟键盘上的第一虚拟按键对应,也即获取与第一虚拟按键对应的第一振动反馈元件。本实现方式中,可以根据第一位置信息和第一映射关系,获取与第一虚拟按键匹配的至少一个第一振动反馈元件,方便快捷,有利于 提高振动反馈元件的匹配过程的效率;且第一映射关系能够指示第一位置信息和指示一个第一振动反馈元件之间的对应关系,不仅可以兼容位置固定的虚拟键盘,还可以兼容位置能够移动的虚拟键盘,保证了各种场景下均可以提供振动反馈。
在第一方面的一种可能实现方式中,电子设备通过第一振动反馈元件发出振动波之前,方法还包括:电子设备获取与至少一个第一振动反馈元件中各个第一振动反馈元件对应的振动波的振动强度,至少一个第一振动反馈元件中各个第一振动反馈元件的振动波的振动强度与以下中任一项或多项因素相关:第一数量、每个第一振动反馈单元与第一虚拟按键的中心点的距离、振动波的类型、虚拟按键是否为锚定点按键或第一位置信息的位置类型,第一数量为第一振动反馈元件的数量。电子设备通过第一振动反馈元件发出振动波,包括:电子设备根据与各个第一振动反馈元件对应的振动波的振动强度,通过至少一个第一振动反馈元件发出振动波,以使与第一虚拟按键对应的振动反馈的强度和与第二虚拟按键对应的振动反馈的强度的差异在预设强度范围内,第二虚拟按键和第一虚拟按键为不同的虚拟按键;预设强度范围可以为强度差异在百分之二以内、强度差异在百分之三以内、强度差异在百分之四以内或强度差异在百分之五以内。进一步地,针对在触控屏幕的表面进行强度测量的过程,可以将振动测量仪器的探头贴合在触控屏幕上的一个虚拟按键(也即一个检测点)的表面,以从前述检测点上采集到振动波,进而得到该采集到的振动波的波形曲线,通过前述波形曲线来指示与该检测点对应的振动反馈的强度。更进一步地,与第一虚拟按键对应的振动反馈的强度和与第二虚拟按键对应的振动反馈的强度之间的差异,可以通过对比在第一虚拟按键这个检测点上量取的波形曲线与在第二虚拟按键这个检测点上两区的波形曲线之间的差异来获得。
本实现方式中,由于与不同的虚拟按键对应的振动反馈元件的数量可能不同,所以根据匹配的振动反馈元件的数量,来确定各个振动反馈元件的强度,以实现各个虚拟按键的振动反馈强度的差别在预设范围之内,由于当用户在使用实体按键时,不同的按键给出的力反馈基本相同,从而可以降低虚拟键盘与实体键盘之间的差异,以增加用户粘度。
在第一方面的一种可能实现方式中,第一振动反馈元件为以下中的任一种:压电陶瓷片、线性马达或压电薄膜。本实现方式中,提供了振动反馈元件的多种具体表现形式,提高了本方案的实现灵活性。
在第一方面的一种可能实现方式中,第一接触点为触控屏幕上新增的接触点。本申请实施例中,由于用户在使用实体键盘时,往往关注点放在新接触的实际按键中,本方案中仅对新增接触点产生反馈,可以更好的模拟用户使用实体键盘进行输入时的体验,且仅针对新增接触点产生反馈,也更容易建立用户与新增接触点之间的记忆关系,进一步降低在触控屏幕上训练盲打的难度。
在第一方面的一种可能实现方式中,方法还包括:在第一虚拟按键为非锚定点按键的情况下,电子设备执行第二反馈操作,第二反馈操作用于提示第一虚拟按键为非锚定点按键,第一反馈操作与第二反馈操作为不同的反馈操作。本实现方式中,不仅在第一虚拟按键为锚定点按键的情况下执行反馈操作,且在第一虚拟按键为非锚定点按键的情况下也会执行反馈操作,第一反馈操作和第二反馈操作为不同的反馈操作,由于当用户使用实体键 盘时,每个按键均会给用户以反馈,通过前述方式,能够增加虚拟键盘与实体键盘之间的相似度,且对锚定点按键与非锚定点按键给出不同的反馈操作,也可以帮助用户记住不同类型的按键,以协助用户实现在虚拟键盘上的盲打。
在第一方面的一种可能实现方式中,第一反馈操作为通过触控屏幕发出第一类型的振动波,第二反馈操作为通过触控屏幕发出第二类型的振动波,第一类型的振动波和第二类型的振动波为不同类型的振动波。若电子设备通过振动反馈元件发出的为连续的振动波,则不同类型的振动波的区别包括以下中的任一种或多种特性:振动幅度、振动频率、振动时长或包络形状。若电子设备通过振动反馈元件发出的为脉冲形式的振动波,则不同类型的振动波的区别包括以下中的任一种或多种特性:振动幅度、振动频率、振动时长、包络形状或电子设备发出脉冲形式的振动波的频率。
在第一方面的一种可能实现方式中,电子设备执行第一反馈操作之前,方法还包括:电子设备根据第一位置信息,获取与第一接触点对应的位置类型,位置类型包括第一接触点位于第一虚拟按键的第一位置区域(也可以称为锚定点按键的特征区域)和第一接触点位于第一虚拟按键的第二位置区域(也可以称为锚定点按键的边缘区),第一位置区域和第二位置区域不同;电子设备执行第一反馈操作,包括:电子设备根据与第一接触点对应的位置类型,通过触控屏幕执行第一反馈操作,与第一位置区域对应的反馈操作和与第二位置区域对应的反馈操作不同。
本实现方式中,将锚定点按键和/或非锚定点按键的全部位置区域划分为第一位置区域和第二位置区域,在第一接触点位于的第一位置区域的情况下,和,在第一接触点位于第二位置区域这两种情况下,电子设备通过至少一个第一振动反馈元件发出的振动波的类型不同,有利于帮助用户记忆虚拟按键的边界,也即有利于协助用户对虚拟按键的不同区域建立肌肉记忆,以进一步降低在触控屏幕上实现盲打的难度。
在第一方面的一种可能实现方式中,与锚定点按键的第一位置区域对应的反馈操作和与非锚定点按键的第一位置区域对应的反馈操作相同,且与锚定点按键的第二位置区域对应的反馈操作和与非锚定点按键的第二位置区域对应的反馈操作不同;或者,与锚定点按键的第一位置区域对应的反馈操作和与非锚定点按键的第一位置区域对应的反馈操作不同,且与锚定点按键的第二位置区域对应的反馈操作和与非锚定点按键的第二位置区域对应的反馈操作相同;或者,与锚定点按键的第一位置区域对应的反馈操作和与非锚定点按键的第一位置区域对应的反馈操作不同,且与锚定点按键的第二位置区域对应的反馈操作和与非锚定点按键的第二位置区域对应的反馈操作不同。
在第一方面的一种可能实现方式中,第一接触操作为按压操作,方法还包括:电子设备检测作用于触控屏幕上的第二接触操作,并获取与第二接触操作对应的第二接触点的第二位置信息,第二接触操作为触摸操作;电子设备响应于第二接触操作,改变触控屏幕上的第二接触点的触觉特性,触觉特性包括以下中的任一种或多种特性:滑动摩擦系数、粘滑性和温度。
在第一方面的一种可能实现方式中,电子设备检测作用于触控屏幕上的第一接触操作之前,方法还包括:电子设备检测到作用于触控屏幕上的第一手势操作;电子设备响应于 第一手势操作,从多个类型的虚拟键盘中选取与第一手势操作对应的第一类型的虚拟键盘,其中,多个类型的虚拟键盘中不同类型的虚拟键盘包括的虚拟按键不完全相同;电子设备通过触控屏幕展示第一类型的虚拟键盘,在第一类型的虚拟键盘的展示过程中,第一类型的虚拟键盘在触控屏幕上的位置固定;电子设备检测作用于触控屏幕上的第一接触操作,包括:电子设备在第一类型的虚拟键盘的展示过程中,检测作用于触控屏幕上的第一接触操作。对于本实现方式中各个名词的含义、具体的实现步骤以及带来的有益效果均会在后续第七方面进行描述,此处暂不做介绍。
第二方面,本申请实施例提供了一种电子设备,可用于虚拟键盘领域中。电子设备配置有触控屏幕,触控屏幕包括接触感知模块和振动反馈模块,振动反馈模块包括多个振动反馈元件。接触感知模块,用于获取触控屏幕上的第一接触点的第一位置信息,接触感知模块具体可以表现为接触感知薄膜,接触感知薄膜具体可以为电容式接触感知薄膜、压力式接触感知薄膜或温度式接触感知薄膜或其他类型的薄膜。第一振动反馈元件,用于在于第一接触点对应的第一虚拟按键为锚定点按键的情况下,发出振动波,振动波用于提示第一虚拟按键为锚定点按键,第一振动反馈元件为以下中的任一种:压电陶瓷片、线性马达或压电薄膜;其中,第一虚拟按键为虚拟键盘中的一个虚拟按键,第一振动反馈元件为多个振动反馈元件中与第一虚拟按键匹配的振动反馈元件。
在第二方面的一种可能实现方式中,第一接触点为基于作用于触控屏幕上的按压操作得到,触控屏幕还包括盖板和超声波模块,超声波模块用于发出超声波,以改变盖板的触觉特性;具体的,接触感知模块还用于获取触控屏幕上的第二接触点的第二位置信息;超声波模块具体用于在第二接触点为基于作用于触控屏幕上的触摸操作得到的情况下,发出超声波,以改变盖板的触觉特性。或者,触控屏幕还包括盖板和静电模块,静电模块用于产生电信号,以改变盖板的触觉特性;具体的,接触感知模块还用于获取触控屏幕上的第二接触点的第二位置信息;静电模块具体用于在第二接触点为基于作用于触控屏幕上的触摸操作得到的情况下,产生电信号,以改变盖板的触觉特性。其中,触觉特性包括以下中的任一种或多种特性:滑动摩擦系数、粘滑性和温度。
本实现方式中,触控屏幕还可以通过设置超声波模块或静电模块的方式,来改变盖板的触觉特性,从而可以提供更为丰富的触觉反馈,进而可以利用更为丰富的触觉反馈来对用户在触控屏幕上实现盲打进行训练,以进一步降低在触控屏幕上实现盲打的难度。
在第二方面的一种可能实现方式中,触控屏幕还包括压力感知模块,压力感知模块和振动反馈模块集成于一体,振动反馈元件为压电陶瓷片、压电聚合物或压电复合材料。压力感知模块用于采集与第一接触操作对应的压力值,以确定第一接触操作为按压操作还是触摸操作。具体的,在一种情况下,可以对振动反馈模块(也可以称为压力感知模块)中包括的多个振动反馈元件进行划分,多个振动反馈元件中的第二振动反馈元件用于采集压力值,多个振动反馈元件中的第三振动反馈元件用于发出振动波,以进行振动反馈。其中,第二振动反馈元件和第三振动反馈元件为不同的振动反馈元件。在另一种情况下,振动反馈模块(也可以称为压力感知模块)中的多个振动反馈元件在第一时间段内用于采集压力值,在第二时间段内用于发出振动波,第一时间段和第二时间段不同。
本实现方式中,触控屏幕中还配置有用于进行采集压力值的压力感知模块,从而不仅可以获取到接触点的位置信息,还可以获取到接触点的压力值,以对通过触控屏幕获取到的接触操作做进一步细致的管理;且将压力感知模块和振动反馈模块集成于一体,有利于降低触控屏幕的厚度,进而提高电子设备的便捷性。
对于本申请实施例第二方面以及第二方面的部分可能实现方式中名词的概念、具体实现步骤所带来的有益效果,均可以参考第一方面中各种可能的实现方式中的描述,此处不再一一赘述。
第三方面,本申请实施例提供了一种电子设备,可用于虚拟键盘领域中。电子设备包括触控屏幕、存储器、一个或多个处理器以及一个或多个程序,触控屏幕中配置有多个振动反馈元件,一个或多个程序被存储在存储器中,一个或多个处理器在执行一个或多个程序时,使得电子设备执行以下步骤:检测作用于触控屏幕上的第一接触操作;响应于第一接触操作,获取与第一接触操作对应的第一接触点的第一位置信息,第一位置信息与虚拟键盘上的第一虚拟按键对应;在第一虚拟按键为锚定点按键的情况下,从多个振动反馈元件中获取第一振动反馈元件,第一振动反馈元件为与第一虚拟按键匹配的振动反馈元件;指示第一振动反馈元件发出振动波,以执行第一反馈操作,第一反馈操作用于提示第一虚拟按键为锚定点按键。
本申请实施例第三方面中,电子设备还可以用于实现第一方面各种可能实现方式中电子设备执行的步骤,对于本申请实施例第三方面以及第三方面的各种可能实现方式中某些步骤的具体实现方式,以及每种可能实现方式所带来的有益效果,均可以参考第一方面中各种可能的实现方式中的描述,此处不再一一赘述。
第四方面,本申请实施例提供了一种计算机程序,当其在计算机上运行时,使得计算机执行上述第一方面所述的反馈方法。
第五方面,本申请实施例提供了一种电子设备,包括处理器,所述处理器与所述存储器耦合;所述存储器,用于存储程序;所述处理器,用于执行所述存储器中的程序,使得所述执行设备执行如上述第一方面所述的反馈方法。
第六方面,本申请实施例提供了一种计算机可读存储介质,所述计算机可读存储介质中存储有计算机程序,当其在计算机上运行时,使得计算机执行上述第一方面所述的反馈方法。
第七方面,本申请实施例提供了一种芯片系统,该芯片系统包括处理器,用于支持实现上述第一方面中所涉及的功能,例如,发送或处理上述方法中所涉及的数据和/或信息。在一种可能的设计中,所述芯片系统还包括存储器,所述存储器,用于保存服务器或通信设备必要的程序指令和数据。该芯片系统,可以由芯片构成,也可以包括芯片和其他分立器件。
第八方面,本申请实施例提供一种虚拟键盘的处理方法,可用于人机交互领域中。方法应用于电子设备,电子设备中配置有显示屏,方法包括:电子设备检测作用于显示屏的第一手势操作,并响应于检测到的第一手势操作,从多个类型的虚拟键盘中选取与第一手势操作对应的第一类型的虚拟键盘,其中,多个类型的虚拟键盘中不同类型的虚拟键盘包 括的虚拟按键不完全相同;电子设备通过显示屏展示第一类型的虚拟键盘。
本实现方式中,电子设备中配置有多个不同类型的虚拟键盘,不同类型的虚拟键盘包括的虚拟按键不完全相同,用户能够实现通过不同的手势操作唤起不同类型的虚拟键盘,也即虚拟键盘不再是只能展示26个字母,而是通过不同类型的虚拟键盘向用户提供更多的虚拟按键,不仅提高了用户唤起虚拟键盘的过程中的灵活性,而且有利于提供更丰富的虚拟按键,从而不再需要提供额外的实体键盘。
在第八方面的一种可能实现方式中,电子设备从多个类型的虚拟键盘中选取与第一手势操作对应的第一类型的虚拟键盘,包括:电子设备根据第一规则,从多个类型的虚拟键盘中选取与第一手势操作对应的第一类型的虚拟键盘,第一规则指示多个类型的手势操作与多个类型的虚拟键盘之间的对应关系。本实现方式中,电子设备中预先配置有第一规则,第一规则指示多个类型的手势操作与所述多个类型的虚拟键盘之间的对应关系,在检测到作用于显示屏的第一手势操作之后,可以根据第一规则,得到与特定的第一手势操作对应的第一类型的虚拟键盘,提高虚拟键盘匹配过程的效率。
在第八方面的一种可能实现方式中,在一种情况下,第一规则直接包括多个类型的手势操作与多个类型的虚拟键盘之间的对应关系;第一规则中包括多个第一标识信息和多个第二标识信息之间的对应关系,第一标识信息用于唯一指向一种类型的手势操作对应的第一标识信息,第二标识信息用于唯一指向一种类型的虚拟键盘。在另一种情况下,第一规则包括多组条件与多个类型的虚拟键盘之间的对应关系,多组条件中的每组条件与一个类型的手势操作对应,也即多组条件中每组条件为与手势操作对应的手势参数的限定条件,每组条件对应于一个类型的手势操作。
在第八方面的一种可能实现方式中,电子设备从多个类型的虚拟键盘中选取与第一手势操作对应的第一类型的虚拟键盘之前,方法还包括:电子设备获取与第一手势操作对应的第一手势参数,其中,第一手势参数包括以下中任一项或多项参数:与第一手势操作对应的接触点的位置信息、与第一手势操作对应的接触点的数量信息、与第一手势操作对应的接触点的面积信息、与第一手势操作对应的手的相对角度信息、与第一手势操作对应的手的位置信息、与第一手势操作对应的手的数量信息和与第一手势操作对应的手的形状信息;电子设备从多个类型的虚拟键盘中选取与第一手势操作对应的第一类型的虚拟键盘,包括:电子设备根据第一手势参数,从多个类型的虚拟键盘中选取第一类型的虚拟键盘。
本实现方式中,第一手势参数中不仅包括每个接触点的位置信息和多个接触点的数量信息,还包括每个接触点的面积信息,接触点的面积信息能够从多个接触点中区分出基于手掌触发的接触点,有利于准确的估计第一手势操作的类型,避免显示错误的虚拟键盘,以提高虚拟键盘显示过程的正确率;对获取到的第一手势参数进行二次处理后,可以得到手的相对角度信息、手的位置信息、手的数量信息或手的形状信息等信息,也即基于第一手势参数可以得到关于第一手势操作的更为丰富的信息,增加虚拟键盘匹配过程的灵活性。
在第八方面的一种可能实现方式中,方法还包括:电子设备响应于第一手势操作,获取第一角度,第一角度指示与第一手势操作对应的手与显示屏的边之间的相对角度,或者,第一角度指示与第一手势操作对应的手与显示屏的中心线之间的相对角度。电子设备通过 显示屏展示第一类型的虚拟键盘,包括:电子设备根据第一角度,获取第一类型的虚拟键盘的第一展示角度,并通过显示屏按照第一展示角度展示第一类型的虚拟键盘;第一展示角度指示第一类型的虚拟键盘的边与显示屏的边之间的相对角度,或者,第一展示角度指示第一类型的虚拟键盘的边与显示屏的中心线之间的相对角度。
本实现方式中,获取用户的手与显示界面的边或中心线之间的相对角度(也即第一角度),并根据第一角度确定虚拟键盘的展示角度,从而使得键盘的展示角度更加贴合用户手的放置角度,使得用户利用虚拟键盘进行输入的过程更加舒适和便捷。
在第八方面的一种可能实现方式中,若第一类型的虚拟键盘是全键盘,该全键盘被拆分为第一子键盘和第二子键盘,第一角度包括左手的相对角度和右手的相对角度,将第一子键盘和第二子键盘包括的为全键盘中不同的虚拟按键,第一展示角度包括第一子键盘的展示角度和第二子键盘的展示角度。若第一角度指示与第一手势操作对应的第一手势中手与显示屏的边之间的相对角度,第一子键盘的展示角度指示第一子键盘的边与显示屏的边之间的相对角度,第二子键盘的展示角度指示第二子键盘的边与显示屏的边之间的相对角度;若第一角度指示与第一手势操作对应的第一手势中手与显示屏的边之间的相对角度,第一子键盘的展示角度指示第一子键盘的边与显示屏的中心线之间的相对角度,第二子键盘的展示角度指示第二子键盘的边与显示屏的中心线之间的相对角度。
在第八方面的一种可能实现方式中,在一种情况下,电子设备判断第一角度是否大于或等于预设角度阈值,若大于或等于预设角度阈值,则获取第一展示角度,并通过显示屏按照第一展示角度展示第一类型的虚拟键盘,其中,预设角度阈值的取值可以为25度、28度、30度、32度、35度或其他数值等,此处不做限定。在另一种情况下,电子设备在获取到第一角度后,将第一类型的虚拟键盘的第一展示角度确定为第一角度,并通过显示屏按照第一角度展示第一类型的虚拟键盘,
在第八方面的一种可能实现方式中,多个类型的虚拟键盘中不同类型的虚拟键盘的功能不同,不同功能的虚拟键盘包括以下中任意两种或多种虚拟键盘的组合:数字键盘、功能键键盘、全键盘和自定义键盘,功能键键盘由功能键组成。本实现方式中,不同类型的虚拟键盘的功能不同,从而可以向用户提供多种不同功能的虚拟键盘,提高用户在虚拟键盘的使用过程的灵活性,以提高本方案的用户粘度。
在第八方面的一种可能实现方式中,在第一手势操作为单手操作的情况下,第一类型的虚拟键盘为以下中的任一种虚拟键盘:迷你键盘、数字键盘、功能性键盘、功能键键盘、圆形键盘、弧形键盘、自定义键盘,其中,迷你键盘包括26个字母按键,功能性键盘展示于应用程序中,功能性键盘包括的虚拟按键与应用程序的功能对应。需要说明的是,同一电子设备中不需要同时配置有迷你键盘、数字键盘、功能性键盘、功能键键盘、圆形键盘、弧形键盘和自定义键盘,此处举例仅为证明在一个电子设备中单手操作触发的可以为迷你键盘、数字键盘、功能性键盘、功能键键盘、圆形键盘、弧形键盘或自定义键盘中的任一中虚拟键盘。本实现方式中,提供了在第一手势操作为单手操作和双手操作这两种情况下,通过显示屏展示的虚拟键盘的多种具体表现形式,提高了本方案的实现灵活性,也扩展了本方案的应用场景。
在第八方面的一种可能实现方式中,在第一手势操作为双手操作的情况下,第一类型的虚拟键盘为全键盘,全键盘至少包括26个字母按键,全键盘的尺寸比迷你键盘大。电子设备通过显示屏展示第一类型的虚拟键盘,包括:在双手之间的距离小于或等于第一距离阈值的情况下,电子设备通过显示屏,采用一体式的方式展示全键盘;在双手之间的距离大于第一距离阈值的情况下,电子设备通过显示屏的第二区域展示第一子键盘,通过显示屏的第三区域展示第二子键盘,其中,第二区域和第三区域为显示屏中的不同区域,第一子键盘和第二子键盘包括的为全键盘中不同的虚拟按键;第一距离阈值的取值可以为70毫米、75毫米、80毫米等,此处不做限定。
本实现方式中,可以基于用户两手之间的距离来决定是采用一体式展示虚拟键盘,还是采用分离式的方式展示虚拟键盘,进一步提高了虚拟键盘的展示过程的灵活性,使得展示的虚拟键盘更加便于用户使用,进一步提高本方案的用户粘度。
在第八方面的一种可能实现方式中,在第一手势操作为第一单手操作的情况下,第一类型的虚拟键盘为迷你键盘。本实现方式中,在第一手势操作为单手操作的情况下,第一类型的虚拟键盘为迷你键盘,有利于提高用户输入字母过程的灵活性。
在第八方面的一种可能实现方式中,单手操作包括左手单手操作和右手单手操作;在第一手势操作为右手单手操作的情况下,第一类型的虚拟键盘为数字键盘;在第一手势操作为左手单手操作的情况下,第一类型的虚拟键盘为功能性键盘,功能性键盘包括的虚拟按键与应用程序的功能对应,作为示例,例如第一手势操作是在游戏类的应用程序中获取到的,则功能性键盘可以为游戏键盘,游戏键盘中配置有游戏常用按键。再例如第一手势操作是在绘图类的应用程序中获取到的,则功能性键盘可以为绘图软件中的常用按键等。
本实现方式中,在第一手势操作为右手单手操作的情况下,第一类型的虚拟键盘为数字键盘,在第一手势操作为左手单手操作的情况下,第一类型的虚拟键盘为功能性键盘,更加符合用户对实体键盘的使用习惯,以降低虚拟键盘与实体键盘之间的差异,有利于增强用户粘度。
在第八方面的一种可能实现方式中,在第一手势操作为位于显示屏的第一区域的单手操作的情况下,第一类型的虚拟键盘为功能键键盘,第一区域位于显示屏的左下方或右下方。本实现方式中,由于功能键按键配置于实体键盘的左下方或右下方,在第一手势操作为位于显示屏的第一区域的单手操作的情况下,第一类型的虚拟键盘为功能键键盘,由于触发手势与用户的使用实体键盘的习惯相同,方便用户记忆触发手势,降低本方案的实现难度,有利于增强用户粘度。
在第八方面的一种可能实现方式中,方法还包括:电子设备获取针对功能键键盘中第一虚拟按键的接触操作,作为示例,例如第一虚拟按键可以为Ctrl按键,也可以同时包括Ctrl按键和Shift按键等。电子设备响应于针对功能键键盘中第一虚拟按键的接触操作,在显示屏上突出展示第二虚拟按键,第二虚拟按键为组合型的快捷键中除第一虚拟按键之外的按键。突出展示包括但不限于高亮展示、加粗展示或闪烁展示,此处不做限定。作为示例,例如在绘图类的应用程序中,Ctrl按键+Shift按键+I按键的组合按键能够提供对当前处理的图像进行反相显示的功能,则第一虚拟按键包括Ctrl按键和Shift按键,第二虚拟按键 为虚拟按键I。
本申请实施例中,在显示屏中展示功能键键盘的过程中,获取针对功能键键盘中第一虚拟按键的接触操作,响应于该接触操作,在显示屏上突出展示第二虚拟按键,第二虚拟按键为组合型的快捷键中除第一虚拟按键之外的按键,由于功能键键盘占用面积小,从而减少了显示虚拟键盘所需要的面积,且在用户对功能键键盘中第一虚拟按键执行接触操作时,又能自动展示组合型的快捷键中的第二虚拟按键,从而保证了用户对快捷键的需求,也避免了对显示屏的显示面积的浪费。
在第八方面的一种可能实现方式中,第一手势操作为通过显示屏获取到的接触操作,第一手势参数包括与第一手势操作对应的接触点的数量信息;在第一手势操作为少于三个接触点的单手操作的情况下,第一类型的虚拟键盘为圆形键盘或弧形键盘。本实现方式中,当第一手势操作为少于三个接触点的单手操作时,还可以提供圆形键盘或弧形键盘,不仅能提供实体键盘中存在的键盘,而且还可以提供实体键盘中不存在的键盘,丰富了键盘的类型,给用户提供了更多的选择,进一步增强用户的选择灵活度。
在第八方面的一种可能实现方式中,第一规则包括第一子规则,第一子规则为基于对至少一个类型的手势操作和/或至少一个类型的虚拟键盘执行自定义操作后得到的。本实现方式中,用户可以对触发手势和/或虚拟键盘的类型进行自定义,使得虚拟键盘的展示过程更加符合用户的预期,以进一步提高本方案的用户粘度。
在第八方面的一种可能实现方式中,显示屏中配置有多个振动反馈元件,在第一类型的虚拟键盘的展示过程中,第一类型的虚拟键盘在显示屏上的位置固定,通过显示屏展示第一类型的虚拟键盘之后,方法还包括:电子设备检测作用于显示屏上的第一接触操作,响应于第一接触操作,获取与第一接触操作对应的第一接触点的第一位置信息,第一位置信息与虚拟键盘上的第一虚拟按键对应。在第一虚拟按键为锚定点按键的情况下,电子设备从多个振动反馈元件中获取第一振动反馈元件,第一振动反馈元件为与第一虚拟按键匹配的振动反馈元件;指示第一振动反馈元件发出振动波,以执行第一反馈操作,第一反馈操作用于提示第一虚拟按键为锚定点按键。对于本实现方式中第一接触操作、第一接触点、第一位置信息、第一虚拟按键、第一振动反馈元件等名词的含义、具体的实现步骤以及带来的有益效果均可以参阅第一方面中各种可能的实现方式中的描述,此处暂不做介绍。
本申请实施例第八方面中,电子设备还可以用于实现第一方面各种可能实现方式中电子设备执行的步骤,对于本申请实施例第八方面以及第八方面的各种可能实现方式中某些步骤的具体实现方式,以及每种可能实现方式所带来的有益效果,均可以参考第一方面中各种可能的实现方式中的描述,此处不再一一赘述。
第九方面,本申请实施例提供了一种电子设备,可用于人机交互领域中。电子设备包括显示屏、存储器、一个或多个处理器以及一个或多个程序,一个或多个程序被存储在存储器中,一个或多个处理器在执行一个或多个程序时,使得电子设备执行以下步骤:响应于检测到的第一手势操作,从多个类型的虚拟键盘中选取与第一手势操作对应的第一类型的虚拟键盘,其中,多个类型的虚拟键盘中不同类型的虚拟键盘包括的虚拟按键不完全相同;通过显示屏展示第一类型的虚拟键盘。
本申请实施例第九方面中,电子设备还可以用于实现第八方面各种可能实现方式中电子设备执行的步骤,对于本申请实施例第九方面以及第九方面的各种可能实现方式中某些步骤的具体实现方式,以及每种可能实现方式所带来的有益效果,均可以参考第八方面中各种可能的实现方式中的描述,此处不再一一赘述。
第十方面,本申请实施例提供了一种计算机程序,当其在计算机上运行时,使得计算机执行上述第八方面所述的虚拟键盘的处理方法。
第十一方面,本申请实施例提供了一种电子设备,包括处理器,所述处理器与所述存储器耦合;所述存储器,用于存储程序;所述处理器,用于执行所述存储器中的程序,使得所述电子设备执行如上述第八方面所述的虚拟键盘的处理方法。
第十二方面,本申请实施例提供了一种计算机可读存储介质,所述计算机可读存储介质中存储有计算机程序,当其在计算机上运行时,使得计算机执行上述第八方面所述的虚拟键盘的处理方法。
第十三方面,本申请实施例提供了一种芯片系统,该芯片系统包括处理器,用于支持实现上述方面中所涉及的功能,例如,发送或处理上述方法中所涉及的数据和/或信息。在一种可能的设计中,所述芯片系统还包括存储器,所述存储器,用于保存服务器或通信设备必要的程序指令和数据。该芯片系统,可以由芯片构成,也可以包括芯片和其他分立器件。
第十四方面,本申请实施例提供一种应用界面的处理方法,可用于人机交互领域中。方法应用于电子设备,电子设备包括第一显示屏和第二显示屏,方法包括:电子设备通过第一显示屏展示第一应用界面;电子设备响应于检测到的第一操作,将与第一应用界面对应的模式类型转变为手写输入;响应于手写输入的输入模式,触发在第二显示屏上展示第一应用界面,以通过第二显示屏获取针对第一应用界面的手写内容。具体的,电子设备上运行有操作系统,电子设备可以通过调用操作系统中的move to函数的方式,或者,电子设备也可以通过调用操作系统中的Set Window Position函数的方式,或者,电子设备还可以通过调用操作系统中的Set Window Placement函数的方式,以实现在第二显示屏上展示第一应用界面。
本实现方式中,电子设备在第一显示屏上展示第一应用界面,在检测到与第一应用界面对应的模式类型为手写输入的情况下,就会触发在第二显示屏上展示第一应用界面,进而直接通过第二显示屏展示的第一应用界面进行输入;通过前述方式,若用户将第二显示屏放置于便于书写的方向上,用户不需要执行任何操作,电子设备就能够自动的将需要进行书写输入的应用界面显示于方便书写的第二显示屏上,既提高了整个输入过程的效率,也避免了冗余步骤,操作简单,有利于提高用户粘度。
在第十四方面的一种可能实现方式中,电子设备响应于手写输入的输入模式,触发在第二显示屏上展示第一应用界面之后,方法还包括:电子设备在检测到与第一应用界面对应的模式类型转变为键盘输入的情况下,响应于键盘输入的输入模式,触发在第一显示屏上展示第一应用界面,并在第二显示屏上展示虚拟键盘,以通过第二显示屏上的虚拟键盘 获取针对第一应用界面的输入内容。或者,电子设备在检测到与第一应用界面对应的模式类型转变为键盘输入的情况下,响应于键盘输入的输入模式,触发在第一显示屏上展示第一应用界面,并在第二显示屏上展示虚拟键盘和应用控制栏。
本实现方式中,在展示应用界面的过程中,不仅能在应用界面从其他模式类型转变为手写输入时,自动调整应用界面在电子设备的不同显示屏上的布局,且能在应用界面的模式类型转变为键盘输入时,也能够自动调整应用界面在不同显示屏上的布局,并自动展示出虚拟键盘,从而当应用界面的模式类型转变为键盘输入时,用户也无需再手动调整应用界面在不同显示屏上的布局,而是直接可以进行键盘输入,步骤简洁,进一步提高了本方案的用户粘度。
在第十四方面的一种可能实现方式中,方法还可以包括:电子设备检测到作用于第二显示屏的第二操作;响应于第二操作将应用控制栏的第一显示面积改变为第二显示面积,并将应用控制栏包括的第一控制键组改变为第二控制键组,第一控制键组和第二控制键组均为对应于目标应用的控制键集合。对于前述步骤中各个名词的具体含义、前述步骤的具体实现方式,均会在后续第二十方面中进行描述,此处暂不进行详细描述。
在第十四方面的一种可能实现方式中,第一应用界面包括第一控制键,方法还可以包括:电子设备检测到对于第一目标应用界面的第二操作;响应于第二操作,在应用控制栏中显示第一控制键,并隐藏第一应用界面中的第一控制键。对于前述步骤中各个名词的具体含义、前述步骤的具体实现方式,均会在后续第二十一方面中进行描述,此处暂不进行详细描述。
在第十四方面的一种可能实现方式中,在第二显示屏上展示虚拟键盘,包括:在第二显示屏上展示第二类型的虚拟键盘;方法还包括:电子设备检测到作用于第二显示屏的第一手势操作,响应于第一手势操作,从多个类型的虚拟键盘中选取与第一手势操作对应的第一类型的虚拟键盘,其中,多个类型的虚拟键盘中不同类型的虚拟键盘包括的虚拟按键不完全相同;通过第二显示屏展示第一类型的虚拟键盘,第一类型的虚拟键盘和第二类型的虚拟键盘为多个类型的虚拟键盘中的不同类型的虚拟键盘。对于前述步骤中各个名词的具体含义、前述步骤的具体实现方式,均可参阅上述第八方面的描述,本申请实施例第十四方面中,电子设备还可以执行第八方面的各种可能的实现方式中电子设备执行的步骤,对于本申请实施例第十四方面以及第十四方面的各种可能实现方式中某些步骤的具体实现方式,以及每种可能实现方式所带来的有益效果,均可以参考第八方面中各种可能的实现方式中的描述,此处不再一一赘述。
在第十四方面的一种可能实现方式中,电子设备响应于手写输入的输入模式,触发在第二显示屏上展示第一应用界面之后,方法还包括:电子设备在检测到与第一应用界面对应的模式类型转变为浏览模式的情况下,响应于浏览模式,触发在第一显示屏上展示第一应用界面,且停止在第二显示屏上展示第一应用界面。本实现方式中,在应用界面的模式类型转变为浏览模式时,也能够自动调整应用界面在不同显示屏上的布局,从而当应用界面的模式类型转变为浏览模式时,用户也无需再手动调整应用界面在不同显示屏上的布局,也即在多种不同的应用场景下,均可以实现操作步骤的简化,进一步提高了本方案的用户 粘度。
在第十四方面的一种可能实现方式中,电子设备检测到第一操作包括以下五项中任一项或多项的组合:电子设备在检测到电子笔的握持姿势满足第一预设条件的情况下,确定检测到第一操作,握持姿势包括以下中任一项或多项的组合:握持位置、握持力度、握持角度,第一预设条件包括以下中的任一项或多项的组合:握持位置位于第一位置范围内、握持力度位于第一力度范围内、握持角度位于第一角度范围内;或者,电子设备通过第一图标获取到针对手写输入的触发指令,第一图标展示于第一应用界面上;或者,电子设备在检测到预设点击操作或预设轨迹操作的情况下,确定检测到第一操作,该预设点击操作可以为单击操作、双击操作、三击操作或长按操作,预设轨迹操作可以为“Z”字型的轨迹操作、下滑操作、“对勾”形的轨迹操作或“圆圈”形的轨迹操作;或者,在检测到电子笔位于第二显示屏的预设范围内的情况下,确定检测到第一操作;或者,在检测到电子笔由第一预设状态转变为第二预设状态的情况下,确定检测到第一操作,其中,电子笔由第一预设状态转变为第二预设状态可以为电子笔由静止状态转变为移动状态、电子笔由未被握持状态转变为被握持状态等。
本实现方式中,提供了与第一应用界面对应的模式类型的多种判断方式,提高了本方案的实现灵活性,也扩展了本方案的应用场景;进一步地,根据电子笔的握持姿势来确定与第一应用界面对应的模式类型,用户无需执行其他操作就可以实现对第一应用界面的模式类型的转变,且根据用户对电子笔的握持姿势,来确定与第一应用界面对应的模式类型,能够降低与第一应用界面对应的模式类型的判断过程的错误率,以降低对第一应用界面进行错误放置的概率,既避免对计算机资源的浪费,又有利于提高用户粘度。
在第十四方面的一种可能实现方式中,第一操作为通过第二显示屏获取第一方向的滑动操作,第一方向的滑动操作为从第二显示屏的上边沿向第二显示屏的下边沿滑动的滑动操作,第二显示屏的上边沿与第一显示屏之间的距离比第二显示屏的下边沿与第一显示屏之间的距离近。具体的,电子设备通过第二显示屏获取第一方向的滑动操作,响应于第一方向的滑动操作,第二显示屏上展示的虚拟键盘沿第一方向向第二显示屏的下边沿移动,在虚拟键盘的上边沿抵达第二显示屏的下边沿时,确认与第一应用界面对应的模式类型转变为手写输入。本实现方式中,显示于第二显示屏上的虚拟键盘能够伴随用户的向下滑动操作,并在虚拟键盘的上边沿抵达第二显示屏的下边沿时,电子设备确认与第一应用界面对应的模式类型转变为手写输入,增加了键盘输入至手写输入过程的趣味性,有利于提高用户粘度。
在第十四方面的一种可能实现方式中,电子设备触发在第二显示屏上展示第一应用界面之后,方法还包括:电子设备获取针对第二应用界面的启动操作,并基于启动操作,确定与第二应用界面对应的模式类型,第二应用界面与第一应用界面为不同的应用界面;在与第二应用界面对应的模式类型为手写输入的情况下,电子设备响应于手写输入的输入模式,触发在第二显示屏上展示第二应用界面;或者,在与第二应用界面对应的模式类型为键盘输入的情况下,电子设备响应于键盘输入的输入模式,触发在第一显示屏上展示第二应用界面,并在第二显示屏上展示虚拟键盘;或者,在与第二应用界面对应的模式类型为 浏览模式的情况下,电子设备响应于浏览模式,触发在第一显示屏上展示第二应用界面。
本实现方式中,不仅在用户使用应用界面的过程中,能够自动检测与应用界面对应的模式类型,进而根据与应用界面对应的模式类型对应用界面的展示位置进行调整,而且在打开应用界面时,也可以基于启动操作,确定与应用界面对应的模式类型,进而决定应用界面的展示位置,以方便用户在对应用界面执行启动操作后可以直接使用,而无需再对应用界面做位置移动操作,进一步提高了本方案的便利性,增加了本方案的用户粘度。
在第十四方面的一种可能实现方式中,电子设备基于启动操作,确定与第二应用界面对应的模式类型,包括:在启动操作为通过第一显示屏获取到的情况下,电子设备确定与第二应用界面对应的模式类型为键盘输入或浏览模式;在启动操作为通过第二显示屏获取到的情况下,电子设备确定与第二应用界面对应的模式类型为手写输入。
在第十四方面的一种可能实现方式中,电子设备基于启动操作,确定与第二应用界面对应的模式类型,包括:在启动操作为通过电子笔获取到的情况下,电子设备确定与第二应用界面对应的模式类型为手写输入;在启动操作为通过鼠标或手指获取到的情况下,电子设备确定与第二应用界面对应的模式类型为键盘输入或浏览模式。
第十五方面,本申请实施例提供了一种电子设备,可用于人机交互领域中。所述电子设备包括第一显示屏、第二显示屏、存储器、一个或多个处理器以及一个或多个程序;所述一个或多个程序被存储在所述存储器中,所述一个或多个处理器在执行所述一个或多个程序时,使得所述电子设备执行以下步骤:通过所述第一显示屏展示第一应用界面;响应于检测到的第一操作,将与所述第一应用界面对应的模式类型转变为手写输入;响应于所述手写输入的输入模式,触发在所述第二显示屏上展示所述第一应用界面,以通过所述第二显示屏获取针对所述第一应用界面的手写内容。对于本申请实施例第二方面以及第二方面的部分可能实现方式中名词的概念、具体实现步骤以及每种可能实现方式所带来的有益效果,均可以参考第一方面中各种可能的实现方式中的描述,此处不再一一赘述。
本申请实施例第十五方面中,电子设备还可以用于实现第十四方面各种可能实现方式中电子设备执行的步骤,对于本申请实施例第十五方面以及第十五方面的各种可能实现方式中某些步骤的具体实现方式,以及每种可能实现方式所带来的有益效果,均可以参考第十四方面中各种可能的实现方式中的描述,此处不再一一赘述。
第十六方面,本申请实施例提供了一种计算机程序,当其在计算机上运行时,使得计算机执行上述第十四方面所述的应用界面的处理方法。
第十七方面,本申请实施例提供了一种电子设备,包括处理器,所述处理器与所述存储器耦合;所述存储器,用于存储程序;所述处理器,用于执行所述存储器中的程序,使得所述电子设备执行如上述第十四方面所述的应用界面的处理方法。
第十八方面,本申请实施例提供了一种计算机可读存储介质,所述计算机可读存储介质中存储有计算机程序,当其在计算机上运行时,使得计算机执行上述第十四方面所述的应用界面的处理方法。
第十九方面,本申请实施例提供了一种芯片系统,该芯片系统包括处理器,用于支持实现上述方面中所涉及的功能,例如,发送或处理上述方法中所涉及的数据和/或信息。在 一种可能的设计中,所述芯片系统还包括存储器,所述存储器,用于保存服务器或通信设备必要的程序指令和数据。该芯片系统,可以由芯片构成,也可以包括芯片和其他分立器件。
本发明实施例的第二十方面,提供一种屏幕显示方法,应用于包括第一显示屏和第二显示屏的电子设备,屏幕显示方法包括:
在第一显示屏上显示目标应用的界面;
在第二显示屏上显示应用控制栏;
响应于接收到的第一操作,将应用控制栏的第一显示面积改变为第二显示面积;
当应用控制栏的显示面积为第一显示面积,应用控制栏包括第一控制键组;
当应用控制栏的显示面积为第二显示面积,应用控制栏包括第二控制键组;
第一控制键组和第二控制键组均为用于控制目标应用的控制键集合,第一控制键组和第二控制键组包含的控制键不完全相同。
其中,电子设备可以是具有两个连接在一起(例如,通过轴连等方式连接)的显示屏的电子设备,其中,两个显示屏可以是两个独立的显示屏,也可以是由一块柔性折叠屏或者曲面屏划分而成的,可用于执行不同功能的两个显示屏。电子设备可以是作为一个整体独立工作的电子设备,例如,个人笔记本等,也可以是由两个可以独立工作的电子设备相互连接,共同工作而形成的电子设备,例如,由两个手机或两个平板电脑拼接而成的双屏电子设备。
其中,第一操作可以是直接作用于应用控制栏的操作,例如,第一操作可以是通过触屏手势改变应用控制栏的显示面积;或者,第一操作可以是通过点击(手指点击或鼠标点击等)应用控制栏的放大、缩小按钮来改变应用控制栏的显示面积;或者,第一操作可以是通过鼠标拖动应用控制栏的边界来改变应用控制栏的显示面积。第一操作也可以是间接作用于应用控制栏的操作,例如,第一操作可以通过上述三种方式直接作用于控件区,通过改变控件区的显示面积,改变应用控制栏的显示面积;或者,第一操作可以通过上述三种方式直接作用于第二显示屏中的其他应用显示界面或者输入模块(虚拟键盘或手写输入区域等),通过改变第二显示屏上其他显示模块的显示面积,改变应用控制栏的显示面积;或者,第一操作可以是用户对于第一显示屏中的目标应用的操作,例如,当用户的第一操作对应的在应用控制栏中显示的控制键的数量不同于第一操作之前在应用控制栏中显示的控制键的数量时,可以适应性的调整应用控制栏的显示面积,使得第一操作对饮的控制键能够得到更好的显示。
根据用户的操作和/或需求,灵活改变应用控制栏的显示面积和控制键,使得应用控制栏可以根据用户的操作或需要进行灵活调整,始终在应用控制栏中显示与用户当前操作相关联的控制键,为用户提供更加便捷的输入操作,提升用户体验。
结合第二十方面,在第二十方面第一种可能的实现方式中:
在应用控制栏的第一显示面积改变为第二显示面积前,第二显示屏上显示有虚拟键盘;
将应用控制栏的第一显示面积改变为第二显示面积后,虚拟键盘的显示布局发生改变。
具体的,当第二显示面积大于第一显示面积时,虚拟键盘的显示面积相应缩小,虚拟键盘中按键的布局也随显示面积的改变而改变,例如,可以缩小全部或部分按键或者减少部分按键或者压缩按键之间的间隔等。当第二显示面积小于第一显示面积时,虚拟键盘的显示面积相应增大,虚拟键盘中按键的布局也随显示面积的改变而改变,例如,可以增大全部或部分按键或者增加部分按键或者增大按键之间的间隔等,还可以在虚拟键盘的基础上增加触控板等其他功能模块。
由于应用控制栏通常与第二显示屏幕中其他显示模块(应用或输入模块等)同时显示在第二显示屏上,因此,当应用控制栏的显示面积发生改变时,适应性的调整其他显示模块的显示布局,能够使得第二显示屏中不存在无显示的部分,也不存在显示折叠的部分,优化第二显示屏上的显示布局,提升用户体验。
结合第二十方面或第二十方面第一种可能的实现方式,在第二十方面第二种可能的实现方式中:
在应用控制栏的第一显示面积改变为第二显示面积前,目标应用的界面中包括第三控制键组;
第二显示面积大于第一显示面积;
第二控制键组包括第一控制键组和第三控制键组;
应用控制栏的第一显示面积改变为第二显示面积后,目标应用的界面中包括第三控制键组。
当用户需要应用控制栏中显示更多控制键,或者用户的当前操作所对应的控制键的数量较多时,增大应用控件区的显示面积,增多应用控件区中显示的控制键,能够为用户提供更多的控制键,为用户提供更加便捷的输入方式。另外,当增大应用控制栏的显示面积,将第一显示屏中的控制键转移到第二显示屏中的应用控制栏中进行显示,能够节约第一显示屏的显示空间,使得第一显示屏的显示内容更加简洁、清爽。另外,在移除应用控制栏中显示的控制键在第一显示屏中的显示后,可以增大第一显示屏中原本显示的内容的大小,或者在原有显示内容的基础上增加新的显示内容,为用户提供更加便捷的操作,提升用户体验。
结合第二十方面第二种可能的实现方式,在第二十方面第三种可能的实现方式中:
根据第二显示面积和目标应用的待显示控制键集合中的待显示控制键的优先级顺序,确定第三控制键组。
具体的,目标应用的待显示控制键集合可以是应用程序提供的,可以显示在应用控制栏中进行显示的待显示控制键的集合。集合中待显示的控制键的优先级顺序可以是由应用程序规定的,也可以是由操作系统根据待显示的控制键的功能或用户的使用频率等因素自行确定的。
由应用程序提供待显示的控制键集合,由应用程序或操作系统规定待显示的控制键的优先级顺序,由操作系统确定当应用控制栏的显示面积增大时,在应用控制栏中增加的控制键,能够在应用控制栏的各种显示面积下,确定应用控制栏中显示的控制键,使得应用控制栏的设置更加灵活,可以支持用户多样的操作方式和需求。
当增大应用控制栏的显示面积时,按照待显示的控制键的优先级顺序,确定在应用控件区中增加的显示键,能够在应用控制栏的显示面积有限的情况下,优先将优先级较高(较为重要的,或者是用户的使用频率)的控制键显示在应用控制栏中,能够为用户提供更加便捷的操作,提升用户体验。
结合第二十方面第二种可能的实现方式或第二十方面第三种可能的实现方式,在第二十方面第四种可能的实现方式中:
将第三控制键组显示在相对于第一控制键组而言,更靠近第一显示屏幕的位置。
在上述设置下,第一控制键组显示在相对于第三控制键组而言,更靠近用户双手的位置。即,每次扩大应用控制栏时,始终将新增加的控制键显示在靠近第一显示屏的位置,将应用控制栏中原本显示的控制键显示在更靠近用户双手的位置,方便用户操作。由于在扩大应用控制栏的显示面积的过程中,新增加的控制键的优先级往往低于在应用控制栏中原本显示的控制键,因此,可以始终把优先级较高的控制键显示在更靠近用户双手的位置,为用户提供更加便捷的操作提升用户体验。
结合第二十方面或第二十方面第一种可能的实现方式,在第二十方面第五种可能的实现方式中:
应用控制栏的第一显示面积改变为第二显示面积前,目标应用的界面中不包括第四控制键组,第四控制键组为用于控制目标应用的控制键集合;
第二显示面积小于第一显示面积;
第二控制键组为在第一控制键组中减少第四控制键组;
应用控制栏的第一显示面积改变为第二显示面积后,目标应用的界面中包括部分或全部第四控制键组。
当用户想要在应用控制栏中显示较少控制键,或者用户当前操作所对应的控制键的数量较少,或者用户需要放大第二显示屏上其他显示模块的显示面积时,减小应用控件区的显示面积,减少应用控件区中显示的控制键,能够节约第二显示屏上的显示控件,能够减少对用户视觉的干扰,方便用户快速定位到需要的控制键。另外,在应用控制栏中减少第四控制键组后,将第四控制键组中的部分或全部控制键组显示在第一显示屏上,使得用户在需要用到这部分控制键时,仍可以通过第一显示屏进行操作,弥补了缩小应用控制栏的情况下对用户操作的影响,提升用户体验。
结合第二十方面第五种可能的实现方式,在第二十方面第六种可能的实现方式中:
根据第二显示面积和第一控制键组中控制键的优先级顺序或所述第一控制键组中控制键的位置关系,确定第四控制键组。
其中,第一控制键组中控制键的优先级顺序可以由应用程序规定,也可以由系统规定。根据控制键的优先级顺序,确定在减小应用控制栏的显示面积时,移除应用控制栏中的哪些控制键,能够将优先级较高的控制键保留在应用控制栏中,提升用户的操作体验。当用户执行缩小应用控制栏的显示面积的操作,是为了隐藏应用控制栏特定区域的显示,例如,通过拖动操作隐藏部分显示内容,可以根据控制键的位置确定隐藏哪些控制键,达到用户的操作目的。
结合第二十方面,或第二十方面前两种可能的实现方式中的任一种,或第二十方面第五种可能的实现方式,在第二十方面第七种可能的实现方式中:
第二控制键组为与应用控制栏的第二显示面积对应的控制键组。
其中,与应用控制栏的第二显示面积对应的控制键组可以是由应用程序提供的。例如,应用程序可以针对应用控制栏的几种固定尺寸的显示面积,定义相对应的控制键组,当应用控制栏的显示面积与某个固定尺寸的显示面积对应时,在应用控制栏中显示该显示面积对应的控制键组;或者,应用程序也可以针对应用控制栏的显示面积的几个尺寸变化范围,分别定义相对应的控制键组,当应用控制栏的实际显示面积落在某一个尺寸范围中时,将该尺寸范围对应的控制键组显示在应用控制栏中。
采用上述方式确定应用控制栏中显示的控制键,能够极大的减少操作系统的计算量,缩短执行第一操作时操作系统的反应时间,提高操作效率。
结合第二十方面,或第二十方面前七种可能的实现方式中的任一种,在第二十方面第八种可能的实现方式中:
第一操作为手势操作;
响应于接收到的第一操作,将应用控制栏的第一显示面积改变为第二显示面积,具体为:
响应于手势操作,从多个类型的虚拟键盘中选取与手势操作对应的第一类型的虚拟键盘,其中,多个类型的虚拟键盘中不同类型的虚拟键盘包括的虚拟按键不完全相同;
通过第二显示屏显示第一类型的虚拟键盘;
根据第一类型的虚拟键盘的显示区域确定第二显示面积。
第二显示屏中可能会同时显示应用控制栏和虚拟键盘等输入方式,不同手势可以开启不同的手势虚拟键盘,当通过手势开启手势虚拟键盘时,可根据手势虚拟键盘的显示区域(显示面积和显示位置等)确定应用控制栏的显示面积和/或显示区域,使得应用控制栏能够灵活的适配手势虚拟键盘,使得第二显示屏上的显示更加合理、整洁、美观,提升用户体验。
结合第二十方面,或第二十方面前八种可能的实现方式中的任一种,在第二十方面第九种可能的实现方式中:
响应于接收到的第二操作,在第二显示屏上显示目标应用的界面,以通过第二显示屏获取针对目标应用的界面的手写内容,第二操作指示开启目标应用的手写输入模式;
第二显示屏上显示目标应用的界面后,第二显示屏不包括应用控制栏。
当检测到用户通过第二操作开启手写输入方式时,可在第二显示屏上显示目标应用的界面,以通过第二显示屏获取针对目标应用的界面的手写内容。此时,由于已经将目标应用的界面全部复制在第二显示屏中,可隐藏第二显示屏中的应用控制栏,节约第二显示屏上的显示控件,使得第二显示屏上的显示内容更加简洁、清爽,避免应用控制栏对用于手写输入的干扰。
结合第二十方面,或第二十方面前九种可能的实现方式中的任一种,在第二十方面第十种可能的实现方式中:
第一操作用于将输入方式切换为手写输入方式;
响应于接收到的第一操作,在应用控制栏中显示手写输入区域和/或与手写输入方式相关联的控制键组;
当用户将输入方式切换为手写输入方式时,可以在应用控制栏中显示手写输入区域,能够使得用户更加便捷的通过应用控制栏执行手写输入操作,提高操作效率。也可以在应用控制栏中显示与手写输入方式相关联的控制键组,例如:笔、橡皮擦、颜色、字体等,使得用户可以通过应用控制栏对手写输入方式进行操作,为用户提供更加便捷的操作。还可以在应用控制栏中同事显示手写输入区域和与手写输入方式相关联的控制键组,并达到上述有益效果。
结合第二十方面,或第二十方面前十种可能的实现方式中的任一种,在第二十方面第十一种可能的实现方式中,所述方法还包括:
获取作用于应用控制栏的接触操作;
响应于接触操作,获取与接触操作对应的第一控制键,所述第一控制键位于应用控制栏;
从多个振动反馈元件中获取与第一控制键匹配的至少一个第一振动反馈元件;
指示至少一个第一振动反馈元件发出振动波,以执行第一反馈操作,第一反馈操作用于提示第一控制键为应用控制栏的按键。
显示于第二显示屏中的控件区可以包括系统控制栏和应用控制栏,当用户接触到应用控制栏中的控制键时,提供反馈操作,能够使得用户在不将视线移动到第二显示屏上,就能够在控件区中定位应用控制栏的位置和应用控制栏中的控制键,帮助用户在应用控制栏的显示面积和控制键变化的过程中快速定位想要定位的控制键,极大的提升操作效率。反过来,根据用户的使用习惯,也可以对控件区中系统控制栏中的控制键设置反馈操作,使得用户在不将视线移动到第二显示屏上,就能够在控件区中定位系统控制栏的位置和系统控制栏中的控制键,极大的提升操作效率。另外,还可以在应用控制栏中功能较重要、或者用户的使用频率较高的控制键设置反馈操作,帮助用户在应用控制栏的显示面积和控制键变化的过程中快速定位想要定位的控制键,极大的提升操作效率。
结合第二十方面,或第二十方面前十一种可能的实现方式中的任一种,在第二十方面第十二种可能的实现方式中,应用控制栏可通过以下任一种方式关闭:
基于接收到的关闭虚拟键盘的指令,关闭应用控制栏;
基于虚拟键盘的按键指令,关闭应用控制栏;
基于手势指令,关闭应用控制栏;以及
基于接收到的关闭应用的全屏模式的指令,关闭应用控制栏。
结合第二十方面,或第二十方面前十二种可能的实现方式中的任一种,在第二十方面第十三种可能的实现方式中,应用控制栏可通过以下任一种方式开启:
基于接收到的激活虚拟键盘的指令,激活应用控制栏;
基于虚拟键盘的按键指令,激活应用控制栏;
基于手势指令,激活应用控制栏;以及
基于接收到的开启应用的全屏模式的指令,激活应用控制栏。
上述开启和关闭应用控制栏的方式仅是示例性的,上述设计使得用户在第二显示屏中显示任意内容的情况下,通过灵活的方式激活或关闭应用控件区,为用户提供更加便捷的操作,提升用户体验。
本发明实施例第二十一方面,提供一种屏幕显示方法,屏幕显示方法应用于包括第一显示屏和第二显示屏的电子设备,屏幕显示方法包括:
在第一显示屏上显示目标应用的界面,目标应用的界面包括第五控制键组;
在第二显示屏上显示应用控制栏;
响应于对于目标应用的界面的第三操作,在应用控制栏中显示第五控制键组,以及隐藏所述目标应用的界面中的所述第五控制键组。
根据用户对于目标应用界面的操作,将用户操作对应的控制键显示在应用控制栏中,可以在应用控制栏中始终显示对应于用户当前操作的快捷操作控制键,为用户提供更加便捷的操作,提高用户的操作效率。另外,当将控制键显示在应用控制栏中之后,移除这部分控制键在第一显示屏中的显示,可以节约第一显示屏的显示面积,使得第一显示屏的显示内容更加简洁、清爽。另外,在移除应用控制栏中显示的控制键在第一显示屏中的显示后,可以增大第一显示屏中原本显示的内容的大小,或者在原有显示内容的基础上增加新的显示内容,为用户提供更加便捷的操作,提升用户体验。
结合第二十一方面,在第二十一方面第一种可能的实现方式中,屏幕显示方法还包括:
相应于对于目标应用的界面的第三操作,改变应用控制栏的显示面积。
在应用控制栏中显示第五控制键组后,可能导致应用控制栏中控制键的数量发生变化,此时,可以适应性的调整应用控制栏的显示面积,优化应用控制栏中控制键的显示,使得应用控制栏中控制键的显示更符合用户的使用习惯,提升用户体验。
结合第二十一方面或第二十一方面第一种可能的实现方式,在第二十一方面第二种可能的实现方式中,应用控制栏中显示第五控制键组前,屏幕显示方法还包括:
应用控制栏包括第六控制键组,第六控制键组为用于控制目标应用的初始控制键的集合。
当用户开启目标应用时,可以在应用控制栏中显示用于控制目标应用的初始控制键组,当用户执行对于目标应用的操作时,可以在初始控制键组的基础上增加对应于用户当前操作的第五控制键组,也可以将初始控制键组部分或全部替换为对应于用户当前操作的第五控制键组,能够使得应用控制栏中始终显示与用户的当前操作最相关的控制键,为用户提供更加便捷的操作,提升用户的操作效率。另外,当在应用控制栏中显示初始控制键组后,可以移除目标应用的界面中的这部分控制键,以节约第一显示屏的显示面积,使得第一显示屏的显示内容更加简洁、清爽。
结合第二十一方面或第二十一方面第一种可能的实现方式,在第二十一方面第三种可能的实现方式中,应用控制栏中显示第五控制键组后,屏幕显示方法还包括:
响应于对于目标应用的界面的第四操作,在应用控制栏中显示第七控制键组,以及隐藏所述目标应用的界面中的所述第七控制键组。
具体的,当用户在对目标应用的界面执行了第三操作的基础上,可以继续对同一目标应用的界面执行第四操作。当用户对目标应用执行第四操作时,可以在应用控制栏中的第五控制键组的基础上,增加对应于用户的第四操作的第七控制键组,也可以将应用控制栏中的第五控制键组中的部分或者全部控制键替换为对应于用户的第四操作的第七控制键组。另外,第四操作也可以是将一个新的目标应用显示在第一显示屏上,第四操作可以通过开启一个新的目标应用实现,也可以是该目标应用原本在后台运行,通过第四操作将其界面显示在第一显示屏上。当将新的目标应用的界面显示在第一显示屏上时,可以隐藏第一显示屏中原目标应用的界面,此时,可以用第七控制键组替换第五控制键组。或者,可以将两个目标应用的界面同时显示在第一显示屏中(例如,双屏显示),此时,可以在第七控制键组的基础上增加第五控制键组,即,将第七控制键组和第五控制键组共同显示在应用控制栏中。
根据用户操作的变化,灵活改变应用控制栏中的控制键,可以始终在应用控制栏中显示与用户操作关联性最强的控制键,为用户提供更加便捷的操作,提高操作效率。
结合第二十一方面,或第二十一方面前三种可能的实现方式中的任一种,在第二十一方面第四种可能的实现方式中:
第三操作为选择目标应用的界面中的目标对象;
第五控制键组为用于操作目标对象的控制键。
其中,第三操作可以为选定目标应用的界面中的目标对象,例如,目标对象的底纹加深以表示其被选中,或者,第三可以为通过将光标移动到目标对象上来选择目标对象。具体的,第三操作选择的目标对象可以是图片或者文字,第五控制键组可以包括与文字或图片编辑相关的控制键,方便用户通过应用控制栏中的控制键对文字或图片进行编辑。第三操作选择的目标对象可以是音视频,第五控制键组可以包括与音视频控制相关的控制键组,方便用户通过应用控制栏中的控制键对音视频进行控制。
结合第二十一方面,或第二十一方面前三种可能的实现方式中的任一种,在第二十一方面第五种可能的实现方式中:
第三操作为将光标移动到目标应用的界面中的目标位置;
第五控制键组为在目标位置单击鼠标右键时显示的菜单栏中的控制键。
当用户将光标移动到目标应用的界面的目标位置时,将在光标所在处单击鼠标右键时显示的菜单栏中的控制键显示在应用控制栏中,右键菜单栏中的控制键时根据用户意图设计的,大概率能够满足用户当前的操作需求,且直接采用右键菜单栏中的控制键,可以避免开发者的二次开发,缩短开发周期。
结合第二十一方面,或第二十一方面前三种可能的实现方式中的任一种,在第二十一方面第六种可能的实现方式中:
第三操作为通过滑动手势或滚动鼠标滚轮浏览目标应用的界面的目标区域中的内容;
第五控制键组为目标区域的缩略图及在缩略图中快速定位目标对象的定位框。
通过上述设置,使得用户可以通过应用控制栏中目标区域的缩略图及在缩略图中快速定位目标对象的定位框,快速定位用户所需的目标内容,提高用户的操作效率。
本发明实施例的第二十二方面,提供一种电子设备,包括:
第一显示屏,第二显示屏,存储器,一个或多个处理器,以及一个或多个程序;其中一个或多个程序被存储在存储器中;其特征在于,一个或多个处理器在执行一个或多个程序时,使得电子设备执行以下步骤:
在第一显示屏上显示目标应用的界面;
在第二显示屏上显示应用控制栏;
响应于接收到的第一操作,将应用控制栏的第一显示面积改变为第二显示面积;
当应用控制栏的显示面积为第一显示面积,应用控制栏包括第一控制键组;
当应用控制栏的显示面积为第二显示面积,应用控制栏包括第二控制键组;
第一控制键组和第二控制键组均为用于控制目标应用的控制键集合,第一控制键组和第二控制键组包含的控制键不完全相同。
结合第二十二方面,在第二十二方面第一种可能的实现方式中,一个或多个处理器在执行一个或多个程序时,使得电子设备执行以下步骤:
在应用控制栏的第一显示面积改变为第二显示面积前,第二显示屏上显示有虚拟键盘;
将应用控制栏的第一显示面积改变为第二显示面积后,虚拟键盘的显示布局发生改变。
结合第二十二方面或第二十二方面第一种可能的实现方式,在第二十二方面第二种可能的实现方式中,一个或多个处理器在执行一个或多个程序时,使得电子设备执行以下步骤:
应用控制栏的第一显示面积改变为第二显示面积前,目标应用的界面中包括第三控制键组;
第二显示面积大于第一显示面积;
第二控制键组包括第一控制键组和第三控制键组;
应用控制栏的第一显示面积改变为第二显示面积后,目标应用的界面中包括第三控制键组。
结合第二十二方面第二种可能的实现方式,在第二十二方面第三种可能的实现方式中,一个或多个处理器在执行一个或多个程序时,使得电子设备执行以下步骤:
根据第二显示面积和目标应用的待显示控制键集合中的待显示控制键的优先级顺序,确定第三控制键组。
结合第二十二方面或第二十二方面第一种可能的实现方式,在第二十二方面第四种可能的实现方式中,一个或多个处理器在执行一个或多个程序时,使得电子设备执行以下步骤:
应用控制栏的第一显示面积改变为第二显示面积前,目标应用的界面中不包括第四控制键组,第四控制键组为用于控制目标应用的控制键集合;
第二显示面积小于第一显示面积;
第二控制键组为在第一控制键组中减少第四控制键组;
应用控制栏的第一显示面积改变为第二显示面积后,目标应用的界面中包括部分或全部第四控制键组。
结合第二十二方面第四种可能的实现方式,在第二十二方面第五种可能的实现方式中,一个或多个处理器在执行一个或多个程序时,使得电子设备执行以下步骤:
根据第二显示面积和第一控制键组中控制键的优先级顺序或第一控制键组中控制键的位置关系,确定第四控制键组。
结合第二十二方面,或第二十二方面前五种可能的实现方式中的任一种,在第二十二方面第六种可能的实现方式中,一个或多个处理器在执行一个或多个程序时,使得电子设备执行以下步骤:
第一操作为手势操作;
响应于接收到的第一操作,将应用控制栏的第一显示面积改变为第二显示面积,具体为:
响应于手势操作,从多个类型的虚拟键盘中选取与手势操作对应的第一类型的虚拟键盘,其中,多个类型的虚拟键盘中不同类型的虚拟键盘包括的虚拟按键不完全相同;
通过第二显示屏显示第一类型的虚拟键盘;
根据第一类型的虚拟键盘的显示区域确定第二显示面积。
结合第二十二方面,或第二十二方面前六种可能的实现方式中的任一种,在第二十二方面第七种可能的实现方式中,一个或多个处理器在执行一个或多个程序时,使得电子设备执行以下步骤:
响应于接收到的第二操作,在第二显示屏上显示目标应用的界面,以通过第二显示屏获取针对目标应用的界面的手写内容,第二操作指示开启所述目标应用的手写输入模式;
第二显示屏上显示目标应用的界面后,第二显示屏不包括应用控制栏。
本发明实施例第二十二方面提供的电子设备能够实现本发明实施例第二十方面中描述各种可能的实现方式,并达到所有有益效果。
本发明实施例的第二十三方面,提供一种电子设备,包括:
第一显示屏,第二显示屏,存储器,一个或多个处理器,以及一个或多个程序;其中一个或多个程序被存储在存储器中;其特征在于,一个或多个处理器在执行一个或多个程序时,使得电子设备执行以下步骤:
在第一显示屏上显示目标应用的界面,目标应用的界面包括第五控制键组;
在第二显示屏上显示应用控制栏;
响应于对于目标应用的界面的第三操作,在应用控制栏中显示第五控制键组,以及隐藏所述目标应用的界面中的所述第五控制键组。
结合第二十三方面,在第二十三方面第一种可能的实现方式中,一个或多个处理器在执行一个或多个程序时,使得电子设备在应用控制栏中显示第五控制键组前,执行以下步骤:
应用控制栏包括第六控制键组,第六控制键组为用于控制目标应用的初始控制键的集合;
目标应用的界面中不包括第六控制键组。
结合第二十三方面或第二十三方面第一种可能的实现方式,在第二十三方面第二种可 能的实现方式中,一个或多个处理器在执行一个或多个程序时,使得电子设备在应用控制栏中显示第五控制键组后,执行以下步骤:
响应于对于目标应用的界面的第四操作,在应用控制栏中显示第七控制键组;
目标应用的界面中不包括第七控制键。
结合第二十三方面或第二十三方面前两种可能的实现方式中的任一种,在第二十三方面第三种可能的实现方式中:
第三操作为选择目标应用的界面中的目标对象;
第五控制键组为用于操作目标对象的控制键。
结合第二十三方面或第二十三方面前两种可能的实现方式中的任一种,在第二十三方面第四种可能的实现方式中:
第三操作为将光标移动到目标应用的界面中的目标位置;
第五控制键组为在目标位置单击鼠标右键时显示的菜单栏中的控制键。
结合第二十三方面或第二十三方面前两种可能的实现方式中的任一种,在第二十三方面第五种可能的实现方式中:
第三操作为通过滑动手势或滚动鼠标滚轮浏览目标应用的界面的目标区域中的内容;
第五控制键组为目标区域的缩略图及在缩略图中快速定位目标对象的定位框。
本发明实施例第二十三方面提供的电子设备能够实现本发明实施例第二十一方面中描述各种可能的实现方式,并达到所有有益效果。
本发明实施例第二十四方面提供一种计算机存储介质,该计算机可读介质中存储有程序,当其在计算机上运行时,使得计算机实现第二十方面或第二十方面前十一种可能的实现方式中的任意一种所述的屏幕显示方法,或者实现第二十一方面或第二十一方面第六种可能的实现方式中任意一种所述的屏幕显示方法,并达到上述所有有益效果。
本发明实施例第二十五方面提供一种计算机程序产品,当其在计算机上运行时,使得计算机实现第二十方面或第二十方面前十一种可能的实现方式中的任意一种所述的屏幕显示方法,或者实现第二十一方面或第二十一方面第六种可能的实现方式中任意一种所述的屏幕显示方法,并达到上述所有有益效果。
附图说明
图1为本申请实施例提供的电子设备的一种结构示意图;
图2为本申请实施例提供的电子设备的另一种结构示意图;
图3为本申请实施例提供的触控屏幕的一种结构示意图;
图4为本申请实施例提供的电子设备中多个振动反馈单元的两种排列示意图;
图5为本申请实施例提供的触控屏幕的一种截面示意图;
图6为本申请实施例提供的振动反馈模块包括的多个振动反馈单元的一种排列布局示意图;
图7为本申请实施例提供的触控屏幕的另一种结构示意图;
图8为本申请实施例提供的触控屏幕的再一种结构示意图;
图9为本申请实施例提供的反馈方法的一种流程示意图;
图10为本申请实施例提供的反馈方法中虚拟键盘的两种示意图;
图11为本申请实施例提供的反馈方法中第一位置区域和第二位置区域的两种示意图;
图12为本申请实施例提供的反馈方法中第一位置区域和第二位置区域的另一种示意图;
图13为本申请实施例提供的反馈方法中第一位置区域和第二位置区域的再一种示意图;
图14为本申请实施例提供的电子设备的一种结构示意图;
图15为本申请实施例提供的电子设备的一种结构示意图;
图16为本申请实施例提供的电子设备的一种示意图;
图17为本申请实施例提供的虚拟键盘的处理方法的一种流程示意图;
图18为本申请实施例提供的虚拟键盘的处理方法中第一手势参数的一种示意图;
图19为本申请实施例提供的虚拟键盘的处理方法中相对角度信息的一种示意图;
图20为本申请实施例提供的虚拟键盘的处理方法中第一区域的两种示意图;
图21为本申请实施例提供的虚拟键盘的处理方法中第一手势操作的一种示意图;
图22为本申请实施例提供的虚拟键盘的处理方法中第一手势操作的另一种示意图;
图23为本申请实施例提供的虚拟键盘的处理方法中第一类型的虚拟键盘的一种示意图;
图24为本申请实施例提供的虚拟键盘的处理方法中第一类型的虚拟键盘的另一种示意图;
图25为本申请实施例提供的虚拟键盘的处理方法中第一类型的虚拟键盘的又一种示意图;
图26为本申请实施例提供的虚拟键盘的处理方法中第一类型的虚拟键盘的再一种示意图;
图27为本申请实施例提供的虚拟键盘的处理方法中第一类型的虚拟键盘的又一种示意图;
图28为本申请实施例提供的虚拟键盘的处理方法中第一类型的虚拟键盘的再一种示意图;
图29为本申请实施例提供的虚拟键盘的处理方法中第一类型的虚拟键盘的又一种示意图;
图30为本申请实施例提供的虚拟键盘的处理方法中第一设置界面的一种示意图;
图31为本申请实施例提供的虚拟键盘的处理方法中第一设置界面的另一种示意图;
图32为本申请实施例提供的虚拟键盘的处理方法中自定义的手势操作的一种示意图;
图33为本申请实施例提供的虚拟键盘的处理方法中第一类型的虚拟键盘的再一种示意图;
图34为本申请实施例提供的虚拟键盘的处理方法中第一类型的虚拟键盘的又一种示意图;
图35为本申请实施例提供的虚拟键盘的处理方法中第一类型的虚拟键盘的再一种示意图;
图36为本申请实施例提供的虚拟键盘的处理方法中第二虚拟按键的一种示意图;
图37为本申请实施例提供的虚拟键盘的处理方法中第二虚拟按键的另一种示意图;
图38为本申请实施例提供的虚拟键盘的处理方法的另一种流程示意图;
图39为本为本申请实施例提供的电子设备的一种结构示意图;
图40为本申请实施例提供的电子设备的一种结构示意图;
图41为本申请实施例提供的电子设备的一种结构示意图;
图42为本申请实施例提供的应用界面的处理方法的一种流程示意图;
图43为本申请实施例提供的应用界面的处理方法中第二显示屏的展示界面的一种界面示意图;
图44为本申请实施例提供的应用界面的处理方法中的一种流程示意图;
图45为本申请实施例提供的应用界面的处理方法中的另一种流程示意图;
图46为本申请实施例提供的应用界面的处理方法中各种握持姿势的一种示意图;
图47为本申请实施例提供的应用界面的处理方法中第一应用界面的一种界面示意图;
图48为本申请实施例提供的应用界面的处理方法中第一应用界面的两种界面示意图;
图49为本申请实施例提供的应用界面的处理方法中第一接触操作的一种示意图;
图50为本申请实施例提供的应用界面的处理方法中第一应用界面的展示界面的一种示意图;
图51为本申请实施例提供的应用界面的处理方法中的一种流程示意图;
图52为本申请实施例提供的应用界面的处理方法中的一种流程示意图;
图53为本申请实施例提供的应用界面的处理方法的一种流程示意图;
图54为本申请实施例提供的应用界面的处理方法中第一应用界面的展示界面的一种示意图;
图55为本申请实施例提供的应用界面的处理方法中第一应用界面的展示界面的一种示意图;
图56为本申请实施例提供的应用界面的处理方法中第一应用界面的展示界面的一种示意图;
图57为本申请实施例提供的应用界面的处理方法中第一应用界面的展示界面的一种示意图;
图58为本申请实施例提供的电子设备的一种结构示意图;
图59为本申请实施例提供的电子设备的一种结构示意图;
图60是本发明实施例提供的一种双屏电子设备;
图61是本发明实施例提供的一种应用场景;
图62是本发明实施例提供的一种屏幕显示方法;
图63A是本发明实施例提供的一种控件区的显示方式;
图63B是本发明实施例提供的另一种控件区的显示方式;
图63C是本发明实施例提供的另一种控件区的显示方式;
图64A是本发明实施例提供的一种激活控件区的方法;
图64B是本发明实施例提供的另一种激活控件区的方法;
图64C是本发明实施例提供的另一种激活控件区的方法;
图64D是本发明实施例提供的另一种激活控件区的方法;
图65A是本发明实施例提供的一种用户操作与控制键组的对应方式;
图65B是本发明实施例提供的另一种用户操作与控制键组的对应方式;
图65C是本发明实施例提供的另一种用户操作与控制键组的对应方式;
图65D是本发明实施例提供的另一种用户操作与控制键组的对应方式;
图65E是本发明实施例提供的另一种用户操作与控制键组的对应方式;
图65F是本发明实施例提供的另一种用户操作与控制键组的对应方式;
图66A是本发明实施例提供的一种控件区的显示方式;
图66B是本发明实施例提供的另一种控件区的显示方式;
图67是本发明实施例提供的一种控件区显示内容的布局方式;
图68是本发明实施例提供的一种优先权设置方式;
图69是本发明实施例提供的另一种优先权设置方式;
图70A是本发明实施例提供的一种关闭控件区的方法;
图70B是本发明实施例提供的另一种关闭控件区的方法;
图70C是本发明实施例提供的另一种关闭控件区的方法;
图70D是本发明实施例提供的另一种关闭控件区的方法;
图71是本发明实施例提供的另一种屏幕显示方法;
图72是本发明实施例提供的另一种屏幕显示方法;
图73A是本发明实施例提供的一种改变应用控制栏显示面积的方法;
图73B是本发明实施例提供的一种增大应用控制栏显示面积的方法;
图73C是本发明实施例提供的一种增大应用控制栏显示面积的方法;
图74A是本发明实施例提供的另一种改变应用控制栏显示面积的方法;
图74B是本发明实施例提供的另一种增大应用控制栏显示面积的方法;
图74C是本发明实施例提供的另一种增大应用控制栏显示面积的方法;
图75A是本发明实施例提供的另一种改变应用控制栏显示面积的方法;
图75B是本发明实施例提供的另一种增大应用控制栏显示面积的方法;
图75C是本发明实施例提供的另一种增大应用控制栏显示面积的方法;
图76A是本发明实施例提供的一种根据用户操作改变应用控制栏的显示面积及控制键的方法;
图76B是本发明实施例提供的另一种根据用户操作改变应用控制栏的显示面积及控制键的方法;
图77A是本发明实施例提供的一种手势控制方法;
图77B是本发明实施例提供的另一种手势控制方法;
图77C是本发明实施例提供的另一种手势控制方法;
图77D是本发明实施例提供的另一种手势控制方法;
图78A是本发明实施例提供的另一种手势控制方法;
图78B是本发明实施例提供的另一种手势控制方法;
图79是本发明实施例提供的另一种手势控制方法;
图80A是本发明实施例提供的一种屏幕显示方法的具体实现方式;
图80B是本发明实施例提供的另一种屏幕显示方法的具体实现方式;
图80C是本发明实施例提供的另一种屏幕显示方法的具体实现方式;
图80D是本发明实施例提供的另一种屏幕显示方法的具体实现方式;
图80E是本发明实施例提供的另一种屏幕显示方法的具体实现方式;
图80F是本发明实施例提供的另一种屏幕显示方法的具体实现方式;
图80G是本发明实施例提供的另一种屏幕显示方法的具体实现方式;
图81是本发明实施例提供的一种电子设备;
图82是本发明实施例提供的另一种电子设备。
具体实施方式
本申请的说明书和权利要求书及上述附图中的术语“第一”、第二”等是用于区别类似的对象,而不必用于描述特定的顺序或先后次序。应该理解这样使用的术语在适当情况下可以互换,这仅仅是描述本申请的实施例中对相同属性的对象在描述时所采用的区分方式。此外,术语“包括”和“具有”以及他们的任何变形,意图在于覆盖不排他的包含,以便包含一系列单元的过程、方法、系统、产品或设备不必限于那些单元,而是可包括没有清楚地列出的或对于这些过程、方法、产品或设备固有的其它单元。
下面结合附图,对本申请的实施例进行描述。本领域普通技术人员可知,随着技术的发展和新场景的出现,本申请实施例提供的技术方案对于类似的技术问题,同样适用。
实施例一:
本申请实施例可应用于各种通过虚拟键盘进行输入的应用场景中。作为示例,例如当用户使用文本录入类的应用程序时、制作演示文稿(power point,PPT)、浏览网页、进行视频播放、进行音乐播放、使用导航类应用程序等应用场景中,均可以借助虚拟键盘进行输入,在前述种种场景中,在触控屏幕上实现盲打均为一种较难的任务。
为了解决上述问题,本申请实施例提供了一种反馈方法,该反馈方法应用于配置有触控屏幕的电子设备中,电子设备获取触控屏幕上的第一接触点的第一位置信息,并根据第一位置信息,获取与第一接触点对应的第一虚拟按键,在第一虚拟按键为锚定点按键的情况下,执行第一反馈操作,以提示用于第一虚拟按键是锚定点按键,从而利于培养用户对于锚定点按键的肌肉记忆,并借助对锚定点按键的肌肉记忆进行盲打训练,以降低在触控屏幕上实现盲打的难度。
本申请实施例提供的反馈方法可以用于图1示出的电子设备中,请参阅图1和图2,图1和图2为本申请实施例提供的电子设备的两种结构示意图。先参阅图1,电子设备1 包括处理器10和触控屏幕20,触控屏幕10中包括接触感知模块100和振动反馈模块200,振动反馈模块200中包括多个振动反馈元件。
具体的,处理器10通过接触感知模块100获取触控屏幕上的第一接触点的第一位置信息,在处理器10确定与第一接触点对应的第一虚拟按键为锚定点按键的情况下,从振动反馈模块200包括的多个振动反馈元件中获取与第一虚拟键盘匹配的振动反馈元件,并通过与第一虚拟按键匹配的振动反馈元件发出振动波,以通过触控屏幕在第一接触点处(也即触控屏幕上第一接触点周围的预设范围内)发出振动反馈,用于提示用户触摸的第一虚拟按键为锚定点按键。需要说明的是,前述振动反馈并不是一种全屏的振动反馈,而是针对第一接触点的一种振动反馈,在第一接触点的振动反馈强度最大。
在一些应用场景中,请参阅图2,电子设备1包括显示器30和触控屏幕20,触控屏幕20上显示有虚拟键盘,虚拟键盘中存在锚定点按键,也即触控屏幕20需要同时具有展示虚拟键盘和进行振动反馈的功能,则触控屏幕20中还需要设置有显示模块。在另一些应用场景中,电子设备1也可以为虚拟现实(virtual reality,VR)设备、增强现实(augmented reality,AR)设备或混合现实设备(mixed reality,MR)等设备,也即触控屏幕20可以不需要展示虚拟键盘,仅需要进行振动反馈,则触控屏幕20中不需要设置显示模块。应理解,后续实施例中仅以触控屏幕20中设置有显示模块为例进行说明。
进一步地,请参阅图3,图3为本申请实施例提供的触控屏幕的一种结构示意图。触控屏幕20还可以包括盖板300和显示模块400,图3中以盖板300与接触感知模块100集成于一体为例,盖板300和接触感知模块100也可以为互相分离。
其中,盖板300可以采用玻璃类透明刚性材料、柔性的透明有机材料或其他的材料等,接触感知模块100具体可以表现为接触感知薄膜,接触感知薄膜具体可以为电容式接触感知薄膜、压力式接触感知薄膜、温度式接触感知薄膜或其他类型的薄膜等,进一步地,作为示例,例如接触感知薄膜具体可以采用氧化铟锡(indium tin oxide,ITO)金属丝网、带凸起的碳纳米管网或其他材质等等,此处均不做穷举。本申请实施例中,提供了振动反馈元件的多种具体表现形式,提高了本方案的实现灵活性。
显示模块400用于显示虚拟键盘,显示模块400与接触感知模块100可以集成于一体,也可以相互分离,图3中仅以显示模块400与接触感知模块100相互分离为例。其中,显示模块400具体可以表现为显示面板,显示面板具体可以为液晶显示器(liquid crystal display,LCD)、有源矩阵有机发光二极体面板(active matrix/organic light Emitting Diode,AMOLED)或其他类型的显示面板等,此处不做穷举。
在一种实现方式中,如图3所示,振动反馈模块200具体可以表现为振动反馈层,振动反馈层位于接触感知模块100的下方,具体可以位于显示模块400的上方,也可以位于显示模块400的下方。
振动反馈模块200中配置有多个振动反馈单元201,图3中每个深灰色的菱形均代表一个振动反馈单元;一个振动反馈单元201中可以包括一个或多个振动反馈元件。在一种情况下,振动反馈层具体可以表现为振动反馈薄膜,对振动反馈薄膜进行分区以划分出多个振动反馈元件,在另一种情况下,振动反馈元件具体可以表现为压电陶瓷片、线性马达 或其他类型的电子元件,此处不做穷举。
更进一步地,多个振动反馈单元201可以有多种布局排列方式。在一种情况中,参阅图3,虚拟键盘与实体键盘的布局完全一致,前述实体键盘可以为包括61个按键的键盘、87个按键的键盘、104个按键的键盘、108个按键的键盘、人体工学键盘或其他类型的实体键盘等,具体虚拟键盘的设计可结合实际应用场景灵活设定。多个振动反馈单元201可以与多个虚拟按键一一对应式布局,也即每个虚拟按键在位置上对应一个振动反馈单元201。
在另一种情况下,请参阅图4,图4为本申请实施例提供的电子设备中多个振动反馈单元的两种排列示意图。图4包括(a)子示意图和(b)子示意图,先参阅图4的(a)子示意图,多个振动反馈单元201呈矩阵式排列。参阅图4的(b)子示意图,多个振动反馈单元201以类似国际象棋棋盘的形式排列,图4的(a)子示意图和图4的(b)子示意图中的每个灰色的方格均代表一个振动反馈单元201。
在另一种实现方式中,如图5和图6所示,多个振动反馈单元201(也即多个振动反馈元件)可以位于显示模块400的四周。图5为本申请实施例提供的触控屏幕的一种截面示意图,图6为本申请实施例提供的振动反馈模块包括的多个振动反馈单元的一种排列布局示意图。先参阅图5,触控屏幕20包括盖板300、接触感知模块100、显示模块400、振动反馈元件、振动反馈元件的支撑结构、触控屏幕中的其他模块以及底板,图5中以盖板300和接触感知模块100集成于一体为例,多个振动反馈元件平行于显示模块400,可以直接支撑盖板300。需要说明的是,在其他实施例中,盖板300和接触感知模块100也可以分别独立,多个振动反馈单元也可以平行于接触感知模块100。结合图5和图6可知,多个振动反馈单元201呈环绕式排列布局,也即多个振动反馈单元201环绕在显示模块400的四周,对应的,在其他实施例中,多个振动反馈单元201也可以环绕在接触感知模块100的四周。
进一步地,显示模块400与盖板300之间可以是空隙层,从而为振动反馈元件发出振动波提供活动空间余量,显示模块400与盖板300之间也可以使用透光胶合材料粘合。振动反馈元件的支撑结构与底板可以集成于一体,也可以为互相分离。本申请实施例中,提供了多个振动反馈单元的多种排列布局方式,提高了本方案的实现灵活性。
需要说明的是,上述对于多个振动反馈单元的排列布局方式的列举仅为方便理解本方案,多个振动反馈单元还可以采用其他的布局方式,具体实现方式应当结合实际产品的形态确定,此处不做穷举。
可选地,触控屏幕20还可以包括压力感知模块,压力感知模块用于检测触控屏幕上的压力变化及位置。
在一种实现方式中,压力感知模块可以与振动反馈模块200分别为两个独立的模块,则压力感知模块可以设置于振动反馈模块200的上方,也可以设置于振动反馈模块200的下方。压力感知模块具体可以表现为压力感知薄膜、分布式压力传感器或表现为其他形式,此处不做穷举。
在另一种实现方式中,压力感知模块可以与振动反馈模块200也可以集成于一体,则振动反馈模块200也可以称为压力感知模块,振动反馈元件也可以称为压力感知元件。在 本实现方式中,振动反馈元件具体可以采用压电陶瓷片、压电聚合物(例如压电薄膜)、压电复合材料或其他类型的元件等,压电复合材料也即采用压电陶瓷片和压电聚合物得到的复核材料。进一步地,在一种情况下,可以对振动反馈模块200(也可以称为压力感知模块)中包括的多个振动反馈元件进行划分,多个振动反馈元件中的第二振动反馈元件用于采集压力值,多个振动反馈元件中的第三振动反馈元件用于发出振动波,以进行振动反馈。其中,第二振动反馈元件和第三振动反馈元件为不同的振动反馈元件。作为示例,例如一个振动反馈单元201中包括两个振动反馈元件,同一振动反馈单元201中的一个振动反馈元件用于采集压力值,另一个振动反馈元件用于发出振动波,以进行振动反馈。
在另一种情况下,振动反馈模块200(也可以称为压力感知模块)中的多个振动反馈元件在第一时间段内用于采集压力值,在第二时间段内用于发出振动波,第一时间段和第二时间段不同。作为示例,例如振动反馈模块200(也可以称为压力感知模块)中的多个振动反馈元件在默认状态下可以用于采集压力值,当达到第一压力值阈值时(也即确认接收到按压操作的情况下),用于发出振动波,以进行振动反馈。
本申请实施例中,触控屏幕中还配置有用于进行采集压力值的压力感知模块,从而不仅可以获取到接触点的位置信息,还可以获取到接触点的压力值,以对通过触控屏幕获取到的接触操作做进一步细致的管理;且将压力感知模块和振动反馈模块集成于一体,有利于降低触控屏幕的厚度,进而提高电子设备的便捷性。
可选地,触控屏幕20的盖板300的触觉特性为可改变的,触觉特性包括以下中的任一种或多种特性:滑动摩擦系数、粘滑性、温度或其他触觉特性等;进一步地,粘滑性代表滑动摩擦系数的变换速度。进一步地,可以为改变整个盖板300的触觉特征,也可以仅改变盖板300中接触点的触觉特征。
具体的,在一种实现方式中,如图7所示,图7为本申请实施例提供的触控屏幕的一种结构示意图。触控屏幕20还可以包括超声波模块500,超声波模块500用于发出超声波,以改变盖板300的触觉特性,具体可以通过超声波振动薄膜、压电薄膜、扬声器或其他元器件等实现,此处不做穷举。其中,超声波模块500可以配置于盖板300的下方,具体可以配置于接触感知模块100或显示模块400的上方,也可以配置于接触感知模块100或显示模块400的下方,图7中以配置于接触感知模块100的上方为例,应理解,图7中的示例仅为方便理解本方案,不用于限定本方案。
在另一种实现方式中,如图8所示,图8为本申请实施例提供的触控屏幕的一种结构示意图,触控屏幕20还包括静电模块600,静电模块600用于产生电信号,以改变盖板的触觉特性。其中,静电模块600具体可以表现为静电薄膜层,可以配置于盖板300的下方,具体可以配置于接触感知模块100或显示模块400的上方,也可以配置于接触感知模块100或显示模块400的下方,图8中以配置于接触感知模块100的上方为例,应理解,图8中的示例仅为方便理解本方案,不用于限定本方案。
本申请实施例中,触控屏幕还可以通过设置超声波模块或静电模块的方式,来改变盖板的触觉特性,从而可以提供更为丰富的触觉反馈,进而可以利用更为丰富的触觉反馈来对用户在触控屏幕上实现盲打进行训练,以进一步降低在触控屏幕上实现盲打的难度。
基于上述描述,本申请实施例提供一种反馈方法,可以应用于上述图1至图8示出的电子设备中。具体的,请参阅图9,图9为本申请实施例提供的反馈方法的一种流程示意图,本申请实施例提供的反馈方法可以包括:
901、电子设备检测作用于触控屏幕上的第一接触操作,并响应于第一接触操作,获取与第一接触操作对应的第一接触点的第一位置信息。
本申请实施例中,电子设备可以实时检测作用于触控屏幕上的第一接触操作,当电子设备通过触控屏幕检测到用户输入的第一接触操作时,可以响应于第一接触操作,获取通过触控屏幕中的接触感知模块采集到触控屏幕上的至少一个第一接触点的数量和每个第一接触点的第一位置信息。其中,该至少一个第一接触点可以仅包括触控屏幕上的新增接触点,也可以包括触控屏幕上的所有接触点。第一位置信息为基于触控屏幕坐标系建立的,可以以触控屏幕的中心点、左上角的顶点、左下角的顶点、右上角的顶点、右下角的顶点、触控屏幕内的任意位置点或其他位置点作为坐标系原点。
更具体的,若至少一个第一接触点可以仅包括触控屏幕上的新增接触点,则在电子设备中的虚拟键盘呈现为打开的状态下,会通过触控屏幕中的接触感知模块持续检测与触控屏幕上各个接触点对应的触控信号,在检测到触控屏幕上出现新的接触点的接触信号的情况下,及时采集新增的至少一个第一接触点的位置信息。作为示例,例如用户刚刚开启文本录入类应用程序并调出虚拟键盘时,从双手尚未接触到触控屏幕,至将双手放置至标准指位时,就可以获取到触控屏幕上多个新的第一接触点。作为另一示例,例如用户在进行键盘输入时,某一手指从一个虚拟按键的键位离开,下落或滑入另一个虚拟按键的键位时,触控屏幕上就会出现多个新的第一接触点,就可以获取到触控屏幕上一个新的第一接触点。应理解,此处举例仅为方便理解本方案,不用于限定本方案。其中,本申请实施例中的虚拟键盘可以表现为任意类型的键盘,作为示例,例如虚拟键盘可以为全键盘、数字键盘、功能键盘等,或者,虚拟键盘也可以为触控屏幕上所有操作按键的统称。
需要说明的是,在步骤901中还需要进行防误触处理。具体的,用于不仅用户的手指可以在触控屏幕上产生接触点,用户的手掌、小臂、手背或电容笔等都可以在触控屏幕上产生接触点,也即电子设备通过触控屏幕的接触感知模块可以采集到用户的手掌、小臂、手背或电容笔等非用户手指产生的接触点的触控信号,则电子设备的处理器在获取到与触控屏幕上每个新增接触点对应的触控信号后,需要进行过滤分析,将获取到的新增接触点的触控信号中除手指触发的新增接触点的触控信号过滤出去,也即第一接触点仅包括由用户手指触发的新增接触点。
本申请实施例中,由于用户在使用实体键盘时,往往关注点放在新接触的实际按键中,本方案中仅对新增接触点产生反馈,可以更好的模拟用户使用实体键盘进行输入时的体验,且仅针对新增接触点产生反馈,也更容易建立用户与新增接触点之间的记忆关系,进一步降低在触控屏幕上训练盲打的难度。
可选地,触控屏幕中还可以配置有接近感知模块,当电子设备中的虚拟键盘为开启状态时,电子设备通过触控屏幕中的接近感知模块感知到用户手指在触控屏幕上方的移动轨迹,并对手指与触控屏幕的预计接触点进行预估。
可选地,在步骤901之前,电子设备还可以响应于检测到的第一手势操作,从多个类型的虚拟键盘中选取与第一手势操作对应的第一类型的虚拟键盘,其中,多个类型的虚拟键盘中不同类型的虚拟键盘包括的虚拟按键不完全相同;通过触控屏幕展示第一类型的虚拟键盘,在第一类型的虚拟键盘的展示过程中,第一类型的虚拟键盘在触控屏幕上的位置固定;电子设备在确定第一类型的虚拟键盘为展示过程中展示位置固定的虚拟键盘,则电子设备会实时获取触控屏幕上的第一接触点的第一位置信息,也即触发进入步骤901。对于第一手势操作、多个类型的虚拟键盘的概念,以及前述步骤的具体实现方式,均会在后续实施例二中进行描述,此处不做赘述。
902、电子设备获取与第一接触点对应的压力值。
本申请的一些实施例中,当电子设备通过触控屏幕获取到用户输入的第一接触操作时,还可以通过触控屏幕中的压力感知模块采集与触控屏幕上至少一个第一接触点对应的压力值。其中,与触控屏幕上至少一个第一接触点对应的压力值可以包括至少一个第一接触点中每个第一接触点的压力值,也可以为至少一个第一接触点共享一个压力值。
具体的,在一种情况中,若触控屏幕中的压力感知模块是一个独立的,则压力感知模块可以直接采集至少一个第一接触点中每个第一接触点的压力值。
在另一种情况中,若压力感知模块和振动反馈模块集成于一体,且振动反馈模块包括的每个振动反馈单元与虚拟键盘中的虚拟按键一一对应,则压力感知模块也可以直接采集到至少一个第一接触点中每个第一接触点的压力值。
在另一种情况中,若压力感知模块和振动反馈模块集成于一体,且振动反馈单元与虚拟按键并非为一一对应的关系,作为示例,例如多个振动反馈单元的排列布局方式为上述图4至图6示出的多种排列布局方式,也即多个振动反馈单元呈矩阵式排列、国际象棋式排列或环绕式排列。电子设备可以获取压力传感模块中所有压力传感元件(也可以称为振动反馈元件)的读数。进而在一种实现方式中,电子设备可以根据每个压力传感元件的坐标位置以及每个压力传感单元采集到的压力值,依据力矩相等的原理求解出至少一个第一接触点(也即每个压力中心点)中每个第一接触点(也即每个压力中心点)的压力值。
在另一种实现方式中,电子设备也可以基于所有压力传感元件采集的压力值,计算出整个触控屏幕的压力值,并将至少一个接触点中各个第一接触点的压力值均确定为前述整个触控屏幕的压力值。
903、电子设备根据第一接触点的第一位置信息,获取与第一接触点对应的第一虚拟按键。
本申请的一些实施例中,电子设备在获取到至少一个第一接触点中每个第一接触点的第一位置信息之后,可以逐个获取与每个第一接触点对应的第一虚拟按键;第一虚拟按键为虚拟键盘中的一个虚拟按键。
具体的,针对获取与第一接触点对应的第一虚拟按键的过程。由于电子设备可以展示一种或多种类型的虚拟键盘,则电子设备中可以存储有每种虚拟键盘中每个虚拟按键的位置信息,电子设备从多种虚拟键盘中确定当前展示的虚拟键盘,获取当前展示的虚拟键盘中每个虚拟按键的位置信息,进而根据第一接触点的第一位置信息和当前展示的虚拟键盘 中每个虚拟按键的位置信息进行匹配,从而确定与第一接触点对应的第一虚拟按键。为了更直观地理解本方案,请参阅图10,图10为本申请实施例提供的反馈方法中虚拟键盘的两种示意图。图10的(a)子示意和图10的(b)子示意图中示出的为触控屏幕上的两种虚拟键盘的样式,图10的(a)子示图示出的为与74个按键的实体键盘对应的虚拟键盘,图10的(b)子示意图示出的为人体工学键盘,应理解,图10中的示例仅为方便理解本方案中的虚拟键盘,不用于限定本方案。
作为示例,例如当前展示的虚拟键盘为人体工学键盘,电子设备通过触控屏幕的接触感知模块确定触控屏幕上的第一接触点的第一位置信息后,将第一位置信息与人体工学键盘中每个虚拟按键的位置信息进行对比,从而确定第一接触点位于虚拟按键K的位置区域内,则确定与第一接触点对应的第一虚拟按键为按键K,应理解,此处举例仅为方便理解本方案,不用于限定本方案。
更具体的,在一种实现方式中,由于第一接触点在实际情况中可以表现为一个位置区域,则第一位置信息描述的可以为一个位置区域,电子设备可以取第一位置信息的中心点的坐标,并将第一位置信息的中心点的坐标与当前展示的虚拟键盘中每个虚拟按键的位置信息进行匹配,从而确定与第一接触点对应的第一虚拟按键。
在另一种实现方式中,电子设备也可以直接将第一接触点的第一位置信息与当前展示的虚拟键盘中每个虚拟按键的位置信息进行匹配,并从中选取第一虚拟按键,第一虚拟按键的位置信息与第一位置信息的交集最多。
904、电子设备根据与第一接触点对应的压力值,判断与第一接触点对应的接触操作是按压操作还是触摸操作,若为按压操作,则进入步骤905;若是触摸操作,则进入步骤908。
本申请的一些实施例中,电子设备可以预先设置有第一压力值阈值和第二压力值阈值,第一压力值阈值指的是按压操作的阈值,第二压力值阈值为触摸操作的阈值。针对至少一个第一接触点中的任意一个第一接触点,电子设备在获取与第一接触点对应的压力值之后,可以判断与第一接触点对应的压力值是否大于或等于第一压力值阈值,若与第一接触点对应的压力值大于或等于第一压力值阈值,则确定与第一接触点对应的接触操作是按压操作;若与第一接触点对应的压力值大于或等于第二压力值阈值且小于第一压力值阈值,则确定与第一接触点对应的接触操作是触摸操作;若与第一接触点对应的压力值小于第二压力值阈值,则确定与第一接触点对应的接触操作是空闲操作,进而不做任何反馈。
其中,第一压力值阈值的取值大于第二压力值阈值的取值,作为示例,例如第一压力值阈值的取值范围可以为50克力至60克力,作为示例,例如第一压力值阈值的取值为55克力、60克力或其他数值等,第二压力值阈值的取值范围可以为0克力至30克力,作为示例,例如第一压力值阈值的取值为15克力、20克力等等,此处不做限定。
905、电子设备判断第一虚拟按键是否为锚定点按键,在第一虚拟按键为锚定点按键的情况下,进入步骤906,在第一虚拟按键不是锚定点按键的情况下,进入步骤908。
本申请实施例中,在一种情况下,在第一类型的虚拟键盘的展示过程中,展示的第一类型的虚拟键盘的位置固定;在另一种情况下,在第一类型的虚拟键盘的展示过程中,展示的第一类型的虚拟键盘的位置可以移动。
若在第一类型的虚拟键盘的展示过程中,展示的第一类型的虚拟键盘的位置固定,则在一种实现方式中,电子设备可以预先存储哪些按键是锚定点按键,哪些按键是非锚定点按键,则步骤903为必选步骤,电子设备在通过步骤903确定与第一接触点对应的第一虚拟按键之后,可以判断第一虚拟按键是否为锚定点按键。在另一种实现方式中,电子设备可以预先存储触控屏幕上的哪些位置区域是锚定点按键的位置区域,触控屏幕上的哪些位置区域是非锚定点按键的位置区域,则步骤903就是可选步骤,电子设备直接根据步骤901获取到的第一接触点的第一位置信息,直接判断第一接触点的位置是否位于锚定点按键的位置区域内,也即得知与第一位置信息对应的第一虚拟按键是否为锚定点按键。
若在第一类型的虚拟键盘的展示过程中,展示的第一类型的虚拟键盘的位置可以移动,则步骤903为必选步骤,电子设备中可以存储有第一类型的虚拟键盘中每个虚拟按键的位置信息,在获取到第一接触点的第一位置信息之后,根据该第一位置信息,获取与第一接触点对应的第一虚拟按键,继而判断第一虚拟按键是否为锚定点按键。
本申请实施例中,能够根据第一位置信息,实时获取与第一接触点对应的第一虚拟按键,使得本方案不仅能够兼容位置固定的虚拟键盘,也可以兼容位置会移动的虚拟键盘,扩展了本方案的应用场景。
需要说明的是,锚定点按键的含义不等同于定位按键,也即锚定点按键指的是用于给用户带来提示效果的按键,在确定了当前展示的虚拟键盘之后,哪些虚拟按键为锚定点按键可以为预先配置于电子设备中的,也即哪些虚拟按键为锚定点按键可以为预先固定好的;也可以为由用户进行自定义,也即用户可以通过电子设备中的“设置”界面来自行定义哪些虚拟按键为锚定点按键。更进一步地,由于同一电子设备能够提供多种不同类型的虚拟按键,则不同类型的虚拟按键中锚点按键也可以不同。
作为示例,例如锚定点按键可以为按键“F”和按键“J”,或者,锚定点按键还可以包括空格键;作为另一示例,锚定点按键还可以包括ESC按键、Backspace按键、Enter按键、Ctrl按键等常用功能按键及数字键等;作为另一示例,例如虚拟键盘采用的为“DVORAK”的布局方式,锚定点按键可以包括按键“AOEUHTNS”这八个标准指位的按键;作为另一示例,虚拟键盘采用的为“AZERTY”的布局方式,锚定点按键还可以包括“QSDFJKLM”这八个按键;作为另一示例,锚定点按键还可以包括“AZERTY”这六个按键等,此处不对锚定点按键进行穷举。
906、电子设备执行第一反馈操作。
本申请实施例中,在与第一接触点对应的接触操作是按压操作,且第一虚拟按键为锚定点按键的情况下,电子设备执行第一反馈操作,第一反馈操作用于提示第一虚拟按键为锚定点按键。
具体的,在一种实现方式中,第一反馈操作采用的可以为振动反馈的形式,则步骤906可以包括:电子设备从多个振动反馈元件中获取第一振动反馈元件,第一振动反馈元件配置于触控屏幕中,第一振动反馈元件为与第一虚拟按键匹配的振动反馈元件,与不同的虚拟按键匹配的振动反馈元件不完全相同;通过第一振动反馈元件发出第一类型的振动波,以执行第一反馈操作。其中,振动反馈元件发出的振动波采用的为非超声波,一般是频率 小于或等于500赫兹。
更具体的,针对获取与第一虚拟按键匹配的第一振动反馈元件的过程。由于该电子设备在出厂时,触控屏幕中包括的多个振动反馈元件的位置已经固定了,则电子设备在出厂时就可以配置有第一映射关系。其中,在一种实现方式中,可以将整个触控屏幕划分为多个位置区域,电子设备中存储的第一映射关系包括触控屏幕中多个位置区域中每个位置区域与至少一个振动反馈元件之间的对应关系,则无论在第一类型的虚拟键盘的展示过程中,展示的第一类型的虚拟键盘的位置固定;还是在第一类型的虚拟键盘的展示过程中,展示的第一类型的虚拟键盘的位置可以移动,电子设备均可以根据步骤901获取到的第一位置信息和第一映射关系,从多个振动反馈元件中获取与第一虚拟按键匹配的(也即与第一位置信息匹配的)至少一个第一振动反馈元件。本申请实施例中,可以根据第一位置信息和第一映射关系,获取与第一虚拟按键匹配的至少一个第一振动反馈元件,方便快捷,有利于提高振动反馈元件的匹配过程的效率;且第一映射关系能够指示第一位置信息和指示一个第一振动反馈元件之间的对应关系,不仅可以兼容位置固定的虚拟键盘,还可以兼容位置能够移动的虚拟键盘,保证了各种场景下均可以提供振动反馈。
在另一种实现方式中,若在第一类型的虚拟键盘的展示过程中,展示的第一类型的虚拟键盘的位置固定,电子设备上可以预先配置有与多种虚拟键盘一一对应的多个映射关系,每个映射关系中包括多个虚拟按键中每个虚拟按键与至少一个振动反馈元件之间的对应关系。则电子设备先从多个映射关系中获取与当前展示的虚拟键盘匹配的第一映射关系,第一映射关系包括当前展示的虚拟键盘中每个虚拟按键与至少一个第一振动反馈元件之间的对应关系,电子设备根据第一映射关系和通过步骤903确定的第一虚拟按键,获取与第一虚拟按键匹配的一个或多个第一振动反馈元件。
本申请实施例中,预先配置有第一映射关系,从而在获取到第一虚拟按键之后,能够第一映射关系,获取与第一虚拟按键匹配的至少一个第一振动反馈元件,方便快捷,有利于提高振动反馈元件的匹配过程的效率;将确定振动反馈元件这一步骤进行拆分,从而当出现故障时,有利于对故障位置进行精确定位。
在另一种实现方式中,电子设备中预先配置有每个振动反馈元件的位置信息,电子设备根据第一虚拟按键的第一位置信息和振动反馈模块中每个振动反馈元件的位置信息,判断第一虚拟按键的下方是否存在用于产生振动波的振动反馈元件,若第一虚拟按键的下方存在用于产生振动波的振动反馈元件,则获取位于第一位置信息下方的至少一个振动反馈元件的,前述位于第一位置信息下方的至少一个振动反馈元件指的是位置区域与第一虚拟按键在振动反馈模块的投影相交的振动反馈元件。若第一虚拟按键的下方不存在用于产生振动波的振动反馈元件,则电子设备中会以第一虚拟按键的第一位置信息的中心点坐标为中心点,搜索预设区域内存在的用于产生振动波的振动反馈元件。该预设区域可以为圆形、正方形、长方形等,该预设区域的大小可以结合振动反馈元件的排列布局情况、振动反馈元件采用的元件类型等因素确定,此处不做限定。
针对通过第一振动反馈元件发出振动波,以执行第一反馈操作的过程。具体的,电子设备在确定了与虚拟按键匹配的至少一个第一振动反馈元件之后,电子设备通过至少一个 第一振动反馈元件发出第一类型的振动波。
可选地,电子设备还可以根据第一接触点的第一位置信息,获取与第一接触点对应的位置类型。位置类型包括第一接触点位于锚定点按键的第一位置区域和第一接触点位于锚定点按键的第二位置区域,第一位置区域和第二位置区域不同;也即将一个锚定点按键的全部位置区域进行进一步地划分,分为第一位置区域(也可以称为锚定点按键的特征区域)和第二位置区域(也可以称为锚定点按键的边缘区)。不同的虚拟按键中第一位置区域和第二位置区域的划分方式可以不同。
为更直观地理解本方案,请参阅图11至图13,图11至图13为本申请实施例提供的反馈方法中第一位置区域和第二位置区域的四种示意图。图11包括(a)和(b)四个子示意图,图11的(a)子示意图中虚线框以内的区域代表的为虚拟按键K的第一位置区域(也可以成为按键K的特征位置区域),图11的(a)子示意图中虚线框以外的区域代表的为虚拟按键K的第二位置区域(也可以成为按键K的边缘位置区域)。图11的(b)子示意图中虚线框以内的区域代表的为虚拟按键J的第一位置区域,图11的(b)子示意图中虚线框以外的区域代表的为虚拟按键J的第二位置区域,图11的(b)子示意图示出的可以为与实体键盘中带有小突起的按键对应的虚拟按键中第一位置区域和第二位置区域的划分方式。
请继续参阅图12,图12中虚线框以内的区域代表的为虚拟按键K的第一位置区域,图12中虚线框以外的区域代表的为虚拟按键K的第二位置区域。图12和图11的(a)子示意图为两种不同的区域划分方式,图12中的划分方式是为了模拟实体键盘中呈下凹弧面的键帽。再参阅图13,图12中虚拟按键K的虚线框以内的区域代表的为虚拟按键K的第一位置区域,图12中虚拟按键K的两条虚线框之间的区域代表的为虚拟按键K的第二位置区域。图13与图12以及图11的(a)子示意图为不同的区域划分方式,在虚拟按键为锚定点按键的情况下,将虚拟按键K的第二位置区域(也可以称为虚拟按键的边缘位置区域)拓展至虚拟按键K的边缘之外,覆盖了虚拟按键K周边的按键间隙,可以进一步增强锚定点按键的触觉差异度,应理解,图11至图13中示出的第一位置区域和第二位置区域的划分方式仅为方便理解第一位置区域和第二位置区域的概念,在实际情况中,可以结合实际的应用产品形态、用户习惯等因素对第一位置区域和第二位置区域进行划分,此处不做限定。
电子设备可以根据与第一接触点对应的位置类型,确定第一振动反馈元件发出的振动波的类型。其中,在第一接触点位于锚定点按键的第一位置区域的情况下,和,在第一接触点位于锚定点按键的第二位置区域的情况下,电子设备通过至少一个第一振动反馈元件发出的振动波的类型可以不同。其中,若电子设备通过振动反馈元件发出的为连续的振动波,则不同类型的振动波的区别包括以下中的任一种或多种特性:振动幅度、振动频率、振动时长或振动波形。若电子设备通过振动反馈元件发出的为脉冲形式的振动波,则不同类型的振动波的区别包括以下中的任一种或多种特性:振动幅度、振动频率、振动时长、振动波形或电子设备发出脉冲形式的振动波的频率。
进一步地,不同振动幅度的振动波可以通过不同的触发电压来实现,当为振动反馈元 件输入300v电压时所产生的振动波的振动幅度,与,当为振动反馈远近输入400v电压时所产生的振动波的振动幅度不同。与锚定点按键对应的振动反馈元件发出的振动波的振动频率可以在200赫兹至400赫兹之间,例如240赫兹、260赫兹、300赫兹、350赫兹、380赫兹或其他取值等等,此处不做穷举。振动时长可以为10毫秒、15毫秒、20毫秒、25毫秒、30毫秒等等。与锚定点按键对应的振动反馈元件发出的振动波可以为单一的基础波形,也可以为多种不同的基础波形之间的叠加;前述基础波型包括但不限于方波、正弦波、锯齿波、三角波或其他类型的基础波形等等。作为示例,例如一个第一振动反馈元件发出的振动波可以为由350v电压(决定了振动幅度)产生的、振动频率为290赫兹、持续20毫秒的正弦波,应理解,此处的种种举例仅为方便理解本方案,不用于限定本方案。
可选地,由于触控屏幕中存在至少一个第一振动反馈元件与第一虚拟按键匹配,虚拟键盘中还会存在第二虚拟按键,与第二虚拟按键的振动反馈元件的数量和与第一虚拟按键对应的振动反馈元件的数量可以不同或相同,也即与不同虚拟按键对应的振动反馈元件的数量可以相同或不同。作为示例,例如与虚拟按键K对应的振动反馈元件的数量可以为3个,与虚拟按键J对应的振动反馈元件的数量可以为2个。
为了实现与第一虚拟按键对应的振动反馈的强度和与第二虚拟按键对应的振动反馈的强度的差异在预设强度范围内,也即为了使得与不同的虚拟按键对应的总的振动反馈的强度(也即用户可以感知到的振动反馈的强度)的差异在预设强度范围之内,电子设备获取与至少一个第一振动反馈元件中各个第一振动反馈元件对应的振动波的振动强度,至少一个第一振动反馈元件中各个第一振动反馈元件的振动波的振动强度与第一数量相关,第一数量为与所述第一虚拟按键匹配的振动反馈元件的数量。进而根据与各个第一振动反馈元件对应的振动波的振动强度,通过至少一个第一振动反馈元件中每个第一振动反馈单元发出第一类型的振动波。其中,预设强度范围可以为强度差异在百分之二以内、强度差异在百分之三以内、强度差异在百分之四以内、强度差异在百分之五以内或其他强度范围等,此处不做穷举。
具体的,在一种实现方式中,电子设备可以在确定与第一虚拟按键匹配的至少一个第一振动反馈元件之后,直接根据与第一虚拟按键匹配的第一振动反馈元件的数量,确定与至少一个第一振动反馈元件中各个第一振动反馈元件对应的振动波的振动强度;其中,电子设备可以根据以下多项因素中的任一种或多种因素的组合来确定每个第一振动反馈元件的振动波的振动强度:与第一虚拟按键匹配的第一振动反馈元件的数量、每个第一振动反馈单元与第一虚拟按键的中心点的距离、振动波的类型、虚拟按键是否为锚定点按键、第一位置信息的位置类型或其他因素等。
在另一种实现方式中,电子设备中可以预先存储有第二映射关系,在一种情况下,第二映射关系指示与第一位置信息对应的各个第一振动反馈元件的振动强度之间的关系,则电子设备可以根据步骤901获取到的第一位置信息和第二映射关系,获取到各个第一振动反馈元件的振动强度。在另一种情况下,第二映射关系指示第一虚拟按键与各个第一振动反馈元件的振动强度之间的关系,则电子设备根据步骤903获取到的第一虚拟按键和第二映射关系,获取到各个第一振动反馈元件的振动强度。
进一步地,针对在触控屏幕的表面进行强度测量的过程,可以将振动测量仪器的探头贴合在触控屏幕上的一个虚拟按键(也即一个检测点)的表面,以从前述检测点上采集到振动波,进而得到该采集到的振动波的波形曲线,通过前述波形曲线来指示与该检测点对应的振动反馈的强度。更进一步地,与第一虚拟按键对应的振动反馈的强度和与第二虚拟按键对应的振动反馈的强度之间的差异,可以通过对比在第一虚拟按键这个检测点上量取的波形曲线与在第二虚拟按键这个检测点上两区的波形曲线之间的差异来获得。
本申请实施例中,由于与不同的虚拟按键对应的振动反馈元件的数量可能不同,所以根据匹配的振动反馈元件的数量,来确定各个振动反馈元件的强度,以实现各个虚拟按键的振动反馈强度的差别在预设范围之内,由于当用户在使用实体按键时,不同的按键给出的力反馈基本相同,从而可以降低虚拟键盘与实体键盘之间的差异,以增加用户粘度。
在另一种实现方式中,第一反馈操作采用的可以为声音反馈的形式,则步骤907可以包括:电子设备发出第一提示音,第一提示音可以为“滴滴”、“哔哔”、“嘟嘟”等声音,此处不对第一提示音的具体表现形式进行穷举。
可选地,电子设备还可以根据第一接触点的第一位置信息,获取与第一接触点对应的位置类型;在第一接触点位于锚定点按键的第一位置区域的情况下,和,在第一接触点位于锚定点按键的第二位置区域的情况下,电子设备发出不同的提示音。作为示例,例如在第一接触点位于锚定点按键的第一位置区域的情况下,电子设备发出“滴滴”的提示音,在第一接触点位于锚定点按键的第二位置区域的情况下,电子设备发出“哔哔”的提示音。
电子设备还可以采用除了声音反馈、振动反馈之外的其他类型的反馈方式,具体采用哪种类型的反馈方式可结合实际产品形态以及产品的实际应用场景来确定,此处不做穷举。
907、电子设备执行第二反馈操作。
本申请的一些实施例中,在与第一接触点对应的接触操作是按压操作,且第一虚拟按键不是锚定点按键的情况下,电子设备可以执行第二反馈操作,第二反馈操作用于提示第一虚拟按键为非锚定点按键,第一反馈操作与第二反馈操作为不同的反馈操作。本申请实施例中,不仅在第一虚拟按键为锚定点按键的情况下执行反馈操作,且在第一虚拟按键为非锚定点按键的情况下也会执行反馈操作,第一反馈操作和第二反馈操作为不同的反馈操作,由于当用户使用实体键盘时,每个按键均会给用户以反馈,通过前述方式,能够增加虚拟键盘与实体键盘之间的相似度,且对锚定点按键与非锚定点按键给出不同的反馈操作,也可以帮助用户记住不同类型的按键,以协助用户实现在虚拟键盘上的盲打。
在一种实现方式中,第二反馈操作采用的可以为振动反馈的形式,则步骤907可以包括:电子设备获取与第一虚拟按键匹配的第一振动反馈元件,第一振动反馈元件配置于触控屏幕中;通过第一振动反馈元件发出第二类型的振动波,以执行第二反馈操作。第一类型的振动波与第二类型的振动波的区别包括以下中的任一种或多种特性:振动幅度、振动频率、振动时长、振动波形。本申请实施例中,提供了不同类型的振动波的具体区别方式,可以通过振动幅度、振动频率、振动时长和/或振动波形等方面来区分不同类型的振动波,提高了本方案的实现灵活性。
具体的,电子设备获取与第一虚拟按键匹配的第一振动反馈元件的具体实现方式可参 阅上述步骤906中的描述,此处不做赘述。
针对通过第一振动反馈元件发出振动波,以执行第二反馈操作的过程。具体的,电子设备在确定了与虚拟按键匹配的至少一个第一振动反馈元件之后,电子设备通过至少一个第一振动反馈元件发出第二类型的振动波。
可选地,在第一虚拟按键不是锚定点按键的情况下,电子设备也可以根据第一接触点的第一位置信息,获取与第一接触点对应的位置类型,位置类型包括第一接触点位于非锚定点按键的第一位置区域和第一接触点位于非锚定点按键的第二位置区域,第一位置区域和第二位置区域不同;也即将一个非锚定点按键的全部位置区域进行进一步地划分为第一位置区域(也可以称为非锚定点按键的特征区域)和第二位置区域(也可以称为非锚定点按键的边缘区),不同的虚拟按键中第一位置区域和第二位置区域的划分方式可以不同。
电子设备可以根据与第一接触点对应的位置类型,确定第一振动反馈元件发出的振动波的类型。在第一接触点位于非锚定点按键的第一位置区域的情况下,和,在第一接触点位于非锚定点按键的第二位置区域的情况下,电子设备通过至少一个第一振动反馈元件发出的振动波的类型可以不同。
进一步地,在一种情况中,与锚定点按键的第一位置区域对应的振动波的类型和与非锚定点按键的第一位置区域对应的振动波的类型相同,且与锚定点按键的第二位置区域对应的振动波的类型和与非锚定点按键的第二位置区域对应的振动波的类型不同。
在另一种情况中,与锚定点按键的第一位置区域对应的振动波的类型和与非锚定点按键的第一位置区域对应的振动波的类型不同,且与锚定点按键的第二位置区域对应的振动波的类型和与非锚定点按键的第二位置区域对应的振动波的类型相同。
在另一种情况中,与锚定点按键的第一位置区域对应的振动波的类型和与非锚定点按键的第一位置区域对应的振动波的类型不同,且与锚定点按键的第二位置区域对应的振动波的类型和与非锚定点按键的第二位置区域对应的振动波的类型不同。
本申请实施例中,将锚定点按键和/或非锚定点按键的全部位置区域划分为第一位置区域和第二位置区域,在第一接触点位于的第一位置区域的情况下,和,在第一接触点位于第二位置区域这两种情况下,电子设备通过至少一个第一振动反馈元件发出的振动波的类型不同,有利于帮助用户记忆虚拟按键的边界,也即有利于协助用户对虚拟按键的不同区域建立肌肉记忆,以进一步降低在触控屏幕上实现盲打的难度。
在另一种实现方式中,第二反馈操作采用的可以为声音反馈的形式,则步骤907可以包括:电子设备发出第二提示音,第二提示音和第一提示音为不同的提示音。
可选地,电子设备还可以根据第一接触点的第一位置信息,获取与第一接触点对应的位置类型;在第一接触点位于非锚定点按键的第一位置区域的情况下,和,在第一接触点位于非锚定点按键的第二位置区域的情况下,电子设备发出不同的提示音。
需要说明的是,步骤907为可选步骤,可以不执行步骤907,也即当电子设备确定第一虚拟按键不是锚定点按键的情况下,可以不执行任何反馈。
908、电子设备判断第一虚拟按键是否为锚定点按键,在第一虚拟按键为锚定点按键的情况下,进入步骤909,在第一虚拟按键不是锚定点按键的情况下,进入步骤910。
本申请实施例中,步骤908的具体实现方式可以参阅上述对步骤905中的描述,此处不做赘述。
909、电子设备改变触控屏幕上的第一接触点的触觉特性,以呈现为第一触觉状态。
本申请的一些实施例中,在与第一接触点对应的接触操作是触摸操作,且第一虚拟按键为锚定点按键的情况下,电子设备改变触控屏幕的盖板中第一接触点的触觉特性,以呈现为第一触觉状态。其中,触控屏幕的盖板的触觉特性包括以下中的任一种或多种特性:滑动摩擦系数、粘滑性、温度和其他类型的触觉特性等。电子设备可以将触控屏幕的整个盖板均改变为第一触觉状态,以实现将触控屏幕的盖板中第一接触点改变至第一触觉状态;也可以为仅将触控屏幕的盖板中第一接触点改变至第一触觉状态,而不改变触控屏幕的盖板中其他区域的触觉状态。
具体的,在一种实现方式中,若电子设备的触控屏幕中集成有超声波模块,电子设备通过触控屏幕中的超声波模块发出超声波的方式,来改变触控屏幕的盖板中第一接触点的触觉特性。则电子设备可以通过超声波模块发出不同类型的超声波,以使触控屏幕的盖板中第一接触点呈现出不同的触觉特性。其中,若电子设备通过超声波模块发出的为单一的超声波,则不同类型的超声波的区别包括以下中的任一种或多种特性:振动幅度、振动频率、振动时长或振动波形。进一步地,超声波模块发出的超声波的频率为大于20k赫兹,具体可以为21k赫兹、22k赫兹、24k赫兹、25k赫兹或其他数值等等,此处不做限定。若电子设备是通过超声波模块发出的为脉冲波,则不同类型的超声波的区别包括以下中的任一种或多种特性:振动幅度、振动频率、振动时长、振动波形、或电子设备发出脉冲波的频率,电子设备发出脉冲波的频率也可以称为电子设备发出脉冲波的节奏。作为示例,例如电子设备通过超声波模块每隔3毫秒发出一个脉冲形式的超声波,和电子设备通过超声波模块每隔10毫秒发出一个脉冲形式的超声波,前述两种情况下,电子设备发出脉冲波的频率不同,应理解,此处举例仅为方便理解本方案,不用于限定本方案。
则步骤909可以包括:电子设备获取与锚定点按键对应的第三类型的超声波,通过触控屏幕中的超声波模块发出第三类型的超声波,以将触控屏幕上的第一接触点的触觉特性改变至第一触觉状态。
可选地,在与第一接触点对应的接触操作是触摸操作,且第一虚拟按键为锚定点按键的情况下,电子设备也可以根据第一接触点的第一位置信息,获取与第一接触点对应的位置类型。电子设备还可以根据与第一接触点对应的位置类型,确定与第一位置信息对应的超声波的类型,进而通过触控屏幕中的超声波模块发出前述类型的超声波。其中,在第一接触点位于锚定点按键的第一位置区域的情况下,和,在第一接触点位于锚定点按键的第二位置区域的情况下,电子设备通过超声波模块发出的超声波的类型可以不同。
在另一种实现方式中,若电子设备的触控屏幕中集成有静电模块,电子设备通过触控屏幕中的静电模块发出静电的方式,来改变触控屏幕的盖板中第一接触点的触觉特性。则电子设备可以通过静电模块发出不同大小的静电,以使触控屏幕的盖板中第一接触点呈现出不同的触觉特性。静电模块发出的静电的伏特数的取值范围可以为100伏特至400伏特,作为示例,例如静电模块发出的静电的伏特数为120伏特、200伏特、380伏特或其他取值 等,此处不做限定。
则步骤909可以包括:电子设备获取与锚定点按键对应的静电的第一伏特值,通过触控屏幕中的静电模块发出第一伏特值的静电,以将触控屏幕上的第一接触点的触觉特性改变至第一触觉状态。
可选地,在与第一接触点对应的接触操作是触摸操作,且第一虚拟按键为锚定点按键的情况下,电子设备也可以根据第一接触点的第一位置信息,获取与第一接触点对应的位置类型。根据与第一接触点对应的位置类型,确定与第一位置信息对应的电流的伏特值,进而通过触控屏幕中的电流模块发出前述伏特值的电流。其中,在第一接触点位于锚定点按键的第一位置区域的情况下,和,在第一接触点位于锚定点按键的第二位置区域的情况下,电子设备通过电流模块发出的电流的伏特值可以不同。
需要说明的是,电子设备还可以通过其他方式来改变触控屏幕中盖板的触觉特性,此处不一一进行列举。
910、电子设备改变触控屏幕上的第一接触点的触觉特性,以呈现为第二触觉状态。
本申请的一些实施例中,在与第一接触点对应的接触操作是触摸操作,且第一虚拟按键为非锚定点按键的情况下,电子设备改变触控屏幕的盖板中第一接触点的触觉特性,以呈现为第二触觉状态,当触控屏幕呈现为第一触觉状态时的触觉特性,与当触控屏幕呈现为第二触觉状态时的触觉特性可以不同,也即当用户触摸锚定点按键时的感受与用户触摸非锚定点按键时的感受可以不同,以进一步协助用户区分虚拟键盘上的锚定点按键和非锚定点按键,以进一步协助用户对虚拟键盘中的虚拟按键进行定位。
具体的,在一种实现方式中,若电子设备的触控屏幕中集成有超声波模块,电子设备通过触控屏幕中的超声波模块发出超声波的方式,来改变触控屏幕的盖板中第一接触点的触觉特性。则步骤910可以包括:电子设备获取与非锚定点按键对应的超声波的第四类型,通过触控屏幕中的超声波模块发出第四类型的超声波,以将触控屏幕上的第一接触点的触觉特性改变至第二触觉状态。
可选地,在与第一接触点对应的接触操作是触摸操作,且第一虚拟按键为非锚定点按键的情况下,电子设备也可以获取与第一接触点对应的位置类型。根据与第一接触点对应的位置类型,确定与第一位置信息对应的超声波的类型。其中,在第一接触点位于非锚定点按键的第一位置区域的情况下,和,在第一接触点位于非锚定点按键的第二位置区域的情况下,电子设备通过超声波模块发出的超声波的类型可以不同。
在另一种实现方式中,若电子设备的触控屏幕中集成有静电模块,电子设备通过触控屏幕中的静电模块发出静电的方式,来改变触控屏幕的盖板中第一接触点的触觉特性。则步骤909可以包括:电子设备获取与锚定点按键对应的静电的第二伏特值,通过触控屏幕中的静电模块发出第二伏特值的静电,以将触控屏幕上的第一接触点的触觉特性改变至第二触觉状态。
可选地,在与第一接触点对应的接触操作是触摸操作,且第一虚拟按键为锚定点按键的情况下,电子设备也可以获取与第一接触点对应的位置类型。根据与第一接触点对应的位置类型,确定与第一位置信息对应的电流的伏特值。其中,在第一接触点位于非锚定点 按键的第一位置区域的情况下,和,在第一接触点位于非锚定点按键的第二位置区域的情况下,电子设备通过电流模块发出的电流的伏特值可以不同。
需要说明的是,步骤908为可选步骤,若不执行步骤908,则可以将步骤909和910进行合并,也即在与第一接触点对应的接触操作是触摸操作的情况下,无论第一虚拟按键是锚定点按键还是非锚定点按键,触控屏幕上的第一接触点的触觉特性均可以呈现为相同的触觉状态。
此外,步骤908至910均为可选步骤,在电子设备确定与第一接触点对应的接触操作不是按压操作之后,可以直接不做任何反馈,也即在与第一接触点对应的压力值小于第一压力值阈值的情况下,电子设备可以不做任何反馈。
本申请实施例中,当用户接触的为虚拟按键上的锚定点按键时,会通过触控屏幕执行第一反馈操作,以提示用户当前接触的为锚定点按键,从而用户可以感知锚定点按键的位置,有利于降低在触控屏幕上实现盲打的难度;此外,触控屏幕中配置有多个振动反馈元件,在确定第一虚拟按键为锚定点按键的情况下,从多个振动反馈元件中获取与第一虚拟按键匹配的至少一个第一振动反馈元件,并指示该至少一个第一振动反馈发出振动波,能够实现仅在第一虚拟按键的周围产生振动反馈的效果,也即不是对全屏进行振动反馈,由于打字的时候所有手指都放置于触控屏幕上,如果是全屏的振动的话,则所有的手指都会感受到振动,就容易让用户混淆,但只在第一虚拟按键周围产生振动反馈的效果,则用户不容易产生混淆,更容易帮助用户在手指处形成肌肉记忆,以协助用户实现在触控屏幕上进行盲打。
在图1至图13所对应的实施例的基础上,为了更好的实施本申请实施例的上述方案,下面还提供用于实施上述方案的相关设备。请参阅图14,图14为本申请实施例提供的电子设备的一种结构示意图。电子设备1包括触控屏幕20、存储器40、一个或多个处理器10以及一个或多个程序401,触控屏幕20中配置有多个振动反馈元件,一个或多个程序401被存储在存储器40中,一个或多个处理器10在执行一个或多个程序401时,使得电子设备执行以下步骤:检测作用于触控屏幕20上的第一接触操作;响应于第一接触操作,获取与第一接触操作对应的第一接触点的第一位置信息,第一位置信息与虚拟键盘上的第一虚拟按键对应;在第一虚拟按键为锚定点按键的情况下,从多个振动反馈元件中获取第一振动反馈元件,第一振动反馈元件为与第一虚拟按键匹配的振动反馈元件;指示第一振动反馈元件发出振动波,以执行第一反馈操作,第一反馈操作用于提示第一虚拟按键为锚定点按键。
在一种可能的设计中,电子设备1中配置有第一映射关系,第一映射关系指示虚拟按键与振动反馈元件之间的对应关系,一个或多个处理器10在执行一个或多个程序401时,使得电子设备1具体执行以下步骤:根据第一映射关系和第一虚拟按键,获取第一振动反馈元件。
在一种可能的设计中,电子设备1中配置有第一映射关系,第一映射关系指示位置信息与振动反馈元件之间的对应关系,一个或多个处理器10在执行一个或多个程序401时,使得电子设备1具体执行以下步骤:根据第一映射关系和第一位置信息,获取第一振动反 馈元件。
在一种可能的设计中,一个或多个处理器10在执行一个或多个程序401时,使得电子设备1还执行以下步骤:获取与至少一个第一振动反馈元件中各个第一振动反馈元件对应的振动波的振动强度,至少一个第一振动反馈元件中各个第一振动反馈元件的振动波的振动强度与第一数量相关,第一数量为第一振动反馈元件的数量。一个或多个处理器10在执行一个或多个程序401时,使得电子设备1具体执行以下步骤:根据与各个第一振动反馈元件对应的振动波的振动强度,通过至少一个第一振动反馈元件发出振动波,以使与第一虚拟按键对应的振动反馈的强度和与第二虚拟按键对应的振动反馈的强度的差异在预设强度范围内,第二虚拟按键和第一虚拟按键为不同的虚拟按键。
在一种可能的设计中,第一振动反馈元件为以下中的任一种:压电陶瓷片、线性马达或压电薄膜。
在一种可能的设计中,一个或多个处理器10在执行一个或多个程序401时,使得电子设备1还执行以下步骤:根据第一位置信息,获取与第一接触点对应的位置类型,位置类型包括第一接触点位于第一虚拟按键的第一位置区域和第一接触点位于第一虚拟按键的第二位置区域,第一位置区域和第二位置区域不同。一个或多个处理器10在执行一个或多个程序401时,使得电子设备1具体执行以下步骤:根据与第一接触点对应的位置类型,通过触控屏幕20执行第一反馈操作,与第一位置区域对应的反馈操作和与第二位置区域对应的反馈操作不同。
在一种可能的设计中,一个或多个处理器10在执行一个或多个程序401时,使得电子设备1还执行以下步骤:响应于检测到的第一手势操作,从多个类型的虚拟键盘中选取与第一手势操作对应的第一类型的虚拟键盘,其中,多个类型的虚拟键盘中不同类型的虚拟键盘包括的虚拟按键不完全相同;通过触控屏幕20展示第一类型的虚拟键盘,在第一类型的虚拟键盘的展示过程中,第一类型的虚拟键盘在触控屏幕20上的位置固定。一个或多个处理器10在执行一个或多个程序401时,使得电子设备1具体执行以下步骤:在第一类型的虚拟键盘的展示过程中,检测作用于触控屏幕20上的第一接触操作。
需要说明的是,电子设备1中各模块/元件之间的信息交互、执行过程等内容,与本申请中图9至图13对应的各个方法实施例基于同一构思,具体内容可参见本申请前述所示的方法实施例中的叙述,此处不再赘述。
本申请实施例还提供了一种电子设备,请参阅图15,图15为本申请实施例提供的电子设备的一种结构示意图,电子设备1具体可以表现为手机、平板、笔记本电脑或者其他配置有触控屏幕的设备等,此处不做限定。其中,电子设备1上可以部署有图1至图8对应实施例中所描述的电子设备,用于实现图9至图13对应实施例中电子设备的功能。具体的,电子设备1可因配置或性能不同而产生比较大的差异,可以包括一个或一个以上中央处理器(central processing units,CPU)1522(例如,一个或一个以上处理器)和存储器40,一个或一个以上存储应用程序1542或数据1544的存储介质1530(例如一个或一个以上海量存储设备)。其中,存储器40和存储介质1530可以是短暂存储或持久存储。存储在存储介质1530的程序可以包括一个或一个以上模块(图示没标出),每个模块可以包括对电子 设备中的一系列指令操作。更进一步地,中央处理器1522可以设置为与存储介质1530通信,在电子设备1上执行存储介质1530中的一系列指令操作。
电子设备1还可以包括一个或一个以上电源1526,一个或一个以上有线或无线网络接口1550,一个或一个以上输入输出接口1558,和/或,一个或一个以上操作系统1541,例如Windows ServerTM,Mac OS XTM,UnixTM,LinuxTM,FreeBSDTM等等。
本申请实施例中,中央处理器1522,用于实现图9至图13对应实施例中电子设备的功能。需要说明的是,对于中央处理器1522执行图9至图13对应实施例中电子设备的功能的具体实现方式以及带来的有益效果,均可以参考图9至图13对应的各个方法实施例中的叙述,此处不再一一赘述。
本申请实施例中还提供一种计算机可读存储介质,该计算机可读存储介质中存储有用于生成车辆行驶速度的程序,当其在计算机上运行时,使得计算机执行如前述图9至图13所示实施例描述的方法中电子设备所执行的步骤。
本申请实施例中还提供一种计算机程序,当其在计算机上运行时,使得计算机执行如前述图9至图13所示实施例描述的方法中电子设备所执行的步骤。
本申请实施例中还提供一种电路系统,所述电路系统包括处理电路,所述处理电路配置为执行如前述图9至图13所示实施例描述的方法中电子设备所执行的步骤。
本申请实施例提供的电子设备具体可以为芯片,芯片包括:处理单元和通信单元,所述处理单元例如可以是处理器,所述通信单元例如可以是输入/输出接口、管脚或电路等。该处理单元可执行存储单元存储的计算机执行指令,以使芯片执行上述前述图9至图13所示实施例描述的方法中电子设备所执行的步骤。可选地,所述存储单元为所述芯片内的存储单元,如寄存器、缓存等,所述存储单元还可以是所述无线接入设备端内的位于所述芯片外部的存储单元,如只读存储器(read-only memory,ROM)或可存储静态信息和指令的其他类型的静态存储设备,随机存取存储器(random access memory,RAM)等。
其中,上述任一处提到的处理器,可以是一个通用中央处理器,微处理器,ASIC,或一个或多个用于控制上述第一方面方法的程序执行的集成电路。
另外需说明的是,以上所描述的装置实施例仅仅是示意性的,其中所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部模块来实现本实施例方案的目的。另外,本申请提供的装置实施例附图中,模块之间的连接关系表示它们之间具有通信连接,具体可以实现为一条或多条通信总线或信号线。
通过以上的实施方式的描述,所属领域的技术人员可以清楚地了解到本申请可借助软件加必需的通用硬件的方式来实现,当然也可以通过专用硬件包括专用集成电路、专用CLU、专用存储器、专用元器件等来实现。一般情况下,凡由计算机程序完成的功能都可以很容易地用相应的硬件来实现,而且,用来实现同一功能的具体硬件结构也可以是多种多样的,例如模拟电路、数字电路或专用电路等。但是,对本申请而言更多情况下软件程序实现是更佳的实施方式。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献 的部分可以以软件产品的形式体现出来,该计算机软件产品存储在可读取的存储介质中,如计算机的软盘、U盘、移动硬盘、ROM、RAM、磁碟或者光盘等,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)执行本申请各个实施例所述的方法。
在上述实施例中,可以全部或部分地通过软件、硬件、固件或者其任意组合来实现。当使用软件实现时,可以全部或部分地以计算机程序的形式实现。
所述计算机程序包括一个或多个计算机指令。在计算机上加载和执行所述计算机程序指令时,全部或部分地产生按照本申请实施例所述的流程或功能。所述计算机可以是通用计算机、专用计算机、计算机网络、或者其他可编程装置。所述计算机指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一计算机可读存储介质传输,例如,所述计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如同轴电缆、光纤、数字用户线(DSL))或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心进行传输。所述计算机可读存储介质可以是计算机能够存储的任何可用介质或者是包含一个或多个可用介质集成的服务器、数据中心等数据存储设备。所述可用介质可以是磁性介质,(例如,软盘、硬盘、磁带)、光介质(例如,DVD)、或者半导体介质(例如固态硬盘(Solid State Disk,SSD))等。
实施例二:
本申请实施例可应用于各种需要通过虚拟键盘进行输入的应用场景中。作为示例,例如在进行文字编辑的应用程序中,需要通过虚拟键盘输入文字、数字、字符等内容;作为另一示例,例如在制作文稿文件(power point,PPT)的应用程序中,也会需要通过虚拟键盘输入文字、数字、字符等内容;作为再一示例,例如在游戏类的应用程序中,也会需要通过虚拟键盘执行操作虚拟人物移动、修改人物名称、和游戏好友进行即时通信等功能等等,应理解,本申请实施例还可以应用于其他通过虚拟键盘进行输入的应用场景中,此处不进行穷举。在前述种种场景中,均存在虚拟键盘的按键数量有限,需要提供额外的实体键盘以满足用户的输入需求的问题。
为了解决上述问题,本申请实施例提供了一种虚拟键盘的处理方法,该方法应用于图16示出的电子设备中,电子设备中配置有多种类型的虚拟键盘,用户可以通过输入不同类型的手势操作以唤起不同类型的虚拟键盘,也即虚拟键盘不再是只能展示26个字母,而是通过不同类型的虚拟键盘向用户提供更多的虚拟按键,不仅提高了用户唤起虚拟键盘的过程中的灵活性,而且有利于提供更丰富的虚拟按键,从而不再需要提供额外的实体键盘。
请参阅图16,图16为本申请实施例提供的电子设备的一种示意图。在一些应用场景中,如图2所示,电子设备1至少包括一个显示屏,该显示屏为一个触控屏幕(也即图2中的触控屏幕20),则电子设备1可以通过该显示屏获取用户输入的各种类型的手势操作,并通过该显示屏展示各种类型的虚拟键盘。
在另一些应用场景中,如图16所示,电子设备2可以表现为VR、AR或MR等虚拟现实设备,电子设备2通过头显设备上配置的摄像机采集用户各种类型的手势操作,并通过该头显设备向用户展示各种类型的虚拟键盘。
结合上述描述,请参阅图17,图17为本申请实施例提供的虚拟键盘的处理方法的一种流程示意图,本申请实施例提供的虚拟键盘的处理方法可以包括:
1701、电子设备检测到第一手势操作,并获取与第一手势操作对应的第一手势参数。
本申请实施例中,电子设备可以实时检测用户是否输入手势操作,当电子设备检测到用户输入的第一手势操作时,生成与第一手势操作对应的第一手势参数。具体的,在一些应用场景中,电子设备配置有触控屏幕,则电子设备通过触控屏幕实时获取用户输入的第一手势操作。在另一些应用场景中,电子设备可以通过头显设备上配置的摄像机采集用户输入的第一手势操作,进而生成与第一手势操作对应的第一手势参数,在本应用场景中,电子设备可以表现为VR、AR或MR等虚拟现实设备,此处不做限定。
若第一手势操作为通过电子设备配置的显示屏获取到的,第一手势参数包括以下中任一项或多项参数:与第一手势操作对应的接触点的位置信息、与第一手势操作对应的接触点的数量信息、与第一手势操作对应的接触点的面积信息或其他类型的参数信息等。本申请实施例中,介绍了第一手势参数中包括哪些信息,第一手势参数中不仅包括每个接触点的位置信息和多个接触点的数量信息,还包括每个接触点的面积信息,接触点的面积信息能够从多个接触点中区分出基于手掌触发的接触点,有利于准确的估计第一手势操作的类型,避免显示错误的虚拟键盘,以提高虚拟键盘显示过程的正确率。
进一步地,第一手势操作对应的接触点的位置信息可以采用坐标信息、函数或其他信息来表示,与前述位置信息对应的坐标系原点可以为该显示屏的中心点、显示屏的左上角顶点、显示屏的左下角顶点、显示屏的右上角顶点、显示屏的右下角顶点或其他位置点等,具体坐标系原点的设定可结合实际应用场景确定。
具体的,电子设备的显示屏可以为触控屏幕,该触控屏幕中可以配置有接触感知模块,电子设备通过显示屏中配置的接触感知模块,采集与第一手势操作对应的第一手势参数。为更直观地理解本方案,请参阅图18,图18为本申请实施例提供的虚拟键盘的处理方法中第一手势参数的一种示意图。图18中以第一手势操作为单手操作为例,如图所示,显示屏上可以获取到4个接触点,4个接触点中由手指产生的3个接触点的面积比较小,剩余1个由手掌产生的接触点的面积较大,应理解,图18中的示例仅为方便理解本方案,不用于限定本方案。
若电子设备为虚拟现实设备,则虚拟键盘可以为在立体空间中通过视觉呈现的。电子设备可以对空间中的手势操作进行实时检测,以在检测到第一手势操作时,获取与第一手势操作对应的第一手势参数。
具体的,在一种情况下,电子设备可以通过用户的手持设备或手部穿戴设备,对用户的手进行实时追踪,以监测用户的第一手势操作。在另一种情况下,电子设备包括头显设备,第一手势操作为通过头显设备中配置的摄像机获取到的,第一手势参数可以具体表现为与第一手势操作对应的图像,电子设备可以将与第一手势操作对应的图像输入用于图像识别的神经网络中,以生成与第一手势操作对应的第一手势参数。
1702、电子设备根据第一手势参数信息,生成第一指示信息。
本申请实施例中,电子设备在采集到与第一手势操作对应的第一手势参数之后,还可 以根据获取到的第一手势参数,进行二次处理,以生成与第一手势参数对应的第一指示信息,第一指示信息也可以视为经过二次处理得到的手势参数。其中,第一指示信息包括以下中任一项或多项(也即第一手势参数指示以下中任一项或多项):与第一手势操作对应的手的相对角度信息、与第一手势操作对应的手的位置信息、与第一手势操作对应的手的数量信息和与第一手势操作对应的手的形状信息等,具体第一指示信息中可以包括哪些类型的信息可以结合实际应用场景来灵活设定,此处不做限定。本申请实施例中,对获取到的第一手势参数进行二次处理后,可以得到手的相对角度信息、手的位置信息、手的数量信息或手的形状信息等信息,也即基于第一手势参数可以得到关于第一手势操作的更为丰富的信息,增加虚拟键盘匹配过程的灵活性。
具体的,在一些应用场景中,若第一手势参数为通过电子设备的显示屏采集到的。针对与第一手势操作对应的手的相对角度信息,该相对角度信息可以包括以下中的任一项或多项:与第一手势操作对应的手与显示屏的任意一条边之间的相对角度、与第一手势操作对应的手与显示屏的中心线之间的相对角度、与第一手势操作对应的手与显示屏的对角线之间的相对角度等,此处不做限定。
更具体的,若电子设备确定第一手势操作为单手操作(也即与第一手势操作对应的手的数量为1),则电子设备从与第一手势操作对应的多个接触点中获取至少两个第一接触点(也即基于手指产生的接触点),将至少两个第一接触点进行连线,或者,将至少两个第一接触点中距离最远的两个第一接触点进行连线,以生成与第一手势操作对应的直线,进而计算前述直线与预设线之间的相对角度,该预设线包括以下中的任一项或多项:显示屏的任意一条边、显示屏的中心线、显示屏的对角线等,以得到与第一手势操作对应的手的相对角度信息。
若电子设备确定第一手势操作为双手操作(也即与第一手势操作对应的手的数量为2),则电子设备从与第一手势操作对应的多个接触点中获取至少两个第一接触点,并将与左手对应的至少两个第一接触点进行连线,或者,将与左手对应的至少两个第一接触点中距离最远的两个第一接触点进行连线,以生成与左手对应的第一直线;将与右手对应的至少两个第一接触点进行连线,或者,将与右手对应的至少两个第一接触点中距离最远的两个第一接触点进行连线,以生成与右手对应的第二直线,进而分别计算第一直线与预设线之间的第一子角度,计算第二直线与预设线之间的第二子角度,以得到与第一手势操作对应的手的相对角度信息。
为更直观地理解本方案,请参阅图19,图19为本申请实施例提供的虚拟键盘的处理方法中相对角度信息的一种示意图。图19中以第一手势操作为双手操作为例,图19包括(a)和(b)两个子示意图,图19的(a)子示意图示出的为与双手操作对应的接触点的位置,图19的(b)子示意图中以预设线为显示屏的底边为例,分别将与左手对应的4个接触点中距离最远的两个接触点进行连线以生成第一直线,将与右手对应的4个接触点中距离最远的两个接触点进行连线以生成第二直线,进而得到第一子角度和第二子角度,应理解,图19中的示例仅为方便理解本方案,不用于限定本方案。
针对手的位置信息的确定过程。电子设备先根据获取到第一手势参数,确定与第一手 势操作对应的手的数量,若第一手势操作为双手操作,则与第一手势操作对应的手的位置包括两手之间的距离;若第一手势操作为单手操作,则与第一手势操作对应的手的位置包括第一区域和第四区域。其中,第一区域位于显示屏的左下方或右下方,第四区域为显示面板中除第一区域之外的区域;进一步地,第一区域的宽度可以为3厘米、4厘米或5厘米等数值,第一区域的底边与显示屏的底边重合。为更直观地理解本方案,请参阅图20,图20为本申请实施例提供的虚拟键盘的处理方法中第一区域的两种示意图。图20的(a)子示意图和图20(b)子示意图分别示出了第一区域的两种示意图,应理解,图20中的示例仅为方便理解本方案,不用于限定本方案。
在第一手势操作为双手操作,电子设备可以将左手食指和右手食指之间的距离确定为两手之间的距离;也可以将左手和右手之间的最近距离确定为两手之间的距离;还可以根据多个接触点生成左手和右手的形状,进而生成左手边界和右手边界之间的距离等,此处不对两手之间的距离的确定方式进行穷举。
若第一手势操作为单手操作,电子设备根据与第一手势操作对应的第一手势参数,从与第一手势操作对应的多个接触点选取出多个第一接触点,并根据多个第一接触点的位置,来确定与第一手势操作对应的手的位置。在一种实现方式中,若至少一个第一接触点中所有第一接触点均位于第一区域内,则确定手的位置为第一区域;若至少一个第一接触点中存在第一区域外的第一接触点,则确定手的位置为第四区域。在另一种实现方式中,若至少一个第一接触点中存在位于第一区域内的第一接触点,则确定手的位置为第一区域;若至少一个第一接触点中所有第一接触点均位于第四区域,则确定手的位置为第四区域。
针对手的数量的确定过程。电子设备获取到的第一手势操作可以为单手操作,也可以为双手操作。电子设备可以根据接触点的数量和接触点的位置信息,确定与第一手势参数对应的手的数量。在一种实现方式中,电子设备判断多个接触点的数量是否大于或等于第一数值,且,多个接触点中存在距离大于第二距离阈值的两个接触点,若多个接触点的数量大于第一数值,且,多个接触点中存在距离大于第二距离阈值的两个接触点,则确定与第一手势操作对应的手的数量为2个;若多个接触点的数量小于第一数值,或,多个接触点中不存在距离大于第二距离阈值的两个接触点,则确定与第一手势操作对应的手的数量为1个。其中,第一数值的取值可以为2个、3个、4个、5个或其他数值,第一数值的取值也可以由用户进行自定义;第二距离阈值的取值可以为22毫米、25毫米、26毫米或其他数值等,第二距离阈值的取值也可以为由用户进行自定义,具体第二距离阈值的取值可结合显示屏的大小、用户手的大小等因素来确定,此处不做限定。
在另一种实现方式中,电子设备判断多个接触点中是否存在第一子集合和第二子集合,若存在第一子集合和第二子集合,则确定与第一手势操作对应的手的数量为2个,若不存在第一子集合或第二子集合,则确定与第一手势操作对应的手的数量为1个。其中,第一子集合和第二子集合中包括的接触点数量均大于或等于第一数值,第一子集合中任意两个接触点之间的距离均小于第二距离阈值,第二子集合中任意两个接触点之间的距离均小于第二距离阈值,第一子集合中任意接触点与第二子集合中任意接触点之间的距离均大于或等于第二距离阈值。
为更直观地理解本方案,请参阅图21,图21为本申请实施例提供的虚拟键盘的处理方法中第一手势操作的一种示意图。图21中以第一数值的取值为3为例,图21包括(a)和(b)两个子示意图,图21的(a)子示意图中示出的为与第一手势操作对应的手的数量为1的情况(也即第一手势操作为单手操作),电子设备可以获取到图21的(a)子示意图中的3个接触点,前述3个接触点之间的距离小于25毫米;图21的(b)子示意图示出的为与第一手势操作对应的手的数量为2的情况(也即第一手势操作为双手操作),电子设备可以获取到8个接触点(分别为图21中的A1、A2、A3、A4、A5、A6、A7和A8),A7和A8所代表的接触点为基于手掌产生的接触点,A1、A2、A3、A4、A5和A6为基于手指产生的接触点A1、A2和A3组成第一子集合,A4、A5和A6组成第二子集合,A1、A2和A3这三个接触点之间的距离均小于25毫米,A4、A5和A6这三个接触点之间的距离均小于25毫米,第一子集合和第二子集合之间的距离均大于25毫米,应理解,图21中的示例仅为方便理解本方案,不用于限定本方案。
可选地,电子设备还可以先根据与第一手势操作对应的第一手势参数,将与第一手势操作对应的多个接触点划分为第一接触点和第二接触点,其中,第一接触点是基于用户的手指产生的,第二接触点是基于用户的手掌产生的。进而判断多个接触点中第一接触点的数量是否大于或等于第一数值,且,至少一个第一接触点中是否存在距离大于第二距离阈值的两个接触点,以确定与第一手势操作对应的手的数量。具体的,在一种实现方式中,电子设备可以判断每个接触点的面积是否大于或等于第一面积阈值,若大于或等于第一面积阈值,则将该接触点确定为第二接触点(也即由手掌产生的接触点),若小于第一面积阈值,则该接触点确定为第一接触点(也即由手指产生的接触点)。第一面积阈值的取值可以为预先设定好的,也可以为由用户进行自定义的;第一面积阈值的取值可以结合用户的手的大小等因素来确定,此处不做限定。需要说明的是,此处以利用接触点的面积来确定接触点是第一接触点还是第二接触点为例,仅为方便理解本方案的可行性,不用于限定本方案。
针对手的形状信息的确定过程。第一手势操作可以为静态的手势操作,与第一手势操作对应的手的形状信息具体可以为左手、右手、两指、握拳、或其他的形状信息等。可选地,若第一手势操作也可以为动态的滑动操作,手的形状信息具体可以为“Z”字形、对勾形、圆圈形等,此处不做穷举。具体的,若电子设备获取到的多个接触点的数量为两个,则可以确定与第一手势操作对应的形状信息为两指。为更直接地理解本方案,请参阅图22,图22为本申请实施例提供的虚拟键盘的处理方法中第一手势操作的一种示意图。图22中包括(a)和(b)两个子示意图,图22的(a)子示意图中示出的为两指操作的第一手势操作,图22的(b)子示意图示出的为与两指操作对应的两个接触点,应理解,图22中的示例仅为方便理解本方案,不用于限定本方案。
若电子设备获取到的多个接触点的数量大于或等于三个,则电子设备需要判断与第一手势操作对应的手的数量为1个还是2个,若电子设备确定为单手操作,则需要根据获取到的第一手势参数,判断与第一手势操作对应的手的形状为左手还是右手。具体的,在一种实现方式中,若与第一手势操作对应的多个接触点均位于显示屏的中线的左侧,则与第 一手势操作对应的手的形状为左手;若与第一手势操作对应的多个接触点均位于显示屏的中线的右侧,则与第一手势操作对应的手的形状为右手。需要说明的是,此处提供的判断左手还是右手的方式,仅为方便理解本方案的可行性,不用于限定本方案。
在另一些应用场景中,若第一手势参数是基于与第一手势操作对应的图像生成的。则电子设备可以将与第一手势操作对应的图像输入用于图像识别的神经网络,以直接生成该第一指示信息。
1703、电子设备获取第一规则。
本申请实施例中,电子设备中可以预先配置有第一规则,第一规则指示多个类型的手势操作与多个类型的虚拟键盘之间的对应关系,第一类型的虚拟键盘为多个类型的虚拟键盘中的一个类型。其中,在一种情况下,多个类型的虚拟键盘中不同类型的虚拟键盘的功能不同,不同功能的虚拟键盘可以包括以下中任意两种或多种虚拟键盘的组合:数字键盘、功能键键盘、全键盘和自定义键盘,功能键键盘由功能键组成。本申请实施例中,不同类型的虚拟键盘的功能不同,从而可以向用户提供多种不同功能的虚拟键盘,提高用户在虚拟键盘的使用过程的灵活性,以提高本方案的用户粘度。
在另一种情况下,不同类型的虚拟键盘可以包括以下中任意两种或多种虚拟键盘的组合:迷你键盘、数字键盘、功能性键盘、功能键键盘、圆形键盘、弧形键盘、自定义键盘和全键盘。
第一规则指示如下信息:在第一手势操作为单手操作的情况下,第一类型的虚拟键盘为以下中的任一种虚拟键盘:迷你键盘、数字键盘、功能性键盘、功能键键盘、圆形键盘、弧形键盘、自定义键盘,迷你键盘包括26个字母按键,功能性键盘展示于应用程序中,功能性键盘包括的虚拟按键与应用程序的功能对应;需要说明的是,同一电子设备中不需要同时配置有迷你键盘、数字键盘、功能性键盘、功能键键盘、圆形键盘、弧形键盘和自定义键盘,此处举例仅为证明在一个电子设备中单手操作触发的可以为迷你键盘、数字键盘、功能性键盘、功能键键盘、圆形键盘、弧形键盘或自定义键盘中的任一中虚拟键盘。在第一手势操作为双手操作的情况下,第一类型的虚拟键盘为全键盘,全键盘至少包括26个字母按键,全键盘的尺寸大于迷你键盘的尺寸。本申请实施例中,提供了在第一手势操作为单手操作和双手操作这两种情况下,通过显示屏展示的虚拟键盘的多种具体表现形式,提高了本方案的实现灵活性,也扩展了本方案的应用场景。
进一步地,由于不同的电子设备中手势操作与不同类型的虚拟键盘之间的对应关系可以不同,同一电子设备可以包括以下五项中至少两项的组合:
(一)在第一手势操作为第一单手操作的情况下,第一类型的虚拟键盘为迷你键盘。
其中,第一单手操作可以为左手单手操作,也可以为右手单手操作;第一单手操作可以为手持手写笔的单手操作,也可以为未拿手写笔的单手操作。为更直观地理解本方案,请参阅图23,图23为本申请实施例提供的虚拟键盘的处理方法中第一类型的虚拟键盘的一种示意图。图23中以第一单手操作为用户拿着手写笔为例,电子设备检测到第一手势操作为第一单手操作,对应的第一类型的虚拟键盘为迷你键盘,迷你键盘中包括26个字母按键,且尺寸比全键盘小,应理解,图23中的示例仅为方便理解本方案,不用于限定本方案。
本申请实施例中,在第一手势操作为单手操作的情况下,第一类型的虚拟键盘为迷你键盘,有利于提高用户输入字母过程的灵活性。
(二)在第一手势操作为右手单手操作的情况下,第一类型的虚拟键盘为数字键盘,在第一手势操作为左手单手操作的情况下,第一类型的虚拟键盘为功能性键盘。
其中,功能性键盘包括的虚拟按键与应用程序的功能对应,作为示例,例如该第一手势操作是在游戏类的应用程序中获取到的,则功能性键盘可以为游戏键盘,游戏键盘中配置有游戏常用按键。再例如该第一手势操作是在绘图类的应用程序中获取到的,则功能性键盘可以为绘图软件中的常用按键等,此处不做穷举。
为更直观地理解本方案,请参阅图24和图25,图24和图25为本申请实施例提供的虚拟键盘的处理方法中第一类型的虚拟键盘的两种示意图。图23和图24均包括(a)和(b)两个子示意图,先参阅图24,图24的(a)子示意图示出的为第一手势操作为右手操作,图24的(b)子示意图代表第一类型的虚拟键盘具体表现为数字键盘。再参阅图25,图25的(a)子示意图中示出的为第一手势操作为左手操作,图25的(b)子示意图代表第一类型的虚拟键盘具体表现为设计师键盘,应理解,图24和图25中的示例仅为方便理解本方案,不用于限定本方案。
本申请实施例中,在第一手势操作为右手单手操作的情况下,第一类型的虚拟键盘为数字键盘,在第一手势操作为左手单手操作的情况下,第一类型的虚拟键盘为功能性键盘,更加符合用户对实体键盘的使用习惯,以降低虚拟键盘与实体键盘之间的差异,有利于增强用户粘度。
(三)在第一手势操作为位于显示屏的第一区域的单手操作的情况下,第一类型的虚拟键盘为功能键键盘,第一区域位于显示屏的左下方或右下方,第一区域的概念可以参阅上述步骤202中的描述,此处不做赘述。
其中,功能键键盘中示出了一个或多个功能键,功能键键盘包括但不限于Shift按键、Ctrl按键、Alt按键、Fn(function的缩写)按键、Delete按键等等,具体功能键键盘中会包括哪些功能键可结合实际应用场景来限定,此处不做限定;其中,Fn按键是计算机键盘上采用的修饰按键,它的主要功能是在紧凑布局的键盘中以组合键方式定义更多一键两义的按键。为更直观地理解本方案,请参阅图26和图27,图26和图27为本申请实施例提供的虚拟键盘的处理方法中第一类型的虚拟键盘的两种示意图。图26和图27包括(a)和(b)两个子示意图,先参阅图26,图26中以第一区域位于显示屏的左下方的情况,如图26的(a)子示意图所示,当用户将单手放置于显示屏的第一区域时,触发进入图26的(b)子示意图,也即第一类型的虚拟键盘为功能键键盘。继续参阅图27,图27中以第一区域位于显示屏的右下方的情况,如图27的(a)子示意图所示,当用户将单手放置于显示屏的第一区域时,触发进入图27的(b)子示意图,也即第一类型的虚拟键盘为功能键键盘,应理解,图26和图27中的示例仅为方便理解本方案,不用于限定本方案。
本申请实施例中,由于功能键按键配置于实体键盘的左下方或右下方,在第一手势操作为位于显示屏的第一区域的单手操作的情况下,第一类型的虚拟键盘为功能键键盘,由于触发手势与用户的使用实体键盘的习惯相同,方便用户记忆触发手势,降低本方案的实 现难度,有利于增强用户粘度。
(四)在第一手势操作为少于三个接触点的单手操作的情况下,第一类型的虚拟键盘为圆形键盘或弧形键盘。
其中,圆形键盘指的是形状为圆形的键盘,弧形键盘指的是形状为弧形的键盘。可选地,在第一手势操作为两个接触点的单手操作,且两个接触点之间的距离大于第三距离阈值的情况下,第一类型的虚拟按键为圆形键盘或弧形键盘,第三距离阈值的取值可以为58毫米、60毫米、62毫米等等,此处不做穷举。为更直观地理解本方案,请参阅图28,图28为本申请实施例提供的虚拟键盘的处理方法中第一类型的虚拟键盘的一种示意图。图28包括(a)和(b)两个子示意图,图28的(a)子示意图代表第一手势操作两个接触点(也即少于三个接触点)的单手操作,图28的(b)子示意图代表第一类型的虚拟键盘为圆形键盘,应理解,图28中的示例仅为方便理解本方案,不用于限定本方案。
本申请实施例中,当第一手势操作为少于三个接触点的单手操作时,还可以提供圆形键盘或弧形键盘,不仅能提供实体键盘中存在的键盘,而且还可以提供实体键盘中不存在的键盘,丰富了键盘的类型,给用户提供了更多的选择,进一步增强用户的选择灵活度。
(五)在第一手势操作为双手操作的情况下,第一类型的虚拟键盘为全键盘。
为更直观地理解本方案,请参阅图29,图29为本申请实施例提供的虚拟键盘的处理方法中第一类型的虚拟键盘的一种示意图。图29代表与双手操作对应的虚拟键盘为全键盘,全键盘至少包括26个字母按键,对比图29和图23可知,全键盘的尺寸大于迷你键盘,应理解,图29中的示例仅为方便理解本方案,不用于限定本方案。
需要说明的是,上述项(一)至项(五)中除了项(一)和项(二)是互相不兼容的,不可以配置于同一电子设备中,其余项之间可任意搭配。
更进一步地,在一种实现方式中,第一规则直接包括多个类型的手势操作与多个类型的虚拟键盘之间的对应关系,也即如上述项(一)至项(五)示出的。第一规则中包括多个第一标识信息和多个第二标识信息之间的对应关系,第一标识信息用于唯一指向一种类型的手势操作对应的第一标识信息,第二标识信息用于唯一指向一种类型的虚拟键盘。
在另一种实现方式中,第一规则包括多组条件与多个类型的虚拟键盘之间的对应关系,多组条件中的每组条件与一个类型的手势操作对应,也即多组条件中每组条件用于限定一个类型的手势操作。
具体的,若第一手势参数为基于显示屏采集到的,用于限定单手操作的一组条件可以为接触点数量大于或等于第一数值,且,多个接触点之间的距离均小于第二距离阈值,第一数值和第二距离阈值的取值可参阅上述描述。可选地,用于限定单手操作的一组条件可以为面积小于第一面积阈值的接触点数量大于或等于第一数值,且,面积小于第一面积阈值的多个接触点之间的距离均小于第二距离阈值。
用于限定左手单手操作的一组条件可以为接触点数量大于或等于第一数值,且,多个接触点之间的距离均小于第二距离阈值,且,多个接触点均位于显示屏的中心线的左侧;用于限定左手单手操作的一组条件可以为接触点数量大于或等于第一数值,且,多个接触点之间的距离均小于第二距离阈值,且,多个接触点均位于显示屏的中心线的右侧。
用于限定第一区域的单手操作的一组条件可以为接触点数量大于或等于第一数值,且,多个接触点之间的距离均小于第二距离阈值,且,多个接触点均位于显示屏的第一区域内;或者,用于限定第一区域的单手操作的一组条件可以为接触点数量大于或等于第一数值,且,多个接触点之间的距离均小于第二距离阈值,且,多个接触点中存在至少一个接触点位于显示屏的第一区域内等。
用于限定双手操作的一组条件可以为多个接触点包括第一子集合和第二子集合,第一子集合和第二子集合的接触点数量均大于或等于第一数值,且,第一子集合中的多个接触点之间的距离均小于第二距离阈值,且,第二子集合中的多个接触点之间的距离均小于第二距离阈值,且,第一子集合中任意一个接触点与第二子集合中任意一个接触点之间的距离大于第二距离阈值。可选地,用于限定双手操作的一组条件可以为面积小于第一面积阈值的多个接触点包括第一子集合和第二子集合。
需要说明的是,以上提供了用于限定多种类型的手势操作的多组条件,但具体一个电子设备中配置有哪些类型的手势操作,以及与每种类型的手势操作对应的是什么限定条件,均可以结合实际应用场景灵活设定,此处不做限定。
可选地,第一规则中包括第一子规则,第一子规则为基于对至少一个类型的手势操作和/或至少一个类型的虚拟键盘执行自定义操作后得到的。本申请实施例中,用户可以对触发手势和/或虚拟键盘的类型进行自定义,使得虚拟键盘的展示过程更加符合用户的预期,以进一步提高本方案的用户粘度。
具体的,电子设备中存在有“设置”功能,在该“设置”功能中配置有对第一规则的第一设置界面,从而用户可以通过第一设置界面对以下中的任一项或多项进行自定义:手势操作、虚拟键盘和手势操作与虚拟键盘之间的对应关系进行自定义。为更直观地理解本方案,请参阅图30至图32,图30和图31为本申请实施例提供的虚拟键盘的处理方法中第一设置界面的示意图,图32为本申请实施例提供的虚拟键盘的处理方法中自定义的手势操作的一种示意图。先参阅图30,图30包括(a)、(b)、(c)和(d)四个子示意图,图30的(a)子示意图代表电子设备中预先配置的多个类型的手势操作与多个类型的虚拟键盘之间的对应关系,如图30的(a)子示意图所示,单手操作触发展示数字键盘,双手操作触发展示全键盘,两指操作触发展示圆形键盘,当用户点击“数字键盘”(一种类型的虚拟键盘)时,触发进入图30的(b)子示图,也即对“数字键盘”的自定义操作。在图30的(b)子示图中,当用户对数字键盘中的任意一个按键执行长按操作、双击操作、三击操作或其他类型的接触操作时,图30中以对数字键盘中的按键2执行长按操作为例,数字键盘中的部分按键上出现删除图标(也即图30中示出的“×”的符号),也即触发进入图30的(c)子示意图。在图30的(c)子示意图中,出现“×”符号的按键为可删除的按键,此外,用户还可以通过长按并拖动按键的方式来移动数字键盘中按键的位置。前述删除按键和移动按键位置的操作可以多次执行,如图30的(d)子示意图所示,用户删除了除1-9之外的数字键,以实现对数字键盘的自定义。需要说明的是,图30中的示例仅为方便理解本方案,还可以通过其他操作来实现对虚拟键盘中虚拟按键的删除或移动操作,且图30中仅以对数字键盘进行自定义为例,对其他类型的虚拟键盘也可以进行自定义。
继续参阅图31,图31需结合图30进行描述,当用户点击图30的(a)子示意图中的“单手操作”时,进入图31的(a)子示意图,图31的(a)子示意图中展示有输入“自定义手势”的图标,用户点击前述图标,进入图31的(b)子示意图,用户基于图31的(b)子示意图的提示进行自定义手势的输入,也即如图31的(c)子示意图中示出的输入“握拳形”手势。在一种实现方式中,电子设备可以预先设置第一时长阈值,该第一时长阈值为输入自定义手势的总时长阈值,当达到该输入时长阈值时,进入图31的(d)子示意图;在另一种实现方式中,电子设备也可以预先配置第二时长阈值,第二时长阈值为用户停止输入手势的阈值,当电子设备检测到用户停止输入手势的时长达到第二时长阈值时,进入图31的(d)子示意图等,此处不对进入图31的(d)的子示意图的方式进行穷举。在图31的(d)子示意图中,显示屏中展示有用于指示“确认”的图标和用于指示“重新输入自定义手势”的图标,若用户点击“确认”的图标,则电子设备将通过图31的(c)子示意图获取到的手势操作确定为自定义手势1,电子设备更新第一规则,将单手操作与数字键盘之间的对应关系更新为自定义手势1与数字键盘之间的对应关系,并展示图31的(e)子示意图,也即将自定义手势1确认为数字键盘的触发手势,以完成了对触发手势的自定义。此外,参阅图31的(f)子示意,图31的(f)子示意代表电子设备获取到的自定义手势1的形状(也即“握拳”形状),应理解,图31中的示例仅为方便理解本方案,不用于限定本方案,用户还可以设置其他形状的自定义手势,此处均不做限定。
请继续参阅图32,图32需要结合图31进行描述,用户基于图31的(b)子示意图中的提示,开始输入自定义手势,也即进入图32的(a)子示意图和图32的(b)子示意图,图32中以自定义手势为动态手势为例,图32中以自定义手势为握拳后张开的动态手势,当电子设备确定用户完成自定义手势的输入后,可以进入图31的(d)子示意图,后续步骤可参阅上述对图31的描述,此处不做赘述。
1704、电子设备根据第一规则,判断第一手势操作是否包括于预存的多种类型的手势操作中,若第一手势操作为预存的多种类型的手势操作中的一种手势操作,则进入步骤1705;若第一手势参数不包括于预存的多种类型的手势操作,则重新进入步骤1701。
本申请实施例中,若第一规则包括多个类型的手势操作与多个类型的虚拟键盘之间的对应关系,则电子设备需要通过步骤1702生成第一指示信息,第一指示信息中需要包括与第一手势操作对应的手的数量信息、与第一手势操作对应的手的位置信息和与第一手势操作对应的手的形状信息,电子设备在得知第一指示信息之后,可以确定第一手势操作是否为电子设备中预先配置的多种类型的手势操作中的一种。
若第一规则包括的是多组条件,则电子设备在通过步骤1701获取到与第一手势操作对应的第一手势参数之后,就可以直接判断第一手势操作是否满足第一规则包括的多组条件中的任意一组条件,对于多组条件的描述可参阅上述步骤1703中的描述。
1705、电子设备通过显示屏展示第一类型的虚拟键盘。
本申请实施例中,电子设备在根据第一规则,确定第一手势操作为电子设备预存的多种类型的手势操作中的目标类型的手势操作之后,可以获取与目标类型的手势操作对应的第一类型的虚拟键盘(也即获取与第一手势操作对应的第一类型的虚拟键盘)。进而通过显 示屏展示第一类型的虚拟键盘。本申请实施例中,电子设备中预先配置有第一规则,第一规则指示多个类型的手势操作与所述多个类型的虚拟键盘之间的对应关系,在检测到作用于显示屏的第一手势操作之后,可以根据第一规则,得到与特定的第一手势操作对应的第一类型的虚拟键盘,提高虚拟键盘匹配过程的效率。
具体的,在一种实现方式中,在第一类型的虚拟键盘的展示过程中,第一类型的虚拟键盘在显示屏上的位置固定。在另一种实现方式中,在第一类型的虚拟键盘的展示过程中,第一类型的虚拟键盘在显示屏上的位置可以随着用户的手的移动而移动。
在另一种实现方式中,将多个类型的虚拟键盘分为第三子集合和第四子集合,第三子集合和第四子集合中均包括至少一个类型的虚拟键盘,第三子集合中的每个类型的虚拟键盘在展示过程中均位置固定,第四子集合中的每个类型的虚拟键盘在展示过程中位置可以随着用户的手移动而移动;也即多个类型的虚拟键盘中部分类型的虚拟键盘展示过程中的位置固定,另一部分类型的虚拟键展示过程中随着用户的手的移动而移动。
可选地,若第一类型的虚拟键盘为迷你键盘、数字键盘或功能性键盘,则第一类型的虚拟键盘可以随着用户的手的移动而移动,也即第三子集合包括以下中的任一项或多项组合:迷你键盘、数字键盘和功能性键盘。若第一类型的虚拟键盘为圆形键盘、弧形键盘或全键盘,则第一类型的虚拟键盘,则第一类型的虚拟键盘可以在展示过程中位置固定,也即第四子集合包括以下中的任一项或多项组合:圆形键盘、弧形键盘或全键盘。
进一步地,针对随着用户的手的移动而移动的虚拟键盘,当用户想要关闭虚拟键盘的移动功能时,可以通过显示屏输入第二手势操作,第二手势操作可以为双击操作、三击操作、单击操作或其他操作等等,此处不做限定。
针对虚拟键盘的初始展示位置。第一类型的虚拟键盘的初始展示位置可以为预先设定好的,也可以为电子设备基于手指的位置确定的。作为示例,例如第一类型的虚拟键盘为数字键盘,则数字键盘初始展示时,数字5对应的按键可以配置于食指下方;作为另一示例,例如第一类型的虚拟键盘为迷你键盘,则迷你键盘的初始展示位置可以为手的下方等,此处举例仅为方便理解本方案,不用于限定本方案。
针对虚拟键盘的显示大小。在一种实现方式中,电子设备中每种类型的虚拟键盘的尺寸大小固定。在另一种实现方式中,同一类型的虚拟键盘可以有不同的尺寸,以适应不同的手指/手的大小;具体的,电子设备中可以为同一类型的虚拟键盘预先存储有至少两个不同的尺寸,并预先存储接触点大小与不同尺寸之间的对应关系,则在确定第一类型的虚拟键盘之后,可以获取与接触点的大小对应的目标尺寸,并展示目标尺寸的第一类型的虚拟键盘。
可选地,电子设备在通过显示屏展示第一类型的虚拟键盘之前,还可以根据步骤1702中生成的第一指示信息获取第一角度,第一角度指示与第一手势操作对应的第一手势中手与显示屏的边之间的相对角度,或者,第一角度指示与第一手势操作对应的第一手势中手与显示屏的中心线之间的相对角度。步骤1705可以包括:电子设备根据第一角度,获取第一类型的虚拟键盘的第一展示角度,并通过显示屏按照第一展示角度展示第一类型的虚拟键盘;第一展示角度指示第一类型的虚拟键盘的边与显示屏的边之间的相对角度,或者, 第一展示角度指示第一类型的虚拟键盘的边与显示屏的中心线之间的相对角度。
具体的,在一种实现方式中,电子设备判断第一角度是否大于或等于预设角度阈值,若大于或等于预设角度阈值,则获取第一展示角度,并通过显示屏按照第一展示角度展示第一类型的虚拟键盘,其中,预设角度阈值的取值可以为25度、28度、30度、32度、35度或其他数值等,此处不做限定。
进一步地,若第一类型的虚拟键盘是全键盘,则第一角度包括左手的相对角度和右手的相对角度,将该全键盘被拆分为第一子键盘和第二子键盘,第一子键盘和第二子键盘包括的为全键盘中不同的虚拟按键,第一展示角度包括第一子键盘的展示角度和第二子键盘的展示角度。若第一角度指示与第一手势操作对应的第一手势中手与显示屏的边之间的相对角度,第一展示角度指示虚拟键盘的底边与显示屏的边之间的相对角度;进一步地,第一子键盘的展示角度指示第一子键盘的边与显示屏的边之间的相对角度,第二子键盘的展示角度指示第二子键盘的边与显示屏的边之间的相对角度。若第一角度指示与第一手势操作对应的第一手势中手与显示屏的中心线之间的相对角度,第一展示角度指示虚拟键盘的底边与显示屏的中心线之间的相对角度;进一步地,第一子键盘的展示角度指示第一子键盘的边与显示屏的中心线之间的相对角度,第二子键盘的展示角度指示第二子键盘的边与显示屏的中心线之间的相对角度。
为更直观地理解本方案,请参阅图33,图33为本申请实施例提供的虚拟键盘的处理方法中第一类型的虚拟键盘的一种示意图。图33中以预设角度阈值的取值为30为例,图33包括(a)、(b)和(c)三个子示意图,图33的(a)子示意图代表与双手操作(第一手势操作的一种)对应的8个第一接触点,图33的(b)子示意图中分别示出了第一直线与显示屏的底边形成的第一子角度(也即左手的相对角度)和第二直线与显示屏的底边形成的第二子角度(也即右手的相对角度),第一子角度和第二子角度的数值均为32度。图33的(c)子示意图代表通过显示屏按照第一展示角度展示第一类型的虚拟键盘,应理解,图33中的示例仅为方便理解本方案,不用于限定本方案。
若第一类型的虚拟键盘为迷你键盘、数字键盘、功能性键盘或功能键键盘,则第一角度为单手的角度,第一展示角度为整个虚拟键盘的相对角度。
在另一种实现方式中,电子设备在获取到第一角度后,将第一类型的虚拟键盘的第一展示角度确定为第一角度,并通过显示屏按照第一角度展示第一类型的虚拟键盘,其中,若第一角度指示与第一手势操作对应的第一手势中手与显示屏的边之间的相对角度,第一展示角度指示虚拟键盘的底边与显示屏的边之间的相对角度;若第一角度指示与第一手势操作对应的第一手势中手与显示屏的边之间的相对角度,第一展示角度指示虚拟键盘的底边与显示屏的中心线之间的相对角度。
本申请实施例中,获取用户的手与显示界面的边或中心线之间的相对角度(也即第一角度),并根据第一角度确定虚拟键盘的展示角度,从而使得键盘的展示角度更加贴合用户手的放置角度,使得用户利用虚拟键盘进行输入的过程更加舒适和便捷。
可选地,若电子设备确定第一手势参数为双手操作,也即确定第一类型的虚拟键盘为全键盘,则电子设备还会获取双手之间的距离,并判断双手之间的距离是否大于或等于第 一距离阈值,在双手之间的距离小于或等于第一距离阈值的情况下,通过显示屏,采用一体式的方式展示第一类型的虚拟键盘;在双手之间的距离大于第一距离阈值的情况下,通过显示屏的第二区域展示第一子键盘,通过显示屏的第三区域展示第二子键盘,其中,第二区域和第三区域为显示屏中的不同区域,第一子键盘和第二子键盘包括的为全键盘中不同的虚拟按键;第一距离阈值的取值可以为70毫米、75毫米、80毫米等,此处不做限定。
为更直观地理解本方案,请参阅图34,图34为本申请实施例提供的虚拟键盘的处理方法中第一类型的虚拟键盘的一种示意图。图34中以第一距离阈值的取值为75毫米为例,图34包括(a)和(b)两个子示意图,图34的(a)子示意图代表双手操作中两手之间的距离为80毫米,由于80毫米大于75毫米,图34的(b)子示意图代表分别在显示屏的第二区域展示第一子键盘,在显示屏的第三区域展示第二子键盘,应理解,图34中的示例仅为方便理解本方案,不用于限定本方案。
本申请实施例中,可以基于用户两手之间的距离来决定是采用一体式展示虚拟键盘,还是采用分离式的方式展示虚拟键盘,进一步提高了虚拟键盘的展示过程的灵活性,使得展示的虚拟键盘更加便于用户使用,进一步提高本方案的用户粘度。
进一步可选地,若电子设备确定第一手势参数为双手操作,也即确定第一类型的虚拟键盘为全键盘,则电子设备还会获取双手之间的距离,并判断双手之间的距离是否小于第四距离阈值,若双手之间的距离小于第四距离阈值,电子设备显示提示信息以指示用户调整双手之间的距离,和/或,电子设备直接采用一体式的方式显示全键盘;可选地,电子设备采用一体式的方式显示最小尺寸的全键盘。前述提示信息可以为文本提示、语音提示、振动提示或其他类型的提示等,此处不对提示信息的展示方式进行穷举。
为更直观地理解本方案,请参阅图35,图35为本申请实施例提供的虚拟键盘的处理方法中第一类型的虚拟键盘的一种示意图。图35包括(a)和(b)两个子示意图,图35的(a)子示意图代表双手操作中两手之间的距离为0毫米,由于双手之间的距离过小,图35的(b)子示意图中的B1代表提示信息,以提示用户双手距离过近,且采用一体式的方式显示全键盘,应理解,图35中的示例仅为方便理解本方案,不用于限定本方案。
可选地,显示屏中还配置有多个振动反馈元件,若在第一类型的虚拟键盘的展示过程中,第一类型的虚拟键盘在显示屏的位置固定,则电子设备通过显示屏展示第一类型的虚拟键盘之后,电子设备还可以检测作用于显示屏上的第一接触操作,响应于第一接触操作,获取与第一接触操作对应的第一接触点的第一位置信息,第一位置信息与虚拟键盘上的第一虚拟按键对应。在第一虚拟按键为锚定点按键的情况下,电子设备从多个振动反馈元件中获取第一振动反馈元件,第一振动反馈元件为与第一虚拟按键匹配的振动反馈元件;指示第一振动反馈元件发出振动波,以执行第一反馈操作,第一反馈操作用于提示第一虚拟按键为锚定点按键。需要说明的是,对于前述描述中各种名词的含义、步骤的具体实现方式以及带来的有益效果,均可以参阅上述实施例一中的描述,此处不做赘述。设置锚定点按键的目的为了协助用户记住锚定点按键的位置,进而协助用户在各种类型的虚拟键盘上实现盲打,因此,每种类型的虚拟键盘上哪些虚拟按键为锚定点按键均可以灵活设备。
为进一步地理解本方案,以下结合上述示出的各种类型的虚拟键盘,来对每种类型的 虚拟键盘中的锚定点按键进行举例。作为示例,例如第一类型的虚拟键盘为图24中示出的数字键盘,则锚定点按键可以为数字“5”指向的虚拟按键。作为另一示例,例如第一类型的虚拟键盘为图26示出的功能键键盘,则锚定点按键可以为Ctrl按键和Shift按键,应理解,此处举例仅为方便理解各种类型的虚拟键盘中存在的锚定点按键的概念,具体每种类型的虚拟键盘中将哪些虚拟按键设为锚定点按键,可以由开发人员结合实际应用场景灵活设定,也可以由用户进行自定义设定,此处均不做限定。
1706、电子设备获取针对功能键键盘中第一虚拟按键的接触操作。
本申请的一些实施例中,电子设备展示的第一类型的虚拟键盘为功能键键盘,则电子设备还可以获取到针对功能键键盘中一个或多个第一虚拟按键的接触操作,该接触操作可以为按压操作,也可以为触摸操作。作为示例,例如第一虚拟按键可以为Ctrl按键,也可以同时包括Ctrl按键和Shift按键等,此处不做限定。
1707、电子设备响应于接触操作,在显示屏上突出展示第二虚拟按键,第二虚拟按键为组合型的快捷键中除第一虚拟按键之外的按键。
本申请的一些实施例中,电子设备响应于接触操作,在显示屏上突出展示至少一个第二虚拟按键。其中,至少一个第二虚拟按键中的每个第二虚拟按键能够和第一虚拟按键组成快捷键,第二虚拟按键为组合型的快捷键中除第一虚拟按键之外的按键;突出展示包括但不限于高亮展示、加粗展示、闪烁展示等,此处不做限定。作为示例,例如在绘图类的应用程序中,Ctrl按键+Shift按键+I按键的组合按键能够提供对当前处理的图像进行反相显示的功能,则第一虚拟按键包括Ctrl按键和Shift按键,第二虚拟按键为虚拟按键I;其中,对当前处理的图像进行反相显示指的是将当前处理的图像的颜色换成它的补色,应理解,此处举例仅为方便理解本方案,不用于限定本方案。
为更直观地理解本方案,请参阅图36,图36为本申请实施例提供的虚拟键盘的处理方法中第二虚拟按键的一种示意图。图36以当前应用为绘图类应用为例,图36包括(a)、(b)、(c)和(d)四个子示意图,图36的(a)子示意图代表显示屏上展示有功能键键盘。图36的(b)子示意图代表用户对Ctrl按键和Shift按键执行按压操作,从而触发电子设备在显示屏上突出展示字母I所在的按键。图36的(c)子示意图代表用户点击字母I所在的按键,从而触发进入图36的(d)的子示意图,也即对当前显示的图像进行反相显示,应理解,图36中的示例仅为方便理解本方案,不用于限定本方案。
可选地,电子设备响应于接触操作,在显示屏上突出展示第二虚拟按键,还显示与每个第二虚拟按键对应的快捷键的功能。
为更直观地理解本方案,请参阅图37,图37为本申请实施例提供的虚拟键盘的处理方法中第二虚拟按键的一种示意图。图37以当前应用为文稿展示类应用,且虚拟键盘以悬浮的方式显示于文稿的展示界面上为例,图37包括(a)、(b)和(c)三个子示意图,图37的(a)子示意图代表显示屏上展示有功能键键盘。图37的(b)子示意图代表用户对Ctrl按键执行按压操作,从而触发进入图37的(c)的子示意图,也即电子设备在显示屏上突出展示多个第二虚拟按键,还显示与每个第二虚拟按键对应的快捷键的功能,分别用于启动保存(save)、剪切(对应图37中的剪刀图标)、复制(copy)、粘贴和插入(insert) 这5种快捷功能,应理解,图37中的示例仅为方便理解本方案,不用于限定本方案。
本申请实施例中,在显示屏中展示功能键键盘的过程中,获取针对功能键键盘中第一虚拟按键的接触操作,响应于该接触操作,在显示屏上突出展示第二虚拟按键,第二虚拟按键为组合型的快捷键中除第一虚拟按键之外的按键,由于功能键键盘占用面积小,从而减少了显示虚拟键盘所需要的面积,且在用户对功能键键盘中第一虚拟按键执行接触操作时,又能自动展示组合型的快捷键中的第二虚拟按键,从而保证了用户对快捷键的需求,也避免了对显示屏的显示面积的浪费。
为更直观地理解本方案,请参阅图38,图38为本申请实施例提供的虚拟键盘的处理方法的一种流程示意图,图38中以本申请实施例应用于文字编辑的应用程序中为例,图38包括(a)、(b)、(c)和(d)四个子示意图,在图38的(a)子示意图中,电子设备获取到与双手操作对应的第一手势参数,根据第一规则和与双手操作对应的第一手势参数,获取与双手操作对应的全键盘,通过显示屏展示全键盘,用户通过全键盘输入内容“主料低筋面粉:”。在图38的(b)子示意图中,电子设备检测到用户抬起一只手,停止在显示屏上展示全键盘,电子设备获取与右手单手操作对应的第一手势参数,根据第一规则和与右手单手操作对应的第一手势参数,获取与右手单手操作对应的数字键盘,通过显示屏展示数字键盘,也即如图38的(c)子示意图所示,数字键盘显示于用户的手的下方,用户通过数字键盘输入内容“145”。如图38的(d)子示图所示,在数字键盘的展示过程中,电子设备检查到用户的手在显示屏的上方移动,电子设备获取手的移动轨迹,并控制数字键盘随着用户的手的移动而移动,当用户通过显示屏输入双击操作时,数字键盘的位置固定,需要说明的是,图38的示例仅为方便理解如何在多种类型的虚拟键盘之间进行切换的方式,不用于限定本方案。
本申请实施例中,电子设备中配置有多个不同类型的虚拟键盘,不同类型的虚拟键盘包括的虚拟按键不完全相同,用户能够实现通过不同的手势操作唤起不同类型的虚拟键盘,也即虚拟键盘不再是只能展示26个字母,而是通过不同类型的虚拟键盘向用户提供更多的虚拟按键,不仅提高了用户唤起虚拟键盘的过程中的灵活性,而且有利于提供更丰富的虚拟按键,从而不再需要提供额外的实体键盘。
在图1至图38所对应的实施例的基础上,为了更好的实施本申请实施例的上述方案,下面还提供用于实施上述方案的相关设备。请参阅图39,图39为本为本申请实施例提供的电子设备的一种结构示意图。电子设备1包括显示屏50、存储器40、一个或多个处理器10以及一个或多个程序401,图39中的显示屏50与图1至图29中的触控屏幕20可以为相同的元件,一个或多个程序401被存储在存储器40中,一个或多个处理器10在执行一个或多个程序401时,使得电子设备执行以下步骤:响应于检测到的第一手势操作,从多个类型的虚拟键盘中选取与第一手势操作对应的第一类型的虚拟键盘,其中,多个类型的虚拟键盘中不同类型的虚拟键盘包括的虚拟按键不完全相同;通过显示屏50展示第一类型的虚拟键盘。
在一种可能的设计中,一个或多个处理器10在执行一个或多个程序401时,使得电子设备具体执行以下步骤:根据第一规则,从多个类型的虚拟键盘中选取与第一手势操作对 应的第一类型的虚拟键盘,第一规则指示多个类型的手势操作与多个类型的虚拟键盘之间的对应关系。
在一种可能的设计中,一个或多个处理器10在执行一个或多个程序401时,使得电子设备还执行以下步骤:获取与第一手势操作对应的第一手势参数,其中,第一手势参数包括以下中任一项或多项参数:与第一手势操作对应的接触点的位置信息、与第一手势操作对应的接触点的数量信息、与第一手势操作对应的接触点的面积信息、与第一手势操作对应的手的相对角度信息、与第一手势操作对应的手的位置信息、与第一手势操作对应的手的数量信息和与第一手势操作对应的手的形状信息。一个或多个处理器10在执行一个或多个程序401时,使得电子设备具体执行以下步骤:根据第一手势参数,从多个类型的虚拟键盘中选取第一类型的虚拟键盘。
在一种可能的设计中,一个或多个处理器10在执行一个或多个程序401时,使得电子设备还执行以下步骤:响应于第一手势操作,获取第一角度,第一角度指示与第一手势操作对应的手与显示屏50的边之间的相对角度,或者,第一角度指示与第一手势操作对应的手与显示屏50的中心线之间的相对角度。一个或多个处理器10在执行一个或多个程序401时,使得电子设备具体执行以下步骤:根据第一角度,获取第一类型的虚拟键盘的展示角度,并通过显示屏50按照展示角度展示第一类型的虚拟键盘,展示角度指示第一类型的虚拟键盘的边与显示屏50的边之间的相对角度,或者,展示角度指示第一类型的虚拟键盘的边与显示屏50的中心线之间的相对角度。
在一种可能的设计中,多个类型的虚拟键盘中不同类型的虚拟键盘的功能不同,不同功能的虚拟键盘包括以下中任意两种或多种虚拟键盘的组合:数字键盘、功能键键盘、全键盘和自定义键盘,功能键键盘由功能键组成。
在一种可能的设计中,在第一手势操作为单手操作的情况下,第一类型的虚拟键盘为以下中的任一种虚拟键盘:迷你键盘、数字键盘、功能性键盘、功能键键盘、圆形键盘、弧形键盘、自定义键盘,其中,迷你键盘包括26个字母按键,功能性键盘展示于应用程序401中,功能性键盘包括的虚拟按键与应用程序401的功能对应。
在一种可能的设计中,在第一手势操作为双手操作的情况下,第一类型的虚拟键盘为全键盘,全键盘至少包括26个字母按键;一个或多个处理器10在执行一个或多个程序401时,使得电子设备具体执行以下步骤:在双手之间的距离小于或等于第一距离阈值的情况下,通过显示屏50,采用一体式的方式展示全键盘;在双手之间的距离大于第一距离阈值的情况下,通过显示屏50的第二区域展示第一子键盘,通过显示屏50的第三区域展示第二子键盘,其中,第二区域和第三区域为显示屏50中的不同区域,第一子键盘和第二子键盘包括的为全键盘中不同的虚拟按键。
在一种可能的设计中,单手操作包括左手单手操作和右手单手操作;在第一手势操作为右手单手操作的情况下,第一类型的虚拟键盘为数字键盘;在第一手势操作为左手单手操作的情况下,第一类型的虚拟键盘为功能性键盘。
在一种可能的设计中,显示屏50中配置有多个振动反馈元件,在第一类型的虚拟键盘的展示过程中,第一类型的虚拟键盘在显示屏50上的位置固定,一个或多个处理器10在 执行一个或多个程序401时,使得电子设备还执行以下步骤:检测作用于显示屏50上的第一接触操作;响应于第一接触操作,获取与第一接触操作对应的第一接触点的第一位置信息,第一位置信息与虚拟键盘上的第一虚拟按键对应;在第一虚拟按键为锚定点按键的情况下,从多个振动反馈元件中获取第一振动反馈元件,第一振动反馈元件为与第一虚拟按键匹配的振动反馈元件;指示第一振动反馈元件发出振动波,以执行第一反馈操作,第一反馈操作用于提示第一虚拟按键为锚定点按键。
需要说明的是,电子设备1中各模块/单元之间的信息交互、执行过程等内容,与本申请中图17至图38对应的各个方法实施例基于同一构思,具体内容可参见本申请前述所示的方法实施例中的叙述,此处不再赘述。
本申请实施例还提供了一种电子设备,请参阅图40,图40为本申请实施例提供的电子设备的一种结构示意图,电子设备1具体可以表现为手机、平板、笔记本电脑或者其他配置有显示屏的设备等,此处不做限定。其中,电子设备1上可以部署有图39对应实施例中所描述的电子设备,用于实现图17至图38对应实施例中电子设备的功能。具体的,电子设备1可因配置或性能不同而产生比较大的差异,可以包括一个或一个以上中央处理器(central processing units,CPU)1522(例如,一个或一个以上处理器)和存储器40,一个或一个以上存储应用程序1542或数据1544的存储介质1530(例如一个或一个以上海量存储设备)。其中,存储器40和存储介质1530可以是短暂存储或持久存储。存储在存储介质1530的程序可以包括一个或一个以上模块(图示没标出),每个模块可以包括对电子设备中的一系列指令操作。更进一步地,中央处理器1522可以设置为与存储介质1530通信,在电子设备1上执行存储介质1530中的一系列指令操作。
电子设备1还可以包括一个或一个以上电源1526,一个或一个以上有线或无线网络接口1550,一个或一个以上输入输出接口1558,和/或,一个或一个以上操作系统1541,例如Windows ServerTM,Mac OS XTM,UnixTM,LinuxTM,FreeBSDTM等等。
本申请实施例中,中央处理器1522,用于实现图17至图38对应实施例中电子设备的功能。需要说明的是,对于中央处理器1522执行图17至图38对应实施例中电子设备的功能的具体实现方式以及带来的有益效果,均可以参考图17至图38对应的各个方法实施例中的叙述,此处不再一一赘述。
本申请实施例中还提供一种计算机可读存储介质,该计算机可读存储介质中存储有用于生成车辆行驶速度的程序,当其在计算机上运行时,使得计算机执行如前述图17至图38所示实施例描述的方法中电子设备所执行的步骤。
本申请实施例中还提供一种计算机程序,当其在计算机上运行时,使得计算机执行如前述图17至图38所示实施例描述的方法中电子设备所执行的步骤。
本申请实施例中还提供一种电路系统,所述电路系统包括处理电路,所述处理电路配置为执行如前述图17至图38所示实施例描述的方法中电子设备所执行的步骤。
本申请实施例提供的电子设备具体可以为芯片,芯片包括:处理单元和通信单元,所述处理单元例如可以是处理器,所述通信单元例如可以是输入/输出接口、管脚或电路等。该处理单元可执行存储单元存储的计算机执行指令,以使芯片执行上述前述图17至图38 所示实施例描述的方法中电子设备所执行的步骤。可选地,所述存储单元为所述芯片内的存储单元,如寄存器、缓存等,所述存储单元还可以是所述无线接入设备端内的位于所述芯片外部的存储单元,如只读存储器(read-only memory,ROM)或可存储静态信息和指令的其他类型的静态存储设备,随机存取存储器(random access memory,RAM)等。
其中,上述任一处提到的处理器,可以是一个通用中央处理器,微处理器,ASIC,或一个或多个用于控制上述第一方面方法的程序执行的集成电路。
另外需说明的是,以上所描述的装置实施例仅仅是示意性的,其中所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部模块来实现本实施例方案的目的。另外,本申请提供的装置实施例附图中,模块之间的连接关系表示它们之间具有通信连接,具体可以实现为一条或多条通信总线或信号线。
通过以上的实施方式的描述,所属领域的技术人员可以清楚地了解到本申请可借助软件加必需的通用硬件的方式来实现,当然也可以通过专用硬件包括专用集成电路、专用CLU、专用存储器、专用元器件等来实现。一般情况下,凡由计算机程序完成的功能都可以很容易地用相应的硬件来实现,而且,用来实现同一功能的具体硬件结构也可以是多种多样的,例如模拟电路、数字电路或专用电路等。但是,对本申请而言更多情况下软件程序实现是更佳的实施方式。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品存储在可读取的存储介质中,如计算机的软盘、U盘、移动硬盘、ROM、RAM、磁碟或者光盘等,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)执行本申请各个实施例所述的方法。
在上述实施例中,可以全部或部分地通过软件、硬件、固件或者其任意组合来实现。当使用软件实现时,可以全部或部分地以计算机程序的形式实现。
所述计算机程序包括一个或多个计算机指令。在计算机上加载和执行所述计算机程序指令时,全部或部分地产生按照本申请实施例所述的流程或功能。所述计算机可以是通用计算机、专用计算机、计算机网络、或者其他可编程装置。所述计算机指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一计算机可读存储介质传输,例如,所述计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如同轴电缆、光纤、数字用户线(DSL))或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心进行传输。所述计算机可读存储介质可以是计算机能够存储的任何可用介质或者是包含一个或多个可用介质集成的服务器、数据中心等数据存储设备。所述可用介质可以是磁性介质,(例如,软盘、硬盘、磁带)、光介质(例如,DVD)、或者半导体介质(例如固态硬盘(Solid State Disk,SSD))等。
实施例三:
本申请实施例本申请实施例提供的应用界面的处理方法可应用于图41示出的电子设备中,请参阅图41,图41为本申请实施例提供的电子设备的一种结构示意图。电子设备 包括第一显示屏501和第一显示屏502。其中,第一显示屏501与第一显示屏502的区别在于:第一显示屏502为用于获取用户的手写输入的显示屏,第一显示屏501不是用于获取用户的手写输入的显示屏。则第一显示屏502为触控屏幕,第一显示屏502需要同时具有接收输入和显示输出的功能。应理解,图41中仅以电子设备包括一个第一显示屏501和一个第一显示屏502为例,但实际情况中,一个电子设备还可以包括至少两个第一显示屏501,或者包括至少两个第一显示屏502等,具体电子设备中包括的第一显示屏501和第一显示屏502的数量可以根据实际应用场景确定,此处不做限定。
在一种实现方式中,电子设备在包括的至少两个显示屏中预先设定好用于获取用户的手写输入的显示屏(也即第二显示屏)和不用于获取用户的手写输入的显示屏(也即第一显示屏),从而用户可以将预先设定好的第二显示屏放置于方便用户手写的位置上。
在另一种实现方式中,电子设备根据包括的至少两个显示屏中每个显示屏的放置方向来确定用于获取用户的手写输入的显示屏(也即第二显示屏)和不用于获取用户的手写输入的显示屏(也即第一显示屏)。具体的,电子设备可以获取至少两个显示屏中每个显示屏的放置角度与水平方向的夹角,进而电子设备可以从包括的至少两个显示屏中选取与水平方向的夹角最小的一个显示屏作为第一显示屏502,将该至少两个显示屏中的剩余显示屏作为第一显示屏501。电子设备也可以从电子设备包括的至少两个显示屏中选取与水平方向的夹角小于第一角度阈值的至少一个显示屏作为第一显示屏502,将该至少两个显示屏中的剩余显示屏作为第一显示屏501,第一角度阈值可以为25度、30度、40度或其他取值等等,此处不做穷举。
进一步地,在一种情况下,第一显示屏501和第一显示屏502可以为互相独立的屏幕,第一显示屏501和第一显示屏502之间通过数据接口连接,或者,第一显示屏501和第一显示屏502之间通过总线连接。在另一种情况下,第一显示屏501和第一显示屏502集成于一个柔性屏幕中,第一显示屏501和第一显示屏502分别为该柔性屏幕中两个不同的区域。
可选地,电子设备中还可以配置有电子笔,该电子笔具体可以采用电磁式触控屏(electro magnetic resonance technology,EMR)技术的电子笔、主动式静电感应(active electrostatic solution,AES)技术的电子笔或其他类型的电子笔等,此处不做限定。
基于图41示出的电子设备,以下对本申请实施例的应用场景进行介绍。作为示例,例如学生上课记笔记的应用场景中,在通过虚拟键盘(也即当前的应用界面采用键盘输入的输入模式)进行打字记笔记的过程中,学生可能会需要采用手写输入的输入模式,以将黑板上的示意图画下来。作为另一示例,例如用户在浏览小说或图片(也即当前的应用界面采用浏览模式)的过程中,用户可能会需要采用手写输入的输入模式,以在小说或图片上添加批注。作为再一示例,例如用户在通过虚拟键盘(也即当前的应用界面采用键盘输入的输入模式)撰写报告的过程中,可能会突然有想法要用笔画下来(也即当前的应用界面需要采用手写输入的输入模式)等,此处不对本申请实施例的应用场景进行穷举。在前述种种场景中,均存在手写输入过程中操作繁琐的问题。
为了解决上述问题,本申请实施例提供了一种应用界面的处理方法,该应用界面的处 理方法应用于图41示出的电子设备中,电子设备通过第一显示屏展示第一应用界面,在检测到与第一应用界面对应的模式类型为手写输入的情况下,响应于该手写输入的输入模式,触发在第二显示屏上展示第一应用界面,以通过第二显示屏获取针对第一应用界面的手写内容,也即当检测到当前的与第一应用界面对应的模式类型为手写输入的输入模式的情况下,电子设备就会自动将第一应用界面在第二显示屏上展示,以通过第二显示屏直接获取针对第一应用界面的手写内容,也即不需要再执行复制和粘贴等步骤,就可以直接完成其他模式到手写输入的转换,避免了繁琐的步骤,大大提高了手写输入的效率。
进一步地,在一些应用场景中,一个应用界面只能够在键盘输入的输入模式与手写输入的输入模式之间进行切换;在另一些应用场景中,一个应用界面只能够在键盘输入的输入模式与浏览模式之间进行切换,由于前述两种应用场景下,具体的实现流程有所不同,以下分别进行描述。
一、在键盘输入的输入模式与手写输入的输入模式之间切换。
本申请实施例中,请参阅图42,图42为本申请实施例提供的应用界面的处理方法的一种流程示意图,本申请实施例提供的应用界面的处理方法可以包括:
4201、电子设备获取针对第一应用界面的启动操作。
本申请实施例中,电子设备获取针对第一应用界面的启动操作,其中,目标应用程序(application,APP)可以包括一个或多个应用界面,第一应用界面指的是目标应用程序包括的至少一个应用界面中的任一个应用界面,也即第一应用界面指的可以为打开目标应用程序时出现的界面,也可以为在使用目标应用程序的过程中打开的新界面。
具体的,步骤4201可以包括:电子设备通过第一显示屏获取针对第一应用界面的启动操作,或者,电子设备通过第二显示屏获取针对第一应用界面的启动操作。进一步地,电子设备为通过电子笔、鼠标或手指获取到针对第一应用界面的启动操作。
4202、电子设备基于启动操作,确定与第一应用界面对应的模式类型。
本申请的一些实施例中,电子设备基于获取到的启动操作,确定与第一应用界面对应的模式类型,与第一应用界面对应的模式类型为手写输入或键盘输入。
在一种实现方式中,电子设备根据与该启动操作对应的获取位置,确定与第一应用界面对应的模式类型。具体的,在启动操作为通过第一显示屏获取到的情况下,则可以证明用户往往在第一显示屏上展示该第一应用界面,则电子设备确定与第一应用界面对应的模式类型为键盘输入,也即将第一应用界面初始模式类型为键盘输入。在启动操作为通过第二显示屏获取到的情况下,则证明用户往往在第二显示屏上使用该第一应用界面,电子设备确定与第一应用界面对应的模式类型为手写输入,也即将第一应用界面初始模式类型为手写输入。其中,第一显示屏与第二显示屏的区别可参阅上述对于图41的介绍,此处不做赘述。
本申请实施例中,基于电子设备获取启动操作的位置,来确定与第一应用界面对应的模式类型,操作简单,易于实现。
在另一种实现方式中,电子设备根据与该启动操作对应的启动方式,确定与第一应用界面对应的模式类型。具体的,在该启动操作为通过电子笔获取到的情况下,电子设备确 定与第一应用界面对应的模式类型为手写输入;作为示例,例如电子设备获取到用户是通过电子笔点击目标应用程序的应用图标,以打开第一应用界面的,则电子设备可以确定与第一应用界面对应的模式类型为手写输入。在启动操作为通过鼠标或手指获取到的情况下,确定与第一应用界面对应的模式类型为键盘输入。
本申请实施例中,提供了基于启动操作,确定与第一应用界面对应的模式类型的又一种实现方式,有利于提高本方案的实现灵活性,且操作简单,易于实现。
在另一种实现方式中,电子设备可以根据该启动操作对应的获取位置和与该启动操作对应的启动方式,来确定与第一应用界面对应的模式类型。具体的,在一种情况下,在该启动操作为通过电子笔获取到的情况下,或者,在该启动操作为通过第二显示屏获取到的情况下,电子设备均将与第一应用界面对应的模式类型确定为手写输入。在该启动操作为通过鼠标或手指获取到的,且该启动操作为通过第一显示屏获取到的情况下,电子设备将与第一应用界面对应的模式类型确定为键盘输入。
在一种情况下,在该启动操作为通过电子笔获取到的,且该启动操作为通过第二显示屏获取到的情况下,电子设备均将与第一应用界面对应的模式类型确定为手写输入。在该启动操作为通过鼠标或手指获取到的情况下,或者,在该启动操作为通过第一显示屏获取到的情况下,电子设备将与第一应用界面对应的模式类型确定为键盘输入。
需要说明的是,电子设备还可以通过其他方式来确定与第一应用界面的初始模式类型,此处不再一一进行列举。
4203、电子设备判断与第一应用对应的模式类型是否为手写输入,若与第一应用对应的模式类型为键盘输入,则进入步骤4204;若与第一应用对应的模式类型为手写输入,则进入步骤4211。
4204、电子设备响应于键盘输入的输入模式,触发在第一显示屏上展示第一应用界面,并在第二显示屏上展示虚拟键盘。
本申请实施例中,在电子设备确定与第一应用界面对应的模式类型不是手写输入而是键盘输入的情况下,电子设备响应于键盘输入的输入模式,触发在第一显示屏上展示第一应用界面,并在第二显示屏上展示虚拟键盘,以通过第二显示屏上的虚拟键盘获取针对第一应用界面的输入内容。
进一步地,若在第二显示屏上展示虚拟键盘时,第二显示屏上还展示有其他应用界面的手写输入的接收界面,则一种实现方式中,电子设备可以在第二显示屏的顶端或底端设置有与每个应用一一对应的开启图标,从而在第二显示屏展示的虚拟键盘与其他应用界面之间进行切换。为更直观地理解本方案,请参阅图43,图43为本申请实施例提供的应用界面的处理方法中第二显示屏的展示界面的一种界面示意图。其中,图43中示出了应用界面1的开启图标、应用界面2的开启图标和虚拟键盘的展示界面(与第一显示屏上示出的第一应用界面对应),从而用户能够通过点击应用界面1的开启图标的方式,来实现虚拟键盘与应用界面1之间的切换;用户能够通过点击应用界面2的开启图标的方式,来实现虚拟键盘与应用界面2之间的切换,应理解,图43中的示例仅为方便理解本方案,不用于限定本方案。
在另一种实现方式中,电子设备也可以在第二显示屏的虚拟键盘的展示界面上设置有缩放图标,当用户通过电子笔、手指或鼠标等点击该缩小图标时,第二显示屏上展示的虚拟键盘收起;当用户通过写笔、手指或鼠标等点击该放大图标时,第二显示屏上展示的虚拟键盘展开。在另一种实现方式中,用户也可以通过对第二显示屏输入滑动操作的方式,以在第二显示屏展示的虚拟键盘与其他应用界面之间进行切换,该滑动操作可以为左右方向的滑动操作、上下方向的滑动操作等等,电子设备还可以采用其他方式以实现虚拟键盘与其他应用界面之间进行切换,此处不做穷举。
为了更直观地理解本方案,请参阅图44和图45,图44和图45分别为本申请实施例提供的应用界面的处理方法中的一种流程示意图。先参阅图44,图44包括(a)和(b)两个子示意图,在图44的(a)子示意图中,电子设备通过第一显示屏获取到对于目标应用程序(也即图示中的“笔记”这一应用程序)的开启操作,由于该启动操作为通过第一显示屏输入的,电子设备确定与第一应用界面对应的模式类型为键盘输入,则电子设备进入图44的(b)子示意图中,电子设备在第一显示屏上展示第一应用界面(也即“笔记”这一应用程序的初始应用界面),在第二显示屏上展示虚拟键盘和触控板区。
请继续参阅图45,图45包括(a)和(b)两个子示意图,在图45的(a)子示意图中,电子设备通过第一显示屏获取到对于目标应用程序(也即图示中的“笔记”这一应用程序)的开启操作,由于该启动操作为通过手指获取到的,电子设备确定与第一应用界面对应的模式类型为键盘输入,则电子设备进入图45的(b)子示意图中,电子设备在第一显示屏上展示第一应用界面,在第二显示屏上展示虚拟键盘和触控板区,需要说明的是,图44和图45的第二显示屏上也可以仅展示虚拟键盘,而不展示触控板区,应理解,图44和图45中的示例仅为方便理解本方案,不用于限定本方案。
可选地,步骤4204可以包括:在第二显示屏上展示虚拟键盘和应用控制栏。方法还可以包括:电子设备检测到作用于第二显示屏的第二操作;响应于第二操作将应用控制栏的第一显示面积改变为第二显示面积,并将应用控制栏包括的第一控制键组改变为第二控制键组,第一控制键组和第二控制键组均为对应于目标应用的控制键集合。对于前述步骤中各个名词的具体含义、前述步骤的具体实现方式,均会在后续实施例四中进行描述,此处不做赘述。
可选地,第一应用界面包括第一控制键,步骤4204可以包括:在第二显示屏上展示虚拟键盘和应用控制栏。方法还可以包括:电子设备检测到对于第一目标应用界面的第二操作;响应于第二操作,在应用控制栏中显示第一控制键,并隐藏第一应用界面中的第一控制键。对于前述步骤中各个名词的具体含义、前述步骤的具体实现方式,均会在后续实施例四中进行描述,此处不做赘述。
可选地,步骤4204可以包括:电子设备在第二显示屏上展示第二类型的虚拟键盘(也可以称为默认类型的虚拟键盘)。方法还包括:电子设备检测到作用于第二显示屏的第一手势操作;响应于第一手势操作,从多个类型的虚拟键盘中选取与第一手势操作对应的第一类型的虚拟键盘,其中,多个类型的虚拟键盘中不同类型的虚拟键盘包括的虚拟按键不完全相同;通过第二显示屏展示第一类型的虚拟键盘,第一类型的虚拟键盘和第二类型的虚 拟键盘为多个类型的虚拟键盘中的不同类型的虚拟键盘。也即电子设备在第二显示屏上展示第二类型的虚拟键盘之后,用户可以通过输入不同的手势操作,以改变第二显示屏上展示的虚拟键盘的类型。对与第一手势操作、不同类型的虚拟键盘等名词的含义,以及前述步骤的具体实现方式,均可参阅实施例二中的描述,此处不做赘述。
4205、电子设备获取与第一应用界面对应的模式类型。
本申请实施例中,电子设备在打开第一应用界面之后,也即在第一应用界面的运行过程中,电子设备还会实时检测并获取与第一应用界面对应的模式类型,以确定与第一应用界面对应的模式类型是否发生变化。具体的,若电子设备能检测到第一操作,则响应于第一操作,将与第一应用界面对应的模式类型转变为手写输入;若电子设备未检测到第一操作,则与第一应用界面对应的模式类型为键盘输入,电子设备会持续检测并获取与第一应用界面对应的模式类型。
更具体的,在一种实现方式中,电子设备根据用户对电子笔的握持姿势,来确定与第一应用界面对应的模式类型。具体的,在一种情况下,电子设备中预先存储有第一预设条件,电子设备会实时获取用户对电子笔的握持姿势,并判断用户对电子笔的握持姿势是否满足第一预设条件,在用户对电子笔的握持姿势满足第一预设条件的情况下,电子设备确定检测到用户的第一操作,进而将与第一应用界面对应模式的类型转变为手写输入的输入模式;在用户对电子笔的握持姿势不满足第一预设条件的情况下,确定与第一应用界面对应模式的类型为键盘输入的输入模式。
其中,握持姿势包括以下中的任一项或多项的组合:握持位置、握持力度、握持角度或其他握持相关的因素等,此处不做限定,第一预设条件包括以下中的任一项或多项的组合:握持位置位于第一位置范围内、握持力度位于第一力度范围内、握持角度位于第一角度范围内或其他预设条件等。
本申请实施例中,由于电子笔除了用于进行书写之外,还可以执行其他操作,例如通过电子笔执行一些通过鼠标执行的操作,作为示例,例如滑动操作、选择操作等等。或者,用户可能只是无意识的拿着电子笔,而不是想执行书写操作等,此处不做穷举。电子设备不会粗略的将用户使用电子笔时候,均确定为与第一应用对应的模式类型为书写模式,而是进一步根据用户对电子笔的握持姿势,来确定与第一应用界面对应的模式类型,从而降低与第一应用界面对应的模式类型的判断过程的错误率,以降低对第一应用界面进行错误放置的概率,既避免对计算机资源的浪费,又有利于提高用户粘度。
进一步地,电子笔可以被配置于电子设备中,当电子笔被用户从电子设备中拿出后,电子笔和电子设备之间可以配置有通信接口,电子笔可以实时采集与握持姿势对应的握持参数,并将握持参数发送给电子设备,以由电子设备来确定用户对电子笔的握持姿势是否满足第一预设条件。该握持参数包括以下中的任一项或多项的组合:与握持操作对应的接触点的位置、握持力度、电子笔的倾斜角度或其他参数等。
电子笔可以设置接触感知模块,电子笔的接触感知模块实时采集到用户与电子笔之间每个接触点的位置(也即确定用户对电子笔的握持位置),并将该每个接触点的位置发送给电子设备,电子设备根据每个接触点的位置,判断用户对电子笔的握持位置是否位于第一 位置范围内。该触摸感知模块具体可以表现为接触感知薄膜,接触感知薄膜具体可以为电容式接触感知薄膜、压力式接触感知薄膜、温度式接触感知薄膜或其他类型的薄膜等,此处均不做穷举。
电子笔中可以设置有压力感知模块,电子笔的压力感知模块实时采集用户对于电子笔的握持力度,并将用户对电子笔的握持力度发送给电子设备,由电子设备判断用户对电子笔的握持力度是否位于第一力度范围内。该压力感知模块具体可以表现为压力感知薄膜、分布式压力传感器或表现为其他形式,此处不做穷举。
电子笔中可以设置有角度测量模块,电子笔的角度测量模块实时采集电子笔的倾斜角度(也即确定用户对电子笔的握持角度),并将电子笔的倾斜角度发送给电子设备,由电子设备判断用户对电子笔的握持角度是否位于第一角度范围内,该角度测量模块具体可以表现为陀螺仪、或其他类型的角度测量模块等,此处不做限定。
更进一步地,在一种实现方式中,电子设备可以提前录入用户利用电子笔进行手写输入时的握持姿势,进而根据用户录入的前述握持姿势确定第一预设条件;可选地,电子设备还可以在用户使用电子笔进行书写的过程采集用户的握持姿势,也即采集用户的手指与电子笔之间接触点的位置、用户的握持力度、电子笔的倾斜角度等等,以对第一预设条件进行调整。在另一种实现方式中,电子设备中的第一预设条件可以为预先设定好的。
为更直观地理解本方案,请参阅图46,图46为本申请实施例提供的应用界面的处理方法中各种握持姿势的一种示意图。图46中示出了(a)、(b)、(c)、(d)、(e)和(f)六个子示意图,其中,图46的(a)子示意图、(b)子示意图、(c)子示意图和(d)子示意图分别示出了用户在利用电子笔进行书写时的四种握持姿势,图46的(e)子示意图和(f)子示意图示出的为用户虽然握持电子笔,但不是用于进行书写时的两种姿势,应理解,图46中的示例仅为方便理解用户对电子笔的握持手势这个概念,不用于限定本方案。
在另一种实现方式中,电子设备在第一应用界面或虚拟键盘的展示界面上可以设置分别与键盘输入的输入模式与手写输入的输入模式一一对应的触发图标,当用户点击第一应用界面上与手写输入的图标时,电子设备能获取到针对手写输入的触发指令,也即电子设备检测到用户的第一操作;当用户点击第一应用界面上与键盘输入的图标时,电子设备能获取到针对键盘输入的触发指令。或者,电子设备在第一应用界面上可以设置有在键盘输入与手写输入的输入模式之间进行切换的切换图标,当该切换图标处于第一状态时,视为用户输入针对手写输入的触发操作;当该切换图标处于第二状态时,视为用户输入针对键盘输入的触发操作等,此处不对电子设备获取针对手写输入的触发指令的方式进行穷举。电子设备响应于该针对手写输入的触发指令,电子设备确定与第一应用界面对应的模式类型为手写输入的输入模式。
为更直观地理解本方案,请参阅图47和图48,图47为本申请实施例提供的应用界面的处理方法中第一应用界面的一种界面示意图,图48为本申请实施例提供的应用界面的处理方法中第一应用界面的两种界面示意图。先参阅图47,第一应用界面上设置有两个图标,C1代表与键盘输入的输入模式对应的触发图标,C2代表与手写输入的输入模式对应的触发图标,则当用户通过第一应用界面点击C2时,电子设备能获取到针对手写输入的触发指 令。
再参阅图48,图48包括(a)和(b)两个子示意图,在图48的(a)子示意图和(b)子示意图中,D1均代表用于在键盘输入与手写输入的输入模式之间进行切换的切换图标,在图48的(a)子示意图中,前述切换图标处于第一状态,则与第一应用界面对应的模式类型为键盘输入的输入模式,在图48的(b)子示意图中,前述切换图标处于第二状态,则与第一应用界面对应的模式类型为手写输入的输入模式,应理解,图47和图48中的示例均仅为方便理解本方案,不用于限定本方案。
在另一种实现方式中,电子设备还可以通过第一显示屏上展示的第一应用界面,或者,通过第二显示屏上展示的虚拟键盘的界面来获取用户输入的第一接触操作,在检测到第一接触操作的情况下,确定检测到第一操作,进而将与第一应用界面对应的模式类型转变为手写输入的输入模式。其中,第一接触操作为点击操作或预设轨迹操作;进一步地,该第一接触操作可以为单击操作、双击操作、三击操作、长按操作、“Z”字型的轨迹操作、下滑操作、“对勾”形的轨迹操作、“圆圈”形的轨迹操作或其他接触操作等等,此处不进行穷举。
更进一步地,步骤4205可以包括:电子设备通过第二显示屏获取第一方向的滑动操作,第一方向的滑动操作为从第二显示屏的上边沿向第二显示屏的下边沿滑动的滑动操作,第二显示屏的上边沿与第一显示屏之间的距离比第二显示屏的下边沿与第一显示屏之间的距离近。电子设备响应于第一方向的滑动操作,第二显示屏上展示的虚拟键盘沿第一方向向第二显示屏的下边沿移动,在虚拟键盘的上边沿抵达第二显示屏的下边沿时,确认与第一应用界面对应的模式类型转变为手写输入。本申请实施例中,显示于第二显示屏上的虚拟键盘能够伴随用户的向下滑动操作,并在虚拟键盘的上边沿抵达第二显示屏的下边沿时,电子设备确认与第一应用界面对应的模式类型转变为手写输入,增加了键盘输入至手写输入过程的趣味性,有利于提高用户粘度。
为更直观地理解本方案,请参阅图49,图49为本申请实施例提供的应用界面的处理方法中第一接触操作的一种示意图。图49包括(a)、(b)和(c)三个子示意图,图49中以第一接触操作为通过第二显示屏输入的下滑操作为例。如图49的(a)子示意图和图49的(b)子示意图所示,当用户通过第二显示屏上的虚拟键盘的展示界面输入下滑操作时,第二显示屏上的虚拟键盘被收起,当第二显示屏上的虚拟键盘被完全收起,视为通过第二显示屏获取到第一接触操作,电子设备确定与第一应用界面对应的模式类型为手写输入的输入模式,从而触发进入图49的(c)子示意图,也即触发在第二显示屏上展示第一应用界面,应理解,图49中的示例均仅为方便理解本方案,不用于限定本方案。
在另一种实现方式中,在通过第一显示屏展示第一应用界面的过程中,电子设备可以实时检测电子笔与第二屏幕之间的距离,当发现电子笔位于第二显示屏的预设范围内的情况下,确定检测到第一操作,将与第一应用界面对应的模式类型转变为手写输入的输入模式。第二显示屏的预设范围可以指的是第二显示屏的正上方的3厘米内、4厘米内、5厘米内或其他范围等等,此处不做限定。
在另一种实现方式中,电子设备可以实时采集电子笔的状态,在电子笔由第一预设状 态转变为第二预设状态的情况下,确定检测到第一操作,将与第一应用界面对应的模式类型转变为手写输入的输入模式;在电子笔未处于第二预设状态的情况下,确定与第一应用界面对应的模式类型为键盘输入的输入模式。其中,电子笔由第一预设状态转变为第二预设状态可以为电子笔由静止状态转变为移动状态、电子笔由未被握持状态转变为被握持状态等,此处不做穷举。
作为示例,例如用户在打开第一应用界面后,在第一应用界面的过程中,用户将电子笔从电子设备中拿出来(电子笔从未被握持状态转换成被握持状态),或者,用户将电子笔从电子设备之外的地方拿起来(电子笔从未被握持状态转换成被握持状态),电子设备可以确定与第一应用界面对应的模式类型为手写输入的输入模式。
进一步地,当电子笔从电子设备中被拿出来之后,会跟电子设备建立通信连接,则若电子设备检测到电子笔从未与电子设备建立通信连接转变为与电子设备建立通信连接,则可以视为电子设备从从未被握持状态转换成被握持状态。
电子笔中可以配置有振动传感器(例如陀螺仪、加速度传感器或其他类型的传感器等等),从而电子设备可以实时采集电子笔的振动数据,并将电子笔的振动数据通过通信模块实时发送给电子设备,由电子设备确定电子笔是否由静止状态转变为移动状态。
将电子笔从设备里拿出来的感知可能会依靠设备本身的处理模块接收到手写笔与设备的接口断开的信号,或者是通过手写笔上的传感器模块感知到与设备的连接断开,再通过通讯模块发送给屏幕设备来实现。而用户拿起笔过程的感知是通过手写笔自身的传感器模块(如陀螺仪传感器或加速度传感器等)感知用户拿起手写笔时对其造成的震动,然后将震动数据通过通讯模块发给主设备处理模块来实现。
本申请实施例中,提供了与第一应用界面对应的模式类型的多种判断方式,提高了本方案的实现灵活性,也扩展了本方案的应用场景;进一步地,根据电子笔的握持姿势来确定与第一应用界面对应的模式类型,用户无需执行其他操作就可以实现对第一应用界面的模式类型的转变,且根据用户对电子笔的握持姿势,来确定与第一应用界面对应的模式类型,能够降低与第一应用界面对应的模式类型的判断过程的错误率,以降低对第一应用界面进行错误放置的概率,既避免对计算机资源的浪费,又有利于提高用户粘度。
4206、电子设备判断与第一应用界面对应的模式类型是否转变为手写输入,若与第一应用界面对应的模式类型转变为手写输入,则进入步骤4207;若与第一应用界面对应的模式类型没有转变为手写输入,则重新进入步骤4205。
本申请实施例中,电子设备在执行完步骤4205之后就会执行步骤4206,以确定与第一应用界面对应的模式类型是否由键盘输入的输入模式转变为手写输入的输入模式,若与第一应用界面对应的模式类型转变为手写输入,则进入步骤4207;若与第一应用界面对应的模式类型未转变为手写输入,则重新进入步骤4205,以继续检测与第一应用界面对应的模式类型。需要说明的是,本申请实施例中,步骤4205和步骤4206可以交叉执行,且本申请实施例不限定步骤4205和4206与步骤4207之间的执行次数关系,可以为在执行多次步骤4205和4206之后,执行一次步骤4207。
4207、电子设备响应于手写输入的输入模式,触发在第二显示屏上展示第一应用界面。
本申请实施例中,在电子设备获取到与第一应用界面对应的模式类型由键盘输入的输入模式转变为由手写输入的输入模式的情况下,响应于手写输入的输入模式,触发在第二显示屏上展示第一应用界面,关闭在第二显示屏上展示的虚拟键盘,以通过第二显示屏获取用户针对第一应用界面输入的手写内容。其中,电子设备在第二显示屏上展示第一应用界面可以为将第一应用界面移动至第二显示屏进行展示,也可以为电子设备自动对第一应用界面进行复制操作之后,通过第二显示屏进行展示等。
具体的,电子设备上运行有操作系统,电子设备可以通过调用操作系统中的move to函数的方式,或者,电子设备也可以通过调用操作系统中的Set Window Position函数的方式,或者,电子设备还可以通过调用操作系统中的Set Window Placement函数的方式,以实现在第二显示屏上展示第一应用界面。
更具体的,在一种情况中,电子设备的第二显示屏上没有其他的应用界面,则在一种实现方式中,电子设备可以关闭在第二显示屏上展示的虚拟键盘,将第一应用界面移动至第二显示屏(也即不在第一显示屏上展示第一应用界面),并通过第二显示屏全屏展示第一应用界面。在另一种实现方式中,电子设备可以关闭在第二显示屏上展示的虚拟键盘,将第一应用界面复制至第二显示屏,以在第一显示屏和第二显示屏上均显示第一应用界面。
为更直观地理解本方案,请参阅50,图50为本申请实施例提供的应用界面的处理方法中第一应用界面的展示界面的一种示意图。图50包括(a)和(b)两个子示意图,图50的(a)子示意图示出的为在与第一应用界面对应的模式类型为键盘输入的输入模式的情况下,电子设备的第一显示屏和第二显示屏的一种示意图,在电子设备获取到与第一应用界面对应的模式类型由键盘输入的输入模式转变为由手写输入的输入模式的情况下,触发由图50的(a)子示意图进入图50的(b)子示意图,也即关闭在第二显示屏上展示的虚拟键盘,将第一应用界面移动至第二显示屏,需要说明的是,电子设备的第一显示屏上除了展示有第一应用界面外,还有可能展示有其他的应用界面,此处举例仅为方便理解本方案,不用于限定本方案。
在另一种情况中,电子设备的第二显示屏上还展示有其他的应用界面。在一种实现方式中,电子设备可以关闭第二显示屏上展示的虚拟键盘,并通过矩阵的方式在第二显示屏上展示第一应用界面和其他的应用界面。在另一种实现方式中,电子设备可以关闭第二显示屏上展示的虚拟键盘,并通过悬浮窗的形式,在第二显示屏上展示第一应用界面。在另一种实现方式中,电子设备可以关闭第二显示屏上展示的虚拟键盘,并将第二显示屏上展示的其他应用界面移动至第一显示屏,以通过第二显示屏采用全屏的形式展示第一应用界面等,此处不对在第二显示屏上展示第一应用界面的形式进行穷举。
进一步地,在上述种种实现方式中,电子设备在第二显示屏上展示第一应用界面的同时,可以也在第一显示屏上展示第一应用界面,也可以不再在第一显示屏上展示第一应用界面。
可选地,步骤4204可以包括:在第二显示屏上展示第一应用界面和应用控制栏。方法还可以包括:电子设备检测到作用于第二显示屏的第二操作;响应于第二操作将应用控制栏的第一显示面积改变为第二显示面积,并将应用控制栏包括的第一控制键组改变为第二 控制键组,第一控制键组和第二控制键组均为对应于目标应用的控制键集合。对于前述步骤中各个名词的具体含义、前述步骤的具体实现方式,均会在后续实施例四中进行描述,此处不做赘述。
可选地,第一应用界面包括第一控制键,步骤4204可以包括:在第二显示屏上展示第一应用界面和应用控制栏。方法还可以包括:电子设备检测到对于第一目标应用界面的第二操作;响应于第二操作,在应用控制栏中显示第一控制键,并隐藏第一应用界面中的第一控制键。对于前述步骤中各个名词的具体含义、前述步骤的具体实现方式,均会在后续实施例四中进行描述,此处不做赘述。
4208、电子设备获取与第一应用界面对应的模式类型。
本申请实施例中,步骤4208的具体实现方式可参阅上述对步骤4205中的描述,此处不做赘述。
4209、电子设备判断与第一应用界面对应的模式类型是否转变为键盘输入,若与第一应用界面对应的模式类型转变为键盘输入,则进入步骤4210;若与第一应用界面对应的模式类型没有转变为键盘输入,则重新进入步骤4208。
本申请实施例中,电子设备在执行完步骤4208之后就会执行步骤4209,以确定与第一应用界面对应的模式类型是否由手写输入的输入模式转变为键盘输入的输入模式,若与第一应用界面对应的模式类型转变为键盘输入,则进入步骤4210;若与第一应用界面对应的模式类型未转变为手写输入,则重新进入步骤4208,以继续检测与第一应用界面对应的模式类型。需要说明的是,本申请实施例中,步骤4208和步骤4209可以交叉执行,且本申请实施例不限定步骤4208和4209与步骤4209之间的执行次数关系,可以为在执行多次步骤4208和4209之后,执行一次步骤4210。
4210、电子设备响应于键盘输入的输入模式,触发在第一显示屏上展示第一应用界面,并在第二显示屏上展示虚拟键盘。
本申请实施例中,步骤4210的具体实现方式可参阅上述对步骤4204中的描述,此处不做赘述。需要说明的是,电子设备在执行完步骤4210之后,可以重新进入步骤4205,以实时检测与第一应用界面对应的模式类型是否转变为手写输入;此外,步骤4205至步骤4209均为可选步骤,若用户在步骤4205至步骤4209中任一步骤中关闭了第一应用界面,则不再需要继续执行其余步骤。
本申请实施例中,在展示应用界面的过程中,不仅能在应用界面从其他模式类型转变为手写输入时,自动调整应用界面在电子设备的不同显示屏上的布局,且能在应用界面的模式类型转变为键盘输入时,也能够自动调整应用界面在不同显示屏上的布局,并自动展示出虚拟键盘,从而当应用界面的模式类型转变为键盘输入时,用户也无需再手动调整应用界面在不同显示屏上的布局,而是直接可以进行键盘输入,步骤简洁,进一步提高了本方案的用户粘度。
4211、电子设备响应于手写输入的输入模式,触发在第二显示屏上展示第一应用界面。
本申请实施例中,在电子设备确定与第一应用界面对应的模式类型是手写输入的情况下,电子设备响应于手写输入的输入模式,触发在第二显示屏上展示第一应用界面,以通 过第一显示屏获取针对第一应用界面的输入内容。其中,第一应用界面在第二显示屏上的展示方式可参阅上述步骤4207中的描述,此处不做赘述。
为了更直观地理解本方案,请参阅图51和图52,图51和图52分别为本申请实施例提供的应用界面的处理方法中的一种流程示意图。先参阅图51,图51包括(a)和(b)两个子示意图,在图51的(a)子示意图中,电子设备通过第二显示屏获取到对于目标应用程序(也即图示中的“笔记”这一应用程序)的开启操作,由于该启动操作为通过第二显示屏输入的,电子设备确定与第一应用界面对应的模式类型为手写输入,则电子设备进入图51的(b)子示意图中,电子设备在第二显示屏上展示第一应用界面(也即“笔记”这一应用程序的初始应用界面)。
请继续参阅图52,图52包括(a)和(b)两个子示意图,在图52的(a)子示意图中,电子设备通过第一显示屏获取到对于目标应用程序(也即图示中的“笔记”这一应用程序)的开启操作,由于该启动操作为通过电子笔获取到的,电子设备确定与第一应用界面对应的模式类型为手写输入,则电子设备进入图52的(b)子示意图中,电子设备在第二显示屏上展示第一应用界面,应理解,图51和图52中的示例仅为方便理解本方案,不用于限定本方案。
4212、电子设备获取与第一应用界面对应的模式类型。
4213、电子设备判断与第一应用界面对应的模式类型是否转变为键盘输入,若与第一应用界面对应的模式类型转变为键盘输入,则进入步骤4214;若与第一应用界面对应的模式类型没有转变为键盘输入,则重新进入步骤4212。
4214、电子设备响应于键盘输入的输入模式,触发在第一显示屏上展示第一应用界面,并在第二显示屏上展示虚拟键盘。
本申请实施例中,步骤4212至4214的具体实现方式可参阅上述对步骤4208至4210中的描述,此处不做赘述。
4215、电子设备获取与第一应用界面对应的模式类型。
4216、电子设备判断与第一应用界面对应的模式类型是否转变为手写输入,若与第一应用界面对应的模式类型转变为手写输入,则进入步骤4217;若与第一应用界面对应的模式类型没有转变为手写输入,则重新进入步骤4215。
4217、电子设备响应于手写输入的输入模式,触发在第二显示屏上展示第一应用界面。
本申请实施例中,步骤4215至4217的具体实现方式可参阅上述对步骤4205至4207中的描述,此处不做赘述。
需要说明的是,电子设备在执行完步骤4217之后,可以重新进入步骤4212,以实时检测与第一应用界面对应的模式类型是否转变为键盘输入;此外,步骤4212至步骤4217均为可选步骤,若用户在步骤4212至步骤4217中任一步骤中关闭了第一应用界面,则不再需要继续执行其余步骤。
二、在手写输入的输入模式与浏览模式之间切换。
本申请实施例中,请参阅图53,图53为本申请实施例提供的应用界面的处理方法的一种流程示意图,本申请实施例提供的应用界面的处理方法可以包括:
5301、电子设备获取针对第一应用界面的启动操作。
5302、电子设备基于启动操作,确定与第一应用界面对应的模式类型。
5303、电子设备判断与第一应用对应的模式类型是否为手写输入,若与第一应用对应的模式类型为浏览模式,则进入步骤5304;若与第一应用对应的模式类型为手写输入,则进入步骤5311。
本申请实施例中,步骤5301至5303的具体实现方式可参阅图42对应实施例中对步骤4201至4203中的描述,区别在于将步骤4201至4203中键盘输入这一模式类型,替换为步骤5301至5303中的浏览模式,具体可参阅图42对应实施例中的描述,此处不做赘述。
5304、电子设备响应于浏览模式,触发在第一显示屏上展示第一应用界面。
本申请实施例中,在电子设备确定与第一应用界面对应的模式类型不是手写输入而是浏览模式的情况下,电子设备响应于浏览模式,触发仅在第一显示屏上展示第一应用界面。
5305、电子设备获取与第一应用界面对应的模式类型。
5306、电子设备判断与第一应用界面对应的模式类型是否转变为手写输入,若与第一应用界面对应的模式类型转变为手写输入,则进入步骤5307;若与第一应用界面对应的模式类型没有转变为手写输入,则重新进入步骤5305。
5307、电子设备响应于手写输入的输入模式,触发在第二显示屏上展示第一应用界面。
本申请实施例中,步骤5305至5307的具体实现方式可参阅图42对应实施例中对步骤205至207中的描述,区别在于将步骤205至207中键盘输入这一模式类型,替换为步骤5305至5307中的浏览模式,且由于在浏览模式下,第二显示屏上不需要展示虚拟键盘,则对应的,当与第一应用界面对应的模式类型由浏览模式转变为手写输入时,也不需要关闭第二显示屏上展示的虚拟键盘,具体可参阅图42对应实施例中的描述,此处不做赘述。
为更直观地理解本方案,请参阅图54至图57,图54至图57为本申请实施例提供的应用界面的处理方法中第一应用界面的展示界面的四种示意图。先参阅图54,图54包括(a)和(b)两个子示意图,在图54的(a)子示意图中,图54中第一显示屏的底端展示有一个灯泡形图案和三个圆圈,该灯泡形图案代表桌面的展示界面,三个圆圈分别代表三个不同的应用界面,第一显示屏中当前展示的为应用界面1(也即第一应用界面的一种示例),第一显示屏的右上角有两个图标分别代表浏览模式和手写模式,第二显示屏中展示有应用界面2。在电子设备获取到与第一应用界面对应的模式类型由浏览模式转变为由手写输入的输入模式的情况下,触发由图54的(a)子示意图进入图54的(b)子示意图,也即将第一应用界面移动至第二显示屏,在图54的(b)子示意图中,电子设备通过矩阵的形式展示应用界面1和应用界面2,且不再在第一显示屏上展示应用界面1,则第一显示屏的当前展示界面变成了应用界面3,用户可以通过点击应用界面1,以触发通过全屏的方式展示应用界面1。
再参阅图55,图55包括(a)和(b)两个子示意图,图55的(a)子示意图和图54的(a)子示意图一致,此处不再赘述,在电子设备获取到与第一应用界面对应的模式类型由浏览模式转变为由手写输入的输入模式的情况下,触发进入图55的(b)子示意图,在图55的(b)子示意图中,电子设备通过悬浮窗的形式展示应用界面1(也即第一应用界 面的一种示例),且不再在第一显示屏上展示应用界面1,则第一显示屏的当前展示界面变成了应用界面3。
再参阅图56,图56包括(a)和(b)两个子示意图,图56的(a)子示意图和图54的(a)子示意图一致,此处不再赘述,在电子设备获取到与第一应用界面对应的模式类型由浏览模式转变为由手写输入的输入模式的情况下,触发进入图56的(b)子示意图,在图56的(b)子示意图中,电子设备通过悬浮窗的形式展示应用界面1(也即第一应用界面的一种示例),且在第一显示屏上依旧展示应用界面1。
再参阅图57,图57包括(a)和(b)两个子示意图,图57的(a)子示意图和图54的(a)子示意图一致,此处不再赘述,在电子设备获取到与第一应用界面对应的模式类型由浏览模式转变为由手写输入的输入模式的情况下,触发进入图57的(b)子示意图,在图57的(b)子示意图中,电子设备通过全屏的形式展示应用界面1(也即第一应用界面的一种示例),且电子设备将第二显示屏上展示的应用界面2移动至第一显示屏。
需要说明的是,电子设备的第一显示屏上除了展示有第一应用界面外,还有可能展示有其他的应用界面,电子设备的第二显示屏上也可以展示有更多的应用界面,图54至图57的示例均仅为方便理解本方案,不用于限定本方案。
5308、电子设备获取与第一应用界面对应的模式类型。
5309、电子设备判断与第一应用界面对应的模式类型是否转变为浏览模式,若与第一应用界面对应的模式类型转变为浏览模式,则进入步骤1530;若与第一应用界面对应的模式类型没有转变为浏览模式,则重新进入步骤5308。
5310、电子设备响应于浏览模式的输入模式,触发在第一显示屏上展示第一应用界面,并不在第二显示屏上展示第一应用界面。
本申请实施例中,步骤5308至5310的具体实现方式可参阅图42对应实施例中对步骤4208至4210中的描述,区别在于将步骤4208至4210中键盘输入这一模式类型,替换为步骤5308至5310中的浏览模式,且当与第一应用界面对应的模式类型由手写输入转变为浏览模式时,不需要在第二显示屏上展示虚拟键盘,具体可参阅图42对应实施例中的描述,此处不做赘述。
本申请实施例中,在应用界面的模式类型转变为浏览模式时,也能够自动调整应用界面在不同显示屏上的布局,从而当应用界面的模式类型转变为浏览模式时,用户也无需再手动调整应用界面在不同显示屏上的布局,也即在多种不同的应用场景下,均可以实现操作步骤的简化,进一步提高了本方案的用户粘度。
需要说明的是,电子设备在执行完步骤5310之后,可以重新进入步骤5305,以实时检测与第一应用界面对应的模式类型是否转变为手写输入;此外,步骤5305至步骤5310均为可选步骤,若用户在步骤5305至步骤5310中任一步骤中关闭了第一应用界面,则不再需要继续执行其余步骤。
5311、电子设备响应于手写输入的输入模式,触发在第二显示屏上展示第一应用界面。
5312、电子设备获取与第一应用界面对应的模式类型。
5313、电子设备判断与第一应用界面对应的模式类型是否转变为浏览模式,若与第一 应用界面对应的模式类型转变为浏览模式,则进入步骤5314;若与第一应用界面对应的模式类型没有转变为浏览模式,则重新进入步骤5312。
5314、电子设备响应于浏览模式的输入模式,触发在第一显示屏上展示第一应用界面,并不在第二显示屏上展示第一应用界面。
5315、电子设备获取与第一应用界面对应的模式类型。
5316、电子设备判断与第一应用界面对应的模式类型是否转变为手写输入,若与第一应用界面对应的模式类型转变为手写输入,则进入步骤5317;若与第一应用界面对应的模式类型没有转变为手写输入,则重新进入步骤5315。
5317、电子设备响应于手写输入的输入模式,触发在第二显示屏上展示第一应用界面。
本申请实施例中,步骤5311至5317的具体实现方式可参阅图42对应实施例中对步骤4211至4217中的描述,区别在于将步骤4211至4217中键盘输入这一模式类型,替换为步骤5311至5317中的浏览模式,且当与第一应用界面对应的模式类型由手写输入转变为浏览模式时,不需要在第二显示屏上展示虚拟键盘,当与第一应用界面对应的模式类型由浏览模式转变为手写输入时,也不需要关闭第二显示屏上展示的虚拟键盘,具体可参阅图42对应实施例中的描述,此处不做赘述。
需要说明的是,电子设备在执行完步骤5317之后,可以重新进入步骤5312,以实时检测与第一应用界面对应的模式类型是否转变为浏览模式;此外,步骤5312至步骤5317均为可选步骤,若用户在步骤5312至步骤5317中任一步骤中关闭了第一应用界面,则不再需要继续执行其余步骤。
本申请实施例中,不仅在用户使用应用界面的过程中,能够自动检测与应用界面对应的模式类型,进而根据与应用界面对应的模式类型对应用界面的展示位置进行调整,而且在打开应用界面时,也可以基于启动操作,确定与应用界面对应的模式类型,进而决定应用界面的展示位置,以方便用户在对应用界面执行启动操作后可以直接使用,而无需再对应用界面做位置移动操作,进一步提高了本方案的便利性,增加了本方案的用户粘度。
电子设备在第一显示屏上展示第一应用界面,并检测与第一应用界面对应的模式类型,在检测到与第一应用界面对应的模式类型为手写输入的情况下,就会触发在第二显示屏上展示第一应用界面,进而直接通过第二显示屏展示的第一应用界面进行输入;通过前述方式,若用户将第二显示屏放置于便于书写的方向上,用户不需要执行任何操作,电子设备就能够自动的将需要进行书写输入的应用界面显示于方便书写的第二显示屏上,既提高了整个输入过程的效率,也避免了冗余步骤,操作简单,有利于提高用户粘度。
在图41至图57所对应的实施例的基础上,为了更好的实施本申请实施例的上述方案,下面还提供用于实施上述方案的相关设备。请参阅图58,图58为本申请实施例提供的电子设备的一种结构示意图。电子设备1包括第一显示屏501、第二显示屏502、存储器40、一个或多个处理器10以及一个或多个程序401;一个或多个程序401被存储在存储器40中,一个或多个处理器10在执行一个或多个程序401时,使得电子设备执行以下步骤:通过第一显示屏501展示第一应用界面;响应于检测到的第一操作,将与第一应用界面对应的模式类型转变为手写输入;响应于手写输入的输入模式,触发在第二显示屏502上展示 第一应用界面,以通过第二显示屏502获取针对第一应用界面的手写内容。
在一种可能的设计中,一个或多个处理器10在执行一个或多个程序401时,使得电子设备还执行以下步骤:在检测到与第一应用界面对应的模式类型转变为键盘输入的情况下,响应于键盘输入的输入模式,触发在第一显示屏501上展示第一应用界面,并在第二显示屏502上展示虚拟键盘;或者,在检测到与第一应用界面对应的模式类型转变为键盘输入的情况下,响应于键盘输入的输入模式,触发在第一显示屏501上展示第一应用界面,并在第二显示屏502上展示虚拟键盘和应用控制栏。
在一种可能的设计中,一个或多个处理器10在执行一个或多个程序401时,使得电子设备还执行以下步骤:在检测到与第一应用界面对应的模式类型转变为浏览模式的情况下,响应于浏览模式,触发在第一显示屏501上展示第一应用界面。
在一种可能的设计中,一个或多个处理器10在执行一个或多个程序401时,使得电子设备具体执行以下四项中任一项或多项的组合:在检测到电子笔的握持姿势满足第一预设条件的情况下,确定检测到第一操作,握持姿势包括以下中任一项或多项的组合:握持位置、握持力度、握持角度;或者,通过第一图标获取到针对手写输入的触发指令,第一图标展示于第一应用界面上;或者,检测到第一接触操作,第一接触操作为预设点击操作或预设轨迹操作;或者,在检测到电子笔位于第二显示屏502的预设范围内的情况下,确定检测到第一操作;在检测到电子笔由第一预设状态转变为第二预设状态的情况下,确定检测到第一操作。
在一种可能的设计中,第一操作为通过第二显示屏502获取第一方向的滑动操作,第一方向的滑动操作为从第二显示屏502的上边沿向第二显示屏502的下边沿滑动的滑动操作,第二显示屏502的上边沿与第一显示屏501之间的距离比第二显示屏502的下边沿与第一显示屏501之间的距离近。
在一种可能的设计中,一个或多个处理器10在执行一个或多个程序401时,使得电子设备还执行以下步骤:获取针对第二应用界面的启动操作,并基于启动操作,确定与第二应用界面对应的模式类型,第二应用界面与第一应用界面为不同的应用界面;在与第二应用界面对应的模式类型为手写输入的情况下,响应于手写输入的输入模式,触发在第二显示屏502上展示第二应用界面;或者,在与第二应用界面对应的模式类型为键盘输入的情况下,响应于键盘输入的输入模式,触发在第一显示屏501上展示第二应用界面,并在第二显示屏502上展示虚拟键盘;或者,在与第二应用界面对应的模式类型为浏览模式的情况下,响应于浏览模式,触发在第一显示屏501上展示第二应用界面。
需要说明的是,电子设备1中各模块/单元之间的信息交互、执行过程等内容,与本申请中图41至图57对应的各个方法实施例基于同一构思,具体内容可参见本申请前述所示的方法实施例中的叙述,此处不再赘述。
本申请实施例还提供了一种电子设备,请参阅图59,图59为本申请实施例提供的电子设备的一种结构示意图,电子设备1具体可以表现为手机、平板、笔记本电脑或者其他配置有显示屏的设备等,此处不做限定。其中,电子设备1上可以部署有图58对应实施例中所描述的电子设备,用于实现图41至图57对应实施例中电子设备的功能。具体的,电 子设备1可因配置或性能不同而产生比较大的差异,可以包括一个或一个以上中央处理器(central processing units,CPU)1522(例如,一个或一个以上处理器)和存储器40,一个或一个以上存储应用程序1542或数据1544的存储介质1530(例如一个或一个以上海量存储设备)。其中,存储器40和存储介质1530可以是短暂存储或持久存储。存储在存储介质1530的程序可以包括一个或一个以上模块(图示没标出),每个模块可以包括对电子设备中的一系列指令操作。更进一步地,中央处理器1522可以设置为与存储介质1530通信,在电子设备1上执行存储介质1530中的一系列指令操作。
电子设备1还可以包括一个或一个以上电源1526,一个或一个以上有线或无线网络接口1550,一个或一个以上输入输出接口1558,和/或,一个或一个以上操作系统1541,例如Windows ServerTM,Mac OS XTM,UnixTM,LinuxTM,FreeBSDTM等等。
本申请实施例中,中央处理器1522,用于实现图41至图57对应实施例中电子设备的功能。需要说明的是,对于中央处理器1522执行图41至图57对应实施例中电子设备的功能的具体实现方式以及带来的有益效果,均可以参考图41至图57对应的各个方法实施例中的叙述,此处不再一一赘述。
本申请实施例中还提供一种计算机可读存储介质,该计算机可读存储介质中存储有用于生成车辆行驶速度的程序,当其在计算机上运行时,使得计算机执行如前述图42至图57所示实施例描述的方法中电子设备所执行的步骤。
本申请实施例中还提供一种计算机程序,当其在计算机上运行时,使得计算机执行如前述图42至图57所示实施例描述的方法中电子设备所执行的步骤。
本申请实施例中还提供一种电路系统,所述电路系统包括处理电路,所述处理电路配置为执行如前述图42至图57所示实施例描述的方法中电子设备所执行的步骤。
本申请实施例提供的电子设备具体可以为芯片,芯片包括:处理单元和通信单元,所述处理单元例如可以是处理器,所述通信单元例如可以是输入/输出接口、管脚或电路等。该处理单元可执行存储单元存储的计算机执行指令,以使芯片执行上述前述图42至图57所示实施例描述的方法中电子设备所执行的步骤。可选地,所述存储单元为所述芯片内的存储单元,如寄存器、缓存等,所述存储单元还可以是所述无线接入设备端内的位于所述芯片外部的存储单元,如只读存储器(read-only memory,ROM)或可存储静态信息和指令的其他类型的静态存储设备,随机存取存储器(random access memory,RAM)等。
其中,上述任一处提到的处理器,可以是一个通用中央处理器,微处理器,ASIC,或一个或多个用于控制上述第一方面方法的程序执行的集成电路。
另外需说明的是,以上所描述的装置实施例仅仅是示意性的,其中所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部模块来实现本实施例方案的目的。另外,本申请提供的装置实施例附图中,模块之间的连接关系表示它们之间具有通信连接,具体可以实现为一条或多条通信总线或信号线。
通过以上的实施方式的描述,所属领域的技术人员可以清楚地了解到本申请可借助软 件加必需的通用硬件的方式来实现,当然也可以通过专用硬件包括专用集成电路、专用CLU、专用存储器、专用元器件等来实现。一般情况下,凡由计算机程序完成的功能都可以很容易地用相应的硬件来实现,而且,用来实现同一功能的具体硬件结构也可以是多种多样的,例如模拟电路、数字电路或专用电路等。但是,对本申请而言更多情况下软件程序实现是更佳的实施方式。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品存储在可读取的存储介质中,如计算机的软盘、U盘、移动硬盘、ROM、RAM、磁碟或者光盘等,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)执行本申请各个实施例所述的方法。
在上述实施例中,可以全部或部分地通过软件、硬件、固件或者其任意组合来实现。当使用软件实现时,可以全部或部分地以计算机程序的形式实现。
所述计算机程序包括一个或多个计算机指令。在计算机上加载和执行所述计算机程序指令时,全部或部分地产生按照本申请实施例所述的流程或功能。所述计算机可以是通用计算机、专用计算机、计算机网络、或者其他可编程装置。所述计算机指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一计算机可读存储介质传输,例如,所述计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如同轴电缆、光纤、数字用户线(DSL))或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心进行传输。所述计算机可读存储介质可以是计算机能够存储的任何可用介质或者是包含一个或多个可用介质集成的服务器、数据中心等数据存储设备。所述可用介质可以是磁性介质,(例如,软盘、硬盘、磁带)、光介质(例如,DVD)、或者半导体介质(例如固态硬盘(Solid State Disk,SSD))等。
实施例四:
本发明实施例可应用于各种多屏显示的智能终端中。作为示例,本发明实施例可以用于双屏电子设备,如图60所示,上述电子设备可以是具有两个显示屏的电子设备,其中,两个显示屏可以是两个分离的显示屏,也可以是由一块柔性折叠屏或者曲面屏划分为的两个显示屏。电子设备可以是作为一个整体独立工作的电子设备,例如,个人笔记本等,也可以是由两个可以独立工作的电子设备相互连接,共同工作而形成的电子设备,例如,由两个手机或两个平板电脑拼接而成的双屏电子设备。
双屏电子设备或者由两个平板电脑对接形成的双屏电子设备通常包括第一显示屏和第二显示屏。其中,第一显示屏主要用于提供输出功能,即,将当前运行的内容或执行的操作等展示给用户。当然,第一显示屏也可以同时具备输入功能,例如,第一显示区域可具备触屏功能,通过触屏手势对当前应用进行操作。相比于第一显示屏,第二显示屏通常更靠近用户的双手,便于用户操作,因此,第二显示屏主要执行输入功能,可通过触摸显示屏接收用户的输入,在该触摸显示屏上,也可通过虚拟键盘取代传统的机械键盘接收用户的输入。当然,第二显示屏也可具备输出功能,例如,第二显示屏同样可用于将当前运行的内容或执行的操作等展示给用户。
本发明实施例还可以用于单应用双屏跨设备的操作中,其中被控设备的显示屏主要用 于将当前运行的内容或执行的操作等展示给用户,即,主要实现第一显示屏的功能。将第一显示屏中的目标应用对应的功能菜单转移至控制设备的显示屏上,即,主要实现第二显示屏的功能,以达到对被控设备上的应用的控制。例如,如图61所示,可通过平板电脑(61-1)或手机(61-2)作为控制终端,远程操作电脑(61-3)的应用;可通过平板电脑(61-1)或手机(61-2)作为控制终端,远程操作智慧大屏(61-4)的应用;或者将手机与平板电脑进行互联操作,以其中一方作为控制终端,对另一方的应用进行操作。具体的,本发明实施例可以应用于智能家居场景中,例如,被控设备可以为电视、微波炉、冰箱、洗衣机等具有显示显示屏的智能家居设备,控制设备可以为手机、平板、电脑等。本发明实施例还可用于智能座舱领域中,例如,通过手机、平板等作为控制设备,对前排车机屏或者后排显示屏进行控制,或者通过后排显示屏对前排车机屏进行控制。
在上述场景中,应用程序的界面通常采用较为固定的布局方式,且全部显示于第一显示屏中,例如,应用的控制键通常出现在应用顶部或者左侧的功能区菜单中。当用户需要对应用的控制键进行操作时,无论用户的当前操作对象在哪里,均需要将光标移至功能区菜单处进行操作,然后再回到当前操作对象处。一种情况下,用户可能难以定位应用的功能区菜单中的具体控制键,另一种情况下,用户在将光标移动至具体控制键之后,难以重新定位到当前的操作对象,两种情况下均使用户的正常操作产生了一定的困难。且上述操作中,用户需要控制光标不断地在操作对象和功能区菜单之间切换,使得用户的操作较为繁琐,操作效率较低。
请参阅图62,在本发明一个实施例中,提供一种屏幕显示方法6200,屏幕显示方法6200用于在第二显示屏中显示控件区,使得用户可以通过第二显示屏上的控件区对第一显示屏上的目标应用或操作系统进行控制。
当本发明实施例的方法应用于双屏显示的电子设备时,例如,双屏电子设备,双屏电子设备的B面(通常设置为显示屏的面)可作为第一显示屏,双屏电子设备的C面(通常设置为键盘的面)可作为第二显示屏。在一种实现方式中,如图63A所示,B面可显示用户操作的目标应用的主显示界面和功能菜单栏,C面可包括虚拟键盘和控件区;在另一种实现方式中,如图63B所示,B面可显示用户操作的目标应用的主显示界面和功能菜单栏,C面可包括其他应用界面和控件区;在另一种实现方式中,如图63C所示,B面可显示用户操作的目标应用的主显示界面和功能菜单栏,C面可仅包括控件区。当本发明实施例的方法应用于单应用双屏跨设备的操作中时,被控设备的显示屏可对应于双屏电子设备的B面,控制设备的显示屏可对应于双屏电子设备的C面,被控设备和控制设备的显示屏的显示内容可借鉴图63A-63C。
通常情况下,一个应用程序可包括一个主要的显示界面和一个或多个包含对应于该应用的控制键的功能菜单栏,主要显示界面通常用于向用户展示应用的当前状态或用户操作的执行结果,功能菜单栏中的控制键通常用于接收用户的输入,并对目标应用执行特定操作。以文本编辑应用为例,其主要显示界面为展示当前编辑的文档的界面,通常也是整个应用程序中显示面积最大的界面,除此之外,文本编辑应用的功能菜单栏可以包括编辑菜单栏(包括文件、开始、插入、设计、页面布局等控制键)、导航菜单栏等功能菜单栏,用 于接收用户对文档的操作指示。通常情况下,应用程序的主要显示界面和功能菜单栏均显示在第一显示屏中,用户仅可通过鼠标或者触屏手势对第一显示屏中的应用程序进行操作。
在本发明实施例中,在第二显示屏中设置控件区,控件区可以包括多个显示区域,例如,在一种实现方式中,控件区可以包含系统控制栏和应用控制栏,其中,系统控制栏可以包含一个或多个功能模块,每个功能模块包含与操作系统相关联的控制键组,应用控制栏可以包含一个或多个功能模块,部分功能模块可包含对应于目标应用的控制键组,部分功能模块可包含与用户当前操作相关的快捷操作控制键组。应当理解,其他有助于提高用户操作效率的显示区域设置,及功能模块的设置也是可能的。通过在第二显示屏中设置控件区,使得用户能够通过第二显示屏中的控制键对第一显示屏中的目标应用或者系统进行操作,为用户提供了更加便捷的操作方式,提高用户操作的效率。
优选的,控件区在第二显示屏上的位置可以根据用户需求进行灵活调整。例如,控件区可以位于第二显示屏的上端,位于第二显示屏中其他显示内容(其他应用、虚拟键盘等)的上侧;控件区还可以位于第二显示屏的下端,位于第二显示屏中其他显示内容(其他应用、虚拟键盘等)的下侧;控件区还可以位于第二显示屏的左侧或者右侧。可由系统或者用户自定义控件区的初始显示位置,当控件区显示在第二显示屏上时,也可由用户灵活移动控件区在第二显示屏上的位置。
屏幕显示方法6200可包含如下步骤:
步骤6201:激活控件区。
在一种实现方式中,在常规的使用状态下,控件区可处于关闭状态,此时,可通过第一显示屏显示目标应用的主要显示模块和部分功能菜单栏,并采用常规的操作方式,通过鼠标操作或触屏手势操作第一显示屏中的功能菜单栏中的控制键实现对目标应用的操作。
当用户需要开启第二显示屏上的控件区以更快达到第一显示屏上的目标应用的控制键,以提高操作效率时,可通过多种方式激活控件区。在一种实现方式中,控件区可与虚拟键盘相关联,默认在开启虚拟键盘的同时,开启控件区。此时,可通过激活虚拟键盘的指令,同时激活虚拟键盘和控件区,如图64A所示。在另一种实现方式中,如图64B所示,虚拟键盘上可设置有控件区的控制开关,当虚拟键盘处于开启状态,而控件区未开启时,可通过虚拟键盘上的控制开关,激活控件区。在另一种实现方式中,如图64C所示,可通过手势控制,激活控件区,即,在存储模块中存储有与激活辅助显示区域相对应的手势,当检测到用户执行该手势时,激活控件区。该控制手势可以为,例如,手指从显示屏边缘向内滑动。在另一种实现方式中,控件区是否开启可与应用的显示模式相关联,由于开启应用的全屏模式时,通常会隐藏掉应用的一部分显示模块,因此,可在开启应用的全屏模式的同时,激活控件区,以补充显示部分显示模块的内容,如图64D所示。应当理解,以上激活控件区的操作方式仅仅是示例性的,其他激活控件区的操作方式也是可能的。
在常规使用状态下关闭控件区,当需要的时候通过简易的操作激活控件区,能够在不必要的情况下简化用户操作界面,避免控件区对正常使用情况下的干扰。
在另一种实现方式中,可以在开启电子设备后默认开启控件区,此时,不需要用户通过步骤6201激活控件区。因此,步骤6201为屏幕显示方法6200的可选步骤。
步骤6202:获取用户对于目标应用的操作。
在本发明实施例中,根据用户对于目标应用的操作,确定控件区的显示内容。在一种实施方式中,用户对于目标应用的操作为将目标应用的操作界面显示在第一显示屏上,例如,在用户操作前,该目标应用可处于关闭状态,用户通过开启目标应用的操作将目标应用的操作界面显示在第一显示屏上;或者,在用户操作前,该目标应用可处于后台运行状态,用户通过切换操作将目标应用的操作界面显示在第一显示屏上。当用户对于目标应用的操作为将目标应用的操作界面显示在第一显示屏上时,可在应用控制栏中显示对应于目标应用的控制键。在将目标应用的操作界面显示在第一显示屏上后,第一显示屏上可仅显示该目标应用的操作界面,也可以共同显示包含该目标应用在内的多个应用的操作界面,例如,双屏多屏操作模式。
可选的,在一种实现方式中,可由应用程序的开发者提供该应用程序的各个功能模块和功能模块中的控制键,和各个功能模块以及各个控制键之间的优先级顺序,然后可由系统根据实际情况(控件区的显示面积等)确定在控件区中对应于该应用的应用控制栏中显示哪些功能模块及控制键,并确定应用控制栏的布局方式。
该实现方式中,系统从应用程序中获取的目标应用的信息可以包括目标应用的各个功能模块,每个功能模块中包含的控制键,以及各个功能模块的优先级顺序和每个功能模块中不同控制键的优先级顺序。下面对目标应用的各种信息分别进行介绍:
1)功能模块及控制键
一个应用通常包含一个主要显示模块和多个功能菜单栏用于对主要显示模块中的内容进行控制,控件区中的功能模块可对应于目标应用的功能菜单栏。例如,以幻灯片编辑应用为例,幻灯片编辑应用可以包括主要显示界面、功能模块1、功能模块2、功能模块3等,其中,主要显示界面显示用户当前正在编辑的幻灯片界面,功能模块1包含常用的对幻灯片界面进行编辑的控制键集合,功能模块2用于显示全部幻灯片,供用户浏览,功能模块3包含快捷操作的控制键集合。应当理解,由于不同应用实现的功能不同,不同应用的功能模块设置和功能模块中控制键的设置可能是不同的。
2)功能模块的优先级
功能模块的优先级表示各个功能模块在用户的使用过程中的重要程度,通常可根据各个功能模块的功能的重要程度及用户的使用频率等指标进行确定。例如,上述幻灯片编辑应用的功能模块的优先级可以定义如下:功能模块1的优先级>功能模块2的优先级>功能模块3的优先级。应当理解,上述关于幻灯片编辑应用的功能模块的优先级的定义仅仅是示例性的,其他可能的,符合用户使用习惯的定义方式也是可能的。
可选的,在一种实现方式中,可以针对应用程序定义一个或多个必选功能模块,必选功能模块为控件区在开启状态下固定显示的对应于该应用程序的功能模块。另外,还可以针对目标应用定义一个或多个优选功能模块,优选功能模块为在控件区中显示了该应用程序的全部必选功能模块后,可优先进行显示的功能模块。目标应用的各个功能模块的优先级顺序可以设置如下:必选功能模块的优先级最高,优选功能模块的优先级次之,其他功能模块的优先级更次。
3)功能模块中控制键的优先级
控制键的优先级表示各个控制键在用户的使用过程中的重要程度,通常可根据各个控制键的控制功能的重要程度及用户的使用频率等指标进行确定。例如,以对文字进行编辑的控制键为例,在一种实现方式中,复制、粘贴、剪切、字体、段落、定义、同义词、翻译等控制键的优先级可以定义如下:复制和粘贴的优先级最高,剪切的优先级低于复制和粘贴的优先级,字体和段落的优先级低于剪切的优先级,定义、同义词和翻译的优先级低于字体和段落的优先级。应当理解,上述优先级的定义仅是一种可能的实现方式,其他符合用户使用习惯的或是其他常见的关于应用的功能键的优先级定义方式也是可能的。
在一种实现方式中,每个功能模块可以定义一个或多个必选控制键,必选控制键为当控件区显示对应功能模块时,固定显示的控制键。另外,每个功能模块可以定义一个或多个优选控制键,优选控制键为在控件区显示了对应功能模块的全部控制键之后,可优先进行显示的控制键。同一个功能模块不同控制键之间的优先级顺序可以设置如下:必选控制键的优先级最高,优选控制键的优先级次之,其他控制键的优先级更次。
可选的,在另一种实现方式中,可由应用程序的开发者直接定义针对于不同显示面积下的应用控制栏的显示内容,包括应用控制栏中的功能模块及控制键,以及应用控制栏的排版布局的方式,例如,由应用程序的开发者设置针对于显示面积1的应用控制栏显示方式1,针对于显示面积2的应用控制栏显示方式2,针对于显示面积3的应用控制栏显示方式3等。其中,显示面积1、显示面积2和显示面积3可以不特指某个尺寸,可以是一个范围,此时,系统可根据应用控制栏的显示面积,选择对应的应用控制栏的显示方式。
在该实现方式中,系统从应用程序中获取的目标应用的信息可以包括针对不同显示面积的应用控制栏显示方式,具体包括每一种应用控制栏的显示方式中,包含哪些功能模块,每个功能模块中包含哪些控制键,以及应用控制栏的排版布局方式。控件区中显示的应用控制栏的显示方式可与应用程序提供的显示方式完全相同。
可选的,在另一种实现方式中,可由系统通过文字或图像识别技术,识别应用程序的各个功能模块以及功能模块中的控制键,由系统根据用户的使用频率或重要程度,为应用程序的各个功能模块以及控制键制定优先级顺序,然后由系统根据制定的优先级顺序,确定在应用控制栏显示哪些功能模块及控制键,并确定具体的排版布局方式。在该实现方式中,系统可不从应用程序方面获取额外的信息。
应当理解,上述三种实现方式中,应用程序与系统之间的交互方式仅仅是示例性的,其他可行的交互方式,或是随着技术的发展新出现的实现方式也是可能的。
在另一种实施方式中,用户对于目标应用的操作为对于目标应用的操作界面的操作,例如,选择目标应用操作界面上的特定内容、将光标放置在目标应用操作界面的特定位置等。当用户对于目标应用的操作为对于目标应用的操作界面的操作时,可在应用控制栏中显示与该操作相关联的快捷控制键。
用户对于目标应用的操作界面的操作包括用户通过目标应用执行特定功能时的任何可能的操作。在一种实现方式中,用户对于目标应用的操作可能为选择操作界面的特定对象,例如,选择特定文字、符号、图片、表格、音视频等,用户可以通过触屏手势或鼠标操作 等多种方式选择特定对象,例如,可以通过触屏手势或操作鼠标将光标移动到特定对象上,可以通过触屏手势或操作鼠标选中特定对象(特定对象的底纹变深)等。在另一种实现方式中,用户对于目标应用的操作界面的操作可以为一种独特的手势或以一种独特的方式操作鼠标,例如,通过滑动手势或滚动鼠标滚轮操作滚动目标区域的内容,以实现对目标区域内容的浏览。应当理解,上述操作仅是示例性的,其他的用户在使用电子设备的过程中可能对目标应用进行的操作都是可能的。
用户对于目标应用操作界面的不同操作可以对应于不同的控制键组,该控制键组中的控制键可以是与特定操作相关联的快捷操作键的集合。如图65A所示,在一种实现方式中,用户对于目标应用的特定操作可以为选择特定的文字内容,例如,将光标放置在文字内容上,与该特定操作对应的控制键组可以包含复制、粘贴、剪切、字体、文字大小、段落、定义、同义词、翻译、使用网络搜索等控制键的集合。如图65B所示,在一种实现方式中,用户对于目标应用的特定操作可以为选择特定的图片内容,与该特定操作对应的控制键组可以包含复制、粘贴、剪切、设置图片格式、更改图片、置于顶层、置于底层、保存图片等控制键的集合。如图65C所示,在一种实现方式中,用户对于目标应用的特定操作可以为选择特定的表格内容,与该特定操作对应的控制键组可以包含复制、粘贴、剪切、格式、插入行、插入列、删除表格等控制键的集合。如图65D所示,在一种实现方式中,用户对于目标应用的特定操作可以为选择特定的视频内容,与该特定操作对应的控制键组可以包含播放、暂停、增大音量、减小音量、提高亮度、降低亮度、画中画、复制视频地址、投射、循环、进度条等控制键的集合。如图65E所示,在一种实现方式中,用户对于目标应用的特定操作可以为选择特定的音频内容,与该特定操作对应的控制键组可以包含播放、暂停、下一首、增大音量、减小音量、复制音频地址、循环、进度条等控制键的集合。如图65F所示,在一种实现方式中,用户对于目标应用的特定操作可以为通过滑动手势或滚动鼠标的滚轮浏览目标区域内的内容,与该特定操作对应的控制键组可以包含目标区域的缩略图,以及在缩略图中快速定位目标内容的定位框等。
可选的,在一种实现方式中,可由系统针对不同的用户操作,定义不同的控制键集合。针对用户对于目标应用的不同操作,显示不同的快捷操作控制键组,能够贴合用户需求,为用户提供更加便捷的操作,提升用户的操作效率。在另一种实现方式中,也可以将控制键集合定义为当前鼠标位置单击鼠标右键显示的控制键集合。采用将控制键集合定义为单击鼠标右键显示的控制键集合的简易设计,能够避免开发者的二次开发,减轻开发者的负担,缩短开发周期。
步骤6203:获取控件区的显示面积。
步骤6203为可选步骤,当控件区的显示面积固定时,无需获取控件区的显示面积,可直接进行步骤6204。当控件区的显示面积可进行灵活调整时,可进行步骤6203。
可选的,在一种实现方式中,控件区的显示面积可以灵活进行调整。可选的,在每次开启控件区的时刻,控件区的初始显示面积也可能是不同的。例如,不同的应用可以对应于不同的初始显示面积。在一种实现方式中,控件区的初始显示面积可以是由系统定义的,系统可以针对不同的应用,定义不同的控件区的初始显示面积。在另一种实现方式中,控 件区的初始显示面积可以是由用户自定义的,用户可以针对不同的应用,定义不同的控件区的初始显示面积。在另一种实现方式中,控件区的初始显示面积可以默认为上一次使用该应用时开启的控件区的显示面积。应当理解,其他本领域常用的可能的对控件区的初始显示面积进行定义的方式也是可能的。
通过灵活的设置控件区的显示面积,能够使得控件区中显示的功能模块及控制键组更加符合用户习惯,为用户提供更加便捷的操作,提升用户操作效率。
可根据需求灵活设置控件区的位置及布局方式,可选的,如图63A所示,在一种实现方式中,控件区可以设置在第二显示屏中虚拟键盘或者其他应用界面的上部。如图63B所示,在另一种实现方式中,控件区可显示在第二显示屏的左侧或右侧。如图63C所示,在另一种实现方式中,控件区可显示在第二显示屏的中间位置。如图66A所示,在另一种实现方式中,当第二显示屏中没有开启虚拟键盘或其他应用时,第一显示屏中的目标应用可以占用第二显示屏的部分显示面积,对应的,控件区可以位于第二显示屏的底端。如图66B所示,在另一种实现方式中,双屏电子设备的两个显示屏可以左右放置,此时,虚拟键盘可采用分离式设计,位于两个显示屏的下端,与此对应的,应用显示区可以设置在分离键盘的中间。
步骤6204:在控件区中显示控制键组。
步骤6204在综合考虑上述一个或多个步骤获取的信息的基础上,确定控件区中包含的功能模块及控制键组,并将其显示在控件区中。
可选的,如图67所示,在一种实现方式中,控件区可包含如下区域:
1)系统控制栏
系统控制栏主要用于显示与系统控制相关的控制键集合,可选的,可包含系统控件功能模块和程序坞功能模块。其中,系统控件功能模块可包括用于执行操作系统相关的控制键组,例如,系统控件功能模块可包括:调节音量,调节亮度,查询天气,查看时间,查看日历,查看闹铃,查阅系统通知等控制键的集合。程序坞功能模块可包括用于执行系统中多个任务程序之间的切换的控制键组,例如,程序坞功能模块可包括:当前运行的程序列表,或常用/喜好的应用列表,或最近使用的应用列表,或桌面应用列表等控制键。可选的,系统控制栏中与系统操作相关的控制键集合可以是由系统设置的,较为固定的控制键集合,也可由用户根据使用习惯对系统设置的系统控制栏中的控制键集合进行调整。
2)应用控制栏
应用控制栏主要用于显示对应于目标应用的控制键组,应用控制栏可包含对应于目标应用的一个或多个功能模块和/或与用户对于目标应用的操作相关联的快捷操作功能模块。
在一种实现方式中,当用户对于目标应用的操作为将目标应用的操作界面显示在第一显示屏中时,可在控件区显示对应于目标应用的控制键组。如步骤302所述,可选的,在一种实现方式中,可由应用程序的开发者提供该应用程序各个功能模块以及各个控制键之间的优先级顺序,然后可由系统根据实际情况(控件区的显示面积等)确定在控件区中对应于该应用的应用控制栏中显示哪些功能模块及控制键,并确定应用控制栏的布局方式。
优选的,当控件区的显示面积为最小时,目标应用的控制键集合中可以包含目标应用 的必选功能模块和必选功能模块中的必选控制键。
优选的,当控件区的显示面积比最小显示面积大时,在一种实现方式中,可以综合考虑目标应用的各个功能模块的优先级顺序及各个控制键的优先级顺序的基础上,按照如图68所示的整体优先级顺序增加目标应用的控制键集合中的控制键。具体的,必选功能模块中的必选控制键的优先级高于必选功能模块的优选控制键的优先级,高于优选功能模块的必选控制键的优先级,高于优选功能模块的优选控制键的优先级,高于其他功能模块的必选控制键的优先级,高于其他功能模块的必选控制键的优先级,高于其他功能模块的优选控制键的优先级,高于必选功能模块的其他控制键的优先级,高于优选功能模块的其他控制键的优先级,高于其他功能模块的其他控制键的优先级。因此,在控件区的初始显示面积逐渐增大的过程中,首先将必选功能模块的必选控制键增加到目标应用的控制键集合中,然后将必选功能模块的优选控制键增加到目标应用的控制键集合中,然后将优选功能模块的必选控制键增加到目标应用的控制键集合中,然后将优选功能模块的优选控制键增加到目标应用的控制键集合中,然后将其他功能模块的必选控制键增加到目标应用的控制键集合中,然后将其他功能模块的必选控制键增加到目标应用的控制键集合中,然后将必选功能模块的其他控制键增加到目标应用的控制键集合中,然后将优选功能模块的其他控制键增加到目标应用的控制键集合中,然后将其他功能模块的其他控制键增加到目标应用的控制键集合中。具体在某一类功能模块的某一类控制键的判断过程中,按照每一个具体的控制键的优先级顺序增加显示。
优选的,当控件区的显示面积比最小显示面积大时,在另一种实现方式中,可以综合考虑目标应用的各个功能模块的优先级顺序及各个控制键的优先级顺序的基础上,按照如图69所示的优先级顺序增加目标应用的控制键集合中的控制键。具体的,必选功能模块中的必选控制键的优先级高于必选功能模块的优选控制键的优先级,高于优选功能模块的必选控制键的优先级,高于优选功能模块的优选控制键的优先级,高于其他功能模块的必选控制键的优先级,高于必选功能模块的其他控制键的优先级,高于优选功能模块的其他控制键的优先级,高于其他功能模块的必选控制键的优先级,高于其他功能模块的优选控制键的优先级,高于其他功能模块的其他控制键的优先级。因此,在控件区的初始显示面积逐渐增大的过程中,首先将必选功能模块的必选控制键增加到目标应用的控制键集合中,然后将必选功能模块的优选控制键增加到目标应用的控制键集合中,然后将优选功能模块的必选控制键增加到目标应用的控制键集合中,然后将优选功能模块的优选控制键增加到目标应用的控制键集合中,然后将必选功能模块的其他控制键增加到目标应用的控制键集合中,然后将优选功能模块的其他控制键增加到目标应用的控制键集合中,然后将其他功能模块的必选控制键增加到目标应用的控制键集合中,然后将其他功能模块的必选控制键增加到目标应用的控制键集合中,然后将其他功能模块的其他控制键增加到目标应用的控制键集合中。具体在某一类功能模块的某一类控制键的判断过程中,按照每一个具体的控制键的优先级顺序增加显示。
应当理解,上述两种优先级顺序仅是示例性的,其他符合用户使用习惯的优先级的定义方式也是可能的。
如步骤6202所述,可选的,在另一种实现方式中,可由应用程序的开发者直接定义针对于不同显示面积下的应用控制栏的显示内容,包括应用控制栏中的功能模块及控制键,以及应用控制栏的排版布局的方式。由系统根据应用控制栏的显示面积,选择显示应用程序对应的哪一种应用控制栏显示方式。可选的,在另一种实现方式中,可由系统通过文字或图像识别技术,识别应用程序的各个功能模块以及功能模块中的控制键,由系统为隔俄国功能模块以及控制键指定优先级顺序,然后由系统根据指定的优先级顺序,确定在应用控制栏显示哪些功能模块及控制键,并确定具体的排版布局方式。
可选的,应用控制栏中可以包括与用户对于目标应用的当前操作相关的快捷操作功能模块。快捷操作功能模块主要包括与用户对于目标应用的当前操作相关的快捷操作控制键组,例如,步骤6203中列举的不同用户操作对应的控制键的集合。在一种实现方式中,用户操作相关的快捷操作控制键可由应用开发者定义的,即,由应用开发者根据针对用户在目标应用中执行的不同操作,设置相应的快捷操作控制键集合,该实现方式下,用户的同一操作在不同应用中可能对应不同的快捷操作控制键集合。在另一种实现方式中,用户操作相关的控制键可由系统定义,即由系统设置用户不同类型的操作对应的快捷操作控制键集合,该实现方式下,用户的同一操作在不同应用中可能对应相同的快捷操作控制键集合。
在另一种实现方式中,当用户对于目标应用的操作为对于目标应用的操作界面的操作时,可在控件区显示与用户操作相关联的快捷控制键组。可选的,在一种实现方式中,可仅在应用控制栏中显示与用户对于目标应用的操作界面的操作相关联的控制键组,即,将应用控制栏中原本显示的对应于目标应用的初始控制键组替换为与用户对于目标应用的操作界面的操作相关联的控制键组。在另一种实现方式中,也可以在应用控制栏中共同显示对应于目标应用的初始控制键组和与用户对于目标应用的操作界面的操作相关联的控制键组,即,在对应于目标应用的初始控制键组的基础上增加与用户对于目标应用的操作界面的操作相关联的控制键组。
在确定在应用控制栏中显示用户操作对应的哪些快捷控制键时,可采用上述与确定对应于目标应用的控制键相似的实现逻辑。在一种实现方式中,可由系统定义与用户操作相关对的快捷控制键的优先级顺序,然后根据应用控制栏的显示面积,确定将哪些快捷控制键显示在应用控制栏中。在另一种实现方式总,可由系统针对不同的应用控制栏的显示面积定义对应的快捷控制键组,然后根据应用控制栏的实际显示面积,确定应用控制栏中显示的快捷控制键组。
步骤6205:隐藏控件区中的控制键组在第一显示屏中的显示。
步骤6205为可选步骤,优选的,当激活第二显示屏上的控件区,并将相关控制键组显示在控件区中之后,可以隐藏控件区中的控制键组在第一显示屏上的显示,以节约第一显示屏的显示空间,扩大目标应用的主要显示界面或其他功能模块在第一显示区域中的显示面积。隐藏控件区中的控制键组在第一显示屏上的显示可以是在第一显示屏上不显示控件区中的控制键,也可以是将控件区中的控制键在第一显示屏中的显示折叠起来,也可以是淡化第一显示屏中显示的控件区中的控制键,例如,控制键变灰等。
在移除了应用控制栏中的控制键组在第一显示屏中的显示后,可以适应性的调整第一 显示屏幕的显示内容。在一种实现方式中,当移除目标应用的控制键集合在第一显示屏上的显示后,可以增大目标应用的主显示界面或其他功能模块的显示内容的大小,例如:放大显示字体、放大显示图片等,并对第一显示屏上的布局进行适应性调整。该实现方式能够方便用户的浏览,提升用户体验。
在另一种实现方式中,当移除目标应用的控制键集合在第一显示屏上的显示后,可以增加目标应用的主显示界面中的显示内容,也可以在第一显示屏中增加部分之前未显示的功能模块,还可以增加第一显示屏上显示的功能模块中的未显示的显示内容,并对第一显示屏上的布局进行适应性调整。例如,可由应用程序定义多个包含不同控制键的,用于在第一显示屏中进行显示的布局方式,由系统根据应用控制栏中显示的控制键组,选择与之适配的应用程序在第一显示屏中的布局方式。当移除目标应用的控制键集合在第一显示屏上的显示后,增加第一显示屏中的显示内容,能够体现目标应用的更多细节内容或操作方式,为用户提供更加便捷的操作。当然,也可以同时增加上述三种显示内容中的一种或几种,还可以同时增加显示内容和放大显示内容。应当理解,当移除目标应用的控制键集合在第一显示屏上的显示后,其他有利于提升用户体验的,对第一显示屏上的内容布局进行改变的方式也是可能的。
步骤6206:关闭控件区。
可选的,在一种实现方式中,当用户暂时不需要使用控件区时,可通过多种方式关闭控件区。在一种实现方式中,控件区可与虚拟键盘相关联,默认在关闭虚拟键盘的同时,关闭控件区。此时,可通过关闭虚拟键盘的指令,同时关闭虚拟键盘和控件区,如图70A所示。在另一种实现方式中,如图70B所示,虚拟键盘上可设置有控件区的控制开关,当虚拟键盘处于开启状态时,可通过虚拟键盘上的控制开关,关闭控件区。在另一种实现方式中,如图70C所示,可通过手势控制,关闭控件区,即,在存储模块中存储有与关闭辅助显示区域相对应的手势,当检测到用户执行该手势时,关闭控件区。该控制手势可以为,例如,手指将控件区滑向显示屏边缘。在另一种实现方式中,控件区是否开启可与应用的显示模式相关联,可在关闭应用的全屏模式的同时,关闭控件区,将控件区的部分内容迁移回第一显示区域进行显示,如图70D所示。
当用户暂时不需要使用控件区时,暂时关闭控件区的显示,能够扩大第二显示屏上其他应用的显示面积,在不必要的情况下,减小控件区对第二显示屏上其他应用的干扰,简化用户操作界面。
在另一种实现方式中,可在第二显示屏中始终显示控件区的内容,因此,步骤307是可选的。
屏幕显示方法6200通过在第二显示屏上显示控件区,控件区中包含与系统控制相关的控制键组和/或与用户对于目标应用的操作界面相关联的控制键组,使得用户能够通过第二显示屏中的控件区对系统或第一显示屏中的目标应用进行操作。在控件区协助下,用户不需要反复移动第一屏幕上的光标位置和反复定位操作对象或控制键的位置,极大的简化了用户操作。控件区显示在第二屏幕上,相比于第一屏幕,更靠近用户的双手,能够为用户提供更加便捷的操作。另外,在将相关控制键组显示在控件区后,移除其在第一显示屏中 的显示,能够节约第一显示屏中的显示面积。进而扩大第一显示屏中的显示内容,或增加第一显示屏中的显示内容,提升用户体验。
请参阅图71,在本发明一个实施例中,提供一种屏幕显示方法7100,屏幕显示方法用于根据用户对于目标应用的当前操作,改变控件区中的应用控制栏的显示内容。屏幕显示方法7100可包含如下步骤:
步骤7101:获取用户对于目标应用的操作。
在开启控件区之后,实时检测用户对于目标应用的当前操作,并根据用户对于目标应用的当前操作,改变在应用控制栏中显示的控制键组。
如步骤6202所述,在一种实现方式中,用户对于目标应用的当前操作可以是将目标应用的操作界面显示在第一显示屏中。当第一显示屏中仅显示目标应用的操作界面,而没有其他应用的操作界面时,可将应用控制栏中显示的控制键组全部替换为对应于目标应用的控制键组。当第一显示屏中同时包含目标应用的操作界面和其他应用的操作界面时,可将应用控制栏中显示的控制键组部分替换为对应于目标应用的控制键组,或者在应用控制栏中显示的控制键组的基础上增加对应于目标应用的控制键组。即,在应用控制栏中同时显示对应于包含目标应用在内的多个应用的控制键组。
如步骤6202所述,在另一种实现方式中,用户对于目标应用的当前操作可以是对于目标应用的操作界面的操作。具体的,可选的,若在用户执行对于目标应用的操作界面的操作前,应用控制栏中显示与用户对于目标应用的操作界面的上一操作相关联的快捷控制键组,则将该部分与上一操作对应的快捷控制键组替换为与当前操作对应的快捷控制键组。若在用户执行对于目标应用的操作界面的操作前,应用控制栏中仅显示与目标应用对应的控制键组,则可将与目标应用对应的控制键组替换为与用户当前操作相关联的快捷控制键组,或者在与目标应用对应的控制键组的基础上,增加与用户当前操作相关联的快捷控制键组。
步骤7101的具体实现方式与步骤6202相同,为避免重复,此处不再进行赘述。
步骤7102:根据用户操作改变应用控制栏的控制键组。
可选的,在一种实现方式中,根据用户操作改变应用控制栏的控制键组可以是,在应用控制栏中原有控制键组的基础上,增加部分与用户针对目标应用的当前操作相关的控制键组。例如,当用户仅开启目标应用,而没有针对目标应用执行操作时,应用控制栏中可不包含用户操作对应的快捷操作控制键组,即,应用控制栏的初始控制键组中仅包括对应于目标应用的控制键的集合,而不包含快捷操作控制键的集合。当用户针对目标应用执行第一操作时,可在应用控制栏中增加与用户操作对应的快捷操作控制键的集合,即,在应用控制栏中增加一组与用户的第一操作相关联的快捷操作控制键的集合。
可选的,在另一种实现方式中,根据用户操作改变应用控制栏的控制键组可以是,在应用控制栏中原有控制键组的基础上,减少部分与用户针对目标应用的当前操作相关的控制键组。例如,当用户的操作发生改变时,即当用户执行不同于第一操作的第二操作时,用户的第二操作所对应的快捷操作控制键的集合被用户的第一操作所对应的快捷操作控制键的集合被包括,且用户的第二操作所对应的快捷操作控制键少于用户的第一操作所对应 的快捷操作控制键,此时,可根据用户针对目标应用的第二操作,减少应用控制栏中与第二操作无关的快捷操作控制键组。
可选的,在另一种实现方式中,根据用户操作改变应用控制栏的控制键组可以是,用新的控制键组部分或者全部替换应用控制栏上原本显示的控制键组。例如,当用户的操作发生改变时,即当用户执行不同于第一操作的第二操作时,若第二操作与第一操作的相关性较小,可将应用控制栏中与用户的第一操作相关联的快捷操作控制键组部分或者完全替换为一组与用户的第二操作相关联的快捷操作控制键组。
可选的,若用户对目标应用的当前操作对应的快捷操作控制键较多,导致应用控制栏较为拥挤,或者无法完全显示所有快捷操作控制键时。例如,根据用户对目标应用的当前操作,需要在应用控制栏中增加一组控制键的集合,或者用于替换应用控制栏中原本显示的控制键组的一组控制键的集合的数量大于原本显示的控制键的数量。此时,可以适应性的增大应用控制栏和控件区的显示面积,以使得应用控制栏显示所有用户对目标应用的当前操作对应的快捷操作控制键。当需要在应用控制栏中显示更多的控制键时,适应性的增大应用控制栏的显示面积,能够优化应用控制栏的显示,避免应用控制栏中的控制键显示过小,为用户提供更好的操作体验。
可选的,若用户对目标应用的当前操作对应的快捷操作控制键较少,导致应用控制栏中存在空闲的显示区域时。例如,根据用户对目标应用的当前操作,需要在应用控制栏中减少一组控制键的集合,或者用于替换应用控制栏中原本显示的控制键组的一组控制键的集合的数量小于原本显示的控制键的数量。此时,可以适应性的减小应用控制栏和控件区的显示面积,以使得应用控制栏的显示面积与用户对目标应用的当前操作对应的快捷操作控制键相匹配。当需要在应用控制栏中显示较少的控制键时,适应性的减小应用控制栏的显示面积,能够优化应用控制栏的显示,避免应用控制栏中出现空闲的显示面积,节约第二显示屏的显示控件,也可以扩大第二显示屏上其他应用的显示面积,为用户提供更好的操作体验。
步骤7103:隐藏应用控制栏显示的控制键组在第一屏幕中的显示。
优选的,当根据用户的操作改变应用控制栏中的控制键组后,可隐藏在应用控件区中进行显示的控制键在第一显示屏中的显示,具体实现方式如步骤6205所述。
屏幕显示方法7100中,根据用户对于目标应用的当前操作的变化,改变应用控制栏显示的控制键,使得应用控制栏上时时显示与用户针对目标应用的当前操作对应的控制键组,最大程度的满足用户的操作需求,为用户提供更加高效的操作,提高用户的操作效率。
需要注意的是,本发明实施例不限制步骤7101—7103的执行次数,即可以多次获取用户针对目标应用的当前操作的变化,多次改变应用控制栏上显示的控制键组。
需要注意的是,可选的,除步骤7101—7103之外,屏幕显示方法7100还可以包含屏幕显示方法6200中的一个或多个步骤,其具体实现方式如屏幕显示方法6200所述,为避免重复此处不再进行赘述。
请参阅图72,在本发明一个实施例中,提供一种屏幕显示方法7200,屏幕显示方法用于根据用户改变应用控制栏的显示面积的操作,改变应用控制栏的显示面积及应用控制栏 中的控制键组。屏幕显示方法7200可包含如下步骤:
步骤7201:获取用户指示改变应用控制栏的显示面积的操作。
用户在终端的使用过程中的需求往往是实时变化的,有些情况下,用户可能希望控件区的应用控制栏中显示更多的控制键,以更好的辅助用户操作,例如,当用户在对目标应用进行较为复杂的操作时。此时,扩大应用控制栏的显示面积能够为用户提供更多的控制键,提高用户的操作效率。另一些情况下,用户可能希望控件区的应用控制栏显示相对较少的控制键,例如,当用户在第二显示屏上同时执行其他操作,而用户希望给其他应用更大的显示界面时,或者当用户在对目标应用进行较为简单的操作时。此时,缩小应用控制栏的显示面积能够节约第二显示屏上的显示空间,且通过减少应用控制栏上的控制键,能够使用户更加简单、快速的定位到所需的控制键,提高用户操作效率,提升用户体验。
用户可以通过多种方式实现改变应用控制栏的显示面积的目的。在一种实现方式中,用户可通过改变控件区的显示面积,间接改变应用控制栏的显示面积,例如,如图73A所示,可通过控件区上的放大按钮扩大控件区的显示面积,进而间接扩大应用控制栏的显示面积。可通过控件区上的缩小按钮缩小控件区的面积,进而间接缩小应用控制栏的显示面积。另外,如图73B所示,可通过放大手势扩大控件区的显示面积,进而间接扩大应用控制栏的显示面积。如图73C所示,可通过缩小手势缩小控件区的显示面积,进而间接缩小应用控制栏的显示面积。在另一种实现方式中,用户可通过改变第二显示屏上其他应用的显示面积,间接改变控件区的显示面积,进而改变应用控制栏的显示面积。例如,如图74A所示,可通过第二显示屏上其他应用上的放大按钮扩大其他应用的显示面积,间接缩小控件区的显示面积,进而间接缩小应用控制栏的显示面积。可通过第二显示屏上其他应用上的缩小按钮缩小其他应用的显示面积,间接扩大控件区的显示面积,进而间接扩大应用控制栏的显示面积。另外,如图74B所示,用户可通过缩小手势缩小第二显示屏上其他应用界面,扩大控件区的显示面积,进而扩大应用控制栏的显示面积。如图74C所示,用户可通过放大手势扩大第二显示屏上其他应用界面,缩小控件区的显示面积,进而缩小应用控制栏的显示面积。在另一种实现方式中,用户可直接对应用控制栏进行操作,改变应用控制栏的显示面积。例如,如图75A所示,可通过应用控制栏上的放大按钮扩大应用控制栏的显示面积。可通过应用控制栏上的缩小按钮缩小应用控制栏的显示面积。另外,如图75B所示,用户可通过扩大手势扩大应用控制栏的显示面积,如图75C所示,用户可通过缩小手势缩小应用控制栏的显示面积。
在另一种实现方式中,也可以根据用户对于第一显示屏上的应用的操作来改变应用控制栏的显示面积。具体的,当用户当前操作对应的控制键的数量多于用户的前一操作对应的控制键的数量时,为了在应用控制栏中显示全部控制键,并且保证应用控制栏中控制键的显示效果,可以适当增大应用控制栏的显示面积。例如,用户的前一操作为开启某个应用,此时应用控制栏中可显示对应于该应用的初始控制键,用户的当前操作为对于该目标应用的界面执行的操作,此时,可在应用控制栏中增加对于用户当前操作的控制键,此时可适当增大应用控制栏的显示面积。当用于当前操作对应的控制键的数量少于用户的前一操作对应的控制键的数量时,为了节约第二显示屏的显示面积,可以适当减小应用控制栏 的显示面积。
在另一种实现方式中,控件区的显示面积和位置可灵活适配与第二显示屏上其他功能模块显示的变化,与此同时,应用控制栏的显示面积也会随着控件区显示面积和位置的变化进行调整。例如,当用户通过不同手势触发不同类型的虚拟键盘时,可根据不同类型的虚拟键盘的显示区域,灵活确定控件区的显示面积和位置。根据不同手势触发不同类型的虚拟键盘的具体实现方式见实施例二,此处不再进行赘述。
在另一种实现方式中,当用户指示开启目标应用的手写输入模式时,在第二显示屏上显示该目标应用的界面,使得用户可以通过第二显示屏执行手写输入,可选的,在一种实现方式中,此时可在第二显示屏中显示应用控制栏,应用控制栏中显示与手写输入方式相关联的控制键。在另一种实现方式中,由于此时已经在第二显示屏幕中显示了目标应用的界面,可以不在第二显示屏幕上显示应用控制栏。手写输入模式与虚拟键盘输入模式之间进行切换的具体实现方式见实施例三,此处不再进行赘述。
在另一种实现方式中,当用户将输入方式切换为手写输入方式时,可以在应用控制栏中显示手写输入区域,也可以在应用控制栏中显示与手写输入方式相关联的控制键组,例如:笔、橡皮擦、颜色、字体等,还可以在应用控制栏中同事显示手写输入区域和与手写输入方式相关联的控制键组。使得用户可以通过应用控制栏执行手写输入,和/或通过应用控制栏对手写输入方式进行操作。提高操作效率。与手写输入模式的切换相关的具体实现方式见实施例三,此处不再进行赘述。
步骤7202:根据用户操作改变应用控制栏的显示面积和控制键集合。
当用户操作指示扩大应用控制栏的显示面积时,按照用户操作的程度,扩大应用控制栏的显示面积。例如,当用户通过点击放大按钮扩大显示面积时,可以根据用户的点击次数,确定将应用控制栏的显示面积放大到何种程度。当用户通过放大手势扩大显示面积时,可以根据用户的放大手势的程度,确定将应用控制栏的显示面积放大到何种程度。
如图76A所示,在扩大应用控制栏的显示面积的同时,可增加应用控制栏中对应于目标应用的控制键组中的控制键。可选的,在一种实现方式中,在扩大应用控制栏的显示面积的同时,可增加应用控制栏中原有功能模块中的控制键。在另一种实现方式中,在扩大应用控制栏的显示面积的同时,可在应用控制栏中增加新的功能模块及其对应的控制键的集合。在另一种实现方式中,在扩大应用控制栏的显示面积的同时,可同时增加应用控制栏中原有功能模块中的控制键和增加新的功能模块及其对应的控制键的集合。
如步骤6202所述,在一种实现方式中,系统可根据各个功能模块和各个控制键的优先级,按照优先级从高到低的顺序,在应用控制栏显示的控制键集合的基础上,增加部分的控制键,并确定增加部分控制键后应用控制栏的布局。优选的,当扩大应用控制栏的显示面积时,如图76A所示,可以将原本显示在应用控制栏中的控制键组向下移动,将新增加显示的控制键组显示在原本显示在应用控制栏中的控制键组的上方,即,在扩大显示面积后的应用控制栏中,新增加显示的控制键组相对于原本显示在应用控制栏中的控制键组,更靠近第一显示屏幕。在该实现方式中,新增加显示的控制键组的优先级低于原本显示在应用控制栏中的控制键组,通过上述设置,能够在扩大应用控制栏的显示面积时,始终将 优先级较高的控制键(功能更重要或者用户的使用频率较高的控制键)设置在距离用户双手更近的位置,为用户提供更加便捷的操作,提高用户的操作效率。
在另一种实现方式中,系统可根据应用控制栏的显示面积选择应用程序提供的对应于该显示面积的应用控制栏的显示方式,并在控件区中进行显示。
优选的,当扩大应用控制栏的显示面积使得应用控制栏中对应于目标应用的控制键增多时,可以隐藏增加的控制键在第一显示屏中的显示,具体实现方式和有益效果如步骤6205所述。
当用户操作指示缩小应用控制栏的显示面积时,按照用户操作的程度,缩小应用控制栏的显示面积。例如,当用户通过点击缩小按钮扩大显示面积时,可以根据用户的点击次数,确定将应用控制栏的显示面积缩小到何种程度。当用户通过缩小手势缩小显示面积时,可以根据用户的缩小手势的程度,确定将应用控制栏的显示面积放大到何种程度。
如图76B所示,在缩小应用控制栏的显示面积的同时,可减少应用控制栏中对应于目标应用的控制键组中的控制键。可选的,在一种实现方式中,在缩小应用控制栏的显示面积的同时,可保持应用控制栏中功能模块的数量不变,减少功能模块中的控制键的数量。在另一种实现方式中,在缩小应用控制栏的显示面积的同时,可减少应用控制栏的功能模块及其对应的控制键的集合。在另一种实现方式中,在缩小应用控制栏的显示面积的同时,可同时减少应用控制栏中功能模块及其对应的控制键的集合和其他保留功能模块中的控制键的数量。
如步骤6202所述,在一种实现方式中,系统可根据各个功能模块和各个控制键的优先级,按照优先级从低到高的顺序,在应用控制栏显示的控制键集合的基础上,减少部分的控制键,并确定减少部分控制键后应用控制栏的布局。在另一种实现方式中,系统可根据应用控制栏的显示面积选择应用程序提供的应用控制栏的显示方式,并在控件区中进行显示。
优选的,当缩小应用控制栏的显示面积使得应用控制栏中对应于目标应用的控制键减少时,可以还原减少的控制键在第一显示屏中的显示,以使得用户在需要使用这些控制键时,能够通过传统的方式,通过触屏手势或鼠标操作通过第一显示屏进行操作。
屏幕显示方法7200中,根据用户指示改变目标应用控制栏的显示面积的操作,改变应用控制栏的显示面积及应用控制栏中的控制键组,能够使得控件区的显示更加灵活,且满足用户在不同使用场景下的不同的使用需求,提升用户体验。
另外,在对应用控制栏的显示面积进行调整时,可适应性的调整控件区其他显示区域(系统控制栏等)的显示面积或第二显示屏上其他显示模块(其他应用的显示界面、虚拟键盘等)的显示布局。
另外,为了方便用户在不将视线移动到第二显示屏上,就能快速定位想要定位的控制键,尤其是在控件区的显示面积和显示键发生变化的情况下,仍然能快速定位到想要定位的控制键,可以在控件区中加入锚定点反馈技术。在一种实现方式中,可以当用户接触应用控制栏中的控制键时,为用户提供反馈,表明用户此时接触的是应用控制栏中的控制键。在另一种实现方式中,可以当用户接触系统控制栏中的控制键时,为用户提供反馈,表明 用户此时接触的是系统控制栏中的控制键。在另一种可能的实现方式中,可以将应用控制栏或者系统控制栏中一些功能较为重要、或者用户的使用频率较高的控制键设置为锚定点反馈按键,使得用户快速定位这些重要的、或者使用频率较高的控制键。锚定点反馈的具体实现方式见实施例一,此处不再进行赘述。
需要注意的是,本发明实施例不限制步骤7201和步骤7202的执行次数,用户可以多次执行改变应用控制栏的显示面积的操作,系统可以实时获取用户指示改变应用控制栏的显示面积的操作,根据用户操作,多次改变应用控制栏的显示面积和应用控制栏中的控制键组。
需要注意的是,可选的,除步骤7201和步骤7202之外,屏幕显示方法7200还可以包含屏幕显示方法6200中的一个或多个步骤,其具体实现方式如屏幕显示方法6200所述,为避免重复此处不再进行赘述。
第二显示屏上显示的控件区具备输出的功能,即作为人机交互界面,将目标应用的部分控制键集合显示给用户。除此之外,控件区还可以具有一些输入功能,例如触屏手势功能,接收用户的输入,进而对目标应用进行一些操作或者对控件区本身进行一些操作。
在一种实现方式中,控件区可接收用户的输入,以实现对目标应用功能的控制。例如,当目标应用主要用于对文档进行编辑时,目标应用对应的控件区中的控制键集合中可能包含对文字内容进行处理的控制键,如复制、粘贴、剪切等。此时,可通过触屏手势点击控件区中的控制键或者鼠标选择控件区中的控制键实现对文字内容的编辑。当目标应用主要用于播放视频内容时,目标应用对应的控件区中的控制键集合中可能包含对视频内容进行控制的控制键,如音量控制键、亮度控制键、进度控制条等。此时,可通过触屏手势点击控件区中的控制键或者鼠标选择相应的控制键对视频的音量、亮度、播放进度等进行控制。应当理解,上述目标应用及对目标应用进行操作的控制键均是示例性的,本领域常见的其他目标应用及其常用的控制键也是可能的。
在另一种实现方式中,用户可通过控件区的控制键集合和其他输入方式共同对第一显示屏中的目标应用进行操作,例如,用户可以通过鼠标或第一显示屏上的触屏手势对选择编辑页面中的特定对象,通过控件区的控制键集合对选择的对象进行编辑。应当理解,上述控件区的控制键集合与鼠标或触屏手势的合作控制仅是示例性的,其他能够实现对第一显示屏中的目标应用进行操作的可能的合作模式也是可能的。
在本发明的一个实施例中,用户可以对控件区的控制键集合进行查看、编辑和自定义操作。在一种实现方式中,控件区可支持用户的如下触屏手势操作:
1)触屏手势操作
在一种实现方式中,可通过拖动手势对控件区进行操作,例如,拖动手势可用于将控件区中的某个位置上的控制键拖动到控件区的另外一个位置上,如图77A所示。拖动手势还可用于将控件区中的某个功能模块整体拖动到控件区的另外一个位置上。拖动手势还可用于移动整个控件区在第二显示屏中的位置,如图77B所示。
在另一种实现方式中,可通过滑动手势对控件区进行操作,例如,滑动手势可用于浏览控件区中的显示内容。例如,当由于控件区显示面积的限制,导致某个功能模块没有显 示全部控制键时,可通过滑动手势浏览该功能模块未显示的控制键,如图77C所示。
在另一种实现方式中,可通过弹动手势对控件区进行操作,例如,弹动手势可用于移除控件区中的某些内容,如图77D所示。
2)手指重压手势操作
可通过手指重压手势对控件区进行操作,在控件区的不同位置接收到用户的手指重压手势时,可对应执行不同功能。如图78A所示,在一种实现方式中,若在控件区的控制键上接收到用户的手指重压手势,可显示当前控制键的删除按钮,进而通过删除按钮删除该控制键。另外,在删除了控制键后,相应显示位置可显示空缺,并显示添加按钮,用户可通过添加按钮在该位置添加新的控制键。如图78B所示,若在不同功能模块的区域分界处接收到手指重压手势,可触发该分界线划分的两个功能模块边缘移动的功能,用户可通过拖动该分解线,改变两个功能模块的显示面积。具体的,其中一个功能模块的显示面积增大,另一个功能模块的显示面积减小,可根据两个功能模块中控制键的优先级顺序,增加显示面积增大的功能模块中显示的控制键,减少显示面积减小的功能模块中显示的控制键。
3)悬浮手势操作
可通过悬浮手势对控件区进行操作,悬浮手势可用于执行预览操作,例如,如图79所示,悬浮手势操作可用于查看当前控制键的名称、辅助提示等内容。悬浮手势操作可用于预览由于显示面积的显示,当前控件区中未显示的控制键。
应当理解,以上列举的触屏手势操作仅是示例性的,其他本领域常见的手势操作方式也是可能的。
在图60至图79所对应的实施例的基础上,为了更好的体现本申请实施例的方案及有益效果,下面提供一个具体的实施例。
以在双屏电子设备上使用笔记应用为例,如图80A所示为常规的显示状态,笔记应用相关的全部内容显示在第一显示屏上,例如,在图80A中可以包括笔记内容的主要显示区域和列表导航和固定菜单栏等功能菜单栏。在常规显示状态下,用户可按照常规操作方式对笔记应用进行操作,例如,通过鼠标或者触屏手势通过第一显示屏对笔记应用进行控制。
当用户需要开启第二显示屏上的控件区辅助操作时,可通过如下四种方式激活控件区:
1)若虚拟键盘处于关闭状态,可在接收到用户开启虚拟键盘的指令时,同时开启虚拟键盘和控件区。
2)若虚拟键盘处于开启状态,可通过虚拟键盘上控件区的控制按钮,接收用户开启应用控制栏的指令,开启控件区。
3)可在接收到用户开启控件区的手势时,开启控件区。
4)当笔记应用不在全屏模式下显示时,可在接收到用户全屏笔记应用的指令时,开启控件区。
当系统接收到用户激活控件区的指令时,按照屏幕显示方法6200对应的方法实施例中的实现方式,根据控件区的显示面积,将相应的系统控制键组和对应于目标应用的控制键组显示在第二显示屏中的控件区中,并相应的缩小了第二显示屏中其他应用的显示面积。例如,如图80B所示,控件区的初始显示面积为最小时,控件区中仅在系统控制栏中显示 与系统控制相关的系统控制键组,在应用控制栏中显示对应于目标应用的部分控制键组。
当接收到用户改变应用控制栏的显示面积的操作时,例如,如图80C所示,用户通过放大手势放大应用控制栏的显示面积时,系统根据用户的操作,扩大控件区及应用控制栏的显示面积,并在应用控制栏中增加了对应于笔记应用的一个功能模块及其控制键组。与此同时,移除该功能模块对应的原本显示在第一显示屏上的功能菜单栏。如图80D所示,当用户进一步通过放大手势扩大应用控制栏的显示面积时,系统根据用户的操作,进一步扩大控件区及应用控制栏的显示面积,并在应用控制栏中增加了对应于笔记应用的另一个功能模块及其控制键组,与此同时,移除该功能模块对应的原本显示在第一显示屏上的功能菜单栏。
当用户对目标应用执行某种操作时,例如,如图80E所示,选中目标应用操作界面中的部分文字,获取用户对于目标应用的当前操作,并根据用户操作,将用户对于目标应用的当前操作对应的控制键组,例如:复制,粘贴,剪切等,显示在应用控制栏中。当用户改变对于目标应用的当前操作时,例如,选中目标应用操作界面中的部分图片时,获取用户对于目标应用的当前操作,并根据用户操作,将应用控制栏中对应于用户前一操作的控制键组改变为对应于用户后一操作的控制键组。
用户可以通过第二显示屏的控件区的控制键对第一显示屏中的目标应用进行操作,例如,如图80F所示,用户可以通过点击功能模块中的控制键,选择浏览笔记应用的哪一个笔记,用户可以通过点击功能模块中的控制键,对当前显示的笔记进行编辑。另外,用户还可以通过对应用控制栏本身进行操作。例如,用户可以自定义编辑应用控制栏的显示内容,用户也可以通过悬浮手势查看控件区中控制键的名称、功能或其他说明,如图80G所示。
当用户不需要使用控件区辅助操作时,可通过如下四种方式关闭控件区:
1)若虚拟键盘处于开启状态,可在接收到用户关闭虚拟键盘指令时,同时关闭虚拟键盘和控件区。
2)若虚拟键盘处于开启状态,可通过虚拟键盘上控件区的控制按钮,接收用户关闭应用控制栏的指令,关闭控件区。
3)可在接收到用户关闭控件区的手势时,关闭控件区。
4)当笔记应用在全屏模式下显示时,可在接收到用户关闭全屏显示模式的指令时,关闭控件区。
可以理解的是,电子设备为了实现上述功能,其包含了执行各个功能相应的硬件和/或软件模块。结合本文中所公开的实施例描述的各示例的方法步骤,本申请能够以硬件或硬件和计算机软件的结合形式来实现。某个功能究竟以硬件还是计算机软件驱动硬件的方式来执行,取决于技术方案的特定应用和设计约束条件。本领域技术人员可以结合实施例对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请的范围。
本实施例可以根据上述方法示例对电子设备进行功能模块的划分,例如,可以对应各个功能划分各个功能模块,也可以将两个或两个以上的功能集成在一个处理模块中。上述 集成的模块可以采用硬件的形式实现。需要说明的是,本实施例中对模块的划分是示意性的,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式。
在采用对应各个功能划分各个功能模块的情况下,图81示出了上述方法实施例中涉及的电子设备的一种可能的组成示意图,如图81所示,电子设备8100可以包括:第一显示屏8101、第二显示屏8102、输入模块8103和处理模块8104。
第一显示屏8101可以用于支持电子设备8100显示目标应用界面,和/或用于本文所描述的技术的其他过程。
在双屏显示的电子设备中,第一显示屏通常承载输出功能,即,显示目标应用的状态及用户操作的执行结果等内容。可选的,第一显示屏的显示内容可以包括目标应用的主要显示界面及部分功能菜单栏。另外,第一显示屏还可以承载输入功能,可通过触屏手势对第一显示屏进行操作,实现输入功能。
第二显示屏8102可以用于支持电子设备8100显示控件区,和/或用于本文所描述的技术的其他过程。
在双屏显示的电子设备中,在一种实现方式中,第二显示屏可承载输入功能,即,接收用户的输入,在另一种实现方式中,第二显示屏也可承载输出功能,显示目标应用的状态及用户操作的执行结果等内容。电子设备8100将控件区显示在第二显示屏上,使用户能够通过第二显示屏上的控件区实现对第一显示屏中的目标应用的控制,提高操作效率,提升用户体验。
输入模块8103可以用于支持电子设备8100在屏幕显示方法6200中执行步骤6202,可以用于支持电子设备8100在屏幕显示方法7100中执行步骤7101,可以用于支持电子设备8100在屏幕显示方法7200中执行步骤7201,和/或用于本文所描述的技术的其他过程。
具体的,在步骤6202和步骤7101中,输入模块用于接收用户对于目标应用的操作,在一种实现方式中,用户可通过操作鼠标对目标应用进行操作,此时,输入模块可以是鼠标。在另一种实现方式中,用户可通过触屏手势对目标应用进行操作,此时,输入模块可以是第一显示屏。在另一种实现方式中,用户可通过隔空手势对目标应用进行操作,此时,输入模块可以是用于采集手势信息的深度摄像头等。在步骤7201中,输入模块用于接收用户改变应用控制栏的显示面积的操作,在一种实现方式中,用户可以通过触屏手势在第二显示屏上进行操作以改变应用控制栏的显示面积,此时,输入模块可以是第二显示屏。在另一种实现方式中,用户可以通过鼠标改变应用控制栏的显示面积,此时,输入模块可以是鼠标。在另一种实现方式中,用户可通过隔空手势改变应用控制栏的显示面积,此时,输入模块可以是用于采集手势信息的深度摄像头等。
处理模块8104可以用于支持电子设备8100在屏幕显示方法6200中执行步骤6201、6203、6204、6205、6206,可以用于支持电子设备8100在屏幕显示方法7100中执行步骤7102、7103,可以用于支持电子设备8100在屏幕显示方法7200中执行步骤7202,和/或用于本文所描述的技术的其他过程。
处理模块可以是处理器或控制器。其可以实现或执行结合本申请公开内容所描述的各种示例性的逻辑方框,模块和电路。处理器也可以是实现计算功能的组合,例如包含一个 或多个微处理器组合,数字信号处理(digital signal processing,DSP)和微处理器的组合等。
需要说明的是,电子设备8100中各模块/单元之间的信息交互、执行过程等内容,与本申请中图60至图79对应的各个方法实施例基于同一构思,具体内容可参见本申请前述所示的方法实施例中的叙述,此处不再赘述。
示例性的,图82示出了电子设备8200的结构示意图。电子设备2200具体可以表现为双屏电子设备,例如,具有两个显示屏或者曲面屏或者柔性折叠屏的笔记本电脑、超级移动个人计算机(ultra-mobile personal computer,UMPC)、上网本、个人数字助理(personal digital assistant,PDA)等,也可以表现为相互连接在一起同步使用的两个电子设备,例如两个平板电脑、两个手机等,本申请实施例对具备两个显示屏的电子设备的具体类型不作任何限制。
电子设备8200可以包括处理器8210,外部存储器接口8220,内部存储器8221,通用串行总线(universal serial bus,USB)接口8230,充电管理模块8240,电源管理模块8241,电池8242,天线1,天线2,移动通信模块8250,无线通信模块8260,音频模块8270,扬声器8270A,受话器8270B,麦克风8270C,耳机接口8270D,传感器模块8280,按键8290,马达8291,指示器8292,摄像头8293,显示屏8294,以及用户标识模块(subscriber identification module,SIM)卡接口8295等。其中传感器模块8280可以包括压力传感器8280A,陀螺仪传感器8280B,气压传感器8280C,磁传感器8280D,加速度传感器8280E,距离传感器8280F,接近光传感器8280G,指纹传感器8280H,温度传感器8280J,触摸传感器8280K,环境光传感器8280L,骨传导传感器8280M等。
可以理解的是,本申请实施例示意的结构并不构成对电子设备8200的具体限定。在本申请另一些实施例中,电子设备8200可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器8210可以包括一个或多个处理单元,例如:处理器8210可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,存储器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
其中,控制器可以是电子设备8200的神经中枢和指挥中心。控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。处理器8210中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器8210中的存储器为高速缓冲存储器。该存储器可以保存处理器8210刚用过或循环使用的指令或数据。如果处理器8210需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器8210的等待时间,因而提高了系统的效率。
在一些实施例中,处理器8210可以包括一个或多个接口。接口可以包括集成电路 (inter-integrated circuit,I2C)接口,集成电路内置音频(inter-integrated circuit sound,I2S)接口,脉冲编码调制(pulse code modulation,PCM)接口,通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口,移动产业处理器接口(mobile industry processor interface,MIPI),通用输入输出(general-purpose input/output,GPIO)接口,用户标识模块(subscriber identity module,SIM)接口,和/或通用串行总线(universal serial bus,USB)接口等。
电源管理模块8241用于连接电池8242,充电管理模块8240与处理器8210。电源管理模块8241接收电池8242和/或充电管理模块8240的输入,为处理器8210,内部存储器8221,外部存储器,显示屏8294,摄像头8293,和无线通信模块8260等供电。电源管理模块8241还可以用于监测电池容量,电池循环次数,电池健康状态(漏电,阻抗)等参数。在其他一些实施例中,电源管理模块8241也可以设置于处理器8210中。在另一些实施例中,电源管理模块8241和充电管理模块8240也可以设置于同一个器件中。
电子设备8200通过GPU,显示屏8294,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏8294和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器8210可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
显示屏8294用于显示图像,视频等。显示屏8294包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD),有机发光二极管(organic light-emitting diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode的,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),Miniled,MicroLed,Micro-oLed,量子点发光二极管(quantum dot light emitting diodes,QLED)等。在本发明实施例中,显示屏8294分为第一显示屏和第二显示屏,第一显示屏或第二显示屏可具备输入功能,例如,通过触屏手势对显示屏进行控制等。
外部存储器接口8220可以用于连接外部存储卡,例如Micro SD卡,实现扩展电子设备8200的存储能力。外部存储卡通过外部存储器接口8220与处理器8210通信,实现数据存储功能。例如将音乐,视频等文件保存在外部存储卡中。
内部存储器8221可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。处理器8210通过运行存储在内部存储器8221的指令,从而执行电子设备8200的各种功能应用以及数据处理。内部存储器8221可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统,一个或者多个功能所需的应用程序(比如声音播放功能,图像播放功能等)等。存储数据区可存储电子设备8200使用过程中所创建的数据(比如音频数据,电话本等)等。此外,内部存储器8221可以包括高速随机存取存储器,还可以包括非易失性存储器,例如一个或者多个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。
本实施例还提供一种计算机存储介质,该计算机存储介质中存储有计算机指令,当该计算机指令在电子设备上运行时,使得电子设备执行上述相关方法步骤实现上述实施例中的屏幕显示的方法。
本实施例还提供了一种计算机程序产品,当该计算机程序产品在计算机上运行时,使得计算机执行上述相关步骤,以实现上述实施例中的屏幕显示的方法。
另外,本申请的实施例还提供一种装置,这个装置具体可以是芯片,组件或模块,该装置可包括相连的处理器和存储器;其中,存储器用于存储计算机执行指令,当装置运行时,处理器可执行存储器存储的计算机执行指令,以使芯片执行上述各方法实施例中的图像分类的方法。
其中,本实施例提供的电子设备、计算机存储介质、计算机程序产品或芯片均用于执行上文所提供的对应的方法,因此,其所能达到的有益效果可参考上文所提供的对应的方法中的有益效果,此处不再赘述。
通过以上实施方式的描述,所属领域的技术人员可以了解到,为描述的方便和简洁,仅以上述各功能模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能模块完成,即将装置的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。
在本申请所提供的几个实施例中,应该理解到,所揭露的装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,模块或单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个装置,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是一个物理单元或多个物理单元,即可以位于一个地方,或者也可以分布到多个不同地方。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个可读取存储介质中。基于这样的理解,本申请实施例的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该软件产品存储在一个存储介质中,包括若干指令用以使得一个设备(可以是单片机,芯片等)或处理器(processor)执行本申请各个实施例方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(read only memory,ROM)、随机存取存储器(random access memory,RAM)、磁碟或者光盘等各种可以存储程序代码的介质。
以上内容,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本申请揭露的技术范围内,可轻易想到变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以权利要求的保护范围为准。

Claims (19)

  1. 一种反馈方法,其特征在于,所述方法应用于电子设备,所述电子设备配置有触控屏幕,所述触控屏幕中配置有多个振动反馈元件,所述方法包括:
    检测作用于所述触控屏幕上的第一接触操作;
    响应于所述第一接触操作,获取与所述第一接触操作对应的第一接触点的第一位置信息,所述第一位置信息与虚拟键盘上的第一虚拟按键对应;
    在所述第一虚拟按键为锚定点按键的情况下,从所述多个振动反馈元件中获取第一振动反馈元件,所述第一振动反馈元件为与所述第一虚拟按键匹配的振动反馈元件;
    指示所述第一振动反馈元件发出振动波,以执行第一反馈操作,所述第一反馈操作用于提示所述第一虚拟按键为锚定点按键。
  2. 根据权利要求1所述的方法,其特征在于,所述电子设备中配置有第一映射关系,所述第一映射关系指示虚拟按键与振动反馈元件之间的对应关系,所述从所述多个振动反馈元件中获取第一振动反馈元件,包括:
    根据所述第一映射关系和所述第一虚拟按键,获取所述第一振动反馈元件。
  3. 根据权利要求2所述的方法,其特征在于,所述电子设备中配置有第一映射关系,所述第一映射关系指示位置信息与振动反馈元件之间的对应关系,所述从所述多个振动反馈元件中获取第一振动反馈元件,包括:
    根据所述第一映射关系和所述第一位置信息,获取所述第一振动反馈元件。
  4. 根据权利要求1至3任一项所述的方法,其特征在于,所述通过所述第一振动反馈元件发出振动波之前,所述方法还包括:
    获取与至少一个第一振动反馈元件中各个第一振动反馈元件对应的振动波的振动强度,所述至少一个第一振动反馈元件中各个第一振动反馈元件的振动波的振动强度与第一数量相关,所述第一数量为所述第一振动反馈元件的数量;
    所述通过所述第一振动反馈元件发出振动波,包括:
    根据与所述各个第一振动反馈元件对应的振动波的振动强度,通过所述至少一个第一振动反馈元件发出振动波,以使与所述第一虚拟按键对应的振动反馈的强度和与所述第二虚拟按键对应的振动反馈的强度的差异在预设强度范围内,所述第二虚拟按键和所述第一虚拟按键为不同的虚拟按键。
  5. 根据权利要求1至3任一项所述的方法,其特征在于,所述第一振动反馈元件为以下中的任一种:压电陶瓷片、线性马达或压电薄膜。
  6. 根据权利要求1至3任一项所述的方法,其特征在于,所述执行第一反馈操作之前,所述方法还包括:
    根据所述第一位置信息,获取与所述第一接触点对应的位置类型,所述位置类型包括所述第一接触点位于所述第一虚拟按键的第一位置区域和所述第一接触点位于所述第一虚拟按键的第二位置区域,所述第一位置区域和所述第二位置区域不同;
    所述执行第一反馈操作,包括:
    根据与所述第一接触点对应的位置类型,通过所述触控屏幕执行第一反馈操作,与所 述第一位置区域对应的反馈操作和与所述第二位置区域对应的反馈操作不同。
  7. 根据权利要求1至3任一项所述的方法,其特征在于,所述检测作用于所述触控屏幕上的第一接触操作之前,所述方法还包括:
    响应于检测到的第一手势操作,从多个类型的虚拟键盘中选取与所述第一手势操作对应的第一类型的虚拟键盘,其中,所述多个类型的虚拟键盘中不同类型的虚拟键盘包括的虚拟按键不完全相同;
    通过所述触控屏幕展示所述第一类型的虚拟键盘,在所述第一类型的虚拟键盘的展示过程中,所述第一类型的虚拟键盘在所述触控屏幕上的位置固定;
    所述检测作用于所述触控屏幕上的第一接触操作,包括:
    在所述第一类型的虚拟键盘的展示过程中,检测作用于所述触控屏幕上的第一接触操作。
  8. 一种电子设备,其特征在于,所述电子设备配置有触控屏幕,所述触控屏幕包括接触感知模块和振动反馈模块,所述振动反馈模块包括多个振动反馈元件;
    所述接触感知模块,用于获取所述触控屏幕上的第一接触点的第一位置信息;
    第一振动反馈元件,用于在于所述第一接触点对应的第一虚拟按键为锚定点按键的情况下,发出振动波,所述振动波用于提示所述第一虚拟按键为锚定点按键;
    其中,所述第一虚拟按键为虚拟键盘中的一个虚拟按键,所述第一振动反馈元件为所述多个振动反馈元件中与所述第一虚拟按键匹配的振动反馈元件。
  9. 根据权利要求8所述的设备,其特征在于,
    所述触控屏幕还包括盖板和超声波模块,所述超声波模块用于发出超声波,以改变所述盖板的触觉特性;或者,
    所述触控屏幕还包括盖板和静电模块,所述静电模块用于产生电信号,以改变所述盖板的触觉特性;
    其中,所述触觉特性包括以下中的任一种或多种特性:滑动摩擦系数、粘滑性和温度。
  10. 根据权利要求8或9所述的设备,其特征在于,所述触控屏幕还包括压力感知模块,所述压力感知模块和所述振动反馈模块集成于一体,所述振动反馈元件为压电陶瓷片、压电聚合物或压电复合材料。
  11. 一种电子设备,所述电子设备包括触控屏幕、存储器、一个或多个处理器以及一个或多个程序,所述触控屏幕中配置有多个振动反馈元件,所述一个或多个程序被存储在所述存储器中,所述一个或多个处理器在执行所述一个或多个程序时,使得所述电子设备执行以下步骤:
    检测作用于所述触控屏幕上的第一接触操作;
    响应于所述第一接触操作,获取与所述第一接触操作对应的第一接触点的第一位置信息,所述第一位置信息与虚拟键盘上的第一虚拟按键对应;
    在所述第一虚拟按键为锚定点按键的情况下,从所述多个振动反馈元件中获取第一振动反馈元件,所述第一振动反馈元件为与所述第一虚拟按键匹配的振动反馈元件;
    指示所述第一振动反馈元件发出振动波,以执行第一反馈操作,所述第一反馈操作用 于提示所述第一虚拟按键为锚定点按键。
  12. 根据权利要求11所述的电子设备,其特征在于,所述电子设备中配置有第一映射关系,所述第一映射关系指示虚拟按键与振动反馈元件之间的对应关系,所述一个或多个处理器在执行所述一个或多个程序时,使得所述电子设备具体执行以下步骤:
    根据所述第一映射关系和所述第一虚拟按键,获取所述第一振动反馈元件。
  13. 根据权利要求11所述的电子设备,其特征在于,所述电子设备中配置有第一映射关系,所述第一映射关系指示位置信息与振动反馈元件之间的对应关系,所述一个或多个处理器在执行所述一个或多个程序时,使得所述电子设备具体执行以下步骤:
    根据所述第一映射关系和所述第一位置信息,获取所述第一振动反馈元件。
  14. 根据权利要求11至13任一项所述的电子设备,其特征在于,所述一个或多个处理器在执行所述一个或多个程序时,使得所述电子设备还执行以下步骤:
    获取与至少一个第一振动反馈元件中各个第一振动反馈元件对应的振动波的振动强度,所述至少一个第一振动反馈元件中各个第一振动反馈元件的振动波的振动强度与第一数量相关,所述第一数量为所述第一振动反馈元件的数量;
    所述一个或多个处理器在执行所述一个或多个程序时,使得所述电子设备具体执行以下步骤:
    根据与所述各个第一振动反馈元件对应的振动波的振动强度,通过所述至少一个第一振动反馈元件发出振动波,以使与所述第一虚拟按键对应的振动反馈的强度和与所述第二虚拟按键对应的振动反馈的强度的差异在预设强度范围内,所述第二虚拟按键和所述第一虚拟按键为不同的虚拟按键。
  15. 根据权利要求11至13任一项所述的电子设备,其特征在于,所述第一振动反馈元件为以下中的任一种:压电陶瓷片、线性马达或压电薄膜。
  16. 根据权利要求11至13任一项所述的电子设备,其特征在于,所述一个或多个处理器在执行所述一个或多个程序时,使得所述电子设备还执行以下步骤:
    根据所述第一位置信息,获取与所述第一接触点对应的位置类型,所述位置类型包括所述第一接触点位于所述第一虚拟按键的第一位置区域和所述第一接触点位于所述第一虚拟按键的第二位置区域,所述第一位置区域和所述第二位置区域不同;
    所述一个或多个处理器在执行所述一个或多个程序时,使得所述电子设备具体执行以下步骤:
    根据与所述第一接触点对应的位置类型,通过所述触控屏幕执行第一反馈操作,与所述第一位置区域对应的反馈操作和与所述第二位置区域对应的反馈操作不同。
  17. 根据权利要求11至13任一项所述的电子设备,其特征在于,所述一个或多个处理器在执行所述一个或多个程序时,使得所述电子设备还执行以下步骤:
    响应于检测到的第一手势操作,从多个类型的虚拟键盘中选取与所述第一手势操作对应的第一类型的虚拟键盘,其中,所述多个类型的虚拟键盘中不同类型的虚拟键盘包括的虚拟按键不完全相同;
    通过所述触控屏幕展示所述第一类型的虚拟键盘,在所述第一类型的虚拟键盘的展示 过程中,所述第一类型的虚拟键盘在所述触控屏幕上的位置固定;
    所述一个或多个处理器在执行所述一个或多个程序时,使得所述电子设备具体执行以下步骤:
    在所述第一类型的虚拟键盘的展示过程中,检测作用于所述触控屏幕上的第一接触操作。
  18. 一种计算机程序产品,其特征在于,所述计算机程序产品包括指令,当所述指令由电子设备加载并执行,使电子设备执行权利要求1-7任一项所述的方法。
  19. 一种电子设备,其特征在于,包括处理器,所述处理器和存储器耦合,所述存储器存储有程序指令,当所述存储器存储的程序指令被所述处理器执行时实现权利要求1-7任一项所述的方法。
PCT/CN2021/141838 2020-12-30 2021-12-28 一种反馈方法以及相关设备 WO2022143579A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP21914315.3A EP4261660A1 (en) 2020-12-30 2021-12-28 Feedback method and related device
US18/343,948 US20230359279A1 (en) 2020-12-30 2023-06-29 Feedback method and related device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202011628845.7A CN114690887B (zh) 2020-12-30 2020-12-30 一种反馈方法以及相关设备
CN202011628845.7 2020-12-30

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US18/343,948 Continuation US20230359279A1 (en) 2020-12-30 2023-06-29 Feedback method and related device

Publications (1)

Publication Number Publication Date
WO2022143579A1 true WO2022143579A1 (zh) 2022-07-07

Family

ID=82133497

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/141838 WO2022143579A1 (zh) 2020-12-30 2021-12-28 一种反馈方法以及相关设备

Country Status (4)

Country Link
US (1) US20230359279A1 (zh)
EP (1) EP4261660A1 (zh)
CN (1) CN114690887B (zh)
WO (1) WO2022143579A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI818864B (zh) * 2023-02-07 2023-10-11 華碩電腦股份有限公司 靜電薄膜致動器之驅動電路

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101859223A (zh) * 2009-04-03 2010-10-13 索尼公司 信息处理设备、通知方法和程序
US20110285653A1 (en) * 2010-05-21 2011-11-24 Satoshi Kojima Information Processing Apparatus and Input Method
CN103713838A (zh) * 2012-10-04 2014-04-09 纬创资通股份有限公司 电子装置和虚拟键盘定位方法
CN105446646A (zh) * 2015-12-11 2016-03-30 小米科技有限责任公司 基于虚拟键盘的内容输入方法、装置及触控设备

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050162402A1 (en) * 2004-01-27 2005-07-28 Watanachote Susornpol J. Methods of interacting with a computer using a finger(s) touch sensing input device with visual feedback
US20060232558A1 (en) * 2005-04-15 2006-10-19 Huan-Wen Chien Virtual keyboard
CN102099768A (zh) * 2008-07-23 2011-06-15 进益研究公司 触摸屏幕中用于键模拟的触觉反馈
KR101613551B1 (ko) * 2009-07-02 2016-04-19 엘지전자 주식회사 이동 단말기
KR101141198B1 (ko) * 2009-11-05 2012-05-04 주식회사 팬택 진동 피드백 제공 단말 및 그 방법
CN102236505A (zh) * 2010-04-21 2011-11-09 英业达股份有限公司 虚拟键盘的操作方法及应用其的可携式电子装置
US20120113008A1 (en) * 2010-11-08 2012-05-10 Ville Makinen On-screen keyboard with haptic effects
TWI524218B (zh) * 2011-10-05 2016-03-01 廣達電腦股份有限公司 觸覺回饋式虛擬鍵盤之提供方法及其電子裝置
CN106445369B (zh) * 2015-08-10 2022-06-07 北京搜狗科技发展有限公司 一种输入的方法和装置
CN107678666A (zh) * 2017-09-14 2018-02-09 维沃移动通信有限公司 一种虚拟按键显示方法、移动终端及计算机可读存储介质

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101859223A (zh) * 2009-04-03 2010-10-13 索尼公司 信息处理设备、通知方法和程序
US20110285653A1 (en) * 2010-05-21 2011-11-24 Satoshi Kojima Information Processing Apparatus and Input Method
CN103713838A (zh) * 2012-10-04 2014-04-09 纬创资通股份有限公司 电子装置和虚拟键盘定位方法
CN105446646A (zh) * 2015-12-11 2016-03-30 小米科技有限责任公司 基于虚拟键盘的内容输入方法、装置及触控设备

Also Published As

Publication number Publication date
CN114690887B (zh) 2024-04-12
EP4261660A1 (en) 2023-10-18
US20230359279A1 (en) 2023-11-09
CN114690887A (zh) 2022-07-01

Similar Documents

Publication Publication Date Title
WO2022143198A1 (zh) 一种应用界面的处理方法以及相关设备
EP1979804B1 (en) Gesturing with a multipoint sensing device
US8941600B2 (en) Apparatus for providing touch feedback for user input to a touch sensitive surface
US9292111B2 (en) Gesturing with a multipoint sensing device
US9239673B2 (en) Gesturing with a multipoint sensing device
US20110216015A1 (en) Apparatus and method for directing operation of a software application via a touch-sensitive surface divided into regions associated with respective functions
US20110060986A1 (en) Method for Controlling the Display of a Touch Screen, User Interface of the Touch Screen, and an Electronic Device using The Same
US20120162093A1 (en) Touch Screen Control
US20050162402A1 (en) Methods of interacting with a computer using a finger(s) touch sensing input device with visual feedback
US20160062634A1 (en) Electronic device and method for processing handwriting
TWI505155B (zh) 電容和電磁雙模觸摸屏的觸控方法及手持式電子設備
WO2022143620A1 (zh) 一种虚拟键盘的处理方法以及相关设备
US9747002B2 (en) Display apparatus and image representation method using the same
US20150062015A1 (en) Information processor, control method and program
US20230359279A1 (en) Feedback method and related device
WO2022143607A1 (zh) 一种应用界面的处理方法以及相关设备
US11188224B2 (en) Control method of user interface and electronic device
KR20100034811A (ko) 다기능 터치마우스
RU96671U1 (ru) Универсальное устройство ввода информации в персональный компьютер (варианты)
JP2010218122A (ja) 情報入力装置、オブジェクト表示方法、およびコンピュータが実行可能なプログラム
AU2016238971B2 (en) Gesturing with a multipoint sensing device
AU2014201419B2 (en) Gesturing with a multipoint sensing device
KR20130140361A (ko) 터치스크린을 구비하는 단말에서 데이터 입력 방법 및 장치

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21914315

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021914315

Country of ref document: EP

Effective date: 20230714

NENP Non-entry into the national phase

Ref country code: DE