CN111669459B - Keyboard display method, electronic device and computer readable storage medium - Google Patents

Keyboard display method, electronic device and computer readable storage medium Download PDF

Info

Publication number
CN111669459B
CN111669459B CN202010327962.3A CN202010327962A CN111669459B CN 111669459 B CN111669459 B CN 111669459B CN 202010327962 A CN202010327962 A CN 202010327962A CN 111669459 B CN111669459 B CN 111669459B
Authority
CN
China
Prior art keywords
screen
display
area
application
electronic device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010327962.3A
Other languages
Chinese (zh)
Other versions
CN111669459A (en
Inventor
杨嘉辰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202010327962.3A priority Critical patent/CN111669459B/en
Publication of CN111669459A publication Critical patent/CN111669459A/en
Application granted granted Critical
Publication of CN111669459B publication Critical patent/CN111669459B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1615Constructional details or arrangements for portable computers with several enclosures having relative motions, each enclosure supporting at least one I/O or computing function
    • G06F1/1616Constructional details or arrangements for portable computers with several enclosures having relative motions, each enclosure supporting at least one I/O or computing function with folding flat displays, e.g. laptop computers or notebooks having a clamshell configuration, with body parts pivoting to an open position around an axis parallel to the plane they define in closed position
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1637Details related to the display arrangement, including those related to the mounting of the display in the housing
    • G06F1/1641Details related to the display arrangement, including those related to the mounting of the display in the housing the display being formed by a plurality of foldable display components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1637Details related to the display arrangement, including those related to the mounting of the display in the housing
    • G06F1/1652Details related to the display arrangement, including those related to the mounting of the display in the housing the display being flexible, e.g. mimicking a sheet of paper, or rollable
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1675Miscellaneous details related to the relative movement between the different enclosures or enclosure parts
    • G06F1/1677Miscellaneous details related to the relative movement between the different enclosures or enclosure parts for detecting open or closed state or particular intermediate positions assumed by movable parts of the enclosure, e.g. detection of display lid position with respect to main body in a laptop, detection of opening of the cover of battery compartment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1694Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a single or a set of motion sensors for pointer control or gesture input obtained by sensing movements of the portable computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/0206Portable telephones comprising a plurality of mechanically joined movable body parts, e.g. hinged housings
    • H04M1/0208Portable telephones comprising a plurality of mechanically joined movable body parts, e.g. hinged housings characterized by the relative motions of the body parts
    • H04M1/0214Foldable telephones, i.e. with body parts pivoting to an open position around an axis parallel to the plane they define in closed position
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/026Details of the structure or mounting of specific components
    • H04M1/0266Details of the structure or mounting of specific components for a display module assembly
    • H04M1/0268Details of the structure or mounting of specific components for a display module assembly including a flexible display panel
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/23Construction or mounting of dials or of equivalent devices; Means for facilitating the use thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2200/00Indexing scheme relating to G06F1/04 - G06F1/32
    • G06F2200/16Indexing scheme relating to G06F1/16 - G06F1/18
    • G06F2200/161Indexing scheme relating to constructional details of the monitor
    • G06F2200/1614Image rotation following screen orientation, e.g. switching from landscape to portrait mode
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04803Split screen, i.e. subdividing the display area or the window area into separate subareas
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04809Textured surface identifying touch areas, e.g. overlay structure for a virtual keyboard
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/12Details of telephonic subscriber devices including a sensor for measuring a physical value, e.g. temperature or motion
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/22Details of telephonic subscriber devices including a touch pad, a touch sensor or a touch detector

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Signal Processing (AREA)
  • Mathematical Physics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application provides a keyboard display method, electronic equipment and a computer readable storage medium, wherein the method comprises the following steps: when touch operation on an input frame is detected, acquiring a first display state of a screen at present; if the screen is in the split screen display state currently, determining a screen area where the application of the input box belongs to; the method has the advantages that the word selection bar is displayed in the screen area where the application of the input box belongs to, and the keyboard main interface is displayed in any screen area except the screen area where the application of the input box belongs to, so that the shielding area of the virtual keyboard on the application program currently touched by a user is greatly reduced, and the user can browse more contents displayed in the application interface of the application program.

Description

Keyboard display method, electronic device and computer readable storage medium
Technical Field
The present application relates to the field of electronic technologies, and in particular, to a keyboard display method, an electronic device, and a computer-readable storage medium.
Background
With the rapid development of terminal technology and internet technology, terminal products represented by smart phones greatly change daily life styles and working styles of people and become essential articles in life of people. As the application functions of smart phones are more and more extensive, the requirements of people on the size of the screen of the smart phone are changed along with the change of the use scene, and large-screen mobile phones, especially foldable display screen mobile phones, are more and more appeared in the lives of people.
When a user clicks an input box displayed in a screen to perform character input operation, the popped virtual keyboard occupies most of the current screen area of the application to which the input box belongs, so that partial content of the application to which the input box belongs is shielded, and the browsing experience of the user is reduced.
Disclosure of Invention
The application discloses a keyboard display method, an electronic device and a computer readable storage medium. The problem that the virtual keyboard shields the display content of the application to which the input box belongs can be reduced.
In a first aspect, an embodiment of the present application provides a keyboard display method, which is applied to an electronic device, and includes: when touch operation on an input frame is detected, acquiring a first display state of a screen at present; if the screen is in a split screen display state currently, determining a screen area where the application of the input box belongs; and displaying a word selection bar in the screen area where the application of the input box belongs, and displaying the keyboard main interface in any screen area except the screen area where the application of the input box belongs. The method has the advantages that the word choice bar with a small area is displayed in the screen area where the input box belongs to the application, the keyboard main interface with a large area is displayed in any screen area except the screen area where the input box belongs to the application, the shielding area of the virtual keyboard on the application program currently touched by a user is greatly reduced, the user can browse more contents displayed in the application interface of the application program, the word choice bar is located in the screen area where the input box belongs to the application, the user can intuitively and clearly know the application program displayed in which the currently input characters belong, and confusion caused by the fact that the word choice bar and the keyboard main interface are displayed in any screen area except the screen area where the input box belongs to the application is avoided.
In the keyboard display method, the word selection bar can be used for displaying input characters and character information corresponding to the characters, and the displayed keyboard main interface can be used for inputting the characters. The characters referred to herein include, but are not limited to, letters, numbers, symbols, etc., and the literal information corresponding to the characters includes, but is not limited to, english, chinese, etc. The term selection bar and the keyboard main interface in the embodiment of the application are controls that can be displayed separately, and can be displayed on application interfaces or application pages displayed in different screen areas, for example, the term selection bar can be displayed on the bottom lower side of one screen area in a horizontal layout manner, and can also be displayed on the leftmost side or the rightmost side of the screen area in a vertical layout manner, and the keyboard main interface is displayed in another screen area to reduce the shielding of the keyboard main interface on the application interface or application page currently operated by a user. In order to facilitate the input operation of the user, when only one current screen area of the screen of the electronic device is provided, the current screen area can be displayed on the leftmost side or the rightmost side of the screen area in a longitudinal layout mode, and the main keyboard interface is displayed on the other screen area, so that the use habit of the input by both hands of the user is met, and the user experience is improved.
With reference to the first aspect, in some embodiments, when determining the screen area where the application belongs to the input box, the electronic device may determine the screen area where the application belongs to the input box according to the screen coordinates of the input box by obtaining the screen coordinates of the input box, and then determine the screen area where the application belongs to the input box according to the screen coordinates of the input box. The screen coordinate corresponding to the touch operation may also be acquired, and then the screen area where the application of the input frame belongs to may be determined according to the screen area range where the screen coordinate belongs, for example, when the screen area range where the screen coordinate belongs is in the left half screen area of the electronic device, that is, the left split screen range, the screen area where the application of the input frame currently touched by the user belongs to may be determined as the left split screen. The screen coordinate corresponding to the cursor locator triggered when the user touches the input frame can also be acquired, and then the screen area where the application of the input frame currently touched by the user is located is determined according to the screen coordinate corresponding to the cursor locator, for example, when the screen area range to which the coordinate corresponding to the cursor locator belongs is in the left half screen area of the electronic device, namely the left split screen range, the screen area where the application of the input frame currently touched and operated by the user is located can be determined to be the left split screen.
According to the method and the device, the screen where the application of the input frame acted by the touch operation belongs to can be quickly determined when the electronic equipment is in a split-screen display state through the input frame, cursor locator information or the screen coordinates corresponding to the touch operation, and the processing efficiency of keyboard display is improved.
In a second aspect, an embodiment of the present application provides another keyboard display method, which is applied to an electronic device, and the method includes: when touch operation on an input frame is detected, acquiring a first display state of a screen at present; if the screen is in a full-screen display state at present, triggering a split-screen display function to obtain at least two screen areas; and selecting one screen area from the at least two screen areas as a display area of the application to which the input box belongs, displaying the word selection bar in the selected screen area, and displaying the keyboard main interface in at least one screen area except the selected screen area.
According to the method and the device, the split-screen display function is triggered when the touch operation on the input frame is detected, the display area of the screen is divided to obtain two or more screen areas, one of the screen areas is selected to be used for displaying the application interface and the word selection bar when the full screen is displayed, the keyboard main interface is displayed in the other screen area except the selected screen area, the word selection bar and the keyboard main interface are separately displayed, the shielding of the keyboard main interface on the application interface when the full screen is displayed is reduced, a user can conveniently browse most of the content of the application interface when the full screen is displayed when the user inputs the input, and the user experience is improved.
In a third aspect, an embodiment of the present application provides another keyboard display method, which is applied to an electronic device, and the method includes: when touch operation on an input frame is detected, acquiring a first display state of a screen at present; and if the screen is in a full-screen display state, acquiring a second display state of the screen, and if the screen is in a horizontal-screen display state, acquiring screen coordinates corresponding to the touch operation, and determining the display positions of the word selection bar and the keyboard main interface according to the screen coordinates.
For example, determining the display positions of the word choice bar and the keyboard main interface according to the screen coordinates specifically includes: and determining the display positions of the word selection bar and the keyboard main interface according to the position of the screen coordinate on the screen.
Specifically, the screen coordinate is located in the direction and position of the screen coordinate relative to the center line of the screen, for example, the screen coordinate is located in the range of the left azimuth of the center line of the screen, that is, the screen coordinate is located in the range of the left half screen area of the screen; or the screen coordinates are coordinates in the right azimuth range with respect to the screen center line, i.e., the screen coordinates are in the coordinate range of the right half screen area of the screen. And if the screen coordinate is located in the coordinate range of the left half screen area of the screen, displaying the word selection bar in the left half screen area of the screen, and displaying the keyboard main interface in the right half screen area of the screen. And if the screen coordinates are located in the coordinate range of the right half screen area of the screen, displaying the word selection bar in the right half screen area of the screen, and displaying the keyboard main interface in the left half screen area of the screen.
According to the method and the device, the display areas of the word selection bar and the keyboard main interface are determined according to the screen coordinates of touch operation on the application interface displayed in a full screen mode, the word selection bar and the keyboard main interface are displayed separately, so that shielding of the displayed application interface is reduced, meanwhile, a user can conveniently input characters through double-hand operation, for example, characters are input on the keyboard main interface through the left hand, characters corresponding to the characters are selected on the word selection bar through the right hand, and user experience is improved.
In a fourth aspect, an embodiment of the present application provides another keyboard display method, which is applied to an electronic device, and the method includes: when touch operation on an input frame is detected, acquiring a first display state of a screen at present; if the screen is in a full-screen display state, acquiring a second display state of the screen, if the screen is in a vertical-screen display state, displaying the word-selecting bar at a first preset position of a current screen area of the screen, and displaying the keyboard main interface at a second preset position of the current screen area of the screen.
The first preset position and the second preset position are different positions in the same screen area, and the first preset position is a preset area block for displaying the word choice bar, such as the leftmost position of the screen. The second preset position is a preset area block for displaying the keyboard main interface, such as a position on the lower side of the bottom of the screen.
According to the method and the device, the word selection bar and the keyboard main interface are displayed at two different positions of the application interface displayed in a full screen mode to achieve separation of the word selection bar and the keyboard main interface, shielding of the word selection bar on the current displayed content of the screen is reduced, the word selection bar is set to be located on the side edge of the screen, such as the position close to the left side or the position close to the right side, and the keyboard main interface is located on the position of the lower side of the bottom of the screen, so that a user can conveniently operate the word selection bar and the keyboard main interface in a single hand mode.
In a fifth aspect, the present application provides an electronic device, comprising: one or more processors, memory, and a display screen; the memory, the display screen, and the one or more processors are coupled, the memory is configured to store computer program code, the computer program code includes computer instructions that, when executed by the one or more processors, cause the electronic device to perform the method as provided by the first aspect, the second aspect, the third aspect, the fourth aspect, any of the possible implementations of the first aspect, any of the possible implementations of the second aspect, any of the possible implementations of the third aspect, or any of the possible implementations of the fourth aspect.
In a sixth aspect, the present application provides a computer storage medium, which includes computer instructions, when the computer instructions are executed on an electronic device, the electronic device is caused to perform the method as provided in the first aspect, the second aspect, the third aspect, the fourth aspect, any one of the possible implementations of the first aspect, any one of the possible implementations of the second aspect, any one of the possible implementations of the third aspect, or any one of the possible implementations of the fourth aspect.
In a seventh aspect, an embodiment of the present application provides a computer program product, which, when run on a computer, causes the computer to perform the method as provided in any one of the first aspect, the second aspect, the third aspect, the fourth aspect, any one of the possible implementations of the first aspect, any one of the possible implementations of the second aspect, any one of the possible implementations of the third aspect, or any one of the possible implementations of the fourth aspect.
In an eighth aspect, an embodiment of the present application provides a chip system, which includes a memory and a processor, and when the chip system is executed, the electronic device performs the method as provided by any one of the first aspect, the second aspect, the third aspect, the fourth aspect, any one of the possible implementations of the first aspect, any one of the possible implementations of the second aspect, any one of the possible implementations of the third aspect, or any one of the possible implementations of the fourth aspect. The chip system can be a single chip or a chip module consisting of a plurality of chips.
It will be appreciated that the electronic device of the fifth aspect, the computer storage medium of the sixth aspect, or the computer program product of the seventh aspect, provided above, are all adapted to perform the method provided in any of the first, second, third, or fourth aspects. Therefore, the beneficial effects achieved by the method can refer to the beneficial effects in the corresponding method, and are not described herein again.
Drawings
The drawings used in the embodiments of the present application are described below.
Fig. 1 is a schematic structural diagram of an electronic device 100 provided in an embodiment of the present application;
fig. 2 is a set of schematic views of a screen of the electronic device 100 provided in the embodiment of the present application when the screen is a foldable display screen; wherein: fig. (a) is a schematic view of a foldable display screen with only one bending portion in an unfolded state according to an embodiment of the present application; fig. (B) is a schematic view of a foldable display screen with two bending portions in an unfolded state according to an embodiment of the present application; fig. (C) and (D) are schematic views illustrating a foldable display screen as shown in fig. (a) being switched from a folded state to an unfolded state or from the unfolded state to the folded state according to an embodiment of the present application; fig. (E) and (F) are schematic views illustrating a foldable display screen as shown in fig. (B) being switched from a folded state to an unfolded state or from the unfolded state to the folded state according to an embodiment of the present application;
fig. 3 is a block diagram of a software structure of the electronic device 100 according to an embodiment of the present disclosure;
FIG. 4 is a schematic diagram of a screen interface provided by an embodiment of the present application; fig. (a) is a screen interface schematic diagram of the electronic device 100 when the screen is in the split-screen display state; FIG. B is a schematic diagram of a keyboard display interface provided by a user after touching the input box at the screen interface shown in FIG. A;
fig. 5 is a flowchart illustrating a keyboard display method according to an embodiment of the present application;
FIG. 6 is a schematic diagram of another screen interface provided by an embodiment of the present application; fig. (a) to (C) are schematic diagrams of a group of keyboard display interfaces provided when the electronic device 100 having a foldable display screen is in an unfolded state; FIGS. D-F are schematic diagrams of a set of keyboard display interfaces provided when the electronic device 100 includes three screen areas;
FIG. 7 is a flowchart illustrating another keyboard display method according to an embodiment of the present application;
FIG. 8 is a schematic diagram of another screen interface provided by an embodiment of the present application; wherein, fig. (a) is a screen interface schematic diagram when the screen of the electronic device 100 is in a full-screen display state and in a landscape display state; FIG. B is a schematic diagram of a keyboard display interface provided by a user after touching the input box at the screen interface shown in FIG. A; FIG. (C) is a schematic view of another keyboard display interface provided by the user after touching the input box at the screen interface shown in FIG. (A);
FIG. 9 is a flowchart illustrating another keyboard display method according to an embodiment of the present application;
FIG. 10 is a schematic diagram of another screen interface provided by an embodiment of the present application; fig. (a) to (C) are schematic diagrams of some keyboard display interfaces provided after a user touches an input frame currently displayed on a screen when the electronic device 100 is in a full-screen display state and in a vertical-screen display state, where fig. (a) and fig. (B) are schematic diagrams of a keyboard display interface provided after the user touches different positions of the input frame; fig. C is a schematic view of another keyboard display interface provided after the user touches the input box.
Detailed Description
The embodiments of the present application will be described below with reference to the drawings. The terminology used in the description of the embodiments of the examples herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application.
First, an electronic device according to an embodiment of the present application will be described. Referring to fig. 1, fig. 1 is a schematic structural diagram of an electronic device 100 according to an embodiment of the present disclosure.
The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a key 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identification Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the illustrated structure of the embodiment of the present invention does not specifically limit the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors.
The controller may be, among other things, a neural center and a command center of the electronic device 100. The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
The I2C interface is a bi-directional synchronous serial bus that includes a serial data line (SDA) and a Serial Clock Line (SCL). In some embodiments, processor 110 may include multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, the charger, the flash, the camera 193, etc. through different I2C bus interfaces, respectively. For example: the processor 110 may be coupled to the touch sensor 180K via an I2C interface, such that the processor 110 and the touch sensor 180K communicate via an I2C bus interface to implement the touch functionality of the electronic device 100.
The I2S interface may be used for audio communication. In some embodiments, processor 110 may include multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 via an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may communicate audio signals to the wireless communication module 160 via the I2S interface, enabling answering of calls via a bluetooth headset.
The PCM interface may also be used for audio communication, sampling, quantizing and encoding analog signals. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled by a PCM bus interface. In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to implement a function of answering a call through a bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus used for asynchronous communications. The bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is generally used to connect the processor 110 with the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit the audio signal to the wireless communication module 160 through a UART interface, so as to realize the function of playing music through a bluetooth headset.
MIPI interfaces may be used to connect processor 110 with peripheral devices such as display screen 194, camera 193, and the like. The MIPI interface includes a Camera Serial Interface (CSI), a Display Serial Interface (DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the capture functionality of electronic device 100. Processor 110 and display screen 194 communicate via a DSI interface to implement display functions of electronic device 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal and may also be configured as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, a MIPI interface, and the like.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transmit data between the electronic device 100 and a peripheral device. And the earphone can also be used for connecting an earphone and playing audio through the earphone. The interface may also be used to connect other electronic devices, such as AR devices and the like.
It should be understood that the connection relationship between the modules according to the embodiment of the present invention is only illustrative, and is not limited to the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 140 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via the USB interface 130. In some wireless charging embodiments, the charging management module 140 may receive a wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc. In some other embodiments, the power management module 141 may also be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including wireless communication of 2G/3G/4G/5G, etc. applied to the electronic device 100. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.
The wireless communication module 160 may provide a solution for wireless communication applied to the electronic device 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (bluetooth, BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves via the antenna 2 to radiate the electromagnetic waves.
In some embodiments, antenna 1 of electronic device 100 is coupled to mobile communication module 150 and antenna 2 is coupled to wireless communication module 160 so that electronic device 100 can communicate with networks and other devices through wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), General Packet Radio Service (GPRS), code division multiple access (code division multiple access, CDMA), Wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (time-division code division multiple access, TD-SCDMA), Long Term Evolution (LTE), LTE, BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The electronic device 100 implements display functions via the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, with N being a positive integer greater than 1.
In some embodiments, the display screen 194 of fig. 1 may be bent when the display panel is made of OLED, AMOLED, FLED, or the like. Here, the display 194 may be bent, which means that the display may be bent at any position to any angle and may be maintained at the angle. For example, the display screen 194 may be folded in half from the middle left to right, or folded in half from the middle up and down. In the embodiment of the present application, the display screen that can be folded is referred to as a foldable display screen. The foldable display screen may be a single screen, or a display screen formed by splicing a plurality of screens, which is not limited herein.
For example, referring to fig. 2, the foldable display screen may include at least two physical forms: an expanded configuration and a folded configuration. As shown in fig. 2 (a), the foldable display screen is in an unfolded state, that is, an angle formed by the left and right ends of the middle bending portion of the foldable display screen (if the foldable display screen is folded up and down, the upper and lower ends of the middle bending portion of the foldable display screen) which can be folded from the middle left and right is between 180 degrees and a first angle, wherein the first angle is greater than 0 degree and less than 180 degrees, for example, the first angle may be 90 degrees. As shown in fig. 2 (B), the foldable display screen is in an unfolded state, that is, an angle formed by the left and right ends of the bending portion of the foldable display screen 2 (if the foldable display screen is folded up and down, the upper and lower ends of the bending portion of 1/3 and 2/3 of the foldable display screen) which can be folded in half from the left and right ends of the bending portion of 1/3 and 2/3 of the foldable display screen 2 is between 180 degrees and a first angle, wherein the first angle is greater than 0 degree and less than 180 degrees, for example, the first angle may be 90 degrees. As shown in fig. 2 (C) and (D), the foldable display screen may also be in a folded state, that is, an included angle formed by the left and right ends of the middle bending portion of the foldable display screen (if the foldable display screen is folded up and down, the upper and lower ends of the middle bending portion of the foldable display screen) is between 0 degree and a first angle; as shown in fig. 2 (E) and (F), the foldable display screen may be in a folded state, that is, an angle formed by the left and right ends of the bending portions 1/3 and 2/3 of the foldable display screen (if the foldable display screen is folded up and down, the upper and lower ends of the bending portions 1/3 and 2/3 of the foldable display screen) is between 0 degree and a first angle.
As shown in fig. 2 (C) and (D), the foldable display screen is a two-folding display screen, and a display area of the foldable display screen after entering the folding configuration may be divided into a first screen area and a second screen area. The foldable display screen can be folded towards the direction in which the first screen area and the second screen area face each other or the direction in which the first screen area and the second screen area face each other in the unfolded state; in some embodiments, the angle formed by the left and right ends of the middle bending portion of the foldable display screen (or the upper and lower ends of the middle bending portion of the foldable display screen if the foldable display screen is folded up and down) may be between 0 degrees and +180 degrees. For example, the foldable display screen may be bent to form a folded configuration with an angle of 30 degrees toward a direction in which the first screen region and the second screen region face each other, or may be bent to form a folded configuration with an angle of 30 degrees toward a direction in which the first screen region and the second screen region face each other.
As shown in fig. 2 (E) and (F), the foldable display screen is a triple-foldable display screen, and a display area of the foldable display screen after entering the folded configuration may be divided into a first screen area, a second screen area, and a third screen area. The foldable display screen can be folded towards the direction that the first screen area faces the second screen area and the second screen area faces the third screen area in the unfolded state, and can also be folded towards the direction that the first screen area faces the second screen area and the second screen area faces the third screen area.
In some embodiments, the electronic device 100 may determine whether the foldable display screen is in the folded configuration or in the unfolded configuration through one or more of a gravity sensor, an acceleration sensor, and a gyroscope, and may also determine whether the foldable display screen is in the portrait screen display state or in the landscape screen display state. The electronic device 100 may further detect a bending angle of the foldable display screen through a gravity sensor, an acceleration sensor, and a gyroscope, and then the electronic device 100 may determine whether the foldable display screen is in the folded state or the unfolded state according to the bending angle. The electronic device 100 may further determine the orientation of the foldable display screen in the folded state through one or more of a gravity sensor, an acceleration sensor, and a gyroscope, so as to determine a display area of the interface content output by the display system. For example, when the first screen area of the foldable display screen is facing upward with respect to the ground, the electronic device 100 may display the interface content output by the display system on the first screen area. When the second screen area of the foldable display screen is facing upward relative to the ground, the electronic device 100 may display the interface content output by the display system on the second screen area.
In some embodiments, the electronic device 100 may further comprise an angle sensor (not shown in fig. 1) which may be arranged at a bend of the foldable display screen. The electronic device 100 may measure an included angle formed between two ends of the middle bending portion of the foldable display screen by an angle sensor (not shown in fig. 1) disposed at the bending portion of the foldable display screen, and when the included angle is greater than or equal to the first angle, the electronic device 100 may recognize that the foldable display screen enters the unfolded state by the angle sensor. When the included angle is smaller than or equal to the first angle, the electronic device 100 may recognize that the foldable display screen enters the folded state through the angle sensor.
In some other embodiments, the electronic device 100 may also recognize whether the foldable display screen is in the folded configuration by a physical switch disposed at a bending portion of the foldable display screen. For example, when the electronic device receives a user's folding operation on the foldable display screen and the physical switch provided on the electronic device is triggered to open, the electronic device 100 may determine that the foldable display screen is in the folded configuration. When the electronic device 100 receives an unfolding operation of the foldable display screen by a user, the physical switch arranged on the electronic device is triggered to be closed, and the electronic device can determine that the foldable display screen is in the unfolded state. The above examples are merely illustrative of the present application and should not be construed as limiting.
Hereinafter, taking the foldable display screen as a two-fold display screen as an example, when the foldable display screen is in the unfolded state, the foldable display screen may display the content in a full screen, in a partial area (for example, the first screen area or the second screen area), or in two or more partial areas. In a possible implementation manner, when the foldable display screen displays the interface content in a full screen, the interface content may occupy a part of a display area of the foldable display screen, for example, when the display screen 194 is a special-shaped cut screen (Notch screen), a middle portion of the special-shaped cut screen displays the interface content, and when one or both edge portions are blank screens, the foldable display screen may also be regarded as displaying the interface content in a full screen.
The electronic device 100 may implement a shooting function through the ISP, the camera 193, the video codec, the GPU, the display 194, the application processor, and the like.
The ISP is used to process the data fed back by the camera 193. For example, when a user takes a picture, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, an optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and converting into an image visible to the naked eye. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. Applications such as intelligent recognition of the electronic device 100 can be implemented by the NPU, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the electronic device 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The storage data area may store data (such as audio data, phone book, etc.) created during use of the electronic device 100, and the like. In addition, the internal memory 121 may include a high speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, a Universal Flash Storage (UFS), and the like.
The electronic device 100 may implement audio functions via the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headset interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The electronic apparatus 100 can listen to music through the speaker 170A or listen to a handsfree call.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the electronic apparatus 100 receives a call or voice information, it can receive voice by placing the receiver 170B close to the ear of the person.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can input a voice signal to the microphone 170C by speaking the user's mouth near the microphone 170C. The electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C to achieve a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may further include three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, perform directional recording, and so on.
The headphone interface 170D is used to connect a wired headphone. The headset interface 170D may be the USB interface 130, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association) standard interface of the USA.
The pressure sensor 180A is used for sensing a pressure signal, and converting the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A can be of a wide variety, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 180A, the capacitance between the electrodes changes. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic apparatus 100 detects the intensity of the touch operation according to the pressure sensor 180A. The electronic apparatus 100 may also calculate the touched position from the detection signal of the pressure sensor 180A. In some embodiments, the touch operations that are applied to the same touch position but different touch operation intensities may correspond to different operation instructions. For example: and when the touch operation with the touch operation intensity smaller than the first pressure threshold value acts on the short message application icon, executing an instruction for viewing the short message. And when the touch operation with the touch operation intensity larger than or equal to the first pressure threshold value acts on the short message application icon, executing an instruction of newly building the short message.
The gyro sensor 180B may be used to determine the motion attitude of the electronic device 100. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., the x, y, and z axes) may be determined by gyroscope sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 180B detects a shake angle of the electronic device 100, calculates a distance to be compensated for by the lens module according to the shake angle, and allows the lens to counteract the shake of the electronic device 100 through a reverse movement, thereby achieving anti-shake. The gyroscope sensor 180B may also be used for navigation, somatosensory gaming scenes.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, electronic device 100 calculates altitude, aiding in positioning and navigation, from barometric pressure values measured by barometric pressure sensor 180C.
The magnetic sensor 180D includes a hall sensor. The electronic device 100 may detect the opening and closing of the flip holster using the magnetic sensor 180D. In some embodiments, when the electronic device 100 is a flip phone, the electronic device 100 may detect the opening and closing of the flip according to the magnetic sensor 180D. And then according to the opening and closing state of the leather sheath or the opening and closing state of the flip cover, the automatic unlocking of the flip cover is set.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity can be detected when the electronic device 100 is stationary. The method can also be used for recognizing the posture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance. The electronic device 100 may measure the distance by infrared or laser. In some embodiments, taking a picture of a scene, electronic device 100 may utilize range sensor 180F to range for fast focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic device 100 emits infrared light to the outside through the light emitting diode. The electronic device 100 detects infrared reflected light from a nearby object using a photodiode. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device 100. When insufficient reflected light is detected, the electronic device 100 may determine that there are no objects near the electronic device 100. The electronic device 100 can utilize the proximity light sensor 180G to detect that the user holds the electronic device 100 close to the ear for talking, so as to automatically turn off the screen to achieve the purpose of saving power. The proximity light sensor 180G may also be used in a holster mode, a pocket mode automatically unlocks and locks the screen.
The ambient light sensor 180L is used to sense the ambient light level. Electronic device 100 may adaptively adjust the brightness of display screen 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust the white balance when taking a picture. The ambient light sensor 180L may also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in a pocket to prevent accidental touches.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 can utilize the collected fingerprint characteristics to unlock the fingerprint, access the application lock, photograph the fingerprint, answer an incoming call with the fingerprint, and so on.
The temperature sensor 180J is used to detect temperature. In some embodiments, electronic device 100 implements a temperature processing strategy using the temperature detected by temperature sensor 180J. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold, the electronic device 100 performs a reduction in performance of a processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection. In other embodiments, the electronic device 100 heats the battery 142 when the temperature is below another threshold to avoid abnormal shutdown of the electronic device 100 due to low temperature. In other embodiments, when the temperature is lower than a further threshold, the electronic device 100 performs boosting on the output voltage of the battery 142 to avoid abnormal shutdown due to low temperature.
The touch sensor 180K is also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation applied thereto or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided through the display screen 194. In other embodiments, the touch sensor 180K may be disposed on a surface of the electronic device 100, different from the position of the display screen 194.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, the bone conduction sensor 180M may acquire a vibration signal of the human voice vibrating a bone mass. The bone conduction sensor 180M may also contact the human pulse to receive the blood pressure pulsation signal. In some embodiments, the bone conduction sensor 180M may also be disposed in a headset, integrated into a bone conduction headset. The audio module 170 may analyze a voice signal based on the vibration signal of the bone mass vibrated by the sound part acquired by the bone conduction sensor 180M, so as to implement a voice function. The application processor can analyze heart rate information based on the blood pressure beating signal acquired by the bone conduction sensor 180M, so as to realize the heart rate detection function.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The electronic apparatus 100 may receive a key input, and generate a key signal input related to user setting and function control of the electronic apparatus 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration cues, as well as for touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also respond to different vibration feedback effects for touch operations applied to different areas of the display screen 194. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card can be brought into and out of contact with the electronic apparatus 100 by being inserted into the SIM card interface 195 or being pulled out of the SIM card interface 195. The electronic device 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support a Nano SIM card, a Micro SIM card, a SIM card, etc. The same SIM card interface 195 can be inserted with multiple cards at the same time. The types of the plurality of cards may be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The electronic device 100 interacts with the network through the SIM card to implement functions such as communication and data communication. In some embodiments, the electronic device 100 employs esims, namely: an embedded SIM card. The eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100.
The software system of the electronic device 100 may employ a layered architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. The embodiment of the present invention uses an Android system with a hierarchical architecture as an example to exemplarily explain a software structure of the electronic device 100.
Fig. 2 is a block diagram of a software structure of the electronic device 100 according to the embodiment of the present application.
The layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, an application layer, an application framework layer, an Android runtime (Android runtime) and system library, and a kernel layer from top to bottom.
The application layer may include a series of application packages.
As shown in fig. 2, the application package may include applications such as camera, gallery, calendar, phone call, map, navigation, WLAN, bluetooth, music, video, short message, etc.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 2, the application framework layers may include a window manager, content provider, view system, phone manager, resource manager, notification manager, and the like.
The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
Content providers are used to store and retrieve data and make it accessible to applications. The data may include video, images, audio, calls made and answered, browsing history and bookmarks, phone books, etc.
The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
The phone manager is used to provide communication functions of the electronic device 100. Such as management of call status (including on, off, etc.).
The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and the like.
The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a short dwell, and does not require user interaction. Such as a notification manager used to inform download completion, message alerts, etc. The notification manager may also be a notification that appears in the form of a chart or scrollbar text in a status bar at the top of the system, such as a notification of a running application in the background, or a notification that appears on the screen in the form of a dialog window. For example, prompting text information in the status bar, sounding a prompt tone, vibrating the electronic device, flashing an indicator light, etc.
The Android Runtime comprises a core library and a virtual machine. The Android runtime is responsible for scheduling and managing an Android system.
The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. And executing java files of the application program layer and the application program framework layer into a binary file by the virtual machine. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface managers (surface managers), Media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., OpenGL ES), 2D graphics engines (e.g., SGL), and the like.
The surface manager is used to manage the display subsystem and provide a fusion of the 2D and 3D layers for multiple applications.
The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, and the like.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
The following describes exemplary workflow of the software and hardware of the electronic device 100 in connection with capturing a photo scene.
When the touch sensor 180K receives a touch operation, a corresponding hardware interrupt is issued to the kernel layer. The kernel layer processes the touch operation into an original input event (including touch coordinates, a time stamp of the touch operation, and other information). The raw input events are stored at the kernel layer. And the application program framework layer acquires the original input event from the kernel layer and identifies the control corresponding to the input event. Taking the touch operation as a touch click operation, and taking a control corresponding to the click operation as a control of a camera application icon as an example, the camera application calls an interface of an application framework layer, starts the camera application, further starts a camera drive by calling a kernel layer, and captures a still image or a video through the camera 193.
In some application scenarios, in order to meet the requirement of a user for processing or browsing multiple applications simultaneously, many manufacturers provide a split-screen display function on an electronic device such as a smart phone, a tablet pc, and a foldable phone, so that the user can open two or more user interfaces simultaneously, where the user interfaces are represented as different application interfaces, different pages of the same application, and the like, for example, when a screen of the electronic device 100 currently includes two screen regions, that is, when the electronic device 100 is in a split-screen state (e.g., split screen left and right or split screen up and down), an application interface displaying a short message in one screen region (e.g., split screen left) and an application interface displaying an email in another screen region (e.g., split screen right); or an inbox page of the email is displayed in one screen area (such as a left split screen) and a newly created email page of the email is displayed in another screen area (such as a right split screen). When the user touches the currently displayed input box of the screen to edit information, the electronic device 100 calls the virtual keyboard to display in the screen area where the application of the input box belongs, which will block most of the display content of the application of the input box, and when the user needs to obtain more display content, the user needs to exit the displayed virtual keyboard to obtain more display content, thereby reducing the user experience.
In other application scenarios, when the electronic device 100 with a large screen (e.g., a foldable mobile phone or a large-screen mobile phone in an unfolded state) is in a landscape display state, if a user interface is currently displayed on the screen, when a user touches an input box currently displayed on the screen to edit information, the electronic device 100 will invoke a virtual keyboard to be displayed on the screen, and at this time, because the area of the displayed virtual keyboard is too large, the user moves a large distance to obtain characters required to be input, thereby reducing the user experience. In the existing system of some electronic devices, a virtual keyboard is symmetrically split into two halves, and the two halves are respectively displayed on the left side and the right side of a screen to optimize the input experience of a user, but the virtual keyboard still occupies most area of the screen, and two hands are required to operate simultaneously to obtain characters required to be input, and the user experience is not high due to the fact that the operation moving distance is too large, the habit problems of the left hand and the right hand of the user in a matched mode are solved when the required characters are selected.
In order to improve the use experience of a user and bring better operation experience to the user, the embodiment of the application provides a keyboard display method.
First, a keyboard display according to an embodiment of the present application will be described. The keyboard display can realize that a word selection bar and a keyboard main interface are displayed according to the touch operation of a user on the input frame, the displayed word selection bar can be used for displaying input characters and character information corresponding to the characters, and the displayed keyboard main interface can be used for inputting the characters. The characters referred to herein include, but are not limited to, letters, numbers, symbols, etc., and the literal information corresponding to the characters include, but is not limited to, english, chinese, etc.
It should be noted that the virtual keyboard related to the embodiment of the present application includes a word choice bar and a keyboard main interface, and in some embodiments of the present application, the word choice bar and the keyboard main interface are respectively used as independent controls, and may be separately displayed, and may be displayed at different positions of different screen areas, and may also be displayed at different positions of the same screen area, for example, the word choice bar is located on the left side of the screen area, and the keyboard main interface is located on the bottom lower side of the screen area. In other embodiments, the word choice bar and the keyboard main interface are inseparable and are displayed on the lower side of the bottom of the screen area as a single control, in the embodiments of the present application, a virtual keyboard in which the word choice bar and the keyboard main interface are inseparable and are an integral may be referred to as a normal keyboard, and a normal keyboard appearing hereinafter may be referred to as an integral virtual keyboard in which the word choice bar and the keyboard main interface are inseparable.
It should be further noted that the input box referred to in the embodiments of the present application is a box block in which characters can be input and edited, and includes, but is not limited to, an application search box, a text input box, and the like, and after a user touches the input box, a cursor locator is displayed in the input box, and a position of an input character can be determined by the cursor locator.
It should be further noted that the word choice bar can be displayed in a horizontal layout mode and a vertical layout mode according to the layout mode of the characters. For example, when the characters in the word choice bar are transversely arranged, the word choice bar is displayed in a transverse arrangement mode; and when the characters in the word selection bar are longitudinally distributed, displaying the word selection bar in a longitudinal distribution mode.
An exemplary user interface for a keyboard display on the electronic device 100 is described below.
Referring to fig. 4, fig. 4 is a schematic diagram of a set of screen interfaces provided in the embodiment of the present application. As shown in fig. 4 a, is a screen interface 10 of the electronic device 100. In the screen interface 10, the current screen is in a split-screen display state, and includes a first screen area 101, a second screen area 102, a status bar 103, and a navigation bar 104. Wherein:
the first screen area 101 may be used to display a user interface, such as a home screen interface 1021, an application interface 1011, and the like. The home screen interface 1021 may include at least one application icon, which may be entered into an application interface of an application to which the application icon belongs through a user operation such as a user clicking on the application icon, and in some application interfaces, may include an input box (e.g., the input box 1011a displayed in the application interface 1011) for receiving and displaying text information input by the user.
The second screen area 102 may be used to display another user interface that is a user interface other than the user interface displayed by the first screen area 101.
It should be noted that the user interface displayed in the first screen area 101 or the second screen area 102 may include an input box 1011a for displaying text information input by the user through the virtual keyboard, and after the user touches the "send" key, the text information input by the user is displayed in the dialog box so as to be received and displayed by the user of the other electronic device.
The status bar 103 may include the name of the operator (e.g., china mobile), signal strength, WI-FI icon, current remaining power, and time.
Navigation bar 104 may include: a return button 1041, a home screen button 1042, a call task history button 1043, and other system navigation keys. The main interface is an interface displayed by the electronic device 100 after any user interface detects a user operation on the main interface key 1042. When it is detected that the user clicks the return key 1041, the electronic device 100 may display a user interface previous to the current user interface. Upon detecting that the user clicks home interface key 1042, electronic device 100 can display home screen interface 10. When it is detected that the user clicks the outgoing task history key 1043, the electronic device 100 may display a task that the first user has recently opened. The names of the navigation keys may also be other keys, for example, 1041 may be called Back Button, 1042 may be called Home Button, 1043 may be called Menu Button, and the present application does not limit this. The navigation keys in the navigation bar 104 are not limited to virtual keys, but may be implemented as physical keys.
It should be noted that, after the electronic device 100 enters the split-screen display state, the navigation bar 104 may be displayed on the upper, lower, left, and right sides of the screen, and the navigation bar 104 may also be hidden and not displayed.
The user can enter the keyboard display interface in the first screen area 101 or the second screen area 102 through user operation. Specifically, the user operation may be a touch operation of detecting an input frame on the first screen region 101 or the second screen region 102. As shown in (a) and (B) in fig. 4, in response to a touch operation, for example, a click touch operation, by the user on the input frame displayed in the first screen area 101, the electronic apparatus 100 displays the keyboard display interface 20.
The user may enter the keyboard display interface on the first screen area 101 by touching an input box 1011a displayed on the application interface 1011. The keyboard display 20 may enter a box 1011a, a vote bar 201, and a keyboard home interface 202. Wherein:
the word-selecting bar 201 can be used for displaying the input characters and the corresponding text information of the characters. The vote bar 201 can be displayed at any position of the screen area where the application to which the input box currently touched by the user belongs. As shown in fig. 4 (B), a vote bar is displayed on the bottom lower side of the first screen area 101, so that the user can select a character to be input from the character information displayed in the vote bar.
The keyboard main interface 202 can be used for inputting characters, and the characters input through the keyboard main interface 202 and characters corresponding to the characters are displayed in the word selection bar 201. The keyboard main interface 202 can be displayed at any position of any screen area except the screen area where the input box currently touched by the user belongs to the application. As shown in fig. 4 (B), a keyboard home interface is displayed at the bottom lower side of the second screen region 102, so that a user can input characters through the keyboard home interface to obtain corresponding text information. It should be noted that, when the screen of the electronic device 100 is in the split-screen display state, the electronic device 100 may be considered to include at least two screen areas, and when the application interface displayed in at least one screen area includes an input box, and a user touches one of the input boxes, a vote column and a keyboard main interface are displayed on the current screen of the electronic device 100.
In the embodiment of the present application, after detecting the touch operation on the input box, the electronic device 100 obtains the screen area where the application of the input box belongs, displays the vote bar in the screen area where the application of the input box belongs, displays the keyboard main interface in any screen area except the screen area where the application of the input box belongs, because the vote bar is only displayed on the screen area where the application of the input box belongs for the user to select the characters required to be input, and the keyboard main interface is located in the screen area different from the screen area where the application of the input box belongs, the shielding area of the virtual keyboard to the application program currently touched by the user is greatly reduced, so that the user can browse more contents displayed in the application interface of the application program, and because the vote bar is located in the screen area where the application of the input box belongs, the user can intuitively and clearly know which screen area the currently input characters belong to, confusion caused by the fact that the word selection bar and the keyboard main interface are displayed in any screen area except the screen area to which the input box belongs is avoided.
Referring to fig. 5, fig. 5 is a flowchart illustrating a keyboard display method according to an embodiment of the present disclosure. As shown in fig. 5, the method includes steps S101 to S104.
S101, the electronic device 100 displays the screen interface 10.
S102, the electronic apparatus 100 detects a touch operation of the user on the input box 1011 a.
S103, in response to the detected touch operation on the input box 1011a, the electronic device 100 acquires the first display state of the screen.
In the embodiment of the application, the first display state comprises a split-screen display state and a full-screen display state. The split-screen display state is that the screen currently comprises at least two screen areas, and different user interfaces are displayed in the at least two screen areas. The full screen display state is that the screen currently has only one screen area to display one user interface.
It should be noted that, when the electronic device 100 is an electronic device having a foldable display screen and the foldable display screen is in a folded state, if a user touches an input box displayed in an application interface of a current screen area, a normal keyboard is displayed in the current screen. And if the foldable display screen is in the unfolded state, determining the display positions of the word selection bar and the keyboard main interface according to the current first display state of the foldable display screen.
S104, if the screen is currently in the split-screen display state, the electronic device 100 determines the screen area where the application of the input box 1011a belongs to.
In this embodiment, after detecting a touch operation performed by the user on the input frame 1011a, the electronic device 100 determines whether the screen area where the application to which the input frame 1011a belongs is left split screen or right split screen, or is up split screen or down split screen.
In some embodiments, when determining the screen area where the application of the input box 1011a belongs, the electronic device 100 may determine the screen area where the input box is located according to the screen coordinates of the input box by acquiring the screen coordinates of the input box. When the screen of the electronic device 100 is currently in a split-screen display state, for example, a left-right split-screen display state, that is, when the screen area of the electronic device 100 is left split screen and right split screen, it is possible to determine in which coordinate range of the screen area (left split screen or right split screen) the screen coordinate of the input box is located, for example, whether the screen coordinate of the input box 1011a is located in the left split screen or the right split screen; or on the upper split screen or the lower split screen. After the screen area where the input box is located is determined, the screen area where the application to which the input box 1011a belongs is determined.
In other embodiments, when two or more input boxes currently displayed on the screen of the electronic device 100 are included, the screen area where the application of the input box belongs to may be determined according to the cursor locator information currently displayed on the screen by obtaining the cursor locator information currently displayed on the screen. When a user touches the input box to perform input operation, a cursor locator is displayed in the input box to prompt the user of the input position of current characters, and the input characters are displayed at the current position of the cursor locator. When the electronic device 100 is in the split-screen display state, the screen area where the application to which the input frame currently touched by the user belongs is determined according to the position information of the cursor locator, that is, the screen area where the cursor locator is located is the screen area where the application to which the input frame currently touched by the user belongs. And determining the screen area where the input frame belongs to the application according to the cursor locator, so that the input frame acted by the touch operation can be quickly determined to be the input frame of which screen area.
In other embodiments, when determining the screen area where the application to which the input frame currently touched by the user belongs, the screen area where the application to which the input frame belongs may be determined according to the screen area range to which the screen coordinate belongs by obtaining the screen coordinate corresponding to the touch operation of the user. The method comprises the steps that screen coordinates corresponding to touch operation are obtained, and the screen area to which the screen coordinates belong is determined, such as left split screen and right split screen, or upper split screen and lower split screen; or whether it belongs to a first screen area, a second screen area, a third screen area, etc.
S105, the electronic device 100 displays the word choice bar 201 in the determined screen area where the application of the input box 1011a belongs, and displays the keyboard main interface 202 in any screen area except the screen area where the application of the input box 1011a belongs.
Referring to fig. 6, fig. 6 is a schematic diagram of another screen interface provided in the embodiment of the present application.
As shown in fig. 6 (a), the electronic device 100 is an electronic device having a foldable display screen, and when an included angle formed by left and right ends of a middle bending portion 301 of the foldable display screen changes from 0 degree to +180 degrees, the foldable display screen is switched from a folded state to an unfolded state, or the foldable display screen is switched from the unfolded state to the folded state, it may be determined that a screen of the electronic device 100 currently includes two screen regions according to the middle bending portion 301 of the foldable display screen, specifically, the two screen regions may be referred to as a left screen region and a right screen region according to an orientation of the screen region relative to the middle bending portion, where the left screen region is a screen region located on the left side of the middle bending portion, and the right screen region is a screen region located on the right side of the middle bending portion. When the user touches an input box displayed in an application interface of an application program in the right screen area, the electronic device 100 determines that the screen area where the application to which the input box belongs is the right screen area (which may also be referred to as a right split screen), displays a word choice bar 302 on the bottom lower side of the right screen area, and displays a keyboard main interface 303 on the bottom lower side of the left screen area (which may also be referred to as a left split screen). Alternatively, as shown in fig. 6 (B), the vote bar 302 is displayed on the leftmost side of the right screen region.
As shown in (C) of fig. 6, the electronic device 100 is an electronic device having a foldable display screen, and when an included angle formed by upper and lower ends of a middle bending portion 401 of the foldable display screen changes from 0 degree to +180 degrees, that is, when the foldable display screen is switched from a folded state to an unfolded state or from the unfolded state to the folded state, it can be determined from the middle bending portion 401 of the foldable display screen that a screen of the electronic device 100 currently includes two screen regions, specifically, the two screen regions can be referred to as an upper side screen region and a lower side screen region according to an orientation of the screen region relative to the middle bending portion, where the upper side screen region is a screen region located on an upper side of the middle bending portion, and the lower side screen region is a screen region located on a lower side of the middle bending portion. When the user touches an input frame displayed in an application interface of an application program in the upper screen area, the electronic device 100 determines that the screen area where the application to which the input frame belongs is the upper screen area (which may also be referred to as an upper split screen), displays a vote bar on the bottom lower side of the upper screen area, and displays a keyboard main interface on the bottom lower side of the lower screen area (which may also be referred to as a lower split screen).
As shown in (D) to (F) of fig. 6, the electronic apparatus 100 is in the split-screen display state, and the screen currently includes three screen regions, such as a first screen region 503, a second screen region 504, and a third screen region 505 shown in fig. 6. As shown in (D) and (E) in fig. 6, when the user touches the input box displayed in the second screen area 504, in response to the touch operation of the user on the input box, after determining that the screen area where the application to which the input box belongs is the second screen area 504, the electronic device 100 displays the vote bar 501 on the bottom lower side of the second screen area, and displays the keyboard main interface 502 on the first screen area 503 or the third screen area 505. As shown in (F) in fig. 6, when the user touches an input frame displayed in the first screen region 503, in response to a touch operation of the user on the input frame, the electronic device 100 displays the keyboard main interface 502 in at least one screen region range other than the first screen region 503, such as the keyboard main interface 502 in a screen region range of the third screen region 505, after determining that the screen region where the application to which the input frame belongs is the first screen region 503.
It is understood that, when the screen of the electronic device 100 currently includes three or more screen areas, if the user touches an input box displayed in any one of the screen areas, after determining the screen area where the application of the input box belongs to, a vote bar is displayed in the screen area where the application of the input box belongs to so that the user can clearly locate the user interface where the user currently performs an input operation, and a keyboard main interface is displayed in at least one screen area except the screen area where the input box belongs to.
In some embodiments, if two sides of a screen area where an application of an input box belongs to are provided with one screen area, gesture operation information of a user is obtained, an input habit of the user is determined, specifically, whether the user is accustomed to using left-hand input or right-hand input is determined, and a display position of a keyboard main interface is determined according to the input habit of the user, that is, the side of the screen area where the keyboard displays the keyboard main interface is determined. And if the user is accustomed to using the left hand for inputting, displaying the keyboard main interface in the left screen area of the screen area where the application of the input box belongs, and if the user is accustomed to using the right hand for inputting, displaying the keyboard main interface in the right screen area of the screen area where the application of the input box belongs.
In other embodiments, if the side of the screen area where the input box belongs to the application contains at least two screen areas than the left side, the keyboard main interface is displayed in the screen area range of the at least two screen areas.
Referring to fig. 7, fig. 7 is a schematic flowchart illustrating a keyboard display method according to an embodiment of the present disclosure. As shown in fig. 7, the method includes steps S201 to S206.
S201, the electronic device 100 displays the screen interface 60.
S202, the electronic apparatus 100 detects a touch operation of the user on the input box 601.
S203, in response to the detected touch operation on the input frame 601, the electronic device 100 acquires a first display state of the screen.
S204, if the screen is in a full-screen display state at present, triggering a split-screen display function to obtain at least two screen areas; and selecting one screen area from the at least two screen areas as a display area of the application to which the input box belongs, displaying the word selection bar in the selected screen area, and displaying the keyboard main interface in at least one screen area except the selected screen area.
In the embodiment of the application, the split-screen display function is to divide the display area of the screen into two or more screen areas according to the current display state of the screen, for example, when the current display state of the screen is a horizontal screen display state, the display area of the screen is divided into two screen areas, namely a left split screen area and a right split screen area; when the current display state of the screen is a vertical screen display state, the display area of the screen is divided into an upper split screen area and a lower split screen area. However, after the screen is split up and down, if the keyboard main interface is displayed in one screen area and the word selection bar is displayed in the other screen area, the application interface is reduced within a certain range when the full screen is displayed, and the user inputs the word in the upper split screen or the lower split screen, which is not in line with the use habit of the user. In order to improve the user experience, the split-screen display function is generally triggered only when the display state of the screen is the landscape display state. That is, step S204 specifically includes:
if the screen is in a full-screen display state and a horizontal-screen display state at present, triggering a split-screen display function to obtain at least two screen areas; and selecting one screen area from the at least two screen areas as a display area of the application to which the input box belongs, displaying the word choice bar in the selected screen area, and displaying the keyboard main interface in at least one screen area except the selected screen area.
It should be noted that, when the split-screen display function is triggered, in order to provide better visual experience for the user, the display area of the screen is generally only divided into two screen areas, where one screen area is used for displaying the application interface and the word selection bar during full-screen display, and the other screen area is used for displaying the keyboard main interface, so that the user can perform input operations with the left hand and the right hand.
Referring to fig. 8, fig. 8 is a schematic view of another screen interface provided in the embodiment of the present application.
As shown in fig. 8 (a), when the screen of the electronic device 100 is in the full-screen display state and in the landscape display state, the user touches the currently displayed input frame on the screen, and when the electronic device 100 detects a touch operation of the user on the input frame 601, in response to the touch operation of the user on the input frame 601, the electronic device 100 triggers split-screen display to divide the display area of the screen of the electronic device 100 into two screen areas, namely, a left split-screen area and a right split-screen area.
In some embodiments, the electronic device 100 acquires screen coordinates of a touch operation, and determines which screen area is selected as the display area of the application to which the input frame 601 belongs, as shown in fig. 8 (B), the screen coordinates of the touch operation of the user are located in a left half screen area range, that is, in a screen coordinate range of a left split screen, the electronic device 100 selects the left split screen as the display area of the application to which the input frame 601 belongs, displays an application interface and a vote column 701 of the application to which the input frame 601 belongs in the left split screen, and displays the keyboard main interface 702 in the right split screen.
Referring to fig. 9, fig. 9 is a flowchart illustrating a keyboard display method according to an embodiment of the present disclosure. As shown in fig. 7, the method includes steps S301 to S306.
S301, the electronic device 100 displays the screen interface 60.
S302, the electronic apparatus 100 detects a touch operation of the user on the input box 601.
S303, in response to the detected touch operation on the input frame 601, the electronic device 100 acquires a first display state of the screen.
S304, if the screen is currently in a full-screen display state, the electronic device 100 acquires a second display state of the screen.
S305, if the screen is currently in a horizontal screen display state, the electronic equipment acquires screen coordinates corresponding to the touch operation.
And S306, determining the display positions of the word selection bar and the keyboard main interface according to the screen coordinates.
In the embodiment of the application, after the screen coordinate corresponding to the touch operation is obtained, whether the screen area range acted by the touch operation is the left side of the current screen or the right side of the current screen can be determined, when the screen coordinate is located on the left side of the current screen, it is determined that the word selection bar is displayed in the left half screen area range of the current screen, and the keyboard main interface is displayed in the right half screen area range of the current screen.
Specifically, step S305 includes: and determining the display positions of the word selection bar and the keyboard main interface according to the position of the screen coordinate on the screen.
For example, if the screen coordinates are located in the coordinate range of the left half screen area of the screen, the word selection bar is displayed in the left half screen area of the screen, and the keyboard main interface is displayed in the right half screen area of the screen. And if the screen coordinates are located in the coordinate range of the right half screen area of the screen, displaying the word selection bar in the right half screen area of the screen, and displaying the keyboard main boundary in the left half screen area of the screen.
As shown in (a) and (C) in fig. 8, when the screen of the electronic device 100 is in a full-screen display state, the user touches an input frame currently displayed on the screen, when the electronic device 100 detects a touch operation of the user on the input frame 601, in response to the touch operation of the user on the input frame 601, after the electronic device 100 acquires that the second display state where the screen is currently located is a landscape screen display state, screen coordinates corresponding to the touch operation are acquired, a left-side screen area range where the touch operation acts on the current screen is determined according to the screen coordinates, the electronic device 100 displays a vote 801 in a left-half screen area range of the current screen, and displays a keyboard main interface 802 in a right-half screen area range of the current square meter.
In some embodiments, if the screen of the electronic device is currently in a full-screen display state and a vertical-screen display state, displaying a word selection bar at a first preset position of a current screen area, and displaying a keyboard main interface at a second preset position of the current screen area; or a normal keyboard is displayed on the bottom lower side of the current screen.
It should be noted that the first preset position and the second preset position are different positions in the same screen area, and the first preset position is a preset area block for displaying the word choice bar, for example, a leftmost position on the screen. The second preset position is a preset area block for displaying the keyboard main interface, such as a position on the lower side of the bottom of the screen.
Referring to fig. 10, fig. 10 is a schematic diagram of some screen interfaces provided in the embodiment of the present application. As shown in fig. 10 (a), when the electronic device 100 detects a touch operation of the user on the input box 901, if the electronic device 100 acquires that the second display state where the screen is currently located is the vertical screen display state, the electronic device 100 acquires screen coordinates of the touch operation, and if the acquired screen coordinates are located in the right half screen area range, the electronic device 100 displays the vote column 902 on the rightmost side of the current screen area, and displays the keyboard main interface 903 on the bottom lower side of the current screen.
As shown in fig. 10 (B), when the electronic device 100 detects a touch operation of the user on the input box 1101, if the electronic device 100 acquires that the second display state where the screen is currently located is the portrait display state, the electronic device 100 acquires screen coordinates of the touch operation, and if the acquired screen coordinates are located within the range of the left half screen area, the electronic device 100 displays a vote 1102 on the leftmost side of the current screen area, and displays the keyboard main interface 1103 on the bottom lower side of the current screen.
As can be seen from (a) and (B) in fig. 10, the word choice bars are displayed on the leftmost side or the rightmost side of the screen in a longitudinally distributed manner, the keyboard main interface is displayed on the lower side of the bottom of the screen in a transversely distributed manner, and by reducing the occlusion of the word choice bars on the current display content of the screen, and the word choice bars are distributed on the leftmost side or the rightmost side of the screen, the use habit of one-hand operation of people is met, and a better experience is provided for users.
As shown in (C) in fig. 10, when the electronic apparatus 100 detects a touch operation of the user on the input box 1201, if the electronic apparatus 100 acquires that the second display state where the screen is currently located is the vertical screen display state, the normal keyboard 1202 is displayed on the bottom lower side of the current screen.
Embodiments of the present application also provide a computer-readable storage medium having stored therein instructions, which when executed on a computer or processor, cause the computer or processor to perform one or more steps of any one of the methods described above.
The embodiment of the application also provides a computer program product containing instructions. The computer program product, when run on a computer or processor, causes the computer or processor to perform one or more steps of any of the methods described above.
Embodiments of the present application further provide a chip system, which includes a memory and a processor, and when the chip system is operated, the chip system or the processor is caused to perform one or more steps of any one of the above methods. The chip system can be a single chip or a chip module consisting of a plurality of chips.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the application to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in or transmitted over a computer-readable storage medium. The computer instructions may be transmitted from one website site, computer, server, or data center to another website site, computer, server, or data center via wired (e.g., coaxial cable, fiber optics, digital subscriber line) or wireless (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
One of ordinary skill in the art will appreciate that all or part of the processes in the methods of the above embodiments may be implemented by hardware related to instructions of a computer program, which may be stored in a computer-readable storage medium, and when executed, may include the processes of the above method embodiments. And the aforementioned storage medium includes: various media capable of storing program codes, such as ROM or RAM, magnetic or optical disks, etc.
The above description is only a specific implementation of the embodiments of the present application, but the scope of the embodiments of the present application is not limited thereto, and any changes or substitutions within the technical scope disclosed in the embodiments of the present application should be covered by the scope of the embodiments of the present application. Therefore, the protection scope of the embodiments of the present application shall be subject to the protection scope of the claims.

Claims (9)

1. A keyboard display method is applied to electronic equipment, and is characterized by comprising the following steps:
when touch operation on an input frame is detected, acquiring a first display state of a screen at present;
if the screen is in a split screen display state currently, determining a screen area where the application of the input box belongs to;
displaying a word selection bar in a screen area where the application of the input box belongs to, and displaying a keyboard main interface in any screen area except the screen area where the application of the input box belongs to;
after the acquiring the first display state of the screen when the touch operation on the input frame is detected, the method further comprises:
if the screen is in a full-screen display state at present, triggering a split-screen display function to obtain at least two screen areas;
selecting one screen area from the at least two screen areas as a display area of an application to which the input box belongs, displaying the word selection bar in the selected screen area, and displaying the keyboard main interface in at least one screen area except the selected screen area; the interface displayed in the selected screen area is the interface displayed in the full screen display state of the screen;
or if the screen is currently in a full-screen display state, acquiring a second display state of the screen;
if the screen is currently in a horizontal screen display state, acquiring screen coordinates corresponding to the touch operation;
and determining the display positions of the word selection bar and the keyboard main interface according to the screen coordinates, wherein the display positions comprise a left half screen area and a right half screen area.
2. The keyboard display method of claim 1, wherein the determining the screen area where the application to which the input box belongs comprises:
acquiring screen coordinates of the input box;
and determining the screen area where the application of the input box belongs to according to the screen coordinates of the input box.
3. The keyboard display method of claim 1, wherein the determining the screen area where the application to which the input box belongs comprises:
acquiring a screen coordinate corresponding to the touch operation;
and determining the screen area where the application of the input box belongs to according to the screen area range where the screen coordinate belongs.
4. The keyboard display method of claim 1, wherein the determining the display positions of the vote bar and the keyboard main interface according to the screen coordinates comprises:
and determining the display positions of the word selection bar and the keyboard main interface according to the position of the screen coordinate on the screen.
5. The keyboard display method of claim 1, wherein the determining the display positions of the vote bar and the keyboard main interface according to the orientation of the screen coordinates on the screen comprises:
and if the screen coordinate is located in the coordinate range of the left half screen area of the screen, displaying the word selection bar in the left half screen area of the screen, and displaying the keyboard main interface in the right half screen area of the screen.
6. The keyboard display method of claim 1, wherein determining the display positions of the vote bar and the keyboard main interface according to the orientation of the screen coordinates on the screen further comprises:
and if the screen coordinates are located in the coordinate range of the right half screen area of the screen, displaying the word selection bar in the right half screen area of the screen, and displaying the keyboard main interface in the left half screen area of the screen.
7. The keyboard display method of claim 1, after said obtaining the second display state the screen is currently in, comprising
And if the screen is in a vertical screen display state currently, displaying the word selection bar at a first preset position of a current screen area of the screen, and displaying the keyboard main interface at a second preset position of the current screen area of the screen.
8. An electronic device, comprising: one or more processors, memory, and a display screen;
the memory, the display screen, and the one or more processors, the memory to store computer program code, the computer program code comprising computer instructions;
the computer instructions, when executed by the one or more processors, cause the electronic device to perform the keyboard display method of any of claims 1-7.
9. A computer-readable storage medium, in which a computer program is stored, which, when being executed by a processor, implements the keyboard display method according to any one of claims 1 to 7.
CN202010327962.3A 2020-04-23 2020-04-23 Keyboard display method, electronic device and computer readable storage medium Active CN111669459B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010327962.3A CN111669459B (en) 2020-04-23 2020-04-23 Keyboard display method, electronic device and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010327962.3A CN111669459B (en) 2020-04-23 2020-04-23 Keyboard display method, electronic device and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN111669459A CN111669459A (en) 2020-09-15
CN111669459B true CN111669459B (en) 2022-08-26

Family

ID=72382805

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010327962.3A Active CN111669459B (en) 2020-04-23 2020-04-23 Keyboard display method, electronic device and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN111669459B (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111967727B (en) * 2020-07-27 2023-08-22 华南理工大学 Distributed electric-thermal coupling system state estimation method considering communication packet loss
CN112181166A (en) * 2020-10-27 2021-01-05 维沃移动通信有限公司 Input method soft keyboard display method and device
CN114527926A (en) * 2020-11-06 2022-05-24 华为终端有限公司 Key operation method and electronic equipment
CN115037976A (en) * 2021-03-03 2022-09-09 Oppo广东移动通信有限公司 Display method, terminal and storage medium
CN113703592A (en) * 2021-08-31 2021-11-26 维沃移动通信有限公司 Secure input method and device
CN113986072B (en) * 2021-09-18 2022-09-30 荣耀终端有限公司 Keyboard display method, folding screen device and computer readable storage medium
CN114185478A (en) * 2021-11-05 2022-03-15 北京搜狗科技发展有限公司 Application program display method and device and storage medium
CN117389437A (en) * 2022-07-05 2024-01-12 华为技术有限公司 Multi-window display method and equipment
CN117648050A (en) * 2022-09-05 2024-03-05 Oppo广东移动通信有限公司 Keyboard display method, keyboard display device, electronic equipment and computer readable storage medium
CN115665298A (en) * 2022-10-21 2023-01-31 西安欧珀通信科技有限公司 Electronic equipment and audio output method

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102439544A (en) * 2009-03-20 2012-05-02 谷歌股份有限公司 Interaction with ime computing device
CN104808935A (en) * 2014-01-27 2015-07-29 上海斐讯数据通信技术有限公司 Virtual keyboard layout method and electronic equipment
CN107526494A (en) * 2017-09-06 2017-12-29 北京小米移动软件有限公司 Keyboard display method, device, terminal and storage medium
CN108519850A (en) * 2018-04-09 2018-09-11 维沃移动通信有限公司 A kind of keyboard interface display methods and mobile terminal

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5634135B2 (en) * 2010-06-03 2014-12-03 株式会社Pfu Image reading apparatus, image processing method, and program
CN101853103A (en) * 2010-06-03 2010-10-06 中兴通讯股份有限公司 Method and device for realizing virtual keyboard
CN104737117A (en) * 2012-10-17 2015-06-24 阿沃森特亨茨维尔公司 System and method for controlling display of virtual keyboard to avoid obscuring data entry fields
KR101746937B1 (en) * 2015-08-31 2017-06-14 나영빈 Method and apparatus for displaying keypad of mobile terminal
CN105872702A (en) * 2015-12-09 2016-08-17 乐视网信息技术(北京)股份有限公司 Method and device for displaying virtual keyboard

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102439544A (en) * 2009-03-20 2012-05-02 谷歌股份有限公司 Interaction with ime computing device
CN104808935A (en) * 2014-01-27 2015-07-29 上海斐讯数据通信技术有限公司 Virtual keyboard layout method and electronic equipment
CN107526494A (en) * 2017-09-06 2017-12-29 北京小米移动软件有限公司 Keyboard display method, device, terminal and storage medium
CN108519850A (en) * 2018-04-09 2018-09-11 维沃移动通信有限公司 A kind of keyboard interface display methods and mobile terminal

Also Published As

Publication number Publication date
CN111669459A (en) 2020-09-15

Similar Documents

Publication Publication Date Title
CN111669459B (en) Keyboard display method, electronic device and computer readable storage medium
CN109814766B (en) Application display method and electronic equipment
CN112130742B (en) Full screen display method and device of mobile terminal
CN112217923B (en) Display method of flexible screen and terminal
CN113645351B (en) Application interface interaction method, electronic device and computer-readable storage medium
CN110362244B (en) Screen splitting method and electronic equipment
CN111176506A (en) Screen display method and electronic equipment
CN111078091A (en) Split screen display processing method and device and electronic equipment
CN114461111B (en) Function starting method and electronic equipment
CN113994317A (en) User interface layout method and electronic equipment
CN110633043A (en) Split screen processing method and terminal equipment
CN109857401B (en) Display method of electronic equipment, graphical user interface and electronic equipment
CN113961157B (en) Display interaction system, display method and equipment
CN113986070B (en) Quick viewing method for application card and electronic equipment
CN112068907A (en) Interface display method and electronic equipment
CN113746961A (en) Display control method, electronic device, and computer-readable storage medium
CN114089932A (en) Multi-screen display method and device, terminal equipment and storage medium
CN115115679A (en) Image registration method and related equipment
CN113448658A (en) Screen capture processing method, graphical user interface and terminal
CN115291779A (en) Window control method and device
CN114205457A (en) Method for moving user interface element and electronic equipment
CN113885973A (en) Translation result display method and device and electronic equipment
CN115562510A (en) Display method and related device
CN114816171A (en) List display method, terminal device and computer-readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant