CN113741708A - Input method and electronic equipment - Google Patents

Input method and electronic equipment Download PDF

Info

Publication number
CN113741708A
CN113741708A CN202010481577.4A CN202010481577A CN113741708A CN 113741708 A CN113741708 A CN 113741708A CN 202010481577 A CN202010481577 A CN 202010481577A CN 113741708 A CN113741708 A CN 113741708A
Authority
CN
China
Prior art keywords
input
operating system
electronic device
input method
user interface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010481577.4A
Other languages
Chinese (zh)
Inventor
任建宝
王乃玄
罗朴良
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202010481577.4A priority Critical patent/CN113741708A/en
Priority to PCT/CN2021/097049 priority patent/WO2021244459A1/en
Publication of CN113741708A publication Critical patent/CN113741708A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • G06F3/0233Character input methods
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application discloses an input method and electronic equipment, the method is to create a daemon process in a client operating system, the daemon process can receive input information from a host operating system side input method through a socket or binder and other high-performance communication channels, then the daemon process obtains an input channel of an input window of a current focus and sends the input information from the host operating system side input method to the input channel, and the input window can receive and display input from the host operating system side input method. According to the method and the device, the input window in the client operating system can simultaneously acquire the input information from the host operating system side input method and the input information from the client operating system side input method, and the input methods on two sides do not need to be switched, so that the input efficiency and the stability of the service state of the input method are improved, a friendly input operating environment is provided for a user, and the use experience of the user is improved.

Description

Input method and electronic equipment
Technical Field
The invention relates to the field of computer software, in particular to an input method and electronic equipment.
Background
The computer desktop operating system currently in the market is
Figure BDA0002517600290000011
The main stream of mobile operating system has
Figure BDA0002517600290000012
Because the running mechanisms of the application programs on different operating systems are different, communication and interaction barriers exist among different operating systems, and simple, effective, coordinated and unified interfaces and interface fusion cannot be achieved in many aspects. Thus, applications on operating system a are typically not installed and run on top of operating system B, e.g.,
Figure BDA0002517600290000013
applications in an operating system cannot be installed and run directly on
Figure BDA0002517600290000016
Operating the system. In some cases, the user needs to run on the computer side
Figure BDA0002517600290000014
Or
Figure BDA0002517600290000015
The system can be implemented by virtual machine (virtualization) or virtualization (virtualization) technology, such as android simulator (b)
Figure BDA0002517600290000017
simulator), android container (
Figure BDA0002517600290000018
container), and the like. For example, the user may first initiate
Figure BDA0002517600290000019
Android simulator in operating system, run
Figure BDA00025176002900000110
Subsystem, then from
Figure BDA00025176002900000111
And starting an application program in the subsystem.
In this case, we can refer to the computer as a Host, and the running bottom operating system is a Host operating system (Host OS). The operating system running in a simulator or container or the like on the host may be referred to as a Guest operating system (Guest OS). At this time, the guest operating system may run on the host machine, and the host machine virtualizes a set of virtual hardware environment (including a processor, a memory, I/O devices, and the like) independent from the actual hardware, such as a simulator or a container, so that there is no difference between the running in the virtual hardware environment on the host machine and the running in the actual hardware for the guest operating system.
Because of barriers of communication and interaction between different operating systems, a guest operating system in a simulator or a container and the like is isolated from a host operating system of a host, and the guest operating system and the host operating system respectively use an input method in the own system to input texts.
If the text is to be input in the input box of the client operating system in the host machine, one method is to install a separate input method program in the client operating system to realize text input, when a user clicks the text box, the input method program is automatically called by the client operating system and displays an input interface, and the user inputs characters into the input box through the input method in the client operating system.
The disadvantages of the above method are: 1. when an input method needs to be installed in a client operating system, such as an android container, and the input method on the client operating system side is called, a mouse may need to be used for clicking a virtual keyboard to perform text input, so that the input efficiency is low compared with the input efficiency by using a physical keyboard; 2. the input method in the host operating system is different from the input method in the client operating system, a user may need to face the use habits of two different input methods, and word banks of the two input methods cannot be shared; 3. the popup of the virtual keyboard of the input method in the client operating system can block a display interface, and the use experience of a user is influenced.
Disclosure of Invention
The application provides an input method and electronic equipment, which are used for solving the problems that an input method on a host operating system side and an input method on a client operating system side cannot run simultaneously and the service state of the input method is possibly disordered when text is input in the client operating system.
The above and other objects are achieved by the features of the independent claims. Further implementations are presented in the dependent claims, the description and the drawings.
In a first aspect, an embodiment of the present invention provides an input method, which is applied to an electronic device, where a host operating system and a guest operating system run on the electronic device, where the method includes:
the electronic equipment displays a first user interface of a host operating system, a second user interface of a first application program is displayed in the first user interface, the first application program is loaded in a client operating system, and the second user interface can comprise a first input window. The electronic device detects a first user operation in the first input window, and then the electronic device may display an input focus in the first input window, and then the electronic device may acquire an input channel corresponding to the first input window. The electronic equipment can acquire the first input object through a first input method on the host operating system side. The input daemon in the guest operating system may acquire the first input object on the host operating system side, and then transmit the first input object to the first input window through the input channel. The electronic device may thereby display the first input object in the first input window. The host operating system may be the mainstreamPC desktop systems, e.g.
Figure BDA0002517600290000021
An operating system,
Figure BDA0002517600290000022
Operating system (A)
Figure BDA0002517600290000023
Kernel), the guest operating system may be a mainstream mobile system, such as
Figure BDA0002517600290000024
And (4) operating the system. In some embodiments, the first user interface of the host operating system may be, for example, a first user interface of the host operating system
Figure BDA0002517600290000025
The desktop of the operating system is provided with a desktop,
Figure BDA0002517600290000026
installing application programs in operating system
Figure BDA0002517600290000027
Android version, i.e. the first application of the guest operating system, may be, for example, an application
Figure BDA0002517600290000028
Android version, the second user interface of the first application may be, for example
Figure BDA0002517600290000029
The Android version of the user interface, the second user interface includes a first input window, which may be, for example
Figure BDA00025176002900000210
A chat input box in the conversation interface. The input focus representation is the position where input is currently possible and may be displayed as a flashing cursor. The input channel is the input method and the first application program used by the user on the client operating system sideAnd the input interface between the sequences can be used for transmitting an input object (such as text) acquired by the input method to the first application program. The first input object may specifically be text, a picture, an expression, and the like.
Implementing the method of the first aspect, when a user inputs text on the guest operating system, the user can input the text by using the input method on the host operating system side, and the input method on the guest operating system side does not need to be installed; in addition, the user can also use the input method of the host operating system side and the input method of the client operating system side to input simultaneously, so that the input efficiency and the stability of the service state of the input method are improved.
With reference to the first aspect, in some embodiments, a first process is run on a host operating system of the electronic device, and the first process is configured to load an image of a guest operating system and run the guest operating system. In some embodiments, the first process may be an android simulator, and the first process may also be an android container.
In combination with the first aspect, in some embodiments, the method may further include: the electronic device can pass the first input object to the input daemon through a window instance of the first application program on the host operating system side.
In combination with the first aspect, in some embodiments, the method may further include: the input daemon process may transmit the first input object to the input channel, the input channel may transmit the first input object to the first application program, and the first application program may transmit the first input object to the view corresponding to the first input window.
In combination with the first aspect, in some embodiments, the method may further include: if an input channel is detected as available for the first input method, the input daemon may pass the first input object to the input channel.
In combination with the first aspect, in some embodiments, the method may further include: the electronic device may determine that the input channel is available for the first input method if the input channel is not occupied by a second input method on the guest operating system side.
In combination with the first aspect, in some embodiments, the method may further include: if the input channel is occupied by the second input method on the client operating system side, but the first input method has higher priority for acquiring the first input object than the second input method for acquiring the second input object, the electronic equipment can determine that the input channel is available for the first input method.
In combination with the first aspect, in some embodiments, the method may further include: when the first input method occupies the input channel, if it is not detected that the first input method obtains the first input object within a first time, for example, within five minutes, the electronic device may cancel the first input method from occupying the input channel.
In combination with the first aspect, in some embodiments, the method may further include: when the first input method occupies the input channel but does not transmit the first input object, once the second input method is detected to acquire the second input object, the electronic device can cancel the first input method from occupying the input channel.
In combination with the first aspect, in some embodiments, the method may further include: when it is detected that the first input method occupies the input channel, the electronic device may display first indication information in the second user interface, where the first indication information is used to indicate that the electronic device is capable of implementing input in the second user interface using the first input method.
In combination with the first aspect, in some embodiments, the method may further include: when the display input focus is detected in the second user interface, the electronic device may display second indication information in the second user interface, the second indication information being used to indicate that the electronic device is capable of implementing an input in the second user interface using a second input method.
In combination with the first aspect, in some embodiments, the method may further include: when a second user operation to exit the input state is detected in the second user interface, the electronic device may cancel displaying the second indication information in the second user interface.
In combination with the first aspect, in some embodiments, the second user action may include one or more of: clicking a location in the second user interface other than the input window, clicking a location in the first user interface other than the second user interface, exiting the second user interface, and so on.
With reference to the first aspect, in some embodiments, the first user operation is an operation of selecting the first input window, and the first user operation may include one or more of: the touch control method comprises the steps of selecting mouse clicking operation of a first input window, touch control operation acting on a touch control panel, voice command operation, space gesture operation and the like.
In combination with the first aspect, in some embodiments, the method may further include: when the first user operation is detected in the first input window, the electronic device can display a virtual input keyboard of the second input method.
In combination with the first aspect, in some embodiments, the method may further include: the electronic device detects that a first key in the virtual input keyboard is clicked, the electronic device can acquire a second input object generated by clicking the first key through a second input method, the electronic device can transmit the second input object to the first input window through the input channel, and the electronic device can display the second input object in the first input window.
With reference to the first aspect, in some embodiments, the obtaining of the first input object by the first input method includes one or more of: an input object received through a physical keyboard of the electronic device, an input object received through a soft keyboard of the first input method, an input object received through a voice command, an input object received through a touch pad, an input object received through a touch screen, and the like.
In a second aspect, an embodiment of the present invention provides an electronic device, including: the memory stores computer-executable instructions, and the processor is coupled to the memory, and is configured to invoke the instructions to enable the electronic device to implement any function of the electronic device in the first aspect, which is not described herein again.
In a third aspect, an embodiment of the present invention provides a computer storage medium, where a computer program is stored in the storage medium, where the computer program includes executable instructions, and when the executable instructions are executed by a processor, the processor is caused to perform operations corresponding to the method provided in the first aspect.
According to the technical scheme of the application, when a user inputs text on the client operating system, the input method on the side of the host operating system can be used for inputting the text, and the input method on the side of the client operating system does not need to be installed; in addition, the user can also use the input method of the host operating system side and the input method of the client operating system side for input at the same time, so that the input efficiency and the stability of the service state of the input method are improved, a friendly input operating environment is provided for the user, and the use experience of the user is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments or the background art of the present application, the drawings required to be used in the embodiments or the background art of the present application will be described below.
Fig. 1 is a schematic diagram of a hardware structure of an electronic device according to an embodiment of the present invention;
FIG. 2 is a block diagram of a software architecture provided by an embodiment of the present invention;
FIG. 3 is a block diagram of an input method in the prior art;
FIG. 4 is a schematic illustration of a user interface provided by an embodiment of the present invention;
FIG. 5 is a schematic illustration of a user interface provided by an embodiment of the present invention;
FIG. 6A is a schematic illustration of a user interface provided by an embodiment of the present invention;
FIG. 6B is a schematic illustration of a user interface provided by an embodiment of the present invention;
FIG. 7 is a schematic illustration of a user interface provided by an embodiment of the present invention;
FIG. 8 is a schematic diagram of an input implementation flow provided by an embodiment of the present invention;
FIG. 9A is a flow chart of an input method provided by an embodiment of the invention;
fig. 9B is a flowchart of an input method according to an embodiment of the present invention.
Detailed Description
The embodiments of the present application will be described in detail below with reference to the accompanying drawings. The terminology used in the following embodiments of the present application is for the purpose of describing particular embodiments only and is not intended to be limiting of the present application. As used in the specification of the present application and the appended claims, the singular forms "a", "an", "the" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the listed items.
The application provides an input method and electronic equipment, which are used for solving the problems that a host operating system side input method cannot be used for inputting texts into a client operating system, and the input method of the host operating system side and the input method of the client operating system side cannot run simultaneously when input is performed in the client operating system, and the possible problem that the service state of the input method is disordered. An Input method, i.e., an Input method application program, also called an Input method editor (english) is software for inputting information on a display screen. The input information includes, but is not limited to, chinese characters, numbers, pictures, etc. The method aims to create an input daemon in a client operating system, the input daemon can receive text input from an input method on the side of a host operating system through a socket or binder and other high-performance communication channels, then when the input daemon obtains an input channel of an input window of a current focus, input information from the input method on the side of the host operating system can be sent to the input window of the current focus through the input channel of the input window, and the input window can receive and display the input information from the input method on the side of the host operating system. The focus refers to a position where input can be performed, and may be represented as a flashing cursor, the input window refers to a window where input focus can be acquired, and input content can be accepted, accommodated, and edited, and specifically may be an input box, a text box, a picture box, an address bar, a search box, an editable page (e.g., a notepad, a word page), a table (e.g., an excel table) capable of accommodating input content, and the like. In addition, in some other embodiments, the input daemon process for obtaining the input text from the host operating system side may also be other types of processes or programs, for example, the input daemon process may be a background process, and may also be other application programs, which is not limited in this application.
According to the technical scheme of the application, when a user inputs text on the client operating system, the input method on the side of the host operating system can be used for inputting the text, and the input method on the side of the client operating system does not need to be installed; in addition, the user can also use the input method of the host operating system side and the input method of the client operating system side for input at the same time, so that the input efficiency and the stability of the service state of the input method are improved, a friendly input operating environment is provided for the user, and the use experience of the user is improved.
In this embodiment, when a Guest operating system (Guest OS) is running in a simulator or a container on a Host, the Host operating system (Host OS) may virtualize a set of virtual hardware environments (including a processor, a memory, an I/O device, and the like) independent of actual hardware for the Guest operating system (Guest OS), so that there is no difference between running in the virtual hardware environment on the Host operating system (Host OS) and running in the actual hardware for the Guest operating system.
The term "User Interface (UI)" in the embodiments of the present application is a media interface for performing interaction and information exchange between an application program or an operating system and a user, and implements conversion between an internal form of information and a form acceptable to the user. The user interface of the application program is a source code written by a specific computer language such as java, extensible markup language (XML), and the like, and the interface source code is analyzed and rendered on the terminal device, and finally presented as content that can be identified by the user, such as controls such as pictures, characters, buttons, and the like. Controls, also called widgets, are basic elements of user interfaces, and typically have a toolbar (toolbar), menu bar (menu bar), text box (text box), button (button), scroll bar (scrollbar), picture, and text. The properties and contents of the controls in the interface are defined by tags or nodes, such as XML defining the controls contained by the interface by nodes < TextView >, < ImgView >, < VideoView >, etc. A node corresponds to a control or attribute in the interface, and the node is rendered as user-viewable content after parsing and rendering. In addition, many applications, such as hybrid applications (hybrid applications), typically include web pages in their interfaces.
A commonly used presentation form of the user interface is a Graphical User Interface (GUI), which refers to a user interface related to computer operations and displayed in a graphical manner. It may be an interface element such as an icon, a window, a control, etc. displayed in the display screen of the electronic device, where the control may include a visual interface element such as an icon, a button, a menu, a tab, a text box, a dialog box, a status bar, a navigation bar, a Widget, etc.
In the embodiment of the present application, the types of the operating systems of the host operating system and the guest operating system are not limited at all. The input method provided by the embodiment of the application is applied to the electronic device 100, the electronic device 100 may be referred to as a host, a host operating system is installed, and a guest operating system may be loaded on the host operating system. The embodiment of the present application uses a host operating system as an example
Figure BDA0002517600290000051
An operating system,
Figure BDA0002517600290000052
An operating system, a guest operating system being
Figure BDA0002517600290000053
The content of the embodiments of the present application is described by taking an operating system as an example. One skilled in the art will appreciate that the embodiments of the present application may also be implemented in other operating systems, and the present application does not limit the present application in any way.
An exemplary electronic device 100 provided in the present embodiment is described below. The electronic device 100 maySome operating system may be installed, but the installed operating system may not be limited to a common one
Figure BDA0002517600290000054
An operating system,
Figure BDA0002517600290000055
An operating system,
Figure BDA0002517600290000056
An operating system,
Figure BDA0002517600290000057
Operating system, etc., the electronic device 100 may be a desktop computer, a notebook computer, a tablet computer, etc.
Fig. 1 shows a hardware configuration diagram of an electronic device 100.
The electronic device 100 may include a processor 110, an internal memory 120, an external memory interface 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, a display screen 151, a key 152, a camera 153, an indicator 154, an audio module 160, a speaker 161, a receiver 162, a microphone 163, an earphone interface 164, a wired communication module 171, a wireless communication module 172, an antenna 1, a sensor module 180, and the like. The sensor module 180 may include a pressure sensor 180A, a fingerprint sensor 180B, a temperature sensor 180C, a touch sensor 180D, an ambient light sensor 180E, and the like.
It is to be understood that the illustrated structure of the embodiment of the present invention does not specifically limit the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware. For example, in some embodiments, the electronic device 100 may have a motor, a mobile communication module (2G/3G/4G/5G), a SIM card interface, an eSIM chip, and so forth, in addition to the hardware described above. Thus, with respect to the specific hardware configuration of electronic device 100, more or fewer components than shown may be included, or certain components may be combined, or certain components may be split, or a different arrangement of components may be used, as the case may be.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors.
The controller may be, among other things, a neural center and a command center of the electronic device 100. The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transmit data between the electronic device 100 and a peripheral device. The interface may also be used to connect other electronic devices, such as AR devices and the like.
It should be understood that the connection relationship between the modules according to the embodiment of the present invention is only illustrative, and is not limited to the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
Internal memory 120 may be used to store computer-executable program code, including instructions. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 120. The internal memory 120 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The storage data area may store data created during use of the electronic device 100, and the like. In addition, the internal memory 120 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like.
The external memory interface 121 may be used to connect an external memory card, such as a removable hard disk, to extend the memory capability of the electronic device 100. The external memory card communicates with the processor 110 through the external memory interface 121 to implement a data storage function. For example, files such as music, video, etc. are saved in an external storage hard disk.
The charging management module 140 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via the USB interface 130. In some wireless charging embodiments, the charging management module 140 may receive a wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charging management module 140, and supplies power to the processor 110, the internal memory 120, the external memory 121, the display 151, the camera 153, the wireless communication module 172, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc. In some other embodiments, the power management module 141 may also be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may be disposed in the same device.
The electronic device 100 implements a display function through the GPU, the display screen 151, and the application processor, etc. The GPU is a microprocessor for image processing, and is connected to the display screen 151 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 151 is used to display images, videos, and the like. The display screen 151 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, the electronic device 100 may include 1 or N display screens 151, N being a positive integer greater than 1.
The keyboard 152 may include a physical keyboard, a touch-sensitive keyboard, and the like. The electronic apparatus 100 may receive an input of a keypad, generate a key signal input related to a user setting and a function control of the electronic apparatus 100. It should be noted that, in the embodiment of the present application, the keys on the keyboard may be divided into character keys and non-character keys; when the user clicks or presses the character keys, operation commands corresponding to the character keys are generated, and the device can generate text contents according to the operation commands corresponding to the character keys. The character keys may specifically include numeric keys 9(0-9), alphabetic keys (a-z), punctuation keys (e.g.,,, |,. And non-character keys refer to keys on the keyboard other than the character keys. Specifically, the non-character keys include a control (Ctrl) of the keyboard, a Shift (Shift), a Shift (Alt), a Caps Lock (Caps Lock), an Insert (Insert), a start (Home), an End (End), a delete (Del), a top page (PgUp), a bottom page (PgDn), a Enter (Enter), a BackSpace (BackSpace), and a direction key. The user can generate a corresponding operation command through the operation of knocking or pressing the non-character keys of the keyboard. The operation command may perform actions such as cursor shift, case switching, insertion, deletion, line feed, transmission, and the like within the input frame in which the input focus is obtained.
The electronic device 100 may implement a photographing function through the ISP, the camera 153, the video codec, the GPU, the display screen 151, the application processor, and the like.
The camera 153 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, the electronic device 100 may include 1 or N cameras 153, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. Applications such as intelligent recognition of the electronic device 100 can be realized through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The network communication function of the electronic device 100 may be implemented by the wired communication module 171, the wireless communication module 172, the antenna 1, the modem processor, the baseband processor, and the like.
The antenna 1 is used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 161, the receiver 162, etc.) or displays an image or video through the display screen 151. In some embodiments, the modem processor may be a stand-alone device.
The wired communication module 171 may provide wired communication solutions including ethernet, lan, internet, etc. applied to the electronic device 100. The wired communication module 171 may be one or more devices integrating at least one communication processing module.
The wireless communication module 172 may provide a solution for wireless communication applied to the electronic device 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (bluetooth, BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 172 may be one or more devices integrating at least one communication processing module. The wireless communication module 172 receives electromagnetic waves via the antenna 1, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 172 may also receive a signal to be transmitted from the processor 110, frequency-modulate and amplify the signal, and convert the signal into electromagnetic waves via the antenna 1 to radiate the electromagnetic waves.
The electronic device 100 may implement audio functions via the audio module 160, speaker 161, microphone 163, headphone interface 164, and application processor, among other things. Such as music playing, recording, etc.
The audio module 160 is used to convert digital audio information into an analog audio signal output and also used to convert an analog audio input into a digital audio signal. The audio module 160 may also be used to encode and decode audio signals. In some embodiments, the audio module 160 may be disposed in the processor 110, or some functional modules of the audio module 160 may be disposed in the processor 110.
The speaker 161, also called a "horn", is used to convert an audio electrical signal into an acoustic signal. The electronic apparatus 100 can listen to music through the speaker 161 or listen to a handsfree call.
The receiver 162, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the electronic device 100 receives a call or voice information, it can receive voice by placing the receiver 162 close to the ear.
The microphone 163, also called "microphone", is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can input a voice signal to the microphone 163 by speaking the user's mouth near the microphone 163. The electronic device 100 may be provided with at least one microphone 163. In other embodiments, the electronic device 100 may be provided with two microphones 163 to achieve a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may further include three, four or more microphones 163 to collect sound signals, reduce noise, identify sound sources, perform directional recording, and so on.
The earphone interface 164 is used to connect a wired earphone. The headset interface 164 may be the USB interface 130, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The sensor module 180 may include a pressure sensor 180A, a fingerprint sensor 180B, a temperature sensor 180C, a touch sensor 180D, an ambient light sensor 180E, and the like.
The pressure sensor 180A is used for sensing a pressure signal, and converting the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 151. The pressure sensor 180A can be of a wide variety, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 180A, the capacitance between the electrodes changes. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 151, the electronic apparatus 100 detects the intensity of the touch operation according to the pressure sensor 180A. The electronic apparatus 100 may also calculate the touched position from the detection signal of the pressure sensor 180A. In some embodiments, the touch operations that are applied to the same touch position but different touch operation intensities may correspond to different operation instructions.
The fingerprint sensor 180B is used to collect a fingerprint. The electronic device 100 can utilize the collected fingerprint characteristics to unlock the fingerprint, access the application lock, photograph the fingerprint, answer an incoming call with the fingerprint, and so on.
The temperature sensor 180C is used to detect temperature. In some embodiments, electronic device 100 implements a temperature processing strategy using the temperature detected by temperature sensor 180C. For example, when the temperature reported by the temperature sensor 180C exceeds a threshold, the electronic device 100 performs a reduction in performance of a processor located near the temperature sensor 180C, so as to reduce power consumption and implement thermal protection. In other embodiments, the electronic device 100 heats the battery 142 when the temperature is below another threshold to avoid the low temperature causing the electronic device 100 to shut down abnormally. In other embodiments, when the temperature is lower than a further threshold, the electronic device 100 performs boosting on the output voltage of the battery 142 to avoid abnormal shutdown due to low temperature.
The touch sensor 180D is also referred to as a "touch panel". The touch sensor 180D is used to detect a touch operation applied thereto or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to the touch operation may be provided through the display screen 151.
The ambient light sensor 180E is used to sense the ambient light level. The electronic device 100 may adaptively adjust the brightness of the display screen 151 according to the perceived ambient light brightness. The ambient light sensor 180E may also be used to automatically adjust the white balance when the camera 153 is taking a picture.
The software system of the electronic device 100 may employ a layered architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. The embodiment of the invention takes a client operating system as
Figure BDA0002517600290000091
The operating system is taken as an example, illustrating the hierarchical architecture thereof. It should be noted that the examples of the present application are only for illustration purposes
Figure BDA0002517600290000092
The operating system is a software environment required for illustrating the technical solution of the present embodiment, and those skilled in the art can understand that the embodiments of the present application can also be implemented by other operating systems.
FIG. 2 shows an embodiment of the present application
Figure BDA0002517600290000093
Software architecture block diagram of an operating system.
The layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the method will comprise
Figure BDA0002517600290000094
The operating system is divided into four layers, namely an application program layer, an application program framework layer, a function library layer and a kernel layer from top to bottom.
1. Application (applications) layer
The application layer is the top layer of the operating system and may include a series of application packages. As shown in FIG. 2, the application packages may include input method, WLAN, calendar, Bluetooth, gallery, browser, music, etc. applications. Of course, for a developer, the developer may write an application and install it into the layer.
In embodiments of the present application, one or more input method applications are installed within the application layer, e.g.
Figure BDA0002517600290000101
An input method,
Figure BDA0002517600290000102
Figure BDA0002517600290000103
Input method, etc., which are not limited in this application.
Generally, an application is developed using Java language, and is completed by calling an Application Programming Interface (API) provided by an application framework layer.
2. Application framework (application framework) layer
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions. Developers can interact with the bottom layer of the operating system (e.g., function libraries, application frameworks, etc.) through an application framework,
Figure BDA0002517600290000104
Kernel, etc.) to develop its own application. The application program framework mainly comprises
Figure BDA0002517600290000105
A series of services and management systems for operating the system.
As shown in FIG. 2, the application framework layers may include a window manager, a content provider, an input method manager, a content aware service, a view system, a notification manager, a resource manager, and the like.
The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make it accessible to applications. The data may include video, images, audio, browsing history and bookmarks, phone books, etc.
The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The method specifically comprises various types such as lists (list), grids (grid), texts (text), buttons (button), pictures (image) and the like. The display interface may be composed of one or more views.
The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a short dwell, and does not require user interaction. Such as a notification manager used to inform download completion, message alerts, etc. The notification manager may also be a notification that appears in the form of a chart or scroll bar text at the top status bar of the system, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, text information is presented in the status bar, and a warning sound is generated.
The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and the like.
As shown in fig. 2, an Input Method Manager (IMM) is further included in the application framework layer, and the input method manager is required to implement the input method
Figure BDA0002517600290000106
And participation of various layers of an operating system. For the input of a user, the input can be identified only by the support of various drivers of a Linux kernel layer; an application program needing to call the input method runs in an application program layer; the application framework layer is used for displaying the content, User Interface (UI), notification and the like of the application; the function library layer provides function support for the operation of the input method, such as the analysis of codes, the support of fonts and graphics, the display of input animation effects and the like.
Specifically, the Input Method Frame (IMF) is
Figure BDA0002517600290000107
A system component of the operating system, a developer can develop an input method application based on IMF. Referring to fig. 3, the IMF mainly comprises three parts: the system comprises an input method manager, an input method editor and a client application program.
(1) The Input Method Manager (IMM) is a central point for managing the interactions between other parts of the IMF. It can be seen as a client API that exists in the context of each application program to communicate with global system services that manage all inter-process interactions.
(2) Input Method Editor (IME): for implementing a specific interaction model that allows the user to generate text and process the user's input. The system binds a currently in use input method, causes it to be created and run, and the system determines when the input method hides or displays its UI. Only one IME can be run at a time.
(3) A Client Application (CA) refers to an application that currently calls an input method, for example,
Figure BDA0002517600290000111
Figure BDA0002517600290000112
and the like, controlling the state of the input focus and the IME through the IMM. Only one client application can use an IME at a time.
The three parts need to cooperate together to complete the work of the input method. For example, opening an application that requires an input method to be invoked, when the input box obtains input focus, the CA may notify the IMM to open the input method, and the IMM may then view the currently selected IME and invoke the IME. When the user submits the input, the IME may pass the input information to the CA for input.
The interactive process of the three parts of IMF is explained below with reference to fig. 3.
IMM: and binding the IME and the CA, and simultaneously performing parameter setting and safety control on the input method module to play a role of intermediate coordination. Wherein, the inputmethodmanagerservice in the IMM is a master control center related to the input method in the whole system, and the CA requests to call the input method through the process; and the windows manager is responsible for displaying the input method, receiving the user event and obtaining the focus authentication. An inputmethodmanagedevice process in the IMM acquires information of the current bound CA, such as an input box type, through an inputmethodmethodclient process of the CA; the inputmethodmanagerservice process in the IMM will initiate, modify, hide the IME through the inputview process of the IME.
IME: and receiving key input information of a user, and calling a client interface to transmit the input information to the CA. The inputview process may acquire input information of the user from the keyboard, or may process text information of the CA through an inputcontext (inputconnection) process of the CA, such as operations of deleting a character, replacing a character, controlling a cursor position, and the like.
CA: for displaying the user's input information while providing the content of the current text box to the IME. Wherein, the inputmethodmanager process of the CA indirectly controls the display or hiding of the IME through the inputmethodmethodmethodmethodrerervice process of the IMM; an inputmethodmanager process of the CA provides the state of the current text box for the IME through an inputmethodspace (inputmethodspace wrapper) process of the IME, such as information of cursor change and the like, so that the IME adjusts the input state; the UI control can process some key information which is not processed by the IME, such as home key information, so that the key information can directly reach the CA; an inputcontext (inputconnection) process may display the input information on the user interface after the interaction of the CA with the IME is completed; the inputmethodclient is implemented as a client of the input method management service, and is used for identifying the current client application program and receiving the state change from the management service.
Generally, input method applications in the application layer may support multiple types of virtual keyboards (otherwise known as soft keyboards). For example, the input method application is provided with a nine-square-grid-type pinyin keyboard, a full-keyboard-type pinyin keyboard, an english keyboard, a numeric keyboard, an expression keyboard or a voice keyboard, and the like, which is not limited in this embodiment of the present application.
For example, the emoticon may include elements such as emoticons, symbols, pictures, or motion pictures. For example, the emoji keyboard which can be displayed in the display interface is an emoji keyboard, and the emoji keyboard comprises a plurality of expression elements; the expressive keyboard which can be displayed in the display interface is a Chinese character (kaomoji) keyboard, and the Chinese character keyboard comprises expressions which are composed of symbol elements; the emoticon keyboard which can be displayed in the display interface is an emoticon keyboard which can comprise pictures or images.
In some embodiments of the present application, the electronic device may detect a type of input content required in the input window and provide an appropriate virtual keyboard according to the type, for example, when the electronic device detects that the input content required in the displayed input window is a text type, the input method application may display a full keyboard type pinyin keyboard, and when the electronic device detects that the input content required in the displayed input window is a numeric type, the input method application may display a numeric keyboard.
The virtual keyboard provided by the input method application can also comprise a switching button for switching the keyboard type. The switching button can be used for switching among different types of keyboards, a user can sequentially switch the currently displayed virtual keyboard according to a certain sequence by clicking the switching button for multiple times, and the application does not limit the current virtual keyboard.
In addition, as also shown in fig. 2, in the embodiment of the present application, a content sensor service (content sensor service) that is open to the input method application is also provided in the application framework layer. The content-aware service is used to obtain chat-like applications (e.g., for example
Figure BDA0002517600290000121
) A type of chat message or a chat message within a chat interface. For example, a text type chat message, a voice type chat message, or a picture type chat message, etc.
The input method application may be registered in the content awareness service in advance. For example, an input method application may register its own package name with a content-aware service and request a content-aware service pair
Figure BDA0002517600290000122
And monitoring the newly generated chat messages in the chat interface. Thus, when a new chat message is generated in the chat interface, the content-aware service can extract the chat message and determine the specific type of the chat message. Further, the content-aware service may send the specific type of the chat message to the input method application, so that the input method application may display a corresponding type of virtual keyboard in the chat interface according to the specific type. For example, the input method application may correspond to the display when the last chat message was of the voice typeA voice display keyboard; when the latest chat message is an emoticon type, the input method application can correspondingly display an emoticon keyboard and the like.
That is to say, the electronic device may automatically display the virtual keyboard matched with the type of the chat message based on the latest chat message, thereby reducing the operation that the user frequently switches the virtual keyboard in the process of inputting information by using the input method application, and improving the input efficiency of the terminal and the input experience of the user.
In some embodiments of the present application, the input method application may also invoke the content-aware service to actively obtain the latest chat message or the type of chat message. For example, an input method application may click on a user
Figure BDA0002517600290000123
The above content-aware service is invoked to obtain the most recent chat message. Or, the content-aware service may also actively send the latest chat message to the input method application when the virtual keyboard of the input method is not retracted, so that the input method application may also switch the type of the virtual keyboard in time according to the type of the latest chat message in the process of displaying the virtual keyboard. The specific display method of the input method virtual keyboard can refer to the following description in the following embodiments, and therefore, the detailed description thereof is omitted.
3. Library of functions (libraries) layer
The function library layer is the support of the application framework and connects the application framework layer with the application framework layer
Figure BDA0002517600290000124
Important ligaments of the inner core layer. The function library layer comprises a plurality of function libraries compiled by a computer program C language or a C + + language, the function libraries can be used by different components in an operating system, and the function libraries provide services for developers through the application framework layer.
In particular, the function library layer may include a plurality of functional modules. For example: interface manager (surface manager), multimedia Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., OpenGL ES), two-dimensional graphics engines (e.g., SGL), and the like.
The interface management library is used for managing the display subsystem, is mainly responsible for managing access to the display system, is particularly used for managing interaction between display and access operation when a plurality of application programs are executed, is also used for displaying and synthesizing 2D drawing and 3D drawing, and provides fusion of 2D and 3D layers for the plurality of application programs.
The multimedia library supports various commonly used audio and video format playback and recording, still image files, and the like. The media library may support a variety of audio-video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, and the like.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The two-dimensional graphics engine is a drawing engine for 2D drawing.
The function library layer may also include other function libraries for implementing various functions of the mobile phone, such as: SGL (scalable graphics library): 2D graphic image processing engine based on XML (extensible markup language) file; ssl (secure sockets layer): the device is positioned between the TVP/IP protocol and various application layer protocols and provides support for data communication; and so on.
The Android Runtime is
Figure BDA0002517600290000125
The operating environment on the operating system is
Figure BDA0002517600290000126
A new virtual machine for use by an operating system. The Android Runtime can comprise a core function library and a virtual machine. The core function library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android. The Android Runtime is responsible for scheduling and managing an Android system. In Android Runtime, AOT (audio-of-time) technology is adopted, when an application program is installed for the first time, bytecode of the application program is compiled into machine code in advance, so that the application program becomes a real local application, and then the local application is shipped againThis step of compilation is eliminated and both startup and execution become faster.
In some embodiments of the present application, the Android Runtime may be replaced by a core function library (core libraries) and a dalvik virtual machine (dalvik virtual machine). The core function library provides most functions in the Java language API, and provides an interface for calling the bottom layer program library to the application program framework layer mainly in a Java Native Interface (JNI) mode. And also comprises some core APIs of the operating system, such as android. The dalvik virtual machine uses a JIT (just-in-time) runtime compilation mechanism, and each time a process is started, the virtual machine needs to recompile the bytecode in the background, which has a certain influence on the starting speed. Each application runs in an instance that is a dalvik virtual machine, each dalvik virtual machine instance being an independent process space. The dalvik virtual machine is designed to run multiple virtual machines efficiently on one device. The dalvik virtual machine executable file format is, dex, a compressed format designed specifically for dalvik, which is suitable for systems with limited memory and processor speed. It is proposed that dalvik virtual machines rely on the linux kernel to provide basic functions (threads, underlying memory management). It can be understood that Android Runtime and dalvik belong to different types of virtual machines, and those skilled in the art can select different types of virtual machines under different conditions.
The application layer and the application framework layer run in a virtual machine. And executing java files of the application program layer and the application program framework layer into a binary file by the virtual machine. Virtual machines can be used to perform functions such as object lifecycle management, stack management, thread management, security and exception management, and garbage collection.
4. Kernel layer
The kernel layer provides core system services of the operating system, such as security, memory management, process management, network protocol stack and driving model, and the like
Figure BDA0002517600290000131
A kernel.
Figure BDA0002517600290000132
The kernel also acts as an abstraction layer between the hardware and software stacks. The layer has many drivers associated with the mobile device, the main drivers being: display driving; a keyboard driver as an input device; flash drive based on memory technology equipment; a Wi-Fi drive; audio driving; driving by Bluetooth; driving a camera; based on
Figure BDA0002517600290000133
Frame buffer driving, etc.
The following describes a human-computer interaction scenario related to an embodiment of the present application with reference to the drawings.
In this embodiment, as shown in fig. 4, the electronic device 100 is a desktop computer, and the electronic device 100 may include an input and output device: a display 401, a physical keyboard 402, a mouse 403 and the like, and a user can input text by an operation 404 of tapping the keyboard.
The operating system installed on the electronic device 100 is
Figure BDA0002517600290000134
An operating system, namely a host operating system, and an android simulator (are also installed on the electronic device 100: (
Figure BDA0002517600290000135
simulator), android simulator is loaded with
Figure BDA0002517600290000136
Operating system, i.e. guest operating system. The user can click on the application icon 405 of the android simulator displayed on the display screen 401, open the application program, and at this time, the user interface 406 of the android simulator can be displayed on the display screen 401, and what runs in the android simulator is
Figure BDA0002517600290000137
And (4) operating the system.
In one example, as shown in FIG. 4, androidThe simulator is installed and operated with
Figure BDA0002517600290000138
The application program(s) is (are) executed,
Figure BDA0002517600290000139
is a chat application. Displayed in the user interface 406 of the android simulator
Figure BDA00025176002900001310
The session interface 407 of the application, meanwhile, in the top status bar (which can display information such as network status, electric quantity, time, etc.) of the android simulator, can be displayed
Figure BDA00025176002900001311
Small icon 408, indicating that the foreground is running at this time
Figure BDA00025176002900001312
Application is carried out. The user can interact at the user interface of the android simulator by manipulating the mouse pointer 410. In that
Figure BDA0002517600290000141
An input field 411 is further arranged in the session interface 407 of the application, an input box 412 can be included in the input field 411, text input can be achieved, and controls for inputting contents such as voice, expressions and pictures can be further arranged in the input field 411.
In the present embodiment, the first and second electrodes are,
Figure BDA0002517600290000142
the Chinese input method installed in the operating system may be
Figure BDA0002517600290000143
A pinyin input method, which may include an input window 413 and a status bar 414. The input window 413 is used for displaying the input characters and candidate texts of the user, and is displayed when the input method is in a state of inputting the characters, and may include the input pinyin characters in the upper row and the candidates in the next rowThe word, the number corresponding to the candidate word, may be entered into the input box 412, or the space key may be pressed after the word is selected to enter the word into the input box 412. The status bar 414 is used to display the current input method status, for example, the small icons in the status bar 414 are shown in order from left to right:
Figure BDA0002517600290000144
the status bar 414 may be hidden by the identification of the pinyin input method, the chinese/english input status, the chinese/english punctuation, the expression, the voice input, the soft keyboard, the skin, the settings menu. In addition, when starting
Figure BDA0002517600290000145
Input method as current
Figure BDA0002517600290000146
When operating the input method used by the system, in
Figure BDA0002517600290000147
The task bar at the lower right corner of the desktop can also be displayed
Figure BDA0002517600290000148
An identification of the method is entered 415. Shown in FIG. 4
Figure BDA0002517600290000149
The input window 413 and the status bar 414 of the pinyin input method are only an example, and in other embodiments, different input method interfaces may be displayed, which is not limited in the present application.
In addition, the present application is not limited to the pinyin input method, and the embodiments of the present application do not limit the types of input methods installed in the operating system, and may also be, for example, an english input method, a wubi input method, a zhuyin input method, a phonetic input method, a handwriting input method, or an input method in which a plurality of input methods are mixed.
In this embodiment, the user can use the external physical keyboard 402 to input the text in the input box 412 of the android simulator, and the input method is
Figure BDA00025176002900001410
Installed in operating system
Figure BDA00025176002900001411
A Pinyin input method. In some embodiments, there may be no installation in the android simulator
Figure BDA00025176002900001412
And the android simulator can take the input method of the host operating system as a default input method at the moment, so that a user can conveniently use a physical keyboard for inputting. In addition, the user can also install and set the default input method as the input method of the host operating system side by himself, and the input method options installed on the android operating system side can be displayed in the input method setting options in the setting interface.
The default input method of the android simulator is
Figure BDA00025176002900001413
In an operating system
Figure BDA00025176002900001414
When the pinyin input method is used, a user can use an external physical keyboard 402 to input text in an android simulator, as shown in fig. 4, an identifier 409 of the keyboard, also called as first indication information, can be displayed in a status bar at the top of the android simulator, and indicates that the input method of the host operating system side can be used for text input in the android simulator. When the user operates the mouse to click the input box 412, a cursor 416 appears in the input box 412, indicating that the input box 412 is in an inputtable state. The user may then enter characters by tapping the keyboard operation 404, and, in response to the user operation,
Figure BDA00025176002900001415
operating system side
Figure BDA00025176002900001416
A pinyin input method is called and,show that
Figure BDA00025176002900001417
In the input window 413 of the pinyin input method, after the user selects a candidate word, the candidate word is input into the input box 412. The operation 404 of the user for tapping the keyboard is not limited, and the manner of using the input method on the host operating system side may also include mouse click input on a soft keyboard, voice input, and the like, which is not limited in any way in the present application.
In the present embodiment, the android simulator may also have its own input method installed therein, and in the example of fig. 5, the android simulator has its own input method installed therein
Figure BDA00025176002900001418
A method for inputting a mobile phone comprises the following steps,
Figure BDA00025176002900001419
installed in the operating system is
Figure BDA00025176002900001420
A Pinyin input method. Displayed on the display 501
Figure BDA00025176002900001421
Can display in the desktop
Figure BDA00025176002900001422
Status bar
512 of Pinyin input method and
Figure BDA00025176002900001423
display in the taskbar in the lower right corner of the desktop
Figure BDA00025176002900001424
An identification of the method is entered 513.
Figure BDA00025176002900001425
The virtual keyboard provided by the operating system side can comprise various types such as a text keyboard, a numeric keyboard, a symbolic keyboard, an expressive keyboard, a voice keyboard, a handwriting keyboard and the likeThe method can also comprise various keyboards such as a Chinese keyboard, an English keyboard and the like, and the expression keyboard can also comprise various keyboards such as an emoji keyboard, a Chinese character keyboard or an expression bag keyboard and the like.
The default input method of the android simulator is
Figure BDA0002517600290000151
In an operating system
Figure BDA0002517600290000152
When the method is used for inputting a mobile phone, a user can input text in the android simulator by using the external physical keyboard 502, and can also input text by controlling the operation 504 of the mouse 503. In one example, as shown in FIG. 5, run in an android simulator
Figure BDA0002517600290000153
Use of in
Figure BDA0002517600290000154
At the bottom of the conversation interface 505 of the application is an input box 508 that can be used for text input. The user operates the mouse pointer to click on the input box 508, a cursor 509 appears in the input box 508, an
Figure BDA0002517600290000155
The virtual keyboard 510 of the mobile phone input method indicates that the input box 508 is in an inputtable state at this time. At this time, the identification 506 (also called as the first indication information) and the identification of the keyboard can be displayed in the top status bar of the android simulator
Figure BDA0002517600290000156
The identifier 507 (also called as second indication information) of the mobile phone input method indicates that the android simulator can use the input method of the host operating system side to input text and can also use the self-installed android simulator in the current input state
Figure BDA0002517600290000157
And the mobile phone input method is used for inputting texts.
In some embodiments, the electronic device may cancel displaying the second indication information in the second user interface when a second user operation to exit the input state is detected in the second user interface. The second user action may include one or more of: clicking a position out of the input window in the second user interface, clicking a position out of the second user interface in the first user interface, and exiting the second user interface.
In the example of FIG. 5, the user manipulates a mouse pointer to click on input box 508, a cursor 509 appears in input box 508, an
Figure BDA0002517600290000158
Virtual keyboard 510 of mobile phone input method, representing the virtual keyboard in the client operating system
Figure BDA0002517600290000159
The handset input method is invoked. The user may then enter a character by clicking on a virtual key indicated in virtual keyboard 510, at which point the character is displayed in virtual keyboard 510
Figure BDA00025176002900001510
The input window 511 of the mobile phone input method is used for displaying input characters and candidate texts of the user, for example, the input characters are displayed in the upper row, the candidate texts are displayed in the next row, the mouse clicks the position of the required candidate text, or the required candidate text is selected by using the keyboard and the space bar is clicked, so that the word can be input into the input box 508.
In the embodiment of the present application, when the user uses the input method of the guest operating system side, for example
Figure BDA00025176002900001511
When the mobile phone input method is used for inputting, a background process is still in a resident state on the client operating system side, and the input of a user on the host operating system side can be accepted at any timeInformation, e.g. obtaining
Figure BDA00025176002900001512
The text of the pinyin input method, namely the input box 508, can simultaneously acquire the input information from the host operating system side input method and the input information from the client operating system side input method, and the input methods on two sides do not need to be switched, so that the input efficiency and the stability of the service state of the input method are improved, a friendly input operating environment is provided for a user, and the use experience of the user is improved.
In the present embodiment, as shown in fig. 6A, the default input method is installed and set in the android simulator as
Figure BDA00025176002900001513
When the mobile phone inputs the method, the user can still input the text in the input box 609 by using the external physical keyboard 604, and the default input method is not required to be changed into the input method on the host operating system side. In one example, as shown in FIG. 6A, displayed on display screen 601
Figure BDA00025176002900001514
Can display in the desktop
Figure BDA00025176002900001515
Status bar
614 of Pinyin input method and
Figure BDA00025176002900001516
display in the taskbar in the lower right corner of the desktop
Figure BDA00025176002900001517
An identification of the input method 615. Run in android simulator
Figure BDA00025176002900001518
Use of in
Figure BDA00025176002900001519
At the bottom of the application's session interface 605 there is an input box 608, which may beFor text entry. The user operates the mouse 603 to click on the input box 608, a cursor 609 appears in the input box 608, an
Figure BDA00025176002900001520
The virtual keyboard 610 of the mobile phone input method indicates that the input box 608 is in an inputtable state at this time. At this time, the keyboard identifier 606 (also called as the first indication information) and the keyboard identifier can be displayed in the top status bar of the android simulator
Figure BDA00025176002900001521
The identifier 607 (also called as second indication information) of the mobile phone input method indicates that the android simulator can use the input method of the host operating system side to input text and can also use the self-installed input method in the current input state
Figure BDA00025176002900001522
Figure BDA00025176002900001523
And the mobile phone input method is used for inputting texts.
The upper portion of the virtual keyboard 610 may be displayed with
Figure BDA00025176002900001524
The status bar 611 of the mobile phone input method illustrates that the small icons in the status bar 611 in fig. 6A sequentially represent from left to right:
Figure BDA00025176002900001525
identification of a mobile phone input method, expression, voice input, keyboard type selection, a tool kit, search and virtual keyboard folding.
As shown in FIG. 6A, upon detecting operation 604 of a user tapping a keyboard, the android simulator invokes
Figure BDA00025176002900001526
Operating system side
Figure BDA0002517600290000161
Pinyin inputMethod, at this time, will display
Figure BDA0002517600290000162
An input window 612 of the pinyin input method, where the input window 612 is used to display characters and candidate texts input by a user through a physical keyboard, and may include, for example, an upper row of input pinyin characters and a next row of candidate characters, and input a number corresponding to a required candidate character, that is, the word may be input into the input box 612, or the word may be input into the input box 612 by pressing a space key after selection. It should be noted that, when the user uses the input method on the host operating system side to perform input, the background process of the input method on the guest operating system side may be in a resident state, and may accept the input operation of the user at any time.
When the android simulator detects that the user clicks on the collapse virtual keyboard icon 613, the virtual keyboard 610 is collapsed, and the user interface refers to fig. 6B. In another embodiment, the android simulator can automatically pack up the virtual keyboard 610 when detecting that the user uses the physical keyboard to perform input operation, so as to increase the visual user interface and avoid the situation that the unnecessary keyboard blocks the session interface. Or the developer sets the virtual keyboard 610 not to be displayed automatically, and the user needs to actively select the display to be displayed. The trigger event for virtual keyboard 610 to collapse is not a limitation of the present application.
In the case of the embodiment shown in figure 6B,
Figure BDA0002517600290000163
the virtual keyboard 610 of the mobile phone input method is hidden, the user can perform input operation by clicking the keyboard 604, and the android simulator can acquire the input data from the android simulator through a daemon process
Figure BDA0002517600290000164
Operating system side
Figure BDA0002517600290000165
Text of Pinyin input method, at this time
Figure BDA0002517600290000167
Operating system side
Figure BDA0002517600290000166
The Pinyin input method is the input source for input window 612. When the user taps the keyboard, a display screen 601 will be displayed
Figure BDA0002517600290000168
An input window 612 of the pinyin input method, where the input window 612 is used to display characters and candidate texts input by a user through a keyboard, and may include, for example, an upper row of input pinyin characters and a next row of candidate characters, and input a number corresponding to a required candidate character, that is, the word may be input into the input box 612, or the word may be input into the input box 612 by pressing a space key after selection.
As shown in FIG. 6B, the keyboard identifier 606 (also called the first indication) and the first indication information can be displayed in the top status bar of the android simulator
Figure BDA0002517600290000169
The identifier 607 (also called as second indication information) of the mobile phone input method indicates that the android simulator can use the input method of the host operating system side to input text and can also use the self-installed input method in the current input state
Figure BDA00025176002900001611
The mobile phone input method is used for inputting texts,
Figure BDA00025176002900001610
the mobile phone input method and the daemon process for acquiring the input information of the host operating system side are in an active state at present.
In addition, in consideration of the situation that the host side input method and the client side input method are input simultaneously, the situation that the text input by the host side input method is displayed, the text input by the client side input method is displayed, neither side of the host side input method is displayed, or both sides of the host side input method and the client side input method are displayed in sequence is considered, developers can set the priority of the input method, the input process and the like occupying the input channel so as to prevent input confusion, and the application does not limit the input method.
In another embodiment, as shown in fig. 7, the electronic device 100 is a desktop computer, and the electronic device 100 may include: a display screen 701, a physical keyboard 702, a mouse 703 and the like, and a user can input text by an operation 704 of tapping the keyboard.
In this embodiment, the host operating system installed on the electronic device 100 is
Figure BDA00025176002900001612
The operating system is used to operate the system,
Figure BDA00025176002900001613
operating system is to
Figure BDA00025176002900001614
For the kernel operating system, the lower left corner of the desktop displayed on the display screen 701 can be displayed with
Figure BDA00025176002900001615
Menu controls 705. The electronic device 100 can also be provided with an android container, and the android container can be operated with
Figure BDA00025176002900001616
Operating systems, i.e. guest operating systems, in
Figure BDA00025176002900001617
Can be installed in the operating system
Figure BDA00025176002900001618
An application program. In this embodiment, the following
Figure BDA00025176002900001619
The application program is
Figure BDA00025176002900001620
The Android version is used as an example to explain the content of this embodiment. The user canBy clicking on a display on the display 701
Figure BDA00025176002900001621
An Android application icon 706, which is opened and displayed on the display screen 701 at this time
Figure BDA00025176002900001622
Figure BDA00025176002900001623
An Android version of the user interface 708.
Figure BDA00025176002900001624
Is a chat application, as shown in FIG. 7, in
Figure BDA00025176002900001625
The Android version of the user interface 708 displays the same
Figure BDA00025176002900001628
Android version of title bars 707 and
Figure BDA00025176002900001626
the application's session interface 709, where the user can manipulate the mouse pointer 710
Figure BDA00025176002900001629
The user interface of (2) enables interaction. In that
Figure BDA00025176002900001627
An input field 711 is also arranged in the session interface 709 of the application, the input field 711 may include an input box 712 for text input, and the input field 711 may also include controls for inputting contents such as voice, emoticons, pictures, and the like.
In the present embodiment, the first and second electrodes are,
Figure BDA0002517600290000171
the Chinese input method installed in the operating system may be
Figure BDA0002517600290000172
A pinyin input method, the input method interface may include an input window 714 and a status bar 715. The input window 714 is used for displaying the input characters and candidate texts of the user, and when the input method is in a state of inputting characters, the input method may include the input pinyin characters in the upper row and the candidate characters in the next row, and the number corresponding to the required candidate character is input, that is, the word may be input into the input box 712, or the space key may be pressed after the selection to input the word into the input box 712. The status bar 715 is used to display the current input method status, e.g., the small icons in the status bar 715 shown represent, in order from left to right:
Figure BDA0002517600290000173
the status bar 715 may be hidden by the identification of the pinyin input method, the chinese/english input status, the chinese/english punctuation, the expression, the voice input, the soft keyboard, the skin, the setup menu. In addition, when starting
Figure BDA0002517600290000174
Input method as current
Figure BDA0002517600290000175
When operating the input method used by the system, in
Figure BDA0002517600290000176
The task bar at the lower right corner of the desktop can also be displayed
Figure BDA0002517600290000177
Identification of the input method 716. Shown in FIG. 7
Figure BDA0002517600290000178
The input window 714 and the status bar 715 of the pinyin input method are only an example, and in other embodiments, different input method interfaces may be displayed, which is not limited in the present application.
In addition, the present application is not limited to the pinyin input method, and the embodiments of the present application do not limit the types of input methods installed in the operating system, and may also be, for example, an english input method, a wubi input method, a zhuyin input method, a phonetic input method, a handwriting input method, or an input method in which a plurality of input methods are mixed.
In this embodiment, the user can use the external physical keyboard 702 to input the text in the input box 712 of the android simulator, and the input method used is
Figure BDA0002517600290000179
Installed in operating system
Figure BDA00025176002900001710
A Pinyin input method. In some embodiments of the present invention, the,
Figure BDA00025176002900001711
Figure BDA00025176002900001712
there may not be an installation in Android
Figure BDA00025176002900001713
Input method on the operating system side, in this case
Figure BDA00025176002900001714
The Android version can take the input method of the host operating system as a default input method, and a user can conveniently use a physical keyboard for inputting. In addition, the user can also install and set the default input method as the input method of the host operating system side by himself, and the input method options installed on the android operating system side can be displayed in the input method setting options in the setting interface.
In that
Figure BDA00025176002900001715
The default input method of the Android version is
Figure BDA00025176002900001716
In an operating system
Figure BDA00025176002900001717
When the pinyin input method is used, a user can use an external physical keyboard 702
Figure BDA00025176002900001718
Text input is realized in the Android, a user operates a mouse to click on the input box 712, and a flickering cursor 713 appears in the input box 712 to indicate that the input box 712 is in an inputtable state at the moment. The user may then enter characters by tapping on the keypad at action 704, which, in response to user action,
Figure BDA00025176002900001719
operating system side
Figure BDA00025176002900001720
The Pinyin input method is called to display the input method
Figure BDA00025176002900001721
The input window 714 of the pinyin input method is used to input a candidate word into the input box 712 after the user selects the candidate word. The operation 704 of the user for tapping the keyboard is not limited, and the manner of using the input method on the host operating system side may also include mouse click input on a soft keyboard, voice input, and the like, which is not limited in any way in the present application.
In that
Figure BDA00025176002900001722
Of operating systems
Figure BDA00025176002900001723
In the case of using the Android input method in the Android version, reference may be made to the embodiment illustrated in fig. 5 and described above
Figure BDA00025176002900001724
Of operating systems
Figure BDA00025176002900001725
In Android version, host operationsFor a scenario in which the two-side input methods of the system side and the guest operating system side perform input simultaneously, reference may be made to the embodiments illustrated in fig. 6A and fig. 6B, which are not described herein again.
In this embodiment, one or more android simulators or android containers may be deployed on the electronic device 100 for running
Figure BDA00025176002900001726
The operating system, which is not limited in this embodiment.
Based on the foregoing embodiments, an input implementation process according to an embodiment of the present application is described with reference to fig. 8.
Referring to FIG. 8, when inputting content into the input window of the guest operating system using the input method of the host operating system, the following 9 steps may be involved, in this embodiment, the guest operating system is used as the input method
Figure BDA00025176002900001727
Figure BDA00025176002900001728
The first application program is
Figure BDA00025176002900001729
The description is given for the sake of example. It is understood that this embodiment is only an example, and that other embodiments may include more or less steps according to the specific situation, and the application is not limited in any way.
The method comprises the following steps: user click
Figure BDA0002517600290000181
Side wall
Figure BDA0002517600290000182
A chat input box in an application.
Step two: host operating system to
Figure BDA0002517600290000183
Side-sending trigger clicking the chat input boxThe message of concern is a message of concern,
Figure BDA0002517600290000184
the side Window Manager service will update the View (View) of the current focus point according to the message
Figure BDA0002517600290000185
The chat input box.
Step three: the WindowManager service sends a message to the Input Method Manager, which triggers the recording of the Input Channel (Input Channel) of the View currently in focus in the Input Method Manager.
Step IV: user input method through host system
Figure BDA0002517600290000186
Text is entered in an input box of the side WeChat application.
Step five:
Figure BDA0002517600290000187
the window instance on the host system side corresponding to the side WeChat application interface firstly receives input text from the host system input method. An instance may include one or more processes.
Step (c): the window instance in the previous step (v) sends the input text of the host side to the host side through a high-performance communication mechanism (such as Socket, Binder, etc.)
Figure BDA0002517600290000188
Input Daemon (Input Daemon). The daemon is served by a non-Android input method, so that other input method programs can be installed on the Android side, and the other input method programs on the Android side can be set as default input methods. In other embodiments, the input daemon for obtaining the input text from the host operating system may also be other types of processes or programs, such as a background process, or other application programs, which is not limited in this application.
Step (c): when the Input Daemon receives the Input text transmitted by the host side, the Input Daemon requests the Input MethodManager service to acquire the Input channel of the View of the current focus.
Step (v): the Input Daemon acquires the InputChannel of View of the current focus.
Step ninthly: the Input Daemon passes Input text from the host system Input method to the currently focused View through the Input channel.
It may be noted that the host system input normal is used throughout the above
Figure BDA0002517600290000189
During the process of inputting the View of the side application program, the modification is not needed
Figure BDA00025176002900001810
Side default input method, and no need of input to
Figure BDA00025176002900001811
The system registers for relevant input method services.
An input method provided by the present application is described below with reference to the above embodiments and the accompanying drawings.
Referring to fig. 9A and 9B, fig. 9A and 9B are schematic flow charts of an input method provided in an embodiment of the present application. In this embodiment, the electronic device 100 runs a host operating system and a guest operating system, and in some embodiments, the host operating system may be
Figure BDA00025176002900001812
The operating system, the guest operating system may be
Figure BDA00025176002900001813
Operating system, refer to the embodiments described in the foregoing fig. 4, fig. 5, fig. 6A, fig. 6B. In other embodiments, the host operating system may also be
Figure BDA00025176002900001814
Operating system (A)
Figure BDA00025176002900001815
Kernel), the guest operating system may be
Figure BDA00025176002900001816
Operating system, the embodiment described with reference to fig. 7 above. A first process runs on a host operating system of the electronic device, and the first process is used for loading an image of a guest operating system and running the guest operating system. In other embodiments, the host operating system and the guest operating system may be other types of operating systems, and the application is not limited thereto.
A connection may be established between the host operating system and the guest operating system that enables communication between the host operating system and the guest operating system. The essence of the connection is that the communicating parties use a commonly recognized mechanism for data transfer. The method for establishing the connection between the host operating system and the guest operating system is not limited in any way in the present application. Optionally, in an implementation manner of this embodiment, the host operating system and the guest operating system may communicate through a socket (socket) mechanism, where the socket is a common communication manner between application processes, and is an abstraction layer between an application layer and a transport layer, and abstracts a complex operation of a TCP/IP layer into several simple interface provisioning layer calls, so as to facilitate communication between application processes. The host operating system and the guest operating system may communicate with each other through a high performance communication mechanism such as a pipe mechanism or a binder mechanism, which is not limited in this application.
In this embodiment, there may be two cases of inputting in the first input window of the guest operating system, case 1: a user inputs by using a first input method on the host operating system side; case 2: and the user uses the second input method on the client operating system side to input.
The following describes the implementation of case 1 and case 2, respectively.
Case 1(S101 to S113): and the user inputs by using the first input method on the host operating system side.
As shown in fig. 9A, case 1 of the method may specifically include:
s101, the host operating system displays a first user interface of the host operating system, a second user interface of a first application program of the client operating system is displayed in the first user interface, and the second user interface comprises a first input window.
In the guest operating system, there may be a case where the user interfaces of a plurality of application programs are displayed in parallel, and this embodiment is not limited.
The first input window refers to a window which can acquire an input focus and accept, contain and edit input content, and specifically may be an input box, a text box, a picture box, an address bar, a search box, an editable page (e.g., a notepad and a word page), a table which can contain input content (e.g., an excel table), and the like. It should be noted that the above description only illustrates the first input window, and is not exhaustive.
In connection with the embodiment shown in fig. 7, a host operating system is running in the electronic device 100, i.e. a host operating system
Figure BDA0002517600290000191
The operating system, the first user interface of the host operating system may be, for example
Figure BDA0002517600290000192
The desktop of the operating system is provided with a desktop,
Figure BDA0002517600290000193
installing application programs in operating system
Figure BDA0002517600290000195
Android version, i.e. the first application of the guest operating system, may be, for example, an application
Figure BDA0002517600290000194
Android version, the second user interface of the first application may be, for example
Figure BDA0002517600290000196
An Android version of the user interface 708, the second user interface including a first input window, which may be, for example
Figure BDA0002517600290000197
Chat input box 712 in the conversation interface.
S102, the host operating system detects that a mouse click event occurs in the second user interface.
In one example, a user may use a mouse to position a mouse pointer to a location of a first input window in a second user interface in the guest operating system and click, at which time the host operating system may detect that a mouse click event has occurred in the second user interface in the guest operating system. The user can select to click or select a certain position by using user operations such as touch pad control, physical keyboard control, voice control and the like without being limited to the mouse control.
S103, the host operating system sends a mouse click event and a first position of a mouse pointer to the client operating system.
In one example, when the electronic device 100 detects that the first location in the second user interface is clicked, the host operating system may send a mouse click event and a message of the first location of the mouse pointer to the guest operating system. The message may be used for notification
Figure BDA0002517600290000198
The window management service (windows manager) of the operating system updates the input window where its current input focus is located.
And S104, the client operating system determines that the first input window in the second user interface is clicked according to the first position.
Application program in connection with the embodiment shown in FIG. 7
Figure BDA0002517600290000199
The Android version can be determined according to the coordinates of the first position
Figure BDA00025176002900001910
Chat input box 712 in the conversation interface is clicked. The click operation is also referred to as a first user operation, the first user operation may be a position where a user clicks the first input window by using a mouse, or a position where the user clicks the first input window by clicking a touch screen or a touch pad with a finger, or a voice instruction operation or a space gesture operation, and the operation mode of the first user operation is not limited in any way.
S105, the client operating system displays the input focus in the first input window.
After the first input window is clicked, the first input window may acquire an input focus and display the input focus, for example, the input focus may be displayed as a blinking cursor, and when the input focus is displayed in the first input window, it may indicate that the first input window is currently in an inputtable state.
In conjunction with the embodiment shown in fig. 7, for example, when the user operates the mouse to click the input box 712, a blinking cursor 713 appears in the input box 712, indicating that the input box 712 is in an inputtable state.
S106, the client operating system acquires an input channel corresponding to the first input window.
The input channel is an input interface between an input method used by a user on the client operating system side and the first application program, and can be used for transmitting an input object (such as text) acquired by the input method to the first application program.
In one example, upon detecting that the first input window is clicked, the window management service of the android operating system may send a message to an Input Method Management Service (IMMS) for registering an input channel of the first input window currently obtaining an input focus in the input method management service, that is, the first input window is registered as a target window of the input channel. The input channel is a communication channel between the input method on the client operating system side and the first application program, and an input object (such as text) acquired by the input method can be transferred to the first application program.
S107, the host operating system detects that a key (such as a key Q) in a physical keyboard on the host operating system side is knocked.
In the present embodiment, the user may perform an input operation by striking a key in a physical keyboard of the electronic device 100.
In connection with the embodiment shown in FIG. 7, a user may perform an input operation by, for example, operation 704 of tapping physical keyboard 702.
Specifically, the user may manipulate an input device of the electronic device 100 to input text and other content into the first input window on the guest operating system side. It will be readily appreciated that the electronic device 100 has an input device, such as a personal computer, which is a keyboard as a conventional input device. For another example, a mobile phone, the input device is a touch screen thereof. In addition, a microphone, a scanner, a camera, or the like may also be used as an input device of the electronic apparatus 100.
For the electronic device 100, the input device may be self-contained or external. For example, when the electronic device 100 is a notebook computer, the input device may be a keyboard, a camera, a microphone, etc. of the electronic device, or may be an external keyboard, a camera, a microphone, etc.
In some embodiments, the input device may be a physical keyboard. The user may generate a corresponding operation command by tapping or pressing a character key on the keyboard, so that the electronic device 100 may generate text content according to the operation command to obtain the content to be displayed.
In some embodiments, the input device may be a touch screen or a touch pad. The user can click the character keys represented on the virtual keyboard in the touch screen to generate corresponding operation commands, so that the text content to be displayed is obtained.
In some embodiments, the input device may be a microphone. A user may input speech to the electronic device 100 through a microphone. The electronic device 100 converts the user input speech into text, resulting in input content.
In some embodiments, the input device may be a camera. The user can input content to the first electronic device by taking pictures through the camera. The electronic device 100 may extract text from the picture taken by the camera, resulting in the input content. Specifically, an Optical Character Recognition (OCR) technology may be used to extract the text. The electronic device 100 may also extract an image from a picture taken by a camera and serve as input content.
In some embodiments, the input device may be a scanner. A user may input content to the electronic device 100 through a scanner. The electronic device 100 may extract text from a scanned picture input through the scanner, resulting in content to be displayed. Specifically, OCR technology may be used for text extraction. The electronic device 100 may also extract an image from the scanned picture to obtain the content to be displayed.
In some embodiments, the input device may be a tablet or stylus, and the text content or images sent to the electronic device 100 are text or images entered via the tablet or stylus.
And S108, displaying (optional) an interface of the first input method on the host operating system side by the host operating system.
When the host operating system detects that the physical keyboard is knocked by a user, the input method application manager of the host operating system provides an input method management service, and the input method management service can call a first input method selected by the user or selected by the system (namely default) on the side of the host operating system.
Optionally, when the host operating system detects that the key in the physical keyboard is clicked, the input method application runs in the foreground, and at this time, the interface of the first input method may be displayed in the first user interface. For example, in connection with the embodiment shown in FIG. 7,
Figure BDA0002517600290000211
the first input method installed in the operating system may be
Figure BDA0002517600290000212
A method for inputting a phonetic alphabet,
Figure BDA0002517600290000213
interface of Pinyin input methodAn input window 714 and a status bar 715 may be included. The input window 714 is used for displaying the input characters and candidate texts of the user, and when the input method is in a state of inputting characters, the input method may include the input pinyin characters in the upper row and the candidate characters in the next row, and the number corresponding to the required candidate character is input, that is, the word may be input into the input box 712, or the space key may be pressed after the selection to input the word into the input box 712. The status bar 715 is used to display the current input method status, e.g., the small icons in the status bar 715 shown represent, in order from left to right:
Figure BDA0002517600290000214
the status bar 715 may be hidden by the identification of the pinyin input method, the chinese/english input status, the chinese/english punctuation, the expression, the voice input, the soft keyboard, the skin, the setup menu. In addition, when starting
Figure BDA0002517600290000215
Input method as current
Figure BDA0002517600290000216
When operating the input method used by the system, in
Figure BDA0002517600290000217
The task bar at the lower right corner of the desktop can also be displayed
Figure BDA0002517600290000218
Identification of the input method 716.
It can be noted that, when the user uses the first input method on the host operating system side for input, the background process of the second input method on the guest operating system side is in a resident state, and can accept the input operation of the user at any time.
Optionally, when the host operating system detects that the key in the physical keyboard is clicked, the interface of the first input method may not be displayed in the first user interface, for example, when the physical keyboard is used for inputting an english character, the interface of the first input method may not be displayed, the first input method may directly convert "Q" represented by the key in the physical keyboard into an english character "Q", and an interface of the first input method does not need to be additionally displayed.
S109, the host operating system acquires a first input object (such as English characters Q) generated by the key (such as the key Q) being knocked by the host operating system through a first input method on the host operating system side.
The first input method may convert a tapped key in the physical keyboard into a corresponding first input object, for example, clicking a key showing "Q" in the physical keyboard, and the first input method may convert it into an english character "Q".
The first input object may specifically be a text, a picture, an expression, and the like, which is not limited in this embodiment.
S110, the client operating system obtains a first input object (such as English character 'Q') from the host operating system through the input daemon process.
The input daemon (inputdaemon) on the guest operating system side can be in a resident state all the time, and can acquire the first input object from the host operating system at any time.
In other embodiments, the input daemon for obtaining the first input object from the host operating system may also be other types of processes or programs, such as a background process, or other application programs, which is not limited in this application.
S111, the client operating system confirms whether the input channel is in an available state (optional).
Optionally, after acquiring the first input object from the host operating system, the input daemon of the guest operating system may query whether the input channel is currently in an available state. If the result of the query is that the input channel is currently available, step S112 is performed.
In some embodiments, an input channel may be determined to be available for a first input method if the input channel is not occupied by a second input method on the guest operating system side.
In some embodiments, an input channel may be determined to be available for a first input method if the input channel is occupied by a second input method on the guest operating system side, but the first input method takes a first input object with a higher priority than the second input method takes a second input object.
In some embodiments, the input channel may also be in an unavailable state, and there may be various situations where the input channel is in an unavailable state, for example, the input channel may be set to not accept text of an input method from the host operating system side currently, such as setting to disable keyboard input from the host side currently, or the input channel may be in an occupied state currently, such as an input method from the client operating system side is inputting, occupying the input channel, and so on, without limitation.
If the result of the query is that the input channel is currently in the unavailable state, the processing result may be multiple, for example, the current input operation may be cancelled, the first input object may be temporarily stored, the input may be retried when it is detected that the input channel is again in the available state, and the like.
In some embodiments, when the first input method occupies the input channel, if it is not detected that the first input method acquires the first input object within the first time, the electronic device may cancel the first input method from occupying the input channel. The first time may be, for example, 5 minutes.
In some embodiments, when the first input method occupies the input channel but does not pass the first input object, the electronic device may cancel the first input method from occupying the input channel upon detecting that the second input method obtains the second input object.
The above cases are merely examples of the "occupancy policy" for the input channel, and do not limit the embodiments of the present application.
S112, the client operating system transmits the first input object (such as English character "Q") acquired by the input daemon to the first input window by using the input channel.
The input daemon process of the client operating system obtains the use right of the input channel and sends the first input object to the input channel, the first input object is transmitted to the first application program through the input channel, and the first application program transmits the first input object to the first input window which acquires the input focus currently.
S113, the guest operating system displays a first input object (e.g., english character "Q") in a first input window of the first application.
Case 2(S201 to S213): and the user uses the second input method on the client operating system side to input.
As shown in fig. 9B, case 2 of the method may specifically include:
s201, the host operating system displays a first user interface of the host operating system, a second user interface of a first application program of the client operating system is displayed in the first user interface, and the second user interface comprises a first input window.
Refer to the aforementioned step S101.
S202, the host operating system detects that a mouse click event occurs in the second user interface.
Refer to step S102 described above.
S203, the host operating system sends a mouse click event and a first position of a mouse pointer to the client operating system.
Refer to the aforementioned step S103.
S204, the client operating system determines that the first input window in the second user interface is clicked according to the first position.
Refer to the aforementioned step S104.
S205, the client operating system displays the input focus in the first input window.
Refer to the aforementioned step S105.
S206, the client operating system acquires an input channel corresponding to the first input window.
Refer to the aforementioned step S106.
S207, the guest operating system displays a virtual input keyboard (optional) of the second input method on the guest operating system side.
Optionally, after the user clicks the first input window, the input method application manager of the android operating system provides an Input Method Management Service (IMMS), and the input method management service may invoke a second input method selected by the user or selected by the system (i.e., default) on the android operating system side. And after the input method management service of the android operating system calls up the second input method, pulling up the virtual input keyboard of the second input method to display in the second user interface.
In connection with the embodiment shown in FIG. 5, for example, the user operates the mouse pointer to click on the input box 508, a cursor 509 appears in the input box 508, and the one on the guest operating system side
Figure BDA0002517600290000231
Virtual keyboard 510 of mobile phone input method, representing the virtual keyboard in the client operating system
Figure BDA0002517600290000232
The handset input method is invoked. The user may then enter a character by clicking on a virtual key indicated in virtual keyboard 510, at which point the character is displayed in virtual keyboard 510
Figure BDA0002517600290000233
The input window 511 of the mobile phone input method is used for displaying input characters and candidate texts of the user, for example, the input characters are displayed in the upper row, the candidate texts are displayed in the next row, the mouse clicks the position of the required candidate text, or the required candidate text is selected by using the keyboard and the space bar is clicked, so that the word can be input into the input box 508.
Optionally, the virtual input keyboard of the second input method may not be displayed, for example, the user may select to hide the virtual input keyboard of the second input method, or the virtual input keyboard may automatically collapse when the user is detected to use the physical keyboard on the host side for input.
Figure BDA0002517600290000234
The virtual keyboard provided by the operating system side can comprise a text keyboard and a numberThe keyboard comprises a character keyboard, a symbol keyboard, an expression keyboard, a voice keyboard, a handwriting keyboard and the like, wherein the text keyboard can comprise a Chinese keyboard, an English keyboard and other various keyboards, the Chinese keyboard can comprise a pinyin keyboard, a five-stroke keyboard, a phonetic notation keyboard and the like, the Chinese pinyin keyboard can comprise a nine-square-grid-type pinyin keyboard, a full-keyboard-type pinyin keyboard and the like, the expression keyboard can comprise an emoji keyboard, a color-character keyboard or an expression package keyboard and other various keyboards, the application does not limit the keyboard, and keys can be arranged in a virtual keyboard to trigger the switching of the keyboards among different types.
S208, the host operating system detects that a mouse click event occurs in the second user interface.
In one example, the user may use a mouse to position a mouse pointer to a location of a virtual input keyboard in the second user interface in the guest operating system and click, at which time the host operating system may detect that a mouse click event has occurred in the second user interface in the guest operating system. The user can select to click a certain position by using user operations such as touch pad control, physical keyboard control, voice control and the like without being limited to the mouse control.
S209, the host operating system sends the mouse click event and the second position of the mouse pointer to the client operating system.
In one example, when the electronic device 100 detects that a second location in the second user interface is clicked, the host operating system may send a mouse click event and the second location of the mouse pointer to the guest operating system.
S210, the guest operating system determines that a key (e.g. key "S") in the virtual input keyboard is clicked according to the second position.
In one example, a second input method in the guest operating system may determine from the coordinates of the second location that a first key (e.g., key "S") in a virtual input keyboard of the second input method was clicked.
And S211, the client operating system acquires a second input object (such as English characters 'S') generated by clicking a key (such as the key 'S') through a second input method on the client operating system side.
The second input method may convert the clicked key in the virtual input keyboard into a corresponding second input object, for example, click a key showing "S" in the virtual input keyboard, and the second input method may convert it into an english character "S".
The second input object may specifically be a text, a picture, an expression, and the like, which is not limited in this embodiment.
S212, the client operating system transfers a second input object (such as English character 'S') obtained by the second input method to the first input window by using the input channel.
And the second input method of the client operating system sends the acquired second input object to the input channel, the second input object is transmitted to the first application program through the input channel, and the first application program transmits the second input object to the first input window which acquires the input focus currently.
S213, the guest operating system displays a second input object (e.g., English character "S") in the first input window of the first application.
By implementing the embodiment of the method, the user can use the host operating system side input method to input the text into the client operating system, and the input window for obtaining the input focus in the client operating system can simultaneously obtain the input information from the host operating system side input method and the input information from the client operating system side input method, and the input methods at two sides do not need to be set and switched, so that the input efficiency and the stability of the input method service state are improved, a friendly input operating environment is provided for the user, and the use experience of the user is improved.
The above-mentioned embodiments, objects, technical solutions and advantages of the present invention are further described in detail, it should be understood that the above-mentioned embodiments are only exemplary embodiments of the present invention, and are not intended to limit the scope of the present invention, and any modifications, equivalent substitutions, improvements and the like made on the basis of the technical solutions of the present invention should be included in the scope of the present invention.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present invention without departing from the spirit and scope of the invention. Thus, if such modifications and variations of the present invention fall within the scope of the claims of the present invention and their equivalents, the present invention is also intended to include such modifications and variations.

Claims (35)

1. An input method applied to an electronic device, on which a host operating system and a guest operating system run, the method comprising:
the electronic equipment displays a first user interface of the host operating system, wherein a second user interface of a first application program is displayed in the first user interface, the first application program is loaded in the client operating system, and the second user interface comprises a first input window;
the electronic equipment detects a first user operation in the first input window;
the electronic device displays an input focus in the first input window;
the electronic equipment acquires an input channel corresponding to the first input window;
the electronic equipment acquires a first input object through a first input method on the host operating system side;
an input daemon process in the guest operating system acquires the first input object on the host operating system side;
the input daemon transmits the first input object to the first input window through the input channel;
the electronic device displays the first input object in the first input window.
2. The method of claim 1, wherein a first process is running on a host operating system of the electronic device, the first process for loading an image of the guest operating system and running the guest operating system.
3. The method of claim 1 or 2, further comprising:
and the electronic equipment transmits the first input object to the input daemon through a window instance of the first application program on the host operating system side.
4. The method of any one of claims 1-3, wherein the input daemon passing the first input object to the first input window via the input channel, specifically comprising:
the input daemon transmits the first input object to the input channel;
the input channel transmits the first input object to the first application program;
and the first application program transfers the first input object to the view corresponding to the first input window.
5. The method of any one of claims 1-4, further comprising:
the input daemon passes the first input object to the input channel if it is detected that the input channel is available for the first input method.
6. The method of claim 5, further comprising:
if the input channel is not occupied by a second input method on the guest operating system side, the electronic device determines that the input channel is available for the first input method.
7. The method of claim 6, further comprising:
and if the input channel is occupied by the second input method on the client operating system side, but the priority of the first input method for acquiring the first input object is higher than that of the second input method for acquiring the second input object, the electronic equipment determines that the input channel is available for the first input method.
8. The method of claims 1-7, further comprising:
when the first input method occupies the input channel, if the first input method is not detected to acquire the first input object within the first time, the electronic equipment cancels the first input method to occupy the input channel.
9. The method of any one of claims 6-8, further comprising:
when the first input method occupies the input channel but does not transfer the first input object, the electronic equipment cancels the first input method from occupying the input channel as soon as the second input method is detected to acquire the second input object.
10. The method of any one of claims 1-9, further comprising:
when the first input method is detected to occupy the input channel, the electronic device displays first indication information in the second user interface, wherein the first indication information is used for indicating that the electronic device can realize input in the second user interface by using the first input method.
11. The method of any one of claims 6-10, further comprising:
when the display of the input focus is detected in the second user interface, the electronic device displays second indication information in the second user interface, wherein the second indication information is used for indicating that the electronic device can realize input in the second user interface by using the second input method.
12. The method of claim 11, further comprising:
and when a second user operation of exiting the input state is detected in the second user interface, the electronic equipment cancels the display of the second indication information in the second user interface.
13. The method of claim 12, wherein the second user action comprises one or more of: clicking a position out of the input window in the second user interface, clicking a position out of the second user interface in the first user interface, and exiting the second user interface.
14. The method of any of claims 1-13, wherein the first user operation is an operation that selects the first input window, the first user operation including one or more of: and selecting the mouse click operation of the first input window, the touch operation acting on the touch panel, the voice instruction operation and the air gesture operation.
15. The method of any one of claims 6-14, further comprising: and when the first user operation is detected in the first input window, displaying a virtual input keyboard of the second input method by the electronic equipment.
16. The method of claim 15, further comprising:
the electronic equipment detects that a first key in the virtual input keyboard is clicked;
the electronic equipment acquires the second input object generated by clicking the first key through the second input method;
the electronic device passes the second input object to the first input window using the input channel;
the electronic device displays the second input object in the first input window.
17. The method of any one of claims 1-16, wherein the obtaining of the first input object by the first input method comprises one or more of: the input object received through a physical keyboard of the electronic device, the input object received through a soft keyboard of the first input method, the input object received through a voice instruction, the input object received through a touch pad, and the input object received through a touch screen.
18. An electronic device having a host operating system and a guest operating system running thereon, the electronic device comprising: a memory having computer-executable instructions stored therein and a processor coupled to the memory, the processor configured to invoke the instructions to cause the electronic device to perform the steps of:
displaying a first user interface of the host operating system, wherein a second user interface of a first application program is displayed in the first user interface, the first application program is loaded in the client operating system, and the second user interface comprises a first input window;
detecting a first user operation in the first input window;
displaying an input focus in the first input window;
acquiring an input channel corresponding to the first input window;
acquiring a first input object through a first input method on the host operating system side;
acquiring the first input object at the host operating system side through an input daemon in the guest operating system;
passing the first input object to the first input window through the input channel;
displaying the first input object in the first input window.
19. The electronic device of claim 18, wherein the host operating system has a first process running thereon, the first process for loading an image of the guest operating system and running the guest operating system.
20. The electronic device of claim 18 or 19, wherein the processor invokes the instructions to cause the electronic device to further perform the following:
passing the first input object to the input daemon through a window instance of the first application on the host operating system side.
21. The electronic device of any of claims 18-20, wherein the processor invokes the instructions to cause the electronic device to perform, in particular:
the input daemon transmits the first input object to the input channel;
the input channel transmits the first input object to the first application program;
and the first application program transfers the first input object to the view corresponding to the first input window.
22. The electronic device of any of claims 18-21, wherein the processor invokes the instructions to cause the electronic device to further perform the following:
the input daemon passes the first input object to the input channel if it is detected that the input channel is available for the first input method.
23. The electronic device of claim 22, wherein the processor invokes the instructions to cause the electronic device to further perform the following:
determining that the input channel is available for the first input method if the input channel is not occupied by a second input method on the guest operating system side.
24. The electronic device of claim 23, wherein the processor invokes the instructions to cause the electronic device to further perform the following:
and if the input channel is occupied by the second input method on the client operating system side, but the priority of the first input method for acquiring the first input object is higher than that of the second input method for acquiring the second input object, determining that the input channel is available for the first input method.
25. The electronic device of claims 18-24, wherein the processor invokes the instructions to cause the electronic device to further perform the following:
and when the first input method occupies the input channel, if the first input method is not detected to acquire the first input object within the first time, canceling the first input method from occupying the input channel.
26. The electronic device of any of claims 23-25, wherein the processor invokes the instructions to cause the electronic device to further perform the following:
when the first input method occupies the input channel but does not transfer the first input object, once the second input method is detected to acquire the second input object, the first input method is cancelled to occupy the input channel.
27. The electronic device of any of claims 18-26, wherein the processor invokes the instructions to cause the electronic device to further perform the following:
when the first input method is detected to occupy the input channel, displaying first indication information in the second user interface, wherein the first indication information is used for indicating that the electronic equipment can realize input in the second user interface by using the first input method.
28. The electronic device of any of claims 23-27, wherein the processor invokes the instructions to cause the electronic device to further perform the following:
when the display of the input focus is detected in the second user interface, displaying second indication information in the second user interface, wherein the second indication information is used for indicating that the electronic equipment can realize input in the second user interface by using the second input method.
29. The electronic device of claim 28, wherein the processor invokes the instructions to cause the electronic device to further perform the following:
and canceling the display of the second indication information in the second user interface when a second user operation for exiting the input state is detected in the second user interface.
30. The electronic device of claim 29, wherein the second user operation comprises one or more of: clicking a position out of the input window in the second user interface, clicking a position out of the second user interface in the first user interface, and exiting the second user interface.
31. The electronic device of any of claims 18-30, wherein the first user operation is an operation to select the first input window, the first user operation including one or more of: and selecting the mouse click operation of the first input window, the touch operation acting on the touch panel, the voice instruction operation and the air gesture operation.
32. The electronic device of any of claims 23-31, wherein the processor invokes the instructions to cause the electronic device to further perform the following:
and when the first user operation is detected in the first input window, displaying a virtual input keyboard of the second input method.
33. The electronic device of claim 32, wherein the processor invokes the instructions to cause the electronic device to further perform the following:
detecting that a first key in the virtual input keyboard is clicked;
acquiring a second input object generated by clicking the first key through the second input method;
communicating the second input object to the first input window using the input channel;
displaying the second input object in the first input window.
34. The electronic device according to any one of claims 18-33, wherein the obtaining of the first input object by the first input method includes one or more of: the input object received through a physical keyboard of the electronic device, the input object received through a soft keyboard of the first input method, the input object received through a voice instruction, the input object received through a touch pad, and the input object received through a touch screen.
35. A computer storage medium, wherein a computer program is stored in the storage medium, the computer program comprising executable instructions that, when executed by a processor, cause the processor to perform operations corresponding to the method of any of claims 1-17.
CN202010481577.4A 2020-05-31 2020-05-31 Input method and electronic equipment Pending CN113741708A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202010481577.4A CN113741708A (en) 2020-05-31 2020-05-31 Input method and electronic equipment
PCT/CN2021/097049 WO2021244459A1 (en) 2020-05-31 2021-05-29 Input method and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010481577.4A CN113741708A (en) 2020-05-31 2020-05-31 Input method and electronic equipment

Publications (1)

Publication Number Publication Date
CN113741708A true CN113741708A (en) 2021-12-03

Family

ID=78727907

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010481577.4A Pending CN113741708A (en) 2020-05-31 2020-05-31 Input method and electronic equipment

Country Status (2)

Country Link
CN (1) CN113741708A (en)
WO (1) WO2021244459A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114489350A (en) * 2021-12-27 2022-05-13 荣耀终端有限公司 Input method calling method and related equipment
WO2023035758A1 (en) * 2021-09-10 2023-03-16 北京字节跳动网络技术有限公司 Input method setting method and apparatus, input method, and electronic device

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104740872A (en) * 2015-04-13 2015-07-01 北京奇虎科技有限公司 Method and device for operating and controlling game program in simulated Android environment
US9300720B1 (en) * 2013-05-21 2016-03-29 Trend Micro Incorporated Systems and methods for providing user inputs to remote mobile operating systems

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6546431B1 (en) * 1999-03-12 2003-04-08 International Business Machines Corporation Data processing system and method for sharing user interface devices of a provider assistive technology application with disparate user assistive technology applications
CN102427448B (en) * 2011-11-03 2017-07-14 南京中兴软件有限责任公司 Method, terminal and the service end of client input are used in virtual desktop
CN104142851A (en) * 2014-08-04 2014-11-12 福州靠谱网络有限公司 Method for using computer input method in android simulator
CN107179952B (en) * 2016-03-11 2021-03-23 思杰系统有限公司 Collaborative Input Method Editor (IME) activity between virtual application clients and servers
CN112968991B (en) * 2019-06-20 2022-07-29 华为技术有限公司 Input method, electronic equipment and screen projection system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9300720B1 (en) * 2013-05-21 2016-03-29 Trend Micro Incorporated Systems and methods for providing user inputs to remote mobile operating systems
CN104740872A (en) * 2015-04-13 2015-07-01 北京奇虎科技有限公司 Method and device for operating and controlling game program in simulated Android environment

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023035758A1 (en) * 2021-09-10 2023-03-16 北京字节跳动网络技术有限公司 Input method setting method and apparatus, input method, and electronic device
CN114489350A (en) * 2021-12-27 2022-05-13 荣耀终端有限公司 Input method calling method and related equipment

Also Published As

Publication number Publication date
WO2021244459A1 (en) 2021-12-09

Similar Documents

Publication Publication Date Title
CN112269527B (en) Application interface generation method and related device
WO2021129326A1 (en) Screen display method and electronic device
JP7385008B2 (en) Screen capture method and related devices
CN110597512B (en) Method for displaying user interface and electronic equipment
WO2021104030A1 (en) Split-screen display method and electronic device
WO2021103981A1 (en) Split-screen display processing method and apparatus, and electronic device
CN109062479B (en) Split screen application switching method and device, storage medium and electronic equipment
CN110362244B (en) Screen splitting method and electronic equipment
CN113542503B (en) Method, electronic device and system for creating application shortcut
WO2021110133A1 (en) Control operation method and electronic device
WO2022068483A9 (en) Application startup method and apparatus, and electronic device
CN114816167B (en) Application icon display method, electronic device and readable storage medium
US20220357818A1 (en) Operation method and electronic device
EP4206884A1 (en) Method for interaction between multiple applications
CN114115619A (en) Application program interface display method and electronic equipment
CN110806831A (en) Touch screen response method and electronic equipment
WO2023124141A1 (en) Input method calling method and related device
WO2021244459A1 (en) Input method and electronic device
CN113986070A (en) Quick viewing method for application card and electronic equipment
EP4195095A1 (en) Method for translating interface of application, and related device
CN110865765A (en) Terminal and map control method
EP4310648A1 (en) Service card processing method, and electronic device
WO2022002213A1 (en) Translation result display method and apparatus, and electronic device
CN115185440B (en) Control display method and related equipment
WO2022213831A1 (en) Control display method and related device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination