WO2023202444A1 - 一种输入方法及装置 - Google Patents

一种输入方法及装置 Download PDF

Info

Publication number
WO2023202444A1
WO2023202444A1 PCT/CN2023/087824 CN2023087824W WO2023202444A1 WO 2023202444 A1 WO2023202444 A1 WO 2023202444A1 CN 2023087824 W CN2023087824 W CN 2023087824W WO 2023202444 A1 WO2023202444 A1 WO 2023202444A1
Authority
WO
WIPO (PCT)
Prior art keywords
input
input device
status information
electronic device
information set
Prior art date
Application number
PCT/CN2023/087824
Other languages
English (en)
French (fr)
Inventor
韩若斐
侯朋飞
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2023202444A1 publication Critical patent/WO2023202444A1/zh

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means

Definitions

  • the present application relates to the field of electronic equipment, and more specifically, to an input method and device.
  • An input device is a device used for human or external interaction with a computer to input data or information into the computer.
  • the computer can obtain the corresponding input event through the input device and respond to the input event.
  • the computer may not respond correctly.
  • Embodiments of the present application provide an input method and device that can effectively respond to a user's collaborative operation of multiple input devices.
  • an input method is provided, which method is applied to an electronic device.
  • the method includes: detecting a user's operation on a first input device, determining change information of the first input device; obtaining an input device status information set, the The input device status information set includes status information of the second input device; generates a first input event, the first input event includes change information of the first input device and the input device status information set; executes related to the first input event operation.
  • the first input event generated when the user operates the first input device includes change information of the first input device and an input device status information set
  • the input device status information set includes status information of the second input device.
  • the input device status information set further includes status information of the first input device.
  • the user can cooperatively operate the first input device and the second input device to interact with the electronic device, and the user's operation method of the electronic device can be enriched and the user experience can be improved.
  • the input device status information set includes status information of all input devices of the electronic device.
  • the method further includes: updating the status information of the first input device in the input device status information set according to the change information of the first input device.
  • the status information of the input device in the input device status information set can be updated according to the change information of the input device, thereby ensuring the validity of the status information of the input device. properties to prevent electronic devices from incorrectly responding to user operations.
  • the method further includes: detecting the user's operation on the second input device, determining change information of the second input device; generating a second input event, the second input device The event includes the change information of the second input device and the updated input device status information set; perform operations related to the second input event.
  • the user's operation on the second input device may be detected, because the input device status information set is processed according to the change information of the first input device.
  • the second input event may include an updated set of input device status information.
  • the method further includes: determining a first input intention according to the change information of the first input device and the input device status information set; the first input event further includes the First input intention.
  • Performing an operation related to the first input event includes: performing an operation related to the first input intention.
  • the user's input intention can be identified based on the change information of the input device and the input device status information set.
  • the electronic device can query the input intention included in the input event and respond to the input intention, that is, perform an operation related to the input intention. Based on this solution, the electronic device can identify the input intention corresponding to the user's operation, which is conducive to the electronic device making a correct response to the user's operation.
  • the electronic device since there are many combinations between input device change information and input device status information sets, the electronic device may need to adapt to each combination. Based on this solution, the electronic device does not need to adapt to each combination. To adapt the combination, you only need to adapt the input intention. At the same time, if you need to add a new operation method, you can only add the operation method corresponding to the input intention, which reduces the complexity of development.
  • determining the first input intention according to the change information of the first input device and the input device status information set includes: determining the first input intention according to the first mapping relationship. The change information of the input device and the first input intention corresponding to the input device status information set.
  • the user's input intention can be determined based on the mapping relationship. In this way, the user's input intention can be accurately and quickly recognized, so that the electronic device can respond correctly.
  • the first mapping relationship is system predefined or user-defined.
  • the mapping relationship is predefined by the system or customized by the user.
  • the user thinks that the operation method corresponding to the input intention is not in line with his own habits he can modify the operation method corresponding to the input intention or add corresponding operations. In this way, the user's operation method of the electronic device can be enriched and the user experience can be improved.
  • an electronic device which includes:
  • processors one or more processors
  • the one or more memories store one or more computer programs.
  • the one or more computer programs include instructions that, when executed by the one or more processors, cause the electronic device to perform the following steps:
  • the input device status information set includes status information of the second input device
  • the first input event includes change information of the first input device and a set of status information of the input device;
  • the input device status information set further includes status information of the first input device.
  • the electronic device when the instruction is executed by the one or more processors, the electronic device is caused to perform the following steps:
  • the electronic device when the instruction is executed by the one or more processors, the electronic device is caused to perform the following steps:
  • the electronic device when the instruction is executed by the one or more processors, the electronic device is caused to perform the following steps:
  • the first input event also includes a first input intention
  • the electronic device when the instruction is executed by the one or more processors, the electronic device is caused to perform the following steps:
  • the first input intention corresponding to the change information of the first input device and the input device status information set is determined.
  • the first mapping relationship is system predefined or user-defined.
  • an electronic device in a third aspect, includes modules/units that perform the above-mentioned first aspect or any possible design method of the first aspect; these modules/units can be implemented by hardware, or can be implemented by The hardware executes the corresponding software implementation.
  • an electronic device including:
  • a detection unit used to detect the user's operation on the first input device and determine the change information of the first input device
  • a processing unit configured to obtain an input device status information set, where the input device status information set includes status information of the second input device
  • the processing unit is also configured to generate a first input event, where the first input event includes change information of the first input device and a set of status information of the input device;
  • An execution unit is used to execute operations related to the first input event.
  • the input device status information set further includes the first Enter status information for the device.
  • the processing unit is further configured to update the status information of the first input device in the input device status information set according to the change information of the first input device. .
  • the detection unit is further configured to detect a user's operation on the second input device and determine the change information of the second input device;
  • the processing unit is also configured to generate a second input event, where the second input event includes the change information of the second input device and the updated input device status information set;
  • the execution unit is also configured to perform operations related to the second input event.
  • the processing unit is further configured to determine the first input intention based on the change information of the first input device and the input device status information set;
  • the first input event also includes the first input intention
  • the execution unit is also used to execute operations related to the first input intention.
  • the processing unit is specifically configured to determine the change information of the first input device and the first input corresponding to the input device status information set according to the first mapping relationship. intention.
  • the first mapping relationship is system predefined or user-defined.
  • a chip in a fifth aspect, characterized in that the chip includes a processor and a communication interface.
  • the communication interface is used to receive a signal and transmit the signal to the processor.
  • the processor processes the signal so that the first The method of one aspect or any implementation of the first aspect is executed by the electronic device.
  • a chip is provided, which is coupled to a memory in an electronic device and used to call a computer program stored in the memory and execute the first aspect of the embodiment of the present application and any possible technical solution of the first aspect.
  • "Coupling" in the embodiment of this application means that two components are directly or indirectly combined with each other.
  • a computer-readable storage medium including computer instructions.
  • the electronic device causes the electronic device to execute the method of the first aspect or any implementation of the first aspect.
  • a computer program product includes: computer program code.
  • the electronic device causes the electronic device to execute the method of the first aspect or any implementation of the first aspect. .
  • Figure 1 is a schematic structural diagram of an electronic device provided in this embodiment.
  • Figure 2 is a software structure block diagram of the electronic device according to the embodiment of the present application.
  • FIG. 3 is a schematic diagram of a traditional input method provided by an embodiment of the present application.
  • Figure 4 is a schematic diagram of an input method provided by an embodiment of the present application.
  • Figure 5 is a schematic diagram of another input method provided by an embodiment of the present application.
  • Figure 6 is a schematic flow chart of an input method provided by an embodiment of the present application.
  • Figure 7 is a schematic flow chart of another input method provided by an embodiment of the present application.
  • Figure 8 is a schematic flow chart of yet another input method provided by an embodiment of the present application.
  • Figure 9 is a schematic diagram of the composition of a device provided by an embodiment of the present application.
  • Figure 10 is a schematic diagram of the hardware structure of a device provided by an embodiment of the present application.
  • the electronic device may be a portable electronic device that also includes other functions such as a personal digital assistant and/or a music player function, such as a mobile phone, a tablet computer, a wearable electronic device with wireless communication functions (such as a smart watch) wait.
  • portable electronic devices include, but are not limited to, carrying Or portable electronic devices with other operating systems.
  • the above-mentioned portable electronic device may also be other portable electronic devices, such as a laptop computer (Laptop). It should also be understood that in some other embodiments, the above-mentioned electronic device may not be a portable electronic device, but a desktop computer.
  • FIG. 1 shows a schematic structural diagram of an electronic device 100 provided by an embodiment of the present application.
  • the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, and a power management module 141.
  • Battery 142 antenna 1, antenna 2, mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, headphone interface 170D, sensor module 180, button 190, motor 191, indicator 192, Camera 193, display screen 194, and subscriber identification module (subscriber identification module, SIM) card interface 195, etc.
  • SIM subscriber identification module
  • the sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, and ambient light. Sensor 180L, bone conduction sensor 180M, etc.
  • the structure illustrated in the embodiment of the present application does not constitute a specific limitation on the electronic device 100 .
  • the electronic device 100 may include more or fewer components than shown in the figures, or some components may be combined, some components may be separated, or some components may be arranged differently.
  • the components illustrated may be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units.
  • the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processing unit (GPU), and an image signal processor. (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (NPU) wait.
  • application processor application processor, AP
  • modem processor graphics processing unit
  • GPU graphics processing unit
  • image signal processor image signal processor
  • ISP image signal processor
  • controller memory
  • video codec digital signal processor
  • DSP digital signal processor
  • baseband processor baseband processor
  • NPU neural-network processing unit
  • different processing units can be independent devices or integrated in one or more processors.
  • the controller may be the nerve center and command center of the electronic device 100 .
  • the controller can generate operation control signals based on the instruction operation code and timing signals to complete the control of fetching and executing instructions.
  • the processor 110 may also be provided with a memory for storing instructions and data.
  • the memory in processor 110 is cache memory. This memory may hold instructions or data that have been recently used or recycled by processor 110 . If the processor 110 needs to use the instructions or data again, it can be called directly from the memory. Repeated access is avoided and the waiting time of the processor 110 is reduced, thus improving the efficiency of the system.
  • processor 110 may include one or more interfaces.
  • Interfaces may include integrated circuit (inter-integrated circuit, I2C) interface, integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, pulse code modulation (pulse code modulation, PCM) interface, universal asynchronous receiver and transmitter (universal asynchronous receiver/transmitter (UART) interface, mobile industry processor interface (MIPI), general-purpose input/output (GPIO) interface, subscriber identity module (SIM) interface, and /or universal serial bus (USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • UART universal asynchronous receiver and transmitter
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • USB universal serial bus
  • the I2C interface is a bidirectional synchronous serial bus, including a serial data line (SDA) and a serial clock line (SCL).
  • SDA serial data line
  • SCL serial clock line
  • the I2S interface can be used for audio communication.
  • processor 110 may include multiple sets of I2S buses.
  • the processor 110 can be coupled with the audio module 170 through the I2S bus to implement communication between the processor 110 and the audio module 170 .
  • the PCM interface can also be used for audio communications to sample, quantize and encode analog signals.
  • the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface.
  • the UART interface is a universal serial data bus used for asynchronous communication.
  • the bus can be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication.
  • a UART interface is generally used to connect the processor 110 and the wireless communication module 160 .
  • the MIPI interface can be used to connect the processor 110 with peripheral devices such as the display screen 194 and the camera 193 .
  • the GPIO interface can be configured through software.
  • the GPIO interface can be configured as a control signal or as a data signal. In some embodiments, the GPIO interface can be used to connect the processor 110 with the camera 193, display screen 194, wireless communication module 160, audio module 170, sensor module 180, etc.
  • the USB interface 130 is an interface that complies with the USB standard specification, and may be a Mini USB interface, a Micro USB interface, a USB Type C interface, etc.
  • the USB interface 130 can be used to connect a charger to charge the electronic device 100, and can also be used to transmit data between the electronic device 100 and peripheral devices.
  • the interface connection relationships between the modules illustrated in the embodiments of the present application are only schematic illustrations and do not constitute a structural limitation of the electronic device 100 .
  • the electronic device 100 may also adopt Different interface connection methods in the above embodiments, or a combination of multiple interface connection methods.
  • the charging management module 140 is used to receive charging input from the charger.
  • the charger can be a wireless charger or a wired charger.
  • the charging management module 140 may receive charging input from the wired charger through the USB interface 130 .
  • the charging management module 140 may receive wireless charging input through the wireless charging coil of the electronic device 100 . While the charging management module 140 charges the battery 142, it can also provide power to the electronic device through the power management module 141.
  • the power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110.
  • the wireless communication function of the electronic device 100 can be implemented through the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor and the baseband processor.
  • the mobile communication module 150 can provide solutions for wireless communication including 2G/3G/4G/5G applied on the electronic device 100 .
  • a modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low-frequency baseband signal to be sent into a medium-high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low-frequency baseband signal.
  • the demodulator then transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the application processor outputs sound signals through audio devices (not limited to speaker 170A, receiver 170B, etc.), or displays images or videos through display screen 194.
  • the modem processor may be a stand-alone device.
  • the modem processor may be independent of the processor 110 and may be provided in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 can provide applications on the electronic device 100 including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) network), Bluetooth (bluetooth, BT), and global navigation satellites.
  • WLAN wireless local area networks
  • Wi-Fi wireless fidelity
  • Bluetooth bluetooth, BT
  • global navigation satellites Global navigation satellite system, GNSS
  • frequency modulation frequency modulation, FM
  • NFC near field communication technology
  • infrared technology infrared, IR
  • the antenna 1 of the electronic device 100 is coupled to the mobile communication module 150, and the antenna 2 is coupled to the wireless communication module 160, so that the electronic device 100 can communicate with the network and other devices through wireless communication technology.
  • the electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like.
  • the GPU is an image processing microprocessor and is connected to the display screen 194 and the application processor. GPUs are used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
  • the display screen 194 is used to display images, videos, etc.
  • Display 194 includes a display panel.
  • the display panel can use a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode or an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode).
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • AMOLED organic light emitting diode
  • FLED flexible light-emitting diode
  • Miniled MicroLed, Micro-oLed or quantum dot light emitting diode (QLED) and other materials Made display panel.
  • the electronic device 100 may include 1 or N display screens 194, where N is a positive integer greater than 1.
  • the display screen 194 can also integrate a touch function, which can also be called a touch screen.
  • the electronic device 100 can implement the shooting function through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the electronic device 100.
  • an external memory card such as a Micro SD card
  • Internal memory 121 may be used to store computer executable program code, which includes instructions.
  • the processor 110 executes instructions stored in the internal memory 121 to execute various functional applications and data processing of the electronic device 100 .
  • the electronic device 100 can implement audio functions through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor. Such as music playback, recording, etc.
  • the audio module 170 is used to convert digital audio information into analog audio signal output, and is also used to convert analog audio input into digital audio signals.
  • Speaker 170A also called “speaker”
  • Receiver 170B also called “earpiece”
  • Microphone 170C also called “microphone” or “microphone”
  • the headphone interface 170D is used to connect wired headphones.
  • the pressure sensor 180A is used to sense pressure signals and can convert the pressure signals into electrical signals.
  • pressure sensor 180A may be disposed on display screen 194 .
  • the gyro sensor 180B may be used to determine the motion posture of the electronic device 100 .
  • Air pressure sensor 180C is used to measure air pressure. In some embodiments, the electronic device 100 calculates the altitude through the air pressure value measured by the air pressure sensor 180C to assist positioning and navigation.
  • the acceleration sensor 180E can detect the acceleration of the electronic device 100 in various directions (generally three axes).
  • Distance sensor 180F for measuring distance.
  • Fingerprint sensor 180H is used to collect fingerprints.
  • Touch sensor 180K also called "touch panel”. The touch sensor 180K can be disposed on the display screen 194.
  • Bone conduction sensor 180M can acquire vibration signals.
  • the bone conduction sensor 180M can acquire the vibration signal of the vibrating bone mass of the human body's vocal part.
  • the bone conduction sensor 180M can also contact the human body's pulse and receive blood pressure beating signals.
  • the buttons 190 include a power button, a volume button, etc.
  • the motor 191 can generate vibration prompts.
  • the indicator 192 may be an indicator light, which may be used to indicate charging status, power changes, or may be used to indicate messages, missed calls, notifications, etc.
  • the SIM card interface 195 is used to connect a SIM card.
  • FIG. 2 is a software structure block diagram of the electronic device 100 according to the embodiment of the present application.
  • the layered architecture divides the software into several layers, and each layer has clear roles and division of labor.
  • the layers communicate through software interfaces.
  • the Android system is divided into four layers, from top to bottom: application layer, application framework layer, Android runtime and system libraries, and kernel layer.
  • the application layer can include a series of application packages.
  • the application package can include applications such as camera, gallery, calendar, call, map, navigation, WLAN, Bluetooth, music, video, short message, APP1, APP2, etc.
  • the application framework layer provides an application programming interface (API) and programming framework for applications in the application layer.
  • API application programming interface
  • the application framework layer includes some predefined functions.
  • the application framework layer can include a window manager, content provider, view system, phone manager, resource manager, notification manager, etc.
  • a window manager is used to manage window programs.
  • the window manager can obtain the display size, determine whether there is a status bar, lock the screen, capture the screen, etc.
  • Content providers are used to store and retrieve data and make this data accessible to applications.
  • This data can include videos, images, audio, calls made and received, browsing history and bookmarks, phone books, etc.
  • the view system includes visual controls, such as controls that display text, controls that display pictures, etc.
  • the view system can be used Build the application.
  • the display interface can be composed of one or more views.
  • a display interface including a text message notification icon may include a view for displaying text and a view for displaying pictures.
  • the phone manager is used to provide communication functions of the electronic device 100 .
  • call status management including connected, hung up, etc.
  • the resource manager provides various resources to applications, such as localized strings, icons, pictures, layout files, video files, etc.
  • the notification manager allows applications to display notification information in the status bar, which can be used to convey notification-type messages and can automatically disappear after a short stay without user interaction.
  • the notification manager is used to notify download completion, message reminders, etc.
  • the notification manager can also be notifications that appear in the status bar at the top of the system in the form of charts or scroll bar text, such as notifications for applications running in the background, or notifications that appear on the screen in the form of conversation windows. For example, text information is prompted in the status bar, a beep sounds, the electronic device vibrates, the indicator light flashes, etc.
  • Android runtime includes core libraries and virtual machines. Android runtime is responsible for the scheduling and management of the Android system.
  • the core library contains two parts: one is the functional functions that need to be called by the Java language, and the other is the core library of Android.
  • the application layer and application framework layer run in virtual machines.
  • the virtual machine executes the java files of the application layer and application framework layer into binary files.
  • the virtual machine is used to perform object life cycle management, stack management, thread management, security and exception management, and garbage collection and other functions.
  • System libraries can include multiple functional modules. For example: surface manager (surface manager), media libraries (media libraries), 3D graphics processing libraries (for example: OpenGL ES), 2D graphics engines (for example: SGL), etc.
  • the surface manager is used to manage the display subsystem and provides the fusion of 2D and 3D layers for multiple applications.
  • the media library supports playback and recording of a variety of commonly used audio and video formats, as well as static image files, etc.
  • the media library can support a variety of audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to implement 3D graphics drawing, image rendering, composition and layer processing.
  • 2D Graphics Engine is a drawing engine for 2D drawing.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer can include display drivers, camera drivers, audio drivers, and sensor drivers.
  • the electronic device in the embodiment of the present application may also be an electronic device installed with operating systems such as Windows, Linux, Android, Hongmeng, or Apple.
  • the electronic device 100 may also include a variety of input devices, such as a mouse, a keyboard, a touch screen, a camera, a microphone, etc.
  • input devices such as a mouse, a keyboard, a touch screen, a camera, a microphone, etc.
  • the input device may be a part of the electronic device 100, such as the camera 193 or the display screen 194; in another example, the input device may be independent of the electronic device 100, such as a mouse or a microphone 170, etc.
  • the input device can be connected to the electronic device 100 in various ways.
  • the input device can be wirelessly connected to the electronic device 100 through Bluetooth or WiFi; or the input device can be wired to the electronic device 100 through the USB interface 130 .
  • the electronic device may also include input devices such as a gyroscope sensor, an infrared light sensor, and a structured light sensor. These sensors can, for example, identify changes in the posture of the electronic device, changes in the surrounding ambient light, etc.
  • the electronic device 100 may include a driver corresponding to the input device, such as a display driver, a camera driver, an audio driver, etc. as shown in FIG. 2 .
  • a driver is an interface for communication between a computer and a hardware device (for example, an input device). The driver can convert the operation of the hardware device into machine language, or it can also convey the system instructions to the hardware device.
  • Input framework It is part of the operating system and connects hardware devices and applications as well as other system services. It is mainly used to generate input events and distribute input events to applications.
  • UI framework It is part of the operating system and is used to identify and display the UI of the application, and can trigger the application to respond based on input events.
  • Input events Operations initiated by the user on the computing system through input devices are represented in the computer as input events.
  • FIG. 3 is a schematic diagram of a traditional input method provided by an embodiment of the present application. As shown in FIG. 3 , when the user operates the input device, the information flow inside the electronic device 100 is transmitted in the direction indicated by the arrow in FIG. 3 . The specific process is introduced as follows.
  • the user operates the input device of the electronic device; the corresponding driver obtains the user's operation information, converts the operation information into input device change information and reports it to the input frame; after the input frame receives the input device change information, it converts the input device change information and The status information is packaged together to form an input event and then distributed to the UI framework; after receiving the input event, the UI framework finds the corresponding response code in the pointed application and hands it over to it for processing; after the response code in the application receives the input event , respond to user operations.
  • the user uses his finger to click the login control of the browser on the touch screen, and the display driver obtains the operation information of the clicked position on the touch screen and converts it into "add a touch point on the touch screen (including the position of the touch point on the screen).
  • the input frame updates the touch screen status information to "the touch screen has a contact point (including the touch point on the screen) position)", and then package the change information of "add a touch point on the touch screen” and the status information of "the touch screen has a touch point” into touch input events and distribute them to the UI framework; after the UI framework receives the touch input events, it The location where the touch screen is clicked triggers the response code corresponding to the login control in the browser and executes the login logic.
  • Users may use multiple input devices when using electronic devices for work, and the multiple input devices may share one electronic device.
  • users use a mouse, keyboard, and touch screen for work at the same time, and the mouse, keyboard, and touch screen can be connected to the same electronic device.
  • each input device will independently report input events, such as mouse input events, key input events, etc. as shown in Figure 3, and the application will report them respectively. Respond independently to mouse input events and key input events.
  • the application cannot obtain the status information of other input devices (such as keyboard, touch screen).
  • the application may not respond correctly due to a conflict between two input events. For example, the user uses the touch screen to drag the application to the left, and the mouse to drag the application to the right. At this time, the electronic device may only respond to one of the operations, or may not respond to the user's operation.
  • FIG. 4 is a schematic diagram of an input method provided by an embodiment of the present application.
  • the internal information flow of the electronic device 100 is transmitted in the direction indicated by the arrow in FIG. 4 , and is detailed as follows.
  • the touch input event Take the touch input event as an example to illustrate.
  • the user presses the upper right corner of the first interface on the touch screen with his finger and drags it to the upper right; the corresponding driver obtains the user's operation information, converts the operation information into touch screen change information and reports it to the input frame; the input frame converts the touch screen change information Packed with the input device status information set to form an input event, and then distributed to the UI framework; after receiving the input event, the UI framework finds the corresponding response code in the pointed application and hands it over to it for processing; the response code in the application receives the input After the event, respond to the user's operation.
  • the input The device status information set refers to the current status information of all input devices of the electronic device.
  • the input device of the electronic device includes a mouse and a touch screen, and the input device status information set includes mouse status information and touch screen status information.
  • the status information of the mouse is: the mouse is located in the lower left corner of the first interface, and the left button is pressed.
  • the touch screen change information is: a touch point is added to the touch screen, located in the upper right corner of the first interface, and the touch point moves to the upper right.
  • Touch input events include touch screen change information and mouse status information. After receiving the touch input event, the application program enlarges the first interface in response to the touch input event.
  • the status information of the mouse is: the mouse is located in the lower left corner of the first interface and is in an inactive state.
  • the touch screen change information is: a touch point is added to the touch screen, located in the upper right corner of the first interface, and the touch point moves to the upper right.
  • Touch input events include touch screen change information and mouse status information. After receiving the touch input event, the application program drags the first interface in a direction away from the mouse position in response to the touch input event.
  • mouse input event and touch input event in Figure 4 are two independent input events, but the input device status information set included in the mouse input event and the touch input event both includes status information of another input device.
  • the mouse status information in the input device status information set is: the mouse is located in the lower left corner of the first interface, and the left button is pressed. It can be understood that before the touch input event is reported, the mouse input event has been reported.
  • the mouse change information of the mouse input event is: move to the lower left corner of the first interface and press the left button, and the mouse status information is updated according to the mouse change information.
  • the updated mouse status information is: the mouse is located in the lower left corner of the first interface, and the left button is pressed. This updated mouse state information can be used as part of the input device state information set in subsequent touch input events.
  • the status information of the input device can be updated according to the change information of the input device.
  • Figure 5 is a schematic diagram of another input method provided by an embodiment of the present application.
  • the internal information flow of the electronic device 100 is transmitted in the direction indicated by the arrow in Figure 5.
  • the process is similar to the embodiment shown in Figure 4.
  • the input event may include an input intention
  • the application program may respond to the input intention
  • the key input event may include input intention, key change information and input device status information set.
  • the input device status information set may include key status information and mouse status information.
  • the key change information is: the S key changes from the up state to the pressed state
  • the key state information is: the WINDOWS key and the CTRL key are in the pressed state and the S key is in the pressed state
  • the mouse state information is: No operation state
  • input intention is: screenshot. After the screenshot application receives the key input event, it responds to the input intention in the key input event and performs the screenshot operation.
  • the key input event includes input intention, key change information and input device status information set.
  • the input device status information set includes key status information and mouse status information.
  • the key change information is: the A key changes from the pop-up state to the pressed state
  • the key state information is: the ALT key is in the pressed state and the A key is in the pressed state
  • the mouse status information is: no operation state
  • the input intention is: screenshot. After the screenshot application receives the key input event, it responds to the input intention and performs the screenshot operation.
  • the user double-tap the screen with their knuckles.
  • the gesture input event includes input intention, gesture change information and input device status information set.
  • the input device status information set includes key status information.
  • the gesture change information is: the screen is tapped twice with knuckles
  • the key status information is: no operation status
  • the input intention is: screenshot. After the screenshot application receives the key input event, it responds to the input intention and performs the screenshot operation.
  • the input device of the electronic device may also be a microphone, a gyroscope sensor, a camera or an infrared light sensor, etc.
  • the input device of the electronic device includes an infrared light sensor and a microphone
  • the input event includes "user's voice received: scroll down" (microphone change information), and the user is looking at the screen (infrared light sensor status information)
  • the input intention corresponding to the input event is: scroll down.
  • the electronic device responds to the input intention of "scroll down” and controls the page to scroll down.
  • the status information that "the user is looking at the screen” can be identified by an infrared light sensor and input into the electronic device.
  • the infrared light sensor recognizes the change information of "the user is looking at the screen", and then updates the infrared light sensor status information in the input device status information set to "the user is looking at the screen” based on the change information.
  • the electronic device includes buttons and a gyro sensor.
  • the input events include: the power button is double-clicked (key change information), and the phone is in a raised posture (gyro sensor status information). Then the input event corresponding to the input event The intention is: open the payment interface. After receiving the input event, the electronic device responds to the input intention of "open the payment interface” and opens the corresponding payment interface.
  • the status information that "the mobile phone is in a raised posture” can be recognized by a gyroscope sensor and input into the electronic device.
  • the gyro sensor recognizes the change information that "the mobile phone changes from a horizontal posture to a raised posture", and then updates the gyro sensor status information in the input device status information set to "mobile phone" based on this change information.
  • the input intention can be identified based on the change information of the input device and the input device status information set, and the input intention can be packaged into the input event.
  • the electronic device can respond to the input intention instead of responding to a specific operation (for example, responding to screenshot intent instead of responding to the ALT+A key).
  • the way to identify the input intention may be to establish a mapping relationship between the change information of the input device and the input device status information set and the input intention.
  • the input intention of "screenshot” can correspond to "left mouse button click (change information) + right mouse button pressed + ALT key pressed", or it can correspond to "right mouse button click + ALT key pressed Pressed state” can also correspond to "the screen was tapped once with the knuckles (change information) + the power button is pressed”.
  • mapping relationship between the change information of the input device and the input device status information set and the input intention can be predefined by the system, and can be specifically defined according to user habits, which is not limited in this application.
  • the user can define or add corresponding operations for input intentions. For example, the user can establish a mapping relationship between the input intention of "screenshot” and "the left mouse button click (change information) + the CTRL key is pressed".
  • the above example only takes the combination between two input devices as an example.
  • the number of input devices may be more than two, and the input device status information set may include status information of more than two input devices, such as , the above input device status information set may include mouse status information, button status information, touch screen status information, sensor status information, etc., which is not limited by this application.
  • the set of input device status information includes status information for all input devices of the electronic device.
  • the electronic device may respond to a specific input event (ie, respond according to the change information of the input device and the set of input device status information in the input event).
  • Figure 6 is a schematic flow chart of an input method provided by an embodiment of the present application.
  • the electronic device includes a first input device and a second input device.
  • the method includes:
  • the first input device detects the user's operation.
  • the first input device may be a mouse.
  • the operations detected by the first input device may include mouse movement, left click, right click, left double click, and left button release. and wheel sliding (up or down) etc.
  • the first input device may be a keyboard.
  • the operation detected by the first input device may include a certain key being pressed, a certain key popping up, etc.
  • the first input device may be a microphone, and when the user uses the microphone to operate, the first input device may detect the user's voice input.
  • the first input device may be a touch screen.
  • the operations detected by the first input device may include single click, double click, and gesture (the touch point moves on the touch screen).
  • the first input device may be a gyroscope sensor, and when the user moves the electronic device, the first input device may detect changes in the posture of the electronic device.
  • the first device driver corresponding to the first input device can obtain the first operation information.
  • the first operation information may be a digital representation of the user's specific operation or an electrical signal, etc.
  • the machine language representation of the user's action of pressing the space bar in the computer may be obtained.
  • the first device driver sends the change information of the first input device to the processing module, and accordingly, the processing module receives the change information of the first input device.
  • the change information of the input device refers to the information about the change of the state of the input device due to the user's operation. For example, if the user presses the touch screen with his finger, the electronic device can obtain the pressed position of the touch screen and the time of pressing.
  • the touch screen change information is "add a touch point to the touch screen (including the position of the touch point on the screen and the time of pressing).” .
  • the change information of the first input device is "add a touch point on the screen (which may include the position of the touch point on the screen). and pressing time)".
  • the first input device may convert the first operation information into a description of a change in the state of the first input device, that is, the change information of the first input device.
  • S640 The processing module obtains the input device status information set.
  • the input device status information set may include status information of all input devices of the electronic device.
  • the electronic device includes a first input device and a second input device
  • the input device status information set may include status information of the first input device and status information of the second input device.
  • the electronic device may include a mouse, a keyboard and a touch screen
  • the input device status information set may include mouse status information, key status information, touch status information, etc.
  • the key state information may be the state of the space bar being pressed or the state of the space bar being bounced; for another example, the touch state information may be that there is no touch point on the screen or there is one touch point on the screen.
  • the first input device and the second input device are different.
  • the first input device and the second input device are different may mean that the first input device and the second input device are of different types.
  • the first input device is a mouse and the second input device is a keyboard; the first input device and the second input device are different.
  • Different input devices may also refer to identification or logical distinction between the first input device and the second input device in the computer.
  • the first input device is a first mouse and the second input device is a second mouse.
  • the set of input device status information includes status information of the second input device.
  • the processing module can obtain current status information of multiple input devices to obtain a set of input device status information. deal with The module can maintain the input device status information set locally, and update the input device status information set according to the change information of the input device whenever the status information of the input device changes.
  • the processing module may pre-store an input device status information set, where the status information of the first input device and the status information of the second input device in the input device status information set are both default status information.
  • the key state information defaults to the pop-up state
  • the mouse status information defaults to: position information on the screen and no operation state (for example, the left button and the right button are both in the pop-up state).
  • the processing module updates the status information of the first input device in the input device status information set according to the change information of the first input device.
  • the processing module can obtain the input device status information set in multiple ways.
  • the processing module can also query status information of multiple input devices to obtain a set of input device status information.
  • the processing module can also receive status information reported by multiple input devices to obtain a set of input device status information.
  • the processing module generates a first input event based on the change information of the first input device and the input device status information set.
  • the first input event may include change information of the first input device and an input device status information set, and the input device status information set includes status information of the second input device.
  • the first input event may include change information of the first input device and an input device status information set
  • the input device status information set includes status information of the first input device and status information of the second input device.
  • S660 The processing module sends the first input event to the execution module, and accordingly, the execution module receives the first input event from the processing module.
  • the processing module may send the first input event to the specific execution module according to the input event distribution rules.
  • the processing module may send the first input event to the system-level service first.
  • the execution module may be an application program, and the processing module may send the first input event to the currently active application program.
  • the processing module sends the first input event to the currently active application program respectively.
  • the processing module sends the first input event to the application running in the background.
  • the first input event is a pointing input event, for example, a mouse click on a specific icon, link, control, etc.
  • the processing module sends a first input event to the application pointed to by the first input event according to the first input event. Enter the event. That is, the processing module distributes the first input event according to the first input event.
  • the execution module is the execution module pointed to by the first input event.
  • the execution module responds to the first input event.
  • the first input event may include change information of the first input device and an input device status information set, where the input device status information set may include status information of the second input device.
  • the input events reported by the first input device can also include status information of other input devices, and the execution module can effectively respond to the user's collaborative operation using multiple input devices.
  • the execution module responding to the first input event may be the execution module performing an operation related to the first input event.
  • the first input event is a mouse input event.
  • the operation related to the mouse input event is to open the application.
  • the first input event is a key input event.
  • the operation related to the key input event is to input the letter A.
  • the first input device may be a microphone
  • the second input device may be an infrared light sensor
  • the first input event includes change information of the first input device: the user's voice "scroll down" is received
  • the second input Status information of the device The user is looking at the screen, and the operation related to the first input event is: control the page to scroll down.
  • the first input device may be a button
  • the second input device may be a gyroscope sensor
  • the first input event includes change information of the first input device: the power button is double-clicked, and status information of the second input device: a mobile phone.
  • the operation related to the first input event is: the electronic device displays the payment interface.
  • the electronic device may include a first input device, a second input device, a first device driver, a processing module and an execution module.
  • Figure 7 is a schematic flow chart of another input method provided by an embodiment of the present application.
  • the electronic device includes a first input device and a second input device.
  • the method includes:
  • the first input device detects the user's operation.
  • S720 The first device driver obtains the first operation information from the first input device.
  • the first device driver sends the change information of the first input device to the processing module, and accordingly, the processing module receives the change information of the first input device.
  • the processing module obtains the input device status information set.
  • S710-S740 is similar to S610-S640. For details, please refer to the relevant description of S610-S640.
  • the processing module determines the first input intention based on the change information of the first input device and the input device status information set, and generates a first input event.
  • the first input intention may be determined based on the change information of the first input device and the input device status information set. There is a first mapping relationship between the change information of the first input device and the device status information set and the first input intention.
  • the first mapping relationship is explained with several examples. For example, “left mouse button click + keyboard ALT key is pressed”, “touch point slides from the left edge of the screen to the right + keyboard ALT key is pressed”, “mouse left button click + keyboard ALT key “is in a pressed state” and “the touch point slides to the right from the left edge of the screen + the ALT key of the keyboard is in a pressed state” may both correspond to the first input intention.
  • the first mapping relationship may be predefined by the developer or the system.
  • user habits may be determined through big data and then defined based on the user habits, which is not limited by this application.
  • the user can also define or add a corresponding operation for the first input intention. For example, the user can establish a mapping relationship between the input intention of "screenshot” and "the left mouse button click (change information) + the CTRL key is pressed (status information)". That is to say, the user can customize the change information of the first input device and the first mapping relationship between the device status information set and the first input intention.
  • the processing module packages the change information of the first input device, the input device status information set and the first input intention to form a first input event. That is to say, the first input event includes change information of the first input device, the input device status information set and the first input intention.
  • the processing module sends the first input event to the execution module, and accordingly, the execution module receives the first input event from the processing module.
  • S760 is similar to S660. For details, please refer to the relevant description of S660.
  • the execution module responds to the first input intention.
  • the execution module After the execution module receives the first input event, it can query the first input intention in the first input event. This first input intent responds to. That is, the execution module executes an operation related to the first input intention.
  • the execution module performs a screenshot operation; for example, if the first input intention is to record, the execution module performs a recording operation; for another example, if the first input intention is to play music, then the execution module performs a music playback operation. .
  • S770 is similar to S670. For details, please refer to the relevant description of S670.
  • the execution module cannot respond to the first input intention.
  • the execution module is text reading software and does not have the function of playing music.
  • the first input intention in the first input event is to play music, and the execution module The module may respond to the change information of the first input device and the set of input device status information in the first input event.
  • the execution module is text reading software.
  • the first input event includes pressing the power button (change information) and the touch screen having a touch point (status information).
  • the first input intention is to play music, because the text reading software cannot respond to the music playing. Intention, you can respond according to the first input event of pressing the power key (change information) and the touch screen having a touch point (status information).
  • pressing the power button (change information) and the touch screen having a touch point (status information) correspond to opening the settings of the text reading software, and then the electronic device displays the setting interface of the text reading software.
  • the above scenario is related to the distribution of input events. If the input event is only distributed to the execution module that cannot respond to the first input intention, the execution module can target the change information and input device status of the first input device in the first input event. information set to respond.
  • the electronic device may distribute the first input event to the second execution module that can respond to the first input intention.
  • the text reading software cannot respond to the input intention of playing music.
  • the electronic device can also distribute the input event to the music player, and the music player responds to the input intention of playing music.
  • Figure 8 is a schematic flow chart of yet another input method provided by an embodiment of the present application.
  • the method is executed by an electronic device.
  • the electronic device includes a first input device and a second input device.
  • the method includes:
  • S810 Detect the user's operation on the first input device, and determine the change information of the first input device.
  • S810 is similar to S610 and S620. For details, please refer to the relevant descriptions of S610 and S620.
  • the electronic device may perform actions or steps performed by the first input device or the first device driver in FIG. 6 or FIG. 7 .
  • S820 Obtain an input device status information set, which includes status information of the second input device.
  • the electronic device may perform actions or steps performed by the processing module in FIG. 6 or FIG. 7, or the electronic device may perform actions or steps performed by the input framework in FIG. 4 or FIG. 5.
  • the input device status information set at least includes status information of the second input device.
  • the input device status information set may include status information of all input devices of the electronic device.
  • the electronic device includes a first input device and a second input device
  • the input device status information set may include status information of the first input device and status information of the second input device.
  • the first input device and the second input device are different.
  • the first input device and the second input device are different may mean that the first input device and the second input device are of different types.
  • the first input device is a mouse and the second input device is a keyboard; the first input device and the second input device are different.
  • Different input devices may also refer to the first input device and the second input device.
  • the devices in the computer are identified or logically distinguished.
  • the first input device is the first mouse
  • the second input device is the second mouse.
  • the status information of the first input device in the input device status information set is updated according to the change information of the first input device.
  • the input device status information set is updated according to the change information of the input device. In this way, the validity of the status information of the input device in the set of input device status information can be ensured, which is beneficial for the electronic device to correctly respond to the user's operation.
  • S830 Generate a first input event, where the first input event includes change information of the first input device and a set of status information of the input device.
  • the electronic device may perform actions or steps performed by the processing module in FIG. 6 or FIG. 7, or the electronic device may perform actions or steps performed by the input framework or UI framework in FIG. 4 or FIG. 5.
  • the first input intention is determined according to the change information of the first input device and the input device status information set; the first input event also includes the first input intention.
  • the first input intention corresponding to the change information of the first input device and the input device status information set is determined.
  • the user's input intention can be determined based on the mapping relationship. In this way, the user's input intention can be accurately and quickly recognized, so that the electronic device can respond correctly.
  • the first mapping relationship is system predefined or user-defined.
  • the mapping relationship is predefined by the system or customized by the user.
  • the user thinks that the operation method corresponding to the input intention is not in line with his own habits he can modify the operation method corresponding to the input intention or add corresponding operations. In this way, the user's operation method of the electronic device can be enriched and the user experience can be improved.
  • S840 Perform operations related to the first input event.
  • the electronic device may perform actions or steps performed by the execution module in FIG. 6 or FIG. 7, or the electronic device may perform actions or steps performed by the application program in FIG. 4 or FIG. 5.
  • an operation related to the first input intention is performed.
  • the user's input intention can be identified based on the change information of the input device and the input device status information set.
  • the electronic device can query the input intention included in the input event and respond to the input intention, that is, perform an operation related to the input intention. Based on this solution, the electronic device can identify the input intention corresponding to the user's operation, which is conducive to the electronic device making a correct response to the user's operation.
  • the electronic device since there are many combinations between input device change information and input device status information sets, the electronic device may need to adapt to each combination. Based on this solution, the electronic device does not need to adapt to each combination. To adapt the combination, you only need to adapt the input intention. At the same time, if you need to add a new operation method, you can only add the operation method corresponding to the input intention, which reduces the complexity of development.
  • the electronic device can detect the user's operation on the second input device, determine the change information of the second input device, and generate a second input event, where the second input event includes the change information of the second input device. and the updated input device status information set; perform operations related to the second input event.
  • the user's operation on the second input device may be detected, because the input device status information set is processed according to the change information of the first input device.
  • the second input event may include an updated set of input device status information.
  • the electronic device includes corresponding hardware structures and/or software modules that perform each function.
  • the algorithm steps of each example described in conjunction with the embodiments disclosed herein the present application can be implemented in the form of hardware or a combination of hardware and computer software. Whether a function is performed by hardware or computer software driving the hardware depends on the specific application and design constraints of the technical solution. Skilled artisans may implement the described functionality using different methods for each specific application, but such implementations should not be considered beyond the scope of this application.
  • Embodiments of the present application can divide the processor in the electronic device into functional modules according to the above method examples.
  • each functional module can be divided into corresponding functional modules, or two or more functions can be integrated into one processing module.
  • the above integrated modules can be implemented in the form of hardware or software function modules. It should be noted that the division of modules in the embodiment of the present application is schematic and is only a logical function division. In actual implementation, there may be other division methods.
  • Figure 9 shows a schematic diagram of the composition of a device 900 provided by an embodiment of the present application.
  • the device 900 includes: a detection unit 910, a processing unit 920 and execution unit 930.
  • the detection unit 910 is used to detect the user's operation on the first input device and determine the change information of the first input device.
  • the detection unit 910 may be used to perform S810 in Figure 8, or perform actions or steps performed by the first input device and the first device driver in Figures 6 and 7, or perform S810 in Figure 4 or Figure 5 Actions or steps performed by input devices such as mouse and keyboard or device drivers.
  • Processing unit 920 configured to obtain an input device status information set, which includes status information of a second input device; and generate a first input event, which includes change information of the first input device and input device status. information set.
  • the processing unit may be used to perform S820 and S830 in Figure 8, or perform actions or steps performed by the processing module in Figures 6 and 7, or perform the input frame or UI frame in Figure 4 or Figure 5. action or step.
  • the execution unit 930 is used to perform operations related to the first input event, or to perform operations related to the first input intention. Illustratively, it is used to perform S840 in Figure 8, or perform actions or steps performed by the execution module in Figure 6 or Figure 7, or perform actions or steps performed by the application program in Figure 4 or Figure 5.
  • the device 900 provided by the embodiment of the present application is used to execute the above input method, and therefore can achieve the same effect as the above input method.
  • An embodiment of the present application also provides an electronic device, including: a display screen (touch screen), a processor, a power button, a memory, an application program, and a computer program.
  • a display screen touch screen
  • a processor a power button
  • a memory a memory
  • an application program a computer program.
  • Each of the above devices can be connected through one or more communication buses.
  • the one or more computer programs are stored in the above-mentioned memory and configured to be executed by the one or more processors.
  • the one or more computer programs include instructions, and the above-mentioned instructions can be used to cause the electronic device to execute each of the above-mentioned tasks.
  • the above-mentioned processor may be the processor 110 shown in FIG. 1
  • the above-mentioned memory may be specifically the processor shown in FIG.
  • Figure 10 is a schematic diagram of the hardware structure of the device 1000 provided by the embodiment of the present application.
  • the device 1000 shown in FIG. 10 includes a memory 1010, a processor 1020, a communication interface 1030 and a bus 1040.
  • the memory 1010, the processor 1020, and the communication interface 1030 implement communication connections between each other through the bus 1040.
  • Memory 1010 may be ROM, static storage device, dynamic storage device or RAM.
  • the memory 1010 can store programs. When the program stored in the memory 1010 is executed by the processor 1020, the processor 1020 is used to execute various steps of the input method of the embodiment of the present application.
  • the processor 1020 may use a general-purpose CPU, microprocessor, ASIC, GPU or one or more integrated circuits to execute relevant programs to implement the functions required to be performed by the units in the device 1000 in the embodiment of the present application, or to perform The input method of the method embodiment of the present application.
  • the processor 1020 may also be an integrated circuit chip with signal processing capabilities. During the implementation process, each step of the input method of the present application can be completed by instructions in the form of hardware integrated logic circuits or software in the processor 1020 .
  • the above-mentioned processor 1020 can also be a general-purpose processor, DSP, ASIC, FPGA or other programmable logic device, discrete gate or transistor logic device, or discrete hardware component. Each method, step and logical block diagram disclosed in the embodiment of this application can be implemented or executed.
  • a general-purpose processor may be a microprocessor or the processor may be any conventional processor, etc.
  • the steps of the method disclosed in conjunction with the embodiments of the present application can be directly implemented by a hardware decoding processor, or executed by a combination of hardware and software modules in the decoding processor.
  • the software module can be located in random access memory, flash memory, read-only memory, programmable read-only memory or electrically erasable programmable memory, registers and other mature storage media in this field.
  • the storage medium is located in the memory 1010.
  • the processor 1020 reads the information in the memory 1010, and in combination with its hardware, completes the functions required to be performed by the units included in the device 1000 of the embodiment of the present application, or executes the input method of the method embodiment of the present application.
  • the communication interface 1030 uses a transceiver device such as but not limited to a transceiver to implement communication between the device 1000 and other devices or communication networks.
  • a transceiver device such as but not limited to a transceiver to implement communication between the device 1000 and other devices or communication networks.
  • Bus 1040 may include a path that carries information between various components of device 1000 (eg, memory 1010, processor 1020, communication interface 1030).
  • the device 1000 shown in Figure 10 only shows a memory, a processor, and a communication interface, during specific implementation, those skilled in the art will understand that the device 1000 also includes other devices necessary for normal operation. . At the same time, based on specific needs, those skilled in the art should understand that the device 1000 may also include hardware devices that implement other additional functions. In addition, those skilled in the art should understand that the device 1000 may only include components necessary to implement the embodiments of the present application, and does not necessarily include all components shown in FIG. 10 .
  • An embodiment of the present application also provides a chip.
  • the chip includes a processor and a communication interface.
  • the communication interface is used to receive a signal and transmit the signal to the processor.
  • the processor processes the signal so that any one of the above Input methods are implemented in three possible implementations.
  • This embodiment also provides a computer-readable storage medium.
  • Computer instructions are stored in the computer-readable storage medium.
  • the electronic device causes the electronic device to execute the above-mentioned related method steps to implement the above-mentioned embodiments. Input method.
  • This embodiment also provides a computer program product.
  • the computer program product When the computer program product is run on a computer, it causes the computer to perform the above related steps to implement the input method in the above embodiment.
  • inventions of the present application also provide a device.
  • This device may be a chip, a component or a module.
  • the device may include a connected processor and a memory.
  • the memory is used to store computer execution instructions.
  • the processor can execute computer execution instructions stored in the memory, so that the chip executes the input method in each of the above method embodiments.
  • the terms “when” or “after” may be interpreted to mean “if" or “after” or “in response to determining" or “in response to detecting ...”.
  • the phrase “when determining" or “if (stated condition or event) is detected” may be interpreted to mean “if it is determined" or “in response to determining" or “on detecting (stated condition or event)” or “in response to detecting (stated condition or event)”.
  • the disclosed systems, devices and methods can be implemented in other ways.
  • the device embodiments described above are only illustrative.
  • the division of the units is only a logical function division. In actual implementation, there may be other division methods.
  • multiple units or components may be combined or can be integrated into another system, or some features can be ignored, or not implemented.
  • the coupling or direct coupling or communication connection between each other shown or discussed may be through some interfaces, and the indirect coupling or communication connection of the devices or units may be in electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated, and the components shown as units may or may not be physical units, that is, they may be located in one place, or they may be distributed to multiple network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of this embodiment.
  • each functional unit in each embodiment of the present application can be integrated into one processing unit, each unit can exist physically alone, or two or more units can be integrated into one unit.
  • the functions are implemented in the form of software functional units and sold or used as independent products, they can be stored in a computer-readable storage medium.
  • the technical solution of the present application is essentially or the part that contributes to the existing technology or the part of the technical solution can be embodied in the form of a software product.
  • the computer software product is stored in a storage medium, including Several instructions are used to cause a computer device (which may be a personal computer, a server, or a network device, etc.) to execute all or part of the steps of the methods described in various embodiments of this application.
  • the aforementioned storage media include: U disk, mobile hard disk, read-only memory (ROM), random access memory (Random Access Memory, RAM), magnetic disk or optical disk and other media that can store program code. .

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本申请提供了一种输入方法及装置,该方法包括:检测用户对第一输入设备的操作,确定第一输入设备的变化信息;获取输入设备状态信息集,输入设备状态信息集包括第二输入设备的状态信息;生成第一输入事件,第一输入事件包括第一输入设备的变化信息和输入设备状态信息集;执行与第一输入事件相关的操作。基于该方案,电子设备能够对用户协同操作多个输入设备进行有效响应,且可以丰富用户对电子设备的操作方式,提升用户体验。

Description

一种输入方法及装置
本申请要求于2022年4月21日提交中国专利局、申请号为202210420430.3、申请名称为“一种输入方法及装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及电子设备领域,并且更加具体地,涉及一种输入方法及装置。
背景技术
输入设备是用于人或外部与计算机进行交互的一种装置,用于将数据或信息输入到计算机中。计算机可以通过输入设备获取对应的输入事件,并且针对输入事件进行响应。然而,当用户想要协同操作多个输入设备与计算机进行交互时,计算机可能无法做出正确的响应。
发明内容
本申请实施例提供一种输入方法及装置,能够对用户协同操作多个输入设备进行有效响应。
第一方面,提供了一种输入方法,该方法应用于电子设备,该方法包括:检测用户对第一输入设备的操作,确定该第一输入设备的变化信息;获取输入设备状态信息集,该输入设备状态信息集包括第二输入设备的状态信息;生成第一输入事件,该第一输入事件包括该第一输入设备的变化信息和该输入设备状态信息集;执行与该第一输入事件相关的操作。
在本申请实施例中,在用户操作第一输入设备时生成的第一输入事件包括第一输入设备的变化信息和输入设备状态信息集,输入设备状态信息集包括第二输入设备的状态信息。这样,用户可以协同操作第一输入设备和第二输入设备与电子设备进行交互,电子设备也能够对用户协同操作第一输入设备和第二输入设备进行有效响应,从而避免电子设备在用户协同操作两个输入设备时无法进行正确响应。在另一方面,可以丰富用户对电子设备的操作方式,提升用户体验。
结合第一方面,在一种可能的实现方式中,该输入设备状态信息集还包括该第一输入设备的状态信息。
在本申请实施例中,用户可以协同操作第一输入设备和第二输入设备与电子设备进行交互,并且可以更加丰富用户对电子设备的操作方式,提升用户体验。
可选地,该输入设备状态信息集包括电子设备全部的输入设备的状态信息。
结合第一方面,在一种可能的实现方式中,该方法还包括:根据该第一输入设备的变化信息,更新该输入设备状态信息集中的该第一输入设备的状态信息。
在本申请实施例中,每当检测到用户对输入设备的操作,可以根据输入设备的变化信息对输入设备状态信息集中的输入设备的状态信息进行更新,从而可以保证输入设备的状态信息的有效性,防止电子设备错误地响应用户的操作。
结合第一方面,在一种可能的实现方式中,该方法还包括:检测用户对该第二输入设备的操作,确定该第二输入设备的变化信息;生成第二输入事件,该第二输入事件包括该第二输入设备的变化信息和更新后的该输入设备状态信息集;执行与该第二输入事件相关的操作。
在本申请实施例中,可以是在检测到用户对第一输入设备的操作后,检测到用户对第二输入设备的操作,由于根据第一输入设备的变化信息对输入设备状态信息集进行了更新,第二输入事件可以包括更新后的输入设备状态信息集。基于该方案,用户可以协同操作第一输入设备和第二输入设备与电子设备进行交互,电子设备也能够对用户协同操作第一输入设备和第二输入设备进行有效响应。在另一方面,可以丰富用户对电子设备的操作方式,提升用户体验。
结合第一方面,在一种可能的实现方式中,该方法还包括:根据该第一输入设备的变化信息和该输入设备状态信息集,确定第一输入意图;该第一输入事件还包括该第一输入意图。该执行与该第一输入事件相关的操作,包括:执行与该第一输入意图相关的操作。
在本申请实施例中,可以根据输入设备的变化信息和输入设备状态信息集,识别用户的输入意图。电子设备在响应输入事件时,可以查询输入事件中包括的输入意图,针对输入意图进行响应,即执行与输入意图相关的操作。基于该方案,电子设备可以识别用户的操作对应的输入意图,有利于电子设备对用户的操作做出正确的响应。在另一方面,由于输入设备的变化信息和输入设备状态信息集之间的组合较多,电子设备可能需要对每一种组合进行适配,而基于该方案,电子设备可以不必针对每一种组合进行适配,只需适配输入意图即可。同时,如果需要增加新的操作方式,可以仅增加输入意图对应的操作方式即可,降低了开发的复杂度。
结合第一方面,在一种可能的实现方式中,该根据该第一输入设备的变化信息和该输入设备状态信息集,确定第一输入意图,包括:根据第一映射关系,确定该第一输入设备的变化信息和该输入设备状态信息集对应的第一输入意图。
在本申请实施例中,可以根据映射关系确定用户的输入意图。这样,可以准确快速的识别出用户的输入意图,以便于电子设备进行正确的响应。
结合第一方面,在一种可能的实现方式中,该第一映射关系为系统预定义或用户自定义的。
在本申请实施例中,映射关系为系统预定义或用户自定义的。在用户认为输入意图对应的操作方式不符合自己的习惯时,可以对输入意图对应的操作方式进行修改或增加对应的操作,这样,可以丰富用户对电子设备的操作方式,提升用户体验。
第二方面,提供了一种电子设备,该电子设备包括:
一个或多个处理器;
一个或多个存储器;
该一个或多个存储器存储有一个或多个计算机程序,该一个或多个计算机程序包括指令,当该指令被该一个或多个处理器执行时,使得该电子设备执行以下步骤:
检测用户对第一输入设备的操作,确定该第一输入设备的变化信息;
获取输入设备状态信息集,该输入设备状态信息集包括第二输入设备的状态信息;
生成第一输入事件,该第一输入事件包括该第一输入设备的变化信息和该输入设备状态信息集;
执行与该第一输入事件相关的操作。
结合第二方面,在一种可能的实现方式中,该输入设备状态信息集还包括该第一输入设备的状态信息。
结合第二方面,在一种可能的实现方式中,当该指令被该一个或多个处理器执行时,使得该电子设备执行以下步骤:
根据该第一输入设备的变化信息,更新该输入设备状态信息集中的该第一输入设备的状态信息。
结合第二方面,在一种可能的实现方式中,当该指令被该一个或多个处理器执行时,使得该电子设备执行以下步骤:
检测用户对该第二输入设备的操作,确定该第二输入设备的变化信息;
生成第二输入事件,该第二输入事件包括该第二输入设备的变化信息和更新后的该输入设备状态信息集;
执行与该第二输入事件相关的操作。
结合第二方面,在一种可能的实现方式中,当该指令被该一个或多个处理器执行时,使得该电子设备执行以下步骤:
根据该第一输入设备的变化信息和该输入设备状态信息集,确定第一输入意图;
该第一输入事件还包括第一输入意图;
执行与该第一输入意图相关的操作。
结合第二方面,在一种可能的实现方式中,当该指令被该一个或多个处理器执行时,使得该电子设备执行以下步骤:
根据预定义的第一映射关系,确定该第一输入设备的变化信息和该输入设备状态信息集对应的第一输入意图。
结合第二方面,在一种可能的实现方式中,该第一映射关系为系统预定义或用户自定义的。
第三方面,提供了一种电子设备,该电子设备包括执行上述第一方面或者第一方面的任意一种可能的设计的方法的模块/单元;这些模块/单元可以通过硬件实现,也可以通过硬件执行相应的软件实现。
第四方面,提供了一种电子设备,该电子设备包括:
检测单元,用于检测用户对第一输入设备的操作,确定该第一输入设备的变化信息;
处理单元,用于获取输入设备状态信息集,该输入设备状态信息集包括第二输入设备的状态信息;
该处理单元还用于生成第一输入事件,该第一输入事件包括该第一输入设备的变化信息和该输入设备状态信息集;
执行单元,用于执行与该第一输入事件相关的操作。
结合第三方面,在一种可能的实现方式中,所述输入设备状态信息集还包括所述第一 输入设备的状态信息。
结合第三方面,在一种可能的实现方式中,所述处理单元还用于根据所述第一输入设备的变化信息,更新所述输入设备状态信息集中的所述第一输入设备的状态信息。
结合第三方面,在一种可能的实现方式中,所述检测单元还用于检测用户对所述第二输入设备的操作,确定所述第二输入设备的变化信息;
所述处理单元还用于生成第二输入事件,所述第二输入事件包括所述第二输入设备的变化信息和更新后的所述输入设备状态信息集;
所述执行单元还用于执行与所述第二输入事件相关的操作。
结合第三方面,在一种可能的实现方式中,所述处理单元还用于根据所述第一输入设备的变化信息和所述输入设备状态信息集,确定第一输入意图;
所述第一输入事件还包括所述第一输入意图;
所述执行单元还用于执行与所述第一输入意图相关的操作。
结合第三方面,在一种可能的实现方式中,所述处理单元具体用于根据第一映射关系,确定所述第一输入设备的变化信息和所述输入设备状态信息集对应的第一输入意图。
结合第三方面,在一种可能的实现方式中,所述第一映射关系为系统预定义或用户自定义的。
应理解,上述几种装置或电子设备对应的具体实现方式以及有益效果在上述方法实施例中已经详细说明,具体可参考上述方法实施例,为了简洁,在此不再赘述。
第五方面,提供了一种芯片,其特征在于,该芯片包括处理器和通信接口,该通信接口用于接收信号,并将该信号传输至该处理器,该处理器处理该信号,使得第一方面或第一方面任意一种实现方式的方法该电子设备执行。
第六方面,提供了一种芯片,该芯片与电子设备中的存储器耦合,用于调用存储器中存储的计算机程序并执行本申请实施例第一方面及其第一方面任一可能设计的技术方案;本申请实施例中“耦合”是指两个部件彼此直接或间接地结合。
第七方面,提供了一种计算机可读存储介质,包括计算机指令,当该计算机指令在电子设备上运行时,使得该电子设备执行第一方面或第一方面任意一种实现方式的方法。
第八方面,提供了一种计算机程序产品,该计算机程序产品包括:计算机程序代码,当该计算机程序代码被运行时,使得该电子设备执行第一方面或第一方面任意一种实现方式的方法。
附图说明
图1是本实施例提供的一种电子设备的结构示意图。
图2是本申请实施例的电子设备的软件结构框图。
图3是本申请实施例提供的一种传统的输入方法的示意图。
图4是本申请实施例提供的一种输入方法的示意图。
图5是本申请实施例提供的另一种输入方法的示意图。
图6是本申请实施例提供的一种输入方法的示意性流程图。
图7是本申请实施例提供的另一种输入方法的示意性流程图。
图8是本申请实施例提供的又一种输入方法的示意性流程图。
图9是本申请实施例提供的一种装置的组成示意图。
图10是本申请实施例提供的一种装置的硬件结构示意图。
具体实施方式
以下实施例中所使用的术语只是为了描述特定实施例的目的,而并非旨在作为对本申请的限制。如在本申请的说明书和所附权利要求书中所使用的那样,单数表达形式“一个”、“一种”、“所述”、“上述”、“该”和“这一”旨在也包括例如“一个或多个”这种表达形式,除非其上下文中明确地有相反指示。还应当理解,在本申请以下各实施例中,“至少一个”、“一个或多个”是指一个、两个或两个以上。术语“和/或”,用于描述关联对象的关联关系,表示可以存在三种关系;例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B的情况,其中A、B可以是单数或者复数。字符“/”一般表示前后关联对象是一种“或”的关系。
在本说明书中描述的参考“一个实施例”或“一些实施例”等意味着在本申请的一个或多个实施例中包括结合该实施例描述的特定特征、结构或特点。由此,在本说明书中的不同之处出现的语句“在一个实施例中”、“在一些实施例中”、“在其他一些实施例中”、“在另外一些实施例中”等不是必然都参考相同的实施例,而是意味着“一个或多个但不是所有的实施例”,除非是以其他方式另外特别强调。术语“包括”、“包含”、“具有”及它们的变形都意味着“包括但不限于”,除非是以其他方式另外特别强调。
以下介绍电子设备和用于使用这样的电子设备的实施例。在一些实施例中,电子设备可以是还包含其它功能诸如个人数字助理和/或音乐播放器功能的便携式电子设备,诸如手机、平板电脑、具备无线通讯功能的可穿戴电子设备(如智能手表)等。便携式电子设备的示例性实施例包括但不限于搭载或者其它操作系统的便携式电子设备。上述便携式电子设备也可以是其它便携式电子设备,诸如膝上型计算机(Laptop)等。还应当理解的是,在其他一些实施例中,上述电子设备也可以不是便携式电子设备,而是台式计算机。
示例性的,图1示出了本申请实施例提供的一例电子设备100的结构示意图。例如,如图1所示,电子设备100可以包括处理器110,外部存储器接口120,内部存储器121,通用串行总线(universal serial bus,USB)接口130,充电管理模块140,电源管理模块141,电池142,天线1,天线2,移动通信模块150,无线通信模块160,音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,传感器模块180,按键190,马达191,指示器192,摄像头193,显示屏194,以及用户标识模块(subscriber identification module,SIM)卡接口195等。其中传感器模块180可以包括压力传感器180A,陀螺仪传感器180B,气压传感器180C,磁传感器180D,加速度传感器180E,距离传感器180F,接近光传感器180G,指纹传感器180H,温度传感器180J,触摸传感器180K,环境光传感器180L,骨传导传感器180M等。
可以理解的是,本申请实施例示意的结构并不构成对电子设备100的具体限定。在本申请另一些实施例中,电子设备100可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,存储器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
其中,控制器可以是电子设备100的神经中枢和指挥中心。控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。
处理器110中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器110中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从该存储器中直接调用。避免了重复存取,减少了处理器110的等待时间,因而提高了系统的效率。
在一些实施例中,处理器110可以包括一个或多个接口。接口可以包括集成电路(inter-integrated circuit,I2C)接口,集成电路内置音频(inter-integrated circuit sound,I2S)接口,脉冲编码调制(pulse code modulation,PCM)接口,通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口,移动产业处理器接口(mobile industry processor interface,MIPI),通用输入输出(general-purpose input/output,GPIO)接口,用户标识模块(subscriber identity module,SIM)接口,和/或通用串行总线(universal serial bus,USB)接口等。
其中,I2C接口是一种双向同步串行总线,包括一根串行数据线(serial data line,SDA)和一根串行时钟线(derail clock line,SCL)。
I2S接口可以用于音频通信。在一些实施例中,处理器110可以包含多组I2S总线。处理器110可以通过I2S总线与音频模块170耦合,实现处理器110与音频模块170之间的通信。
PCM接口也可以用于音频通信,将模拟信号抽样,量化和编码。在一些实施例中,音频模块170与无线通信模块160可以通过PCM总线接口耦合。
UART接口是一种通用串行数据总线,用于异步通信。该总线可以为双向通信总线。它将要传输的数据在串行通信与并行通信之间转换。在一些实施例中,UART接口通常被用于连接处理器110与无线通信模块160。
MIPI接口可以被用于连接处理器110与显示屏194,摄像头193等外围器件。GPIO接口可以通过软件配置。
GPIO接口可以被配置为控制信号,也可被配置为数据信号。在一些实施例中,GPIO接口可以用于连接处理器110与摄像头193,显示屏194,无线通信模块160,音频模块170,传感器模块180等。
USB接口130是符合USB标准规范的接口,具体可以是Mini USB接口,Micro USB接口,USB Type C接口等。USB接口130可以用于连接充电器为电子设备100充电,也可以用于电子设备100与外围设备之间传输数据。
可以理解的是,本申请实施例示意的各模块间的接口连接关系,只是示意性说明,并不构成对电子设备100的结构限定。在本申请另一些实施例中,电子设备100也可以采用 上述实施例中不同的接口连接方式,或多种接口连接方式的组合。
充电管理模块140用于从充电器接收充电输入。其中,充电器可以是无线充电器,也可以是有线充电器。在一些有线充电的实施例中,充电管理模块140可以通过USB接口130接收有线充电器的充电输入。在一些无线充电的实施例中,充电管理模块140可以通过电子设备100的无线充电线圈接收无线充电输入。充电管理模块140为电池142充电的同时,还可以通过电源管理模块141为电子设备供电。
电源管理模块141用于连接电池142,充电管理模块140与处理器110。
电子设备100的无线通信功能可以通过天线1,天线2,移动通信模块150,无线通信模块160,调制解调处理器以及基带处理器等实现。
移动通信模块150可以提供应用在电子设备100上的包括2G/3G/4G/5G等无线通信的解决方案。
调制解调处理器可以包括调制器和解调器。其中,调制器用于将待发送的低频基带信号调制成中高频信号。解调器用于将接收的电磁波信号解调为低频基带信号。随后解调器将解调得到的低频基带信号传送至基带处理器处理。低频基带信号经基带处理器处理后,被传递给应用处理器。应用处理器通过音频设备(不限于扬声器170A,受话器170B等)输出声音信号,或通过显示屏194显示图像或视频。在一些实施例中,调制解调处理器可以是独立的器件。在另一些实施例中,调制解调处理器可以独立于处理器110,与移动通信模块150或其他功能模块设置在同一个器件中。
无线通信模块160可以提供应用在电子设备100上的包括无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。
在一些实施例中,电子设备100的天线1和移动通信模块150耦合,天线2和无线通信模块160耦合,使得电子设备100可以通过无线通信技术与网络以及其他设备通信。
电子设备100通过GPU,显示屏194,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏194和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
显示屏194用于显示图像,视频等。显示屏194包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD),也可以采用有机发光二极管(organic light-emitting diode,OLED)、有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode,AMOLED)、柔性发光二极管(flex light-emitting diode,FLED)、Miniled、MicroLed、Micro-oLed或量子点发光二极管(quantum dot light emitting diodes,QLED)等材料中的一种所制作的显示面板。在一些实施例中,电子设备100可以包括1个或N个显示屏194,N为大于1的正整数。在一些实施例中,显示屏194还可以集成触控功能,也可以称为触摸屏。
电子设备100可以通过ISP,摄像头193,视频编解码器,GPU,显示屏194以及应用处理器等实现拍摄功能。
外部存储器接口120可以用于连接外部存储卡,例如Micro SD卡,实现扩展电子设备100的存储能力。
内部存储器121可以用于存储计算机可执行程序代码,该可执行程序代码包括指令。处理器110通过运行存储在内部存储器121的指令,从而执行电子设备100的各种功能应用以及数据处理。
电子设备100可以通过音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,以及应用处理器等实现音频功能。例如音乐播放,录音等。
音频模块170用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转换为数字音频信号。扬声器170A,也称“喇叭”,用于将音频电信号转换为声音信号。受话器170B,也称“听筒”,用于将音频电信号转换成声音信号。麦克风170C,也称“话筒”,“传声器”,用于将声音信号转换为电信号。耳机接口170D用于连接有线耳机。
压力传感器180A用于感受压力信号,可以将压力信号转换成电信号。在一些实施例中,压力传感器180A可以设置于显示屏194。陀螺仪传感器180B可以用于确定电子设备100的运动姿态。气压传感器180C用于测量气压。在一些实施例中,电子设备100通过气压传感器180C测得的气压值计算海拔高度,辅助定位和导航。加速度传感器180E可检测电子设备100在各个方向上(一般为三轴)加速度的大小。距离传感器180F,用于测量距离。指纹传感器180H用于采集指纹。触摸传感器180K,也称“触控面板”。触摸传感器180K可以设置于显示屏194,由触摸传感器180K与显示屏194组成触摸屏,也称“触控屏”。骨传导传感器180M可以获取振动信号。在一些实施例中,骨传导传感器180M可以获取人体声部振动骨块的振动信号。骨传导传感器180M也可以接触人体脉搏,接收血压跳动信号。
按键190包括开机键,音量键等。马达191可以产生振动提示。指示器192可以是指示灯,可以用于指示充电状态,电量变化,也可以用于指示消息,未接来电,通知等。SIM卡接口195用于连接SIM卡。
图2是本申请实施例的电子设备100的软件结构框图。分层架构将软件分成若干个层,每一层都有清晰的角色和分工。层与层之间通过软件接口通信。在一些实施例中,将Android系统分为四层,从上至下分别为应用程序层,应用程序框架层,安卓运行时(Android runtime)和系统库,以及内核层。应用程序层可以包括一系列应用程序包。
如图2所示,应用程序包可以包括相机,图库,日历,通话,地图,导航,WLAN,蓝牙,音乐,视频,短信息,APP1,APP2等应用程序。
应用程序框架层为应用程序层的应用程序提供应用编程接口(application programming interface,API)和编程框架。应用程序框架层包括一些预先定义的函数。
如图2所示,应用程序框架层可以包括窗口管理器,内容提供器,视图系统,电话管理器,资源管理器,通知管理器等。
窗口管理器用于管理窗口程序。窗口管理器可以获取显示屏大小,判断是否有状态栏,锁定屏幕,截取屏幕等。
内容提供器用来存放和获取数据,并使这些数据可以被应用程序访问。该数据可以包括视频,图像,音频,拨打和接听的电话,浏览历史和书签,电话簿等。
视图系统包括可视控件,例如显示文字的控件,显示图片的控件等。视图系统可用于 构建应用程序。显示界面可以由一个或多个视图组成的。例如,包括短信通知图标的显示界面,可以包括显示文字的视图以及显示图片的视图。
电话管理器用于提供电子设备100的通信功能。例如通话状态的管理(包括接通,挂断等)。
资源管理器为应用程序提供各种资源,比如本地化字符串,图标,图片,布局文件,视频文件等等。
通知管理器使应用程序可以在状态栏中显示通知信息,可以用于传达告知类型的消息,可以短暂停留后自动消失,无需用户交互。比如通知管理器被用于告知下载完成,消息提醒等。通知管理器还可以是以图表或者滚动条文本形式出现在系统顶部状态栏的通知,例如后台运行的应用程序的通知,还可以是以对话窗口形式出现在屏幕上的通知。例如在状态栏提示文本信息,发出提示音,电子设备振动,指示灯闪烁等。
Android runtime包括核心库和虚拟机。Android runtime负责安卓系统的调度和管理。
核心库包含两部分:一部分是java语言需要调用的功能函数,另一部分是安卓的核心库。
应用程序层和应用程序框架层运行在虚拟机中。虚拟机将应用程序层和应用程序框架层的java文件执行为二进制文件。虚拟机用于执行对象生命周期的管理,堆栈管理,线程管理,安全和异常的管理,以及垃圾回收等功能。
系统库可以包括多个功能模块。例如:表面管理器(surface manager),媒体库(media libraries),三维图形处理库(例如:OpenGL ES),2D图形引擎(例如:SGL)等。
表面管理器用于对显示子系统进行管理,并且为多个应用程序提供了2D和3D图层的融合。
媒体库支持多种常用的音频,视频格式回放和录制,以及静态图像文件等。媒体库可以支持多种音视频编码格式,例如:MPEG4,H.264,MP3,AAC,AMR,JPG,PNG等。
三维图形处理库用于实现三维图形绘图、图像渲染、合成和图层处理等。
2D图形引擎是2D绘图的绘图引擎。
内核层是硬件和软件之间的层。内核层可以包含显示驱动,摄像头驱动,音频驱动,传感器驱动。
应理解,本申请实施例中的电子设备也可以是安装有Windows、Linux、安卓、鸿蒙或苹果等操作系统的电子设备。
在一些实施例中,电子设备100还可以包括多种输入设备,例如鼠标、键盘、触摸屏、相机、麦克风等。例如,图1所示的按键190、摄像头193、麦克风170、显示屏194等。
在一个示例中,输入设备可以是电子设备100的一部分,例如,摄影头193或显示屏194等;在另一个示例中,输入设备可以独立于电子设备100,例如鼠标或麦克风170等。输入设备与电子设备100可以有多种连接方式,输入设备可以通过蓝牙或WiFi与电子设备100无线连接;或者,输入设备可以通过USB接口130与电子设备100有线连接。
在本申请实施例中,电子设备还可能包括陀螺仪传感器、红外光传感器、结构光传感器等输入设备,这些传感器例如可以识别电子设备的姿态变化,周围的环境光变化等。
电子设备100可以包括输入设备对应的驱动程序,如图2所示的显示驱动、摄像头驱动、音频驱动等。驱动程序是计算机和硬件设备(例如,输入设备)进行通信的接口,驱 动程序可以将硬件设备的操作转换成机器语言,或者也可以将系统的指令传达给硬件设备。
在介绍本申请实施例之前,先介绍几个和本申请实施例相关的概念。
输入框架:属于操作系统的一部分,连接硬件设备和应用程序以及其他系统服务,主要用于生成输入事件、分发输入事件到应用程序。
用户界面(user interface,UI)框架:属于操作系统的一部分,用于识别和显示应用程序的UI,并且可以根据输入事件触发应用程序进行响应。
输入事件:用户通过输入设备对计算系统发起的操作在计算机中表示为输入事件。
图3是本申请实施例提供的一种传统的输入方法的示意图。如图3所示,在用户对输入设备进行操作时,电子设备100的内部的信息流沿图3中的箭头所示方向传递。具体流程介绍如下。
用户操作电子设备的输入设备;对应的驱动程序获取到用户的操作信息,将操作信息转换为输入设备变化信息并上报输入框架;输入框架收到输入设备变化信息后,并将输入设备变化信息和状态信息打包在一起形成输入事件,然后分发到UI框架;UI框架接收到输入事件后,找到指向的应用程序中对应的响应代码,交由其处理;应用程序中的响应代码接收到输入事件后,响应用户的操作。
示例性的,用户使用手指点击触摸屏中的浏览器的登录控件,显示驱动获取到触摸屏被点击的位置的操作信息,将其转换为“触摸屏上增加一个触摸点(包括触摸点在屏幕上的位置)”的触摸变化信息并且上报输入框架;输入框架收到驱动程序上报的“触摸屏上增加一个触摸点”的变化信息后,将触摸屏状态信息更新为“触摸屏具有一个接触点(包括触摸点在屏幕上的位置)”,然后将“触摸屏上增加一个触摸点”的变化信息和“触摸屏具有一个触摸点”的状态信息打包成触摸输入事件分发到UI框架;UI框架接收到触摸输入事件后,根据触摸屏被点击的位置,触发浏览器中的登录控件对应的响应代码,执行登录逻辑。
用户在使用电子设备办公时可能会用到多个输入设备,该多个输入设备可以共用一个电子设备。例如,在一些场景下,用户同时使用鼠标、键盘和触摸屏进行办公,鼠标、键盘和触摸屏可以与同一个电子设备相连接。然而,当用户同时操作两个以上的输入设备与电子设备进行交互时,每个输入设备均会独立的上报输入事件,如图3所示的鼠标输入事件、按键输入事件等,应用程序会分别针对鼠标输入事件和按键输入事件独立进行响应。上报鼠标输入事件时,应用程序无法获取其他输入设备(例如键盘、触摸屏)的状态信息。在一些情况下,由于两种输入事件之间存在矛盾,可能造成应用程序无法做出正确的响应。例如,用户使用触摸屏将应用程序向左拖动,并且使用鼠标将应用程序向右拖动,此时,电子设备可能只响应其中一个操作,或者无法响应用户的操作。
图4是本申请实施例提供的一种输入方法的示意图。电子设备100的内部的信息流沿图4中的箭头所示方向传递,具体介绍如下。
以触摸输入事件为例进行说明。用户使用手指按压触摸屏中的第一界面的右上角并向右上方拖动;对应的驱动程序获取到用户的操作信息,将操作信息转换为触摸屏变化信息并上报输入框架;输入框架将触摸屏变化信息和输入设备状态信息集打包形成输入事件,然后分发到UI框架;UI框架接收到输入事件后,找到指向的应用程序中对应的响应代码,交由其处理;应用程序中的响应代码接收到输入事件后,响应用户的操作。其中,输入设 备状态信息集是指电子设备的全部输入设备当前的状态信息。例如,电子设备的输入设备包括鼠标和触摸屏,输入设备状态信息集包括鼠标状态信息和触摸屏状态信息。
示例性的,在上述示例中,鼠标的状态信息为:鼠标位于第一界面的左下角,处于左键按下的状态。触摸屏变化信息为:触摸屏上增加一个触摸点,位于第一界面的右上角,且该触摸点向右上方移动。触摸输入事件包括触摸屏变化信息和鼠标的状态信息。应用程序接收到该触摸输入事件后,响应于该触摸输入事件,放大第一界面。
在另一个示例中,鼠标的状态信息为:鼠标位于第一界面的左下角,处于无操作的状态。触摸屏变化信息为:触摸屏上增加一个触摸点,位于第一界面的右上角,且该触摸点向右上方移动。触摸输入事件包括触摸屏变化信息和鼠标的状态信息。应用程序接收到该触摸输入事件后,响应于该触摸输入事件,将第一界面向远离鼠标位置的方向拖动。
需要说明的是,图4中的鼠标输入事件和触摸输入事件为两个独立的输入事件,但是鼠标输入事件和触摸输入事件中包括的输入设备状态信息集均包括另一个输入设备的状态信息。
在上述触摸输入事件的示例中,输入设备状态信息集中的鼠标状态信息为:鼠标位于第一界面的左下角,处于左键按下的状态。可以理解,在上报触摸输入事件之前,已经上报鼠标输入事件。该鼠标输入事件的鼠标变化信息为:移动到第一界面左下角并且按下左键,并且根据该鼠标变化信息更新鼠标状态信息。更新后的鼠标状态信息为:鼠标位于第一界面的左下角,处于左键按下的状态。该更新后的鼠标状态信息可以作为后续触摸输入事件中的输入设备状态信息集的一部分。
应理解,在本申请实施例中,可以根据输入设备的变化信息更新输入设备的状态信息。
图5是本申请实施例提供的另一种输入方法的示意图。电子设备100的内部的信息流沿图5中的箭头所示方向传递,流程与图4所示实施例类似,具体可参考图4所示实施例的描述,在此仅对不同之处进行介绍。
在本实施例中,输入事件可以包括输入意图,应用程序可以针对输入意图进行响应。
在一个示例中,用户按下键盘中的S键。则按键输入事件可以包括输入意图、按键变化信息和输入设备状态信息集。输入设备状态信息集可以包括按键状态信息和鼠标状态信息。其中,按键变化信息为:S键从弹起状态变化为按下状态,按键状态信息为:WINDOWS键和CTRL键处于被按下的状态且S键处于被按下的状态,鼠标状态信息为:无操作状态,输入意图为:截图。截图应用程序接收到按键输入事件后,响应按键输入事件中的输入意图,执行截图操作。
在一个示例中,用户按下键盘中的A键。则按键输入事件包括输入意图、按键变化信息和输入设备状态信息集。输入设备状态信息集包括按键状态信息和鼠标状态信息。其中,按键变化信息为:A键从弹起状态变化为按下状态,按键状态信息为:ALT键处于被按下的状态且A键处于被按下的状态,鼠标状态信息为:无操作状态,输入意图为:截图。截图应用程序接收到按键输入事件后,响应输入意图,执行截图操作。
在一个示例中,用户使用指关节双击屏幕。则手势输入事件包括输入意图、手势变化信息和输入设备状态信息集。输入设备状态信息集包括按键状态信息。其中,手势变化信息为:屏幕被指关节敲击两次,按键状态信息为:无操作状态,输入意图为:截图。截图应用程序接收到该按键输入事件后,响应输入意图,执行截图操作。
在其他示例中,电子设备的输入设备还可能是麦克风、陀螺仪传感器、摄像头或红外光传感器等。
在一个示例中,电子设备的输入设备包括红外光传感器和麦克风,输入事件包括“接收到用户的语音:向下滚动”(麦克风变化信息),以及用户正在注视屏幕(红外光传感器状态信息),则该输入事件对应的输入意图为:向下滚动,电子设备接收到该输入事件后,响应“向下滚动”的输入意图,控制页面向下滚动。
应理解,“用户正在注视屏幕”的状态信息可以是由红外光传感器进行识别并输入到电子设备中的。例如,可以是在接收到用户的语音前,红外光传感器识别到“用户注视屏幕”的变化信息,然后根据该变化信息更新输入设备状态信息集中的红外光传感器状态信息为“用户正在注视屏幕”。在一个示例中,电子设备包括按键和陀螺仪传感器,输入事件包括:电源键被双击(按键变化信息),以及手机正处于抬起姿态(陀螺仪传感器状态信息),则该输入事件对应的输入意图为:打开支付界面,电子设备接收到该输入事件后,响应“打开支付界面”的输入意图,打开相应的支付界面。
应理解,“手机正处于抬起姿态”的状态信息可以是由陀螺仪传感器进行识别并输入到电子设备中的。例如,可以是用户双击电源键前,陀螺仪传感器识别到“手机由水平姿态变化为抬起姿态”的变化信息,然后根据该变化信息更新输入设备状态信息集中的陀螺仪传感器状态信息为“手机处于抬起姿态”。在本实施例中,可以根据输入设备的变化信息以及输入设备状态信息集识别输入意图,并将输入意图打包进输入事件中,电子设备可以响应输入意图,而不是响应具体的操作(例如,响应截图意图,而不是响应ALT+A键)。
应理解,识别输入意图的方式可以是建立输入设备的变化信息和输入设备状态信息集与输入意图之间的映射关系。例如,“截图”的输入意图可以对应“鼠标左键单击(变化信息)+鼠标右键被按下状态+ALT键处于被按下状态”,也可以对应“鼠标右键单击+ALT键处于被按下状态”,还可以对应“屏幕被指关节敲击一次(变化信息)+电源键处于被按下状态”。
还应理解,上述输入设备的变化信息和输入设备状态信息集与输入意图之间的映射关系可以是系统预定义的,具体可以根据用户习惯进行定义,本申请不予限定。
在一些实施例中,用户可以为输入意图定义或增加其对应的操作。例如,用户可以建立“截图”的输入意图与“鼠标左键单击(变化信息)+CTRL键处于被按下的状态”之间的映射关系。
还应理解,上述示例仅以两个输入设备之间的组合为例进行了说明,输入设备的数量可以是两个以上,输入设备状态信息集可以包括两个以上的输入设备的状态信息,例如,上述输入设备状态信息集可以包括鼠标状态信息、按键状态信息、触摸屏状态信息、传感器状态信息等,本申请不予限定。
在一些实施例中,输入设备状态信息集包括电子设备的全部输入设备的状态信息。
可选地,当没有识别出输入意图时,电子设备可以响应具体的输入事件(即根据输入事件中的输入设备的变化信息和输入设备状态信息集进行响应)。
图6是本申请实施例提供的一种输入方法的示意性流程图。在本实施例中,以电子设备包括第一输入设备和第二输入设备进行说明。如图6所示,该方法包括:
S610,第一输入设备检测到用户的操作。
在一个示例中,第一输入设备可以是鼠标,当用户使用鼠标进行操作时,第一输入设备检测到的操作可以有鼠标移动、左键单击、右键单击、左键双击、左键释放和滚轮滑动(向上或向下)等。
在一个示例中,第一输入设备可以是键盘,当用户使用键盘进行操作时,第一输入设备检测到的操作可以有某个键被按下、某个键弹起等。
在一个示例中,第一输入设备可以是麦克风,当用户使用麦克风进行操作时,第一输入设备可以检测到用户的语音输入。
在一个示例中,第一输入设备可以是触摸屏,当用户使用触摸屏进行操作时,第一输入设备检测到的操作可以有单击、双击、手势(触摸点在触摸屏上移动)。
在一个示例中,第一输入设备可以是陀螺仪传感器,当用户移动电子设备时,第一输入设备可以检测到电子设备的姿态的变化。S620,第一设备驱动获取来自第一输入设备的第一操作信息。
在第一输入设备检测到用户的操作后,第一输入设备对应的第一设备驱动可以获取到第一操作信息。第一操作信息可以是用户的具体操作的数字表示或者电信号等。例如,用户按下空格键的动作在计算机中的机器语言的表示。
S630,第一设备驱动向处理模块发送第一输入设备的变化信息,相应的,处理模块接收第一输入设备的变化信息。
输入设备的变化信息是指由于用户的操作导致输入设备状态的变化的信息。例如,用户使用手指按压触摸屏,电子设备可以获取到触摸屏被按压的位置和点击按压的时间,触摸屏变化信息为“触摸屏上增加一个触摸点(包括触摸点在屏幕上的位置和按压的时间)”。例如,键盘从弹起状态变化为按压状态的动作的描述。又例如,用户在手机屏幕上点击一次,设备驱动可以获取到屏幕被点击的位置,此时,第一输入设备的变化信息为“屏幕上增加一个触摸点(可以包括触摸点在屏幕上的位置和按压的时间)”。
在本实施例中,第一输入设备可以将第一操作信息转换为对第一输入设备的状态的变化的描述,也就是第一输入设备的变化信息。
S640,处理模块获取输入设备状态信息集。
在本申请实施例中,输入设备状态信息集可以包括电子设备的全部输入设备的状态信息。例如,电子设备包括第一输入设备和第二输入设备,输入设备状态信息集可以包括第一输入设备的状态信息和第二输入设备的状态信息。示例性的,电子设备可以包括鼠标、键盘和触摸屏,输入设备状态信息集可以包括鼠标状态信息、按键状态信息、触摸状态信息等。例如,按键状态信息可以是空格键被按下的状态或者空格键弹起的状态;又例如,触摸状态信息可以是屏幕上无触摸点或屏幕上具有一个触摸点。
可选地,第一输入设备和第二输入设备不同。其中,第一输入设备和第二输入设备不同可以是指第一输入设备和第二输入设备的类型不同,例如第一输入设备为鼠标,第二输入设备为键盘;第一输入设备和第二输入设备不同还可以是指第一输入设备和第二输入设备在计算机中的标识或逻辑上有区分,例如第一输入设备为第一鼠标,第二输入设备为第二鼠标。
可选地,输入设备状态信息集包括第二输入设备的状态信息。
处理模块可以获取多个输入设备当前的状态信息,以获取输入设备状态信息集。处理 模块可以在本地维护输入设备状态信息集,每当输入设备的状态信息发生改变时,根据输入设备的变化信息更新输入设备状态信息集。
例如,处理模块可以预存输入设备状态信息集,其中输入设备状态信息集中的第一输入设备的状态信息和第二输入设备的状态信息均为默认的状态信息。例如,按键状态信息默认为弹起状态,鼠标状态信息默认为:在屏幕中的位置信息以及无操作状态(例如,左键、右键均为弹起状态)。处理模块获取到第一输入设备的变化信息后,根据第一输入设备的变化信息更新输入设备状态信息集中的第一输入设备的状态信息。
在一些可能的情况下,处理模块可以有多种方式获取输入设备状态信息集。在一个示例中,处理模块还可以查询多个输入设备的状态信息,以获取输入设备状态信息集。在另一个示例中,处理模块还可以接收多个输入设备上报的状态信息,以获取输入设备状态信息集。
S650,处理模块根据第一输入设备的变化信息和输入设备状态信息集,生成第一输入事件。
在本申请实施例中,第一输入事件可以包括第一输入设备的变化信息和输入设备状态信息集,输入设备状态信息集包括第二输入设备的状态信息。
可选地,第一输入事件可以包括第一输入设备的变化信息和输入设备状态信息集,输入设备状态信息集包括第一输入设备的状态信息和第二输入设备的状态信息。
S660,处理模块向执行模块发送第一输入事件,相应的,执行模块接收来自处理模块的第一输入事件。
处理模块可以根据输入事件分发的规则向具体的执行模块发送第一输入事件。
例如,处理模块可以优先向系统级服务发送第一输入事件。又例如,执行模块可以是应用程序,处理模块可以向当前处于活动状态的应用程序发送第一输入事件。
可选地,当前有两个以上的处于活动状态的应用程序,处理模块分别向当前处于活动状态的应用程序发送第一输入事件。
可选地,处理模块向后台运行的应用程序发送第一输入事件。
在一些实施例中,第一输入事件是指向型的输入事件,例如,鼠标点击具体的图标、链接、控件等,处理模块根据第一输入事件,向第一输入事件指向的应用程序发送第一输入事件。即,处理模块根据第一输入事件,分发第一输入事件。
可选地,该执行模块是第一输入事件指向的执行模块。
S670,执行模块响应第一输入事件。
在本申请实施例中,第一输入事件可以包括第一输入设备的变化信息和输入设备状态信息集,其中,输入设备状态信息集可以包括第二输入设备的状态信息。这样,第一输入设备上报的输入事件还可以包括其他输入设备的状态信息,执行模块可以对用户使用多种输入设备协同操作进行有效响应。
执行模块响应第一输入事件可以是执行模块执行与第一输入事件相关的操作。
示例性的,第一输入事件为鼠标输入事件,当鼠标的位置位于某一应用程序的图标上,并且鼠标左键被按下,则该鼠标输入事件相关的操作为打开该应用程序。
示例性的,第一输入事件为按键输入事件,当A键被按下,则该按键输入事件相关的操作为输入A字母。
示例性的,第一输入设备可以是麦克风,第二输入设备可以是红外光传感器,第一输入事件包括第一输入设备的变化信息:接收到用户的语音“向下滚动”,以及第二输入设备的状态信息:用户正在注视屏幕,则与第一输入事件相关的操作为:控制页面向下滚动。
示例性的,第一输入设备可以是按键,第二输入设备可以是陀螺仪传感器,第一输入事件包括第一输入设备的变化信息:电源键被双击,以及第二输入设备的状态信息:手机正处于抬起姿态,则与第一输入事件相关的操作为:电子设备显示支付界面。
应理解,在本实施例中,电子设备可以包括第一输入设备、第二输入设备、第一设备驱动、处理模块和执行模块。
图7是本申请实施例提供的另一种输入方法的示意性流程图。在本实施例中,以电子设备包括第一输入设备和第二输入设备进行说明。如图7所示,该方法包括:
S710,第一输入设备检测到用户的操作。
S720,第一设备驱动获取来自第一输入设备的第一操作信息。
S730,第一设备驱动向处理模块发送第一输入设备的变化信息,相应的,处理模块接收第一输入设备的变化信息。
S740,处理模块获取输入设备状态信息集。
S710-S740与S610-S640类似,具体可参考S610-S640的相关描述。
S750,处理模块根据第一输入设备的变化信息和输入设备状态信息集,确定第一输入意图,并生成第一输入事件。
在本申请实施例中,可以根据第一输入设备的变化信息和输入设备状态信息集,确定第一输入意图。其中,第一输入设备的变化信息以及设备状态信息集与第一输入意图之间具有第一映射关系。
以几个示例对第一映射关系进行说明。例如,“鼠标左键单击+键盘ALT键处于被按下状态”、“触摸点从屏幕左边缘向右滑动+键盘ALT键处于被按下状态”、“鼠标左键单击+键盘ALT键处于被按下状态”、“触摸点从屏幕左边缘向右滑动+键盘ALT键处于被按下状态”可以均对应第一输入意图。
其中,第一映射关系可以是开发人员或者系统预先定义的,例如,可以通过大数据确定用户习惯,然后根据用户习惯进行定义,本申请不予限定。
在一些实施例中,用户还可以为第一输入意图定义或增加其对应的操作。例如,用户可以建立“截图”的输入意图与“鼠标左键单击(变化信息)+CTRL键处于被按下的状态(状态信息)”之间的映射关系。也就是说,用户可以自定义第一输入设备的变化信息以及设备状态信息集与第一输入意图之间的第一映射关系。
在本实施例中,处理模块将第一输入设备的变化信息、输入设备状态信息集和第一输入意图打包形成第一输入事件。也就是说,第一输入事件包括第一输入设备的变化信息、输入设备状态信息集和第一输入意图。
S760,处理模块向执行模块发送第一输入事件,相应的,执行模块接收来自处理模块的第一输入事件。
S760与S660类似,具体可参考S660的相关描述。
S770,执行模块响应第一输入意图。
当执行模块接收到第一输入事件后,可以查询第一输入事件中的第一输入意图,针对 该第一输入意图进行响应。即,执行模块执行与第一输入意图相关的操作。
例如,第一输入意图为截图,则执行模块执行截图操作;例如,第一输入意图为录音,则执行模块执行录音操作;又例如,第一输入意图为播放音乐,则执行模块执行播放音乐操作。
在一些场景下,第一输入事件中没有携带第一输入意图,则针对第一输入事件中的第一输入设备的变化信息和输入设备状态信息集进行响应。在这种情况下,S770与S670类似,具体可以参考S670的相关描述。
在另一些场景下,执行模块不能够针对第一输入意图进行响应,例如,执行模块为文本阅读软件,不具备播放音乐的功能,但是第一输入事件中的第一输入意图为播放音乐,执行模块可以针对第一输入事件中的第一输入设备的变化信息和输入设备状态信息集进行响应。例如,执行模块为文本阅读软件,第一输入事件包括按下电源键(变化信息)和触摸屏具有一个触摸点(状态信息),第一输入意图为播放音乐,由于文本阅读软件不能响应该播放音乐的意图,则可以根据第一输入事件中的按下电源键(变化信息)和触摸屏具有一个触摸点(状态信息)进行响应。例如按下电源键(变化信息)和触摸屏具有一个触摸点(状态信息)对应打开文本阅读软件的设置,则电子设备显示文本阅读软件的设置界面。
应理解,上述场景关乎到输入事件的分发,如果输入事件仅分发到该不能响应第一输入意图的执行模块,执行模块可以针对第一输入事件中的第一输入设备的变化信息和输入设备状态信息集进行响应。
在又一些场景下,如果第一执行模块不能够针对第一输入事件中的第一输入意图进行响应,则电子设备可以将第一输入事件分发到可以响应第一输入意图的第二执行模块。
例如,在上述示例中,文本阅读软件不能响应播放音乐的输入意图,电子设备还可以将该输入事件分发到音乐播放器,由音乐播放器响应播放音乐的输入意图。
图8是本申请实施例提供的又一种输入方法的示意性流程图。该方法由电子设备执行,以电子设备包括第一输入设备和第二输入设备进行说明,如图8所示,该方法包括:
S810,检测用户对第一输入设备的操作,确定该第一输入设备的变化信息。
S810与S610以及S620类似,具体可参考S610和S620的相关描述。在S810中,电子设备可以执行图6或图7中的第一输入设备或第一设备驱动执行的动作或步骤。
S820,获取输入设备状态信息集,该输入设备状态信息集包括第二输入设备的状态信息。
在S820中,电子设备可以执行图6或图7中的处理模块执行的动作或步骤,或者电子设备可以执行图4或图5中输入框架执行的动作或步骤。
在本申请实施例中,输入设备状态信息集至少包括第二输入设备的状态信息。
可选地,输入设备状态信息集可以包括电子设备的全部输入设备的状态信息。例如,电子设备包括第一输入设备和第二输入设备,输入设备状态信息集可以包括第一输入设备的状态信息和第二输入设备的状态信息。
可选地,第一输入设备和第二输入设备不同。其中,第一输入设备和第二输入设备不同可以是指第一输入设备和第二输入设备的类型不同,例如第一输入设备为鼠标,第二输入设备为键盘;第一输入设备和第二输入设备不同还可以是指第一输入设备和第二输入设 备在计算机中的标识或逻辑上有区分,例如第一输入设备为第一鼠标,第二输入设备为第二鼠标。
可选地,根据第一输入设备的变化信息,更新输入设备状态信息集中的第一输入设备的状态信息。
应理解,每当输入设备的状态发生变化时,根据输入设备的变化信息,更新输入设备状态信息集。这样,可以保证输入设备状态信息集中的输入设备的状态信息的有效性,有利于电子设备正确地响应用户的操作。
S830,生成第一输入事件,该第一输入事件包括该第一输入设备的变化信息和该输入设备状态信息集。
在S830中,电子设备可以执行图6或图7中处理模块执行的动作或步骤,或者电子设备可以执行图4或图5中输入框架或UI框架执行的动作或步骤。
可选地,根据第一输入设备的变化信息和输入设备状态信息集,确定第一输入意图;该第一输入事件还包括该第一输入意图。
可选地,根据第一映射关系,确定第一输入设备的变化信息和输入设备状态信息集对应的第一输入意图。
在本申请实施例中,可以根据映射关系确定用户的输入意图。这样,可以准确快速的识别出用户的输入意图,以便于电子设备进行正确的响应。
可选地,该第一映射关系为系统预定义或用户自定义的。
在本申请实施例中,映射关系为系统预定义或用户自定义的。在用户认为输入意图对应的操作方式不符合自己的习惯时,可以对输入意图对应的操作方式进行修改或增加对应的操作,这样,可以丰富用户对电子设备的操作方式,提升用户体验。
S840,执行与该第一输入事件相关的操作。
在S840中,电子设备可以执行图6或图7中执行模块执行的动作或步骤,或者电子设备可以执行图4或图5中应用程序执行的动作或步骤。
可选地,执行与第一输入意图相关的操作。
在本申请实施例中,可以根据输入设备的变化信息和输入设备状态信息集,识别用户的输入意图。电子设备在响应输入事件时,可以查询输入事件中包括的输入意图,针对输入意图进行响应,即执行与输入意图相关的操作。基于该方案,电子设备可以识别用户的操作对应的输入意图,有利于电子设备对用户的操作做出正确的响应。在另一方面,由于输入设备的变化信息和输入设备状态信息集之间的组合较多,电子设备可能需要对每一种组合进行适配,而基于该方案,电子设备可以不必针对每一种组合进行适配,只需适配输入意图即可。同时,如果需要增加新的操作方式,可以仅增加输入意图对应的操作方式即可,降低了开发的复杂度。
在一些实施例中,电子设备可以检测用户对该第二输入设备的操作,确定该第二输入设备的变化信息;生成第二输入事件,该第二输入事件包括该第二输入设备的变化信息和更新后的该输入设备状态信息集;执行与该第二输入事件相关的操作。
在本申请实施例中,可以是在检测到用户对第一输入设备的操作后,检测到用户对第二输入设备的操作,由于根据第一输入设备的变化信息对输入设备状态信息集进行了更新,第二输入事件可以包括更新后的输入设备状态信息集。基于该方案,用户可以协同操作第 一输入设备和第二输入设备与电子设备进行交互,电子设备也能够对用户协同操作第一输入设备和第二输入设备进行有效响应。在另一方面,可以丰富用户对电子设备的操作方式,提升用户体验。
上述主要从电子设备的角度对本申请实施例提供的输入方法进行了介绍。可以理解的是,电子设备为了实现上述功能,其包含了执行各个功能相应的硬件结构和/或软件模块。本领域技术人员应该很容易意识到,结合本文中所公开的实施例描述的各示例的算法步骤,本申请能够以硬件或硬件和计算机软件的结合形式来实现。某个功能究竟以硬件还是计算机软件驱动硬件的方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请的范围。
本申请实施例可以根据上述方法示例对电子设备中的处理器进行功能模块的划分,例如,可以对应各个功能划分各个功能模块,也可以将两个或两个以上的功能集成在一个处理模块中。上述集成的模块既可以采用硬件的形式实现,也可以采用软件功能模块的形式实现。需要说明的是,本申请实施例中对模块的划分是示意性的,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式。
在采用了对应各个功能的划分各个功能模块的情况下,图9示出了本申请实施例提供的一种装置900的组成示意图,如图9所示,该装置900包括:检测单元910,处理单元920和执行单元930。
其中,检测单元910,用于检测用户对第一输入设备的操作,确定第一输入设备的变化信息。示例性地,检测单元910可以用于执行图8中的S810,或者执行图6和图7中的第一输入设备和第一设备驱动执行的动作或步骤,或者执行图4或图5中的鼠标、键盘等输入设备或设备驱动执行的动作或步骤。
处理单元920,用于获取输入设备状态信息集,输入设备状态信息集包括第二输入设备的状态信息;生成第一输入事件,该第一输入事件包括第一输入设备的变化信息和输入设备状态信息集。示例性地,处理单元可以用于执行图8中的S820和S830,或者执行图6和图7中的处理模块执行的动作或步骤,或者执行图4或图5中的输入框架或UI框架执行的动作或步骤。
执行单元930,用于执行与第一输入事件相关的操作,或者用于执行与第一输入意图相关的操作。示例性的,用于执行图8中的S840,或者执行图6或图7中的执行模块执行的动作或步骤,或者执行图4或图5中的应用程序执行的动作或步骤。
需要说明的是,上述方法实施例涉及的各步骤的所有相关内容均可以援引到对应功能模块的功能描述,在此不再赘述。本申请实施例提供的装置900,用于执行上述输入方法,因此可以达到与上述输入方法相同的效果。
本申请实施例还提供了一种电子设备,包括:显示屏(触摸屏)、处理器、电源键、存储器、应用程序以及计算机程序。上述各器件可以通过一个或多个通信总线连接。其中,该一个或多个计算机程序被存储在上述存储器中并被配置为被该一个或多个处理器执行,该一个或多个计算机程序包括指令,上述指令可以用于使电子设备执行上述各实施例中的输入方法的各个步骤。
示例性地,上述处理器具体可以为图1所示的处理器110,上述存储器具体可以为图 1所示的内部存储器120和/或与电子设备连接的外部存储器。
图10是本申请实施例提供的装置1000的硬件结构示意图。图10所示的装置1000(该装置1000具体可以是一种电子设备)包括存储器1010、处理器1020、通信接口1030以及总线1040。其中,存储器1010、处理器1020、通信接口1030通过总线1040实现彼此之间的通信连接。
存储器1010可以是ROM,静态存储设备,动态存储设备或者RAM。存储器1010可以存储程序,当存储器1010中存储的程序被处理器1020执行时,处理器1020用于执行本申请实施例的输入方法的各个步骤。
处理器1020可以采用通用的CPU,微处理器,ASIC,GPU或者一个或多个集成电路,用于执行相关程序,以实现本申请实施例的装置1000中的单元所需执行的功能,或者执行本申请方法实施例的输入方法。
处理器1020还可以是一种集成电路芯片,具有信号的处理能力。在实现过程中,本申请的输入方法的各个步骤可以通过处理器1020中的硬件的集成逻辑电路或者软件形式的指令完成。上述的处理器1020还可以是通用处理器、DSP、ASIC、FPGA或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件。可以实现或者执行本申请实施例中的公开的各方法、步骤及逻辑框图。通用处理器可以是微处理器或者该处理器也可以是任何常规的处理器等。结合本申请实施例所公开的方法的步骤可以直接体现为硬件译码处理器执行完成,或者用译码处理器中的硬件及软件模块组合执行完成。软件模块可以位于随机存储器,闪存、只读存储器,可编程只读存储器或者电可擦写可编程存储器、寄存器等本领域成熟的存储介质中。该存储介质位于存储器1010,处理器1020读取存储器1010中的信息,结合其硬件完成本申请实施例的装置1000中包括的单元所需执行的功能,或者执行本申请方法实施例的输入方法。
通信接口1030使用例如但不限于收发器一类的收发装置,来实现装置1000与其他设备或通信网络之间的通信。
总线1040可包括在装置1000各个部件(例如,存储器1010、处理器1020、通信接口1030)之间传送信息的通路。
应注意,尽管图10所示的装置1000仅仅示出了存储器、处理器、通信接口,但是在具体实现过程中,本领域的技术人员应当理解,装置1000还包括实现正常运行所必须的其他器件。同时,根据具体需要,本领域的技术人员应当理解,装置1000还可包括实现其他附加功能的硬件器件。此外,本领域的技术人员应当理解,装置1000也可仅仅包括实现本申请实施例所必须的器件,而不必包括图10中所示的全部器件。
本申请实施例还提供一种芯片,该芯片包括处理器和通信接口,该通信接口用于接收信号,并将该信号传输至该处理器,该处理器处理该信号,使得如前文中任一种可能的实现方式中的输入方法被执行。
本实施例还提供一种计算机可读存储介质,该计算机可读存储介质中存储有计算机指令,当该计算机指令在电子设备上运行时,使得电子设备执行上述相关方法步骤实现上述实施例中的输入方法。
本实施例还提供了一种计算机程序产品,当该计算机程序产品在计算机上运行时,使得计算机执行上述相关步骤,以实现上述实施例中的输入方法。
另外,本申请的实施例还提供一种装置,这个装置具体可以是芯片,组件或模块,该装置可包括相连的处理器和存储器;其中,存储器用于存储计算机执行指令,当装置运行时,处理器可执行存储器存储的计算机执行指令,以使芯片执行上述各方法实施例中的输入方法。
以上实施例中所用,根据上下文,术语“当…时”或“当…后”可以被解释为意思是“如果…”或“在…后”或“响应于确定…”或“响应于检测到…”。类似地,根据上下文,短语“在确定…时”或“如果检测到(所陈述的条件或事件)”可以被解释为意思是“如果确定…”或“响应于确定…”或“在检测到(所陈述的条件或事件)时”或“响应于检测到(所陈述的条件或事件)”。
本领域普通技术人员可以意识到,结合本文中所公开的实施例描述的各示例的单元及算法步骤,能够以电子硬件、或者计算机软件和电子硬件的结合来实现。这些功能究竟以硬件还是软件方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请的范围。
所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,上述描述的系统、装置和单元的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
在本申请所提供的几个实施例中,应该理解到,所揭露的系统、装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,所述单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。
所述功能如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(Read-Only Memory,ROM)、随机存取存储器(Random Access Memory,RAM)、磁碟或者光盘等各种可以存储程序代码的介质。
以上所述,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本申请揭露的技术范围内,可轻易想到变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以所述权利要求的保护范围为准。

Claims (17)

  1. 一种输入方法,所述方法应用于电子设备,其特征在于,所述方法包括:
    检测用户对第一输入设备的操作,确定所述第一输入设备的变化信息;
    获取输入设备状态信息集,所述输入设备状态信息集包括第二输入设备的状态信息;
    生成第一输入事件,所述第一输入事件包括所述第一输入设备的变化信息和所述输入设备状态信息集;
    执行与所述第一输入事件相关的操作。
  2. 根据权利要求1所述的方法,其特征在于,所述输入设备状态信息集还包括所述第一输入设备的状态信息。
  3. 根据权利要求2所述的方法,其特征在于,所述方法还包括:
    根据所述第一输入设备的变化信息,更新所述输入设备状态信息集中的所述第一输入设备的状态信息。
  4. 根据权利要求3所述的方法,其特征在于,所述方法还包括:
    检测用户对所述第二输入设备的操作,确定所述第二输入设备的变化信息;
    生成第二输入事件,所述第二输入事件包括所述第二输入设备的变化信息和更新后的所述输入设备状态信息集;
    执行与所述第二输入事件相关的操作。
  5. 根据权利要求1至3中任一项所述的方法,其特征在于,所述方法还包括:
    根据所述第一输入设备的变化信息和所述输入设备状态信息集,确定第一输入意图;
    所述第一输入事件还包括所述第一输入意图;
    所述执行与所述第一输入事件相关的操作,包括:
    执行与所述第一输入意图相关的操作。
  6. 根据权利要求5所述的方法,其特征在于,所述根据所述第一输入设备的变化信息和所述输入设备状态信息集,确定第一输入意图,包括:
    根据第一映射关系,确定所述第一输入设备的变化信息和所述输入设备状态信息集对应的第一输入意图。
  7. 根据权利要求6所述的方法,其特征在于,所述第一映射关系为系统预定义或用户自定义的。
  8. 一种电子设备,其特征在于,所述电子设备包括:
    一个或多个处理器;
    一个或多个存储器;
    所述一个或多个存储器存储有一个或多个计算机程序,所述一个或多个计算机程序包括指令,当所述指令被所述一个或多个处理器执行时,使得所述电子设备执行以下步骤:
    检测用户对第一输入设备的操作,确定所述第一输入设备的变化信息;
    获取输入设备状态信息集,所述输入设备状态信息集包括第二输入设备的状态信息;
    生成第一输入事件,所述第一输入事件包括所述第一输入设备的变化信息和所述输入设备状态信息集;
    执行与所述第一输入事件相关的操作。
  9. 根据权利要求8所述的电子设备,其特征在于,所述输入设备状态信息集还包括所述第一输入设备的状态信息。
  10. 根据权利要求9所述的电子设备,其特征在于,当所述指令被所述一个或多个处理器执行时,使得所述电子设备执行以下步骤:
    根据所述第一输入设备的变化信息,更新所述输入设备状态信息集中的所述第一输入设备的状态信息。
  11. 根据权利要求10所述的电子设备,其特征在于,当所述指令被所述一个或多个处理器执行时,使得所述电子设备执行以下步骤:
    检测用户对所述第二输入设备的操作,确定所述第二输入设备的变化信息;
    生成第二输入事件,所述第二输入事件包括所述第二输入设备的变化信息和更新后的所述输入设备状态信息集;
    执行与所述第二输入事件相关的操作。
  12. 根据权利要求8至10中任一项所述的电子设备,其特征在于,当所述指令被所述一个或多个处理器执行时,使得所述电子设备执行以下步骤:
    根据所述第一输入设备的变化信息和所述输入设备状态信息集,确定第一输入意图;
    所述第一输入事件还包括第一输入意图;
    执行与所述第一输入意图相关的操作。
  13. 根据权利要求12所述的电子设备,其特征在于,当所述指令被所述一个或多个处理器执行时,使得所述电子设备执行以下步骤:
    根据预定义的第一映射关系,确定所述第一输入设备的变化信息和所述输入设备状态信息集对应的第一输入意图。
  14. 根据权利要求13所述的电子设备,其特征在于,所述第一映射关系为系统预定义或用户自定义的。
  15. 一种芯片,其特征在于,所述芯片包括处理器和通信接口,所述通信接口用于接收信号,并将所述信号传输至所述处理器,所述处理器处理所述信号,使得如权利要求1至7中任一项所述的方法被执行。
  16. 一种计算机可读存储介质,其特征在于,包括计算机指令,当所述计算机指令在电子设备上运行时,使得所述电子设备执行如权利要求1至7中任一项所述的方法。
  17. 一种计算机程序产品,其特征在于,所述计算机程序产品包括:计算机程序代码,当所述计算机程序代码被运行时,实现如权利要求1至7中任一项所述的方法。
PCT/CN2023/087824 2022-04-21 2023-04-12 一种输入方法及装置 WO2023202444A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210420430.3A CN116974361A (zh) 2022-04-21 2022-04-21 一种输入方法及装置
CN202210420430.3 2022-04-21

Publications (1)

Publication Number Publication Date
WO2023202444A1 true WO2023202444A1 (zh) 2023-10-26

Family

ID=88419129

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/087824 WO2023202444A1 (zh) 2022-04-21 2023-04-12 一种输入方法及装置

Country Status (2)

Country Link
CN (1) CN116974361A (zh)
WO (1) WO2023202444A1 (zh)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050160105A1 (en) * 2003-12-26 2005-07-21 Samsung Electronics Co., Ltd. Apparatus and method for input management
KR20110046786A (ko) * 2009-10-29 2011-05-06 한국전자통신연구원 멀티포인트 사용자 인터페이스 장치 및 멀티포인트 사용자 인터페이싱 방법
CN103543944A (zh) * 2012-07-17 2014-01-29 三星电子株式会社 执行包括笔识别面板的终端的功能的方法及其终端
CN105690385A (zh) * 2016-03-18 2016-06-22 北京光年无限科技有限公司 基于智能机器人的应用调用方法与装置
CN106020850A (zh) * 2016-06-23 2016-10-12 北京光年无限科技有限公司 在机器人操作系统中关闭应用的方法及装置

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050160105A1 (en) * 2003-12-26 2005-07-21 Samsung Electronics Co., Ltd. Apparatus and method for input management
KR20110046786A (ko) * 2009-10-29 2011-05-06 한국전자통신연구원 멀티포인트 사용자 인터페이스 장치 및 멀티포인트 사용자 인터페이싱 방법
CN103543944A (zh) * 2012-07-17 2014-01-29 三星电子株式会社 执行包括笔识别面板的终端的功能的方法及其终端
CN105690385A (zh) * 2016-03-18 2016-06-22 北京光年无限科技有限公司 基于智能机器人的应用调用方法与装置
CN106020850A (zh) * 2016-06-23 2016-10-12 北京光年无限科技有限公司 在机器人操作系统中关闭应用的方法及装置

Also Published As

Publication number Publication date
CN116974361A (zh) 2023-10-31

Similar Documents

Publication Publication Date Title
WO2021057830A1 (zh) 一种信息处理方法及电子设备
WO2021129326A1 (zh) 一种屏幕显示方法及电子设备
WO2021063343A1 (zh) 语音交互方法及装置
WO2020181988A1 (zh) 一种语音控制方法及电子设备
WO2021120914A1 (zh) 一种界面元素的显示方法及电子设备
WO2021057868A1 (zh) 一种界面切换方法及电子设备
WO2021036735A1 (zh) 显示用户界面的方法及电子设备
WO2020155876A1 (zh) 控制屏幕显示的方法及电子设备
WO2021244443A1 (zh) 分屏显示方法、电子设备及计算机可读存储介质
WO2021000839A1 (zh) 一种分屏方法及电子设备
KR20210092795A (ko) 음성 제어 방법 및 전자 장치
WO2021129253A1 (zh) 显示多窗口的方法、电子设备和系统
WO2020253758A1 (zh) 一种用户界面布局方法及电子设备
EP2869181A1 (en) Method for executing functions in response to touch input and electronic device implementing the same
WO2021196970A1 (zh) 一种创建应用快捷方式的方法、电子设备及系统
WO2021063098A1 (zh) 一种触摸屏的响应方法及电子设备
WO2021037223A1 (zh) 一种触控方法与电子设备
WO2018223558A1 (zh) 数据处理方法及电子设备
WO2021051982A1 (zh) 调用硬件接口的方法及电子设备
WO2021175272A1 (zh) 一种应用信息的显示方法及相关设备
WO2020006669A1 (zh) 一种图标切换方法、显示gui的方法及电子设备
WO2021008589A1 (zh) 一种应用的运行方法及电子设备
WO2022194190A1 (zh) 调整触摸手势的识别参数的数值范围的方法和装置
WO2022213831A1 (zh) 一种控件显示方法及相关设备
WO2021052488A1 (zh) 一种信息处理方法及电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23791099

Country of ref document: EP

Kind code of ref document: A1