WO2024046179A1 - 交互事件的处理方法及装置 - Google Patents

交互事件的处理方法及装置 Download PDF

Info

Publication number
WO2024046179A1
WO2024046179A1 PCT/CN2023/114369 CN2023114369W WO2024046179A1 WO 2024046179 A1 WO2024046179 A1 WO 2024046179A1 CN 2023114369 W CN2023114369 W CN 2023114369W WO 2024046179 A1 WO2024046179 A1 WO 2024046179A1
Authority
WO
WIPO (PCT)
Prior art keywords
event
key combination
zoom
electronic device
touch
Prior art date
Application number
PCT/CN2023/114369
Other languages
English (en)
French (fr)
Inventor
童辰
孙科
Original Assignee
荣耀终端有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 荣耀终端有限公司 filed Critical 荣耀终端有限公司
Publication of WO2024046179A1 publication Critical patent/WO2024046179A1/zh

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation

Definitions

  • the present application relates to the field of electronic information technology, and in particular, to a method and device for processing interactive events.
  • touch-screen electronic devices such as mobile phones and tablets
  • external input devices can be used for control
  • these touch-screen electronic devices may be used The situation of projecting the screen to other electronic devices for control.
  • This application provides a method and device for processing interactive events, which can realize zooming on touch screen electronic devices through the control of keyboard and mouse.
  • the first aspect provides a method for processing interactive events.
  • the method includes: obtaining a key combination event, which is an interaction event used to trigger the zoom function; converting the key combination event into a zoom touch event; responding to the zoom function. Touch events to zoom the interface.
  • the effect of controlling the zoom function of a touch screen electronic device through a keyboard and mouse is mainly achieved by converting an unresponsive key combination event into a responsive zoom touch event and then responding. And the solution is easy to implement and does not require too many changes.
  • converting the key combination event into a zoom touch event may include: converting parameters of the key combination event into corresponding parameters in the zoom touch event.
  • the key combination event is a combination event of a control key event and a mouse wheel event;
  • the parameters of the key combination event include the rolling direction and amount of the wheel;
  • the parameters of the zoom touch event Including the pinch direction and pinch distance; convert the parameters of the key combination event into the corresponding parameters in the zoom touch event, including: convert the scroll amount into the pinch distance, and convert the scroll direction into the pinch direction.
  • the above method further includes:
  • the key combination event is a zoom key combination event
  • This implementation can block the input of pure control key events, or avoid mistakenly responding to pure wheel events as zoom key combinations.
  • the key combination event into a zoom touch event when converting the key combination event into a zoom touch event, it may also include:
  • This implementation can improve the accuracy of processing continuously triggered key combination events.
  • the following when obtaining the key combination event, the following may be included:
  • the above method further includes:
  • scaling the interface in response to a zoom touch event includes: scaling the running interface of the foreground application in response to a zoom touch event; or,
  • the following when determining whether the foreground application supports scaling, the following may be included:
  • the foreground application is an application in the whitelist of applications, the foreground application is considered to support scaling; or,
  • the foreground application is not an application in the whitelist, it is considered that the foreground application does not support scaling.
  • the above method further includes:
  • the key combination event is discarded.
  • a second aspect provides an interactive event processing device, which device includes a unit composed of software and/or hardware for executing any one of the methods in the first aspect.
  • the device includes:
  • the acquisition unit is used to obtain the key combination event, which is an interactive event used to trigger the zoom function;
  • Processing unit for:
  • Zoom the interface in response to a zoom touch event
  • the processing unit is specifically configured to: convert parameters of the key combination event into corresponding parameters in the zoom touch event.
  • the key combination event is a combination event of a control key event and a mouse wheel event;
  • the parameters of the key combination event include the rolling direction and amount of the wheel;
  • the parameters of the zoom touch event Including the pinching direction and pinching distance;
  • the processing unit is specifically used for:
  • the processing unit is also used for:
  • the key combination event is a zoom key combination event
  • the processing unit is also used for:
  • the acquisition unit is specifically used for:
  • the processing unit is also used for:
  • scaling the interface in response to a zoom touch event includes: scaling the running interface of the foreground application in response to a zoom touch event; or,
  • the processing unit is specifically used for:
  • the foreground application is an application in the whitelist of applications, the foreground application is considered to support scaling; or,
  • the foreground application is not an application in the whitelist, it is considered that the foreground application does not support scaling.
  • the processing unit is also used to:
  • the key combination event is discarded.
  • an electronic device including a memory, a processor, and a computer program stored in the memory and executable on the processor.
  • the processor executes the computer program, any method of the first aspect can be implemented.
  • a chip including a processor.
  • the processor is configured to read and execute a computer program stored in a memory.
  • any method of the first aspect can be implemented.
  • the chip also includes a memory, and the memory is electrically connected to the processor.
  • the chip may also include a communication interface.
  • a computer-readable storage medium stores a computer program.
  • any method of the first aspect can be implemented.
  • a computer program product includes a computer program.
  • the computer program product includes a computer program.
  • any method of the first aspect can be implemented.
  • Figure 1 is a schematic diagram of an interaction scenario according to an embodiment of the present application.
  • Figure 2 is a schematic diagram of another interaction scenario according to an embodiment of the present application.
  • Figure 3 is a schematic diagram of the response process of a zoom function.
  • Figure 4 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
  • Figure 5 is a software structure block diagram of an electronic device according to an embodiment of the present application.
  • Figure 6 is a schematic diagram of the interaction event processing process of the first type of interaction scenario in this embodiment of the present application.
  • Figure 7 is a schematic diagram of the interaction event processing process of the second type of interaction scenario in this embodiment of the present application.
  • Figure 8 is a schematic flow chart of an interactive event processing method according to an embodiment of the present application.
  • Figure 9 is a schematic flow chart of another interactive event processing method according to an embodiment of the present application.
  • Figure 10 is a schematic flow chart of yet another interactive event processing method according to an embodiment of the present application.
  • Figure 11 is a schematic diagram of an interactive event processing device according to an embodiment of the present application.
  • the interactive event processing method provided by this application can be applied to keyboard and mouse control scenarios of various touch screen electronic devices.
  • the interactive event processing method provided by this application can be used in touch screen electronic devices that can support control by external keyboard and mouse devices.
  • Figure 1 is a schematic diagram of an interaction scenario according to an embodiment of the present application. It should be understood that the embodiments of the present application mainly include two types of interaction scenarios.
  • the first type of interaction scenario is that the touch screen electronic device is directly connected to a keyboard and mouse, and the external keyboard and mouse are used to control the touch screen electronic device;
  • the second type of interaction scenario is The touch screen electronic device projects the screen to the screen projection device, and uses the external keyboard and mouse of the screen projection device to control the touch screen electronic device.
  • Figure 1 is an example of the first type of interaction scenario
  • Figure 2 is an example of the second type of interaction scenario.
  • the electronic device D is a touch-screen electronic device, that is, an electronic device that can be controlled by performing touch operations such as clicking, double-clicking, or sliding on the screen.
  • Touch screen electronic devices can also be called touch screen devices or touch screen terminals.
  • the electronic device D may be, for example, a mobile phone or a tablet computer, or other electronic devices capable of touch screen control.
  • the electronic device D is a tablet computer as an example. It should be understood that the main input method of the electronic device D is touch screen operation, but there may also be input methods using mechanical keys such as power keys and volume keys for key operations.
  • the touch screen electronic device in the embodiment of the present application mainly refers to the original main input method being touch screen operation, and the key operation is not its original main input method.
  • the keyboard and mouse are not touch screen electronic devices.
  • the original input device is just its extended input device.
  • touch operation on the screen is the original main input method.
  • the keyboard and mouse are essentially It is designed for computers, so the keyboard and mouse controls do not completely match those of mobile phones.
  • the keyboard and mouse can control the computer, when the control object becomes a mobile phone, some of the original control functions on the mobile phone cannot be realized.
  • wireless connection method can be, for example, Bluetooth connection, but it should be understood that it can also be other wireless connection methods.
  • wireless connection method can also be replaced by a wired connection method, without limitation.
  • the shape and model of keyboard B and mouse C as well as the key layout on keyboard B.
  • the Fn key in keyboard B shown in Figure 1 may not exist, or the actual position may be different from the position shown in the figure, and for example
  • the icons of the arrow keys can be other representations, for example There is also a numeric keyboard area, which is also a small keyboard area, and there may be volume buttons, etc. Other possible situations will not be explained one by one.
  • user A can control electronic device D by operating keyboard B and mouse C.
  • interactive events include key events and touch events.
  • Key events are used to represent interaction events generated by operating keys or buttons on the keyboard and/or mouse.
  • Touch events are used to represent interactive events generated by touching the screen.
  • touch event here is an interactive event generated by a touch operation on the screen of the touch-screen electronic device, which is different from the touch pad (touchpad) on the laptop computer.
  • ctrl + scroll wheel can achieve the zoom function, among which, ctrl + scroll wheel up can achieve the zoom in function, and ctrl + scroll wheel down can achieve the zoom out function. If it is just a simple scroll wheel operation, the page turning function is implemented.
  • the upward scroll wheel corresponds to the upward page turning
  • the downward scroll wheel corresponds to the downward page turning.
  • the solution of the embodiment of the present application mainly adds some processing for the key combination events, so as to control the zooming of the touch screen electronic device through the key combination events.
  • electronic device D processes the key combination event of ctrl+wheel, and Ability to realize interface scaling on the electronic device D. The specific process will be introduced below and will not be expanded upon yet.
  • Figure 2 is a schematic diagram of another interaction scenario according to an embodiment of the present application. As mentioned above, Figure 2 is an example of the second type of interaction scenario. As shown in FIG. 2 , in this scenario, the electronic device D is a touch-screen electronic device. Refer to the related description of the electronic device D in FIG. 1 .
  • electronic device D projects the screen to electronic device E through near field communication, that is, short-range wireless communication technology.
  • the electronic device E can be called a screen projection device.
  • the electronic device E may be an electronic device such as a laptop or a computer that can be controlled through a keyboard and mouse. Laptops usually come with a built-in keyboard, so you can use the laptop's own keyboard and an external mouse for control. For computers, an external keyboard and mouse are usually required for control.
  • Electronic device E and electronic device D are two types of electronic devices with different control methods.
  • Electronic device D is a touch-screen electronic device, that is, an electronic device that can be controlled by performing touch operations such as clicking, double-clicking, or sliding on the screen.
  • the main input method of electronic device D is touch screen operation.
  • the electronic device E is an auxiliary device for the electronic device D in a joint office or collaborative office usage scenario.
  • the main function of the electronic device E is to provide a screen for the electronic device D and synchronously display the screen of the electronic device D, and the interface and input and output devices of the electronic device E can be called by the electronic device D.
  • the processing of data such as the processing of input interactive events, is still performed by the electronic device D.
  • the touch screen electronic device in the embodiment of the present application mainly refers to the original main input mode being touch screen operation, and the key operation is not its original main input mode.
  • the keyboard and mouse are not touch screen electronic devices.
  • the original input device is just its extended input device.
  • the original main input method is key operation, which means that the keyboard and mouse are the original input devices of the screen projection equipment, not its extended input devices.
  • user A can control electronic device D by operating keyboard B and mouse C.
  • some processing of the key combination events by the electronic device D is added, so as to control the zoom function of the touch screen electronic device through the key combination events.
  • electronic device D uses the solution of the embodiment of the present application to perform some processing on the ctrl+wheel key combination event.
  • the interface scaling of the electronic device D can be realized. It should be understood that since the electronic device E is the screen projection device of the electronic device D, the interface of the electronic device D is synchronized with the screen projection interface of the electronic device D on the screen of the electronic device E. Therefore, the interface on the electronic device D and the screen projection interface of the electronic device D on the electronic device E will be scaled synchronously.
  • the electronic device D is an Android operating system
  • the electronic device E is a Windows operating system.
  • zooming on the touch screen electronic device is realized through the control of the keyboard and mouse, including: realizing the touch screen electronic device through an external keyboard and mouse of the touch screen electronic device or use the keyboard and mouse of the projection device of the touch-screen electronic device to realize the scaling of the interface of the touch-screen electronic device and the projection interface of the touch-screen electronic device on the projection device.
  • touch screen notebook computer that can have the functions of an ordinary notebook computer and a tablet computer, that is, it has the ability to process touch screen interaction events and keyboard and mouse interaction events at the same time, it can be called a touch screen notebook. computer.
  • these types of laptops are all Windows operating systems, and these two types of interactive events correspond to two independent sets of processing processes. If this type of touch screen laptop is directly operated, its zoom function is implemented directly in response to the user's touch operation or directly in response to the user's keyboard and mouse operation, that is, the user can zoom by touching the screen. Or zoom by keyboard and mouse operations.
  • This dual processing flow method cannot be used in devices such as electronic device D.
  • a touch-screen laptop computer can be used as a screen projection device in the embodiment of the present application, that is to say, the electronic device E can be a touch-screen laptop computer.
  • the electronic device E can be a touch-screen laptop computer.
  • the touch-screen laptop only functions as a The role of the data carrier is that the mobile phone is still the one that actually performs the computing operations. Therefore, even if the keyboard and mouse connected to the touch screen laptop are operated at this time, zooming still cannot be performed. It is still necessary to use the solution of the embodiment of the present application to first connect the keyboard and mouse. The key event is converted into a touch event and then responded, thereby realizing the scaling of the mobile phone interface and the screen projection interface of the mobile phone on the touch screen laptop.
  • FIG 3 is a schematic diagram of the response process of a zoom function.
  • a zoom-in touch operation is performed on the interface 310, that is, two fingers slide in the direction where the distance between the fingers increases.
  • the zoom-in touch event is represented by two-finger pinch #1.
  • the electronic device processes the touch event of two-finger pinch #1 to obtain interface 320. It can be seen that the picture A in the interface 320 is larger than the picture A in the interface 310, that is, the proportion of the picture A on the screen of the electronic device increases.
  • a key combination event of ctrl+wheel up can be generated.
  • the key combination event to enlarge is represented by ctrl+wheel up.
  • Electronic equipment By processing the key event of the ctrl+scroll wheel up combination, you can also get interface 320. That is to say, both the amplification key combination event and the amplification touch event can amplify the interface of the electronic device.
  • a zoom-out touch event can be generated.
  • the zoom-out touch event is represented by two-finger pinch #2.
  • the electronic device By processing the touch event of pinch #2, you can get interface 340. It can be seen that the picture A in the interface 340 is smaller than the picture A in the interface 330, that is, the proportion of the picture A on the screen of the electronic device is reduced.
  • a key combination event of ctrl+scroll wheel down can be generated.
  • the key combination event of zooming out is represented by ctrl+scroll wheel down.
  • the electronic device handles the key event of the ctrl + scroll wheel down combination event, and can also obtain interface 340. That is to say, both the zoom out key combination event and the zoom out touch event can zoom in the interface of the electronic device.
  • Figure 3 can be seen as an example in which the foreground application of the electronic device is a picture viewing application.
  • Realizing the zoom function through a two-finger pinch touch event is an inherent function of electronic devices, and the solution of the embodiment of the present application can realize the zoom function through the key operation of ctrl+wheel.
  • the touch screen electronic device itself already has touch operations corresponding to the zoom function, so as long as the zoom key combination event is converted into a zoom touch event, the electronic device can respond to the zoom touch event .
  • Ctrl + scroll wheel is the most commonly used key operation for the zoom function, so using this key combination event to trigger the zoom function is more in line with user habits.
  • other key events can also be designed to replace this key combination event. It just needs to be added. Some adaptive changes, and the need to explain to the user that the corresponding function of other key events is zooming. For example, a key combination event that is not currently occupied can be used to set the key combination operation for the zoom function.
  • Figure 3 mainly shows that zooming can be achieved through touch operations or key combination operations, that is, comparing the zoom touch event and the zoom key combination event to illustrate that the two can achieve the same effect.
  • Figure 3 is mainly to illustrate that the solution of the embodiment of the present application can be used to enable the zoom key combination event to achieve the same zoom effect as the two-finger pinch touch event.
  • FIG. 4 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
  • the electronic device 400 may include a processor 410, an external memory interface 420, an internal memory 421, a universal serial bus (USB) interface 430, a charging management module 440, a power management module 441, and a battery 442 , Antenna 1, Antenna 2, mobile communication module 450, wireless communication module 460, audio module 470, speaker 470A, receiver 470B, microphone 470C, headphone interface 470D, sensor module 480, button 490, motor 491, indicator 492, camera 493 , display screen 494, and subscriber identification module (subscriber identification module, SIM) card interface 495, etc.
  • SIM subscriber identification module
  • the sensor module 480 may include a pressure sensor 480A, a gyro sensor 480B, an air pressure sensor 480C, a magnetic sensor 480D, an acceleration sensor 480E, a distance sensor 480F, a proximity light sensor 480G, a fingerprint sensor 480H, a temperature sensor 480J, and a touch sensor 480K.
  • the structure illustrated in the embodiment of the present application does not constitute a specific limitation on the electronic device 400 .
  • the electronic device 400 may include more or less components than shown in the figures, or some components may be combined, some components may be separated, or some components may be arranged differently.
  • the components illustrated may be implemented in hardware, software, or a combination of software and hardware.
  • the processor 410 shown in Figure 4 may include one or more processing units.
  • the processor 410 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit). unit, GPU), image signal processor (ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural network processor (neural-network processing unit, NPU), etc.
  • application processor application processor
  • AP application processor
  • modem processor graphics processor
  • graphics processor graphics processor
  • ISP image signal processor
  • controller memory
  • video codec digital signal processor
  • DSP digital signal processor
  • baseband processor baseband processor
  • neural network processor neural-network processing unit
  • the controller may be the nerve center and command center of the electronic device 400 .
  • the controller can generate operation control signals based on the instruction operation code and timing signals to complete the control of fetching and executing instructions.
  • the processor 410 may also be provided with a memory for storing instructions and data.
  • the memory in processor 410 is cache memory. This memory may hold instructions or data that have been recently used or recycled by processor 410 . If processor 410 needs to use the instructions or data again, it can be called directly from the memory. Repeated access is avoided and the waiting time of the processor 410 is reduced, thus improving the efficiency of the system.
  • processor 410 may include one or more interfaces.
  • Interfaces may include integrated circuit (inter-integrated circuit, I2C) interface, integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, pulse code modulation (pulse code modulation, PCM) interface, universal asynchronous receiver and transmitter (universal asynchronous receiver/transmitter (UART) interface, mobile industry processor interface (MIPI), general-purpose input/output (GPIO) interface, subscriber identity module (SIM) interface, and /or universal serial bus (USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • UART universal asynchronous receiver and transmitter
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • USB universal serial bus
  • the I2C interface is a bidirectional synchronous serial bus, including a serial data line (SDA) and a serial clock line (SCL).
  • Processor 410 may include multiple sets of I2C bus.
  • the processor 410 can couple the touch sensor 480K, charger, flash, camera 493, etc. respectively through different I2C bus interfaces.
  • the processor 410 can be coupled to the touch sensor 480K through an I2C interface, so that the processor 410 and the touch sensor 480K communicate through the I2C bus interface to implement the touch function of the electronic device 400 .
  • the I2S interface may be used for audio communications.
  • Processor 410 may include multiple sets of I2S buses. The processor 410 can be coupled with the audio module 470 through the I2S bus to implement communication between the processor 410 and the audio module 470.
  • the audio module 470 can transmit audio signals to the wireless communication module 460 through the I2S interface to implement the function of answering calls through a Bluetooth headset.
  • the PCM interface may also be used for audio communications to sample, quantize and encode analog signals.
  • the audio module 470 and the wireless communication module 460 may be coupled through a PCM bus interface.
  • the audio module 470 can also transmit audio signals to the wireless communication module 460 through the PCM interface to implement the function of answering calls through a Bluetooth headset. It should be understood that both the I2S interface and the PCM interface can be used for audio communication.
  • the UART interface is a universal serial data bus used for asynchronous communications.
  • the bus can be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication.
  • the UART interface is generally used to connect the processor 410 and the wireless communication module 460.
  • the processor 410 communicates with the Bluetooth module in the wireless communication module 460 through the UART interface to implement the Bluetooth function.
  • the audio module 470 can transmit audio signals to the wireless communication module 460 through the UART interface to implement the function of playing music through the Bluetooth headset.
  • the MIPI interface can be used to connect the processor 410 with peripheral devices such as the display screen 494 and the camera 493.
  • MIPI interfaces include camera serial interface (CSI), display serial interface (DSI), etc.
  • the processor 410 and the camera 493 communicate through the CSI interface to implement the shooting function of the electronic device 400.
  • the processor 410 and the display screen 494 communicate through the DSI interface to implement the display function of the electronic device 400.
  • the GPIO interface can be configured through software.
  • the GPIO interface can be configured as a control signal or as a data signal.
  • the GPIO interface can be used to connect the processor 410 with the camera 493, display screen 494, wireless communication module 460, audio module 470, sensor module 480, etc.
  • the GPIO interface can also be configured as an I2C interface, I2S interface, UART interface, MIPI interface, etc.
  • the USB interface 430 is an interface that complies with the USB standard specification, and may be a Mini USB interface, a Micro USB interface, a USB Type C interface, etc.
  • the USB interface 430 can be used to connect a charger to charge the electronic device 400, and can also be used to transmit data between the electronic device 400 and peripheral devices. It can also be used to connect headphones to play audio through them.
  • This interface can also be used to connect other electronic devices, such as AR devices, etc.
  • the interface connection relationship between the modules illustrated in the embodiment of the present application is only a schematic explanation and does not constitute a structural limitation of the electronic device 400 .
  • the electronic device 400 may also adopt different interface connection methods in the above embodiments, or a combination of multiple interface connection methods.
  • the charge management module 440 is used to receive charging input from the charger.
  • the charger can be a wireless charger or a wired charger.
  • the charging management module 440 may receive charging input from the wired charger through the USB interface 430 .
  • the charging management module 440 may receive wireless charging input through the wireless charging coil of the electronic device 400 . While charging the battery 442, the charging management module 440 can also provide power to the electronic device through the power management module 441.
  • the power management module 441 is used to connect the battery 442, the charging management module 440 and the processor 410.
  • the power management module 441 receives input from the battery 442 and/or the charging management module 440, and supplies power to the processor 410, internal memory 421, external memory, display screen 494, camera 493, wireless communication module 460, etc.
  • the power management module 441 can also be used to monitor battery capacity, battery cycle times, battery health status (leakage, impedance) and other parameters.
  • the power management module 441 may also be provided in the processor 410.
  • the power management module 441 and the charging management module 440 can also be provided in the same device.
  • the wireless communication function of the electronic device 400 can be implemented through the antenna 1, the antenna 2, the mobile communication module 450, the wireless communication module 460, the modem processor and the baseband processor.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in electronic device 400 may be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • antenna 1 can be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, antennas may be used in conjunction with tuning switches.
  • the mobile communication module 450 can provide a wireless communication solution applied to the electronic device 400, such as at least one of the following solutions: a second generation (2th generation, 2G) mobile communication solution, a third generation (3th generation, 3G) mobile Communication solutions, fourth generation (4th generation, 5G) mobile communication solutions, fifth generation (5th generation, 5G) mobile communication solutions.
  • the mobile communication module 450 may include at least one filter, switch, power amplifier, low noise amplifier (LNA), etc.
  • the mobile communication module 450 can receive electromagnetic waves through the antenna 1, filter and amplify the received electromagnetic waves, and then transmit them to the modem processor for demodulation.
  • the mobile communication module 450 can also amplify the signal modulated by the modem processor, and the amplified signal is converted into electromagnetic waves by the antenna 1 and radiated out.
  • at least part of the functional modules of the mobile communication module 450 may be disposed in the processor 410 .
  • at least part of the functional modules of the mobile communication module 450 and at least part of the modules of the processor 410 may be provided in the same device.
  • a modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low-frequency baseband signal to be sent into a medium-high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low-frequency baseband signal.
  • the demodulator then transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the application processor outputs sound signals through audio devices (not limited to speaker 470A, receiver 470B, etc.), or displays images or videos through display screen 494.
  • the modem processor may be a stand-alone device.
  • the modem processor may be independent of the processor 410 and may be provided in the same device as the mobile communication module 450 or other functional modules.
  • the wireless communication module 460 can provide applications on the electronic device 400 including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) network), Bluetooth (blue tooth, BT), and global navigation. Satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field communication technology (near field communication, NFC), infrared technology (infrared, IR) and other wireless communication solutions.
  • the wireless communication module 460 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 460 receives electromagnetic waves via the antenna 2 , frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 410 .
  • the wireless communication module 460 can also receive the signal to be sent from the processor 410, frequency modulate it, amplify it, and convert it into electromagnetic waves through the antenna 2 for radiation.
  • the antenna 1 of the electronic device 400 is coupled to the mobile communication module 450, and the antenna 2 of the electronic device 400 is coupled to the wireless communication module 460, so that the electronic device 400 can communicate with the network and other electronic devices through wireless communication technology.
  • the wireless communication technology may include at least one of the following communication technologies: Global System for Mobile Communications (global system for mobile communications, GSM), general packet radio service (GPRS), code division multiple access (code division multiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), time-division code division multiple access (TD-SCDMA), long term evolution (LTE), BT, GNSS, WLAN, NFC, FM, IR technology.
  • the GNSS may include at least one of the following positioning technologies: global positioning system (GPS), global navigation satellite system (GLONASS), Beidou navigation satellite system (BDS), Quasi-zenith satellite system (QZSS), satellite-based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS Quasi-zenith satellite system
  • SBAS satellite-based augmentation systems
  • the electronic device 400 implements display functions through a GPU, a display screen 494, an application processor, and the like.
  • the GPU is an image processing microprocessor and is connected to the display screen 494 and the application processor. GPUs are used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 410 may include one or more GPUs that execute program instructions to generate or alter display information.
  • Display 494 is used to display images, videos, etc.
  • Display 494 includes a display panel.
  • the display panel can use a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active matrix organic light emitting diode or an active matrix organic light emitting diode (active-matrix organic light).
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • AMOLED organic light-emitting diode
  • FLED flexible light-emitting diode
  • Mini-led Micro-Led, Micro-oLed, quantum dot light-emitting diode (QLED), etc.
  • the electronic device 400 may include 1 or N display screens 494, where N is a positive integer greater than 1.
  • the electronic device 400 can implement the shooting function through an ISP, a camera 493, a video codec, a GPU, a display screen 494, and an application processor.
  • the ISP is used to process the data fed back by the camera 493. For example, when taking a photo, the shutter is opened, the light is transmitted to the camera sensor through the lens, the optical signal is converted into an electrical signal, and the camera sensor passes the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye. ISP can also perform algorithm optimization on image noise, brightness, and skin color. ISP can also optimize the exposure, color temperature and other parameters of the shooting scene. In some embodiments, the ISP may be located in the camera 493.
  • Camera 493 is used to capture still images or video.
  • the object passes through the lens to produce an optical image that is projected onto the photosensitive element.
  • the photosensitive element can be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then passes the electrical signal to the ISP to convert it into a digital image signal.
  • ISP outputs digital image signals to DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other format image signals.
  • the electronic device 400 may include 1 or N cameras 493, where N is a positive integer greater than 1.
  • Digital signal processors are used to process digital signals. In addition to digital image signals, they can also process other digital signals. For example, when the electronic device 400 selects a frequency point, the digital signal processor is used to perform Fourier transform on the frequency point energy.
  • Video codecs are used to compress or decompress digital video.
  • Electronic device 400 may support one or more video codecs.
  • the electronic device 400 can play or record videos in multiple encoding formats, such as moving picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
  • NPU is a neural network (NN) computing processor.
  • NN neural network
  • Intelligent cognitive applications of the electronic device 400 can be implemented through the NPU, such as image recognition, face recognition, speech recognition, text understanding, etc.
  • the external memory interface 420 can be used to connect an external memory card, such as a secure digital (SD) card, to expand the storage capacity of the electronic device 400 .
  • the external memory card communicates with the processor 410 through the external memory interface 420 to implement the data storage function. Such as saving music, videos, etc. files in external memory card.
  • Internal memory 421 may be used to store computer executable program code, which includes instructions.
  • the processor 410 executes instructions stored in the internal memory 421 to execute various functional applications and data processing of the electronic device 400 .
  • the internal memory 421 may include a program storage area and a data storage area. Among them, the stored program area can store an operating system, at least one application program required for a function (such as a sound playback function, an image playback function, etc.).
  • the storage data area may store data created during use of the electronic device 400 (such as audio data, phone book, etc.).
  • the internal memory 421 may include high-speed random access memory, and may also include non-volatile memory, such as at least one disk storage device, flash memory device, universal flash storage (UFS), etc.
  • the electronic device 400 can implement audio functions through the audio module 470, the speaker 470A, the receiver 470B, the microphone 470C, the headphone interface 470D, and the application processor. For example, music playback, recording, etc.
  • the audio module 470 is used to convert digital audio information into analog audio signal output, and is also used to convert analog audio input into digital audio signals. Audio module 470 may also be used to encode and decode audio signals. In some embodiments, the audio module 470 may be disposed in the processor 410, or some functional modules of the audio module 470 may be disposed in the processor 410.
  • Speaker 470A also known as “speaker” is used to convert audio electrical signals into sound signals.
  • Electronic device 400 can listen to music through speaker 470A, or listen to hands-free calls.
  • Receiver 470B also called “earpiece” is used to convert audio electrical signals into sound signals.
  • the voice can be heard by bringing the receiver 470B close to the human ear.
  • Microphone 470C also known as “microphone” and “microphone”, is used to convert sound signals into electrical signals. When making a call or sending a voice message, the user can speak close to the microphone 470C with the human mouth and input the sound signal to the microphone 470C.
  • the electronic device 400 may be provided with at least one microphone 470C. In other embodiments, the electronic device 400 may be provided with two microphones 470C, which in addition to collecting sound signals, may also implement a noise reduction function. In other embodiments, the electronic device 400 can also be equipped with three, four or more microphones 470C to collect sound signals, reduce noise, identify sound sources, and implement directional recording functions, etc.
  • Headphone interface 470D is used to connect wired headphones.
  • the headphone interface 470D can be a USB interface 430, or a 3.5mm open mobile terminal platform (OMTP) standard interface, or a cellular telecommunications industry association of the USA (CTIA) standard interface.
  • OMTP open mobile terminal platform
  • CTIA cellular telecommunications industry association of the USA
  • the pressure sensor 480A is used to sense pressure signals and can convert the pressure signals into electrical signals.
  • pressure sensor 480A may be disposed on display screen 494.
  • pressure sensors 480A there are many types of pressure sensors 480A, such as resistive pressure sensors, inductive pressure sensors, capacitive pressure sensors, etc.
  • a capacitive pressure sensor may include at least two parallel plates of conductive material. When a force is applied to pressure sensor 480A, the capacitance between the electrodes changes. electricity Subdevice 400 determines the intensity of the pressure based on changes in capacitance. When a touch operation is performed on the display screen 494, the electronic device 400 detects the strength of the touch operation according to the pressure sensor 480A.
  • the electronic device 400 may also calculate the touched position based on the detection signal of the pressure sensor 480A.
  • touch operations acting on the same touch location but with different touch operation intensities may correspond to different operation instructions. For example, when a touch operation with a touch operation intensity smaller than the first pressure threshold is applied to the short message application icon, an instruction to view the short message is executed. When a touch operation with a touch operation intensity greater than or equal to the first pressure threshold is applied to the short message application icon, an instruction to create a new short message is executed.
  • the gyroscope sensor 480B can be used to determine the motion posture of the electronic device 400 .
  • the angular velocity of electronic device 400 about three axes may be determined by gyro sensor 480B.
  • the gyro sensor 480B can be used for image stabilization. For example, when the shutter is pressed, the gyro sensor 480B detects the angle at which the electronic device 400 shakes, calculates the distance that the lens module needs to compensate based on the angle, and allows the lens to offset the shake of the electronic device 400 through reverse movement to achieve anti-shake.
  • the gyro sensor 480B can also be used for navigation and somatosensory gaming scenarios.
  • Air pressure sensor 480C is used to measure air pressure. In some embodiments, the electronic device 400 calculates the altitude through the air pressure value measured by the air pressure sensor 480C to assist positioning and navigation.
  • Magnetic sensor 480D includes a Hall sensor.
  • the electronic device 400 may utilize the magnetic sensor 480D to detect the opening and closing of the flip holster.
  • the electronic device 400 can detect the opening and closing of the flip cover based on the magnetic sensor 480D; and set the flip cover to automatically unlock based on the detected opening and closing status of the leather case or the opening and closing status of the flip cover. and other characteristics.
  • the acceleration sensor 480E can detect the acceleration of the electronic device 400 in various directions (generally three axes). When the electronic device 400 is stationary, the magnitude and direction of gravity can be detected. It can also be used to identify the posture of electronic devices and be used in horizontal and vertical screen switching, pedometer and other applications.
  • Distance sensor 480F is used to measure distance.
  • Electronic device 400 can measure distance via infrared or laser. In some embodiments, when shooting a scene, the electronic device 400 can utilize the distance sensor 480F to measure distance to achieve fast focusing.
  • Proximity light sensor 480G may include, for example, a light-emitting diode (LED) and a light detector, such as a photodiode.
  • the light emitting diode may be an infrared light emitting diode.
  • the electronic device 400 emits infrared light outwardly through the light emitting diode.
  • Electronic device 400 uses photodiodes to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device 400 . When insufficient reflected light is detected, the electronic device 400 may determine that there is no object near the electronic device 400 .
  • the electronic device 400 can use the proximity light sensor 480G to detect when the user holds the electronic device 400 close to the ear for talking, so as to automatically turn off the screen to save power.
  • the proximity light sensor 480G can also be used in holster mode, and pocket mode automatically unlocks and locks the screen.
  • the ambient light sensor 480L is used to sense ambient light brightness.
  • the electronic device 400 can adaptively adjust the brightness of the display screen 494 according to the perceived ambient light brightness.
  • the ambient light sensor 480L can also be used to automatically adjust white balance when taking photos.
  • the ambient light sensor 480L can also cooperate with the proximity light sensor 480G to detect whether the electronic device 400 is in the pocket to prevent accidental touching.
  • Fingerprint sensor 480H is used to collect fingerprints.
  • the electronic device 400 can use the collected fingerprint characteristics to achieve fingerprint unlocking, access to application locks, fingerprint photography, fingerprint answering of incoming calls, etc.
  • Temperature sensor 480J is used to detect temperature.
  • the electronic device 400 utilizes the temperature detected by the temperature sensor 480J to execute the temperature processing strategy. For example, when the temperature reported by the temperature sensor 480J exceeds a threshold, the electronic device 400 reduces the performance of a processor located near the temperature sensor 480J in order to reduce power consumption and implement thermal protection. In other embodiments, when the temperature is below another threshold, the electronic device 400 heats the battery 442 to avoid low temperatures. causing the electronic device 400 to shut down abnormally. In some other embodiments, when the temperature is lower than another threshold, the electronic device 400 performs boosting on the output voltage of the battery 442 to avoid abnormal shutdown caused by low temperature.
  • Touch sensor 480K also known as "touch panel”.
  • the touch sensor 480K can be disposed on the display screen 494.
  • the touch sensor 480K and the display screen 494 form a touch screen, which is also called a "touch screen”.
  • Touch sensor 480K is used to detect a touch operation on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the touch event type.
  • Visual output related to the touch operation may be provided through display screen 494.
  • the touch sensor 480K may also be disposed on the surface of the electronic device 400 at a location different from that of the display screen 494 .
  • Bone conduction sensor 480M can acquire vibration signals.
  • the bone conduction sensor 480M can acquire the vibration signal of the vibrating bone mass of the human body's vocal part.
  • the bone conduction sensor 480M can also contact the human body's pulse and receive blood pressure beating signals.
  • the bone conduction sensor 480M can also be provided in the earphone and combined into a bone conduction earphone.
  • the audio module 470 can analyze the voice signal based on the vibration signal of the vocal vibrating bone obtained by the bone conduction sensor 480M to implement the voice function.
  • the application processor can analyze the heart rate information based on the blood pressure beating signal obtained by the bone conduction sensor 480M to implement the heart rate detection function.
  • the buttons 490 include a power button, a volume button, etc.
  • Key 490 may be a mechanical key. It can also be a touch button.
  • the electronic device 400 may receive key input and generate key signal input related to user settings and function control of the electronic device 400 .
  • Motor 491 can produce vibration prompts.
  • Motor 491 can be used for vibration prompts for incoming calls and can also be used for touch vibration feedback.
  • touch operations for different applications can correspond to different vibration feedback effects.
  • Acting on touch operations in different areas of the display screen 494, the motor 491 can also correspond to different vibration feedback effects.
  • Different application scenarios (such as time reminders, receiving information, alarm clocks, games, etc.) can also correspond to different vibration feedback effects.
  • the touch vibration feedback effect can also be customized.
  • the indicator 492 may be an indicator light, which may be used to indicate charging status, power changes, or may be used to indicate messages, missed calls, notifications, etc.
  • the SIM card interface 495 is used to connect a SIM card.
  • the SIM card can be connected to or separated from the electronic device 400 by inserting it into the SIM card interface 495 or pulling it out from the SIM card interface 495 .
  • the electronic device 400 can support 1 or N SIM card interfaces, where N is a positive integer greater than 1.
  • SIM card interface 495 can support Nano SIM card, Micro SIM card, SIM card, etc.
  • the same SIM card interface 495 can insert multiple cards at the same time.
  • the types of the plurality of cards may be the same or different.
  • the SIM card interface 495 is also compatible with different types of SIM cards.
  • the SIM card interface 495 is also compatible with external memory cards.
  • the electronic device 400 interacts with the network through the SIM card to implement functions such as calls and data communications.
  • the electronic device 400 uses an eSIM, that is, an embedded SIM card.
  • the eSIM card can be embedded in the electronic device 400 and cannot be separated from the electronic device 400 .
  • the software system of the electronic device 400 may adopt a layered architecture, an event-driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture.
  • the embodiments of this application take the Android system with a layered architecture as an example to illustrate the software structure of the electronic device.
  • the software structure of the electronic device will be exemplarily described below with reference to FIG. 5 , taking an electronic device with an Android system with a layered architecture as an example.
  • Figure 5 is a software structure block diagram of an electronic device according to an embodiment of the present application.
  • the electronic device in Figure 5 may be the electronic device 400 shown in Figure 4.
  • the layered architecture divides the software into several layers. Each layer has a clear role and division of labor; the layers are connected through software Interface communication.
  • the Android system is divided into four layers, starting from From top to bottom are the application layer, application framework layer, system library layer and kernel layer.
  • the application layer can include a series of application packages.
  • Figure 5 is just an example of layering the Android system, and other layering methods may also exist in actual scenarios.
  • there may be other naming methods for the naming of modules in each layer and there is no limit.
  • the application framework layer can be called the framework layer, or the system library layer can be merged into the application framework layer, etc.
  • the application package can include applications (applications, APPs) such as short messages, calendars, cameras, videos, navigation, calls, and galleries.
  • applications applications, APPs
  • the application will be referred to as application in the following.
  • the running application can also be called a foreground application or front-end application.
  • the application framework layer provides an application programming interface (API) and programming framework for applications in the application layer.
  • API application programming interface
  • the application framework layer includes some predefined functions.
  • the application framework layer can include a window manager, resource manager, notification manager, input manager, etc.
  • a window manager is used to manage window programs.
  • the window manager can obtain the display size, determine whether there is a status bar, lock the screen, capture the screen, etc.
  • the resource manager provides various resources to applications, such as localized strings, icons, pictures, layout files, video files, etc.
  • the notification manager allows applications to display notification information in the status bar, which can be used to convey notification-type messages and can automatically disappear after a short stay without user interaction.
  • the notification manager is used to notify download completion, message reminders, etc.
  • the notification manager can also be notifications that appear in the status bar at the top of the system in the form of charts or scroll bar text, such as notifications for applications running in the background, or notifications that appear on the screen in the form of conversation windows. For example, text information is prompted in the status bar, a beep sounds, the electronic device vibrates, the indicator light flashes, etc.
  • the input manager (InputManagerService) is mainly used to manage the input part of the entire system, including keyboard, mouse, touch screen, etc.
  • the system library layer can include Android runtime and system libraries.
  • Android runtime includes core libraries and virtual machines. Android runtime is responsible for the scheduling and management of the Android system.
  • the core library contains two parts: one is the functional functions that need to be called by the Java language, and the other is the core library of Android.
  • the application layer and application framework layer run in virtual machines.
  • the virtual machine executes the java files of the application layer and application framework layer into binary files.
  • the virtual machine is used to perform object life cycle management, stack management, thread management, security and exception management, and garbage collection and other functions.
  • System libraries can include multiple functional modules. For example, surface manager (surface manager), media libraries (media libraries), 3D graphics processing library (for example: OpenGL ES), input service module, etc.
  • surface manager surface manager
  • media libraries media libraries
  • 3D graphics processing library for example: OpenGL ES
  • input service module etc.
  • the surface manager is used to manage the display subsystem and provides the fusion of 2D and 3D layers for multiple applications.
  • the media library supports playback and recording of multiple audio formats, playback and recording of multiple video formats, and still image files. pieces.
  • the media library can support a variety of audio and video encoding formats, such as: MPEG4, H.264, Moving Picture Experts Group Audio Layer III (MP3), Advanced Audio Coding (AAC), Auto Adaptive multi-rate (AMR), joint photo graphic experts group (JPG) and portable network graphics (PNG).
  • MP3 Moving Picture Experts Group Audio Layer III
  • AAC Advanced Audio Coding
  • AMR Auto Adaptive multi-rate
  • JPG joint photo graphic experts group
  • PNG portable network graphics
  • the 3D graphics processing library is used to implement 3D graphics drawing, image rendering, composition and layer processing.
  • the input service module can include InputReader and InputDispatcher.
  • InputReader is used to read input events and report the read events to InputDispatcher, and InputDispatcher distributes input events.
  • the kernel layer refers to the layer between hardware and software; the kernel layer at least includes sensor drivers, camera drivers, display drivers, etc.
  • FIG. 4 listed above is a structural diagram of a possible electronic device
  • FIG. 5 is a software architecture diagram of a possible electronic device. Any touch screen electronic device (electronic device D) shown in FIGS. 1 to 3 may have the structure shown in FIG. 4 and the software structure shown in FIG. 5 .
  • Figure 6 is a schematic diagram of the interaction event processing process of the first type of interaction scenario in this embodiment of the present application.
  • the processing process shown in FIG. 6 can be applied to the interaction scenario shown in FIG. 1 , and the processing process shown in FIG. 6 can be executed by the above-mentioned electronic device D or electronic device 400.
  • the application layer includes multiple applications such as app1, app2, app3, app4, etc.
  • An application that is running in the foreground can be called a foreground application.
  • the corresponding driver in the kernel layer captures the key combination event and reports it to the InputReader in the native framework layer, and the InputReader reads the key combination event. And reported to the InputDispatcher in the native framework layer, the InputDispatcher can distribute the key combination event.
  • the InputDispatcher will first distribute the key combination event to the InputFilter in the InputManagerService of the java framework layer.
  • the InputFilter filters the key combination event before event distribution, and injects the key combination event into the InputDispatcher to form a
  • the corresponding touch event is then sent by the InputDispatcher to the view frame, and then sent to the foreground application through the view frame, because the touch screen electronic device (electronic device D or electronic device 400) can respond to the touch event.
  • the process is that the InputDispatcher distributes events to the application process, and the view framework running in the application process further distributes and processes the events according to the view tree of the current interface layout, and passes the events to the corresponding application controls. As a result, the interface of the foreground application can be zoomed.
  • the native framework layer can be seen as an example of the system library layer shown in Figure 5, and the java framework layer can be seen as an example of the application framework layer shown in Figure 5.
  • the layering method of the Android system is not unique, that is to say, the division method of the inter-layer structure shown in Figure 5 may not be completely adopted.
  • the native framework layer and the java framework layer can also be collectively referred to as the framework layer.
  • the InputDispatcher will distribute the key combination event directly to the view frame and then send it to the foreground application.
  • the touch screen electronic device electronic device D or electronic device 400
  • the touch screen electronic device cannot respond to the key combination event, that is, it cannot process the key combination event
  • the interface of the foreground application cannot be zoomed.
  • the key combination event is first converted into a touch event, and then the touch event is distributed to the view frame, thereby realizing the zoom function.
  • Figure 7 is a schematic diagram of the interaction event processing process of the second type of interaction scenario in this embodiment of the present application.
  • the processing process shown in Figure 7 can be applied to the interaction scenario shown in Figure 2, and the processing process shown in Figure 7 can be executed by the above-mentioned electronic device D or electronic device 400.
  • the application layer includes multiple applications such as app1 and app2.
  • Device E can be regarded as an example of the electronic device E shown in Figure 2.
  • the device E passes the key combination event through The communication channel between device E and the touch screen electronic device (electronic device D or electronic device 400) is received by the collaborative office software of the touch screen electronic device.
  • the collaborative office software sends the key combination event to InputManagerService, and InputManagerService sends the key combination event.
  • the event is injected into the InputDispatcher to form a corresponding touch event, and then the InputDispatcher distributes the touch event to the view frame, and then sends it to the foreground application through the view frame.
  • the touch screen electronic device electronic device D or electronic device 400
  • the interface of the foreground application can be zoomed.
  • the above-mentioned response process to the touch event can also refer to the relevant description in Figure 6 and will not be described again.
  • InputManagerService will not inject this key combination event into InputDispatcher to form a corresponding touch event. Instead, it will directly send the key combination event to InputDispatcher, and then InputDispatcher will distribute the key combination event to the view frame. , and then sent to the front-end application.
  • the touch screen electronic device electronic device D or electronic device 400
  • the touch screen electronic device cannot respond to the key combination event, that is, it cannot process the key combination event
  • the interface of the foreground application cannot be zoomed.
  • the key combination event is first converted into a touch event, and then the touch event is distributed to the view frame, thereby realizing the zoom function.
  • FIG 8 is a schematic flow chart of an interactive event processing method according to an embodiment of the present application. This method can be applied to the above-mentioned electronic device D or electronic device 400. Each step in Figure 8 is introduced below.
  • the key combination event can be obtained through a hardware driver or collaborative office software.
  • step S801 may be to capture the operation of the control key and the scroll wheel through the hardware driver, generate a key combination event, and report the key combination event to the InputReader.
  • step S801 may be to obtain the key combination event through collaborative office software.
  • the hardware driver of the screen projection device can capture the control key and scroll wheel operations, generate a key combination event, and transmit the key combination event to the touch screen electronic device through the communication path between the screen projection device and the touch screen electronic device, and then touch the screen electronic device.
  • the screen electronic device can obtain the key combination event through collaborative office software.
  • the key combination event includes an event composed of a control key (ctrl key) and a scroll wheel. That is, the key combination event is a combination event of a control key event and a mouse wheel event.
  • the key combination event can be ctrl+scroll wheel, and ctrl+scroll wheel can include "ctrl+scroll wheel up” and "ctrl+scroll wheel down”.
  • ctrl can be regarded as the status value of the wheel event, that is to say , the ctrl+wheel key combination event can be regarded as a wheel event carrying the ctrl status value.
  • the current event state can be obtained through the interface getMetaState() in the current wheel event object motionevent.
  • the event state is reflected as an integer (int) value.
  • the bit 0x1000 represents the status of the ctrl key. If the first bit is 1, it means the ctrl key is pressed. If the first bit is 0, it means the ctrl key is not pressed. It is a simple scroll wheel.
  • the parameters of the key combination event can be converted into corresponding parameters in the zoom touch event.
  • the key combination event can be converted into a zoom touch event by injecting a segment of the touch event. That is to say, you can start injecting touch events after receiving the key combination event until you can generate the touch event with the touch screen. A touch event equivalent to the zoom gesture ends up being injected, thus generating a zoom touch event.
  • step S802 may be that the InputDispatcher distributes the key combination event to the InputFilter.
  • the InputFilter filters the key combination event before event distribution, and injects the key combination event into the InputDispatcher to form a corresponding touch event.
  • step S802 may be that the InputManagerService injects the key combination event into the InputDispatcher to form a corresponding touch event.
  • a certain duration is required. Within this duration, it starts with the down event and ends with the up event, which can be understood as the time from pressing to releasing. part. During this process, a move event can be injected for each preset period of time, thereby generating a set of touch events.
  • a set of touch events often includes the location of the press and the distance and direction of movement (mostly sliding).
  • it can include the coordinates of the two fingers pressing on the screen, as well as the sliding distance and sliding direction of the two fingers during the pinching process, which can also be said to be the distance between the two fingers.
  • Change the positive and negative changes in the distance correspond to the zoom in and out respectively, that is, the change in coordinates of the two fingers corresponding to the press.
  • a two-finger pinch event can be to detect that the change amount of the coordinates pressed within the preset time interval is + ⁇ , which corresponds to zooming in, and the zoom-in ratio is positively related to ⁇ ; if the change amount is - ⁇ , it corresponds to zooming out, and The reduction ratio is positively related to ⁇ .
  • the parameters of a zoom touch event include a pinch distance and a pinch direction, where the pinch direction corresponds to zooming in or out, and the pinching distance corresponds to the zoom ratio.
  • the mouse wheel scroll event it is a motion event, and the event contains the coordinate information of the current mouse cursor. Therefore, the coordinate information of the cursor can be corresponding to the sliding distance of the two fingers.
  • the parameters of the scroll wheel event include the scroll direction and the scroll amount.
  • Scroll directions include, for example, upward and downward.
  • the amount of scrolling can be understood as the amount of change in rolling when the user pushes the wheel. For example, it can be expressed in units of grids, which is how many grids the wheel scrolls up or how many grids it scrolls down.
  • the scrolling direction corresponds to whether to zoom in or out. In usual operations, scrolling up corresponds to zooming in, and scrolling down corresponds to zooming out.
  • the amount of scrolling corresponds to the zoom ratio.
  • the scrolling direction corresponds to the page turning direction.
  • scrolling up corresponds to page turning up
  • scrolling down corresponds to page turning down
  • the amount of scrolling corresponds to the amount of page turning up or down. correspond.
  • the difference between a simple wheel event and a combined key event of a control key and a wheel is that the control key status bits in the wheel event are marked differently, but it will not affect the two parameters of the rolling direction and rolling amount of the wheel, as mentioned above. No longer.
  • the parameters of the zoom touch event include the pinch direction and pinch distance
  • the parameters of the key combination event include the scroll direction and scroll distance.
  • the pinch distance corresponds to the zoom ratio
  • the scroll distance also corresponds to the zoom ratio, so the pinch distance can correspond to the scroll distance.
  • the key combination event is a combination event of a control key event and a mouse wheel event;
  • the parameters of the key combination event include the rolling direction and amount of the wheel;
  • the parameters of the zoom touch event include the pinch direction and pinch distance; in When the parameters of the key combination event are converted into corresponding parameters in the zoom touch event, it can include: converting the scroll amount into the pinch distance, and converting the scroll direction into the pinch direction.
  • the above injection process that is, the step of converting key combination events into zoom touch events, Can include:
  • the scroll wheel event is injected to form a zoom touch event, where the zoom identifier of the zoom touch event (corresponding to the pinch direction) corresponds to the wheel direction, and the zoom ratio of the zoom touch event (corresponds to the pinch distance) corresponds to the scroll amount.
  • the zoom flag is used to indicate whether this interaction event is zooming in or out. For example, positive and negative can be used to represent zooming in and zooming out respectively.
  • the coordinates of the down event initiated by pinching two fingers respectively correspond to two symmetrical points with the cursor position as the center point.
  • the distances between the x-axis and y-axis of the two points are respectively preset values.
  • the preset values are, for example, It can be 400dp.
  • the injected up event coordinates are the starting coordinates offset by 100dp in the x and y axis directions.
  • the x and y coordinate offsets of each move event are calculated based on the time of the event injection process. coordinate value.
  • the zoom-in and zoom-out gestures are determined based on the direction of the scroll wheel event, and it also determines whether the coordinate calculation of the coordinate points x and y uses positive values or negative values. Injected touch event coordinates need to be constrained to screen boundaries.
  • dp density-independent pixel
  • 1dp represents the length of 1 pixel when the screen pixel density is 160 resolution.
  • the injected event coordinates are restricted based on screen boundaries.
  • step S802 may also include:
  • This implementation can improve the accuracy of processing continuously triggered key combination events.
  • the interface can be an interface of a touch-screen electronic device or a projection interface of a touch-screen electronic device, corresponding to the above two types of interaction scenarios respectively.
  • the above method may further include: determining whether the key combination event is a zoom key combination event;
  • the key combination event is a zoom key combination event
  • step S802 it is determined whether the input key combination event is a key combination event corresponding to the zoom function. If it is a key combination event corresponding to the zoom function, step S802 is executed. That is, a discrimination operation is added to step S802. When the input interaction event is not the expected key combination event, subsequent operations will no longer be performed. In other words, the conversion will continue to be performed only when the input interaction event is the correct key combination event. and response steps.
  • This implementation can block the input of pure control key events, or avoid mistakenly responding to pure wheel events as zoom key combinations.
  • the above method may further include: determining whether the foreground application supports zooming, and when the foreground application supports zooming, perform step S803. That is to say, a discrimination operation is added. Since some applications support scaling and some applications do not support scaling, the solution of the embodiment of the present application can be executed only for applications that support scaling. However, it should be understood that the judgment action may be performed at any step node before step S803, for example, it may be performed before or after step S801, or it may be performed before or after step S802. There is no limitation. For example, assuming that it is judged before step S801 whether the foreground application supports scaling, then step S801 may not be executed when the judgment result is no.
  • Steps S801-S803 this processing flow is omitted. If the judgment result is yes, steps S801-S803 are executed to realize zooming under keyboard and mouse control. Assuming that the judgment of whether the foreground application supports zooming is performed after step S801 and before step S802, steps S802 and S803 may no longer be performed when the judgment result is no, and steps S802 and S803 may be continued when the judgment result is yes to implement keyboard and mouse control. Zoom below. No more listing them one by one.
  • the above-mentioned determination of whether the foreground application supports zooming when the foreground application supports zooming, the process of executing step S803 may include:
  • step S803 includes: in response to the zoom touch event, scaling the running interface of the foreground application; or,
  • an application whitelist can be set, and all applications in the whitelist can support the zoom function.
  • the above steps of determining whether the foreground application supports scaling can be performed by InputManagerService.
  • the foreground application is an application in the whitelist of applications, the foreground application is considered to support scaling; or,
  • the foreground application is not an application in the whitelist, it is considered that the foreground application does not support scaling.
  • the above method further includes: after the zoom touch event is generated, discarding the key combination event. That is to say, the key combination event can be discarded after step S802 is executed, and the discarded event no longer needs to be processed subsequently.
  • the method shown in Figure 8 mainly achieves the effect of controlling the zoom function of a touch screen electronic device through a keyboard and mouse by converting an unresponsive key combination event into a responsive zoom touch event and then responding. And the solution is easy to implement and does not require too many changes.
  • the touch screen electronic device itself already has touch operations corresponding to the zoom function, so as long as the zoom key combination event is converted into a zoom touch event, the electronic device can respond to the zoom touch event .
  • it can be understood as establishing a mapping relationship between the key event of the zoom function and the touch event, so that when the key combination event of the zoom function is input to the electronic device, the electronic device only needs to add a conversion step, and first convert the key combination event of the zoom function into a zoom touch event.
  • the event can be processed according to the original processing flow of touch events, thereby realizing the control of the zoom function of the electronic device through the keyboard and mouse from the user's perspective.
  • This solution adds an additional conversion step to the existing touch event processing flow, so the transformation is relatively difficult and easy to implement. There is no need to change the hardware structure or redesign the complex processing flow.
  • Figure 9 is a schematic flow chart of another interactive event processing method according to an embodiment of the present application.
  • Figure 9 can be seen as an example of the method shown in Figure 8.
  • Step S901 can be regarded as an example of step S801.
  • step S902 Determine whether the key combination event is a ctrl+scroll event. When the judgment result is yes, step S903 is executed. When the judgment result is no, step S904 is executed.
  • step S903 Determine whether the foreground application supports scaling. When the judgment result is yes, step S905 is executed. When the judgment result is no, step S904 is executed.
  • Step S904 allows the InputDispatcher to report the key combination event to the view, and then the view sends it to the foreground application.
  • step S901 actually obtains only control key events, that is, ctrl events
  • the ctrl events are directly reported to the foreground application, and subsequent steps to implement the zoom function will not be performed.
  • step S901 actually obtains only the scroll wheel event
  • the scroll wheel event is directly reported to the foreground application, and subsequent steps to implement the zoom function will not be performed.
  • the foreground application may respond to the scroll wheel event to turn pages.
  • Steps S902 and S904 are mainly to make a pre-judgment of the key combination event. Only when the key combination event is indeed a key combination event of the zoom function, the subsequent steps will be continued. Otherwise, the key combination event will be directly reported to the foreground application.
  • Steps S903 and S904 are mainly to make a pre-judgment for the foreground application. Only when the current foreground application supports the zoom function, the subsequent steps will be continued. Otherwise, the key combination event will be directly reported to the foreground application.
  • Step S903 may also be executed before or after step S901 or during execution, without limitation.
  • step S905. Determine whether the previous round of injection is still being executed. When the judgment result is yes, step S906 is executed. When the judgment result is no, step S907 is executed.
  • S907 Inject for a period of time to generate a zoom touch event.
  • Step S907 can be regarded as an example of step S802.
  • Steps S905-S907 can mainly realize that for continuous key combination operations, through each round of injection, the generated zoom touch event is equal to the actual two-finger pinch event.
  • step S907 when step S907 is completed, in addition to step S908, a step of discarding the key combination event may also be performed.
  • Figure 10 is a schematic flow chart of yet another interactive event processing method according to an embodiment of the present application.
  • Figure 10 can be regarded as an example of the method shown in Figure 8 or Figure 9.
  • the hardware driver reports the key combination event to the InputReader.
  • Step S1001 can be regarded as an example of step S801 or S901.
  • InputReader reports the key combination event to InputDispatcher.
  • InputDispatcher reports the key combination event to InputFilter in InputManagerService, and InputFilter performs filtering before event distribution.
  • InputFilter determines whether the key combination event is a zoom key combination event.
  • Step S1004 can be regarded as an example of S902.
  • InputFilter determines whether the foreground application supports scaling.
  • Step S1005 can be regarded as an example of S903.
  • InputFilter injects zoom touch events into InputDispatcher.
  • Step S1006 can be regarded as an example of step S802 or S907.
  • InputDispatcher sends a zoom touch event to the foreground application.
  • Step S1007 can be regarded as an example of step S908.
  • Step S1007 can report the zoom touch event to the view for the InputDispatcher, and then the view sends it to the foreground application.
  • Step 10 is mainly introduced using the first type of interaction scenario as an example, and the method shown in Figure 10 It is a modification based on the existing interactive event processing framework and processing flow. Steps S1003-S1006 are added, so that step S1007 sends a zoom touch event to the front desk instead of a key combination event, so that the view can Respond to zoom touch events, thus fully implementing the zoom function triggered by key combination events. It should be understood that those skilled in the art can also implement the conversion of key combination events into zoom touch events in other suitable processing modules or design separate processing modules, but the changes required will be more than in Figure 10 .
  • Figure 10 is a modification based on the existing processing framework and processing flow. The changes are smaller, the development cost is lower, and it is more convenient.
  • FIG 11 is a schematic diagram of an interactive event processing device according to an embodiment of the present application.
  • the device 2000 includes an acquisition unit 2001 and a processing unit 2002.
  • the device 2000 may be any of the above-mentioned touch screen electronic devices, such as the electronic device D or the electronic device 400 .
  • the device 2000 can be used to perform any of the above interactive event processing methods.
  • the acquisition unit 2001 can be used to perform step S801, and the processing unit 2002 can be used to perform steps S802 and S803.
  • the processing unit 2002 may be further configured to determine whether the key combination event is a key combination event corresponding to the zoom function, and/or to perform the step of determining whether the foreground application supports the zoom function.
  • the acquisition unit 2001 can also be used to perform step S901, and the processing unit 2002 can also be used to perform steps S902 to S908.
  • the acquisition unit 2001 can also be used to perform step S1001, and the processing unit 2002 can also be used to perform steps S1002 to S1007.
  • the device 2000 may further include a storage unit for storing data such as interaction events and configuration information of applications that support the zoom function.
  • the storage unit may be integrated in the processing unit 2002, or may be a unit independent of the acquisition unit 2001 and the processing unit 2002.
  • Module completion means dividing the internal structure of the device into different functional units or modules to complete all or part of the functions described above.
  • Each functional unit and module in the embodiment can be integrated into one processing unit, or each unit can exist physically alone, or two or more units can be integrated into one unit.
  • the above-mentioned integrated unit can be hardware-based. It can also be implemented in the form of software functional units.
  • the specific names of each functional unit and module are only for the convenience of distinguishing each other and are not used to limit the scope of protection of the present application.
  • For the specific working processes of the units and modules in the above system please refer to the corresponding processes in the foregoing method embodiments, and will not be described again here.
  • An embodiment of the present application also provides an electronic device, which includes: at least one processor, a memory, and and a computer program stored in the memory and executable on the at least one processor.
  • the processor executes the computer program, the steps in any of the above methods are implemented.
  • Embodiments of the present application also provide a computer-readable storage medium.
  • the computer-readable storage medium stores a computer program.
  • the steps in each of the above method embodiments can be implemented.
  • Embodiments of the present application provide a computer program product.
  • the steps in each of the above method embodiments can be implemented when the mobile terminal is executed.
  • the integrated unit is implemented in the form of a software functional unit and sold or used as an independent product, it may be stored in a computer-readable storage medium.
  • this application can implement all or part of the processes in the methods of the above embodiments by instructing relevant hardware through a computer program.
  • the computer program can be stored in a computer-readable storage medium.
  • the computer program When executed by a processor, the steps of each of the above method embodiments may be implemented.
  • the computer program includes computer program code, which may be in the form of source code, object code, executable file or some intermediate form.
  • the computer-readable medium may at least include: any entity or device capable of carrying computer program code to a camera/electronic device, a recording medium, a computer memory, a read-only memory (ROM), or a random access memory. (random access memory, RAM), electrical carrier signals, telecommunications signals, and software distribution media.
  • a camera/electronic device a recording medium
  • a computer memory a read-only memory (ROM), or a random access memory.
  • random access memory random access memory
  • RAM random access memory
  • electrical carrier signals telecommunications signals
  • software distribution media for example, U disk, mobile hard disk, magnetic disk or CD, etc.
  • computer-readable media may not be electrical carrier signals and telecommunications signals.
  • the disclosed apparatus/network equipment and methods can be implemented in other ways.
  • the device/network device embodiments described above are only illustrative.
  • the division of modules or units is only a logical function division. In actual implementation, there may be other division methods, such as multiple units. Or components can be combined or can be integrated into another system, or some features can be omitted, or not implemented.
  • the coupling or direct coupling or communication connection between each other shown or discussed may be through some interfaces, and the indirect coupling or communication connection of the devices or units may be in electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated, and the components shown as units may or may not be physical units, that is, they may be located in one place, or they may be distributed to multiple network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of this embodiment.
  • the term “if” may, depending on the context, be Interpreted as “when” or “once” or “in response to determination” or “in response to detection.”
  • the phrase “if determined” or “if [the described condition or event] is detected” may be interpreted, depending on the context, to mean “once determined” or “in response to a determination” or “once the [described condition or event] is detected ]” or “in response to detection of [the described condition or event]”.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Position Input By Displaying (AREA)

Abstract

本申请提供一种交互事件的处理方法及装置,该方法包括:获取组合键事件,该组合键事件为用于触发缩放功能的交互事件;将组合键事件转换为缩放触摸事件;响应于缩放触摸事件,对界面进行缩放。该方案主要通过将无法响应的组合键事件转换为能够进行响应的缩放触摸事件再进行响应,从而实现了通过键盘鼠标来操控触屏电子设备的缩放功能的效果,且该方案易于实现。

Description

交互事件的处理方法及装置
本申请要求于2022年09月02日提交国家知识产权局、申请号为202211069768.5、申请名称为“交互事件的处理方法及装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及电子信息技术领域,尤其涉及一种交互事件的处理方法及装置。
背景技术
随着科技发展,人们对于手机、平板电脑等触屏电子设备的使用不再只局限于对这些电子设备的直接操控,也出现了外接输入设备进行操控的情况,以及可能将这些触屏电子设备投屏到其他电子设备上进行操控的情况。例如可以在手机等触屏电子设备上外接键盘和鼠标,就可以将手机当成办公电脑使用,利用键盘和鼠标来控制触屏电子设备。或者将手机投屏到笔记本电脑或计算机上,进行联合办公,然后利用笔记本电脑的自带键盘来控制触屏电子设备,或者利用笔记本电脑或计算机的外接键盘和鼠标来控制触屏电子设备。
在上述场景中,以手机为例,由于键盘和鼠标本质上是为计算机设计的,所以键盘和鼠标的操控与手机并不能完全匹配。也就是说,键盘和鼠标虽然能够实现对计算机的操控,但当操控对象变成了手机,会导致手机上的一些原有操控功能无法实现。例如,当利用键盘和鼠标来操控手机时,目前没有办法实现手机上的缩放功能。
因此,如何通过键盘和鼠标的操控来实现触屏电子设备上的缩放是亟待解决的技术问题。
发明内容
本申请提供一种交互事件的处理方法及装置,能够通过键盘和鼠标的操控,实现触屏电子设备上的缩放。
第一方面,提供了一种交互事件的处理方法,该方法包括:获取组合键事件,该组合键事件为用于触发缩放功能的交互事件;将组合键事件转换为缩放触摸事件;响应于缩放触摸事件,对界面进行缩放。
在本申请的技术方案中,主要通过将无法响应的组合键事件转换为能够进行响应的缩放触摸事件再进行响应,从而实现了通过键盘鼠标来操控触屏电子设备的缩放功能的效果。且该方案易于实现,不需要过多改动。
结合第一方面,在第一方面的某些实现方式中,在将组合键事件转换为缩放触摸事件时,可以包括:将组合键事件的参数转换为缩放触摸事件中的对应参数。
结合第一方面,在第一方面的某些实现方式中,组合键事件为控制键事件和鼠标滚轮事件的组合事件;组合键事件的参数包括滚轮的滚动方向和滚动量;缩放触摸事件的参数 包括捏合方向和捏合距离;将组合键事件的参数转换为缩放触摸事件中的对应参数,包括:将滚动量转换为捏合距离,将滚动方向转换为捏合方向。
结合第一方面,在第一方面的某些实现方式中,上述方法还包括:
判断组合键事件是否为缩放组合键事件;
当组合键事件为缩放组合键事件时,执行将组合键事件转换为缩放触摸事件的步骤;或者,
当组合键事件不是缩放组合键事件时,结束交互事件的处理流程。
这种实现方式能够屏蔽掉纯控制键事件的输入,或者避免误将纯滚轮事件当成缩放组合键进行响应。
结合第一方面,在第一方面的某些实现方式中,在将组合键事件转换为缩放触摸事件时,还可以包括:
判断前一轮次的组合键事件的转换是否还在执行中;
当前一轮次的组合键事件的转换还在执行中时,丢弃组合键事件;或者,
当前一轮次的组合键事件的转换不在执行中时,将组合键事件转换为缩放触摸事件。
该实现方式能够提高对于连续触发的组合键事件的处理的准确性。
结合第一方面,在第一方面的某些实现方式中,在获取组合键事件时,可以包括:
通过硬件驱动或协同办公软件,获取组合键事件。
结合第一方面,在第一方面的某些实现方式中,上述方法还包括:
判断前台应用是否支持缩放;
当前台应用支持缩放时,响应于缩放触摸事件,对界面进行缩放,包括:响应于缩放触摸事件,对前台应用的运行界面进行缩放;或者,
当前台应用不支持缩放时,结束交互事件的处理流程。
结合第一方面,在第一方面的某些实现方式中,在判断前台应用是否支持缩放时,可以包括:
将前台应用的信息与白名单应用的预配置信息进行比较,确定前台应用是否为白名单应用中的应用,白名单应用中包括至少一个支持缩放功能的应用;
当前台应用是白名单应用中的应用时,认为前台应用支持缩放;或者,
当前台应用不是白名单应用中的应用时,认为前台应用不支持缩放。
结合第一方面,在第一方面的某些实现方式中,上述方法还包括:
在缩放触摸事件生成之后,丢弃组合键事件。
第二方面,提供了一种交互事件的处理装置,该装置包括由软件和/或硬件组成的用于执行第一方面中的任意一种方法的单元。
在其中一个实施例中,该装置包括:
获取单元,用于获取组合键事件,该组合键事件为用于触发缩放功能的交互事件;
处理单元,用于:
将组合键事件转换为缩放触摸事件;
响应于缩放触摸事件,对界面进行缩放。
结合第二方面,在第二方面的某些实现方式中,处理单元具体用于:将组合键事件的参数转换为缩放触摸事件中的对应参数。
结合第二方面,在第二方面的某些实现方式中,组合键事件为控制键事件和鼠标滚轮事件的组合事件;组合键事件的参数包括滚轮的滚动方向和滚动量;缩放触摸事件的参数包括捏合方向和捏合距离;处理单元具体用于:
将滚动量转换为捏合距离,将滚动方向转换为捏合方向。
结合第二方面,在第二方面的某些实现方式中,处理单元还用于:
判断组合键事件是否为缩放组合键事件;
当组合键事件为缩放组合键事件时,执行将组合键事件转换为缩放触摸事件的步骤;或者,
当组合键事件不是缩放组合键事件时,结束交互事件的处理流程。
结合第二方面,在第二方面的某些实现方式中,处理单元还用于:
判断前一轮次的组合键事件的转换是否还在执行中;
当前一轮次的组合键事件的转换还在执行中时,丢弃组合键事件;或者,
当前一轮次的组合键事件的转换不在执行中时,将组合键事件转换为缩放触摸事件。
结合第二方面,在第二方面的某些实现方式中,获取单元具体用于:
通过硬件驱动或协同办公软件,获取组合键事件。
结合第二方面,在第二方面的某些实现方式中,处理单元还用于:
判断前台应用是否支持缩放;
当前台应用支持缩放时,响应于缩放触摸事件,对界面进行缩放,包括:响应于缩放触摸事件,对前台应用的运行界面进行缩放;或者,
当前台应用不支持缩放时,结束交互事件的处理流程。
结合第二方面,在第二方面的某些实现方式中,处理单元具体用于:
将前台应用的信息与白名单应用的预配置信息进行比较,确定前台应用是否为白名单应用中的应用,白名单应用中包括至少一个支持缩放功能的应用;
当前台应用是白名单应用中的应用时,认为前台应用支持缩放;或者,
当前台应用不是白名单应用中的应用时,认为前台应用不支持缩放。
结合第二方面,在第二方面的某些实现方式中,处理单元还用于:
在缩放触摸事件生成之后,丢弃组合键事件。
第三方面,提供了一种电子设备,包括存储器、处理器以及存储在存储器中并可在处理器上运行的计算机程序,当处理器执行计算机程序时能够实现第一方面的任意一种方法。
第四方面,提供了一种芯片,包括处理器,该处理器用于读取并执行存储在存储器中的计算机程序,当计算机程序被处理器执行时能够实现第一方面的任意一种方法。
可选地,该芯片还包括存储器,存储器与处理器电连接。
可选地,该芯片还可以包括通信接口。
第五方面,提供了一种计算机可读存储介质,该计算机可读存储介质存储有计算机程序,当计算机程序被处理器执行时能够实现第一方面的任意一种方法。
第六方面,提供了一种计算机程序产品,该计算机程序产品包括计算机程序,当计算机程序被处理器执行时能够实现第一方面的任意一种方法。
附图说明
图1是本申请实施例的一种交互场景的示意图。
图2是本申请实施例的另一种交互场景的示意图。
图3是一种缩放功能的响应过程示意图。
图4是本申请实施例的一种电子设备的结构示意图。
图5是本申请实施例的一种电子设备的软件结构框图。
图6是本申请实施例的第一类交互场景的交互事件处理过程的示意图。
图7是本申请实施例的第二类交互场景的交互事件处理过程的示意图。
图8是本申请实施例的一种交互事件的处理方法的示意性流程图。
图9是本申请实施例的另一种交互事件的处理方法的示意性流程图。
图10是本申请实施例的又一种交互事件的处理方法的示意性流程图。
图11是本申请实施例的一种交互事件的处理装置的示意图。
具体实施方式
下面结合附图对本申请实施例的方案进行介绍。本申请提供的交互事件的处理方法能够应用于各类触屏电子设备的键鼠控制场景。本申请提供的交互事件的处理方法能用于能够支持外接键鼠装置进行操控的触屏电子设备中。
图1是本申请实施例的一种交互场景的示意图。应理解,本申请实施例主要包括两类交互场景,第一类交互场景是触屏电子设备直接外接键盘和鼠标,利用外接的键盘和鼠标对触屏电子设备进行操控;第二类交互场景是触屏电子设备投屏到投屏设备,利用投屏设备的外接键盘和鼠标对触屏电子设备进行操控。图1是第一类交互场景的示例,图2则是第二类交互场景的示例。
如图1所示,在该场景中,电子设备D为触屏电子设备,也就是能够通过在屏幕上进行点击、双击或滑动等触摸操作来进行操控的电子设备。触屏电子设备也可以称之触屏设备或触屏终端。电子设备D例如可以是手机或平板电脑,或者其他能够进行触屏操控的电子设备,图1主要以电子设备D为平板电脑为例。应理解,电子设备D的主要输入方式是触屏操作,但也可能存在电源键、音量键等机械键来进行按键操作的输入方式。
也就是说,本申请实施例中的触屏电子设备主要是指原本的主要输入方式是触屏操作,按键操作并不是其原本的主要输入方式,也就是说键盘和鼠标并不是触屏电子设备原本的输入装置,而只是其扩展出来的输入装置。例如,对于手机而言,对屏幕的触摸操作才是其原本的主要输入方式,是随着科技发展才衍生出了能够外接键盘和鼠标来操控手机的使用场景,但由于键盘和鼠标本质上是为计算机设计的,所以键盘和鼠标的操控与手机并不能完全匹配。也就是说,键盘和鼠标虽然能够实现对计算机的操控,但当操控对象变成了手机,会导致手机上的一些原有操控功能无法实现。
在图1所示场景中,电子设备D通过无线连接的方式外接了键盘B和鼠标C。该无线连接方式例如可以是蓝牙连接,但应理解,也可以是其他无线连接方式,此外无线连接的方式还可以替换为有线连接的方式,不存在限定。对于键盘B和鼠标C的形状、型号以及键盘B上的按键布局也不存在限定,例如图1所示键盘B中的Fn键可以不存在,或者实际位置与图中所示位置不同,又例如方向键的图标可以是其他表示形式,又例如可能 还有数字键盘区也就是小键盘区,又例如可能有音量按键等,其他可能情况不再逐一说明。
在图1所示场景中,用户A可以通过操作键盘B和鼠标C来实现对于电子设备D的操控。
需要说明的是,在本申请实施例中交互事件包括按键事件和触摸事件两种。按键事件用于表示通过操作键盘和/或鼠标上的按键或按钮生成的交互事件。触摸事件用于表示通过触摸屏幕产生的交互事件。应理解,此处的触摸事件是对触屏电子设备的屏幕的触摸操作产生的交互事件,与笔记本电脑上的触控板(触摸板)是不同的。
如果用户A在按下键盘B的ctrl键的同时推动鼠标C的滚轮(scroll lock),就会形成ctrl+滚轮这一组合键事件。应理解,推动滚轮包括向上滚动和向下滚动,也就是说滚轮可以有两个滚动方向,且两个滚动方向分别描述为向上和向下。在传统计算机的外接键盘鼠标的操作中,ctrl+滚轮能够实现缩放功能,其中,ctrl+向上滚轮能够实现放大功能,ctrl+向下滚轮能够实现缩小功能。如果只是单纯的滚轮操作,则实现的是翻页功能,向上滚轮对应向上翻页,向下滚轮对应向下翻页。应理解,由于ctrl键是控制键,单纯按下ctrl键,被控设备并不会产生相应操作,所以ctrl+滚轮这个组合键事件中ctrl可以看作是滚轮事件的状态值,也就是说,ctrl+滚轮这个组合键事件可以看作携带了ctrl状态值的滚轮事件。
对于目前的触屏电子设备,如果连接键盘鼠标操控,即使通过操作键盘和鼠标生成了ctrl+滚轮这一组合键事件,触屏电子设备也无法对这个组合键事件进行响应,也就是无法实现缩放。
针对上述问题,本申请实施例的方案则主要通过增加对于组合键事件的一些处理,从而实现通过组合键事件来操控触屏电子设备的缩放。在图1所示场景中,如果用户A在按下键盘B的ctrl键的同时滚动鼠标C的滚轮,通过本申请实施例的方案,电子设备D对ctrl+滚轮这一组合键事件进行处理,就能够实现电子设备D上的界面缩放。具体过程会在下文介绍,暂不展开。
图2是本申请实施例的另一种交互场景的示意图。如上所述,图2是第二类交互场景的示例。如图2所示,在该场景中,电子设备D为触屏电子设备,可以参照图1对于电子设备D的相关描述。
在图2所示场景中,电子设备D通过近场通信,也就是近距离无线通信技术,投屏到电子设备E。但应理解,本申请实施例对电子设备D和电子设备E之间的通信方式不存在限定。电子设备E可以称之为投屏设备。电子设备E可以是笔记本电脑或计算机等能够通过键盘和鼠标进行操控的电子设备。对于笔记本电脑,通常会自带键盘,所以可以利用笔记本电脑的自身的键盘和外接的鼠标进行操控。对于计算机而言则通常需要外接键盘和鼠标进行操控。
应理解,电子设备E和电子设备D是两类操控方式不同的电子设备。电子设备D为触屏电子设备,也就是能够通过在屏幕上进行点击、双击或滑动等触摸操作来进行操控的电子设备。电子设备D的主要输入方式是触屏操作。而电子设备E是电子设备D在联合办公或称之为协同办公的使用场景下的辅助设备。电子设备E的主要作用是为电子设备D提供屏幕,同步显示电子设备D的屏幕画面,且电子设备E的接口和输入输出装置能够由电子设备D调用。但应理解,在协同办公的使用场景下,对数据的处理,例如对于输入的交互事件的处理依然是由电子设备D来执行的。
如上所述,本申请实施例中的触屏电子设备主要是指原本的主要输入方式是触屏操作,按键操作并不是其原本的主要输入方式,也就是说键盘和鼠标并不是触屏电子设备原本的输入装置,而只是其扩展出来的输入装置。而对于投屏设备而言,原本的主要输入方式是按键操作,也就是说键盘和鼠标就是投屏设备原本的输入装置,并不是其扩展出来的输入装置。
在图2所示场景中,电子设备E通过有线连接的方式外接了键盘B和鼠标C。但应理解,也可以是无线连接的方式,不存在限定。对于键盘B和鼠标C的描述同样可以参照图1相关内容。
在图2所示场景中,用户A可以通过操作键盘B和鼠标C来实现对于电子设备D的操控。
如果用户A在按下键盘B的ctrl键的同时推动鼠标C的滚轮,也就是形成ctrl+滚轮这一组合键事件。该组合键事件被上报给电子设备E,然后通过电子设备D和电子设备E之间的通信通道传输给电子设备D,但由于电子设备D无法对这个组合键事件进行响应,因此电子设备D也就无法实现缩放。
针对这一问题,本申请实施例的方案中,则通过增加了电子设备D对于组合键事件的一些处理,从而实现通过组合键事件来操控触屏电子设备的缩放功能。在图2所示场景中,如果用户A在按下键盘B的ctrl键的同时滚动鼠标C的滚轮,电子设备D利用本申请实施例的方案对ctrl+滚轮这一组合键事件进行一些处理,就能够实现电子设备D的界面缩放。应理解,此处由于电子设备E是电子设备D的投屏设备,所以电子设备D的界面与电子设备E的屏幕中电子设备D的投屏界面是同步的。因此,电子设备D上的界面和电子设备D在电子设备E上的投屏界面会同步缩放。
在一个例子中,电子设备D为安卓操作系统,电子设备E为windows操作系统。
从图1和图2可以看出,在本申请实施例中,通过键盘和鼠标的操控实现触屏电子设备上的缩放,包括:通过触屏电子设备的外接键盘和鼠标来实现触屏电子设备上的界面缩放;或者通过触屏电子设备的投屏设备的键盘和鼠标来实现触屏电子设备的界面和投屏设备上的触屏电子设备的投屏界面的缩放。
需要说明的是,虽然目前出现了一类笔记本电脑,可以兼具普通笔记本电脑和平板电脑的功能,也就是同时具备触屏交互事件和键盘鼠标交互事件的处理能力,可以称之为触屏笔记本电脑。但是这类笔记本电脑都是windows操作系统,这两类交互事件对应的是两套独立的处理流程。如果直接操作这类触屏笔记本电脑,它的缩放功能是分别直接响应于用户的触摸操作或直接响应于用户的键盘鼠标操作来实现的,也就是用户可以通过对屏幕的触摸操作来进行缩放,或者通过键盘和鼠标操作来进行缩放。这种双处理流程的方式并不能用于电子设备D这类设备中,原因在于,如果强行将这种双处理流程应用于电子设备D,则电子设备D必须采用windows系统,且整个电子设备的软件架构和配套的硬件架构都得改变,重新设计,但这种改变相当于把手机彻底改造成了笔记本电脑,这种改造成本高、复杂,并且并没有改造必要。因为没有必要把一个已有产品种类强行改造成另一个已有产品种类。简而言之,对于上述触屏笔记本电脑,其自身原本就具备触摸操作和按键操作两种输入方式的输入装置,且已经配套设计了两种交互事件的处理装置和流程,所以这类触屏笔记本电脑不是本申请实施例所述的触屏电子设备,即电子设备D不是这类触屏笔 记本电脑。
但触屏笔记本电脑可以作为本申请实施例中的投屏设备使用,也就是说电子设备E可以是触屏笔记本电脑。如果将手机或平板电脑这类触屏电子设备投屏到触屏笔记本电脑,以手机为例,即使利用触屏笔记本电脑的键盘和外连的鼠标进行缩放,由于触屏笔记本电脑只是起到一个数据载体的作用,真正执行运算操作的依然是手机,所以,此时即使操作触屏笔记本电脑连接的键盘和鼠标,依然不能进行缩放,依然需要利用本申请实施例的方案先将键盘和鼠标的按键事件转换为触摸事件再进行响应,从而实现手机界面和手机在触屏笔记本电脑上的投屏界面的缩放。
图3是一种缩放功能的响应过程示意图。如图3所示,如果对界面310进行放大触摸操作,也就是两个手指往手指间距离增大的方向滑动,可以生成放大触摸事件,在图3中放大触摸事件用双指捏合#1表示,电子设备对双指捏合#1这个触摸事件进行处理,就能够得到界面320。可以看出在界面320中的图片A比在界面310中的图片A大,也就是图片A在电子设备的屏幕上的占比增大。
如果对界面310进行放大按键操作,也就是在按下ctrl键的同时向上推动滚轮,就可以生成ctrl+向上滚轮的组合键事件,在图3中放大的组合键事件用ctrl+向上滚轮表示,电子设备对ctrl+向上滚轮的组合键事件这个按键事件进行处理,同样能够得到界面320。也就是说,放大组合键事件和放大触摸事件都能够进行电子设备的界面的放大。
如果对界面330进行缩小触摸操作,也就是两个手指往手指间距离减小的方向滑动,可以生成缩小触摸事件,在图3中缩小触摸事件用双指捏合#2表示,电子设备对双指捏合#2这个触摸事件进行处理,就能够得到界面340。可以看出在界面340中的图片A比在界面330中的图片A小,也就是图片A在电子设备的屏幕上的占比减小。
如果对界面330进行缩小按键操作,也就是在按下ctrl键的同时向下推动滚轮,就可以生成ctrl+向下滚轮的组合键事件,在图3中缩小的组合键事件用ctrl+向下滚轮表示,电子设备对ctrl+向下滚轮的组合键事件这个按键事件进行处理,同样能够得到界面340。也就是说,缩小组合键事件和缩小触摸事件都能够进行电子设备的界面的缩小。
图3可以看作是电子设备的前台应用为图片查看应用的一个示例。通过双指捏合触摸事件来实现缩放功能是电子设备原本就具备的功能,而本申请实施例的方案则是能够实现通过ctrl+滚轮的按键操作来实现缩放功能。在本申请实施例中,主要就是考虑到触屏电子设备本身已经有缩放功能相应的触摸操作了,所以只要将缩放的组合键事件转换成缩放触摸事件,电子设备就可以对缩放触摸事件进行响应。或者理解成建立了缩放功能的按键事件和触摸事件之间的映射关系,从而当缩放的组合键事件输入到电子设备时,电子设备只需要增加转换步骤,先将缩放组合键事件转换成缩放触摸事件,就能按照原有对于触摸事件的处理流程继续进行处理,从而从用户角度看是实现了通过键盘鼠标对电子设备的缩放功能的操控。这种方案是在现有触摸事件的处理流程上额外增加了一个转换步骤,所以改造难度相对较小,易于实现。不需要改变硬件结构,也不需要重新设计复杂的处理流程。
而ctrl+滚轮是最为常用的缩放功能的按键操作,所以采用这个组合键事件来触发缩放功能,更符合用户使用习惯,但应理解,也可以设计其他按键事件来代替这个组合键事件,只是需要增加一些适应性改动,且需要向用户说明其他按键事件的对应功能是缩放。例如可以采用目前还没被占用的组合键事件来设置成缩放功能的组合键操作。
可以看出,图3中主要示出了可以通过触摸操作或组合键操作来实现缩放,也就是比较缩放触摸事件和缩放的组合键事件来说明二者可以达到相同的效果。图3主要为了说明利用本申请实施例的方案能够使得缩放的组合键事件能够达到与双指捏合触摸事件相同的缩放效果。
图4是本申请实施例的一种电子设备的结构示意图。如图4所示,电子设备400可以包括处理器410,外部存储器接口420,内部存储器421,通用串行总线(universal serial bus,USB)接口430,充电管理模块440,电源管理模块441,电池442,天线1,天线2,移动通信模块450,无线通信模块460,音频模块470,扬声器470A,受话器470B,麦克风470C,耳机接口470D,传感器模块480,按键490,马达491,指示器492,摄像头493,显示屏494,以及用户标识模块(subscriber identification module,SIM)卡接口495等。其中,传感器模块480可以包括压力传感器480A,陀螺仪传感器480B,气压传感480C,磁传感器480D,加速度传感器480E,距离传感器480F,接近光传感器480G,指纹传感器480H,温度传感器480J,触摸传感器480K,环境光传感器480L,骨传导传感器480M等。
可以理解的是,本申请实施例示意的结构并不构成对电子设备400的具体限定。在本申请另一些实施例中,电子设备400可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
示例性地,图4所示的处理器410可以包括一个或多个处理单元,例如:处理器410可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,存储器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
其中,控制器可以是电子设备400的神经中枢和指挥中心。控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。
处理器410中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器410中的存储器为高速缓冲存储器。该存储器可以保存处理器410刚用过或循环使用的指令或数据。如果处理器410需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器410的等待时间,因而提高了系统的效率。
在一些实施例中,处理器410可以包括一个或多个接口。接口可以包括集成电路(inter-integrated circuit,I2C)接口,集成电路内置音频(inter-integrated circuit sound,I2S)接口,脉冲编码调制(pulse code modulation,PCM)接口,通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口,移动产业处理器接口(mobile industry processor interface,MIPI),通用输入输出(general-purpose input/output,GPIO)接口,用户标识模块(subscriber identity module,SIM)接口,和/或通用串行总线(universal serial bus,USB)接口等。
在一些实施例中,I2C接口是一种双向同步串行总线,包括一根串行数据线(serial data line,SDA)和一根串行时钟线(derail clock line,SCL)。处理器410可以包含多组I2C 总线。处理器410可以通过不同的I2C总线接口分别耦合触摸传感器480K,充电器,闪光灯,摄像头493等。例如,处理器410可以通过I2C接口耦合触摸传感器480K,使处理器410与触摸传感器480K通过I2C总线接口通信,实现电子设备400的触摸功能。
在一些实施例中,I2S接口可以用于音频通信。处理器410可以包含多组I2S总线。处理器410可以通过I2S总线与音频模块470耦合,实现处理器410与音频模块470之间的通信。
在一些实施例中,音频模块470可以通过I2S接口向无线通信模块460传递音频信号,实现通过蓝牙耳机接听电话的功能。
在一些实施例中,PCM接口也可以用于音频通信,将模拟信号抽样,量化和编码。音频模块470与无线通信模块460可以通过PCM总线接口耦合。
在一些实施例中,音频模块470也可以通过PCM接口向无线通信模块460传递音频信号,实现通过蓝牙耳机接听电话的功能。应理解,I2S接口和PCM接口都可以用于音频通信。
在一些实施例中,UART接口是一种通用串行数据总线,用于异步通信。该总线可以为双向通信总线。它将要传输的数据在串行通信与并行通信之间转换。UART接口通常被用于连接处理器410与无线通信模块460。例如,处理器410通过UART接口与无线通信模块460中的蓝牙模块通信,实现蓝牙功能。在一些实施例中,音频模块470可以通过UART接口向无线通信模块460传递音频信号,实现通过蓝牙耳机播放音乐的功能。
在一些实施例中,MIPI接口可以被用于连接处理器410与显示屏494,摄像头493等外围器件。MIPI接口包括摄像头串行接口(camera serial interface,CSI),显示屏串行接口(display serial interface,DSI)等。处理器410和摄像头493通过CSI接口通信,实现电子设备400的拍摄功能。处理器410和显示屏494通过DSI接口通信,实现电子设备400的显示功能。
在一些实施例中,GPIO接口可以通过软件配置。GPIO接口可以被配置为控制信号,也可被配置为数据信号。GPIO接口可以用于连接处理器410与摄像头493,显示屏494,无线通信模块460,音频模块470,传感器模块480等。GPIO接口还可以被配置为I2C接口,I2S接口,UART接口,MIPI接口等。
示例性地,USB接口430是符合USB标准规范的接口,具体可以是Mini USB接口,Micro USB接口,USB Type C接口等。USB接口430可以用于连接充电器为电子设备400充电,也可以用于电子设备400与外围设备之间传输数据。也可以用于连接耳机,通过耳机播放音频。该接口还可以用于连接其他电子设备,例如AR设备等。
可以理解的是,本申请实施例示意的各模块间的接口连接关系,只是示意性说明,并不构成对电子设备400的结构限定。在本申请另一些实施例中,电子设备400也可以采用上述实施例中不同的接口连接方式,或多种接口连接方式的组合。
充电管理模块440用于从充电器接收充电输入。其中,充电器可以是无线充电器,也可以是有线充电器。在一些有线充电的实施例中,充电管理模块440可以通过USB接口430接收有线充电器的充电输入。在一些无线充电的实施例中,充电管理模块440可以通过电子设备400的无线充电线圈接收无线充电输入。充电管理模块440为电池442充电的同时,还可以通过电源管理模块441为电子设备供电。
电源管理模块441用于连接电池442,充电管理模块440与处理器410。电源管理模块441接收电池442和/或充电管理模块440的输入,为处理器410,内部存储器421,外部存储器,显示屏494,摄像头493,和无线通信模块460等供电。电源管理模块441还可以用于监测电池容量,电池循环次数,电池健康状态(漏电,阻抗)等参数。在其他一些实施例中,电源管理模块441也可以设置于处理器410中。在另一些实施例中,电源管理模块441和充电管理模块440也可以设置于同一个器件中。
电子设备400的无线通信功能可以通过天线1,天线2,移动通信模块450,无线通信模块460,调制解调处理器以及基带处理器等实现。
天线1和天线2用于发射和接收电磁波信号。电子设备400中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如,可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。
移动通信模块450可以提供应用在电子设备400上的无线通信的解决方案,例如下列方案中的至少一个:第二代(2th generation,2G)移动通信解决方案、第三代(3thgeneration,3G)移动通信解决方案、第四代(4th generation,5G)移动通信解决方案、第五代(5th generation,5G)移动通信解决方案。移动通信模块450可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。移动通信模块450可以由天线1接收电磁波,并对接收的电磁波进行滤波和放大等处理,随后传送至调制解调处理器进行解调。移动通信模块450还可以放大经调制解调处理器调制后的信号,放大后的该信号经天线1转变为电磁波辐射出去。在一些实施例中,移动通信模块450的至少部分功能模块可以被设置于处理器410中。在一些实施例中,移动通信模块450的至少部分功能模块可以与处理器410的至少部分模块被设置在同一个器件中。
调制解调处理器可以包括调制器和解调器。其中,调制器用于将待发送的低频基带信号调制成中高频信号。解调器用于将接收的电磁波信号解调为低频基带信号。随后解调器将解调得到的低频基带信号传送至基带处理器处理。低频基带信号经基带处理器处理后,被传递给应用处理器。应用处理器通过音频设备(不限于扬声器470A,受话器470B等)输出声音信号,或通过显示屏494显示图像或视频。在一些实施例中,调制解调处理器可以是独立的器件。在另一些实施例中,调制解调处理器可以独立于处理器410,与移动通信模块450或其他功能模块设置在同一个器件中。
无线通信模块460可以提供应用在电子设备400上的包括无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(blue tooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。无线通信模块460可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块460经由天线2接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器410。无线通信模块460还可以从处理器410接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。
在一些实施例中,电子设备400的天线1和移动通信模块450耦合,电子设备400的天线2和无线通信模块460耦合,使得电子设备400可以通过无线通信技术与网络和其他电子设备通信。该无线通信技术可以包括以下通信技术中的至少一个:全球移动通讯系统 (global system for mobile communications,GSM),通用分组无线服务(general packet radio service,GPRS),码分多址接入(code division multiple access,CDMA),宽带码分多址(wideband code division multiple access,WCDMA),时分码分多址(time-division code division multiple access,TD-SCDMA),长期演进(long term evolution,LTE),BT,GNSS,WLAN,NFC,FM,IR技术。该GNSS可以包括以下定位技术中的至少一个:全球卫星定位系统(global positioning system,GPS),全球导航卫星系统(global navigation satellite system,GLONASS),北斗卫星导航系统(beidou navigation satellite system,BDS),准天顶卫星系统(quasi-zenith satellite system,QZSS),星基增强系统(satellite based augmentation systems,SBAS)。
电子设备400通过GPU,显示屏494,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏494和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器410可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
显示屏494用于显示图像,视频等。显示屏494包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD),有机发光二极管(organic light-emitting diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode的,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),Mini-led,Micro-Led,Micro-oLed,量子点发光二极管(quantum dot light-emitting diodes,QLED)等。在一些实施例中,电子设备400可以包括1个或N个显示屏494,N为大于1的正整数。
电子设备400可以通过ISP,摄像头493,视频编解码器,GPU,显示屏494以及应用处理器等实现拍摄功能。
ISP用于处理摄像头493反馈的数据。例如,拍照时,打开快门,光线通过镜头被传递到摄像头感光元件上,光信号转换为电信号,摄像头感光元件将所述电信号传递给ISP处理,转化为肉眼可见的图像。ISP还可以对图像的噪点,亮度,肤色进行算法优化。ISP还可以对拍摄场景的曝光,色温等参数优化。在一些实施例中,ISP可以设置在摄像头493中。
摄像头493用于捕获静态图像或视频。物体通过镜头生成光学图像投射到感光元件。感光元件可以是电荷耦合器件(charge coupled device,CCD)或互补金属氧化物半导体(complementary metal-oxide-semiconductor,CMOS)光电晶体管。感光元件把光信号转换成电信号,之后将电信号传递给ISP转换成数字图像信号。ISP将数字图像信号输出到DSP加工处理。DSP将数字图像信号转换成标准的RGB,YUV等格式的图像信号。在一些实施例中,电子设备400可以包括1个或N个摄像头493,N为大于1的正整数。
数字信号处理器用于处理数字信号,除了可以处理数字图像信号,还可以处理其他数字信号。例如,当电子设备400在频点选择时,数字信号处理器用于对频点能量进行傅里叶变换等。
视频编解码器用于对数字视频压缩或解压缩。电子设备400可以支持一种或多种视频编解码器。这样,电子设备400可以播放或录制多种编码格式的视频,例如:动态图像专家组(moving picture experts group,MPEG)1,MPEG2,MPEG3,MPEG4等。
NPU为神经网络(neural-network,NN)计算处理器,通过借鉴生物神经网络结构,例如借鉴人脑神经元之间传递模式,对输入信息快速处理,还可以不断的自学习。通过NPU可以实现电子设备400的智能认知等应用,例如:图像识别,人脸识别,语音识别,文本理解等。
外部存储器接口420可以用于连接外部存储卡,例如安全数码(secure digital,SD)卡,实现扩展电子设备400的存储能力。外部存储卡通过外部存储器接口420与处理器410通信,实现数据存储功能。例如将音乐,视频等文件保存在外部存储卡中。
内部存储器421可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。处理器410通过运行存储在内部存储器421的指令,从而执行电子设备400的各种功能应用以及数据处理。内部存储器421可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统,至少一个功能所需的应用程序(比如声音播放功能,图像播放功能等)等。存储数据区可存储电子设备400使用过程中所创建的数据(比如音频数据,电话本等)等。此外,内部存储器421可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。
电子设备400可以通过音频模块470,扬声器470A,受话器470B,麦克风470C,耳机接口470D,以及应用处理器等实现音频功能。例如,音乐播放,录音等。
音频模块470用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转换为数字音频信号。音频模块470还可以用于对音频信号编码和解码。在一些实施例中,音频模块470可以设置于处理器410中,或将音频模块470的部分功能模块设置于处理器410中。
扬声器470A,也称“喇叭”,用于将音频电信号转换为声音信号。电子设备400可以通过扬声器470A收听音乐,或收听免提通话。
受话器470B,也称“听筒”,用于将音频电信号转换成声音信号。当电子设备400接听电话或语音信息时,可以通过将受话器470B靠近人耳接听语音。
麦克风470C,也称“话筒”,“传声器”,用于将声音信号转换为电信号。当拨打电话或发送语音信息时,用户可以通过人嘴靠近麦克风470C发声,将声音信号输入到麦克风470C。电子设备400可以设置至少一个麦克风470C。在另一些实施例中,电子设备400可以设置两个麦克风470C,除了采集声音信号,还可以实现降噪功能。在另一些实施例中,电子设备400还可以设置三个,四个或更多麦克风470C,实现采集声音信号,降噪,还可以识别声音来源,实现定向录音功能等。
耳机接口470D用于连接有线耳机。耳机接口470D可以是USB接口430,也可以是3.5mm的开放移动电子设备平台(open mobile terminal platform,OMTP)标准接口,美国蜂窝电信工业协会(cellular telecommunications industry association of the USA,CTIA)标准接口。
压力传感器480A用于感受压力信号,可以将压力信号转换成电信号。在一些实施例中,压力传感器480A可以设置于显示屏494。压力传感器480A的种类很多,如电阻式压力传感器,电感式压力传感器,电容式压力传感器等。电容式压力传感器可以是包括至少两个具有导电材料的平行板。当有力作用于压力传感器480A,电极之间的电容改变。电 子设备400根据电容的变化确定压力的强度。当有触摸操作作用于显示屏494,电子设备400根据压力传感器480A检测所述触摸操作强度。电子设备400也可以根据压力传感器480A的检测信号计算触摸的位置。在一些实施例中,作用于相同触摸位置,但不同触摸操作强度的触摸操作,可以对应不同的操作指令。例如,当有触摸操作强度小于第一压力阈值的触摸操作作用于短消息应用图标时,执行查看短消息的指令。当有触摸操作强度大于或等于第一压力阈值的触摸操作作用于短消息应用图标时,执行新建短消息的指令。
陀螺仪传感480B可以用于确定电子设备400的运动姿态。在一些实施例中,可以通过陀螺仪传感器480B确定电子设备400围绕三个轴(即x,y和z轴)的角速度。陀螺仪传感器480B可以用于拍摄防抖。示例性地,当按下快门,陀螺仪传感器480B检测电子设备400抖动的角度,根据角度计算出镜头模组需要补偿的距离,让镜头通过反向运动抵消电子设备400的抖动,实现防抖。陀螺仪传感器480B还可以用于导航,体感游戏场景。
气压传感器480C用于测量气压。在一些实施例中,电子设备400通过气压传感器480C测得的气压值计算海拔高度,辅助定位和导航。
磁传感器480D包括霍尔传感器。电子设备400可以利用磁传感器480D检测翻盖皮套的开合。在一些实施例中,当电子设备400是翻盖机时,电子设备400可以根据磁传感器480D检测翻盖的开合;根据检测到的皮套的开合状态或翻盖的开合状态,设置翻盖自动解锁等特性。
加速度传感器480E可检测电子设备400在各个方向上(一般为三轴)加速度的大小。当电子设备400静止时可检测出重力的大小及方向。还可以用于识别电子设备姿态,应用于横竖屏切换,计步器等应用。
距离传感器480F用于测量距离。电子设备400可以通过红外或激光测量距离。在一些实施例中,拍摄场景,电子设备400可以利用距离传感器480F测距以实现快速对焦。
接近光传感器480G可以包括例如发光二极管(light-emitting diode,LED)和光检测器,例如光电二极管。发光二极管可以是红外发光二极管。电子设备400通过发光二极管向外发射红外光。电子设备400使用光电二极管检测来自附近物体的红外反射光。当检测到充分的反射光时,可以确定电子设备400附近有物体。当检测到不充分的反射光时,电子设备400可以确定电子设备400附近没有物体。电子设备400可以利用接近光传感器480G检测用户手持电子设备400贴近耳朵通话,以便自动熄灭屏幕达到省电的目的。接近光传感器480G也可用于皮套模式,口袋模式自动解锁与锁屏。
环境光传感器480L用于感知环境光亮度。电子设备400可以根据感知的环境光亮度自适应调节显示屏494亮度。环境光传感器480L也可用于拍照时自动调节白平衡。环境光传感器480L还可以与接近光传感器480G配合,检测电子设备400是否在口袋里,以防误触。
指纹传感器480H用于采集指纹。电子设备400可以利用采集的指纹特性实现指纹解锁,访问应用锁,指纹拍照,指纹接听来电等。
温度传感器480J用于检测温度。在一些实施例中,电子设备400利用温度传感器480J检测的温度,执行温度处理策略。例如,当温度传感器480J上报的温度超过阈值,电子设备400执行降低位于温度传感器480J附近的处理器的性能,以便降低功耗实施热保护。在另一些实施例中,当温度低于另一阈值时,电子设备400对电池442加热,以避免低温 导致电子设备400异常关机。在其他一些实施例中,当温度低于又一阈值时,电子设备400对电池442的输出电压执行升压,以避免低温导致的异常关机。
触摸传感器480K,也称“触控面板”。触摸传感器480K可以设置于显示屏494,由触摸传感器480K与显示屏494组成触摸屏,也称“触控屏”。触摸传感器480K用于检测作用于其上或附近的触摸操作。触摸传感器可以将检测到的触摸操作传递给应用处理器,以确定触摸事件类型。可以通过显示屏494提供与触摸操作相关的视觉输出。在另一些实施例中,触摸传感器480K也可以设置于电子设备400的表面,与显示屏494所处的位置不同。
骨传导传感器480M可以获取振动信号。在一些实施例中,骨传导传感器480M可以获取人体声部振动骨块的振动信号。骨传导传感器480M也可以接触人体脉搏,接收血压跳动信号。在一些实施例中,骨传导传感器480M也可以设置于耳机中,结合成骨传导耳机。音频模块470可以基于所述骨传导传感器480M获取的声部振动骨块的振动信号,解析出语音信号,实现语音功能。应用处理器可以基于所述骨传导传感器480M获取的血压跳动信号解析心率信息,实现心率检测功能。
按键490包括开机键,音量键等。按键490可以是机械按键。也可以是触摸式按键。电子设备400可以接收按键输入,产生与电子设备400的用户设置以及功能控制有关的键信号输入。
马达491可以产生振动提示。马达491可以用于来电振动提示,也可以用于触摸振动反馈。例如,作用于不同应用(例如拍照,音频播放等)的触摸操作,可以对应不同的振动反馈效果。作用于显示屏494不同区域的触摸操作,马达491也可对应不同的振动反馈效果。不同的应用场景(例如:时间提醒,接收信息,闹钟,游戏等)也可以对应不同的振动反馈效果。触摸振动反馈效果还可以支持自定义。
指示器492可以是指示灯,可以用于指示充电状态,电量变化,也可以用于指示消息,未接来电,通知等。
SIM卡接口495用于连接SIM卡。SIM卡可以通过插入SIM卡接口495,或从SIM卡接口495拔出,实现和电子设备400的接触和分离。电子设备400可以支持1个或N个SIM卡接口,N为大于1的正整数。SIM卡接口495可以支持Nano SIM卡,Micro SIM卡,SIM卡等。同一个SIM卡接口495可以同时插入多张卡。所述多张卡的类型可以相同,也可以不同。SIM卡接口495也可以兼容不同类型的SIM卡。SIM卡接口495也可以兼容外部存储卡。电子设备400通过SIM卡和网络交互,实现通话以及数据通信等功能。在一些实施例中,电子设备400采用eSIM,即:嵌入式SIM卡。eSIM卡可以嵌在电子设备400中,不能和电子设备400分离。
电子设备400的软件系统可以采用分层架构,事件驱动架构,微核架构,微服务架构,或云架构。本申请实施例以分层架构的安卓(Android)系统为例,示例性说明电子设备的软件结构。下面结合图5以具有分层架构的安卓(Android)系统的电子设备为例,示例性地对电子设备的软件结构进行描述。
图5是本申请实施例的一种电子设备的软件结构框图。图5中的电子设备可以是图4所示电子设备400,如图5所示,分层架构将软件分成若干个层,每一层都有清晰的角色和分工;层与层之间通过软件接口通信。在一些实施例中,将Android系统分为四层,从 上至下分别为应用程序层,应用程序框架层,系统库层以及内核层。应用程序层可以包括一系列应用程序包。但应理解,图5只是对Android系统进行分层的一个示例,在实际场景中也可能存在其他分层方式。此外对于每个层中的模块的命名也可能存在其他命名方式,不存在限定。例如可以将应用程序框架层称之为框架层,或者可以将系统库层合并到应用程序框架层等。
如图5所示,应用程序包可以包括短信息、日历、相机、视频、导航、通话、图库等应用程序(application,APP)。为了便捷,以下将应用程序简称为应用。正在运行的应用程序也可以称之为前台应用或前端应用。
由于有些应用是支持缩放功能的,例如短信息、图库或导航等,而有些应用不支持缩放功能,例如设置、遥控等,不同厂商的电子设备可能不同。简而言之,应用可以分为支持缩放功能的应用和不支持缩放功能的应用。在本申请实施例中,可以实现对于支持缩放功能的应用的组合键操控。
应用程序框架层为应用程序层的应用程序提供应用编程接口(application programming interface,API)和编程框架。应用程序框架层包括一些预先定义的函数。
如图5所示,应用程序框架层可以包括窗口管理器,资源管理器,通知管理器,输入管理器等。
窗口管理器用于管理窗口程序。窗口管理器可以获取显示屏大小,判断是否有状态栏,锁定屏幕,截取屏幕等。
资源管理器为应用程序提供各种资源,比如本地化字符串,图标,图片,布局文件,视频文件等等。
通知管理器使应用程序可以在状态栏中显示通知信息,可以用于传达告知类型的消息,可以短暂停留后自动消失,无需用户交互。比如通知管理器被用于告知下载完成,消息提醒等。通知管理器还可以是以图表或者滚动条文本形式出现在系统顶部状态栏的通知,例如后台运行的应用程序的通知,还可以是以对话窗口形式出现在屏幕上的通知。例如在状态栏提示文本信息,发出提示音,电子设备振动,指示灯闪烁等。
输入管理器(InputManagerService)主要用于管理整个系统的输入部分,包括键盘、鼠标、触摸屏等。
系统库层可以包括安卓运行时(Android runtime)和系统库,Android runtime包括核心库和虚拟机。Android runtime负责安卓系统的调度和管理。
核心库包含两部分:一部分是java语言需要调用的功能函数,另一部分是安卓的核心库。
应用程序层和应用程序框架层运行在虚拟机中。虚拟机将应用程序层和应用程序框架层的java文件执行为二进制文件。虚拟机用于执行对象生命周期的管理,堆栈管理,线程管理,安全和异常的管理,以及垃圾回收等功能。
系统库可以包括多个功能模块。例如,表面管理器(surface manager),媒体库(media libraries),三维图形处理库(例如:OpenGL ES),输入服务模块等。
表面管理器用于对显示子系统进行管理,并且为多个应用程序提供了2D和3D图层的融合。
媒体库支持多种音频格式的回放和录制、多种视频格式回放和录制以及静态图像文 件。媒体库可以支持多种音视频编码格式,例如:MPEG4、H.264、动态图像专家组音频层面3(moving picture experts group audio layer III,MP3)、高级音频编码(advanced audio coding,AAC)、自适应多码率(adaptive multi-rate,AMR)、联合图像专家组(joint photo graphic experts group,JPG)和便携式网络图形(portable network graphics,PNG)。
三维图形处理库用于实现三维图形绘图,图像渲染,合成和图层处理等。
输入服务模块可以包括InputReader和InputDispatcher,InputReader用于读取输入事件并将读取的事件上报给InputDispatcher,InputDispatcher将输入事件进行分配。
内核层是指是硬件和软件之间的层;内核层至少包含传感器驱动、摄像头驱动、显示驱动等。
需要说明的是,以上列举的图4为可能的一种电子设备的结构图,图5为可能的一种电子设备的软件架构图。如图1至图3中所示的任意一个触屏电子设备(电子设备D)可以具有图4所述的结构以及图5所示的软件结构。
图6是本申请实施例的第一类交互场景的交互事件处理过程的示意图。图6所示处理过程可以应用于图1所示交互场景,图6所述处理过程可以由上述电子设备D或电子设备400执行。
如图6所示,应用程序层包括app1、app2、app3、app4等多个应用。可以将正在前台运行的应用称之为前台应用。当用户通过触屏电子设备外接的键盘和鼠标输入ctrl+滚轮这一组合键操作,内核层的相应驱动抓取该组合键事件,上报给native framework层中的InputReader,InputReader读取该组合键事件,并上报给native framework层中InputDispatcher,InputDispatcher就可以将该组合键事件进行分发。在本申请实施例的方案中,InputDispatcher会将该组合键事件先分发给java framework层的InputManagerService中的InputFilter,InputFilter对组合键事件进行事件分发前的过滤,并将这个组合键事件注入到InputDispatcher形成相应的触摸事件,然后InputDispatcher再将这个触摸事件分发给view框架,然后通过view框架发送给前台应用,由于触屏电子设备(电子设备D或电子设备400)能够对该触摸事件进行响应。具体而言,该过程是InputDispatcher将事件分发给应用进程,运行在应用进程的view框架对事件按照当前界面布局的view树进一步分发处理,把事件传递给对应的应用控件。从而前台应用的界面可以产生缩放。
在一些情况下,native framework层可以看作是图5所示系统库层的一个示例,java framework层可以看作是图5所示应用程序框架层的一个示例。但应理解,由于对安卓系统的分层方式并不唯一,也就是说,可以不完全采用图5所示的层间结构的划分方式。例如,在另一些情况下,还可以将native framework层和java framework层统称为框架层。
需要说明的是,如果是传统方案,InputDispatcher会将该组合键事件直接分发给view框架,然后发送给前台应用。但由于触屏电子设备(电子设备D或电子设备400)并不能对该组合键事件进行响应,也就是无法处理这个组合键事件,所以前台应用的界面无法产生缩放。而与之不同的是,本申请实施例的方案中则先将组合键事件转换为触摸事件,再将触摸事件分发给view框架,从而实现了缩放功能。
图7是本申请实施例的第二类交互场景的交互事件处理过程的示意图。图7所示处理过程可以应用于图2所示交互场景,图7所述处理过程可以由上述电子设备D或电子设备400执行。
如图7所示,应用程序层包括app1、app2等多个应用。设备E可以看作是图2所示电子设备E的一个示例,而后续当用户通过设备E的自身键盘或外接的键盘和鼠标输入ctrl+滚轮这一组合键操作,设备E将该组合键事件通过设备E和触屏电子设备(电子设备D或电子设备400)之间的通信通道,经触屏电子设备的协同办公软件接收,协同办公软件将该组合键事件发送给InputManagerService,InputManagerService将这个组合键事件注入到InputDispatcher形成相应的触摸事件,然后InputDispatcher再将这个触摸事件分发给view框架,然后通过view框架发送给前台应用,由于触屏电子设备(电子设备D或电子设备400)能够对该触摸事件进行响应,所以前台应用的界面可以产生缩放。上述对触摸事件的响应过程也可参照图6相关描述,不再赘述。
需要说明的是,如果是传统方案,InputManagerService不会将这个组合键事件注入到InputDispatcher形成相应的触摸事件,而是直接将组合键事件发送给InputDispatcher,然后InputDispatcher会将该组合键事件分发给view框架,然后发送给前台应用。但由于触屏电子设备(电子设备D或电子设备400)并不能对该组合键事件进行响应,也就是无法处理这个组合键事件,所以前台应用的界面无法产生缩放。而本申请实施例则先将组合键事件转换为触摸事件,再将触摸事件分发给view框架,从而实现了缩放功能。
图8是本申请实施例的一种交互事件的处理方法的示意性流程图。该方法能够应用于上述电子设备D或电子设备400中。下面对图8各步骤进行介绍。
S801、获取组合键事件,该组合键事件为用于触发缩放功能的交互事件。
在一种实现方式中,可以通过硬件驱动或协同办公软件获取该组合键事件。
在第一类交互场景中,步骤S801可以是通过硬件驱动抓取控制键和滚轮的操作,生成组合键事件,并将该组合键事件上报给InputReader。
在第二类交互场景中,步骤S801可以是通过协同办公软件获取该组合键事件。例如可以投屏设备的硬件驱动抓取控制键和滚轮操作,生成组合键事件,并将该组合键事件通过投屏设备与触屏电子设备之间的通信通路传输给触屏电子设备,然后触屏电子设备就可以通过协同办公软件获取该组合键事件的。
在一种实现方式中,该组合键事件包括控制键(ctrl键)和滚轮组成的事件,也就是组合键事件为控制键事件和鼠标滚轮事件的组合事件。可选地,组合键事件可以为ctrl+滚轮,ctrl+滚轮可以包括“ctrl+向上滚轮”和“ctrl+向下滚轮”。
如上文所述,由于ctrl键是控制键,单纯按下ctrl键,被控设备并不会产生相应操作,所以ctrl+滚轮这个组合键事件中ctrl可以看作是滚轮事件的状态值,也就是说,ctrl+滚轮这个组合键事件可以看作携带了ctrl状态值的滚轮事件。
在实际场景中,当发生滚轮事件时,并且判断当前ctrl键已经按下,则当前滚轮事件对象motionevent中可以通过接口getMetaState()获取当前事件状态,事件状态体现为一个整数(int)型值,其中0x1000这个比特(bit)位表示了ctrl键状态,首位为1则为ctrl键按下,首位为0为ctrl键没有按下,是单纯的滚轮。
S802、将组合键事件转换为缩放触摸事件。
在一种实现方式中,可以将组合键事件的参数转换为缩放触摸事件中的对应参数。
可选地,可以通过注入一段事件的触摸事件,从而将组合键事件转换为缩放触摸事件。也就是说,可以通过在接收到组合键事件开始进行触摸事件的注入,直到能够生成与触屏 上的缩放手势等同的触摸事件结束注入,从而生成缩放触摸事件。
在第一类交互场景中,步骤S802可以是InputDispatcher将该组合键事件分发给InputFilter,InputFilter对组合键事件进行事件分发前的过滤,并将这个组合键事件注入到InputDispatcher形成相应的触摸事件。
在第二类交互场景中,步骤S802可以是InputManagerService将这个组合键事件注入到InputDispatcher形成相应的触摸事件。
对于一组触摸事件,需要一定的持续时间,在该持续时间内,以按下(down)事件为起始,以抬起(up)事件为结束,可以理解为从按下到松开的时间段。该过程中,每个一段预设时间可以注入一个移动(move)事件,从而可以生成一组触摸事件。
一组触摸事件往往包括按下的位置和移动(大多为滑动)的距离和方向。例如对于图3中的双指捏合,就可以包括两个手指在屏幕上按下的坐标,以及捏合过程中两个手指的滑动距离和滑动的方向,也可以说是两个手指之间的距离变化,该距离变化的正负分别对应放大缩小,也就是两个手指对应的按下的坐标的变化量。例如,一个双指捏合事件可以是检测预设时间间隔内按下的坐标变化量是+Δ,则对应放大,且放大的比例与Δ正相关,如果变化量是-Δ,则对应缩小,且缩小的比例与Δ正相关。
为了便于理解,可以认为缩放触摸事件(双指捏合事件)的参数包括捏合距离和捏合方向,其中捏合方向与放大或缩小对应,捏合距离与缩放比例对应。
对于鼠标滚轮scroll事件则属于motion事件,事件中包含当前的鼠标光标的坐标信息。因此可以将光标的坐标信息与上述两个手指的滑动距离对应。
为了便于理解,可以认为滚轮事件的参数包括滚动方向和滚动量。滚动方向例如包括向上和向下。滚动量则可以理解成用户推动滚轮时,推动滚轮滚动的变化量,例如可以用格为单位表示,就是滚轮向上滚动了多少格或者向下滚动了多少格。滚动方向与放大还是缩小对应,在惯常操作中往往是,向上滚动对应放大,向下滚动对应缩小。滚动量与缩放比例对应。
如果是单纯的滚轮事件,则滚动方向与翻页方向对应,在惯常操作中往往是,向上滚动对应向上翻页,向下滚动对应向下翻页,滚动量与页面向上或向下翻动的量对应。单纯滚轮事件和控制键与滚轮的组合键事件之间的区别在于滚轮事件中的控制键状态位的标识不同,但不会影响滚轮的滚动方向和滚动量这两个参数,如上文所述,不再赘述。
因此可以看出,可以将组合键事件的参数跟缩放触摸事件的参数建立映射关系,从而实现将组合键事件转换成触摸事件的目的。
缩放触摸事件的参数包括捏合方向和捏合距离,组合键事件的参数包括滚动方向和滚动距离。捏合方向有两个,可以分别对应放大和缩小,而滚动方向也有两个,可以分别对应放大和缩小,所以捏合方向和滚动方向可以一一对应。捏合距离与缩放比例对应,滚动距离也与缩放比例对应,因此可以将捏合距离与滚动距离对应。
在一种实现方式中,组合键事件为控制键事件和鼠标滚轮事件的组合事件;组合键事件的参数包括滚轮的滚动方向和滚动量;缩放触摸事件的参数包括捏合方向和捏合距离;在将组合键事件的参数转换为缩放触摸事件中的对应参数时,可以包括:将滚动量转换为捏合距离,将滚动方向转换为捏合方向。
在一种实现方式中,上述注入过程,也就是将组合键事件转换为缩放触摸事件的步骤, 可以包括:
获取滚轮的滚动方向和滚动量;
通过注入的方式,将滚轮事件注入形成缩放触摸事件,其中,缩放触摸事件的缩放标识(与捏合方向对应)与滚轮方向对应,缩放触摸事件的缩放比例(与捏合距离对应)与滚动量对应。缩放标识用于表示此次交互事件是放大还是缩小,例如可以用正负分别表示放大和缩小。
在一个例子中,将两个手指捏合起始down事件的坐标分别对应光标位置为中心点的对称两点,两个点的x轴和y轴的距离分别为预设值,该预设值例如可以是400dp,注入的up事件坐标为起始坐标在x和y轴方向上各偏移100dp,而其中的每个move事件的x和y坐标偏移则根据事件注入过程的时间计算出具体的坐标值。放大和缩小手势则依据滚轮事件的方向进行决定,同时决定坐标点x和y的坐标计算采用正值还是负值。注入的触摸事件坐标需要针对屏幕边界进行限制。
dp(density-independent pixel)是安卓开发用的长度单位,1dp表示在屏幕像素点密度为160分辨率时1像素长度。
在一种实现方式中,根据屏幕边界,限制注入的事件坐标。
在一种实现方式中,步骤S802还可以包括:
判断前一轮次的组合键事件的转换是否还在执行中;
当前一轮次的组合键事件的转换还在执行中时,丢弃组合键事件;或者,
当前一轮次的组合键事件的转换不在执行中时,将组合键事件转换为缩放触摸事件。
该实现方式能够提高对于连续触发的组合键事件的处理的准确性。
S803、响应于缩放触摸事件,对界面进行缩放。
可选地,可以通过view对缩放触摸事件进行响应,从而对界面进行缩放。
该界面可以为触屏电子设备的界面,也可以为触屏电子设备的投屏界面,分别对应上述两类交互场景。
在一种实现方式中,上述方法还可以包括:判断组合键事件是否为缩放组合键事件;
当组合键事件为缩放组合键事件时,执行将组合键事件转换为缩放触摸事件的步骤;或者,
当组合键事件不是缩放组合键事件时,结束交互事件的处理流程。
也就是说,判断输入的组合键事件是否为缩放功能对应的组合键事件,当为缩放功能对应的组合键事件时,执行步骤S802。即为步骤S802增加一项判别操作,当输入的交互事件不是期望的组合键事件时,就不再执行后续操作,或者说只有在输入的交互事件是对的组合键事件时,才继续执行转换和响应的步骤。这种实现方式能够屏蔽掉纯控制键事件的输入,或者避免误将纯滚轮事件当成缩放组合键进行响应。
在一种实现方式中,上述方法还可以包括:判断前台应用是否支持缩放,当前台应用支持缩放时,执行步骤S803。也就是说,增加一个判别操作,由于有的应用是支持缩放的,有些应用不支持缩放,所以可以只对支持缩放的应用执行本申请实施例的方案。但应理解,该判断动作可以是在步骤S803之前的任意步骤节点执行,例如可能是在步骤S801之前或之后执行,或者可能是在步骤S802之前或之后执行,不存在限定。举例说明,假设在步骤S801前执行判断前台应用是否支持缩放,则可以在判断结果为否时不再执行步 骤S801-S803,省去此次的处理流程,如果判断结果为是时执行步骤S801-S803,实现键鼠操控下的缩放。假设在步骤S801之后且步骤S802之前执行判断前台应用是否支持缩放,则可以在判断结果为否时不再执行步骤S802和S803,如果判断结果为是时继续执行步骤S802和S803,实现键鼠操控下的缩放。不再逐一列举。
在一个例子中,上述判断前台应用是否支持缩放,当前台应用支持缩放时,执行步骤S803的过程可以包括:
判断前台应用是否支持缩放;
当前台应用支持缩放时,步骤S803包括:响应于缩放触摸事件,对前台应用的运行界面进行缩放;或者,
当前台应用不支持缩放时,结束交互事件的处理流程。
在一个例子中,可以设置应用白名单,白名单中的应用都是能够支持缩放功能的。
可选地,可以由InputManagerService执行上述判断前台应用是否支持缩放的步骤。
可选地,在判断前台应用是否支持缩放时,可以包括:
将前台应用的信息与白名单应用的预配置信息进行比较,确定前台应用是否为白名单应用中的应用,白名单应用中包括至少一个支持缩放功能的应用;
当前台应用是白名单应用中的应用时,认为前台应用支持缩放;或者,
当前台应用不是白名单应用中的应用时,认为前台应用不支持缩放。
在另一种实现方式中,上述方法还包括:在缩放触摸事件生成之后,丢弃组合键事件。也就是说可以在步骤S802执行完之后丢弃组合键事件,被丢弃的事件后续不再需要处理。
图8所示方法,主要通过将无法响应的组合键事件转换为能够进行响应的缩放触摸事件再进行响应,从而实现了通过键盘鼠标来操控触屏电子设备的缩放功能的效果。且该方案易于实现,不需要过多改动。
在本申请实施例中,主要就是考虑到触屏电子设备本身已经有缩放功能相应的触摸操作了,所以只要将缩放的组合键事件转换成缩放触摸事件,电子设备就可以对缩放触摸事件进行响应。或者理解成建立了缩放功能的按键事件和触摸事件之间的映射关系,从而当缩放的组合键事件输入到电子设备时,电子设备只需要增加转换步骤,先将缩放组合键事件转换成缩放触摸事件,就能按照原有对于触摸事件的处理流程继续进行处理,从而从用户角度看是实现了通过键盘鼠标对电子设备的缩放功能的操控。这种方案是在现有触摸事件的处理流程上额外增加了一个转换步骤,所以改造难度相对较小,易于实现。不需要改变硬件结构,也不需要重新设计复杂的处理流程。
图9是本申请实施例的另一种交互事件的处理方法的示意性流程图。图9可以看作是图8所示方法的一个示例。
S901、获取组合键事件。
步骤S901可以看作是步骤S801的一个示例。
S902、判断该组合键事件是否为ctrl+scroll事件,当判断结果为是时,执行步骤S903,当判断结果为否时,执行步骤S904。
S903、判断前台应用是否支持缩放,当判断结果为是时,执行步骤S905,当判断结果为否时,执行步骤S904。
S904、将组合键事件上报给前台应用。
步骤S904可以为InputDispatcher将该组合键事件上报给view,然后由view再发送给前台应用。假设步骤S901获取的实际只有控制键事件,也就是ctrl事件,则直接将ctrl事件上报给前台应用,不会执行实现缩放功能的后续步骤。假设步骤S901获取的实际是只有滚轮事件,则直接将滚轮事件上报给前台应用,不会执行实现缩放功能的后续步骤,前台应用可能会响应该滚轮事件,进行翻页。
步骤S902和S904主要是对组合键事件进行一个预判,只有当该组合键事件确实是缩放功能的组合键事件时才继续执行后续步骤,否则直接将组合键事件上报给前台应用。
步骤S903和S904主要是对前台应用进行一个预判,只有当前台应用支持缩放功能时才继续执行后续步骤,否则直接将组合键事件上报给前台应用。
应理解步骤S902和步骤S903不存在先后顺序的限定。步骤S903还可以是在步骤S901前后或执行期间执行,不存在限定。
S905、判断前一轮次的注入是否还在执行中,当判断结果为是时,执行步骤S906,当判断结果为否时,执行步骤S907。
S906、丢弃组合键事件。
S907、注入一段时间,从而生成缩放触摸事件。
步骤S907可以看作是步骤S802的一个示例。
步骤S905-S907主要能够实现对于连续的组合键操作时,通过每一轮次的注入,使得生成的缩放触摸事件与实际的双指捏合事件等同。
S908、将缩放触摸事件上报给前台应用。
可选地,当执行完步骤S907时,除了执行步骤S908,还可以执行丢弃组合键事件的步骤。
图10是本申请实施例的又一种交互事件的处理方法的示意性流程图。图10可以看作是图8或图9所示方法的一个示例。
S1001、硬件驱动将组合键事件上报给InputReader。
步骤S1001可以看作是步骤S801或S901的一个示例。
S1002、InputReader将该组合键事件上报给InputDispatcher。
S1003、InputDispatcher将该组合键事件上报给InputManagerService中的InputFilter,由InputFilter进行事件分发前的过滤。
S1004、InputFilter判断该组合键事件是否为缩放组合键事件。
步骤S1004可以看作是S902的一个示例。
S1005、InputFilter判断前台应用是否支持缩放。
步骤S1005可以看作是S903的一个示例。
S1006、InputFilter向InputDispatcher注入缩放触摸事件。
步骤S1006可以看作是步骤S802或S907的一个示例。
S1007、InputDispatcher向前台应用发送缩放触摸事件。
步骤S1007可以看作是步骤S908的一个示例。
步骤S1007可以为InputDispatcher将缩放触摸事件上报给view,然后由view发送给前台应用。
可以看出,图10所示方法主要以第一类交互场景为例进行介绍的,且图10所示方法 是在现有的交互事件的处理框架和处理流程的基础上进行的改动,增加了步骤S1003-S1006,从而使得步骤S1007往前台发送的是缩放触摸事件,而不是组合键事件,使得view可以对缩放触摸事件进行响应,从而从整体上实现了组合键事件触发的缩放功能。应理解,本领域技术人员也可以在其他合适的处理模块或者设计单独的处理模块来实现组合键事件到缩放触摸事件的转换,只是需要的改动相比于图10会更多一些。图10基于现有的处理框架和处理流程的基础上进行的改动,改动较小,开发成本较低,且更便捷。
上文主要结合附图对本申请实施例的方法进行了介绍。应理解,虽然如上所述的各实施例所涉及的流程图中的各个步骤依次显示,但是这些步骤并不是必然按照图中所示的顺序依次执行。除非本文中有明确的说明,这些步骤的执行并没有严格的顺序限制,这些步骤可以以其它的顺序执行。而且,如上所述的各实施例所涉及的流程图中的至少一部分步骤可以包括多个步骤或者多个阶段,这些步骤或者阶段并不必然是在同一时刻执行完成,而是可以在不同的时刻执行,这些步骤或者阶段的执行顺序也不必然是依次进行,而是可以与其它步骤或者其它步骤中的步骤或者阶段的至少一部分轮流或者交替地执行。下面结合附图对本申请实施例的装置进行介绍。
图11是本申请实施例的一种交互事件的处理装置的示意图。如图11所示,该装置2000包括获取单元2001和处理单元2002。该装置2000可以是上述任意一种触屏电子设备,例如电子设备D或电子设备400。
该装置2000能够用于执行上文任意一种交互事件的处理方法。例如,获取单元2001可用于执行步骤S801,处理单元2002可用于执行步骤S802和S803。又例如,处理单元2002还可用于执行判断组合键事件是否是缩放功能对应的组合键事件,和/或用于执行判断前台应用是否支持缩放功能的步骤。又例如,获取单元2001还可用于执行步骤S901,处理单元2002还可用于执行步骤S902至S908。又例如,获取单元2001还可用于执行步骤S1001,处理单元2002还可用于执行步骤S1002至S1007。
在一种实现方式中,装置2000还可以包括存储单元,用于存储交互事件、支持缩放功能的应用的配置信息等数据。该存储单元可以是集成在处理单元2002中,也可以是独立于获取单元2001和处理单元2002之外的单元。
需要说明的是,上述装置/单元之间的信息交互、执行过程等内容,由于与本申请方法实施例基于同一构思,其具体功能及带来的技术效果,具体可参见方法实施例部分,此处不再赘述。
所属领域的技术人员可以清楚地了解到,为了描述的方便和简洁,仅以上述各功能单元、模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能单元、模块完成,即将所述装置的内部结构划分成不同的功能单元或模块,以完成以上描述的全部或者部分功能。实施例中的各功能单元、模块可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中,上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。另外,各功能单元、模块的具体名称也只是为了便于相互区分,并不用于限制本申请的保护范围。上述系统中单元、模块的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
本申请实施例还提供了一种电子设备,该电子设备包括:至少一个处理器、存储器以 及存储在所述存储器中并可在所述至少一个处理器上运行的计算机程序,所述处理器执行所述计算机程序时实现上述任意方法中的步骤。
本申请实施例还提供了一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序,所述计算机程序被处理器执行时实现可实现上述各个方法实施例中的步骤。
本申请实施例提供了一种计算机程序产品,当计算机程序产品在移动终端上运行时,使得移动终端执行时实现可实现上述各个方法实施例中的步骤。
所述集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本申请实现上述实施例方法中的全部或部分流程,可以通过计算机程序来指令相关的硬件来完成,所述的计算机程序可存储于一计算机可读存储介质中,该计算机程序在被处理器执行时,可实现上述各个方法实施例的步骤。其中,所述计算机程序包括计算机程序代码,所述计算机程序代码可以为源代码形式、对象代码形式、可执行文件或某些中间形式等。所述计算机可读介质至少可以包括:能够将计算机程序代码携带到拍照装置/电子设备的任何实体或装置、记录介质、计算机存储器、只读存储器(read-only memory,ROM)、随机存取存储器(random access memory,RAM)、电载波信号、电信信号以及软件分发介质。例如U盘、移动硬盘、磁碟或者光盘等。在某些司法管辖区,根据立法和专利实践,计算机可读介质不可以是电载波信号和电信信号。
在上述实施例中,对各个实施例的描述都各有侧重,某个实施例中没有详述或记载的部分,可以参见其它实施例的相关描述。
本领域普通技术人员可以意识到,结合本文中所公开的实施例描述的各示例的单元及算法步骤,能够以电子硬件、或者计算机软件和电子硬件的结合来实现。这些功能究竟以硬件还是软件方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请的范围。
在本申请所提供的实施例中,应该理解到,所揭露的装置/网络设备和方法,可以通过其它的方式实现。例如,以上所描述的装置/网络设备实施例仅仅是示意性的,例如,所述模块或单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通讯连接可以是通过一些接口,装置或单元的间接耦合或通讯连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
应当理解,当在本申请说明书和所附权利要求书中使用时,术语“包括”指示所描述特征、整体、步骤、操作、元素和/或组件的存在,但并不排除一个或多个其它特征、整体、步骤、操作、元素、组件和/或其集合的存在或添加。
还应当理解,在本申请说明书和所附权利要求书中使用的术语“和/或”是指相关联列出的项中的一个或多个的任何组合以及所有可能组合,并且包括这些组合。
如在本申请说明书和所附权利要求书中所使用的那样,术语“如果”可以依据上下文被 解释为“当...时”或“一旦”或“响应于确定”或“响应于检测到”。类似地,短语“如果确定”或“如果检测到[所描述条件或事件]”可以依据上下文被解释为意指“一旦确定”或“响应于确定”或“一旦检测到[所描述条件或事件]”或“响应于检测到[所描述条件或事件]”。
另外,在本申请说明书和所附权利要求书的描述中,术语“第一”、“第二”、“第三”等仅用于区分描述,而不能理解为指示或暗示相对重要性。
在本申请说明书中描述的参考“一个实施例”或“一些实施例”等意味着在本申请的一个或多个实施例中包括结合该实施例描述的特定特征、结构或特点。由此,在本说明书中的不同之处出现的语句“在一个实施例中”、“在一些实施例中”、“在其他一些实施例中”、“在另外一些实施例中”等不是必然都参考相同的实施例,而是意味着“一个或多个但不是所有的实施例”,除非是以其他方式另外特别强调。术语“包括”、“包含”、“具有”及它们的变形都意味着“包括但不限于”,除非是以其他方式另外特别强调。
以上所述实施例仅用以说明本申请的技术方案,而非对其限制;尽管参照前述实施例对本申请进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本申请各实施例技术方案的精神和范围,均应包含在本申请的保护范围之内。

Claims (11)

  1. 一种交互事件的处理方法,其特征在于,包括:
    获取组合键事件,所述组合键事件为用于触发缩放功能的交互事件;
    将所述组合键事件转换为缩放触摸事件;
    响应于所述缩放触摸事件,对界面进行缩放。
  2. 根据权利要求1所述的方法,其特征在于,所述将所述组合键事件转换为缩放触摸事件,包括:
    将所述组合键事件的参数转换为所述缩放触摸事件中的对应参数。
  3. 根据权利要求2所述的方法,其特征在于,所述组合键事件为控制键事件和鼠标滚轮事件的组合事件;所述组合键事件的参数包括滚轮的滚动方向和滚动量;所述缩放触摸事件的参数包括捏合方向和捏合距离;
    将所述组合键事件的参数转换为所述缩放触摸事件中的对应参数,包括:
    将所述滚动量转换为所述捏合距离,将所述滚动方向转换为所述捏合方向。
  4. 根据权利要求1至3中任一项所述的方法,其特征在于,所述方法还包括:
    判断所述组合键事件是否为缩放组合键事件;
    当所述组合键事件为所述缩放组合键事件时,执行所述将所述组合键事件转换为所述缩放触摸事件的步骤;或者,
    当所述组合键事件不是所述缩放组合键事件时,结束所述交互事件的处理流程。
  5. 根据权利要求1至4中任一项所述的方法,其特征在于,所述将所述组合键事件转换为缩放触摸事件,还包括:
    判断前一轮次的所述组合键事件的转换是否还在执行中;
    当前一轮次的所述组合键事件的转换还在执行中时,丢弃所述组合键事件;或者,
    当前一轮次的所述组合键事件的转换不在执行中时,将所述组合键事件转换为所述缩放触摸事件。
  6. 根据权利要求1至5中任一项所述的方法,其特征在于,所述获取组合键事件,所述组合键事件为用于触发缩放功能的交互事件,包括:
    通过硬件驱动或协同办公软件,获取所述组合键事件。
  7. 根据权利要求1至6中任一项所述的方法,其特征在于,所述方法还包括:
    判断前台应用是否支持缩放;
    当所述前台应用支持缩放时,所述响应于所述缩放触摸事件,对界面进行缩放,包括:响应于所述缩放触摸事件,对所述前台应用的运行界面进行缩放;或者,
    当所述前台应用不支持缩放时,结束所述交互事件的处理流程。
  8. 根据权利要求7所述的方法,其特征在于,所述判断前台应用是否支持缩放,包括:
    将所述前台应用的信息与白名单应用的预配置信息进行比较,确定所述前台应用是否为所述白名单应用中的应用,所述白名单应用中包括至少一个支持缩放功能的应用;
    当所述前台应用是所述白名单应用中的应用时,认为所述前台应用支持缩放;或者,
    当所述前台应用不是所述白名单应用中的应用时,认为所述前台应用不支持缩放。
  9. 根据权利要求1至8中任一项所述的方法,其特征在于,所述方法还包括:
    在所述缩放触摸事件生成之后,丢弃所述组合键事件。
  10. 一种电子设备,包括存储器、处理器以及存储在所述存储器中并可在所述处理器上运行的计算机程序,其特征在于,所述处理器执行所述计算机程序时实现如权利要求1至9中任一项所述的方法。
  11. 一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序,其特征在于,所述计算机程序被处理器执行时实现如权利要求1至9中任一项所述的方法。
PCT/CN2023/114369 2022-09-02 2023-08-23 交互事件的处理方法及装置 WO2024046179A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202211069768.5 2022-09-02
CN202211069768.5A CN116719468A (zh) 2022-09-02 2022-09-02 交互事件的处理方法及装置

Publications (1)

Publication Number Publication Date
WO2024046179A1 true WO2024046179A1 (zh) 2024-03-07

Family

ID=87873946

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/114369 WO2024046179A1 (zh) 2022-09-02 2023-08-23 交互事件的处理方法及装置

Country Status (2)

Country Link
CN (1) CN116719468A (zh)
WO (1) WO2024046179A1 (zh)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103237130A (zh) * 2013-04-19 2013-08-07 李雄 手机触摸屏多功能辅助触摸操控方法
US20140372903A1 (en) * 2013-06-14 2014-12-18 Microsoft Corporation Independent Hit Testing for Touchpad Manipulations and Double-Tap Zooming
CN104423826A (zh) * 2013-09-03 2015-03-18 上海炬力集成电路设计有限公司 一种使用鼠标中键和滚轮实现缩放的方法及装置
CN105549761A (zh) * 2016-01-26 2016-05-04 北京易诚高科科技发展有限公司 一种用滚轮鼠标实现双指缩放操作的控制方法
CN106484087A (zh) * 2015-09-02 2017-03-08 黄小明 一种便携遥测体感输入方法及装置
CN107203432A (zh) * 2017-05-27 2017-09-26 网易(杭州)网络有限公司 一种远程控制安卓设备的方法和系统
CN113655940A (zh) * 2021-08-24 2021-11-16 深圳技德应用技术有限公司 一种Linux兼容Android的模拟双指缩放方法及装置

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103324734B (zh) * 2013-06-28 2017-04-05 北京奇虎科技有限公司 一种电子设备上网页缩放的方法和装置
CN115357178B (zh) * 2019-08-29 2023-08-08 荣耀终端有限公司 一种应用于投屏场景的控制方法以及相关设备
CN111603767A (zh) * 2020-04-14 2020-09-01 上海卓易科技股份有限公司 调整分辨率的方法、终端及存储介质
CN114237413A (zh) * 2020-09-09 2022-03-25 华为技术有限公司 处理交互事件的方法和装置
CN112394895B (zh) * 2020-11-16 2023-10-13 Oppo广东移动通信有限公司 画面跨设备显示方法与装置、电子设备

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103237130A (zh) * 2013-04-19 2013-08-07 李雄 手机触摸屏多功能辅助触摸操控方法
US20140372903A1 (en) * 2013-06-14 2014-12-18 Microsoft Corporation Independent Hit Testing for Touchpad Manipulations and Double-Tap Zooming
CN104423826A (zh) * 2013-09-03 2015-03-18 上海炬力集成电路设计有限公司 一种使用鼠标中键和滚轮实现缩放的方法及装置
CN106484087A (zh) * 2015-09-02 2017-03-08 黄小明 一种便携遥测体感输入方法及装置
CN105549761A (zh) * 2016-01-26 2016-05-04 北京易诚高科科技发展有限公司 一种用滚轮鼠标实现双指缩放操作的控制方法
CN107203432A (zh) * 2017-05-27 2017-09-26 网易(杭州)网络有限公司 一种远程控制安卓设备的方法和系统
CN113655940A (zh) * 2021-08-24 2021-11-16 深圳技德应用技术有限公司 一种Linux兼容Android的模拟双指缩放方法及装置

Also Published As

Publication number Publication date
CN116719468A (zh) 2023-09-08

Similar Documents

Publication Publication Date Title
JP7142783B2 (ja) 音声制御方法及び電子装置
US11785329B2 (en) Camera switching method for terminal, and terminal
WO2021129326A1 (zh) 一种屏幕显示方法及电子设备
US20230046708A1 (en) Application Interface Interaction Method, Electronic Device, and Computer-Readable Storage Medium
US11994918B2 (en) Electronic device control method and electronic device
CN111078091A (zh) 分屏显示的处理方法、装置及电子设备
CN112751954B (zh) 一种操作提示的方法和电子设备
WO2020088633A1 (zh) 支付方法、装置和用户设备
CN110633043A (zh) 一种分屏处理方法及终端设备
CN112068907A (zh) 一种界面显示方法和电子设备
WO2023241209A9 (zh) 桌面壁纸配置方法、装置、电子设备及可读存储介质
CN114089932A (zh) 多屏显示方法、装置、终端设备及存储介质
CN112578981A (zh) 具有柔性屏幕的电子设备的控制方法及电子设备
CN115016697A (zh) 投屏方法、计算机设备、可读存储介质和程序产品
WO2021042878A1 (zh) 一种拍摄方法及电子设备
WO2023029916A1 (zh) 批注展示方法、装置、终端设备及可读存储介质
US20220317841A1 (en) Screenshot Method and Related Device
WO2022002213A1 (zh) 翻译结果显示方法、装置及电子设备
WO2024046179A1 (zh) 交互事件的处理方法及装置
CN113821129A (zh) 一种显示窗口控制方法及电子设备
WO2024012346A1 (zh) 任务迁移的方法、电子设备和系统
WO2023124829A1 (zh) 语音协同输入方法、电子设备及计算机可读存储介质
WO2024109573A1 (zh) 悬浮窗显示的方法和电子设备
CN116301483A (zh) 一种应用卡片的管理方法、电子设备和存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23859212

Country of ref document: EP

Kind code of ref document: A1