CN117348785A - Control method and electronic equipment - Google Patents

Control method and electronic equipment Download PDF

Info

Publication number
CN117348785A
CN117348785A CN202210746887.3A CN202210746887A CN117348785A CN 117348785 A CN117348785 A CN 117348785A CN 202210746887 A CN202210746887 A CN 202210746887A CN 117348785 A CN117348785 A CN 117348785A
Authority
CN
China
Prior art keywords
electronic device
event
interface
touch event
external input
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210746887.3A
Other languages
Chinese (zh)
Inventor
何书杰
姚仕贤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202210746887.3A priority Critical patent/CN117348785A/en
Priority to PCT/CN2023/101372 priority patent/WO2024001871A1/en
Publication of CN117348785A publication Critical patent/CN117348785A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Abstract

The embodiment of the application provides a control method and electronic equipment, wherein the method comprises the following steps: the electronic equipment receives a reporting event of external input equipment; the electronic equipment converts the reported event into a touch event; and the electronic equipment executes the target operation according to the touch event. Through the control method and the electronic device provided by the embodiment of the application, the external input device of the electronic device can be enabled to be operated on a page which is not developed based on an original ecological operating system, and the electronic device can be controlled.

Description

Control method and electronic equipment
Technical Field
The embodiment of the application relates to the field of electronic equipment, in particular to a control method and electronic equipment.
Background
Some pages (e.g., web page development pages) not developed based on the original ecological operating system may not respond to the input instruction of the external input device, such as the keyboard and mouse device, so that it is difficult for the user to control the electronic device through the external input device, such as the keyboard and mouse device, and the user experience is not improved when the electronic device is especially in a screen projection scene.
Disclosure of Invention
The embodiment of the application provides a control method and electronic equipment, and aims to enable external input equipment of the electronic equipment to be capable of realizing control on the electronic equipment on a page which is not developed based on an original ecological operating system.
In a first aspect, a manipulation method is provided, the method comprising: the electronic equipment receives a reporting event of external input equipment; the electronic equipment converts the reported event into a touch event; and the electronic equipment executes the target operation according to the touch event.
By converting the reporting event of the external input device into the touch event of the electronic device, the electronic device can be controlled by the external input device on a page which is not developed based on an original ecological operating system.
With reference to the first aspect, in some implementations of the first aspect, the electronic device converts the reporting event into a touch event, including: acquiring a mapping relation between a reporting event and a touch event; and converting the reported event into a touch event according to the mapping relation.
Alternatively, the mapping relationship may be a mapping table.
Through setting the mapping relation between the reporting event and the touch event, the electronic device can more effectively perform event conversion.
With reference to the first aspect, in some implementations of the first aspect, before the electronic device performs the target operation according to the touch event, the method further includes: the electronic equipment determines a target interface, wherein the target interface is a corresponding interface when a reporting event occurs; the electronic device executes a target operation according to the touch event, including: and the electronic equipment executes target operation according to the touch event and the target interface.
By determining the target interface, the electronic device can acquire the target interface corresponding to the operation of the external input device, so that the event is dispatched to the target interface, and the target operation is realized.
With reference to the first aspect, in some implementation manners of the first aspect, converting the reporting event into a touch event includes: and when the target interface is determined to be the first page, converting the reported event into a touch event, wherein the first page is a page which does not support direct control of external input equipment.
By setting the event conversion in the application only under the condition that the external input device is not supported to directly control the electronic device, the event conversion can be prevented from being executed to prolong the reaction time under the condition of an original ecological operation interface, so that the overall efficiency can be improved, and the user experience can be improved.
With reference to the first aspect, in certain implementations of the first aspect, the first page is a web page class development page.
With reference to the first aspect, in some implementations of the first aspect, when the electronic device detects that the electronic device is connected to the external input device and the electronic device is connected to the display device, the electronic device is projected onto the display device.
Thus, the user experience can be improved.
With reference to the first aspect, in some implementations of the first aspect, the external input device is a mouse.
With reference to the first aspect, in some implementations of the first aspect, the electronic device is a device that is equipped with an android system.
In a second aspect, an electronic device is provided, where the electronic device is connected to an external input device, and the electronic device includes: the receiving unit is used for receiving the report event of the external input equipment; the processing unit is used for converting the reported event into a touch event; the processing unit is also used for executing target operation according to the touch event.
With reference to the second aspect, in some implementations of the second aspect, the processing unit is specifically configured to obtain a mapping relationship between a reporting event and a touch event; and converting the reported event into a touch event according to the mapping relation.
With reference to the second aspect, in certain implementations of the second aspect, the processing unit is further configured to: determining a target interface, wherein the target interface is a corresponding interface when a reporting event occurs; the processing unit is specifically configured to: and executing target operation according to the touch event and the target interface.
With reference to the second aspect, in certain implementations of the second aspect, the processing unit is specifically configured to: and when the target interface is determined to be the first interface, converting the reported event into a touch event, wherein the first page is a page which does not support direct control of the electronic equipment.
With reference to the second aspect, in some implementations of the second aspect, the first page is a web page class development page.
With reference to the second aspect, in some implementations of the second aspect, when the processing unit detects that the electronic device is connected to the external input device and the electronic device is connected to the display device, the electronic device is projected onto the display device.
With reference to the second aspect, in some implementations of the second aspect, the external input device is a mouse.
With reference to the second aspect, in some implementations of the second aspect, the electronic device is a device that is equipped with an android system.
In a third aspect, an electronic device is provided, comprising: one or more processors; a memory; and one or more computer programs. Wherein one or more computer programs are stored in the memory, the one or more computer programs comprising instructions. The instructions, when executed by the electronic device, cause the electronic device to perform the manipulation method in any one of the possible implementations of the first aspect described above.
In a fourth aspect, a computer storage medium is provided, comprising computer instructions which, when run on an electronic device, cause the electronic device to perform the manipulation method in any one of the possible implementations of the first aspect.
In a fifth aspect, a computer program product is provided which, when run on an electronic device, causes the electronic device to perform the manipulation method in any of the possible implementations of the first aspect.
In a sixth aspect, there is provided a chip system comprising at least one processor in which program instructions, when executed, cause the functions of any one of the possible methods of the first aspect described above to be carried out on an electronic device.
In a seventh aspect, there is provided a chip comprising a processor and a communication interface for receiving signals and transmitting signals to the processor, the processor processing the signals such that the functions of the method of any one of the above-described possible aspects on an electronic device are performed.
Drawings
Fig. 1 is a schematic structural view of an electronic device.
Fig. 2 is a block diagram of a software architecture of an electronic device.
FIG. 3 is an interface diagram of an electronic device coupled to an external input device.
Fig. 4 is yet another interface diagram of an electronic device coupled to an external input device.
Fig. 5 is a schematic block diagram of a manipulation method according to an embodiment of the present application.
Fig. 6 is an interface diagram of an electronic device connected to an external input device according to an embodiment of the present application.
Fig. 7 is a further interface diagram of an electronic device connected to an external input device according to an embodiment of the present application.
Fig. 8 is a further interface diagram of an electronic device connected to an external input device according to an embodiment of the present application.
Fig. 9 is a schematic block diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the accompanying drawings. In the description of the embodiments of the present application, unless otherwise indicated, "/" means or, for example, a/B may represent a or B; "and/or" herein is merely an association relationship describing an association object, and means that three relationships may exist, for example, a and/or B may mean: a exists alone, A and B exist together, and B exists alone. In addition, in the description of the embodiments of the present application, "plurality" means two or more than two.
The terms "first" and "second" are used below for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature. In the description of the present embodiment, unless otherwise specified, the meaning of "plurality" is two or more.
By way of example, fig. 1 shows a schematic diagram of an electronic device 100. The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, and a subscriber identity module (subscriber identification module, SIM) card interface 195, etc. The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the structure illustrated in the embodiments of the present application does not constitute a specific limitation on the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a memory, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller may be a neural hub and a command center of the electronic device 100, among others. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
The I2C interface is a bi-directional synchronous serial bus comprising a serial data line (SDA) and a serial clock line (derail clock line, SCL). In some embodiments, the processor 110 may contain multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, charger, flash, camera 193, etc., respectively, through different I2C bus interfaces. For example: the processor 110 may be coupled to the touch sensor 180K through an I2C interface, such that the processor 110 communicates with the touch sensor 180K through an I2C bus interface to implement a touch function of the electronic device 100.
The I2S interface may be used for audio communication. In some embodiments, the processor 110 may contain multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 via an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may transmit an audio signal to the wireless communication module 160 through the I2S interface, to implement a function of answering a call through the bluetooth headset.
PCM interfaces may also be used for audio communication to sample, quantize and encode analog signals. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface. In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface to implement a function of answering a call through the bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus for asynchronous communications. The bus may be a bi-directional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is typically used to connect the processor 110 with the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit an audio signal to the wireless communication module 160 through a UART interface, to implement a function of playing music through a bluetooth headset.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal or as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, an MIPI interface, etc.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transfer data between the electronic device 100 and a peripheral device. And can also be used for connecting with a headset, and playing audio through the headset. The interface may also be used to connect other electronic devices, such as AR devices, etc.
It should be understood that the interfacing relationship between the modules illustrated in the embodiments of the present application is only illustrative, and does not limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also use different interfacing manners, or a combination of multiple interfacing manners in the foregoing embodiments.
The charge management module 140 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charge management module 140 may receive a charging input of a wired charger through the USB interface 130. In some wireless charging embodiments, the charge management module 140 may receive wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED) or an active-matrix organic light-emitting diode (matrix organic light emitting diode), a flexible light-emitting diode (flex), a mini, a Micro led, a Micro-OLED, a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to fourier transform the frequency bin energy, or the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the electronic device 100. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card.
The internal memory 121 may be used to store computer executable program code including instructions. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a storage program area and a storage data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data created during use of the electronic device 100 (e.g., audio data, phonebook, etc.), and so on. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like.
The electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or a portion of the functional modules of the audio module 170 may be disposed in the processor 110.
The touch sensor 180K, also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is for detecting a touch operation acting thereon or thereabout. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 194. In other embodiments, the touch sensor 180K may also be disposed on the surface of the electronic device 100 at a different location than the display 194.
The keys 190 include a power-on key, a volume key, etc. The keys 190 may be mechanical keys or touch keys. The electronic device 100 may receive key inputs, generating key signal inputs related to user settings and function controls of the electronic device 100.
The indicator 192 may be an indicator light, may be used to indicate a state of charge, a change in charge, a message indicating a missed call, a notification, etc.
The software system of the electronic device 100 may employ a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture. In this embodiment, taking an Android system with a layered architecture as an example, a software structure of the electronic device 100 is illustrated.
Fig. 2 is a software configuration block diagram of the electronic device 100 according to the embodiment of the present application. The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, from top to bottom, an application layer, an application framework layer, an Zhuoyun row (Android run) and system libraries, and a kernel layer, respectively. The application layer may include a series of application packages.
As shown in fig. 2, the application layer may include a camera, settings, skin modules, user Interfaces (UIs), three-way applications, and the like. The three-party application program can comprise a gallery, calendar, conversation, map, navigation, WLAN, bluetooth, music, video, short message, and the like.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer may include some predefined functions.
As shown in FIG. 2, the application framework layer may include a window manager, a content provider, a view system, a telephony manager, a resource manager, a notification manager, and the like.
The window manager is used for managing window programs. The window manager can acquire the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like. The content provider is used to store and retrieve data and make such data accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc.
The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture.
The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction. Such as notification manager is used to inform that the download is complete, message alerts, etc. The notification manager may also be a notification in the form of a chart or scroll bar text that appears on the system top status bar, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, a text message is prompted in a status bar, a prompt tone is emitted, the electronic device vibrates, and an indicator light blinks, etc.
Android runtimes include core libraries and virtual machines. Android run time is responsible for scheduling and management of the Android system.
The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
In addition, the system library can also comprise a state monitoring service module and the like, such as a physical state recognition module, which is used for analyzing and recognizing gestures of a user; the sensor service module is configured to monitor sensor data uploaded by various sensors in the hardware layer, and determine a physical state of the electronic device 100.
The kernel layer is a layer between hardware and software. The kernel layer at least comprises a display driver, a camera driver, an audio driver, a sensor driver and an input device driver.
The hardware layer may include various sensors, such as the various sensors described in fig. 1, acceleration sensors, gyroscopic sensors, touch sensors, etc. referred to in embodiments of the present application.
In connection with the electronic device described in fig. 1 and fig. 2, in the embodiment of the present application, the physical components related to the electronic device 100 mainly include hardware components such as an input device, a sensor, a decision support system (decision support systems, DSS) display chip, a touch display screen, a fingerprint recognition module, and the like; an input device driver, a screen management module, a display driver, a fingerprint driver, an anti-false touch kernel software layer and the like; anti-false touch input, screen control, off-screen display (always on display, AOD) service, power management, etc. application framework layer functions; application layer services such as special adaptation application (camera), three-party application, system dormancy, AOD and the like.
Currently, in an electronic device based on an android system, in order to improve the interactive experience of a user, some web page development pages, for example, pages written based on HTML5 language and javascript pages, are more applied.
However, the web page development page may not respond to the input instruction of the external input device, such as the mouse device, for example, the click event of the mouse may not be recognized and responded, so that the user may not be able to control the electronic device through the external input device, which is not beneficial to improving the experience of the user.
For example, as shown in fig. 3, taking the electronic device 100 as a tablet computer as an example, a general user may utilize a touch screen to control the tablet computer, and illustratively, the electronic device 100 displays a red-envelope interface 301, where the red-envelope interface 301 includes a control 10, and the control 10 may be used to close the red-envelope interface 301, and when the tablet computer detects that a position corresponding to the control 10 is touched, the tablet computer may exit the red-envelope interface 600.
In general, when the tablet computer is connected to an external input device, the tablet computer may be controlled by the external input device, but this may be limited to an application program that is developed based on the operating system ecology, and for a web page, the tablet computer may not be controlled by the external input device.
As shown in fig. 3, since the application program of the web page cannot respond to the mouse event, the pointer of the mouse 200 cannot be moved to the position where the control 10 is exited, or the control 10 cannot be exited from the red package interface 301 by clicking the left mouse button 201 at the position of the control 10, so that the mouse 50 cannot implement man-machine interaction, resulting in poor user experience.
As another example, as shown in fig. 4, the tablet computer displays a teletext page 401, which teletext page 401 can flip the page in response to detecting a finger swipe screen. For applications developed based on the original ecological operating system, page scrolling can be generally achieved by scrolling the scroll wheel 202 of the mouse 200, however, due to incompatibility problems, applications developed based on web pages cannot be controlled in response to a mouse-scroll event, and even if the electronic device 100 receives a report event of the scroll wheel 202, the page remains on the page 401 because of the inability to respond.
Therefore, it is desirable to provide a method that can be used to control an electronic device through an external input device on a page that is not developed based on an original operating system.
The following describes the workflow of the software and hardware of the electronic device 100 in an exemplary manner in connection with the manipulation method according to the embodiments of the present application. The control method provided in the embodiments of the present application is mainly implemented by the interaction between the external input device and the layers of the software architecture layer of the electronic device 100 and the one or more physical components.
For easy understanding, the following embodiments of the present application will take an electronic device having the structure shown in fig. 1 and fig. 2 as an example, and specifically describe the manipulation method 400 provided in the embodiments of the present application with reference to the accompanying drawings and application scenarios.
Fig. 5 illustrates a manipulation method 500 provided in an embodiment of the present application, which aims to implement manipulation of an electronic device by an external input device, such as a mouse device, of the electronic device by converting an event stream of the external input device into a touch type event stream.
Fig. 6 and 7 schematically illustrate scene graphs implemented based on the manipulation method 500.
Taking the electronic device 100 as a tablet computer for example, and taking an external input device as a mouse for example, as shown in (a) of fig. 6, the electronic device 100 is connected to the input device 200, and the electronic device 100 displays a red envelope interface 301, where the red envelope interface 301 includes a control 10, and the control 10 is used to close the red envelope interface 301. When the electronic device 100 detects that the control 10 is clicked, the clicking is performed by the mouse 200, and the electronic device 100 may convert the mouse clicking event into a touch type clicking event, so that the electronic device 100 may close the red envelope interface 301 in response to the touch type clicking event, thereby displaying the interface 302 as shown in (b) of fig. 6. In other words, the electronic device can translate the operation of the mouse click control 10 into an operation of the touch click control 10.
Taking the electronic device 100 as a tablet computer for example and the external input device as a mouse for example, as shown in (a) of fig. 7, the electronic device 100 is connected with the external input device 200, the electronic device 100 displays a graphic page 401, and in response to detecting that the finger of the user slides on the screen, the electronic device can switch the page to display the page 402. When the electronic device 100 detects a scroll event stream of the mouse wheel 202, the electronic device 100 may convert the scroll event stream of the mouse wheel 202 into a touch type slide event stream, so that the electronic device 100 may flip the teletext page 401 in response to the touch type slide event stream, thereby displaying the page 402 as shown in (b) of fig. 7.
In addition, a touch event stream of the right key 503 may also be set to correspond to a touch event stream of the tablet computer, for example, a touch event stream of a homepage is displayed, so as to enrich the control capability of the mouse and improve the user experience.
In some embodiments, the electronic device may cast a screen onto the display device. The display device may include, but is not limited to: projection screens, smart screens, televisions, tablet computers, PC displays, and the like.
As shown in fig. 8 (a), the electronic device 100 is connected to the external input device 200 and drops on the screen to the display device 800. The electronic device 100 may be, for example, a mobile phone, the external input device 200 may be, for example, a mouse, and the external input device 200 may be used to control the electronic device 100. Wherein the electronic device 100 displays the interface 301, the display device 800 displays the interface 801, the interface 801 being a projection of the interface 301, and thus both interfaces display the control 10, as well as the mouse pointer. Thus, the movement of the mouse pointer of interface 801 is synchronized with the pointer movement of interface 301. Therefore, the interface 801 may provide a position of the mouse pointer for user manipulation, and the manipulation synchronizes the electronic device 100, so that even if the electronic device 100 does not display the interface 301, the user experience may be improved by manipulating the interface through the external input device 200. Alternatively, the user may manipulate the electronic device 100 without having to see the interface 301.
When the electronic device 100 detects that the control 10 is clicked, the clicking is performed by the mouse 200, the electronic device 100 may convert the mouse clicking event into a touch type clicking event, so that the electronic device 100 may close the red envelope interface 301 and display the interface 302 shown in (b) of fig. 8 in response to the touch type clicking event, and the display device 800 closes the interface 801 synchronously because the display content is from the electronic device 100, thereby displaying the interface 802 shown in (b) of fig. 8.
In some embodiments, the electronic device is a device that is android system-mounted, e.g., an android tablet, an android phone.
In the above examples, the external input device is a mouse, but the embodiments of the present application are not limited thereto, and the external input device may be a keyboard, for example.
The method 500 includes the steps of:
s510, the electronic device receives the report event of the input device.
In some embodiments, the input device may be a mouse, or a keyboard.
Taking the input device as a mouse as an example, the reporting event of the input device may include, but is not limited to: clicking of the mouse (comprising clicking of the mouse and double clicking of the mouse), long pressing of the mouse, moving of the mouse and sliding of the mouse wheel, wherein clicking of the mouse can be clicking of left and right buttons of the mouse or clicking of the mouse wheel.
In some embodiments, the electronic device may receive the reporting event via a wired interface or bluetooth.
For example, as shown in fig. 6 (a), the electronic device is connected to the mouse through bluetooth, so that the mouse can report a mouse event to the electronic device.
S520, the electronic device converts the reported event into a touch event.
In some embodiments, prior to this step, the electronic device may determine a mapping relationship of the reporting event and the touch event.
Thus, the electronic device performs event conversion according to the mapping relationship.
The mapping relationship may include: clicking the corresponding touch by clicking the mouse, and sliding the corresponding touch by scrolling the mouse wheel.
Some preset shortcut touch operations may also be corresponding to, for example, a wake-up key click of the keyboard, which may correspond to a quick click of multiple screen positions, where the quick click of multiple screen positions is preset as one of the wake-up modes of the device.
For example, as shown in fig. 6, the report event of the mouse is an event stream of mouse click, and the conversion flow may be, for example, based on the detected mouse event stream being: the action Down→action_button_press→action_move→action_button_release→action_up, and the event stream is converted into a touch type event stream: touch_move_on→touch_action_move→touch_action_up. It should be understood that the above conversions are merely illustrative and do not constitute a limitation of the embodiments of the present application.
As another example, as shown in fig. 7, the report event of the mouse is a rolling event stream of the wheel, and the conversion flow may be, for example: roller depression-roller scrolling-roller release, converting the event stream into a touch type event stream: touch click- & gt touch slide- & gt touch release. In some possible implementations, the reported scroll wheel rolling event carries parameter information, such as a rolling acceleration of the scroll wheel, which the conversion process can convert to a corresponding sliding speed.
For another example, a stream of events for a right mouse click may be mapped to a stream of long press events for a touch type. By way of example, the control of the electronic device, in response to touching the long-press event stream, may present a shortcut function menu for the space, and in general, mouse control is similarly set, and the two event streams are corresponding, so that the control habit of the user may be adapted more.
For another example, when the external input device is a keyboard, for example, a click event stream of a ctrl key of the keyboard may also be set to correspond to a touch event stream of the mobile phone, for example, a touch event stream of a display homepage, and when the mobile phone is externally connected with the keyboard and detects that the user clicks the ctrl key, the event stream may be converted into the touch event stream of the display homepage, so that the mobile phone may be switched from the application page to the homepage. For another example, a click event of a volume adjustment key of a keyboard may be converted into: the virtual volume key of the mobile phone clicks the event stream, and it should be understood that the electronic device does not need to determine the interface corresponding to the input operation at this time.
In some embodiments, the conversion may be performed only when needed, for example, when it is determined that the target interface is the first page, where the first page is a page that does not support the electronic device to directly control the external input device, the reporting event is converted into the touch event of the electronic device.
In one possible implementation, a set of web page class development pages may be preset. It is only necessary to determine if the target interface is within this set. It should be appreciated that the collection may also be updated subsequently.
Through setting the conversion conditions, the conversion can be performed only under the necessary conditions, so that the event conversion is performed by avoiding some pages which can be controlled in a native manner, and the time is prolonged, thereby further improving the user experience.
S530, executing target operation according to the touch event. In some embodiments, before step S530, a target interface of the electronic device may also be determined, where the target interface is an interface corresponding to when the reporting event occurs.
The target interface may be a page based on human-machine interaction. It should be appreciated that when the electronic device determines a target interface, a reporting event may be dispatched to the target interface.
Illustratively, as shown in fig. 6 (b), the electronic device 100 responds to the converted touch event stream as a click event stream and sends the event stream to the window management module to determine the target interface, so that the electronic device 100 closes the control 10 and displays the interface 602.
As another example, as shown in fig. 7 (b), the electronic device 100 responds to the converted touch event stream being a sliding event stream, and sends the event stream to the window management module to determine the target interface, so that the electronic device 100 can switch the displayed interfaces, and display the interface 702.
It will be appreciated that this step corresponds to a general touch screen manipulation procedure.
In some embodiments, the electronic device detects that the electronic device is connected to an external input device and is connected to a display device, and may throw the electronic device on the display device.
The embodiment of the application also provides electronic equipment, which is connected with the external input equipment, and the external input equipment is used for controlling the electronic equipment.
As shown in fig. 9, the electronic device includes: a receiving module and a processing module.
The receiving module is used for receiving the report event of the input equipment.
The processing module is used for converting the reported event into a touch event.
In some embodiments, the electronic device may cast a screen to the display device.
Embodiments of the present application also provide a computer storage medium comprising computer instructions that, when executed on an electronic device, cause the electronic device to perform the above-described method 500.
Embodiments of the present application also provide a computer program product that, when run on an electronic device, causes the electronic device to perform the above-described method 500.
Embodiments of the present application also provide a chip system including at least one processor, wherein program instructions, when executed in the at least one processor, cause the above-described method 500 to be implemented as a function on an electronic device.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, and are not repeated herein.
In the several embodiments provided in this application, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, and the division of the units or modules is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units, modules or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a read-only memory (ROM), a random access memory (random access memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily think about changes or substitutions within the technical scope of the present application, and the changes and substitutions are intended to be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (20)

1. A method of manipulation, the method comprising:
the electronic equipment receives a reporting event of external input equipment;
the electronic equipment converts the reported event into a touch event;
and the electronic equipment executes target operation according to the touch event.
2. The method of claim 1, wherein the electronic device converting the reporting event into a touch event comprises:
acquiring a mapping relation between the reporting event and the touch event;
and converting the reporting event into the touch event according to the mapping relation.
3. The manipulation method according to claim 1 or 2, wherein before the electronic device performs a target operation according to the touch event, the method further comprises:
the electronic equipment determines a target interface, wherein the target interface is an interface corresponding to the reporting event;
the electronic device executes a target operation according to the touch event, including:
and the electronic equipment executes target operation according to the touch event and the target interface.
4. The method of claim 3, wherein the converting the reporting event into the touch event of the electronic device includes:
And when the target interface is determined to be a first page, converting the reporting event into the touch event, wherein the first page is a page which does not support direct control of the external input device.
5. The method of claim 4, wherein the first page is a web class development page.
6. The manipulation method according to any one of claims 1 to 5, further comprising:
when the electronic equipment detects that the electronic equipment is connected with the external input equipment and the electronic equipment is connected with the display equipment, the electronic equipment is projected to the display equipment.
7. The manipulation method according to any one of claims 1 to 6, wherein the external input device is a mouse.
8. The manipulation method according to any one of claims 1 to 7, wherein the electronic device is a device on which an android system is mounted.
9. An electronic device connected to an external input device, the electronic device comprising:
the receiving unit is used for receiving the report event of the external input equipment;
the processing unit is used for converting the reported event into a touch event;
The processing unit is further configured to execute a target operation according to the touch event.
10. The electronic device of claim 9, wherein the processing unit is specifically configured to obtain a mapping relationship between the reporting event and the touch event, and convert the reporting event into the touch event according to the mapping relationship.
11. The electronic device according to claim 9 or 10, wherein the processing unit is further configured to determine a target interface, where the target interface is an interface corresponding to when the reporting event occurs;
the processing unit is specifically configured to execute a target operation according to the touch event and the target interface.
12. The electronic device according to claim 11, wherein the processing unit is specifically configured to: and when the target interface is determined to be a first interface, converting the reporting event into the touch event, wherein the first page is a page which does not support direct control of the electronic equipment.
13. The electronic device of claim 12, wherein the first page is a web class development page.
14. The electronic device of any one of claims 9-13, wherein the electronic device is projected onto a display device when the processing unit detects that the electronic device is connected to the external input device and the electronic device is connected to the display device.
15. The electronic device of any one of claims 9-14, wherein the external input device is a mouse.
16. The electronic device of any one of claims 9 to 15, wherein the electronic device is an android system-mounted device.
17. A computer-readable storage medium comprising computer instructions which, when run on an electronic device, cause the electronic device to perform the manipulation method according to any one of claims 1 to 8.
18. An electronic device comprising one or more processors; one or more memories; the one or more memories store one or more computer programs comprising instructions that, when executed by the one or more processors, cause the method of any of claims 1-8 to be performed.
19. A chip comprising a processor and a communication interface for receiving signals and transmitting the signals to the processor, the processor processing the signals such that the method of any of claims 1 to 8 is performed.
20. A computer program product comprising instructions which, when run on a computer, cause the computer to perform the method of any of claims 1 to 8.
CN202210746887.3A 2022-06-29 2022-06-29 Control method and electronic equipment Pending CN117348785A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202210746887.3A CN117348785A (en) 2022-06-29 2022-06-29 Control method and electronic equipment
PCT/CN2023/101372 WO2024001871A1 (en) 2022-06-29 2023-06-20 Control and operation method and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210746887.3A CN117348785A (en) 2022-06-29 2022-06-29 Control method and electronic equipment

Publications (1)

Publication Number Publication Date
CN117348785A true CN117348785A (en) 2024-01-05

Family

ID=89354492

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210746887.3A Pending CN117348785A (en) 2022-06-29 2022-06-29 Control method and electronic equipment

Country Status (2)

Country Link
CN (1) CN117348785A (en)
WO (1) WO2024001871A1 (en)

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111840990B (en) * 2020-07-21 2022-08-19 联想(北京)有限公司 Input control method and device and electronic equipment
CN114281288A (en) * 2021-12-10 2022-04-05 海宁奕斯伟集成电路设计有限公司 Screen projection processing method and device and electronic equipment

Also Published As

Publication number Publication date
WO2024001871A1 (en) 2024-01-04

Similar Documents

Publication Publication Date Title
EP4177725A1 (en) Cross-device object dragging method and device
WO2021057830A1 (en) Information processing method and electronic device
US20220413695A1 (en) Split-screen display method and electronic device
WO2021244443A1 (en) Split-screen display method, electronic device, and computer readable storage medium
WO2021057868A1 (en) Interface switching method and electronic device
EP4040277A1 (en) Method for displaying multiple windows, and electronic device and system
KR102199193B1 (en) Operating Method For Handwriting Data and Electronic Device supporting the same
EP2869181A1 (en) Method for executing functions in response to touch input and electronic device implementing the same
CN110989882B (en) Control method, electronic device and computer readable storage medium
US11455075B2 (en) Display method when application is exited and terminal
CN113132526B (en) Page drawing method and related device
CN114816167B (en) Application icon display method, electronic device and readable storage medium
CN111026464A (en) Identification method and electronic equipment
WO2023124141A1 (en) Input method calling method and related device
WO2022194190A1 (en) Method and apparatus for adjusting numerical range of recognition parameter of touch gesture
WO2022213831A1 (en) Control display method and related device
CN108351698B (en) Wearable electronic device and method for controlling applications executed in electronic device
CN110865765A (en) Terminal and map control method
WO2021052488A1 (en) Information processing method and electronic device
CN113741708A (en) Input method and electronic equipment
US20230300240A1 (en) Lock Screen Display Method for Electronic Device and Electronic Device
WO2022002213A1 (en) Translation result display method and apparatus, and electronic device
CN114461312B (en) Display method, electronic device and storage medium
CN117348785A (en) Control method and electronic equipment
CN113760164A (en) Display device and response method of control operation thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination