CN114764270B - Input conversion method, electronic device and readable medium - Google Patents

Input conversion method, electronic device and readable medium Download PDF

Info

Publication number
CN114764270B
CN114764270B CN202210409578.7A CN202210409578A CN114764270B CN 114764270 B CN114764270 B CN 114764270B CN 202210409578 A CN202210409578 A CN 202210409578A CN 114764270 B CN114764270 B CN 114764270B
Authority
CN
China
Prior art keywords
input
key
parameters
parameter
application
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210409578.7A
Other languages
Chinese (zh)
Other versions
CN114764270A (en
Inventor
饶凯浩
卞超
杨云帆
欧阳张健
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202210409578.7A priority Critical patent/CN114764270B/en
Publication of CN114764270A publication Critical patent/CN114764270A/en
Application granted granted Critical
Publication of CN114764270B publication Critical patent/CN114764270B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means

Abstract

The application relates to the technical field of terminals and discloses an input conversion method, electronic equipment and a readable medium. The electronic equipment supports a first type of input equipment, an input conversion module and a first application are installed on the electronic equipment, and the first application supports a second input parameter of a second type of input equipment; and the method comprises: the input conversion module acquires a first input parameter which is input through a first type of input equipment and aims at a first application; the input conversion module converts the first input parameters into second input parameters based on a preset conversion rule, wherein corresponding first input parameters are set for different second input parameters in the preset conversion rule, and the different second input parameters can be used by the first application to realize different functions. Therefore, when a developer develops an application program for the electronic equipment, the first-class input equipment can be compatible only by carrying out function adaptation on the second-class input equipment, the workload of the developer is reduced, and the development cost is reduced.

Description

Input conversion method, electronic device and readable medium
Technical Field
The present application relates to the field of terminal technologies, and in particular, to an input conversion method, an electronic device, and a readable medium.
Background
With the development of the related technologies of electronic devices, more and more types of electronic devices enter people's work and life. Meanwhile, the types of input modes of electronic equipment are also more and more abundant, such as a mouse, a keyboard, a touch screen, a crown, a key, an air gesture, voice and the like. Since most application programs need to be deployable on a variety of devices, and some electronic devices also have different types of input devices. Therefore, in order to ensure that the application can respond to input events of different types of input devices, the different types of input devices need to be individually adapted when the application is developed. Therefore, each time a developer develops an application program, the input events of the input devices of different types need to be adapted according to the use scene of the application program, the workload of the developer is increased, and the development cost is increased.
Disclosure of Invention
In view of this, embodiments of the present application provide an input conversion method, an electronic device, and a readable medium, in which input events of different types of input devices are mapped to input events of the same type of input device in an operating system, so that a developer can be compatible with different types of input devices only by adapting functions to the same type of events when developing an application program, thereby reducing workload of the developer and reducing development cost.
In a first aspect, an embodiment of the present application provides an input conversion method, which is applied to an electronic device, where the electronic device supports a first type of input device, an input conversion module and a first application are installed on the electronic device, and the first application supports a second input parameter of a second type of input device;
the method comprises the following steps:
the input conversion module acquires a first input parameter which is input through a first type of input equipment and aims at a first application;
the input conversion module converts the first input parameters into second input parameters based on a preset conversion rule, wherein corresponding first input parameters are set for different second input parameters in the preset conversion rule, and the different second input parameters can be used by the first application to realize different functions;
the first application implements the associated function in response to the second input parameter.
In this embodiment of the application, the first input parameter may include a device type of the first type of input device and a specific parameter corresponding to the device type, the input conversion module may be disposed in an operating system of the electronic device, for example, may be disposed in an application framework of the operating system of the electronic device, when receiving the first input parameter, the input conversion module maps the first input parameter to the second type of input device according to a preset conversion rule, and provides the converted second input parameter to the first application, for example, provides a monitoring interface of the second input parameter of the second type of input device to the first application, so that the first application can respond to an input event of the first type of input device, and then a developer only needs to adapt a related function to the second input parameter of the second type of input device when developing the first application, so that the developer can be compatible with the first type of input device, thereby reducing a workload of the developer, and reducing a development cost.
In a first possible implementation of the first aspect, the second type of input device is a touch screen, and the touch screen corresponds to at least one second input parameter that is:
the long press parameter comprises a long press position;
zooming parameters, wherein the zooming parameters comprise double-finger coordinates, a kneading direction and a zooming ratio;
and the sliding parameters comprise starting point coordinates, sliding speed and sliding distance.
In the embodiment of the application, the input event of the touch screen is abstracted from the long-press gesture, the zooming gesture and the sliding gesture, and the long-press gesture, the zooming gesture and the sliding gesture are defined by the long-press parameter, the zooming parameter and the sliding parameter. The input conversion module converts the input event of the first type of input equipment into the touch screen event when detecting that the input event of the first type of input equipment meets the condition mapped as the long press parameter, the zooming parameter or the sliding parameter in the preset conversion rule.
In a second possible implementation manner of the first aspect, the first type of input device is a mouse, and the mouse has at least one of the following first input parameters:
right click parameters, wherein the right click parameters comprise cursor positions;
the roller rolling parameters comprise cursor position, roller coefficient and rolling direction;
a left key length pressing parameter, wherein the left key length pressing parameter comprises a cursor position;
and the mouse moving parameters comprise cursor position, moving speed and moving distance.
In the embodiment of the application, the input events of the mouse are abstracted to right click, roller rolling, left long-press and mouse movement, and the right click, roller rolling, left long-press and mouse movement are respectively defined by right click parameters, roller rolling parameters, left long-press parameters and mouse movement parameters. The input conversion module may convert the input event of the mouse into the input event of the touch screen by mapping a parameter defining the input event of the mouse to a parameter defining the input event of the touch screen.
In a third possible implementation manner that is combined with the second possible implementation manner of the first aspect, the preset conversion rule includes at least one of the following rules:
when the first input parameter is a right-click parameter, mapping the cursor position to a long-click position;
when the first input parameter is a roller wheel rolling parameter, mapping the cursor position as a starting point coordinate, and mapping the roller wheel coefficient and/or the rolling direction as a sliding speed;
when the first input parameter is a left key long-press parameter and a mouse movement parameter, the cursor position is mapped to be a sliding starting point, the movement speed is mapped to be a sliding speed, and the movement distance is mapped to be a sliding distance.
In the embodiment of the application, after the input conversion module is converted by the preset rule, when an input event of the electronic device is mouse click, the input event received by the first application is a touch screen long-time pressing event; when the input event of the electronic equipment is mouse wheel rolling or left key pressing and moving, the input event received by the first application is a touch screen sliding event.
In a fourth possible implementation manner with reference to the first possible implementation manner of the first aspect, the first type of input device is a keyboard, and the first input parameter corresponding to the keyboard includes a key parameter, and the key parameter includes at least one of the following parameters:
the "CTRL" key and the "+" key are pressed simultaneously or the "CTRL" key and the "-" key are pressed simultaneously;
pressing a 'menu' key;
the "SHIFT" key and the "F10" key are simultaneously depressed.
In a fifth possible implementation manner of the fourth possible implementation manner of the first aspect, the preset transformation rule includes at least one of the following rules:
when the key parameters of the keyboard are that a CTRL key and a CTRL key are pressed simultaneously or a CTRL key and a CTRL key are pressed simultaneously, mapping the CTRL key or the CTRL key to be a kneading direction, mapping a first preset coordinate to be a double-finger coordinate, and mapping a first preset scale to be a scaling scale;
and mapping the second preset coordinate to a long pressing position when the key pressing parameter of the keyboard is 'menu' key pressing or 'SHIFT' key and 'F10' key pressing simultaneously.
In the embodiment of the application, after the input conversion module is converted by the preset rule, when an input event of the electronic device is a keyboard, a CTRL key and a "+" key are simultaneously pressed, or a CTRL key and a "-" key are simultaneously pressed, the input event received by the first application is a touch screen zoom event; when the input event of the electronic equipment is the pressing of a keyboard 'menu' key or the pressing of a 'SHIFT' key and an 'F10' key simultaneously, the input event received by the first application is a touch screen long-pressing event.
In a sixth possible implementation manner that is combined with the fifth possible implementation manner of the first aspect, the first application includes a first control; and is
The first preset coordinate is a coordinate which is 1/4 of the diagonal line of the first control and is 3/4 of the diagonal line of the first control.
In a seventh possible implementation manner that is combined with the sixth possible implementation manner of the first aspect, the first preset ratio is 1.05 when the key parameter is that the "CTRL" key and the "+" key are pressed simultaneously; the first preset ratio is 0.95 when the key parameters are the "CTRL" key and the "-" key simultaneously pressed.
In an eighth possible implementation manner that is combined with the fifth possible implementation manner of the first aspect, the first application includes a second control; and is provided with
The second preset coordinate is a coordinate of a center of the second control.
In a ninth possible implementation combined with the first possible implementation of the first aspect, the first type of input device includes a mouse and a keyboard, and the first input parameters corresponding to the mouse and the keyboard include:
keyboard parameters and roller scrolling parameters; wherein, the first and the second end of the pipe are connected with each other,
the keyboard parameters are that the CTRL key is pressed down, and the roller wheel rolling parameters comprise a cursor position, a roller wheel coefficient and a rolling direction.
In a tenth possible implementation manner that is combined with the ninth possible implementation manner of the first aspect, the preset conversion rule includes:
when the first input parameter includes a "CTRL" key press and a scroll wheel scroll parameter, a scroll wheel coefficient is mapped to a zoom coefficient, a scroll wheel direction is mapped to a pinch direction, and a third preset coordinate is mapped to a double index coordinate.
In this embodiment of the application, after the input conversion module is converted by the preset rule, when an input event of the electronic device is a keyboard, a CTRL key is pressed and a mouse wheel is scrolled, and the input event received by the first application is a touch screen zoom event.
In an eleventh possible implementation combined with the tenth possible implementation of the first aspect, the first application includes a third control; and is
The third preset coordinate is the coordinate of two points which are parallel to the diagonal line of the third control element and pass through the cursor position, and the distance between the two points and the cursor position is equal.
In a twelfth possible implementation manner that is combined with the first possible implementation manner of the first aspect, the first type of input device is a crown, and the first input parameter corresponding to the crown includes a crown rotation parameter; wherein the crown rotation parameter comprises a rotation angle.
In a thirteenth possible implementation manner that is combined with the twelfth possible implementation manner of the first aspect, the presetting of the conversion rule includes:
and under the condition that the first input parameter is a crown rotation parameter, mapping the rotation angle to a sliding distance, mapping the fourth preset coordinate to a start point coordinate, and mapping the preset speed to a sliding speed.
In this embodiment of the application, after the input conversion module is converted by the preset rule, when the input event of the electronic device is crown rotation, the input event received by the first application is a long-time touch event.
In a fourteenth possible implementation combined with the thirteenth possible implementation of the first aspect, the first application includes a fourth control; and is
The fourth preset coordinate is a center coordinate of the fourth control.
In a second aspect, an embodiment of the present application provides a readable medium, where instructions are stored in the readable medium, and when the instructions are executed by an electronic device, the electronic device implements the first aspect and any one of the possible input conversion methods provided in the foregoing aspects.
In a third aspect, an embodiment of the present application provides an electronic device, including:
a memory to store instructions for execution by at least one processor of an electronic device; and
at least one processor configured to execute instructions to enable an electronic device to implement the first aspect and any one of the possible implementations of the provided input conversion method.
Drawings
FIG. 1 illustrates a schematic diagram of an input event handling process, according to some embodiments of the present application;
FIG. 2 illustrates a schematic diagram of an input event handling process, according to some embodiments of the present application;
FIG. 3 illustrates a hardware architecture diagram of an electronic device 20, according to some embodiments of the present application;
FIG. 4 illustrates a software architecture diagram of an electronic device 20, according to some embodiments of the present application;
FIG. 5 illustrates a schematic diagram of an interaction flow for input event conversion, according to some embodiments of the present application;
FIG. 6A illustrates a schematic diagram of a display menu for operating a notebook computer, according to some embodiments of the present application;
FIG. 6B illustrates an operational handset display menu diagram according to some embodiments of the present application;
FIG. 7A illustrates a process diagram of a long press event, according to some embodiments of the present application;
FIG. 7B illustrates another exemplary long press event processing according to some embodiments of the present application;
FIG. 8A illustrates a schematic diagram of a zoom operation in a notebook computer, according to some embodiments of the present application;
FIG. 8B illustrates a schematic diagram of a zoom operation at a cell phone, according to some embodiments of the present application;
FIG. 9 illustrates a process diagram of a zoom event, according to some embodiments of the present application;
FIG. 10 illustrates a schematic diagram of a coordinate point simulation method in scaling event processing, according to some embodiments of the present application;
FIG. 11 illustrates a schematic diagram of another method of coordinate point simulation in scaling event processing, according to some embodiments of the present application;
FIG. 12 illustrates another process diagram for a zoom event, according to some embodiments of the present application;
fig. 13A illustrates a schematic diagram of adjusting a progress bar on a smart watch, according to some embodiments of the present application;
FIG. 13B shows a schematic view of a progress bar being adjusted at a cell phone, in accordance with some embodiments of the present application;
FIG. 13C illustrates a schematic diagram of a progress bar adjusted at a notebook computer, according to some embodiments of the present application;
FIG. 14A illustrates a process diagram of a slip event, according to some embodiments of the present application;
FIG. 14B illustrates another slip event processing diagram according to some embodiments of the present application.
Detailed Description
Illustrative embodiments of the present application include, but are not limited to, input conversion methods, electronic devices, and readable media.
The technical solutions of the embodiments of the present application are described in detail below with reference to the accompanying drawings.
FIG. 1 illustrates a schematic diagram of an input event handling process, according to some embodiments of the present application. Referring to fig. 1, the application framework layer of the electronic device may acquire input events of different types of input devices through the hardware framework layer and classify the input events of the different types of input devices, for example, classify input events of the crown 11 of the smart watch 10, the mouse 21 and the keyboard 22 of the notebook computer 20 as key events, classify input events of the touch pad 23 of the notebook computer 20 and the touch screen 31 of the mobile phone 30 as mobile events, and provide a listening interface for an application installed in the electronic device, so that the application may acquire a user's operation on the input device through the listening interface and interact with the user. When a developer develops an application, in order to make the application compatible with different types of input devices, even if the application can respond to input events of different types of input devices having the same function, it is necessary to separately adapt the different types of input devices. For example, referring to fig. 1, when a developer develops an application, in order to enable a double-press event of the crown 11, a double-click left key event of the mouse 21, a press enter key event of the keyboard 22, a double-click event of the touch panel 23, and a click event of the touch screen 31 to correspond to a function of opening the control a, the developer needs to adapt the above operations to the function of opening the control a through adaptations 1 to 5, and when the developer develops different applications, the developer needs to respectively adapt input events generated by different types of input devices having the same function, so that the development cost of the developer is high.
In order to further reduce the development cost of developers, the embodiment of the application provides another input event conversion method. The input events of different types of input devices can be mapped into the input events of the same type of input devices in an operating system of the electronic device, for example, in an application program framework of the operating system of the electronic device, so that a user only needs to adapt corresponding functions to the input events of the same type of input devices when developing different application programs, and can be compatible with the different types of input devices without respectively adapting the input events of other types of input devices except the same type of input devices, thereby reducing the development cost of developers. Specifically, referring to fig. 2, the operating systems of the smart watch 10, the notebook computer 20, and the mobile phone 30 may include a hardware driving framework and an application framework, wherein the hardware driving framework enables the operating system of the aforementioned electronic device to access hardware, for example, obtain input devices such as the crown 11, the mouse 21, the keyboard 22, the touch pad 23, and the touch screen 31 to obtain input events of the input devices. When the application framework of the electronic device detects the operation of a user on the input device, the input events of the input device other than the touch screen, such as a touch pad event, a keyboard event, a mouse event, a crown event and the like, are mapped into the touch screen event, and a monitoring interface is provided for the application program, so that the user only needs to adapt corresponding functions to the touch screen event when developing different application programs, for example, touch screen input events such as touch screen clicking, touch screen zooming, touch screen sliding, touch screen long pressing and the like are respectively adapted to functions such as opening a control, zooming a control, moving the control, popping up a menu and the like, and does not need to adapt the input events of the functions to other types of input devices, so that the application program can correctly respond to the input events of different types of input devices in the electronic device with the different types of input devices, thereby ensuring the compatibility of the application program, reducing the workload of developers and reducing the development cost of the developers.
It is understood that mapping the input events of different types of input devices to touch screen events is only an example, and in other embodiments, the mapping may also be to other types of events, for example, mapping the input events of different types of devices to input events of a keyboard or input events of a mouse, and the like, which is not limited in this application.
It can be understood that adapting touch screen events such as touch screen clicking, touch screen zooming, touch screen sliding, touch screen long-time pressing, and the like to operations such as opening a control, zooming a control, moving a control, popping up a menu, and the like, respectively, is only an example, in other embodiments, a developer may adapt according to a specific function of a developed application, and the embodiment of the present application is not limited.
In order to facilitate understanding of the technical solution of the embodiments of the present application, a hardware and software structure of an electronic device suitable for input event conversion provided by the embodiments of the present application is described below by taking a notebook computer 20 as an example.
Fig. 3 illustrates a schematic diagram of a notebook computer 20, according to some embodiments of the present application. As shown in fig. 2, the notebook computer 20 may include one or more processors 201, memory 202, an interface module 203, input/output (I/O) devices 204, and system control logic 205 to couple the processors 201, memory 202, interface module 203, and input/output (I/O) devices 204. Wherein:
the Processor 201 may include one or more Processing units, such as Processing modules or Processing circuits that may include a Central Processing Unit (CPU), an image Processing Unit (GPU), a Digital Signal Processor (DSP), a Micro-programmed Control Unit (MCU), an Artificial Intelligence (AI) Processor, or a Programmable logic device (FPGA), etc. In some embodiments, the processor 201 may perform operations for obtaining user operations on different types of input devices and mapping the user operations to touch screen events.
The memory 202 may include non-volatile memory 2021 and volatile memory 2022. Non-volatile memory 203 may include, among other things, one or more tangible, non-transitory computer-readable media for persistently storing data and/or instructions. The nonvolatile memory 203 may include any suitable nonvolatile memory such as flash memory and/or any suitable nonvolatile storage device, such as a Hard Disk Drive (HDD), a Compact Disc (CD), a Digital Versatile Disc (DVD), a Solid-State Drive (SSD), and the like. The nonvolatile memory 2021 may also be a removable storage medium such as a Secure Digital (SD) memory card or the like. The volatile Memory 2022 may include Random-Access Memory (RAM), double Data Rate Synchronous Dynamic Random Access Memory (DDR SDRAM), and the like. The memory 202 may be used to store data and programs, either permanently or temporarily, for example, in some embodiments, the memory 202 may be used to store instructions that map input events of different types of input devices to touch screen events.
The interface module 203 may include a Universal Serial Bus (USB) for providing connection support between the notebook computer and other hardware. For example, in some embodiments, the input device may be connected to a notebook computer via the interface module 203.
The input/output (I/O) device 204 may include an input device 2041, such as a keyboard, a mouse, a touch screen, a touch pad, a joystick, etc., for converting a user operation into an analog or digital signal and transmitting the analog or digital signal to the processor 201, and the processor 201 may execute a corresponding instruction according to a signal of the input device 2041 (not labeled), for example, when an enter key of the keyboard is pressed, a selected control is opened; and an output device 2042 (not shown), such as a speaker, a printer, a display, etc., for presenting information from the laptop computer 20 to the user in the form of sound, text, images, etc., such as in some embodiments the display may be used to display a graphical user interface for an application to facilitate user interaction with the laptop computer 20.
The system control logic 205 may include any suitable interface controllers to provide any suitable interface to the other modules of the notebook computer 20 so that the various modules of the notebook computer 20 may communicate with each other. For example, in some embodiments, system control logic 205 may include an input/output controller to enable communication between processor 201 and input/output (I/O) devices 204.
In some embodiments, at least one of the processors 201 may be packaged together with logic for one or more controllers of the System control logic 205 to form a System In Package (SiP). In other embodiments, at least one of the processors 201 may also be integrated on the same Chip with logic for one or more controllers of the System control logic 205 to form a System-on-Chip (SoC).
It is understood that the hardware structure of the notebook computer 20 shown in fig. 3 is only an example, in other embodiments, the notebook computer 20 may also include more or less modules, and may also combine or split some modules, and the embodiments of the present application are not limited.
It is understood that the electronic device to which the input event conversion provided in the embodiment of the present application is described by taking the notebook computer 20 as an example is only an example, and in other embodiments, the electronic device may also be used in other electronic devices, including but not limited to: a laptop computer, a smart television, a smart speaker, a tablet computer, a server, a wearable device, a head-mounted display, a mobile email device, a portable game console, a portable music player, a reader device, etc., which are not limited in the embodiments of the present application.
It is understood that the operating system used by the notebook computer 20 may be android TM 、IOS TM (Input Output System)、Microsoft TM Windows and hongmeng TM (Harmony OS), etc., without limitation. The software architecture of the notebook computer 20 will be described with the hong meng operating system as an example.
Fig. 4 illustrates a schematic diagram of a software architecture of a notebook computer 20, according to some embodiments of the present application. As shown in fig. 4, the software architecture of the notebook computer 20 includes an application layer 401, a framework layer 402, a system service layer 403, and a kernel layer 404 from top to bottom. Wherein:
the application layer 401: may include system applications 4011 and extension applications 4012 (or third party applications). The system applications 4011 may include a desktop, settings, a camera, navigation, and the like; the extended applications 4012 may include software applications such as smart home control applications and music players. It is understood that the applications developed by the developer are all in the application layer 401.
The framework layer 402 provides a multi-language framework for the application layer, and includes an Interface (User Interface, UI) framework 4021, a User program framework 4022, a capability framework 4023, and the like. The UI framework 4021 includes a window manager, a content provider, a view system, a phone manager, a resource manager, a notification manager, and the like, which are not described herein. The user program framework 4022 may provide an interface for accessing devices and data of the notebook computer 20 for an application program of the application layer 401, for example, may provide a monitoring interface for an input event for the application program, such as a touch screen event monitoring interface, a touch screen gesture monitoring interface, and the like, and the application program may trigger a corresponding function according to the input event of the input device acquired from the monitoring interface. The capability framework 4023 may provide capabilities of various capability components required by an application for the application, such as, for example, operational capabilities (which may include CPU computing power, graphics Processing Unit (GPU) computing power, image Signal Processor (ISP) computing power, etc.), sound pickup capabilities (which may include microphone sound pickup capability, voice recognition capability, etc.), security capabilities in terms of device security (which may include trusted operating environment security level, etc.), display capabilities (which may include screen resolution, screen size, etc.), playback capabilities (which may include sound amplification capability, stereo effect capability, etc.), and storage capabilities (which may include memory capability, random Access Memory (RAM) capability, etc.) of the device), and so on, without limitation.
The system service layer 403 is the core of the software system of the notebook computer 20, and can provide services to the application programs of the application layer 401 through the framework layer 402. The system services layer 403 includes a distributed soft bus 4031, a distributed data management module 4032, a multimode input subsystem 4033, and the like. Wherein:
the distributed soft bus 4031 is used to couple the notebook computer 20 with other electronic devices to form a distributed system. For example, in some embodiments, the distributed soft bus 4031 may couple the notebook computer 20 with input devices of other electronic devices so that the notebook computer 20 may use the input devices of the other electronic devices.
The distributed data management module 4032 realizes distributed management of application program data and user data based on a distributed soft bus. For example, in some embodiments, the notebook computer 20 may obtain input events generated by input devices of other electronic devices through the distributed data management module.
The multimodal input subsystem 4033 provides input services to the laptop 20. For example, in some embodiments, the multimodal input subsystem 4033 may obtain user actions on input devices and communicate them to the application framework 4022.
The kernel layer 404 includes a kernel subsystem 4041 and a driver subsystem 4042. The kernel subsystem 4041 provides basic kernel capabilities to upper layers, including process/thread management, memory management, file system, network management, peripheral management, etc., by shielding multi-kernel differences. The driver subsystem 4042 includes a hardware driver framework that may provide a unified peripheral access capability and management framework for the handset 10. For example, in some embodiments, a hardware driven framework may provide access capabilities to access an input device.
The technical solution of the embodiment of the present application is described below by taking an example of mapping input events of different types of input devices to touch screen events in combination with a hardware structure and a software architecture of the notebook computer 20.
FIG. 5 illustrates an interactive process diagram of input event transition, according to some embodiments of the present application, the method including the following steps, as shown in FIG. 5.
Step 501: the input device 2041 transmits an input event to the application framework 4022.
For example, in some embodiments, the input device 2041 may be coupled to the application framework 4022 through the driver subsystem 4042 and the multi-mode input subsystem 4033.
Step 502: the application framework 4022 determines whether the input event sent by the input device 2041 is a touch screen event. If the input event is a touch screen event, turning to step 504, and directly sending the touch screen event to the application program; otherwise go to step 503.
Step 503: the application framework 4022 maps the input events to touch screen events. The application framework 4022 maps the input event to a touch screen event that implements the same function according to the specific function of the input event and according to a preset conversion rule.
For example, in some embodiments, when the application framework 4022 detects that the input event is a right mouse click event, the corresponding function is a pop-up menu, and in the touch screen event, the long press corresponding function is also a pop-up menu, and at this time, the application framework 4022 maps the right mouse click event to a long press event in the touch screen event.
In some embodiments, the touch event may be abstracted, for example, abstracted as a touch screen gesture such as a long press gesture, a zoom gesture, a slide gesture, and the like, and the touch screen gesture is defined by corresponding parameters, when the application framework 4022 detects an input event of an input device of another type having the same function as the touch screen gesture, the input event of the input device of the other type is mapped as the defined parameter, and is provided to the application program through the touch screen gesture monitoring interface, and a specific mapping method (a conversion rule) will be described in detail below, which is not described herein again.
It is understood that in other embodiments, the input events of other types of input devices may also be mapped to the specific operations of the touch screen, that is, when the application framework 4022 detects an input event of another type of input device having the same function as the gesture of the touch screen, the corresponding relationship between the touch time and the touch position of the touch screen at the time of triggering the gesture is simulated. For example, in some embodiments, a menu of an application program may be opened by both a long-time pressing gesture on the touch screen and a right click on the mouse, and at this time, when the right click on the mouse is clicked by the user, the application program frame 4022 may simulate an operation of the user on the touch screen, for example, generate a touch operation of touching a position of a mouse cursor for a preset time duration.
Step 504: the application framework 4022 sends the touch screen event to the application.
For example, in some embodiments, the application framework 4022 may provide a touch screen gesture listening interface to the application of the application layer 401, so that the application of the application layer 401 may obtain the touch screen gesture through the touch screen gesture listening interface and execute the related function according to the touch screen gesture.
It can be understood that when the application program obtains a specific touch screen operation and/or touch screen gesture, the function corresponding to the touch screen operation and/or touch screen gesture can be realized.
It is to be appreciated that in other embodiments, the application framework 4022 may also provide a touch screen operation listening interface to the application to provide the application with the locations where the touch screen is touched at different times, so that the developer can flexibly adapt the functions according to the specific touch operations when developing the application.
After different types of input events are mapped into the touch screen events through the input event conversion, developers can be compatible with other types of input equipment only by adapting corresponding functions to the touch screen events, such as touch screen gestures, when developing application programs, and therefore the workload of the developers for adapting the input events of the input equipment is reduced, and development cost is reduced.
It should be noted that, the implementation of the input event conversion provided in each embodiment of the present application through the application framework 4022 is only an example, and in other embodiments, the implementation may also be implemented through other modules in a software architecture, for example, the multimode input subsystem 4033, and the input event conversion provided in each embodiment of the present application may also be implemented through an independent application program or module, which is not limited in the embodiment of the present application.
It can be understood that mapping methods adopted for mapping the input events of different types of input devices to different touch screen events are also different, and the technical solutions of the present application will be described in detail below by taking a touch screen long-press gesture, a touch screen zoom gesture, and a touch screen slide gesture as examples.
Firstly, a technical scheme of mapping input events of other types of input devices to specific operations corresponding to long-press gestures and/or long-press gestures of a touch screen is introduced.
In some application scenarios, the user needs to pop up the menu of the application program when interacting with the electronic device in order to perform the next operation, for example, in some embodiments, the user may use the shortcut function of the music player, such as "listen to a song to identify a song", "search for a song", "recently play", "play locally", and so on, by popping up the menu of the music player. Fig. 6A and 6B are schematic diagrams illustrating the interfaces of the music player application pop-up menu 60 of the notebook computer 20 and the mobile phone 30, respectively, wherein the pop-up menu 60 of the notebook computer 20 can be triggered by pressing a menu key on the keyboard 22, simultaneously pressing the "SHIFT" key and the "F10" key of the keyboard 22, or clicking the right button of the mouse 21 on the music player icon; the cell phone 30 pop-up menu 60 can be triggered by long pressing a music player icon on the touch screen 31. In order to avoid that a developer respectively adapts to different types of input devices for triggering the pop-up menu function when developing a music player, a SHIFT + F10 pressing event of the keyboard 22, a menu key pressing event of the keyboard 22, and a right click event of the mouse 21 may all be mapped to a long-press gesture of the touch screen 31 and/or a specific operation corresponding to the long-press gesture in the application framework 4022.
In particular, FIG. 7A illustrates a schematic diagram of mapping input events of other types of input devices to a touchscreen long press gesture, according to some embodiments of the present application. In some embodiments, the long-press gesture of the touch screen is a touch event generated when a user continuously presses a certain area of the touch screen for more than a preset long-press time, for example, a touch event generated when the user continuously presses a certain area for more than 1 second. Referring to fig. 7A, in some embodiments, when the application framework 4022 detects that the user touches the touch position of the touch screen 31 at different times, gesture recognition may be performed, and the recognized long-press gesture is provided to the touch screen gesture monitoring interface with the long-press position as a parameter, so that a developer may perform function adaptation on the long-press gesture. When the input device is an input device other than a touch screen, for example, the mouse 21 or the keyboard 22, the application framework 4022, when detecting that the mouse 21 is clicked on the right, the "SHIFT" key and the "F10" key of the keyboard 22 are simultaneously pressed, and the menu key of the keyboard 22 is pressed, maps the foregoing operations of the mouse 21 or the keyboard 22 to a long-press position, for example, may map a position of a current cursor of the notebook computer 20, a center position of a currently selected control in the notebook computer 20, a position of a currently selected control in the notebook computer 20, and the like to a long-press position, and provide an event of the long-press gesture to the touch screen gesture monitoring interface, so that a developer need not pay attention to input events of other types of input devices, and only need to adapt to the long-press gesture, so that the notebook computer 20 pops up the menu 60 when the keyboard 22 presses SHIFT + F10, the keyboard 22 presses the menu key, and the right key of the mouse 21.
In other embodiments, the application framework 4022 may map input events of other types of input devices to specific operations corresponding to the long-press gesture. As before, in some embodiments, the long-press gesture is an event generated when the user presses the touch screen for a time period exceeding a preset long-press time, so that the long-press gesture of the touch screen 31 may be simulated after an input event corresponding to the long-press gesture is detected by the mouse 21 or the keyboard 22. Referring to fig. 7B, after detecting that the keyboard 22 presses the SHIFT + F10 event, the keyboard 22 presses the menu key event, and the mouse 21 right click event, the application framework 4022 maps the current position of the cursor of the notebook computer 20, the center position of the currently selected control in the notebook computer 20, and the position of the currently selected control in the notebook computer 20 to the touch position of the touch screen within the preset long press time, that is, provides the corresponding relationship between the touch time and the touch position within the preset long press time, so that the gesture recognition unit can recognize the corresponding relationship between the touch time and the touch position as the long press gesture and provide the long press gesture to the touch screen gesture monitoring interface.
In other embodiments, the application framework 4022 may also provide the specific operation corresponding to the simulated long-press gesture to the application program instead of performing gesture recognition on the specific operation corresponding to the long-press gesture mapped by the input event of the other type of input device, so that a developer may perform flexible function adaptation according to the specific operation, for example, pop up different menus according to different lengths of time, and the like.
It may be understood that, in other embodiments, the long-press gesture and/or the specific touch operation of the long-press gesture may also correspond to other functions, for example, deleting a control, and the like, which is not limited in this embodiment of the present application.
It is understood that the long-press gesture defined by the long-press position is only an example, in other embodiments, other parameters may be used to define the long-press gesture of the touch screen, and input events that trigger the same function by other types of input devices and the long-press gesture may be mapped to the defined parameters, which is not limited in the embodiment of the present application.
The following describes a technical solution for mapping input events of other types of input devices to zoom gestures of a touch screen and/or specific operations corresponding to the zoom gestures.
In some application scenarios, a user needs to perform a zoom operation on a control in an application program, for example, when the user browses a picture, the user needs to zoom in or zoom out on the picture to view details or an overall effect of the picture. Fig. 8A and 8B illustrate diagrams of interfaces of the laptop computer 20 and the cell phone 30, respectively, for viewing pictures using a picture browsing application, according to some embodiments of the present application. Referring to fig. 8A, when a user wants to zoom in or out a picture, the wheel of the mouse 21 may be scrolled forward or backward while holding the "CTRL" key of the keyboard 22 by simultaneously pressing the "CTRL" key and the "+" key of the keyboard 22 or simultaneously pressing the "CTRL" key and the "-" key of the keyboard 22. Referring to fig. 8B, when the user zooms in a picture on the cellular phone 30, zooming in or zooming out of the picture may be triggered by a pinch operation on the touch screen 31. In order to avoid that a developer separately adapts different types of input devices triggering the zoom function when developing a picture browsing application, events that the keyboard 22 simultaneously presses the "CTRL" key and the "+" key, events that the keyboard 22 simultaneously presses the "CTRL" key and the "-" key, and events that the mouse 21 wheel rolls while the keyboard 22"CTRL" key is pressed may be mapped to specific operations corresponding to the zoom gesture and/or the zoom gesture of the touch screen 31 in the application frame 4022.
In particular, fig. 9 illustrates a schematic diagram of an input event transition, according to some embodiments of the present application. In some embodiments, when the user holds down the touch screen 31 with two fingers and the two fingers move in opposite directions, the operation is a pinch operation, and the touch screen input event generated on the touch screen 31 by the pinch operation is a zoom gesture. In some embodiments, referring to fig. 9, when the zoom gesture is recognized according to the relationship between the touch time and the touch position of the user on the touch screen 31, the application framework 4022 may provide the zoom gesture with the two-finger coordinate, the zoom direction, and the zoom scale as parameters to the touch screen gesture monitoring interface, so that the developer may perform functional adaptation on the zoom gesture when developing the application. When the input device is a device other than the touch screen, the application framework layer 4022, when detecting an input event of a zoom gesture corresponding to another device, for example, when detecting that the keyboard 22 simultaneously presses the "CTRL" key and the "+" key, simultaneously presses the "CTRL" key and the "-" key event, and presses the keyboard 22"CTRL" key while scrolling the scroll wheel event of the mouse 21, maps the preset coordinates to the double-finger coordinates, maps the "+" key/"-" key of the keyboard 22 or the scroll direction of the mouse 21 to the zoom direction, maps the preset scale or the scroll wheel coefficient of the mouse 21 to the zoom scale, and provides the zoom gesture to the touch screen monitoring interface, so that a developer only needs to adapt to corresponding functions to the zoom gesture, for example, to zoom a picture, and can enable the user to simultaneously press the keyboard 22"CTRL" key and the "+" key, simultaneously press the keyboard 22"CTRL" key and the "-" key, and simultaneously scroll the scroll wheel of the mouse 21, and zoom the picture.
It is understood that in some embodiments, the scroll wheel coefficient refers to a sliding distance of the control when the mouse scroll wheel slides once, for example, 3 lines of text sliding each time, and the like, and the application program may obtain the scroll wheel coefficient and map the scroll wheel coefficient into an operation on the control in the application program, for example, a sliding operation of the control, a zooming operation on a picture, and the like.
In some embodiments, referring to fig. 10, when the application framework 4022 detects that the user presses the "CTRL" key and the "+" key of the keyboard 22 at the same time or that the user presses the "CTRL" key and the "-" key of the keyboard 22 at the same time, the two-finger coordinates, such as the P point and the Q point, may be mapped with 1/4 position and 3/4 position on the diagonal AB of the current control. And upon detecting that the user presses the "CTRL" key and the "+" key, the mapping is enlarged, the scaling is a preset scale, for example, 1.05; upon detecting that the user pressed the "CTRL" key and the "+" key, the mapping is reduced to a scale of 0.95. When the user presses the CTRL key and the plus key, the picture is enlarged to 1.05 times before the pressing; when the user presses the "CTRL" key and the "-" key, the picture is reduced to 0.95 times before the pressing. Thus, when the user simultaneously presses the CTRL key and the plus key each time, the picture is enlarged to 1.05 times before being pressed by taking the picture center as a zooming center; when the user simultaneously presses the CTRL key and the minus key, the picture is reduced to 0.95 time before pressing by taking the picture center as a zooming center.
In some embodiments, referring to FIG. 11, when the application framework 4022 detects that the user is scrolling the mouse 21 scroll wheel while pressing the "CTRL" key of the keyboard 22, the cursor position O on the current control may be passed 1 On a straight line UV parallel to the diagonal AB of the control member with O 1 The M and N points with equal distance are mapped into double-finger coordinates, and O 1 The length of U is less than O 1 Length of V and N is O 1 The midpoint of the U. And when it is detected that the mouse 21 wheel rolls forward, the mapping is magnification, the scaling is 1+ constant × wheel coefficient, for example, 1+0.05 × wheel coefficient; upon detecting that the mouse 21 wheel is scrolling backwards, the mapping is scaled down by 1-constant x wheel factor, e.g., 1-0.05 x wheel factor. Thus, when the user presses the "CTRL" key and scrolls the wheel forward once, the picture is magnified to (1 +0.05 times wheel coefficient) times before scrolling with the cursor position as the zoom center, for example, to 1.05 times before scrolling when the wheel coefficient is 1; when the user holds down the "CTRL" key and scrolls back once, the picture is zoomed out by (1-0.05 times the scroll wheel coefficient) before scrolling with the cursor position as the zoom center, for example, when the scroll wheel coefficient is 1, the picture is zoomed out by 0.95 times before scrolling.
It is understood that in other embodiments, the application framework 4022 may map the input events of other types of input devices as specific operations of the zoom gesture, for example, when the application framework 4022 detects that the keyboard 22 presses the "CTRL" key and the "+" key simultaneously, the keyboard 22 presses the "CTRL" key and the "-" key simultaneously, and the keyboard 22 presses the "CTRL" key simultaneously, and the mouse 21 scrolls with the wheel, the pinch operation of the touch screen 31 by the user is simulated, that is, the corresponding relationship between the touch position and the touch time of the touch screen is generated within a preset time, so that the zoom gesture is recognized by the gesture recognition module and transmitted to the touch screen gesture listening interface.
Specifically, referring to fig. 10 and 12, when the application framework 4022 detects that the user presses the "CTRL" key and the "+" key of the keyboard 22 at the same time or the user presses the "CTRL" key and the "-" key of the keyboard 22 at the same time, the two points whose length is proportional to the length of PQ are the end coordinates of the two fingers, with the point P and the point Q as the initial coordinates of the two fingers, the center O of the current control as the midpoint, and the center O of the control as the middle point on the diagonal AB of the current control, so as to generate the trajectory of the two coordinates within the preset time, that is, the corresponding relationship between the touch time and the touch position within the preset time. For example, when the key is "CTRL" and "+" detected as being pressed, the initial coordinates are indicated by the point P and the point Q, the end coordinates are indicated by the point P1 and the point Q1, and P 1 Q 1 /PQ =1.05, where the simulated touch operation is a point P along the PP within a predetermined time, such as 0.5 second 1 Moving, point Q along QQ 1 And (4) moving. For another example, when the key is "CTRL" key and "+" key, and the point P and the point Q are used as initial coordinates, P 2 Dot sum Q 2 The point is a two-finger end point coordinate, and, P 2 Q 2 /PQ =0.95, and the simulated touch operation is performed when the point P is along the PP within a preset time, for example, 0.5 second 2 Moving, point Q along QQ 2 And (4) moving.
Referring to fig. 11 and 12, when the application framework 4022 detects that the user has scrolled the wheel of the mouse 21 while pressing "CTRL" of the keyboard 22, the aforementioned M point and N point are used as two-finger start coordinates, UV is used, and the cursor position O is used 1 The coordinate of the two fingers is the middle point, the length of the two points is the scaling ratio of the length of MN, the two points are the end point coordinates of the two fingers, and the track of the coordinates of the two fingers in the preset time is generated, namely the corresponding relation between the touch time and the touch position in the preset time class. For example, when the scroll wheel of the mouse 21 is detected to scroll forward, M and N points are used as initial coordinates of two fingers, M 1 Point sum N 1 The point is a two-finger end point coordinate, and M 1 N 1 the/MN =1+0.05 multiplied by the roller coefficient, and the simulated touch operation is the touch operationM points along MM within a predetermined time, e.g. 0.5 seconds 1 Moving, N points along NN 1 And (4) moving. For another example, when it is detected that the wheel of the mouse 21 rolls backward, M and N points are used as initial coordinates of two fingers, M 2 Point sum N 2 The point is a two-finger end point coordinate, and M 2 N 2 MN =1-0.05 × wheel coefficient, and the simulated touch operation is performed along MM at M points within a preset time, for example, 0.5 second 2 Moving, N points along NN 2 And (4) moving.
It will be appreciated that in some embodiments, the application framework 4022 triggers a zoom gesture each time the user presses the "CTRL" and "+" keys of the keyboard 22 or the "CTRL" and "-" keys of the keyboard 22 at the same time. The application 4022 periodically triggers a zoom gesture, such as every 1 second, when the user holds down the "CTRL" and "+" keys of the keyboard 22 at the same time or the user presses the "CTRL" and "-" keys of the keyboard 22 at the same time without loosing.
It will be appreciated that in some embodiments, the application framework 4022 triggers a zoom gesture for each scroll wheel that is scrolled while the user holds down the keyboard 22"CTRL" key. The application 4022 periodically triggers the zoom gesture when the user continuously scrolls the scroll wheel of the mouse 21 while holding down the keyboard 22"ctrl" key, and stops periodically triggering the zoom gesture when the two scrolling intervals are detected for more than a certain time, for example, more than 500ms (milliseconds).
In other embodiments, the application framework 4022 may also not perform gesture recognition on a specific operation corresponding to a zoom gesture mapped by an input event of another type of input device, but provide a specific operation of a simulated touch screen event to the application, that is, a corresponding relationship between touch time and touch position of the touch screen at different times, so that a developer may perform flexible function adaptation according to the specific operation, for example, adapt a zoom speed according to a speed of a pinch operation, and the like.
It is understood that, in other embodiments, the zoom gesture and/or the specific touch operation of the zoom gesture may also correspond to other functions, and the embodiments of the present application are not limited.
It is understood that the definition of the zoom gesture by the two-finger coordinate, the zoom direction and the zoom scale is only an example, and in other embodiments, other parameters may be used to define the touch screen zoom gesture, and input events that trigger the same function by other types of input devices and zoom gestures are mapped to the defined parameters, which is not limited in the embodiments of the present application.
The following describes a technical solution for mapping input events of other types of input devices to a sliding gesture of a touch screen and/or a specific operation corresponding to the sliding gesture.
In some application scenarios, a user needs to perform a sliding operation on a control in an application program, for example, when the user is playing music, the user can slide a progress bar to adjust the playing progress. Fig. 13A-13C illustrate interface diagrams of a smart watch 10, a laptop computer 20, and a cell phone 30, respectively, when using a music player, according to some embodiments of the present application. Referring to fig. 13A, when the user uses the music player of the smart watch 10, to adjust the playing progress, it may be implemented by rotating the crown 11; referring to fig. 13B, when the user uses the music player of the notebook computer 20, the user can adjust the playing progress by pressing the left button of the mouse 21 to drag the progress bar 130 or scroll the scroll wheel; referring to fig. 13C, the user can adjust the progress of the playing by the sliding operation of the progress bar 130 on the touch screen 31 while using the music player of the cellular phone 30. In order to avoid that a developer respectively adapts to different types of input devices triggering a progress adjustment function when developing a music player application program, a rotation event of the crown 11, a left button pressing and moving event of the mouse 21, and a scroll event of the mouse 21 wheel may be mapped to a slide gesture of the touch screen 31 and/or a specific operation corresponding to the slide gesture in the application program framework 4022.
In particular, FIG. 14A illustrates a schematic diagram of mapping input events of other types of input devices to touchscreen swipe gestures, according to some embodiments of the present application. In some embodiments, the touch screen sliding gesture is an operation of pressing a certain area of the touch screen 31 and moving a touch position by a user. Referring to fig. 14A, in some embodiments, when the application framework 4022 detects that the user touches the touch position of the touch screen 31 at different times, gesture recognition may be performed, and the recognized swipe gesture is provided to the touch screen gesture monitoring interface with the start point coordinate, the swipe distance, and the swipe speed as parameters, so that a developer may perform functional adaptation on the swipe gesture. When the input device is an input device other than a touch screen, for example, when the crown 11 or the mouse 21 is pressed down and moved, that is, when the crown 11 is rotated and the left key of the mouse 21 is pressed down, the application framework 4022 maps the operations of the crown 11 or the mouse 21 to the coordinates of the starting point, the sliding distance, and the sliding speed, for example, the center of the current control of the smart watch 10 may be set as a preset coordinate and mapped to the coordinates of the starting point, the angle of rotation of the crown 11 may be mapped to the sliding distance, and the preset speed may be mapped to the sliding speed, and the current cursor position of the laptop 20 may be mapped to the coordinates of the starting point, the scrolling coefficient of the mouse 21, or the moving distance of the mouse 21 may be mapped to the sliding distance, and the preset speed or the moving speed of the mouse 21 may be mapped to the sliding speed, so that the developer may adjust the playing progress when the scroll wheel is pressed down and moved down by pressing the left key of the mouse 21 and pressing down the mouse 21 and rolling the mouse 21.
In other embodiments, referring to fig. 14B, the application framework 4022 may map input events of other types of input devices to specific operations corresponding to the swipe gesture. For example, when the application framework 4022 detects a rotation event of the crown 11, a left button press and move event of the mouse 21, and a scroll event of the mouse 21 wheel, a preset time is generated to simulate a sliding operation of the user on the touch screen 31, that is, a corresponding relationship between a touch time and a touch position of the touch screen when the user slides on the touch screen 31. For example, the application framework 4022, when detecting a rotation operation of the crown 11: and generating a corresponding relation between the touch time and the touch position of the touch screen within a preset time, for example, within 1 second by taking the preset coordinate as a starting point coordinate and a moving track taking a multiple of the rotation angle of the crown 11, which is a length of a straight line parallel to the starting point coordinate and the diagonal line of the current control, as a sliding distance. For another example, when the application framework 4022 detects that the mouse 21 is scrolled by the scroll wheel or the left button of the mouse 21 is pressed and moved, it generates a corresponding relationship between a touch event and a touch position on the touch screen where the movement trajectory of the mouse 21 is a sliding trajectory within a preset time, for example, within 1 second.
In other embodiments, the application framework 4022 may also not perform gesture recognition on a specific operation corresponding to a slide gesture mapped by an input event of another type of input device, but provide a specific operation of a simulated slide event to the application, that is, a corresponding relationship between touch time and touch position of the touch screen at different times, so that a developer may perform flexible function adaptation according to the specific operation, for example, adapt a slide speed according to the length of a slide distance, and the like.
It is understood that, in other embodiments, the slide gesture and/or the specific touch operation of the slide gesture may also correspond to other functions, for example, deleting a control, which is not limited in this embodiment of the present application.
It is to be understood that the definition of the slide gesture by the start point coordinate, the slide distance, and the slide speed is only an example, in other embodiments, other parameters may also be used to define the touch screen slide gesture, and input events that trigger the same function by other types of input devices and the slide gesture are mapped to the defined parameters, which is not limited in the embodiment of the present application.
It should be noted that, in some embodiments, the input event of the touch panel may also include a long-press gesture, a swipe gesture, and a zoom gesture, and parameters corresponding to the long-press gesture, the swipe gesture, and the zoom gesture in the touch screen event are defined according to the foregoing embodiments, that is to say, when the application framework 4022 detects the long-press gesture, the swipe gesture, and the zoom gesture of the touch panel, the parameters of the long-press gesture, the swipe gesture, and the zoom gesture of the touch panel may be directly used as the parameters corresponding to the touch screen event, without special calculation.
It should be noted that, in some embodiments, when the input device of the electronic device includes other keys and knobs having the same functions as the mouse, the keyboard, and the crown, the application framework 4022 may map the other keys and knobs to the touch screen operation according to the solutions of the foregoing embodiments, which is not limited in the embodiments of the present application.
Embodiments of the mechanisms disclosed herein may be implemented in hardware, software, firmware, or a combination of these implementations. Embodiments of the application may be implemented as computer programs or program code executing on programmable systems comprising at least one processor, a storage system (including volatile and non-volatile memory and/or storage elements), at least one input device, and at least one output device.
Program code may be applied to input instructions to perform the functions described herein and generate output information. The output information may be applied to one or more output devices in a known manner. For purposes of this application, a processing system includes any system having a processor such as, for example, a Digital Signal Processor (DSP), a microcontroller, an Application Specific Integrated Circuit (ASIC), or a microprocessor.
The program code may be implemented in a high level procedural or object oriented programming language to communicate with a processing system. The program code can also be implemented in assembly or machine language, if desired. Indeed, the mechanisms described in this application are not limited in scope to any particular programming language. In any case, the language may be a compiled or interpreted language.
In some cases, the disclosed embodiments may be implemented in hardware, firmware, software, or any combination thereof. The disclosed embodiments may also be implemented as instructions carried by or stored on one or more transitory or non-transitory machine-readable (e.g., computer-readable) storage media, which may be read and executed by one or more processors. For example, the instructions may be distributed via a network or via other computer readable media. Thus, a machine-readable medium may include any mechanism for storing or transmitting information in a form readable by a machine (e.g., a computer), including, but not limited to, floppy diskettes, optical disks, read-only memories (CD-ROMs), magneto-optical disks, read-only memories (ROMs), random Access Memories (RAMs), erasable programmable read-only memories (EPROMs), electrically erasable programmable read-only memories (EEPROMs), magnetic or optical cards, flash memory, or a tangible machine-readable memory for transmitting information (e.g., carrier waves, infrared digital signals, etc.) using the internet in an electrical, optical, acoustical or other form of propagated signal. Thus, a machine-readable medium includes any type of machine-readable medium suitable for storing or transmitting electronic instructions or information in a form readable by a machine (e.g., a computer).
In the drawings, some features of the structures or methods may be shown in a particular arrangement and/or order. However, it is to be understood that such specific arrangement and/or ordering may not be required. Rather, in some embodiments, the features may be arranged in a manner and/or order different from that shown in the illustrative figures. In addition, the inclusion of a structural or methodical feature in a particular figure is not meant to imply that such feature is required in all embodiments, and in some embodiments, may not be included or may be combined with other features.
It should be noted that, in the embodiments of the apparatuses in the present application, each unit/module is a logical unit/module, and physically, one logical unit/module may be one physical unit/module, or may be a part of one physical unit/module, and may also be implemented by a combination of multiple physical units/modules, where the physical implementation manner of the logical unit/module itself is not the most important, and the combination of the functions implemented by the logical unit/module is the key to solve the technical problem provided by the present application. Furthermore, in order to highlight the innovative part of the present application, the above-mentioned device embodiments of the present application do not introduce units/modules which are not so closely related to solve the technical problems presented in the present application, which does not indicate that no other units/modules exist in the above-mentioned device embodiments.
It is noted that, in the examples and descriptions of this patent, relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, the use of the verb "comprise a" to define an element does not exclude the presence of another, same element in a process, method, article, or apparatus that comprises the element.
While the present application has been shown and described with reference to certain preferred embodiments thereof, it will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the spirit and scope of the present application.

Claims (15)

1. An input conversion method is applied to electronic equipment, and is characterized in that the electronic equipment supports a first type of input equipment, an application program framework and a first application are installed on the electronic equipment, and the first application supports a second input parameter of a second type of input equipment;
the method comprises the following steps:
the application program framework acquires a first input parameter which is input through the first type of input equipment and aims at the first application;
the application program framework converts the first input parameters into second input parameters based on a preset conversion rule, wherein corresponding first input parameters are set for different second input parameters in the preset conversion rule, and the different second input parameters can trigger the first application to realize different functions;
the first application responds to the second input parameter to realize a corresponding function;
the preset conversion rule comprises at least one of the following rules:
when the first input parameter is a right-click parameter, mapping the cursor position to a long-click position;
when the first input parameter is a roller wheel rolling parameter, mapping the cursor position as a starting point coordinate, and mapping the roller wheel coefficient and/or the rolling direction as a sliding speed;
and when the first input parameter is a left key long-press parameter and a mouse movement parameter, mapping the cursor position as a sliding starting point, mapping the movement speed as a sliding speed, and mapping the movement distance as a sliding distance.
2. The input conversion method according to claim 1, wherein the second type of input device is a touch screen, and the touch screen corresponds to at least one of the second input parameters:
the long press parameter comprises a long press position;
scaling parameters including double-finger coordinates, a kneading direction and a scaling ratio;
the sliding parameters comprise starting point coordinates, sliding speed and sliding distance; and/or the presence of a gas in the gas,
the first type of input device is a mouse, and the mouse correspondingly has at least one of the following first input parameters:
right click parameters, wherein the right click parameters comprise cursor positions;
the roller rolling parameters comprise cursor position, roller coefficient and rolling direction;
a left key length press parameter, the left key length press parameter comprising a cursor position;
and mouse moving parameters including cursor position, moving speed and moving distance.
3. The input conversion method according to claim 1, wherein the first input device is a keyboard, and the first input parameter corresponding to the keyboard comprises a key parameter, and the key parameter comprises at least one of the following parameters:
the "CTRL" key and the "+" key are pressed simultaneously or the "CTRL" key and the "-" key are pressed simultaneously;
pressing a 'menu' key;
the "SHIFT" key and the "F10" key are pressed simultaneously; and/or the presence of a gas in the gas,
the second type of input device is a touch screen, and the touch screen correspondingly has at least one of the following second input parameters:
a long press parameter, wherein the long press parameter comprises a long press position;
scaling parameters including double-finger coordinates, a kneading direction and a scaling ratio;
the sliding parameters comprise starting point coordinates, sliding speed and sliding distance; and/or the presence of a gas in the gas,
the first type of input device is a mouse, and the mouse correspondingly has at least one of the following first input parameters:
right click parameters, wherein the right click parameters comprise cursor positions;
the roller rolling parameters comprise cursor position, roller coefficient and rolling direction;
a left key length press parameter, the left key length press parameter comprising a cursor position;
and mouse moving parameters including cursor position, moving speed and moving distance.
4. The input conversion method according to claim 3, wherein the preset conversion rule further comprises at least one of the following rules:
when the key parameter of the keyboard is that a CTRL key and a CTRL key are pressed simultaneously or a CTRL key and a CTRL key are pressed simultaneously, mapping the CTRL key or the CTRL key to the pinch direction, mapping a first preset coordinate to the double-finger coordinate, and mapping a first preset scale to the zoom scale;
and mapping a second preset coordinate to the long pressing position when the key pressing parameter of the keyboard is 'menu' key pressing or 'SHIFT' key and 'F10' key pressing simultaneously.
5. The input conversion method according to claim 4, wherein the first application includes a first control; and is
The first preset coordinate is a coordinate at 1/4 and a coordinate at 3/4 of the diagonal line of the first control.
6. The input conversion method according to claim 5, wherein the first preset ratio is 1.05 when the key parameters are a "CTRL" key and a "+" key; the first preset ratio is 0.95 when the key parameters are a CTRL key and a '-' key which are simultaneously pressed.
7. The input conversion method according to claim 4, wherein the first application includes a second control; and is
The second preset coordinate is a coordinate of the center of the second control.
8. The input conversion method according to claim 1,
the first type of input device comprises a mouse and a keyboard, and the first input parameters corresponding to the mouse and the keyboard comprise:
keyboard parameters and roller scrolling parameters; wherein the content of the first and second substances,
the keyboard parameters are that a CTRL key is pressed down, and the roller rolling parameters comprise a cursor position, a roller coefficient and a rolling direction; and/or the presence of a gas in the gas,
the second type of input device is a touch screen, and the touch screen correspondingly has at least one of the following second input parameters:
a long press parameter, wherein the long press parameter comprises a long press position;
scaling parameters including double-finger coordinates, a kneading direction and a scaling ratio;
the sliding parameters comprise starting point coordinates, sliding speed and sliding distance; and/or the presence of a gas in the gas,
the first type of input device is a mouse, and the mouse correspondingly has at least one of the following first input parameters:
a right-click parameter comprising a cursor position;
the roller rolling parameters comprise cursor position, roller coefficient and rolling direction;
a left key length press parameter, the left key length press parameter comprising a cursor position;
and mouse moving parameters including cursor position, moving speed and moving distance.
9. The input conversion method according to claim 8, wherein the preset conversion rule further comprises:
when the first input parameter includes the "CTRL" key press and the scroll wheel scroll parameter, mapping the scroll wheel coefficient to a zoom coefficient, mapping the scroll wheel direction to a pinch direction, and mapping a third preset coordinate to the dual index coordinate.
10. The input conversion method according to claim 9, wherein the first application includes a third control; and is provided with
The third preset coordinate is the coordinate of two points which are parallel to the diagonal line of the third control and pass through the cursor position, and the distance between the two points and the cursor position is equal.
11. The input conversion method according to claim 1,
the first type of input device is a crown, and the first input parameters corresponding to the crown comprise crown rotation parameters; wherein the crown rotation parameter comprises a rotation angle; and/or the presence of a gas in the gas,
the second type of input device is a touch screen, and the touch screen correspondingly has at least one of the following second input parameters:
a long press parameter, wherein the long press parameter comprises a long press position;
scaling parameters including double-finger coordinates, a kneading direction and a scaling ratio;
the sliding parameters comprise a starting point coordinate, a sliding speed and a sliding distance; and/or the presence of a gas in the gas,
the first type of input device is a mouse, and the mouse correspondingly has at least one of the following first input parameters:
right click parameters, wherein the right click parameters comprise cursor positions;
the roller rolling parameters comprise cursor position, roller coefficient and rolling direction;
a left key length press parameter, the left key length press parameter comprising a cursor position;
and mouse moving parameters including cursor position, moving speed and moving distance.
12. The input conversion method according to claim 11, wherein the preset conversion rule further comprises:
and under the condition that the first input parameter is the crown rotation parameter, mapping the rotation angle to the sliding distance, mapping a fourth preset coordinate to the start point coordinate, and mapping a preset speed to the sliding speed.
13. The input conversion method according to claim 12, wherein the first application includes a fourth control; and is
The fourth preset coordinate is a center coordinate of the fourth control.
14. A readable medium having stored therein instructions which, when executed by an electronic device, cause the electronic device to implement the input conversion method of any one of claims 1 to 13.
15. An electronic device, characterized in that the electronic device comprises:
a memory to store instructions for execution by at least one processor of the electronic device; and
at least one processor configured to execute the instructions to cause the electronic device to implement the input conversion method of any one of claims 1 to 13.
CN202210409578.7A 2021-07-29 2021-07-29 Input conversion method, electronic device and readable medium Active CN114764270B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210409578.7A CN114764270B (en) 2021-07-29 2021-07-29 Input conversion method, electronic device and readable medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110864093.2A CN115686302A (en) 2021-07-29 2021-07-29 Input conversion method, electronic device and readable medium
CN202210409578.7A CN114764270B (en) 2021-07-29 2021-07-29 Input conversion method, electronic device and readable medium

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN202110864093.2A Division CN115686302A (en) 2021-07-29 2021-07-29 Input conversion method, electronic device and readable medium

Publications (2)

Publication Number Publication Date
CN114764270A CN114764270A (en) 2022-07-19
CN114764270B true CN114764270B (en) 2023-03-24

Family

ID=82384835

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202110864093.2A Pending CN115686302A (en) 2021-07-29 2021-07-29 Input conversion method, electronic device and readable medium
CN202210409578.7A Active CN114764270B (en) 2021-07-29 2021-07-29 Input conversion method, electronic device and readable medium

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202110864093.2A Pending CN115686302A (en) 2021-07-29 2021-07-29 Input conversion method, electronic device and readable medium

Country Status (1)

Country Link
CN (2) CN115686302A (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109885245A (en) * 2019-02-21 2019-06-14 Oppo广东移动通信有限公司 Application control method, apparatus, terminal device and computer-readable storage medium

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2105823A4 (en) * 2006-12-19 2012-12-26 Bo Qiu Human computer interaction device, electronic device and human computer interaction method
CN107357560A (en) * 2017-04-28 2017-11-17 阿里巴巴集团控股有限公司 Interaction processing method and device
CN107297073B (en) * 2017-06-15 2022-11-01 广州华多网络科技有限公司 Method and device for simulating peripheral input signal and electronic equipment
CN110825242B (en) * 2019-10-18 2024-02-13 亮风台(上海)信息科技有限公司 Method and device for inputting
CN111840990B (en) * 2020-07-21 2022-08-19 联想(北京)有限公司 Input control method and device and electronic equipment

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109885245A (en) * 2019-02-21 2019-06-14 Oppo广东移动通信有限公司 Application control method, apparatus, terminal device and computer-readable storage medium

Also Published As

Publication number Publication date
CN115686302A (en) 2023-02-03
CN114764270A (en) 2022-07-19

Similar Documents

Publication Publication Date Title
US11256396B2 (en) Pinch gesture to navigate application layers
US9959025B2 (en) Device, method, and graphical user interface for navigating user interface hierarchies
US11402978B2 (en) Devices, methods, and systems for manipulating user interfaces
US10831337B2 (en) Device, method, and graphical user interface for a radial menu system
US8839122B2 (en) Device, method, and graphical user interface for navigation of multiple applications
KR102044826B1 (en) Method for providing function of mouse and terminal implementing the same
US20110175826A1 (en) Automatically Displaying and Hiding an On-screen Keyboard
US20100095234A1 (en) Multi-touch motion simulation using a non-touch screen computer input device
JP2016027481A (en) Navigation applications using side-mounted touchpad
JP2011150413A (en) Information processing apparatus, method and program for inputting operation
US20210026528A1 (en) Method for Displaying Graphical User Interface Based on Gesture and Electronic Device
WO2019128193A1 (en) Mobile terminal, and floating window operation control method and device
WO2012092063A1 (en) Creating, displaying and interacting with comments on computing devices
WO2017101445A1 (en) Method for responding to operation track and operation track response apparatus
JP2011053770A (en) Information processing apparatus and input processing method
US9658865B2 (en) Method of editing content and electronic device for implementing the same
US20140164186A1 (en) Method for providing application information and mobile terminal thereof
US20130127745A1 (en) Method for Multiple Touch Control Virtual Objects and System thereof
WO2008082095A1 (en) Touch-screen device, and control method for the same
CN114764270B (en) Input conversion method, electronic device and readable medium
KR101381878B1 (en) Method, device, and computer-readable recording medium for realizing touch input using mouse
US20140085340A1 (en) Method and electronic device for manipulating scale or rotation of graphic on display
US20190034069A1 (en) Programmable Multi-touch On-screen Keyboard
TW201319912A (en) Touch control presentation system and the method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant