WO2023071590A1 - Procédé de commande d'entrée et dispositif électronique - Google Patents

Procédé de commande d'entrée et dispositif électronique Download PDF

Info

Publication number
WO2023071590A1
WO2023071590A1 PCT/CN2022/119062 CN2022119062W WO2023071590A1 WO 2023071590 A1 WO2023071590 A1 WO 2023071590A1 CN 2022119062 W CN2022119062 W CN 2022119062W WO 2023071590 A1 WO2023071590 A1 WO 2023071590A1
Authority
WO
WIPO (PCT)
Prior art keywords
electronic device
display area
input
display
content
Prior art date
Application number
PCT/CN2022/119062
Other languages
English (en)
Chinese (zh)
Inventor
卢跃东
周学而
魏凡翔
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2023071590A1 publication Critical patent/WO2023071590A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1407General aspects irrespective of display type, e.g. determination of decimal point position, display with fixed or driving decimal point, suppression of non-significant zeros

Definitions

  • the embodiments of the present application relate to the technical field of terminals, and in particular, to an input control method and an electronic device.
  • multi-screen collaboration can be realized between electronic devices. Controlling the display screens of two electronic devices through one electronic device not only expands the display space, but also quickly transfers the content on one electronic device to another. displayed on electronic devices.
  • multiple electronic devices share a set of keyboard and mouse, and the display position of the keyboard input content is switched as the display position of the mouse pointer is switched among the display screens.
  • a multi-screen collaborative connection is established between the computer 101 and the tablet 102 , and the computer 101 and the tablet 102 share a set of keyboard and mouse. If the mouse pointer 11 is displayed on the display screen of the computer 101 at this time, then the content entered by the user through the keyboard can be displayed on the display screen of the computer 101 at this time. Afterwards, the mouse pointer 11 moves toward the direction of the tablet 102 in response to the user operation and moves out of the edge of the display screen of the computer 101, then the computer 101 hides the displayed mouse pointer 11, and as shown in (b) in FIG. The mouse pointer 12 is displayed on the display screen, and the content entered by the user through the keyboard is displayed on the display screen of the tablet 102 at this time.
  • the embodiments of the present application provide an input control method and an electronic device.
  • the first electronic device will intercept and forward the input to the second electronic device only after determining the operation of switching the input focus position, so as to improve user experience.
  • an input control method is provided, which is applied to a first electronic device.
  • the method includes: determining a first operation acting on the first display area; the first operation is used to instruct to switch the input focus position to the first display area of the second electronic device. After determining the first operation acting on the first display area, intercepting the first input and sending the first input to the second electronic device, so as to display the first input on the first display area of the second electronic device.
  • the first operation is a mouse click operation acting on the first display area, or the first operation is a touch operation acting on the first display area.
  • the first display area may be all or part of the display screen, for example.
  • the first display area may refer to a display area for displaying a certain application, function, component, etc., or any component area that can support input.
  • the first electronic device implements the interception of the first input through keyboard HOOK. After determining the first operation, the first electronic device mounts a keyboard HOOK, and intercepts and forwards the first input through the first HOOK. However, before the first operation is determined, the first electronic device will not mount the keyboard HOOK (such as unloading the keyboard HOOK), then it will send the obtained input to the local application to display the input in the local display area.
  • the first electronic device After the first electronic device detects the operation of moving the mouse pointer to another display area, it will not switch the input focus position. After the first electronic device determines that the user has the intention to switch the input focus, it hangs on the keyboard HOOK, intercepts subsequent keyboard input events and sends them to the second electronic device, thereby realizing the switching of the input focus and ensuring that the input display meets the user's needs. Improve user experience.
  • intercepting the first input and sending the first input to the second electronic device includes: after detecting After the mouse is clicked on the first display area, or after the touch operation on the first display area is determined, the first input of the keyboard is intercepted, and the first input is sent to the second electronic device.
  • the first electronic device determines the user's intention according to the click position of the mouse, or determines the user's intention according to the position of the touch operation. Only after determining the user's intention, the first electronic device will switch the input focus position to the position where the first operation acts.
  • the display position of the subsequent input can meet the needs of the user, and the problem of abnormal input position caused by the misoperation of moving the mouse position will not occur, thereby improving the user experience.
  • determining the first operation acting on the first display area includes: detecting the first operation acting on the first display area. Or, receiving first information sent by the second electronic device, where the first information is information sent after the second electronic device detects the first operation acting on the first display area.
  • the first electronic device acts as the master device
  • the multi-screen cooperation modes applied by the first electronic device and the second electronic device include: a shared cooperation mode, a windowed cooperation mode, or a full-screen extended-screen cooperation mode.
  • the display screen of the first electronic device and the display screen of the second electronic device are displayed separately, the first display area is the display area displayed on the display screen of the second electronic device, and the second display area is the display area of the first electronic device.
  • the first electronic device can detect a mouse click operation on the display screen of the second electronic device
  • the second display area is the projection area of the first electronic device on the display screen of the second electronic device
  • the first display area is the display outside the projection area on the display screen of the second electronic device area.
  • the first electronic device needs to determine the first operation before switching the input focus position, and even if the first operation does not act on the first electronic device, the first operation can be determined through the connection with the second electronic device. Only after determining the first operation, the first electronic device switches the input focus position, intercepts and forwards the input to the second electronic device. In order to meet the user's input requirements.
  • the method further includes: after detecting the second operation acting on the second display area, determining to scroll and display the display content displayed in the second display area;
  • the display area is a display area corresponding to the first electronic device.
  • the second operation is when the mouse pointer is displayed in the second display area, the mouse rolls the wheel operation.
  • determining to scroll and display the display content displayed in the second display area includes: when the second display area is the display area of the first electronic device, scrolling Displaying the display content of the second display area; or, in the case that the second display area is a display area displayed by the second electronic device for displaying the screen projection content of the first electronic device, sending the second display to the second electronic device The scrolling display content of the area.
  • the multi-screen coordination mode applied by the first electronic device and the second electronic device is a shared coordination mode.
  • the first electronic device detects the operation of scrolling the mouse wheel on the second display area of the first electronic device, and scrolls and displays the display content in the second display area.
  • the multi-screen cooperation mode applied by the first electronic device and the second electronic device is a windowed cooperation mode or a full-screen extended screen cooperation mode.
  • the first electronic device detects the second operation acting on the second display area, it sends the first screen projection content to the second electronic device, where the first screen projection content is scrolling display screen projection content.
  • the display position of the mouse pointer is decoupled from the position of the input focus, and the input focus is uniformly managed by the first electronic device, which satisfies the user's need to input content in another display area while scrolling through one display area.
  • the method further includes: determining a third operation acting on the second display area; the third operation is used to instruct to switch the input focus position to the one corresponding to the first electronic device Second display area. According to the second input, it is determined to display the second input in the second display area.
  • the first electronic device after the third operation is determined, uninstalls the keyboard HOOK, and no longer intercepts the first input, but after receiving the second input, determines that in the second display area Displays the second input.
  • the third operation is a mouse click operation acting on the second display area, or the third operation is a touch operation acting on the second display area.
  • the first electronic device determines whether to mount the keyboard HOOK or uninstall the keyboard HOOK according to whether the display area is its corresponding display area only after determining the mouse click operation or touch operation acting on the display area. In this way, it is ensured that the input display meets the needs of the user, and the user experience is improved.
  • an input control method is provided, which is applied to a second electronic device.
  • the method includes: after detecting a first operation acting on the first display area, sending first information to the first electronic device; the first operation is used to instruct to switch the input focus position to the first display area of the second electronic device. receiving a first input sent by the first electronic device; the first input is an input intercepted by the first electronic device after receiving the first information. A first input is displayed in the first display area.
  • the first operation is a mouse click operation acting on the first display area, or the first operation is a touch operation acting on the first display area.
  • the second electronic device further displays a second display area
  • the second display area is used to display the screen projection content sent by the first electronic device
  • the method further includes:
  • the second display area displays the first screen projection content sent by the first electronic device.
  • the first screen projection content is scrollingly displayed screen projection content. 2. Screencast content sent after the operation.
  • the second operation is when the mouse pointer is displayed in the second display area, the mouse rolls the wheel operation.
  • an electronic device is provided.
  • the electronic device is the first electronic device, including: a processor and a memory, the memory is coupled to the processor, the memory is used to store computer program codes, the computer program codes include computer instructions, and when the processor reads the computer instructions from the memory, the electronic The device performs the following operations: determining a first operation acting on the first display area; the first operation is used to instruct to switch the input focus position to the first display area of the second electronic device. After determining the first operation acting on the first display area, intercepting the first input and sending the first input to the second electronic device, so as to display the first input on the first display area of the second electronic device.
  • the first operation is a mouse click operation acting on the first display area, or the first operation is a touch operation acting on the first display area.
  • intercepting the first input and sending the first input to the second electronic device includes: after detecting After the mouse is clicked on the first display area, or after the touch operation on the first display area is determined, the first input of the keyboard is intercepted, and the first input is sent to the second electronic device.
  • determining the first operation acting on the first display area includes: detecting the first operation acting on the first display area.
  • receiving first information sent by the second electronic device, and determining the first operation according to the first information the first information is information sent after the second electronic device detects the first operation acting on the first display area.
  • the electronic device when the processor reads the computer instruction from the memory, the electronic device is further caused to perform the following operations: after detecting the second operation acting on the second display area, Determine to scroll and display the display content displayed in the second display area; the second display area is a display area corresponding to the first electronic device.
  • the second operation is when the mouse pointer is displayed in the second display area, the mouse rolls the wheel operation.
  • determining to scroll and display the display content displayed in the second display area includes: when the second display area is the display area of the first electronic device, scrolling Displaying the display content of the second display area; or, in the case that the second display area is a display area displayed by the second electronic device for displaying the screen projection content of the first electronic device, sending the second display to the second electronic device The scrolling display content of the area.
  • the processor when the processor reads the computer instruction from the memory, it also causes the electronic device to perform the following operations: determine the third operation acting on the second display area; the third The operation is used to instruct to switch the input focus position to the second display area corresponding to the first electronic device. According to the second input, it is determined to display the second input in the second display area.
  • an electronic device is provided.
  • the electronic device is a second electronic device, comprising: a processor, a memory and a display screen, the memory, the display screen and the processor are coupled, the memory is used to store computer program codes, and the computer program codes include computer instructions, when the processor reads from the memory Taking computer instructions, so that the electronic device performs the following operations: after detecting the first operation acting on the first display area, sending the first information to the first electronic device; the first operation is used to instruct to switch the input focus position to the second electronic device the first display area of the .
  • a first input sent by the first electronic device is received; the first input is an input intercepted by the first electronic device after receiving the first information.
  • a first input is displayed in the first display area.
  • the first operation is a mouse click operation acting on the first display area, or the first operation is a touch operation acting on the first display area.
  • the second electronic device further displays a second display area, and the second display area is used to display the screen projection content sent by the first electronic device, when the processor Reading the computer instructions from the memory also causes the electronic device to perform the following operations: display the first screen projection content sent by the first electronic device in the second display area, and the first screen projection content is a scrolling display screen content, the first screen projection content is the screen projection content sent by the first electronic device after detecting the second operation acting on the second display area.
  • the second operation is the mouse scroll wheel operation when the mouse pointer is displayed in the second display area.
  • the embodiment of the present application provides an electronic device, the electronic device has the function of implementing the input control method described in the above first aspect and any possible implementation; or, the electronic device has the function of implementing The function of the input control method described in the above second aspect and any possible implementation manner thereof.
  • This function can be realized by hardware, and can also be realized by executing corresponding software by hardware.
  • the hardware or software includes one or more modules corresponding to the above functions.
  • the embodiments of the present application provide a computer-readable storage medium.
  • the computer-readable storage medium stores a computer program (also referred to as an instruction or code), and when the computer program is executed by the electronic device, the electronic device executes the first aspect or the method in any one of the implementation manners of the first aspect; or , so that the electronic device executes the method of the second aspect or any one of the implementation manners in the second aspect.
  • a computer program also referred to as an instruction or code
  • the embodiments of the present application provide a computer program product.
  • the electronic device When the computer program product is run on the electronic device, the electronic device is made to execute the method in the first aspect or any one of the implementation manners in the first aspect; or, to make the electronic device The device executes the second aspect or the method in any one implementation manner of the second aspect.
  • the embodiment of the present application provides a circuit system, the circuit system includes a processing circuit, and the processing circuit is configured to execute the method in the first aspect or any one of the implementation manners in the first aspect; or, the processing circuit is configured to execute The second aspect or the method in any one of the implementation manners in the second aspect.
  • the embodiment of the present application provides a chip system, including at least one processor and at least one interface circuit, at least one interface circuit is used to perform the function of sending and receiving, and send instructions to at least one processor, when at least one processor When executing the instructions, at least one processor executes the first aspect or the method of any one of the implementations of the first aspect; or, when the at least one processor executes the instructions, at least one processor executes the second aspect or any of the second aspects.
  • Figure 1 is a schematic interface diagram 1 provided by the embodiment of the present application.
  • FIG. 2A is a schematic diagram of a communication system to which an input control method provided in an embodiment of the present application is applied;
  • FIG. 2B is a schematic diagram of a scene where the input control method provided by the embodiment of the present application is applied;
  • FIG. 3 is a schematic diagram of a hardware structure of an electronic device provided in an embodiment of the present application.
  • FIG. 4A is a schematic diagram of a software structure block diagram of the first electronic device provided by the embodiment of the present application.
  • FIG. 4B is a schematic diagram of a software structure block diagram of a second electronic device provided in an embodiment of the present application.
  • Fig. 5 is the interface schematic diagram 2 provided by the embodiment of the present application.
  • Figure 6 is a schematic diagram of the third interface provided by the embodiment of the present application.
  • FIG. 7 is a schematic diagram of the fourth interface provided by the embodiment of this application.
  • Figure 8 is a schematic diagram of the fifth interface provided by the embodiment of the present application.
  • FIG. 9 is a schematic diagram of the sixth interface provided by the embodiment of this application.
  • FIG. 10 is a schematic diagram of interface VII provided by the embodiment of this application.
  • FIG 11 is the eighth schematic diagram of the interface provided by the embodiment of the present application.
  • FIG. 12 is a flow chart of the input control method provided by the embodiment of the present application.
  • FIG. 13 is a schematic structural diagram of a first electronic device provided by an embodiment of the present application.
  • FIG. 14 is a schematic structural diagram of a second electronic device provided by an embodiment of the present application.
  • references to "one embodiment” or “some embodiments” or the like in this specification means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application.
  • appearances of the phrases “in one embodiment,” “in some embodiments,” “in other embodiments,” “in other embodiments,” etc. in various places in this specification are not necessarily All refer to the same embodiment, but mean “one or more but not all embodiments” unless specifically stated otherwise.
  • the terms “including”, “comprising”, “having” and variations thereof mean “including but not limited to”, unless specifically stated otherwise.
  • the term “connected” includes both direct and indirect connections, unless otherwise stated. "First” and “second” are used for descriptive purposes only, and should not be understood as indicating or implying relative importance or implicitly specifying the quantity of indicated technical features.
  • words such as “exemplarily” or “for example” are used as examples, illustrations or descriptions. Any embodiment or design scheme described as “exemplary” or “for example” in the embodiments of the present application shall not be interpreted as being more preferred or more advantageous than other embodiments or design schemes. Rather, the use of words such as “exemplarily” or “for example” is intended to present related concepts in a concrete manner.
  • the "user” is not limited to the owner of the electronic device or a specific user, but may also be an animal, a hard object or other objects.
  • the operation of the electronic device by the user may also be the operation of the electronic device by animals, hard objects or other objects.
  • the use of words such as “user” or “user's operation” is intended as an exemplary description, and the specific person who performs the operation is not limited in this embodiment of the present application.
  • FIG. 2A is a schematic diagram of a communication system to which an input control method provided in an embodiment of the present application is applied. As shown in FIG. 2A , the communication system includes a first electronic device 100 and a second electronic device 200 .
  • the first electronic device 100 may establish a wireless communication connection with the second electronic device 200 through a wireless communication technology.
  • the wireless communication technology includes but is not limited to at least one of the following: near field communication (near field communication, NFC), bluetooth (bluetooth, BT) (for example, traditional bluetooth or low power consumption (bluetooth low energy, BLE) bluetooth ), wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) network), Zigbee (Zigbee), frequency modulation (frequency modulation, FM), infrared (infrared, IR) and so on.
  • both the first electronic device 100 and the second electronic device 200 support the proximity discovery function.
  • both the first electronic device 100 and the second electronic device 200 can realize the proximity discovery function through NFC sensing.
  • the first electronic device 100 and the second electronic device 200 can discover each other, and then establish a Wi-Fi peer-to-peer (P2P) connection, Bluetooth connection, etc. Wireless communication connection.
  • P2P Wi-Fi peer-to-peer
  • Bluetooth connection etc.
  • Wireless communication connection Wireless communication connection.
  • the first electronic device 100 and the second electronic device 200 realize multi-screen collaboration.
  • the first electronic device 100 and the second electronic device 200 establish a wireless communication connection through a local area network.
  • both the first electronic device 100 and the second electronic device 200 are connected to the same router.
  • the first electronic device 100 and the second electronic device 200 establish a wireless communication connection through a cellular network, the Internet, or the like.
  • the second electronic device 200 accesses the Internet through a router, and the first electronic device 100 accesses the Internet through a cellular network; furthermore, the first electronic device 100 establishes a wireless communication connection with the second electronic device 200 .
  • the first electronic device 100 or the second electronic device 200 may be, for example, a personal computer (personal computer, PC), a tablet computer (Pad), a mobile phone (mobile phone), a notebook computer, a desktop computer, a notebook computer, a Functional computers, wearable devices, vehicle-mounted devices, artificial intelligence (AI) devices and other terminal devices.
  • the operating system installed on the first electronic device 100 or the second electronic device 200 includes but is not limited to or other operating systems.
  • the device types of the first electronic device 100 and the second electronic device 200 are the same or different, and the operating systems installed on the first electronic device 100 and the second electronic device 200 are the same or different.
  • the first electronic device 100 or the second electronic device 200 may be a fixed device or a portable device.
  • the present application does not limit the specific type of the first electronic device 100 or the second electronic device 200 and the installed operating system.
  • the first electronic device 100 or the second electronic device 200 is configured with its own keyboard and mouse.
  • the first electronic device 100 is a PC
  • the PC is configured with a keyboard, a mouse and a touchpad.
  • the PC can also be configured with an external keyboard and/or mouse.
  • the first electronic device 100 or the second electronic device 200 is equipped with a touch screen, it can be configured with a mouse or a keyboard, or it is not necessary to configure a keyboard or a mouse, such as directly displaying a virtual keyboard and receiving touch operations through the touch screen.
  • the second electronic device 200 is a PAD
  • the PAD is a touch screen.
  • the PAD can be configured with a stylus 21 .
  • the PAD can also be connected with an external keyboard and mouse.
  • the connection between the first electronic device 100 and the second electronic device 200 can be applied to a multi-screen collaboration scenario as shown in FIG. 2B .
  • the display screen of the first electronic device 100 and the display screen of the second electronic device 200 can be displayed separately,
  • the keyboard and mouse of the first electronic device 100 can control the second electronic device 200, which is equivalent to an external display screen connected to the first electronic device 100, which expands the display space of the first electronic device.
  • the first electronic device 100 sends the received input content of the keyboard to the display screen of the second electronic device 200 for display.
  • the current multi-screen collaboration scenario may be a shared collaboration mode of the first electronic device 100 and the second electronic device 200 .
  • the second electronic device 200 displays a display area 22, and the display area 22 is used to display the first The content sent by the electronic device 100.
  • the display content of the display area 22 on the first electronic device 100 and the second electronic device 200 may be the same or different.
  • the first electronic device 100 changes the displayed content in response to the user operation, without affecting the operation of the second electronic device 200 on the displayed content in the display area 22 .
  • the first electronic device 100 projects application A to the display area 22 of the second electronic device 200 for display, and the user's operation on application B in the first electronic device 100 will not affect the display of the second electronic device 200 Application A displayed in area 22 .
  • the current multi-screen collaboration scenario may be a windowed collaboration mode of the first electronic device 100 and the second electronic device 200 .
  • the second electronic device 200 displays a display area 23, which is used to display the second The display content of the electronic device 200 itself.
  • the current multi-screen collaboration scenario may be a full-screen extended-screen collaboration mode of the first electronic device 100 and the second electronic device 200 .
  • first electronic device 100 and the second electronic device 200 establish the multi-screen cooperative connection of three cooperative modes as shown in FIG.
  • the second electronic device 200 may also establish multi-screen cooperative connections in more cooperative modes, and may divide different multi-screen collaborative scenarios by using other cooperative mode division methods, which are not specifically limited in this embodiment of the present application.
  • the first electronic device 100 and the second electronic device 200 may determine the master device according to user operations during the process of establishing a connection.
  • the first electronic device 100 negotiates with the second electronic device to determine the master device.
  • the first electronic device 100 is used as the main device and equipped with a keyboard and a mouse capable of controlling the second electronic device 200 as an example. It is applied to the multi-screen collaborative scene as shown in FIG. 2B as an example. The input control The method is explained. It can be understood that the second electronic device 200 can also serve as the master device.
  • the number of the second electronic device 200 is one or more.
  • the second electronic device 200 in order to avoid a conflict with the keyboard input of the first electronic device 100, the second electronic device 200 will no longer display the virtual keyboard, and /or, the keyboard of the second electronic device 200 no longer works. This will not be explained below.
  • FIG. 3 shows a schematic structural diagram of the first electronic device 100 or the second electronic device 200 .
  • the first electronic device 100 or the second electronic device 200 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charging management module 140, a power management module 141, Battery 142, antenna 1, antenna 2, mobile communication module 150, wireless communication module 160, audio module 170, sensor module 180, button 190, motor 191, keyboard and mouse module 192, camera 193, display screen 194, and user identification module ( subscriber identification module, SIM) card interface 195 etc.
  • a processor 110 an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charging management module 140, a power management module 141, Battery 142, antenna 1, antenna 2, mobile communication module 150, wireless communication module 160, audio module 170, sensor module 180, button 190, motor 191, keyboard and mouse module 192, camera 193, display screen 194, and user identification module ( subscriber identification module, SIM) card interface 195 etc.
  • SIM subscriber identification module
  • first electronic device 100 or the second electronic device 200 may include more or fewer components than those shown in the illustration, or combine certain components, or split certain components, or use different components layout.
  • the illustrated components can be realized in hardware, software or a combination of software and hardware.
  • the first electronic device 100 may not include the mobile communication module 150 and the SIM card interface 195 .
  • the processor 110 may include one or more processing units, for example: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processing unit (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural network processor
  • application processor application processor, AP
  • modem processor graphics processing unit
  • graphics processing unit graphics processing unit
  • ISP image signal processor
  • controller video codec
  • digital signal processor digital signal processor
  • baseband processor baseband processor
  • NPU neural-network processing unit
  • different processing units may be independent devices, or may be integrated in one or more processors.
  • the controller can generate an operation control signal according to the instruction opcode and timing signal, and complete the control of fetching and executing the instruction.
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the memory in processor 110 is a cache memory.
  • the memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to use the instruction or data again, it can be called directly from the memory. Repeated access is avoided, and the waiting time of the processor 110 is reduced, thus improving the efficiency of the system.
  • processor 110 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous transmitter (universal asynchronous receiver/transmitter, UART) interface, mobile industry processor interface (mobile industry processor interface, MIPI), general-purpose input and output (general-purpose input/output, GPIO) interface, subscriber identity module (subscriber identity module, SIM) interface, and /or universal serial bus (universal serial bus, USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous transmitter
  • MIPI mobile industry processor interface
  • GPIO general-purpose input and output
  • subscriber identity module subscriber identity module
  • SIM subscriber identity module
  • USB universal serial bus
  • the I2C interface is a bidirectional synchronous serial bus, including a serial data line (serial data line, SDA) and a serial clock line (derail clock line, SCL).
  • processor 110 may include multiple sets of I2C buses.
  • the processor 110 can be respectively coupled to a touch sensor, a charger, a flashlight, a camera 193 and the like through different I2C bus interfaces.
  • the processor 110 may be coupled to the touch sensor through the I2C interface, so that the processor 110 communicates with the touch sensor through the I2C bus interface to realize the touch function of the first electronic device 100 or the second electronic device 200 .
  • the MIPI interface can be used to connect the processor 110 with peripheral devices such as the display screen 194 and the camera 193 .
  • MIPI interface includes camera serial interface (camera serial interface, CSI), display serial interface (display serial interface, DSI), etc.
  • the processor 110 and the camera 193 communicate through the CSI interface to realize the shooting function of the first electronic device 100 or the second electronic device 200.
  • the processor 110 and the display screen 194 communicate through the DSI interface to implement the display function of the first electronic device 100 or the second electronic device 200 .
  • the USB interface 130 is an interface conforming to the USB standard specification, specifically, it can be a Mini USB interface, a Micro USB interface, a USB Type C interface, and the like.
  • the USB interface 130 can be used to connect a charger to charge the first electronic device 100 or the second electronic device 200 , and can also be used to transmit data between the first electronic device 100 or the second electronic device 200 and peripheral devices. It can also be used to connect headphones and play audio through them.
  • This interface can also be used to connect other electronic devices, such as AR devices.
  • the interface connection relationship between modules shown in the embodiment of the present application is only a schematic illustration, and does not constitute a structural limitation on the first electronic device 100 or the second electronic device 200 .
  • the first electronic device 100 or the second electronic device 200 may also adopt different interface connection methods in the above embodiments, or a combination of multiple interface connection methods.
  • the charging management module 140 is configured to receive a charging input from a charger.
  • the charger may be a wireless charger or a wired charger.
  • the charging management module 140 can receive charging input from the wired charger through the USB interface 130 .
  • the charging management module 140 may receive wireless charging input through the wireless charging coil of the first electronic device 100 or the second electronic device 200 . While the charging management module 140 is charging the battery 142 , it can also provide power for electronic devices through the power management module 141 .
  • the power management module 141 is used for connecting the battery 142 , the charging management module 140 and the processor 110 .
  • the power management module 141 receives the input from the battery 142 and/or the charging management module 140 to provide power for the processor 110 , the internal memory 121 , the display screen 194 , the camera 193 , and the wireless communication module 160 .
  • the power management module 141 can also be used to monitor parameters such as battery capacity, battery cycle times, and battery health status (leakage, impedance).
  • the power management module 141 may also be disposed in the processor 110 .
  • the power management module 141 and the charging management module 140 may also be set in the same device.
  • the wireless communication function of the first electronic device 100 or the second electronic device 200 can be realized by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor and the baseband processor.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in the first electronic device 100 or the second electronic device 200 may be used to cover single or multiple communication frequency bands. Different antennas can also be multiplexed to improve the utilization of the antennas.
  • Antenna 1 can be multiplexed as a diversity antenna of a wireless local area network.
  • the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module 150 can provide wireless communication solutions including 2G/3G/4G/5G applied on the first electronic device 100 or the second electronic device 200 .
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA) and the like.
  • the mobile communication module 150 can receive electromagnetic waves through the antenna 1, filter and amplify the received electromagnetic waves, and send them to the modem processor for demodulation.
  • the mobile communication module 150 can also amplify the signals modulated by the modem processor, and convert them into electromagnetic waves through the antenna 1 for radiation.
  • at least part of the functional modules of the mobile communication module 150 may be set in the processor 110 .
  • at least part of the functional modules of the mobile communication module 150 and at least part of the modules of the processor 110 may be set in the same device.
  • a modem processor may include a modulator and a demodulator.
  • the modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low frequency baseband signal. Then the demodulator sends the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the low-frequency baseband signal is passed to the application processor after being processed by the baseband processor.
  • the application processor outputs sound signals through audio equipment, or displays images or videos through the display screen 194 .
  • the modem processor may be a stand-alone device. In some other embodiments, the modem processor may be independent of the processor 110, and be set in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 can provide wireless local area network (wireless local area networks, WLAN) (such as wireless fidelity (wireless fidelity, Wi-Fi) network), bluetooth ( bluetooth, BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field communication technology (near field communication, NFC), infrared technology (infrared, IR) and other wireless communication solution.
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency-modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110 , frequency-modulate it, amplify it, and convert it into electromagnetic waves through the antenna 2 for radiation.
  • the antenna 1 of the first electronic device 100 or the second electronic device 200 is coupled to the mobile communication module 150, and the antenna 2 is coupled to the wireless communication module 160, so that the first electronic device 100 or the second electronic device 200 can pass through Wireless communication technologies communicate with networks and other devices.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), broadband Code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC , FM, and/or IR techniques, etc.
  • GSM global system for mobile communications
  • GPRS general packet radio service
  • code division multiple access code division multiple access
  • CDMA broadband Code division multiple access
  • WCDMA wideband code division multiple access
  • time division code division multiple access time-division code division multiple access
  • the GNSS may include a global positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a Beidou navigation satellite system (beidou navigation satellite system, BDS), a quasi-zenith satellite system (quasi -zenith satellite system (QZSS) and/or satellite based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • Beidou navigation satellite system beidou navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite based augmentation systems
  • the first electronic device 100 or the second electronic device 200 realizes the display function through the GPU, the display screen 194 , and the application processor.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor. GPUs are used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
  • the display screen 194 is used to display images, videos and the like.
  • the display screen 194 includes a display panel.
  • the display panel can be a liquid crystal display (LCD), such as an organic light-emitting diode (OLED), an active matrix organic light emitting diode or an active matrix organic light emitting diode (active-matrix organic light emitting diode).
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • active-matrix organic light emitting diode active-matrix organic light emitting diode.
  • AMOLED organic light emitting diode
  • flexible light emitting diode flexible light emitting diode (flex light-emitting diode, FLED), Mini-led, Micro-led, Micro-oled, quantum dot light emitting diode (quantum dot light emitting diodes, QLED), etc.
  • the first electronic device 100 or the second electronic device 200 may include 1 or N display screens 194 , where N is a positive integer
  • the keyboard and mouse module 192 may include a mouse, a keyboard, a touch pad for realizing keyboard and mouse functions, and the like.
  • the first electronic device 100 receives the content input by the user through the keyboard and mouse module 192, it can display it on the display screen of the first electronic device 100, or establish communication with the first electronic device 100 Display on the display screen of the connected second electronic device 200 .
  • the keyboard and mouse module 192 is an optional module.
  • the second electronic device 200 is a PAD
  • the keyboard and mouse module 192 may not be configured, and the user's input operation may be directly received through the display screen 194 .
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the light signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • the ISP outputs the digital image signal to the DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other image signals.
  • the first electronic device 100 or the second electronic device 200 may include 1 or N cameras 193 , where N is a positive integer greater than 1.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, so as to expand the storage capacity of the first electronic device 100 or the second electronic device 200.
  • the external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. Such as saving music, video and other files in the external memory card.
  • the internal memory 121 may be used to store computer-executable program codes including instructions.
  • the internal memory 121 may include an area for storing programs and an area for storing data.
  • the stored program area can store an operating system, at least one application program required by a function (such as a sound playing function, an image playing function, etc.) and the like.
  • the storage data area can store data created during use of the first electronic device 100 or the second electronic device 200 (such as audio data, phonebook, etc.) and the like.
  • the internal memory 121 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (universal flash storage, UFS) and the like.
  • the processor 110 executes various functional applications and data processing of the first electronic device 100 or the second electronic device 200 by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
  • the audio module 170 is used to convert digital audio information into analog audio signal output, and is also used to convert analog audio input into digital audio signal.
  • the audio module 170 may also be used to encode and decode audio signals.
  • the audio module 170 may be set in the processor 110 , or some functional modules of the audio module 170 may be set in the processor 110 .
  • the first electronic device 100 or the second electronic device 200 can use the audio module 170, for example, to play music, record and so on.
  • the audio module 170 may include a speaker, a receiver, a microphone, an earphone interface, and an application processor to implement audio functions.
  • the sensor module 180 may include a pressure sensor, a gyro sensor, an air pressure sensor, a magnetic sensor, an acceleration sensor, a distance sensor, a proximity light sensor, a fingerprint sensor, a temperature sensor, a touch sensor, an ambient light sensor, a bone conduction sensor, and the like.
  • the pressure sensor is used to sense the pressure signal and convert the pressure signal into an electrical signal.
  • a pressure sensor may be located on the display screen 194 .
  • pressure sensors such as resistive pressure sensors, inductive pressure sensors, and capacitive pressure sensors.
  • a capacitive pressure sensor may be comprised of at least two parallel plates with conductive material.
  • the first electronic device 100 or the second electronic device 200 may also calculate the touched position according to the detection signal of the pressure sensor.
  • touch operations acting on the same touch position but with different touch operation intensities may correspond to different operation instructions. For example: when a touch operation with a touch operation intensity less than the first pressure threshold acts on the short message application icon, an instruction to view short messages is executed. When a touch operation whose intensity is greater than or equal to the first pressure threshold acts on the icon of the short message application, the instruction of creating a new short message is executed.
  • Touch sensor also known as "touch device”.
  • the touch sensor can be arranged on the display screen 194, and the touch sensor and the display screen 194 form a touch screen, also called “touch screen”.
  • the touch sensor is used to detect a touch operation on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • Visual output related to the touch operation can be provided through the display screen 194 .
  • the touch sensor may also be disposed on the surface of the first electronic device 100 or the second electronic device 200 , which is different from the position of the display screen 194 .
  • the keys 190 include a power key, a volume key and the like.
  • the key 190 may be a mechanical key. It can also be a touch button.
  • the first electronic device 100 or the second electronic device 200 may receive a key input and generate a key signal input related to user settings and function control of the first electronic device 100 or the second electronic device 200 .
  • the motor 191 can generate a vibrating reminder.
  • the motor 191 can be used for incoming call vibration prompts, and can also be used for touch vibration feedback.
  • touch operations applied to different applications may correspond to different vibration feedback effects.
  • the motor 191 may also correspond to different vibration feedback effects for touch operations acting on different areas of the display screen 194 .
  • Different application scenarios for example: time reminder, receiving information, alarm clock, games, etc.
  • the touch vibration feedback effect can also support customization.
  • the SIM card interface 195 is used for connecting a SIM card.
  • the SIM card can be connected and separated from the first electronic device 100 or the second electronic device 200 by inserting it into the SIM card interface 195 or pulling it out from the SIM card interface 195 .
  • the first electronic device 100 or the second electronic device 200 may support 1 or N SIM card interfaces, where N is a positive integer greater than 1.
  • the same or different operating systems may be installed in the first electronic device 100 and the second electronic device 200 .
  • system Taking the system as an example, the software structure of the electronic equipment is described.
  • FIG. 4A is a software structural block diagram of the first electronic device 100 according to the embodiment of the present application.
  • a first application is installed in the first electronic device 100, and the first application can be used to manage connections with other electronic devices.
  • the first application is a computer housekeeper application, and the first electronic device 100 A multi-screen collaborative connection can be established with the second electronic device 200 through the computer housekeeper application.
  • the first application includes a transmission management module, a keyboard and mouse management module, and an event interception module.
  • the transmission management module is used to manage the transmission based on multi-screen cooperative connection, for example, the first electronic device 100 sends the content input by the user through the keyboard of the first electronic device 100 to the second electronic device 200 through the transmission management module.
  • the keyboard and mouse management module is used to manage the keyboard and mouse of the first electronic device 100 and manage the input focus.
  • the user's input content can be received and sent to the display area of the first electronic device 100 for display; if it is determined that the input focus is on the display area of the second electronic device 200 On the display area, the event interception module can be started, and after it is determined that the input focus is switched back to the display area of the first electronic device 100 , the event interception module can be closed.
  • the event interception module is used to intercept the operating system (such as system) in the input event.
  • the event interception module intercepts For the input event of the keyboard in the system, the input content corresponding to the input event is sent to the second electronic device 200 for display through the transmission management module, so that the keyboard of the first electronic device 100 can receive user input and display it in the display area of the second electronic device 200 Display user input.
  • the function of the event interception module can be realized through a hook (HOOK) function. If the keyboard and mouse module determines to start the event interception module, it can instruct to mount the keyboard HOOK, then the event interception module can intercept the input events of the keyboard; Then intercept the keyboard input event.
  • HOOK hook
  • module division method in the first application shown in FIG. 4A is only an exemplary description, and there may also be other module division methods for implementing the above-mentioned transmission management module, keyboard and mouse management module, and event interception module. functions, and there may be more or fewer modules in the first application, and the method of dividing modules and the number of divided modules are not limited in this embodiment of the present application.
  • FIG. 4B is a block diagram of the software structure of the second electronic device 200 according to the embodiment of the present application.
  • the software system of the second electronic device 200 may adopt a layered architecture, an event-driven architecture, a micro-kernel architecture, a micro-service architecture, or a cloud architecture.
  • the software structure of the second electronic device 200 is exemplarily described by taking an Android system with a layered architecture as an example.
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Layers communicate through software interfaces.
  • the Android system is divided into four layers, which are respectively the application program layer, the application program framework layer, the Android runtime (Android runtime) and the system library, and the kernel layer from top to bottom.
  • the application layer can consist of a series of application packages.
  • the application program package may include application programs such as the first application, short message, camera, calendar, music, gallery, map, call, and video.
  • the first application can be used to manage connections with other electronic devices, and the second electronic device 200 can establish a multi-screen collaborative connection with the first electronic device 100 through the first application.
  • the second electronic device 200 is a mobile phone
  • the first application is a mobile phone manager.
  • the module division method and the functions of each module in the first application of the second electronic device 200 can refer to the module division method and the functions of each module in the first application shown in FIG. 4A above, and will not be repeated here.
  • first application in the second electronic device 200 and the first application in the first electronic device 100 may be the same application or different applications.
  • the first electronic device 100 and the second electronic device 200 are electronic devices of the same type, and the first electronic device 100 and the second electronic device 200 may install the same first application for establishing a multi-screen collaborative connection.
  • the first electronic device 100 and the second electronic device 200 are electronic devices of different types and installed with different operating systems, then the first application in the first electronic device 100 and the first application in the second electronic device 200 Can be different for different applications.
  • the first electronic device 100 and the second electronic device 200 establish a multi-screen coordinated connection, if it is determined that the keyboard or mouse of one of the electronic devices is used, or the keyboard or mouse of other electronic devices is unavailable, or other
  • the electronic device is not equipped with a keyboard, mouse, etc., and the electronic device that is determined to use the keyboard and mouse can be determined as the main device, and the keyboard and mouse management module in the master device can be determined to be activated, and the event interception module in the master device can be enabled or disabled according to user needs , while the event interception modules in other electronic devices are in a closed or dormant state.
  • the main device is the PC
  • the user can operate the PC and the PAD through the keyboard and mouse of the PC.
  • the keyboard and mouse management module in the PC activates or deactivates the event interception module in the PC according to user requirements, and correspondingly mounts or uninstalls the keyboard HOOK in the PC.
  • the application framework layer provides an application programming interface (application programming interface, API) and a programming framework for applications in the application layer.
  • the application framework layer includes some predefined functions.
  • the application framework layer may include a window manager, a content provider, a view system, a phone manager, a resource manager, a notification manager, and the like.
  • a window manager is used to manage window programs.
  • the window manager can get the size of the display screen, determine whether there is a status bar, lock the screen, capture the screen, etc.
  • Content providers are used to store and retrieve data and make it accessible to applications.
  • Said data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebook, etc.
  • the view system includes visual controls, such as controls for displaying text, controls for displaying pictures, and so on.
  • the view system can be used to build applications.
  • a display interface can consist of one or more views.
  • a display interface including a text message notification icon may include a view for displaying text and a view for displaying pictures.
  • the phone manager is used to provide a communication function of the second electronic device 200 .
  • the management of call status including connected, hung up, etc.).
  • the resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and so on.
  • the notification manager enables the application to display notification information in the status bar, which can be used to convey notification-type messages, and can automatically disappear after a short stay without user interaction.
  • the notification manager is used to notify the download completion, message reminder, etc.
  • the notification manager can also be a notification that appears on the top status bar of the system in the form of a chart or scroll bar text, such as a notification of an application running in the background, or a notification that appears on the screen in the form of a dialog window.
  • prompting text information in the status bar issuing a prompt sound, vibrating the electronic device, and flashing the indicator light, etc.
  • the Android Runtime includes core library and virtual machine. The Android runtime is responsible for the scheduling and management of the Android system.
  • the core library consists of two parts: one part is the function function that the java language needs to call, and the other part is the core library of Android.
  • the application layer and the application framework layer run in virtual machines.
  • the virtual machine executes the java files of the application program layer and the application program framework layer as binary files.
  • the virtual machine is used to perform functions such as object life cycle management, stack management, thread management, security and exception management, and garbage collection.
  • a system library can include multiple function modules. For example: surface manager (surface manager), media library (Media Libraries), 3D graphics processing library (eg: OpenGL ES), 2D graphics engine (eg: SGL), etc.
  • the surface manager is used to manage the display subsystem and provides the fusion of 2D and 3D layers for multiple applications.
  • the media library supports playback and recording of various commonly used audio and video formats, as well as still image files, etc.
  • the media library can support a variety of audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to implement 3D graphics drawing, image rendering, compositing, and layer processing, etc.
  • 2D graphics engine is a drawing engine for 2D drawing.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer includes at least a display driver, a camera driver, an audio driver, and a sensor driver.
  • the first electronic device 100 and the second electronic device 200 establish a multi-screen cooperative connection
  • the first electronic device 100 is the master device as an example
  • the input control method provided by the embodiment of the present application is described as an example.
  • the input control method using the second electronic device 200 as the main device may refer to the input control method using the first electronic device 100 as the main device, which will not be repeated in this embodiment of the present application.
  • the first electronic device 100 and the second electronic device 200 establish a multi-screen collaborative connection in a shared collaborative mode as shown in (a) of FIG. 2B . That is to say, after the connection between the first electronic device 100 and the second electronic device 200 is established, the display screen of the first electronic device 100 and the display screen of the second electronic device 200 can be displayed separately. And, the keyboard and mouse of the first electronic device 100 can control the second electronic device 200 .
  • the click operation may be a click operation such as a mouse click or a double click, or may be a user's touch operation on a display screen or a click operation through a stylus.
  • the first electronic device 100 is configured with a keyboard and a mouse, and the user can use the functions of the mouse of the first electronic device 100 through an external mouse 51 and/or a touchpad 52 .
  • the user inputs display content on the display screen of the first electronic device 100 through the keyboard, and the mouse pointer 53 is displayed on the display screen of the first electronic device 100 .
  • the first electronic device 100 detects that the user moves the mouse pointer 53 out of the edge of the display screen to the direction where the second electronic device 200 is located, and can hide the mouse pointer displayed on the display screen of the first electronic device 100 , and instruct the second electronic device 200 to display the mouse pointer 53 .
  • the first electronic device 100 may detect the moving position of the mouse pointer in real time, and determine whether the mouse pointer moves out of the display area of the display screen. Alternatively, the first electronic device 100 determines whether the mouse pointer moves out of the display area according to the preconfigured range of the display area of the display screen. Moreover, the first electronic device 100 can determine that the mouse needs to be moved to the display screen of the second electronic device 200 for display according to the connection relationship and position relationship with the second electronic device 200 . It should be noted that the implementation of moving the position of the mouse pointer and switching the display on the display screen may refer to the prior art, which is not specifically limited in this embodiment of the present application.
  • the first electronic device 100 detects that the user clicks the mouse on the display screen of the second electronic device 200, and determines that the user needs to click the mouse on the second electronic device 200.
  • the event interception module shown in FIG. 4A is started to switch the input focus to the second electronic device 200 .
  • the keyboard and mouse module in the first electronic device 100 receives the mouse press event, it determines that the operating position is the second electronic device 200, and mounts the keyboard HOOK through the event interception module to intercept subsequent keyboard input events, and Send the keyboard input event to the second electronic device 200 .
  • the second electronic device 200 detects a user's touch operation on the display screen (such as a touch operation by a finger, a stylus, etc.), and sends a network signaling to the first electronic device 100, instructing the first electronic device 100 to start
  • the event interception module mounts the keyboard HOOK, and then the first electronic device 100 uses the keyboard HOOK to intercept the keyboard input event, and then sends the keyboard input event to the second electronic device 200 .
  • the first electronic device 100 detects the user's input, and will forward the input data to the second electronic device 200, as shown by the reference numeral 56 As shown, the user's input content is displayed on the display screen of the second electronic device 200 .
  • the first electronic device 100 can then hook on the keyboard, intercept subsequent keyboard input events and send them to the second electronic device 200, so as to realize the switching of the input focus and ensure the input display Meet user needs and improve user experience.
  • the keyboard HOOK intercepts and forwards the keyboard input event to the second electronic device 200 .
  • the content entered by the user through the keyboard of the first electronic device 100 will be displayed on the second electronic device 200 .
  • the mouse pointer 53 moves from the display screen displayed on the second electronic device 200 as shown in Figure 6 (a) to the display screen displayed on the first electronic device 100 as shown in Figure 6 (b). display.
  • the second electronic device 200 hides or unloads the mouse pointer, and the first electronic device 100 reloads the mouse pointer, but the input focus does not switch to the first electronic device 100 .
  • the content input by the user through the keyboard of the first electronic device 100 will still be intercepted by the keyboard HOOK, and forwarded to the second electronic device 200 for display by the transmission management module as shown in FIG. 4A.
  • the first electronic device 100 detects the user's operation on the display screen (such as clicking the mouse, touching operation, etc.), and determines that the user needs to click on the first
  • the display content is input on the display screen of the electronic device 100 , so the event interception module shown in FIG. 4A is turned off, and the input focus is switched to the first electronic device 100 .
  • the keyboard and mouse module in the first electronic device 100 receives the mouse press event, it determines that the operating position is the first electronic device 100, and the keyboard HOOK is unloaded through the event interception module, so the keyboard HOOK will not intercept subsequent keyboard input event.
  • the first electronic device 100 detects the user's input, and as shown by reference number 64, the first electronic device 100 will directly The content entered by the user is displayed on the display screen. Moreover, even if it is detected that the user moves the mouse pointer 53 to display on the second electronic device 200 during the user input process, the user input content will still be displayed on the display screen of the first electronic device 100 .
  • the electronic device can switch the input focus position according to the user's intention. And in the process of input display, the input focus position will not be switched according to the movement of the mouse pointer, which improves the user experience.
  • the user's operation on the mouse wheel, etc. can still switch the display screen on which the wheel works as the mouse pointer moves. Therefore, while one display screen displays user input content, the display content of other display screens is scrolled in response to user operations.
  • the first electronic device 100 starts the event interception module shown in FIG.
  • the information is forwarded to the display screen of the second electronic device 200 for display.
  • the first electronic device 100 detects the user's operation of scrolling the mouse 51 , and determines that the current mouse pointer 53 is displayed on the display screen of the first electronic device 100 . Therefore, the first electronic device 100 may determine that the user needs to scroll the display content on the display screen of the first electronic device 100 , and may control the display content of the display screen to scroll in the direction shown by the arrow 71 .
  • the scrolling display process as shown in (b) in FIG.
  • the keyboard HOOK will still intercept the input event and forward the input event to the second electronic device 200 , then as indicated by reference numeral 72 , the second electronic device 200 displays the content input by the user.
  • the first electronic device 100 uninstalls the keyboard HOOK and no longer intercepts input events.
  • the display content of the display screen of the second electronic device 200 can be scrolled and displayed during the process of displaying the user input on the first electronic device 100 .
  • the input focus is uniformly managed by the first electronic device 100 , which satisfies the user's need to input content on another display screen while scrolling through one display screen.
  • the user does not feel the input split of multiple devices, and truly realizes that one system is connected to multiple display screens, and other display screens are used as extended screens.
  • the first electronic device 100 and the second electronic device 200 establish a multi-screen collaborative connection in a windowed collaborative mode as shown in (b) of FIG. 2B . That is to say, after the connection between the first electronic device 100 and the second electronic device 200 is established, a display area for displaying the content sent by the first electronic device 100 is set on the display screen of the second electronic device 200 . And, the keyboard and mouse of the first electronic device 100 can control the second electronic device 200 .
  • the first electronic device 100 after the first electronic device 100 detects that the mouse pointer moves out of the display screen or display area, it will not switch the position of the input focus, so as to ensure that the user can still perform input on the original display screen or original display area. Moreover, after detecting the user's mouse click operation, touch operation and other operations in the screen projection area, the position of the input focus will not be switched, but only when it is detected that the user is outside the screen projection area of the second electronic device 200 The input focus will be moved to the second electronic device 200 only after the mouse click operation, touch operation and other operations.
  • the first electronic device 100 is configured with a keyboard and a mouse, and the same user can use the functions of the mouse of the first electronic device 100 through an external mouse and/or touchpad.
  • the second electronic device 200 includes a screen projection area 81 for displaying the screen projection content sent by the first electronic device 100 , and the display area other than the screen projection area 81 is for displaying the display content of the second electronic device 200 .
  • the content displayed on the display screen of the first electronic device 100 and the content displayed on the screen projection area 81 may be the same or different, which is not specifically limited in this embodiment of the present application.
  • the screen projection method and specific screen projection related content reference may be made to the prior art, and the embodiments of the present application will not repeat them here.
  • the movement of the mouse pointer 82 will not affect the display of the content being input.
  • the first electronic device 100 unloads the keyboard HOOK, displays the content input by the user on the display screen of the first electronic device 100 and/or displays it in the screen projection area 81, and the mouse pointer 82 moves to the second electronic device 200 display screen, and moving to a display area other than the screen projection area 81 will not affect the input being displayed.
  • the first electronic device 100 is mounted with a keyboard HOOK, and the content input by the user is sent to a display area other than the screen projection area 81 of the second electronic device 200 for display, and the mouse pointer 82 moves to the screen projection area 81 or to the first screen projection area. None of the electronic devices 100 will affect the user input displayed in the display area other than the screen projection area 81 of the second electronic device 200 .
  • network signaling After receiving the network signaling, the first electronic device 100 starts the event interception module shown in FIG. 4A , and mounts the keyboard HOOK through the event interception module. After that, as shown in (c) in FIG.
  • the first electronic device 100 detects the user's input, and the keyboard HOOK intercepts the user's input, and forwards the user's input to the second electronic device 200, as indicated by reference numeral 84
  • the second electronic device 200 displays the user's input in a display area for displaying content displayed by the second electronic device 200 outside the projection area 81 .
  • the user's click operation in the display area other than the screen projection area 81 may include, for example, one or more of the user's touch operation, stylus click operation, mouse click operation, and the like.
  • the click operation is the user's operation on the mouse
  • the first electronic device 100 can directly detect the position of the user's click operation, and determine that the user needs to make an input in the display area of the second electronic device 200, then the second electronic device 100 at this time 200 may not need to send network signaling to the first electronic device 100, saving signaling consumption and improving efficiency.
  • the second electronic device 200 still sends network signaling to the first electronic device 100 to ensure the accuracy of user intention determination.
  • the electronic device can switch the input focus after determining that the user has the intention to switch the input focus, so as to ensure that the input display meets the needs of the user and improve user experience.
  • the second electronic device 200 detects the user's click operation (such as a touch operation) in the screen projection area 81, and determines that the user needs to input display content in the screen projection area. Therefore, Send network signaling to the first electronic device 100 . After receiving the network signaling, the first electronic device 100 closes the event interception module shown in FIG. 4A and uninstalls the keyboard HOOK, so that the keyboard HOOK will not intercept subsequent keyboard input events. Then, as shown in (c) in FIG.
  • the first electronic device 100 detects the user's input, and after processing the user's input to determine the content of the screen projection, the screen projection content will be sent to to the second electronic device 200.
  • the second electronic device 200 displays the screen-casting content in the screen-casting area 81 .
  • the first electronic device 100 uninstalls the keyboard During the HOOK process (that is, the second electronic device 200 displays the user input in the display area other than the screen projection area 81), the user's click operation on the display screen of the first electronic device 100 is detected, and the first electronic device 100 will also mount the keyboard HOOK. In this way, the input focus is switched to the first electronic device 100 , and the user's input content is displayed on the display screen of the first electronic device 100 and the screen projection area 81 of the second electronic device 200 .
  • the user's operation on the mouse wheel, etc. can still switch the active display area along with the movement of the mouse pointer display position. In this way, while displaying content input by the user in one display area, display content in other display areas is scrolled in response to user operations.
  • the user can also scroll through displayed content in one display area and input content in another display area.
  • the first electronic device 100 is equipped with a keyboard HOOK.
  • the first electronic device 100 detects that the user moves the mouse pointer 82 to outside the screen projection area 81 of the second electronic device 200 The operation of the display area of the mouse 1002 is detected, and the operation of the scroll wheel of the mouse 1002 by the user is detected.
  • the first electronic device 100 sends a scrolling instruction to the second electronic device 200 , and the second electronic device 200 can control the display content in the display area other than the scrolling projection area 81 to scroll in the direction indicated by arrow 1003 according to the scrolling instruction. Then, the second electronic device 200 may scroll and display the display content in the display area outside the screen projection area 81 during the process of displaying the content input by the user in the screen projection area 81 .
  • the first electronic device 100 uninstalls the keyboard HOOK.
  • the first electronic device 100 detects that the user moves the mouse pointer 82 to the projection screen of the second electronic device 200.
  • the operation in the screen area 81 is detected, and the user's operation on the scroll wheel of the mouse 1002 is detected.
  • the first electronic device 100 sends the scrolling screen projection content to the second electronic device 200 for display, and the second electronic device 200 can control the display content of the scrolling screen projection area 81 to scroll and display in the direction indicated by arrow 1005 .
  • the second electronic device 200 may scroll and display the displayed content in the screen projection area 81 during the process of displaying the content input by the user in the display area other than the screen projection area 81 .
  • the input focus is uniformly managed by the first electronic device 100 , which satisfies the user's need to input content in another display area while scrolling through one display area.
  • the first electronic device 100 and the second electronic device 200 establish a multi-screen cooperative connection in a full-screen extended screen cooperative mode as shown in (c) of FIG. 2B . That is to say, after the first electronic device 100 establishes a connection with the second electronic device 200 , a display area for displaying its own content is set on the display screen of the second electronic device 200 . And, the keyboard and mouse of the first electronic device 100 can control the second electronic device 200 .
  • the first electronic device 100 after the first electronic device 100 detects that the mouse pointer moves out of the display screen or display area, it will not switch the position of the input focus, so as to ensure that the user can still perform input in the original display area. Moreover, after detecting the user's mouse click operation, touch operation and other operations in the display area other than the display area used to display the display content of the second electronic device 200 itself, the position of the input focus will not be switched, but only after the detection The input focus will not be moved to the second electronic device 200 until the user performs operations such as a mouse click operation or a touch operation in the content display area of the second electronic device 200 .
  • the first electronic device 100 is configured with a keyboard and a mouse, and the same user can use the functions of the mouse of the first electronic device 100 through an external mouse and/or touchpad.
  • the second electronic device 200 includes a display area 1101 for displaying the display content of the second electronic device 200 itself, and the display area other than the display area 1101 is for displaying the screen projection content sent by the first electronic device 100 .
  • the content displayed on the display screen of the first electronic device 100 may be the same or different from the content displayed in the display areas other than the display area 1101 , which is not specifically limited in this embodiment of the present application.
  • the screen projection method and specific screen projection related content reference may be made to the prior art, and the embodiments of the present application will not repeat them here.
  • the movement of the mouse pointer 1102 will not affect the display of the content being input.
  • the first electronic device 100 unloads the keyboard HOOK, and displays the content input by the user on the display screen of the first electronic device 100 and/or in a display area other than the display area 1101 of the second electronic device 200
  • the mouse pointer The movement of 1102 into the display area 1101 will not affect the input being displayed in the display area outside the display area 1101 .
  • the first electronic device 100 mounts a keyboard HOOK, and sends the intercepted user input content to the display area 1101 of the second electronic device 200 for display, and the mouse pointer 1102 moves to a display area other than the display area 1101 or to The display screen of the first electronic device 100 will not affect the user input being displayed in the display area 1101 of the second electronic device 200 .
  • the user input content is displayed on the first electronic device 100, and/or as shown by reference numeral 1103, the display in the display area other than the display area 1101 of the second electronic device 200
  • the second electronic device 200 detects the user's click operation (such as a touch operation) in the display area 1101, it determines that the user needs to input in the display area 1101, so Send network signaling to the first electronic device 100 .
  • the first electronic device 100 starts the event interception module shown in FIG. 4A , and mounts the keyboard HOOK through the event interception module. After that, as shown in (b) in FIG.
  • the first electronic device 100 detects the user's input, intercepts the input data through the keyboard HOOK and forwards the input data to the second electronic device 200, as shown by reference numeral 1105, According to the received input data, the second electronic device 200 displays the input content of the user in the display area 1101 .
  • the second electronic device 200 detects a click operation (such as a touch operation) of the user on a display area other than the display area 1101, and determines that the user needs to click on the display area 1101. Display content is input in display areas other than the display area 1101 , and therefore, network signaling is sent to the first electronic device 100 . After receiving the network signaling, the first electronic device 100 closes the event interception module shown in FIG. 4A and uninstalls the keyboard HOOK, so that the keyboard HOOK will not intercept subsequent keyboard input events. Then, as shown in (c) in FIG.
  • the first electronic device 100 detects the user's input, processes the user's input data to determine the content of the screen projection, and then talks about screen projection.
  • the content is projected to the second electronic device 200 for display.
  • the user input content received by the first electronic device 100 is displayed in a display area other than the display area 1101 of the second electronic device 200 by means of screen projection.
  • the input focus can be normalized between electronic devices configured with the same or different systems, and the main device can manage the input.
  • the main device can switch the input focus after determining that the user has the intention to switch the input focus, so as to ensure that the input display meets the needs of the user and improve the user experience.
  • FIG. 12 is a schematic flowchart of an input control method provided by an embodiment of the present application. As shown in Fig. 12, the method includes the following steps.
  • the first electronic device determines a first operation that acts on the first display area.
  • the first electronic device and the second electronic device establish a multi-screen cooperative connection, wherein the first electronic device acts as the master device.
  • the multi-screen coordination modes applied by the first electronic device and the second electronic device include: a shared coordination mode, a windowed coordination mode, or a full-screen extended screen coordination mode.
  • the first operation is used to instruct to switch the input focus position to the first display area of the second electronic device.
  • the first operation is a mouse click operation acting on the first display area, or the first operation is a touch operation acting on the first display area.
  • the display screen of the first electronic device and the display screen of the second electronic device are displayed separately, the first display area is the display area displayed on the display screen of the second electronic device, and the second display area is the display area of the first electronic device.
  • the first electronic device is configured with a mouse and a keyboard, and the first electronic device can detect a mouse click operation acting on the display screen of the second electronic device.
  • the second display area is the projection area of the first electronic device on the display screen of the second electronic device
  • the first display area is the display outside the projection area on the display screen of the second electronic device area.
  • the first electronic device intercepts the first input.
  • the first electronic device determines to mount the keyboard HOOK after detecting the click operation of the mouse on the first display area, or after determining the touch operation acting on the first display area.
  • the first electronic device intercepts the first input of the keyboard through the keyboard HOOK. That is to say, after the mouse pointer moves to other display areas, the keyboard HOOK will not be mounted, and the first electronic device will mount the keyboard HOOK only after confirming the first operation to intercept keyboard input.
  • the input module of the first electronic device is described using a keyboard as an example, and the keyboard may be a built-in keyboard or an external keyboard.
  • the method for the first electronic device to intercept keyboard input is described by taking the keyboard HOOK method as an example.
  • the names of the input module and the module for intercepting input whether the input module and the module for intercepting input are two modules or one module, as long as they have similar functions for receiving input and intercepting input, they conform to
  • the technical ideas of the methods provided in the embodiments of the present application shall be covered within the scope of protection of the present application.
  • the first electronic device sends the first input to the second electronic device.
  • the first electronic device after intercepting the first input of the keyboard, the first electronic device sends the first input to the second electronic device.
  • the second electronic device receives the first input sent by the first electronic device.
  • the second electronic device displays the first input in the first display area.
  • the second electronic device after receiving the first input sent by the first electronic device, displays the first input in the first display area.
  • the first electronic device can then mount the keyboard HOOK, intercept subsequent keyboard input events and send them to the second electronic device, thereby realizing the switching of the input focus and ensuring that the input display satisfies the needs of the user. demand and improve user experience.
  • the first electronic device after the first electronic device detects the second operation acting on the second display area, it determines to scroll and display the display content displayed in the second display area.
  • the second operation is when the mouse pointer is displayed in the second display area, the mouse scrolls the wheel.
  • the second display area is the display area of the first electronic device
  • display content in the second display area is scrolled.
  • the multi-screen coordination mode applied by the first electronic device and the second electronic device is a shared coordination mode.
  • the first electronic device detects the operation of scrolling the mouse wheel on the second display area of the first electronic device, and scrolls and displays the display content in the second display area.
  • the scrolling display content of the second display area is sent to the second electronic device.
  • the multi-screen coordination mode applied by the first electronic device and the second electronic device is a windowed coordination mode or a full-screen extended screen coordination mode.
  • the first electronic device detects the second operation acting on the second display area, it sends the first screen projection content to the second electronic device, where the first screen projection content is scrolling display screen projection content.
  • the first screen projection content is scrolling display screen projection content.
  • the display position of the mouse pointer is decoupled from the position of the input focus, and the input focus is uniformly managed by the first electronic device, which satisfies the user's need to input content in another display area while scrolling through one display area.
  • the first electronic device determines a third operation acting on the second display area, and the third operation is used to instruct switching of the input focus position to the second display area corresponding to the first electronic device. After determining the third operation, the first electronic device uninstalls the keyboard HOOK, no longer intercepts the first input, but determines to display the second input in the second display area according to the second input after receiving the second input.
  • the third operation is a mouse click operation acting on the second display area, or the third operation is a touch operation acting on the second display area.
  • the first electronic device determines whether to mount the keyboard HOOK or uninstall the keyboard HOOK according to whether the display area is its corresponding display area only after determining the mouse click operation or touch operation acting on the display area. In this way, it is ensured that the input display meets the needs of the user, and the user experience is improved.
  • the input control method provided by the embodiment of the present application has been described in detail above with reference to FIGS. 5-12 .
  • the electronic device provided by the embodiment of the present application will be described in detail below with reference to FIG. 13 and FIG. 14 .
  • FIG. 13 is a schematic structural diagram of a first electronic device provided in an embodiment of the present application.
  • the first electronic device 1300 may include: a processing unit 1301 and a transceiver unit 1302 .
  • the first electronic device 1300 may be configured to implement the functions of the first electronic device 100 involved in the foregoing method embodiments.
  • the processing unit 1301 is configured to support the first electronic device 1300 to execute S1201 and S1202 in FIG. 12 .
  • the transceiving unit 1302 is configured to support the first electronic device 1300 to execute S1203 in FIG. 12 .
  • the transceiving unit may include a receiving unit and a transmitting unit, may be implemented by a transceiver or a transceiver-related circuit component, and may be a transceiver or a transceiver module.
  • the operations and/or functions of each unit in the first electronic device 1300 are to realize the corresponding flow of the input control method described in the above method embodiment, and all relevant content of each step involved in the above method embodiment can be referred to the corresponding
  • the description of the functions of the functional units will not be repeated here.
  • the first electronic device 1300 shown in FIG. 13 may further include a storage unit (not shown in FIG. 13 ), where programs or instructions are stored.
  • a storage unit not shown in FIG. 13
  • the processing unit 1301 and the transceiver unit 1302 execute the program or instruction
  • the first electronic device 1300 shown in FIG. 13 can execute the input control method described in the above method embodiment.
  • the technical solution provided in this application may also be a functional unit or a chip in the first electronic device, or a device used in conjunction with the first electronic device.
  • FIG. 14 is a schematic structural diagram of a second electronic device provided in an embodiment of the present application.
  • the second electronic device 1400 may include: a transceiver unit 1401 and a display unit 1402 .
  • the second electronic device 1400 may be used to implement the functions of the second electronic device 200 involved in the foregoing method embodiments.
  • the transceiving unit 1401 is configured to support the second electronic device 1400 to execute S1203 in FIG. 12 .
  • the display unit 1402 is configured to support the second electronic device 1400 to execute S1204 in FIG. 12 .
  • the transceiving unit may include a receiving unit and a transmitting unit, may be implemented by a transceiver or a transceiver-related circuit component, and may be a transceiver or a transceiver module.
  • the operations and/or functions of each unit in the second electronic device 1400 are to realize the corresponding flow of the input control method described in the above-mentioned method embodiment, and all relevant content of each step involved in the above-mentioned method embodiment can be referred to the corresponding
  • the description of the functions of the functional units will not be repeated here.
  • the second electronic device 1400 shown in FIG. 14 may further include a processing unit (not shown in FIG. 14 ), which may be implemented as a processing module or a processing circuit for processing the first electronic device sent by the first electronic device. Once input, the display of the display unit 1402 is realized.
  • a processing unit not shown in FIG. 14
  • the display of the display unit 1402 is realized.
  • the second electronic device 1400 shown in FIG. 14 may further include a storage unit (not shown in FIG. 14 ), where programs or instructions are stored.
  • a storage unit not shown in FIG. 14
  • programs or instructions are stored.
  • the transceiver unit 1401 and the display unit 1402 execute the program or instruction
  • the second electronic device 1400 shown in FIG. 14 can execute the input control method described in the above method embodiment.
  • the technical solution provided in this application may also be a functional unit or a chip in the second electronic device, or a device matched with the second electronic device.
  • the embodiment of the present application also provides a chip system, including: a processor, the processor is coupled with a memory, and the memory is used to store programs or instructions, and when the programs or instructions are executed by the processor, the The system on chip implements the method in any one of the foregoing method embodiments.
  • processors in the chip system there may be one or more processors in the chip system.
  • the processor can be realized by hardware or by software.
  • the processor may be a logic circuit, an integrated circuit, or the like.
  • the processor may be a general-purpose processor implemented by reading software codes stored in a memory.
  • the memory may be integrated with the processor, or may be configured separately from the processor, which is not limited in this embodiment of the present application.
  • the memory can be a non-transitory processor, such as a read-only memory ROM, which can be integrated with the processor on the same chip, or can be respectively arranged on different chips.
  • the embodiment of the present application defines the type of memory, and The arrangement manner of the memory and the processor is not specifically limited.
  • the chip system may be a field programmable gate array (field programmable gate array, FPGA), an application specific integrated chip (AP device application specific integrated circuit, ASIC), or a system on chip (system on chip, SoC ), it can also be a central processing unit (central processor unit, CPU), it can also be a network processor (network processor, NP), it can also be a digital signal processing circuit (digital signal processor, DSP), it can also be a microcontroller (micro controller unit, MCU), can also be a programmable controller (programmable logic device, PLD) or other integrated chips.
  • FPGA field programmable gate array
  • ASIC application specific integrated circuit
  • SoC system on chip
  • CPU central processing unit
  • CPU central processor unit, NP
  • DSP digital signal processing circuit
  • microcontroller micro controller unit, MCU
  • PLD programmable logic device
  • each step in the foregoing method embodiments may be implemented by an integrated logic circuit of hardware in a processor or instructions in the form of software.
  • the method steps disclosed in connection with the embodiments of the present application may be directly implemented by a hardware processor, or implemented by a combination of hardware and software modules in the processor.
  • the embodiment of the present application also provides a computer-readable storage medium, the computer-readable storage medium stores a computer program, and when the computer program is run on the computer, the computer is made to perform the above-mentioned related steps, so as to realize the above-mentioned embodiment. input control method.
  • An embodiment of the present application further provides a computer program product, which, when running on a computer, causes the computer to execute the above-mentioned related steps, so as to realize the input control method in the above-mentioned embodiment.
  • the embodiment of the present application further provides a device.
  • the apparatus may specifically be a component or a module, and the apparatus may include one or more processors and memory associated therewith. Among them, the memory is used to store computer programs. When the computer program is executed by one or more processors, the device is made to execute the input control method in the above method embodiments.
  • the apparatus, computer-readable storage medium, computer program product or chip provided in the embodiments of the present application are all used to execute the corresponding method provided above. Therefore, the beneficial effects that it can achieve can refer to the beneficial effects in the corresponding method provided above, and will not be repeated here.
  • the steps of the methods or algorithms described in connection with the disclosure of the embodiments of the present application may be implemented in the form of hardware, or may be implemented in the form of a processor executing software instructions.
  • the software instructions can be composed of corresponding software modules, and the software modules can be stored in random access memory (random access memory, RAM), flash memory, read only memory (read only memory, ROM), erasable programmable read-only memory (erasable programmable ROM, EPROM), electrically erasable programmable read-only memory (electrically EPROM, EEPROM), registers, hard disk, removable hard disk, CD-ROM, or any other form of storage medium known in the art.
  • An exemplary storage medium is coupled to the processor such the processor can read information from, and write information to, the storage medium.
  • the storage medium may also be a component of the processor.
  • the processor and the storage medium may be located in an application specific integrated circuit (AP device application specific integrated circuit, ASIC).
  • the disclosed method may be implemented in other ways.
  • the device embodiments described above are illustrative only.
  • the division of the modules or units is only a logical function division, and there may be other division methods in actual implementation; for example, multiple units or components can be combined or integrated into another system, or some features can be ignored, or not.
  • the mutual coupling or direct coupling or communication connection shown or discussed may be through some interfaces, and the indirect coupling or communication connection of modules or units may be in electrical, mechanical or other forms.
  • each functional unit in each embodiment of the present application may be integrated into one processing unit, each unit may exist separately physically, or two or more units may be integrated into one unit.
  • the above-mentioned integrated units can be implemented in the form of hardware or in the form of software functional units.
  • Computer-readable storage media include but are not limited to any of the following: U disk, mobile hard disk, read-only memory (ROM), random access memory (random access memory, RAM), magnetic disk or optical disk, etc.
  • ROM read-only memory
  • RAM random access memory
  • magnetic disk or optical disk etc.
  • Various media that can store program code include but are not limited to any of the following: U disk, mobile hard disk, read-only memory (ROM), random access memory (random access memory, RAM), magnetic disk or optical disk, etc.
  • Various media that can store program code include but are not limited to any of the following: U disk, mobile hard disk, read-only memory (ROM), random access memory (random access memory, RAM), magnetic disk or optical disk, etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

La présente demande se rapporte au domaine technique des terminaux et concerne un procédé de commande d'entrée et un dispositif électronique. Dans la présente demande, après la détermination d'une opération de commutation d'une position de focalisation d'entrée, un premier dispositif électronique intercepte et transmet à un second dispositif électronique une entrée, de façon à améliorer l'expérience de l'utilisateur. Selon le procédé, après la détermination d'une première opération agissant sur une première zone d'affichage d'un second dispositif électronique, un premier dispositif électronique intercepte une première entrée et transmet la première entrée à un second dispositif électronique, de telle sorte que le second dispositif électronique affiche la première entrée dans la première zone d'affichage. La première opération est utilisée pour ordonner la commutation d'une position de focalisation d'entrée vers la première zone d'affichage du second dispositif électronique.
PCT/CN2022/119062 2021-10-29 2022-09-15 Procédé de commande d'entrée et dispositif électronique WO2023071590A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202111276091.8 2021-10-29
CN202111276091.8A CN116069224A (zh) 2021-10-29 2021-10-29 输入控制方法及电子设备

Publications (1)

Publication Number Publication Date
WO2023071590A1 true WO2023071590A1 (fr) 2023-05-04

Family

ID=86159106

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/119062 WO2023071590A1 (fr) 2021-10-29 2022-09-15 Procédé de commande d'entrée et dispositif électronique

Country Status (2)

Country Link
CN (1) CN116069224A (fr)
WO (1) WO2023071590A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140035851A1 (en) * 2012-07-31 2014-02-06 Samsung Electronics Co., Ltd. Method for controlling user input and electronic device thereof
US20150067590A1 (en) * 2013-08-30 2015-03-05 Samsung Electronics Co., Ltd. Method and apparatus for sharing objects in electronic device
CN105511787A (zh) * 2015-12-04 2016-04-20 联想(北京)有限公司 输入方法、电子设备和输入系统
CN105955513A (zh) * 2016-04-25 2016-09-21 北京润科通用技术有限公司 信息处理方法、电子设备及无线鼠标
CN110147256A (zh) * 2019-04-03 2019-08-20 珠海全志科技股份有限公司 一种多屏交互方法及装置
CN113050841A (zh) * 2019-12-26 2021-06-29 华为技术有限公司 显示多窗口的方法、电子设备和系统

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140035851A1 (en) * 2012-07-31 2014-02-06 Samsung Electronics Co., Ltd. Method for controlling user input and electronic device thereof
US20150067590A1 (en) * 2013-08-30 2015-03-05 Samsung Electronics Co., Ltd. Method and apparatus for sharing objects in electronic device
CN105511787A (zh) * 2015-12-04 2016-04-20 联想(北京)有限公司 输入方法、电子设备和输入系统
CN105955513A (zh) * 2016-04-25 2016-09-21 北京润科通用技术有限公司 信息处理方法、电子设备及无线鼠标
CN110147256A (zh) * 2019-04-03 2019-08-20 珠海全志科技股份有限公司 一种多屏交互方法及装置
CN113050841A (zh) * 2019-12-26 2021-06-29 华为技术有限公司 显示多窗口的方法、电子设备和系统

Also Published As

Publication number Publication date
CN116069224A (zh) 2023-05-05

Similar Documents

Publication Publication Date Title
JP7473101B2 (ja) アプリケーション表示方法及び電子デバイス
WO2021052147A1 (fr) Procédé de transmission de données et dispositifs associés
WO2020000448A1 (fr) Procédé et terminal d'affichage d'écran flexible
WO2021036770A1 (fr) Procédé de traitement d'écran partagé et dispositif terminal
WO2021121052A1 (fr) Procédé et système de coopération à écrans multiples et dispositif électronique
WO2021063237A1 (fr) Procédé de commande de dispositif électronique et dispositif électronique
WO2022068483A1 (fr) Procédé et appareil de démarrage d'application, et dispositif électronique
WO2024016559A1 (fr) Procédé de coopération de multiples dispositifs, dispositif électronique et produit associé
WO2022057512A1 (fr) Procédé et appareil à écran divisé et dispositif électronique
WO2023016012A9 (fr) Procédé d'affichage d'informations et dispositif électronique
US20240073978A1 (en) Method for monitoring link and terminal device
CN112130788A (zh) 一种内容分享方法及其装置
WO2022262439A1 (fr) Procédé de traitement de ressources réseau, dispositif électronique et support de stockage lisible par ordinateur
WO2023207761A1 (fr) Procédé de commande périphérique, ainsi que dispositif électronique et système
WO2022127661A1 (fr) Procédé de partage d'applications et dispositif électronique et support de stockage
WO2022156535A1 (fr) Procédé et appareil de traitement d'application distribuée
WO2021057699A1 (fr) Procédé de commande d'un dispositif électronique à écran flexible et dispositif électronique
WO2021190524A1 (fr) Procédé de traitement de capture d'écran, interface utilisateur graphique et terminal
US20230236714A1 (en) Cross-Device Desktop Management Method, First Electronic Device, and Second Electronic Device
CN115016697A (zh) 投屏方法、计算机设备、可读存储介质和程序产品
CN110609650B (zh) 一种应用状态切换方法及终端设备
WO2023088459A1 (fr) Procédé de collaboration de dispositif et appareil associé
WO2022206848A1 (fr) Procédé et dispositif d'affichage d'un gadget logiciel d'application
WO2023071590A1 (fr) Procédé de commande d'entrée et dispositif électronique
WO2023045774A1 (fr) Procédé d'affichage et dispositif électronique

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22885477

Country of ref document: EP

Kind code of ref document: A1