CN117555433A - Input method, input device and system of device - Google Patents

Input method, input device and system of device Download PDF

Info

Publication number
CN117555433A
CN117555433A CN202210931591.9A CN202210931591A CN117555433A CN 117555433 A CN117555433 A CN 117555433A CN 202210931591 A CN202210931591 A CN 202210931591A CN 117555433 A CN117555433 A CN 117555433A
Authority
CN
China
Prior art keywords
input
terminal device
terminal
input device
input information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210931591.9A
Other languages
Chinese (zh)
Inventor
彭冠奇
段潇潇
原琳杰
王金波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202210931591.9A priority Critical patent/CN117555433A/en
Priority to PCT/CN2023/110455 priority patent/WO2024027671A1/en
Publication of CN117555433A publication Critical patent/CN117555433A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • G06F3/0383Signal control means within the pointing device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F13/00Interconnection of, or transfer of information or other signals between, memories, input/output devices or central processing units
    • G06F13/10Program control for peripheral devices
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F13/00Interconnection of, or transfer of information or other signals between, memories, input/output devices or central processing units
    • G06F13/10Program control for peripheral devices
    • G06F13/105Program control for peripheral devices where the programme performs an input/output emulation function
    • G06F13/107Terminal emulation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0489Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using dedicated keyboard keys or combinations thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Telephone Function (AREA)

Abstract

The application discloses an input method, equipment and a system of equipment, which relate to the technical field of terminals, and can reduce networking requirements on terminal equipment and cost of the input equipment and improve input efficiency. In the application, the first input device can switch the focal point device between a plurality of terminal devices at will, and the second input device can realize the input of the cross-device only by establishing connection with one of the terminal devices. Based on the scheme disclosed by the application, the effect of seamless connection of the input equipment crossing equipment with low cost can be realized under the condition of not depending on a wireless local area network (wireless local areanetwork, WLAN).

Description

Input method, input device and system of device
Technical Field
The embodiment of the application relates to the technical field of terminals, in particular to an input method, input equipment and a system of equipment.
Background
With the rapid development of terminal technology, users often have a scene of using a plurality of terminal devices at the same time in daily work or life. If a plurality of terminal devices are controlled by using a plurality of sets of input devices (such as a keyboard and a mouse) respectively, the user is caused to have complicated operation when switching the terminal devices. Therefore, in order to use a plurality of terminal devices more efficiently, it is necessary to share a set of input devices among the plurality of terminal devices.
In the existing scheme, in order to realize that a set of keyboard and mouse control a plurality of terminal devices, the plurality of terminal devices are required to be in the same local area network, and the requirements on a keyboard and a mouse are relatively high. Therefore, how to conveniently control a plurality of terminal devices by using a single input device becomes a problem to be solved.
Disclosure of Invention
The invention provides an input method of equipment, input equipment and a system. The input devices can be switched among the terminal devices, connection between the terminal devices is not needed, and the cost of the input devices is low.
In a first aspect, an input method of a device is provided, applied to a communication system. The communication system comprises a first input device, a second input device, a first terminal device and a second terminal device, wherein the first input device is in communication connection with the first terminal device and the second terminal device, and the second input device is in communication connection with the first terminal device. The method comprises the following steps: the first input device switching focus from the first terminal device to the second terminal device in response to a first operation on the first input device; the second input device transmits the first input information to the first terminal device in response to a second operation acting on the second input device. And then the first terminal equipment sends second input information to the first input equipment according to the received first input information, and the first input equipment resends the received second input information to the second terminal equipment. The second terminal device responds to the second operation according to the received second input information.
In the method, the first terminal equipment and the second terminal equipment do not need to build a complex networking environment, and the input information of the second input equipment is forwarded through the first input equipment, so that the complexity of scheme implementation is reduced. In addition, whether the second input device has the multi-connection function is not limited, the requirement on the input device is reduced, and the use cost of a user is reduced.
In one implementation, the first input device is responsive to a first operation, the method further comprising: and the first input device sends a first broadcast message to the first terminal device and the second terminal device, wherein the first broadcast message is used for indicating that the current focal point device of the first input device is the second terminal device.
When the focus of the first input device is switched, a broadcast message is sent to all terminal devices which are connected currently, so that all terminals can sense the state of the input device and respond accordingly.
In another possible implementation, the first input information is the same as or different from the second input information.
In another possible implementation, the first input device is a mouse and the second device is a keyboard.
In another possible implementation manner, the first terminal device runs a first system, and the second input information includes a key code and a system identifier, where the key code is a key code under the first system, and the system identifier is used to indicate the first system.
The second input information carries the identification of the first system, so that when the second input information is received by the subsequent second terminal equipment through the first input equipment, the meaning of the key code can be decoded according to the identification of the system to respond.
In another possible implementation manner, when the focal point of the first terminal device is switched to the second terminal device, the second terminal device creates a virtual device, where the virtual device corresponds to the second input device.
The second terminal device does not establish a connection with the second input device, and therefore a virtual input device corresponding to the second input device needs to be established to process the received input information.
In another possible implementation manner, the second terminal device runs a second system, and the second terminal device responds to the second operation according to the second input information includes: the second terminal equipment converts the second input information into input information under a second system; and the second terminal equipment transmits the second input information to the second system through the virtual equipment so that the second terminal equipment responds to the second operation.
The second terminal device, upon receiving the second input information, performs a transformation of the second input information to facilitate understanding by the second system.
In another possible implementation, the first input device switches focus from the second terminal device to the terminal device in response to a third operation acting on the first input device; the first input device sends a second broadcast message to the first terminal device and the second terminal device, wherein the second broadcast message is used for indicating that the current focal point device of the first input device is the first terminal device.
When the focus of the first input device is switched from the second terminal device to the first terminal device, a broadcast message needs to be sent to all terminal devices to which the first input device is currently connected, so as to synchronize information of the focus host of the current first input device.
In another possible implementation, the second terminal device deletes the virtual device when the focal device of the first input device is not the second terminal device.
In a second aspect, there is provided a device input method applied to a first input device, the first input device being in communication connection with a first terminal device and a second terminal device, the method comprising: the first input device receives a focus migration instruction sent by the first terminal device, and switches the focus device of the first input device from the first terminal device to the second terminal device; the first input equipment receives second input information sent by the first terminal equipment, the second input information is generated by the first terminal equipment according to the first input information sent by the second input equipment, and the second input equipment is connected with the first terminal equipment; the first input device transmits the second input information to the second terminal device so that the second terminal device responds according to the second input information.
In the method, after the first input device switches the focus from the first terminal device to the second terminal device, the first input device forwards the input information of the second input device to the second terminal device so that the second terminal device can respond. Although the second input device is not connected with the second terminal device, the input operation of the second input device can still be reflected on the second terminal device, and the scene of focal point switching of the second input device is simulated.
In another implementation manner, after receiving the focus migration instruction, the first input device sends a first broadcast message to the first terminal device and the second terminal device, where the first broadcast message is used to indicate that the current focus device of the first input device is the second terminal device.
In another implementation, the first input information is the same or different than the second input information.
In another implementation, the first input device is a mouse.
In a third aspect, there is provided an input device having the capability of connecting a plurality of terminal devices simultaneously, the input device comprising: a communication interface for performing radio signal transmission and reception; a memory for storing computer program instructions; a processor for executing the computer program instructions to support the input device to implement the method as described in any of the implementations of the first aspect.
In a fourth aspect, a system is provided, the communication system comprising a first terminal device, a second terminal device, and a first input device; the communication system is configured to implement the method according to any implementation of the first aspect.
In a fifth aspect, there is provided a computer readable storage medium having stored thereon computer program instructions which, when executed by a processor, implement a method as in any one of the possible implementations of the second aspect.
In a sixth aspect, a chip system is provided, the chip system comprising a processor, a memory, the memory having stored therein computer program instructions; the computer program instructions, when executed by the processor, implement a method as in any one of the possible implementations of the second aspect. The chip system may be formed of a chip or may include a chip and other discrete devices.
In a seventh aspect, a computer program product is provided which, when run on a computer, causes the method as in any one of the possible implementations of the second aspect to be implemented.
Drawings
Fig. 1 is a diagram of a conventional system architecture according to an embodiment of the present application.
Fig. 2 is a system architecture diagram provided in an embodiment of the present application.
Fig. 3 is a hardware architecture diagram of a terminal device according to an embodiment of the present application.
Fig. 4 is a hardware architecture diagram of an input device according to an embodiment of the present application.
Fig. 5 is a schematic diagram of a user interaction interface according to an embodiment of the present application.
Fig. 6 is a signaling interaction diagram provided in an embodiment of the present application.
Fig. 7 is a schematic diagram of a device manager according to an embodiment of the present application.
Fig. 8 is a signaling interaction diagram provided in an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application. Wherein, in the description of the embodiments of the present application, "/" means or is meant unless otherwise indicated, for example, a/B may represent a or B; "and/or" herein is merely an association relationship describing an association object, and means that three relationships may exist, for example, a and/or B may mean: a exists alone, A and B exist together, and B exists alone. In addition, in the description of the embodiments of the present application, "plurality" means two or more than two.
The terms "first" and "second" are used below for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature. In the description of the present embodiment, unless otherwise specified, the meaning of "plurality" is two or more.
It will be appreciated that the terminal device may typically be connected to one or more input devices, such as input peripherals, to perform corresponding functions. For example, a personal computer (personal computer, PC) may be connected to a mouse, through which the movement of a cursor on the PC display screen and further functions such as opening/deleting files are implemented. The PC can also be connected with a keyboard, and the functions of cursor, character input, file opening/deleting and the like on the display screen of the mobile PC are realized through the keyboard. For another example, the tablet computer may be connected to a keyboard, and the functions of inputting characters to the tablet computer, opening applications/video/audio in the tablet computer, and the like are implemented through the keyboard.
Based on the conventional method, when a user uses a plurality of terminal devices at the same time, the user needs to carry and maintain the input devices matched with the plurality of terminal devices, or the user needs to manually cut off/reconfigure the connection relationship between the input devices and the terminal devices, so that the process of controlling the plurality of devices by the user is very complicated and has low efficiency.
In order to provide a more convenient multi-device operation experience, as an implementation manner, the input device can be controlled to switch among a plurality of devices, so that flexible operation or control of a plurality of terminal devices through a set of input devices can be realized.
For example, referring to fig. 1, fig. 1 shows a schematic diagram of a process for controlling a plurality of terminal devices using a single set of input devices. In the scenario shown in fig. 1, the input devices are a keyboard and a mouse. The terminal device 1 and the terminal device 2 establish a connection, which may be a P2P (Peer-to-Peer) connection, or both are located in the same local area network. In the scenario illustrated in fig. 1, the mouse, keyboard only has to remain connected to the terminal device 1. The terminal equipment 1 monitors the event that the mouse slides to the edge of the screen of the terminal equipment 1, when the edge of the screen of the mouse slides out is determined according to the displacement of the mouse, the displacement of the mouse and the position information of the screen sliding out are sent to the terminal equipment 2, and then the terminal equipment 2 calculates and displays the position of the mouse on the screen of the terminal equipment 2 according to the received displacement of the mouse and the position information of the screen sliding out of the terminal equipment 1.
In the scheme shown in fig. 1, neither the keyboard nor the mouse is connected with the terminal device 2, but the terminal device 1 sends information to the terminal device 2, so that a scene that two terminal devices can be controlled by using one set of input device is simulated. In this scheme, the terminal device 1 and the terminal device 2 need to establish stable and reliable connection, and when the terminal devices are increased, networking is complex and expandability is poor.
In order to solve the above problems, the embodiments of the present application provide a more intelligent multi-device operation experience, and the embodiments of the present application provide a method for connecting an input device, where the method may be applied to a scenario of interconnection and interworking between multiple terminal devices, so as to implement arbitrary migration of the same input device between multiple terminal devices.
As an example, in embodiments of the present application, the input device may be a wireless peripheral or a wired peripheral. The wireless peripheral device refers to an input device, such as a wireless mouse, a wireless keyboard, etc., that realizes input to the terminal device based on a wireless transmission protocol (such as a bluetooth protocol or a 2.4G communication protocol, etc.). The wired external device includes, for example, a wired mouse or a wired keyboard connected to the host computer via a USB interface, and is used for inputting data to the terminal device.
Taking the example that the input device is a Bluetooth peripheral, the Bluetooth peripheral can communicate with the terminal device based on a Bluetooth protocol supporting a multi-connection protocol stack between devices; alternatively, the input device may communicate with the terminal device based on a 2.4G communication protocol; alternatively, the input device may be a wired external device, and may communicate with the terminal device based on a USB channel
In this embodiment of the present application, the input modes supported by the input device may include, but are not limited to, a key mode (such as a mouse, a keyboard, etc.), a scroll wheel mode (such as a mouse, etc.), a gesture input mode, a limb language input mode, a voice input mode, an expression input mode, an eye movement input mode, etc.
The following describes a scenario in which the embodiments of the present application apply. As shown in fig. 2, includes a first terminal device 210, a second terminal device 220, a first input device 240, and a second input device 230.
The terminal device 1 or the terminal device 2 according to the embodiments of the present application may include, but is not limited to, a personal computer (personal computer, PC), a smart phone, a netbook, a tablet computer, a smart camera, a palm top computer, a smart television, a personal digital assistant (personal digital assistant, PDA), a portable multimedia player (portable multimedia player, PMP), a projection device, a smart screen device, an augmented reality (augmented reality, AR)/Virtual Reality (VR) device, a Mixed Reality (MR) device, a television, or a motion sensing game machine in a human-computer interaction scene, etc. The specific functions and structures of the terminal equipment are not limited in the application.
Referring to fig. 3, fig. 3 shows a schematic hardware structure of an electronic device 300, where the electronic device 300 may be the terminal device 210 or the terminal device 220 provided in the embodiments of the present application. The electronic device 300 may include at least one of a cell phone, a foldable electronic device, a tablet computer, a desktop computer, a laptop computer, a handheld computer, a notebook computer, an ultra mobile personal computer, a netbook, a cellular phone, a PDA, an AR device, a VR device, an artificial intelligence device, a wearable device, a vehicle-mounted device, a smart home device, a smart city device. The embodiment of the present application does not particularly limit the type of the electronic apparatus 100.
The electronic device 300 may include a processor 310, an internal memory 321, a usb connector 330, a charge management module 340, a power management module 341, a battery 342, an antenna 1, an antenna 2, a mobile communication module 350, a wireless communication module 360, an audio module 370, a speaker 370A, a receiver 370B, a microphone 370C, an earphone connector 370D, a sensor module 380, keys 390, a motor 391, an indicator 392, a camera 393, a display screen 394, a memory card connector 320, a SIM card interface 395, and the like. The sensor module 380 may include a pressure sensor 380A, a gyroscope sensor 380B, an air pressure sensor 380C, a magnetic sensor 380D, an acceleration sensor 380E, a distance sensor 380F, a proximity sensor 380G, a fingerprint sensor 380H, a temperature sensor 380J, a touch sensor 380K, an ambient light sensor 380L, a bone conduction sensor 380M, and the like.
The structure illustrated in the embodiment of the present application does not constitute a limitation of the electronic apparatus 300. In other embodiments of the present application, electronic device 300 may include more or fewer components than shown. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 310 may include one or more processing units, such as: the processor 310 may include an application processor, a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The processor 310 may generate operation control signals according to the instruction operation code and the timing signals to complete instruction fetching and instruction execution control.
A memory may also be provided in the processor 310 for storing instructions and data. In some embodiments, the memory in the processor 310 may be a cache memory. The memory may hold instructions or data that are used or used more frequently by the processor 310.
In some embodiments, processor 310 may include one or more interfaces. The interface may include an integrated circuit I2C interface, an I2S interface, a PCM interface, a UART interface, MIPI, GPIO interface, SIM interface, and/or USB interface, among others. The processor 310 may be connected to a touch sensor, an audio module, a wireless communication module, a display screen, or a camera module through at least one of the above interfaces.
It should be understood that the interfacing relationship between the modules illustrated in the embodiments of the present application is only illustrative, and does not constitute a limitation on the electronic device 300. In other embodiments of the present application, the electronic device 300 may also use different interfacing manners, or a combination of multiple interfacing manners in the foregoing embodiments.
USB connector 330 is a connector that conforms to the USB standard specification and can be used to connect electronic device 300 to a peripheral device. USB connector 130 may be a Mini-USB connector, a Micro-USB connector, a USB Type-C connector, or the like. The USB connector 330 may be used to connect a charger by which the electronic device 300 is charged. And can also be used for connecting other electronic devices to realize data transmission between the electronic device 300 and the other electronic devices. And may also be used to connect headphones through which audio stored in the electronic device is output. The connector may also be used to connect other electronic devices, such as VR devices, etc.
The charge management module 340 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charge management module 340 may receive a charging input of a wired charger through the USB connector 130. In some wireless charging embodiments, the charge management module 340 may receive wireless charging input through a wireless charging coil. The battery 342 is charged by the charge management module 340, and the electronic device 300 can be powered by the power management module 341. The battery 342 may include at least one set of electrode terminals, each set of electrode terminals including at least one positive terminal. In one embodiment, when the battery includes two sets of electrode terminals, the electronic device may set two wired charging paths or two wireless charging paths, where each wired or wireless charging path is connected to at least one set of electrode terminals, and charge the battery 342 through multiple charging paths at the same time, so as to increase the charging power and reduce the temperature rise. In another embodiment, when the battery includes two sets of electrode terminals, one set of electrode terminals is used for wired charging and the other set of electrode terminals is used for wireless charging, the charging circuit layout is more flexible. Based on the same design concept, a person skilled in the art may set two or more sets of electrode terminals and two or more charging paths according to design needs.
The power management module 341 is configured to connect the battery 342, the charge management module 340 and the processor 310. The power management module 341 receives input from the battery 342 and/or the charge management module 340 to power the processor 310, the internal memory 321, the display screen 394, the camera 393, and the like. The power management module 341 may also be configured to monitor battery capacity, battery cycle number, battery health (leakage, impedance), and other parameters. In other embodiments, the power management module 341 may also be disposed in the processor 310. In other embodiments, the power management module 341 and the charging management module 340 may also be disposed in the same device.
The wireless communication function of the electronic device 300 may be implemented by the antenna 1, the antenna 2, the mobile communication module 350, the wireless communication module 360, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 300 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 350 may provide a solution for wireless communication including at least one of 2G, 3G, 4G, 5G, or 6G for application on the electronic device 300. The mobile communication module 350 may include at least one filter, switch, power amplifier, low noise amplifier, etc. The mobile communication module 350 may perform processes such as filtering, amplifying, etc. on the electromagnetic wave received by the antenna 1, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 350 may amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate the electromagnetic waves. In some embodiments, at least some of the functional modules of the mobile communication module 350 may be disposed in the processor 310. In some embodiments, at least some of the functional modules of the mobile communication module 350 may be provided in the same device as at least some of the modules of the processor 310.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then transmits the demodulated low frequency baseband signal to the baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device (not limited to speaker 370A, receiver 370B, etc.), or displays images or video through display screen 394. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 350 or other functional module, independent of the processor 310.
The wireless communication module 360 may provide a wireless local area network module, a bluetooth module, a BLE module, an Ultra Wide Band (UWB) module, a global navigation satellite system (global navigation satellite system, GNSS) module, an FM module, a near field wireless communication (near field communication, NFC) module, an infrared module, or the like, which is applied to the electronic device 100. The wireless communication module 360 may be one or more devices that integrate at least one communication processing module. The wireless communication module 360 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 310. The wireless communication module 360 may also receive a signal to be transmitted from the processor 310, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
In some embodiments, antenna 1 and mobile communication module 350 of electronic device 300 are coupled, and antenna 2 and wireless communication module 360 are coupled, such that electronic device 100 may communicate with other electronic devices via wireless communication techniques. The wireless communication techniques may include GSM, GPRS, CDMA, WCDMA, TD-SCDMA, LTE, BT, GNSS, WLAN, NFC, FM, and/or IR techniques, among others. The GNSS may include GPS, global navigation satellite System (global navigation satellite system, GLONASS), beidou satellite navigation System (beidou navigation satellite system, BDS), quasi zenith satellite System (quasi-zenith satellite system, QZSS) and/or satellite based augmentation System (satellite based augmentation systems, SBAS).
The electronic device 300 may implement display functions through a GPU, a display screen 394, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations and graphics rendering. Processor 310 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 394 is used for displaying images, videos, and the like. In some embodiments, the electronic device 100 may include one or more display screens 394. The display screen 394 may be at least one of an LCD, an OLED display screen, an AMOLED display screen, a FLED display screen, a Miniled display screen, a Micro-OLED display screen, a quantum dot light emitting diode (QLED) display screen, and the like.
The electronic device 300 may implement camera functions through the camera module 393, ISP, video codec, GPU, display screen 394, and application processor AP, neural network processor NPU, etc.
The camera module 393 may be used to collect color image data and depth data of a photographed object. The ISP may be used to process color image data collected by the camera module 393. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and is converted into an image visible to naked eyes. ISP can also optimize the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, an ISP may be provided in the camera module 393.
In some embodiments, the camera module 393 may be composed of a color camera module and a 3D sensing module.
In some embodiments, the photosensitive element of the camera of the color camera module may include a CCD, or CMOS phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing.
In some embodiments, the 3D sensing module may be a structured light 3D sensing module. The structured light 3D sensing module may include an infrared emitter, an infrared camera module, and the like. The structured light 3D sensing module firstly emits light spots with specific patterns to a shot object, then receives the light spot pattern codes on the surface of the object, and further compares the differences with the original projected light spots to determine the three-dimensional coordinates of the object. The three-dimensional coordinates may include a distance from the electronic device 300 to the object to be photographed. The 3D sensing module can obtain the distance (namely the depth) between the 3D sensing module and the shot object through the time of infrared ray turn-back so as to obtain a 3D depth map.
The structured light 3D sensing module can also be applied to the fields of face recognition, somatosensory game machines, industrial machine vision detection and the like. The 3D sensing module can also be applied to the fields of game machines, AR, VR and the like.
In other embodiments, camera module 393 may also be comprised of two or more cameras. The two or more cameras may include a color camera that may be used to capture color image data of the object being photographed. The two or more cameras may employ stereoscopic techniques to acquire depth data of the photographed object.
In some embodiments, electronic device 300 may include one or more camera modules 393. The electronic device 100 may include 1 front camera module 393 and 1 rear camera module 393. The front camera module 393 may be used for collecting color image data and depth data of a photographer, and the rear camera module may be used for collecting color image data and depth data of a photographed object (such as a person, a landscape, etc.) facing the photographer.
In some embodiments, a CPU or GPU or NPU in processor 310 may process color image data and depth data acquired by camera module 393. In some embodiments, the NPU may identify color image data acquired by the camera module 393 by a neural network algorithm, such as a convolutional neural network algorithm (CNN), based on which the skeletal point identification technique is based, to determine skeletal points of the captured person. The CPU or GPU may also be used to run a neural network algorithm to effect determination of skeletal points of the captured person from the color image data. In some embodiments, the CPU, GPU, or NPU may further be configured to confirm the stature (such as body proportion, weight of the body between the skeletal points) of the photographed person based on the depth data collected by the camera module 393 (which may be a 3D sensing module) and the identified skeletal points, and further determine the beautification parameters for the photographed person, and finally process the photographed image of the photographed person according to the body beautification parameters, so that the body form of the photographed person in the photographed image is beautified.
Video codecs are used to compress or decompress digital video. The electronic device 300 may support one or more video codecs. Thus, the electronic device 300 may play or record video in a variety of encoding formats, such as: MPEG1, MPEG2, MPEG3, MPEG4, etc.
The NPU is a neural-network (NN) computing processor, and can rapidly process input information by referencing a biological neural network structure, for example, referencing a transmission mode between human brain neurons, and can also continuously perform self-learning. Applications such as intelligent cognition of the electronic device 300 may be implemented by the NPU, for example: image recognition, face recognition, speech recognition, text understanding, etc.
The memory card connector 320 may be used to connect memory cards, such as Micro SD card, nano SD card, to enable expansion of the memory capabilities of the electronic device 300. The memory card communicates with the processor 110 through the memory card connector 120 to implement data storage functions. In some embodiments, the memory card and the SIM card may share the same connector in a time sharing manner, and the electronic device may identify that the card disposed on the connector is the memory card or the SIM card, so as to implement a corresponding function. Or the memory card and the SIM card may be simultaneously disposed in the same connector and electrically connected to different shrapnel of the electronic device 300, so as to implement the memory function and the SIM function respectively.
The internal memory 321 may be used to store computer executable program code comprising instructions. The internal memory 321 may include a storage program area and a storage data area. The storage program area may store application programs and the like required for at least one function (such as a sound playing function, an image playing function, etc.) of the operating system. The storage data area may store data created during use of the electronic device 300 (e.g., audio data, phonebook, etc.), and so on. In addition, the internal memory 321 may include a high-speed random access memory, and may also include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like. The processor 310 performs various functional methods or data processing of the electronic device 300 by executing instructions stored in the internal memory 321, and/or instructions stored in a memory provided in the processor.
The electronic device 300 may implement audio functions through an audio module 370, a speaker 370A, a receiver 370B, a microphone 370C, an earphone connector 370D, an application processor, and the like. Such as music playing, recording, etc.
The audio module 370 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 370 may also be used to encode and decode audio signals. In some embodiments, the audio module 370 may be disposed in the processor 310, or some of the functional modules of the audio module 370 may be disposed in the processor 310.
Speaker 370A, also known as a "horn," is used to convert audio electrical signals into sound signals. The electronic device 300 may listen to music through the speaker 370A or output an audio signal for hands-free calling.
A receiver 370B, also referred to as a "earpiece", is used to convert the audio electrical signal into a sound signal. When electronic device 100 is answering a telephone call or voice message, voice may be received by placing receiver 370B close to the human ear.
Microphone 370C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user may sound near microphone 370C, inputting a sound signal to microphone 370C. The electronic device 300 may be provided with at least one microphone 370C. In other embodiments, the electronic device 300 may be provided with more than two microphones 370C, and may implement a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may also utilize a microphone to identify the source of sound, implement a directional recording function, and so forth.
The earphone connector 370D is for connecting a wired earphone. The headset connector 370D may be a USB connector 330 or a 3.5mm connector conforming to the open mobile electronic device platform (open mobile terminal platform, OMTP) standard, a connector conforming to the american cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard.
The pressure sensor 380A is configured to sense a pressure signal and convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 380A may be disposed on the display screen 394. Pressure sensor 380A
Such as resistive pressure sensors, inductive pressure sensors, capacitive pressure sensors, etc. The capacitive pressure sensor may comprise at least two parallel plates with conductive material. When a force is applied to the pressure sensor 380A, the capacitance between the electrodes changes. The electronic device 300 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 394, the electronic apparatus 300 detects the touch operation intensity from the pressure sensor 380A. The electronic device 300 may also calculate the location of the touch based on the detection signal of the pressure sensor 380A. In some embodiments, touch operations that act on the same touch location, but at different touch operation strengths, may correspond to different operation instructions. For example: and executing an instruction for checking the short message when the touch operation with the touch operation intensity smaller than the first pressure threshold acts on the short message application icon. And executing an instruction for newly creating the short message when the touch operation with the touch operation intensity being greater than or equal to the first pressure threshold acts on the short message application icon.
The gyro sensor 380B may be used to determine a motion gesture of the electronic device 300. In some embodiments, the angular velocity of electronic device 300 about three axes (i.e., x, y, and z axes) may be determined by gyro sensor 380B. The gyro sensor 380B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 380B detects the shake angle of the electronic device 300, calculates the distance to be compensated by the lens module according to the angle, and controls the lens to move in the opposite direction to counteract the shake of the electronic device 300, thereby realizing anti-shake. The gyro sensor 380B may also be used for navigating, somatosensory game scenes.
The air pressure sensor 380C is used to measure air pressure. In some embodiments, electronic device 00 calculates altitude from barometric pressure values measured by barometric pressure sensor 80C, aiding in positioning and navigation.
The magnetic sensor 380D includes a hall sensor. The electronic device 300 may detect the opening and closing of the flip holster using the magnetic sensor 380D. When the electronic device is a foldable electronic device, the magnetic sensor 380D may be used to detect the folding or unfolding, or folding angle, of the electronic device. In some embodiments, when the electronic device 300 is a flip machine, the electronic device 300 may detect the opening and closing of the flip according to the magnetic sensor 380D. And then according to the detected opening and closing state of the leather sheath or the opening and closing state of the flip, the characteristics of automatic unlocking of the flip and the like are set.
The acceleration sensor 380E may detect the magnitude of acceleration of the electronic device 00 in various directions (typically three axes). The magnitude and direction of gravity may be detected when the electronic device 300 is stationary. The electronic equipment gesture recognition method can also be used for recognizing the gesture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 380F for measuring distance. The electronic device 300 may measure the distance by infrared or laser. In some embodiments, the electronic device 00 may range using the distance sensor 80F to achieve fast focus.
The proximity light sensor 380G may include, for example, a light emitting diode and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic device 300 emits infrared light outward through the light emitting diode. The electronic device 300 uses a photodiode to detect infrared reflected light from nearby objects. When the intensity of the detected reflected light is greater than the threshold, it may be determined that an object is near the electronic device 300. When the intensity of the detected reflected light is less than the threshold, the electronic device 300 may determine that no object is near the electronic device 300. The electronic device 300 can detect that the user holds the electronic device 300 close to the ear by using the proximity light sensor 380G, so as to automatically extinguish the screen to achieve the purpose of saving power. The proximity light sensor 380G may also be used in holster mode, pocket mode to automatically unlock and lock the screen.
The ambient light sensor 380L may be used to sense ambient light level. The electronic device 300 may adaptively adjust the brightness of the display screen 394 based on the perceived ambient light level. The ambient light sensor 380L may also be used to automatically adjust white balance during photographing. The ambient light sensor 380L may also cooperate with the proximity light sensor 380G to detect whether the electronic device 300 is occluded, e.g., the electronic device is in a pocket. When the electronic equipment is detected to be blocked or in the pocket, part of functions (such as touch control functions) can be in a disabled state so as to prevent misoperation.
The fingerprint sensor 380H is used to collect a fingerprint. The electronic device 300 can utilize the collected fingerprint characteristics to realize fingerprint unlocking, access an application lock, fingerprint photographing, fingerprint incoming call answering and the like.
The temperature sensor 380J is used to detect temperature. In some embodiments, the electronic device 300 performs a temperature processing strategy using the temperature detected by the temperature sensor 80J. For example, when the temperature detected by temperature sensor 380J exceeds a threshold, electronic device 300 performs a reduction in the performance of the processor in order to reduce the power consumption of the electronic device to implement thermal protection. In other embodiments, the electronic device 300 heats the battery 342 when the temperature detected by the temperature sensor 380J is below another threshold. In other embodiments, the electronic device 300 may boost the output voltage of the battery 342 when the temperature is below a further threshold.
Touch sensor 380K, also known as a "touch device". The touch sensor 80K may be disposed on the display screen 394, and the touch sensor 380K and the display screen 394 form a touch screen, which is also referred to as a "touch screen". The touch sensor 380K is for detecting a touch operation acting thereon or thereabout. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display screen 394. In other embodiments, touch sensor 380K may also be located on a surface of electronic device 300 other than at display 394.
The bone conduction sensor 380M may acquire a vibration signal. In some embodiments, bone conduction sensor 380M may acquire a vibration signal of a human vocal tract vibrating bone pieces. The bone conduction sensor 380M may also contact the pulse of the human body to receive the blood pressure pulsation signal. In some embodiments, bone conduction sensor 380M may also be provided in the headset, in combination with an osteoinductive headset. The audio module 370 may analyze the voice signal based on the vibration signal of the vocal part vibration bone piece obtained by the bone conduction sensor 380M, and implement the voice function. The application processor can analyze heart rate information based on the blood pressure beat signals acquired by the bone conduction sensor 380M, so that a heart rate detection function is realized.
Key 390 may include a power on key, a volume key, etc. Key 390 may be a mechanical key. Or may be a touch key. The electronic device 300 may receive key inputs, generating key signal inputs related to user settings and function controls of the electronic device 300.
The motor 391 may generate a vibration alert. The motor 391 may be used for incoming call vibration alerting as well as for touch vibration feedback. For example, touch operations acting on different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 391 may also correspond to different vibration feedback effects by touch operations applied to different areas of the display screen 394. Different application scenarios (such as time reminding, receiving information, alarm clock, game, etc.) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
The indicator 392 may be an indicator light, which may be used to indicate a state of charge, a change in charge, a message indicating a missed call, a notification, etc.
The SIM card interface 395 may be a hardware module for interfacing with a SIM card. The SIM card may be inserted into the SIM card interface 395 or removed from the SIM card interface 395 to enable contact and separation with the electronic device 300. The electronic device 300 may support one or more SIM card interfaces. The SIM card interface 395 may support Nano SIM cards, micro SIM cards, and the like. The same SIM card interface 395 can be used to insert multiple cards simultaneously. The types of the plurality of cards may be the same or different. The SIM card interface 395 may also be compatible with different types of SIM cards. The SIM card interface 395 may also be compatible with memory cards. The electronic device 00 interacts with the network through the SIM card to realize the functions of communication, data communication and the like. In some embodiments, the electronic device 300 employs esims, namely: an embedded SIM card. The eSIM card can be embedded in the electronic device 300 and cannot be separated from the electronic device 300.
It should be noted that, the hardware modules included in the terminal device shown in fig. 3 are only described by way of example, and the specific structure of the terminal device is not limited.
Referring now to fig. 4, fig. 4 illustrates a block diagram of an input device 400 provided in an embodiment of the present application. The input device 400 includes a processing unit 410, a storage unit 420, a communication unit 430, and an input unit 440. The processing unit 410 may be a Central Processing Unit (CPU) or a Microcontroller (MCU), among others. The memory unit 420 may include DDR SDRAM (Double Data Rate SDRAM, double Rate synchronous dynamic random Access memory) or Non-Volatile (Non-Volatile) memory, such as FLASH memory. The storage unit of the input device may be used to store connected host information, illustratively, device identification of the connected host. The storage unit of the input device may also be used to record and maintain the current focus host of the input device. The communication unit may include a bluetooth module for performing bluetooth communication with the terminal device; alternatively, the communication unit may include a 2.4gwi chip for communicating with the terminal device based on a 2.4G network protocol; alternatively, the communication unit may be a USB port for communicating with the terminal device based on a wired USB channel. It should be noted that, the modules included in the input device are not specifically limited in this application, and may be determined according to circumstances.
Further, in the embodiment provided in the application, when the first input device is a mouse, the mouse may have a multi-connection function. The mouse with the multi-connection function can be simultaneously connected with a plurality of terminal devices in a wired or wireless manner. As introduced above, the wireless connection may include a bluetooth connection, a 2.4 gwfi connection, etc., and the wired connection may include a USB connection, etc. When the mouse has the connection capability of any two or more of the wireless connection and the wired connection, the mouse has a multi-connection function. For example, when the bluetooth module of the mouse supports the bluetooth 5.0 communication protocol, that is, supports the multi-connection protocol stack, the mouse may respectively and simultaneously establish bluetooth channels with a plurality of terminal devices to realize multi-connection. For another example, when the mouse is provided with a bluetooth module and also provided with a 2.4gwi chip, communication channels can be respectively and simultaneously established with a plurality of terminal devices through bluetooth and wifi respectively, so as to realize multiple connections. For another example, when the mouse has one bluetooth module and also has a USB port, communication channels can be established simultaneously with a plurality of terminal devices by bluetooth and wired methods, respectively. The present application is not particularly limited as to the configuration for realizing multiple connections, as appropriate.
For example, when the input device has a multi-connection function, a hardware switch may be provided on the input device to enable on or off of the multi-connection function. Alternatively, the user may implement opening or closing of the multi-connection function of the input device through the UI interface, as shown in fig. 5.
It will be appreciated that the present application takes only two terminal devices as an example, and the number of terminal devices may be increased if the connection capability of the input device allows, which is not particularly limited in this application. In the application scenario shown in fig. 2, the first input device 240 establishes a connection with the terminal device 210, 220 at the same time, which may be a wireless connection or a wired connection. The second input device 230 establishes a connection with the terminal device 210 only, which may be a wireless connection or a wired connection.
An apparatus input method provided in an embodiment of the present application is described below with reference to fig. 2 and 6. Taking the first input device 240 as a mouse and the second input device 230 as a keyboard as an example.
Step S601: the first input device establishes communication connections with a plurality of terminal devices.
The first input device 240 opens the multi-connection function in response to an opening operation of the multi-connection function switch or in response to an operation of opening the multi-connection function through the UI interface by a user. For example, the first input device 240 originally established a wired connection only with the second terminal device 220, and in response to the activation of the multi-connection function, the first input device 240 searches for nearby terminal devices and also establishes a communication connection with the first terminal device 210. For another example, the first input device 240 may have originally established a bluetooth connection only with the first terminal device 210, and in response to the activation of the multi-connection function, the first input device 240 may switch bluetooth channels to send bluetooth close to a discovery pairing broadcast to discover surrounding connectable terminal devices, such that the first input device 240 maintains a bluetooth communication connection with both the first terminal device 210 and the second terminal device 220.
Note that, the timing of establishing connection between the first terminal device and the other plurality of terminals is not particularly limited. When the multi-connection function is started, the connection with other terminal devices can be started immediately, or when the requirement of switching the terminal devices is met, the connection with other terminal devices can be started, for example, when a mouse slides out of a screen from the first terminal device, the connection with other terminal devices can be restarted.
Step S602: the second input device establishes a communication connection with the first terminal device.
The second input device 230 establishes a connection with the first terminal device 210, which may be a wired connection or a wireless connection, for example. In the embodiment provided in the application, no specific requirement is made as to whether the second input device has the multi-connection function.
Step S603: when the focus of the first input device is switched from the first terminal device to the second terminal device, a virtual input device is established in the second terminal device, the virtual input device corresponding to the second input device.
The first terminal device and the second terminal device monitor and record the state of the first input device after establishing connection with the first input device, and may provide an interface for querying other terminal devices. For example, the device ID of the first input device may be saved after the first terminal device establishes a connection with the first input device, and the current connection state of the first input device is recorded as connected. Further, in the embodiment provided in the application, when the first input device is a mouse, the first terminal device may record a device identifier of a current focal device of the mouse. Or directly recording whether the current focus of the mouse is the first terminal device.
Since the first input device has established a connection with the first terminal device and the second terminal device, respectively, in step S601, the first input device can be freely switched for use between the first terminal device and the second terminal device. When the cursor of the mouse slides out of the screen from the edge of the first terminal device and slides to other surrounding terminal devices, the focus of the mouse is switched from the first terminal device to the other surrounding terminal devices. Illustratively, when the mouse cursor continues to slide rightward from the right edge of the screen of the first terminal device 210, the mouse cursor will appear at the left edge of the screen of the second terminal device 220, so as to implement the switching of the mouse focus, according to the position layout of the first terminal device 210 and the second terminal device 220 as shown in fig. 2.
The first terminal device may monitor the displacement of the mouse cursor, and when the mouse cursor slides to the edge of the screen, the first terminal device notifies the mouse to switch the focus to the second terminal device, for example, sends a focus migration instruction to the mouse. After the mouse switches the focus to the second terminal device, the mouse sends a broadcast message to all the terminal devices which are connected currently, and the current focus device of the broadcast mouse is the second terminal device.
When the focus of the first input device has been switched to the second terminal device, it is stated that the user's use needs have been switched from the first terminal device to the second terminal device, and therefore it is ensured that the input operation of the second input device can be responded to also at the second terminal device in addition to the focus switching of the first input device. When the focus of a first input device (such as a mouse) is switched from a first terminal device to a second terminal device, after the second terminal device receives a broadcast sent by the mouse and used for indicating a host computer with the current focus of the mouse, a virtual input device corresponding to the second input device is built in the second terminal device so as to control the second terminal device to respond. The implementation mechanisms of the virtual keyboard devices are different on different terminal systems. For example, when the second terminal device is an android or Linux device, the method can be implemented by creating an input device in the system; when the second terminal device is a windows device, the HID device implementation can be created through the HID framework.
Step S604: the first terminal device responds to the input operation of the second input device, acquires first input information of the second input device, and sends the second input information to the first input device according to the first input information.
For example, when the second input device is a keyboard, the user presses a key of the keyboard, and the first terminal device receives a key code (first input information) transmitted from the keyboard, and then converts the key code (second input information) into a key code (second input information) matched with the first terminal device, and then transmits the key code (second input information) matched with the first terminal device to the first input device.
Or when the second input device is a keyboard, after the user presses a key of the keyboard, the keyboard transmits a key code to the bottom layer driver of the first terminal device, and then the first terminal device directly transmits the key code to the first input device without conversion. That is, the first input information and the second input information are the same information.
In one embodiment, the input information includes a keyboard key code and a system type of the first terminal device, the system type of the first terminal device being the first system.
In one embodiment, the input information includes only keyboard key codes. In this embodiment, the system types of both sides can be confirmed in advance when the mouse establishes connection with both terminal apparatuses, so that the system types need not be included in the input information.
Optionally, the input information may further include a device identifier of the first terminal device.
Step S605: after the first input device receives the second input information, the first input device forwards the second input information to the second terminal device.
The first input device maintains connection with the first terminal device and the second terminal device although the focus host is switched, and thus can receive input information transmitted from the first terminal device and forward the input information to the second terminal device.
Step S606: the second terminal device responds to the input operation in step S604 according to the received input information.
The second terminal equipment converts the second input information into input information matched with a second system of the second terminal equipment, and the converted input information is injected into the second system through the virtual input equipment.
In general, when an input device establishes a communication connection with a terminal device, the terminal device can maintain information of the input device and manage the input device, and the connected input device and the type or operation state of the device can be seen in a system of the terminal device. For example, as shown in fig. 7, when a device such as a mouse, a keyboard, etc. is connected to the windows device, information of the input device can be viewed in a device manager of the system, and the mouse and the keyboard are both displayed as HID devices in the windows device. In the embodiment provided in the application, the first input device and the second terminal device are not connected, so in order to inject the input operation of the first input device into the second system of the second terminal device, a virtual input device needs to be built in the second system, so that the second terminal device responds to the input operation.
Step S607: when the focus of the first input device switches back from the second terminal device to the first terminal input device, the second terminal device deletes the virtual input device established in step S603.
For example, when the mouse slides out from the left edge of the screen of the second terminal device, the second terminal device notifies the mouse to switch the focus to the first terminal device. After the mouse switches the focus to the first terminal device, broadcasting is sent to all the terminal devices which are connected currently, and all the terminal devices are informed that the current focus device of the mouse is the first terminal device. The second terminal device perceives that the focus of the mouse has been switched, and may delete the virtual input device previously established in step S603.
In the following, a specific example is used to describe the method provided in the embodiment of the present application, where the first terminal device 210 runs a Windows system (first system), the second terminal device 220 runs an android system (second system), the first input device 240 is a mouse, and the second input device 230 is a keyboard. The first input device is in communication connection with the first terminal device and the second terminal device, and the mouse can freely switch focuses between the two input devices. The second input device is only communicatively connected to the first terminal device. The above communication connection manner is referred to the foregoing description, and will not be repeated here.
The mouse cursor slides from the screen of the first terminal device 210 to the screen of the second terminal device 220, and clicking operation is performed in the text input box of the second terminal device, so that focus switching is realized. When the mouse focus is switched to the second terminal device 220, a virtual keyboard device corresponding to the keyboard is established on the second terminal device 220. When the user presses the key W on the keyboard 230, the first terminal apparatus generates the key code 0x57 in the Windows system after receiving the key code transmitted from the keyboard. The first terminal device 210 then sends the key code 0x57 and the identifier of the first system to the mouse, where the identifier of the first system is used to indicate that the first system is a windows system. The mouse sends the received key code and the identification of the first system to the second terminal equipment, the second terminal equipment determines that the corresponding key code under the android system is 51 according to the identification of the first system and the key code 0x57, then the key code is injected into the second system through the virtual keyboard equipment, and the letter W appears in an input box of the second terminal equipment.
When the mouse is slid again from the screen of the second terminal device 220 to the screen of the first terminal device, the focus of the mouse is switched from the second terminal device to the first terminal device. Optionally, the second terminal device deletes the virtual keyboard device established before.
In the embodiment provided by the application, when the focus of the mouse is switched from the first terminal device to the second terminal device, the first terminal device connected with the keyboard intercepts the input of the keyboard and forwards the input information to the second terminal device through the mouse. After the virtual keyboard device is established on the second terminal device, the input information can be injected into the system of the second terminal device through the virtual keyboard device so that the second terminal device can respond. In the method provided by the embodiment of the application, the first terminal equipment and the second terminal equipment do not need to be connected, so that a complex networking environment is not needed. In addition, the method provided by the embodiment of the application has no special requirement on the first input device, and only one input device capable of being connected with the terminal device is needed, so that the use cost of a user is reduced.
Another device input method provided in an embodiment of the present application is described below with reference to fig. 8. The first input device is provided with multiple connection capability, namely, the first input device can establish connection with the first terminal device and the second terminal device at the same time. The first terminal device operates a first system and the second terminal device operates a second system.
Step S801: the first input device 240 establishes a connection with the first terminal device 210, the second terminal device 220.
As described above, the first input device 240 may establish a connection with the first terminal device 210 and the second terminal device 220, respectively, in a wireless or wired manner. After the connection is established, the initial focus of the first input device 240 is on the first terminal device.
Step S802: the second input device 230 establishes a connection with the first terminal device 210.
Step S803: the first terminal device sends indication information to the first input device, wherein the device connection indication information is used for indicating that the first terminal device and the second input device are connected currently.
Step S804: the first input device forwards the indication information to the second terminal device, so that the second terminal device performs a corresponding operation according to the indication information, for example, step S804.
It should be noted that, the execution timing of step S803 and step S804 may be before the first input device switches the focus, that is, before step S805, after the focus is switched, or when the first terminal device receives the input information sent by the second input device, for example, step S807. The present application is not limited in this regard.
Step S805: the first input device switches the focal device from the first terminal device to the second terminal device and sends a broadcast message to all the terminal devices currently connected.
The procedure of focus switching can refer to step S603 described above. After the focus is switched, the mouse sends a broadcast message to all hosts currently connected to synchronize information of the focus host of the current first input device.
Step S806: the second terminal device establishes a virtual device corresponding to the second input device.
The second terminal device has previously received the indication information, i.e. the second terminal device has determined that the second input device has established a communication connection with the first terminal device, so that the second terminal device establishes a virtual device corresponding to the second input device when the first input device performs a focus switch.
Step S807: the second input device receives user operation and sends first input information to the first terminal device.
When the second input device is a keyboard, the user presses a key of the keyboard, and the keyboard transmits a key code to the drive of the first terminal device.
Step S808: the first terminal device receives the first input information sent by the second input device, and judges whether the focal device of the current first input device is the first terminal device. Step S809 is performed if the current focus of the first input device is the first terminal device, and step S810 is performed directly by skipping step S809 if the current focus of the first input device is not the first terminal device but the second terminal device. The mouse, when switching focus, will send a broadcast to all the terminal devices to inform the mouse which terminal device is currently the focus device. The first terminal device can thus perceive whether the focal device of the current mouse is the first terminal device.
Step S809: the first terminal device responds to the input information sent by the second input device and executes corresponding operation.
Step S810: the first terminal device sends second input information to the first input device according to the first input information.
In one implementation, the second input information is the first input information. Illustratively, when the second input device is a keyboard, the first terminal device directly forwards the key code transmitted by the keyboard to the first input device.
In another implementation, the second input information is generated from the first input information. When the second input device is a keyboard, the first terminal device converts the key code transmitted by the keyboard into the key code under the first system, and then transmits the converted key code to the first input device.
In addition, the input device may also carry a system identifier or a device identifier of the first terminal device, which is used for identifying a system operated by the first terminal device. In one case, after the mouse is connected to the plurality of terminal devices, each of the plurality of terminal devices acquires the identifiers of surrounding terminal devices and the system in which the terminal device operates. Thus, the system in which the first terminal device operates may be identified by the device identification, or the system in which the first terminal device operates may be identified directly by the system identification.
It should be noted that if the second input information is the first input information directly, the second input information may not include the device identifier or the system identifier, and the second terminal device may directly respond to the operation according to the second input information.
Step S811: the first input device transmits the second input information to the second terminal device.
Step S812: the second terminal device responds according to the second input information.
When the second input device is a keyboard, the second input information is the first input information, and the second terminal device may determine the key code under the second system directly according to the key code sent to the first terminal device by the keyboard, and then execute the operation corresponding to the key code by the second system.
In one implementation, the second input information is a key code under the first system, and the second terminal device converts the key code into the key code under the second system according to the key code, so that the second system executes corresponding operation.
In another implementation, the second system does not need to perform the conversion operation when the first system and the second system are the same system.
Step S813: when the focus of the first input device is switched from the second terminal device to the other terminal device, the second terminal device deletes the virtual device established in step S807.
When the second input device receives the input operation of the user again, the second input device continues to perform step S805, and the first terminal device continues to perform steps S806 to S812.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied in hardware or in software instructions executed by a processor. The software instructions may be comprised of corresponding software modules that may be stored in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art. An exemplary storage medium is coupled to the processor such the processor can read information from, and write information to, the storage medium. In the alternative, the storage medium may be integral to the processor. The processor and the storage medium may reside in an ASIC. In addition, the ASIC may reside in an electronic device. The processor and the storage medium may reside as discrete components in an input device.
From the foregoing description of the embodiments, it will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of functional modules is illustrated, and in practical application, the above-described functional allocation may be implemented by different functional modules according to needs, i.e. the internal structure of the apparatus is divided into different functional modules to implement all or part of the functions described above.

Claims (18)

1. An input method of a device, applied to a communication system, the communication system including a first input device, a second input device, a first terminal device, and a second terminal device, the first input device being communicatively connected to the first terminal device and the second terminal device, the second input device being communicatively connected to the first terminal device, the method comprising:
the first input device responds to a first operation, namely, an operation acting on the first input device, and switches the focus from the first terminal device to the second terminal device;
the second input device responds to a second operation, namely, an operation acting on the second input device, and sends first input information to the first terminal device;
the first terminal equipment sends second input information to first input equipment, and the second input information is generated according to the received first input information;
the first input device sends the second input information to a second terminal device;
and the second terminal equipment responds to the second operation according to the second input information.
2. The method of claim 1, wherein the first input device is responsive to a first operation, the method further comprising:
The first input device sends a first broadcast message to the first terminal device and the second terminal device, wherein the first broadcast message is used for indicating that the current focal point device of the first input device is the second terminal device.
3. The method according to claim 1 or 2, wherein the first input information is the same or different from the second input information.
4. A method according to any one of claims 1-3, wherein the first input device is a mouse and the second device is a keyboard.
5. The method of claim 4, wherein the first terminal device operates a first system, and the second input information includes a key code and a system identifier, wherein the key code is a key code under the first system, and the system identifier is used to indicate the first system.
6. The method according to any of claims 1-5, wherein when the focus of the first terminal device is switched to the second terminal device, the second terminal device creates a virtual device, the virtual device corresponding to the second input device.
7. The method of claim 6, wherein the second terminal device operates a second system, and wherein the second terminal device responding to the second operation according to the second input information comprises:
And the second terminal equipment transmits the second input information to the second system through the virtual equipment so that the second terminal equipment responds to the second operation.
8. The method of claim 6, wherein the second terminal operates a second system, and wherein the second terminal device responding to the second operation according to the second input information comprises:
the second terminal equipment converts the second input information into input information under a second system;
and the second terminal equipment sends the converted input information to a second system through virtual equipment so that the second terminal equipment responds to the second operation.
9. The method according to any one of claims 1-8, further comprising:
switching a focus from a second terminal device to a terminal device by the first input device in response to a third operation, the third operation being an operation acting on the first input device;
the first input device sends a second broadcast message to the first terminal device and the second terminal device, wherein the second broadcast message is used for indicating that the current focal point device of the first input device is the first terminal device.
10. The method according to claim 9, wherein the method further comprises:
and deleting the virtual device by the second terminal device when the focus device of the first input device is not the second terminal device.
11. A device input method applied to a first input device, the first input device being communicatively connected to a first terminal device and a second terminal device, the method comprising:
the first input device receives a focus migration instruction sent by the first terminal device, and switches the focus device of the first input device from the first terminal device to the second terminal device;
the first input device receives second input information sent by the first terminal device, wherein the second input information is generated by the first terminal device according to the first input information sent by the second input device, and the second input device is connected with the first terminal device;
the first input device sends the second input information to the second terminal device so that the second terminal device responds to the second input information.
12. The method of claim 11, wherein the first input device, after receiving the focus migration indication, sends a first broadcast message to the first terminal device and the second terminal device, where the first broadcast message is used to indicate that a first input device is a second terminal device that is currently in focus.
13. The method according to claim 11 or 12, wherein the first input information is the same or different from the second input information.
14. The method of any of claims 11-13, wherein the first input device is a mouse.
15. An input device having the capability to connect to a plurality of terminal devices simultaneously, the input device comprising:
a communication interface for performing radio signal transmission and reception;
a memory for storing computer program instructions;
a processor for executing the computer program instructions to support the input device to implement the method of any of claims 1-10.
16. A communication system, the communication system comprising: the device comprises a first terminal device, a second terminal device and a first input device; the communication system being adapted to implement the method of any of claims 1-10.
17. A computer readable storage medium, having stored thereon computer program instructions which, when executed by a processing circuit, implement the method of any of claims 11-14.
18. A chip system having the capability of providing a plurality of MAC addresses, the chip system comprising processing circuitry, a storage medium having stored therein computer program instructions; the computer program instructions, when executed by the processing circuitry, implement the method of any of claims 11-14.
CN202210931591.9A 2022-08-04 2022-08-04 Input method, input device and system of device Pending CN117555433A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202210931591.9A CN117555433A (en) 2022-08-04 2022-08-04 Input method, input device and system of device
PCT/CN2023/110455 WO2024027671A1 (en) 2022-08-04 2023-08-01 Input method for device, input device, and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210931591.9A CN117555433A (en) 2022-08-04 2022-08-04 Input method, input device and system of device

Publications (1)

Publication Number Publication Date
CN117555433A true CN117555433A (en) 2024-02-13

Family

ID=89813396

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210931591.9A Pending CN117555433A (en) 2022-08-04 2022-08-04 Input method, input device and system of device

Country Status (2)

Country Link
CN (1) CN117555433A (en)
WO (1) WO2024027671A1 (en)

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8682249B2 (en) * 2010-08-11 2014-03-25 International Business Machines Corporation Input device with switchable frequency channel for switchable use between computer systems
US10133361B2 (en) * 2011-06-06 2018-11-20 International Business Machines Corporation Device driver-level approach for utilizing a single set of interface input devices for multiple computing devices
CN103927015B (en) * 2014-04-24 2017-12-12 国家电网公司 A kind of wireless input-output system and its implementation towards multiple terminals
US10191869B2 (en) * 2014-10-20 2019-01-29 Logitech Europe S.A. Input device with multi-host switching
US11521356B2 (en) * 2019-10-10 2022-12-06 Meta Platforms Technologies, Llc Systems and methods for a shared interactive environment
CN114201128A (en) * 2020-09-02 2022-03-18 华为技术有限公司 Display method and device

Also Published As

Publication number Publication date
WO2024027671A1 (en) 2024-02-08

Similar Documents

Publication Publication Date Title
EP3800876B1 (en) Method for terminal to switch cameras, and terminal
EP3961358B1 (en) False touch prevention method for curved screen, and eletronic device
EP3993460B1 (en) Method, electronic device and system for realizing functions through nfc tag
CN114125130B (en) Method for controlling communication service state, terminal device and readable storage medium
CN113728295B (en) Screen control method, device, equipment and storage medium
CN118051111A (en) High-energy-efficiency display processing method and equipment
CN114090102B (en) Method, device, electronic equipment and medium for starting application program
WO2020221062A1 (en) Navigation operation method and electronic device
CN113535284A (en) Full-screen display method and device and electronic equipment
CN114356195B (en) File transmission method and related equipment
CN115914461B (en) Position relation identification method and electronic equipment
CN113364970B (en) Imaging method of non-line-of-sight object and electronic equipment
CN116382810A (en) META mode starting method, electronic equipment and storage medium
CN114915747B (en) Video call method, electronic device and readable storage medium
CN114812381B (en) Positioning method of electronic equipment and electronic equipment
WO2024027671A1 (en) Input method for device, input device, and system
CN117093068A (en) Vibration feedback method and system based on wearable device, wearable device and electronic device
CN116806013A (en) Message transmission method and corresponding terminal
CN112463086A (en) Display control method and electronic equipment
CN114500725B (en) Target content transmission method, master device, slave device, and storage medium
CN114520870B (en) Display method and terminal
CN115580541B (en) Information synchronization method and electronic equipment
WO2023024889A1 (en) First electronic device, second electronic device, and method for screen-casting
WO2023185750A1 (en) Capability calling method, system, and electronic device
CN117666820A (en) Mouse connection method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination