CN115665313A - Device control method and electronic device - Google Patents

Device control method and electronic device Download PDF

Info

Publication number
CN115665313A
CN115665313A CN202110779144.1A CN202110779144A CN115665313A CN 115665313 A CN115665313 A CN 115665313A CN 202110779144 A CN202110779144 A CN 202110779144A CN 115665313 A CN115665313 A CN 115665313A
Authority
CN
China
Prior art keywords
electronic device
touch point
user
instruction
target operation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110779144.1A
Other languages
Chinese (zh)
Inventor
范亚军
阚彬
吉伟
杨洋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202110779144.1A priority Critical patent/CN115665313A/en
Publication of CN115665313A publication Critical patent/CN115665313A/en
Pending legal-status Critical Current

Links

Images

Abstract

The application provides an equipment control method and electronic equipment, which can solve the problem of difficulty in operation when a user needs to touch the electronic equipment in a scene that the user cannot touch the electronic equipment, and brings convenience to the user. The method is applied to a system consisting of first electronic equipment and at least one second electronic equipment, wherein the first electronic equipment is provided with a touch panel, and the second electronic equipment is provided with a display screen. The second electronic device receives a first instruction generated according to a first user operation from the first electronic device, wherein the first user operation is an operation of a user on the touch panel. The first instruction is used for instructing the second electronic device to control the touch point to execute target operation, and the target operation comprises any one or more of displaying the touch point in a fixed state, displaying the touch point in a released state, moving, clicking, double clicking, sliding and long pressing. The touch point is displayed on a display screen of the second electronic device, and the second electronic device controls the touch point to execute target operation in response to the first instruction.

Description

Device control method and electronic device
Technical Field
The present application relates to the field of electronic devices, and in particular, to a device control method and an electronic device.
Background
With the popularization of electronic devices, usage scenarios of electronic devices by users are more diversified, and the electronic devices can be operated anytime and anywhere, for example: receive phone calls, play music, watch videos, etc. However, in some scenarios, the user cannot touch the electronic device, for example: when a user has a certain distance with the electronic device or is inconvenient to hold the electronic device, the user is difficult to operate if the user needs to touch the electronic device, and inconvenience is brought to the user.
Disclosure of Invention
The embodiment of the application provides an equipment control method and electronic equipment, which can solve the problem of difficulty in operation when a user needs to touch the electronic equipment in a scene that the user cannot touch the electronic equipment, and brings convenience to the user.
In order to achieve the purpose, the following technical scheme is adopted in the application:
in a first aspect, an embodiment of the present application provides an apparatus control method, which is applied to a system formed by a first electronic apparatus and at least one second electronic apparatus, where the first electronic apparatus is configured with a touch panel, and the second electronic apparatus is configured with a display screen. The method comprises the following steps: the second electronic equipment receives a first instruction from the first electronic equipment, wherein the first instruction is generated according to a first user operation, and the first user operation is an operation of a user on the touch panel; the first instruction is used for instructing the second electronic equipment to control the touch point to execute target operation, and the target operation comprises any one or more of displaying the touch point in a fixed state, displaying the touch point in a released state, moving, clicking, double clicking, sliding and long pressing. The touch point is displayed on a display screen of the second electronic device, the position of the touch point cannot move when the touch point is in a fixed state, and the position of the touch point can move when the touch point is in a released state. And responding to the first instruction, and controlling the touch point to execute the target operation by the second electronic equipment.
Based on the technical scheme, the second electronic device is the electronic device with the screen, so that when the user cannot touch the electronic device with the screen, the touch points on the electronic device with the screen are controlled through other electronic devices, the electronic device with the screen is controlled, various complex operations of the electronic device with the screen are realized, and convenience is brought to the user.
In one possible design, the target operation is determined according to a mapping relationship, where the mapping relationship includes a correspondence between the first user operation and the target operation.
In one possible design, the first user action is a click, and the target action is a click. Or, the first user operation is a double click, and the target operation is a double click. Or, the first user operation is a double click, and the target operation is to display the touch point in a fixed state. Or, the first user operation is a double click, and the target operation is to display the touch point in a released state. Or, the first user operation is multi-tap, and the target operation is to display the touch point in a fixed state. Alternatively, the first user operation shown is a multi-tap, and the target operation is to display the touch point in a released state. Or, the first user operation is long press, and the target operation is long press. Or, if the first user operation is a long press, the target operation is to display the touch point in a fixed state. Or, if the first user operation is a long press, the target operation is to display the touch point in a released state. Or, the first user operation is sliding, and the target operation is sliding. Or, the first user operation is sliding, and the target operation is moving.
In one possible design, the first instruction includes a first distance, the target operation includes a move, and the target operation includes a second distance.
In one possible design, before the second electronic device receives the first instruction from the first electronic device, the method further includes: the second electronic device receives a second instruction from the first electronic device, wherein the second instruction is used for instructing the second electronic device to display the touch point. And responding to the second instruction, and displaying the touch point on the display screen by the second electronic equipment. Based on the design, the second electronic device can be controlled to display the touch point first, so that the touch point can be operated subsequently, the second electronic device can be controlled, and convenience is brought to a user.
In one possible design, before the second electronic device receives the first instruction from the first electronic device, the method further includes: the second electronic device starts a touch point control mode.
In a second aspect, an embodiment of the present application provides an apparatus control method, which is applied to a system formed by a first electronic apparatus and at least one second electronic apparatus, where the first electronic apparatus is configured with a touch panel, and the second electronic apparatus is configured with a display screen. The method comprises the following steps: the first electronic equipment receives a first user operation, and the first user operation comprises an operation of a user on the touch panel. The first electronic equipment generates a first instruction according to a first user operation; the first electronic equipment sends a first instruction to the second electronic equipment, wherein the first instruction is used for instructing the second electronic equipment to control the touch point to execute target operation, and the target operation comprises any one or more of displaying the touch point in a fixed state, displaying the touch point in a released state, moving, clicking, double clicking, sliding and long pressing. The touch point is displayed on a display screen of the second electronic device, the position of the touch point cannot move when the touch point is in a fixed state, and the position of the touch point can move when the touch point is in a released state.
In one possible design, the target operation is determined according to a mapping relationship, where the mapping relationship includes a correspondence between the first user operation and the target operation.
In one possible design, the first user action is a click, and the target action is a click. Or, the first user operation is double-click, and the target operation is double-click. Or, the first user operation is a double click, and the target operation is to display the touch point in a fixed state. Or, the first user operation is a double click, and the target operation is to display the touch point in a released state. Or, the first user operation is multi-tap, and the target operation is to display the touch point in a fixed state. Alternatively, the first user operation shown is a multi-tap, and the target operation is to display the touch point in a released state. Or, the first user operation is long press, and the target operation is long press. Or, if the first user operation is a long press, the target operation is to display the touch point in a fixed state. Or, if the first user operation is a long press, the target operation is to display the touch point in a released state. Or, the first user operation is sliding, and the target operation is sliding. Or, the first user operation is sliding, and the target operation is moving.
In one possible design, if the first command includes a first distance, the target operation includes a move, and the target operation includes a second distance.
In one possible design, before the first electronic device sends the first instruction to the second electronic device, the method further includes: the first electronic equipment receives a second user operation, generates a second instruction according to the second user operation, and sends the second instruction to the second electronic equipment, wherein the second instruction is used for indicating the second electronic equipment to display the touch point.
In one possible design, after the first electronic device sends the first instruction to the second electronic device, the method further includes: the first electronic equipment receives a third user operation, the third user operation is used for indicating the first electronic equipment to switch the second electronic equipment, and the third user operation comprises the operation of a user on the touch panel. And responding to the third user operation, the first electronic equipment generates a third instruction according to the third user operation, and sends the third instruction to the third electronic equipment, wherein the third instruction is used for instructing the third electronic equipment to control the touch point to execute target operation, and the target operation comprises any one or more of displaying the touch point in a fixed state, displaying the touch point in a released state, moving, clicking, double clicking, sliding and long pressing. The touch point is displayed on a display screen of the third electronic device, and the third electronic device belongs to at least one second electronic device. Based on the design, when a user uses a plurality of electronic devices with screens, the plurality of electronic devices can be controlled respectively by switching and controlling the touch points on the electronic devices, seamless switching control among the plurality of electronic devices can be realized, and convenience is brought to the user.
In one possible design, the third electronic device is determined by the first electronic device according to a sequence table, where the sequence table includes a sequence of access of the at least one second electronic device to the distributed network. Based on the design, the first electronic device may control the plurality of electronic devices according to the sequence table, respectively.
In one possible design, the first electronic device is a bluetooth headset.
In a third aspect, an embodiment of the present application provides an electronic device, where the electronic device is a second electronic device, and the second electronic device includes a touch panel, one or more processors, a memory, and a communication interface. The memory, the touch panel and the communication interface are coupled with the processor. The memory has stored therein one or more computer programs, the one or more computer programs comprising instructions, which when executed by the second electronic device, cause the second electronic device to perform the apparatus control method of any of the above first aspects and designs thereof.
In a fourth aspect, an embodiment of the present application provides an electronic device, which is a first electronic device that includes a display screen, one or more processors, a memory, and a communication module. Wherein the memory, the display screen, the communication interface are coupled with the processor. The memory has stored therein one or more computer programs, the one or more computer programs comprising instructions, which when executed by the first electronic device, cause the first electronic device to perform the apparatus control method of any of the above second aspects and their designs.
In a fifth aspect, an embodiment of the present application provides an apparatus control system, where the system includes the above first electronic apparatus and second electronic apparatus, and the first electronic apparatus and the second electronic apparatus may execute the apparatus control method according to any one of the first aspect to the second aspect through interaction.
In a sixth aspect, an embodiment of the present application provides a computer-readable storage medium, where instructions are stored in the computer-readable storage medium, and when the instructions are executed on an electronic device, the instructions cause the electronic device to perform the device control method according to any one of the first aspect and the second aspect.
In a seventh aspect, an embodiment of the present application provides a computer program product including instructions, which, when run on an electronic device, causes the electronic device to execute the device control method according to any one of the first aspect and the second aspect.
In an eighth aspect, an embodiment of the present application provides a chip system, which includes at least one processor and at least one interface circuit, where the at least one interface circuit is configured to perform a transceiving function and send an instruction to the at least one processor, and when the at least one processor executes the instruction, the at least one processor performs the device control method according to any one of the first aspect and the second aspect.
It should be noted that, for technical effects brought by any design in the second aspect to the eighth aspect, reference may be made to technical effects brought by corresponding designs in the first aspect, and details are not described here.
Drawings
FIG. 1 is a schematic structural diagram of a conventional external device for implementing human-computer interaction with a screened electronic device;
fig. 2 is a schematic view of a scenario that a first electronic device communicates with another electronic device according to an embodiment of the present application;
fig. 3 is a schematic view of a scenario that a first electronic device communicates with multiple other electronic devices according to an embodiment of the present application;
fig. 4a is a schematic structural diagram of a distributed network according to an embodiment of the present application;
FIG. 4b is a schematic diagram of a communication architecture including a distributed soft bus according to an embodiment of the present application;
FIG. 4c is a schematic diagram illustrating communication between devices via a distributed soft bus according to an embodiment of the present disclosure;
fig. 5 is a schematic structural diagram of a bluetooth headset according to an embodiment of the present application;
fig. 6 is a schematic diagram of a hardware structure of a mobile phone according to an embodiment of the present disclosure;
fig. 7 is a block diagram of a software structure of a mobile phone according to an embodiment of the present application;
fig. 8 is a schematic view of an interface in which a touch point on a mobile phone screen moves upward according to an embodiment of the present disclosure;
fig. 9 is a schematic view of an interface in which a touch point on a mobile phone screen moves to the left according to an embodiment of the present disclosure;
fig. 10a is a schematic view of a relevant interface of a bluetooth headset controlling a mobile phone to start a touch point control mode according to an embodiment of the present application;
fig. 10b is a schematic view of an interface related to a mode for controlling a touch point of a mobile phone to be turned on by a bluetooth headset according to an embodiment of the present application;
fig. 10c is a schematic view of a relevant interface of a mobile phone in a touch point control mode according to an embodiment of the present disclosure;
fig. 11 is a schematic view of a related interface of a bluetooth headset controlling a mobile phone to start a touch point control mode according to an embodiment of the present application;
FIG. 12a is a schematic view of an interface associated with an initial position of a selected touch point according to an embodiment of the present application;
FIG. 12b is a schematic diagram of an interface associated with an initial state of a selected touch point according to an embodiment of the present disclosure;
fig. 13a is a schematic view of a relevant interface for controlling a touch point on a mobile phone to implement interface switching of the mobile phone by using a bluetooth headset according to an embodiment of the present application;
fig. 13b is a schematic view of a relevant interface for controlling a mobile phone to switch a touch point display state by using a bluetooth headset according to an embodiment of the present application;
fig. 14a is a schematic view of another interface related to controlling a mobile phone to switch a touch point display state by a bluetooth headset according to an embodiment of the present application;
fig. 14b is a schematic view of an interface related to controlling a mobile touch point of a mobile phone by a bluetooth headset according to an embodiment of the present application;
fig. 15a is a schematic view of a relevant interface for controlling a touch point on a mobile phone to play a video through a bluetooth headset according to an embodiment of the present application;
fig. 15b is a schematic view of another related interface for controlling a touch point on a mobile phone to play a video through a bluetooth headset according to an embodiment of the present application;
fig. 16a is a schematic diagram of a bluetooth headset, a mobile phone, and a tablet computer accessing to the same distributed network, where the bluetooth headset controls the mobile phone to start a touch point control mode according to an embodiment of the present application;
fig. 16b is a schematic diagram illustrating that the bluetooth headset, the mobile phone, and the tablet computer access the same distributed network, and the bluetooth headset controls the tablet computer to start the touch point control mode according to the embodiment of the present application;
fig. 16c is a schematic diagram of a bluetooth headset, a mobile phone, and a tablet computer accessing to the same distributed network, where the bluetooth headset is switched from controlling a touch point on the mobile phone to controlling a touch point on the tablet computer according to an embodiment of the present disclosure;
fig. 16d is a schematic diagram of the bluetooth headset, the mobile phone, and the tablet computer accessing to the same distributed network, where the bluetooth headset controls the mobile phone and the tablet computer to start the touch point control mode simultaneously according to the embodiment of the present application;
FIG. 17 is a schematic diagram of a related interface for setting priority of touch point events according to an embodiment of the present disclosure;
fig. 18 is a schematic flowchart of an apparatus control method according to an embodiment of the present application;
fig. 19 is a schematic flowchart of another apparatus control method provided in the embodiment of the present application;
fig. 20 is a schematic flowchart of another apparatus control method provided in the embodiment of the present application;
fig. 21 is a schematic view of a related interface for prompting a user to enter a touch point control mode according to an embodiment of the present application;
fig. 22 is a schematic flowchart of another apparatus control method provided in the embodiment of the present application;
fig. 23 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 24 is a schematic structural diagram of another electronic device provided in the embodiment of the present application;
fig. 25 is a schematic structural diagram of a chip system according to an embodiment of the present disclosure.
Detailed Description
The following describes an apparatus control method and an electronic apparatus provided in an embodiment of the present application in detail with reference to the accompanying drawings.
The terms "comprising" and "having," and any variations thereof, as referred to in the description of the present application, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements but may alternatively include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
It should be noted that in the embodiments of the present application, words such as "exemplary" or "for example" are used to indicate examples, illustrations or explanations. Any embodiment or design described herein as "exemplary" or "e.g.," is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present concepts related in a concrete fashion.
In the description of the present application, unless otherwise indicated, the meaning of "plurality" means two or more, and the meaning of "multi-tap" means two or more taps, for example: three clicks, four clicks, etc. "and/or" herein is merely an association relationship describing an associated object, and means that there may be three relationships, for example, a and/or B, and may mean: a exists alone, A and B exist simultaneously, and B exists alone. The terms "first", "second", and the like do not necessarily limit the number and execution order, and the terms "first", "second", and the like do not necessarily limit the difference.
At present, along with electronic equipment, especially the popularization of electronic equipment with a screen, the scene that the user used electronic equipment with a screen is more and more diversified, and the user can realize human-computer interaction through touch electronic equipment with a screen, can also realize human-computer interaction with electronic equipment with a screen through external equipment, for example: a user may interact with the electronic device with a screen through the headset shown in fig. 1.
Fig. 1 is a schematic structural diagram of an existing external device for implementing man-machine interaction with a screen-equipped electronic device.
The user can interact with the electronic device with the screen through the interactive earphone cable shown in fig. 1, and the user can touch the special position 101 on the interactive earphone cable to control the electronic device with the screen in a mode different from a mode of controlling the electronic device with the screen through keys on the earphone cable. The interactive earphone line is woven by using special fabrics, touch metal wires are arranged in the fabrics, and the touch metal wires can be used as conductors to sense the voltage of human fingers and further convert the voltage into corresponding operation. For example: when a finger touches a certain part of the metal wire of the special position 101, the electronic device with the screen starts playing music, and the volume can be controlled by touching other parts of the special position 101.
The above scheme can realize the control of the electronic device with the screen only by relying on the earphone cord, but for some external devices without physical cords, for example: bluetooth headset, this solution is difficult to popularize. In addition, point-to-point instruction transmission is adopted in the scheme, namely, the action of the earphone line corresponds to the operation of the electronic equipment with the screen, the combination mode is limited, and therefore the corresponding operation mode is also limited. In addition, the interactive mode has high learning cost for users, and when the effect corresponding to the touch part of the defined interactive earphone line is large, the users are difficult to memorize and are easy to generate misoperation. In addition, the scheme cannot control the electronic equipment with the screen to execute continuous operation and complex operation, and only can control the electronic equipment with the screen to execute basic operation such as pause, play, volume adjustment and the like.
In addition, human-computer interaction with electronic equipment with a screen can be realized through a Bluetooth mouse at present, but the Bluetooth mouse is inconvenient to carry, is generally only suitable for home and office work and the like, and has limitation in application scenes.
Therefore, in order to solve the above technical problem, an embodiment of the present application provides a device control method, which is applied to a system composed of a first electronic device and at least one second electronic device, and the first electronic device is configured with a touch panel. The first electronic device receives a first user operation, the first user operation comprises an operation executed by a user on the touch panel, generates a first instruction according to the first user operation, and then sends the first instruction to the second electronic device. The first instruction is used for instructing the second electronic equipment to control the touch point to execute target operation, and the target operation comprises any one or more of displaying the touch point in a fixed state, displaying the touch point in a released state, moving, clicking, double clicking, sliding and long pressing. The touch point is displayed on the display screen of the second electronic device, the position of the touch point is not movable when the touch point is in a fixed state, and the position of the touch point is movable when the touch point is in a released state. The second electronic equipment is the electronic equipment with the screen, so that when a user cannot touch the electronic equipment with the screen, the touch points on the electronic equipment with the screen are controlled through other electronic equipment, the electronic equipment with the screen is controlled, various complex operations of the electronic equipment with the screen are realized, and convenience is brought to the user.
The method provided by the embodiment of the application can be applied to a scene that the first electronic equipment is communicated with at least one other electronic equipment (for example, the second electronic equipment, the third electronic equipment and the like). Exemplarily, as shown in fig. 2, a schematic view of a scenario that a first electronic device communicates with one other electronic device is provided for the embodiment of the present application. As shown in fig. 3, a scene diagram of a first electronic device communicating with a plurality of other electronic devices is provided for the embodiment of the present application. For example, the first electronic device may establish a communication connection with another electronic device through a bluetooth (bluetooth) technology, a Near Field Communication (NFC) technology, zigBee, a Wi-Fi network, a distributed soft bus, or the like, and the application does not limit the manner in which the first electronic device establishes a communication connection with another electronic device.
Alternatively, the first electronic device may be a wearable device (e.g., a bluetooth headset), a smart home device, and the like, and the other electronic devices (e.g., the second electronic device, the third electronic device) may be a mobile phone, a tablet computer, a desktop computer, a laptop computer, a handheld computer, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, a cellular phone, a Personal Digital Assistant (PDA), an Augmented Reality (AR) device, a Virtual Reality (VR) device, an Artificial Intelligence (AI) device, a wearable device, a vehicle-mounted device, a smart home device, and/or a smart city device, and the specific type of the first electronic device and the other electronic devices is not particularly limited in the embodiments of the present application.
Exemplarily, as shown in fig. 4a, a schematic architecture diagram of a distributed network provided in the embodiment of the present application is shown.
The distributed network architecture shown in fig. 4a includes a security authentication module 401, a distributed soft bus 402, a distributed data management module 403, a distributed task scheduling module 404, and the like.
The security authentication module 401 is used for identity authentication of various electronic devices, including but not limited to fingerprint authentication, password authentication, and the like, to ensure communication security of the distributed network.
The distributed soft bus module 402 is a unified base for various electronic devices (mobile phones, tablets, intelligent wearable devices, smart screens, bluetooth headsets, car machines, etc.), provides unified distributed communication capability for interconnection and intercommunication among the devices, can quickly discover and connect the devices, and efficiently distribute tasks and transmit data. The electronic device can complete distributed services such as device virtualization, cross-device service invocation, multi-screen collaboration, file sharing, and the like based on the distributed soft bus module 402.
The distributed data management module 403 implements distributed management of application data and user data based on the capabilities of the distributed soft bus. The user data is not bound with a single physical device any more, and the service logic is separated from the data storage.
The distributed task scheduling module 404 constructs a unified distributed service management (discovery, synchronization, registration, scheduling) mechanism based on the technical characteristics of a distributed soft bus, distributed data management, and the like, supports operations such as remote start, remote call, remote connection, migration, and the like on applications across devices, and can select a proper device to run a distributed task according to the capabilities, positions, service running states, resource use conditions, and habits and intentions of users of different devices.
Illustratively, as shown in fig. 4b, a schematic diagram of a communication architecture including a distributed soft bus is provided for an embodiment of the present application.
The distributed softbus as shown in fig. 4b comprises a bus hub 405, as well as a bus 406, a security module 407, etc. The bus hub 405 is responsible for parsing commands and performing inter-device discovery and connection. The bus 406 includes a task bus and a data bus for transmitting various forms of data such as service messages, bytes, files, streams, and the like. The security module 407 is used for encryption and decryption of communication data.
The basic communication module includes a protocol stack layer 409 and a software/hardware protocol layer 410, which are used to mask the protocol differences of various devices, for example: bluetooth, bluetooth Low Energy (BLE), hybrid Fiber Coaxial (HFC), etc.
In the distributed soft bus technology, the devices are automatically discovered, self-discovery experience of zero-waiting of a user is realized, the devices nearby with the account are automatically discovered without waiting, and automatic and safe connection is realized. The device accessing the distributed soft bus is divided into a discovery end and a discovered end, the discovery end is generally a device requesting to use a service or called a master control device, and the discovered end is a device issuing a service. At present, devices interconnected through a distributed soft bus need to ensure that a discovery end and a discovery end are in the same local area network.
Illustratively, as shown in fig. 4c, a schematic diagram of communication between devices through a distributed soft bus is provided for the embodiment of the present application.
The discovery end first accesses the distributed network, for example: the distributed network is accessed in the form of an account number and a password, and it can be understood that the distributed network can be a local area network. The discovered end then starts the distributed soft bus and issues a service to the distributed soft bus to inform other devices that they can request to use the service, i.e. can establish a communication connection with itself to transfer data. Then, after the discovery end accesses the distributed network, the discovery end can be known by the network device, and then data is transmitted to the distributed soft bus, for example: data is transmitted to the distributed soft bus in the form of a restricted application protocol (coach) broadcast. Finally, after the discovered end receives the data transmitted by the discovered end, the discovered end can receive the data and respond to the data.
Illustratively, the discovery end is a bluetooth headset, and the discovery end is a mobile phone. The mobile phone accesses the distributed network by inputting the account number and the password of the distributed network. The bluetooth headset then also accesses the distributed network. Optionally, the bluetooth headset may access the distributed network through a mobile phone, for example: firstly, the Bluetooth earphone and the mobile phone establish Bluetooth connection, and after the mobile phone accesses the distributed network, the mobile phone can forward the identity information of the Bluetooth earphone to a server of the distributed network so as to complete the identity authentication of the Bluetooth earphone. And the Bluetooth headset can access the distributed network after authentication. Then the Bluetooth connection between the Bluetooth earphone and the mobile phone can be disconnected, and data is transmitted between the subsequent Bluetooth earphone and the mobile phone through a distributed soft bus in a distributed network. Then, the bluetooth headset can send instruction information to the mobile phone through the distributed soft bus to control touch points on the mobile phone, and further control the mobile phone. The instruction information may include an identity of the mobile phone, so that the mobile phone may receive and respond to the instruction information. Optionally, the identity of the mobile phone may be in the form of a token (token), and the token may be assigned to the mobile phone by the server after the mobile phone accesses the distributed network, which is not limited in this application.
For example, taking the first electronic device as a bluetooth headset as an example, fig. 5 shows a schematic structural diagram of the bluetooth headset.
As shown in fig. 5, the bluetooth headset includes a touch panel 501, keys 502, an earpiece 503, a receiver 504, and the like. Optionally, the bluetooth headset may further include an indicator light 505.
The touch panel 501, which may also be referred to as a touch screen, is used for receiving user operation information, and is composed of a sensor (sensor), a controller (controller), and a software (software). The sensor is used for receiving information input by a user on the touch panel, and the controller is used for analyzing the information sensed by the sensor, determining the position of a touch point, generating an analog signal and converting the analog signal into a digital signal so that other modules (such as a Bluetooth module) can receive the digital signal. The software portion acts as a protocol layer between the other modules and the controller so that the other modules can receive and recognize the digital signal to perform subsequent processing. In some embodiments of the present application, the touch panel is configured to receive operation information of a user, for example: click, double click, multi-click, slide, long press, etc.
The keys 502 may be mechanical keys or touch keys. The bluetooth headset may receive key inputs, generating key signal inputs related to user settings and function controls of the bluetooth headset. In some embodiments of the present application, the keys 502 may be used to turn on, turn off, volume adjust, turn on and/or off a touch point control mode, and the like.
The speaker 504 is disposed inside the bluetooth headset for receiving the audio signal and converting the audio signal into a sound signal, and the earpiece 505 is close to the ear of the user for transmitting the sound signal to the ear of the user to be heard by the user.
The indicator light 505 may be a Light Emitting Diode (LED). In some embodiments, an LED indicator can be used to indicate bluetooth headset charging status, on-off status, bluetooth pairing status, and the like, in some embodiments of the present application, an LED indicator can be used to indicate a touch point control mode on and/or off status.
Exemplarily, taking other electronic devices as a mobile phone as an example, fig. 6 shows a schematic structural diagram of the mobile phone.
The mobile phone may include a processor 610, an external memory interface 620, an internal memory 621, a Universal Serial Bus (USB) interface 630, a charging management module 640, a power management module 641, a battery 642, an antenna 1, an antenna 2, a mobile communication module 650, a wireless communication module 660, an audio module 670, a sensor module 680, a button 690, a motor 691, an indicator 692, a camera 693, a display 694, and a Subscriber Identity Module (SIM) card interface 695, among others.
It is to be understood that the illustrated structure of the embodiment of the present invention is not to be specifically limited to a mobile phone. In other embodiments of the present application, the handset may include more or fewer components than illustrated, or combine certain components, or split certain components, or arrange different components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 610 may include one or more processing units, such as: the processor 610 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors.
The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 610 for storing instructions and data. In some embodiments, the memory in the processor 610 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 610. If the processor 610 needs to use the instruction or data again, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 610, thereby increasing the efficiency of the system.
The charging management module 640 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 640 may receive charging input from a wired charger via the USB interface 630. In some wireless charging embodiments, the charging management module 640 may receive the wireless charging input through a wireless charging coil of the cell phone. The charging management module 640 may also supply power to the electronic device through the power management module 641 while charging the battery 642.
The power management module 641 is configured to connect the battery 642, the charging management module 640 and the processor 610. The power management module 641 receives the input from the battery 642 and/or the charging management module 640, and supplies power to the processor 610, the internal memory 621, the display 694, the camera 693, the wireless communication module 660, and the like. The power management module 641 may also be configured to monitor battery capacity, battery cycle count, battery state of health (leakage, impedance), and other parameters. In some other embodiments, the power management module 641 may be disposed in the processor 610. In other embodiments, the power management module 641 and the charging management module 640 may be disposed in the same device.
The wireless communication function of the mobile phone can be realized by the antenna 1, the antenna 2, the mobile communication module 650, the wireless communication module 660, the modem processor, the baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the handset may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 650 may provide a solution including wireless communication of 2G/3G/4G/5G, etc. applied to a cellular phone. The mobile communication module 650 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 650 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the filtered electromagnetic wave to the modem processor for demodulation. The mobile communication module 650 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 650 may be disposed in the processor 610. In some embodiments, at least some of the functional blocks of the mobile communication module 650 may be disposed in the same device as at least some of the blocks of the processor 610.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device or displays images or video through the display screen 694. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be separate from the processor 610, and may be located in the same device as the mobile communication module 650 or other functional modules.
The wireless communication module 660 may provide solutions for wireless communication applied to a mobile phone, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (BT), global Navigation Satellite System (GNSS), frequency Modulation (FM), near Field Communication (NFC), infrared (IR), and the like. The wireless communication module 660 may be one or more devices integrating at least one communication processing module. The wireless communication module 660 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering on electromagnetic wave signals, and transmits the processed signals to the processor 610. The wireless communication module 660 may also receive a signal to be transmitted from the processor 610, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves. In some embodiments of the present application, the wireless communication module 650 is configured to communicate with a first electronic device, for example: and receiving instruction information sent by the first electronic device, such as a first instruction, a second instruction, a third instruction and the like. In still other embodiments of the present application, the wireless communication module 660 comprises a bluetooth module for communicating with the first electronic device.
In some embodiments, the handset antenna 1 is coupled to the mobile communication module 150 and the handset antenna 2 is coupled to the wireless communication module 660 such that the handset can communicate with the network and other devices via wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), general Packet Radio Service (GPRS), code division multiple access (code division multiple access, CDMA), wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (time-division code division multiple access, TD-SCDMA), long Term Evolution (LTE), BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The mobile phone realizes the display function through the GPU, the display screen 694, the application processor and the like. The GPU is a microprocessor for image processing, connected to the display screen 694 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 610 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 694 is used to display images, video, and the like. The display 694 includes a display panel. The display panel may be a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), or the like. In some embodiments, the cell phone may include 1 or N display screens 694, N being a positive integer greater than 1.
The mobile phone can realize the shooting function through the ISP, the camera 693, the video codec, the GPU, the display screen 694, the application processor and the like.
The ISP is used to process the data fed back by the camera 693. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 693.
The camera 693 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, the handset may include 1 or N cameras 693, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the mobile phone selects the frequency point, the digital signal processor is used for performing fourier transform and the like on the frequency point energy.
Video codecs are used to compress or decompress digital video. The handset may support one or more video codecs. Thus, the mobile phone can play or record videos in various encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The external memory interface 620 may be used to connect an external memory card, such as a Micro SD card, to extend the storage capability of the mobile phone. The external memory card communicates with the processor 610 through the external memory interface 620 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
Internal memory 621 may be used to store computer-executable program code, including instructions. The internal memory 621 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The data storage area can store data (such as audio data, a phone book and the like) created in the use process of the mobile phone. In addition, the internal memory 621 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like. The processor 610 executes various functional applications of the cellular phone and data processing by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
The handset can implement audio functions through the audio module 670 and the application processor, etc. Such as music playing, recording, etc.
The audio module 670 is used to convert digital audio information into an analog audio signal output and also used to convert an analog audio input into a digital audio signal. The audio module 670 may also be used to encode and decode audio signals. In some embodiments, the audio module 670 may be disposed in the processor 610, or some functional modules of the audio module 670 may be disposed in the processor 610.
Sensor module 680 may include a pressure sensor, a gyroscope sensor, an air pressure sensor, a magnetic sensor, an acceleration sensor, a distance sensor, a proximity light sensor, a fingerprint sensor, a temperature sensor, a touch sensor, an ambient light sensor, a bone conduction sensor, etc.
The keys 690 include a power-on key, a volume key, and the like. The keys 690 may be mechanical keys. Or may be touch keys. The mobile phone may receive a key input, and generate a key signal input related to user setting and function control of the mobile phone.
The motor 691 may produce a vibration indication. Motor 691 can be used for incoming call vibration prompting, as well as for touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 691 may also respond to different vibration feedback effects for touch operations applied to different areas of the display screen 694. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
Indicator 692 may be an indicator light that may be used to indicate a state of charge, a change in charge, or may be used to indicate a message, a missed call, a notification, etc.
The SIM card interface 695 is used for connecting a SIM card. The SIM card can be attached to and detached from the mobile phone by being inserted into the SIM card interface 695 or being pulled out of the SIM card interface 695. The mobile phone can support 1 or N SIM card interfaces, and N is a positive integer greater than 1. The SIM card interface 695 can support a Nano SIM card, a Micro SIM card, a SIM card, etc. Multiple cards can be inserted into the same SIM card interface 695 at the same time. The types of the plurality of cards may be the same or different. The SIM card interface 695 may also be compatible with different types of SIM cards. The SIM interface 695 may also be compatible with an external memory card. The mobile phone realizes functions of communication, data communication and the like through the interaction of the SIM card and the network. In some embodiments, the handset employs eSIM, namely: an embedded SIM card. The eSIM card can be embedded in the mobile phone and cannot be separated from the mobile phone.
The software system of the mobile phone can adopt a layered architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture or a cloud architecture. The embodiment of the invention takes an Android system with a layered architecture as an example, and exemplarily illustrates a software structure of a mobile phone.
Fig. 7 is a block diagram of a software configuration of a mobile phone according to an embodiment of the present invention.
The layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, an application layer, an application framework layer, an Android runtime (Android runtime) and system library, and a kernel layer from top to bottom.
The application layer may include a series of application packages.
As shown in fig. 7, the application package may include applications such as camera, gallery, calendar, phone call, map, navigation, WLAN, bluetooth, music, video, short message, etc.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 7, the application framework layers may include a window manager, content provider, view system, phone manager, resource manager, notification manager, and the like.
The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make it accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phone books, etc.
The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
The telephone manager is used for providing the communication function of the mobile phone. Such as management of call status (including on, off, etc.).
The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and the like.
The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a brief dwell, and does not require user interaction. Such as a notification manager used to notify download completion, message alerts, etc. The notification manager may also be a notification that appears in the form of a chart or scroll bar text at the top status bar of the system, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, prompting text information in the status bar, sounding a prompt tone, vibrating the electronic device, flashing an indicator light, etc.
The Android Runtime comprises a core library and a virtual machine. The Android runtime is responsible for scheduling and managing an Android system.
The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. And executing java files of the application program layer and the application program framework layer into a binary file by the virtual machine. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface managers (surface managers), media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., openGL ES), 2D graphics engines (e.g., SGL), and the like.
The surface manager is used to manage the display subsystem and provide fusion of 2D and 3D layers for multiple applications.
The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, and the like.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, an audio driver, a sensor driver and a Bluetooth driver.
In the following, with reference to a scenario, taking the first electronic device as a bluetooth headset and the second electronic device as a mobile phone as an example, the workflow of the bluetooth headset and the mobile phone is exemplarily described.
Firstly, the Bluetooth headset establishes communication connection with the mobile phone. The above description refers to the manner used for establishing the connection between the bluetooth headset and the mobile phone, and is not repeated herein. After the communication connection is established, when the touch panel of the bluetooth headset detects a user operation, for example: clicking, sliding, double clicking, long pressing and the like are reported to the processor, and the processor can generate a corresponding input event according to the operation. Then the processor packages the input event into a data packet, transmits the data packet to the Bluetooth module of the processor, and then the data packet is sent to the mobile phone by the Bluetooth module. Optionally, identification information may be included in the input event. The identification information is used to indicate the type of the input event, i.e. whether the input event is a click event, a slide event, a long press event, etc. Optionally, when the input event is a sliding event, the input event may further include a displacement distance, where the displacement distance is used to indicate a distance that a finger of a user slides on the touch panel of the bluetooth headset, that is, a relative position where the finger of the user moves on the touch panel. Optionally, a displacement direction may be further included in the input event, where the displacement direction is used to indicate a direction in which a finger of the user slides on the touch panel of the bluetooth headset.
After receiving the data packet sent by the bluetooth headset, the bluetooth module of the mobile phone calls the bluetooth driver of the kernel layer to analyze the data packet, generates an input event, calculates coordinates of a touch point according to information contained in the input event and the mapping table through the processor 610, sends the coordinates to the display screen 694, and displays the touch point through the display screen 694.
The mapping table includes a mapping relationship between an operation of a user on the touch panel of the bluetooth headset and an operation performed on a touch point by the mobile phone, and may be, for example, as shown in table 1.
TABLE 1
Figure BDA0003156961560000121
Figure BDA0003156961560000131
And under the condition that the touch point is in a fixed state, the position of the touch point cannot be moved. In the released state of the touch point, the position of the touch point may move, and the moving direction may be arbitrary, for example: up and down movement, left and right movement, etc., and the present application does not limit this. The mobile phone can determine the type of operation currently required to be executed on the touch point according to the identification information in the input event, the state of the touch point and the mapping table.
For example: assuming that the identification information in the input event indicates that the input event is a single-click operation, and the mobile phone determines that the touch point on the current display screen is in a fixed state, the mobile phone queries the mapping table, determines that the single-click operation in the bluetooth headset corresponds to the single-click operation in the mobile phone, and then the mobile phone performs the single-click operation on the touch point on the current display screen.
Another example is: assuming that the identification information in the input event indicates that the input event is a double-click event, and the mobile phone determines that the touch point on the current display screen is in a fixed state, the mobile phone queries the mapping table, determines that the double-click operation in the bluetooth headset corresponds to the release operation in the mobile phone, and then the mobile phone performs the release operation on the touch point on the current display screen, that is, the touch point is converted from the fixed state to the release state.
Another example is: assume that the identification information in the input event indicates that the input event is a swipe operation, and the swipe distance included in the input event is 3mm and the swipe direction is upward, and the mobile phone determines that the touch point on the current display screen is in the release state. The mobile phone inquires the mapping table, determines that the sliding operation in the bluetooth headset corresponds to the moving operation, and determines that the direction of the moving operation performed on the touch point is upward according to the sliding direction, and then determines the position of the touch point which needs to be moved upward currently. The distance to be moved is then calculated: according to the mapping table, it can be determined that the sliding distance in the bluetooth headset is x millimeters, the corresponding moving distance is (2*x)% of pixels (pixel), and then the distance of the touch point which needs to be moved upwards at present is 6% of pixels. Assuming that the screen size of the mobile phone is 2340 × 1080 pixels, that is, the height of the screen is 2340 pixels and the width is 1080 pixels, the distance required to move the touch point is 2340 × 6% =140.4 pixels. As shown in fig. 8, assuming that the upper left corner of the mobile phone screen is the origin of coordinates, the downward direction is the positive y-axis direction, and the right direction is the positive x-axis direction, and the current coordinates of the touch point are (x 1, y 1) in units of pixels, the coordinates of the touch point after movement are (x 1, y 1-140.4).
Another example is: assume that the identification information in the input event indicates that the input event is a sliding operation, the sliding distance included in the input event is 1mm and the sliding direction is to the left, and the mobile phone determines that the touch point on the current display screen is in the release state. The mobile phone inquires the mapping table, determines that the sliding operation in the bluetooth headset corresponds to the moving operation, and determines that the direction of the moving operation performed on the touch point is also leftward according to the sliding direction. The distance to be moved is then calculated: assuming that the size of the screen of the mobile phone is 1920 × 1080 pixels, the distance for moving the touch point at this time is 1080 × 2% =21.6 pixels. The handset moves the touch point on the current display screen 21.6 pixels to the left. As shown in fig. 9, taking the upper left corner of the mobile phone screen as the origin of coordinates, downward as the positive y-axis direction, and rightward as the positive x-axis direction, and taking the current coordinates of the touch point as (x 1, y 1) as the unit of pixel as an example, the coordinates of the touch point after movement are (x 1-21.6, y 1).
Illustratively, the mapping table may also be as shown in table 2.
TABLE 2
Figure BDA0003156961560000132
Figure BDA0003156961560000141
As shown in table 2, the states of the touch points are no longer distinguished, and the mobile phone directly determines the operation type that needs to be currently executed on the touch point according to the identification information in the input event and the mapping table, without considering the states of the touch points.
For example: if the identification information in the input event indicates that the input event is a single-click operation, the mobile phone queries the mapping table, determines that the single-click operation in the Bluetooth headset corresponds to the single-click operation in the mobile phone, and then executes the single-click operation on the touch point on the current display screen. This is not exemplified here.
It should be noted that the mapping relationship between the displacement of the bluetooth headset sliding and the distance of the mobile phone moving the touch point is only an exemplary description, and when the displacement of the bluetooth headset sliding is x millimeters, the distance of the mobile phone moving the touch point may be (3*x)% of pixels, (5*x)% of pixels, and the like, which is not limited in the present application. In addition, the distance may be in units of other standards besides pixels, for example: inches, etc., as this application is not intended to be limiting.
In addition, the distance that the mobile phone moves to the touch point is in the form of relative distance, for example: the distance of (2*x)% pixel refers to the distance of (2*x)% pixel with respect to the self screen size. Because the screen sizes of different mobile phones may be different, the requirement of the mobile phones with different screen sizes can be met by adopting the relative distance mode.
It should be understood that the above mapping relationship is only an exemplary illustration, and in actual design, a developer can design the mapping relationship according to actual requirements. For example: the single-click operation in the bluetooth headset may correspond to a double-click operation of the mobile phone on the touch point, the long-press operation in the bluetooth headset may correspond to a single-click operation of the mobile phone on the touch point, and the like, which is not limited in this application.
Optionally, the mapping relationship may be stored in a buffer of the bluetooth headset, and then the bluetooth headset sends the mapping relationship to the mobile phone. For example, the mapping relationship may be sent at the time when the bluetooth headset and the mobile phone initially establish a connection, or may be sent at the time when the bluetooth headset and the mobile phone subsequently communicate, for example: and sending the input data packet. The mapping relation can be sent only once or many times, and the application does not limit the sending time and the sending times of the mapping relation at all.
Taking the first electronic device as a bluetooth headset and the second electronic device as a mobile phone as an example, the device control method provided by the embodiment of the present application is elaborated in detail with reference to the accompanying drawings.
First, a communication connection is established between the mobile phone and the bluetooth headset, optionally, a communication connection may be established between the mobile phone and the bluetooth headset through a bluetooth technology, and may also be established through a distributed soft bus, which is not limited in this application.
Then, the mobile phone can turn on the touch point control mode. For example, the mobile phone may turn on the touch point control mode in the manner shown in fig. 10a, 10b, and 10 c.
As shown in fig. 10a, it is assumed that the bluetooth headset detects that the user performs an operation of turning on the touch point control mode on the touch panel of the bluetooth headset, for example: and if the user executes long-time pressing operation on the touch panel of the Bluetooth headset, the Bluetooth headset indicates the mobile phone to start the touch point control mode. Optionally, the time of the long press operation may be greater than or equal to a preset time, the preset time may be 2 seconds, 3 seconds or other values, and the preset time may be set by a developer according to actual requirements, which is not limited in the present application. The mobile phone starts a touch point control mode according to the indication of the Bluetooth headset. Illustratively, after the mobile phone starts the touch point control mode, the mobile phone jumps from the interface 1000 to the interface 1010. In the interface 1010, a touch point 1001 is displayed.
As shown in fig. 10b, it is assumed that the bluetooth headset detects that the user performs an operation of turning on the touch point control mode on the key of the bluetooth headset, for example: and the user executes click operation on the keys of the Bluetooth headset, and the Bluetooth headset indicates the mobile phone to start the touch point control mode. The mobile phone turns on the touch point control mode according to the instruction of the bluetooth headset, and the mobile phone jumps from the interface 1000 to the interface 1010. In the interface 1010, a touch point 1001 is displayed.
As shown in fig. 10c, a setting Application (APP) is installed in the mobile phone, for example, an icon for setting APP is displayed in the interface 1000, the mobile phone detects an operation of a user clicking the icon 1002 for setting APP, the mobile phone may jump to the interface 1020, then detects an operation of a user triggering the touch point interaction module 1003 in the interface 1020, the mobile phone may jump to the interface 1030, then detects an operation of a user triggering the touch point opening and closing module 1004 in the interface 1030, the mobile phone may jump to the interface 1040, then detects an operation of a user clicking the touch point button 1005 in the interface 1040, and the mobile phone opens the touch point control mode. Optionally, the handset may display touch points (not shown) in interface 1040. Optionally, the mobile phone may display the interface 1010 when detecting that the user returns to the mobile phone main interface.
Optionally, the mobile phone may further default to start the touch point control mode, that is, after the bluetooth headset establishes a communication connection with the mobile phone, the mobile phone automatically starts the touch point control mode without inputting an additional instruction.
Optionally, before the interface 1010 is displayed on the mobile phone, assuming that the electronic device is in the screen-locking state, as shown in the interface 1100 shown in fig. 11, the bluetooth headset detects an operation of the user to start the touch point control mode, and instructs the mobile phone to start the touch point control mode. And the mobile phone starts a touch point control mode according to the indication. For example, the mobile phone may jump from the interface 1100 to the interface 1010, that is, the mobile phone may unlock the screen and enter the touch point control mode.
Alternatively, the touch point may be displayed in the center of the screen of the interface 1010 or the interface 1040. Alternatively, the touch point may also be displayed in a fixed state.
At this time, the display position and the display state of the touch point may be referred to as an initial position and an initial state, respectively. Optionally, the initial state of the touch point may also be a release state, which is not limited in this application. Alternatively, the initial position of the touch point may be other positions, such as: the lower right corner, the upper left corner and the like of the mobile phone display screen can also be located on an icon of an APP, optionally, the APP can be an APP commonly used by a user, and the application does not limit the APP.
Optionally, the initial position and the initial state may be sent to the mobile phone by the bluetooth headset, for example: the initial state and the initial position can be contained in the mapping table, and are sent to the mobile phone when the bluetooth headset sends the mapping table to the mobile phone; or can be sent to the mobile phone by the bluetooth headset alone, for example: when the mobile phone and the Bluetooth headset are initially connected, the Bluetooth headset sends the connection to the mobile phone. Optionally, the initial position and the initial state of the touch point may be stored in the mobile phone by default.
Optionally, the initial position and the initial state are set on the mobile phone by the user, which is not limited in the present application.
Illustratively, as shown in fig. 12a, an initial position module 1201 is included in the interface 1030, and when detecting that the user triggers the operation of the initial position module 1201, the mobile phone may jump to the interface 1200, and in the interface 1200, the user may select the initial position of the touch point.
Illustratively, as shown in fig. 12b, an initial state module 1202 is included in the interface 1030, and when detecting that the user triggers the operation of the initial state module 1202, the mobile phone may jump to the interface 1210, and in the interface 1210, the user may select an initial state of a touch point, for example: a fixed state and a released state.
Optionally, on the basis of the touch point 1001 in the state shown in the interface 1010, the mobile phone may perform one or more operations of sliding, releasing, clicking, double-clicking, and long-pressing on the touch point on the interface 1010.
For example, as shown in fig. 13a, assuming that the bluetooth headset detects that the user performs a sliding operation on the touch panel of the bluetooth headset, the bluetooth headset instructs the mobile phone to also perform the sliding operation on the touch point on the display screen, and in response to the instruction of the bluetooth headset, the mobile phone performs the sliding operation on the touch point on the display screen. Illustratively, the handset jumps from interface 1010 to interface 1300. By the mode, the switching of the mobile phone interface can be realized.
For example, as shown in fig. 13b, assuming that the bluetooth headset detects that the user performs a double-click operation on the touch panel of the bluetooth headset, the bluetooth headset instructs the mobile phone to perform a release operation on the touch point on the display screen, and in response to the instruction of the bluetooth headset, the mobile phone performs a release operation on the touch point on the display screen. Illustratively, touch point 1001 in the handset interface 1010 translates to touch point 1301 in the interface 1310. The touch point 1301 on the interface 1310 is displayed in a released state.
Optionally, based on the touch point 1301 in the state shown in the interface 1310, the mobile phone may perform one or more of fixed and moving operations on the touch point on the interface 1310.
For example, as shown in fig. 14a, assuming that the bluetooth headset detects that the user performs a double-click operation on the touch panel of the bluetooth headset, the bluetooth headset instructs the mobile phone to perform a fixing operation on the touch point on the display screen, and in response to the instruction of the bluetooth headset, the mobile phone performs a fixing operation on the touch point on the display screen. Illustratively, touch point 1301 in handset interface 1310 translates into touch point 1001 in interface 1010. The touch point 1001 on the interface 1010 is displayed in a fixed state.
For example, as shown in fig. 14b, assuming that the bluetooth headset detects that the user performs a sliding operation on the touch panel of the bluetooth headset, the bluetooth headset instructs the mobile phone to perform a moving operation on the touch point on the display screen, and in response to the instruction of the bluetooth headset, the mobile phone performs a moving operation on the touch point on the display screen. For example, taking the mapping table shown in table 1 as an example, assuming that the user slides 1mm to the left on the touch panel of the bluetooth headset, the mobile phone moves the position of the touch point by 2% pixels to the left. Illustratively, the cell phone moves touch point 1301 of interface 1310 to the location of touch point 1401 in interface 1400.
For example, as shown in fig. 15a, if the user currently wants to play a video, on the basis of the touch point 1301 in the state of the interface 1310, assuming that the bluetooth headset detects that the user performs a sliding operation on the touch panel of the bluetooth headset, the bluetooth headset instructs the mobile phone to perform a moving operation on the touch point on the display screen, and in response to the instruction of the bluetooth headset, the mobile phone performs a moving operation on the touch point on the display screen. Also taking the mapping table shown in table 1 as an example, assuming that the user slides 3mm to the left on the touch panel of the bluetooth headset and then slides 2mm upwards, the mobile phone moves the position of the touch point by 6% pixels to the left and then by 4% pixels upwards. Optionally, the user may repeatedly slide on the same area on the touch panel of the bluetooth headset, for example: the sliding to the left, the sliding up, etc. are repeated until the touched point is located above the icon 1501 of the video APP. Illustratively, touch point 1301 in handset interface 1310 moves to the location of touch point 1501 in interface 1500.
Then, the bluetooth headset detects double-click operation of a user on a touch panel of the bluetooth headset, the bluetooth headset instructs the mobile phone to perform fixed operation on a touch point on the display screen, and the mobile phone performs fixed operation on the touch point on the display screen in response to the instruction of the bluetooth headset. Illustratively, the touch point 1501 in the handset interface 1500 transitions to the touch point 1502 in the interface 1510, i.e., the touch point transitions from the released state to the fixed state.
Then, the Bluetooth headset detects a click operation of a user on the touch panel of the Bluetooth, the Bluetooth headset instructs the mobile phone to execute the click operation on the touch point on the display screen, and the mobile phone executes the click operation on the touch point on the display screen in response to the instruction of the Bluetooth headset. For example, the mobile phone jumps to the interface 1520 from the interface 1510, that is, the video APP is turned on at this time, and the user can select the video playing desired to be watched.
For example, as shown in fig. 15b, on the basis of not distinguishing the state of the touch point, optionally, the touch point at this time may be a touch point 1301 shown in the interface 1310, or a touch point 1001 shown in the interface 1010, but the touch point 1301 and/or the touch point 1302 at this time do not distinguish whether the state is the released state or the fixed state. Taking the touch point in this case as the touch point 1301 shown in the interface 1310 as an example, assuming that the bluetooth headset detects that the user performs a sliding operation on the touch panel of the bluetooth headset, the bluetooth headset instructs the mobile phone to perform a moving operation on the touch point on the display screen, and in response to the instruction of the bluetooth headset, the mobile phone performs a moving operation on the touch point on the display screen. Taking the mapping table shown in table 2 as an example, assuming that the user slides 3mm to the left on the touch panel of the bluetooth headset and then slides 2mm upwards, the mobile phone moves the position of the touch point by 6% pixels to the left and then by 4% pixels upwards. Optionally, the user may repeatedly slide in the same area on the touch panel of the bluetooth headset, for example: the sliding to the left, the sliding up, etc. are repeated until the touched point is located above the icon 1501 of the video APP. Illustratively, touch point 1301 in handset interface 1310 moves to the location of touch point 1501 in interface 1500.
Then, the Bluetooth headset detects a click operation of a user on the Bluetooth touch panel, the Bluetooth headset instructs the mobile phone to execute the click operation on the touch point on the display screen, and the mobile phone executes the click operation on the touch point on the display screen in response to the instruction of the Bluetooth headset. Illustratively, the mobile phone jumps to the interface 1520 from the interface 1500, that is, the video APP is turned on at this time, and the user can select the video playing that the user wants to watch.
The above example is described as the first electronic device establishing a communication connection with the second electronic device, and the following example is described as the first electronic device simultaneously establishing a communication connection with the second electronic device and the third electronic device.
For example, taking the first electronic device as a bluetooth headset, the second electronic device as a mobile phone, and the third electronic device as a television as an example, assuming that the mobile phone and the television have already established a communication connection, the mobile phone is projected to the television, and the bluetooth headset, the mobile phone, and the television have all been connected to the same distributed network, at this time, the bluetooth headset may control a touch point on the television, so as to operate the television, for example: controlling the television to play videos, and the like.
Take the first electronic device as a bluetooth headset, the second electronic device as a mobile phone, and the third electronic device as a tablet computer as an example. Optionally, the bluetooth headset, the mobile phone, and the tablet computer are respectively accessed to the distributed network after being authorized and authenticated, and the bluetooth headset, the mobile phone, and the tablet computer may establish communication connection through a distributed soft bus in the distributed network.
Optionally, the bluetooth headset can only control a touch point on one electronic device (e.g. a mobile phone or a tablet computer) at a time. For example, in a case where the bluetooth headset can only control touch points on one electronic device at a time, the bluetooth headset may sequentially control the touch points on the electronic device according to a preset sequence table. Optionally, the preset sequence table may be used to sequence the electronic devices according to a time sequence of accessing the distributed network by the electronic devices, for example: assuming that the handset first accesses the distributed network, the bluetooth headset may first control the touch point on the handset. For example, after the bluetooth headset, the mobile phone, and the tablet computer access the distributed network, as shown in fig. 16a, it is assumed that the bluetooth headset detects that the user performs an operation of turning on the touch point control mode on the touch panel of the bluetooth headset, for example: the user performs a long-time pressing operation on the touch panel of the bluetooth headset, and the bluetooth headset may instruct the mobile phone to start the touch point control mode, for example, the bluetooth headset issues corresponding instruction information to the distributed soft bus, where the instruction information carries an identity of the mobile phone, for example: the token of the mobile phone, after receiving the instruction information, the mobile phone responds to the instruction, for example: the handset can jump from interface 1600 to interface 1610. After the tablet computer receives the instruction information, if the tablet computer detects that the identity included in the instruction information does not belong to the tablet computer, the tablet computer does not respond to the instruction information. Optionally, the bluetooth headset may learn the identity of the mobile phone from the network device, for example: the sequence list sent by the network device to the Bluetooth headset comprises the identity of each device.
Assuming that the tablet computer accesses the distributed network first, the mobile phone may control the touch point on the tablet computer first. For example, after the bluetooth headset, the mobile phone, and the tablet computer access the distributed network, as shown in fig. 16b, it is assumed that the bluetooth headset detects that the user performs an operation of turning on the touch point control mode on the touch panel of the bluetooth headset, for example: if the user performs a long press operation on the touch panel of the bluetooth headset, the bluetooth headset may instruct the tablet computer to start the touch point control mode, and the tablet computer may jump from the interface 1620 to the interface 1630.
Illustratively, on the basis of the example shown in fig. 16a, as shown in fig. 16c, it is assumed that the bluetooth headset detects that the user performs an operation of switching the electronic device on the touch panel of the bluetooth headset, for example: the user performs multi-click operation on the touch panel of the bluetooth headset, such as three-click operation, four-click operation and the like, so that the bluetooth headset can be switched to control over the touch point on the tablet computer by controlling the touch point on the mobile phone. Illustratively, the cell phone jumps from interface 1610 to interface 1600, and the tablet jumps from interface 1620 to interface 1630. Optionally, at this time, the bluetooth headset may switch to control the touch point on the next electronic device according to a preset sequence table.
Based on the design, when a user uses a plurality of electronic devices with screens, the touch points on the electronic devices are switched and controlled, and then the electronic devices can be respectively controlled, seamless switching control among the electronic devices can be realized, and convenience is brought to the user. And each terminal is connected in a distributed network mode, so that the touch point switching process among multiple devices is greatly simplified.
Optionally, the double-click and the multi-click mentioned in the embodiments of the present application may have a completion time, for example: double-click and multi-click can be completed within 1 second, the completion time can be set by developers according to actual requirements, and the method is not limited in the application.
Optionally, the preset sequence table may also be related to the priority of each electronic device, and the priority of each electronic device may be set by a developer.
Optionally, the preset sequence list may be sent to the bluetooth headset by another network device (e.g., a base station), or may be stored in the bluetooth headset by default, which is not limited in this application.
Optionally, the bluetooth headset may also control touch points on multiple electronic devices (e.g., a mobile phone and a tablet computer) at a time. For example, as shown in fig. 16d, it is assumed that the bluetooth headset detects that the user performs an operation of turning on the touch point control mode on the touch panel of the bluetooth headset, for example: if the user performs a long press operation on the touch panel of the bluetooth headset, the bluetooth headset may instruct both the mobile phone and the tablet pc to start the touch point control mode, the mobile phone may jump from the interface 1600 to 1610, and the tablet pc may jump from the interface 1620 to 1630.
Optionally, under the condition that the bluetooth headset simultaneously controls the touch points on the plurality of electronic devices, the bluetooth headset may also control to cancel the control of the touch point on one electronic device. For example, assume that the bluetooth headset detects that the user performs an operation of canceling the touch point control mode on the touch panel of the bluetooth headset, for example: if the user performs a long-time pressing operation on the touch panel of the bluetooth headset, the bluetooth headset can turn off the touch point control mode of the mobile phone and/or the tablet computer. For example: if the bluetooth headset detects that the user performs long-time pressing operation on the touch panel, the bluetooth headset can close the touch point control mode of the mobile phone and/or the tablet computer according to a preset sequence list, and if the mobile phone is located in front of the tablet in the preset sequence list, the bluetooth headset closes the touch point control mode of the mobile phone if the equipment in the front sequence is closed; and if the equipment in the later sequence is closed, the Bluetooth headset closes the touch point control mode of the panel. The developer can design the closing rule according to the actual requirement, which is not limited in this application.
Optionally, for controlling the touch point on each electronic device by the bluetooth headset, reference may be made to an example where the bluetooth headset establishes a communication connection with the mobile phone, which is not described herein again.
Optionally, an electronic device (e.g., a mobile phone, a tablet computer, etc.) may receive both the touch point control command sent by the bluetooth headset and the command input by the user on the screen of the electronic device. Optionally, the electronic device may default to respond to an instruction input by the user on the screen of the electronic device, that is, the priority of the instruction input by the user on the screen of the electronic device is higher than that of the touch point control instruction sent by the bluetooth headset. Optionally, the user may also set the priority via the electronic device. An input event is used for replacing a touch point control instruction sent by the Bluetooth headset, and a touch event is used for replacing an instruction input by a user on a screen of the electronic equipment. Illustratively, taking an electronic device as a mobile phone as an example, as shown in fig. 17, when an operation that a user triggers a priority 1701 of a touch point event in an interface 1030 is detected, the mobile phone jumps to an interface 1700, and in the interface 1700, the user can set priorities of an input event and a touch event.
It should be noted that, the operations performed by the user on the bluetooth headset in the above examples, and the operations performed by the corresponding handset on the touch point are merely exemplary, and are not limited to the present application.
Exemplarily, as shown in fig. 18, a device control method provided in an embodiment of the present application is applied to a system including a first electronic device and at least one second electronic device, where the first electronic device is configured with a touch panel, and the second electronic device is configured with a display screen, and the method includes the following steps:
s1801, the first electronic device receives a first user operation.
The first user operation comprises any one operation or a plurality of operations of clicking, double clicking, multi-clicking, sliding and long pressing on a touch panel on the first electronic equipment by a user.
Optionally, the first user operation further includes a user operating a key on the first electronic device.
And S1802, the first electronic device generates a first instruction according to the first user operation.
S1803, the first electronic device sends the first instruction to the second electronic device, and correspondingly, the second electronic device receives the first instruction from the first electronic device.
The first instruction is used for instructing the second electronic device to control the touch point to execute target operation, and the target operation comprises any one or more of displaying the touch point in a fixed state, displaying the touch point in a released state, moving, clicking, double clicking, moving, sliding and long pressing. And the touch point is displayed on the display screen of the second electronic device. And when the touch point is in the release state, the position of the touch point can be moved. Optionally, the touch point may also be referred to as an anchor point in this application, which is not limited in this application.
And when the touch point is in a fixed state, the second electronic equipment can switch the interface by sliding the touch point. In the released state, the second electronic device may move the position of the touch point in response to the movement of the touch point. The effect of sliding and moving is different.
Optionally, the first electronic device may send the first instruction to the second electronic device through technologies such as bluetooth and a distributed soft bus, and correspondingly, the second electronic device may receive the first instruction sent by the first electronic device through technologies such as bluetooth and a distributed soft bus.
Optionally, the first instruction may include an input event.
Optionally, the first instruction may include the mapping table.
And S1804, responding to the first instruction, and controlling the touch point to execute the target operation by the second electronic equipment.
In one possible implementation, if the touch point is in a fixed state, the target operation includes displaying any one or more of the touch point in a released state, clicking, double-clicking, sliding, and long-pressing. The target operation can be determined according to a mapping relationship, where the mapping relationship includes a correspondence between an operation of a user on the touch panel and the target operation.
For example: and if the first user operation is clicking, the target operation is clicking. Or, the first user operation is a double click, and the target operation is a double click. Or, the first user operation is a double click, and the target operation is to display the touch point in a fixed state. Or, the first user operation is a double click, and the target operation is to display the touch point in a released state. Or, the first user operation is multi-tap, and the target operation is to display the touch point in a fixed state. Alternatively, the first user operation shown is a multi-tap, and the target operation is to display the touch point in a released state. Or, the first user operation is long press, and the target operation is long press. Or, if the first user operation is a long press, the target operation is to display the touch point in a fixed state. Or, if the first user operation is a long press, the target operation is to display the touch point in a released state. Or, the first user operation is sliding, and the target operation is sliding. Or, the first user operation is sliding, and the target operation is moving. This is not a limitation of the present application.
For example, the mapping relationship may include the mapping table shown in table 1. For example, as shown in fig. 13a, in a state where the touch point is fixed, a sliding operation may be performed on the touch point. For example, as shown in fig. 13b, in a state where the touch point is in a fixed state, the display state of the touch point may be switched to display the touch point in a released state.
Optionally, in this implementation, the touch point in the fixed state is displayed as a first shape on the display screen of the second electronic device. Illustratively, the first shape may be as shown by touch points 1001 shown in fig. 10a, 10b, 11, etc.
In yet another possible implementation, if the touch point is in the released state, the target operation includes displaying any one or more of the touch point and the movement in a fixed state. The target operation can be determined according to a mapping relationship, where the mapping relationship includes a correspondence between an operation of a user on the touch panel and the target operation. For example, the mapping relationship may include the mapping table shown in table 1.
Optionally, in this implementation, the touch point in the released state is displayed as a second shape on the display screen of the second electronic device. Illustratively, the second shape is shown as touch point 1301 shown in fig. 13b, 14a, 14b, etc.
It should be noted that the first shape and the second shape are only exemplary and are not intended to limit the present application.
Optionally, in this implementation, the first instruction includes a first distance, the target operation includes moving, and the target operation includes a second distance.
For example, as shown in fig. 14a, in a state where the touch point is in the released state, the display state of the touch point may be switched to display the touch point in a fixed state. For example, as shown in fig. 14b, in the released state of the touch point, the touch point may be controlled to move, for example: the user slides 1mm to the left on the touch panel, the touch point can move 2% pixels to the left.
Based on the technical scheme, when the user can not touch the electronic equipment with the screen, the touch points on the electronic equipment with the screen are controlled through other electronic equipment, so that the electronic equipment with the screen is controlled, various complex operations of the electronic equipment with the screen are realized, and convenience is brought to the user.
Optionally, before step S1801 shown in fig. 18, the following steps may be further included, as shown in fig. 19:
s1805, the first electronic device receives a second user operation.
Optionally, the second user operation includes a long press operation of the user on the touch panel. Optionally, the time of the long pressing operation may be greater than or equal to a preset time, and the preset time may be 2 seconds, 3 seconds, and the like, which is not limited in this application.
Optionally, the second user operation may further include a key operation of the user on the first electronic device.
Optionally, the second user operation may be the same as or different from the first user operation.
And S1806, the first electronic device generates a second instruction according to the second user operation.
S1807, the first electronic device sends a second instruction to the second electronic device. Accordingly, the second electronic device receives the second instruction from the first electronic device.
The second instruction is used for indicating the second electronic equipment to display the touch point.
Optionally, the first electronic device may send the second instruction to the second electronic device through technologies such as bluetooth and a distributed soft bus, and correspondingly, the second electronic device may receive the second instruction sent by the first electronic device through technologies such as bluetooth and a distributed soft bus.
Optionally, the second instruction may include an input event.
Optionally, the second instruction may include the mapping table.
And S1808, responding to the second instruction, and displaying the touch point on a display screen of the second electronic device.
Optionally, the touch point is displayed in a first shape on a display screen of the second electronic device.
Optionally, the touch point may also be displayed in the center of the display screen of the second electronic device.
Based on the scheme, the second electronic device can be controlled to display the touch point first, so that the touch point can be operated subsequently, the second electronic device can be controlled, and convenience is brought to a user.
Optionally, before step S1801 shown in fig. 18, the following steps may be further included, as shown in fig. 20.
And S1809, the second electronic device starts a touch point control mode.
In one possible implementation, the touch point control mode of the second electronic device may be turned on by the first electronic device.
For example, taking the first electronic device as a bluetooth headset, as shown in fig. 10a, a touch point control mode of the second electronic device is turned on by a user operating on a touch panel of the bluetooth headset.
Illustratively, as shown in fig. 10b, the touch point control mode of the second electronic device may also be turned on through a key on the bluetooth headset.
In another possible implementation, the touch point control mode may be turned on by the second electronic device itself, as shown in fig. 10 c.
Optionally, after the touch point control mode of the second electronic device is turned on, the user may be prompted by voice that the touch point control mode is currently entered. A prompt window may also pop up at the second electronic device, such as interface 2100 shown in fig. 21, prompting the user that the touch point control mode has been currently entered.
Similarly, the second electronic device may also turn off the touch point control mode through the above implementation manner. For example: long-pressing the touch panel of the first electronic device, clicking a key of the first electronic device, clicking a button 1005 in an interface 1040 shown in fig. 10c, and the like.
When the touch point control of the second electronic device is realized through the first electronic device, the power consumption of the first electronic device is additionally increased. Therefore, the touch point control mode is turned on and/or off in the above manner, so that when the user does not use the touch point control mode, the touch point control mode can be turned off, power consumption of the first electronic device is saved, and meanwhile, mistaken touch can be prevented.
Optionally, the method shown in fig. 18 may further include the following steps, as shown in fig. 22:
s1810, the first electronic device receives a third user operation.
The third user operation is used to instruct the first electronic device to switch the second electronic device, and the third user operation includes a multi-click operation of the user on the touch panel, for example: three clicks, four clicks, etc., which are not limited in this application.
Optionally, the third user operation may further include a key operation of the user on the first electronic device. Optionally, the third user operation may be the same as or different from the first user operation and the second user operation.
S1811, the first electronic device generates a third instruction according to a third user operation.
S1812, the first electronic device sends a third instruction to a third electronic device.
The third instruction is used for instructing the third electronic device to control the touch point to execute target operation, and the target operation includes any one or more of displaying the touch point in a fixed state, displaying the touch point in a released state, moving, clicking, double clicking, sliding and long pressing. And the touch point is displayed on a display screen of the third electronic device. Wherein the third electronic device belongs to the at least one second electronic device.
Optionally, the third electronic device is determined by the first electronic device according to a sequence table, where the sequence table includes a sequence of the at least one second electronic device accessing the distributed network. For a detailed description of the sequence table, please refer to the related description of the preset sequence table, which is not repeated herein.
Optionally, the first electronic device may send the third instruction to the third electronic device through technologies such as bluetooth and a distributed soft bus, and correspondingly, the third electronic device may receive the third instruction sent by the first electronic device through technologies such as bluetooth and a distributed soft bus.
Optionally, the third instruction may include an input event.
Optionally, the third instruction may include the mapping table.
Illustratively, as shown in fig. 16c, the first electronic device may switch control over a plurality of electronic devices. Based on the design, when a user uses a plurality of electronic devices with screens, the touch points on the electronic devices are switched and controlled, and then the electronic devices can be respectively controlled, seamless switching control among the electronic devices can be realized, and convenience is brought to the user.
It should be noted that, each interface is only a schematic diagram, and in practical applications, each interface may include more or less contents, or may include more or less interfaces, which is not limited in the present application.
The scheme provided by the embodiment of the application is mainly introduced from the perspective of the method. It is understood that the electronic device comprises corresponding hardware structures and/or software modules for performing the respective functions in order to realize the above-mentioned functions. The elements and algorithm steps of the various examples described in connection with the embodiments disclosed herein may be embodied in hardware or in a combination of hardware and computer software. Whether a function is performed as hardware or computer-driven hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present teachings.
In the present application, the electronic device may be divided into functional modules according to the method example, for example, each functional module may be divided corresponding to each function, or two or more functions may be integrated into one processing unit. The integrated unit can be realized in a hardware form or a software functional module form. It should be noted that the division of the unit in the embodiment of the present application is schematic, and is only a logic function division, and there may be another division manner in actual implementation.
As shown in fig. 23, which is a schematic structural diagram of an electronic device provided in this embodiment of the application, the electronic device 2300 may be used to implement the method performed by the first electronic device in the above method embodiments. Illustratively, the electronic device 2300 specifically includes a processing unit 2301, a communication unit 2302, and a touch control unit 2303.
Among other things, the processing unit 2301 is configured to support the first electronic device to perform step S1802 in fig. 18, 19, 20, 22, step S1806 in fig. 19, step S1811 in fig. 21, and/or other processing operations performed by the first electronic device in this embodiment of the present application. The communication unit 2302 is used to support the first electronic device to perform step S1803 in fig. 18, fig. 19, fig. 20, fig. 22, step S1807 in fig. 19, step S1812 in fig. 22, and/or other communication operations performed by the first electronic device in this embodiment of the present application. The touch unit 2303 is used to support the first electronic device to perform step S1801 in fig. 18, fig. 19, fig. 20, and fig. 22, step S1805 in fig. 19, step S1810 in fig. 22, and/or other touch operations performed by the first electronic device in this embodiment of the present application.
Optionally, the electronic device 2300 shown in fig. 23 may further include a storage unit (not shown in fig. 23) that stores a program or instructions. The program or the instructions, when executed by the processing unit, enable the electronic device 2300 shown in fig. 23 to execute the device control method executed by the first electronic device shown in fig. 18, fig. 19, fig. 20, fig. 22.
As shown in fig. 24, which is a schematic structural diagram of an electronic device according to an embodiment of the present application, the electronic device 2400 may be configured to implement the method performed by the second electronic device in the above method embodiments. Illustratively, the electronic device 2400 specifically includes a processing unit 2401, a communication unit 2402, and a display unit 2403.
The processing unit 2401 is configured to support the second electronic device to perform step S1804 in fig. 18, 19, 20, 22, step S1808 in fig. 19, step S1809 in fig. 20, and/or other processing operations performed by the second electronic device in this embodiment of the present application. The communication unit 2402 is configured to support the second electronic device to perform step S1803 in fig. 18, fig. 19, fig. 20, fig. 22, step S1807 in fig. 19, and/or other communication operations performed by the second electronic device in this embodiment. The display unit 2403 is used for supporting the first electronic device to display interfaces shown in fig. 8, 9, 10a, 10b, 10c, 11, 12a, 12b, 13a, 13b, 14a, 14b, 15a, 15b, and the like, and/or other display operations performed by the second electronic device in this embodiment.
Optionally, the electronic device 2400 shown in fig. 24 may further include a storage unit (not shown in fig. 24) that stores programs or instructions. The program or instructions, when executed by the processing unit, cause the electronic device 2400 shown in fig. 24 to execute the device control method executed by the second electronic device shown in fig. 18, 19, 20, and 22.
Technical effects of the electronic device 2300 shown in fig. 23 and the electronic device 2400 shown in fig. 24 may refer to technical effects of the device control methods shown in fig. 18, fig. 19, fig. 20, and fig. 22, and are not described again here. The processing unit 2301 (processing unit 2401) involved in the electronic device 2300 shown in fig. 23 (the electronic device 2400 shown in fig. 24) may be implemented by a processor or a processor-related circuit component, and may be a processor or a processing module. The communication unit 2302 (communication unit 2402) may be implemented by a transceiver or transceiver-related circuit component, and may be a transceiver or transceiver module. The touch unit 2303 (display unit 2403) may be implemented by display screen related components, and may include a display screen and a touch panel.
An embodiment of the present application further provides a chip system, as shown in fig. 25, where the chip system includes at least one processor 2501 and at least one interface circuit 2502. The processor 2501 and the interface circuit 2502 may be interconnected by wires. For example, the interface circuit 2502 may be used to receive signals from other devices. Also for example, the interface circuit 2502 may be used to send signals to other devices (e.g., the processor 2501). Illustratively, the interface circuit 2502 may read instructions stored in a memory and send the instructions to the processor 2501. The instructions, when executed by the processor 2501, may cause the electronic device to perform the various steps performed by the first electronic device or the second electronic device in the embodiments described above. Of course, the chip system may further include other discrete devices, which is not specifically limited in this embodiment of the present application.
Optionally, the system on a chip may have one or more processors. The processor may be implemented by hardware or by software. When implemented in hardware, the processor may be a logic circuit, an integrated circuit, or the like. When implemented in software, the processor may be a general-purpose processor implemented by reading software code stored in a memory.
Optionally, the memory in the system-on-chip may also be one or more. The memory may be integrated with the processor or may be separate from the processor, which is not limited in this application. For example, the memory may be a non-transitory processor, such as a read only memory ROM, which may be integrated with the processor on the same chip or separately disposed on different chips, and the type of the memory and the arrangement of the memory and the processor are not particularly limited in this application.
The chip system may be a Field Programmable Gate Array (FPGA), an Application Specific Integrated Circuit (ASIC), a system on chip (SoC), a Central Processor Unit (CPU), a Network Processor (NP), a Digital Signal Processor (DSP), a Microcontroller (MCU), a Programmable Logic Device (PLD) or other integrated chips.
It will be appreciated that the steps of the above described method embodiments may be performed by integrated logic circuits of hardware in a processor or instructions in the form of software. The steps of the method disclosed in connection with the embodiments of the present application may be directly implemented by a hardware processor, or may be implemented by a combination of hardware and software modules in a processor.
The present application provides a computer-readable storage medium, on which a computer program or instructions are stored, which, when run on a computer, cause the computer to perform the method described in the above method embodiments.
An embodiment of the present application provides a computer program product, including: computer program or instructions which, when run on a computer, cause the computer to perform the method described in the above method embodiments.
In addition, embodiments of the present application also provide an apparatus, which may be specifically a component or a module, and may include a processor and a memory connected to each other; the memory is used for storing computer execution instructions, and when the device runs, the processor can execute the computer execution instructions stored in the memory, so that the device can execute the method in the above method embodiments.
In addition, the electronic device, the computer-readable storage medium, the computer program product, or the chip provided in the embodiments of the present application are all configured to execute the corresponding method provided above, so that the beneficial effects achieved by the electronic device, the computer-readable storage medium, the computer program product, or the chip may refer to the beneficial effects in the corresponding method provided above, and are not described herein again.
Through the above description of the embodiments, it is clear to those skilled in the art that, for convenience and simplicity of description, the foregoing division of the functional modules is merely used as an example, and in practical applications, the above function distribution may be completed by different functional modules according to needs, that is, the internal structure of the device may be divided into different functional modules to complete all or part of the above described functions. For the specific working processes of the system, the apparatus and the unit described above, reference may be made to the corresponding processes in the foregoing method embodiments, and details are not described here again.
In the several embodiments provided in the present application, it should be understood that the disclosed method may be implemented in other manners. The embodiments may be combined with each other or referenced to each other without conflict. The above-described embodiments of the electronic device are merely illustrative, and for example, the division of the modules or units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of modules or units through some interfaces, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be substantially implemented or contributed to by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) or a processor to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: flash memory, removable hard drive, read only memory, random access memory, magnetic or optical disk, and the like.
The above description is only an embodiment of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present disclosure should be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (19)

1. The device control method is applied to a system consisting of first electronic equipment and at least one second electronic equipment, wherein the first electronic equipment is provided with a touch panel; the second electronic device is configured with a display screen; the method comprises the following steps:
the second electronic equipment receives a first instruction from the first electronic equipment, wherein the first instruction is generated according to a first user operation, and the first user operation is an operation of a user on the touch panel; the first instruction is used for instructing the second electronic device to control a touch point to execute target operation, and the target operation comprises any one of displaying the touch point in a fixed state, displaying the touch point in a released state, moving, clicking, double clicking, sliding and long pressing;
the touch point is displayed on a display screen of the second electronic device, the position of the touch point cannot be moved when the touch point is in the fixed state, and the position of the touch point can be moved when the touch point is in the released state;
and responding to the first instruction, and controlling the touch point to execute the target operation by the second electronic equipment.
2. The method of claim 1, wherein the target operation is determined according to a mapping relationship, the mapping relationship comprising a correspondence between the first user operation and the target operation.
3. The method of claim 2,
if the first user operation is clicking, the target operation is clicking;
or, if the first user operation is double-click, the target operation is double-click;
or, if the first user operation is double-click, the target operation is to display the touch point in a fixed state;
or, if the first user operation is double-click, the target operation is to display the touch point in a release state;
or, if the first user operation is multi-click, the target operation is to display the touch point in a fixed state;
or, if the first user operation is multi-click, the target operation is to display the touch point in a release state;
or, if the first user operation is long press, the target operation is long press;
or, if the first user operation is long press, the target operation is to display the touch point in a fixed state;
or, if the first user operation is long press, the target operation is to display the touch point in a release state;
or, if the first user operation is sliding, the target operation is sliding;
or, if the first user operation is sliding, the target operation is moving.
4. The method of any of claims 1-3, wherein the first instruction includes a first distance, wherein the target operation includes a movement, and wherein the target operation includes a second distance.
5. The method of any of claims 1-4, wherein prior to the second electronic device receiving the first instruction from the first electronic device, the method further comprises:
the second electronic device receives a second instruction from the first electronic device, wherein the second instruction is used for instructing the second electronic device to display the touch point;
and responding to the second instruction, and displaying the touch point on the display screen by the second electronic equipment.
6. The method of any of claims 1-5, wherein prior to the second electronic device receiving the first instruction from the first electronic device, the method further comprises:
and the second electronic equipment starts a touch point control mode.
7. The equipment control method is characterized by being applied to a system consisting of first electronic equipment and at least one second electronic equipment, wherein the first electronic equipment is provided with a touch panel; the second electronic device is configured with a display screen; the method comprises the following steps:
the first electronic equipment receives a first user operation, wherein the first user operation comprises an operation of a user on the touch panel;
the first electronic equipment generates a first instruction according to the first user operation;
the first electronic equipment sends the first instruction to the second electronic equipment, wherein the first instruction is used for instructing the second electronic equipment to control a touch point to execute target operation, and the target operation comprises any one or more of displaying the touch point in a fixed state, displaying the touch point in a released state, moving, clicking, double clicking, sliding and long pressing;
the touch point is displayed on a display screen of the second electronic device, the position of the touch point cannot be moved when the touch point is in the fixed state, and the position of the touch point can be moved when the touch point is in the released state.
8. The method of claim 7, wherein the target operation is determined according to a mapping relationship, the mapping relationship comprising a correspondence between the first user operation and the target operation.
9. The method of claim 8,
if the first user operation is clicking, the target operation is clicking;
or, if the first user operation is double-click, the target operation is double-click;
or, if the first user operation is double-click, the target operation is to display the touch point in a fixed state;
or, if the first user operation is double-click, the target operation is to display the touch point in a release state;
or, if the first user operation is multi-click, the target operation is to display the touch point in a fixed state;
or, if the first user operation is multi-click, the target operation is to display the touch point in a release state;
or, if the first user operation is long press, the target operation is long press;
or, if the first user operation is long press, the target operation is to display the touch point in a fixed state;
or, if the first user operation is long press, the target operation is to display the touch point in a release state;
or, if the first user operation is sliding, the target operation is sliding;
or, the first user operation is sliding, and then the target operation is moving.
10. The method of any of claims 7-9, wherein the first instruction includes a first distance, wherein the target operation includes a movement, and wherein the target operation includes a second distance.
11. The method of any of claims 7-10, wherein prior to the first electronic device sending the first instruction to the second electronic device, the method further comprises:
the first electronic equipment receives a second user operation;
the first electronic equipment generates a second instruction according to the second user operation,
and the first electronic equipment sends a second instruction to the second electronic equipment, wherein the second instruction is used for indicating the second electronic equipment to display the touch point.
12. The method of any of claims 7-11, wherein after the first electronic device sends the first instruction to the second electronic device, the method further comprises:
the first electronic equipment receives a third user operation, the third user operation is used for instructing the first electronic equipment to switch the second electronic equipment, and the third user operation comprises the operation of the user on the touch panel;
the first electronic equipment generates a third instruction according to the third user operation;
the first electronic device sends the third instruction to a third electronic device, wherein the third instruction is used for instructing the third electronic device to control a touch point to execute target operation, and the target operation comprises any one or more of displaying the touch point in a fixed state, displaying the touch point in a released state, moving, clicking, double clicking, sliding and long pressing;
the touch point is displayed on a display screen of the third electronic device, and the third electronic device belongs to the at least one second electronic device.
13. The method of claim 12, wherein the third electronic device is determined by the first electronic device based on a sequence table that includes a sequence in which the at least one second electronic device accesses the distributed network.
14. The method of any of claims 7-13, wherein the first electronic device is a bluetooth headset.
15. An electronic device, wherein the electronic device is a second electronic device, and wherein the second electronic device comprises:
a display screen;
one or more processors;
a memory;
a communication interface;
wherein the memory, the display screen, the communication interface are coupled with the processor, the memory having one or more computer programs stored therein, the one or more computer programs comprising instructions which, when executed by the second electronic device, cause the second electronic device to perform the device control method of any of claims 1-6.
16. An electronic device, wherein the electronic device is a first electronic device, the first electronic device comprising:
a touch panel;
one or more processors;
a memory;
a communication interface;
wherein the memory, the touch panel, the communication interface, and the processor are coupled, the memory having stored therein one or more computer programs comprising instructions that, when executed by the first electronic device, cause the first electronic device to perform the device control method of any of claims 7-14.
17. A device control system, characterized in that the system comprises an electronic device according to claim 15 and an electronic device according to claim 16.
18. A computer-readable storage medium having instructions stored therein, which when run on an electronic device, cause the electronic device to perform the device control method of any one of claims 1-6 or 7-14.
19. A computer program product comprising instructions for causing an electronic device to perform the device control method of any one of claims 1-6 or 7-14 when the computer program product is run on the electronic device.
CN202110779144.1A 2021-07-09 2021-07-09 Device control method and electronic device Pending CN115665313A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110779144.1A CN115665313A (en) 2021-07-09 2021-07-09 Device control method and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110779144.1A CN115665313A (en) 2021-07-09 2021-07-09 Device control method and electronic device

Publications (1)

Publication Number Publication Date
CN115665313A true CN115665313A (en) 2023-01-31

Family

ID=85015175

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110779144.1A Pending CN115665313A (en) 2021-07-09 2021-07-09 Device control method and electronic device

Country Status (1)

Country Link
CN (1) CN115665313A (en)

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN2828919Y (en) * 2005-04-19 2006-10-18 方燕梅 Computer mainframe intelligent contact
CN202310046U (en) * 2011-10-31 2012-07-04 山东科技大学 Novel earphone
CN103389792A (en) * 2012-05-07 2013-11-13 高超 Head and neck control type mouse
WO2016082409A1 (en) * 2014-11-27 2016-06-02 捷开通讯(深圳)有限公司 Terminal control method and intelligent earphone
CN106502556A (en) * 2015-09-08 2017-03-15 苹果公司 For moving the apparatus and method of current focus using touch-sensitive remote control
US20170308182A1 (en) * 2016-04-26 2017-10-26 Bragi GmbH Mechanical Detection of a Touch Movement Using a Sensor and a Special Surface Pattern System and Method
CN108600887A (en) * 2018-04-23 2018-09-28 Oppo广东移动通信有限公司 Method of toch control based on wireless headset and Related product
CN109327756A (en) * 2018-09-13 2019-02-12 歌尔科技有限公司 The charging box and its control method and device of a kind of wireless headset
WO2020019355A1 (en) * 2018-07-27 2020-01-30 华为技术有限公司 Touch control method for wearable device, and wearable device and system
CN111131952A (en) * 2019-12-27 2020-05-08 深圳春沐源控股有限公司 Control method of earphone assembly, earphone assembly and computer readable storage medium
CN111142775A (en) * 2019-12-27 2020-05-12 王友位 Gesture interaction method and device
US20200252715A1 (en) * 2017-09-26 2020-08-06 Mobvoi Information Technology Co., Ltd. Wireless Earpiece and Control Method Therefor

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN2828919Y (en) * 2005-04-19 2006-10-18 方燕梅 Computer mainframe intelligent contact
CN202310046U (en) * 2011-10-31 2012-07-04 山东科技大学 Novel earphone
CN103389792A (en) * 2012-05-07 2013-11-13 高超 Head and neck control type mouse
WO2016082409A1 (en) * 2014-11-27 2016-06-02 捷开通讯(深圳)有限公司 Terminal control method and intelligent earphone
CN106502556A (en) * 2015-09-08 2017-03-15 苹果公司 For moving the apparatus and method of current focus using touch-sensitive remote control
US20170308182A1 (en) * 2016-04-26 2017-10-26 Bragi GmbH Mechanical Detection of a Touch Movement Using a Sensor and a Special Surface Pattern System and Method
US20200252715A1 (en) * 2017-09-26 2020-08-06 Mobvoi Information Technology Co., Ltd. Wireless Earpiece and Control Method Therefor
CN108600887A (en) * 2018-04-23 2018-09-28 Oppo广东移动通信有限公司 Method of toch control based on wireless headset and Related product
WO2020019355A1 (en) * 2018-07-27 2020-01-30 华为技术有限公司 Touch control method for wearable device, and wearable device and system
CN109327756A (en) * 2018-09-13 2019-02-12 歌尔科技有限公司 The charging box and its control method and device of a kind of wireless headset
CN111131952A (en) * 2019-12-27 2020-05-08 深圳春沐源控股有限公司 Control method of earphone assembly, earphone assembly and computer readable storage medium
CN111142775A (en) * 2019-12-27 2020-05-12 王友位 Gesture interaction method and device

Similar Documents

Publication Publication Date Title
EP3907597A1 (en) Method for displaying ui assembly and electronic device
WO2021000881A1 (en) Screen splitting method and electronic device
CN110839096B (en) Touch method of equipment with folding screen and folding screen equipment
CN112558825A (en) Information processing method and electronic equipment
CN112083867A (en) Cross-device object dragging method and device
KR20220110314A (en) Card display method, electronic device and computer readable storage medium
CN113778574B (en) Card sharing method, electronic equipment and communication system
KR20120067636A (en) Mobile terminal and control method therof
CN112527174B (en) Information processing method and electronic equipment
US20230422154A1 (en) Method for using cellular communication function, and related apparatus and system
CN113050841A (en) Method, electronic equipment and system for displaying multiple windows
CN112527222A (en) Information processing method and electronic equipment
CN115002937B (en) Multi-device cooperation method, electronic device and related product
WO2022017393A1 (en) Display interaction system, display method, and device
CN112130788A (en) Content sharing method and device
JP7234379B2 (en) Methods and associated devices for accessing networks by smart home devices
CN114885442A (en) Input device connection method, device and system
KR20130001826A (en) Mobile terminal and control method therof
CN115657918A (en) Cross-device object dragging method and device
EP4163782A1 (en) Cross-device desktop management method, first electronic device, and second electronic device
CN115016697A (en) Screen projection method, computer device, readable storage medium, and program product
CN114690986A (en) Method for creating shortcut and related equipment
CN113805825B (en) Method for data communication between devices, device and readable storage medium
WO2023005711A1 (en) Service recommendation method and electronic device
EP4258099A1 (en) Double-channel screen projection method and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination