CN111190488A - Device control method, communication apparatus, and storage medium - Google Patents

Device control method, communication apparatus, and storage medium Download PDF

Info

Publication number
CN111190488A
CN111190488A CN201911393813.0A CN201911393813A CN111190488A CN 111190488 A CN111190488 A CN 111190488A CN 201911393813 A CN201911393813 A CN 201911393813A CN 111190488 A CN111190488 A CN 111190488A
Authority
CN
China
Prior art keywords
head
information
user
mounted device
glasses
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911393813.0A
Other languages
Chinese (zh)
Inventor
熊刘冬
张衡
刘勋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN201911393813.0A priority Critical patent/CN111190488A/en
Publication of CN111190488A publication Critical patent/CN111190488A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/012Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment

Abstract

The embodiment of the application provides a device control method, a communication device and a storage medium. When a user operates target operable information in the one or more operable information, the head-mounted device can generate a control instruction for controlling a first device in the one or more devices and send the control instruction to the first device, and when the first device or the user is in a state inconvenient to move, the user can control the first device through the head-mounted device, so that the convenience of the user for controlling the device is improved.

Description

Device control method, communication apparatus, and storage medium
Technical Field
The present application relates to the field of communications technologies, and in particular, to an apparatus control method, a communication device, and a storage medium.
Background
With the development of communication technology, the types of user terminals are increasing, for example, the user terminals include smart phones, tablet computers, notebook computers, smart bands, speakers, and the like.
However, when a user needs to control a terminal, the user needs to acquire the terminal and perform a control operation on the terminal, and the terminal may be currently in a state inconvenient to move, for example, the terminal is in a charging state, or the terminal is being used by other users. Or the user is not convenient to move at present and cannot acquire the terminal at a far position, so that the user is inconvenient to control the terminal.
Disclosure of Invention
The application provides an equipment control method, a communication device and a storage medium, so as to improve the convenience of a user for controlling equipment.
In a first aspect, the present application provides a device control method, including: the head-mounted device displays information of one or more devices and one or more operational information; the head-mounted device generates a control instruction for controlling a first device according to the operation of a user on target operable information, wherein the target operable information is one of the one or more operable information, and the first device is one of the one or more devices; the head-mounted device sends the control instruction to the first device. Through the scheme provided by the embodiment, when the first device or the user is in a state inconvenient to move, the user can also control the first device through the head-mounted device, so that the convenience of the user for controlling the device is improved.
In one possible design, the one or more operational information includes one or more file information; the head-mounted device generates a control instruction for controlling the first device according to the operation of the user on the target operable information, and the control instruction comprises the following steps: the head-mounted device generates a control instruction for controlling the first device according to an operation of the user on target file information, wherein the target file information is one of the one or more pieces of file information, the first device is a device where a target file corresponding to the target file information is located, the control instruction is used for controlling the first device to send the target file to a second device, and the second device is one of the one or more devices. Through the scheme provided by the embodiment, the user can send the file in the first device to the second device through the head-mounted device, and similarly, the user can also send the file in the second device to the first device through the head-mounted device, so that information interaction between the first device and the second device is realized.
In one possible design, the user's manipulation of the target file information includes: and the operation is used for indicating that the user sends the target file corresponding to the target file information to the second device.
In one possible design, the head mounted device generates a control instruction for controlling the first device according to the operation of the user on the target file information, and the control instruction includes: the head-mounted device generates a control instruction for controlling the first device according to the operation that the user moves the target file information to a first preset position in display equipment of the head-mounted device, wherein the distance between the position of the second device in the display equipment and the first preset position is smaller than or equal to a first preset distance. Through the scheme provided by the embodiment, the user can send the file in the first device to the second device through the head-mounted device, and similarly, the user can also send the file in the second device to the first device through the head-mounted device, so that information interaction between the first device and the second device is realized.
In one possible design, the one or more file information includes at least one of: one or more file identifications, one or more file icons.
In one possible design, the one or more operational information includes one or more function setting icons; the head-mounted device generates a control instruction for controlling the first device according to the operation of the user on the target operable information, and the control instruction comprises the following steps: the head-mounted device generates a control instruction for controlling the first device according to the operation of the user on a target function setting icon, wherein the target function setting icon is one of the one or more function setting icons, and the control instruction is used for setting the function of the first device. Through the scheme provided by the embodiment, when the first device or the user is in a state inconvenient to move, the user can also perform function control on the first device through the head-mounted device, so that the convenience of controlling the device by the user is improved.
In one possible design, the head mounted device generates a control instruction for controlling the first device according to the operation of the user on the target function setting icon, including: and the head-mounted equipment generates a control instruction for controlling the first equipment according to the selection operation of the target function setting icon by the user and the selection operation of the information of the first equipment by the user.
In one possible design, the information for the one or more devices includes: icons of the one or more devices; the one or more operational information is displayed in an icon of the one or more devices.
In one possible design, the head mounted device displays information for one or more devices, including: the head-mounted device displays icons for one or more devices within a range of viewing angles of the head-mounted device.
In one possible design, before the head-mounted device displays icons for one or more devices within a range of viewing angles of the head-mounted device, the method further comprises: the head-mounted device detects wireless signals transmitted by one or more devices; the head-mounted device determines the angle and distance of one or more devices relative to the head-mounted device according to wireless signals transmitted by the one or more devices; the head-mounted device determines one or more devices within a range of viewing angles of the head-mounted device based on angles and distances of the one or more devices relative to the head-mounted device.
In one possible design, before the head-mounted device displays the one or more operational information, the method further comprises: the head-mounted device sends an acquisition request to the first device according to the movement operation of the user on a movable mark displayed on a display device of the head-mounted device, wherein the acquisition request is used for acquiring the one or more operable information; the head-mounted device receives the one or more operational information from the first device. By the scheme provided by the embodiment, the flexibility of selecting the target file by the user and the transmission efficiency of the target file are improved.
In one possible design, the head-mounted device sends an acquisition request to the first device according to a movement operation of the user on a movable marker displayed on a display device of the head-mounted device, including; the head-mounted device sends the acquisition request to the first device according to the operation that the user moves the movable mark displayed on the display device of the head-mounted device to a second preset position, wherein the distance between the position of the first device in the display device and the second preset position is smaller than or equal to the second preset distance. According to the scheme provided by the embodiment, the head-mounted device sends the acquisition request to the first device according to the operation that the user moves the movable mark displayed on the display device of the head-mounted device to the vicinity of or on the first device displayed by the display device, so as to request the first device to send the one or more pieces of operable information to the head-mounted device, and the flexibility of the head-mounted device to acquire the one or more pieces of operable information is improved.
In one possible design, before the head-mounted device sends the acquisition request to the first device according to the user's movement operation on a movable marker displayed on a display device of the head-mounted device, the method further includes: the head-mounted device verifies whether an account number of the head-mounted device logged in a server is the same as an account number of the first device logged in the server; when the account number of the head-mounted device logged in the server is the same as the account number of the first device logged in the server, the control authority of the head-mounted device on the first device is a first authority; when the account number of the head-mounted device logged in the server is different from the account number of the first device logged in the server, the control authority of the head-mounted device on the first device is a second authority; wherein the first right is greater than or equal to the second right. Through the scheme that this embodiment provided, after confirming that both sides login is same account each other through head mounted device and first equipment, the user sends the target file in the first equipment to the second equipment through head mounted device, avoids when head mounted device and first equipment belong to different users respectively, under the condition of the user's of first equipment, sends the file in the first equipment to other equipment through head mounted device to the security that has improved first equipment is not known. In addition, if the head-mounted device and the first device belong to the same user, the user can set the privacy information of the first device through the head-mounted device, and when the head-mounted device and the first device do not belong to the same user, the user can set the simple function of the first device through the head-mounted device, so that the safety of the first device is improved, and the safety of the privacy information of the user is improved. In addition, if the head-mounted device and the first device do not belong to the same user, the user can also set the privacy information of the first device through the head-mounted device, and the control flexibility of the user on the first device is improved.
In one possible design, the method for verifying whether the account number registered in the server by the head-mounted device is the same as the account number registered in the server by the first device includes: the head-mounted equipment encrypts first preset information by adopting a secret key distributed to the head-mounted equipment by the server to obtain first encrypted information; the head-mounted device sends the first encrypted information to the first device; the head-mounted device receives first decryption information obtained by decrypting the first encryption information by the first device; if the first decryption information is the same as the first preset information, the head-mounted device determines that an account registered in a server by the head-mounted device is the same as an account registered in the server by the first device; if the first decryption information is different from the first preset information, the head-mounted device determines that an account registered in the server by the head-mounted device is different from an account registered in the server by the first device.
In a second aspect, the present application provides a device control method, including: a first device receives a control instruction from a head-mounted device, wherein the control instruction is an instruction generated by the head-mounted device according to a user operation on target operable information and used for controlling the first device, the head-mounted device displays information of one or more devices and one or more operable information, the target operable information is one of the one or more operable information, and the first device is one of the one or more devices; the first device executes the control instruction.
In one possible design, the one or more operational information includes one or more file information; the control instruction is an instruction which is generated by the head-mounted device according to an operation of the user on target file information and is used for controlling the first device, the target file information is one of the one or more pieces of file information, the first device is a device where a target file corresponding to the target file information is located, the control instruction is used for controlling the first device to send the target file to a second device, and the second device is one of the one or more devices.
In one possible design, the user's manipulation of the target file information includes: and the operation is used for indicating that the user sends the target file corresponding to the target file information to the second device.
In one possible design, the one or more file information includes at least one of: one or more file identifications, one or more file icons.
In one possible design, the one or more operational information includes one or more function setting icons; the control instruction is an instruction which is generated by the head-mounted device according to the operation of the user on a target function setting icon and is used for controlling the first device, the target function setting icon is one of the one or more function setting icons, and the control instruction is used for setting the function of the first device.
In one possible design, the information for the one or more devices includes: icons of the one or more devices; the one or more operational information is displayed in an icon of the one or more devices.
In one possible design, before the first device receives the control instruction from the head mounted device, the method further includes: the first device receives an acquisition request from the head-mounted device, wherein the acquisition request is used for acquiring the one or more operable information; the first device transmits the one or more operational information to the head mounted device.
In one possible design, before the first device receives the acquisition request from the headset, the method further includes: the first device verifies whether an account number of the head-mounted device logged in a server is the same as an account number of the first device logged in the server; when the account number of the head-mounted device logged in the server is the same as the account number of the first device logged in the server, the control authority of the head-mounted device on the first device is a first authority; when the account number of the head-mounted device logged in the server is different from the account number of the first device logged in the server, the control authority of the head-mounted device on the first device is a second authority; wherein the first right is greater than or equal to the second right.
In one possible design, the first device verifying whether an account registered by the head mounted device in the server is the same as an account registered by the first device in the server includes: the first equipment encrypts second preset information by adopting a key distributed to the first equipment by the server to obtain second encrypted information; the first device sends the second encrypted information to the head-mounted device; the first device receives second decryption information obtained by decrypting the second encryption information by the head-mounted device; if the second decryption information is the same as the second preset information, the first device determines that the account number of the head-mounted device logged in the server is the same as the account number of the first device logged in the server.
In a third aspect, the present application provides a communication device comprising means, components or circuits for implementing the method of the first or second aspect.
In a fourth aspect, the present application provides a communication apparatus comprising:
an interface and a processor, the interface and the processor coupled;
the processor is configured to perform the method of the first aspect or the second aspect.
In a possible design, the communication apparatus in the fourth aspect may be a head-mounted device or a terminal, or may be a chip; the interface can be integrated with the processor on the same chip or can be respectively arranged on different chips.
In one possible design, the communication device of the fourth aspect may further include a memory for storing the computer program or instructions. The memory and the processor may be integrated on the same chip, or may be disposed on different chips.
In a fifth aspect, the present application provides a communication device, comprising:
a processor and a transceiver, the processor and the transceiver communicating with each other through an internal connection;
the processor is adapted to execute computer programs or instructions in the memory such that the method according to the first aspect or the second aspect is performed, and the transceiver is adapted to perform the transceiving steps in the method according to the first aspect or the second aspect.
In a possible design, the communication apparatus in the fifth aspect may be a head-mounted device or a terminal, or may be a component (e.g., a chip or a circuit) of the head-mounted device or the terminal.
In a sixth aspect, the present application provides a communication apparatus comprising: a processor and a memory, the processor and the memory coupled;
the memory for storing a computer program;
the processor is configured to execute the computer program stored in the memory to cause the communication apparatus to perform the method according to the first aspect or the second aspect.
In a seventh aspect, the present application provides a communications apparatus, comprising: a processor, a memory, and a transceiver;
the memory for storing a computer program;
the processor is configured to execute the computer program or instructions stored in the memory to cause the communication apparatus to perform the method according to the first aspect or the second aspect.
In an eighth aspect, the present application provides a communication apparatus, comprising: the device comprises an input interface circuit, a logic circuit and an output interface circuit, wherein the input interface circuit is used for acquiring data to be processed; the logic circuit is configured to execute the method according to the first aspect or the second aspect to process the data to be processed, so as to obtain processed data; the output interface circuit is used for outputting the processed data.
In a ninth aspect, the present application provides a computer readable storage medium comprising a computer program or instructions which, when run on a computer, performs a method as described in the first or second aspect.
In a tenth aspect, the present application provides a computer program comprising a program or instructions for performing a method according to the first or second aspect when the program or instructions are run on a computer.
In a possible design, the computer program in the tenth aspect may be stored in whole or in part on a storage medium packaged with the processor, or in part or in whole on a memory not packaged with the processor.
In an eleventh aspect, the present application provides a computer program product comprising a computer program or instructions for performing the method according to the first or second aspect when the computer program or instructions is run on a computer.
In a twelfth aspect, embodiments of the present application further provide a system, including the head-mounted device, the first device, and the second device described in the first aspect or the second aspect.
In a thirteenth aspect, an embodiment of the present application further provides a processor, where the processor includes: at least one circuit for performing a method according to the first or second aspect.
It can be seen that, in the above aspects, information of one or more devices and one or more operational information are displayed through a head-mounted device, when a user operates target operational information in the one or more operational information, the head-mounted device may generate a control instruction for controlling a first device in the one or more devices and transmit the control instruction to the first device, and when the first device or the user is in a state where it is inconvenient to move, the user may also control the first device through the head-mounted device, thereby improving convenience of the user in controlling the device.
Drawings
Fig. 1 is a schematic view of an application scenario provided in an embodiment of the present application;
fig. 2 is a schematic view of another application scenario provided in the embodiment of the present application;
fig. 3 is a schematic view of another application scenario provided in the embodiment of the present application;
fig. 4 is a flowchart of an apparatus control method according to an embodiment of the present application;
fig. 5 is a schematic view of AR glasses according to an embodiment of the present disclosure;
fig. 6 is a schematic diagram of another application scenario provided in the embodiment of the present application;
fig. 7 is a schematic view of another application scenario provided in the embodiment of the present application;
fig. 8 is a signaling diagram for verifying whether the AR glasses and the terminal belong to the same user according to the embodiment of the present application;
fig. 9 is a schematic view of another application scenario provided in the embodiment of the present application;
fig. 10 is a schematic view of another application scenario provided in an embodiment of the present application;
fig. 11 is a schematic view of another application scenario provided in the embodiment of the present application;
fig. 12 is a schematic view of another application scenario provided in the embodiment of the present application;
fig. 13 is a schematic view of another application scenario provided in the embodiment of the present application;
fig. 14 is a signaling diagram of a device control method according to an embodiment of the present application;
fig. 15 is a schematic view of another application scenario provided in the embodiment of the present application;
fig. 16 is a signaling diagram of still another device control method according to an embodiment of the present application;
fig. 17 is a signaling diagram of another device control method according to an embodiment of the present application;
fig. 18 is a schematic view of another application scenario provided in the embodiment of the present application;
fig. 19 is a signaling diagram of a function setting method according to an embodiment of the present application;
fig. 20 is a schematic diagram of another application scenario provided in the embodiment of the present application;
fig. 21 is a schematic diagram of another application scenario provided in the embodiment of the present application;
fig. 22 is a schematic view of another application scenario provided in the embodiment of the present application;
fig. 23 is a schematic structural diagram of a communication device according to an embodiment of the present application;
fig. 24 is a schematic structural diagram of a communication device according to an embodiment of the present application;
fig. 25 is a schematic structural diagram of another communication device according to an embodiment of the present application;
fig. 26 is a schematic structural diagram of another communication device according to an embodiment of the present application;
fig. 27 is a schematic structural diagram of another communication device according to an embodiment of the present application.
Detailed Description
The terminology used in the description of the embodiments section of the present application is for the purpose of describing particular embodiments of the present application only and is not intended to be limiting of the present application.
The embodiment of the application can be applied to various types of communication systems. Fig. 1 is a schematic view of an application scenario provided in an embodiment of the present application. The communication system shown in fig. 1 mainly includes a network device 11 and a terminal 12.
Among them, 1) the network device 11 may be a network Side device, such as an Access Point (AP) of a Wireless Local Area Network (WLAN), an Evolved Node B (eNB or eNodeB) of 4G, a New Radio Access Technology (NR) base station (neighbor Node B, gNB) of 5G, a base station of next generation communication, a satellite, a small station, a micro station, a relay station, a Transmission and Reception Point (TRP), a Road Side Unit (RSU), and so on. For the sake of distinction, the base station of the 4G communication system is referred to as Long Term Evolution (LTE) eNB, and the base station of the 5G communication system is referred to as NR gNB. It is understood that some base stations can support both 4G and 5G communication systems. In addition, these designations of base stations are for convenience of distinction only and are not intended to be limiting.
2) The terminal 12, also referred to as a User Equipment (UE), is a device that provides voice and/or data connectivity to a User, such as a handheld device having wireless connection capability, a vehicle-mounted device, a vehicle having vehicle-to-vehicle (V2V) communication capability, and so on. Common terminals include, for example: the mobile phone includes a mobile phone, a tablet computer, a notebook computer, a palm computer, a Mobile Internet Device (MID), and a wearable device such as a smart watch, a smart bracelet, a pedometer, and the like.
3) "plurality" means two or more, and other terms are analogous. "and/or" describes the corresponding relationship of the associated objects, and indicates that three relationships may exist, for example, a and/or B, and may indicate that: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship.
It should be noted that the number and types of the terminals 12 included in the communication system shown in fig. 1 are merely examples, and the embodiment of the present application is not limited thereto. For example, more terminals 12 communicating with the network device 11 may be included, and are not depicted in the figures one by one for simplicity of description. Furthermore, in the communication system shown in fig. 1, although the network device 11 and the terminal 12 are shown, the communication system may not be limited to include the network device 11 and the terminal 12, and may also include a core network node or a device for carrying a virtualized network function, which is obvious to those skilled in the art and is not described herein again.
In addition, the embodiments of the present application may be applied to not only a 4G wireless communication system, a vehicle-to-outside (V2X) communication system, a Device-to-Device (D2D) communication system, a communication system in the subsequent evolution of LTE, a 5G communication system, a satellite communication system, and the like, but also a next generation wireless communication system. In addition, the embodiment of the application can also be applied to other systems which may appear in the future, such as a wifi network of the next generation, a 5G car networking and the like.
It should be noted that, as the communication system continuously evolves, names of the network elements may change in other systems that may appear in the future, and in this case, the scheme provided in the embodiment of the present application is also applicable.
In this embodiment, the network device 11 shown in fig. 1 may also be a server, for example, the server 21 shown in fig. 2. In addition, in other application scenarios, the server 21 may communicate with a plurality of terminals, for example, the server 21 may also communicate with the terminal 14. Terminal 14 and terminal 12 may belong to the same user or may belong to different users. For example, the terminal 14 and the terminal 12 belong to the same user, and when the user needs to control the terminal 14 or the terminal 12, the user needs to acquire the terminal 14 or the terminal 12 and perform a control operation on the terminal 14 or the terminal 12. But terminal 14 or terminal 12 may currently be in a state that is not conveniently moved, such as in a charging state, or in use by other users. Or the user is not currently mobile and cannot acquire the terminal 14 or the terminal 12 at a remote location, so that the user cannot control the terminal conveniently. In order to solve the problem, an embodiment of the present application provides a device control method, which may control other terminals through a head-mounted device, for example, control information interaction among multiple terminals through the head-mounted device, or control functions of other terminals through the head-mounted device. The head-mounted device may be specifically Augmented Reality (AR) glasses, such as the AR glasses 13 shown in fig. 3.
The apparatus control method is described below with reference to specific embodiments. Fig. 4 is a flowchart of an apparatus control method according to an embodiment of the present application. As shown in fig. 4, the device control method according to this embodiment includes the following steps:
s401, the AR glasses display information of one or more devices, and one or more operational information.
Fig. 5 is a plan view of the user's head and the AR glasses 13 after the user wears the AR glasses 13. Where 51 denotes a display device of the AR glasses 13, the display device 51 may be integrated in the AR glasses 13. The display device 51 may be a lens of the AR glasses 13 or a screen of the AR glasses 13. As shown in fig. 6, the display device 51 of the AR glasses 13 may display information of a plurality of devices, for example, an icon of the terminal 12 and an icon of the terminal 14, it being understood that in other embodiments, the display device 51 of the AR glasses 13 may display information of one device. In addition, the display device 51 of the AR glasses 13 may also display operable information, which may specifically be an operable identifier or an operable icon, for example, an operable icon 52, and the operable icon 52 may specifically be a movable marker, for example, a cursor. Further, the operable information that the display device 51 can display is not limited to one, and may be a plurality.
S402, the AR glasses generate a control instruction for controlling the first device according to the operation of the user on the target operable information, wherein the target operable information is one of the one or more operable information, and the first device is one of the one or more devices.
Here, the operable icon 52 may be regarded as the target operable information, and further, the user may operate the operable icon 52 so that the operable icon 52 moves on the display device 51. For example, the operational icon 52 may be moved over an icon of the terminal 12. Further, the user may move the operable icon 52 from the icon of the terminal 12 to the icon of the terminal 14. The AR glasses 13 may generate a control instruction for controlling the terminal 12 according to the operation of the user on the operable icon 52, for example, the control instruction is used for controlling the terminal 12 to transmit data local to the terminal 12 to the terminal 14, where the terminal 12 may be referred to as a first device and the terminal 14 may be referred to as a second device.
And S403, the AR glasses send a control instruction to the first device.
For example, the AR glasses 13 transmit the control instruction to the terminal 12.
S404, the first device executes the control instruction.
After receiving the control command, the terminal 12 executes the control command, for example, the terminal 12 sends local data to the terminal 14.
In other scenarios, the display device 51 of the AR glasses 13 may also display information of one device, as well as a plurality of operational information. As shown in fig. 7, the display device 51 displays an icon of the speaker 15 and a plurality of operable icons, and the speaker 15 may be referred to herein as a first device. Wherein 71 denotes any one of the plurality of operable icons. The plurality of operable icons may be used to set different functions of the speaker 15. For example, the operable icon 71 can be used to set an alarm function of the speaker box 15. Specifically, the user can operate the operable icon 71, and the AR glasses 13 generate a control instruction for setting the alarm function of the sound box 15 according to the operation of the user on the operable icon 71. Further, the AR glasses 13 may send the control instruction to the sound box 15, so that the sound box 15 may configure the alarm function according to the control instruction.
According to the embodiment of the application, the information of one or more devices and one or more operable information are displayed through the head-mounted device, when a user operates the target operable information in the one or more operable information, the head-mounted device can generate a control instruction for controlling a first device in the one or more devices and send the control instruction to the first device, and when the first device or the user is in a state inconvenient to move, the user can also control the first device through the head-mounted device, so that the convenience of the user in controlling the device is improved.
On the basis of the above-described embodiment, the terminal 12 and the AR glasses 13 may be registered in the server 21, and for example, the server 21 may store the user identification of the terminal 12 and the device identification of the terminal 12, and the user identification of the AR glasses 13 and the device identification of the AR glasses 13. The terminal 12 and the AR glasses 13 may communicate with the server 21, respectively. The terminal 12 and the AR glasses 13 may belong to the same user or may belong to different users. For example, the user may send a data acquisition request to the terminal 12 through the AR glasses 13 and the server 21, after the terminal 12 receives the data acquisition request, the data stored in the terminal 12 is sent to the AR glasses 13 through the server 21, and the AR glasses 13 may display the data to the user for viewing. Alternatively, the user may transmit data displayed on the AR glasses 13 to the terminal 12 via the server 21. Alternatively, the user may also transmit data in the terminal 12 to other terminals, such as the terminal 14, through the AR glasses 13. The terminal 14 may also be registered in the server 21 and communicate with the server 21.
Before transmitting the data in the terminal 12 to the terminal 14 through the AR glasses 13, the AR glasses 13 may determine whether the AR glasses 13 and the terminal 12 belong to the same user, and the terminal 12 may also determine whether the AR glasses 13 and the terminal 12 belong to the same user. In the embodiment of the present application, the terminal 12 may be referred to as a first device, and the terminal 14 may be referred to as a second device. Specifically, the process of determining whether the AR glasses 13 and the terminal 12 belong to the same user respectively by the AR glasses 13 and the terminal 12 specifically includes the following steps as shown in fig. 8:
s801, the AR glasses send a login request to the server.
And S802, the server sends the key distributed by the server to the AR glasses.
S803, the first device sends a login request to the server.
S804, the server sends the key distributed by the server to the first equipment.
And S805, the AR glasses adopt the secret key distributed to the AR glasses by the server to encrypt the first preset information to obtain first encrypted information.
And S806, the AR glasses send the first encryption information to the first equipment.
S807, the first device decrypts the first encrypted information to obtain first decrypted information.
And S808, the first device sends the first decryption information to the AR glasses.
And S809, if the AR glasses determine that the first preset information is the same as the first decryption information, determining that the AR glasses and the first device belong to the same user.
S810, the AR glasses inform the first device that: the AR glasses and the first device belong to the same user.
S811, the first device encrypts the second preset information by using the key distributed to the first device by the server to obtain second encrypted information.
And S812, the first device sends second encryption information to the AR glasses.
And S813, the AR glasses decrypt the second encrypted information to obtain second decrypted information.
S814, the AR glasses send the second decryption information to the first device.
And S815, if the first device determines that the second preset information is the same as the second decryption information, determining that the AR glasses and the first device belong to the same user.
S816, the first device informs the AR glasses of: the AR glasses and the first device belong to the same user.
And if the AR glasses determine that the first preset information is different from the first decryption information, determining that the AR glasses and the first device do not belong to the same user. And if the first device determines that the second preset information is different from the second decryption information, determining that the AR glasses and the first device do not belong to the same user. The first preset information and the second preset information may be the same information or different information. For example, the first preset information and the second preset information may be the same random number or different random numbers. In addition, the present embodiment does not limit the execution order among the partial steps in S801 to S816, and for example, S803 and S804 may also be executed before S801 and S802. S811-S816 may also be performed before S805-S810.
In one possible implementation manner, the key distributed by the server to the AR glasses or the first device is a symmetric key, that is, the AR glasses or the first device performs encryption and decryption using the symmetric key. When the AR glasses and the first device belong to the same user, the AR glasses and the first device log in to the server using the same user account in S801 and S803. The server assigns the same symmetric key to the AR glasses and the first device belonging to the same user. Therefore, the key used by the AR glasses for encryption in S805 and the key used by the first device for decryption in S807 are both the symmetric key, and therefore, the first preset information is the same as the first decryption information in S809, so that the AR glasses determine that the AR glasses and the first device belong to the same user. Similarly, the process of the first device determining that the AR glasses and the first device belong to the same user is similar to this, and is not described here again.
In another possible implementation manner, the key distributed by the server to the AR glasses or the first device is an asymmetric key, that is, the server distributes a public key and a private key to the AR glasses, the public key is used for encryption, and the private key is used for decryption. In addition, the server also assigns a public key to the first device for encryption and a private key for decryption. When the AR glasses and the first device belong to the same user, the public key distributed to the AR glasses by the server is the same as the public key distributed to the first device by the server, and the private key distributed to the AR glasses by the server is the same as the private key distributed to the first device by the server. Therefore, the key used for encrypting the AR glasses in S805 and the key used for decrypting the first device in S807 are a public-private key pair, and therefore the first preset information in S809 is the same as the first decryption information, so that the AR glasses determine that the AR glasses and the first device belong to the same user. Similarly, the process of the first device determining that the AR glasses and the first device belong to the same user is similar to this, and is not described here again.
When the AR glasses 13 determine that the AR glasses 13 and the terminal 12 belong to the same user, that is, the account registered in the server 21 by the AR glasses 13 is the same as the account registered in the server 21 by the terminal 12, the terminal 12 determines that the user has the authority to operate the terminal 12 through the AR glasses 13, or the AR glasses 13 determines that the user has the authority to operate the AR glasses 13, for example, the user may send data in the terminal 12 to the terminal 14 through the AR glasses 13.
On the basis of the above-described embodiment, the user can control the movable marker 52 to move on the display device 51 through several possible implementations as follows.
One possible implementation is: the AR glasses 13 are equipped with an Inertial Measurement Unit (IMU), which can be used to sense posture information of a user, specifically, the IMU is used to sense posture information of a head of the user, where the posture information includes at least one of the following: yaw angle (pitch angle) and pitch angle (yawangle). The AR glasses control the movement of the moveable marker 52 on the display device 51 according to the yaw and/or pitch angle of the user's head as sensed by the IMU. Specifically, the AR glasses control the movable marker 52 to move in the horizontal direction of the display device 51 according to the yaw angle of the user's head, and control the movable marker 52 to move in the vertical direction of the display device 51 according to the pitch angle of the user's head.
Another possible implementation is: AR glasses are equipped with sensors, such as eye trackers, that sense both eye movements and eye (e.g., eyeball) changes of a user. The AR glasses may sense the user's binocular movement and eye changes through the eye tracker, and further control the movable marker 52 to move on the display device 51 according to the user's binocular movement and eye changes.
Yet another possible implementation is: the AR glasses may further be connected with an input device, the input device comprising at least one of: five-dimensional keys, a rocker, a touch pad, a touch screen or virtual keys. The connection mode of the AR glasses and the input device may be wired (such as cable) or wireless (such as Wifi, bluetooth, etc.).
Taking the five-dimensional key as an example, the five-dimensional key includes an up-down-left-right key and a confirm key. The input device may be a remote controller corresponding to the AR glasses, and the remote controller is provided with five-dimensional keys. The AR glasses control the movable marker 52 to move on the display device 51 according to the input information indicating the direction generated by the user clicking the up, down, left, and right keys of the five-dimensional keys.
Taking a joystick as an example, the AR glasses may control the movable marker 52 to move on the display device 51 according to input information indicating a direction generated when a user rotates the joystick.
Taking a touch screen as an example, the AR glasses may control the movable marker 52 to move on the display device 51 according to input information generated by a sliding or touch operation of the user on the touch screen.
Taking a touch pad as an example, the touch pad may be disposed on a frame of AR glasses, and the AR glasses may control the movable marker 52 to move on the display device 51 according to input information generated by a sliding or touch operation of a user on the touch pad.
Taking a virtual key as an example, the virtual key may be displayed on the display device 51, and the virtual key may be similar to a five-dimensional key. The AR glasses may control the movable marker 52 to move on the display device 51 according to input information generated by the user's operation of the virtual key.
The following describes a process for controlling information interaction between a plurality of terminals through a head-mounted device in detail with reference to a specific embodiment.
As shown in fig. 9, the terminal 12 is a mobile phone of the user, and the terminal 14 is a notebook computer of the user. The terminals 12 and 14 are within the viewing angle range of the AR glasses 13, or the user may bring the terminals 12 and 14 into the viewing angle range of the AR glasses 13 by turning the head. At this time, the icon of the terminal 12 and the icon of the terminal 14 may be displayed on the display device 51, and here, taking the display device 51 as an example of the lens of the AR glasses 13, the terminal 12 and the terminal 14 may be presented in the display device 51 by reflection of light.
Specifically, the user performs a moving operation on the movable mark 52, and the moving operation is specifically described in the above embodiment and is not described herein again. The AR glasses 13 may send, to the terminal 12, an acquisition request for acquiring one or more pieces of operational information, which may specifically be one or more pieces of file information, which may specifically be a file identifier or a file icon, according to the movement operation of the user on the movable marker 52. The file identifier or file icon may specifically be a thumbnail of the file.
In one possible implementation, the user may move the movable marker 52 to a second preset position on the display device 51, where the distance between the position displayed on the display device 51 by the terminal 12 and the second preset position is less than or equal to the second preset distance. As shown in fig. 9, the user may move the movable marker 52 to a location point around the icon of the terminal 12. Alternatively, the user may move the movable marker 52 onto an icon of the terminal 12, at which time the AR glasses 13 may transmit the acquisition request to the terminal 12 according to the user's moving operation of the movable marker 52. Upon receiving the acquisition request, the terminal 12 sends thumbnails of one or more files local to the terminal 12 to the AR glasses 13, for example, a thumbnail 94 of file a and a thumbnail 95 of file B. Further, the AR glasses 13 display the thumbnail 94 of the file a and the thumbnail 95 of the file B on the display device 51.
In another possible implementation, when the movable marker 52 is moved to a certain position point around the icon of the terminal 12 or the movable marker 52 is moved to the icon of the terminal 12 displayed on the display device 51, further, the user inputs an input information to the AR glasses 13, and the AR glasses 13 transmits the acquisition request as described above to the terminal 12 according to the input information. Wherein the input information may be a gesture of the user, and/or confirmation information input by the user via the input device as described above.
Taking a gesture as an example, for example, a shooting device, such as a camera, may be further installed on the AR glasses 13, and the camera may be used to shoot a gesture image of the user. For example, when the movable marker 52 moves to an icon of the terminal 12 displayed on the display device 51, the AR glasses 13 capture a gesture image of the user through the camera, and if the AR glasses 13 performs image processing on the gesture image and determines that the gesture of the user is a preset gesture, for example, the gesture shown in fig. 10, the AR glasses 13 sends the acquisition request to the terminal 12. The gesture of the user may be displayed on the display device 51 or may not be displayed on the display device 51. The preset gesture is not specifically limited herein.
Taking the input device as an example, for example, five-dimensional keys may be connected to the AR glasses 13 in a wireless or wired manner. For example, when the movable marker 52 is moved onto the icon of the terminal 12 displayed on the display device 51, if the AR glasses 13 receive the confirmation information input by the user by clicking the confirmation key of the five-dimensional key, the acquisition request as described above is sent to the terminal 12.
In some embodiments, after the terminal 12 receives the acquisition request, the thumbnail 94 of the file sent by the terminal 12 to the AR glasses 13 may also be displayed on an icon of the terminal 12 displayed on the display device 51, for example, as shown in fig. 11.
As shown in fig. 12, on the basis of fig. 9, the user can also move the movable marker 52 onto the thumbnail 94 so that the thumbnail 94 becomes in a selected state, at which time the thumbnail 94 can be regarded as a destination file icon. Further, the user can move by controlling the movable marker 52 so that the selected thumbnail image 94 comes gradually closer to the icon of the terminal 14 displayed on the display device 51. The AR glasses 13 generate a control instruction according to the movement operation of the user on the thumbnail 94, where the control instruction is used to control the source device, i.e., the terminal 12, where the target file corresponding to the thumbnail 94 is located to send the target file to the terminal 14.
In one possible implementation, after the user moves the movable marker 52 onto the thumbnail 94 so that the thumbnail 94 becomes the selected state, the AR glasses 13 capture the gesture image of the user through the camera, at this time, the gesture of the user may be a moving gesture, as shown in fig. 13, in the process that the user gesture is far away from the body of the user, the thumbnail 94 and the movable marker 52 gradually approach to the icon of the terminal 14 displayed on the display device 51. When the thumbnail images 94 and the movable marker 52 are moved to a first preset position in the display device 51, the AR glasses 13 generate a control instruction in which the distance between the display position of the terminal 14 in the display device 51 and the first preset position is less than or equal to the first preset distance. That is, if the AR glasses 13 determine that the movement gesture of the user is a gesture of moving the thumbnail image 94 to the first preset position through the movable mark 52, a control instruction for controlling the terminal 12 to transmit the target file to the terminal 14 is generated.
It is understood that the movement gesture shown in fig. 13 is an operation for indicating that the user sends the target file to the terminal 14, and in other embodiments, there may be other ways to control the thumbnail 94 and the movable mark 52 to gradually approach the icon of the terminal 14 displayed on the display device 51, and in particular, when the AR glasses 13 detects the operation for indicating that the user sends the target file to the terminal 14, the control instruction for controlling the terminal 12 to send the target file to the terminal 14 may be generated. For example, the manner in which the thumbnail images 94 are controlled to gradually approach the icons of the terminals 14 displayed on the display device 51 is similar to the manner in which the movable markers 52 are controlled to move on the display device 51, and will not be described here again.
It is understood that the present embodiment is schematically illustrated by taking the example of transmitting the file in the terminal 12 to the terminal 14 through the AR glasses 13, and similarly, the file in the terminal 14 may also be transmitted to the terminal 12 through the AR glasses 13, so that the terminal 14 and the terminal 12 can perform mutual transmission of the file through the AR glasses 13. In addition, the terminal performing information interaction through the AR glasses 13 may not be limited to the terminal 12 and the terminal 14, and may further include more terminals, where information interaction between two terminals is similar to this, and is not described herein again.
The embodiment transmits an acquisition request to the first device by the head-mounted device according to a movement operation of a user on a movable mark displayed on a display device of the head-mounted device to request acquisition of one or more pieces of file information, and the head-mounted device displays the one or more pieces of file information transmitted by the first device on the display device. Further, the head-mounted device may generate a control instruction for controlling the first device to send the target file corresponding to the target file information to the second device according to an operation of the user on the target file information in the one or more pieces of file information, so that the first device may send the local file to the second device, that is, the user may send the file in the first device to the second device through the head-mounted device.
In a possible case, in response to the acquisition request sent by the AR glasses 13, the terminal 12 described in the above embodiment may specifically be thumbnails of one or more target files selected in advance on the terminal 12 by the user, and the thumbnails of the one or more files sent to the AR glasses 13 may specifically be thumbnails of the one or more target files selected in advance by the user. The following describes in detail a process in which the AR glasses 13 receive thumbnails of one or more target files transmitted by the terminal 12, and the AR glasses 13 control the terminal 12 to transmit the target files to the terminal 14, with reference to a specific embodiment. As shown in fig. 14, the process includes the following steps:
s1401, the AR glasses, and the terminal 12 mutually confirm that the two login accounts are the same account.
The process of the AR glasses and the terminal 12 mutually confirming that the two parties log in the same account is specifically described above, and is not described herein again.
S1402, the terminal 12 determines one or more target files selected by the user on the terminal 12.
For example, as shown in fig. 15, a folder 91 is displayed on the top page of the terminal 12, and the folder 91 includes two files. After the user clicks the folder 91, the terminal 12 displays the file 92 and the file 93 included in the folder 91, and further, the user selects the file 92, where the file 92 is a target file selected by the user, that is, in this embodiment, the operation of selecting the target file by the user is the operation directly performed on the terminal 12 by the user.
S1403, the AR glasses send an acquisition request to the terminal 12, where the acquisition request is used to request the terminal 12 to send the thumbnail of the target file to the AR glasses.
As shown in fig. 15, the AR glasses may transmit an acquisition request for requesting the terminal 12 to transmit a thumbnail of a target file to the AR glasses to the terminal 12 according to an operation of the user moving the movable marker 52 onto the icon of the terminal 12.
S1404, the terminal 12 transmits the thumbnail of the target file to the AR glasses.
As shown in fig. 15, the terminal 12 transmits a thumbnail of the file 92 to the AR glasses, and the thumbnail is denoted as a thumbnail 94.
S1405, the AR glasses display the thumbnail of the target file on the display device.
The AR glasses display thumbnails 94 of files 92 on display device 51.
S1406, the AR glasses generate a control instruction according to an operation of the user moving the thumbnail of the target file to the periphery of the icon of the terminal 14.
Specifically, the AR glasses may control the movable marker 52 to move onto the thumbnail 94 of the file 92 so that the thumbnail 94 becomes a selected state, and further, the user may control the movable marker 52 to move so that the selected thumbnail 94 comes gradually closer to the icon of the terminal 14 displayed on the display device 51. When the AR glasses determine that the thumbnail 94 moves to a first preset position around the icon of the terminal 14, a control instruction is generated, wherein the control instruction includes identification information of the terminal 14 and an identification of the target file, and the control instruction is used for controlling a source device, namely the terminal 12, where the target file corresponding to the thumbnail 94 is located to send the target file to the terminal 14. The process of the user moving the movable mark 52 to make the selected thumbnail 94 gradually approach the icon of the terminal 14 is specifically described in the above embodiment, and is not described here again.
S1407, the AR glasses transmit a control instruction to the terminal 12.
S1408, the terminal 12 transmits the object file to the terminal 14.
In this embodiment, the terminal 14 and the terminal 12 may or may not belong to the same user. Specifically, the process of the terminal 14 and the terminal 12 mutually confirming whether the login of both parties is the same account is similar to the process of the AR glasses and the terminal 12 mutually confirming whether the login of both parties is the same account, and details are not repeated here.
When the terminal 14 and the terminal 12 belong to the same user, the terminal 14 directly receives the object file after the terminal 12 transmits the object file to the terminal 14.
When the terminal 14 and the terminal 12 do not belong to the same user, after the terminal 12 sends the object file to the terminal 14, the terminal 14 needs to determine whether to receive the object file according to the setting or selection of the user of the terminal 14. For example, the terminal 14 sends a prompt message to the user of the terminal 14, the prompt message being used to prompt the user whether to receive the object file, and the terminal 14 receives the object file when the user of the terminal 14 confirms to receive the object file.
S1409, the AR glasses stop displaying the thumbnail of the target file.
The present embodiment does not limit the execution order of S1408 and S1409.
In one possible scenario, after the terminal 12 transmits the target file to the terminal 14, the AR glasses stop displaying the thumbnail of the target file.
In another possible case, when the thumbnail 94 is moved to the vicinity of the icon of the terminal 14 displayed on the display device 51, the AR glasses may control the thumbnail 94 moved along with the movable marker 52 to fade away.
In yet another possible scenario, when the thumbnail 94 moves to the vicinity of the icon of the terminal 14 displayed on the display device 51, the AR glasses may control the thumbnail 94 moving with the movable marker 52 to fade away, but the thumbnail 94 may still be displayed in the icon of the terminal 12 displayed on the display device 51.
It is understood that in the present embodiment, the terminal 12 and the terminal 14 may not be displayed in the display device 51 at the same time, that is, the terminal 12 may be displayed in the display device 51 first, and when the terminal 12 transmits the thumbnail of the target file to the AR glasses, the user may turn the head so that the thumbnail of the target file moves on the display device, and so that the terminal 14 may appear in the display device 51 and the thumbnail of the target file gradually approaches the icon of the terminal 14 displayed in the display device 51.
After the AR glasses and the first equipment mutually confirm that both sides log in to be the same account number, the user sends the target file in the first equipment to the second equipment through the AR glasses, and when the AR glasses and the first equipment belong to different users respectively, under the condition that the user of the first equipment is not aware of, the file in the first equipment is sent to other equipment through the AR glasses, so that the safety of the first equipment is improved.
In another possible case, in response to the acquisition request sent by the AR glasses 13 by the terminal 12 as described in the above embodiment, the thumbnails of the one or more files sent to the AR glasses 13 may specifically be thumbnails of a user interface displayed by the terminal 12 in real time, or the information of the one or more files sent by the terminal 12 to the AR glasses 13 is a user interface displayed by the terminal 12 in real time. The following describes in detail the process of the AR glasses 13 receiving the user interface transmitted by the terminal 12 and the AR glasses 13 controlling the terminal 12 to transmit the user interface of the terminal 12 to the terminal 14, with reference to a specific embodiment. As shown in fig. 16, the process includes the following steps:
s1601, the AR glasses and the terminal 12 mutually confirm that the login of both sides is the same account.
S1602, the AR glasses send an acquisition request to the terminal 12, where the acquisition request is used to request the terminal 12 to send the user interface displayed by the terminal 12 in real time to the AR glasses.
S1603, the AR glasses receive the user interface displayed by the terminal 12 in real time.
And S1604, displaying the user interface displayed by the terminal 12 in real time on the display device by the AR glasses.
S1605, the AR glasses generate a control instruction according to an operation of the user moving the user interface to the periphery of the icon of the terminal 14.
And S1606, the AR glasses send a control instruction to the terminal 12, wherein the control instruction comprises the identification information of the terminal 14, and the control terminal 12 sends the user interface to the terminal 14.
S1607, the terminal 12 sends the user interface displayed by the terminal 12 in real time to the terminal 14.
It will be appreciated that the AR glasses may control the user interface of the terminal 12 to fade away on the display device 51 when the user moves the user interface of the terminal 12 around the icon of the terminal 14. Alternatively, after the user interface of the terminal 12 gradually disappears on the display device 51, the user interface of the terminal 12 may be displayed in the icon of the terminal 12 displayed on the display device 51, and only the user interface moving along with the movable mark 52 disappears.
The implementation process and specific principle of S1601 to S1607 may refer to S1401 to S1409 described above, and are not described herein again.
After the same account number is logged in by both sides of mutual confirmation through AR glasses and first equipment, the user sends the user interface displayed in real time by the first equipment to the second equipment through AR glasses, and when the AR glasses and the first equipment belong to different users respectively, under the condition that the user of the first equipment is not aware of, the user interface of the first equipment is sent to other equipment through AR glasses, so that the safety of the user interface displayed in real time by the first equipment is improved.
In yet another possible scenario, the user may also select a target file from the terminal 12 through the AR glasses 13. The following describes, in detail, a process in which the user selects an object file from the terminal 12 through the AR glasses 13, the AR glasses 13 receive the object file transmitted by the terminal 12, and the AR glasses 13 control the terminal 12 to transmit the object file to the terminal 14, with reference to a specific embodiment. As shown in fig. 17, the process includes the following steps:
s1701, the AR glasses and the terminal 12 mutually confirm that both login is the same account.
S1702, the AR glasses send an acquisition request to the terminal 12, where the acquisition request is used to request the terminal 12 to send the user interface displayed by the terminal 12 to the AR glasses.
S1703, the terminal 12 transmits the user interface of the terminal 12 to the AR glasses, the user interface of the terminal 12 including thumbnails of the plurality of files.
S1704, the AR glasses display the user interface of the terminal 12 on the display device.
And S1705, the AR glasses generate an acquisition request according to the selection operation of the user on the thumbnail of the target file in the thumbnails of the files.
It is understood that if thumbnails of a plurality of files are not included in the user interface of the terminal 12, the AR glasses transmit a user interface update request to the terminal 12 to request the terminal 12 to transmit an updated user interface to the AR glasses, and if thumbnails of a plurality of files are included in the updated user interface, the AR glasses receive a user selection operation of a thumbnail of a target file among the thumbnails of the plurality of files.
S1706, the AR glasses send an acquisition request to the terminal 12, where the acquisition request is used to request the terminal 12 to send the target file to the AR glasses.
S1707, the terminal 12 sends the target file to the AR glasses.
S1708, the AR glasses generate a control instruction according to an operation of the user moving the target file to the periphery of the icon of the terminal 14.
S1709, the AR glasses send a control instruction to the terminal 12, where the control instruction includes identification information of the terminal 14, and the control instruction is used to request the terminal 12 to send the target file to the terminal 14.
S1710, the terminal 12 sends the target file to the terminal 14.
It is understood that, in the present embodiment, the terminal 12 may transmit file information, such as a thumbnail of a file, to the AR glasses according to the acquisition request of the AR glasses. In addition, in the embodiment of the present application, the terminal 12 may also send the file itself, for example, the target file, to the AR glasses according to the acquisition request of the AR glasses.
In the embodiment, the user selects the target file from the first device through the AR glasses, and further, the target file in the first device is sent to the second device through the AR glasses, so that the flexibility of selecting the target file by the user and the transmission efficiency of the target file are improved.
In addition, on the basis of the above embodiments, the AR glasses may be further provided with three or more antennas supporting bluetooth 5.1, the three or more antennas not being in a straight line. In addition, terminals around the AR glasses may be configured with a single antenna supporting bluetooth 5.1. Terminals around the AR glasses may continuously broadcast wireless signals, for example, bluetooth signals, and the AR glasses may detect the bluetooth signals broadcast by the terminals around the AR glasses and determine an Angle-of-Arrival (AOA) or Angle-of-departure (AOD) technique of the terminal relative to the AR glasses according to the bluetooth signals broadcast by the terminals around the AR glasses. Further, the AR glasses may also determine whether the terminal is within the range of viewing angles in the AR glasses according to the angle and distance of the terminal relative to the AR glasses.
In some embodiments, the terminals around the AR glasses may be configured with three or more antennas supporting bluetooth 5.1, the AR glasses being configured with a single antenna supporting bluetooth 5.1.
It is to be understood that the number of terminals around the AR glasses is not limited in this embodiment, and may be one or more, for example. As shown in fig. 18, the AR glasses 13 include around them a terminal 12, a terminal 14, and a sound box 15. The AR glasses 13 determine whether the terminal 12, the terminal 14, and the speaker 15 are within the viewing angle range of the AR glasses 13 according to the bluetooth signals respectively broadcast by the terminal 12, the terminal 14, and the speaker 15. Further, the user may control the function of the terminal within the viewing angle range of the AR glasses 13 through the AR glasses 13, for example, the sound box 15 is within the viewing angle range of the AR glasses 13, and the function of the user controlling the sound box 15 through the AR glasses 13 may specifically include several steps as shown in fig. 19:
s1901, the AR glasses and the speaker 15 mutually confirm that the two login accounts are the same account.
In the embodiment of the present application, the sound box 15 may be regarded as a first device. The AR glasses 13 and the speaker 15 can mutually confirm that the two parties log in the same account, and then the AR glasses 13 perform function setting on the speaker 15. The process of the AR glasses and the sound box 15 mutually confirming that the login of the two parties is the same account is similar to the process of the AR glasses and the terminal 12 mutually confirming that the login of the two parties is the same account, and details are not repeated here.
S1902, the AR glasses send an acquisition request to the speaker 15, where the acquisition request is used to request to acquire a function setting icon.
As shown in fig. 20, the speaker 15 is within the viewing angle range of the AR glasses, and the display device 51 of the AR glasses displays an icon of the speaker 15, and further, the AR glasses may send, to the speaker 15, an acquisition request according to the movement operation of the user on the movable marker 52, where the acquisition request is used to request to acquire one or more pieces of operable information, where the one or more pieces of operable information are specifically one or more function setting icons.
In one possible implementation, the user may move the movable marker 52 to a second preset position on the display device 51, where the distance between the position displayed by the sound box 15 on the display device 51 and the second preset position is smaller than or equal to the second preset distance. That is, the user may move the movable marker 52 to a certain location point around the icon of the loudspeaker 15. Alternatively, the user may move the movable marker 52 to an icon of the sound box 15, and at this time, the AR glasses 13 may send the acquisition request to the sound box 15 according to the movement operation of the user on the movable marker 52.
S1903, the speaker 15 sends a plurality of function setting icons to the AR glasses.
When the speaker 15 receives the acquisition request, it sends a plurality of function setting icons to the AR glasses.
S1904, the AR glasses display a plurality of function setting icons on the display device.
As shown in fig. 21, the AR glasses 13, upon receiving a plurality of function setting icons, which represent any one of the plurality of function setting icons 141, display the plurality of function setting icons in the display device 51 in the vicinity of the icons of the sound box 15. Wherein different ones of the plurality of function setting icons may be used to set different functions of the loudspeaker 15. For example, function setting icon 141 may be used to set an alarm function for loudspeaker 15, and function setting icon 161 may be used to set a volume for loudspeaker 15.
It will be appreciated that in some embodiments, the function setting icons may not be obtained from the device being set, such as the speaker 15, for example, the AR glasses 13 may be pre-configured with function setting icons for different types of devices. For example, the AR glasses 13 may be preset with a function setting icon of a speaker, a function setting icon of a mobile phone, a function setting icon of a tablet computer, and the like. When the AR glasses 13 determine that the speaker 15 is within the range of viewing angles of the AR glasses 13, the AR glasses 13 may display a function setting icon corresponding to the speaker 15 on the display device 51.
S1905, the AR glasses generate a control instruction according to the user' S operation on the target function setting icon among the plurality of function setting icons.
Further, the user may operate a target function setting icon among the plurality of function setting icons, for example, the user may control the movable marker 52 to move onto the function setting icon 141 by rotating the head, and when the AR glasses detect that the movable marker 52 moves onto the function setting icon 141, the AR glasses generate a control instruction for setting an alarm function of the speaker 15. It will be appreciated that the user's manipulation of the target function setting icon, such as the function setting icon 141, is not limited to controlling the movable marker 52 to move over the function setting icon 141, and that there may be other manipulations, such as the user controlling the movable marker 52 to move over the function setting icon 141 and inputting an input message to the AR glasses 13, which input message may be a gesture of the user, and/or a confirmation message input by the user via the input device as described above.
In some scenarios, there may be multiple speakers within the viewing angle range of the AR glasses 13, and at this time, the display device 51 of the AR glasses 13 may display icons of the multiple speakers and function setting icons corresponding to the speakers. When a user needs to perform function setting on a target sound box among the multiple sound boxes, the user can perform selection operation on the icon of the target sound box and selection operation on the icon of the target function setting, the AR glasses 13 can perform selection operation on the icon of the target sound box and selection operation on the icon of the target function setting according to the user, and generate a control instruction, and the control instruction is used for the target sound box to perform function setting corresponding to the icon of the target function setting.
S1906, the AR glasses send a control instruction to the sound box, and the control instruction is used for setting functions of the sound box.
For example, after the sound box 15 receives the control instruction, the sound box 15 sets an alarm function of the sound box 15 according to the control instruction, where the alarm function may be a privacy function of the sound box 15.
And S1907, the sound box sends a function setting response to the AR glasses.
For example, after the speaker 15 successfully sets the alarm function of the speaker 15, the speaker 15 may also send a function setting response to the AR glasses.
In addition, if the AR glasses and the speaker 15 mutually confirm that the login of both sides is not the same account, after the AR glasses send the acquisition request to the speaker 15, the speaker 15 may send only one function setting icon to the AR glasses, and the AR glasses may display the function setting icon on the display device 51, as shown in the function setting icon 161 shown in fig. 22, where the function setting icon 161 may only be used to adjust the volume of the speaker 15.
For example, when the AR glasses and the sound box 15 belong to the same user, the user sets the authority for the function of the sound box 15 through the AR glasses and records the authority as the first authority, and when the AR glasses and the sound box 15 do not belong to the same user, the user wearing the AR glasses sets the authority for the function of the sound box 15 through the AR glasses and records the authority as the second authority.
One possible scenario is where the first right is greater than the second right. So can make AR glasses and audio amplifier belong to same user when, this user's accessible this AR glasses carries out privacy information setting to this audio amplifier, and when AR glasses and audio amplifier do not belong to same user, the user of wearing this AR glasses can only carry out simple function setting to this audio amplifier through this AR glasses to the security of audio amplifier has been improved, and the security of user's privacy information has been improved.
In another possible case, the first permission is equal to the second permission, that is, the function setting permissions provided by the sound box for different users are the same, so that when the AR glasses and the sound box do not belong to the same user, the user wearing the AR glasses can also set the privacy information of the sound box through the AR glasses, where the user wearing the AR glasses may not be the affiliation user of the sound box.
According to the embodiment, the function control instruction corresponding to the function setting icon is generated through the head-mounted device according to the operation of the user on the function setting icon, and the function control instruction is sent to the first device, so that the first device performs corresponding function setting according to the function control instruction, that is, the user can perform function control on the first device through the head-mounted device, when the first device or the user is in a state inconvenient to move, the user can also perform function control on the first device through the head-mounted device, and therefore convenience of the user control device is improved.
It is to be understood that some or all of the steps or operations in the above-described embodiments are merely examples, and other operations or variations of various operations may be performed by the embodiments of the present application. Further, the various steps may be performed in a different order presented in the above-described embodiments, and it is possible that not all of the operations in the above-described embodiments are performed.
It is to be understood that, in the above embodiments, the operations or steps implemented by the head-mounted device may also be implemented by a component (e.g., a chip or a circuit) available for the head-mounted device, and the operations or steps implemented by the terminal, e.g., the first device, may also be implemented by a component (e.g., a chip or a circuit) available for the first device.
Fig. 23 shows a schematic configuration of a communication apparatus. The communication apparatus may be configured to implement the method of the corresponding part of the head-mounted device, the method of the corresponding part of the first device, or the method of the corresponding part of the second device described in the above method embodiments, for specific reference to the description in the above method embodiments.
The communication device 230 may include one or more processors 231, and the processors 231 may also be referred to as processing units and may implement certain control functions. The processor 231 may be a general-purpose processor or a special-purpose processor, etc.
In an alternative design, the processor 231 may also have instructions 233 stored therein, which may be executed by the processor to enable the communication apparatus 230 to perform the method corresponding to the head-mounted device or the first device described in the above method embodiment.
In yet another possible design, communication device 230 may include circuitry that may implement the functionality of transmitting or receiving or communicating in the foregoing method embodiments.
Optionally, one or more memories 232 may be included in the communication device 230, on which instructions 234 or intermediate data are stored, and the instructions 234 may be executed on the processor, so that the communication device 230 performs the method described in the above method embodiments. Optionally, other related data may also be stored in the memory. Optionally, instructions and/or data may also be stored in the processor. The processor and the memory may be provided separately or may be integrated together.
Optionally, the communication device 230 may further include a transceiver 235.
The processor 231 may be referred to as a processing unit. The transceiver 235 may be referred to as a transceiver unit, a transceiver, a transceiving circuit, a transceiver, or the like, and is used for performing transceiving functions of a communication device.
If the communication device is used to implement the operation of the AR glasses corresponding to the embodiment shown in fig. 4, for example, the transceiver may send the first encryption information to the first device. The transceiver may further perform other corresponding communication functions. And the processor is used for completing corresponding determination or control operations, and optionally, corresponding instructions can also be stored in the memory. The specific processing manner of each component can be referred to the related description of the previous embodiment.
If the communication apparatus is used to implement the operation corresponding to the first device in fig. 4, for example, the first encrypted information transmitted by the AR glasses may be received by the transceiver, and the first decrypted information may be transmitted to the AR glasses by the transceiver. The transceiver may further perform other corresponding communication functions. And the processor is used for completing corresponding determination or control operations, and optionally, corresponding instructions can also be stored in the memory. The specific processing manner of each component can be referred to the related description of the previous embodiment.
The processors and transceivers described herein may be implemented on Integrated Circuits (ICs), analog ICs, Radio Frequency Integrated Circuits (RFICs), mixed signal ICs, Application Specific Integrated Circuits (ASICs), Printed Circuit Boards (PCBs), electronic devices, and the like. The processor and transceiver may also be fabricated using various 1C process technologies, such as Complementary Metal Oxide Semiconductor (CMOS), N-type metal oxide semiconductor (NMOS), P-type metal oxide semiconductor (PMOS), Bipolar Junction Transistor (BJT), Bipolar CMOS (bicmos), silicon germanium (SiGe), gallium arsenide (GaAs), and the like.
Alternatively, the communication means may be a stand-alone device or may be part of a larger device. For example, the device may be:
(1) a stand-alone integrated circuit IC, or chip, or system-on-chip or subsystem;
(2) a set of one or more ICs, which optionally may also include storage components for storing data and/or instructions;
(3) an ASIC, such as a modem (MSM);
(4) a module that may be embedded within other devices;
(5) receivers, terminals, cellular telephones, wireless devices, handsets, mobile units, network devices, and the like;
(6) others, and so forth.
Fig. 24 is a schematic structural diagram of a communication device according to an embodiment of the present application. As shown in fig. 24, the communication device 240 includes: a display module 2401, a generation module 2402 and a sending module 2403; the display module 2401 is used for displaying information of one or more devices and one or more pieces of operational information; the generating module 2402 is configured to generate a control instruction for controlling a first device according to an operation of a user on target operational information, where the target operational information is one of the one or more operational information, and the first device is one of the one or more devices; the sending module 2403 is configured to send the control instruction to the first device.
Optionally, the one or more operational information comprises one or more file information; the generating module 2402, when generating a control instruction for controlling the first device according to the operation of the user on the target operable information, is specifically configured to: and generating a control instruction for controlling the first device according to the operation of the user on the target file information, wherein the target file information is one of the one or more pieces of file information, the first device is a device where a target file corresponding to the target file information is located, and the control instruction is used for controlling the first device to send the target file to a second device, and the second device is one of the one or more devices.
Optionally, the operation of the user on the target file information includes: and the operation is used for indicating that the user sends the target file corresponding to the target file information to the second device.
Optionally, when the generating module 2402 generates a control instruction for controlling the first device according to the operation of the user on the target file information, the generating module is specifically configured to: and generating a control instruction for controlling the first device according to the operation that the user moves the target file information to a first preset position in display equipment of the head-mounted device, wherein the distance between the position of the second device in the display equipment and the first preset position is smaller than or equal to a first preset distance.
Optionally, the one or more file information includes at least one of: one or more file identifications, one or more file icons.
Optionally, the one or more operational information comprises one or more function setting icons; the generating module 2402, when generating a control instruction for controlling the first device according to the operation of the user on the target operable information, is specifically configured to: and generating a control instruction for controlling the first device according to the operation of the user on a target function setting icon, wherein the target function setting icon is one of the one or more function setting icons, and the control instruction is used for setting the function of the first device.
Optionally, when the generating module 2402 generates a control instruction for controlling the first device according to the operation of the user on the target function setting icon, the generating module is specifically configured to: and generating a control instruction for controlling the first equipment according to the selection operation of the target function setting icon by the user and the selection operation of the information of the first equipment by the user.
Optionally, the information of the one or more devices includes: icons of the one or more devices; the one or more operational information is displayed in an icon of the one or more devices.
Optionally, when the display module 2401 displays information of one or more devices, it is specifically configured to: displaying icons of one or more devices within a range of viewing angles of the head mounted device.
Optionally, the communication device 240 further includes: a detection module 2404, a determination module 2405; the detecting module 2404 is configured to detect a wireless signal sent by one or more devices before the displaying module 2401 displays icons of the one or more devices within a viewing angle range of the head-mounted device; the determining module 2405 is configured to determine an angle and a distance of one or more devices relative to the head-mounted device according to a wireless signal transmitted by the one or more devices; determining one or more devices within a range of viewing angles of the head mounted device according to the angle and distance of the one or more devices relative to the head mounted device.
Optionally, the communication device 240 further includes: a receiving module 2406; the sending module 2403 is further configured to: before the display module 2401 displays one or more pieces of operable information, sending an acquisition request to the first device according to a movement operation of the user on a movable marker displayed on a display device of the head-mounted device, wherein the acquisition request is used for acquiring the one or more pieces of operable information; the receiving module 2406 is configured to receive the one or more operational information from the first device.
Optionally, the sending module 2403, when sending the obtaining request to the first device according to the movement operation of the user on the movable mark displayed on the display device of the head-mounted device, is specifically configured to: and sending the acquisition request to the first device according to the operation that the user moves the movable mark displayed on the display device of the head-mounted device to a second preset position, wherein the distance between the position of the first device in the display device and the second preset position is smaller than or equal to the second preset distance.
Optionally, the communication device 240 further includes: a verification module 2407, where the verification module 2407 is configured to verify whether an account that the head-mounted device logs in a server is the same as an account that the first device logs in the server before the sending module 2403 sends an acquisition request to the first device according to a movement operation of the user on a movable mark displayed on a display device of the head-mounted device; when the account number of the head-mounted device logged in the server is the same as the account number of the first device logged in the server, the control authority of the head-mounted device on the first device is a first authority; when the account number of the head-mounted device logged in the server is different from the account number of the first device logged in the server, the control authority of the head-mounted device on the first device is a second authority; wherein the first right is greater than or equal to the second right.
Optionally, the verification module 2407 verifies whether the account registered in the server by the head-mounted device is the same as the account registered in the server by the first device, and is specifically configured to: encrypting first preset information by using a key distributed to the head-mounted equipment by the server to obtain first encrypted information; the sending module 2403 is configured to send the first encrypted information to the first device; the receiving module 2406 is configured to receive first decryption information obtained by decrypting, by the first device, the first encryption information; if the first decryption information is the same as the first preset information, the determining module 2405 determines that the account registered by the head-mounted device in the server is the same as the account registered by the first device in the server; if the first decryption information is different from the first preset information, the determining module 2405 determines that the account registered by the head-mounted device in the server is different from the account registered by the first device in the server.
The communication apparatus of the embodiment shown in fig. 24 may be used to implement the technical solutions of the above method embodiments, and the implementation principles and technical effects of the technical solutions may further refer to the relevant descriptions in the method embodiments, and optionally, the communication apparatus may be a head-mounted device, and may also be a component (e.g., a chip or a circuit) of the head-mounted device.
Fig. 25 is a schematic structural diagram of another communication device according to an embodiment of the present application. As shown in fig. 25, the communication device 250 includes: a receiving module 2501 and a processing module 2502; the receiving module 2501 is configured to receive a control instruction from a head-mounted device, where the control instruction is an instruction generated by the head-mounted device according to a user operation on target operational information, the head-mounted device displays information of one or more devices and one or more operational information, the target operational information is one of the one or more operational information, and the first device is one of the one or more devices; the processing module 2502 is configured to execute the control instruction.
Optionally, the one or more operational information comprises one or more file information; the control instruction is an instruction which is generated by the head-mounted device according to an operation of the user on target file information and is used for controlling the first device, the target file information is one of the one or more pieces of file information, the first device is a device where a target file corresponding to the target file information is located, the control instruction is used for controlling the first device to send the target file to a second device, and the second device is one of the one or more devices.
Optionally, the operation of the user on the target file information includes: and the operation is used for indicating that the user sends the target file corresponding to the target file information to the second device.
Optionally, the one or more file information includes at least one of: one or more file identifications, one or more file icons.
Optionally, the one or more operational information comprises one or more function setting icons; the control instruction is an instruction which is generated by the head-mounted device according to the operation of the user on a target function setting icon and is used for controlling the first device, the target function setting icon is one of the one or more function setting icons, and the control instruction is used for setting the function of the first device.
Optionally, the information of the one or more devices includes: icons of the one or more devices; the one or more operational information is displayed in an icon of the one or more devices.
Optionally, the communication device 250 further comprises: a transmitting module 2503; before the receiving module 2501 receives the control instruction from the head-mounted device, it is further configured to: receiving an acquisition request from the head-mounted device, wherein the acquisition request is used for acquiring the one or more operable information; the transmitting module 2503 is configured to transmit the one or more operational information to the head-mounted device.
Optionally, the communication device 250 further comprises: a verification module 2504; before the receiving module 2501 receives the obtaining request from the headset device, the verifying module 2504 is configured to verify whether the account registered in the server by the headset device is the same as the account registered in the server by the first device; when the account number of the head-mounted device logged in the server is the same as the account number of the first device logged in the server, the control authority of the head-mounted device on the first device is a first authority; when the account number of the head-mounted device logged in the server is different from the account number of the first device logged in the server, the control authority of the head-mounted device on the first device is a second authority; wherein the first right is greater than or equal to the second right.
Optionally, the verification module 2504 verifies whether the account logged in by the head-mounted device in the server is the same as the account logged in by the first device in the server, and is specifically configured to: encrypting second preset information by using a key distributed to the first equipment by the server to obtain second encrypted information; the sending module 2503 is configured to send the second encrypted information to the head mounted device; the receiving module 2501 is configured to receive second decryption information obtained by decrypting, by the head-mounted device, the second encryption information; if the second decryption information is the same as the second preset information, the verification module 2504 determines that the account registered by the head-mounted device in the server is the same as the account registered by the first device in the server.
The communication apparatus of the embodiment shown in fig. 25 may be used to execute the technical solution of the above method embodiment, and the implementation principle and technical effect of the technical solution may further refer to the relevant description in the method embodiment, and optionally, the communication apparatus may be the first device, and may also be a component (e.g., a chip or a circuit) of the first device.
It should be understood that the division of the modules of the communication device shown in fig. 24 or fig. 25 is merely a logical division, and the actual implementation may be wholly or partially integrated into one physical entity or may be physically separated. And these modules can be realized in the form of software called by processing element; or may be implemented entirely in hardware; and part of the modules can be realized in the form of calling by the processing element in software, and part of the modules can be realized in the form of hardware. For example, the determining module may be a processing element that is separately set up, or may be implemented by being integrated in a certain chip of the communication apparatus, such as a head-mounted device, or may be stored in a memory of the communication apparatus in the form of a program, and the certain processing element of the communication apparatus calls and executes the functions of the above modules. Other modules are implemented similarly. In addition, all or part of the modules can be integrated together or can be independently realized. The processing element described herein may be an integrated circuit having signal processing capabilities. In implementation, each step of the above method or each module above may be implemented by an integrated logic circuit of hardware in a processor element or an instruction in the form of software.
For example, the above modules may be one or more integrated circuits configured to implement the above methods, such as: one or more Application Specific Integrated Circuits (ASICs), or one or more microprocessors (DSPs), or one or more Field Programmable Gate Arrays (FPGAs), among others. For another example, when some of the above modules are implemented in the form of a processing element scheduler, the processing element may be a general-purpose processor, such as a Central Processing Unit (CPU) or other processor capable of calling programs. As another example, these modules may be integrated together, implemented in the form of a system-on-a-chip (SOC).
Fig. 26 is a schematic structural diagram of another communication device according to an embodiment of the present application. As shown in fig. 26, the communication device 260 includes: a processor 262 and a transceiver 263, the transceiver 263 also being a transceiver. The transceiver 263 receives a control instruction from the head-mounted device; processor 262 is configured to execute the control instructions. Further, a memory 261 is included for storing computer programs or instructions for invoking which the processor 262 is configured.
The communication apparatus in the embodiment shown in fig. 26 may be configured to execute the technical solution of the above method embodiment, and further refer to the relevant description in the method embodiment, which is not described herein again, where the communication apparatus may be the first device, or may be a component (e.g., a chip or a circuit) of the first device.
In fig. 26, the transceiver 263 may be connected to an antenna. In the downlink direction, the transceiver 263 receives information transmitted from the base station via the antenna and transmits the information to the processor 262 for processing. In the uplink direction, the processor 262 processes the data of the terminal and transmits the processed data to the base station through the transceiver 263.
Alternatively, the processor 262 may be used to implement the respective functions in the processing module 2502 of the communication apparatus shown in fig. 25, and the transceiving apparatus may be used to implement the respective functions of the receiving module 2501 and the transmitting module 2503 of the communication apparatus shown in fig. 25. Alternatively, part or all of the above modules may be implemented by being embedded in a chip of the terminal in the form of an integrated circuit. And they may be implemented separately or integrated together. That is, the above modules may be configured as one or more integrated circuits implementing the above methods, for example: one or more Application Specific Integrated Circuits (ASICs), or one or more microprocessors (DSPs), or one or more Field Programmable Gate Arrays (FPGAs), etc.
An embodiment of the present application further provides a computer-readable storage medium, in which a computer program is stored, and when the computer program runs on a computer, the computer is caused to execute the device control method according to the above embodiment.
In addition, the present application also provides a computer program product, which includes a computer program that, when running on a computer, causes the computer to execute the apparatus control method described in the above embodiments.
In addition, an embodiment of the present application further provides a processor, where the processor includes: at least one circuit for performing the device control method as described in the above embodiments.
In addition, the embodiment of the present application also provides a system, which includes the terminal and the head-mounted device as described above.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on a computer, the procedures or functions described in accordance with the present application are generated, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by wire (e.g., coaxial cable, fiber optic, digital subscriber line) or wirelessly (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., Solid state disk), among others.
Based on the same inventive concept as the method provided in the foregoing embodiment of the present application, the present application further provides a communication apparatus, which is used to implement the method in the foregoing embodiment, and the communication apparatus may be the first device or the head-mounted device, or may be a component (e.g., a chip or a circuit) of the first device or the head-mounted device. Part or all of the methods of the above embodiments may be implemented by hardware or may be implemented by software, and when implemented by hardware, as shown in fig. 27, the communication apparatus 1000 includes: input interface circuitry 1002, logic circuitry 1004, and output interface circuitry 1006. The communication device 1000 further includes a transceiver 1008 and an antenna 1010, and the transceiver 1008 transmits and receives data via the antenna 1010.
The input interface circuit 1002 is used for acquiring data to be processed; the logic circuit 1004 is configured to execute the apparatus control method shown in fig. 4 to process the data to be processed, so as to obtain processed data; the output interface circuit 1006 is used for outputting the processed data. In a specific implementation, the communication device 1000 may be a chip or an integrated circuit.

Claims (30)

1. An apparatus control method characterized by comprising:
the head-mounted device displays information of one or more devices and one or more operational information;
the head-mounted device generates a control instruction for controlling a first device according to the operation of a user on target operable information, wherein the target operable information is one of the one or more operable information, and the first device is one of the one or more devices;
the head-mounted device sends the control instruction to the first device.
2. The method of claim 1, wherein the one or more actionable information comprises one or more file information;
the head-mounted device generates a control instruction for controlling the first device according to the operation of the user on the target operable information, and the control instruction comprises the following steps:
the head-mounted device generates a control instruction for controlling the first device according to an operation of the user on target file information, wherein the target file information is one of the one or more pieces of file information, the first device is a device where a target file corresponding to the target file information is located, the control instruction is used for controlling the first device to send the target file to a second device, and the second device is one of the one or more devices.
3. The method of claim 2, wherein the user operation on the target file information comprises: and the operation is used for indicating that the user sends the target file corresponding to the target file information to the second device.
4. The method according to claim 2 or 3, wherein the head-mounted device generates a control instruction for controlling the first device according to the operation of the user on the target file information, and the method comprises the following steps:
the head-mounted device generates a control instruction for controlling the first device according to the operation that the user moves the target file information to a first preset position in display equipment of the head-mounted device, wherein the distance between the position of the second device in the display equipment and the first preset position is smaller than or equal to a first preset distance.
5. The method of any of claims 2-4, wherein the one or more file information includes at least one of:
one or more file identifications, one or more file icons.
6. The method of claim 1, wherein the one or more actionable information comprises one or more function setting icons;
the head-mounted device generates a control instruction for controlling the first device according to the operation of the user on the target operable information, and the control instruction comprises the following steps:
the head-mounted device generates a control instruction for controlling the first device according to the operation of the user on a target function setting icon, wherein the target function setting icon is one of the one or more function setting icons, and the control instruction is used for setting the function of the first device.
7. The method of claim 6, wherein the head-mounted device generates a control instruction for controlling the first device according to the user's operation of the target function setting icon, comprising:
and the head-mounted equipment generates a control instruction for controlling the first equipment according to the selection operation of the target function setting icon by the user and the selection operation of the information of the first equipment by the user.
8. The method of any one of claims 1-7, wherein the information for the one or more devices comprises: icons of the one or more devices;
the one or more operational information is displayed in an icon of the one or more devices.
9. The method of claim 8, wherein the head-mounted device displays information for one or more devices, comprising:
the head-mounted device displays icons for one or more devices within a range of viewing angles of the head-mounted device.
10. The method of claim 8 or 9, wherein before the head-mounted device displays icons for one or more devices within a range of viewing angles of the head-mounted device, the method further comprises:
the head-mounted device detects wireless signals transmitted by one or more devices;
the head-mounted device determines the angle and distance of one or more devices relative to the head-mounted device according to wireless signals transmitted by the one or more devices;
the head-mounted device determines one or more devices within a range of viewing angles of the head-mounted device based on angles and distances of the one or more devices relative to the head-mounted device.
11. The method of any one of claims 1-10, wherein prior to the head-mounted device displaying the one or more operational information, the method further comprises:
the head-mounted device sends an acquisition request to the first device according to the movement operation of the user on a movable mark displayed on a display device of the head-mounted device, wherein the acquisition request is used for acquiring the one or more operable information;
the head-mounted device receives the one or more operational information from the first device.
12. The method according to claim 11, wherein the head mounted device sends a fetch request to the first device according to the user's movement operation of a movable marker displayed on a display device of the head mounted device, including;
the head-mounted device sends the acquisition request to the first device according to the operation that the user moves the movable mark displayed on the display device of the head-mounted device to a second preset position, wherein the distance between the position of the first device in the display device and the second preset position is smaller than or equal to the second preset distance.
13. The method according to claim 11 or 12, wherein before the head-mounted device sends the acquisition request to the first device according to the user's movement operation on a movable marker displayed on a display device of the head-mounted device, the method further comprises:
the head-mounted device verifies whether an account number of the head-mounted device logged in a server is the same as an account number of the first device logged in the server;
when the account number of the head-mounted device logged in the server is the same as the account number of the first device logged in the server, the control authority of the head-mounted device on the first device is a first authority;
when the account number of the head-mounted device logged in the server is different from the account number of the first device logged in the server, the control authority of the head-mounted device on the first device is a second authority;
wherein the first right is greater than or equal to the second right.
14. The method of claim 13, wherein the headset device verifying whether the account number of the headset device logged in the server is the same as the account number of the first device logged in the server comprises:
the head-mounted equipment encrypts first preset information by adopting a secret key distributed to the head-mounted equipment by the server to obtain first encrypted information;
the head-mounted device sends the first encrypted information to the first device;
the head-mounted device receives first decryption information obtained by decrypting the first encryption information by the first device;
if the first decryption information is the same as the first preset information, the head-mounted device determines that an account registered in a server by the head-mounted device is the same as an account registered in the server by the first device;
if the first decryption information is different from the first preset information, the head-mounted device determines that an account registered in the server by the head-mounted device is different from an account registered in the server by the first device.
15. An apparatus control method characterized by comprising:
a first device receives a control instruction from a head-mounted device, wherein the control instruction is an instruction generated by the head-mounted device according to a user operation on target operable information and used for controlling the first device, the head-mounted device displays information of one or more devices and one or more operable information, the target operable information is one of the one or more operable information, and the first device is one of the one or more devices;
the first device executes the control instruction.
16. The method of claim 15, wherein the one or more actionable information comprises one or more file information;
the control instruction is an instruction which is generated by the head-mounted device according to an operation of the user on target file information and is used for controlling the first device, the target file information is one of the one or more pieces of file information, the first device is a device where a target file corresponding to the target file information is located, the control instruction is used for controlling the first device to send the target file to a second device, and the second device is one of the one or more devices.
17. The method of claim 16, wherein the user operation on the target file information comprises: and the operation is used for indicating that the user sends the target file corresponding to the target file information to the second device.
18. The method of claim 16 or 17, wherein the one or more file information comprises at least one of:
one or more file identifications, one or more file icons.
19. The method of claim 15, wherein the one or more operational information comprises one or more function setting icons;
the control instruction is an instruction which is generated by the head-mounted device according to the operation of the user on a target function setting icon and is used for controlling the first device, the target function setting icon is one of the one or more function setting icons, and the control instruction is used for setting the function of the first device.
20. The method of any one of claims 15-19, wherein the information of the one or more devices comprises: icons of the one or more devices;
the one or more operational information is displayed in an icon of the one or more devices.
21. The method of any of claims 15-20, wherein prior to the first device receiving the control instruction from the head-mounted device, the method further comprises:
the first device receives an acquisition request from the head-mounted device, wherein the acquisition request is used for acquiring the one or more operable information;
the first device transmits the one or more operational information to the head mounted device.
22. The method of claim 21, wherein prior to the first device receiving the acquisition request from the headset, the method further comprises:
the first device verifies whether an account number of the head-mounted device logged in a server is the same as an account number of the first device logged in the server;
when the account number of the head-mounted device logged in the server is the same as the account number of the first device logged in the server, the control authority of the head-mounted device on the first device is a first authority;
when the account number of the head-mounted device logged in the server is different from the account number of the first device logged in the server, the control authority of the head-mounted device on the first device is a second authority;
wherein the first right is greater than or equal to the second right.
23. The method of claim 22, wherein the first device verifying whether the account number of the headset logged in the server is the same as the account number of the first device logged in the server comprises:
the first equipment encrypts second preset information by adopting a key distributed to the first equipment by the server to obtain second encrypted information;
the first device sends the second encrypted information to the head-mounted device;
the first device receives second decryption information obtained by decrypting the second encryption information by the head-mounted device;
if the second decryption information is the same as the second preset information, the first device determines that the account number of the head-mounted device logged in the server is the same as the account number of the first device logged in the server.
24. A communications device comprising means for performing the method of any of claims 1-14 or 15-23.
25. A communication device comprising a processor and a transceiver, the processor and the transceiver communicating with each other through an internal connection; the processor is configured to perform the method of any one of claims 1-14 or 15-23.
26. A communications apparatus, comprising:
an interface and a processor, the interface and the processor coupled;
the processor is for executing a computer program or instructions to cause the communication device to perform the method of any of claims 1-14 or 15-23.
27. The communications device of claim 26, further comprising: a memory;
the memory is for storing the computer program or instructions.
28. A communications apparatus, comprising: the device comprises an input interface circuit, a logic circuit and an output interface circuit, wherein the input interface circuit is used for acquiring data to be processed;
the logic circuit is configured to perform the method of any one of claims 1-14 or 15-23 to process the data to be processed to obtain processed data;
the output interface circuit is used for outputting the processed data.
29. A computer-readable storage medium comprising a computer program or instructions for performing the method of any one of claims 1-14 or 15-23 when the computer program or instructions is run on a computer.
30. A computer program product comprising a computer program or instructions for performing the method of any one of claims 1-14 or 15-23 when the computer program or instructions is run on a computer.
CN201911393813.0A 2019-12-30 2019-12-30 Device control method, communication apparatus, and storage medium Pending CN111190488A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911393813.0A CN111190488A (en) 2019-12-30 2019-12-30 Device control method, communication apparatus, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911393813.0A CN111190488A (en) 2019-12-30 2019-12-30 Device control method, communication apparatus, and storage medium

Publications (1)

Publication Number Publication Date
CN111190488A true CN111190488A (en) 2020-05-22

Family

ID=70705962

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911393813.0A Pending CN111190488A (en) 2019-12-30 2019-12-30 Device control method, communication apparatus, and storage medium

Country Status (1)

Country Link
CN (1) CN111190488A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112164146A (en) * 2020-09-04 2021-01-01 维沃移动通信有限公司 Content control method and device and electronic equipment
CN114510171A (en) * 2022-02-14 2022-05-17 广州塔普鱼网络科技有限公司 Portable three-dimensional touch interactive system based on image processing technology
CN114527864A (en) * 2020-11-19 2022-05-24 京东方科技集团股份有限公司 Augmented reality text display system, method, device, and medium
CN114679612A (en) * 2022-03-15 2022-06-28 辽宁科技大学 Intelligent household system and control method thereof
WO2022161432A1 (en) * 2021-01-28 2022-08-04 维沃移动通信有限公司 Display control method and apparatus, and electronic device and medium
WO2023000519A1 (en) * 2021-07-21 2023-01-26 歌尔股份有限公司 Smart wearable device, and method and system for controlling target device

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103986738A (en) * 2013-02-07 2014-08-13 百度在线网络技术(北京)有限公司 Inter-multi-terminal synchronization method and system
CN105188516A (en) * 2013-03-11 2015-12-23 奇跃公司 System and method for augmented and virtual reality
CN105229588A (en) * 2013-03-15 2016-01-06 埃尔瓦有限公司 For the intersection realistic choice in augmented reality system, pull and place
CN106803988A (en) * 2017-01-03 2017-06-06 苏州佳世达电通有限公司 Message transfer system and information transferring method
CN107014378A (en) * 2017-05-22 2017-08-04 中国科学技术大学 A kind of eye tracking aims at control system and method
CN108646997A (en) * 2018-05-14 2018-10-12 刘智勇 A method of virtual and augmented reality equipment is interacted with other wireless devices
CN110209263A (en) * 2018-02-28 2019-09-06 联想(新加坡)私人有限公司 Information processing method, information processing equipment and device-readable storage medium
CN110493729A (en) * 2019-08-19 2019-11-22 芋头科技(杭州)有限公司 Exchange method, equipment, storage medium and the program product of augmented reality equipment

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103986738A (en) * 2013-02-07 2014-08-13 百度在线网络技术(北京)有限公司 Inter-multi-terminal synchronization method and system
CN105188516A (en) * 2013-03-11 2015-12-23 奇跃公司 System and method for augmented and virtual reality
CN105229588A (en) * 2013-03-15 2016-01-06 埃尔瓦有限公司 For the intersection realistic choice in augmented reality system, pull and place
CN106803988A (en) * 2017-01-03 2017-06-06 苏州佳世达电通有限公司 Message transfer system and information transferring method
CN107014378A (en) * 2017-05-22 2017-08-04 中国科学技术大学 A kind of eye tracking aims at control system and method
CN110209263A (en) * 2018-02-28 2019-09-06 联想(新加坡)私人有限公司 Information processing method, information processing equipment and device-readable storage medium
CN108646997A (en) * 2018-05-14 2018-10-12 刘智勇 A method of virtual and augmented reality equipment is interacted with other wireless devices
CN110493729A (en) * 2019-08-19 2019-11-22 芋头科技(杭州)有限公司 Exchange method, equipment, storage medium and the program product of augmented reality equipment

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112164146A (en) * 2020-09-04 2021-01-01 维沃移动通信有限公司 Content control method and device and electronic equipment
CN114527864A (en) * 2020-11-19 2022-05-24 京东方科技集团股份有限公司 Augmented reality text display system, method, device, and medium
CN114527864B (en) * 2020-11-19 2024-03-15 京东方科技集团股份有限公司 Augmented reality text display system, method, equipment and medium
WO2022161432A1 (en) * 2021-01-28 2022-08-04 维沃移动通信有限公司 Display control method and apparatus, and electronic device and medium
WO2023000519A1 (en) * 2021-07-21 2023-01-26 歌尔股份有限公司 Smart wearable device, and method and system for controlling target device
CN114510171A (en) * 2022-02-14 2022-05-17 广州塔普鱼网络科技有限公司 Portable three-dimensional touch interactive system based on image processing technology
CN114510171B (en) * 2022-02-14 2023-10-24 广州塔普鱼网络科技有限公司 Portable three-dimensional interaction system based on image processing technology
CN114679612A (en) * 2022-03-15 2022-06-28 辽宁科技大学 Intelligent household system and control method thereof

Similar Documents

Publication Publication Date Title
CN111190488A (en) Device control method, communication apparatus, and storage medium
KR102429535B1 (en) Method for registration of internet of things deivce and the appratus thereof
US20140068261A1 (en) Methods And Apparatus For Use In Sharing Credentials Amongst A Plurality Of Mobile Communication Devices
WO2018195845A1 (en) Control terminal, control method and control terminal for unmanned aerial vehicle, unmanned aerial vehicle and system
WO2019184016A1 (en) Sim card authentication method and terminal
KR20160035535A (en) Communication pairing method, communication mediation apparatus, method for controlling network using the same and network system
KR20160147441A (en) Mobile terminal and operating method thereof
EP4099620A2 (en) Pairing groups of accessories
KR20190035414A (en) Wireless device and operating method thereof
US20230189360A1 (en) Method for managing wireless connection of electronic device, and apparatus therefor
KR20150123645A (en) Mobile terminal and method for controlling the same
US20220407666A1 (en) Method and apparatus for finding lost device using uwb and ar apparatus
KR20170112527A (en) Wearable device and method for controlling the same
KR20200133587A (en) Apparatus and method for providing different service according to area based on beam book information
KR101728758B1 (en) Mobile terminal and method for controlling the same
KR20160076263A (en) Mobile terminal and method for controlling the same
KR20160067393A (en) Apparatus for controlling push service
KR20160072641A (en) Mobile terminal and method of controlling the same
KR20160004164A (en) Mobile terminal and method for controlling the same
US20220343747A1 (en) Method and apparatus for providing location alarm service of electronic device
US20230189357A1 (en) Packet transmission method for electronic device positioning service and apparatus thereof
US20240129690A1 (en) Distributed Device Location Finding
WO2023080262A1 (en) Method for v2x-related operation of node in wireless communication system
KR20170066856A (en) Mobile terminal and operating method thereof
KR20170059816A (en) Mobile terminal and method for controlling the same

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200522