CN112817665A - Equipment interaction method and device and storage medium - Google Patents

Equipment interaction method and device and storage medium Download PDF

Info

Publication number
CN112817665A
CN112817665A CN202110090556.4A CN202110090556A CN112817665A CN 112817665 A CN112817665 A CN 112817665A CN 202110090556 A CN202110090556 A CN 202110090556A CN 112817665 A CN112817665 A CN 112817665A
Authority
CN
China
Prior art keywords
wearable electronic
electronic device
task
identified
identified device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110090556.4A
Other languages
Chinese (zh)
Inventor
彭文佳
张兴泷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Priority to CN202110090556.4A priority Critical patent/CN112817665A/en
Publication of CN112817665A publication Critical patent/CN112817665A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/448Execution paradigms, e.g. implementations of programming paradigms
    • G06F9/4488Object-oriented
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16YINFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR THE INTERNET OF THINGS [IoT]
    • G16Y10/00Economic sectors
    • G16Y10/75Information technology; Communication
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16YINFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR THE INTERNET OF THINGS [IoT]
    • G16Y40/00IoT characterised by the purpose of the information processing
    • G16Y40/30Control

Landscapes

  • Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Computing Systems (AREA)
  • Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Accounting & Taxation (AREA)
  • Development Economics (AREA)
  • Economics (AREA)
  • General Business, Economics & Management (AREA)
  • Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The disclosure relates to a device interaction method and device and a storage medium. The method comprises the following steps: acquiring a current task of a first identified device; determining a target task and an execution object of the target task based on the current task; and executing a preset action to enable the execution object to execute the target task. According to the embodiment of the disclosure, different execution objects can execute the target task according to the change of the current task, so that the wearable electronic device can interact with different execution objects, and the interaction mode of the wearable electronic device can be more flexible.

Description

Equipment interaction method and device and storage medium
Technical Field
The present disclosure relates to the field of internet technologies, and in particular, to a device interaction method and apparatus, and a storage medium.
Background
With the development of technologies, the Internet of Things (IoT) can implement interconnection and interworking of devices.
The wearable electronic device may interact with an Artificial Intelligence Internet of things (AIoT) device. However, in the process of implementing interconnection and intercommunication of devices, the control of functions and the assignment of execution tasks among the devices often require manual control by users, and are not intelligent enough.
Therefore, finding a scheme capable of realizing intelligent interconnection and interworking between the wearable electronic device and the AIoT device is a hot spot of current research.
Disclosure of Invention
In order to overcome the problems in the related art, the present disclosure provides a device interaction method and apparatus, and a storage medium.
According to a first aspect of the embodiments of the present disclosure, there is provided a device interaction method, including:
acquiring a current task of a first identified device;
determining a target task and an execution object of the target task based on the current task;
and executing a preset action to enable the execution object to execute the target task.
In some embodiments, the performing the preset action such that the execution object performs the target task includes:
when the execution object is the first identified device, sending a first instruction to enable the first identified device to execute the target task;
when the execution object is the wearable electronic device, controlling the wearable electronic device to execute the target task;
when the execution object is a second identified device, sending a second instruction to enable the second identified device to execute the target task;
wherein the first identified device and the execution object both have at least one same function of executing the target task.
In some embodiments, the obtaining the current task of the first identified device comprises:
determining the first identified device based on a first predetermined rule;
based on a second predetermined rule, the current task of the first identified device is obtained.
In some embodiments, said determining said first identified device based on a first predetermined rule comprises one or more of:
determining the first identified device based on a relationship of communication connections of the wearable electronic device;
determining the first identified device based on visual recognition of the wearable electronic device;
determining the first identified device based on the sound wave signal identification of the wearable electronic device.
In some embodiments, the determining the first identified device based on visual recognition of the wearable electronic device comprises:
identifying based on a computer vision identification model of the wearable electronic device, and determining the first identified device;
or,
and identifying the visual two-dimensional code based on the wearable electronic equipment, and determining the first identified equipment.
In some embodiments, said obtaining said current task of said first identified device based on a second predetermined rule comprises one or more of:
receiving the current task sent by the first identified device based on a communication connection established between the first identified device and the wearable electronic device through a communication protocol;
acquiring the current task of the first identified device based on a computer vision recognition model of the wearable electronic device;
receiving the current task sent by the first identified device to the public network based on the public network connecting the wearable electronic device and the first identified device.
In some embodiments, the determining a target task and an execution object of the target task based on the current task includes:
and when the target task determined according to the current task is a task with the highest priority in the tasks executed by the wearable electronic device, controlling the wearable electronic device to execute the target task.
According to a second aspect of the embodiments of the present disclosure, there is provided a device interaction apparatus applied to a wearable electronic device, the apparatus including:
the first acquisition module is configured to acquire a current task of the first identified device;
a first determination module configured to determine a target task and an execution object of the target task based on the current task;
and the execution module is configured to execute a preset action so that the execution object executes the target task.
In some embodiments, the execution module is further configured to send a first instruction to cause the first identified device to execute the target task when the execution object is the first identified device; when the execution object is the wearable electronic device, controlling the wearable electronic device to execute the target task; when the execution object is a second identified device, sending a second instruction to enable the second identified device to execute the target task; wherein the first identified device and the execution object both have at least one same function of executing the target task.
In some embodiments, the first obtaining module includes:
a second determination module configured to determine the first identified device based on a first predetermined rule;
a second obtaining module configured to obtain the current task of the first identified device based on a second predetermined rule.
In some embodiments, the second determination module is further configured to one or more of:
a third determination module configured to determine the first identified device based on a relationship of a communication connection of the wearable electronic device;
a fourth determination module configured to determine the first identified device based on visual recognition of the wearable electronic device;
a fifth determination module configured to determine the first identified device based on a sound wave signal identification of the wearable electronic device.
In some embodiments, the fourth determination module is configured to determine the first identified device based on recognition by a computer vision recognition model of the wearable electronic device; or, the first identified device is determined based on the wearable electronic device recognizing the visual two-dimensional code.
In some embodiments, the second obtaining module is configured to one or more of: receiving the current task sent by the first identified device based on a communication connection established between the first identified device and the wearable electronic device through a communication protocol; acquiring the current task of the first identified device based on a computer vision recognition model of the wearable electronic device; receiving the current task sent by the first identified device to the public network based on the public network connecting the wearable electronic device and the first identified device.
In some embodiments, the first determining module is further configured to control the wearable electronic device to execute the target task when the target task determined according to the current task is a task with a highest priority in tasks executed by the wearable electronic device.
According to a third aspect of the embodiments of the present disclosure, there is provided a device interaction apparatus, including at least: a processor and a memory for storing executable instructions operable on the processor, wherein:
the processor is configured to execute the executable instructions, and the executable instructions perform the steps in the device interaction method provided in the first aspect.
According to a fourth aspect of embodiments of the present disclosure, there is provided a non-transitory computer-readable storage medium having stored therein computer-executable instructions that, when executed by a processor, implement the device interaction method as provided in the first aspect above.
The technical scheme provided by the embodiment of the disclosure can have the following beneficial effects:
in the embodiment of the disclosure, the wearable electronic device determines the target task and the execution object of the target task based on the current task by acquiring the current task of the first identified device, and executes the preset action, so that the execution object executes the target task. Therefore, according to the embodiment of the disclosure, different execution objects can execute the target task according to the change of the current task, so that the wearable electronic device can interact with different execution objects, and the interaction mode of the wearable electronic device can be more flexible and intelligent.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the invention and together with the description, serve to explain the principles of the invention.
Fig. 1 is a flowchart of a device interaction method shown in the embodiment of the present disclosure.
Fig. 2 is a flowchart of a device interaction method shown in the embodiment of the present disclosure.
Fig. 3 is a schematic diagram illustrating a method for determining a first identified device according to an embodiment of the disclosure.
Fig. 4 is a schematic diagram illustrating a determination of a current task according to an embodiment of the disclosure.
FIG. 5 is a diagram illustrating a device interaction according to an example embodiment.
FIG. 6 is a diagram illustrating a device interaction apparatus according to an example embodiment.
Fig. 7 is a diagram illustrating an apparatus interaction device according to an example embodiment.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present invention. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the invention, as detailed in the appended claims.
Fig. 1 is a flowchart illustrating a first device interaction method according to an embodiment of the present disclosure, and as shown in fig. 1, the method applied to the wearable electronic device includes the following steps:
s101, acquiring a current task of a first identified device;
s102, determining a target task and an execution object of the target task based on the current task;
s103, executing a preset action to enable the execution object to execute the target task.
In the embodiment of the disclosure, the device interaction method is suitable for an interaction scenario between a wearable electronic device and an Artificial Intelligence Internet of things (AIoT) device. The wearable electronic device may include a head mounted display device, such as an AR device, VR device, MR device, or the like; wearable sports devices, such as smart watches or smart bracelets, may also be included. The AIOT equipment includes but is not limited to terminal equipment in the Internet of things, intelligent household appliances, such as mobile phones, routers, televisions, sound boxes, air conditioners or electric cookers.
The interaction scenario between the wearable electronic device and the AIOT device may include: the interactive scene between the AR glasses and the television, the interactive scene between the AR glasses and the loudspeaker box or the interactive scene between the AR glasses and the mobile phone. For example, AR glasses display video pictures of a television by interacting with the television; the AR glasses realize playing music output by the sound box through interaction with the sound box; the AR glasses perform photographing by interacting with the mobile phone.
In step S101, the first identified device is an electronic device identified by the wearable electronic device within the identification range. The first identified device may be any electronic device in the AIOT device, for example, the first identified device may include a television, a sound box, an air conditioner, or a mobile phone, and the embodiments of the disclosure are not limited thereto. The identification range may be determined depending on different first identified devices. When the first identified device is a head-mounted display device, the identification range may be a field angle range or a viewing range of the head-mounted display device; when the first identified device is an intelligent bracelet or an intelligent watch, the identification range may be a range defined by taking a predetermined distance as a radius, and may also be a transmission range of an identification sensor in the intelligent watch or the intelligent bracelet, such as a radar sensor or a laser sensor, and the embodiments of the present disclosure are not limited.
It should be noted that the current task may be understood as an operation being performed by the first identified device or an operating state in which the first identified device is currently located. For example, when the first identified device is a television, the current task of the television may include playing an animation, playing a video, or playing a picture; when the first identified device is an air conditioner, the current task of the air conditioner may include being in a heating state, performing a wind speed adjustment, or performing a temperature adjustment, and the embodiments of the present disclosure are not limited.
The above-mentioned obtaining of the current task of the first identified device may be understood as identifying the first identified device first, and then obtaining the current task of the first identified device. For example, the AR glasses recognize that the first recognized device is a sound box, and then recognize that the current task of the first recognized device is playing a song. For another example, the AR glasses recognize that the first recognized device is an air conditioner, and then recognize that the current task of the air conditioner is in a heating state.
In step S102, the target task is determined according to the current task. In some embodiments, the target task may be the current task, i.e., the execution object is the same as the execution task of the first identified device. For example, a speaker and AR glasses are playing a song for a child. Also for example, both television and AR glasses are playing the same video picture.
In other embodiments, the target task may be determined according to the current task and the corresponding relationship between the current task and the preset target task. The corresponding relation between the current task and the preset target task is stored in the wearable electronic device in advance. And on the premise of acquiring the current task, the target task can be determined by searching the corresponding relation between the current task and the preset target task.
Wherein, the corresponding relationship may include: when the current task is that the small-screen electronic equipment executes display related operation, the corresponding target task is that the large-screen electronic equipment executes the display related operation; when the current task is to execute various control operations for an electronic device with limited operation (for example, an electronic device with an operation interface smaller than a preset threshold), the target task is to enable an electronic device with convenient operation (for example, an electronic device with an operation interface larger than a preset threshold) to execute various control operations.
For example, the current task of a small-screen electronic device in the AIOT device, such as a mobile phone, is playing a video, and the large-screen electronic device can be determined to play the video according to the current task and the corresponding relationship, at this time, the execution object may be a television or a computer, and the target task may be the television or the computer playing the video. Thus, the viewing experience of the user can be improved by transferring the target task to the large-screen electronic device. For another example, the current task of the limited electronic device such as a watch in the AIOT device is to take a picture, and the electronic device that needs to be conveniently operated can be determined according to the current task and the corresponding relationship to control the taking of the picture, at this time, the execution object may be a mobile phone or a tablet computer, and the target task may be to execute the control of taking a picture. Therefore, the photographing control can be more convenient and faster by transmitting the target task to the mobile phone or the tablet personal computer to control photographing.
In other embodiments, the target task may be an associated task of the current task, the associated task may be a task with higher execution priority than the current task, and may also be a task running in the background or a task to be executed in a task list, which is not limited by the embodiments of the present disclosure. For example, the current task is a play video task, and the associated task may be a telephony task having a higher priority than the play video task.
In the embodiment of the present disclosure, the execution object may be the first recognized device or the wearable electronic device, and may also be a second recognized device other than the first recognized device and the wearable electronic device. The second identified device may be capable of performing the target task. For example, the second identification device and the first identified device both have screens and can play videos; or the second identification equipment and the first identified equipment are both provided with loudspeakers and can play music.
It should be noted that the execution object and the target task are determined according to the current task, that is, the target task and the execution object may change following the change of the current task. For example, when the current task is playing music, the target task and the execution object may play music for AR glasses; when the current task is a small-screen playing video, the target task and the execution object may be a large-screen playing video of a television, and the embodiment of the disclosure is not limited.
In the embodiment of the present disclosure, when the execution objects are different, the wearable electronic device may interact with different execution objects. For example, when the execution object is a first recognized device, the wearable electronic device may interact with the first recognized device such that the first recognized device performs the target task. When the execution object is a second recognized device, the wearable electronic device can interact with the first recognized device and the second recognized device respectively, so that the second recognized device executes the target task. The wearable electronic equipment can interact with different identified equipment, and interaction flexibility of the wearable electronic equipment is improved.
In step S103, after the wearable electronic device determines the target task and the execution object, the wearable electronic device may execute a preset action, so that the execution object executes the target task.
It should be noted that the preset action is associated with the execution object, and the execution object is different, and the preset action is different. When the execution object is a first recognized device or a second recognized device, the preset action can be that the wearable electronic device sends the target task to the first recognized device or the second recognized device; when the execution object is a wearable electronic device, the preset action may be that the wearable electronic device directly controls itself to execute the target task.
That is, the wearable electronic device may control itself to perform the target task, and may also interact with the first identified device or the second identified device, so that the first identified device or the second identified device performs the target task.
In the embodiment of the disclosure, the wearable electronic device determines a target task and an execution object of the target task based on a current task by acquiring the current task of a first identified device; and executing the preset action so that the execution object executes the target task. Therefore, according to the embodiment of the disclosure, different execution objects can execute the target task according to the change of the current task, so that the wearable electronic device can interact with different execution objects, and the interaction mode of the wearable electronic device can be more flexible and intelligent.
In some embodiments, as shown in fig. 2, the performing a preset action to make the execution object execute the target task includes:
s103, 103a, when the execution object is the first identified device, sending a first instruction to enable the first identified device to execute the target task;
s103b, when the execution object is the wearable electronic device, controlling the wearable electronic device to execute the target task;
s103, 103c, when the execution object is a second identified device, sending a second instruction to enable the second identified device to execute the target task;
wherein the first identified device and the execution object both have at least one same function of executing the target task.
In the embodiment of the present disclosure, the execution object has a function of executing the target task, so that the execution object can execute the target task. For example, the functions that the execution object has include: playing video, picture and voice; a call function; a cross-screen assistance function; and a photographing control function, which is not limited in the embodiments of the present disclosure.
In step S103a, when the execution object is the first recognized device, a first instruction is sent so that the first recognized device executes the target task. The wearable electronic device serves as a master device, the first identified device serves as a slave device, and the master device transmits the target task to the slave device, so that the slave device executes the target task.
For example, the first identified device is a television, the wearable electronic device is AR glasses, the AR glasses are recording a video, the current task of the television is in a display state, and it may be determined that the television needs to display the recorded video according to the current task. At this time, the television as the execution target displays the video that the AR glasses are recording. For another example, the first identified device may also be a mobile phone or a monitoring device with a camera, the wearable electronic device is AR glasses, and the AR glasses send the target task to the mobile phone or the monitoring device with the camera through the first instruction, so that the mobile phone or the monitoring device can play a video.
In step S103b, when the execution object is a wearable electronic device, the wearable electronic device is controlled to execute the target task. Namely, the wearable electronic device directly controls itself to execute the target task when determining that the execution object is itself. At this time, the wearable electronic device is a slave device, so that the slave device performs a target task.
For example, the wearable electronic device is an AR glasses, and when the first electronic device is a mobile phone and the target task determined according to the current task is the screen projection required or the video call relay, the AR glasses perform a screen projection operation or a video call operation as an executable object; when the first electronic equipment is a sound box and the target task determined according to the current task is music playing transmission, the AR glasses are used as executable objects to execute music playing operation; when the first electronic device is a television and the target task determined according to the current task is picture playing or video playing transmission, the AR glasses are used as executable objects to execute picture playing and video playing operations.
In step S103c, when the execution object is the second recognized device, a second instruction is transmitted so that the second recognized device executes the target task. The second instruction carries information related to the target task, the target task needs to be sent to the second identified device, at this time, the first identified device is the master device, the second identified device is the slave device, and the wearable electronic device is the intermediate transfer device.
For example, the wearable electronic device is an AR glasses, when the first identified device is a small-screen notebook or a mobile phone, the second identified device is a large-screen television, and the target task determined according to the current task is a video or picture relay, the large-screen television performs an operation of playing the video or picture to obtain the best viewing experience; when the first identified device is a mobile phone and the second identified device is a computer, and the target task determined according to the current task is cross-screen assistance, the computer executes cross-screen assistance operation; when the first identified device is a mobile phone and the second identified device is a sound box, and the target task determined according to the current task is audio, voice or video telephone relay, the sound box plays the audio, voice or video telephone; when the first identified device is a watch or a bracelet and the second identified device is a mobile phone, and the target task determined according to the current task is photographing control, the mobile phone executes photographing control operation; and when the first identified device is a sound box and the second identified device is a television and the target task determined according to the current task is audio or video relay, the television plays the audio or video.
In the embodiment of the disclosure, based on different execution objects, the wearable electronic device can be used as a slave device to directly execute the target task after acquiring the target task; the target task can be transmitted to the first identified device by being used as the main device to interact with the first identified device, so that the first identified device executes the target task; the target task acquired from the first recognized device can also be transferred to the second recognized device as an intermediate transfer device, so that the second recognized device executes the target task. Therefore, the interaction mode of the wearable electronic equipment can be more flexible.
In some embodiments, the obtaining the current task of the first identified device comprises:
determining the first identified device based on a first predetermined rule;
based on a second predetermined rule, the current task of the first identified device is obtained.
In the embodiment of the present disclosure, the first predetermined rule is a rule that the wearable electronic device identifies the first identified device.
In some embodiments, said determining said first identified device based on a first predetermined rule comprises one or more of:
determining the first identified device based on a relationship of communication connections of the wearable electronic device;
determining the first identified device based on visual recognition of the wearable electronic device;
determining the first identified device based on auditory recognition of the wearable electronic device.
In an embodiment of the present disclosure, a relationship of communication connection of a wearable electronic device includes: the wearable electronic device establishes Bluetooth communication with the first identified device; or the wearable electronic device establishes ultra-bandwidth communication with the first identified device; alternatively, the wearable electronic device establishes near field communication with the first identified device, and the embodiments of the present disclosure are not limited thereto.
The wearable electronic device can be provided with an Ultra Wide Band (UWB) module and can also communicate in a wifi module, a bluetooth module, a laser module, an infrared module and a zigbee mode. Correspondingly, the first identified device is also correspondingly provided with the various modules, so that different communication connections are established with the wearable electronic device to determine the first identified device.
It should be noted that, after the communication connection is established, the wearable electronic device may directly determine the first identified device through the identity information sent by the first identified device. The identity information may be a device serial number, or may also be a device SN number, a device identifier, or information related to device authentication.
In the embodiment of the present disclosure, determining the first identified device based on the auditory recognition of the wearable electronic device includes: determining a first identified device based on a sound wave signal output by the first identified device and received by the wearable electronic device; alternatively, the first identified device is determined based on identifying an audible two-dimensional code on the first identified device.
The sound wave signal may be a sound wave signal that can be heard by human ears, and a sound wave signal that cannot be heard by human ears, such as ultrasonic waves or infrasonic waves.
It should be noted that the wearable electronic device may determine the first identified device by having a millimeter wave module, an ultrasonic wave module, or a radar module.
In the embodiment of the disclosure, in the process of determining the first identified device by visual recognition, the first identified device may be determined by image acquisition or recognizing a two-dimensional code. In some embodiments, the determining the first identified device based on visual recognition of the wearable electronic device comprises: identifying based on a computer vision identification model of the wearable electronic device, and determining the first identified device; or, the first identified device is determined based on the wearable electronic device recognizing the visual two-dimensional code.
The wearable electronic device may acquire device information of a first recognized device through a camera, a laser sensor, or a radar sensor, and input the device information into a computer vision recognition model for recognition to determine the first recognized device; the first identified device may also be determined by identifying a visual two-dimensional code on the first identified device. The visual two-dimensional code is formed by fusing the area of the background picture and the area of the two-dimensional code body. For example, the visual two-dimensional code includes, but is not limited to, a light-dark two-dimensional code.
In the embodiment of the present disclosure, in determining the first identified device, the first identified device may be determined by adopting a plurality of determination manners. As shown in fig. 3, the determination manner of the first identified device includes: artificial intelligence vision, artificial intelligence hearing, ultra-bandwidth, bluetooth, ultrasonic wave or light and dark two-dimensional code. The embodiment of the disclosure may select one or more ways from the determining ways of the first identified device to determine the first identified device.
For example, the first identified device may be determined collectively based on the relationship of the bluetooth connection of the wearable electronic device and artificial intelligence vision. For another example, the first recognized device may be determined jointly based on the relationship of the ultra-wideband connection of the wearable electronic device and the lamp brightness two-dimensional code. Therefore, the accuracy of identification can be improved by selecting multiple identification modes to jointly determine the first identified equipment.
It should be noted that, in the process of determining the first identified device, the first identified device may also be determined by establishing an identification priority. For example, when the wearable electronic device establishes a communication connection with a first identified device, the first identified device may be determined based on a relationship of the communication connection of the wearable electronic device; when the wearable electronic device and the first identified device cannot establish a communication connection, the first identified device may be determined based on visual recognition or auditory recognition of the wearable electronic device. Therefore, the first identified device can be preferentially determined in a communication connection mode, and the identification efficiency and the identification accuracy of the first identified device can be improved.
In some embodiments, said obtaining said current task of said first identified device based on a second predetermined rule comprises one or more of:
receiving the current task sent by the first identified device based on a communication connection established between the first identified device and the wearable electronic device through a communication protocol;
acquiring the current task of the first identified device based on a computer vision recognition model of the wearable electronic device;
receiving the current task sent by the first identified device to the public network based on the public network connecting the wearable electronic device and the first identified device.
In the embodiment of the present disclosure, the communication protocol includes a bluetooth communication protocol, an ultra-wideband communication protocol, or a near field communication protocol, which is not limited in the embodiment of the present disclosure. After the communication connection is established, the wearable electronic device can directly receive the current task sent by the first identified device. For example, when the wearable electronic device is a smart bracelet or a smart watch, the first identified device may be determined through a communication connection established by ultra-wideband UWB, and the current task transferred by the first identified device through the communication connection established by the ultra-wideband may be acquired. When the wearable electronic device is an intelligent bracelet or an intelligent watch, the wearable electronic device may include an ultra-wideband UWB positioning module, determine the first identified device through a pointing operation of the wearable electronic device, and acquire a current task transmitted by the first identified device through the ultra-wideband communication protocol or another communication protocol.
In the process of acquiring the current task by adopting the computer vision recognition model, the image information of the first recognized device can be acquired by adopting a camera on the wearable electronic device, and the current task of the first recognized device can be obtained by inputting the image information into the computer vision recognition model. Wherein the image information includes, but is not limited to, a current task that the first identified device virtually displays on a surface of the first identified device.
In the process of acquiring the current task by using the public network, the public network can respectively establish communication connection with the wearable electronic device and the first identified device, and then the wearable electronic device in the embodiment of the disclosure can acquire the current task by indirect connection through the public network.
In the embodiment of the disclosure, in the process of determining the current task, the current task can be determined by adopting a plurality of determination modes. As shown in fig. 4, the determination manner of the current task includes: a public network, a communications connection, or a visual recognition model. The embodiment of the disclosure may select one or more ways from the determining ways of the first identified device to determine the first identified device.
For example, the current task may be determined jointly based on the relationship of the communication connections of the wearable electronic device and the computer vision recognition model. Therefore, the accuracy of identification can be improved by acquiring the current task through multiple determination modes.
It should be noted that, in the process of determining the current task, the current task may also be determined by establishing a priority. For example, when the wearable electronic device establishes a communication connection with the first identified device, the current task sent by the first identified device may be received based on the communication connection established between the first identified device and the wearable electronic device through the communication protocol; when the wearable electronic device and the first identified device cannot establish a communication connection, the current task may be determined based on a public network connecting the wearable electronic device and the first identified device. Therefore, the first identified device can be preferentially determined in a communication connection mode, and the identification efficiency and accuracy of the current task can be improved.
In some embodiments, the determining a target task and an execution object of the target task based on the current task includes:
and when the target task determined according to the current task is a task with the highest priority in the tasks executed by the wearable electronic device, controlling the wearable electronic device to execute the target task.
In the embodiment of the present disclosure, the task with the highest priority may be set as needed. The wearable electronic device performs tasks with the highest priority among tasks, including but not limited to a phone task, a voice task, a video task, or an alarm task, and the embodiments of the present disclosure are not limited thereto. For example, when the target task is determined to be an alarm task, the wearable electronic device can be controlled to directly give an alarm prompt, so that timely alarm can be realized, and the user experience is improved. For another example, when the target task is a phone task, the wearable electronic device can be controlled to directly prompt for a phone call, so that the phone can be answered in time, and the user experience is improved.
It should be noted that different devices correspond to different tasks that need the highest priority to be executed by the wearable electronic device. For example, for an air conditioner or a washing machine, the task with the highest corresponding priority is an alarm prompt; for a mobile phone, the task with the highest priority is a telephone or voice prompt.
In some embodiments, the execution object is a wearable electronic device, and when the execution object is in an executable state and the priority of the execution task of the execution object is higher than the priority of the target task, the target task is passed to the first identified device or the second identified device, so that the first identified device or the second identified device executes the target task.
In the embodiment of the disclosure, the wearable electronic device may execute tasks with different priorities in sequence according to the order of priorities. That is, the wearable electronic device executes the task with the highest priority first.
The execution task may be a task that the wearable electronic device is processing, for example, the wearable electronic device is communicating over a telephone or is alerting.
In the embodiment of the disclosure, when the priority of the task is higher than that of the target task, the target task is transferred to the first identified device or the second identified device. Therefore, the target task can be processed and executed in time, and the user experience is improved.
In order to better understand the above embodiment, taking the wearable electronic device as AR glasses and the first identified device as a mobile phone as an example, an example of the embodiment of the disclosure is as follows:
as shown in fig. 5, the AR glasses and the mobile phone establish a bluetooth communication connection. After the AR glasses are in communication connection with the mobile phone, the mobile phone can send identity information and a current task to the AR glasses through Bluetooth communication connection; the AR glasses can determine that the identified equipment is a mobile phone according to the received identity information; after the recognized device is determined to be the mobile phone and the current task of the mobile phone, the AR glasses can determine the target task and the execution object of the target task based on the current task and execute the preset action, so that the execution object executes the target task. Therefore, the AR glasses can enable different execution objects to execute the target task according to the change of the current task, interaction between the AR glasses and the different execution objects is achieved, and the interaction mode of the AR glasses is flexible and intelligent.
FIG. 6 is a diagram illustrating a device interaction apparatus according to an example embodiment. Referring to fig. 6, the device interaction apparatus includes a first obtaining module 1001, a first determining module 1002, and an executing module 1003, wherein,
a first obtaining module 1001 configured to obtain a current task of a first identified device;
a first determining module 1002 configured to determine a target task and an execution object of the target task based on the current task;
an executing module 1003 configured to execute a preset action, so that the executing object executes the target task.
In some embodiments, the execution module is further configured to send a first instruction to cause the first identified device to execute the target task when the execution object is the first identified device; when the execution object is the wearable electronic device, controlling the wearable electronic device to execute the target task; when the execution object is a second identified device, sending a second instruction to enable the second identified device to execute the target task; wherein the first identified device and the execution object both have at least one same function of executing the target task.
In some embodiments, the first obtaining module includes:
a second determination module configured to determine the first identified device based on a first predetermined rule;
a second obtaining module configured to obtain the current task of the first identified device based on a second predetermined rule.
In some embodiments, the second determination module is further configured to one or more of:
a third determination module configured to determine the first identified device based on a relationship of a communication connection of the wearable electronic device;
a fourth determination module configured to determine the first identified device based on visual recognition of the wearable electronic device;
a fifth determination module configured to determine the first identified device based on a sound wave signal identification of the wearable electronic device.
In some embodiments, the fourth determination module is configured to determine the first identified device based on recognition by a computer vision recognition model of the wearable electronic device; or, the first identified device is determined based on the wearable electronic device recognizing the visual two-dimensional code.
In some embodiments, the second obtaining module is configured to one or more of: receiving the current task sent by the first identified device based on a communication connection established between the first identified device and the wearable electronic device through a communication protocol; acquiring the current task of the first identified device based on a computer vision recognition model of the wearable electronic device; receiving the current task sent by the first identified device to the public network based on the public network connecting the wearable electronic device and the first identified device.
In some embodiments, the first determining module is further configured to control the wearable electronic device to execute the target task when the target task determined according to the current task is a task with a highest priority in tasks executed by the wearable electronic device.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
Fig. 7 is a diagram illustrating an apparatus interaction device according to an example embodiment. For example, the apparatus may be a wearable electronic device comprising: smart phones, smart bracelets, or AR glasses.
Referring to fig. 7, an apparatus may include one or more of the following components: processing component 802, memory 804, power component 806, multimedia component 808, audio component 810, input/output (I/O) interface 812, sensor component 814, and communication component 816.
The processing component 802 generally controls overall operation of the device, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing components 802 may include one or more processors 820 to execute instructions to perform all or a portion of the steps of the methods described above. Further, the processing component 802 can include one or more modules that facilitate interaction between the processing component 802 and other components. For example, the processing component 802 can include a multimedia module to facilitate interaction between the multimedia component 808 and the processing component 802.
The memory 804 is configured to store various types of data to support operations at the device. Examples of such data include instructions for any application or method operating on the device, contact data, phonebook data, messages, pictures, videos, and the like. The memory 804 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
The power component 806 provides power to various components of the device. The power components 806 may include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for a device.
The multimedia component 808 includes a screen that provides an output interface between the device and the user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 808 includes a front facing camera and/or a rear facing camera. The front camera and/or the rear camera may receive external multimedia data when the device is in an operational mode, such as a shooting mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
The audio component 810 is configured to output and/or input audio signals. For example, the audio component 810 includes a Microphone (MIC) configured to receive external audio signals when the apparatus is in an operating mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may further be stored in the memory 804 or transmitted via the communication component 816. In some embodiments, audio component 810 also includes a speaker for outputting audio signals.
The I/O interface 812 provides an interface between the processing component 802 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The sensor component 814 includes one or more sensors for providing status assessment of various aspects of the device. For example, the sensor assembly 814 may detect the on/off status of the device, the relative positioning of the components, such as the display and keypad of the apparatus, the sensor assembly 814 may also detect a change in the position of the apparatus or a component of the apparatus, the presence or absence of user contact with the apparatus, the orientation or acceleration/deceleration of the apparatus, and a change in the temperature of the apparatus. Sensor assembly 814 may include a proximity sensor configured to detect the presence of a nearby object without any physical contact. The sensor assembly 814 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 814 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 816 is configured to facilitate wired or wireless communication between the apparatus and other devices. The device may access a wireless network based on a communication standard, such as Wi-Fi, 2G or 3G, or a combination thereof. In an exemplary embodiment, the communication component 816 receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 816 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components for performing the above-described methods.
In an exemplary embodiment, a non-transitory computer-readable storage medium comprising instructions, such as the memory 804 comprising instructions, executable by the processor 820 of the apparatus to perform the method described above is also provided. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
A non-transitory computer readable storage medium having instructions therein which, when executed by a processor of a wearable electronic device, enable the wearable electronic device to perform a device interaction method, the method comprising:
acquiring a current task of a first identified device;
determining a target task and an execution object of the target task based on the current task;
and executing a preset action to enable the execution object to execute the target task.
Other embodiments of the invention will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This application is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the invention and including such departures from the present disclosure as come within known or customary practice within the art to which the invention pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the invention being indicated by the following claims.
It will be understood that the invention is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the invention is limited only by the appended claims.

Claims (16)

1. A device interaction method is applied to a wearable electronic device, and comprises the following steps:
acquiring a current task of a first identified device;
determining a target task and an execution object of the target task based on the current task;
and executing a preset action to enable the execution object to execute the target task.
2. The method of claim 1, wherein the performing the preset action such that the execution object performs the target task comprises:
when the execution object is the first identified device, sending a first instruction to enable the first identified device to execute the target task;
when the execution object is the wearable electronic device, controlling the wearable electronic device to execute the target task;
when the execution object is a second identified device, sending a second instruction to enable the second identified device to execute the target task;
wherein the first identified device and the execution object both have at least one same function of executing the target task.
3. The method of claim 1 or 2, wherein the obtaining a current task of the first identified device comprises:
determining the first identified device based on a first predetermined rule;
based on a second predetermined rule, the current task of the first identified device is obtained.
4. The method of claim 3, wherein determining the first identified device based on a first predetermined rule comprises one or more of:
determining the first identified device based on a relationship of communication connections of the wearable electronic device;
determining the first identified device based on visual recognition of the wearable electronic device;
determining the first identified device based on auditory recognition of the wearable electronic device.
5. The method according to claim 4, wherein the determining the first identified device based on the visual identification of the wearable electronic device comprises:
identifying based on a computer vision identification model of the wearable electronic device, and determining the first identified device;
or,
and identifying the visual two-dimensional code based on the wearable electronic equipment, and determining the first identified equipment.
6. The method of claim 4, wherein the obtaining the current task of the first identified device based on a second predetermined rule comprises one or more of:
receiving the current task sent by the first identified device based on a communication connection established between the first identified device and the wearable electronic device through a communication protocol;
acquiring the current task of the first identified device based on a computer vision recognition model of the wearable electronic device;
receiving the current task sent by the first identified device to the public network based on the public network connecting the wearable electronic device and the first identified device.
7. The method of claim 1 or 2, wherein determining a target task and an execution object of the target task based on the current task comprises:
and when the target task determined according to the current task is a task with the highest priority in the tasks executed by the wearable electronic device, controlling the wearable electronic device to execute the target task.
8. A device interaction apparatus applied to a wearable electronic device, the apparatus comprising:
the first acquisition module is configured to acquire a current task of the first identified device;
a first determination module configured to determine a target task and an execution object of the target task based on the current task;
and the execution module is configured to execute a preset action so that the execution object executes the target task.
9. The apparatus of claim 8, wherein the execution module is further configured to send a first instruction to cause the first identified device to perform the target task when the execution object is the first identified device; when the execution object is the wearable electronic device, controlling the wearable electronic device to execute the target task; when the execution object is a second identified device, sending a second instruction to enable the second identified device to execute the target task; wherein the first identified device and the execution object both have at least one same function of executing the target task.
10. The apparatus of claim 8 or 9, wherein the first obtaining module comprises:
a second determination module configured to determine the first identified device based on a first predetermined rule;
a second obtaining module configured to obtain the current task of the first identified device based on a second predetermined rule.
11. The apparatus of claim 10, wherein the second determining module is further configured to one or more of:
a third determination module configured to determine the first identified device based on a relationship of a communication connection of the wearable electronic device;
a fourth determination module configured to determine the first identified device based on visual recognition of the wearable electronic device;
a fifth determination module configured to determine the first identified device based on a sound wave signal identification of the wearable electronic device.
12. The apparatus according to claim 11, wherein the fourth determining module is configured to determine the first recognized device based on recognition by a computer vision recognition model of the wearable electronic device; or, the first identified device is determined based on the wearable electronic device recognizing the visual two-dimensional code.
13. The apparatus of claim 10, wherein the second obtaining module is configured to one or more of: receiving the current task sent by the first identified device based on a communication connection established between the first identified device and the wearable electronic device through a communication protocol; acquiring the current task of the first identified device based on a computer vision recognition model of the wearable electronic device; receiving the current task sent by the first identified device to the public network based on the public network connecting the wearable electronic device and the first identified device.
14. The apparatus according to claim 8 or 9, wherein the first determining module is further configured to control the wearable electronic device to execute the target task when the target task determined according to the current task is a task with a highest priority in task execution of the wearable electronic device.
15. An apparatus for device interaction, the apparatus comprising at least: a processor and a memory for storing executable instructions operable on the processor, wherein:
the processor is configured to execute the executable instructions, and the executable instructions perform the steps of the device interaction method provided in any one of the preceding claims 1 to 7.
16. A non-transitory computer-readable storage medium having stored therein computer-executable instructions that, when executed by a processor, implement the device interaction method provided in any one of claims 1 to 7.
CN202110090556.4A 2021-01-22 2021-01-22 Equipment interaction method and device and storage medium Pending CN112817665A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110090556.4A CN112817665A (en) 2021-01-22 2021-01-22 Equipment interaction method and device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110090556.4A CN112817665A (en) 2021-01-22 2021-01-22 Equipment interaction method and device and storage medium

Publications (1)

Publication Number Publication Date
CN112817665A true CN112817665A (en) 2021-05-18

Family

ID=75858974

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110090556.4A Pending CN112817665A (en) 2021-01-22 2021-01-22 Equipment interaction method and device and storage medium

Country Status (1)

Country Link
CN (1) CN112817665A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113329375A (en) * 2021-05-27 2021-08-31 Oppo广东移动通信有限公司 Content processing method, device, system, storage medium and electronic equipment
WO2023284355A1 (en) * 2021-07-15 2023-01-19 Oppo广东移动通信有限公司 Information processing method, apparatus, and system, storage medium, and electronic device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103686326A (en) * 2013-12-20 2014-03-26 乐视致新电子科技(天津)有限公司 Video file synchronous playing control method, smart television and wearable device
CN103873959A (en) * 2012-12-13 2014-06-18 联想(北京)有限公司 Control method and electronic device
CN105138112A (en) * 2015-07-13 2015-12-09 腾讯科技(深圳)有限公司 Display control method and apparatus
CN105511307A (en) * 2015-11-26 2016-04-20 小米科技有限责任公司 Control method and apparatus of electronic device
US20170193302A1 (en) * 2016-01-05 2017-07-06 Daqri, Llc Task management system and method using augmented reality devices
CN112218145A (en) * 2020-10-15 2021-01-12 聚好看科技股份有限公司 Smart television, VR display device and related methods

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103873959A (en) * 2012-12-13 2014-06-18 联想(北京)有限公司 Control method and electronic device
CN103686326A (en) * 2013-12-20 2014-03-26 乐视致新电子科技(天津)有限公司 Video file synchronous playing control method, smart television and wearable device
CN105138112A (en) * 2015-07-13 2015-12-09 腾讯科技(深圳)有限公司 Display control method and apparatus
CN105511307A (en) * 2015-11-26 2016-04-20 小米科技有限责任公司 Control method and apparatus of electronic device
US20170193302A1 (en) * 2016-01-05 2017-07-06 Daqri, Llc Task management system and method using augmented reality devices
CN112218145A (en) * 2020-10-15 2021-01-12 聚好看科技股份有限公司 Smart television, VR display device and related methods

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
李旭东;陆朝铨;桂彦;张建明;: "基于树莓派的穿戴式智能眼镜系统设计与实现", 软件, no. 08 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113329375A (en) * 2021-05-27 2021-08-31 Oppo广东移动通信有限公司 Content processing method, device, system, storage medium and electronic equipment
CN113329375B (en) * 2021-05-27 2023-06-27 Oppo广东移动通信有限公司 Content processing method, device, system, storage medium and electronic equipment
WO2023284355A1 (en) * 2021-07-15 2023-01-19 Oppo广东移动通信有限公司 Information processing method, apparatus, and system, storage medium, and electronic device

Similar Documents

Publication Publication Date Title
EP3136793B1 (en) Method and apparatus for awakening electronic device
CN105094732B (en) Screen display method and device
US10237901B2 (en) Method and apparatus for connecting with controlled smart device, and storage medium
CN105430625B (en) Communication information transfer method, apparatus and system
CN112114765A (en) Screen projection method and device and storage medium
CN105407433A (en) Method and device for controlling sound output equipment
JP2016522636A (en) Imaging parameter setting method, imaging parameter setting device, program, and recording medium
CN103986821A (en) Method, equipment and system for carrying out parameter adjustment
EP3565374B1 (en) Region configuration methods and devices
CN112217990A (en) Task scheduling method, task scheduling device, and storage medium
CN112817665A (en) Equipment interaction method and device and storage medium
CN111540350B (en) Control method, device and storage medium of intelligent voice control equipment
CN111123716B (en) Remote control method, remote control device, and computer-readable storage medium
CN111092795B (en) Function control method, function control apparatus, and computer-readable storage medium
CN111010721A (en) Wireless network distribution method, wireless network distribution device and computer readable storage medium
JP6310159B2 (en) Application installation method, apparatus, smart device, program, and recording medium
CN107846646B (en) Control method and device of intelligent sound box and readable storage medium
CN112217987B (en) Shooting control method and device and storage medium
CN109474744B (en) Alarm clock processing method, device and storage medium
CN112882622A (en) Data processing method and device, terminal and storage medium
CN112817547A (en) Display method and device, and storage medium
CN106899369B (en) Method and device for reserved playing of intelligent radio
CN112099364B (en) Intelligent interaction method for Internet of things household equipment
CN111246012B (en) Application interface display method and device and storage medium
CN106531163A (en) Method and device for controlling terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination