CN114666444B - Equipment control method and device and electronic equipment - Google Patents

Equipment control method and device and electronic equipment Download PDF

Info

Publication number
CN114666444B
CN114666444B CN202011536963.5A CN202011536963A CN114666444B CN 114666444 B CN114666444 B CN 114666444B CN 202011536963 A CN202011536963 A CN 202011536963A CN 114666444 B CN114666444 B CN 114666444B
Authority
CN
China
Prior art keywords
equipment
user
volume
account
reduce
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011536963.5A
Other languages
Chinese (zh)
Other versions
CN114666444A (en
Inventor
张子曰
鲍修远
张若兰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202011536963.5A priority Critical patent/CN114666444B/en
Priority to PCT/CN2021/137174 priority patent/WO2022135183A1/en
Publication of CN114666444A publication Critical patent/CN114666444A/en
Application granted granted Critical
Publication of CN114666444B publication Critical patent/CN114666444B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/165Management of the audio stream, e.g. setting of volume, audio stream path
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72409User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories
    • H04M1/72415User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories for remote control of appliances
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72469User interfaces specially adapted for cordless or mobile telephones for operating the device by selecting functions from two or more displayed items, e.g. menus or icons
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/725Cordless telephones
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M11/00Telephonic communication systems specially adapted for combination with other electrical systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/12Details of telephonic subscriber devices including a sensor for measuring a physical value, e.g. temperature or motion
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Theoretical Computer Science (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Environmental & Geological Engineering (AREA)
  • Multimedia (AREA)
  • Telephone Function (AREA)

Abstract

The application provides a device control method, a device and electronic equipment, and relates to the technical field of intelligent control, wherein the method comprises the following steps: and in response to detecting the target behavior of the user, controlling first equipment of which the associated account number in the equipment set meets the preset requirement to reduce the volume, wherein the audio equipment comprises first equipment and second equipment, and the equipment sharing rate of the second equipment is higher than that of the first equipment. According to the technical scheme, the influence of the audio equipment in the environment where the user is located on the user can be reduced, and the equipment use experience of the user is improved.

Description

Equipment control method and device and electronic equipment
Technical Field
The application relates to the technical field of intelligent control, in particular to a device control method, a device and electronic equipment.
Background
With the continuous development of terminal technology and internet of things, users have more and more kinds and functions of electronic devices. Many electronic devices have audio playing functions, such as: mobile phones, tablets, computers, smart televisions, smart speakers and the like; when people use a plurality of electronic devices in the same environment, the electronic devices may be affected, which results in a low device use experience for the user.
Disclosure of Invention
In view of the foregoing, the present application provides a device control method, an apparatus, and an electronic device, which are used for reducing the influence of the electronic device on a user, so as to improve the device use experience of the user.
In order to achieve the above object, in a first aspect, an embodiment of the present application provides an apparatus control method, applied to an electronic apparatus, including:
acquiring a device set of audio devices in an online state in a target area where a user is located; the audio equipment comprises first equipment and second equipment, and the equipment sharing rate of the second equipment is higher than that of the first equipment;
and in response to detecting the target behavior of the user, controlling the first type of equipment with the concentrated association account number meeting the preset requirement to reduce the volume.
The presentation form of the device set formed by the audio devices may be a list form (i.e., the device set is specifically a device list), or may be other presentation forms that represent the device set.
The first type of equipment can comprise private equipment such as mobile phones, intelligent watches and the like, and public equipment such as tablets, computers and the like which can be used by multiple people but have relatively low sharing rate; the second type of equipment can comprise public equipment which can be used by multiple persons, such as intelligent televisions, intelligent sound boxes and the like, and has relatively high sharing rate.
The target behavior of the user may be a user behavior in which the user enters a call state or a sleep state, etc. and is easily interfered by environmental sound.
In some application scenarios, such as a home scenario, a user may control and/or manage other electronic devices through some electronic devices (referred to as a control device, which may be a mobile phone, a tablet, etc.), for example, the user may log into an application program on the mobile phone a for controlling and/or managing the electronic devices through a user account, and control and/or manage the associated electronic devices through the electronic devices in the associated scenario. The associated account number of the audio device can be an account number of the audio device login, or an account number of the control device login associated with the audio device.
According to the equipment control method provided by the embodiment, when the user is detected to enter the target user behaviors such as a call state or a sleep state, different volume control modes can be carried out on the equipment according to the classification and the associated account numbers of the equipment for the audio equipment in the online state in the same environment, so that the influence of the equipment on the user can be reduced, and the equipment use experience of the collective user in the public space can be improved while the equipment use experience of the individual user is improved.
In a possible implementation manner of the first aspect, in response to detecting the target behavior of the user, the controlling device sets a first device whose associated account number meets a preset requirement to reduce the volume, including:
and in response to detecting that the user enters a call state, the control device sets first-class devices associated with the same account number as the answering device to reduce the volume.
The answering device may be an electronic device that executes the method, or may be other audio devices. For example, the electronic device executing the method is a mobile phone, and the user can answer the call on the mobile phone, and the answering device is the mobile phone at this time; the user can transfer the call to the tablet for answering, and the answering device is the tablet.
In the above embodiment, when the user enters a call state, the control device concentrates the first type of devices with the same account number associated with the answering device to reduce the volume, so that the influence of the electronic devices on the answering device can be reduced; the other devices in the device set are not subjected to volume control, namely the volume of the other devices is kept unchanged, so that the experience of other users in the public space can be enhanced.
In a possible implementation manner of the first aspect, the method further includes: and if the answering equipment belongs to the second type of equipment, controlling the second type of equipment except the answering equipment in the equipment set to reduce the volume.
In the above embodiment, when the user selects to answer a call on the second type of equipment (i.e., the equipment with higher sharing rate), it is indicated that the call is a call behavior with a common attribute, that is, the call can be listened by other users in the public space, at this time, the second type of equipment except for the answering equipment is controlled to reduce the volume in addition to the first type of equipment with the account number, so that the influence of other electronic equipment on the answering equipment can be further reduced, and the use experience of each user participating in the call is improved; in addition, the personal requirements of other users are considered, the volume control is not carried out on the first type of equipment with different accounts, namely the volume of the equipment is kept unchanged, so that the equipment using experience of the users in a public space can be enhanced.
In a possible implementation manner of the first aspect, if there are other users in the target area, the control device sets the first type of devices with the associated account numbers meeting the preset requirements to reduce the volume;
if there are no other users in the target area, the control device sets all devices except the answering device down the volume.
Through the embodiment, the influence of the surrounding audio equipment on the user can be further reduced under the unmanned condition, and the use experience of the user is improved.
In a possible implementation manner of the first aspect, in response to detecting the target behavior of the user, the controlling device sets a first device whose associated account number meets a preset requirement to reduce the volume, including:
in response to detecting that the user enters a sleep state, the control device centralizes a first type of device associated with the same account as the electronic device and the electronic device reduces the volume.
In the above embodiment, the first type of equipment with the same account as the electronic equipment and the electronic equipment are controlled to reduce the volume, so that the sleeping influence of the electronic equipment on the user can be reduced; for other audio devices, the volume control can be omitted, namely the volume of the audio device can be kept unchanged, so that the experience of other users in the public space can be enhanced.
In a possible implementation manner of the first aspect, the method further includes: other devices in the control device set which do not reduce the volume send out reminding messages. Therefore, the user using the equipment can be reminded to keep low volume when using the equipment, and the equipment using experience of the user can be further improved.
In a possible implementation manner of the first aspect, if there are other users in the target area, the control device sets the first type of devices with the associated account numbers meeting the preset requirements to reduce the volume;
If there are no other users in the target area, all devices in the device set are controlled to reduce the volume.
Through the embodiment, the influence of the surrounding audio equipment on the user can be further reduced under the unmanned condition, and the use experience of the user is improved.
In a possible implementation manner of the first aspect, the associated account of the audio device in the online state in the target area is a first account or a second account, where the first account is an associated account of the electronic device, and the second account and the first account belong to the same target group. Wherein the second account number may include one or more; the target group may also include one or more.
In the above embodiment, the device set includes audio devices associated with each account in the target group where the electronic device is located, so that more comprehensive audio devices can be obtained, that is, more audio devices around the user can be controlled, and thus, the influence of the surrounding audio devices on the user can be further reduced.
In a possible implementation manner of the first aspect, the audio device in the online state in the target area meets at least one of the following requirements:
the distance between the electronic equipment and the electronic equipment is smaller than a preset distance;
The electronic device is in the same wireless local area network, such as the same Wi-Fi network;
a near field communication connection, such as a bluetooth connection, may be established with the electronic device;
in the same target building space as the electronic device, such as the same room, the same house, etc.
In a second aspect, an embodiment of the present application provides an apparatus control device, which is applied to an electronic apparatus, including:
the communication module is used for acquiring a device set of the audio device in an online state in a target area where the user is located; the audio equipment comprises first equipment and second equipment, and the equipment sharing rate of the second equipment is higher than that of the first equipment;
and the processing module is used for responding to the detection of the target behavior of the user and controlling the first equipment of which the centralized association account number meets the preset requirement to reduce the volume.
In a possible implementation manner of the second aspect, the processing module is specifically configured to:
and in response to detecting that the user enters a call state, the control device sets first-class devices associated with the same account number as the answering device to reduce the volume.
In a possible implementation manner of the second aspect, the processing module is further configured to:
and if the answering equipment belongs to the second type of equipment, controlling the second type of equipment except the equipment for centralizing the answering equipment to reduce the volume.
In a possible implementation manner of the second aspect, the processing module is further configured to:
judging whether other users exist in the target area;
if no other users exist in the target area, controlling the equipment to intensively answer other equipment except the equipment to reduce the volume.
In a possible implementation manner of the second aspect, the processing module is specifically configured to:
in response to detecting that the user enters a sleep state, the control device sets a first type of device associated with the same account as the electronic device to reduce the volume.
In a possible implementation manner of the second aspect, the processing module is further configured to:
other devices in the control device set which do not reduce the volume send out reminding messages.
In a possible implementation manner of the second aspect, the processing module is further configured to:
judging whether other users exist in the target area;
and if no other users exist in the target area, controlling other devices in the device set to reduce the volume.
In a possible implementation manner of the second aspect, the associated account of the audio device in the online state in the target area is a first account or a second account, where the first account is an associated account of the electronic device, and the second account and the first account belong to the same target group.
In a possible implementation manner of the second aspect, the audio device in the online state in the target area meets at least one of the following requirements:
the distance between the electronic equipment and the electronic equipment is smaller than a preset distance;
the wireless local area network is the same as the electronic equipment;
a close range communication connection can be established with the electronic equipment;
is in the same target building space as the electronic device.
In a third aspect, an embodiment of the present application provides an electronic device, including: a memory and a processor, the memory for storing a computer program; the processor is configured to perform the method of the first aspect or any implementation of the first aspect when the computer program is invoked.
In a fourth aspect, embodiments of the present application provide a computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements the method of the first aspect or any implementation of the first aspect.
In a fifth aspect, embodiments of the present application provide a computer program product, which when run on an electronic device, causes the electronic device to perform the method of the first aspect or any implementation of the first aspect.
In a sixth aspect, an embodiment of the present application provides a chip system, including a processor, where the processor is coupled to a memory, and the processor executes a computer program stored in the memory, to implement the method according to the first aspect or any implementation manner of the first aspect. The chip system can be a single chip or a chip module formed by a plurality of chips.
It will be appreciated that the advantages of the second to sixth aspects may be found in the relevant description of the first aspect, and are not described here again.
Drawings
Fig. 1 is a schematic view of a home scenario provided in an embodiment of the present application;
fig. 2 is a schematic view of a user scenario provided in an embodiment of the present application;
fig. 3 is a schematic diagram of a call scenario provided in an embodiment of the present application;
fig. 4 is a schematic flow chart of a device control method according to an embodiment of the present application;
FIG. 5 is a schematic diagram of a user interface provided by an embodiment of the present application;
fig. 6 is a schematic diagram of a device classification method according to an embodiment of the present application;
fig. 7 is a schematic flow chart of device control according to device classification according to an embodiment of the present application;
fig. 8 is a schematic view of a sleeping scenario provided in an embodiment of the present application;
Fig. 9 is a schematic flow chart of another device control method according to an embodiment of the present application;
fig. 10 is a schematic structural diagram of an apparatus control device according to an embodiment of the present application;
fig. 11 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
Embodiments of the present application are described below with reference to the accompanying drawings in the embodiments of the present application. The terminology used in the description of the embodiments of the application is for the purpose of describing particular embodiments of the application only and is not intended to be limiting of the application.
Currently, users have more and more electronic devices, and many electronic devices have audio playing functions, for example: cell phones, tablets, computers, smart televisions, smart speakers, and the like. When a user uses a plurality of audio devices (i.e., electronic devices having audio playback functions) in the same environment, the audio devices may interfere with the user, for example: in a home scene, a user answers an incoming call when watching a television, and the conversation of the user can be influenced if the television volume is too large; in addition, the work and rest habits among family members are different, and the rest is affected mutually when the audio equipment is used. These can affect the user's device usage experience.
An alternative scheme is to detect the current state of the user, and when the user is detected to enter a talking or sleeping state, adjust the upper limit of the volume of other audio devices to a preset volume value so as to reduce the influence among the audio devices.
Considering that the control mode performs volume regulation and control on other audio devices, and may affect the device use experience of other users in the public space, the embodiment of the application provides another device control method, by classifying the devices, after the user enters a call or sleep state, different controls are performed on the devices according to the classification of the devices, so that the intelligence of device control is improved, the device use experience of the individual user is met as far as possible, and meanwhile, the requirements of collective users in the public space are met. A home scene will be described as an exemplary scene.
Fig. 1 is a schematic view of a home scenario provided in an embodiment of the present application, as shown in fig. 1, a home scenario 1000 may be divided into one or more areas, where one or more areas may include one or more audio devices, and the audio devices may be portable electronic devices (e.g., a mobile phone, a tablet, a wearable device, etc.) or non-portable electronic devices (e.g., smart home devices such as a smart speaker, a smart television, etc.). Other types of electronic devices may also be included in the home scene 1000, such as routers, smart lights, and the like.
For example, as shown in fig. 1, the home scene 1000 may be divided into seven areas: living room, primary lying, secondary lying, study room, bathroom, kitchen and corridor; the living room comprises a mobile phone 1001, a flat panel 1002, a smart watch 1003, a smart television 1004, a router 1005 and a smart sound box 1006; the primary sleeper includes an intelligent light 1007, the secondary sleeper includes an intelligent sound box 1008, and the study includes a computer 1009.
Each electronic device can perform data interaction with other electronic devices through the communication function of the electronic device. A user may control and/or manage other electronic devices through some of them (e.g., cell phones, tablets, etc.).
The electronic device can perform data interaction with other electronic devices through the cloud device, can also establish close-range communication connection with other electronic devices, and performs data interaction through the established close-range communication connection. Techniques for establishing a close range communication connection include, but are not limited to: wireless local area networks (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) networks), bluetooth (BT) technology, ultra Wideband (UWB) technology, near field communication (near field communication, NFC) technology, infrared (IR) technology, general 2.4G/5G band wireless communication technology, and the like. The data interaction mode between the electronic devices through the short-range communication connection can be direct transmission between the devices or indirect transmission between the devices.
Taking three persons as an example, as shown in fig. 2, a home scenario 1000 includes a user a, a user B, and a user C. Taking the example of the user a, when the user a is talking or sleeping, if the user B and/or the user C is playing audio content using the audio device, the user a may be affected.
According to the device control method provided by the embodiment, when the target user behavior of a certain user A is detected, the electronic devices in the same environment can be controlled, the influence of the electronic devices on the user A is reduced, and the requirements of other users can be met while the device use experience of the user A is met. The target user behavior may be a user behavior that is easily interfered by environmental sound, such as a user entering a call state or a sleep state. The following describes the device control process in connection with a talk scenario and a sleep scenario.
Fig. 3 is a schematic diagram of a call scenario provided in the embodiment of the present application, as shown in fig. 3, where a user a, a user B, and a user C are in a living room, where the user a is using a tablet, the user B is watching a television, and the user C is playing a mobile phone. At this time, when the mobile phone a of the user a receives the incoming call (i.e. the call request), the voice of other electronic devices in the scene may affect the call process of the user a when the user a listens to the call. In this embodiment, other audio devices besides the receiving device may be controlled, so as to reduce the influence of these devices on the receiving of the call by the user and give consideration to the demands of other users. The following describes a specific flow of the device control.
Fig. 4 is a schematic flow chart of a device control method provided in the embodiment of the present application, and an execution subject of the method may be any electronic device in a home scene 1000, for example, a portable electronic device such as a mobile phone 1001 of a user, a smart watch 1003, and the like. The execution subject of the device control method may also be a designated electronic device in the home scene 1000, for example, the operation data of each audio device may be collected by the router 1005 in the home scene 1000, and the device control process may be performed by the router 1005 according to the collected device operation data. In connection with the scenario shown in fig. 3, in the embodiment of the present application, the mobile phone a owned by the user a is taken as an example for an execution subject.
Referring to fig. 4, the device control method provided in this embodiment may include the following steps:
s110, when the user is detected to enter a call state, acquiring a device list of the audio devices in the online state in the same environment.
In this embodiment, the mobile phone a may receive a call request or send out a call request, and when detecting the call request, it may consider that the user enters a call state; in order to save processing resources, the user may also be considered to enter a talk state when it is detected that a talk request is being put on, considering that the talk may not be put on. The call request may be a voice call request or a video call request by running a telephony application or an instant messaging type application.
As an optional implementation manner, after detecting that the user enters the call state, the mobile phone a may acquire, from the cloud device, a device list (simply referred to as an online device list) of an audio device in an online state in the home scene through the user account, and then select the online device list in the same environment, where the same environment may be in a target area where the mobile phone a is located.
As previously described, a user may control and/or manage some electronic devices (e.g., cell phones, tablets, etc.) in a scene. For example, a user may log into an application program on the mobile phone a for controlling and/or managing the electronic device through a user account, and control and/or manage the associated electronic device through the electronic device in the associated scene. Take the "Smart Life" application offered by Hua technology Co., ltd as an example for illustration. FIG. 4 illustrates an exemplary user interface of a "Smart Life" application.
As shown in fig. 5 (a), after the user opens the "smart life" application, the corresponding user interface 10 may include a scene name 101 (here, an exemplary scene name 101 is "home of a"). When the mobile phone a detects an operation that the user clicks the icon 102 located on the right side of the scene name 101, the application program may provide a "home name" option for the user to modify the scene name 101, and may also provide a "room management" option for the user to set rooms in the scene, such as adding rooms, deleting rooms, and changing room names.
Here, the room may be generated from a real room, for example, a room calculated after the user creates a grid map/point cloud map of the home scene 1000 in the "smart life" application. In other embodiments, the room is virtual, and is a virtual room that the user establishes according to his or her needs. Some of the relatively fixed location electronic devices in the home scene 1000 have location tags that are configurable by the user, with the electronic devices having the same location tags being in the same room.
When the user clicks the "more" icon 103, handset a may pop up more options such as "add device", "create scene", "share device" and "connect to three-party platform". Wherein, the option of adding equipment can be used for users to add and associate new electronic equipment; the 'create scene' option allows the user to create a new intelligent scene; the option of sharing the equipment can be used for a user to share the control right of the associated one or more electronic equipment to other users, so that the other users manage and/or control the shared one or more electronic equipment, thereby meeting the use requirement of multiple users; the "connect three party platform" option allows the user to associate accounts in other applications for managing electronic devices with the "Smart Life" application, thereby enabling the "Smart Life" application to manage more electronic devices.
The user may view the associated electronic device in the interface corresponding to the "home" icon 108, and fig. 5 (a) shows the interface corresponding to the "home" icon, where the "home" icon may be displayed in a selected state (where the selected example of the icon represents that the "home" icon is filled). In this interface, each electronic device in the home scene 1000 and associated with a "smart life" application will generate a card in which the user can view the device name, device status, and room in which it is located. For example, as shown in fig. 5 (a), the user associates a smart tv 1004 (corresponding to card 104), a router 1005 (corresponding to card 105), a sound box 1006 (corresponding to card 106), and a smart light 1007 (corresponding to card 107) in the scene.
The device state may include an on-line state and an off-line state, where the on-line state may refer to that the device is in an on-state and is in a WLAN or establishes a communication connection (such as a bluetooth connection) with the mobile phone a; the offline state may refer to the device being powered off, or not being in a WLAN or establishing a communication connection with handset a.
As shown in fig. 5 (a) and (b), the user may click on the my icon 109, open the my interface 20, log in to the account through the account center icon 201, exit the account, manage other services related to the account (such as cloud space services), etc.
The user can also establish a sharing group when sharing the devices, and share the associated electronic devices to other user accounts in the group. As shown in fig. 5 (B) and (c), the user may click on the my share icon 202 in the my interface 20, open the my share interface 30, view the established share group, and establish a new device share group, such as the established family group 301 shown in the figure, in which user a (corresponding to "a" in the group member) shares its associated electronic device with user B (corresponding to "a" in the group member).
It will be appreciated that the above description is only given by taking the control right that the user a shares the electronic device with the user B as an example, the user a may also be used as a sharee to accept the sharing of other users, and the corresponding card corresponding to the electronic device that the user a accepts the sharing may be displayed in the interface corresponding to the "home" icon 108. In addition, the sharing group may be embodied in other forms, for example, the group word is not displayed in the interface, but only the related information of the sharing members is displayed, and the user a and each sharing member can form a sharing group.
When the online device list is acquired, the online device list in the family scene can be acquired from the cloud device according to the account number of the electronic device login and the account numbers of other users in the sharing group corresponding to the device sharing service, and then the online device list in the same environment is selected from the cloud device.
In addition, the user can share other services such as cloud space to other user accounts, so that the user can enjoy more intelligent multi-device perception service. That is, the mobile phone a can obtain, from the cloud device, audio devices (simply referred to as online devices) in online states associated with all accounts in each sharing group where the mobile phone a is located; and then selects an online device (called a target online device) in the environment of the mobile phone a. The following is illustrative:
assume that user a logs in to the cloud space service and the "smart life" application of mobile phone a through account a, associates smart tv 1004 with speaker 1006 in the "smart life" application, and adds account B as a shared member of the family group, and user a logs in to tablet 1002 and smart watch 1003 through account a (as a system account). The user B logs in the cloud space service and the smart life application program of the mobile phone B through the account B, the sound box 1008 is associated in the smart life application program, and the shared member account a and the account C are added in the cloud space service. User C logs in to the cloud space service and the "smart life" application of the mobile phone C through account number C, does not associate any device in the "smart life" application, and also logs in to the computer 1009 through account number C. Assume that other electronic devices are online except for the computer 1009.
Then, the shared group in which account a is located includes two: family groups in "Smart Life" applications (including Account A and Account B), and sharing groups in cloud space services (including Account A, account B, and Account C). According to the two sharing groups, all accounts in the sharing group where the mobile phone a is located include three accounts in total: the mobile phone A can acquire all online devices associated under the three accounts from the cloud device, wherein the online devices associated with the account A comprise: cell-phone A, tablet 1002, audio amplifier 1006, intelligent wrist-watch 1003 and intelligent TV 1004, account number B associated online device includes: the mobile phone B and the audio box 1008, and the online equipment associated with the account number C includes: and a mobile phone C.
It can be appreciated that the target group adopted when the online device list is obtained from the cloud device is not limited to the shared group of the cloud space service and the "smart life" application, and may be a shared group/family group corresponding to other services.
In addition, in the case where the user establishes a plurality of groups, if the groups are marked with geographical information, the target group may be determined according to the location of the mobile phone a and the geographical information of each group. For example, the user a also associates an electronic device located in a company in the "smart life" application program, establishes a company group, and takes an account number of a certain colleague of the company as a sharing member of the group to share the electronic device of the company. The user a may mark geographical information for the established home group and company group, respectively, or the mobile phone a may automatically generate geographical information for each group according to the location of the electronic device shared in the group. When determining the target group, the mobile phone a may use the group (i.e., the home group) matching the location of the mobile phone a as the target group.
As another alternative, handset a may also obtain a list of devices connected to Wi-Fi from the router and then select an online device (i.e., the target online device) from the same environment. The method is simple, and the target online equipment can be acquired quickly; under the condition of diversified connection modes, the mode of acquiring the online equipment from the cloud equipment can acquire more comprehensive target online equipment.
In this embodiment, the target online device other than the mobile phone a may be an electronic device whose distance from the mobile phone a is smaller than a preset distance, or an electronic device connected to the mobile phone a in the same Wi-Fi network, where the Wi-Fi signal strength of the Wi-Fi signal is greater than a preset threshold, or an electronic device in the same room as the mobile phone a, or an electronic device capable of establishing close-range communication connection with the mobile phone a. The target online device may also be determined in other manners, and the specific determination manner is not particularly limited in this embodiment.
The mobile phone a may determine the distance with other electronic devices by using technologies such as global satellite positioning system (global positioning system, GPS) positioning, bluetooth ranging or UWB ranging, and the value of the preset distance may be set according to needs, for example, 5 meters. In addition, handset a may obtain Wi-Fi signal strengths from the router for connection to various electronic devices in the Wi-Fi network. For some electronic devices (such as smart televisions) with relatively fixed positions, the mobile phone A can acquire a room where the device is located from a smart life application program; for some electronic devices with unfixed locations, the mobile phone a may determine the room in which the device is located using GPS positioning, bluetooth positioning, UWB positioning, or other techniques.
In this embodiment, for different target user behaviors, the same environment may be determined in the same manner, or the same environment may be determined in different manners. The present embodiment is exemplified by the case of determining the same environment in different manners.
Taking the example that the same environment refers to the same room (i.e. the room where the mobile phone a is located) in the call scene, in this embodiment, after the mobile phone a detects that the user enters the call state, as described above, the cloud device may obtain online devices associated with each account in the shared group where the mobile phone a is located: cell phone a, tablet 1002, smart watch 1003, smart tv 1004, speaker 1006, speaker 1008, cell phone B, and cell phone C. Then, the target online device in the same room (namely, the room in which the mobile phone A is located: living room) can be selected from the following: cell phone a, tablet 1002, smart watch 1003, smart tv 1004, speaker 1006, cell phone B, and cell phone C.
S120, judging whether a person exists in the same environment, and if not, executing a step S130; if so, step S140 is performed.
The mobile phone A can judge whether people exist in the same environment or not besides determining the target online equipment, and if not, the mobile phone A can control all the target online equipment except the answering equipment to reduce the volume; if someone is present, the electronic device may be controlled differently depending on the classification of the electronic device.
In the specific judgment, the mobile phone A can determine whether people exist in the surrounding environment through one or more devices of an intelligent door lock, a router, an internal or external camera, a human body sensor and the like. The mobile phone A can acquire personnel information from an intelligent door lock, acquire equipment information of electronic equipment connected to the same Wi-Fi from a router, and acquire personnel information in the surrounding environment from a camera and a human body sensor; the mobile phone a can determine whether a person exists in the environment according to one or more kinds of information.
S130, controlling all devices except the answering device in the device list to reduce the volume.
Specifically, for each target online device other than the answering device, when the volume of the device exceeds the preset volume value, the volume of the device may be adjusted to the preset volume value, or other volume control manners may be adopted to perform device control, which is not limited in this embodiment.
And S140, performing equipment control according to the classification of each equipment in the equipment list.
For the situation that people exist in the same environment, the mobile phone A can classify the target online equipment, and different control modes can be adopted for different classified target online equipment.
In this embodiment, the electronic devices may be classified into the following three types according to the device sharing rate of the electronic devices: private devices, low-sharing-rate public devices (simply referred to as low-sharing devices), high-sharing-rate public devices (simply referred to as high-sharing devices).
Wherein the sharing rates of the private device, the low sharing device, and the high sharing device are sequentially increased.
It will be appreciated that electronic devices may be categorized into other numbers of categories, such as simply two categories of personal devices and common devices, or more, and that the present embodiment is exemplified by a total of three categories.
In this embodiment, the classification method of the electronic device includes, but is not limited to, the following three ways:
first, classification is based on device type. Devices that are typically used exclusively by one person, such as cell phones and smart watches, may be determined to be private devices, such as shown in fig. 6; the method comprises the steps that devices with relatively low sharing rate, such as a tablet, a computer and the like, which can be used by multiple persons are determined to be low sharing devices; the intelligent television, the intelligent sound box and the like can be used by multiple persons, and the equipment with relatively high sharing rate is determined to be high sharing equipment. This embodiment is followed by an exemplary description taking this classification method as an example.
And secondly, classifying according to the account registered by the electronic equipment, namely marking the account, wherein the classification of the electronic equipment is the classification to which the registered account belongs. For example, the tablet, when logging in account number 1, belongs to a low-sharing device; when logging in account number 2, it belongs to a private device.
Thirdly, classifying according to the account switching frequency of the electronic equipment. Specifically, the account switching frequency can be determined according to the account switching frequency of the electronic device in a preset time period, and when the account switching frequency is smaller than a first-time threshold value in specific implementation, the electronic device can be considered to belong to private equipment; when the number of times of account switching is greater than or equal to a first time threshold value and less than a second time threshold value, the electronic equipment can be considered to belong to low-sharing equipment; when the number of times of account switching is greater than or equal to the second number of times threshold, the electronic device can be considered to belong to a high-sharing device. The first time threshold and the second time threshold may be set according to actual needs, for example, the first time threshold may be two times, and the second time threshold may be four times.
Considering that the user can answer the call on the private device, the user can answer the call on a low-sharing device (such as a tablet) or a strong-sharing device (such as a smart television), for example, the user A can answer the call on the mobile phone A, can answer the call on the tablet, and can answer the call on the smart television. In this embodiment, different control modes may be adopted for different answering devices.
As shown in fig. 7, in the case where the target user behavior is that the user enters a call state, when device control is specifically performed, it may be determined that the device class of the answering device (step S141), and if the answering device belongs to a private device or a low-sharing device, it may be controlled to reduce the volume of private devices and low-sharing devices of the same account as the answering device (i.e., private devices and low-sharing devices of the same account as the answering device) in the device list (step S142). If the answering device belongs to the high sharing device, the private device with the same account as the answering device, the low sharing device with the same account, and the high sharing device except the answering device in the device list can be controlled to reduce the volume (step S143).
For example, as known from the first classification method, in the online device in the room (i.e. the same environment) where the user a is located, the private device includes: cell phone a, smart watch 1003 (same account as cell phone a), cell phone B and cell phone C; the low-sharing device includes: tablet 1002 (same account as handset a); the high-sharing device includes: smart tv 1004 and audio box 1006. If the user A answers the call on the mobile phone A, the flat 1002 and the intelligent watch 1003 can be controlled to reduce the volume, and the volume of the mobile phone B, the mobile phone C, the intelligent television 1004 and the voice box 1006 is kept unchanged; if the user A answers the call on the tablet 1002, the mobile phone A and the intelligent watch 1003 can be controlled to reduce the volume, and the volume of the mobile phone B, the mobile phone C, the intelligent television 1004 and the voice box 1006 is kept unchanged; if user A listens to a call on smart TV 1004, then handset A, tablet 1002, smart watch 1003 and speaker 1006 can be controlled to reduce the volume, and handset B and handset C remain unchanged.
In this embodiment, when a user selects to answer a call on a private device or a low-sharing device, the private device and the low-sharing device with the same account are controlled to reduce the volume, so that the influence of the electronic devices on the answering device can be reduced; for private equipment with different accounts, low sharing equipment with different accounts and all high sharing equipment (namely high sharing equipment with the same account and different accounts), volume control is not carried out, namely the volume is kept unchanged, so that experience of other users in public space can be enhanced.
When a user selects to answer a call on high-sharing equipment, the call is indicated to be a call behavior with one public attribute, and the call can be listened by other users in public space, at the moment, besides private equipment and low-sharing equipment of the same account, the high-sharing equipment except the answering equipment is controlled to reduce the volume, so that the influence of other electronic equipment on the answering equipment can be further reduced, and the use experience of each user participating in the call is improved; in addition, considering personal demands of other users, for private equipment and low-sharing equipment with different accounts, volume control is not performed, namely the volume is kept unchanged, so that equipment using experience of the users in public space can be enhanced.
The above describes the device control process in the call scenario, and the following describes the device control process in the sleep scenario.
Fig. 8 is a schematic diagram of a sleeping scenario provided in the embodiment of the present application, as shown in fig. 8, in which a user a sleeps in a primary sleeping position, a user B listens to music in a secondary sleeping position, and a user C watches television in a living room. In view of this situation, in this embodiment, the device control method provided in the embodiment of the present application may control electronic devices in the same environment, reduce the sleep influence of these electronic devices on the user a, and consider the needs of other users. The following describes a specific flow of the device control.
Fig. 9 is a schematic flow chart of another method for controlling a device, similar to the method shown in fig. 4, where the execution subject of the method may be any electronic device or a designated electronic device in a home scenario, and in this embodiment, the execution subject is taken as an example of a mobile phone a to perform an exemplary description.
Referring to fig. 9, the device control method provided in the present embodiment may include the following steps:
s210, after detecting that a user enters a sleep state, acquiring a device list in an online state in the same environment.
The mobile phone a may detect the sleep state of the user throughout the day, or may use a period from a first time of the evening to a second time of the next day as a detection period, and perform sleep detection in the detection period, so as to save energy consumption.
During specific detection, sleep detection can be performed according to the state information of the mobile phone A, information can be acquired from other electronic equipment, and the sleep detection is performed by combining the input information of the other electronic equipment, so that the accuracy of a detection result is improved. Wherein the condition that the mobile phone a determines that the user a falls asleep includes, but is not limited to, at least one of the following conditions: the current time is in a sleep time period preset by a system or set by a user; the user closes the window curtains, doors and/or lights in the main sleeper; the smart wearable device (e.g., smart watch 1003) detects that user a falls asleep based on user a's heart rate, movement information, and/or brain information; the mobile phone A turns off the screen at night and starts to charge; the mobile phone A starts a do not disturb mode or a flight mode; an intelligent sound box (not shown) in the main sleeping room starts to play sleep-aiding audio; the user adjusts the air cleaner or the temperature and humidity control apparatus (not shown) to a sleep mode; the user issues an instruction indicating to fall asleep: "I sleep", "evening", "turn off the lights", "play sleep aiding music", etc.
The manner of acquiring the online device list in the same environment by the mobile phone a is similar to that in step S110, and will not be described here again.
Taking the example that the same environment refers to the same house (i.e. the house where the mobile phone a is located) in the sleep scene, in this embodiment, after the mobile phone a of the user a detects that the user enters the sleep state, as described above, the online devices associated with each account in the shared group where the mobile phone a is located may be obtained from the cloud device: cell phone a, tablet 1002, smart watch 1003, smart tv 1004, speaker 1006, speaker 1008, cell phone B, and cell phone C. And then, target online equipment of the same house (namely the whole home environment where the mobile phone A is located) can be selected from the target online equipment, and the target online equipment is all the online equipment obtained from the cloud equipment.
S220, judging whether a person exists in the same environment, and if not, executing a step S230; if so, step S240 is performed.
The mobile phone A can judge whether people exist in the same environment or not except for determining target online equipment (namely online equipment in the same environment), and if not, can control all the target online equipment to reduce the volume; if someone is present, the electronic device may be controlled differently depending on the classification of the electronic device.
The method for determining whether a person exists is similar to that in step S120, and will not be described here.
S230, controlling all devices in the device list to reduce the volume.
Specifically, for each target online device, the volume control may be performed, and the specific control process may be referred to the description related to step S130, which is not repeated herein.
S240, controlling the mobile phone A, private equipment with the same account number as the mobile phone A and low sharing equipment with the same account number in the equipment list to reduce the volume, and sending reminding messages by the private equipment with the different account number, the low sharing equipment with the different account number and the high sharing equipment with the different account number.
For the situation that people exist in the same environment, the mobile phone A can classify the target online equipment, and different control modes can be adopted for different classified target online equipment. The device classification method may be referred to the related description in step S140, and will not be described herein.
Taking the same environment as the same house as an example, based on the first classification method, the private device in the obtained online device with the same environment comprises: cell phone a, smart watch 1003 (same account as cell phone a), cell phone B and cell phone C; the low-sharing device includes: tablet 1002 (same account as handset a); the high-sharing device includes: smart tv 1004, speaker 1006 and audio box 1008. After detecting that the user a falls asleep, the mobile phone a, the tablet 1002 and the smart watch 1003 can be controlled to reduce the volume, and the mobile phone B, the mobile phone C, the smart television 1004, the sound box 1006 and the audio box 1008 are controlled to send out reminding messages.
The reminding message sent by the controlled electronic device can comprise a text message and/or a voice message, for example, the reminding message can be a text message for a mobile phone and a tablet; for smart televisions, text messages and voice messages may be included; for smart speakers and smart watches, it may be a voice message. The message content of the reminding message can be set according to the needs, for example, the message content can be: "someone falls asleep, you can turn down the volume.
Similar to a call scene, the volume of the mobile phone A, the private equipment with the same account number as the mobile phone A and the low-sharing equipment with the same account number as the mobile phone A is controlled to be reduced, so that the sleeping influence of the electronic equipment on the user A can be reduced; for private equipment with a different account number (i.e. different from the account number logged in by the mobile phone A), low-sharing equipment with the different account number and all high-sharing equipment, volume control can be omitted, namely the volume is kept unchanged, so that experience of other users in a public space can be enhanced; in order to improve the intelligence, so as to further improve the equipment use experience of the user, for private equipment with a non-same account, low-sharing equipment with a non-same account and high-sharing equipment with a non-same account, the equipment can be controlled to send out reminding messages so as to remind the user using the equipment that the equipment can keep low volume, so that rest of other people is not influenced.
In this embodiment, when the user is detected to exit from the call state or the sleep state, the target online device with the reduced volume may be restored to the volume before the reduction.
Those skilled in the art will appreciate that the above embodiments are exemplary and not intended to limit the present application. The order of execution of one or more of the above steps may be adjusted, or optionally combined, as the case may be, to obtain one or more other embodiments, for example, the step of determining whether a person exists in the same environment may be performed before acquiring a list of devices in the same environment that are online, or both steps may be performed simultaneously. Those skilled in the art can select any combination from the above steps according to the need, and all the steps do not depart from the spirit of the scheme of the present application.
According to the equipment control method, when the user is detected to enter a call state or a sleep state, the online equipment in the same environment can be obtained, different volume control modes are carried out on the equipment according to the classification of the online equipment, so that the intelligence of equipment control can be improved, and the equipment use experience of the individual user is improved while the equipment use experience of the collective user in the public space is improved.
Based on the same inventive concept, as an implementation of the above method, the embodiment of the present application provides a device control apparatus, where the embodiment of the device corresponds to the embodiment of the foregoing method, and for convenience of reading, the embodiment of the present application does not describe details of the embodiment of the foregoing method one by one, but it should be clear that the device in the embodiment can correspondingly implement all the details of the embodiment of the foregoing method.
Fig. 10 is a schematic structural diagram of an apparatus control device provided in an embodiment of the present application, and as shown in fig. 10, the apparatus provided in this embodiment may include: a display module 210, an input module 220, a processing module 230, and a communication module 240.
Wherein the display module 210 is used to support the electronic device to perform the interface display operations in the above-described embodiments and/or other processes for the techniques described herein. The display module may be a touch screen or other hardware or a combination of hardware and software.
The input module 220 is configured to receive user input on a display interface of the electronic device, such as touch input, voice input, gesture input, etc., and is configured to support the electronic device to perform the steps of receiving a user's answer call operation in the above-described embodiments and/or other processes for the techniques described herein. The input module may be a touch screen or other hardware or a combination of hardware and software.
The processing module 230 is used to support the electronic device to perform the processing operations in S110-S140, S141-S143, S210-S240 in the embodiments described above and/or other processes for the techniques described herein.
The communication module 240 is used to support the electronic device to perform operations related to the communication process between the cloud device and other electronic devices in the above embodiments and/or other processes for the techniques described herein.
The device provided in this embodiment may perform the above method embodiment, and its implementation principle is similar to that of the technical effect, and will not be described herein again.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional units and modules is illustrated, and in practical application, the above-described functional distribution may be performed by different functional units and modules according to needs, i.e. the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-described functions. The functional units and modules in the embodiment may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit, where the integrated units may be implemented in a form of hardware or a form of a software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working process of the units and modules in the above system may refer to the corresponding process in the foregoing method embodiment, which is not described herein again.
Based on the same inventive concept, the embodiment of the present application further provides an electronic device, please refer to fig. 11, and fig. 11 is a schematic structural diagram of the electronic device provided in the embodiment of the present application.
The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, and a subscriber identity module (subscriber identification module, SIM) card interface 195, etc. The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It should be understood that the illustrated structure of the embodiment of the present invention does not constitute a specific limitation on the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (sraphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a memory, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller may be a neural hub and a command center of the electronic device 100, among others. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
The charge management module 140 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charge management module 140 may receive a charging input of a wired charger through the USB interface 130. In some wireless charging embodiments, the charge management module 140 may receive wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be configured to monitor battery capacity, battery cycle number, battery health (leakage, impedance) and other parameters. In other embodiments, the power management module 141 may also be provided in the processor 110. In other embodiments, the power management module 141 and the charge management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G, etc., applied to the electronic device 100. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 150 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field communication technology (near field communication, NFC), infrared technology (IR), etc., as applied on the electronic device 100. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
In some embodiments, antenna 1 and mobile communication module 150 of electronic device 100 are coupled, and antenna 2 and wireless communication module 160 are coupled, such that electronic device 100 may communicate with a network and other devices through wireless communication techniques. The wireless communication techniques may include the Global System for Mobile communications (global system for mobile communications, GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), time division synchronous code division multiple access (TD-synchronous code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC, FM, and/or IR techniques, among others. The GNSS may include a global satellite positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GNSS), a beidou satellite navigation system (beidou navigation satellite system, BDS), a quasi zenith satellite system (quasi-zenith satellite system, QZSS) and/or a satellite based augmentation system (satellite based augmentation systems, SBAS).
The electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED) or an active-matrix organic light-emitting diode (matrix organic light emitting diode), a flexible light-emitting diode (flex), a Mini LED, a Micro LED, a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The electronic device 100 may implement photographing functions through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the electronic device 100. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card.
The internal memory 121 may be used to store computer executable program code including instructions. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a storage program area and a storage data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data created during use of the electronic device 100 (e.g., audio data, phonebook, etc.), and so on. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like.
The electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or a portion of the functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also referred to as a "horn," is used to convert audio electrical signals into sound signals. The electronic device 100 may listen to music, or to hands-free conversations, through the speaker 170A.
A receiver 170B, also referred to as a "earpiece", is used to convert the audio electrical signal into a sound signal. When electronic device 100 is answering a telephone call or voice message, voice may be received by placing receiver 170B in close proximity to the human ear.
Microphone 170C, also referred to as a "microphone" or "microphone", is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can sound near the microphone 170C through the mouth, inputting a sound signal to the microphone 170C. The electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C, and may implement a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may also be provided with three, four, or more microphones 170C to enable collection of sound signals, noise reduction, identification of sound sources, directional recording functions, etc.
The earphone interface 170D is used to connect a wired earphone. The headset interface 170D may be a USB interface 130 or a 3.5mm open mobile electronic device platform (open mobile terminal platform, OMTP) standard interface, a american cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The electronic device provided in this embodiment may execute the above method embodiment, and its implementation principle is similar to that of the technical effect, and will not be described herein again.
The embodiment of the application also provides a computer readable storage medium, on which a computer program is stored, which when executed by a processor, implements the method described in the above method embodiment.
Embodiments of the present application also provide a computer program product which, when run on an electronic device, causes the electronic device to execute the method described in the above method embodiments.
The embodiment of the application also provides a chip system, which comprises a processor, wherein the processor is coupled with the memory, and the processor executes a computer program stored in the memory to realize the method described in the embodiment of the method. The chip system can be a single chip or a chip module formed by a plurality of chips.
In the above embodiments, it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, produces a flow or function in accordance with embodiments of the present application, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in or transmitted across a computer-readable storage medium. The computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by a wired (e.g., coaxial cable, fiber optic, digital subscriber line), or wireless (e.g., infrared, wireless, microwave, etc.). The computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains an integration of one or more available media. The usable medium may be a magnetic medium (e.g., a floppy Disk, hard Disk, or magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., solid State Disk (SSD)), or the like.
Those of ordinary skill in the art will appreciate that implementing all or part of the above-described method embodiments may be accomplished by a computer program to instruct related hardware, the program may be stored in a computer readable storage medium, and the program may include the above-described method embodiments when executed. And the aforementioned storage medium may include: ROM or random access memory RAM, magnetic or optical disk, etc.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and in part, not described or illustrated in any particular embodiment, reference is made to the related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/device and method may be implemented in other manners. For example, the apparatus/device embodiments described above are merely illustrative, e.g., the division of the modules or units is merely a logical functional division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection via interfaces, devices or units, which may be in electrical, mechanical or other forms.
It should be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
In the description of the present application, unless otherwise indicated, "/" means that the associated object is an "or" relationship, e.g., a/B may represent a or B; the term "and/or" in this application is merely an association relation describing an association object, and means that three kinds of relations may exist, for example, a and/or B may mean: there are three cases, a alone, a and B together, and B alone, wherein a, B may be singular or plural.
Also, in the description of the present application, unless otherwise indicated, "a plurality" means two or more than two. "at least one of the following" or similar expressions thereof, means any combination of these items, including any combination of single or plural items. For example, at least one of a, b, or c may represent: a, b, c, a-b, a-c, b-c, or a-b-c, wherein a, b, c may be single or plural.
As used in this specification and the appended claims, the term "if" may be interpreted as "when..once" or "in response to a determination" or "in response to detection" depending on the context. Similarly, the phrase "if a determination" or "if a [ described condition or event ] is detected" may be interpreted in the context of meaning "upon determination" or "in response to determination" or "upon detection of a [ described condition or event ]" or "in response to detection of a [ described condition or event ]".
In addition, in the description of the present application and the appended claims, the terms "first," "second," "third," and the like are used merely to distinguish between descriptions and are not to be construed as indicating or implying relative importance.
Reference in the specification to "one embodiment" or "some embodiments" or the like means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," and the like in the specification are not necessarily all referring to the same embodiment, but mean "one or more but not all embodiments" unless expressly specified otherwise.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some or all of the technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the corresponding technical solutions from the scope of the technical solutions of the embodiments of the present application.

Claims (21)

1. A device control method applied to an electronic device, comprising:
acquiring a device set of audio devices in an online state in a target area where a user is located;
dividing the audio equipment into first equipment and second equipment according to the equipment sharing rate of the audio equipment; wherein the device sharing rate of the second class of devices is higher than the device sharing rate of the first class of devices;
and in response to the detection of the target behavior of the user, controlling the first type of equipment of which the associated account number in the equipment set meets the preset requirement to reduce the volume, and controlling the second type of equipment in the equipment set to reduce the volume or keep the volume unchanged.
2. The method of claim 1, wherein the controlling the first type of device in which the device-centric associated account meets the preset requirement to reduce the volume in response to detecting the target behavior of the user comprises:
and in response to detecting that the user enters a call state, controlling the first type of equipment which is associated with the same account number as the answering equipment in the equipment set to reduce the volume.
3. The method according to claim 2, wherein the method further comprises:
and if the answering equipment belongs to the second type of equipment, controlling the second type of equipment except the answering equipment in the equipment set to reduce the volume.
4. A method according to claim 2 or 3, characterized in that the method further comprises:
judging whether other users exist in the target area;
and if no other users exist in the target area, controlling other devices except the answering device in the device set to reduce the volume.
5. The method of claim 1, wherein the controlling the first type of device in which the device-centric associated account meets the preset requirement to reduce the volume in response to detecting the target behavior of the user comprises:
And in response to detecting that the user enters a sleep state, controlling first-class devices which are associated with the same account number with the electronic device in the device set to reduce the volume.
6. The method of claim 5, wherein the method further comprises:
and controlling other devices which do not reduce the volume in the device set to send out a reminding message.
7. The method according to claim 5 or 6, characterized in that the method further comprises:
judging whether other users exist in the target area;
and if no other users exist in the target area, controlling other devices in the device set to reduce the volume.
8. The method of any of claims 1-3, 5-6, wherein the associated account of the audio device in the online state in the target area is a first account or a second account, the first account is an associated account of the electronic device, and the second account and the first account belong to the same target group.
9. The method of claim 8, wherein the audio device in the target area that is online satisfies at least one of the following requirements:
the distance between the electronic equipment and the electronic equipment is smaller than a preset distance;
The electronic equipment is located in the same wireless local area network;
a near field communication connection can be established with the electronic equipment;
is in the same target building space as the electronic device.
10. A device control apparatus applied to an electronic device, comprising:
the communication module is used for acquiring a device set of the audio device in an online state in a target area where the user is located; dividing the audio equipment into first equipment and second equipment according to the equipment sharing rate of the audio equipment; wherein the equipment sharing rate of the second type of equipment is higher than the equipment sharing rate of the first type of equipment;
and the processing module is used for controlling the first type of equipment of which the equipment centralized association account number meets the preset requirement to reduce the volume or controlling the second type of equipment in the equipment centralized to reduce the volume or keep the volume unchanged in response to the detection of the target behavior of the user.
11. The apparatus of claim 10, wherein the processing module is specifically configured to:
and in response to detecting that the user enters a call state, controlling the first type of equipment which is associated with the same account number as the answering equipment in the equipment set to reduce the volume.
12. The apparatus of claim 11, wherein the processing module is further configured to:
and if the answering equipment belongs to the second type of equipment, controlling the second type of equipment except the answering equipment in the equipment set to reduce the volume.
13. The apparatus of claim 11 or 12, wherein the processing module is further configured to:
judging whether other users exist in the target area;
and if no other users exist in the target area, controlling other devices except the answering device in the device set to reduce the volume.
14. The apparatus of claim 10, wherein the processing module is specifically configured to:
and in response to detecting that the user enters a sleep state, controlling first-class devices which are associated with the same account number with the electronic device in the device set to reduce the volume.
15. The apparatus of claim 14, wherein the processing module is further configured to:
and controlling other devices which do not reduce the volume in the device set to send out a reminding message.
16. The apparatus of claim 14 or 15, wherein the processing module is further configured to:
judging whether other users exist in the target area;
And if no other users exist in the target area, controlling other devices in the device set to reduce the volume.
17. The apparatus of any of claims 10-12, 14-15, wherein the associated account of the audio device in the online state in the target area is a first account or a second account, the first account is an associated account of the electronic device, and the second account and the first account belong to a same target group.
18. The apparatus of claim 17, wherein the audio device in the target area that is online satisfies at least one of the following requirements:
the distance between the electronic equipment and the electronic equipment is smaller than a preset distance;
the electronic equipment is located in the same wireless local area network;
a near field communication connection can be established with the electronic equipment;
is in the same target building space as the electronic device.
19. An electronic device, comprising: a memory and a processor, the memory for storing a computer program; the processor is configured to perform the method of any of claims 1-9 when the computer program is invoked.
20. A computer readable storage medium, on which a computer program is stored, which computer program, when being executed by a processor, implements the method according to any of claims 1-9.
21. A chip system comprising a processor coupled to a memory, the processor executing a computer program stored in the memory to implement the method of any of claims 1-9.
CN202011536963.5A 2020-12-23 2020-12-23 Equipment control method and device and electronic equipment Active CN114666444B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202011536963.5A CN114666444B (en) 2020-12-23 2020-12-23 Equipment control method and device and electronic equipment
PCT/CN2021/137174 WO2022135183A1 (en) 2020-12-23 2021-12-10 Device control method and apparatus and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011536963.5A CN114666444B (en) 2020-12-23 2020-12-23 Equipment control method and device and electronic equipment

Publications (2)

Publication Number Publication Date
CN114666444A CN114666444A (en) 2022-06-24
CN114666444B true CN114666444B (en) 2023-06-06

Family

ID=82025322

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011536963.5A Active CN114666444B (en) 2020-12-23 2020-12-23 Equipment control method and device and electronic equipment

Country Status (2)

Country Link
CN (1) CN114666444B (en)
WO (1) WO2022135183A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017035967A1 (en) * 2015-09-01 2017-03-09 广东欧珀移动通信有限公司 Smart speaker volume control method, device, smart speaker and mobile terminal
CN107085511A (en) * 2016-02-12 2017-08-22 松下电器(美国)知识产权公司 Control method, control device and equipment

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104898446A (en) * 2015-05-29 2015-09-09 四川长虹电器股份有限公司 Control method and intelligent household control device
CN106899869A (en) * 2015-12-25 2017-06-27 小米科技有限责任公司 Adjust method, the apparatus and system of volume of electronic device
US10048929B2 (en) * 2016-03-24 2018-08-14 Lenovo (Singapore) Pte. Ltd. Adjusting volume settings based on proximity and activity data
CN106528036B (en) * 2016-10-09 2020-02-14 腾讯科技(深圳)有限公司 Volume adjusting method and device
CN106371802A (en) * 2016-10-31 2017-02-01 北京小米移动软件有限公司 Terminal volume control method and device
CN109923848B (en) * 2017-07-12 2021-03-30 华为技术有限公司 Method and device for controlling volume of device and server
CN107483744B (en) * 2017-09-12 2018-06-01 江苏赛博宇华科技有限公司 A kind of mobile phone communication anti-interference method
US20190349683A1 (en) * 2018-05-14 2019-11-14 International Business Machines Corporation Adjusting audio volume for a plurality of zone speakers, separately, within a plurality of zones in real-time

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017035967A1 (en) * 2015-09-01 2017-03-09 广东欧珀移动通信有限公司 Smart speaker volume control method, device, smart speaker and mobile terminal
CN107085511A (en) * 2016-02-12 2017-08-22 松下电器(美国)知识产权公司 Control method, control device and equipment

Also Published As

Publication number Publication date
WO2022135183A1 (en) 2022-06-30
CN114666444A (en) 2022-06-24

Similar Documents

Publication Publication Date Title
CN110364151B (en) Voice awakening method and electronic equipment
WO2020249062A1 (en) Voice communication method and related device
CN114173204B (en) Message prompting method, electronic equipment and system
CN111369988A (en) Voice awakening method and electronic equipment
CN113272745A (en) Smart home equipment sharing system and method and electronic equipment
CN110401767B (en) Information processing method and apparatus
WO2021000817A1 (en) Ambient sound processing method and related device
CN110198362B (en) Method and system for adding intelligent household equipment into contact
WO2020216098A1 (en) Method for providing forwarding service across electronic apparatuses, apparatus, and system
WO2020164256A1 (en) Wireless signal quality evaluating method, electronic device, and system
EP3883299A1 (en) Method for smart home appliance to access network and related device
CN114449110A (en) Control method and device of electronic equipment
CN109285563B (en) Voice data processing method and device in online translation process
WO2023071502A1 (en) Volume control method and apparatus, and electronic device
CN114666444B (en) Equipment control method and device and electronic equipment
CN113467747B (en) Volume adjusting method, electronic device and storage medium
CN113572798B (en) Device control method, system, device, and storage medium
CN115242994A (en) Video call system, method and device
CN116056050A (en) Audio playing method, electronic equipment and system
CN114895991B (en) Content sharing method and electronic equipment
CN115035894B (en) Equipment response method and device
CN115734323B (en) Power consumption optimization method and device
CN114258044B (en) Standby method, standby system and terminal equipment
WO2023142900A1 (en) Volume adjustment method and electronic device
CN115883714A (en) Message reply method and related equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant