CN115718433A - Control method and device of intelligent equipment, intelligent system and storage medium - Google Patents

Control method and device of intelligent equipment, intelligent system and storage medium Download PDF

Info

Publication number
CN115718433A
CN115718433A CN202211437928.7A CN202211437928A CN115718433A CN 115718433 A CN115718433 A CN 115718433A CN 202211437928 A CN202211437928 A CN 202211437928A CN 115718433 A CN115718433 A CN 115718433A
Authority
CN
China
Prior art keywords
target user
information
intelligent
behavior
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211437928.7A
Other languages
Chinese (zh)
Inventor
张海钰
许丽
万根顺
潘嘉
刘聪
胡国平
刘庆峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
iFlytek Co Ltd
Original Assignee
iFlytek Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by iFlytek Co Ltd filed Critical iFlytek Co Ltd
Priority to CN202211437928.7A priority Critical patent/CN115718433A/en
Publication of CN115718433A publication Critical patent/CN115718433A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Selective Calling Equipment (AREA)

Abstract

The application provides a control method, a control device, an intelligent system and a storage medium of intelligent equipment, which can analyze the potential control behavior of a target user to the intelligent equipment according to the state information of the target user, wherein the state information of the target user comprises behavior action information and/or physiological parameter information of the target user, and the target intelligent equipment corresponding to the potential control behavior is determined from active intelligent equipment, wherein the active intelligent equipment comprises intelligent equipment capable of being controlled by the target user, so that the target intelligent equipment is controlled according to the potential control behavior, and further the intelligent equipment is actively controlled according to the state information of the user, the requirement actively provided by the user is not needed, the use is convenient, and the user experience can be improved.

Description

Control method and device of intelligent equipment, intelligent system and storage medium
Technical Field
The present application relates to the field of intelligent control technologies, and in particular, to a method and an apparatus for controlling an intelligent device, an intelligent system, and a storage medium.
Background
With the increasing maturity of the related art of artificial intelligence, more and more intelligent devices enter the lives of users, and human-machine interaction is becoming common. At present, intelligent equipment generally has a voice control function, and a user can interact with the intelligent equipment through voice to realize the control of the intelligent equipment. However, the current voice control mode requires the intelligent device to perform passive reaction based on the requirement actively proposed by the user, and is inconvenient to use.
Disclosure of Invention
Based on the above requirements, the present application provides a control method, device, intelligent system and storage medium for an intelligent device, so as to solve the problem that in the prior art, the intelligent device needs to perform passive reaction based on the requirement actively provided by the user, and is inconvenient to use.
The technical scheme provided by the application is as follows:
in one aspect, the present application provides a method for controlling an intelligent device, including:
determining potential control behaviors of a target user on the intelligent device at least according to state information of the target user; the state information of the target user comprises behavior and action information and/or physiological parameter information of the target user;
determining target intelligent equipment corresponding to the potential control behaviors from active intelligent equipment; the active smart devices comprise smart devices controllable by the target user;
and controlling the target intelligent equipment according to the potential control behaviors.
Further, in the method described above, before controlling the target smart device according to the potential control behavior, the method further includes:
outputting inquiry information whether to control the target intelligent equipment according to the potential control behavior;
acquiring reply information of the target user;
and if the reply information is that the target intelligent equipment is controlled according to the potential control behavior, controlling the target intelligent equipment according to the potential control behavior.
Further, in the method described above, determining a potential control behavior of the target user on the smart device according to at least the state information of the target user includes:
determining the potential control behavior of the target user on the intelligent equipment according to the behavior habit preference model of the target user and the state information of the target user;
the behavior habit preference model of the target user is a model constructed according to the behavior habit preference information of the target user.
Further, in the above method, determining the potential control behavior of the target user on the intelligent device according to the behavior habit preference model of the target user and the state information of the target user includes:
and determining a habit preference control behavior corresponding to the state information according to the behavior habit preference model of the target user, and determining the habit preference control behavior as the potential control behavior.
Further, in the above method, the determining, from the active smart devices, a target smart device corresponding to the potential control behavior includes:
according to the behavior habit preference model of the target user, determining target intelligent equipment corresponding to the potential control behavior from the active intelligent equipment; the behavior habit preference model of the target user is a model constructed according to the behavior habit preference information of the target user.
Further, in the method described above, the method further includes:
acquiring voice information of the target user;
performing semantic analysis on the voice information of the target user, and determining whether the voice information of the target user comprises control information of intelligent equipment;
and if the voice information of the target user comprises the control information of the intelligent equipment, controlling the intelligent equipment corresponding to the control information according to the control information.
Further, in the method described above, before controlling the smart device corresponding to the control information according to the control information, the method further includes:
performing semantic analysis on the voice information of the target user, and extracting intelligent equipment corresponding to the control information from the voice information of the target user;
or analyzing the behavior and action information of the target user, and determining that the intelligent device towards which the target user faces is the intelligent device corresponding to the control information; the orientation is the target user face orientation;
or determining the intelligent equipment corresponding to the control information according to the intelligent equipment oriented to the target user and the behavior habit preference model of the target user; the behavior habit preference model of the target user is a model constructed according to the behavior habit preference information of the target user.
Further, in the method described above, acquiring the voice information of the target user includes:
extracting first voice information of the target user from the detected voice information based on the voiceprint recognition information of the target user; and/or, based on the face identification information of the target user, determining the target user from the detected video information, extracting lip information of the target user, and generating second voice information of the target user according to the lip information; the video information comprises video information obtained by shooting the space where the target user is located;
and determining the voice information of the target user according to the first voice information and/or the second voice information.
Further, in the method described above, the smart device for determining the orientation of the target user includes:
according to at least one of sound field information, video information and ground pressure information, determining intelligent equipment towards which the target user faces; the video information comprises video information obtained by shooting the space where the target user is located; the ground pressure information comprises the ground pressure information of the space where the target user is located; the sound field information is determined from the sound information.
On the other hand, this application still provides a controlling means of smart machine, includes:
the analysis module is used for determining the potential control behavior of the target user on the intelligent equipment at least according to the state information of the target user; the state information of the target user comprises behavior and action information and/or physiological parameter information of the target user;
the determining module is used for determining target intelligent equipment corresponding to the potential control behaviors from active intelligent equipment; the active smart devices comprise smart devices controllable by the target user;
and the control module is used for controlling the target intelligent equipment according to the potential control behaviors.
In another aspect, the present application further provides an intelligent system, including: a controller, and at least one sensor connected to the control system;
the sensor is used for acquiring the state information of a target user and sending the state information of the target user to the controller; the state information of the target user comprises behavior and action information and/or physiological parameter information of the target user;
the controller is used for determining the potential control behavior of the target user on the intelligent equipment at least according to the state information of the target user; determining target intelligent equipment corresponding to the potential control behaviors from active intelligent equipment; the active smart devices comprise smart devices controllable by the target user; and controlling the target intelligent equipment according to the potential control behaviors.
Further, in the above intelligent system, the sensors at least include a pressure sensor, a visual sensor, a sound sensor and a physiological parameter sensor;
the pressure sensor is used for acquiring ground pressure information of a space where the target user is located;
the visual sensor is used for acquiring video information of a space where the target user is located;
the sound sensor is used for acquiring sound information of the target user;
the physiological parameter sensor is used for acquiring physiological parameter information of the target user.
Further, the intelligent system further comprises an intelligent device;
the intelligent device is electrically connected with the controller.
In another aspect, the present application further provides a storage medium, including: the storage medium stores thereon a computer program which, when executed by a processor, implements the steps of the control method of the smart device described in any one of the above.
The control method of the intelligent device can analyze the potential control behavior of the target user to the intelligent device according to the state information of the target user, the state information of the target user comprises behavior action information and/or physiological parameter information of the target user, the target intelligent device corresponding to the potential control behavior is determined from the active intelligent device, wherein the active intelligent device comprises the intelligent device capable of being controlled by the target user, so that the target intelligent device can be controlled according to the potential control behavior, further the intelligent device can be actively controlled according to the state information of the user, the requirement actively provided by the user is not needed, the use is convenient, and the user experience can be improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, it is obvious that the drawings in the following description are only embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the provided drawings without creative efforts.
FIG. 1 is a schematic diagram of an application environment of an intelligent system provided in an embodiment of the present application;
fig. 2 is a schematic flowchart of a control method of an intelligent device according to an embodiment of the present disclosure;
FIG. 3 is a schematic flowchart illustrating a process of determining control information of a target user for a smart device according to an embodiment of the present application;
fig. 4 is a schematic flowchart of acquiring voice information of a target user according to an embodiment of the present application;
fig. 5 is a schematic structural diagram of a control apparatus of an intelligent device according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of an intelligent system provided in an embodiment of the present application;
fig. 7 is a schematic structural diagram of a controller according to an embodiment of the present application.
Detailed Description
The technical scheme of the embodiment of the application is suitable for the application scene of controlling the intelligent equipment, and by adopting the technical scheme of the embodiment of the application, the intelligent equipment can be actively controlled according to the state information of the user, the requirement actively provided by the user is not needed, the use is convenient, and the user experience can be improved.
For example, the technical solution of the embodiments of the present application may be applied to hardware devices such as a hardware processor, or may be packaged into a software program to be executed, and when the hardware processor executes the processing procedure of the technical solution of the embodiments of the present application, or the software program is executed, the control of the intelligent device may be implemented. The embodiment of the present application only introduces the specific processing procedure of the technical scheme of the present application by way of example, and does not limit the specific execution form of the technical scheme of the present application, and any technical implementation form that can execute the processing procedure of the technical scheme of the present application may be adopted by the embodiment of the present application.
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments in the present application without making any creative effort belong to the protection scope of the present application.
Exemplary implementation Environment
Referring to fig. 1, fig. 1 is a schematic diagram of an implementation environment that may be involved in the present invention. The implementation environment is an internet of things platform, and the internet of things platform mainly comprises an internal networking based on ZigBee and an internet based on protocols such as WIFI (wireless fidelity) and Bluetooth (Blue Tooth). The two networks are connected through a gateway, and information transmission between the Internet and the intranet is achieved. Through the structure, equipment in the internet can conveniently monitor the states of various intelligent equipment in the intranet, and the controller connected with the gateway can acquire the information of all the equipment in the internet of things platform, so that the controller can make a decision conveniently.
Among them, devices in the external internet include mobile terminals, such as smart phones, personal Computers (PCs), and the like, and also include other devices, such as smart watches, and the like. The mobile terminal and other equipment communicate based on WIFI or Bluetooth protocols.
The devices in the internal network comprise mobile devices and fixed devices. Exemplarily, the mobile device includes a floor sweeping robot and other movable smart home devices, and the fixed device includes smart home devices fixedly arranged such as a smart lamp, a smart air conditioner, a smart television, and a smart refrigerator, or smart sensors such as a human body sensor, a visual sensor, a door and window sensor, and a temperature and humidity sensor configured with a communication module (e.g., a ZIGBEE module, a WIFI module, and a bluetooth communication module), or smart switch devices configured with the communication module, such as a wall switch, a wall socket, and a wireless switch.
Compared with wired communication, wireless communication has the advantages of portability, flexibility and easier expansion. The advantages of wired communication mainly focus on the aspect of information privacy and signal anti-interference, and with the development of wireless radio frequency technology, the anti-interference capability and stability of radio frequency transmission become stronger and stronger, so in a general scene, a wireless scheme is more preferable. The mainstream wireless transmission protocols are: bluetooth, WIFI, zigBee, ZWAVE and the like. Compared with other ZigBee network nodes, zigBee has the characteristics of long transmission distance, more maximum network nodes and low power consumption, and is particularly suitable for indoor scenes. And the ZigBee is connected through the local area gateway, so that the basic function operation can be maintained in the state of network disconnection. For example, the basic intelligent functions remain unaffected without the network. Therefore, the ZigBee protocol is selected as the intranet communication protocol in this embodiment.
The ZigBee network topology structure comprises a terminal, a coordinator and a gateway, and any node in the network can be in data communication. When the equipment is added or deleted and then added again, the self-networking capability of the ZigBee network terminal module can realize automatic network access of the equipment and automatic network repair. Illustratively, when a new device is added in a house, as long as two terminals automatically find each other in the communication range of the network module, the two terminals can quickly form an interconnected network. When the devices in the house are added again after leaving, the network terminal module can automatically reset the original network by searching the communication objects again and determining the connection between the communication objects.
The coordinator is connected with the gateway, and the intranet can be in contact with the external internet through the gateway. Therefore, the coordinator can be controlled through the APP on the mobile phone or the control panel, and then all the intelligent devices are authenticated and controlled. The controller is also connected with the gateway, so that the push service of the internal information can be realized.
Exemplary method
The present embodiment proposes a control method for an intelligent device, and the method is described by taking a controller applied in an exemplary implementation environment as an example. Referring to fig. 2, the method includes:
s201, determining the potential control behavior of the target user to the intelligent device at least according to the state information of the target user.
The target user refers to a current service object of the intelligent device. In this embodiment, the target user may actively control the intelligent device, or the controller may automatically control the intelligent device based on the state information of the target user.
Specifically, any user may be set as the target user, and this implementation is not limited. When the target user is set, the identity characteristic information of the target user can be collected and stored, so that the identity of the target user can be confirmed at a later stage according to the acquired identity verification information. Particularly, under the condition that a plurality of target users are simultaneously set, the target users can be distinguished according to different identity characteristic information of each target user. The identity characteristic information comprises any information capable of confirming the identity of the user, including password information, biological identification information, radio frequency card information and the like. The biometric information includes fingerprint information, iris information, voice print information, face information, and the like.
Illustratively, in a home scene, the smart devices are smart home devices such as a smart television, a smart audio, a smart air conditioner, a smart refrigerator, a smart water heater, a smart lamp, and a smart curtain. Any family member can be set as a target user, and each target user can be distinguished by identifying the information of the target user. Furthermore, the identity characteristic information of each family member can be obtained, the identity information of each family member can be authenticated, and when the family enters suspicious personnel for identity authentication, prompt information can be sent to the mobile equipment of the family member to inform the family member that the suspicious personnel break into the mobile equipment at present.
Illustratively, in an on-vehicle scene, the intelligent device is a device such as an on-vehicle smart television, an on-vehicle smart audio, an on-vehicle smart air conditioner, an on-vehicle smart refrigerator, and the like, any user can be set as a target user, and by identifying information of the target user, each target user can be distinguished.
In the case where a plurality of target users are set at the same time, the priority among the target users may be set so that when a plurality of target users are in the same space, the target user with the highest priority is determined as the target user in the space. In addition, in the case that a plurality of target users are set at the same time, a default target user may be set so that when a plurality of target users are in the same space, the default target user is determined as the target user in the space.
The state information of the target user comprises behavioral and action information and/or physiological parameter information of the target user. The behavior and action information includes target user limb states, expression states and the like. The limb state of the target user comprises gesture actions, leg actions, standing postures and the like of the target user; the expression state includes a facial expression of the target user, and the like. The physiological parameter information includes heart rate, body temperature, blood oxygen, and respiratory rate of the target user.
Behavioral and/or physiological parameter information of the target user may be detected by sensors disposed within the space. For example, the ground pressure information of the space where the target user is located is acquired through a pressure sensor, and the position where the target user is located can be determined based on the ground pressure information; video information of a space where a target user is located is acquired through a visual sensor, and gesture actions, leg actions, standing postures, body orientation, facial expressions and the like of the target user can be determined based on the video information; collecting sound information, sound field information and the like of a target user through a sound sensor; the physiological parameter information of the target user is acquired through the physiological parameter sensor, and the heart rate, the body temperature, the blood oxygen, the respiratory rate and the like of the target user can be determined based on the physiological parameter information. In conjunction with the behavioral and/or physiological parameter information of the target user, the controller is able to determine the state in which the target user is located.
For example, if the behavior and action information and the physiological parameter information of the target user, which are sent to the controller by the sensor, indicate that the target user is currently lying in bed and the heartbeat speed is slow, the controller determines that the target user is in a state that the target user may be about to enter a sleep state.
If the behavior action information and the physiological parameter information of the target user, which are sent to the controller by the sensor, indicate that the body surface temperature of the target user is increased, the fan acts or the clothes is removed, the judgment result of the controller on the state of the target user indicates that the target user may feel that the current environment temperature is higher.
If the behavior and action information and the physiological parameter information of the target user, which are sent to the controller by the sensor, indicate that the body surface temperature of the target user is reduced and the target user has clothes-wearing action, the judgment result of the controller on the state of the target user indicates that the target user may feel that the current environment temperature is lower.
If the behavior action information and the physiological parameter information of the target user, which are sent to the controller by the sensor, are positions of the target user, such as the wearing of coats and shoes or the movement to a door of a house, the controller judges that the target user is in a state that the target user is likely to go out.
If the behavior and action information and the physiological parameter information of the target user, which are sent to the controller by the sensor, are facial expressions of the target user, which are sad, the judgment result of the controller on the state of the target user is that the target user is possibly in a sad psychological state.
If the behavior and action information and the physiological parameter information of the target user, which are sent to the controller by the sensor, indicate that the target user is making a call, the judgment result of the controller on the state of the target user indicates that the target user possibly needs a quieter environment; and so on.
Based on the state of the target user, the potential control behavior of the target user can be further determined. The potential control behavior refers to the control behavior which may exist in the current state of the target user and is not executed by the target user.
For example, if the target user is about to go to sleep, there are potential control actions of closing curtains, dimming lights; if the target user feels that the current ambient temperature is high, there is a potential control action to lower the air conditioner set temperature; if the target user feels that the current ambient temperature is low, then there is a potential control action to raise the air conditioner set temperature; if the target user is going out, there is a potential control action of turning off some electric devices (including intelligent lamps, intelligent air conditioners, etc.) in the room; if the target user is in a sad mental state, there is a potential control action of playing the soothing music; if the target user is in a calling state, if the audio playing device is playing music, potential control behaviors of reducing the volume of the audio playing device or closing the audio playing device exist; and so on.
For example, the mapping relationship between different state information of the target user and the potential control behavior may be preset, so as to determine the potential control behavior corresponding to the mapping relationship according to the state information of the target user.
For example, a mapping relation between the target user lying in bed and the target user having a gentle heartbeat speed and closing a curtain and dimming light can be set, so that when the state information that the target user lies in bed and the target user has a gentle heartbeat speed is detected, it is determined that the potential control behavior of the target user on the intelligent device is closing the curtain and dimming light;
the mapping relation between the body surface temperature rise of the target user, the fan action or the clothes removing action and the air conditioner set temperature reduction can be set, so that when the body surface temperature rise of the target user, the fan action or the clothes removing action is detected, the control action of the target user on the intelligent device is determined to be the air conditioner set temperature reduction;
the mapping relation between the reduction of the body surface temperature of the target user and the increase of the air conditioner set temperature can be set, so that when the reduction of the body surface temperature of the target user and the clothes wearing action are detected, the control action of the target user on the intelligent equipment is determined to be the increase of the air conditioner set temperature;
the mapping relation between the action of going out of the target user and the closing of some indoor electric equipment (including intelligent lamps, intelligent air conditioners and the like) can be set, so that when the situation that the target user goes out of the room is detected, the control action of the target user on the intelligent equipment is determined to be the closing of some indoor electric equipment (including intelligent lamps, intelligent air conditioners and the like);
the mapping relation between the psychological state of the target user in sadness and the music playing of the soothing music can be set, so that when the psychological state of the target user in sadness is detected, the control action of the target user on the intelligent device is determined to be the music playing of the soothing music;
the mapping relation between the calling state of the target user and the reduction of the volume of the audio playing device can be set, so that when the target user is detected to make a call, the control action of the target user on the intelligent device is determined to be the reduction of the volume of the audio playing device.
As another example, the potential control behavior of the target user on the smart device may also be determined according to the behavior habit of the target user.
Specifically, the behavior habit preference information of a certain number of target users can be collected, including the state information of the target users and the habit preference control behavior of the intelligent device corresponding to different state information of the target users. The behavior habit preference information of the target user is learned to generate a behavior habit preference model of the target user, so that when the state information of the target user is detected, the habit preference control behavior corresponding to the target user can be determined based on the behavior habit preference model of the target user, and the habit preference control behavior is used as the potential control behavior of the target user.
Further, after the potential control behavior of the target user is determined, the intelligent device corresponding to the potential control behavior can be determined. And the intelligent device corresponding to the potential control behavior refers to a potential control object of the potential control behavior of the target user.
For example, if it is determined that the potential control behaviors of the target user on the smart devices are curtain closing and light dimming, the smart devices corresponding to the potential control behaviors are smart curtains and smart lights; if the target user's potential control action on the intelligent equipment is determined to be lowering the set temperature of the air conditioner or raising the set temperature of the air conditioner, the intelligent equipment corresponding to the potential control action is the intelligent air conditioner; if it is determined that the potential control behavior of the target user on the intelligent device is to turn off some indoor electric devices (including an intelligent lamp, an intelligent air conditioner and the like), the intelligent device corresponding to the potential control behavior is an intelligent lamp, an intelligent air conditioner and the like; if the potential control behavior of the target user on the intelligent device is determined to be playing of the soothing music, the intelligent device corresponding to the potential control behavior is the intelligent sound.
S202, determining target intelligent equipment corresponding to the potential control behaviors from the active intelligent equipment.
The active smart device refers to a smart device that can be controlled by a target user. In this embodiment, the target smart device corresponding to the potential control behavior is determined from the active smart devices. The target intelligent device corresponding to the potential control behavior refers to a potential control object of the potential control behavior of the target user in the active intelligent device.
For example, the control authority of the smart device may be set for the target user, and when the target user has the control authority for any smart device, the smart device is an active smart device for the target user. And determining the target intelligent equipment from the intelligent equipment of which the target user has the control authority.
For example, in a home scene, the smart devices are smart home devices such as a smart television, a smart sound, a living room smart air conditioner, a main-lying smart air conditioner, a sub-lying smart air conditioner, a smart refrigerator, a smart water heater, a smart lamp, and a smart curtain. If the target user a has the control authority of all the intelligent home devices, it can be determined that the intelligent home devices such as the intelligent television, the intelligent sound box, the intelligent living room air conditioner, the intelligent master-slave air conditioner, the intelligent slave-slave air conditioner, the intelligent refrigerator, the intelligent water heater, the intelligent lamp and the intelligent curtain are active intelligent devices for the target user a; and the target user B has the control authority of the smart home devices such as the smart television, the secondary intelligent air conditioner, the intelligent lamp and the intelligent curtain, so that it can be determined that the smart home devices such as the smart television, the secondary intelligent air conditioner, the intelligent lamp and the intelligent curtain are active intelligent devices for the target user B.
If the potential control behavior of the target user A on the intelligent equipment is determined to be the reduction of the set temperature of the air conditioner, the intelligent equipment corresponding to the potential control behavior is the intelligent air conditioner, and on the premise that the target user has control authority over the intelligent air conditioner in the living room, the intelligent air conditioner in the main sleeping room and the intelligent air conditioner in the secondary sleeping room, the intelligent air conditioner in the living room, the intelligent air conditioner in the main sleeping room and the intelligent air conditioner in the secondary sleeping room can be simultaneously used as the target intelligent equipment. If the potential control behavior of the target user B on the intelligent equipment is determined to be the rising of the set temperature of the air conditioner, the intelligent equipment corresponding to the potential control behavior is also the intelligent air conditioner, and the secondary intelligent air conditioner can be used as the target intelligent equipment on the premise that the target user only has the control authority on the secondary intelligent air conditioner.
For example, if the target user has control authority over a plurality of smart devices having the same function, that is, there are a plurality of active smart devices having the same function for the target user, in such a case, if the plurality of smart devices having the same function are simultaneously controlled as the target smart device according to the potential control behavior of the target user, waste of power energy may be caused. For example, in a home scenario, a target user is located in one of the rooms, which may result in a waste of electrical energy if the smart air conditioners of the other rooms are turned on. For example, in an on-vehicle scenario, the target user can only be located in one seat, which would result in a waste of electrical energy if the intelligent air conditioner for the other seat is turned on.
To solve the above problem, for example, in this embodiment, a space where the target user is located may be determined, and the target smart device corresponding to the potential control behavior is determined from active smart devices in the space where the target user is located. For example, in a home scenario, the space in which the target user is located refers to the room in which the target user is located. If the target user a in the above embodiment is located in the main bedroom and it is determined that the potential control behavior of the target user a on the intelligent device is to reduce the air conditioner set temperature, it may be determined that the target intelligent device corresponding to the potential control behavior is the main bedroom intelligent air conditioner.
For another example, the target intelligent device corresponding to the potential control behavior may be determined from the active intelligent devices according to the behavior habit preference model of the target user, which is described in the steps of the above embodiment. For example, if it is determined that the potential control behavior of the target user a on the intelligent device is to reduce the air conditioner set temperature, and it is determined that the target user in the current season adjusts the master-sleeping intelligent air conditioner according to the behavior habit preference model of the target user, it may be determined that the target intelligent device corresponding to the potential control behavior is the master-sleeping intelligent air conditioner.
For another example, the control record of the target user may be queried, and the target smart device corresponding to the potential control behavior may be determined from the active smart devices according to the control record of the target user. For example, if it is determined that the potential control behavior of the target user a on the intelligent device is to lower the air conditioner set temperature, and it is determined that the target user adjusts the master-bedroom intelligent air conditioner most frequently according to the control record of the target user, it may be determined that the target intelligent device corresponding to the potential control behavior is the master-bedroom intelligent air conditioner.
As yet another example, the aforementioned problem of wasted electrical energy may also be alleviated by further narrowing the range of active smart devices. For example, the smart device that the target user has control authority and the distance to the target user is less than the preset distance may be determined as the active smart device. The preset distance can be determined according to actual conditions. For example, if it is determined that the potential control behavior of the target user a on the intelligent device is to reduce the air conditioner set temperature, and the intelligent air conditioner whose distance from the target user a is smaller than the preset distance is the master-sleeper intelligent air conditioner, it may be determined that the target intelligent device corresponding to the potential control behavior is the master-sleeper intelligent air conditioner.
And S203, controlling the target intelligent equipment according to the potential control behaviors.
And after the target intelligent equipment is determined, controlling the target intelligent equipment according to the potential control behaviors. Specifically, the controller may generate a control instruction according to the potential control behavior of the target user, and send the control instruction to the target smart device, so as to control the target smart device through the control instruction.
For example, if it is determined that the potential control behavior of the target user is to increase the set temperature of the air conditioner and the target intelligent device is the master-sleeper intelligent air conditioner, in this embodiment, the controller may generate a control instruction for the master-sleeper intelligent air conditioner and send the control instruction to the master-sleeper intelligent air conditioner, where the control instruction is used to trigger the master-sleeper intelligent air conditioner to decrease the set temperature.
For another example, if it is determined that the potential control behavior of the target user is curtain closing and dimming of light, and the target smart device is a smart curtain and a smart lamp in a room where the user is currently located, in this embodiment, the controller may generate a control instruction for the smart curtain and the smart lamp in the room where the user is currently located, and send the control instruction to the smart curtain and the smart lamp in the room where the user is currently located, where the control instruction is used to trigger the smart curtain to close and trigger the smart lamp to reduce brightness.
It should be noted that, when controlling the target smart device, the control amount of the target smart device may be determined based on the behavior habit preference model of the target user.
For example, if the behavior habit preference model of the target user is learned to determine that the habit preference of the target user in the current season is an ambient temperature of 25 ℃, then if the potential control behavior of the target user is to increase the set temperature of the air conditioner and the target intelligent device is a master-sleeper intelligent air conditioner, the control instruction is used to trigger the master-sleeper intelligent air conditioner to increase the set temperature to 25 ℃ or above 25 ℃.
For another example, if the behavior habit preference model of the target user is learned to determine that the adjustment step length of each time when the target user adjusts the master-sleeper intelligent air conditioner is 1 ℃, the control instruction is used to trigger the master-sleeper intelligent air conditioner to increase the set temperature by 1 ℃ if the potential control behavior of the target user is to increase the set temperature of the air conditioner and the target intelligent device is the master-sleeper intelligent air conditioner.
In the above embodiment, the potential control behavior of the target user on the smart device can be analyzed according to the state information of the target user, the state information of the target user includes behavior action information and/or physiological parameter information of the target user, and the target smart device corresponding to the potential control behavior is determined from the active smart device, wherein the active smart device includes a smart device capable of being controlled by the target user, so as to control the target smart device according to the potential control behavior, and further, active control of the smart device according to the state information of the user is realized, the requirement actively proposed by the user is not needed, and the intelligent device is not only convenient to use, but also can improve user experience.
As an optional implementation manner, in another embodiment of the present application, it is disclosed that, before the steps of the foregoing embodiment control the target smart device according to the potential control behavior, the following steps may be specifically included:
outputting inquiry information whether to control the target intelligent equipment according to the potential control behavior; acquiring reply information of a target user; and if the reply information is that the target intelligent equipment is controlled according to the potential control behavior, controlling the target intelligent equipment according to the potential control behavior.
In this embodiment, after the target intelligent device corresponding to the potential control behavior is determined from the active intelligent devices, the target intelligent device is not controlled directly according to the potential control behavior, but query information on whether to control the target intelligent device according to the potential control behavior is generated and output.
It should be noted that the query information may be output in a form of voice, or may be output from any smart device with a display screen in a space where the target user is located through the display screen of the smart device, and further, the query information may be output from a smart device with a display screen selected from the active smart devices, or may be output through a combination of the two.
And acquiring reply information fed back by the target user according to the query information. The target user can feed back the reply information through voice and can also feed back the reply information through an input device. The input device comprises an input device of any intelligent device in the space where the target user is located.
The reply information of the target user comprises agreement to control the target intelligent device according to the potential control behavior or disagreement to control the target intelligent device according to the potential control behavior. If the reply information indicates that the target intelligent device is controlled according to the potential control behavior, the target intelligent device can be controlled according to the potential control behavior; and under the condition that the reply information does not agree to control the target intelligent device according to the potential control behavior, the target intelligent device is not controlled. It should be noted that, if the target user does not feedback the reply information within the preset time, the target user may be set to agree to control the target intelligent device according to the potential control behavior or disagree to control the target intelligent device according to the potential control behavior, which is not limited in this embodiment.
The preset time may be set according to actual conditions, and this embodiment is not limited.
For example, if it is detected that the potential control action of the target user is to increase the set temperature of the air conditioner, and the target smart device is a smart air conditioner of the space where the target user is currently located, in this embodiment, query information on whether to increase the set temperature of the smart air conditioner of the space where the target user is currently located may be generated and output.
Acquiring reply information of a target user, if the reply information of the target user agrees to increase the set temperature of the intelligent air conditioner of the space where the target user is currently located, adjusting the set temperature of the intelligent air conditioner of the space where the target user is currently located, and if the reply information of the target user disagrees to increase the set temperature of the intelligent air conditioner of the space where the target user is currently located or does not answer within preset time, not executing the step of adjusting the set temperature of the intelligent air conditioner of the space where the target user is currently located.
In the embodiment of the application, before the target intelligent device is actively controlled according to the potential control behavior of the target user, whether the target user is controlled or not is inquired, so that the wrong control behavior caused by the wrong judgment of the potential control behavior can be avoided, and the use experience of the target user is improved.
As an optional implementation manner, in another embodiment of the present application, it is disclosed that the steps of the foregoing embodiment determine the potential control behavior of the target user on the smart device according to at least the state information of the target user, and specifically may include the following steps:
and determining the potential control behaviors of the target user on the intelligent equipment according to the behavior habit preference model of the target user and the state information of the target user.
The behavior habit preference model is a model constructed according to the behavior habit preference information of the target user. The behavior habit preference information of the target user comprises state information of the target user and habit preference control behaviors of the intelligent device corresponding to different state information of the target user. Specifically, a neural network model may be preset, and then the neural network model is controlled to learn the behavior habit preference information of the target user, so as to obtain the behavior habit preference model of the target user.
When the state information of the target user is detected, the habit preference control behavior corresponding to the target user can be determined based on the behavior habit preference model of the target user, so that the habit preference control behavior is used as a potential control behavior of the target user.
The living habits of the target user may change, so that the behavior habit preference information of the target user changes. In this embodiment, the behavior habit preference model of the target user may also be modified according to the behavior habit preference information after the user changes, so that the potential control behavior determined according to the behavior habit preference model meets the real requirement of the target user.
In the above embodiment, the potential control behavior of the target user is determined by setting the behavior habit preference model of the target user, so that not only is the accuracy of the result high, but also the response time is short.
As an optional implementation manner, in another embodiment of the present application, it is disclosed that, in the steps of the foregoing embodiment, the potential control behavior of the target user on the smart device is determined according to the behavior habit preference model of the target user and the state information of the target user, and specifically includes the following steps:
and determining habit preference control behaviors corresponding to the state information according to the behavior habit preference model of the target user, and determining the habit preference control behaviors as potential control behaviors.
When the state information of the target user is detected, the habit preference control behavior corresponding to the state information can be determined based on the behavior habit preference model of the target user, and the habit preference control behavior is determined as a potential control behavior.
For example, if it is detected that the state information of the target user is that the target user is about to enter a sleep state, based on the behavior habit preference model of the target user, it is determined that the habit preference control behavior of the target user during sleep is to close a curtain and reduce the brightness of an illumination lamp, and then it may be determined that the potential control behavior of the current target user is to close the curtain and reduce the brightness of the illumination lamp.
For example, if it is detected that the state information of the target user is that the target user is making a call, and the habit preference control behavior of the target user when making a call is determined to be to decrease the volume of the smart sound based on the behavior habit preference model of the target user, it may be determined that the potential control behavior of the current target user is to decrease the volume of the smart sound.
For another example, if it is detected that the state information of the target user is that the target user has just returned home, based on the behavior habit preference model of the target user, it is determined that the target user has just returned home, and when the outdoor temperature reaches the preset temperature, the habit preference control behavior is to turn on the bedroom intelligent air conditioner and go to the bedroom for rest, and then it may be determined that the potential control behavior of the current target user is the bedroom intelligent air conditioner.
In the above embodiment, the potential control behavior of the target user is determined by setting the behavior habit preference model of the target user, so that not only is the accuracy of the result high, but also the response time is short.
As an optional implementation manner, another embodiment of the present application discloses that the determining, by the steps of the foregoing embodiment, a target smart device corresponding to a potential control behavior from active smart devices specifically includes the following steps:
and determining target intelligent equipment corresponding to the potential control behaviors from the active intelligent equipment according to the behavior habit preference model of the target user.
The behavior habit preference model of the target user is a model constructed according to the behavior habit preference information of the target user. Specifically, the target intelligent device corresponding to the potential control behavior can be determined from the active intelligent devices according to the behavior habit preference model of the target user.
For example, if it is detected that the state information of the target user is that the target user has just returned home, and based on the behavior habit preference model of the target user, the habit preference control behavior of the target user when the target user has just returned home is determined to be that an air conditioner in a bedroom is turned on, it may be determined that the potential control behavior of the current target user is that the air conditioner in the bedroom is turned on.
In the above embodiment, according to the behavior habit preference model of the target user, the target intelligent device corresponding to the potential control behavior is determined from the active intelligent devices, so as to alleviate the problem that if a plurality of active intelligent devices having the same function exist for the target user, if the plurality of intelligent devices having the same function are simultaneously controlled as the target intelligent device according to the potential control behavior of the target user, waste of electric power energy may be caused.
As an alternative implementation manner, as shown in fig. 3, in another embodiment of the present application, it is disclosed that, in the steps of the foregoing embodiment, the following steps may also be included:
s301, acquiring voice information of the target user.
In this embodiment, the voice information of the target user may be acquired. Illustratively, the sound sensor arranged in the space where the target user is located can collect the voice information of the target user and send the voice information of the target user to the controller, so that the controller can analyze the voice information of the target user.
The controller analyzes the voice information of the target user, and if the voice information of the target user is determined to include the control information of the intelligent device through analysis, the controller can generate a control instruction according to the control information, send the control instruction to the intelligent device corresponding to the control information, and control the intelligent device corresponding to the control information.
S302, performing semantic analysis on the voice information of the target user, and determining whether the voice information of the target user comprises control information of the intelligent equipment; if the voice information of the target user comprises the control information of the intelligent device, executing the step S303; if the voice information of the target user does not include the control information of the intelligent device, step S301 is executed.
In the embodiment of the application, the semantic analysis can be performed on the voice information of the target user in real time based on a semantic analysis technology, so that whether the voice information of the target user includes the control information of the intelligent device or not can be determined.
In the prior art, voice control equipment generally needs to be awakened by an awakening word, and other intelligent equipment is controlled by the voice control equipment in a mode of 'awakening word + control content'. For example, if the name of a certain voice control device is "ABC", and the control content is to turn on the smart television, in the prior art, after the user speaks "ABC" to wake up the voice control device, the user speaks the control content of "turning on the smart television", so as to achieve the purpose of turning on the smart television through the voice control device.
In the embodiment of the present application, semantic analysis is performed on the voice information of the target user in real time, and when it is determined that the voice information of the target user includes the control information of the intelligent device, step S303 may be directly performed to control the intelligent device corresponding to the control information, without using a wakeup word.
It should be noted that the semantic analysis technology can follow the mature semantic analysis technology in the prior art, and the present embodiment is not limited thereto, and those skilled in the art can refer to the prior art.
If the voice information of the target user is determined to include the control information of the intelligent device after semantic analysis is performed on the voice information of the target user, executing step S303; if it is determined that the voice information of the target user does not include the control information of the smart device, step S301 may be repeatedly performed, so as to continuously collect and detect the voice information of the target user.
And S303, controlling the intelligent equipment corresponding to the control information according to the control information.
If after semantic analysis is performed on the voice information of the target user, it is determined that the voice information of the target user includes control information of the intelligent device, the controller may generate a control instruction according to the control information, send the control instruction to the intelligent device corresponding to the control information, and control the intelligent device corresponding to the control information based on the control instruction.
For example, if after performing semantic analysis on the voice information of the target user, it is determined that the voice information of the target user includes control information for the intelligent refrigerator, specifically, the temperature of the refrigerating chamber is increased by 1 ℃, the controller may send a control instruction to the intelligent refrigerator to control the temperature of the refrigerating chamber of the intelligent refrigerator to be increased by 1 ℃.
In the embodiment of the application, the voice information of the target user is acquired in real time and analyzed in real time, when the voice information of the target user is detected to comprise the control information of the intelligent device, the voice information can be directly responded, the intelligent device corresponding to the control information is controlled according to the control information, the user does not need to speak out a wake-up word, and the use is convenient.
As an optional implementation manner, another embodiment of the present application discloses that before the step of the above embodiment controls the intelligent device corresponding to the control information according to the control information, the method may specifically include the following step:
performing semantic analysis on the voice information of the target user, and extracting intelligent equipment corresponding to the control information from the voice information of the target user;
or analyzing the behavior and action information of the target user, and determining the intelligent equipment towards which the target user faces as the intelligent equipment corresponding to the control information;
or determining the intelligent equipment corresponding to the control information according to the intelligent equipment oriented by the target user and the behavior habit preference model of the target user; the behavior habit preference model of the target user is a model constructed according to the behavior habit preference information of the target user.
Specifically, before the steps in the above embodiments control the intelligent device corresponding to the control information according to the control information, the intelligent device corresponding to the control information needs to be determined. In this embodiment, the intelligent device corresponding to the control information may be determined in the following ways:
the method I comprises the following steps: semantic analysis can be performed on the voice information of the target user, and the intelligent device corresponding to the control information is extracted from the voice information of the target user.
Specifically, the target user may speak the name of the intelligent device corresponding to the control information, and based on this, in the embodiment of the present application, semantic analysis may be performed on the voice information of the target user, and the intelligent device corresponding to the control information may be extracted from the voice information of the target user.
The second method comprises the following steps: the behavior and action information of the target user can be analyzed, and the intelligent device towards which the target user faces is determined to be the intelligent device corresponding to the control information.
The orientation refers to the orientation of the target user's face. In the embodiment of the application, when it is determined that the voice information of the target user includes the control information of the intelligent device, the intelligent device of the face orientation of the target user when speaking the control information can be detected, and the intelligent device of the face orientation of the target user is determined to be the intelligent device corresponding to the control information.
Specifically, the target user wants to control a certain intelligent device, and when interacting with the intelligent device, the face generally faces the intelligent device. When the target user is oriented to a plurality of smart devices, the device closest to the target user may be determined as the smart device to which the control information corresponds.
The third method comprises the following steps: the intelligent device corresponding to the control information can be determined according to the intelligent device oriented by the target user and the behavior habit preference model of the target user.
The orientation refers to the orientation of the target user's face. In the embodiment of the application, the intelligent device corresponding to the control information can be determined together by combining the intelligent device oriented by the target user and the behavior habit preference model of the target user.
Specifically, when the intelligent device towards which the target user faces includes a plurality of intelligent devices, in addition to determining the device closest to the target user as the intelligent device corresponding to the control information according to the second mode, the device towards which the target user faces and which is controlled by the target user preference under the current situation may be determined according to the behavior habit preference model of the target user, so that the device towards which the target user faces and which is controlled by the target user preference is determined as the intelligent device corresponding to the control information.
In an optional embodiment, when it is not determined in what manner the target user may definitely control the intelligent device corresponding to the information, the intelligent device corresponding to the control information may be extracted from the voice information of the target user according to the first manner, if the intelligent device corresponding to the control information cannot be extracted from the voice information of the target user, the intelligent device towards which the target user is oriented may be further determined as the intelligent device corresponding to the control information according to the second manner, if the intelligent device towards which the target user is oriented includes a plurality of intelligent devices, the intelligent device corresponding to the control information may be further determined according to the third manner, according to the intelligent device towards which the target user is oriented and a behavior preference model of the target user, and if the intelligent device corresponding to the control information cannot be accurately determined according to the third manner, query information may be output so as to query the target user, which intelligent device the intelligent device corresponding to the control information is specifically determined, so as to perform control according to the information fed back by the target user.
Illustratively, if the information fed back by the target user is the name of the intelligent device corresponding to the control information, the intelligent device is controlled according to the control information, and if the information fed back by the target user is not controlled or the target user does not have feedback information within a preset time period, the step S301 of the above embodiment is not controlled and is repeatedly executed, so as to continuously collect and detect the voice information of the target user.
In the above embodiments, the intelligent device corresponding to the control information may be determined based on the face orientation of the target user and the behavior habit preference model of the target user. Therefore, the target user can control the intelligent device under the conditions that the awakening words are not used and the name of the controlled device is not spoken, and the convenience of intelligent device control is greatly improved.
In addition, when the intelligent device corresponding to the control information is controlled, if the control quantity of the intelligent device corresponding to the control information cannot be determined from the control information provided by the target user, the control quantity of the intelligent device corresponding to the control information can be automatically determined based on the behavior habit preference model of the target user.
For example, if the behavior habit preference model of the target user is learned to determine that the adjustment step length of each time when the target user adjusts the master-sleeping intelligent air conditioner is 1 ℃, if the control information is to raise the set temperature of the air conditioner, the intelligent device corresponding to the control information is the master-sleeping intelligent air conditioner, and the control quantity of the intelligent device corresponding to the control information cannot be determined from the control information provided by the target user, the control information may be determined based on the behavior habit preference model of the target user, and the adjusting step is to raise the set temperature by 1 ℃ for the master-sleeping intelligent air conditioner.
As an optional implementation manner, another embodiment of the present application discloses that, as shown in fig. 4, the step of the foregoing embodiment of acquiring the voice information of the target user may specifically include the following steps:
s401, extracting first voice information of a target user from the detected voice information based on voiceprint recognition information of the target user; and/or determining the target user from the detected video information based on the face identification information of the target user, extracting lip information of the target user, and generating second voice information of the target user according to the lip information.
The sensor arranged in the space where the target user is located can acquire the voice information and/or the facial recognition information of the target user and send the voice information and/or the facial recognition information of the target user to the controller, so that the controller analyzes the voice information and/or the facial recognition information of the target user.
Specifically, a sound sensor disposed in the space where the target user is located can acquire voice information of the target user, and a visual sensor disposed in the space where the target user is located can acquire facial recognition information of the target user.
The identity of the target user can be determined based on voiceprint recognition information of the target user, so that first voice information of the target user is extracted from the detected voice information, the identity of the target user can be determined based on face recognition information of the target user, so that the target user is determined from the detected video information, lip information of the target user is extracted, and second voice information of the target user is synthesized according to the lip information of the target user. The video information comprises video information obtained by shooting a space where a target user is located.
In a more complex scene, for example, a plurality of people speak simultaneously, or a target user speaks while playing audio, which may result in that complete sound information of the target user cannot be acquired; and under the condition that the vision sensor is shielded, lip information finished by the target user cannot be acquired. In such a scenario, the first voice information and the second voice information may be collected at the same time.
It should be noted that, no matter the first voice information of the target user is extracted according to the voiceprint recognition information of the target user or the second voice information of the target user is synthesized according to the lip information of the target user, it is a very mature prior art, and this embodiment does not need to be described in detail, and those skilled in the art may refer to the description of the prior art.
S402, determining the voice information of the target user according to the first voice information and/or the second voice information.
In the embodiment of the application, the first voice information can be used as the voice information of the target user; the second voice information can also be used as the voice information of the target user; when the finished sound information and the lip information cannot be acquired, the second voice information can be used as compensation of the first voice information, and the content corresponding to the second voice information is supplemented in the missing or unclear position in the first voice information, so that the voice information of the target user is obtained.
In the embodiment of the application, the voice information of the target user or the lip information of the target user can be independently collected to determine the voice information of the target user, so that the calculated amount is effectively reduced; under a complex scene, the voice information of the target user can be generated by combining the voice information and the lip information of the target user, the definition of the voice information of the target user is improved, and the accuracy of semantic analysis is further improved.
As an optional implementation manner, in another embodiment of the present application, it is disclosed that the step of determining the intelligent device toward which the target user is oriented in the above embodiment may specifically include the following steps:
and determining the intelligent equipment towards which the target user faces according to at least one of the sound field information, the video information and the ground pressure information.
The video information comprises video information obtained by shooting a space where a target user is located; the ground pressure information comprises the ground pressure information of the space where the target user is located; the sound field information is determined based on the sound information.
Illustratively, the video information can be acquired by a visual sensor arranged in a space where a target user is located, and the visual sensor sends the acquired video information to the controller; the sound field information can be acquired by a sound sensor arranged in the space where the target user is located, and the sound sensor sends the acquired sound field information to the controller; the ground pressure information can be acquired through a pressure sensor arranged in the space where the target user is located, and the pressure sensor sends the acquired ground pressure information to the controller.
In this embodiment, the controller may determine, by combining at least one of sound field information, video information, and ground pressure information of the target user, a current target position of the target user and a specific direction in which the target user faces, and further determine the smart device in which the target user faces.
In the above embodiment, the intelligent device corresponding to the control information can be analyzed based on the orientation of the target user according to at least one of the sound field information, the video information and the ground pressure information, and the target user can control the intelligent device without using a wakeup word or speaking the name of the controlled device, so that the convenience of controlling the intelligent device is greatly improved.
As an alternative implementation manner, it is disclosed in another embodiment of the present application that an avatar may be customized by a target user as an interactive object of the target user, and the avatar may be visually displayed on all smart devices having display screens or may be acoustically displayed on all smart devices having sound devices. The visual presentation image of the avatar may be customized by the target user according to personal preferences, and the audio target user of the avatar may also customize languages, accents, dialects, and the like.
Moreover, the capabilities of the avatar may be continually upgraded for iteration and customization. For example, a family manager can have the capabilities of intelligent medical assistance, speech emotion recognition, family teachers and the like, and can play more family roles.
As an optional implementation manner, in another embodiment of the present application, it is disclosed that a target user is supported to actively select subscribed content and perform personalized push. The subscribed content can be news information, friend dynamic information and the like.
For example, if the target user is determined based on the behavioral habit preference model of the target user, the target user is a mother english speaker, and the habit of listening to the news in english while eating breakfast in the morning is provided. It may be asked whether to listen to the news when it is detected that the user is sitting at the table in the morning.
Exemplary devices
Corresponding to the control method of the intelligent device, an embodiment of the present application further discloses a control apparatus of an intelligent device, as shown in fig. 5, the apparatus includes:
the analysis module 100 is configured to determine a potential control behavior of the target user on the smart device according to at least the state information of the target user; the state information of the target user comprises behavior and action information and/or physiological parameter information of the target user;
a determining module 110, configured to determine, from the active smart devices, a target smart device corresponding to the potential control behavior; the active smart devices include smart devices capable of being controlled by the target user;
and a control module 120, configured to control the target smart device according to the potential control behavior.
As an optional implementation manner, in another embodiment of the present application, the control apparatus of an intelligent device in the above embodiment further includes:
the output module is used for outputting inquiry information whether to control the target intelligent equipment according to the potential control behavior;
the reply information acquisition module is used for acquiring reply information of a target user;
and the control module is also used for controlling the target intelligent equipment according to the potential control behavior if the reply information indicates that the target intelligent equipment is controlled according to the potential control behavior.
As an optional implementation manner, in another embodiment of the present application, when determining, at least according to the state information of the target user, a potential control behavior of the target user on the smart device, the analysis module 100 of the above embodiment is specifically configured to:
determining the potential control behavior of the target user on the intelligent equipment according to the behavior habit preference model of the target user and the state information of the target user; the behavior habit preference model of the target user is a model constructed according to the behavior habit preference information of the target user.
As an optional implementation manner, in another embodiment of the present application, when the analysis module 100 of the above embodiment determines, according to the behavior habit preference model of the target user and the state information of the target user, the potential control behavior of the target user on the smart device is specifically configured to:
and determining habit preference control behaviors corresponding to the state information according to the behavior habit preference model of the target user, and determining the habit preference control behaviors as potential control behaviors.
As an alternative implementation manner, in another embodiment of the present application, the determining module 110 of the above embodiment includes:
the first determining unit is used for determining target intelligent equipment corresponding to the potential control behaviors from the active intelligent equipment according to the behavior habit preference model of the target user; the behavior habit preference model of the target user is a model constructed according to the behavior habit preference information of the target user.
As an optional implementation manner, in another embodiment of the present application, the control apparatus of the smart device in the above embodiment further includes:
the voice information acquisition module is used for acquiring the voice information of the target user;
the voice information determining module is used for performing semantic analysis on the voice information of the target user and determining whether the voice information of the target user comprises control information of the intelligent equipment;
and the control module is further used for controlling the intelligent equipment corresponding to the control information according to the control information if the voice information of the target user comprises the control information of the intelligent equipment.
As an optional implementation manner, in another embodiment of the present application, the control apparatus of the smart device in the above embodiment further includes:
the first equipment determining module is used for performing semantic analysis on the voice information of the target user and extracting intelligent equipment corresponding to the control information from the voice information of the target user;
or the second device determining module is used for analyzing the behavior and action information of the target user and determining the intelligent device towards which the target user faces as the intelligent device corresponding to the control information; the orientation is the orientation of the face of the target user;
or the third equipment determining module is used for determining the intelligent equipment corresponding to the control information according to the intelligent equipment oriented by the target user and the behavior habit preference model of the target user; the behavior habit preference model of the target user is a model constructed according to the behavior habit preference information of the target user.
As an optional implementation manner, in another embodiment of the present application, the voice information obtaining module in the above embodiment includes:
a first extraction unit configured to extract first voice information of a target user from the detected voice information based on voiceprint recognition information of the target user; and/or a second extraction unit for determining a target user from the detected video information and extracting lip information of the target user based on the face recognition information of the target user, and generating second voice information of the target user according to the lip information; the video information comprises video information obtained by shooting the space where the target user is located;
and the second determining unit is used for determining the voice information of the target user according to the first voice information and/or the second voice information.
As an optional implementation manner, in another embodiment of the present application, when the second device determining module determines the smart device toward which the target user is oriented, the second device determining module is specifically configured to:
according to at least one of sound field information, video information and ground pressure information, determining intelligent equipment towards which a target user faces; the video information comprises video information obtained by shooting a space where a target user is located; the ground pressure information comprises the ground pressure information of the space where the target user is located; the sound field information is determined based on the sound information.
For details, please refer to the contents of the method embodiment for the specific working contents of each unit of the control apparatus of the intelligent device, which are not described herein again.
Exemplary Intelligent systems, controllers, computer program products, and storage media
Corresponding to the control method of the intelligent device, the embodiment of the present application further discloses an intelligent system, as shown in fig. 6, the system includes:
a controller 200, and at least one sensor 210 connected to the control system;
the controller 200 is further electrically connected to the smart device, and is configured to perform unified management on the sensors 210 and the smart device in the space, so as to implement linkage of multiple devices.
The sensor 210 is configured to collect state information of a target user and send the state information of the target user to the controller 200; the state information of the target user comprises behavior and action information and/or physiological parameter information of the target user;
the controller 200 is used for determining the potential control behavior of the target user on the intelligent device at least according to the state information of the target user; determining target intelligent equipment corresponding to the potential control behaviors from the active intelligent equipment; the active smart devices include smart devices capable of being controlled by the target user; and controlling the target intelligent equipment according to the potential control behaviors.
As an optional implementation manner, the controller 200 in the above embodiment is further configured to output inquiry information about whether to control the target smart device according to the potential control behavior before controlling the target smart device according to the potential control behavior; acquiring reply information of a target user; and if the reply information is that the target intelligent equipment is controlled according to the potential control behavior, controlling the target intelligent equipment according to the potential control behavior.
As an optional implementation manner, when determining, according to at least the state information of the target user, a potential control behavior of the target user on the smart device, the controller 200 of the above embodiment is specifically configured to:
determining the potential control behavior of the target user on the intelligent equipment according to the behavior habit preference model of the target user and the state information of the target user; the behavior habit preference model of the target user is a model constructed according to the behavior habit preference information of the target user.
As an optional implementation manner, when determining the potential control behavior of the target user on the smart device according to the behavior habit preference model of the target user and the state information of the target user, the controller 200 of the above embodiment is specifically configured to:
and determining habit preference control behaviors corresponding to the state information according to the behavior habit preference model of the target user, and determining the habit preference control behaviors as potential control behaviors.
As an optional implementation manner, when the controller 200 of the above embodiment determines the target smart device corresponding to the potential control behavior from the active smart devices, the method is specifically configured to:
according to the behavior habit preference model of the target user, determining target intelligent equipment corresponding to the potential control behavior from the active intelligent equipment; the behavior habit preference model of the target user is a model constructed according to the behavior habit preference information of the target user.
As an optional implementation manner, the sensor 210 of the above embodiment is further configured to detect voice information of the target user, and send the voice information of the target user to the controller 200;
the controller 200 is further configured to obtain voice information of the target user; performing semantic analysis on the voice information of the target user, and determining whether the voice information of the target user comprises control information of the intelligent equipment; and if the voice information of the target user comprises the control information of the intelligent equipment, controlling the intelligent equipment corresponding to the control information according to the control information.
As an optional implementation manner, before controlling the smart device corresponding to the control information according to the control information, the controller 200 is further configured to:
performing semantic analysis on the voice information of the target user, and extracting intelligent equipment corresponding to the control information from the voice information of the target user; or analyzing the behavior and action information of the target user, and determining the intelligent equipment towards which the target user faces as the intelligent equipment corresponding to the control information; the orientation is the orientation of the target user face; or determining the intelligent equipment corresponding to the control information according to the intelligent equipment oriented by the target user and the behavior habit preference model of the target user; the behavior habit preference model of the target user is a model constructed according to the behavior habit preference information of the target user.
As an optional implementation manner, when the controller 200 acquires the voice information of the target user, the method is specifically configured to:
extracting first voice information of a target user from the detected voice information based on voiceprint recognition information of the target user; and/or determining the target user from the detected video information based on the face identification information of the target user, extracting lip information of the target user, and generating second voice information of the target user according to the lip information; the video information comprises video information obtained by shooting the space where the target user is located; and determining the voice information of the target user according to the first voice information and/or the second voice information.
As an alternative implementation, when the controller 200 determines the smart device that the target user is facing, it is specifically configured to:
according to at least one of sound field information, video information and ground pressure information, determining intelligent equipment towards which a target user faces; the video information comprises video information obtained by shooting a space where a target user is located; the ground pressure information comprises the ground pressure information of the space where the target user is located; the sound field information is determined based on the sound information.
As an alternative implementation, the sensor 210 of the above embodiment includes at least a pressure sensor, a visual sensor, a sound sensor, and a physiological parameter sensor.
The system comprises a pressure sensor, a data processing unit and a data processing unit, wherein the pressure sensor is used for acquiring ground pressure information of a space where a target user is located;
the visual sensor is used for acquiring video information of a space where a target user is located;
the sound sensor is used for acquiring sound information of a target user;
and the physiological parameter sensor is used for acquiring the physiological parameter information of the target user.
In a home scene, the pressure sensor can be arranged on the ground; the visual sensor can be arranged at the roof in a house, and a 180-degree fisheye lens can be adopted to acquire plane scenes of the whole house; the sound sensor may be disposed on a wall; the physiological parameter sensor needs to be worn by a user, for example, the physiological parameter sensor is made into a bracelet so that the user can wear the physiological parameter sensor on the wrist.
The various sensors and controllers 200 may be installed in the house after the construction of the house is completed, and the various sensors and controllers 200 may be installed and then subjected to the processes of finishing, delivery, and the like.
As an optional implementation manner, when the pressure sensor detects that there is no user in the current space, all the smart devices and sensors in the current space may be controlled to enter a sleep mode, so as to save the power. When the pressure sensor senses that the pressure is triggered, the user is present in the current space, and all the intelligent devices and the sensors in the space can be controlled to stop sleeping.
As an optional implementation manner, the intelligent system of the above embodiment may further include an intelligent device 220, and the intelligent device 220 is electrically connected to the controller 200.
The intelligent device 220 may include devices such as an intelligent air conditioner, an intelligent television, an intelligent refrigerator, an intelligent curtain, and an intelligent lamp, which is not limited in this embodiment.
Corresponding to the control method of the above intelligent device, an embodiment of the present application further discloses a controller, as shown in fig. 7, the controller includes:
a memory 300 and a processor 310;
wherein, the memory 300 is connected with the processor 310 for storing the program;
the processor 310 is configured to implement the control method of the smart device disclosed in any of the above embodiments by executing the program stored in the memory 300.
Specifically, the controller may further include: a bus, a communication interface 320, an input device 330, and an output device 340.
The processor 310, the memory 300, the communication interface 320, the input device 330, and the output device 340 are connected to each other through a bus. Wherein:
a bus may include a path that transfers information between components of a computer system.
The processor 310 may be a general-purpose processor, such as a general-purpose Central Processing Unit (CPU), microprocessor, etc., an application-specific integrated circuit (ASIC), or one or more integrated circuits for controlling the execution of programs in accordance with the present disclosure. But may also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components.
The processor 310 may include a main processor and may also include a baseband chip, a modem, and the like.
The memory 300 stores programs for executing the technical solution of the present application, and may also store an operating system and other key services. In particular, the program may include program code including computer operating instructions. More specifically, memory 300 may include read-only memory (ROM), other types of static storage devices that may store static information and instructions, random Access Memory (RAM), other types of dynamic storage devices that may store information and instructions, disk storage, flash, and so forth.
The input device 330 may include means for receiving data and information input by a user, such as a keyboard, mouse, camera, scanner, light pen, voice input device, touch screen, pedometer, or gravity sensor, among others.
Output device 340 may include equipment that allows output of information to a user, such as a display screen, a printer, speakers, etc.
Communication interface 320 may include any device that uses a transceiver or the like to communicate with other devices or communication networks, such as an ethernet network, a Radio Access Network (RAN), a Wireless Local Area Network (WLAN), etc.
The processor 310 executes the program stored in the memory 300 and calls other devices, which can be used to implement the steps of the control method of the smart device provided in the above-described embodiments of the present application.
In addition to the above-described methods and devices, embodiments of the present application may also be a computer program product comprising computer program instructions that, when executed by the processor 310, cause the processor 310 to perform the steps of the control method of the smart device provided by the above-described embodiments.
The computer program product may include program code for carrying out operations for embodiments of the present application in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server.
Furthermore, embodiments of the present application may also be a computer-readable storage medium on which computer program instructions are stored, which, when executed by a processor, cause the processor 310 to perform the steps of the control method of the smart device provided by the above-described embodiments.
The computer-readable storage medium may take any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may include, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
While, for purposes of simplicity of explanation, the foregoing method embodiments have been described as a series of acts or combination of acts, it will be appreciated by those skilled in the art that the present application is not limited by the order of acts or acts described, as some steps may occur in other orders or concurrently with other steps in accordance with the application. Further, those skilled in the art should also appreciate that the embodiments described in the specification are preferred embodiments and that the acts and modules referred to are not necessarily required in this application.
It should be noted that, in this specification, each embodiment is described in a progressive manner, and each embodiment focuses on differences from other embodiments, and portions that are the same as and similar to each other in each embodiment may be referred to. For the device-like embodiment, since it is basically similar to the method embodiment, the description is simple, and for the relevant points, reference may be made to the partial description of the method embodiment.
The steps in the methods of the embodiments of the present application may be sequentially adjusted, combined, and deleted according to actual needs, and technical features described in the embodiments may be replaced or combined.
The modules and sub-modules in the device and the terminal in the embodiments of the application can be combined, divided and deleted according to actual needs.
In the several embodiments provided in the present application, it should be understood that the disclosed terminal, apparatus and method may be implemented in other manners. For example, the above-described terminal embodiments are merely illustrative, and for example, the division of a module or a sub-module is only one logical function division, and other division manners may be available in actual implementation, for example, a plurality of sub-modules or modules may be combined or integrated into another module, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or modules, and may be in an electrical, mechanical or other form.
The modules or sub-modules described as separate parts may or may not be physically separate, and parts that are modules or sub-modules may or may not be physical modules or sub-modules, may be located in one place, or may be distributed over a plurality of network modules or sub-modules. Some or all of the modules or sub-modules can be selected according to actual needs to achieve the purpose of the solution of the present embodiment.
In addition, functional modules or sub-modules in the embodiments of the present application may be integrated into one processing module, or each module or sub-module may exist alone physically, or two or more modules or sub-modules are integrated into one module. The integrated modules or sub-modules can be implemented in the form of hardware, and can also be implemented in the form of software functional modules or sub-modules.
Those of skill would further appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both, and that the various illustrative components and steps have been described above generally in terms of their functionality in order to clearly illustrate this interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the technical solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied directly in hardware, in a software unit executed by a processor, or in a combination of the two. The software cells may reside in Random Access Memory (RAM), memory, read Only Memory (ROM), electrically programmable ROM, electrically erasable programmable ROM, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art.
Finally, it should also be noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrases "comprising a," "8230," "8230," or "comprising" does not exclude the presence of additional like elements in a process, method, article, or apparatus that comprises the element.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present application. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the application. Thus, the present application is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (14)

1. A control method of an intelligent device is characterized by comprising the following steps:
determining potential control behaviors of a target user on the intelligent device at least according to state information of the target user; the state information of the target user comprises behavior and action information and/or physiological parameter information of the target user;
determining target intelligent equipment corresponding to the potential control behaviors from active intelligent equipment; the active smart devices comprise smart devices controllable by the target user;
and controlling the target intelligent equipment according to the potential control behaviors.
2. The method of claim 1, wherein prior to controlling the target smart device in accordance with the potential control behavior, the method further comprises:
outputting inquiry information whether to control the target intelligent equipment according to the potential control behavior;
acquiring reply information of the target user;
and if the reply information is that the target intelligent equipment is controlled according to the potential control behavior, controlling the target intelligent equipment according to the potential control behavior.
3. The method of claim 1, wherein determining the potential control behavior of the target user on the smart device based on at least the status information of the target user comprises:
determining the potential control behavior of the target user on the intelligent equipment according to the behavior habit preference model of the target user and the state information of the target user;
the behavior habit preference model of the target user is a model constructed according to the behavior habit preference information of the target user.
4. The method of claim 3, wherein determining the potential control behavior of the target user on the smart device according to the behavior habit preference model of the target user and the state information of the target user comprises:
and determining a habit preference control behavior corresponding to the state information according to the behavior habit preference model of the target user, and determining the habit preference control behavior as the potential control behavior.
5. The method of claim 1, wherein determining the target smart device corresponding to the potential control behavior from among the active smart devices comprises:
determining target intelligent equipment corresponding to the potential control behaviors from the active intelligent equipment according to the behavior habit preference model of the target user; the behavior habit preference model of the target user is a model constructed according to the behavior habit preference information of the target user.
6. The method of claim 1, further comprising:
acquiring voice information of the target user;
performing semantic analysis on the voice information of the target user, and determining whether the voice information of the target user comprises control information of intelligent equipment;
and if the voice information of the target user comprises the control information of the intelligent equipment, controlling the intelligent equipment corresponding to the control information according to the control information.
7. The method according to claim 6, before controlling the smart device corresponding to the control information according to the control information, further comprising:
performing semantic analysis on the voice information of the target user, and extracting intelligent equipment corresponding to the control information from the voice information of the target user;
or analyzing the behavior and action information of the target user, and determining that the intelligent device towards which the target user faces is the intelligent device corresponding to the control information; the orientation is the target user face orientation;
or determining the intelligent equipment corresponding to the control information according to the intelligent equipment oriented to the target user and the behavior habit preference model of the target user; the behavior habit preference model of the target user is a model constructed according to the behavior habit preference information of the target user.
8. The method of claim 6, wherein obtaining the voice information of the target user comprises:
extracting first voice information of the target user from the detected voice information based on the voiceprint recognition information of the target user; and/or determining the target user from the detected video information based on the facial recognition information of the target user, extracting lip information of the target user, and generating second voice information of the target user according to the lip information; the video information comprises video information obtained by shooting the space where the target user is located;
and determining the voice information of the target user according to the first voice information and/or the second voice information.
9. The method of claim 7, wherein determining the smart device towards which the target user is oriented comprises:
according to at least one of sound field information, video information and ground pressure information, determining intelligent equipment towards which the target user faces; the video information comprises video information obtained by shooting the space where the target user is located; the ground pressure information comprises ground pressure information of a space where the target user is located; the sound field information is determined from the sound information.
10. A control device of an intelligent device, comprising:
the analysis module is used for determining the potential control behavior of the target user on the intelligent equipment at least according to the state information of the target user; the state information of the target user comprises behavior and action information and/or physiological parameter information of the target user;
the determining module is used for determining target intelligent equipment corresponding to the potential control behaviors from active intelligent equipment; the active smart devices comprise smart devices controllable by the target user;
and the control module is used for controlling the target intelligent equipment according to the potential control behaviors.
11. An intelligent system, comprising: a controller, and at least one sensor connected to the control system;
the sensor is used for acquiring the state information of a target user and sending the state information of the target user to the controller; the state information of the target user comprises behavior and action information and/or physiological parameter information of the target user;
the controller is used for determining the potential control behavior of the target user on the intelligent equipment at least according to the state information of the target user; determining target intelligent equipment corresponding to the potential control behaviors from active intelligent equipment; the active smart devices comprise smart devices capable of being controlled by the target user; and controlling the target intelligent equipment according to the potential control behaviors.
12. The intelligent system of claim 11, the sensors comprising at least a pressure sensor, a visual sensor, a sound sensor, and a physiological parameter sensor;
the pressure sensor is used for acquiring ground pressure information of a space where the target user is located;
the visual sensor is used for acquiring video information of a space where the target user is located;
the sound sensor is used for acquiring sound information of the target user;
the physiological parameter sensor is used for acquiring physiological parameter information of the target user.
13. The intelligent system of claim 11, further comprising an intelligent device;
the intelligent device is electrically connected with the controller.
14. A storage medium, comprising: the storage medium has stored thereon a computer program which, when executed by a processor, carries out the steps of the method of controlling a smart device according to any one of claims 1 to 9.
CN202211437928.7A 2022-11-15 2022-11-15 Control method and device of intelligent equipment, intelligent system and storage medium Pending CN115718433A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211437928.7A CN115718433A (en) 2022-11-15 2022-11-15 Control method and device of intelligent equipment, intelligent system and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211437928.7A CN115718433A (en) 2022-11-15 2022-11-15 Control method and device of intelligent equipment, intelligent system and storage medium

Publications (1)

Publication Number Publication Date
CN115718433A true CN115718433A (en) 2023-02-28

Family

ID=85255346

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211437928.7A Pending CN115718433A (en) 2022-11-15 2022-11-15 Control method and device of intelligent equipment, intelligent system and storage medium

Country Status (1)

Country Link
CN (1) CN115718433A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117808438A (en) * 2024-02-29 2024-04-02 广州市森锐科技股份有限公司 Deep learning-based user habit learning method, device, equipment and medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117808438A (en) * 2024-02-29 2024-04-02 广州市森锐科技股份有限公司 Deep learning-based user habit learning method, device, equipment and medium
CN117808438B (en) * 2024-02-29 2024-07-09 广州市森锐科技股份有限公司 Deep learning-based user habit learning method, device, equipment and medium

Similar Documents

Publication Publication Date Title
CN104896656B (en) Method and device for starting air conditioner
JP7193180B2 (en) METHOD AND SYSTEM FOR PROVIDING CONTROL USER INTERFACE FOR HOME ELECTRONICS
CN105118257B (en) Intelligent control system and method
WO2016155021A1 (en) Environmental control system
CN112051743A (en) Device control method, conflict processing method, corresponding devices and electronic device
CN110806708A (en) Intelligent household control method and system and computer readable storage medium
CN111665730B (en) Electrical appliance configuration method and intelligent home system
CN108111948A (en) The visual output that server at speech interface equipment provides
WO2014190886A1 (en) Intelligent interaction system and software system thereof
CN105257140B (en) Method and device for controlling intelligent door and window
TW201636887A (en) Smart control apparatus and system
CN107229262A (en) A kind of intelligent domestic system
CN107360066A (en) A kind of household service robot and intelligent domestic system
US20170013111A1 (en) Intelligent notification device and intelligent notification method
CN113746708B (en) Electrical appliance configuration method and device, intelligent home system and computer equipment
WO2022262283A1 (en) Control method and apparatus for air conditioner, and air conditioner
KR20210097563A (en) Electronic device for supporting a task management service
CN115718433A (en) Control method and device of intelligent equipment, intelligent system and storage medium
CN110286600B (en) Scene setting method and device of intelligent household operating system
US9392320B1 (en) Adaptive battery life enhancer
CN113359503B (en) Equipment control method and related device
WO2018000261A1 (en) Method and system for generating robot interaction content, and robot
CN114636231A (en) Control method and device of air conditioner, terminal and medium
WO2022193633A1 (en) Projection-based wake-up method and apparatus, and projection device and computer storage medium
WO2016117514A1 (en) Robot control device and robot

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination