CN111817929A - Equipment interaction method and device, household equipment and storage medium - Google Patents

Equipment interaction method and device, household equipment and storage medium Download PDF

Info

Publication number
CN111817929A
CN111817929A CN202010482975.8A CN202010482975A CN111817929A CN 111817929 A CN111817929 A CN 111817929A CN 202010482975 A CN202010482975 A CN 202010482975A CN 111817929 A CN111817929 A CN 111817929A
Authority
CN
China
Prior art keywords
user
identity information
interaction
target
current user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010482975.8A
Other languages
Chinese (zh)
Inventor
袁珊娜
赵雪
岳长琴
李彭安
隋俊华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qingdao Haier Smart Technology R&D Co Ltd
Haier Smart Home Co Ltd
Original Assignee
Qingdao Haier Smart Technology R&D Co Ltd
Haier Smart Home Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qingdao Haier Smart Technology R&D Co Ltd, Haier Smart Home Co Ltd filed Critical Qingdao Haier Smart Technology R&D Co Ltd
Priority to CN202010482975.8A priority Critical patent/CN111817929A/en
Publication of CN111817929A publication Critical patent/CN111817929A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/28Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
    • H04L12/2803Home automation networks
    • H04L12/2816Controlling appliance services of a home automation network by calling their functionalities
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/0205Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
    • A61B5/02055Simultaneously evaluating both cardiovascular condition and temperature
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/165Evaluating the state of mind, e.g. depression, anxiety
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/28Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
    • H04L12/2803Home automation networks
    • H04L12/2807Exchanging configuration information on appliance services in a home automation network
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/28Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
    • H04L12/2803Home automation networks
    • H04L12/2816Controlling appliance services of a home automation network by calling their functionalities
    • H04L12/282Controlling appliance services of a home automation network by calling their functionalities based on user interaction within the home

Abstract

The application relates to a device interaction method, a device, household equipment and a storage medium. The method comprises the following steps: receiving an interaction triggering instruction input by a current user; the interaction triggering instruction comprises identity information of the current user and interaction data of the current user; determining a target interaction mode corresponding to the identity information of the current user in a preset mapping relation based on the identity information of the current user; the mapping relation comprises identity information of a plurality of users and an interaction mode corresponding to the identity information of each user; and after corresponding target response data are determined according to the interaction data, the target response data are output to the current user based on the target interaction mode. By adopting the method, the intelligence of human-computer interaction can be improved.

Description

Equipment interaction method and device, household equipment and storage medium
Technical Field
The application relates to the technical field of artificial intelligence, in particular to a device interaction method, a device, home equipment and a storage medium.
Background
Along with the continuous development of artificial intelligence technique and internet of things, more and more intelligent homes have appeared thereupon, and the intelligent home refers to through internet of things with the various equipment in the family be connected together, provide functions such as household electrical appliances control, environmental monitoring, burglar alarm, can realize omnidirectional information interaction, and its life that gives people has brought very big convenience.
In the correlation technique, before the smart home leaves the factory, usually the manufacturer has set up the interactive mode of smart home and people inside the smart home, then when the user uses the smart home at home, no matter which user, the smart home can interact with any user according to the preset unified interactive mode.
However, the technology has the problem of poor intelligence of man-machine interaction.
Disclosure of Invention
In view of the foregoing, it is necessary to provide a device interaction method, an apparatus, a home device, and a storage medium, which can improve interaction intelligence by a human-computer.
A device interaction method, the method comprising:
receiving an interaction triggering instruction input by a current user; the interaction triggering instruction comprises identity information of a current user and interaction data of the current user;
determining a target interaction mode corresponding to the identity information of the current user in a preset mapping relation based on the identity information of the current user; the mapping relation comprises identity information of a plurality of users and an interaction mode corresponding to the identity information of each user;
and after the corresponding target response data are determined according to the interaction data, outputting the target response data to the current user based on the target interaction mode.
In one embodiment, the outputting the target response data to the current user based on the target interaction mode includes:
acquiring sign data of a current user, and determining the emotional state of the current user according to the sign data;
and outputting the target response data to the current user according to the emotional state and the target interaction mode of the current user.
In one embodiment, the outputting the target response data to the current user according to the emotional state of the current user and the target interaction mode includes:
fusing the emotional state of the current user and a target interaction mode to obtain a target response mode;
and outputting the target response data to the current user in a target response mode.
In one embodiment, the target interaction mode includes at least one of: the method comprises the following steps of adopting a mode of interaction by a user-defined name corresponding to a current user, adopting a mode of interaction by a user-defined personality corresponding to the current user, adopting a mode of interaction by a user-defined character corresponding to the current user and adopting a mode of interaction by a user-defined tone corresponding to the current user.
In one embodiment, the method for establishing the mapping relationship includes:
receiving a configuration instruction input by a user; the configuration instruction comprises identity information of the user and an interaction mode required by the user;
and changing the built-in factory interaction mode into an interaction mode required by the user, and establishing a corresponding relation between the identity information of the user and the interaction mode required by the user to obtain a mapping relation.
In one embodiment, the method for establishing the mapping relationship includes:
acquiring identity information of different users and preference data of each user; the preference data of each user comprises at least one of sports data, life data, entertainment data and shopping data of the user;
generating an interaction mode corresponding to each user according to the preference data of each user;
and obtaining a mapping relation according to the interaction mode corresponding to each user and the identity information of each user.
In one embodiment, if the current user is a plurality of first users, the determining, based on the identity information of the current user, a target interaction mode corresponding to the identity information of the current user in a preset mapping relationship includes:
determining identity information of a target user according to the identity information of a plurality of first users;
and determining a target interaction mode corresponding to the identity information of the target user in a preset mapping relation based on the identity information of the target user.
In one embodiment, the determining the identity information of the target user according to the identity information of the plurality of first users includes:
determining overlapped identity information in the identity information of the plurality of first users according to the identity information of the plurality of first users;
and determining the overlapped identity information as the identity information of the target user.
In one embodiment, the determining the identity information of the target user according to the identity information of the plurality of first users includes:
acquiring the priority of the identity information of each first user; the priority represents the sequence of each first user when using the equipment;
and determining the identity information of the first user with the highest priority as the identity information of the target user.
An apparatus for device interaction, the apparatus comprising:
the receiving module is used for receiving an interaction triggering instruction input by a current user; the interaction triggering instruction comprises identity information of a current user and interaction data of the current user;
the determining module is used for determining a target interaction mode corresponding to the identity information of the current user in a preset mapping relation based on the identity information of the current user; the mapping relation comprises identity information of a plurality of users and an interaction mode corresponding to the identity information of each user;
and the output module is used for outputting the target response data to the current user based on the target interaction mode after determining the corresponding target response data according to the interaction data.
A home device comprising a memory and a processor, the memory storing a computer program, the processor implementing the following steps when executing the computer program:
receiving an interaction triggering instruction input by a current user; the interaction triggering instruction comprises identity information of a current user and interaction data of the current user;
determining a target interaction mode corresponding to the identity information of the current user in a preset mapping relation based on the identity information of the current user; the mapping relation comprises identity information of a plurality of users and an interaction mode corresponding to the identity information of each user;
and after the corresponding target response data are determined according to the interaction data, outputting the target response data to the current user based on the target interaction mode.
A computer-readable storage medium, on which a computer program is stored which, when executed by a processor, carries out the steps of:
receiving an interaction triggering instruction input by a current user; the interaction triggering instruction comprises identity information of a current user and interaction data of the current user;
determining a target interaction mode corresponding to the identity information of the current user in a preset mapping relation based on the identity information of the current user; the mapping relation comprises identity information of a plurality of users and an interaction mode corresponding to the identity information of each user;
and after the corresponding target response data are determined according to the interaction data, outputting the target response data to the current user based on the target interaction mode.
The equipment interaction method, the equipment interaction device, the household equipment and the storage medium receive an interaction triggering instruction input by a current user, the interaction triggering instruction comprises identity information and interaction data of the current user, a target interaction mode corresponding to the identity information of the current user is determined in a preset mapping relation based on the identity information of the current user, the mapping relation comprises the identity information of a plurality of users and the interaction mode corresponding to the identity information of each user, and after corresponding target response data are determined according to the interaction data, the target response data are output to the current user based on the target interaction mode. In the method, the personalized interaction mode exclusive to each user can be obtained according to the mapping relation established in advance, so that after response data corresponding to the interaction data are obtained, the response data can be output by adopting the interaction mode exclusive to the user, and not all users interact in a uniform interaction mode, so that the method can meet personalized requirements of different users, and interaction intelligence between the users and equipment can be improved.
Drawings
FIG. 1 is a diagram of an application environment for a method of device interaction in one embodiment;
FIG. 2 is a flow diagram that illustrates a method for device interaction, according to one embodiment;
FIG. 3 is a flowchart illustrating a method of device interaction according to another embodiment;
FIG. 4 is a flowchart illustrating the process of establishing a mapping relationship according to one embodiment;
FIG. 5 is a flowchart illustrating the process of establishing a mapping relationship in another embodiment;
FIG. 6 is a flowchart illustrating a method of device interaction in another embodiment;
FIG. 7 is a block diagram showing the structure of an apparatus interaction device according to an embodiment;
fig. 8 is an internal structure diagram of home equipment in one embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
At present, before the smart home leaves the factory, the manufacturer has set up the interactive mode of smart home and people inside the smart home usually, then when the user used the smart home at home, no matter which user, the smart home can interact according to the unified interactive mode and the arbitrary user that preset. As can be seen, the technology has the problem of poor human-computer interaction intelligence. The application provides a device interaction method, a device, household equipment and a storage medium, which can solve the technical problems.
The device interaction method provided by the application can be applied to the application environment shown in fig. 1. The user device 102 may directly communicate with the home device 104 (illustrated in fig. 1 by taking an intelligent refrigerator as an example), the user device 102 may be but is not limited to various personal computers, notebook computers, smart phones, tablet computers and portable wearable devices, and the home device 104 may be but is not limited to an intelligent refrigerator, an intelligent air conditioner, an intelligent electric cooker, an intelligent microwave oven, an intelligent water heater, an intelligent television, an intelligent sound box, an intelligent curtain, an intelligent lamp curtain wall, an intelligent wallpaper, and the like in a home of the user. In addition, the user device 102 and the home device 104 may also communicate with each other through the server 106, and the interaction data between the user device 102 and the home device 104 may be forwarded through the server 106, where the server 106 may be implemented by an independent server or a server cluster formed by multiple servers.
It should be noted that the execution subject of the embodiment of the present application may be a home device, or may also be a device interaction apparatus, and the method of the embodiment of the present application is described below with reference to the home device as the execution subject.
In one embodiment, an equipment interaction method is provided, and the embodiment relates to a specific process of how to realize personalized interaction between home equipment and a user. As shown in fig. 2, the method may include the steps of:
s202, receiving an interaction triggering instruction input by a current user; the interaction triggering instruction comprises identity information of the current user and interaction data of the current user.
Wherein, the interaction triggering instruction can be a voice instruction, a touch instruction and the like. The identity information of the user may be a model of the user's terminal (the terminal may be a mobile phone, a tablet computer, a notebook computer, a smart band, etc.), facial information of the user, voice information of the user, a certificate number of the user (the certificate number may be an identification number, a driver's license number, etc.), a mobile phone number of the user, and so on. The interactive data may be a problem provided by the user to the home device, may also be open-field white-language technology for starting communication input by the user to the home device, and the like, and may also be data in other forms.
In addition, the interactive data of the current user may be input directly on the home equipment by the current user in a voice or touch form, or input by the current user on a terminal thereof in a voice or touch form, and the interactive data is forwarded to the home equipment through the server, or of course, the interactive data may be in other forms. The identity information of the current user may be obtained by identifying the facial information of the current user through a camera on the home device, or by identifying the voice of the current user through the home device, or certainly by identifying the model of the terminal of the current user through the home device, or even by inputting the identity information on the terminal of the current user and forwarding the identity information to the home device through the server, or in other ways, which is not specifically limited in this embodiment.
S204, determining a target interaction mode corresponding to the identity information of the current user in a preset mapping relation based on the identity information of the current user; the mapping relation comprises identity information of a plurality of users and an interaction mode corresponding to the identity information of each user.
The mapping relationship may be obtained by performing custom configuration according to a configuration instruction input by a user, or may be generated according to preference data of the user.
Optionally, the target interaction mode includes at least one of the following: the method comprises the following steps of adopting a mode of interaction by a user-defined name corresponding to a current user, adopting a mode of interaction by a user-defined personality corresponding to the current user, adopting a mode of interaction by a user-defined character corresponding to the current user and adopting a mode of interaction by a user-defined tone corresponding to the current user.
That is, each user may have a dedicated personalized interaction mode, which is pre-stored in the mapping relationship, and the interaction mode of each user may be a mode in which interaction is performed by using at least one of a customized name, a personality, a timbre, and the like, and if the interaction mode of the user includes a plurality of customized interaction modes, the plurality of customized interaction modes may be generally packaged together to be a total interaction mode.
Specifically, after obtaining the identity information of the current user, the home device may match the identity information of the current user with the plurality of identity information in the mapping relationship to obtain matched identity information, and use an interaction mode corresponding to the matched identity information as an interaction mode of the current user, and record the interaction mode as a target interaction mode.
And S206, after the corresponding target response data is determined according to the interactive data, outputting the target response data to the current user based on the target interactive mode.
In this embodiment, after obtaining the interactive data of the current user, the home device may search for the target response data corresponding to the interactive data in the built-in dialect library. The built-in dialect library can be established in advance according to historical interaction data and corresponding historical response data.
After the target response data and the target interaction mode are obtained, part of data in the target response data can be changed into custom data in the target interaction mode, or all the target response data can be changed into response data in the target interaction mode, and then the changed or changed response data is output to the current user.
For example, assuming that the current user gives a personality to the home device, such as a weak man or a weak woman, the home device may output, when outputting the target response data: the luzhou man or the weak woman prompts you what to look like today. Or, the child may name the air conditioner at home as grand Wukong or the name of his classmate, and when the child turns on the air conditioner at a certain time, the air conditioner feeds back the response data: sunwu serves you, the target temperature is 25 ℃, the current room temperature is 32 ℃, and nowadays, people are too hot and cannot go to the mountain to play tigers (which can be obtained through a correlation technique); when the air conditioner is turned on by the next child, the air conditioner feeds back: sunwu remind you that today's high temperature alarm, asphalt pavement temperature has reached 43 degrees centigrade, and so on. Or the user sets the name of Wu's or Wu's, the same tone, character and personality as Wu's, the name of model ice for intelligent microwave oven, the same tone, character and personality as model ice for intelligent microwave oven, so that when using intelligent electric cooker or intelligent microwave oven, the response data can be output according to the tone, character and personality of Wu's or model ice, for example: wu Fa or Fang Bing suggest what kind of data you like.
When the response data is output to the user, the output mode may be voice output, or may be an output mode in which display and voice are combined, and the output mode may also be an output mode in which the response data is forwarded to the terminal of the current user through the server, and then the voice and/or display screen display output is performed through the terminal. During voice output, the voice output can be realized by adopting conversation or linkage of an intelligent sound box or other intelligent equipment with audio processing; when the display output is carried out, an intelligent lamp curtain wall, intelligent wallpaper, a display screen and the like can be adopted for future realization. Of course, the above-mentioned method can also be used to output response data by using a simulation robot, the shape of the simulation robot can be generated by 3D printing according to the user preference, or the shape can be updated in real time every year or every month, for example, a purchase link is sent at regular time, etc.
Certainly, when the user goes home or goes out, the user may not actively input the interactive data, as long as the home equipment detects that the user goes out or goes home through linkage, the interactive data input by the user may be defaulted to be the going out or going home, then the user may output the contents of going in and out such as "welcome to go home", "one way smooth", "one way careful" and the like through the pre-recorded audio and video, of course, the user may output the contents of going in and out such as "welcome to go home", "one way smooth", "one way careful" and the like through the target interaction mode corresponding to the identity information of the user, and weather information may also be associated, and after the contents of going in and out are output, the weather condition of the day or the last few days is output to the user, and even corresponding prompt information may be output to the user according to the weather condition. Therefore, the user can have the feeling of communicating with the real person, the loneliness of the user is solved, the intelligence of man-machine interaction can be improved, and the user experience is improved.
In the device interaction method, an interaction trigger instruction input by a current user is received, the interaction trigger instruction comprises identity information and interaction data of the current user, a target interaction mode corresponding to the identity information of the current user is determined in a preset mapping relation based on the identity information of the current user, the mapping relation comprises identity information of a plurality of users and an interaction mode corresponding to the identity information of each user, and after corresponding target response data are determined according to the interaction data, the target response data are output to the current user based on the target interaction mode. In the method, the personalized interaction mode exclusive to each user can be obtained according to the mapping relation established in advance, so that after response data corresponding to the interaction data are obtained, the response data can be output by adopting the interaction mode exclusive to the user, and not all users interact in a uniform interaction mode, so that the method can meet personalized requirements of different users, and interaction intelligence between the users and equipment can be improved.
In another embodiment, another device interaction method is provided, and the embodiment relates to a specific process of how home equipment interacts based on the emotional state of the user and the user. On the basis of the foregoing embodiment, as shown in fig. 3, the step of outputting the target response data to the current user based on the target interaction mode in S206 may include the following steps:
s302, collecting sign data of the current user, and determining the emotional state of the current user according to the sign data.
The physical sign data can be the body temperature, pulse, blood pressure, respiration and other data of the user, and can be obtained by arranging corresponding sensors in the household equipment and collecting the human body by adopting the arranged sensors.
Correspondingly, threshold ranges corresponding to the sign data can be set in the household equipment, after the sign data of the current user are obtained by the household equipment through the sensors, the sign data of the current user can be matched with the corresponding threshold ranges, and the emotional state of the user can be obtained according to the matching result. For example, in one possible embodiment, if the vital sign data exceeds a threshold range, the current emotional state of the user may be considered not good, e.g., the emotional state is low or excited; further, when the current user's sign data exceeds the threshold range, if the current user's sign data belongs to the side with the smaller threshold, the emotional state of the user may be considered as low, and if the current user's sign data belongs to the side with the larger threshold, the emotional state of the user may be considered as excited. In another possible implementation, if the sign data does not exceed the threshold range, the current emotional state of the user may be considered normal.
And S304, outputting the target response data to the current user according to the emotional state and the target interaction mode of the current user.
In this step, after obtaining the emotional state and the target interaction mode of the current user, optionally, the emotional state and the target interaction mode of the current user may be fused to obtain a target response mode; and outputting the target response data to the current user in a target response mode.
That is, after the emotional state of the current user is obtained, to avoid the cold single character pronunciation of the mechanical pronunciation, a rhythm and a tone variation may be given to the voice, that is, a tone and a rhythm corresponding to the emotional state may be obtained in the home device, and then the found tone and rhythm are combined with the name, personality, character, tone, and the like in the target interaction mode to obtain a combined response mode, which is recorded as a target response mode, and the target response data is converted into the target response mode and output to the current user after conversion.
For example, suppose that if the current user's emotional state is excited, then a moderated tone and rhythm may be employed to moderate the user's emotion; or if the current emotional state of the user is low, the inspiring tone and rhythm can be adopted to encourage the user; or finding out corresponding audio and video according to the emotional state of the user, and then actively playing the corresponding audio and video after prompting the user by using the target interaction mode and the target response data. Or, the physical condition of the user may also be judged according to the emotional state of the user, and when the user is not fit, the user may be concerned more, for example: the great host seems to have a little bit of cold, and according to the current social situation, probably the flu, … … is recommended to be taken; or according to the clothes condition of your morning and the weather condition of today, the user may feel cold, … … is recommended to be taken, and the like.
The device interaction method provided by this embodiment can obtain the emotional state of the current user by collecting the sign data of the current user, and output the target response data to the current user based on the emotional state of the current user and the target interaction mode. In this embodiment, the response data output to the user may not only be combined with the target interaction mode of the exclusive user, but also be combined with the current emotional state of the user, and thus, when the method of this embodiment is used for human-computer interaction, the method may better meet the actual needs and actual physical conditions of the current user, i.e., be more anthropomorphic, and thereby the intelligence of human-computer interaction may be further improved.
In the above-mentioned actual human-computer interaction, the target interaction mode corresponding to the identity information of the current user can be obtained through the mapping relationship between the identity information and the interaction mode, and then before using the mapping relationship, the mapping relationship needs to be established, or the mapping relationship needs to be determined.
First, a mapping relationship is obtained through user-defined configuration, and in an embodiment, this embodiment relates to a specific process of how home equipment obtains a mapping relationship between identity information and an interaction mode through user-defined configuration. On the basis of the foregoing embodiment, as shown in fig. 4, the manner of establishing the mapping relationship may include the following steps:
s402, receiving a configuration instruction input by a user; the configuration instruction comprises identity information of the user and an interaction mode required by the user.
S404, changing the built-in factory interaction mode into an interaction mode required by the user, and establishing a corresponding relation between the identity information of the user and the interaction mode required by the user to obtain a mapping relation.
In this embodiment, a user may directly input a configuration instruction on the home device through a direct voice or touch manner, for example, the voice speaks the home device to enter a configuration state, or touches a configuration button on a display screen of the home device, and inputs the configuration instruction through the voice or touch manner after the home device enters the configuration state; or the user inputs the configuration instruction on the terminal through voice or touch on the simulated interface of the household equipment, and the configuration instruction is forwarded to the household equipment through the server; of course, the home devices may be controlled to enter the configuration state by touching or clicking a button on the remote controller, and the home devices may be input with a configuration instruction by clicking or touching the button.
After the home equipment receives a configuration instruction input by a user, the home equipment can acquire that the configuration instruction comprises the identity information of the user and an interaction mode required by the user, the interaction pattern required by the user may be at least one of a required name, personality, character, timbre, then the home equipment can call out a built-in factory interaction mode (namely a built-in initial interaction mode which can comprise names, personalities, characters, timbres and the like set when the home equipment leaves the factory), change a mode in the factory interaction mode which is the same as an interaction mode required by a user into the interaction mode required by the user to obtain a changed interaction mode, and then bind the identity information of the user and the corresponding changed interaction mode, this can be performed for any configuration command entered by the user, so that a mapping between the identity information and the interaction pattern can be obtained. And when the subsequent user interacts with the household equipment, the server can interact with the user by adopting the interaction mode set in the mapping relation.
The mapping relation between the identity information of the user and the interaction mode is obtained through user-defined configuration, and the user can configure by the user in a user-defined mode, so that personalized requirements of different users can be met, the configured interaction mode can be more accurate, and the experience degree of the user is improved.
Next, a description is given of obtaining the mapping relationship through the user preference data, and in another embodiment, this embodiment relates to a specific process of how the home device generates the mapping relationship between the identity information and the interaction mode through the user preference data. On the basis of the foregoing embodiment, as shown in fig. 5, the manner of establishing the mapping relationship may include the following steps:
s502, acquiring identity information of different users and preference data of each user; the preference data of each user includes at least one of sports data, life data, entertainment data, shopping data of the user.
S504, generating an interaction mode corresponding to each user according to the preference data of each user.
S506, obtaining a mapping relation according to the interaction mode corresponding to each user and the identity information of each user.
In this embodiment, on the premise of big data, more auxiliary information may be collected as much as possible through social media, articles of different users or communication with different users, so as to obtain identity information of each user and preference data corresponding to the identity information. The preference data of each user includes sports data, life data, entertainment data, shopping data, and the like. The life data comprises data of getting-up time, eating time, resting time, clothes wearing matching and the like; the exercise data includes data of running, swimming, playing ball, etc.; entertainment data includes data of television programs, music, especially pay music, games, especially video games, etc.; the shopping data includes data for out-of-home dining, beverages, supermarket purchases, take-out, etc.
After the preference data of each user is obtained, the data such as names, characters, personality, timbre and the like preferred by the user can be analyzed by adopting a big data analysis method, and an exclusive interaction mode of each user is obtained and comprises at least one exclusive name, character, personality, timbre and the like. For example, if the user is a quadratic element homeman, the personality of the home equipment is close to the quadratic element preferred by the user; if the user likes a pan play, the personality and the character of the household equipment are close to a certain role in a television play recently watched by the user, the name of the household equipment can be set as the name of the role, and the tone of the household equipment can be set as the tone of the role, etc. In addition, when the interaction mode corresponding to each user is generated, some interaction modes of naming, characters, personality, timbre and the like may be generated, and the other interaction mode of leaving the factory of the home equipment may be adopted.
Further, after obtaining the identity information of each user and the dedicated interaction mode, the identity information of each user and the corresponding interaction mode can be bound together to obtain the mapping relationship.
The above-mentioned dedicated interactive mode of user is generated through user's preference data to obtain the mapping relation through dedicated interactive mode of user and corresponding identity information, because this process does not need user manual operation completely, consequently can use manpower sparingly and time, simultaneously because obtain the mapping relation according to user's preference data, consequently the mapping relation that obtains also is more accurate, and then can promote user experience.
The device interaction method provided by the embodiment can obtain the mapping relationship between the identity information of the user and the interaction mode through user-defined configuration or according to preference data of the user. In this embodiment, because the user-defined configuration and the preference data of the user are in accordance with the requirements of the user, the mapping relationship obtained in this embodiment can meet the personalized requirements of different users, and the interaction mode in the obtained mapping relationship is also accurate, so that the user experience can be improved.
It should be noted that, the above embodiments all address the case of human-computer interaction when the current user is a user, but in an actual scenario, there are likely to be multiple users at the current time in the home, and how to implement human-computer interaction in this case? The following description will be made by way of a specific example.
In another embodiment, another device interaction method is provided, and this embodiment relates to how to obtain a target interaction mode based on identity information of a plurality of first users to implement a specific process of human-computer interaction if the current user is the plurality of first users. On the basis of the above embodiment, as shown in fig. 6, the above S204 may include the following steps:
s602, determining the identity information of the target user according to the identity information of the plurality of first users.
The plurality of first users may be a plurality of different terminals of the same user, or a plurality of actual different people, such as the elderly, children, parents, etc. at home, each of the first users has its own identity information, where the identity information may include occupation, sex, age, height, degree of academic record, etc. of each of the first users in addition to the information content mentioned in S202.
As can be seen from the above, the identity information of each first user may be partially the same, generally not all the same, which is equivalent to that there are a plurality of different identity information, but one home device can only interact with the user in one target interaction mode at the same time, so that the identity information of one target user needs to be determined, and when determining, the following first scenario and second scenario may be used for determining, as follows:
determining overlapped identity information in the identity information of a plurality of first users according to the identity information of the plurality of first users; and determining the overlapped identity information as the identity information of the target user.
For example, assuming that the identity information of the first user a includes white-collar profession, age 28, height 170, and the identity information of the first user B includes white-collar profession and age 32, the identity information overlapped in the identity information of the two users is white-collar profession, and then the white-collar profession can be determined as the identity information of the target user.
A second scenario, acquiring the priority of the identity information of each first user; the priority represents the sequence of each first user when using the equipment; and determining the identity information of the first user with the highest priority as the identity information of the target user.
When the home device obtains the identity information of each first user, the priority level configured by the administrator for each first user can also be obtained, then the priority level of the identity information of each first user can be sequenced, and the identity information of the first user with the highest priority level in the sequencing result is determined as the identity information of the target user.
S604, determining a target interaction mode corresponding to the identity information of the target user in a preset mapping relation based on the identity information of the target user.
After the identity information of the target user is obtained, the identity information of the target user and the plurality of identity information in the mapping relation are matched to obtain matched identity information, and an interaction mode corresponding to the matched identity information is taken as an interaction mode of the target user and is recorded as a target interaction mode. And after response data corresponding to the interactive data are obtained, the response data can be output to a plurality of first users based on the target interaction mode.
Of course, in this embodiment, different management rights of the home devices may be allocated to different users, for example, a television belongs to an old person, a computer belongs to a child, and the like, which are the same as the case where one user corresponds to one home device, and the description of this embodiment is omitted here.
According to the device interaction method provided by the embodiment, if a plurality of users use the home equipment at the same time, the identity information of the target user can be determined, and then the target interaction mode corresponding to the identity information of the target user is obtained, so that the target interaction mode is adopted for interaction with the plurality of users. In this embodiment, the identity information of the target user can be determined from the identity information of the multiple users, so that human-computer interaction is performed in an interaction mode corresponding to the identity information of the target user, and thus, the scene requirement that multiple users use the same household equipment can be met.
In another embodiment, in order to facilitate a more detailed description of the technical solution of the present application, the following description is given in conjunction with a more detailed embodiment, and the method may include the following steps S1-S10:
and S1, establishing a mapping relation between the identity information of the user and the interaction mode through the user custom configuration or the preference data of the user.
S2, receiving an interactive trigger instruction input by a current user; the interaction triggering instruction comprises identity information of the current user and interaction data of the current user.
And S3, judging whether the current user is a user or a plurality of users according to the identity information of the current user in the interactive triggering instruction, if so, executing S4, and if so, executing S6.
S4, determining overlapped identity information in the identity information of the plurality of users according to the identity information of the plurality of users, and determining the overlapped identity information as the identity information of the target user; or, acquiring the priority of the identity information of each first user, and determining the identity information of the first user with the highest priority as the identity information of the target user.
S5, determining a target interaction mode corresponding to the identity information of the target user in a preset mapping relation based on the identity information of the target user.
And S6, determining a target interaction mode corresponding to the identity information of the current user in a preset mapping relation based on the identity information of the current user.
And S7, determining corresponding target response data according to the interactive data of the current user.
And S8, collecting sign data of the current user, and determining the emotional state of the current user according to the sign data.
And S9, fusing the emotion state of the current user and the target interaction mode to obtain a target response mode.
And S10, outputting the target response data to the current user in a target response mode.
It should be understood that although the various steps in the flow charts of fig. 2-6 are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least some of the steps in fig. 2-6 may include multiple steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, which are not necessarily performed in sequence, but may be performed in turn or alternately with other steps or at least some of the other steps.
In one embodiment, as shown in fig. 7, there is provided a device interaction apparatus including: a receiving module 10, a determining module 11 and an output module 12, wherein:
the receiving module 10 is configured to receive an interaction triggering instruction input by a current user; the interaction triggering instruction comprises identity information of a current user and interaction data of the current user;
the determining module 11 is configured to determine, based on the identity information of the current user, a target interaction mode corresponding to the identity information of the current user in a preset mapping relationship; the mapping relation comprises identity information of a plurality of users and an interaction mode corresponding to the identity information of each user;
and the output module 12 is configured to output the target response data to the current user based on the target interaction mode after determining the corresponding target response data according to the interaction data.
Optionally, the target interaction mode includes at least one of the following: the method comprises the following steps of adopting a mode of interaction by a user-defined name corresponding to a current user, adopting a mode of interaction by a user-defined personality corresponding to the current user, adopting a mode of interaction by a user-defined character corresponding to the current user and adopting a mode of interaction by a user-defined tone corresponding to the current user.
For the specific definition of the device interaction apparatus, reference may be made to the above definition of the device interaction method, which is not described herein again.
In another embodiment, another device interaction apparatus is provided, and on the basis of the above embodiment, the output module 12 may include a first determining unit and an output unit, where:
the first determining unit is used for acquiring sign data of the current user and determining the emotional state of the current user according to the sign data;
and the output unit is used for outputting the target response data to the current user according to the emotional state and the target interaction mode of the current user.
Optionally, the output unit is further configured to fuse an emotional state of the current user with the target interaction mode to obtain a target response mode; and outputting the target response data to the current user in a target response mode.
In another embodiment, another device interaction apparatus is provided, and on the basis of the above embodiment, the apparatus may further include an establishing module, configured to receive a configuration instruction input by a user; the configuration instruction comprises identity information of the user and an interaction mode required by the user; and changing the built-in factory interaction mode into an interaction mode required by the user, and establishing a corresponding relation between the identity information of the user and the interaction mode required by the user to obtain a mapping relation.
Optionally, the establishing module is further configured to obtain identity information of different users and preference data of each user; the preference data of each user comprises at least one of sports data, life data, entertainment data and shopping data of the user; generating an interaction mode corresponding to each user according to the preference data of each user; and obtaining a mapping relation according to the interaction mode corresponding to each user and the identity information of each user.
In another embodiment, another device interaction apparatus is provided, on the basis of the foregoing embodiment, if the current user is a plurality of first users, the determining module 11 may include a second determining unit and a third determining unit, where:
the second determining unit is used for determining the identity information of the target user according to the identity information of the plurality of first users;
and the third determining unit is used for determining a target interaction mode corresponding to the identity information of the target user in a preset mapping relation based on the identity information of the target user.
Optionally, the second determining unit is further configured to determine, according to the identity information of the plurality of first users, overlapped identity information in the identity information of the plurality of first users; and determining the overlapped identity information as the identity information of the target user.
Optionally, the second determining unit is further configured to obtain a priority of the identity information of each first user; the priority represents the sequence of each first user when using the equipment; and determining the identity information of the first user with the highest priority as the identity information of the target user.
For the specific definition of the device interaction apparatus, reference may be made to the above definition of the device interaction method, which is not described herein again.
The modules in the device interaction apparatus may be implemented in whole or in part by software, hardware, and a combination thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
In one embodiment, a household device is provided, and the internal structure thereof can be as shown in fig. 8. The household equipment comprises a processor, a memory, a communication interface, a display screen and an input device which are connected through a system bus. Wherein, the processor of the household equipment is used for providing calculation and control capability. The memory of the household equipment comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The communication interface of the household equipment is used for carrying out wired or wireless communication with an external terminal, and the wireless communication can be realized through WIFI, an operator network, NFC (near field communication) or other technologies. The computer program is executed by a processor to implement a device interaction method. The display screen of the household equipment can be a liquid crystal display screen or an electronic ink display screen, and the input device of the household equipment can be a touch layer covered on the display screen, a key, a track ball or a touch pad arranged on the shell of the household equipment, an external keyboard, a touch pad or a mouse and the like.
Those skilled in the art will appreciate that the structure shown in fig. 8 is a block diagram of only a portion of the structure relevant to the present application, and does not constitute a limitation on the household devices to which the present application is applied, and a particular household device may include more or less components than those shown in the drawings, or combine certain components, or have a different arrangement of components.
In one embodiment, there is provided a home appliance, including a memory and a processor, the memory storing a computer program, the processor implementing the following steps when executing the computer program:
receiving an interaction triggering instruction input by a current user; the interaction triggering instruction comprises identity information of a current user and interaction data of the current user;
determining a target interaction mode corresponding to the identity information of the current user in a preset mapping relation based on the identity information of the current user; the mapping relation comprises identity information of a plurality of users and an interaction mode corresponding to the identity information of each user;
and after the corresponding target response data are determined according to the interaction data, outputting the target response data to the current user based on the target interaction mode.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
acquiring sign data of a current user, and determining the emotional state of the current user according to the sign data; and outputting the target response data to the current user according to the emotional state and the target interaction mode of the current user.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
fusing the emotional state of the current user and a target interaction mode to obtain a target response mode; and outputting the target response data to the current user in a target response mode.
In one embodiment, the target interaction mode includes at least one of: the method comprises the following steps of adopting a mode of interaction by a user-defined name corresponding to a current user, adopting a mode of interaction by a user-defined personality corresponding to the current user, adopting a mode of interaction by a user-defined character corresponding to the current user and adopting a mode of interaction by a user-defined tone corresponding to the current user.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
receiving a configuration instruction input by a user; the configuration instruction comprises identity information of the user and an interaction mode required by the user; and changing the built-in factory interaction mode into an interaction mode required by the user, and establishing a corresponding relation between the identity information of the user and the interaction mode required by the user to obtain a mapping relation.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
acquiring identity information of different users and preference data of each user; the preference data of each user comprises at least one of sports data, life data, entertainment data and shopping data of the user; generating an interaction mode corresponding to each user according to the preference data of each user; and obtaining a mapping relation according to the interaction mode corresponding to each user and the identity information of each user.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
determining identity information of a target user according to the identity information of a plurality of first users; and determining a target interaction mode corresponding to the identity information of the target user in a preset mapping relation based on the identity information of the target user.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
determining overlapped identity information in the identity information of the plurality of first users according to the identity information of the plurality of first users; and determining the overlapped identity information as the identity information of the target user.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
acquiring the priority of the identity information of each first user; the priority represents the sequence of each first user when using the equipment; and determining the identity information of the first user with the highest priority as the identity information of the target user.
In one embodiment, a computer-readable storage medium is provided, having a computer program stored thereon, which when executed by a processor, performs the steps of:
receiving an interaction triggering instruction input by a current user; the interaction triggering instruction comprises identity information of a current user and interaction data of the current user;
determining a target interaction mode corresponding to the identity information of the current user in a preset mapping relation based on the identity information of the current user; the mapping relation comprises identity information of a plurality of users and an interaction mode corresponding to the identity information of each user;
and after the corresponding target response data are determined according to the interaction data, outputting the target response data to the current user based on the target interaction mode.
In one embodiment, the computer program when executed by the processor further performs the steps of:
acquiring sign data of a current user, and determining the emotional state of the current user according to the sign data; and outputting the target response data to the current user according to the emotional state and the target interaction mode of the current user.
In one embodiment, the computer program when executed by the processor further performs the steps of:
fusing the emotional state of the current user and a target interaction mode to obtain a target response mode; and outputting the target response data to the current user in a target response mode.
In one embodiment, the target interaction mode includes at least one of: the method comprises the following steps of adopting a mode of interaction by a user-defined name corresponding to a current user, adopting a mode of interaction by a user-defined personality corresponding to the current user, adopting a mode of interaction by a user-defined character corresponding to the current user and adopting a mode of interaction by a user-defined tone corresponding to the current user.
In one embodiment, the computer program when executed by the processor further performs the steps of:
receiving a configuration instruction input by a user; the configuration instruction comprises identity information of the user and an interaction mode required by the user; and changing the built-in factory interaction mode into an interaction mode required by the user, and establishing a corresponding relation between the identity information of the user and the interaction mode required by the user to obtain a mapping relation.
In one embodiment, the computer program when executed by the processor further performs the steps of:
acquiring identity information of different users and preference data of each user; the preference data of each user comprises at least one of sports data, life data, entertainment data and shopping data of the user; generating an interaction mode corresponding to each user according to the preference data of each user; and obtaining a mapping relation according to the interaction mode corresponding to each user and the identity information of each user.
In one embodiment, the computer program when executed by the processor further performs the steps of:
determining identity information of a target user according to the identity information of a plurality of first users; and determining a target interaction mode corresponding to the identity information of the target user in a preset mapping relation based on the identity information of the target user.
In one embodiment, the computer program when executed by the processor further performs the steps of:
determining overlapped identity information in the identity information of the plurality of first users according to the identity information of the plurality of first users; and determining the overlapped identity information as the identity information of the target user.
In one embodiment, the computer program when executed by the processor further performs the steps of:
acquiring the priority of the identity information of each first user; the priority represents the sequence of each first user when using the equipment; and determining the identity information of the first user with the highest priority as the identity information of the target user.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, storage, database or other medium used in the embodiments provided herein can include at least one of non-volatile and volatile memory. Non-volatile Memory may include Read-Only Memory (ROM), magnetic tape, floppy disk, flash Memory, optical storage, or the like. Volatile Memory can include Random Access Memory (RAM) or external cache Memory. By way of illustration and not limitation, RAM can take many forms, such as Static Random Access Memory (SRAM) or Dynamic Random Access Memory (DRAM), among others.
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (12)

1. A device interaction method, the method comprising:
receiving an interaction triggering instruction input by a current user; the interaction triggering instruction comprises identity information of the current user and interaction data of the current user;
determining a target interaction mode corresponding to the identity information of the current user in a preset mapping relation based on the identity information of the current user; the mapping relation comprises identity information of a plurality of users and an interaction mode corresponding to the identity information of each user;
and after corresponding target response data are determined according to the interaction data, the target response data are output to the current user based on the target interaction mode.
2. The method of claim 1, wherein outputting the target response data to the current user based on the target interaction mode comprises:
acquiring sign data of a current user, and determining the emotional state of the current user according to the sign data;
and outputting the target response data to the current user according to the emotional state of the current user and the target interaction mode.
3. The method of claim 2, wherein outputting the target response data to the current user based on the emotional state of the current user and the target interaction mode comprises:
fusing the emotional state of the current user and the target interaction mode to obtain a target response mode;
and outputting the target response data to the current user in the target response mode.
4. The method of any one of claims 1-3, wherein the target interaction mode comprises at least one of: the method comprises the following steps of adopting a mode of carrying out interaction by using a custom name corresponding to the current user, adopting a mode of carrying out interaction by using a custom personality corresponding to the current user, adopting a mode of carrying out interaction by using a custom character corresponding to the current user, and adopting a mode of carrying out interaction by using a custom tone corresponding to the current user.
5. The method according to any one of claims 1-3, wherein the mapping relationship is established in a manner that includes:
receiving a configuration instruction input by a user; the configuration instruction comprises identity information of the user and an interaction mode required by the user;
and changing a built-in delivery interaction mode into an interaction mode required by the user, and establishing a corresponding relation between the identity information of the user and the interaction mode required by the user to obtain the mapping relation.
6. The method according to any one of claims 1-3, wherein the mapping relationship is established in a manner that includes:
acquiring identity information of different users and preference data of each user; the preference data of each user comprises at least one of sports data, life data, entertainment data and shopping data of the user;
generating an interaction mode corresponding to each user according to the preference data of each user;
and obtaining the mapping relation according to the interaction mode corresponding to each user and the identity information of each user.
7. The method according to any one of claims 1 to 3, wherein if the current user is a plurality of first users, the determining, in a preset mapping relationship, a target interaction mode corresponding to the identity information of the current user based on the identity information of the current user includes:
determining identity information of a target user according to the identity information of the plurality of first users;
and determining a target interaction mode corresponding to the identity information of the target user in a preset mapping relation based on the identity information of the target user.
8. The method of claim 7, wherein determining the identity information of the target user according to the identity information of the plurality of first users comprises:
determining overlapped identity information in the identity information of the plurality of first users according to the identity information of the plurality of first users;
and determining the overlapped identity information as the identity information of the target user.
9. The method of claim 7, wherein determining the identity information of the target user according to the identity information of the plurality of first users comprises:
acquiring the priority of the identity information of each first user; the priority represents the sequence of the first users when using the equipment;
and determining the identity information of the first user with the highest priority as the identity information of the target user.
10. An apparatus for device interaction, the apparatus comprising:
the receiving module is used for receiving an interaction triggering instruction input by a current user; the interaction triggering instruction comprises identity information of the current user and interaction data of the current user;
the determining module is used for determining a target interaction mode corresponding to the identity information of the current user in a preset mapping relation based on the identity information of the current user; the mapping relation comprises identity information of a plurality of users and an interaction mode corresponding to the identity information of each user;
and the output module is used for outputting the target response data to the current user based on the target interaction mode after determining the corresponding target response data according to the interaction data.
11. A household device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor, when executing the computer program, implements the steps of the method according to any one of claims 1 to 9.
12. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method of any one of claims 1 to 9.
CN202010482975.8A 2020-06-01 2020-06-01 Equipment interaction method and device, household equipment and storage medium Pending CN111817929A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010482975.8A CN111817929A (en) 2020-06-01 2020-06-01 Equipment interaction method and device, household equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010482975.8A CN111817929A (en) 2020-06-01 2020-06-01 Equipment interaction method and device, household equipment and storage medium

Publications (1)

Publication Number Publication Date
CN111817929A true CN111817929A (en) 2020-10-23

Family

ID=72848039

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010482975.8A Pending CN111817929A (en) 2020-06-01 2020-06-01 Equipment interaction method and device, household equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111817929A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113137697A (en) * 2021-04-19 2021-07-20 青岛海尔空调电子有限公司 Control method and device of air conditioner and computer readable storage medium
CN113160832A (en) * 2021-04-30 2021-07-23 合肥美菱物联科技有限公司 Voice washing machine intelligent control system and method supporting voiceprint recognition
CN113934299A (en) * 2021-10-18 2022-01-14 珠海格力电器股份有限公司 Equipment interaction method and device, smart home equipment and processor
CN114137841A (en) * 2021-10-28 2022-03-04 青岛海尔科技有限公司 Control method, equipment and system of Internet of things equipment

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106113038A (en) * 2016-07-08 2016-11-16 纳恩博(北京)科技有限公司 Mode switching method based on robot and device
CN106682090A (en) * 2016-11-29 2017-05-17 上海智臻智能网络科技股份有限公司 Active interaction implementing device, active interaction implementing method and intelligent voice interaction equipment
CN107483493A (en) * 2017-09-18 2017-12-15 广东美的制冷设备有限公司 Interactive calendar prompting method, device, storage medium and intelligent domestic system
CN107819651A (en) * 2017-09-30 2018-03-20 深圳市艾特智能科技有限公司 Intelligent home equipment control method, device, storage medium and computer equipment
CN108073605A (en) * 2016-11-10 2018-05-25 阿里巴巴集团控股有限公司 A kind of loading of business datum, push, the generation method of interactive information and device
CN108351707A (en) * 2017-12-22 2018-07-31 深圳前海达闼云端智能科技有限公司 Man-machine interaction method and device, terminal equipment and computer readable storage medium
CN108874895A (en) * 2018-05-22 2018-11-23 北京小鱼在家科技有限公司 Interactive information method for pushing, device, computer equipment and storage medium
CN109002022A (en) * 2018-08-16 2018-12-14 陕西卓居未来智能科技有限公司 A kind of cloud intelligent steward system and operating method based on interactive voice ability
CN109065035A (en) * 2018-09-06 2018-12-21 珠海格力电器股份有限公司 Information interacting method and device
CN109409063A (en) * 2018-10-10 2019-03-01 北京小鱼在家科技有限公司 A kind of information interacting method, device, computer equipment and storage medium
CN111061953A (en) * 2019-12-18 2020-04-24 深圳市优必选科技股份有限公司 Intelligent terminal interaction method and device, terminal equipment and storage medium
CN111177329A (en) * 2018-11-13 2020-05-19 奇酷互联网络科技(深圳)有限公司 User interaction method of intelligent terminal, intelligent terminal and storage medium

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106113038A (en) * 2016-07-08 2016-11-16 纳恩博(北京)科技有限公司 Mode switching method based on robot and device
CN108073605A (en) * 2016-11-10 2018-05-25 阿里巴巴集团控股有限公司 A kind of loading of business datum, push, the generation method of interactive information and device
CN106682090A (en) * 2016-11-29 2017-05-17 上海智臻智能网络科技股份有限公司 Active interaction implementing device, active interaction implementing method and intelligent voice interaction equipment
CN107483493A (en) * 2017-09-18 2017-12-15 广东美的制冷设备有限公司 Interactive calendar prompting method, device, storage medium and intelligent domestic system
CN107819651A (en) * 2017-09-30 2018-03-20 深圳市艾特智能科技有限公司 Intelligent home equipment control method, device, storage medium and computer equipment
CN108351707A (en) * 2017-12-22 2018-07-31 深圳前海达闼云端智能科技有限公司 Man-machine interaction method and device, terminal equipment and computer readable storage medium
CN108874895A (en) * 2018-05-22 2018-11-23 北京小鱼在家科技有限公司 Interactive information method for pushing, device, computer equipment and storage medium
CN109002022A (en) * 2018-08-16 2018-12-14 陕西卓居未来智能科技有限公司 A kind of cloud intelligent steward system and operating method based on interactive voice ability
CN109065035A (en) * 2018-09-06 2018-12-21 珠海格力电器股份有限公司 Information interacting method and device
CN109409063A (en) * 2018-10-10 2019-03-01 北京小鱼在家科技有限公司 A kind of information interacting method, device, computer equipment and storage medium
CN111177329A (en) * 2018-11-13 2020-05-19 奇酷互联网络科技(深圳)有限公司 User interaction method of intelligent terminal, intelligent terminal and storage medium
CN111061953A (en) * 2019-12-18 2020-04-24 深圳市优必选科技股份有限公司 Intelligent terminal interaction method and device, terminal equipment and storage medium

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113137697A (en) * 2021-04-19 2021-07-20 青岛海尔空调电子有限公司 Control method and device of air conditioner and computer readable storage medium
CN113160832A (en) * 2021-04-30 2021-07-23 合肥美菱物联科技有限公司 Voice washing machine intelligent control system and method supporting voiceprint recognition
CN113934299A (en) * 2021-10-18 2022-01-14 珠海格力电器股份有限公司 Equipment interaction method and device, smart home equipment and processor
CN113934299B (en) * 2021-10-18 2024-01-30 珠海格力电器股份有限公司 Equipment interaction method and device, intelligent household equipment and processor
CN114137841A (en) * 2021-10-28 2022-03-04 青岛海尔科技有限公司 Control method, equipment and system of Internet of things equipment
CN114137841B (en) * 2021-10-28 2024-03-22 青岛海尔科技有限公司 Control method, equipment and system of Internet of things equipment

Similar Documents

Publication Publication Date Title
CN111817929A (en) Equipment interaction method and device, household equipment and storage medium
JP6853858B2 (en) Display device
US11353259B2 (en) Augmented-reality refrigerator and method of controlling thereof
CN113412457B (en) Scene pushing method, device and system, electronic equipment and storage medium
CN107483493A (en) Interactive calendar prompting method, device, storage medium and intelligent domestic system
US20160093081A1 (en) Image display method performed by device including switchable mirror and the device
CN110168298A (en) Refrigerator and its information display method
CN108111948A (en) The visual output that server at speech interface equipment provides
US11732961B2 (en) Augmented-reality refrigerator and method of controlling thereof
CN108696631A (en) Method and its electronic equipment for providing content corresponding with accessory device
CN105794170A (en) Hazard detection unit facilitating user-friendly setup experience
CN111665729A (en) Household equipment control method and device and computer equipment
US20150222450A1 (en) System and method of controlling external apparatus connected with device
KR20180118914A (en) An audio device and method for controlling the same
CN107490971A (en) Intelligent automation assistant in home environment
CN101548531A (en) Configurable personal audiovisual device for use in networked application-sharing system
JP2019120935A (en) Method for providing service using plural wake word in artificial intelligence device and system thereof
CN108647056A (en) Application program preloads method, apparatus, storage medium and terminal
KR20200085143A (en) Conversational control system and method for registering external apparatus
US10936140B2 (en) Method and device for displaying response
CN108037699B (en) Robot, control method of robot, and computer-readable storage medium
JP6669942B2 (en) Linked system, linked server, and device control server
CN106502652B (en) A kind of method for switching theme and server
KR20200068506A (en) Network based portable dining table
KR102625772B1 (en) Method and apparatus for group purchase using neural networks

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination