CN105634881B - Application scene recommendation method and device - Google Patents

Application scene recommendation method and device Download PDF

Info

Publication number
CN105634881B
CN105634881B CN201410606347.0A CN201410606347A CN105634881B CN 105634881 B CN105634881 B CN 105634881B CN 201410606347 A CN201410606347 A CN 201410606347A CN 105634881 B CN105634881 B CN 105634881B
Authority
CN
China
Prior art keywords
scene
user
description data
data
template
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201410606347.0A
Other languages
Chinese (zh)
Other versions
CN105634881A (en
Inventor
蔡耿平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201410606347.0A priority Critical patent/CN105634881B/en
Publication of CN105634881A publication Critical patent/CN105634881A/en
Application granted granted Critical
Publication of CN105634881B publication Critical patent/CN105634881B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention discloses an application scene recommendation method and device, and belongs to the technical field of Internet of things. The method comprises the following steps: when a scene recommendation instruction is received, generating scene description data according to equipment information and user information, wherein the equipment information is intelligent equipment information bound with a user account which is logged in by a terminal, and the user information is user information corresponding to the user account; acquiring a scene template matched with the scene description data from a server according to the scene description data; generating application scene data according to the scene description data and the matched scene template; recommending the application scene corresponding to the application scene data to the user. The device includes: the device comprises a first generation module, an acquisition module, a second generation module and a recommendation module. According to the method and the device, manual setting by a user is not needed, the terminal is enabled to recommend the application scene to the user only by submitting the scene recommendation instruction to the terminal, the whole process can be completed only by one-step operation of the user, and therefore operation of the user is greatly facilitated.

Description

Application scene recommendation method and device
Technical Field
The invention relates to the technical field of Internet of things, in particular to an application scene recommendation method and device.
Background
Along with the rapid development of the technology of the internet of things, more and more intelligent devices are added into the internet of things, and the intelligent devices can work in a cooperative mode by means of the internet of things, so that richer intelligent services are provided for users. When the intelligent service is realized, a user can install an intelligent service client on the mobile phone, and the intelligent service client can realize the remote control and management of the intelligent devices. For example, when a user goes home and is particularly cold, the user wants to get home and then to be particularly warm, and at the moment, the user can control the air conditioner in the home to be turned on through the intelligent server client installed on the mobile phone.
When the number of the intelligent devices is large, the intelligent devices may be divided into a plurality of groups, and the same group of intelligent devices provides one type of intelligent service, that is, the same group of intelligent devices is in the same application scene, for example, a camera may be in the application scene of a home security system, and a temperature sensor, an air conditioner and a window may be in the application scene of home temperature control. At present, all the intelligent devices are manually grouped by users, and different IFTTT rules are set for different application scenarios based on IFTTT (if this that) technology, and then the client of the intelligent server enables the intelligent devices in the application scenarios to perform cooperative work through the IFTTT rules corresponding to the application scenarios.
Because the application scene where the intelligent device is located needs to be manually configured by the user, the IFTTT rule corresponding to the application scene also needs to be manually set by the user, and the operation is complex, a method for recommending the application scene for the user according to a plurality of intelligent devices is urgently needed.
Disclosure of Invention
In order to solve the problems in the prior art, embodiments of the present invention provide an application scenario recommendation method and apparatus. The technical scheme is as follows:
in one aspect, an application scenario recommendation method is provided, where the method includes:
when a scene recommendation instruction is received, generating scene description data according to equipment information and user information, wherein the equipment information is intelligent equipment information bound with a user account which is logged in by a terminal, and the user information is user information corresponding to the user account;
acquiring a scene template matched with the scene description data from a server according to the scene description data;
generating application scene data according to the scene description data and the scene template;
recommending the application scene corresponding to the application scene data to the user.
In another aspect, an application scenario recommendation apparatus is provided, the apparatus including:
the scene recommendation method comprises a first generation module and a second generation module, wherein the first generation module is used for generating scene description data according to equipment information and user information when a scene recommendation instruction is received, the equipment information is intelligent equipment information bound with a user account which is logged in by a terminal, and the user information is user information corresponding to the user account;
the acquisition module is used for acquiring a scene template matched with the scene description data from a server according to the scene description data;
the second generation module is used for generating application scene data according to the scene description data and the scene template;
and the recommending module is used for recommending the application scene corresponding to the application scene data to the user.
In the embodiment of the invention, when a user submits a scene recommendation instruction to the terminal, the terminal acquires the equipment information and the user information, generates scene description data according to the equipment information and the user information, and further acquires a scene template matched with the scene description data from the server according to the scene description data, so that application scene data is generated, and an application scene corresponding to the application scene is recommended to the user. In the whole application scene recommending process, the user does not need to manually set, and only needs to submit a scene recommending instruction to the terminal to enable the terminal to recommend the application scene to the user, and the whole process can be completed only by one-step operation of the user, so that the operation of the user is greatly facilitated.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a flowchart of an application scenario recommendation method according to an embodiment of the present invention;
fig. 2 is a flowchart of an application scenario recommendation method according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of an intelligent service listing interface provided by an embodiment of the invention;
FIG. 4 is a schematic diagram of an intelligent service client interface provided by an embodiment of the present invention;
FIG. 5 is a schematic diagram of an application scenario interface according to an embodiment of the present invention;
fig. 6 is a schematic structural diagram of an application scene recommendation apparatus according to an embodiment of the present invention;
fig. 7 is a schematic structural diagram of an application scene recommendation terminal according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, embodiments of the present invention will be described in detail with reference to the accompanying drawings.
To facilitate an understanding of the invention, some terms referred to in the examples of the invention are explained herein:
application scenarios: a picture formed by a plurality of intelligent devices executing a series of actions in a certain time and space. For example, a home temperature control application scenario may be constructed by performing a series of actions for a temperature sensor, an air conditioner, and a window.
Scene description data: the user information and the device information are generated according to a specified data format and are used for describing the performance, the type and the like of the intelligent device and describing the behavior, the preference, the characteristics and the like of the user.
Scene template: the scene matching method is generated in advance based on the application scene and is used for defining information such as scene matching conditions, scene configuration templates and scene script templates.
Application scene data: the configuration information and the scene script of the application scene are generated after the scene description data and the scene template matched with the scene description data are instantiated, and technical support is provided for the operation of the application scene.
Service data: including simple descriptive information of the application scenario for presenting the user with an easy-to-understand, simple-to-configure interactive interface.
Before explaining the embodiments of the present invention in detail, an application scenario of the embodiments of the present invention will be described. When the number of intelligent devices which are added to the internet of things and bound with the user account is large, in order to enable the intelligent devices to perform cooperative work so as to provide richer intelligent services for the user, the intelligent devices need to be grouped according to the functions of the intelligent services, and the intelligent devices in the same group can perform cooperative work to provide one type of intelligent services. Before providing the intelligent service, an application scene needs to be set for the intelligent service, because the application scene is manually set by a user at present, an IFTTT rule corresponding to the application scene is also manually set by the user, the operation is complex, the IFTTT can only realize simple logic, and more intelligent service cannot be provided for the user, therefore, the embodiment of the invention provides the application scene recommendation method, the application scene and the execution action of the application scene do not need to be manually set by the user, the operation is simple, and the application scene recommended by the embodiment of the invention can realize the complex logic so as to provide more intelligent service for the user.
Fig. 1 is a flowchart of an application scenario recommendation method according to an embodiment of the present invention. Referring to fig. 1, the method includes:
step 101: when a scene recommendation instruction is received, scene description data is generated according to equipment information and user information, the equipment information is intelligent equipment information bound with a user account which is logged in by a terminal, and the user information is user information corresponding to the user account.
Step 102: and acquiring a scene template matched with the scene description data from the server according to the scene description data.
Step 103: and generating application scene data according to the scene description data and the matched scene template.
Step 104: recommending the application scene corresponding to the application scene data to the user.
In the embodiment of the invention, when a user submits a scene recommendation instruction to the terminal, the terminal acquires the equipment information and the user information, generates scene description data according to the equipment information and the user information, and further acquires a scene template matched with the scene description data from the server according to the scene description data, so that application scene data is generated, and an application scene corresponding to the application scene is recommended to the user. In the whole application scene recommending process, the user does not need to manually set, and only needs to submit a scene recommending instruction to the terminal to enable the terminal to recommend the application scene to the user, and the whole process can be completed only by one-step operation of the user, so that the operation of the user is greatly facilitated.
Optionally, obtaining, from the server, a scene template matching the scene description data according to the scene description data includes:
sending a template matching request message to a server, wherein the template matching request message carries the scene description data, so that the server acquires a scene template matched with the scene description data according to the scene description data;
and receiving the scene template sent by the server.
Optionally, obtaining, from the server, a scene template matching the scene description data according to the scene description data includes:
sending a template acquisition request message to a server;
receiving a scene template list sent by a server, wherein the scene template list comprises a plurality of scene templates;
from the plurality of scene templates, a scene template matching the scene description data is acquired.
Optionally, obtaining a scene template matching the scene description data from the plurality of scene templates includes:
for each scene template of the plurality of scene templates, comparing each matching condition in the scene template with scene description data;
and if the data meeting each matching condition exists in the scene description data, determining the scene template as the scene template matched with the scene description data.
Optionally, generating application scene data according to the scene description data and the matched scene template, including:
reading each object in the scene template;
acquiring a sub-object with a specified character string from the sub-objects included in each object;
acquiring corresponding attribute values from scene description data according to the acquired identifiers of the sub-objects;
and replacing the designated character string in the sub-object with the acquired attribute value to obtain application scene data.
Optionally, recommending the application scenario corresponding to the application scenario data to the user, including:
selecting specified application scene data from the application scene data;
generating service data according to the specified application scene data;
generating an application scene interface according to the generated service data;
and displaying the application scene interface to the user.
All the above optional technical solutions can be combined arbitrarily to form an optional embodiment of the present invention, and the embodiment of the present invention is not described in detail herein.
Fig. 2 is a flowchart of an application scenario recommendation method according to an embodiment of the present invention. Referring to fig. 2, the method includes:
step 201: when receiving a scene recommendation instruction, the terminal sends an information acquisition request to the server, wherein the information acquisition request carries the user account number which is logged in by the terminal.
When the scene template at the server side is updated, the intelligent device bound with the user account may generate new intelligent service, or when the user binds the new intelligent device, the new intelligent device may also generate new intelligent service, so the scene recommendation instruction may be triggered when the intelligent service is refreshed or the user account is successfully bound with the new intelligent device, that is, the scene recommendation instruction is triggered when the terminal receives the refresh instruction for refreshing the intelligent service, or when the terminal successfully binds the user account with the new intelligent device, the scene recommendation instruction is triggered when the binding success response sent by the server is received. Of course, the scene recommendation instruction may also be triggered in other manners, for example, the scene recommendation instruction may be triggered by a designated gesture operation on a terminal screen, which is not specifically limited in this embodiment of the present invention.
The refresh command may be triggered by operations such as a pull-down operation, a pull-up operation, and a click of a refresh button, which is not specifically limited in this embodiment of the present invention. For example, fig. 3 is a schematic diagram of an intelligent service list interface provided by an embodiment of the present invention, where fig. 3 includes an intelligent service list of an intelligent service that can be provided by an intelligent device bound to the user account, the intelligent service list includes application scenarios such as a digital home theater, a home security system, a lighting control system, and an automatic cleaning service, and a user may trigger a refresh instruction through a pull-down operation or a pull-up operation on the intelligent service list, and certainly, when fig. 3 further includes a refresh button, the user may also trigger the refresh instruction through an operation of clicking the refresh button.
Step 202: when the server receives the information acquisition request, the server acquires equipment information and user information according to the user account, and sends the equipment information and the user information to the terminal, wherein the equipment information is intelligent equipment information bound with the user account, and the user information is user information corresponding to the user account.
Specifically, when the server receives the information acquisition request, the server acquires the corresponding device information from the stored corresponding relationship between the user account and the device information according to the user account, acquires the corresponding user information from the stored corresponding relationship between the user account and the user information according to the user account, and sends the acquired device information and the acquired user information to the terminal.
The device information is the intelligent device information bound with the user account, that is, the corresponding relationship between the user account and the device information is to store the user account and the new intelligent device when bound. The method specifically comprises the following steps: when a terminal searches for a new intelligent device, the terminal can acquire device information of the new intelligent device and send a device binding request to a server, wherein the device binding request carries the device information of the new intelligent device and the user account. And when the server receives the equipment binding request, the server binds the new intelligent equipment and the user account and sends a binding success response message to the terminal. Then, the server may store the user account and the device information of the new smart device in the correspondence between the user account and the device information.
The device information may include device description information, a device location, and device service information, where the device description information may include a device name, a type, a manufacturer, and the like, the device location may include a geographic location of the device, an IP (internet protocol) address, and the like, the device service information is used to describe a function that the device has, and may include a service identifier, a service type, and the like, where the service identifier is an identifier of a certain function and is set in advance, and the service type is a type to which the function belongs, such as a play class and a display class.
Optionally, after the terminal acquires the device information of the new intelligent device, the terminal may directly send a device binding request to the server, and of course, the terminal may also send a device binding request to the server when receiving a binding instruction for binding the new intelligent device with the user account.
For example, fig. 4 is a schematic diagram of an intelligent service client interface according to an embodiment of the present invention. The interface comprises a new equipment searching button, a bound equipment button and an intelligent service button, and a user can click the new equipment searching button to enable the terminal to search for new intelligent equipment and obtain equipment information of the new intelligent equipment, so that the new intelligent equipment is bound with the user account. Thereafter, the user may click the bound device button to view the binding result.
Optionally, in a general case, the number of the smart devices bound to the user account is multiple, and before the server sends the device information to the terminal, the server may combine the multiple pieces of smart device information into a device information list and send the device information list to the terminal.
The user information is the user information corresponding to the user account, and the user information not only can include the user attribute information, but also can include the preference information of the user, so that the generated application scene better meets the requirements of the user. The user attribute information may be a user identifier, location information of a location where the user is located, and the like, and the user preference information may be counted based on network behaviors of the user, for example, the user needs to make a cup of coffee every day and watch news every day before sleeping.
Step 203: when the terminal receives the equipment information and the user information sent by the server, scene description data is generated according to the equipment information and the user information.
Specifically, when the terminal receives the device information and the user information sent by the server, the terminal generates scene description data according to a specified description data format according to the device information and the user information.
For example, the user information includes a user identifier 123456, the device information includes device description information, device location, and device service information, the device description information includes an identifier 16335 and 6463 and 456 of the device, a name camera01, a type device and camera, and a manufacturer Logitech, the device location may include a geographic location 5.2,1.3,3.4 of the device, an IP (Internet Protocol) address 192.168.1.3, and device service information for describing functions that the device has, including a service identifier fh4h-jgtd-f25ge and a service type d4se-g53 k-htdh. In this way, the terminal may generate the scene description data according to the specified data format according to the device information and the user information as follows:
Figure GDA0001294947560000081
Figure GDA0001294947560000091
step 204: and the terminal acquires a scene template matched with the scene description data from the server according to the scene description data.
Since the scene template list is stored in the server and includes a plurality of scene templates, the terminal may obtain, according to the scene description data, a scene template matching the scene description data from the server in two ways:
in the first mode, a terminal sends a template matching request message to a server, wherein the template matching request message carries scene description data, so that the server acquires a scene template matched with the scene description data according to the scene description data; and the terminal receives the scene template sent by the server.
Specifically, a terminal sends a template matching request message to a server, the template matching request message carries scene description data, when the server receives the template matching request message, the server obtains a scene template matched with the scene description data from a plurality of scene templates included in the scene template list according to the scene description data, and sends the matched scene template to the terminal. And then, the terminal receives the scene template sent by the server and determines the received scene template as the scene template matched with the scene description data.
The specific operation of the server obtaining the scene template matched with the scene description data from the plurality of scene templates included in the scene template list according to the scene description data may be: for each scene template in the plurality of scene templates, the server compares each matching condition in the scene template with the scene description data; and if the scene description data contains data meeting each matching condition, determining the scene template as the scene template matched with the scene description data.
In the second mode, the terminal sends a template acquisition request message to the server, so that the server acquires a scene template list; a terminal receives a scene template list sent by a server, wherein the scene template list comprises a plurality of scene templates; and the terminal acquires the scene template matched with the scene description data from the plurality of scene templates.
Specifically, the terminal sends a template acquisition request message to the server; and when the server receives the template acquisition request message, the server acquires a stored scene template list and sends the scene template list to the terminal. When the terminal receives the scene template list sent by the server, the terminal acquires the scene template matched with the scene description data from the scene templates.
The specific operation of the terminal obtaining the scene template matched with the scene description data from the plurality of scene templates may be: for each scene template in the plurality of scene templates, the terminal compares each matching condition in the scene template with the scene description data; and if the scene description data contains data meeting each matching condition, determining the scene template as the scene template matched with the scene description data.
It should be noted that, when a scene template matching the scene description data is acquired, the scene description data needs to be compared with each matching condition included in the scene template, and because some matching conditions are to perform language restriction on the function of the device or to perform language restriction on the preference of the user, semantic comparison can be performed between the scene description data and each matching condition included in the scene template, and then it is determined whether data meeting the matching conditions exists in the scene description data, so that the matching success rate can be improved.
Further, in the embodiment of the present invention, there may be a plurality of smart devices bound to the user account, and since the functions of the plurality of smart devices may be different, the terminal may obtain a plurality of scene templates matching the scene description data from the server according to the scene description data, so that the plurality of smart devices are divided into a plurality of groups. For example, the smart device bound to the user account includes: balcony camera A, balcony camera B, sitting room camera, motion perceptron, storage device, digital television, STB, audio amplifier, guest room lamp, bedroom lamp, kitchen lamp and bathroom lamp. After scene description data is generated by the device information and the user information of the intelligent devices, according to the scene description data, matching scene templates corresponding to the balcony camera A, the balcony camera B, the living room camera, the motion sensor and the storage device are determined to be a home security system scene template, matching scene templates corresponding to the digital television, the set top box and the loudspeaker box are digital home theater scene templates, and matching scene templates corresponding to the living room lamp, the bedroom lamp, the kitchen lamp and the bathroom lamp are light control system scene templates.
Step 205: and the terminal generates application scene data according to the scene description data and the matched scene template.
Specifically, the terminal reads each object in the matched scene template; acquiring a sub-object with a specified character string from the sub-objects included in each object; acquiring corresponding attribute values from scene description data according to the acquired identifiers of the sub-objects; and replacing the designated character string in the sub-object with the acquired attribute value to obtain application scene data.
For example, the specified character string is $. $$, the terminal reads a service object < service _ tmpl > from a scene template of the home security system, the sub-objects included in the service object are service identifiers < id > $ service _ id $ > and service types < type > d4fd25be-ab9e-98ed-0371-c3cb1d485ee6</type >, it is determined that the specified character string exists in the sub-object service identifier in the service object, the specified character string does not exist in the service type, then according to the identifier service _ id of the sub-object service identifier, the corresponding attribute value is fe1aee70-fhigw _353dg from the scene description data, the obtained attribute value fe1aee70-fhigw _353dg in the sub-object service identifier is obtained, the service identifiers are 1aee70-fhigw > 353dg, and each sub-object in the scene template is scanned according to the above method, and obtaining application scene data of the home security system.
Step 206: and the terminal recommends the application scene corresponding to the application scene data to the user.
When the application scene is recommended to the user, data which is easy to understand by the user needs to be selected, namely, the application scene data is appointed, so that the terminal selects the appointed application scene data from the application scene data; generating service data according to the specified application scene data; generating an application scene interface according to the service data; and displaying the application scene interface to the user. For example, the terminal generates an application scene interface as shown in fig. 5 according to the service data, where the name of the application scene is a home security system, and the intelligent devices included in the application scene are a balcony camera a, a balcony camera B, a living room camera, a motion sensor, and a storage device.
The specific operation of the terminal generating the service data according to the specified application scenario data may be: and the terminal generates service data according to the specified application scene data and the service data format.
It should be noted that, after the terminal displays the application scene interface to the user, the application scene interface not only includes a switch of the intelligent service, but also includes a switch corresponding to each intelligent device, and the user can select whether to turn on the intelligent device through the switch. However, for the intelligent service corresponding to the application scenario, some devices are necessary and some devices are optional, and when the intelligent service corresponding to the application scenario is opened, the necessary devices also need to be opened, and the optional devices may be opened or closed.
In the embodiment of the invention, the scene description data, the scene template, the application scene data and the service data are all expressed by adopting an XML language, so that the data exchange and the data expansion are convenient. Of course, other languages may be used for the representation, and only the same effect can be achieved.
In the embodiment of the invention, when a user submits a scene recommendation instruction to the terminal, the terminal acquires the equipment information and the user information, generates scene description data according to the equipment information and the user information, and further acquires a scene template matched with the scene description data from the server according to the scene description data, so that application scene data is generated, and an application scene corresponding to the application scene is recommended to the user. In the whole application scene recommending process, the user does not need to manually set, and only needs to submit a scene recommending instruction to the terminal to enable the terminal to recommend the application scene to the user, and the whole process can be completed only by one-step operation of the user, so that the operation of the user is greatly facilitated.
Fig. 6 is a schematic structural diagram of an application scenario recommendation device according to an embodiment of the present invention. Referring to fig. 6, the apparatus includes: a first generation module 601, an acquisition module 602, a second generation module 603 and a recommendation module 604.
The first generating module 601 is configured to generate scene description data according to device information and user information when a scene recommendation instruction is received, where the device information is intelligent device information bound to a user account that a terminal has logged in, and the user information is user information corresponding to the user account;
an obtaining module 602, configured to obtain, according to scene description data, a scene template matched with the scene description data from a server;
a second generating module 603, configured to generate application scene data according to the scene description data and the matched scene template;
and a recommending module 604, configured to recommend the application scenario corresponding to the application scenario data to the user.
Optionally, the obtaining module 602 includes:
a first sending unit, configured to send a template matching request message to a server, where the template matching request message carries the scene description data, so that the server obtains a scene template matched with the scene description data according to the scene description data;
and the first receiving unit is used for receiving the scene template sent by the server.
Optionally, the obtaining module 602 includes:
a second sending unit, configured to send a template acquisition request message to the server;
the second receiving unit is used for receiving a scene template list sent by the server, and the scene template list comprises a plurality of scene templates;
a first obtaining unit, configured to obtain a scene template matching the scene description data from the plurality of scene templates.
Optionally, the obtaining unit includes:
a comparison subunit, configured to, for each scene template in the plurality of scene templates, compare each matching condition in the scene template with the scene description data;
and the determining subunit is used for determining the scene template as the scene template matched with the scene description data if the data meeting each matching condition exists in the scene description data.
Optionally, the second generating module 603 includes:
a reading unit, configured to read each object in the scene template;
a second acquisition unit configured to acquire a child object in which a specified character string exists, from child objects included in each object;
a third obtaining unit, configured to obtain a corresponding attribute value from the scene description data according to the obtained identifier of the sub-object;
and the replacing unit is used for replacing the designated character string in the sub-object with the acquired attribute value to obtain the application scene data.
Optionally, the recommending module 604 comprises:
a selection unit configured to select specified application scene data from the application scene data;
a first generating unit, configured to generate service data according to the selected specified application scenario data;
the second generation unit is used for generating an application scene interface according to the generated service data;
and the display unit is used for displaying the application scene interface to a user.
In the embodiment of the invention, when a user submits a scene recommendation instruction to the terminal, the terminal acquires the equipment information and the user information, generates scene description data according to the equipment information and the user information, and further acquires a scene template matched with the scene description data from the server according to the scene description data, so that application scene data is generated, and an application scene corresponding to the application scene is recommended to the user. In the whole application scene recommending process, the user does not need to manually set, and only needs to submit a scene recommending instruction to the terminal to enable the terminal to recommend the application scene to the user, and the whole process can be completed only by one-step operation of the user, so that the operation of the user is greatly facilitated.
It should be noted that: in the application scene recommendation apparatus provided in the foregoing embodiment, only the division of the functional modules is illustrated in the application scene recommendation, and in practical applications, the function distribution may be completed by different functional modules according to needs, that is, the internal structure of the apparatus is divided into different functional modules to complete all or part of the functions described above. In addition, the application scene recommendation device and the application scene recommendation method provided by the above embodiments belong to the same concept, and specific implementation processes thereof are detailed in the method embodiments and are not described herein again.
Referring to fig. 7, a schematic structural diagram of a terminal with a touch-sensitive surface according to an embodiment of the present invention is shown, where the terminal may be used to implement the application scene recommendation method provided in the foregoing embodiment. Specifically, the method comprises the following steps:
the terminal 700 may include RF (Radio Frequency) circuitry 110, memory 120 including one or more computer-readable storage media, an input unit 130, a display unit 140, a sensor 150, audio circuitry 160, a WiFi (wireless fidelity) module 170, a processor 180 including one or more processing cores, and a power supply 190. Those skilled in the art will appreciate that the terminal structure shown in fig. 7 is not intended to be limiting and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components. Wherein:
the RF circuit 110 may be used for receiving and transmitting signals during information transmission and reception or during a call, and in particular, receives downlink information from a base station and then sends the received downlink information to the one or more processors 180 for processing; in addition, data relating to uplink is transmitted to the base station. In general, the RF circuitry 110 includes, but is not limited to, an antenna, at least one Amplifier, a tuner, one or more oscillators, a Subscriber Identity Module (SIM) card, a transceiver, a coupler, an LNA (Low Noise Amplifier), a duplexer, and the like. In addition, the RF circuitry 110 may also communicate with networks and other devices via wireless communications. The wireless communication may use any communication standard or protocol, including but not limited to GSM (Global System for Mobile communications), GPRS (General Packet Radio Service), CDMA (Code Division Multiple Access), WCDMA (Wideband Code Division Multiple Access), LTE (Long Term Evolution), e-mail, SMS (short messaging Service), etc.
The memory 120 may be used to store software programs and modules, and the processor 180 executes various functional applications and data processing by operating the software programs and modules stored in the memory 120. The memory 120 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the terminal 700, and the like. Further, the memory 120 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device. Accordingly, the memory 120 may further include a memory controller to provide the processor 180 and the input unit 130 with access to the memory 120.
The input unit 130 may be used to receive input numeric or character information and generate keyboard, mouse, joystick, optical or trackball signal inputs related to user settings and function control. In particular, the input unit 130 may include a touch-sensitive surface 131 as well as other input devices 132. The touch-sensitive surface 131, also referred to as a touch display screen or a touch pad, may collect touch operations by a user on or near the touch-sensitive surface 131 (e.g., operations by a user on or near the touch-sensitive surface 131 using a finger, a stylus, or any other suitable object or attachment), and drive the corresponding connection device according to a predetermined program. Alternatively, the touch sensitive surface 131 may comprise two parts, a touch detection means and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 180, and can receive and execute commands sent by the processor 180. Additionally, the touch-sensitive surface 131 may be implemented using various types of resistive, capacitive, infrared, and surface acoustic waves. In addition to the touch-sensitive surface 131, the input unit 130 may also include other input devices 132. In particular, other input devices 132 may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like.
The display unit 140 may be used to display information input by or provided to a user and various graphical user interfaces of the terminal 700, which may be made up of graphics, text, icons, video, and any combination thereof. The Display unit 140 may include a Display panel 141, and optionally, the Display panel 141 may be configured in the form of an LCD (Liquid Crystal Display), an OLED (Organic Light-Emitting Diode), or the like. Further, the touch-sensitive surface 131 may cover the display panel 141, and when a touch operation is detected on or near the touch-sensitive surface 131, the touch operation is transmitted to the processor 180 to determine the type of the touch event, and then the processor 180 provides a corresponding visual output on the display panel 141 according to the type of the touch event. Although in FIG. 7, touch-sensitive surface 131 and display panel 141 are shown as two separate components to implement input and output functions, in some embodiments, touch-sensitive surface 131 may be integrated with display panel 141 to implement input and output functions.
The terminal 700 can also include at least one sensor 150, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor may include an ambient light sensor that may adjust the brightness of the display panel 141 according to the brightness of ambient light, and a proximity sensor that may turn off the display panel 141 and/or a backlight when the terminal 700 is moved to the ear. As one of the motion sensors, the gravity acceleration sensor can detect the magnitude of acceleration in each direction (generally, three axes), can detect the magnitude and direction of gravity when the mobile phone is stationary, and can be used for applications of recognizing the posture of the mobile phone (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration recognition related functions (such as pedometer and tapping), and the like; as for other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which can be configured in the terminal 700, detailed descriptions thereof are omitted.
Audio circuitry 160, speaker 161, and microphone 162 may provide an audio interface between a user and terminal 700. The audio circuit 160 may transmit the electrical signal converted from the received audio data to the speaker 161, and convert the electrical signal into a sound signal for output by the speaker 161; on the other hand, the microphone 162 converts the collected sound signal into an electric signal, converts the electric signal into audio data after being received by the audio circuit 160, and then outputs the audio data to the processor 180 for processing, and then to the RF circuit 110 to be transmitted to, for example, another terminal, or outputs the audio data to the memory 120 for further processing. The audio circuit 160 may also include an earbud jack to provide communication of a peripheral headset with the terminal 700.
WiFi belongs to a short-distance wireless transmission technology, and the terminal 700 can help a user send and receive e-mails, browse web pages, access streaming media, and the like through the WiFi module 170, and provides wireless broadband internet access for the user. Although fig. 7 shows the WiFi module 170, it is understood that it does not belong to the essential constitution of the terminal 700 and may be omitted entirely as needed within the scope not changing the essence of the invention.
The processor 180 is a control center of the terminal 700, connects various parts of the entire mobile phone using various interfaces and lines, and performs various functions of the terminal 700 and processes data by operating or executing software programs and/or modules stored in the memory 120 and calling data stored in the memory 120, thereby performing overall monitoring of the mobile phone. Optionally, processor 180 may include one or more processing cores; preferably, the processor 180 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 180.
The terminal 700 also includes a power supply 190 (e.g., a battery) for powering the various components, which may preferably be logically coupled to the processor 180 via a power management system to manage charging, discharging, and power consumption management functions via the power management system. The power supply 190 may also include any component including one or more of a dc or ac power source, a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator, and the like.
Although not shown, the terminal 700 may further include a camera, a bluetooth module, etc., which will not be described herein. Specifically, in this embodiment, the display unit of the terminal is a touch screen display, the terminal further includes a memory, and one or more programs, where the one or more programs are stored in the memory and configured to be executed by the one or more processors, and the one or more programs include instructions for:
when a scene recommendation instruction is received, scene description data is generated according to equipment information and user information, the equipment information is intelligent equipment information bound with a user account which is logged in by a terminal, and the user information is user information corresponding to the user account.
And acquiring a scene template matched with the scene description data from the server according to the scene description data.
And generating application scene data according to the scene description data and the matched scene template.
Recommending the application scene corresponding to the application scene data to the user.
Optionally, obtaining, from the server, a scene template matching the scene description data according to the scene description data includes:
sending a template matching request message to a server, wherein the template matching request message carries the scene description data, so that the server acquires a scene template matched with the scene description data according to the scene description data;
and receiving the scene template sent by the server.
Optionally, obtaining, from the server, a scene template matching the scene description data according to the scene description data includes:
sending a template acquisition request message to a server;
receiving a scene template list sent by a server, wherein the scene template list comprises a plurality of scene templates;
from the plurality of scene templates, a scene template matching the scene description data is acquired.
Optionally, obtaining a scene template matching the scene description data from the plurality of scene templates includes:
for each scene template of the plurality of scene templates, comparing each matching condition in the scene template with scene description data;
and if the data meeting each matching condition exists in the scene description data, determining the scene template as the scene template matched with the scene description data.
Optionally, generating application scene data according to the scene description data and the matched scene template, including:
reading each object in the scene template;
acquiring a sub-object with a specified character string from the sub-objects included in each object;
acquiring corresponding attribute values from scene description data according to the acquired identifiers of the sub-objects;
and replacing the designated character string in the sub-object with the acquired attribute value to obtain application scene data.
Optionally, recommending the application scenario corresponding to the application scenario data to the user, including:
selecting specified application scene data from the application scene data;
generating service data according to the specified application scene data;
generating an application scene interface according to the generated service data;
and displaying the application scene interface to the user.
In the embodiment of the invention, when a user submits a scene recommendation instruction to the terminal, the terminal acquires the equipment information and the user information, generates scene description data according to the equipment information and the user information, and further acquires a scene template matched with the scene description data from the server according to the scene description data, so that application scene data is generated, and an application scene corresponding to the application scene is recommended to the user. In the whole application scene recommending process, the user does not need to manually set, and only needs to submit a scene recommending instruction to the terminal to enable the terminal to recommend the application scene to the user, and the whole process can be completed only by one-step operation of the user, so that the operation of the user is greatly facilitated.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, where the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.

Claims (10)

1. An application scenario recommendation method is applied to a terminal for controlling a plurality of intelligent devices in an internet of things, wherein the internet of things is used for providing at least one intelligent service, and the method comprises the following steps:
when a scene recommendation instruction is received, generating scene description data according to a specified description data format according to device information and user information, wherein the scene description data is used for describing the performance and the type of intelligent devices and describing the behaviors, the characteristics and the preferences of users, the device information is intelligent device information bound with a user account which is logged in by the terminal, a plurality of intelligent devices bound with the user account are provided, the user information is user information corresponding to the user account, the user information comprises attribute information of the user and preference information of the user, the attribute information of the user comprises an identifier of the user and position information of the user, the preference information of the user is counted based on the network behaviors of the user, and the scene recommendation instruction is triggered when the terminal receives a refreshing intelligent service instruction, or the binding between the terminal and the intelligent equipment added to the internet of things is successful, and the scene description data is used for indicating the equipment description information, the equipment position information and the equipment service information of each intelligent equipment in the internet of things;
acquiring all scene templates matched with the scene description data from a server according to the scene description data;
reading each object in the scene template; acquiring a sub-object with a specified character string from the sub-objects included in each object; acquiring corresponding attribute values from the scene description data according to the acquired identifiers of the sub-objects; replacing the designated character string in the sub-object with the attribute value to obtain application scene data;
recommending the application scene corresponding to the application scene data to the user.
2. The method of claim 1, wherein the obtaining the scene template matching the scene description data from the server according to the scene description data comprises:
sending a template matching request message to a server, wherein the template matching request message carries the scene description data, so that the server acquires a scene template matched with the scene description data according to the scene description data;
and receiving the scene template sent by the server.
3. The method of claim 1, wherein the obtaining the scene template matching the scene description data from the server according to the scene description data comprises:
sending a template acquisition request message to a server;
receiving a scene template list sent by the server, wherein the scene template list comprises a plurality of scene templates;
and acquiring a scene template matched with the scene description data from the scene templates.
4. The method of claim 3, wherein said obtaining a scene template from said plurality of scene templates that matches said scene description data comprises:
for each scene template of the plurality of scene templates, comparing each matching condition in the scene template to the scene description data;
and if the scene description data contains data meeting each matching condition, determining the scene template as the scene template matched with the scene description data.
5. The method of claim 1, wherein recommending the application scenario corresponding to the application scenario data to the user comprises:
selecting specified application scene data from the application scene data;
generating service data according to the specified application scene data;
generating an application scene interface according to the service data;
and displaying the application scene interface to a user.
6. An application scenario recommendation device applied to a terminal for controlling a plurality of intelligent devices in an internet of things, the internet of things being used for providing at least one intelligent service, the device comprising:
a first generating module, configured to generate scene description data according to a specified description data format and according to device information and user information when receiving a scene recommendation instruction, where the scene description data is used to describe performance and type of an intelligent device and to describe behavior, characteristics, and preferences of a user, the device information is intelligent device information bound to a user account to which the terminal has logged in, the number of intelligent devices bound to the user account is multiple, the user information is user information corresponding to the user account, the user information includes attribute information of the user and preference information of the user, the attribute information of the user includes an identifier of the user and location information of the user, the preference information of the user is counted based on network behavior of the user, and the scene recommendation instruction is triggered when the terminal receives a refresh intelligent service instruction, or the binding between the terminal and the intelligent equipment added to the internet of things is successful, and the scene description data is used for indicating the equipment description information, the equipment position information and the equipment service information of each intelligent equipment in the internet of things;
the acquisition module is used for acquiring all scene templates matched with the scene description data from a server according to the scene description data;
the second generation module is used for generating application scene data according to the scene description data and the scene template;
the recommending module is used for recommending the application scene corresponding to the application scene data to a user; the second generation module comprises: a reading unit, configured to read each object in the scene template; a second obtaining unit configured to obtain a child object in which a specified character string exists, from child objects included in each of the objects; a third obtaining unit, configured to obtain a corresponding attribute value from the scene description data according to the obtained identifier of the sub-object; and the replacing unit is used for replacing the designated character string in the sub-object with the attribute value to obtain application scene data.
7. The apparatus of claim 6, wherein the acquisition module comprises:
a first sending unit, configured to send a template matching request message to a server, where the template matching request message carries the scene description data, so that the server obtains a scene template matched with the scene description data according to the scene description data;
and the first receiving unit is used for receiving the scene template sent by the server.
8. The apparatus of claim 6, wherein the acquisition module comprises:
a second sending unit, configured to send a template acquisition request message to the server;
a second receiving unit, configured to receive a scene template list sent by the server, where the scene template list includes a plurality of scene templates;
a first obtaining unit, configured to obtain, from the plurality of scene templates, a scene template that matches the scene description data.
9. The apparatus of claim 8, wherein the obtaining unit comprises:
a comparison subunit, configured to, for each scene template of the plurality of scene templates, compare each matching condition in the scene template with the scene description data;
a determining subunit, configured to determine, if there is data that satisfies the each matching condition in the scene description data, that the scene template is a scene template that matches the scene description data.
10. The apparatus of claim 6, wherein the recommendation module comprises:
a selection unit configured to select specified application scene data from the application scene data;
a first generating unit, configured to generate service data according to the specified application scenario data;
the second generation unit is used for generating an application scene interface according to the service data;
and the display unit is used for displaying the application scene interface to a user.
CN201410606347.0A 2014-10-30 2014-10-30 Application scene recommendation method and device Active CN105634881B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410606347.0A CN105634881B (en) 2014-10-30 2014-10-30 Application scene recommendation method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410606347.0A CN105634881B (en) 2014-10-30 2014-10-30 Application scene recommendation method and device

Publications (2)

Publication Number Publication Date
CN105634881A CN105634881A (en) 2016-06-01
CN105634881B true CN105634881B (en) 2020-07-07

Family

ID=56049411

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410606347.0A Active CN105634881B (en) 2014-10-30 2014-10-30 Application scene recommendation method and device

Country Status (1)

Country Link
CN (1) CN105634881B (en)

Families Citing this family (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110018861B (en) * 2018-01-08 2023-05-05 视联动力信息技术股份有限公司 Message prompting method and device for application program
JP7136632B2 (en) * 2018-08-31 2022-09-13 東芝ライフスタイル株式会社 Remote control device and program
CN109407527B (en) * 2018-09-14 2022-05-17 深圳绿米联创科技有限公司 Method and device for realizing intelligent equipment recommendation
CN111127127B (en) * 2018-10-31 2023-06-30 阿里巴巴集团控股有限公司 Intelligent equipment information processing method, device and system
CN109299384B (en) * 2018-11-02 2021-05-04 北京小米智能科技有限公司 Scene recommendation method, device and system and storage medium
CN109670106B (en) * 2018-12-06 2022-03-11 百度在线网络技术(北京)有限公司 Scene-based object recommendation method and device
CN111866037A (en) * 2019-04-24 2020-10-30 青岛海尔洗衣机有限公司 Control method of clothes treatment system based on Internet of things
WO2020228032A1 (en) * 2019-05-16 2020-11-19 深圳市欢太科技有限公司 Scene pushing method, apparatus and system, and electronic device and storage medium
CN110781361B (en) * 2019-10-23 2023-05-02 芜湖盟博科技有限公司 Method for intelligent scene with infinite nested sub-scene
CN111880791A (en) * 2020-06-30 2020-11-03 海尔优家智能科技(北京)有限公司 Scene instance development method and device
CN112269473B (en) * 2020-12-23 2021-05-11 深圳市蓝凌软件股份有限公司 Man-machine interaction method and system based on flexible scene definition
CN112596410A (en) * 2020-12-24 2021-04-02 深圳市欧瑞博科技股份有限公司 Function updating method and device of intelligent switch, electronic equipment and storage medium
CN113094589A (en) * 2021-04-30 2021-07-09 中国银行股份有限公司 Intelligent service recommendation method and device
CN113139132A (en) * 2021-05-19 2021-07-20 云米互联科技(广东)有限公司 HomeMap-based distribution network and scene automatic recommendation method and device
CN113433832A (en) * 2021-06-29 2021-09-24 青岛海尔科技有限公司 Scene application method and device, storage medium and electronic equipment
CN115695171A (en) * 2021-07-30 2023-02-03 青岛海尔科技有限公司 Scene configuration method and device of panel equipment, storage medium and electronic device
CN113885344A (en) * 2021-10-28 2022-01-04 四川虹美智能科技有限公司 Deployment method, device and system of smart home scene
CN114153152A (en) * 2021-11-23 2022-03-08 深圳海智创科技有限公司 Automatic scene configuration method and system applied to intelligent household equipment
CN116804854A (en) * 2022-03-18 2023-09-26 华为技术有限公司 Intelligent device control method and electronic device
CN115361247B (en) * 2022-07-05 2023-12-08 芜湖美的厨卫电器制造有限公司 Scene recommendation method and device, storage medium and electronic equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102957742A (en) * 2012-10-18 2013-03-06 北京天宇朗通通信设备股份有限公司 Data pushing method and device
CN103248658A (en) * 2012-02-10 2013-08-14 富士通株式会社 Service recommendation device, service recommendation method and mobile device
CN103942021A (en) * 2014-03-24 2014-07-23 华为技术有限公司 Method for presenting content, method for pushing content presenting modes and intelligent terminal
CN104063462A (en) * 2014-06-25 2014-09-24 北京智谷睿拓技术服务有限公司 Information communication method, device and system
CN104063457A (en) * 2014-06-25 2014-09-24 北京智谷睿拓技术服务有限公司 Information communication method, system and terminal

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013145265A (en) * 2012-01-13 2013-07-25 Sony Corp Server, terminal device for learning, and learning content management method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103248658A (en) * 2012-02-10 2013-08-14 富士通株式会社 Service recommendation device, service recommendation method and mobile device
CN102957742A (en) * 2012-10-18 2013-03-06 北京天宇朗通通信设备股份有限公司 Data pushing method and device
CN103942021A (en) * 2014-03-24 2014-07-23 华为技术有限公司 Method for presenting content, method for pushing content presenting modes and intelligent terminal
CN104063462A (en) * 2014-06-25 2014-09-24 北京智谷睿拓技术服务有限公司 Information communication method, device and system
CN104063457A (en) * 2014-06-25 2014-09-24 北京智谷睿拓技术服务有限公司 Information communication method, system and terminal

Also Published As

Publication number Publication date
CN105634881A (en) 2016-06-01

Similar Documents

Publication Publication Date Title
CN105634881B (en) Application scene recommendation method and device
US10834237B2 (en) Method, apparatus, and storage medium for controlling cooperation of multiple intelligent devices with social application platform
US11329938B2 (en) Terminal apparatus and method for controlling internet of things (IoT) devices
CN106101736B (en) A kind of methods of exhibiting and system of virtual present
CN106331826B (en) A kind of methods, devices and systems of setting live streaming template and video mode
KR102064929B1 (en) Operating Method For Nearby Function and Electronic Device supporting the same
CN106953785B (en) Intelligent household equipment adding method and device
JP2017517904A (en) Video-based interaction method, terminal, server, and system
JP2021005898A (en) Method for acquiring interactive information, terminal, server and system
CN110049476B (en) Equipment pushing method, device, mobile terminal and storage medium
US9860359B2 (en) Method for communicating with neighbor device, electronic device, and storage medium
CN105208056B (en) Information interaction method and terminal
CN107333162B (en) Method and device for playing live video
CN106375179B (en) Method and device for displaying instant communication message
CN105828145A (en) Interaction method and interaction device
CN106371326B (en) Storage method and device for equipment working scene
WO2017084289A1 (en) Method, apparatus and system for presenting information
CN109495638B (en) Information display method and terminal
CN108476339A (en) A kind of remote control method and terminal
CN106789488B (en) Intelligent household equipment management method and device
CN103491421A (en) Content display method and device and smart television
US20190052745A1 (en) Method For Presenting An Interface Of A Remote Controller In A Mobile Device
WO2015014138A1 (en) Method, device, and equipment for displaying display frame
KR20150128482A (en) Apparatus and Method for operating communication service between electronic devices
CN105320532B (en) Method, device and terminal for displaying interactive interface

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant