CN111176503A - Interactive system setting method and device and storage medium - Google Patents

Interactive system setting method and device and storage medium Download PDF

Info

Publication number
CN111176503A
CN111176503A CN201911295362.7A CN201911295362A CN111176503A CN 111176503 A CN111176503 A CN 111176503A CN 201911295362 A CN201911295362 A CN 201911295362A CN 111176503 A CN111176503 A CN 111176503A
Authority
CN
China
Prior art keywords
keywords
user
interactive
interaction
body temperature
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911295362.7A
Other languages
Chinese (zh)
Inventor
吴妙瑜
杨慧敏
龚韵遥
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Gree Electric Appliances Inc of Zhuhai
Original Assignee
Gree Electric Appliances Inc of Zhuhai
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Gree Electric Appliances Inc of Zhuhai filed Critical Gree Electric Appliances Inc of Zhuhai
Priority to CN201911295362.7A priority Critical patent/CN111176503A/en
Publication of CN111176503A publication Critical patent/CN111176503A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention provides an interactive system setting method, an interactive system setting device and a storage medium, wherein the method comprises the following steps: acquiring an interactive medium selected by a user; receiving keywords of a preset type input by a user; acquiring first facial expression data and/or first body temperature distribution data of a user, and/or acquiring current first day information; and configuring a corresponding interactive system based on the interactive media and the preset type of keywords and by combining the first facial expression data, the first body temperature distribution data and/or the first antenna information. The scheme provided by the invention can realize the rapid configuration of the user according to the self requirement and obtain the required function.

Description

Interactive system setting method and device and storage medium
Technical Field
The present invention relates to the field of interactive systems, and in particular, to a method and an apparatus for setting an interactive system, and a storage medium.
Background
In the G-IEMS local energy Internet system, at present, according to user requirements, a developer configures function permissions for user account permissions in a background mode, the configuration mode needs professional operation, and a user can see system functions and interfaces after the account permissions are opened. At present, some model selection systems exist, and a user can select functions on the system, but the interface and the interaction mode of the system functions are fixed and cannot be changed, and the personalized experience is not high.
Disclosure of Invention
The present invention is directed to overcome the drawbacks of the prior art, and provides a method, an apparatus and a storage medium for setting an interactive system, so as to solve the problem that the functional interface and the interactive mode of the interactive system in the prior art are fixed and cannot be changed.
The invention provides an interactive system setting method on one hand, which comprises the following steps: acquiring an interactive medium selected by a user; receiving keywords of a preset type input by a user; acquiring first facial expression data and/or first body temperature distribution data of a user, and/or acquiring current first day information; and configuring a corresponding interactive system based on the interactive media and the preset type of keywords and by combining the first facial expression data, the first body temperature distribution data and/or the first antenna information.
Optionally, the interactive medium comprises: APP media, computer media and/or touch screen terminal media; and/or, the preset type of keywords comprise: application keywords, function keywords, and/or emotion keywords.
Optionally, based on the interaction medium and the preset type of keyword, and in combination with the first facial expression data, the first body temperature distribution data, and/or the first antenna information, configuring a corresponding interaction system, including: configuring an interactive interface of a corresponding interactive system based on the interactive media and the keywords; setting a color change rule and/or an interaction mode of the interaction interface according to the keyword, the first facial expression data, the first body temperature distribution data and/or the first day information; the interaction mode comprises the following steps: page switching manner and/or page switching speed.
Optionally, the preset type of keyword includes: application occasion keywords, function keywords and/or emotion keywords; configuring an interactive interface of a corresponding interactive system based on the interactive media and the keywords, wherein the interactive interface comprises: configuring an interactive interface of a corresponding interactive system according to the interactive media, the application occasion keywords and the function keywords; and/or setting a color change rule and/or an interaction mode of the interaction interface according to the keyword, the first facial expression data, the first body temperature distribution data and/or the first day information, wherein the color change rule and/or the interaction mode comprise: setting a color change rule and/or an interaction mode of the interaction interface according to the emotion keywords and by combining the first facial expression data and/or the first body temperature distribution data; and/or setting a color change rule and/or an interaction mode of the interaction interface according to the emotion keywords and by combining the first weather information; and/or setting a color change rule and/or an interaction mode of the interaction interface according to the first facial expression data and/or the first body temperature distribution data and by combining the first day information.
Optionally, the method further comprises: receiving a change command for changing the color change rule and/or the interaction mode of the interaction interface in the use process of the current interaction system; when the change command is received, receiving emotion keywords input by a user based on the change command, acquiring second facial expression data and/or second body temperature distribution data of the user, and/or acquiring current second weather information; and resetting the color change rule and/or the interaction mode of the interaction interface according to the emotion keywords and by combining the second facial expression data, the second body temperature distribution data and/or the second weather information.
Optionally, receiving a preset type of keyword input by a user, including: displaying at least one preset alternative keyword aiming at each preset type; and receiving the alternative keywords selected by the user aiming at each preset type.
Optionally, the method further comprises: collecting historical configuration data of the user's interactive system; and setting a common menu bar corresponding to the user according to the historical configuration data, wherein the common menu bar is used for displaying the interactive page and/or system function commonly used by the user.
In another aspect, the present invention provides an interactive system setting apparatus, including: the first acquisition unit is used for acquiring the interactive media selected and used by the user; the first receiving unit is used for receiving keywords of preset types input by a user; the acquisition unit is used for acquiring first facial expression data and/or first body temperature distribution data of a user; the second acquisition unit is used for acquiring current first weather information; and the configuration unit is used for configuring a corresponding interactive system based on the interactive media and the preset type of keywords and combining the first facial expression data, the first body temperature distribution data and/or the first antenna information.
Optionally, the interactive medium comprises: APP media, computer media and/or touch screen terminal media; and/or, the preset type of keywords comprise: application keywords, function keywords, and/or emotion keywords.
Optionally, the configuration unit includes: the configuration subunit is used for configuring an interactive interface of a corresponding interactive system based on the interactive media and the keywords; the setting subunit is used for setting a color change rule and/or an interaction mode of the interaction interface according to the keyword, the first facial expression data, the first body temperature distribution data and/or the first day information; the interaction mode comprises the following steps: page switching manner and/or page switching speed.
Optionally, the preset type of keyword includes: application occasion keywords, function keywords and/or emotion keywords; the configuration subunit, configured, based on the interaction media and the keywords, an interaction interface of a corresponding interaction system, including: configuring an interactive interface of a corresponding interactive system according to the interactive media, the application occasion keywords and the function keywords; and/or the setting subunit sets a color change rule and/or an interaction mode of the interaction interface according to the keyword, the first facial expression data, the first body temperature distribution data and/or the first day information, including: setting a color change rule and/or an interaction mode of the interaction interface according to the emotion keywords and by combining the first facial expression data and/or the first body temperature distribution data; and/or setting a color change rule and/or an interaction mode of the interaction interface according to the emotion keywords and by combining the first weather information; and/or setting a color change rule and/or an interaction mode of the interaction interface according to the first facial expression data and/or the first body temperature distribution data and by combining the first day information.
Optionally, the method further comprises: the second receiving unit is used for receiving a change command for changing the color change rule and/or the interaction mode of the interaction interface in the use process of the current interaction system; the first receiving unit is further configured to: when the second receiving unit receives the change command, receiving emotion keywords input by a user based on the change command; the acquisition unit is further configured to: when the second receiving unit receives the change command, acquiring second facial expression data and/or second body temperature distribution data of the user, and/or the second acquiring unit, further configured to: acquiring current second weather information; the configuration unit is further configured to: and resetting the color change rule and/or the interaction mode of the interaction interface according to the emotion keywords and by combining the second facial expression data, the second body temperature distribution data and/or the second weather information.
Optionally, the first receiving unit receives a preset type of keyword input by a user, and includes: displaying at least one preset alternative keyword aiming at each preset type; and receiving the alternative keywords selected by the user aiming at each preset type.
Optionally, the method further comprises: the collection unit is used for collecting historical configuration data of the interactive system of the user; and the setting unit is used for setting a common menu bar corresponding to the user according to the historical configuration data, and the common menu bar is used for displaying the interactive page and/or system function commonly used by the user.
A further aspect of the invention provides a storage medium having stored thereon a computer program which, when executed by a processor, carries out the steps of any of the methods described above.
According to the technical scheme of the invention, based on the interactive media selected by the user and the input keywords, and combining the facial expression data, the body temperature distribution data and/or the weather information of the user, a corresponding interactive system is configured, so that the user can rapidly configure according to the self requirement and obtain the required functions; according to the technical scheme of the invention, the corresponding color change rule and/or interaction mode are set according to the emotional tendency of the user, so that the user experience is improved. According to the technical scheme of the invention, the color change rule and/or the interaction mode of the interaction interface are/is set according to comprehensive consideration of emotion keywords input by the user, facial expression data and/or body temperature distribution data and/or weather information of the user, and the experience of the automatic configuration user of the G-IEMS system is improved in a multi-dimensional or multi-sensor or multi-information source mode.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the invention and not to limit the invention. In the drawings:
FIG. 1 is a method diagram of an embodiment of an interactive system setup method provided by the present invention;
FIG. 2 is a flowchart illustrating one embodiment of the steps of configuring a corresponding interactive system, in accordance with an embodiment of the present invention;
FIG. 3 is a method diagram of another embodiment of the interactive system setup method provided by the present invention;
FIG. 4 is a schematic structural diagram of an embodiment of an interactive system setup apparatus provided in the present invention;
FIG. 5 is a schematic diagram illustrating an embodiment of a configuration unit;
fig. 6 is a schematic structural diagram of another embodiment of the interactive system setting device provided by the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the technical solutions of the present invention will be clearly and completely described below with reference to the specific embodiments of the present invention and the accompanying drawings. It is to be understood that the described embodiments are merely exemplary of the invention, and not restrictive of the full scope of the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
The display mode of the G-IEMS local area energy Internet system mainly comprises a webpage, an APP and a touch screen mode, and different resolution ratios and interaction modes are provided. The current model selection method (including selecting application, used medium and/or used function) of the G-IEMS system, which determines the system configuration by means of keywords, may cause the system to make an erroneous judgment due to fraudulent selection by the user.
The invention provides an interactive system setting method. The interactive system is preferably an interactive system of a G-IEMS local energy Internet.
Fig. 1 is a schematic method diagram of an embodiment of an interactive system setting method provided by the present invention.
As shown in fig. 1, according to an embodiment of the present invention, the interactive system setting method includes at least step S110, step S120, step S130, and step S140.
Step S110, acquiring the interactive media selected by the user.
The interactive media include, for example: APP media, computer media and/or touch screen terminal media; and receiving keywords of preset types input by a user. In a specific embodiment, the device currently used by the user may be acquired, for example, when the user opens the type selection page through the mobile terminal of the mobile phone, the default interactive medium selected by the user is APP. For example, the browser obtains the request header information to determine a device used by the user to access the type selection system, such as a PC web page terminal, an APP mobile terminal, or a touch screen terminal (e.g., an industrial touch screen), so as to determine that the interaction medium is a computer medium, an APP medium, or a touch screen terminal medium. By determining the default option in this manner, the user may be facilitated. In another embodiment, at least two interactive media options may be displayed for selection by the user, for example, at least two interactive media options may be displayed on a preset setup page for selection by the user.
Step S120, receiving a preset type of keyword input by a user.
The preset type of keyword may specifically include: application keywords, function keywords, and/or emotion keywords. In a specific embodiment, for each preset type, displaying a corresponding input area; and receiving keywords input by a user in the input area corresponding to each preset type. In a preferred embodiment, at least one preset alternative keyword is displayed for each preset type; and receiving the alternative keywords selected by the user aiming at each preset type. That is, the user may select a desired keyword among the displayed at least one preset candidate keyword of each preset type.
Step S130, collecting first facial expression data and/or first body temperature distribution data of a user, and/or acquiring current first day information.
For example, a camera is used for collecting a facial image of a user to obtain first facial expression data of the user, and an infrared camera is used for collecting thermal imaging data of the user as the first body temperature difference data. The first weather information may be, for example, outside weather information acquired by a weather station owned by the G-IEMS system.
Step S140, configuring a corresponding interactive system based on the interactive media and the preset type of keywords, and combining the first facial expression data, the first body temperature distribution data, and/or the first antenna information.
FIG. 2 is a flowchart illustrating an embodiment of steps for configuring a corresponding interactive system according to an embodiment of the present invention. As shown in fig. 2, step S140 may specifically include step S141 and step S142.
Step S141, configuring an interactive interface of the corresponding interactive system based on the interactive media and the keyword.
In one embodiment, the interactive interface of the interactive system is configured according to the interactive medium, the application scenario keywords and the function keywords. For example, the interactive medium selected by the user is an APP medium, the application scenario keyword selected by the user is home, and the functional keyword selects electric energy monitoring (according to a hierarchy or a region), air conditioner control, lamp control, and the like, so that the APP home system is automatically configured and generated and includes the functions of the electric energy monitoring, the air conditioner control, the lamp control, and the like.
And S142, setting a color change rule and/or an interaction mode of the interaction interface according to the keyword, the first facial expression data, the first body temperature distribution data and/or the first day information.
The interaction means may comprise, for example, a page switching means and/or a page switching speed. The page switching mode includes, for example, horizontal left-right switching and/or up-down switching.
When the preset type of keywords include emotion keywords, in a specific embodiment, the color change rule and/or the interaction mode of the interaction interface are set according to the emotion keywords. Namely, determining the emotion of the user according to the emotion keywords, and setting the color change rule and/or the interaction mode of the interaction interface according to the determined emotion of the user. For example, different user emotions correspond to different color change rules and/or interaction patterns.
When the preset type of keywords include emotion keywords, in another specific embodiment, color change rules and/or interaction modes of the interaction interface are set according to the emotion keywords and by combining the first facial expression data, the first body temperature distribution data and/or the first weather information. That is to say, according to the emotion keywords, and in combination with one or more of the first facial expression data, the first body temperature distribution data and the first day information, a color change rule and/or an interaction mode of the interaction interface are/is set.
For example, according to the emotion keywords, the color change rule and/or the interaction mode of the interaction interface are set by combining the first facial expression data and/or the first body temperature distribution data. Specifically, the emotion of the user is judged according to the first facial expression data and/or the first body temperature distribution data, and the color change rule and/or the interaction mode of the interaction interface are/is set according to the emotion keywords and the judged emotion of the user. For example, the user may intentionally select an emotional keyword, and then determine the real emotion of the user by using the first facial expression data and/or the first body temperature distribution data, and then set the color change rule according to the real emotion of the user. It has been shown that color influences human mood, and that the mood of a user can be adjusted from a visual point of view by appropriate color substitution when the user is in an emotional state and the user deliberately chooses the wrong emotional state (the user has a fraudulent choice).
For another example, the color change rule and/or the interaction mode of the interaction interface are set according to the emotion keywords and in combination with the first weather information. Specifically, the emotion of the user is determined according to the emotion key words, and the color change rule and/or the interaction mode of the interaction interface are/is set according to the determined emotion of the user and the first weather information. The first weather information may include, for example, temperature, humidity, irradiance, and/or weather information (e.g., wind, cloud, rain, snow, frost, dew, rainbow, halo, lightning, thunderstrike, etc.).
In another specific embodiment, the color change rule and/or the interaction mode of the interaction interface are set according to the first facial expression data and/or the first body temperature distribution data and by combining the first day information. Specifically, the emotion of the user is judged according to the first facial expression data and/or the first body temperature distribution data; and setting a color change rule and/or an interaction mode of the interaction interface according to the first weather information and the judged user emotion. The first weather information may include, for example, temperature, humidity, irradiance, and/or weather information (e.g., wind, cloud, rain, snow, frost, dew, rainbow, halo, lightning, thunderstrike, etc.).
In the foregoing specific implementation manner in which the color change rule and/or the interaction manner of the interactive interface are set according to the emotion keyword and in combination with the first day information, or in the specific implementation manner in which the color change rule and/or the interaction manner of the interactive interface are set according to the first face expression data and/or the first body temperature distribution data and in combination with the first day information, for example, it is determined that the emotion of the user is happy according to the emotion keyword selected by the user, or it is determined that the emotion of the user is happy according to the first face expression data and/or the first body temperature distribution data, the default theme color is blue, when the weather is clear (first day information) in the daytime, the interactive interface is displayed in bright blue, and at this time, when the user wants to operate the page, the page can be switched only by a very light operation, for example, the ease of page switching is embodied by setting the page switching speed and/or the page switching mode, for example, three page switching speeds including slow, fast and fast, and setting the current page switching speed to be fast can visually embody the ease of page switching. When the weather is light rain (first day information) at night, the interactive interface is changed into dark blue, and the page switching speed and/or the page switching mode are set, so that the page switching speed is slightly slow and the page switching is light. And adjusting the mood of the user through the change of the color and the change of the interactive mode.
In another specific embodiment, a color change rule and/or an interaction mode of the interactive interface is set according to the first facial expression data and/or the first body temperature distribution data. Specifically, the emotion of the user is judged according to the first facial expression data and/or the first body temperature distribution data, and the color change rule and/or the interaction mode of the interaction interface are/is set according to the judged emotion of the user. For example, different user emotions correspond to different color change rules and/or interaction patterns.
Fig. 3 is a schematic method diagram of another embodiment of the interactive system setting method provided by the present invention.
As shown in fig. 3, according to another embodiment of the present invention, the interactive system setting method further includes step S150, step S160, and step S170.
Step S150, in the using process of the current interactive system, receiving a change command for changing the color change rule and/or the interactive mode of the interactive interface.
Step S160, when the change command is received, receiving the emotion keywords input by the user based on the change command, acquiring second facial expression data and/or second body temperature distribution data of the user, and/or acquiring current second weather information.
Step S170, according to the emotion keywords, and by combining the second facial expression data, the second body temperature distribution data and/or the second weather information, resetting the color change rule and/or the interaction mode of the interaction interface.
Specifically, in the use process of the current interactive system, if a user wants to change the system color and/or the interactive mode, a change command may be issued, and an emotion keyword may be selected again, a change command issued by the user to change the color change rule and/or the interactive mode of the interactive interface is received, an emotion keyword input by the user based on the change command is received, meanwhile, second facial expression data and/or second body temperature distribution data of the user may be collected, and/or current second weather information may be acquired, and the color change rule and/or the interactive mode of the interactive interface may be reset by combining one, two, or more of the second facial expression data, the second body temperature distribution data, and the second weather information.
For example, according to the emotion keywords, the color change rule and/or the interaction mode of the interaction interface are reset by combining the second facial expression data and/or the second body temperature distribution data. Specifically, the emotion of the user is judged according to the second facial expression data and/or the second body temperature distribution data, and the color change rule and/or the interaction mode of the interaction interface are/is set according to the emotion keywords and the judged emotion of the user. For example, the user may intentionally select the wrong emotion keyword, and then determine the real emotion of the user by using the second facial expression data and/or the second body temperature distribution data, and then set the color change rule according to the real emotion of the user. It has been shown that color influences human mood, and that the mood of a user can be adjusted from a visual point of view by appropriate color substitution when the user is in an emotional state and the user deliberately chooses the wrong emotional state (the user has a fraudulent choice).
For another example, the color change rule and/or the interaction mode of the interaction interface are reset according to the emotion keywords and in combination with the second weather information. Specifically, the emotion of the user is determined according to the emotion key words, and the color change rule and/or the interaction mode of the interaction interface are/is set according to the determined emotion of the user and the second weather information. The second weather information may include, for example, temperature, humidity, irradiance, and/or meteorological information (e.g., wind, cloud, rain, snow, frost, dew, rainbow, halo, lightning, thunderstrike, etc.).
Optionally, the method may further include: collecting historical configuration data of the user's interactive system; and setting a common menu bar corresponding to the user according to the historical configuration data, wherein the common menu bar is used for displaying the interactive page and/or system function commonly used by the user.
For example, the system may automatically collect the user historical configuration data, for example, the user historical configuration data includes an interactive page commonly used by the user or functions commonly used by the user, and the functions commonly used by the user are sorted, so that the user can conveniently and quickly select the function concerned by the user. The common pages and functions can be automatically set for the user in the common menu bar as the quick menu bar according to the historical configuration data of the user.
The invention also provides an interactive system setting device. The interactive system is preferably an interactive system of a G-IEMS local energy Internet.
Fig. 4 is a schematic structural diagram of an embodiment of an interactive system setting apparatus provided in the present invention. As shown in fig. 4, the interactive system setting apparatus 100 includes: a first obtaining unit 110, a first receiving unit 120, a collecting unit 130, a second obtaining unit 140, and a configuration unit 150.
The first obtaining unit 110 is used for obtaining the interactive media selected by the user for use.
The interactive media include, for example: APP media, computer media and/or touch screen terminal media; and receiving keywords of preset types input by a user. In a specific embodiment, the first obtaining unit 110 may obtain a device currently used by a user, for example, if the user opens a type selection page through a mobile terminal of a mobile phone, the default interactive medium selected by the user is APP. For example, the browser obtains the request header information to determine a device used by the user to access the type selection system, such as a PC web page terminal, an APP mobile terminal, or a touch screen terminal (e.g., an industrial touch screen), so as to determine that the interaction medium is a computer medium, an APP medium, or a touch screen terminal medium. By determining the default option in this manner, the user may be facilitated. In another embodiment, at least two interactive media options may be displayed for selection by the user, for example, at least two interactive media options may be displayed on a preset setup page for selection by the user.
The first receiving unit 120 is configured to receive a preset type of keyword input by a user.
The preset type of keyword may specifically include: application keywords, function keywords, and/or emotion keywords. In a specific embodiment, for each preset type, displaying a corresponding input area; and receiving keywords input by a user in the input area corresponding to each preset type. In a preferred embodiment, at least one preset alternative keyword is displayed for each preset type; and receiving the alternative keywords selected by the user aiming at each preset type. That is, the user may select a desired keyword among the displayed at least one preset candidate keyword of each preset type.
The collecting unit 130 is used for collecting first facial expression data and/or first body temperature distribution data of the user.
For example, a camera is used for collecting a facial image of a user to obtain first facial expression data of the user, and an infrared camera is used for collecting thermal imaging data of the user as the first body temperature difference data.
The second obtaining unit 140 is configured to obtain current first weather information. For example, the second obtaining unit 140 may obtain the external weather information as the first weather information through a weather station owned by the G-IEMS system.
The configuration unit 150 is configured to configure a corresponding interactive system based on the interactive media and the preset type of keywords, and by combining the first facial expression data, the first body temperature distribution data, and/or the first antenna information.
Fig. 5 is a schematic structural diagram of an embodiment of a configuration unit according to the present invention. As shown in fig. 5, the configuration unit 150 specifically includes a configuration subunit 151 and a setting subunit 152.
The configuration subunit 151 is configured to configure an interactive interface of the corresponding interactive system based on the interactive medium and the keyword.
In one embodiment, the configuration subunit 151 configures an interactive interface of the corresponding interactive system according to the interactive medium, the application scenario keyword, and the function keyword. For example, the interactive medium selected by the user is an APP medium, the application scenario keyword selected by the user is home, and the functional keyword selects electric energy monitoring (according to a hierarchy or a region), air conditioner control, lamp control, and the like, so that the APP home system is automatically configured and generated and includes the functions of the electric energy monitoring, the air conditioner control, the lamp control, and the like.
The setting subunit 152 is configured to set a color change rule and/or an interaction manner of the interaction interface according to the keyword, the first facial expression data, the first body temperature distribution data, and/or the first day information.
The interaction means may comprise, for example, a page switching means and/or a page switching speed. The page switching mode includes, for example, horizontal left-right switching and/or up-down switching.
When the preset type of keywords include emotion keywords, in a specific embodiment, the setting subunit 152 sets the color change rule and/or the interaction mode of the interaction interface according to the emotion keywords. Namely, determining the emotion of the user according to the emotion keywords, and setting the color change rule and/or the interaction mode of the interaction interface according to the determined emotion of the user. For example, different user emotions correspond to different color change rules and/or interaction patterns.
When the preset type of keywords include emotion keywords, in another specific embodiment, the setting subunit 152 sets the color change rule and/or the interaction mode of the interaction interface according to the emotion keywords and by combining the first facial expression data, the first body temperature distribution data, and/or the first day information. That is to say, according to the emotion keywords, and in combination with one or more of the first facial expression data, the first body temperature distribution data and the first day information, a color change rule and/or an interaction mode of the interaction interface are/is set.
For example, the setting subunit 152 sets a color change rule and/or an interaction manner of the interaction interface according to the emotion keyword and in combination with the first facial expression data and/or the first body temperature distribution data. Specifically, the emotion of the user is judged according to the first facial expression data and/or the first body temperature distribution data, and the color change rule and/or the interaction mode of the interaction interface are/is set according to the emotion keywords and the judged emotion of the user. For example, the user may intentionally select an emotional keyword, and then determine the real emotion of the user by using the first facial expression data and/or the first body temperature distribution data, and then set the color change rule according to the real emotion of the user. It has been shown that color influences human mood, and that the mood of a user can be adjusted from a visual point of view by appropriate color substitution when the user is in an emotional state and the user deliberately chooses the wrong emotional state (the user has a fraudulent choice).
For another example, the setting subunit 152 sets the color change rule and/or the interaction mode of the interaction interface according to the emotion keyword and in combination with the first weather information. Specifically, the emotion of the user is determined according to the emotion key words, and the color change rule and/or the interaction mode of the interaction interface are/is set according to the determined emotion of the user and the first weather information. The first weather information may include, for example, temperature, humidity, irradiance, and/or weather information (e.g., wind, cloud, rain, snow, frost, dew, rainbow, halo, lightning, thunderstrike, etc.).
In another specific embodiment, the setting subunit 152 sets the color change rule and/or the interaction mode of the interactive interface according to the first facial expression data and/or the first body temperature distribution data and by combining the first day information. Specifically, the emotion of the user is judged according to the first facial expression data and/or the first body temperature distribution data; and setting a color change rule and/or an interaction mode of the interaction interface according to the first weather information and the judged user emotion. The first weather information may include, for example, temperature, humidity, irradiance, and/or weather information (e.g., wind, cloud, rain, snow, frost, dew, rainbow, halo, lightning, thunderstrike, etc.).
In the specific implementation manner of the aforementioned setting sub-unit 152 setting the color change rule and/or the interaction manner of the interactive interface according to the emotion keyword and in combination with the first day information, or in the specific implementation manner of the setting sub-unit 152 setting the color change rule and/or the interaction manner of the interactive interface according to the first facial expression data and/or the first body temperature distribution data and in combination with the first day information, for example, it is determined that the emotion of the user is happy according to the emotion keyword selected by the user, or it is determined that the emotion of the user is happy according to the first facial expression data and/or the first body temperature distribution data, the default theme color is blue, when the weather is clear (first day information) in daytime, the interactive interface is displayed as bright blue, and when the user wants to operate the page, the page can be switched only by very easy operation, for example, the page switching is easily embodied by setting the page switching speed and/or the page switching mode, for example, the three page switching speeds include slow, fast and fast, the current page switching speed is set to be fast, and the page switching is easily embodied visually. When the weather is light rain (first day information) at night, the interactive interface is changed into dark blue, and the page switching speed and/or the page switching mode are set, so that the page switching speed is slightly slow and light. And adjusting the mood of the user through the change of the color and the change of the interactive mode.
In yet another embodiment, the setting subunit 152 sets the color change rule and/or the interaction mode of the interaction interface according to the first facial expression data and/or the first body temperature distribution data. Specifically, the emotion of the user is judged according to the first facial expression data and/or the first body temperature distribution data, and the color change rule and/or the interaction mode of the interaction interface are/is set according to the judged emotion of the user. For example, different user emotions correspond to different color change rules and/or interaction patterns.
Fig. 6 is a schematic structural diagram of another embodiment of the interactive system setting device provided by the present invention. As shown in fig. 6, the interactive system setting apparatus 100 further includes a second receiving unit 160.
The second receiving unit 160 is configured to receive a change command for changing a color change rule and/or an interaction manner of the interaction interface during a use process of the current interaction system; the first receiving unit 110 is further configured to: when the second receiving unit receives the change command, receiving emotion keywords input by a user based on the change command; the acquisition unit 130 is further configured to: when the second receiving unit receives the change command, the second receiving unit is configured to acquire second facial expression data and/or second body temperature distribution data of the user, and/or the second acquiring unit 140 is further configured to: acquiring current second weather information; the configuration unit 150 is further configured to: and resetting the color change rule and/or the interaction mode of the interaction interface according to the emotion keywords and by combining the second facial expression data, the second body temperature distribution data and/or the second weather information.
Specifically, in the using process of the current interactive system, if the user wants to change the system color and/or the interactive mode, a change command may be issued, and the second receiving unit 160 receives the change command for changing the color change rule and/or the interactive mode of the interactive interface; the user may reselect an emotion keyword, the first receiving unit 110 receives the emotion keyword input by the user based on the change command, the acquiring unit 130 acquires second facial expression data and/or second body temperature distribution data of the user, and/or the second acquiring unit 140 acquires current second weather information, and resets the color change rule and/or the interaction mode of the interactive interface by combining one, two or more of the second facial expression data, the second body temperature distribution data, and the second weather information.
For example, according to the emotion keywords, the color change rule and/or the interaction mode of the interaction interface are reset by combining the second facial expression data and/or the second body temperature distribution data. Specifically, the emotion of the user is judged according to the second facial expression data and/or the second body temperature distribution data, and the color change rule and/or the interaction mode of the interaction interface are/is set according to the emotion keywords and the judged emotion of the user. For example, the user may intentionally select the wrong emotion keyword, and then determine the real emotion of the user by using the second facial expression data and/or the second body temperature distribution data, and then set the color change rule according to the real emotion of the user. It has been shown that color influences human mood, and that the mood of a user can be adjusted from a visual point of view by appropriate color substitution when the user is in an emotional state and the user deliberately chooses the wrong emotional state (the user has a fraudulent choice).
For another example, the color change rule and/or the interaction mode of the interaction interface are reset according to the emotion keywords and in combination with the second weather information. Specifically, the emotion of the user is determined according to the emotion key words, and the color change rule and/or the interaction mode of the interaction interface are/is set according to the determined emotion of the user and the second weather information. The second weather information may include, for example, temperature, humidity, irradiance, and/or meteorological information (e.g., wind, cloud, rain, snow, frost, dew, rainbow, halo, lightning, thunderstrike, etc.).
Optionally, the method further comprises: the collection unit is used for collecting historical configuration data of the interactive system of the user; and the setting unit is used for setting a common menu bar corresponding to the user according to the historical configuration data, and the common menu bar is used for displaying the interactive page and/or system function commonly used by the user.
For example, the system may automatically collect the user historical configuration data, for example, the user historical configuration data includes an interactive page commonly used by the user or functions commonly used by the user, and the functions commonly used by the user are sorted, so that the user can conveniently and quickly select the function concerned by the user. The common pages and functions can be automatically set for the user in the common menu bar as the quick menu bar according to the historical configuration data of the user.
The invention also provides a storage medium corresponding to the interactive system setting method, on which a computer program is stored, which when executed by a processor implements the steps of any of the methods described above. The interactive system is preferably an interactive system of a G-IEMS local energy Internet.
Accordingly, according to the scheme provided by the invention, based on the interactive media selected by the user and the input keywords, and combined with the facial expression data, the body temperature distribution data and/or the weather information of the user, the corresponding figure 4 is a local energy internet interactive system of the interactive system setting device of the G-IEMS local energy internet provided by the invention, so that the user can rapidly configure and obtain the functions required by the user according to the self requirements; according to the technical scheme of the invention, the corresponding color change rule and/or interaction mode are set according to the emotional tendency of the user, so that the user experience is improved. According to the technical scheme of the invention, the color change rule and/or the interaction mode of the interaction interface are/is set according to comprehensive consideration of emotion keywords input by the user, facial expression data and/or body temperature distribution data and/or weather information of the user, and the experience of the automatic configuration user of the G-IEMS system is improved in a multi-dimensional or multi-sensor or multi-information source mode.
The functions described herein may be implemented in hardware, software executed by a processor, firmware, or any combination thereof. If implemented in software executed by a processor, the functions may be stored on or transmitted over as one or more instructions or code on a computer-readable medium. Other examples and implementations are within the scope and spirit of the invention and the following claims. For example, due to the nature of software, the functions described above may be implemented using software executed by a processor, hardware, firmware, hardwired, or a combination of any of these. In addition, each functional unit may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
In the embodiments provided in the present application, it should be understood that the disclosed technology can be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units may be a logical division, and in actual implementation, there may be another division, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, units or modules, and may be in an electrical or other form.
The units described as separate parts may or may not be physically separate, and the parts serving as the control device may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
The above description is only an example of the present invention, and is not intended to limit the present invention, and it is obvious to those skilled in the art that various modifications and variations can be made in the present invention. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the scope of the claims of the present invention.

Claims (13)

1. An interactive system setting method, comprising:
acquiring an interactive medium selected by a user;
receiving keywords of a preset type input by a user;
acquiring first facial expression data and/or first body temperature distribution data of a user, and/or acquiring current first day information;
and configuring a corresponding interactive system based on the interactive media and the preset type of keywords and by combining the first facial expression data, the first body temperature distribution data and/or the first antenna information.
2. The method of claim 1,
the interactive medium comprises: APP media, computer media and/or touch screen terminal media; and/or the presence of a gas in the gas,
the preset type of keywords comprise: application keywords, function keywords, and/or emotion keywords.
3. The method according to claim 1 or 2, wherein the step of configuring a corresponding interactive system based on the interactive media and the predetermined type of keywords and combining the first facial expression data, the first body temperature distribution data and/or the first weather information comprises:
configuring an interactive interface of a corresponding interactive system based on the interactive media and the keywords;
setting a color change rule and/or an interaction mode of the interaction interface according to the keyword, the first facial expression data, the first body temperature distribution data and/or the first day information;
the interaction mode comprises the following steps: page switching manner and/or page switching speed.
4. The method of claim 3,
the preset type of keywords comprise: application occasion keywords, function keywords and/or emotion keywords;
configuring an interactive interface of a corresponding interactive system based on the interactive media and the keywords, wherein the interactive interface comprises:
configuring an interactive interface of a corresponding interactive system according to the interactive media, the application occasion keywords and the function keywords;
and/or the presence of a gas in the gas,
setting a color change rule and/or an interaction mode of the interaction interface according to the keyword, the first facial expression data, the first body temperature distribution data and/or the first day information, wherein the color change rule and/or the interaction mode comprise:
setting a color change rule and/or an interaction mode of the interaction interface according to the emotion keywords and by combining the first facial expression data and/or the first body temperature distribution data;
and/or the presence of a gas in the gas,
setting a color change rule and/or an interaction mode of the interaction interface according to the emotion keywords and by combining the first weather information;
and/or the presence of a gas in the gas,
and setting a color change rule and/or an interaction mode of the interaction interface according to the first facial expression data and/or the first body temperature distribution data and by combining the first day information.
5. The method according to any one of claims 1-4, further comprising:
receiving a change command for changing the color change rule and/or the interaction mode of the interaction interface in the use process of the current interaction system;
when the change command is received, receiving emotion keywords input by a user based on the change command, acquiring second facial expression data and/or second body temperature distribution data of the user, and/or acquiring current second weather information;
and resetting the color change rule and/or the interaction mode of the interaction interface according to the emotion keywords and by combining the second facial expression data, the second body temperature distribution data and/or the second weather information.
6. The method according to any one of claims 1-5, wherein receiving a preset type of keyword input by a user comprises:
displaying at least one preset alternative keyword aiming at each preset type;
and receiving the alternative keywords selected by the user aiming at each preset type.
7. An interactive system setting apparatus, comprising:
the first acquisition unit is used for acquiring the interactive media selected and used by the user;
the first receiving unit is used for receiving keywords of preset types input by a user;
the acquisition unit is used for acquiring first facial expression data and/or first body temperature distribution data of a user;
the second acquisition unit is used for acquiring current first weather information;
and the configuration unit is used for configuring a corresponding interactive system based on the interactive media and the preset type of keywords and combining the first facial expression data, the first body temperature distribution data and/or the first antenna information.
8. The apparatus of claim 7,
the interactive medium comprises: APP media, computer media and/or touch screen terminal media; and/or the presence of a gas in the gas,
the preset type of keywords comprise: application keywords, function keywords, and/or emotion keywords.
9. The apparatus according to claim 7 or 8, wherein the configuration unit comprises:
the configuration subunit is used for configuring an interactive interface of a corresponding interactive system based on the interactive media and the keywords;
the setting subunit is used for setting a color change rule and/or an interaction mode of the interaction interface according to the keyword, the first facial expression data, the first body temperature distribution data and/or the first day information;
the interaction mode comprises the following steps: page switching manner and/or page switching speed.
10. The apparatus of claim 9,
the preset type of keywords comprise: application occasion keywords, function keywords and/or emotion keywords;
the configuration subunit, based on the interactive media and the preset type of keywords, configures an interactive interface of the corresponding interactive system, including:
configuring an interactive interface of a corresponding interactive system according to the interactive media, the application occasion keywords and the function keywords;
and/or the presence of a gas in the gas,
the setting subunit sets a color change rule and/or an interaction mode of the interaction interface according to the keyword, the first facial expression data, the first body temperature distribution data and/or the first day information, including:
setting a color change rule and/or an interaction mode of the interaction interface according to the emotion keywords and by combining the first facial expression data and/or the first body temperature distribution data;
and/or the presence of a gas in the gas,
setting a color change rule and/or an interaction mode of the interaction interface according to the emotion keywords and by combining the first weather information;
and/or the presence of a gas in the gas,
and setting a color change rule and/or an interaction mode of the interaction interface according to the first facial expression data and/or the first body temperature distribution data and by combining the first day information.
11. The apparatus of any one of claims 7-10, further comprising:
the second receiving unit is used for receiving a change command for changing the color change rule and/or the interaction mode of the interaction interface in the use process of the current interaction system;
the first receiving unit is further configured to: when the second receiving unit receives the change command, receiving emotion keywords input by a user based on the change command;
the acquisition unit is further configured to: when the second receiving unit receives the change command, acquiring second facial expression data and/or second body temperature distribution data of the user, and/or the second acquiring unit, further configured to: acquiring current second weather information;
the configuration unit is further configured to: and resetting the color change rule and/or the interaction mode of the interaction interface according to the emotion keywords and by combining the second facial expression data, the second body temperature distribution data and/or the second weather information.
12. The apparatus according to any one of claims 7-11, wherein the first receiving unit receives a preset type of keyword input by a user, and comprises:
displaying at least one preset alternative keyword aiming at each preset type;
and receiving the alternative keywords selected by the user aiming at each preset type.
13. A storage medium, having stored thereon a computer program which, when being executed by a processor, carries out the steps of the method of any one of claims 1 to 6.
CN201911295362.7A 2019-12-16 2019-12-16 Interactive system setting method and device and storage medium Pending CN111176503A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911295362.7A CN111176503A (en) 2019-12-16 2019-12-16 Interactive system setting method and device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911295362.7A CN111176503A (en) 2019-12-16 2019-12-16 Interactive system setting method and device and storage medium

Publications (1)

Publication Number Publication Date
CN111176503A true CN111176503A (en) 2020-05-19

Family

ID=70657367

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911295362.7A Pending CN111176503A (en) 2019-12-16 2019-12-16 Interactive system setting method and device and storage medium

Country Status (1)

Country Link
CN (1) CN111176503A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116360666A (en) * 2023-05-31 2023-06-30 Tcl通讯科技(成都)有限公司 Page sliding method and device, electronic equipment and computer storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105279494A (en) * 2015-10-23 2016-01-27 上海斐讯数据通信技术有限公司 Human-computer interaction system, method and equipment capable of regulating user emotion
CN105938390A (en) * 2015-03-03 2016-09-14 卡西欧计算机株式会社 Content output apparatus and content output method
CN105955486A (en) * 2016-05-16 2016-09-21 西北工业大学 Method for assisting teleoperation based on visual stimulation of brainwaves
US20170083116A1 (en) * 2014-08-25 2017-03-23 Chiun Mai Communication Systems, Inc. Electronic device and method of adjusting user interface thereof
CN108363492A (en) * 2018-03-09 2018-08-03 南京阿凡达机器人科技有限公司 A kind of man-machine interaction method and interactive robot
CN108416002A (en) * 2018-02-27 2018-08-17 维沃移动通信有限公司 A kind of man-machine interaction method and mobile terminal
CN108614987A (en) * 2016-12-13 2018-10-02 深圳光启合众科技有限公司 The method, apparatus and robot of data processing
CN109324687A (en) * 2018-08-14 2019-02-12 华为技术有限公司 A kind of display methods and virtual reality device

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170083116A1 (en) * 2014-08-25 2017-03-23 Chiun Mai Communication Systems, Inc. Electronic device and method of adjusting user interface thereof
CN105938390A (en) * 2015-03-03 2016-09-14 卡西欧计算机株式会社 Content output apparatus and content output method
CN105279494A (en) * 2015-10-23 2016-01-27 上海斐讯数据通信技术有限公司 Human-computer interaction system, method and equipment capable of regulating user emotion
CN105955486A (en) * 2016-05-16 2016-09-21 西北工业大学 Method for assisting teleoperation based on visual stimulation of brainwaves
CN108614987A (en) * 2016-12-13 2018-10-02 深圳光启合众科技有限公司 The method, apparatus and robot of data processing
CN108416002A (en) * 2018-02-27 2018-08-17 维沃移动通信有限公司 A kind of man-machine interaction method and mobile terminal
CN108363492A (en) * 2018-03-09 2018-08-03 南京阿凡达机器人科技有限公司 A kind of man-machine interaction method and interactive robot
CN109324687A (en) * 2018-08-14 2019-02-12 华为技术有限公司 A kind of display methods and virtual reality device

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116360666A (en) * 2023-05-31 2023-06-30 Tcl通讯科技(成都)有限公司 Page sliding method and device, electronic equipment and computer storage medium
CN116360666B (en) * 2023-05-31 2023-09-19 Tcl通讯科技(成都)有限公司 Page sliding method and device, electronic equipment and computer storage medium

Similar Documents

Publication Publication Date Title
US10635713B2 (en) Method and device for replacing the application visual control
CN106469038A (en) Display screen changing method based on multi-screen terminal and device
CN107991897B (en) Control method and device
US11943498B2 (en) Display method, display terminal and non-transitory computer readable storage medium
CN108833222B (en) Household appliance control method, household appliance control device, remote controller, terminal, server and medium
CN108345907A (en) Recognition methods, augmented reality equipment and storage medium
JP2017538312A (en) Streetlight management method and apparatus
CN110780598B (en) Intelligent device control method and device, electronic device and readable storage medium
CN110308845B (en) Interaction method and device for application program control interface
US11470240B2 (en) Method and terminal device for matching photgraphed objects and preset text imformation
CN105785784B (en) Intelligent household scene visualization method and device
JP2017123587A (en) Control device, control method, and program
EP2829961A1 (en) Display interface converting system and method thereof
CN111324275A (en) Broadcasting method and device for elements in display picture
CN101626633A (en) Mobile terminal and method for automatically updating subject mode
CN113495487A (en) Terminal and method for adjusting operation parameters of target equipment
CN103501410A (en) Reminding method and device of shooting as well as generation method and device of detection mode
CN111176503A (en) Interactive system setting method and device and storage medium
CN114253145A (en) Control method and control device for household appliance and household appliance control system
CN110912806A (en) Message processing method, device, storage medium and electronic device
CN110673737B (en) Display content adjusting method and device based on intelligent home operating system
CN110543276B (en) Picture screening method and terminal equipment thereof
CN111094859A (en) Air cleaner and network system
CN104679383A (en) Method and device for switching photographing buttons automatically
CN106094058A (en) A kind of weather forecast method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200519

RJ01 Rejection of invention patent application after publication