WO2014185056A1 - 情報提供方法 - Google Patents
情報提供方法 Download PDFInfo
- Publication number
- WO2014185056A1 WO2014185056A1 PCT/JP2014/002519 JP2014002519W WO2014185056A1 WO 2014185056 A1 WO2014185056 A1 WO 2014185056A1 JP 2014002519 W JP2014002519 W JP 2014002519W WO 2014185056 A1 WO2014185056 A1 WO 2014185056A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- display
- information
- home appliance
- home
- category
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L43/00—Arrangements for monitoring or testing data switching networks
- H04L43/02—Capturing of monitoring data
- H04L43/028—Capturing of monitoring data by filtering
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B15/00—Systems controlled by a computer
- G05B15/02—Systems controlled by a computer electric
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L12/00—Data switching networks
- H04L12/28—Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
- H04L12/2803—Home automation networks
- H04L12/2807—Exchanging configuration information on appliance services in a home automation network
- H04L12/281—Exchanging configuration information on appliance services in a home automation network indicating a format for calling an appliance service function in a home automation network
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L12/00—Data switching networks
- H04L12/28—Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
- H04L12/2803—Home automation networks
- H04L12/2807—Exchanging configuration information on appliance services in a home automation network
- H04L12/2812—Exchanging configuration information on appliance services in a home automation network describing content present in a home automation network, e.g. audio video content
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L12/00—Data switching networks
- H04L12/28—Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
- H04L12/2803—Home automation networks
- H04L12/2823—Reporting information sensed by appliance or service execution status of appliance services in a home automation network
- H04L12/2825—Reporting to a device located outside the home and the home network
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/01—Protocols
- H04L67/12—Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/30—Services specially adapted for particular environments, situations or purposes
- H04W4/38—Services specially adapted for particular environments, situations or purposes for collecting sensor information
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/70—Services for machine-to-machine communication [M2M] or machine type communication [MTC]
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/20—Pc systems
- G05B2219/26—Pc applications
- G05B2219/2642—Domotique, domestic, home control, automation, smart house
Definitions
- the present disclosure relates to an information providing method, and more particularly, to an information providing method in a plurality of home appliances, display devices, and servers connected via a network.
- Patent Document 1 discloses an information display method for a screen of a central supervisory control device or the like in a supervisory control system of a plurality of plants. According to Patent Document 1, a monitoring person can display various types of information so that the user can easily view the information at a time without having to open a separate screen for viewing the various types of information.
- Patent Document 1 With the information display method disclosed in Patent Document 1, it is possible to superimpose and display a certain amount of various types of information, but display a large amount of various types of information called so-called big data in an easily visible manner. There is a problem that it cannot be done.
- the present disclosure solves such a problem, and provides an information providing method that can provide an enormous amount of various information in a display mode that is easy to visually recognize.
- An information providing method is an information providing method in a plurality of home appliances, a display device, and a server connected via a network, and the server includes a plurality of the plurality of home appliances from each of the plurality of home appliances.
- FIG. 1 is a diagram illustrating an example of a configuration of an information providing system according to the first embodiment.
- FIG. 2 is a diagram illustrating an example of a house ID and attribute information held by the home appliance DB group in the first embodiment.
- FIG. 3 is a diagram illustrating an example of an event log including home appliance state changes and user operations held by the home appliance DB group in the first embodiment.
- FIG. 4A is a diagram illustrating an example of screen information of the information sharing service according to the first embodiment.
- FIG. 4B illustrates an example of a structure of the display device in Embodiment 1.
- FIG. 4C is a diagram illustrating an example of a configuration of a server in the first embodiment.
- FIG. 5 is a flowchart illustrating an example of UI acquisition processing of the display device according to the first embodiment.
- FIG. 6 is a flowchart illustrating an example of processing until the home appliance in Embodiment 1 performs event notification to the server.
- FIG. 7 is a flowchart illustrating an example of processing in which the server according to the first embodiment notifies the display device of UI update.
- FIG. 8 is a flowchart showing detailed processing of S2333, S2334, and S2335 of FIG.
- FIG. 9 is a diagram illustrating an example of information stored in the house DB according to the first embodiment.
- FIG. 10 is a diagram showing an example of a home appliance event summary table in Osaka Prefecture in the first embodiment.
- FIG. 11 is a diagram illustrating an example of a home appliance event occurrence frequency list according to the first embodiment.
- FIG. 12 is a diagram illustrating an example of an operation rate for each home appliance type according to the first embodiment.
- FIG. 13 is a diagram illustrating an example of the content of a display change notification to the display unit in the first embodiment.
- FIG. 14A is a diagram illustrating an example of a UI displayed by the display unit before the processing illustrated in FIG. 8 is performed.
- FIG. 14B is a diagram illustrating an example of a UI displayed by the display unit after the processing illustrated in FIG. 8 is performed.
- FIG. 15 is a diagram illustrating an example of a configuration of an information providing system according to the second embodiment.
- FIG. 16 is a diagram illustrating an example of information stored in the home appliance category DB in the second embodiment.
- FIG. 17 is a flowchart illustrating processing in which the server according to the second embodiment determines a category type and a display size.
- FIG. 18 is a diagram illustrating an example of a UI displayed by the display unit according to the second embodiment.
- FIG. 19 is a diagram illustrating an example of information stored in the home appliance category DB in the second embodiment.
- FIG. 20 is a diagram illustrating an example of the configuration of the display format determination unit in the third embodiment.
- FIG. 21 is a flowchart illustrating an example of processing in which the home appliance operating time predicting unit according to Embodiment 3 predicts the operating time of home appliances.
- FIG. 22 is a diagram illustrating an example of a method for calculating an operation time from an event log held by the home appliance DB group in the third embodiment.
- FIG. 23 is a diagram illustrating an example of a method for calculating an operation time from an event log held by the home appliance DB group in the third embodiment.
- FIG. 24 is a diagram illustrating an example of a home appliance operation ratio held by the home appliance DB group in the third embodiment.
- FIG. 25 is a flowchart illustrating an example of processing in which the display index value calculation unit according to Embodiment 3 calculates a display index.
- FIG. 26 is a diagram illustrating an example of a display screen displayed in a specific display format according to the third embodiment.
- FIG. 27 is a diagram showing another example of a display screen displayed in a specific display format in the third embodiment.
- FIG. 28 is a diagram showing an example of a display screen when an application is added to the display screen of FIG.
- FIG. 29 is a diagram showing an example of a display screen when an application is added to the display screen of FIG.
- FIG. 30 is a diagram illustrating an example of a display screen displayed in a specific display format according to the fourth embodiment.
- FIG. 31 is a flowchart illustrating an example of processing for generating the specific display format illustrated in FIG. 30.
- FIG. 32 is a diagram illustrating an example of a display format in which user icons are overlapped with each other in the fourth embodiment.
- FIG. 33 is a diagram showing an example of a display format obtained by modifying the background map image in the fourth embodiment.
- FIG. 34 is a diagram illustrating an example of display format transition in the fourth embodiment.
- FIG. 35A is a diagram illustrating an example of an external appearance of a wireless coaster in Embodiment 5.
- FIG. 35B is a diagram illustrating an example of a configuration of a wireless coaster in Embodiment 5.
- FIG. 36 is a flowchart illustrating an example of processing in which the wireless coaster in Embodiment 5 detects a state.
- FIG. 37 is a diagram illustrating an example of a system configuration when the wireless coaster according to the fifth embodiment is connected to a PC through a wired connection.
- FIG. 35A is a diagram illustrating an example of an external appearance of a wireless coaster in Embodiment 5.
- FIG. 35B is a diagram illustrating an example of a configuration of a wireless coaster in Embodiment 5.
- FIG. 36 is a
- FIG. 38 is a flowchart illustrating an example of processing in which the coaster according to the fifth embodiment detects a state and cooperates with the PC.
- FIG. 39 is a diagram illustrating an example of information obtained from the coaster according to the fifth embodiment.
- FIG. 40 is a diagram showing an example of screen information displayed in conjunction with the coaster in the fifth embodiment.
- FIG. 41 is a diagram illustrating an example of a configuration when the wireless coaster according to Embodiment 5 performs state estimation.
- FIG. 42 is a flowchart illustrating processing in which the coaster according to the fifth embodiment detects a state.
- FIG. 43 is a diagram showing an example of a weight change pattern in the fifth embodiment.
- FIG. 44 is a diagram illustrating an example of information obtained from the coaster according to the fifth embodiment.
- FIG. 39 is a diagram illustrating an example of information obtained from the coaster according to the fifth embodiment.
- FIG. 40 is a diagram showing an example of screen information displayed in conjunction with the coaster in the fifth
- FIG. 45 is a diagram illustrating an example of a configuration of a server having a function of estimating a state according to the fifth embodiment.
- FIG. 46 is a flowchart illustrating an example of processing of the server in the fifth embodiment.
- FIG. 47 is a flowchart illustrating an example of server processing according to the fifth embodiment.
- FIG. 48 is a diagram illustrating an example of information stored in the cup DB according to the fifth embodiment.
- FIG. 49 is a diagram showing an example of usage frequency information for calculating a time zone mainly used for cups in the fifth embodiment.
- FIG. 50 is a diagram illustrating an example of a configuration of a system that simultaneously uses coffee maker usage information and coaster information according to the fifth embodiment.
- FIG. 51 is a diagram showing an example of the configuration of a server in the case of using drink manufacturer information in the fifth embodiment.
- FIG. 52 is a diagram showing an example of a shared screen displaying information of a manufacturer such as a coffee maker in the fifth embodiment.
- FIG. 53 is a flowchart illustrating an example of processing performed by the server when using drink maker information according to the fifth embodiment.
- FIG. 54 is a diagram showing an example of a system configuration in the sixth embodiment.
- FIG. 55 is a diagram illustrating an example of sound generated by the sound generation apparatus that collects sound from the network-connected microphone device according to the sixth embodiment.
- FIG. 56A is a diagram illustrating an example of a specific configuration of the network-connected microphone device according to the sixth embodiment.
- FIG. 56B is a diagram illustrating an example of a specific configuration of the sound generator A according to the sixth embodiment.
- FIG. 56C is a diagram illustrating an example of a specific configuration of the sound generation device B according to the sixth embodiment.
- FIG. 57 is a flowchart for explaining an example of the operation of the network-connected microphone device in the sixth embodiment.
- FIG. 58 is a flowchart for explaining an example of the operation of the network-connected microphone device in the sixth embodiment.
- FIG. 59 is a flowchart for explaining an example of the operation of the sound generator A according to the sixth embodiment.
- FIG. 60 is an example of a table used to determine the sound output from the sound generator A according to the sixth embodiment.
- FIG. 60 is an example of a table used to determine the sound output from the sound generator A according to the sixth embodiment.
- FIG. 61 is a diagram illustrating an example of a child state associated with mother's voice recognition in the sixth embodiment.
- FIG. 62 is an example of a UI displayed on the display device in the sixth embodiment.
- FIG. 63 is an example of a UI displayed on the display device in the sixth embodiment.
- FIG. 64 is a diagram illustrating an example of a UI display effect displayed on the display device in the sixth embodiment.
- FIG. 65 is a diagram showing an example of the display effect speed according to the positional relationship between the network-connected microphone device and the display device in the sixth embodiment.
- FIG. 66 is a diagram showing an example of a display effect according to the event notification elapsed time to the display device in the sixth embodiment.
- FIG. 67 is a flowchart illustrating an example of processing in which the server according to the sixth embodiment notifies the display device 1 of UI update.
- FIG. 68 is a diagram showing an example of an event notified by the network-connected microphone device stored in the household appliance DB group in the sixth embodiment.
- FIG. 69A is a diagram illustrating an example of a result of counting events accumulated in the home appliance DB group of the server in the sixth embodiment.
- FIG. 69B is a diagram illustrating an example of a result of counting events accumulated in the home appliance DB group of the server in the sixth embodiment.
- FIG. 70 is a diagram illustrating an example of a UI displayed after the UI change notification from the server is displayed by the display device according to the sixth embodiment.
- FIG. 71A is a diagram for describing an example of a form in which a service is provided using a server.
- FIG. 71B is a diagram for describing an example of a form in which a service is provided using a server.
- FIG. 71C is a diagram for describing an example of a form in which a service is provided using a server.
- FIG. 72 is a diagram for explaining an example of service types.
- FIG. 73 is a diagram for explaining an example of service types.
- FIG. 74 is a diagram for explaining an example of service types.
- FIG. 75 is a diagram for explaining an example of service types.
- the network of home appliances is limited to the connection and use of a plurality of home appliances in one home, and has not yet reached a service including home appliances connected to other home networks.
- various home appliance information such as home appliance logs (so-called big data) can be collected by connecting information on home appliances connected to the network in individual homes and connecting them with a wide range of networks. It is expected to become.
- the inventors can use the collected information, for example, to grasp the usage status of household electrical appliances of a third party and use it as a reference for using their own household electrical appliances, or to use another SNS.
- new services such as transitioning to new service screens can be developed. Thereby, it is thought that the utilization form of household appliances that has never existed can be created for users of household appliances.
- information on home appliances in a plurality of households connected to a wide range of networks can be displayed in a centralized manner by users in each household using a display device in each household. It is desirable to prepare a portal screen. As a result, the user can easily transition to a new service screen such as another SNS, or can easily grasp the usage status of home appliances in each home including his / her own home. It is because it becomes possible to use various services starting from that.
- Patent Document 1 discloses an information display method for a screen of a central supervisory control device or the like in a supervisory control system of a plurality of plants.
- the central supervisory control device used in a plant or the like there is a technique for displaying a plurality of variable part display information such as alarm information on a fixed part display information such as a map of each plant in a time-sharing manner. It is disclosed.
- the monitor can display various types of information so that it can be easily viewed at a time without having to open a separate screen individually for viewing various types of information.
- Patent Document 1 With the information display method disclosed in Patent Document 1, it is possible to superimpose and display a certain amount of various types of information, but display a large amount of various types of information called so-called big data in an easily visible manner. There is a problem that it cannot be done.
- Patent Document 1 it is not possible to easily display a variety of information called big data in an easy-to-view manner only by a display switching method using time-division display of a screen.
- the present inventors first effectively display the information to be displayed according to the purpose of use.
- the present invention has been created in order to realize an information providing method capable of narrowing down and then displaying the narrowed-down information so that it can be easily viewed and shifted to the next service screen.
- the present disclosure has been made in view of the above-described circumstances, and provides an information providing method that can provide an enormous variety of information in a display mode that is easy to visually recognize.
- An information providing method is an information providing method in a plurality of home appliances, a display device, and a server connected via a network, and the server includes a plurality of the plurality of home appliances from each of the plurality of home appliances.
- information related to home appliances connected to the network refers to “home appliances and the like” as well as AV home appliances such as TVs and recorders, so-called white goods such as air conditioners and refrigerators, and beauty appliances. It can include any hardware or software that is connected to a network and can communicate its own data, such as home appliances, health devices, and digital cameras. Therefore, an apparatus that can communicate data with M2M, such as an NFC sensor, may be included.
- M2M such as an NFC sensor
- Network includes not only short-distance communication such as wired or wireless LAN in the home, Bluetooth (registered trademark), infrared communication, ZigBee, NFC, etc., but also mobile networks in the outdoors. Any form can be used as long as it can communicate in one direction or both directions.
- display devices are not only display screens installed on individual home appliances, TVs connected to networks, computers, projectors, but also mobile phones, smartphones, tablets, etc. If that is the case, it does not matter.
- an object corresponding to a home appliance including information on the one or more home appliances filtered based on the display information transmitted by the server is added.
- a display step of displaying in the specific display format on the display screen of the display device may be included.
- the processing step receives the information about the plurality of home appliances received.
- a tabulation step for tabulating by category, and a display format indicating a specific display format including a display size and a display position of an object corresponding to the household electric appliance including the filtered information on the one or more household electric appliances using the result of the tabulation A calculation step of calculating information, and in the transmission step, the display information including the display format information may be transmitted to the display device.
- the plurality of home appliances received for each category determined based on geographic information including a region, family structure and number of people, and information on a home of the home appliance user including a residence form Information about each may be aggregated.
- the receiving step information on each of the plurality of home appliances transmitted by each of the home appliances triggered by a change in state of each of the plurality of home appliances is received, and in the counting step, Based on the number of times of reception of information related to each of the plurality of home appliances received in the reception step, an operation rate for each type of the plurality of home appliances is estimated, and in the calculation step, the display format information is calculated based on the operation rate. It may be calculated.
- the display format is a specific display format that does not depend on the screen size of the display device, and indicates the specific display format including the relative display position and the display size on the screen. Information may be calculated.
- the processing step further includes a category type in the information on each of the plurality of home appliances received in the receiving step.
- Category display priority indicating the display priority of the category type based on the information on each of the plurality of home appliances received in the reception step and the information on each of the stored home appliances
- the assigned category ID, the determined category display priority, and the home appliance Processing for generating the display information including display priority And in the display step, based on the display information, one or more home appliances having the same category ID are collectively displayed on the display screen, and a plurality of category IDs belonging to a category ID having a high category display priority are displayed. Information related to home appliances may be displayed larger on the display screen.
- the category display priority may be determined in accordance with a total usage frequency of one or more home appliances having the same category ID.
- the category display priority of a category ID to which the household electrical appliance used by the user belongs is predetermined. It may be set temporarily higher than the value.
- a category ID indicating the same category type may be given to a plurality of home appliances that are used more than a preset usage frequency in a preset time zone.
- icons indicating one or more home appliances having the same category ID are collectively displayed on the display screen, and further, the one or more home appliances are displayed.
- the icons indicating home appliances the icons corresponding to the home appliances with higher home appliance display priority may be displayed larger.
- the processing step further predicts a home appliance operating time that is a time when the plurality of home appliances are operated based on information about each of the plurality of home appliances received in the receiving step.
- the display size and the display position of the home appliance icon that is an object corresponding to the home appliance including the filtered information regarding the one or more home appliances are determined by weighting the home appliance operation time by the user operation time in the prediction step.
- Calculating a display index value for performing, and in the display step, a display size and a display position of the home appliance icon on the display screen are determined based on the display index value transmitted by the server Then, it may be displayed on the display screen.
- an icon indicating that the user operation is performed may be displayed in the vicinity of the home appliance icon corresponding to the home appliance on which the user operation is performed among the home appliance icons.
- the home appliance icon and a person icon selected according to a predetermined display priority among a plurality of person icons respectively indicated by a plurality of users to be displayed are displayed.
- the predetermined display priority is information relating to each of the plurality of home appliances received in the reception step and using information belonging to each of the plurality of users, superimposed on a background image and displayed on the display screen. May be determined.
- the display priority may be determined by using operation times of the plurality of home appliances as information on the plurality of home appliances.
- the priority may be determined by using the accumulated operation time of each of the plurality of home appliances as information on each of the plurality of home appliances.
- the background image may be further deformed and displayed so that the plurality of person icons or the plurality of home appliance icons do not overlap.
- the background image is returned to the original shape before the deformation, and the original shape of the original image is changed.
- the background image may be deformed and displayed again.
- the background image may be a map.
- the first home appliance having a function of measuring the weight among the plurality of home appliances is measured in the measurement step in which the change in the weight of the object is measured, and in the measurement step.
- a device transmitting step of transmitting weight information indicating a change in the weight and an identifier capable of uniquely identifying the first home appliance to the server via the network, and in the receiving step, the transmitted The home appliance identifier and the weight information are received, and the processing step further includes an estimation step of estimating the state of the object measured by the first home appliance from the received change pattern of the weight information,
- the display device displays the display content in accordance with the state of the object estimated in the estimation step in the specific display format. Generates display information to be displayed, in the display step, based on the display information, it may be changed according to the display content of the avatar of the user associated with the object among the avatars are displayed on the display screen.
- the weight of the cup is measured by placing a cup on the first home appliance
- the state of the object is determined from the received change pattern of the weight information. It may be estimated whether the user of the first household electrical appliance has placed the cup on the first household electrical appliance or picked up the cup from the first household electrical appliance.
- the weight of an object used by the user of the first home appliance is estimated from the received change pattern of the weight information, and in the display step, further, based on the display information.
- a comparison step of comparing a plurality of pre-registered images corresponding to the weight of the object with the estimated weight of the object, and the display step includes: You may change the image regarding the said object of the said avatar currently displayed into the image according to the weight of the said object estimated in the said comparison step.
- a second home appliance having a function of measuring a current amount further measures a current amount of the third home appliance
- the measurement is further performed in the measurement step.
- the current amount of the third home appliance is transmitted to the server, and in the display step, the second home appliance further measures the amount of current based on the display information, and the use is specified.
- the information providing method further includes: a voice recognition step in which a fourth home appliance having a function of recognizing voice among the plurality of home appliances collects sound and performs voice recognition; and the voice recognition A determination step of determining the state of the object based on the sound recognized in the step, wherein the reception step receives the state of the object as information about each of the plurality of home appliances, and the processing step includes: Furthermore, display information for displaying the display contents according to the received state of the object in the specific display format may be generated.
- the sound generated by the sound generator mounted on the fifth home appliance used by the object is collected, and based on the sound recognized in the voice recognition step, The state of the object that uses the fifth home appliance may be determined.
- the determination step further includes a determination step of determining an object state based on a word included in the sound recognized in the voice recognition step, and the reception step further includes the object state.
- display information for displaying the display contents according to the received state of the object in the specific display format may be generated.
- FIG. 1 is a diagram illustrating an example of a configuration of an information providing system according to the first embodiment.
- the information providing system shown in FIG. 1 includes a plurality of home appliances, a display device, and a server (1100), which are connected via a public network (1200).
- home appliances 1 (1401), home appliance 2 (1402), and home appliance 3 (1403) which are a plurality of home appliances, are arranged.
- the event information including the operation is transmitted to the server (1100) via the GW1 (1301) and the public network (1200). Further, event information generated in the home appliance 3 (1403) is directly transmitted to the server (1100) via the public network (1200) without passing through the GW1 (1301).
- home appliance 4 In house B (1211), home appliance 4 (1404), home appliance 5 (1405),... Which are home appliances of home B (1202) registered in advance in GW2 (1302). . . , And event information including a change in state of the home appliance that occurred in home appliance n (1406) and a user operation on the home appliance are transmitted to the server (1100) via the GW1 (1302) and the public network (1200).
- the server (1100) includes a communication unit (1101), a display format determination unit (1102), and a home appliance DB group (1103).
- the communication means (1101) receives information on each of the plurality of home appliances from each of the plurality of home appliances.
- the communication means (1101) transmits the display information formed (generated) by the display format determination means (1102) to the display device.
- the communication means (1101) receives information on each of the plurality of home appliances transmitted by each of the home appliances when the state of each of the plurality of home appliances changes.
- the server (1100) (in particular, the communication unit (1101) and the display format determination unit (1102)) includes, for example, a first memory and a CPU as a hardware configuration.
- the first memory stores, for example, a program that functions as communication means (1101) and a program that functions as display format determination means (1102).
- the first memory is, for example, a readable recording medium or a readable / writable recording medium. Examples of the recording medium include a semiconductor memory, an optical disk, and a hard disk.
- the server (1100) is configured such that the communication means (1101) and the display format determination means (1102) function by, for example, reading the above-described programs from the first memory and causing the CPU to execute them.
- the CPU causes the program to function as the communication unit (1101) and the program to function as the display format determination unit (1102) to be executed by the CPU.
- the present invention is not limited to this.
- a dedicated signal processing circuit that functions as the communication unit (1101) and a dedicated signal processing circuit that functions as the display format determination unit (1102) may be used.
- the server (1100) has a memory (not shown) for storing the home appliance DB group (1103).
- the memory that stores the home appliance DB group (1103) is, for example, a readable and writable recording medium. Examples of the recording medium that can be read and written include a semiconductor memory, an optical disk, and a hard disk.
- the communication means (1101) receives event information of home appliances transmitted from the home A (1210) and the home B (1211) and stores them in the home appliance DB group (1103).
- the communication unit (1101) provides screen information formed (generated) by the display format determination unit (1102) in response to a request from the display device 1 (1510).
- the display format determination unit (1102) performs processing for filtering information on each of the plurality of home appliances received by the communication unit (1101) and generating display information for the display device to display in a specific display format.
- the display format determination means (1102) receives a plurality of received information at a predetermined time or when a certain number of pieces of information about each of the plurality of home appliances are received in the communication means (1101). Aggregate information on home appliances by category.
- the display format determination means (1102) uses the result of the aggregation to display a specific display format that includes a display size and a display position of an object corresponding to the home appliance including information on one or more filtered home appliances. Calculate information.
- the display format determination means (1102) received for each category determined based on, for example, geographical information including the area, family structure and number of people, and information on the home of the home appliance user including the residence form. Information regarding each of the plurality of home appliances may be aggregated.
- the display format determination unit (1102) further estimates, for example, the operating rate for each type of the plurality of home appliances based on the number of times of reception of information regarding each of the plurality of home appliances received by the communication unit (1101). Display format information may be calculated based on the operating rate.
- the display format determination means (1102) displays display format information indicating a specific display format that is a specific display format independent of the screen size of the display device and includes a relative display position and display size on the screen. calculate.
- the display format determination unit (1102) takes out event information desired by the user from the home appliance DB group (1103) based on, for example, the setting information of the user who owns the display device 1 (1510), Based on the event information type and frequency information, each house information is formed as screen information.
- the display format determination unit (1102) provides screen information formed via the communication unit (1102) in response to a request from the display device 1 (1510).
- the display format determination unit (1102) periodically provides screen update information to the display device 1 (1510) based on the event information type and frequency information.
- Display device 1 includes display means (1511) and communication means (1512).
- the display apparatus 1 (1510) should just be an apparatus which can display screen information, such as a television, a smart phone, and PC.
- Display means (1511) displays information on a screen (display screen). More specifically, the display unit (1511) displays an object corresponding to the home appliance including information on one or more filtered home appliances based on the display information transmitted by the server (1100) on the display device 1 ( 1510) is displayed in a specific display format.
- the communication means (1512) receives the acquisition of the user interface displayed on the screen and the display content update notification.
- Display device 1 includes, for example, a second memory and a CPU as a hardware configuration.
- the second memory stores, for example, a program that functions as display means (1511) and a program that functions as communication means (1512).
- the second memory is, for example, a readable recording medium or a readable / writable recording medium. Examples of the recording medium include a semiconductor memory, an optical disk, and a hard disk.
- the display device 1 is configured such that the display unit (1511) and the communication unit (1512) function by reading the above-described programs from the memory and causing the CPU to execute them.
- the CPU causes the program to function as the display unit (1511) and the program to function as the communication unit (1512) to be executed by the CPU.
- the present invention is not limited thereto.
- a dedicated signal processing circuit that functions as the display means (1511) and a dedicated signal processing circuit that functions as the communication means (1512) may be used.
- a program that allows one of the display means (1511) and the communication means (1512) to function may be stored in the second memory, and the other may be configured using a dedicated signal processing circuit.
- the GW1 (1301) and the GW2 (1302) may be, for example, wireless LAN access points, connected to each device by a power-saving radio or the like, and further connected to a server (via a public network (1200) such as the Internet. 1100) may be a dedicated GW.
- FIG. 2 is a diagram illustrating an example of house IDs and attribute information held by the home appliance DB group (1103) according to the first embodiment. 2 shows a house ID (2001), a nickname (2002), and an owned home appliance (2003) as an example of information held in the home appliance DB group (1103) in FIG.
- the house ID (2001) holds an identifier that can uniquely identify the house A (1210) and the house B (1211), and the nickname (2002) holds a name set for each house.
- the owned home appliance (2003) holds home appliances owned by each home such as home appliance 1 (1401) and home appliance 4 (1404).
- FIG. 3 is a diagram illustrating an example of an event log including home appliance state changes and user operations held by the home appliance DB group according to the first embodiment.
- FIG. 3 shows an example of an event log stored in the home appliance DB group (1103), and event information of home appliances transmitted from the home A (1210) and the home B (1211) in FIG. Stored.
- the house ID (2001) stores an identifier (identification ID) for uniquely specifying the house A (1210) and the house B (1211) as in FIG.
- the home appliance type (2202) stores the type of home appliance that has generated the event, stored as event information.
- the event key (2203) stores the reason why event information such as a user operation on the home appliance and a state change of the home appliance has been notified.
- the event value (2204) stores the contents of user operations and home appliance state changes.
- the date (2205) and time (2206) store the date and time and the hour, minute, second, and millisecond at which the event in the corresponding event information row was generated.
- FIG. 4A is a diagram illustrating an example of screen information of the information sharing service according to the first embodiment.
- FIG. 4B is a diagram illustrating an example of the configuration of the display device according to Embodiment 1
- FIG. 4C is a diagram illustrating an example of the configuration of the server according to Embodiment 1.
- the screen information shown in FIG. 4A is an example of information shared on a display screen displayed in a specific display format.
- the information on the operation state (state information) of the home appliance of each house that the user wanted to see represents the appearance of the home appliance in a small image.
- an icon is used so as to be superimposed on a background image that is a map.
- a heating device is MAX (used most) among a plurality of home appliances with a house icon (1254)
- a user of a community sharing the screen information of FIG. When you think, you can choose to display only the houses that use heating equipment without displaying the houses that do not use heating equipment.
- the message (1253) shown in FIG. 4A it is linked to postings from other SNS sites, etc. via the time axis and communication API. Thereby, messages of each house can be displayed together.
- the user side device display device or server
- the user side device has an information input device as shown in the lower part of FIG. 4B or FIG. 4C
- the user inputs a message in this service and matches the information with the event. Can be displayed.
- a voice input device may be mounted on the user side device, and user voice data may be input.
- a moving image input device may be mounted on the user side device to share a moving image. If there is a voice recognition function, the voice data can be displayed as text.
- which house is selected as the house to be displayed on the display screen of the display device is selected from the information of the transmitting side user who discloses information and the setting information of the viewer side user. For example, if the sending user wants to appeal cooking, and if there are more than a certain number of cooking household appliance events, the priority will increase in the selection of the house displayed on the receiving side. It is displayed at a position that is larger or more easily visible than other home appliances. Further, the selection may be made based on the preference and setting information of the viewer-side user. Of course, the case where the transmission side user and the viewer side user are the same person is also considered. Moreover, when the information set on the transmission side is displayed with priority on cooking-related housework, the screen information at the time of browsing can also be displayed according to the priority.
- FIG. 5 is a flowchart illustrating an example of UI acquisition processing of the display device according to the first embodiment.
- FIG. 5 shows processing until the display apparatus 1 (1510) acquires a UI from the server.
- S2301 to S2304 and S2308 to 2310 are processes in the display device 1 (1510), and S2305 to S2307 are processes in the server (1100).
- S2302 it is determined whether or not an event has been notified from the server (1100) to the communication means (1512) of the display device 1 (1510). If there is an event notification (YES in S2302), S2304 is performed. If no event notification is received (NO in S2302), S2303 is performed.
- the display device 1 determines whether or not there is a user operation for UI acquisition. If there is a user operation (YES in S2303), S2304 is performed. If there is no user operation (NO in S2303), the process returns to the determination in S2302.
- the communication means (1512) of the display device 1 requests a UI corresponding to the event or user operation from the server (1100) via the public network (1200).
- the communication unit 1101 of the server (1100) receives a UI request from the display device 1 (1510).
- the display format determination unit (1102) of the server (1100) acquires necessary information from the home appliance DB group (1103) according to the content of the received UI request, and forms a UI.
- the display format determination unit (1102) transmits the formed UI to the display device 1 (1510) via the communication unit 1101 of the server (1100).
- the communication unit (1512) of the display device 1 determines whether or not a UI has been received from the server (1100). If a UI is received (YES in S2308), S2309 is executed. If no UI is received (NO in S2308), the determination in S2308 is executed again.
- the display means (1511) of the display device 1 displays the received UI on the screen.
- the processing flow for obtaining the UI from the server (1100) by the display device 1 (1510) is terminated.
- the server (1100) may hold a UI displayed on the display device 1 (1510).
- FIG. 6 is a flowchart illustrating an example of processing until the home appliance in Embodiment 1 notifies the server of an event.
- FIG. 6 illustrates processing until the home appliance 1 (1401) notifies the server (1100) of an event.
- S2321 to S2324 described later are processes in the home appliance 1
- S2325 to S2327 are processes in the server (1100).
- the home appliance 1 (1401) starts a processing flow for notifying the server (1100) of an event.
- the home appliance 1 (1401) determines whether or not the value of the sensor mounted on itself has changed. Here, if there is a change (YES in S2322), S2324 is executed, and if there is no change (NO in S2322), S2323 is executed.
- the home appliance 1 (1401) determines whether or not its own operating state has changed.
- S2324 is performed, and when there is no change (NO in S2323), S2322 is performed.
- the home appliance 1 (1401) transmits an event related to a sensor value or an operation state change to the server (1100) via the GW1 (1301) and the public network (1200).
- the communication unit 1101 of the server (1100) receives an event from the home appliance 1 (1401).
- the server (1100) stores the received event in the home appliance DB group (1103).
- FIG. 7 is a flowchart illustrating an example of processing in which the server according to the first embodiment notifies the display device of UI update.
- FIG. 7 illustrates a process in which the server (1100) notifies the display apparatus 1 (1510) of UI update.
- S2331 to S2335 described later are processes in the server (1100), and S2336 to S2338 are processes in the display device 1 (1510).
- the server (1100) starts a processing flow for notifying the display apparatus 1 (1510) of UI update.
- the communication means (1101) of the server (1100) determines whether an event has been received from the home appliance. Here, if received (YES in S2332), S2333 is performed, and if not received (NO in S2332), S2332 is performed again.
- the server (1100) stores the received event in the home appliance DB group (1103).
- the received event is stored in the event log shown in FIG.
- the display format determination means (1102) determines whether or not the display content on the display device 1 (1510) needs to be updated from the event log accumulated in the home appliance DB group (1103). If updating is necessary (YES in S2334), S2335 is performed, and if updating is not necessary (NO in S2334), S2332 is performed.
- the server (1100) transmits a UI update notification to the display device 1 (1510) through the transmission unit (1102).
- the communication unit (1512) of the display device 1 determines whether or not a UI update notification has been received from the server (1100). Here, if it is received (YES in S2336), S2337 is performed, and if it is not received (NO in S2336), the determination in S2336 is performed again.
- the display means (1511) of the display device 1 displays the received UI on the screen.
- FIG. 8 is a flowchart showing an example of detailed processing of S2333, S2334, and S2335 of FIG. That is, FIG. 8 shows detailed processing of S2333, S2334, and S2335, which are processing performed by the server (1100) described in FIG. That is, in S2333 of FIG. 7, the processing of S2333a and S2333b of FIG. 8 is performed, and in S2334 of FIG. 7, the processing of S2334a and S2334b of FIG. 8 is performed. In S2335 of FIG. 7, the processing of S2335a, S2335b, S2335c, and S2335d of FIG. 8 is performed.
- the communication means (1101) of the server (1100) receives an event from the home appliance of each house.
- the server (1100) stores the received event in the home appliance DB group (1103).
- the server (1100) counts up the number of events in the home appliance event count table for the same home appliance as the home appliance type of the event received by region. For example, as shown in FIG. 3, based on the house ID (2001) of the event received from the home appliance by the server (1100), the house DB shown in FIG. Is identified by referring to the address (2004).
- FIG. 9 is a diagram illustrating an example of information stored in the house DB according to the first embodiment.
- the house DB shown in FIG. 9 is a DB held in the household appliance DB group (1103), and has an address (2004), the number of households (2005), and a residence form (house ID (2001) as a house ID). 2006) attribute information is stored.
- the server (1100) determines whether or not a predetermined time set as the processing timing has elapsed. If the predetermined time has elapsed (YES in S2334b), the process proceeds to S2335a. If the predetermined time has not elapsed (NO in S2334b), the process ends.
- the server (1100) calculates the home appliance type operation rate using (the number of events in the home appliance event count table) and (the occurrence frequency of the home appliance event occurrence frequency list).
- FIG. 10 is a diagram showing an example of a home appliance event summary table in Osaka Prefecture in the first embodiment.
- the home appliance event tabulation table shown in FIG. 10 is tabulated for each category such as geographical information such as area, the number of family members, and the form of residence.
- FIG. 11 is a diagram showing an example of a home appliance event occurrence frequency list in the first embodiment.
- the home appliance event occurrence frequency list shown in FIG. 11 describes home appliances for each home appliance type, and describes the home appliance model number (home appliance model number) and its event occurrence frequency.
- the event occurrence frequency differs for each home appliance or home appliance model number, and the home appliance cannot calculate the operating status by adding the number of events. Therefore, the event occurrence frequency characteristics per unit time according to the home appliance are defined using the home appliance event occurrence frequency list shown in FIG. Thereby, it is possible to calculate the home appliance operating state more accurately using only the number of events transmitted from each home appliance.
- FIG. 12 is a diagram illustrating an example of an operation rate for each home appliance type according to the first embodiment.
- FIG. 12 shows the model number operating rate as a calculation process when calculating the operating rate for each home appliance type.
- the server (1100) determines the number of events (2504) in the home appliance event count table (Osaka) shown in FIG. 10 and the occurrence frequency (2553) in the home appliance event occurrence frequency list shown in FIG. Is used to calculate the model number availability (2603) for each model number (2602) shown in FIG. Thereafter, the server (1100) calculates the home appliance type operation rate (2604) by obtaining the sum of the model number operation rates for the same home appliance type (2601).
- the model number operation rate (2603) can be calculated by (number of events in the home appliance event count table) / (occurrence frequency in the home appliance event occurrence frequency list), and the home appliance type operation rate (2604) is the same home appliance. It can be calculated from the sum of the model number availability for each type.
- the server (1100) calculates the display change notification content to the display means (1511) from the calculated household appliance type operation rate (2604).
- the server (1100) calculates the display size ratio (2652) shown in FIG. 13 as the display change notification content.
- FIG. 13 is a diagram showing an example of the content of the display change notification to the display means (1511) in the first embodiment.
- the display size ratio (2652) shown in FIG. 13 is a value calculated by (home appliance type operation rate) / (maximum value of home appliance type operation rate), and the home appliance type operation rate (2604) of FIG. The following are normalized numbers.
- This display size ratio (2652) can be calculated by dividing each home appliance type operation rate by the maximum value of the home appliance type operation rate (2604).
- the household appliance type operation rate 198.75 of the washing machine is the maximum value.
- the communication unit (1101) of the server (1100) notifies the display unit (1511) of the display change notification content.
- the communication means (1101) notifies the display size ratio (2652) shown in FIG. 13 as the display change notification content.
- the server (1100) sets all the number of events and the total number of events in the home appliance event summary table to 0, and ends the process.
- the server (1100 for example, sets the number of events (2503) and the total number of events (2504) in the home appliance event summary table shown in FIG.
- determination process of S2334b it may be determined whether a certain number of events have been received in addition to determining whether a certain time has elapsed.
- the display unit (1511) of the display device 1 displays, for example, from the UI before the display update notification shown in FIG. 14A to the UI after the display update notification shown in FIG. 14B.
- Update. 14A is a diagram showing an example of a UI displayed by the display means (1511) before the processing in FIG. 8 is performed, and FIG. 14B is displayed by the display means after the processing shown in FIG. It is a figure which shows an example of UI.
- the home appliance icon 2701, home appliance icon 2702, home appliance icon 2703, home appliance icon 2704, and home appliance icon 2705 in the UI before the display update notification are displayed in the display size ratio (2652) shown in FIG. Is updated with the display size determined based on.
- An icon 2706 indicates a nameplate of the house, and means in what category the display size of the home appliance icons surrounding the house is determined.
- Embodiment 2 a mode of displaying a UI displayed on a display device such as the display device 1 (1510) or the display device 2 (1520) will be described.
- various information operation information and state change information
- various information that can be acquired from home appliances by a user operation on the home appliance is collected as an event, and related home appliance information is collected as a category.
- This realizes a UI display method that is easy for the user to visually recognize while displaying as much information as possible on the limited screen area of the display device.
- it does not demonstrate redundantly.
- FIG. 15 is a diagram illustrating an example of the configuration of the information providing system in the present embodiment. The difference from FIG. 1 is that the server (3001) shown in FIG. 15 includes a home appliance category (DB3004) as an example of the home appliance DB group (1103) and the configuration of the display format determination unit (1102A).
- DB3004 home appliance category
- the display format determination unit (1102A) the configuration of the display format determination unit (1102A).
- the server (3001) (in particular, the communication unit (1101) and the display format determination unit (1102A)) includes, for example, a first memory and a CPU as a hardware configuration.
- the first memory for example, a program that functions as the communication unit (1101) and a program that functions as the display format determination unit (1102A) are stored.
- the first memory is, for example, a readable recording medium or a readable / writable recording medium. Examples of the recording medium include a semiconductor memory, an optical disk, and a hard disk.
- the server (3001) is configured such that the communication unit (1101) and the display format determination unit (1102A) function by, for example, reading the above-described programs from the first memory and causing the CPU to execute them.
- the CPU causes the program to function as the communication unit (1101) and the program to function as the display format determination unit (1102A) to be executed by the CPU.
- the present invention is not limited to this.
- a dedicated signal processing circuit that functions as a communication unit (1101) and a dedicated signal processing circuit that functions as a display format determination unit (1102A) may be used.
- a program that allows one of the communication unit (1101) and the display format determination unit (1102A) to function may be stored in the first memory, and the other may be configured using a dedicated signal processing circuit.
- the home appliance category (DB3004) is stored in a memory (not shown).
- the memory that stores the home appliance category (DB3004) is, for example, a readable and writable recording medium. Examples of the recording medium that can be read and written include a semiconductor memory, an optical disk, and a hard disk.
- FIG. 16 is a diagram illustrating an example of information stored in the home appliance category DB according to the second embodiment.
- the home appliance category DB (3004) is an example of the home appliance DB group (1103), and stores information for collectively displaying home appliance icons for each category.
- the home appliance category DB (3004) is defined for each house ID shown in FIG. Therefore, information (data) stored in the home appliance category DB (3004) can be acquired using the house ID as a search key.
- the home appliance category DB (3004) stores information related to each of a plurality of home appliances received by the communication means (1101). As shown in FIG. 16, the information stored in the home appliance category DB (3004) includes a category ID (3011), a category display priority (3012), a home appliance type (3013), a home appliance operation rate (3014), and an operation status ( 3015) and home appliance icon display priority (3016).
- Category ID (3011) indicates the category type to which the home appliance owned by the user belongs.
- the category ID includes cleaning (3017), cooking (3018), and beauty (3019).
- the home appliance type (3013) defines the type of home appliance. In addition, it is equivalent to what is defined by the household appliance classification (2022) of FIG.
- the category ID is cleaning (3017)
- a vacuum cleaner or a washing machine belongs to the category ID of cleaning (3017) among user-owned household appliances classified by the household appliance type (3013).
- the category ID is cooking (3018)
- household appliances related to cooking belong and in the example shown in FIG. 16, the IH cooking heater and rice cooker belong to the category ID of cooking (3018).
- the category ID is beauty (3019)
- beauty steamers, hair dryers, and the like belong to the category ID of beauty (3019).
- the category ID described above is an example, and may be determined for each home appliance in advance, or may be arbitrarily set by the user.
- the category display priority (3012) indicates the display priority for each category type.
- the category display priority 1 is defined as the highest priority, and the priority is defined in descending order.
- the household appliance operation rate (3014) indicates the average number of times the household appliance is used per day.
- the operating status (3015) indicates the current operating status of the home appliance. 16 indicates that the home appliance is in operation, and OFF indicates that the home appliance is not in operation.
- the home appliance icon display priority (3016) indicates the display priority of the home appliance icon within the category indicated by the category ID (3011). Here, it is defined that the home appliance icon display priority 1 is the highest priority, and the priority is lower in descending order.
- the display format determination means (1102A) displays one or more home appliances having the same category ID as a group on the display screen of the display device, and information about a plurality of home appliances belonging to the category ID having a higher category display priority. Display information to be displayed by the display device in a specific display format that is largely displayed on the display screen is generated.
- the display format determination means (1102A) displays one or more home appliances having the same category ID on the display screen as a group, and the information related to the plurality of home appliances belonging to the category ID having a higher category display priority. Display information to be displayed by the display device in a specific display format that is largely displayed on the display screen is generated.
- the display format determination means (1102A) displays, on the display screen, icons indicating one or more home appliances having the same category ID based on the home appliance display priority, and further displays the one or more home appliances.
- the display information for a display apparatus to display with the specific display format displayed larger is generated, so that the icon corresponding to a household appliance with a high household appliance display priority among the icons which show is displayed.
- the display format determination means (1102A) includes a category display size determination unit (3002), a home appliance icon display size determination unit (3003), and a home appliance category DB update unit (3005).
- a display device will be described using a display device 1 (1510) shown in FIG.
- the category display size determination unit (3002) acquires the category display priority (3012) shown in FIG. 16, and determines the display size and display position of the category type indicated by each category ID.
- the home appliance icon display size determination unit (3003) is based on the display size for each category type determined by the category display size determination unit (3002) and the home appliance icon display priority (3016) of the home appliance category DB (3004). The display size of the home appliance icon for each category type and the display position within the category are determined.
- the home appliance category DB update unit (3005) assigns a category ID indicating a category type to information regarding each of the plurality of home appliances received by the communication unit (1101).
- the home appliance category DB update unit (3005) displays the display priority of the category type based on the information on each of the plurality of home appliances received by the communication unit (1101) and the information on each of the home appliances stored in the home appliance category DB 3004. And a home appliance display priority indicating the display priority of each of the plurality of home appliances.
- the home appliance category DB update unit (3005) stores the determined category display priority and home appliance display priority in the home appliance category DB 3004.
- the home appliance category DB update unit (3005) performs a process of generating display information including the assigned category ID, the determined category display priority, and the home appliance display priority.
- the home appliance category DB update unit (3005) may determine the category display priority according to the sum of the usage frequencies of one or more home appliances having the same category ID. Further, the home appliance category DB update unit (3005) predefines the category display priority of the category ID to which the home appliance used by the user belongs when the user uses a home appliance having a predetermined usage frequency or less. It may be set temporarily higher than the above value. Further, the home appliance category DB update unit (3005) may assign a category ID indicating the same category type to a plurality of home appliances that are used more than a preset usage frequency in a preset time zone.
- the home appliance category DB update unit (3005) updates the information of the home appliance category DB (3004).
- the home appliance category DB update unit (3005) sets the home appliance category DB (3004 category ID (3011), for example, when the operating status of the home appliance changes or when a new home appliance is connected to the server (3001)).
- the key is to add or update information related to home appliances whose operating status has changed or new home appliances.
- the home appliance category DB update unit 3005 notifies the UI update when the information in the category DB (3004) is updated. For example, the home appliance category DB update unit 3005 notifies the UI update to the display device 1 (1510) whose operation status has changed or the display device 1 (1510) connected to the server (3001).
- the home appliance category DB update unit 3005 is notified of the UI acquisition request including the display size and display position of the home appliance icon from the display device 1 (1510)
- the category display size determination unit (3002) and the home appliance icon display size are displayed. From the determination means (3003), the display size and display position for each category type and the display size and display position of each home appliance icon are acquired and returned to the display device 1 (1510).
- the display size is the product of the width and height (number of pixels) in a coordinate system (for example, a two-dimensional XY coordinate system) on which the UI is displayed on the screen (display screen) of the display device 1 (1510). ).
- the display position is given as a coordinate position on the screen.
- FIG. 17 is a flowchart illustrating processing in which the server according to Embodiment 2 determines the category type and the display size.
- FIG. 18 is a diagram illustrating an example of a UI displayed by the display unit according to the second embodiment.
- FIG. 18 shows an example of a UI displayed on the display device 1 (1510) as a result of performing the processing of FIG.
- a category display area (3031) which is a broken-line rectangular area around a house icon (3030) indicating a house to which a specific house ID (2001) is assigned. Is displayed. Rounded rectangular areas included in the category display area (3031) are the individual category display area (3032) to the individual category display area (3039) corresponding to the category ID.
- home appliance icons of home appliances corresponding to the category ID are displayed together.
- FIGS. 18A and 18B show an example in which the display format is changed by the process shown in FIG. 17, and a description thereof to be described later is omitted.
- a broken-line rectangle indicating the category display area (3031) and a rounded rectangular line indicating the individual category display area (3032) to the individual category display area (3039) are displayed for convenience of description of the present embodiment.
- the important point is that the size and display position of the individual category display area in the category display area (3031) are determined.
- as many home appliance icons as possible are displayed in the limited category display area (3031), but by displaying the home appliance icons collected for each category type, it can be displayed in an easy-to-understand manner for the user.
- step S3020 the server (3001) is notified of a UI acquisition request from the display terminal 1 (1510).
- the server (3001) calculates the display size and display position of each category type.
- the category display size determination unit (3002) acquires the category display priority (3012) for each category ID (3011) from the home appliance category DB (3004), and the category display priority (3012) is high.
- the display size is determined so as to increase the display size for each category in order from the one.
- the display size can be determined using a calculation formula using c,... (0.5 ⁇ a, b, c,... ⁇ 1.0).
- the display size of cooking (3018), which is the highest priority category ID among the category IDs (3011) shown in FIG. 16, can be determined using the formula Sa SIZE ⁇ a, and is the second priority category ID.
- each category priority coefficient may be divided by the same priority number.
- An example in this case is shown in FIG. 18A, and the individual category area (3032) and the individual category area (3033) have the same size because the category priorities are equal.
- the display width and display height of the category display area (3031) can be used as an alternative to the area (SIZE) of the category display area (3031).
- SIZE area of the category display area
- the category display size determination unit (3002) determines the display position of the individual category area so that the category display priority (3032) is placed at the center of the category display area (3031). Then, the category display size determination unit (3002) determines the coordinate positions so as to be alternately arranged on the left and right in order of priority.
- the individual category display area (3034) corresponding to the category priority 1 is displayed at the center and the largest size of the category display area (3031).
- the individual category display area (3035) corresponding to the category priority 2 is displayed on the right side of the individual category display area (3034) and the next largest size.
- the individual category display area (3036) corresponding to the category priority 3 is displayed on the left side and the smallest size of the individual category display area (3034). Note that the coordinate positions may be determined so that they are arranged side by side from the center, left, and right in order of category priority.
- the server (3001) calculates the display size and display position of the home appliance icon of the home appliance belonging to the same category based on the display size and display position for each category determined in S3021 as described above. .
- the home appliance icon display size determination unit (3003) displays the display size for each category type determined by the category display size determination unit (3002) and the home appliance icon display priority (3016) of the home appliance category DB (3004). ) And the display size of the home appliance icon for each category type and the display position within the category are determined.
- the home appliance icon display size determination unit (3003) can determine the display size and display position of the home appliance icon using a calculation formula similar to the calculation formula described in S3021.
- the display size (Sa, Sb, Sc) of the individual category area in the calculation formula used in S3021 is replaced with the display size of the home appliance icon, and the category priority coefficient (a, b, c) is changed to the home appliance priority. Determining the display size and display position of home appliance icons by replacing with coefficients a ′, b ′, c ′... (0.5 ⁇ a ′, b ′, c ′... 1.0) Can do.
- the home appliance icon display size determination unit (3003) can calculate the display size of the home appliance icons in each category in descending order of the home appliance icon display priority (3016) by using this calculation formula. Note that the calculation formula may not be used for the display size of the home appliance icon. The same size may be used for the same category, and the proportional size may be divided by the number of home appliances belonging to the same category.
- the home appliance icon display size determining unit (3003) determines the display position of the home appliance icon so that the one with the higher home appliance display priority (3016) is placed at the center of the category display area (3031). Thereafter, the home appliance icon display size determination unit (3003) determines the coordinate positions to be alternately arranged on the left and right in the order of priority of the home appliance display priority (3016). In addition, you may determine as a coordinate position arrange
- the server (3001) generates a UI reflecting each determined category and the display size and display position of the home appliance icon, passes it to the display device 1 (1510), and completes the process.
- home appliance icons can be displayed together for each category, and the display size of the high-priority individual category display area and the display size of home appliance icons can be displayed large.
- the home appliance category DB update unit (3005) changes the category display priority (3012) and the home appliance icon display priority (3016) according to the operation status of the home appliance. Can also be changed.
- the home appliance category DB (3004) may be updated using the home appliance operation rate (3014) so that the average value of the home appliance operation rate for each category is high priority.
- the household appliance icon regarding the household appliance with high utilization frequency can be displayed easily, and can be displayed large.
- the home appliance category DB update unit (3005) when there is a home appliance whose operation status (3015) is ON among home appliances whose home appliance operation rate (3014) is equal to or less than a predetermined value, The home appliance icon display priority of the home appliance and the category display priority to which the home appliance belongs may be increased and set in the home appliance category DB (3004).
- FIG. 19 is a diagram illustrating an example of information stored in the home appliance category DB according to the second embodiment.
- the home appliance category DB update unit (3005) temporarily displays the home appliance icon display priority of the beauty steamer and the category display priority of the category ID to which the beauty steamer belongs, as shown in FIG.
- Information of household appliance category DB (3004) is updated so as to increase the degree.
- FIG. An example in this case is shown in FIG. That is, when the screen shown in (b) of FIG. 18 is displayed, the beauty steamer, which is a low operating rate home appliance, is operated and given high priority, so that it is shown in (c) of FIG. The screen is updated.
- the individual category display area (3037) corresponding to the updated category priority 1 to which the operated beauty steamer belongs is the center of the category display area (3031) and Displayed in the largest size.
- the individual category display area (3038) corresponding to the updated category priority 2 is displayed on the right side of the individual category display area (3037) and in the next largest size.
- the individual category display area (3039) corresponding to the updated category priority 3 is displayed on the left side and the smallest size of the individual category display area (3034).
- the home appliance icon corresponding to the beauty steamer operated in the individual category display area (3037) is displayed the largest.
- the home appliance category DB update unit (3005) sets home appliances that are used more than the preset usage frequency as the same category ID in the time zone in which the home appliance operation rate (3014) is preset, and the home appliance category DB ( 3004) may be updated.
- zone can be displayed collectively collectively, it can make it easy for a user to visually recognize the information of the household appliances which operate
- the category display size determining unit (3002) and the home appliance icon display size determining unit (3003) are provided as display means of a display device such as the display device 1 (1510) or the display device 2 (1520). It may be done. In the case of such a configuration, the operation load of the server related to UI display can be distributed as processing of the display device.
- the display unit of the display device displays one or more home appliances having the same category ID as a group on the display screen, and has a high category display priority. What is necessary is just to display larger the information regarding the some household appliances which belong to category ID on a display screen.
- the display means of the display device displays, on the display screen, icons indicating one or more home appliances having the same category ID on the display screen based on the home appliance display priority, and further indicates the one or more home appliances.
- icons corresponding to home appliances with higher home appliance display priority may be displayed larger.
- FIG. 20 is a diagram illustrating an example of the configuration of the display format determination unit in the third embodiment.
- the display format determination unit (1102) shown in FIG. 20 includes a home appliance operation time prediction unit (4001) and a display index value calculation unit (4002) based on user operation time.
- the home appliance operating time predicting means (4001) predicts the home appliance operating time, which is the time when the plurality of home appliances are operated, based on the information related to each of the plurality of home appliances received by the communication means (1101), for example.
- the home appliance operating time predicting means (4001) uses the event log accumulated in the home appliance DB group (1103), for example, to predict the operating time related to the home appliance of the home appliance type received in the latest event. .
- the display index value calculation unit (4002) based on user operation time is an object corresponding to a home appliance including information regarding one or more home appliances filtered by weighting the home appliance operation time by the user operation time.
- a display index value for determining the display size and display position of the home appliance icon is calculated.
- the display index value calculation unit (4002) based on user operation time stores and manages a home appliance operation ratio DB, which will be described later, for example, in a home appliance DB group (1103).
- a display index value calculation unit (4002) based on user operation time calculates a display index value for determining the display size and position of the latest event using the operation time prediction unit (4001) and the home appliance operation ratio DB.
- FIG. 21 is a flowchart illustrating an example of processing in which the home appliance operating time predicting unit according to Embodiment 3 predicts the operating time of home appliances.
- the home appliance operating time predicting means (4001) starts a process of predicting the operating time.
- the home appliance operating time predicting means (4001) acquires the latest event.
- the home appliance operating time predicting means (4001) stores the latest event in the information indicating the state change of the home appliance and the event value of the user operation held by the home appliance DB group (1103), Get the latest event.
- step S ⁇ b> 4103 the home appliance operation time prediction unit (4001) refers to the event key and event value of the acquired event, and determines whether the event key is “power” and the event value is “ON”. (S4103). If the event key of the acquired event is “power” and the event value does not satisfy the condition of “ON” (NO in S4103), the new event is again waited for arrival of a new event. On the other hand, if the above condition is satisfied in S4013 (YES in S4103), the process proceeds to S4104.
- the home appliance operating time predicting means (4001) stores an event log in which the home appliance DB group holds past events with the same house ID, home appliance type, event key, and event value of the acquired event (FIG. 3). Perform a search while going back in time.
- the home appliance operating time predicting means (4001) confirms whether the same event as the acquired event has existed in the past.
- the process proceeds to S4108, and an operation time default value for each home appliance type is set.
- the process proceeds to S4106.
- the home appliance operating time prediction means (4001) has the same house ID, home appliance type, event key as the acquired event, and the event value is “OFF” at the time after the searched past event. Search for past events.
- the home appliance operating time predicting means (4001) checks whether or not the same past event as the event acquired at the time after the searched past event exists. Here, if there is no past event satisfying this condition (NO in S4107), the process proceeds to S4108 to set an operating time default value for each home appliance type. On the other hand, if there is a past event that satisfies this condition (YES in step S4107), the process advances to step S4109.
- the home appliance operating time predicting means (4001) calculates the operating time from the difference between the times when the event value is “ON” and “OFF” in the searched past event.
- the home appliance operating time predicting means (4001) has the same house ID, home appliance type and event key in the searched past event before the time when the event value is “ON” X hours. Search for existence. If there is no corresponding event (NO in S4110), the operating time calculated in S4109 is set as the operating time as it is, and the process proceeds to S4112. On the other hand, if a corresponding event exists (YES in S4110), the process proceeds to S4111.
- the home appliance operating time predicting means (4001) calculates and adds the operating time of the newly discovered past event.
- the home appliance operating time predicting means (4001) outputs the operating time calculated in S4111 as the predicted operating time.
- FIG. 22 is a diagram illustrating an example of a method for calculating an operation time from an event log held by the home appliance DB group in the third embodiment.
- FIG. 22 shows an example of a method for calculating the operation time of the home appliance from the processing of S4104 to S4109 using the event log shown in FIG. 3 including the state change of the home appliance and the user operation.
- the event log where the house ID is the same and the household appliance type is “washing machine”, the time when the event key is “power” and the event value is “ON” to “OFF” is searched.
- the event logs shown in rows 2010 and 2130 have the same house ID “H000-0001”, the same household appliance type “washing machine”, and the same event key “power”. . Therefore, it can be calculated that the time 4203 between the time 4201 to the time 4202 when the event value is “ON” and “OFF” can be calculated as the operating time of the washing machine.
- FIG. 23 shows an example of a method for calculating the operation time of the home appliance from the processing of S4110 to S4111 using the event log shown in FIG. 3 including the state change of the home appliance and the user operation.
- the event logs shown in rows 2030 to 2040 and rows 2060 to 2090 have the same house ID “H000-0001”, the same home appliance type “vacuum cleaner”, and the same This is an event log having an event key “power”.
- the total time of the time 4303 between the time 4301 between the time 4301 to the time 4302 when the event value is “ON” and “OFF” and the time 4304 between the time 4304 to the time 4305 with the event value “ON” and “OFF” is It shows that the machine operating time can be calculated.
- FIG. 24 is a diagram illustrating an example of a home appliance operation ratio held by the home appliance DB group in the third embodiment.
- FIG. 24 is a figure which shows an example of the user contact table of the cleaner in Embodiment 3
- (c) of FIG. 24 is a figure which shows an example of the user contact table of the washing machine in Embodiment 3. is there.
- FIG. 24A shows a home appliance operation ratio indicating a time during which the user has to operate when operating a home appliance, and is stored in the home appliance DB group (1103). ing.
- the display index value calculation unit (4002) when the event including the state change of the home appliances held in the home appliance DB group (1103) or the user operation is stored, (a) of FIG.
- the display index which determines the display order on the portal screen is calculated using the home appliance operation ratio shown in FIG.
- column 4401 indicates the home appliance type
- column 4402 indicates the default user contact ratio.
- the default user contact ratio shown in column 4402 is calculated by dividing the average operation time when operating the home appliances in column 4401 by the operation time (average operation time / operation time). For example, a device that always requires a user operation during operation, such as a vacuum cleaner that is a household electrical appliance shown in row 4410, calculates the default user contact as “1” and sets it as “1”. Further, for example, when a user operation is not required during operation other than the initial setting and pressing of the start button, such as a washing machine that is a home appliance shown in row 4420, the default user contact ratio is a low value. It becomes.
- a column 4403 indicates that the function-specific user contact table is referred to.
- the function-specific user contact table for example, the user contact table of the vacuum cleaner shown in FIG. 24B is shown, and in FIG. 24C, the user contact table of the washing machine is shown.
- column 4405 indicates the operation content for the vacuum cleaner
- column 4406 indicates the essentiality of the user operation for the operation of the vacuum cleaner.
- Column 4407 shows the average operation time for the operation in column 4405.
- line 4440 indicates that the user is always required for the operation while the cleaner is operating. Since the operation time of the vacuum cleaner can be calculated by the method described above, the average operation time is not set.
- row 4450 defines that when the “mode selection” event of the vacuum cleaner is stored in the household appliance DB group (1103), the user operation is essential and the average operation time is 10 seconds. Yes.
- row 4460 defines that when the “intensity selection” event of the vacuum cleaner is stored in the household appliance DB group (1103), the user operation is essential and the average operation time is 10 seconds. Yes.
- columns 4408 to 4410 have the same definition as the washing machine, and thus the description thereof is omitted.
- line 4470 indicates that the user is not always essential for the operation while the washing machine is operating.
- Line 4480 indicates that user operation is essential for “course selection” of the washing machine, and the average operation time is 30 seconds.
- FIG. 25 is a flowchart illustrating an example of processing in which the display index value calculation unit according to Embodiment 3 calculates a display index.
- the display index value calculation unit (4002) starts calculating the display index.
- the display index value calculation unit (4002) acquires the predicted operation time from the home appliance operation time prediction unit (4001).
- the display index value calculation unit (4002) acquires the user contact ratio from the table shown in FIG.
- the display index value calculation unit (4002) calculates the display index value of the received device.
- the display index value calculation unit (4002) calculates the display index value using, for example, the following calculation formula.
- Display index value (Predicted operation time x User operation ratio during operation time for home appliance type) + (Total of average operation time of user operation events)
- the home appliance operating time predicting means (4001) can calculate that the operating time 4203 of the washing machine is 61 minutes with reference to FIG. Next, the home appliance operating time predicting means (4001) refers to (a) of FIG. 24 to obtain a default user contact ratio 0.05 of the washing machine.
- the home appliance operating time predicting means (4001) can calculate the display index value in the washing machine shown in FIG. 22 as “3.05”. In the example shown in FIG. 22, there is no user operation event, but when there is a user operation event, the total value of the average operation time corresponding to the operation of (c) of FIG. 24 is added.
- the home appliance operating time predicting means (4001) can calculate that the operating time of the cleaner (the total time of time 4303 + time 4306) is 10 minutes with reference to FIG. Next, the home appliance operating time predicting means (4001) refers to (a) of FIG. 2, and acquires the default user contact ratio “1” of the vacuum cleaner.
- the home appliance operating time predicting means (4001) can calculate the display index value of the vacuum cleaner shown in FIG. 23 as “10”.
- the priority in the home appliances shown in FIGS. 22 and 23 is the order of “vacuum cleaner” and “washing machine”.
- weighting by the user operation ratio for each home appliance is not limited to this.
- what is necessary is just to be able to use the display index value by performing weighting according to the operation ratio of the user when operating the home appliance.
- the display index value calculation unit (4002) acquires the home appliances already displayed on the portal with the house ID received as the latest event and the display index value for each home appliance. To do.
- the display index value calculation unit (4002) displays the display index value of the displayed home appliance having the same home ID acquired in S4505 and the home appliance type included in the latest event calculated in S4504. Sort everything in descending order.
- the display format determination means (1102) determines whether up to the top N are to be displayed, and indicates an instruction (display information) indicating that the largest display index value is displayed with the largest icon. Is generated.
- FIG. 26 is a diagram illustrating an example of a display screen displayed in a specific display format according to the third embodiment.
- FIG. 26 shows a house icon (4601) identified by the house ID.
- Household appliances operated in the house corresponding to the house icon (4601) are displayed in an area 4602 in the house icon (4601) and lower areas 4603 to 4606.
- the home appliance with the highest display index value is displayed in an area 4602 in the house icon (4601).
- the home appliances with the highest display index value are displayed in order from the area 4603 below the house icon (4601).
- step S4507 the display format determination unit (1102) determines whether the display device on the portal screen includes the home appliances belonging to the house ID up to the top Nth home appliance with the display index value. To decide. Then, the display format determination means (1102) displays the home appliance icon corresponding to the home appliance with the largest display index value in the area 4602 in the house icon (4601), and selects the home appliance having a display index value thereafter. Instructions (display information) indicating that the images are to be displayed in areas 4603 to 4606 are generated in order.
- the icon display method shown in FIG. 26 is an example, and the present invention is not limited to this. What can be displayed in a more conspicuous form from the one with a high user operation ratio is acceptable.
- the display format determination unit (1102) transmits the generated instruction (display information) to the display device via the communication unit (1101).
- FIG. 27 is a diagram showing another example of a display screen displayed in a specific display format in the third embodiment.
- FIG. 27 shows an example of a display screen displayed on the display device based on the display information generated by performing the processing shown in FIG.
- the portal screen (display screen) shown in FIG. 27 house icons are mapped on a map, and it is possible to determine at a glance which household appliances are being operated in which area of the house. Note that, as shown in FIG. 27, the house icon (4705) of a house where no home appliance is operated may be displayed in black.
- FIG. 28 is a diagram showing an example of a display screen when an application is added to the display screen of FIG. 26, and FIG. 29 is a diagram showing an example of a display screen when an application is added to the display screen of FIG.
- FIG. 28 shows an example in which a person icon indicating the user is displayed when the home appliance is operated by a user operation on the display screen of FIG.
- FIG. 29 shows an example in which a person icon indicating the user is displayed when the home appliance is operated by a user operation on the display screen of FIG.
- the display unit of the display device displays an icon indicating that the user operation is performed in the vicinity of the home appliance icon corresponding to the home appliance on which the user operation is performed among the home appliance icons.
- the display unit of the display device uses, as a specific display format, a home appliance icon and a person icon selected according to a predetermined display priority among a plurality of person icons respectively indicating a plurality of users to be displayed as a background image It may be displayed on the display screen in a superimposed manner.
- the predetermined display priority is information relating to each of the plurality of home appliances received by the communication means (1101) in the server (1100) or the server (3001) and using information belonging to each of the plurality of users. To be determined.
- the latest event stored in the household appliance DB group (1103) is included in the operations shown in FIG. 24B and FIG. If it is an essential event, for example, a person icon (4801) and a person icon (4802) are displayed next to the home appliance icon of the home appliance related to the event during the average operation time.
- the latest events stored in the home appliance DB group (1103) are included in the operations shown in FIG. 24B and FIG. If it is, for example, a person icon (4901), a person icon (4902), and a person icon (4903) are displayed next to the home appliance icon of the home appliance related to the event during the average operation time.
- the number of people in the house can be inferred on the portal screen (display screen) shown in FIGS. 28 and 29, and the real-time house state can be shared. .
- the home appliance operation time that is the time when the plurality of home appliances are operated is predicted,
- the display index value for determining the display size and display position of the home appliance icon that is an object corresponding to the home appliance including information on one or more filtered home appliances by weighting the time with the user's operation time. To do.
- the display size and display position of the home appliance icon on the display screen can be determined and displayed on the display screen.
- the icon can be superimposed on the background image and displayed on the display screen of the display device.
- the display priority is determined using information regarding each of the received plurality of home appliances and belonging to each of the plurality of users.
- the display priority may be determined using the operation time of each of the plurality of home appliances as information on each of the plurality of home appliances.
- the display priority may be determined by using the accumulated operation time of each of the plurality of home appliances as information on each of the plurality of home appliances.
- the information providing method of the present embodiment it is possible to increase the display priority of home appliances that require user operations in addition to the operating status of home appliances.
- the display mode of the portal screen in which the display size of information (icons) corresponding to home appliances that the user needs to operate is increased. Can be provided. Therefore, the user who is viewing this portal screen can not only simply see the operating status of home appliances, but can also see the real movement of the person in the home in real time on the portal screen.
- the information providing method of the present embodiment it is possible to provide an enormous variety of information in a display mode that is easy to visually recognize.
- the display format determination means (1102) performs a process of generating a specific display format for displaying a deformed background image so that a plurality of person icons or a plurality of home appliance icons do not overlap.
- the display format determining means (1102) further switches home appliance icons to be displayed after displaying the deformed background image.
- the background image is returned to the original shape before deformation, and when a home appliance icon to be displayed superimposed on the background image of the original shape overlaps, a specific display format for deforming and displaying the background image is generated again.
- the background image is, for example, a map.
- FIG. 30 is an example of a display screen displayed in a specific display format according to the fourth embodiment.
- FIG. 30 shows an example of a UI generated by the display format determination means (1102), and shows an example of a UI for effectively displaying a plurality of users on one screen.
- the background map image (5001) in Fig. 30 is an image showing a display target area, and is an image determined by a service to be provided or a user.
- the background map image (5001) may be, for example, a world map or a map of the Kinki region other than the Japan map.
- the background map image (5001) may be an actual map or a virtual map used inside content such as a game.
- the user icon (5002) is an icon indicating a display target user.
- the user icon (5002) is displayed superimposed on the background map image (5001) according to the current position of the user.
- the user icon (5002) may be, for example, an icon having a shape such as a person, a face, a house, or a home appliance, or a combination of a plurality of these icons. When combining two or more, these magnitude
- FIG. 31 is a flowchart illustrating an example of processing for generating the specific display format illustrated in FIG. 30.
- the display format determination means (1102) determines the type of home appliance to be displayed. This may be determined by the service provider or may be determined by the user. Further, the display target may be limited by some category such as home appliance attributes and user attributes as well as home appliance types.
- step S ⁇ b> 5002 the display format determination unit (1102) acquires information such as a house ID, a home appliance type, an event key, an event value, a date, and a time for the user corresponding to the display target from the home appliance DB group (1103). To do.
- the display format determination unit (1102) determines the display priority of the user based on the information acquired from the home appliance DB group (1103).
- a display format determination means (1102) determines a display priority using the operation time of the household appliances calculated from the information acquired from the household appliance DB group (1103), and its accumulated value. Note that, in determining the display priority, in addition to the operation time of the household electrical appliance, a value such as the degree of intimacy between users, ranking of games, or the number of utterances in the case of a service having a chat function may be used.
- the display format determination means (1102) selects a display target user based on the determined display priority.
- the display format determination means (1102) compares the determined display priority with a specific value and selects a user corresponding to the display priority exceeding the specific value.
- the display format determination means (1102) displays the user icon (5002) corresponding to the selected user in a superimposed manner at a location corresponding to the user position information on the background map image (5001).
- the display format determination means (1102) determines whether or not a plurality of user icons (5002) displayed on the display screen of the display device are overlapped. If they overlap (YES in S5006), the process of S5007 is performed. If they do not overlap (YES in S5006), the process of S5008 is performed.
- the display format determination means (1102) transforms the background map image (5001) so that the user icons (5002) do not overlap each other. Specifically, it is realized by locally expanding the map image so that the distance between the user icons (5002) is increased while leaving the shape as a map. If the overlap between the user icons (5002) cannot be eliminated even by this modification, the user icons (5002) may be hidden in order from the lowest display priority.
- FIG. 32 is a diagram showing an example of a display format in which user icons are overlapped in the fourth embodiment.
- FIG. 33 is a diagram showing an example of a display format obtained by modifying the background map image in the fourth embodiment.
- FIG. 32 shows an example of the case where it is determined that the user icons (5002) overlap each other in the process of S5006, and the user icon (5002) is displayed in an overlapping manner (5003).
- the icon is hard to see.
- FIG. 33 shows an example when the background map image (5001) is deformed so that the user icons (5002) do not overlap each other by the processing of S5007.
- the deformed background map image (5004) is expanded only in a portion where the user icons (5002) are dense while maintaining the original map shape. In this way, the overlap between the user icons (5002) is eliminated, and the display is easy to see.
- the display format determination means (1102) determines the home appliance type to be displayed next. As in S5001, this may be determined by the service provider or may be determined by the user. Further, the display target may be limited by some category such as home appliance attributes and user attributes as well as home appliance types.
- the display format determination means (1102) determines whether or not the next home appliance type to be displayed is valid. If it is valid (YES in S5009), the process of S5010 is performed. If not valid (NO in S5009), the process is terminated.
- the display format determination means (1102) determines whether or not the next display target home appliance type matches the current display target home appliance type. If they match (YES in S5010), the process of S5002 is performed. If they do not match (NO in S5010), the process of S5011 is performed.
- the display format determination means (1102) hides the user icon (5002), cancels the deformation of the background map image (5001), and returns to the original state.
- a background map image of the original shape may be inserted between them to make a transition.
- FIG. 34 is a diagram showing an example of display format transition in the fourth embodiment.
- 34A shows the background map image (5004) transformed in S5007, and in FIG. 34B, the user icon (5002) is hidden in S5011 and the background of the original shape is displayed.
- An example of a display format when transitioning to a map image (5005) is shown.
- the background map image of the original shape is inserted and transitioned between them, thereby making it easier for the user to visually recognize the map shape.
- the user's life information is acquired and a specific display format to be shared is generated. .
- the home appliance is a coaster having a weight sensor and a communication device. That is, in a scene where the user drinks coffee, the user places the coffee cup on the coaster or lifts the coffee cup from the coaster, so that the user's life information is acquired.
- FIG. 35A is a diagram illustrating an example of an external appearance of a wireless coaster in Embodiment 5.
- FIG. 35B is a diagram illustrating an example of a configuration of a wireless coaster in Embodiment 5.
- the wireless coaster (6000) shown in FIG. 35A is formed as a table on which the cup (6001) is placed.
- the wireless coaster (6000) receives power supplied by energy harvesting such as batteries or sunlight, senses the weight, temperature, sound, etc. of an object placed on the top, and uses the wireless function to detect the result. Send to.
- the wireless coaster (6000) illustrated in FIG. 35A is, for example, a first home appliance having a function of measuring weight, and as illustrated in FIG. 35B, a weight sensor (6002), a wireless communication unit (6003), and ID management Part (6004).
- the wireless coaster (6000) (particularly the wireless communication unit (6003)) includes, for example, a third memory and a CPU as a hardware configuration.
- the third memory for example, a program that functions as a wireless communication unit (6003) is stored.
- the third memory is, for example, a readable recording medium or a readable / writable recording medium. Examples of the recording medium include a semiconductor memory, an optical disk, and a hard disk.
- the wireless coaster (6000) is configured such that, for example, the wireless communication unit (6003) functions by reading the above-described program from the third memory and causing the CPU to execute the program.
- the CPU is configured to execute a program that functions as the wireless communication unit (6003), but is not limited thereto.
- a dedicated signal processing circuit that functions as a wireless communication unit (6003) may be used.
- the ID management unit (6004) manages an identifier that can uniquely identify the wireless coaster (6000). That is, the ID management unit (6004) manages its own ID.
- the identifier managed by the ID management unit (6004) is stored in a memory (not shown), for example.
- the memory that stores the identifier is, for example, a recording medium that can be read and written. Examples of the recording medium that can be read and written include a semiconductor memory, an optical disk, and a hard disk.
- the weight sensor (6002) has a function of measuring weight and measures a change in weight.
- the weight sensor (6002) measures the weight of the cup by placing the cup on the wireless coaster (6000).
- the weight sensor (6002) senses the weight of an object placed on the wireless coaster (6000) from information such as pressure and strain using a pressure sensor, a strain sensor, and the like.
- the wireless communication unit (6003) transmits the weight information measured by the weight sensor (6002) indicating the change in the weight of the object and the ID of the identifier that can uniquely identify the wireless coaster (6000) via a server ( 1100).
- the wireless communication unit (6003) communicates with the GW1 (1301) by wireless means such as ZigBee, Bluetooth (registered trademark), specific power saving wireless, and wireless LAN.
- the wireless communication unit (6003) may send the data directly to the server (1100) through wireless means such as 3G wireless.
- FIG. 36 is a flowchart illustrating an example of processing in which the wireless coaster in Embodiment 5 detects a state.
- the weight sensor (6002) detects a change in weight (change in weight) using a pressure sensor, a strain sensor, etc., and measures the weight.
- the weight sensor (6002) may have a mechanical system such as a spring.
- the weight sensor (6002) increases the detection frequency of the arbitrary time sensor.
- the time for increasing the detection accuracy may be set in advance from the server (1100) side.
- the server (1100) sets an average of the time when the increase / decrease of the counted weight is continuously performed. This realizes power saving when the cup is usually left without being placed.
- the weight sensor (6002) notifies the wireless communication unit (6003) of weight increase / decrease information (weight information indicating a change in weight).
- the wireless communication unit (6003) adds the ID information of the wireless coaster (6000) and transmits weight increase / decrease information to the GW1 (1301).
- the GW 1 (1301) transmits the information to the server (1100) after adding the house ID, the home appliance type, and (time information as necessary).
- the wireless coaster (6000) having the function of measuring the weight measures the change in the weight of the object, and the weight information indicating the measured change in the weight and the identifier that can uniquely identify the wireless coaster (6000). Can be transmitted to the server (1100). As a result, when the cup is placed on the top or a liquid is further poured thereon, the wireless coaster (6000) can detect the weight change and notify the server (1100).
- FIG. 37 is a diagram illustrating an example of a system configuration when the wired coaster according to the fifth embodiment is connected to a PC through a wired connection such as USB. Elements similar to those in FIG. 35B are denoted by the same reference numerals, and detailed description thereof is omitted.
- a wired coaster (6000A) illustrated in FIG. 37 is a first home appliance having a function of measuring weight, for example, and as illustrated in FIG. 37, a weight sensor (6002), an ID management unit (6004), and wired communication Part (6011).
- the wired coaster (6000A) shown in FIG. 37 includes, for example, a third memory and a CPU.
- the third memory for example, a program that functions as a wired communication unit (6011) is stored.
- the third memory is, for example, a readable recording medium or a readable / writable recording medium. Examples of the recording medium include a semiconductor memory, an optical disk, and a hard disk.
- the wired coaster (6000A) shown in FIG. 37 is configured such that, for example, the wired communication unit (6011) functions by reading the above-mentioned program from the third memory and causing the CPU to execute it.
- the CPU is configured to execute a program that functions as the wired communication unit (6011), but is not limited thereto.
- a dedicated signal processing circuit that functions as a wired communication unit (6011) may be used.
- the wired communication unit (6011) transmits the weight information measured by the weight sensor (6002) indicating the change in the weight of the object and the ID of the identifier that can uniquely identify the wired coaster (6000A) via the server ( 1100).
- the wired communication unit (6011) is connected to a personal computer (hereinafter referred to as a PC) with a USB cable, for example, instead of wireless.
- a PC personal computer
- power may be supplied to the wired coaster (6000A) via USB. In this case, there is no need to mount a battery or the like on the wired coaster (6000A).
- a display 37 includes an information processing unit (6020), a USB port (6021), and an Internet connection unit (6022), and is connected to a display device (6023) such as a display.
- FIG. 37 (particularly, the information processing unit (6020) and the Internet connection unit (6022)) illustrated in FIG. 37 includes, for example, a fourth memory and a CPU as a hardware configuration.
- the fourth memory stores a program that functions as the information processing unit (6020) and a program that functions as the Internet connection unit (6022).
- the fourth memory is, for example, a readable recording medium or a readable / writable recording medium.
- Examples of the recording medium include a semiconductor memory, an optical disk, and a hard disk.
- the PC shown in FIG. 37 is configured such that the information processing unit (6020) and the Internet connection unit (6022) function by, for example, reading the above-described programs from the fourth memory and causing the CPU to execute them.
- the CPU causes a program to function as the information processing unit (6020) and a program to function as the Internet connection unit (6022) to be executed by the CPU.
- the present invention is not limited thereto.
- a dedicated signal processing circuit that functions as the information processing unit (6020) and a dedicated signal processing circuit that functions as the Internet connection unit (6022) may be used.
- a program that causes one of the information processing unit (6020) and the Internet connection unit (6022) to function may be stored in the fourth memory, and the other may be configured using a dedicated signal processing circuit.
- the information processing unit (6020) performs processing based on information notified from the USB port (6021) and the wired coaster (6000A).
- the information processing unit (6020) displays an image on the display device (6023).
- the PC is described as an example, but the present invention is not limited to this. It may be a home appliance having a function corresponding to a PC such as a tablet or a smartphone.
- the difference between the wireless coaster (6000) and the wired coaster (6000A) is mainly the difference in communication means
- the difference between the wireless and wired communication means is not related to the configuration and method, List as coasters without classification as necessary.
- FIG. 38 is a flowchart illustrating an example of processing in which the coaster according to the fifth embodiment detects a state and cooperates with the PC.
- the weight sensor (6002) detects an increase or decrease in weight.
- the weight sensor (6002) measures the weight using a pressure sensor, a strain sensor, or the like as described above.
- the weight sensor (6002) increases the detection frequency of the arbitrary time sensor.
- the weight sensor (6002) notifies the wired communication unit (6011) of weight increase / decrease information.
- the wired communication unit (6011) adds the ID information of the wired coaster (6000A) and transmits weight increase / decrease information to the PC.
- the PC acquires information via the USB port (6021), and the information processing unit (6020) confirms the information.
- the information processing unit (6020) adds a coaster, a house ID, a home appliance type, and (time information as necessary) to the server (1100) via the Internet connection unit (6022). Send information.
- the information processing unit (6020) transmits the screen information associated with the user obtained from the server (1100) to the display device (6023) and displays it.
- the PC can be used as a GW that transmits information to the server (1100) in the same manner as the function of the GW1 (1301).
- the PC itself operates as an image display device.
- a GW device can be omitted, and a mechanism such as a power supply device of a wired coaster (6000A) can be simplified.
- FIG. 39 is a diagram illustrating an example of information obtained from the coaster according to the fifth embodiment.
- a coaster having a house ID, a coaster as a home appliance type, a detection result as an event key, and information including weight information, date, and time such as 20.5 g as an event value is obtained from the coaster. .
- the date and time may be given to information at the time of communication if the coaster has a clock function.
- date and time information is given on the GW or server (1100) side.
- the coaster can notify the server (1100) of the weight change information.
- FIG. 39 (b) shows an example of additional information.
- the additional information includes, for example, a coaster ID that is an identifier of the coaster.
- the group ID may be written in the additional information in advance when the coaster is shipped. If there is a group ID, the community associated with the group ID can be entered when accessing the SNS site using the coaster. Depending on the community setting, you can also prevent participation if you are not using a coaster with the target group ID. In addition, by adding information such as disclosure or non-disclosure in advance, it is possible to prevent the user's ID from being disclosed within the SNS site. By putting information in the coaster in this way in advance, it is possible to decide the community to be explicitly connected, or to associate with the group as a setting in the server from the coaster ID information Is possible. Further, forgery can be prevented by transmitting the coaster ID information together with the home ID etc. in a secret manner using a hash or the like.
- FIG. 40 is a diagram showing an example of screen information displayed in conjunction with the coaster in the fifth embodiment.
- each community as a social networking service is expressed as a table (6035).
- four communities are displayed.
- each table shown in FIG. 40 different topics and friends are classified as categories.
- an avatar (6030) including a character appearing on the screen and an object related to the character is displayed as his / her own substitute.
- the avatar (6030) indicates a user in a table shape associated with the number of user utterances and the most recent coffee consumption.
- the arrangement of the avatar (6030) or the size of the avatar (6030) can be changed according to the number of utterances of the user and the most recent coffee consumption.
- the arrangement and size of the table itself can be changed from the number of utterances in the entire community and the most recent coffee consumption. As a result, it is possible to form a screen in which the community's speech situation and the user's speech situation are intuitively easy to understand.
- the cup picture (6031) a picture of the cup registered in advance, such as a user's favorite picture, is displayed.
- the cup is estimated from the weight information notified by the coaster, and the estimated cup or the picture associated with the estimated cup is displayed.
- 6032 is an example of an avatar (screen) indicating an operation of lifting a cup.
- Reference numeral 6033 is an example of an avatar (screen) that increases or decreases the number of cups using information on the amount and number of times coffee is poured.
- Reference numeral 6034 denotes an example of an avatar (screen) displayed in such a manner that coffee is superimposed on a cup when the amount of coffee currently poured can be measured from the weight of the cup, the estimated capacity, and the like.
- Reference numeral 6036 denotes an example of an image of a place indicating the number of vacant communities. In 6036, it is expressed that only a chair is placed and the person can still enter the community and participate in the conversation.
- FIG. 41 is a diagram illustrating an example of a configuration when the wireless coaster according to Embodiment 5 performs state estimation. Elements similar to those in FIG. 35B are denoted by the same reference numerals, and detailed description thereof is omitted.
- the state estimation is to estimate the state when the user is drinking coffee, and estimates the remaining amount of coffee, whether the user is lifting the cup to drink coffee, or the like.
- a wireless coaster includes a weight sensor (6002), a wireless communication unit (6003), an ID management unit (6004), a state estimation unit (6041), a state DB (6043), and time management. Part (6042).
- the wireless coaster (6000B) (in particular, the wireless communication unit (6003), the state estimation unit (6041), and the time management unit (6042)) includes, for example, a third memory and a CPU as a hardware configuration.
- the third memory stores, for example, a program that functions as the wireless communication unit (6003), a program that functions as the state estimation unit (6041), and a program that functions as the time management unit (6042).
- the third memory is, for example, a readable recording medium or a readable / writable recording medium. Examples of the recording medium include a semiconductor memory, an optical disk, and a hard disk.
- the wireless coaster (6000B) reads the above-mentioned program from the third memory and causes the CPU to execute it, so that the wireless communication unit (6003), the state estimation unit (6041), and the time management unit (6042) function. It is configured.
- the CPU causes the program to function as the wireless communication unit (6003), the state estimation unit (6041), and the time management unit (6042), but is not limited thereto.
- a dedicated signal processing circuit that functions as a wireless communication unit (6003), a dedicated signal processing circuit that functions as a state estimation unit (6041), and a dedicated signal processing circuit that functions as a time management unit (6042) May be.
- a program that causes at least one of the wireless communication unit (6003), the state estimation unit (6041), and the time management unit (6042) to function is stored in the third memory, and the rest is used by a dedicated signal processing circuit. May be configured.
- the state estimation unit (6041) estimates the remaining amount of coffee and whether or not the user is lifting the cup to drink coffee. More specifically, the state estimation unit (6041) estimates the state of the object measured by the weight sensor (6002) from the change pattern of the weight information received from the weight sensor (6002). Here, the state estimation unit (6041) determines that the user of the wireless coaster (6000B) places a cup on the wireless coaster (6000B) or a cup from the wireless coaster (6000B) as the state of the object based on the received weight information change pattern. It may be estimated whether or not. The state estimation unit (6041) may estimate the weight of the object used by the user of the wireless coaster (6000B) from the received weight information change pattern.
- the state DB (6043) has information for determining the state.
- the state DB (6043) is stored in the memory.
- the memory that stores the state DB (6043) is, for example, a readable / writable recording medium. Examples of the recording medium that can be read and written include a semiconductor memory, an optical disk, and a hard disk.
- the time management unit (6042) manages the time.
- FIG. 42 is a flowchart illustrating processing in which the coaster according to the fifth embodiment detects a state.
- the weight sensor (6002) detects an increase or decrease in weight.
- the state estimation unit (6041) uses the time information from the time management unit (6042) and the weight sensor information to generate weight change information, which is time-series change information of the weight increase / decrease. To do.
- This weight change information indicates a weight transition as shown in FIG. 43, for example.
- the state estimation unit (6041) acquires a weight change pattern that is a time-series weight change pattern from the state DB (6043).
- the state DB (6043) for example, feature amount data of change patterns indicating the respective states in FIG. 43 is stored.
- the state estimation unit (6041) transmits the state change information and the current weight information to the wireless communication unit (6003).
- the wireless communication unit (6003) adds the ID information of the wireless coaster (6000B) and transmits the information to GW1 (1301).
- the wireless coaster (6000B) can estimate the state when the user drinks coffee and notify the server (1100).
- FIG. 43 is a diagram showing an example of a weight change pattern in the fifth embodiment.
- the weight sensor (6002) shows a transition as shown in FIG. 43 according to the user's operation.
- the horizontal axis is time and the vertical axis is weight
- the weight of the cup is detected (6051).
- the weight gradually increases.
- the cup is lifted, the weight becomes 0 g, and the transition as shown in 6053 is made.
- the size obtained by subtracting the weight of the drink from the weight immediately before 6053 is detected.
- the change in weight is plotted.
- the user's state can be estimated from the change in the weight value detected by the coaster.
- FIG. 44 is a diagram illustrating an example of information obtained from the coaster according to the fifth embodiment.
- the server (1100) is notified by detecting state information such as a cup set that is in a state of placing a cup, pouring, and pouring end. This information may be estimated on the server (1100) side.
- FIG. 45 is a diagram illustrating an example of a configuration of a server having a function of estimating a state according to the fifth embodiment.
- the server (1100A) shown in FIG. 45 includes a communication unit (6061), a coaster information acquisition unit (6062), a display information communication unit (6063), a message management unit (6064), and a server-side state estimation unit (6065).
- the information search unit (6067) and the display format determination unit (6069) include, for example, a first memory and a CPU as a hardware configuration.
- a program that functions as a communication unit (6061), a program that functions as a coaster information acquisition unit (6062), a program that functions as a display information communication unit (6063), and a function as a message management unit (6064) Stored therein, a program that functions as a server-side state estimation unit (6065), a program that functions as a display information search unit (6067), and a program that functions as a display format determination unit (6069).
- the first memory is, for example, a readable recording medium or a readable / writable recording medium.
- Examples of the recording medium include a semiconductor memory, an optical disk, and a hard disk.
- the server (1100A) illustrated in FIG. 45 reads the above-described program from the first memory and causes the CPU to execute the program, thereby causing the communication unit (6061), the coaster information acquisition unit (6062), and the display information communication unit (6063). ), A message management unit (6064), a server side state estimation unit (6065), a display information search unit (6067), and a display format determination unit (6069).
- the CPU is configured to execute a program that functions as the display format determination unit (6069), but is not limited thereto.
- a dedicated signal processing circuit that functions as a communication unit (6061), a dedicated signal processing circuit that functions as a coaster information acquisition unit (6062), a dedicated signal processing circuit that functions as a display information communication unit (6063), and message management Dedicated signal processing circuit functioning as a unit (6064), dedicated signal processing circuit functioning as a server-side state estimation unit (6065), dedicated signal processing circuit functioning as a display information search unit (6067), display format determination unit You may comprise using the signal processing circuit for exclusive use which functions as (6069).
- a program that functions as at least one of the determination units (6069) may be stored in the first memory, and the rest may be configured using a dedicated signal processing circuit.
- the communication unit (6061) receives information on each of the plurality of home appliances from each of the plurality of home appliances.
- the communication unit (6061) transmits the display information formed (generated) by the display format determination unit (6069) to the display device.
- a communication part (6061) receives the status information transmitted from the coaster.
- a communication part (6061) receives the identifier and weight information of the household appliances which were transmitted.
- the coaster information acquisition unit (6062) selects and acquires the state information transmitted from the coaster.
- the server-side state estimation unit (6065) estimates the state of the object measured by the first home appliance from the received change pattern of the weight information.
- the server-side state estimation unit (6065) estimates whether the coaster user puts the cup on the coaster or picks up the cup from the coaster as the state of the object from the received weight information change pattern.
- the server-side state estimation unit (6065) estimates the weight of the object used by the user of the coaster device from the received weight information change pattern.
- Information included in the state DB (6066) and information included in the cup DB (6068) are stored in the memory.
- the memory is, for example, a recording medium that can be read and written. Examples of the recording medium that can be read and written include a semiconductor memory, an optical disk, and a hard disk.
- Information included in the state DB (6066) and information included in the cup DB (6068) may be stored in physically separate memories, for example, or may be stored in the same memory.
- the server-side state estimation unit (6065) acquires a weight change pattern, which is a time-series weight change pattern, from the state DB (6066).
- the server-side state estimation unit (6065) determines a target state when there is an approximation by comparing the weight change information with the weight change pattern. Further, the server-side state estimation unit (6065) notifies the display information search unit (6067) when there are frequent words among the registered words.
- the message management unit (6064) determines whether there is a registered user message before and after the time when the target state occurs.
- the display format determination unit (6069) performs processing for filtering information regarding each of the plurality of home appliances received by the communication unit (6061) and generating display information for the display device to display in a specific display format.
- the display format determination unit (6069) generates display information for the display device to display the display content according to the estimated state of the object in a specific display format. For example, the display format determination unit (6069) changes the display content in which the display device changes the user avatar related to the object among the avatars displayed on the display screen based on the display information according to the display content. Is generated.
- the communication unit (6061) receives state information transmitted from the coaster.
- the coaster information acquisition unit (6062) selects and acquires the state information transmitted from the coaster.
- the server-side state estimation unit (6065) acquires a weight change pattern that is a time-series weight change pattern from the state DB (6066). At this time, when the state is determined on the coaster side, this processing may be omitted.
- the server-side state estimation unit (6065) determines a target state when there is an approximation by comparing the weight change information and the weight change pattern. At this time, when the state is determined on the coaster side, this processing may be omitted.
- step S6125 the message management unit (6064) determines whether there is a user message registered before and after the time when the target state occurs. Register.
- the server-side state estimation unit (6065) notifies the display information search unit (6067) when there are frequent words among the registered words.
- the display information search unit (6067) estimates the estimated weight information of the cup from the weight change.
- the display information search unit (6067) searches the cup DB (6068), which is a list of cup weights associated with the coaster ID, and estimates the cup used by the user. At this time, the user can register the weight and the pattern of the favorite cup in advance. At the time of registration, if the user has used this service for a certain period of time, an estimated cup list may be presented based on the pre-measured weight deviation of the cup.
- the display format determination unit (6069) detects “the state where a drink is poured into the cup” as the status information while the user is logged in to the community, and the number of coffee cups corresponding to that number of times. Line up.
- the display format determination unit (6069) changes the screen so that the arrangement on the avatar screen is moved to the center or displayed larger based on the number of utterances and the amount of coffee consumed in the most recent time.
- the display format determination unit (6069) changes the setting of the community startup user to a screen showing information indicating an empty space when there is a free space in the maximum number of communities. At this time, it may be in the form of a chair or the like.
- the display format determination unit (6069) changes to a screen showing a picture showing the internal capacity.
- the display format determination unit (6069) determines the size of the image in units representing the community, such as the number of utterances in the community and the consumption of coffee in the most recent time of all participants from the size of the table. Or change the arrangement of perspective.
- FIG. 48 is a diagram illustrating an example of information stored in the cup DB according to the fifth embodiment.
- Mainly used time zones are obtained by estimating the most used time zones from the respective usage time zones and their frequencies as shown in the mainly used time zones (6071) in FIG.
- the number of uses is the number of times that the cup has been used so far.
- FIG. 49 is a diagram showing an example of usage frequency information for calculating a time zone mainly used for cups in the fifth embodiment.
- the time zone (6071) used mainly shows a deviation in the frequency of usage time of the user, and the most used time zone is estimated from each usage time zone and its frequency.
- FIG. 50 is a diagram illustrating an example of a configuration of a system that simultaneously uses coffee maker usage information and coaster information according to the fifth embodiment.
- the 50 includes a coffee maker (6081), a current sensor (6082), and a wireless communication unit (6083) in addition to the wireless coaster (6000) and the GW1 (1301).
- the coffee maker (6081) is an example of a third home appliance.
- the current sensor (6082) is an example of a second home appliance having a function of measuring the amount of current, and measures the amount of current of the third home appliance.
- the current sensor (6082) measures the power consumption of the coffee maker (6081).
- the current sensor (6082) is arranged between the connector of the power source and the outlet in the form of an extension cable, and detects the amount of current in the extended portion.
- the current sensor (6082) can measure the current without modifying the coffee maker (6081).
- the wireless communication unit (6083) transmits the current amount of the coffee maker (6081) measured by the current sensor (6082) to the server (1100) via the GW1 (1301).
- FIG. 51 is a diagram showing an example of the configuration of a server in the case of using drink manufacturer information in the fifth embodiment. Elements similar to those in FIG. 45 are denoted by the same reference numerals, and detailed description thereof is omitted.
- a server (1100B) shown in FIG. 51 includes, in addition to the configuration of the server (1100A) shown in FIG. 45, a current sensor information receiving unit (6091), a manufacturer state estimating unit (6092), and a manufacturer state DB (6093). Is provided.
- the search unit (6067), the display format determination unit (6069), the current sensor information reception unit (6091), and the manufacturer state estimation unit (6092) have, for example, a first memory and a CPU as a hardware configuration.
- a program that functions as the manufacturer state estimation unit (6092) are stored.
- the first memory is, for example, a readable recording medium or a readable / writable recording medium.
- Examples of the recording medium include a semiconductor memory, an optical disk, and a hard disk.
- the server (1100B) illustrated in FIG. 51 reads the above-described program from the first memory and causes the CPU to execute the communication unit (6061), the coaster information acquisition unit (6062), and the display information communication unit (6063). ), Message management unit (6064), server-side state estimation unit (6065), display information search unit (6067), display format determination unit (6069), current sensor information reception unit (6091), manufacturer state estimation unit (6092) Each is configured to function.
- the CPU is configured to execute a program that functions as the wired communication unit (6011), but is not limited thereto.
- a dedicated signal processing circuit that functions as a communication unit (6061), a dedicated signal processing circuit that functions as a coaster information acquisition unit (6062), a dedicated signal processing circuit that functions as a display information communication unit (6063), and a message Dedicated signal processing circuit functioning as a management unit (6064), dedicated signal processing circuit functioning as a server side state estimation unit (6065), dedicated signal processing circuit functioning as a display information search unit (6067), display format determination Dedicated signal processing circuit functioning as a unit (6069)
- a dedicated signal processing circuit functioning as a current sensor information receiving unit (6091) and a dedicated signal processing circuit functioning as a manufacturer state estimation unit (6092) Also good.
- a program that functions as at least one of the determination unit (6069), the current sensor information reception unit (6091), and the manufacturer state estimation unit (6092) is stored in the first memory, and the rest is a dedicated signal processing circuit. You may comprise using.
- the manufacturer status DB (6093) is stored in a memory, for example.
- the memory that stores the manufacturer status DB (6093) is, for example, a readable and writable recording medium. Examples of the recording medium that can be read and written include a semiconductor memory, an optical disk, and a hard disk.
- the display format determination unit (6069) generates display information for the display device to display the display content in accordance with the estimated state of the object in a specific display format. For example, the display format determination unit (6069) uses the current sensor (6082) to measure the amount of current based on the display information by the display device, and uses a time difference within an arbitrary time from the first home appliance.
- positions the information which shows the 3rd household appliances used on the side of the avatar currently displayed on the display screen is produced
- the display format determination unit (6069) displays a coffee maker icon near the user's avatar on the screen.
- a specific display format is generated for forming a display screen such as displaying an image.
- the display format determination unit (6069) specifies that, for example, a coffee maker and a juicer mixer are arranged on the screen in the order of operation in time series when there are a plurality of drinks poured into the cup.
- the screen information including the display format may be generated.
- the example of the coffee maker (6081) was given above, it is not limited thereto. It may be an apparatus for producing a liquid product such as a juicer mixer, or may indicate food in a powder form by a mill. In either case, screen information can be generated with the same configuration.
- FIG. 52 is a diagram showing an example of a shared screen displaying information of a manufacturer such as a coffee maker in the fifth embodiment.
- the coffee maker (6111) is displayed near the user's avatar, and the juicer mixer (6112) is displayed at a slightly distant position. At the same time, information such as the time during and after the operation may be displayed.
- FIG. 53 is a flowchart illustrating an example of processing performed by the server when using drink maker information according to the fifth embodiment.
- the current sensor (60826083) determines whether or not the coffee maker (6081) is operating. It transmits to GW1 (1301) via a communication part (6083). At this time, when the target home appliance is a juicer mixer, the information may be added.
- the wireless coaster (6000) transmits the detected information to the GW1 (1301).
- the GW1 (1301) transmits information detected by the wireless coaster (6000).
- the current sensor information reception unit (6091) notifies the maker state estimation unit (6092) of the current information.
- the manufacturer state estimation unit (6092) determines whether the temperature is being kept or being generated from the current change pattern stored in the manufacturer state DB (6093). At this time, the determination is made based on the feature amount information, which is pattern information registered in advance in the manufacturer state DB (6093). At this time, if the product number of the manufacturer's product is registered in advance, the target information may be used, or learning may be performed from the peak of the electric energy.
- the display format determination unit generates a screen for the manufacturer (home appliance) that operates before the time when the information from the coaster arrives so that the distance is closer in the order of closer time.
- the display information communication unit (6063) transmits screen information in response to the request.
- the user's avatar related to the object among the avatars displayed on the display screen can be changed according to the display content based on the display information. Further, according to the information providing method of the present embodiment, based on the display information, a plurality of pre-registered images corresponding to the weight of the object, and the estimated weight of the object And the image related to the object of the avatar displayed on the display screen can be changed to an image corresponding to the weight of the object estimated by the comparison.
- use is specified by the second home appliance measuring the amount of current based on the display information, and it is used with a time difference within an arbitrary time from the first home appliance.
- the information which shows the 3rd household appliances to be performed can be arrange
- the actual user state is finely shared by sensing, so that information sharing with reality is realized.
- the home appliance is a coaster
- getting the weight of the coffee cup allows the user to start drinking coffee when generating a specific display format to share, whether the coffee cup is being lifted
- the estimated cup state is estimated.
- the display form of the screen information relevant to a user is changed.
- life information can be shared with other users.
- sharing of real life information is realized because the actual user state such as drinking coffee is shared. In this way, it is possible to provide a screen on which a realistic presentation is made.
- drinks such as juice and beer may be used.
- a seasoning container such as salt pepper and soy sauce to obtain the usage state by measuring the weight.
- the amount of use can also be acquired by capturing changes in the time axis such as one week or one month. In that case, the face color and body shape of the avatar corresponding to the user can be changed on the shared screen from the change in the usage amount.
- the pen stand when the pen stand is placed on the coaster, it can be determined from the weight change whether the pen is removed and returned.
- the avatar corresponding to the user on the screen may be changed by an effect such as an icon having a meaning such as working or studying or a headband.
- FIG. 54 is a diagram showing an example of a system configuration in the sixth embodiment. 54, the same reference numerals as those in FIG. 1 denote the same or corresponding parts, and thus detailed description thereof is omitted here.
- house A (1210) includes GW1 (1301), a network-connected microphone device (7001) and a network-connected microphone device (7001a) as an example of a sound collector, and a sound generator. 7002 and a sound generator 7002a are arranged.
- the sound generator (7002) is arranged in a range where the network-connected microphone device (7001) can collect sound in the house A (1210).
- the sound generator (7002a) is arranged in a range where the network-connected microphone device (7001a) can collect sound in the house A (1210).
- the network connection microphone device (7001) and the network connection microphone device (7001a) have a function as a microphone for collecting sound and a communication function.
- the communication function may be a wired communication function or a wireless communication function.
- the range in which the network-connected microphone device (7001) can collect sound is, for example, the floor where the network-connected microphone device (7001) is installed in the room where the network-connected microphone device (7001) is installed. On the same floor, in the house where the network-connected microphone device (7001) is installed.
- the range in which the network connection microphone (7001) can collect sound may be, for example, an area including a predetermined specific direction, or a position where the network connection microphone device (7001) is installed. It is good also as a range which can collect the area
- the range where the network connection microphone device (7001a) can collect sound is the same as the floor where the network connection microphone (7001a) is installed in the room where the network connection microphone device (7001a) is installed, for example. Floor, in a house where a network-connected microphone device (7001a) is installed.
- the range in which the network-connected microphone device (7001a) can collect sound may be, for example, a range including a predetermined specific direction, or a position where the network-connected microphone device (7001a) is installed. It is good also as a range which can collect the area
- the network-connected microphone device (7001) is connected to the GW1 (1301) and transmits event information related to the collected data to the server (1100) via the GW1 (1301) and the public network (1200). .
- the network-connected microphone device (7001a) transmits event information related to the collected data via the public network (1200) without passing through the GW1 (1301).
- the network-connected microphone device (7001) collects sounds of human voices (or voices) and sound generators (7002) located in a range where sound can be collected.
- the network-connected microphone device (7001) converts the collected sound into electronic data, includes the converted electronic data in event information, and transmits it to the server (1100) via the GW1 (1301) and the public network (1200). To do.
- the network-connected microphone device (7001) converts the collected sound into electronic data, analyzes the converted electronic data, includes the analysis result in the event information, and sends it to the server (1100) to the GW1 (1301) and the public You may transmit via a network (1200).
- the network-connected microphone device (7001a) collects a voice (or voice) of a person located in a range where sound can be collected and a sound generated by the sound generator (7002a).
- the network-connected microphone device (7001) may convert the collected sound into electronic data, include the electronic data in the event information, and transmit the electronic data to the server (1100) via the public network (1200).
- the network-connected microphone device (7001a) may analyze the collected sound, include the analysis result in the event information, and transmit it to the server (1100) via the public network (1200).
- FIG. 55 is a diagram illustrating an example of sound generated by the sound generation apparatus that collects sound from the network-connected microphone device according to the sixth embodiment.
- FIG. 55 shows a specific example in which sound of a person located in a range where the network-connected microphone device (7001) can collect sound and sound generated by the sound generator (7002) are collected in the house A (1210).
- the network-connected microphone device (7001a) can collect sound and the sound generated by the sound generator (7002a)
- FIG. 55 shows a specific example of collecting the voice of a person located in a range where the network-connected microphone device (7001a) can collect sound and the sound generated by the sound generator (7002a)
- the description thereof is omitted here.
- the sound generator (7002) is attached to, for example, a person or an instrument that does not have a communication function.
- the sound generator 7002 may be attached to a home appliance.
- FIG. 55 is an example in which the sound generator (7002) is attached to a child (7102), a car toy (7103), a baby bottle (7104), and a vacuum cleaner (7105) as an example of home appliances. Show. Note that the household electrical appliance (7105) may or may not have a communication function.
- the network-connected microphone device (7001) collects sounds of a mother (7101) and a child (7102) in the house A (1210) and collects sounds generated by the sound generator (7002).
- the network-connected microphone device (7001) collects sounds of a mother (7101) and a child (7102) in the house A (1210) and collects sounds generated by the sound generator (7002).
- the sound generator (7002) includes, for example, a sensor that detects that it has moved.
- the sensor include an acceleration sensor, an angular velocity sensor, a gyro sensor, a temperature sensor, a humidity sensor, an illuminance sensor, and a human sensor.
- the sound generation device (7002) When the sound generation device (7002) is attached to the child (7102), the sound generation device (7002) has a sensor provided in the sound generation device (7002) when the child (7102) moves. Detects movement and emits sound.
- the sound generator (7002) When the sound generator (7002) is attached to the toy (7103), for example, when the child (7102) moves the toy (7103), the sound generator (7002) includes a sensor included in the sound generator (7002). The movement of this toy (7103) is detected and a sound is emitted.
- the sound generation device (7002) when the sound generation device (7002) is attached to the baby bottle (7104), the sound generation device (7002) can be used, for example, for the child (7102) to drink the contents of the baby bottle (7104). 7104), the sensor included in the sound generator (7002) detects the tilting action of the baby bottle (7104) and emits a sound.
- the sound generator (7002) attached to the baby bottle (7104) is configured so that, for example, when the child (7102) grasps the baby bottle (7104), the sensor provided in the sound generator (7002) is used for the baby bottle (7104). A change in temperature may be detected and a sound may be emitted.
- the sound generator (7002) when the sound generator (7002) is attached to the cleaner (7105), the sound generator (7002) includes a sensor provided in the sound generator (7002) when the cleaner (7105) is operated, for example. Detecting the movement of the vacuum cleaner (7105) and producing a sound.
- the sensor included in the sound generation device (7002) detects the movement of the person or object to which the sound generation device (7002) is attached.
- the sound generator (7002) has a sound output unit that outputs sounds such as speakers and tuning forks.
- the sensor included in the sound generation device (7002) controls the sound output unit so as to output a sound corresponding to the previously detected movement when detecting the movement of the person or object to which the sound generation device (7002) is attached.
- the sound corresponding to the movement includes, for example, outputting a sound having a predetermined frequency, outputting a sound having a predetermined frequency for a predetermined time, and repeatedly outputting a sound having a predetermined frequency for a predetermined time.
- the sensor can detect a plurality of movements (moving quickly, moving slowly, moving large, moving small, etc.), it may be configured to output a different sound for each movement.
- the sound generator (7002) can output a sound according to the movement of the person or object to which the sound generator (7002) is attached.
- the output sound may be a frequency that can be heard by human ears or a sound that cannot be heard by human ears.
- the network connection microphone device (7001) includes, for example, a memory.
- the memory is a recording medium for storing information such as a semiconductor memory and a hard disk.
- the memory stores information associating sounds generated by the sound generator (7002) with corresponding movements. Further, an ID for specifying the network-connected microphone device (7001) is stored in the memory.
- the network-connected microphone device (7001) analyzes, for example, the frequency, length, interval, and the like of the sound emitted from the sound generator (7002) included in the collected sound, and the corresponding movement and the sound that emitted the sound.
- the person who attached the generator (7002) is identified. Thereby, it is possible to specify the state of the person from the movement of the person and the state of the person who moves the object from the movement of the thing. For example, if the car toy (7103) is moved greatly, the child (7102) is playing with a scuffle, and the state of the child (7102) is estimated indirectly from the movement of the car toy (7103). .
- the network-connected microphone device (7001) analyzes a human voice included in the collected sound, for example, and identifies the surrounding situation from the content of the analyzed voice. For example, when a child (7102) such as an infant or an infant is in the house A (1210), an adult (for example, mother, father, babysitter) who takes care of the child (7102) is around the child (7102). There are many. As an adult who takes care of the child (7102), the mother (7101) shown in FIG. 55 will be described as an example. When the mother (7101) cares for the child (7102), the mother (7101) may be easy to talk to the child (7102). Many. The same is true for adults such as fathers and babysitters.
- the network-connected microphone device (7001) collects the voice of the mother (7101) located in a range where sound can be collected, the voice of the mother (7101) is emitted using the existing voice recognition technology. Analyze voice content.
- the memory of the network-connected microphone device (7001) stores table information for associating a predetermined voice content with a child's state with respect to the voice content.
- this table information When the voice content is analyzed, this table information , The content that matches or resembles the analyzed content is identified, and the state of the child associated with the identified content is identified.
- the state of the child (7102) by collecting the voice of the child (7102) and analyzing the content of the voice. Specifically, when the network-connected microphone device (7001) collects the voice of the child (7102) located in a range where sound can be collected, the voice of the child (7102) is emitted using the existing voice recognition technology. Analyze voice content.
- the network-connected microphone device (7001) transmits to the server (1100) event information in which the ID for identifying itself is associated with the analysis result.
- the server (1100) can accumulate the state of the person related to the sound collected by the network-connected microphone device (7001).
- the sound generated by the sound generator (7002) may be a non-audible sound that cannot be recognized by a person. In this case, discomfort with respect to the sound generated by the sound generator (7002) can be reduced.
- FIG. 56A is a diagram illustrating an example of a specific configuration of the network-connected microphone device according to the sixth embodiment.
- FIG. 56B is a diagram showing an example of a specific configuration of the sound generator A in the sixth embodiment
- FIG. 56C is a diagram showing an example of a specific configuration of the sound generator B in the sixth embodiment.
- the specific configurations of the network-connected microphone device (7001a) and the sound generation device (7002a) are the same as those shown in FIGS. 56A to 56C, and thus description thereof is omitted here.
- the network-connected microphone device (7001) is, for example, a fourth home appliance having a function of recognizing voice among a plurality of home appliances, and collects sound and performs voice recognition.
- the network-connected microphone device (7001) determines the state of the object based on the recognized sound.
- the network-connected microphone device (7001) may perform sound recognition by collecting sound generated by the sound generation device mounted on the fifth home appliance used by the object. In this case, the network-connected microphone device (7001) determines the state of the object that uses the fifth home appliance based on the recognized sound.
- the network-connected microphone device (7001) may determine the state of the object based on words included in the recognized sound.
- the network-connected microphone device (7001) includes, for example, a microphone sensor (7201), an environment sensor (7202), a sound control processing unit (7203), a voice recognition processing unit (7204), and a network communication unit (7205). And a memory (7206).
- the network-connected microphone device (7001) (in particular, the sound control processing unit (7203), the voice recognition processing unit (7204), and the network communication unit (7205)) includes, for example, a fifth memory and a CPU as a hardware configuration.
- the fifth memory stores, for example, a program that functions as a sound control processing unit (7203), a program that functions as a speech recognition processing unit (7204), and a program that functions as a network communication unit (7205).
- the fifth memory is, for example, a readable recording medium or a readable / writable recording medium.
- Examples of the recording medium include a semiconductor memory, an optical disk, and a hard disk.
- the network-connected microphone device (7001) shown in FIG. 56A reads, for example, the above-mentioned program from the fifth memory and causes the CPU to execute the sound control processing unit (7203), voice recognition processing unit (7204), network Each is configured to function as a communication unit (7205).
- the CPU is configured to execute a program that functions as the sound control processing unit (7203), the voice recognition processing unit (7204), and the network communication unit (7205).
- the present invention is not limited to this.
- a dedicated signal processing circuit functioning as a sound control processing unit (7203), a dedicated signal processing circuit functioning as a speech recognition processing unit (7204), and a dedicated signal processing circuit functioning as a network communication unit (7205) are used. May be configured.
- a program that functions as at least one of the sound control processing unit (7203), the voice recognition processing unit (7204), and the network communication unit (7205) is stored in the first memory, and the rest is used as a dedicated signal. You may comprise using a processing circuit.
- the memory (7206) is a readable / writable recording medium, for example.
- Examples of the recording medium that can be read and written include a semiconductor memory, an optical disk, and a hard disk.
- the microphone sensor (7201) collects voice or sound within a range where sound can be collected and converts it into electronic data.
- the environmental sensor (7202) detects the surrounding state of the network-connected microphone device (7001). For example, when the network-connected microphone device (7001) detects that the environment sensor (7202) is in a predetermined state, the voice of the person included in the collected sound and the sound emitted from the sound generator (7202) are analyzed. I do.
- the environmental sensor (7202) is an illuminance sensor that detects illuminance, for example, and has a value that can distinguish the illuminance when the illumination of the room where the network-connected microphone device (7001) is turned on from the illuminance when it is turned off. As a threshold value, the detected illuminance is compared with the threshold value, and it is detected whether the room is illuminated based on the comparison result.
- the network connection microphone device (7001) detects that, for example, lighting in a room where the network connection microphone device (7001) is installed is turned on.
- the network connection microphone device (7001) detects the voice and sound of a person included in the collected sound.
- the sound emitted from the generator (7202) is analyzed.
- the sound control processing unit (7203) analyzes electronic data corresponding to the sound collected by the microphone sensor (7201), for example.
- the sound control processing unit (7203) extracts a human voice from electronic data corresponding to the collected sound, for example, and outputs it to a voice recognition processing unit (7204) described later.
- the sound control processing unit (7203) extracts sounds other than human voice from electronic data corresponding to the sounds collected by the microphone sensor (7201), and a sound generator (7002) included in the extracted data. Analyzes the sound emitted from The speech recognition processing unit (7204) analyzes data corresponding to the human voice sent from the sound control processing unit (7203), and outputs the result to the sound control processing unit (7203).
- the voice recognition processing unit (7204) analyzes the content of a human voice using existing voice recognition technology.
- the sound control processing unit (7203) includes an analysis result of a human voice included in the sound collected by the voice recognition processing unit (7204), and a sound generator included in the sound collected by the sound control processing unit (7203).
- the analysis result of the sound emitted from (7002) is transmitted to the server (1100).
- the sound control processing unit (7203) uses the human data from the electronic data corresponding to the collected sound while the environmental sensor (7202) detects a predetermined environmental state (for example, lighting is on). Extraction of sound other than the voice of the voice and analysis of sound emitted from the sound generator (7002) are performed.
- the sound generator (7002) is mounted on the fifth home appliance used by the object. As shown in FIG. 56B and FIG. 56C, the sound generator (7002) is roughly divided into a sound generator A (7206) and a sound generator B (7209).
- the sound generator A (7206) determines the sound to be output by the sound source control processing unit (7208) based on the value detected by the sensor (7209), and then outputs the sound from the speaker (7207).
- the sound generator B (7209) (particularly the sound source control processing unit (7208)) includes, for example, a sixth memory and a CPU as a hardware configuration.
- the sixth memory stores a program that functions as the sound source control processing unit (7208).
- the sixth memory is, for example, a readable recording medium or a readable / writable recording medium. Examples of the recording medium include a semiconductor memory, an optical disk, and a hard disk.
- the sound generator B (7209) shown in FIG. 56B is configured to function as a sound source control processing unit (7208), for example, by reading the above-mentioned program from the sixth memory and executing it by the CPU.
- the CPU is configured to execute a program that functions as the sound source control processing unit (7208), but is not limited thereto.
- a dedicated signal processing circuit that functions as the sound source control processing unit (7208) may be used.
- the sensor (7209) is assumed to be a sensor such as acceleration, gyro, temperature / humidity, illuminance, and human feeling. Further, the speaker (7207) may be a driving tuning fork or the like as long as it can generate sound determined by the sound source control processing unit (7208).
- the sound generator B (7209) is composed of a tuning fork (7210).
- the tuning fork (7210) emits a sound according to the movement of the attached object. For example, when it is desired to detect the opening / closing of the door in the house A (1210), the opening / closing operation of the door can be detected by attaching the sound generator B (7209) to the door.
- the microphone sensor (7201) included in the network-connected microphone device (7001) is a voice generated by a person located in the sound collection range, or the sound generator A (7206) or the sound generator B (7209). ) Is collected, and electronic data corresponding to the collected sound is generated.
- the sound control processing unit (7203) calculates the volume or frequency characteristic of the collected sound.
- the sound control processing unit (7203) determines whether the volume is equal to or higher than the threshold value or the value of the specific frequency is equal to or higher than the threshold value. If it is determined in S7304 that the volume is equal to or higher than the threshold value, or the value of the specific frequency is equal to or higher than the threshold value, the sound control processing unit (7203) generates a voice or sound generated by a person in the collected sound. It is determined that any of the sounds emitted by the device A (7206) and the sound generator B (7209) is included, and the process proceeds to S7305.
- the sound control processing unit (7203) determines whether the sound volume is smaller than the threshold value and the value of the specific frequency is smaller than the threshold value. If it is determined in S7304 that the sound volume is smaller than the threshold value and the value of the specific frequency is smaller than the threshold value, the sound control processing unit (7203) generates a voice generated by a person in the collected sound and the sound generator A. (7206), it is determined that the sound generated by the sound generator B (7209) is not included, and the process proceeds to S7302.
- the sound control processing unit (7203) acquires the value of the environment sensor (7202).
- the sound control processing unit (7203) determines whether the value of the environmental sensor (7202) is equal to or greater than a threshold value. Here, if the value of the environmental sensor (7202) is equal to or greater than the threshold value, the process returns to S7307, and otherwise returns to S7302.
- the sound control processing unit (7203) determines whether or not the collected sound includes a sound emitted from the sound generator A (7206) or the sound generator B (7209), When it is determined that the sound is included, the sound generated from the sound generator A (7206) or the sound generator B (7209) included in the collected sound is analyzed, and the analysis result is held. If it is determined that the sound collected from the sound generator A (7206) and the sound generator B (7209) is not included, the analysis result is not retained.
- the voice recognition processing unit (7204) analyzes the sound in the frequency band corresponding to the human voice. Specifically, the voice recognition processing unit (7204) performs voice recognition, and returns to the sound control processing unit (7203) a value converted into a character string that has been voice-recognized as a result of analysis or a word representing a state based on the voice. .
- the sound control processing unit (7203) holds the analysis result output from the speech recognition processing unit (7204).
- the sound control processing unit (7203) requests the network communication unit (7205) to transmit the analysis result in S7401 and the analysis result in S7402 to the server (1100).
- the network communication unit (7205) associates the ID of the network-connected microphone device (7001) with the analysis result, and transmits it to the server (1100).
- the server (1100) may include a configuration for analyzing the collected sound.
- the network-connected microphone device (7001) may transmit electronic data corresponding to the sound collected by the microphone to the server (1100) in association with the ID of the network-connected microphone device (7001).
- FIG. 59 is a flowchart for explaining an example of the operation of the sound generator A according to the sixth embodiment.
- the sound source control processing unit (7208) of the utterance device A (7206) acquires a sensor value within a specific time from the sensor (7209).
- the sensor (7209) is an acceleration sensor, a gyro sensor, a temperature / humidity sensor, an illuminance sensor, a human sensor, or the like.
- the sensor value is either an average value within a specific time, a maximum value, or a minimum value.
- the sound source control processing unit (7208) requests the speaker (7207) to output a sound having a specific frequency and rhythm according to the sensor value.
- An example of sound output according to the acceleration sensor values of the three axes XYZ will be described later with reference to FIG.
- the speaker (7207) outputs the requested sound.
- FIG. 60 is an example of a table used to determine the sound output from the sound generator A according to the sixth embodiment.
- the sound source control processing unit (7208) determines the sound to be output according to the acceleration sensor values in the three-axis XYZ directions based on the table shown in FIG.
- the sound source control processing unit (7208) defines acceleration threshold values (7601) corresponding to the X, Y, and Z directions, and the sound frequency (7602) and rhythm (7603) to be output are determined accordingly. It has been.
- a row 7604 indicates a frequency (7602) and a rhythm (7603) of a sound output when the acceleration value in the X-axis direction is less than ⁇ 10.0 mV / g.
- the rows 7605 and 7606 indicate the sound frequency (7602) and rhythm (7603) according to the acceleration value in the X-axis direction.
- FIG. 61 is a diagram illustrating an example of a state of a child associated with mother's voice recognition in the sixth embodiment.
- the voice recognition processing unit (7204) identifies the child's state using the table shown in FIG. 61 based on the mother's voice recognition after the mother's voice recognition. Specifically, the speech recognition processing unit (7204) accompanies the column (7702) when the child's state is included in the column (7701) when a word corresponding to any of the columns (7701) is included. It is estimated that it corresponds to. For example, when the speech recognition processing unit (7204) detects the word “laughing” as a result of the mother's speech recognition, the speech recognition result of the mother corresponds to the row (7703). Is specified (determined). Note that the conversion shown in FIG. 61 may be performed by the server (1100).
- Example of UI 62 and 63 are examples of a UI displayed on the display device in the sixth embodiment.
- the value converted by the speech recognition processing unit (7204) generated by the process shown in FIG. 58 is transmitted to the display device (1510) through the server (1100) or directly.
- An example of a UI displayed on the screen is shown.
- the displayed icon (7802) and message (7803) are determined based on the values converted by the voice recognition processing unit (7204). Furthermore, the elapsed time message (7804) indicates the elapsed time since the notification to the display device (1510).
- the values converted in the past by the voice recognition processing unit 7204 can be listed.
- FIG. 64 is a diagram showing an example of the display effect of the UI displayed on the display device in the sixth embodiment, and FIG. 65 shows the speed of the display effect according to the positional relationship between the network-connected microphone device and the display device in the sixth embodiment. It is a figure which shows an example.
- the display device (1510) is a device that can perform current position measurement such as indoor positioning using GPS or WiFi by itself, and the installation position of the network connection microphone device (7001) registered in advance and the display device (1510).
- the display effect is determined by the display means (1511) of the display device (1510) according to the relationship with the current position.
- the speed of the display effect from the display start to the end is defined in a table as shown in FIG. 65 depending on the physical positional relationship between the network-connected microphone device (7001) and the display device (1510). For example, when the distance between the network-connected microphone device (7001) and the display device (1510) is 38 kilometers, it is determined that the display means (1511) corresponds to the row (8103) in the column (8101) in FIG. The display is performed so that the speed of the display effect from the start to the end of the display of the icon (7802) is 1.5 seconds shown in the row (8103) of the column (8102).
- the display effect may be an effect such that the icon (7802) is gradually and clearly displayed by gradually decreasing the transparency of the icon (7802).
- FIG. 66 is a diagram showing an example of a display effect according to the elapsed time of event notification to the display device (1510) in the sixth embodiment.
- a notification including a message (7803) as shown in UI (7801) of FIG. 62 is displayed on the display device (1510)
- the display means (1511) changes the elapsed time message (7804) according to the notification elapsed time.
- the transparency of the icon (7802) is also changed.
- the user can intuitively grasp the elapsed time from the notification only by glance at the UI (7801) on the display screen of the display device (1510).
- the elapsed time from the notification is 45 minutes, it corresponds to the elapsed time row (8203) in the column (8201) in FIG. 66, so that the transparency is 10%, and the display is thinly transmitted. Displayed on the device (1510). Furthermore, as the time elapses from the notification, the transparency gradually increases, so the icon (7802) changes to a transparent state.
- FIG. 67 is a flowchart illustrating an example of processing in which the server according to the sixth embodiment notifies the display device 1 of UI update.
- the communication means (1101) of the server (1100) receives an event from the network connection microphone device (7001) of each house.
- the server (1100) stores the received event in the home appliance DB group (1103).
- An example of an event stored in the home appliance DB group (1103) will be described with reference to FIG.
- the server (1100) determines whether a certain time has elapsed.
- the determination condition may include a case where a certain number of events are received in addition to time.
- the server (1100) totals the events received from each house stored in the home appliance DB group (1103) for each area according to the house ID of each house and the state of the child. An example of the total result will be described later.
- the server (1100) determines the ranking of each house (house ID) for each area tabulated in S8304.
- the communication unit (1101) of the server (1100) sends a display change notification to the display unit (1511).
- a display change notification An example of a UI displayed on the display means (1511) after receiving the change notification will be described with reference to FIG.
- FIG. 68 is a diagram showing an example of an event notified by the network-connected microphone device stored in the household appliance DB group in the sixth embodiment.
- house ID (8201) is a unique value for identifying each house.
- House ID (8201) is a value stored in advance in an ID management unit (not shown) of the network-connected microphone device (7001) shown in FIG. 56A.
- Date / time (8202) is the time when the event of the network-connected microphone device (7001) is generated.
- the child state (8203) is a child state determined by the network-connected microphone device (7001).
- the maximum volume (8204) is the maximum volume detected by the network-connected microphone device (7001) when collecting sound, and is given to the sound control processing unit (7203) in FIG. 56A when generating a notification event to the server (1100).
- the time (8205) indicates a time when the child state (8203) continuously occurs, and is given when a notification event is generated to the server (1100) by the sound control processing unit (7203).
- the residence area (8206) is a value stored in advance in the ID management unit (not shown) of the network-connected microphone device (7001) shown in FIG. 56A and indicates the residence area of the user. For example, since the rows 8207 and 8208 are H000-0001 of the same house ID (8201), the events transmitted from the network-connected microphone device (7001) of the same house to the server (1100) are shown.
- FIG. 69A and FIG. 69B are diagrams illustrating an example of a result of counting events accumulated in the home appliance DB group of the server in the sixth embodiment.
- FIG. 69A is a table summarizing values when the child's state (8203) in the house where the residence area (8206) shown in FIG. 68 is Osaka is crying.
- the row 8303 counts the rows in which the house ID (8201) in FIG. 68 is the same and the child state (8203) is crying, and the events of the summation source are the rows 8207 and 8208 in FIG.
- the maximum volume ranking (8304) shown in FIG. 69B is ranked higher in order from the house with the highest volume based on the maximum volume (8301).
- the time ranking (8305) is ranked higher in order from the home with the longest time based on the time (8302).
- FIG. 70 is a diagram illustrating an example of a UI displayed after the UI change notification from the server is displayed by the display device according to the sixth embodiment.
- the UI (8401) of the entire display screen of the display device (1510) is the UI in H000-0001 of the house ID (8201), and is displayed when the user performs an operation such as displaying the ranking screen. Is.
- a loud crying volume ranking message (8403) is displayed based on the maximum volume ranking (834) shown in FIG. 69B, and a loud crying time ranking message (8402) is displayed based on the time ranking (8305) shown in FIG. 69B.
- the fourth home appliance having a function of recognizing voice among a plurality of home appliances collects sound, performs voice recognition, and performs a voice recognition step.
- the state of the object is determined based on the sound recognized in step.
- 71A to 71C are diagrams for explaining an example of a form in which a service is provided using a server.
- a server (1100) shows a configuration having a cloud server (110011) operated by a data center operating company (11001) and a server (110021) operated by a service provider.
- the cloud server (110011) is a virtualization server that cooperates with various devices via the Internet. It mainly manages huge data (big data) that is difficult to handle with ordinary database management tools.
- the data center operating company (11001) performs data management, management of the cloud server (111), operation of the data center that performs them, and the like. Details of services performed by the data center operating company (110) will be described later.
- the data center operating company (110) is not limited to a company that performs only data management or operation of the cloud server (11001).
- Embodiments 1 to 6 when a device manufacturer that develops and manufactures one of the home appliances described in Embodiments 1 to 6 performs data management, management of the cloud server (11001), etc.
- the device manufacturer corresponds to the data center operating company (110) (FIG. 71B).
- the data center operating company (11001) is not limited to one company.
- the device manufacturer and another management company jointly or share the data management and operation of the cloud server (111), both or one of them corresponds to the data center operating company (11001). (FIG. 71C).
- the service provider (11002) has a server (110021).
- the server (110021) mentioned here includes, for example, a personal computer or a server that operates on a gateway device regardless of the scale. In some cases, the service provider does not have a server (110021).
- event information of home appliances generated in the house A (1210) shown in the figure (or the house B (1211) shown in FIG. 1 although not shown in FIG. 71A) is transmitted to the cloud server (110011). (Arrow (a) in FIG. 71A).
- the cloud server (110011) receives and stores event information of home appliances transmitted from, for example, the house A (1210) and the house B (1211).
- the cloud server (110011) of the data center operating company (11001) provides the stored event information of home appliances to the service provider (11002) in a certain unit.
- it may be a unit that can organize the event information stored by the data center operating company and provide it to the service provider (110021), or a unit requested by the service provider (11002).
- a unit that can organize the event information stored by the data center operating company and provide it to the service provider (110021), or a unit requested by the service provider (11002).
- a unit requested by the service provider (11002 Although described as a fixed unit, it may not be fixed, and the amount of information to be provided may change depending on the situation.
- Event information of home appliances provided to the service provider (11002) by the data center operating company (11001) is stored in the server (110021) held by the service provider (11002) as necessary (arrow (b) in FIG. 71A )).
- the service provider (11002) organizes the information into information (for example, shared screen information) suitable for the service provided to the user based on the event information type and frequency information of the home appliance and provides the information to the user.
- information for example, shared screen information
- the user to be provided may be a user (10) who uses one or a plurality of home appliances, or a user (20) outside the house.
- the service providing method to the user may be provided directly to the user from the service provider (11002) without going through the cloud server (11001) again (arrow (e) or (f) in FIG. 71A), for example.
- the service providing method to the user may be provided to the user via the cloud server (11001) of the data center operating company (110) again (arrows (c) and (d) in FIG. 71A). ). Further, the cloud server (111) of the data center operating company (110) organizes the information into the information suitable for the service to be provided to the user based on the information on the type and frequency of the event information of the home appliance, and the service provider (120). May be provided.
- FIG. 72 is a diagram for explaining an example of service types.
- FIG. 72 is a diagram specifically showing service type 1 (in-house data center type).
- the service provider (11002) obtains information from the house A (1210) shown in the figure (or the house B (1211 in FIG. 1 although not shown in FIG. 55)), and It is a type that provides services.
- the service provider (11002) has the function of a data center operating company. That is, the service provider has a cloud server (110011) that manages big data. Therefore, there is no data center operating company.
- the service provider (11002) operates and manages the data center (cloud server (110011)) (1100203).
- the service provider (11002) manages the OS (11002) and the application (1100201).
- the service provider (120) provides a service using the OS (11002) and the application (1100201) managed by the service provider (120) (1100204).
- FIG. 73 is a diagram for explaining an example of service types.
- FIG. 73 is a diagram specifically showing service type 2 (IaaS usage type).
- IaaS is an abbreviation for infrastructure as a service, and is a cloud service provision model that provides a base for constructing and operating a computer system as a service via the Internet.
- the data center operating company operates and manages the data center (cloud server (11001)) (1100103).
- the service provider (11002) manages the OS (11002) and the application (1100201).
- the service provider (11002) provides a service using the OS (11002) and the application (1100201) managed by the service provider (11002) (1100204).
- FIG. 74 is a diagram for explaining an example of service types.
- FIG. 74 is a diagram specifically showing service type 3 (PaaS usage type).
- PaaS is an abbreviation for Platform as a Service
- Cloud service provision model that provides a platform serving as a foundation for constructing and operating software as a service via the Internet.
- the data center operating company (11001) manages the OS (1100102) and operates and manages the data center (cloud server (110011)) (1100103). Also, the service provider (11002) manages the application (1100201). Service is provided using the service provider (11002), the OS (11100102) managed by the data center operating company, and the application (11100201) managed by the service provider (11002) (1100204).
- FIG. 75 is a diagram for explaining an example of service types.
- FIG. 75 is a diagram specifically showing service type 4 (SaaS usage type).
- SaaS is an abbreviation for software as a service.
- a function that allows applications provided by a platform provider who owns a data center (cloud server) to be used via a network such as the Internet by a company / individual (user) who does not have a data center (cloud server).
- This is a cloud service provision model.
- the data center operating company (11001) manages the application (1100101), manages the OS (1100102), and operates and manages the data center (cloud server (110011)) (1100103).
- the service provider 120 provides a service using the data center operating company (OS (1100102) and application (1100101) managed by the 11001 (1100204).
- the service provider 11002 performs a service providing act.
- the service provider or the data center operating company may develop an OS, an application, a big data database, or the like, or may be outsourced to a third party.
- the information providing method and the like it is possible to provide an enormous variety of information in a display mode that is easy to visually recognize.
- this information providing method users in each home can easily visually recognize a huge variety of information related to home appliances connected to the network in a plurality of homes according to the purpose of use thereafter. It can also be provided on an easy-to-use portal screen.
- information related to home appliances connected to the network refers to “home appliances and the like” as well as AV home appliances such as TVs and recorders, so-called white goods such as air conditioners and refrigerators, and beauty appliances. It can include any hardware or software that is connected to a network and can communicate its own data, such as home appliances, health devices, and digital cameras. Therefore, an apparatus that can communicate data with M2M, such as an NFC sensor, may be included.
- M2M such as an NFC sensor
- each component may be configured by dedicated hardware or may be realized by executing a software program suitable for each component.
- Each component may be realized by a program execution unit such as a CPU or a processor reading and executing a software program recorded on a recording medium such as a hard disk or a semiconductor memory.
Landscapes
- Engineering & Computer Science (AREA)
- Automation & Control Theory (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Business, Economics & Management (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Economics (AREA)
- Operations Research (AREA)
- Theoretical Computer Science (AREA)
- General Business, Economics & Management (AREA)
- Tourism & Hospitality (AREA)
- General Engineering & Computer Science (AREA)
- Entrepreneurship & Innovation (AREA)
- Human Resources & Organizations (AREA)
- Marketing (AREA)
- Strategic Management (AREA)
- Quality & Reliability (AREA)
- Computing Systems (AREA)
- Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- General Health & Medical Sciences (AREA)
- Multimedia (AREA)
- User Interface Of Digital Computer (AREA)
- Telephonic Communication Services (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
- Information Transfer Between Computers (AREA)
Abstract
Description
近年、家庭内においてもネットワーク環境が整備されてきたことにともない、パソコンだけでなく、テレビやレコーダー、さらにエアコンや冷蔵庫等の家電機器にも、ネットワークに接続できる機能を付加したものが販売されている。
本実施の形態では、ユーザ操作などによって発生する宅内にある家電機器の操作情報および状態変化情報をイベントとして収集し、ソーシャルネットワーク上で共有するための情報表示画面を形成する方法について説明する。
図1は、実施の形態1における情報提供システムの構成の一例を示す図である。図1に示す情報提供システムは、複数の家電機器と、表示装置と、サーバ(1100)とを備え、それらは、公衆ネットワーク(1200)を介して接続されている。
図2は、実施の形態1における家電DB群(1103)が保持する家のIDと属性情報との一例を示す図である。図2には、図1における家電DB群(1103)にて保持される情報の一例として、ハウスID(2001)、ニックネーム(2002)および所有家電(2003)が示されている。
図4Aは、実施の形態1における情報共有サービスの画面情報の一例を示す図である。図4Bは実施の形態1における表示装置の構成の一例を示す図であり、図4Cは実施の形態1におけるサーバの構成の一例を示す図である。
図5は、実施の形態1における表示装置のUI取得の処理の一例を示すフローチャートである。図5には、表示装置1(1510)がサーバからUI取得するまでの処理が示されている。
以上のように、本実施の形態の情報提供方法によれば、膨大で多種多様な情報であっても、視認しやすい表示態様で提供することができる。
本実施の形態では、表示装置1(1510)や表示装置2(1520)などの表示装置に表示されるUIの表示方法の一形態について説明する。本実施の形態では、家電機器に対するユーザ操作によって家電機器から取得できる様々な情報(操作情報や状態変化情報)をイベントとして収集し、関連する家電機器の情報をカテゴリとして集約する。それにより、表示装置の限られた画面面積上にできるだけ多くの情報を表示しつつ、ユーザが視認しやすいUIの表示方法を実現する。なお、実施の形態1で説明した図においては、重複して説明しない。
図15は、本実施の形態における情報提供システムの構成の一例を示す図である。図1と異なる点は、図15に示すサーバ(3001)は、家電DB群(1103)の一例として家電カテゴリ(DB3004)を備える点と、表示形式判定手段(1102A)の構成が異なる。
以下では、図17および図18を用いて、ユーザが所有する家電機器のカテゴリ種別及び家電アイコンの表示サイズを決定する処理について説明する。
以上のように、本実施の形態の情報提供方法によれば、膨大で多種多様な情報であっても、視認しやすい表示態様で提供することができる。
本実施の形態では、家電機器を稼動させるために特にユーザが行わなくてはならない操作に着目した情報提供方法について説明する。
図20は、実施の形態3における表示形式判定手段の構成の一例を示す図である。
図21は、実施の形態3における家電稼働時間予測手段が家電機器の稼働時間を予測する処理の一例を示すフローチャートである。
図22は、実施の形態3における家電DB群が保持するイベントログから稼働時間を算出する方法の一例を示す図である。
図24は、実施の形態3における家電DB群が保持する家電操作比率の一例を示す図である。
図25は、実施の形態3における表示指標値算出部が表示指標を算出する処理の一例を示すフローチャートである。
図27は、実施の形態3における特定の表示形式で表示された表示画面の別の一例を示す図である。
以上のように、本実施の形態の情報提供方法によれば、家電機器を稼動させるために特にユーザが行わなくてはならない操作に着目した表示態様で提供することができる。
本実施の形態における表示形式判定手段(1102)が生成する、表示装置が表示画面で特定の表示形式で表示するための表示情報に関して説明する。
表示形式判定手段(1102)は、複数の人アイコンまたは複数の家電アイコンが重ならないように背景画像を変形させて表示する特定の表示形式を生成する処理を行う。表示形式判定手段(1102)は、実施の形態1~実施の形態3の表示形式判定手段(1102)の処理に加えて、さらに、背景画像を変形させて表示した後に、表示する家電アイコンを切り替える場合には、背景画像を変形前の元の形に戻し、元の形の背景画像に重畳して表示する家電アイコンが重なるときには、背景画像を再度変形させて表示する特定の表示形式を生成する処理を行う。ここで、背景画像は、例えば地図である。
図31は、図30に示す特定の表示形式を生成する処理の一例を示すフローチャートである。
以上のように、本実施の形態の情報提供方法によれば、膨大で多種多様な情報であっても、視認しやすい表示態様で提供することができる。
本実施の形態では、ユーザ操作などによって発生する宅内にある家電機器の操作情報や状態変化情報をイベントとして収集し、表示装置がソーシャルネットワーク上で共有する特定の表示形式で表示するための表示情報を形成する場合について説明する。
図35Aは、実施の形態5における無線コースターの外観の一例を示す図である。図35Bは、実施の形態5における無線コースターの構成の一例を示す図である。
図36は、実施の形態5における無線コースターが状態を検出する処理の一例を示すフローチャートである。
図37は、実施の形態5における有線コースターが、USBなどの有線接続でPCと接続された場合のシステムの構成の一例を示す図である。図35Bと同様の要素には同一の符号を付しており、詳細な説明は省略する。
図38は、実施の形態5におけるコースターが状態を検出してPCと連携する処理の一例を示すフローチャートである。
図39は、実施の形態5におけるコースターから得られる情報の一例を示す図である。
図40は、実施の形態5におけるコースターと連動して表示される画面情報の一例を示す図である。
図41は、実施の形態5における無線コースターが状態推定を行う場合の構成の一例を示す図である。図35Bと同様の要素には同一の符号を付しており、詳細な説明は省略する。
図42は、実施の形態5におけるコースターが状態を検出する処理を示すフローチャートである。
図43は、実施の形態5における重量変化のパターンの一例を示す図である。
図44は、実施の形態5におけるコースターから得られる情報の一例を示す図である。図44に示すイベント値にあるように、カップを置いたときの状態であるカップセットや、注ぎ中、注ぎ終わりといった状態情報を検知して、サーバ(1100)に通知される。この情報はサーバ(1100)側で状態推定を行うとしても良い。
図45は、実施の形態5における状態を推定する機能を有するサーバの構成の一例を示す図である。図45に示すサーバ(1100A)は、通信部(6061)とコースター情報取得部(6062)と表示情報通信部(6063)と、メッセージ管理部(6064)と、サーバ側状態推定部(6065)と、状態DB(6066)と、表示情報検索部(6067)と、カップDB(6068)と、表示形式判定部(6069)とを備える。
図46および図47は、実施の形態5におけるサーバの処理の一例を示すフローチャートである。
図48は、実施の形態5におけるカップDBに保存される情報の一例を示す図である。
図50は、実施の形態5におけるコーヒーメーカの利用情報とコースターの情報を同時に用いるシステムの構成の一例を示す図である。
図52は、実施の形態5におけるコーヒーメーカなどのメーカの情報を表示した共有画面の一例を示す図である。
図53は、実施の形態5における飲み物のメーカの情報を用いる場合のサーバの処理の一例を示すフローチャートである。
本実施の形態の情報提供方法によれば、家電機器自身または家電機器の周辺にセンサを配置することで、ユーザが家電機器を利用する際に、ユーザの生活情報を取得して、共有する特定の表示形式を生成することができる。
本実施の形態では、家電機器や小型センシングデバイスとGW1(1301)との接続方法として、GW1(1301)にネットワーク接続マイクデバイス(7001)を用いた例について説明する。
図56Aは、実施の形態6におけるネットワーク接続マイクデバイスの具体的な構成の一例を示す図である。図56Bは実施の形態6における音発生装置Aの具体的な構成の一例を示す図であり、図56Cは実施の形態6における音発生装置Bの具体的な構成の一例を示す図である。なお、ネットワーク接続マイクデバイス(7001a)と音発生装置(7002a)の具体的な構成は、図56A~図56Cに示したものと同じであるため、ここでは説明を省略する。
図57および図58は、実施の形態6におけるネットワーク接続マイクデバイスの動作の一例を説明するためのフローチャートである。
図59は、実施の形態6における音発生装置Aの動作の一例を説明するためのフローチャートである。
図60は、実施の形態6における音発生装置Aが出力する音を決定するために用いられるテーブルの一例である。本実施の形態では、音源制御処理部(7208)は、図60に示すテーブルに基づき、3軸XYZ方向の加速度センサ値に応じて出力する音を決定する。
図62および図63は、実施の形態6における表示装置に表示されるUIの一例である。図62および図63には、図58に示す処理にて生成された音声認識処理部(7204)で変換した値を、サーバ(1100)を通じて、もしくは、直接、表示装置(1510)へ送信した場合に表示されるUIの例が示されている。
図64は実施の形態6における表示装置に表示するUIの表示効果の一例を示す図であり、図65は実施の形態6におけるネットワーク接続マイクデバイスと表示装置の位置関係に応じた表示効果の速度の一例を示す図である。
図67は、実施の形態6におけるサーバが表示装置1にUI更新を通知する処理の一例を示すフローチャートである。
図68は、実施の形態6における家電DB群が格納するネットワーク接続マイクデバイスにより通知されたイベントの一例を示す図である。
図69Aおよび図69Bは、実施の形態6におけるサーバの家電DB群が蓄積するイベントの集計結果の一例を示す図である。図69Aは、図68に示す居住地域(8206)が大阪であるお宅における子どもの状態(8203)が泣きの場合の値を集計した表である。行8303は、図68のハウスID(8201)が同一、かつ、子どもの状態(8203)が泣きである行を集計しており、集計元のイベントは図68の行8207、行8208である。
図70は、実施の形態6における表示装置がサーバからのUI変更通知後に表示するUIの一例を示す図である。
以上のように、本実施の形態の情報提供方法によれば、複数の家電機器のうち音声を認識する機能を有する第4家電機器が音を集音して、音声認識を行い、音声認識ステップにおいて認識された前記音に基づいてオブジェクトの状態を判定する。これにより、オブジェクトの状態に応じた表示内容を特定の表示形式で表示装置が表示する表示情報を生成することができるので、表示装置の表示画面に特定の表示形式で表示させることができる。
本実施の形態では、上述の実施の形態で説明をしたサーバにより提供するサービスに関する説明をする。
図72は、サービスの類型の一例を説明するための図である。
図73は、サービスの類型の一例を説明するための図である。
図74は、サービスの類型の一例を説明するための図である。
図75は、サービスの類型の一例を説明するための図である。
1101、1512 通信手段
1102 表示形式判定手段
1103 家電DB群
1200 公衆ネットワーク
1201 家A
1211 家B
1251 家電アイコン
1252、1254、3030、4601 家アイコン
1253 メッセージ
1301 GW1
1302 GW2
1401 家電1
1402 家電2
1403 家電3
1404 家電4
1405 家電5
1406 家電n
1510 表示装置1
1511 表示手段
3002 カテゴリ表示サイズ決定部
3003 家電アイコン表示サイズ決定部
3004 家電カテゴリDB
3005 家電カテゴリDB更新部
3011 カテゴリID
3031 カテゴリ表示領域
3032~3039 個別カテゴリ表示領域
3040 吹き出し
4001 家電稼働時間予測手段
4002 表示指標値算出部
4602、4603、4604、4605、4606 家電アイコン表示領域
4701 地図画像
4702、4703、4704 家電アイコン
4705 家アイコン
4801、4802、4901、4902、4903 人アイコン
5001 背景地図画像
5002、5003 ユーザアイコン
5004、5005 背景地図画像
6000、6000B 無線コースター
6000A 有線コースター
6001 カップ
6002 重量センサ
6003 無線通信部
6004 ID管理部
6011 有線通信部
6020 情報処理部
6021 USBポート
6022 インターネット接続部
6023 表示装置
6041 状態推定部
6042 時刻管理部
6043 状態DB
6081、6111 コーヒーメーカ
6082 電流センサ
6083 無線通信部
6091 電流センサ情報受信部
6092 メーカ状態推定部
6093 メーカ状態DB
6112 ジューサーミキサー
7001 ネットワーク接続マイクデバイス
7002 音発生装置
7101 母親
7102 子ども
7103 車のおもちゃ
7104 哺乳瓶
7105 掃除機
7201 マイクセンサ
7202 環境センサ
7203 音制御処理部
7204 音声認識処理部
7205 ネットワーク通信部
7206 ID管理部
7801、7901、8401 UI
7802 アイコン
7803 メッセージ
7804 経過時間メッセージ
8402 大泣き時間ランキングメッセージ
8403 大泣き音量ランキングメッセージ
110011 クラウドサーバ
110021 サーバ
1100101、1100201 アプリケーション
1100102、1100202 OS
1100103、1100203 データセンタ
Claims (26)
- ネットワークを介して接続された複数の家電機器と表示装置とサーバとにおける情報提供方法であって、
前記サーバにおいて、
前記複数の家電機器それぞれから前記複数の家電機器それぞれに関する情報を受信する受信ステップと、
前記受信ステップにおいて受信された前記複数の家電機器それぞれに関する情報をフィルタリングし、前記表示装置が特定の表示形式で表示するための表示情報を生成する処理を行う処理ステップと、
前記処理ステップにおける前記表示情報を前記表示装置に送信する送信ステップとを含む、
情報提供方法。 - 前記情報提供方法は、さらに、
前記表示装置において、前記サーバにより送信された前記表示情報に基づいて、フィルタリングされた1以上の家電機器に関する情報を含む家電機器に対応するオブジェクトを前記表示装置の表示画面に前記特定の表示形式で表示する表示ステップを含む、
請求項1に記載の情報提供方法。 - 前記処理ステップは、
前記受信ステップにおいて、一定時間毎に、または、前記複数の家電機器それぞれに関する情報を一定数受信した時を契機に、受信した前記複数の家電機器に関する情報をカテゴリ別に集計する集計ステップと、
前記集計の結果を用いて、フィルタリングした1以上の家電機器に関する情報を含む家電機器に対応するオブジェクトの表示サイズおよび表示位置を含む特定の表示形式を示す表示形式情報を算出する算出ステップと、を含み、
前記送信ステップでは、前記表示形式情報を含む前記表示情報を前記表示装置に送信する、
請求項1または2に記載の情報提供方法。 - 前記集計ステップにおいて、地域を含む地理情報、家族構成および人数、並びに住居形態を含む前記家電機器のユーザの家に関する情報に基づいて決定されるカテゴリ別に、受信した前記複数の家電機器それぞれに関する情報を集計する、
請求項3に記載の情報提供方法。 - 前記受信ステップでは、前記複数の家電機器それぞれが状態変化したときを契機として前記家電機器それぞれが送信した前記複数の家電機器それぞれに関する情報を受信し、
前記集計ステップでは、さらに、前記受信ステップにおいて受信した前記複数の家電それぞれに関する情報の受信回数に基づき、前記複数の家電機器の種別毎の稼働率を推定し、
前記算出ステップでは、前記稼働率に基づいて、前記表示形式情報を算出する、
請求項3または4に記載の情報提供方法。 - 前記算出ステップでは、前記表示装置の画面のサイズによらない特定の表示形式であって当該画面における相対的な前記表示位置および前記表示サイズを含む特定の表示形式を示す前記表示形式情報を算出する、
請求項3~5のいずれか1項に記載の情報提供方法。 - 前記受信ステップでは、受信した前記複数の家電機器それぞれに関する情報をデータベースに蓄積し、
前記処理ステップは、さらに、
前記受信ステップにおいて受信された前記複数の家電機器それぞれに関する情報にカテゴリ種別を示すカテゴリIDを付与し、前記受信ステップにおいて受信された前記複数の家電機器それぞれに関する情報と、蓄積された前記家電機器それぞれに関する情報に基づいて、前記カテゴリ種別の表示優先度を示すカテゴリ表示優先度と、前記複数の家電機器それぞれの表示優先度を示す家電表示優先度を決定する決定ステップとを含み、
前記処理ステップでは、
付与した前記カテゴリIDと、決定した前記カテゴリ表示優先度および前記家電表示優先度とを含めて前記表示情報を生成する処理を行い、
前記表示ステップでは、
前記表示情報に基づいて、前記カテゴリIDを同一とする1以上の家電機器を一まとまりとして前記表示画面に表示し、前記カテゴリ表示優先度の高いカテゴリIDに属する複数の家電機器に関する情報ほど前記表示画面において大きく表示する、
請求項2に記載の情報提供方法。 - 前記決定ステップでは、前記カテゴリIDを同一とする1以上の家電機器の利用頻度の総和に応じて前記カテゴリ表示優先度を決定する、
請求項7に記載の情報提供方法。 - 前記処理ステップでは、あらかじめ定められた利用頻度以下の家電機器がユーザに利用されたとき、前記ユーザに利用された前記家電機器が属するカテゴリIDの前記カテゴリ表示優先度を予め定めた値よりも一時的に高く設定する、
請求項7または8に記載の情報提供方法。 - 前記決定ステップでは、あらかじめ設定された時間帯にあらかじめ設定された利用頻度以上利用される複数の家電機器に同一のカテゴリ種別を示すカテゴリIDを付与する、
請求項7~9のいずれか1項に記載の情報提供方法。 - 前記表示ステップでは、前記家電表示優先度に基づいて、前記カテゴリIDを同一とする1以上の家電機器を示すアイコンを一まとまりとして前記表示画面に表示し、さらに、前記1以上の家電機器を示すアイコンのうち前記家電表示優先度の高い家電機器に対応するアイコンほど、大きく表示する、
請求項7~10のいずれか1項に記載の情報提供方法。 - 前記処理ステップは、さらに、
前記受信ステップにおいて受信された前記複数の家電機器それぞれに関する情報に基づいて、前記複数の家電機器が稼動した時間である家電稼動時間を予測する家電稼働時間予測ステップと、
前記家電稼働時間を、ユーザの操作時間で重み付けすることにより、フィルタリングした前記1以上の家電機器に関する情報を含む家電機器に対応するオブジェクトである家電アイコンの表示サイズおよび表示位置を決定するための表示指標値を算出する算出ステップと、を含み、
前記表示ステップでは、
前記サーバにより送信された前記表示指標値に基づいて、前記家電アイコンの前記表示画面における表示サイズおよび表示位置を決定して前記表示画面に表示する、
請求項2に記載の情報提供方法。 - 前記表示ステップでは、
前記家電アイコンのうちユーザ操作が行われた家電機器に対応する家電アイコンの近傍に、前記ユーザ操作が行われていることを示すアイコンを表示する、
請求項12に記載の情報提供方法。 - 前記表示ステップでは、前記特定の表示形式として、前記家電アイコンと、表示対象となる複数のユーザそれぞれを示す複数の人アイコンのうち、所定の表示優先度に従って選択された人アイコンとを背景画像に重畳して前記表示画面に表示し、
前記所定の表示優先度は、前記受信ステップにおいて受信された前記複数の家電機器それぞれに関する情報であって前記複数のユーザそれぞれに属する情報を利用して決定される、
請求項12または13に記載の情報提供方法。 - 前記表示優先度は、前記複数の家電機器それぞれに関する情報として前記複数の家電機器それぞれの動作時間を用いて決定される、
請求項14に記載の情報提供方法。 - 前記優先度は、前記複数の家電機器それぞれに関する情報として前記複数の家電機器それぞれの動作時間の累積を用いて決定される、
請求項15に記載の情報提供方法。 - 前記表示ステップでは、さらに、
前記複数の人アイコンまたは前記複数の家電アイコンが重ならないように前記背景画像を変形させて表示する、
請求項14~16のいずれか1項に記載の情報提供方法。 - 前記表示ステップでは、さらに、前記背景画像を変形させて表示後に、表示する家電アイコンを切り替える場合には、前記背景画像を変形前の元の形に戻し、
前記元の形の前記背景画像に重畳して表示する家電アイコンが重なるときには、前記背景画像を再度変形させて表示する、
請求項17に記載の情報提供方法。 - 前記背景画像は、地図である、
請求項12~18のいずれか1項に記載の情報提供方法。 - 前記情報提供方法は、さらに、
前記複数の家電機器のうち重量を計測する機能を有する第1家電機器が、物体の重量の変化を計測する計測ステップと、
前記計測ステップにおいて計測された前記重量の変化を示す重量情報と前記第1家電機器を一意に識別可能な識別子を、前記ネットワークを介して前記サーバに送信する機器送信ステップとを含み、
前記受信ステップでは、送信された前記家電機器の識別子と前記重量情報とを受信し、
前記処理ステップでは、さらに、受信した前記重量情報の変化パターンから、前記第1家電機器が計測した物体の状態を推定する推定ステップを含み、
前記処理ステップでは、前記推定ステップにおいて推定された前記物体の状態に合わせた表示内容を前記特定の表示形式で前記表示装置が表示する表示情報を生成し、
前記表示ステップでは、前記表示情報に基づいて、前記表示画面に表示されているアバターのうち前記物体に関連するユーザのアバターを前記表示内容に従って変更する、
請求項2に記載の情報提供方法。 - 前記計測ステップでは、前記第1家電機器にカップが置かれることで、前記カップの重量を計測し、
前記推定ステップでは、受信した前記重量情報の変化パターンから、前記物体の状態として、前記第1家電機器のユーザが前記第1家電機器に前記カップを置くまたは前記第1家電機器から前記カップを取り上げたかどうかを推定する、
請求項20に記載の情報提供方法。 - 前記推定ステップでは、受信した前記重量情報の変化パターンから、前記第1家電機器のユーザの利用する物体の重さを推測し、
前記表示ステップでは、さらに、前記表示情報に基づいて、予め登録された複数の画像であって前記物体の重さに応じた複数の画像と、推測された前記物体の重さとを比較する比較ステップを含み、
前記表示ステップでは、前記表示画面に表示されている前記アバターの前記物体に関する画像を、前記比較ステップにおいて推測された前記物体の重さに応じた画像に変更する、
請求項20または21に記載の情報提供方法。 - 前記計測ステップでは、さらに、電流量を計測する機能を有する第2家電機器が第3家電機器の電流量を計測し、
前記機器送信ステップでは、さらに、前記計測ステップにおいて計測された前記第3家電機器の電流量を、前記サーバに送信し、
前記表示ステップでは、さらに、前記表示情報に基づいて、前記第2家電機器が電流量を計測することで使用が特定され、前記第1家電機器と任意の時間以内の時間差で使用される前記第3家電機器を示す情報を、前記表示画面に表示されている前記アバターの側に配置する配置ステップを含み、
前記配置ステップでは、前記電流量から推測された前記第3家電機器の動作時刻と、前記重量情報の変化パターンから推測された前記第1家電機器が使用された時刻との差に応じて、前記第3家電機器を示す情報を配置する距離を変更する、
請求項20~22のいずれか1項に記載の情報提供方法。 - 前記情報提供方法は、さらに、
前記複数の家電機器のうち音声を認識する機能を有する第4家電機器が音を集音して、音声認識を行う音声認識ステップと、
前記音声認識ステップにおいて認識された前記音に基づいてオブジェクトの状態を判定する判定ステップとを含み、
前記受信ステップでは、前記複数の家電機器それぞれに関する情報として、前記オブジェクトの状態を受信し、
前記処理ステップでは、さらに、受信した前記オブジェクトの状態に応じた表示内容を前記特定の表示形式で前記表示装置が表示する表示情報を生成する、
請求項1に記載の情報提供方法。 - 前記音声認識ステップでは、前記オブジェクトにより使用される第5家電機器に搭載された音発生装置が発する音を集音し、
前記音声認識ステップにおいて認識された前記音に基づいて、前記第5家電機器を使用するオブジェクトの状態を判定する、
請求項24に記載の情報提供方法。 - 前記判定ステップでは、さらに、前記音声認識ステップにおいて認識された前記音に含まれる単語に基づいてオブジェクトの状態を判定する判定ステップを含み、
前記受信ステップでは、さらに、前記オブジェクトの状態を受信し、
前記処理ステップでは、さらに、受信した前記オブジェクトの状態に応じた表示内容を前記特定の表示形式で前記表示装置が表示する表示情報を生成する、
請求項24または25に記載の情報提供方法。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/770,117 US10355947B2 (en) | 2013-05-16 | 2014-05-13 | Information providing method |
CN201480009242.5A CN105247428B (zh) | 2013-05-16 | 2014-05-13 | 信息提供方法 |
JP2015516914A JP6478242B2 (ja) | 2013-05-16 | 2014-05-13 | 情報提供方法 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013-104271 | 2013-05-16 | ||
JP2013104271 | 2013-05-16 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014185056A1 true WO2014185056A1 (ja) | 2014-11-20 |
Family
ID=51898046
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2014/002519 WO2014185056A1 (ja) | 2013-05-16 | 2014-05-13 | 情報提供方法 |
Country Status (5)
Country | Link |
---|---|
US (1) | US10355947B2 (ja) |
JP (1) | JP6478242B2 (ja) |
CN (1) | CN105247428B (ja) |
TW (1) | TW201507486A (ja) |
WO (1) | WO2014185056A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017041008A (ja) * | 2015-08-18 | 2017-02-23 | ヤマハ株式会社 | 制御装置 |
CN106940548A (zh) * | 2017-05-05 | 2017-07-11 | 苏州普绿法环保科技有限公司 | 一种物联网智能化除尘设备管控系统 |
Families Citing this family (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10085211B2 (en) * | 2014-09-02 | 2018-09-25 | Apple Inc. | Communication of processor state information |
KR20160088102A (ko) * | 2015-01-15 | 2016-07-25 | 삼성전자주식회사 | 네트워크에서 연결 상태를 표시하기 위한 장치 및 방법 |
CN106130835B (zh) * | 2016-06-22 | 2020-09-08 | 广东美的厨房电器制造有限公司 | 一种厨房通讯系统及方法 |
CN106603350B (zh) * | 2016-12-15 | 2020-06-02 | 北京小米移动软件有限公司 | 信息展示方法及装置 |
JP6571144B2 (ja) * | 2017-09-08 | 2019-09-04 | シャープ株式会社 | 監視システム、監視機器、サーバ、および監視方法 |
JP2019066378A (ja) * | 2017-10-03 | 2019-04-25 | 東芝ライフスタイル株式会社 | 動作音比較装置 |
FR3074988A1 (fr) * | 2017-12-11 | 2019-06-14 | Orange | Gestion d'equipements domestiques mettant en œuvre un identifiant de foyer. |
CA190752S (en) | 2018-01-15 | 2021-02-16 | Lutron Electronics Co | Control device for smart building systems |
USD872122S1 (en) * | 2018-01-15 | 2020-01-07 | Lutron Technology Company Llc | Display screen or portion thereof with graphical user interface |
USD846507S1 (en) | 2018-01-15 | 2019-04-23 | Lutron Technology Company Llc | Control device |
JP2019153883A (ja) * | 2018-03-01 | 2019-09-12 | パナソニックIpマネジメント株式会社 | 表示方法およびプログラム |
CN110945877B (zh) * | 2018-03-02 | 2022-10-25 | 松下知识产权经营株式会社 | 设备管理系统以及设备管理方法 |
US20190278527A1 (en) * | 2018-03-06 | 2019-09-12 | Kabushiki Kaisha Toshiba | System and method for machine learning optimization of human resource scheduling for device repair visits |
KR102499734B1 (ko) * | 2018-10-08 | 2023-02-15 | 구글 엘엘씨 | 스마트 어플라이언스 상태들의 요약 전달 |
US11092954B2 (en) * | 2019-01-10 | 2021-08-17 | Johnson Controls Technology Company | Time varying performance indication system for connected equipment |
JP7389630B2 (ja) * | 2019-11-29 | 2023-11-30 | 東芝ライフスタイル株式会社 | 家電システム、家電機器、映像機器、外部サーバ、及び、家電情報の表示方法 |
US11693823B2 (en) * | 2020-06-09 | 2023-07-04 | Adp, Inc. | File decay property |
TWI739481B (zh) * | 2020-06-18 | 2021-09-11 | 國立陽明交通大學 | 虛擬使用者裝置訊號合成系統及其方法 |
WO2022087269A1 (en) * | 2020-10-21 | 2022-04-28 | Verint Americas Inc. | System and method of automated determination of use of sensitive information and corrective action for improper use |
US11886853B2 (en) * | 2021-02-09 | 2024-01-30 | Capital One Services, Llc | Software widget installation on a client device |
KR20220154394A (ko) * | 2021-05-13 | 2022-11-22 | 삼성전자주식회사 | 그래픽 객체를 표시하는 방법 및 장치 |
TWI843050B (zh) * | 2022-01-26 | 2024-05-21 | 台灣松下電器股份有限公司 | 裝置狀態的顯示方法 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0224796A (ja) * | 1988-07-14 | 1990-01-26 | Matsushita Electric Ind Co Ltd | 防犯装置 |
JPH02280296A (ja) * | 1989-04-20 | 1990-11-16 | Matsushita Electric Ind Co Ltd | 警報装置 |
JPH11120473A (ja) * | 1997-10-09 | 1999-04-30 | Matsushita Electric Ind Co Ltd | 機器使用実態診断システム |
JP2008046934A (ja) * | 2006-08-17 | 2008-02-28 | Toshiba Corp | 家電機器ネットワークシステム |
JP2012221205A (ja) * | 2011-04-08 | 2012-11-12 | Enegate:Kk | 電気機器の省エネ情報配信システム |
Family Cites Families (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2734944B2 (ja) * | 1993-08-27 | 1998-04-02 | 三菱化学株式会社 | シクロヘキサノールの製造方法 |
US7210099B2 (en) * | 2000-06-12 | 2007-04-24 | Softview Llc | Resolution independent vector display of internet content |
JP2007316813A (ja) | 2006-05-24 | 2007-12-06 | Toshiba Corp | 監視制御システム画面上での情報表示方法 |
JP4898581B2 (ja) * | 2007-07-12 | 2012-03-14 | 株式会社日立製作所 | ユーザインターフェース方法、表示装置、および、ユーザインターフェースシステム |
US8170886B2 (en) * | 2008-03-28 | 2012-05-01 | The Nielsen Company (U.S.), Llc | Systems, methods, and apparatus to generate an energy consumption index |
US20090313582A1 (en) * | 2008-06-11 | 2009-12-17 | Raul Rupsingh | System, Method and Computer Program for User-Friendly Social Interaction |
US20100124939A1 (en) * | 2008-11-19 | 2010-05-20 | John Osborne | Method and system for graphical scaling and contextual delivery to mobile devices |
JP5492047B2 (ja) | 2010-10-21 | 2014-05-14 | 日本電信電話株式会社 | 購買行動分析装置、購買行動分析方法、購買行動分析プログラム、購買行動分析システム及び制御方法 |
AU2012206258A1 (en) * | 2011-01-13 | 2013-08-01 | Tata Consultancy Services Limited | A method and system for effective management of energy consumption by household appliances |
JP2013054528A (ja) | 2011-09-02 | 2013-03-21 | Canon Inc | デバイス管理装置、デバイス管理方法、およびプログラム |
US20130080898A1 (en) * | 2011-09-26 | 2013-03-28 | Tal Lavian | Systems and methods for electronic communications |
US20130196297A1 (en) * | 2012-01-31 | 2013-08-01 | Yasir Anwar | Interactive shopping - health & wellness |
WO2013154692A1 (en) * | 2012-04-09 | 2013-10-17 | Master Lock Company | Luggage lock |
US20140046675A1 (en) * | 2012-08-08 | 2014-02-13 | Jeffrey Harwood | System and method for processing and displaying medical provider information |
US20140108978A1 (en) * | 2012-10-15 | 2014-04-17 | At&T Mobility Ii Llc | System and Method For Arranging Application Icons Of A User Interface On An Event-Triggered Basis |
US20140201655A1 (en) * | 2013-01-16 | 2014-07-17 | Lookout, Inc. | Method and system for managing and displaying activity icons on a mobile device |
US20140214596A1 (en) * | 2013-01-29 | 2014-07-31 | Wal-Mart Stores, Inc. | Shopping process including monitored shopping cart basket weight |
-
2014
- 2014-05-13 JP JP2015516914A patent/JP6478242B2/ja active Active
- 2014-05-13 US US14/770,117 patent/US10355947B2/en active Active
- 2014-05-13 CN CN201480009242.5A patent/CN105247428B/zh active Active
- 2014-05-13 WO PCT/JP2014/002519 patent/WO2014185056A1/ja active Application Filing
- 2014-05-14 TW TW103116975A patent/TW201507486A/zh unknown
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0224796A (ja) * | 1988-07-14 | 1990-01-26 | Matsushita Electric Ind Co Ltd | 防犯装置 |
JPH02280296A (ja) * | 1989-04-20 | 1990-11-16 | Matsushita Electric Ind Co Ltd | 警報装置 |
JPH11120473A (ja) * | 1997-10-09 | 1999-04-30 | Matsushita Electric Ind Co Ltd | 機器使用実態診断システム |
JP2008046934A (ja) * | 2006-08-17 | 2008-02-28 | Toshiba Corp | 家電機器ネットワークシステム |
JP2012221205A (ja) * | 2011-04-08 | 2012-11-12 | Enegate:Kk | 電気機器の省エネ情報配信システム |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017041008A (ja) * | 2015-08-18 | 2017-02-23 | ヤマハ株式会社 | 制御装置 |
WO2017030013A1 (ja) * | 2015-08-18 | 2017-02-23 | ヤマハ株式会社 | 制御装置、制御方法及び制御システム |
US10642235B2 (en) | 2015-08-18 | 2020-05-05 | Yamaha Corporation | Control apparatus, control method, and control system that provide display control to display images associated with master and slave devices |
CN106940548A (zh) * | 2017-05-05 | 2017-07-11 | 苏州普绿法环保科技有限公司 | 一种物联网智能化除尘设备管控系统 |
Also Published As
Publication number | Publication date |
---|---|
TW201507486A (zh) | 2015-02-16 |
CN105247428A (zh) | 2016-01-13 |
CN105247428B (zh) | 2019-02-19 |
US10355947B2 (en) | 2019-07-16 |
JP6478242B2 (ja) | 2019-03-06 |
US20160057029A1 (en) | 2016-02-25 |
JPWO2014185056A1 (ja) | 2017-02-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6478242B2 (ja) | 情報提供方法 | |
US11281971B2 (en) | Devices, systems, and methods that observe and classify real-world activity relating to an observed object, and track and disseminate state relating the observed object | |
KR102014623B1 (ko) | 화상 표시장치, 화제 선택 방법, 화제 선택 프로그램, 화상 표시 방법 및 화상 표시 프로그램 | |
US10015060B2 (en) | Information sharing method that provides a graphical user interface image for sharing information relating to an application within a home among a plurality of users | |
CN109791762A (zh) | 语音接口设备的噪声降低 | |
CN107548474A (zh) | 基于情绪和/或行为信息的用于环境背景噪声修改的方法、系统和介质 | |
US10969924B2 (en) | Information processing apparatus, method, and non-transitory computer readable medium that controls a representation of a user object in a virtual space | |
CN104125511A (zh) | 多媒体数据推送方法及装置 | |
CN106162369A (zh) | 一种实现虚拟场景中互动的方法、装置及系统 | |
CN105074740A (zh) | 用户状况确认系统、用户状况确认方法、通信终端装置、用户状况通知方法以及计算机程序 | |
CN109211259A (zh) | 轨迹路线的显示方法、装置、终端及存储介质 | |
KR20130106029A (ko) | 디스플레이장치를 구비한 스마트 테이블 | |
CN105893771A (zh) | 一种信息服务方法和装置、一种用于信息服务的装置 | |
CN107979687A (zh) | 一种壁纸切换方法、移动终端 | |
JP2017027473A (ja) | 連携システム、連携サーバ、機器制御サーバ、機器、および端末装置 | |
CN109658198A (zh) | 一种商品推荐方法及移动终端 | |
JP7452524B2 (ja) | 情報処理装置、及び情報処理方法 | |
WO2020196100A1 (ja) | 情報処理装置、情報処理方法及びプログラム | |
KR101263231B1 (ko) | 런타임 엔진을 이용한 네트워크 접속 미디어 기기의 사용자 컨트롤에 대한 제어 시스템 | |
CN112823346B (zh) | 信息提供方法 | |
CN108696635A (zh) | 用户行为检测方法、装置、系统及电子设备 | |
US10860617B2 (en) | Information processing apparatus, information processing method, and program | |
CN109063003A (zh) | 一种内容推荐方法及移动终端 | |
JP7407681B2 (ja) | 電子装置及びシステム | |
US20240005948A1 (en) | Information providing method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14798606 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2015516914 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14770117 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 14798606 Country of ref document: EP Kind code of ref document: A1 |