WO2022176946A1 - Control system, and control method - Google Patents

Control system, and control method Download PDF

Info

Publication number
WO2022176946A1
WO2022176946A1 PCT/JP2022/006382 JP2022006382W WO2022176946A1 WO 2022176946 A1 WO2022176946 A1 WO 2022176946A1 JP 2022006382 W JP2022006382 W JP 2022006382W WO 2022176946 A1 WO2022176946 A1 WO 2022176946A1
Authority
WO
WIPO (PCT)
Prior art keywords
control
scene
information
terminal
information terminal
Prior art date
Application number
PCT/JP2022/006382
Other languages
French (fr)
Japanese (ja)
Inventor
彩衣 吉川
恵里 眞田
洋子 藤原
Original Assignee
パナソニックIpマネジメント株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パナソニックIpマネジメント株式会社 filed Critical パナソニックIpマネジメント株式会社
Publication of WO2022176946A1 publication Critical patent/WO2022176946A1/en

Links

Images

Classifications

    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F24HEATING; RANGES; VENTILATING
    • F24FAIR-CONDITIONING; AIR-HUMIDIFICATION; VENTILATION; USE OF AIR CURRENTS FOR SCREENING
    • F24F11/00Control or safety arrangements
    • F24F11/50Control or safety arrangements characterised by user interfaces or communication
    • F24F11/56Remote control
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M11/00Telephonic communication systems specially adapted for combination with other electrical systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04QSELECTING
    • H04Q9/00Arrangements in telecontrol or telemetry systems for selectively calling a substation from a main station, in which substation desired apparatus is selected for applying a control signal thereto or for obtaining measured values therefrom

Definitions

  • the present invention relates to control systems and control methods.
  • Patent Literature 1 discloses a home appliance control system in which a sound collector connectable to a network collects voice information and controls the home appliance via the network based on the collected voice information.
  • the present invention provides a control system and a control method that can adaptively change the content of device control.
  • a control system includes a setting unit that sets details of scene control for equipment installed in a facility, and a type of information terminal that receives an instruction to execute the scene control or a current state of the information terminal. and a control unit that executes only part or all of the content of the scene control set by the setting unit according to the position.
  • a control method includes a setting step of setting details of scene control for equipment installed in a facility; and a control step of executing only part or all of the content of the scene control set in the setting step according to the position.
  • a program according to one aspect of the present invention is a program for causing a computer to execute the control method.
  • a control system and a control method according to one aspect of the present invention can adaptively change the details of device control.
  • FIG. 1 is a block diagram showing the functional configuration of the control system according to the embodiment.
  • FIG. 2 is a diagram showing an example of setting information for executing scene control.
  • FIG. 3 is a flowchart of an operation for setting the content of scene control according to Operation Example 1.
  • FIG. 4 is a diagram showing an example of a display screen of a device to be controlled.
  • FIG. 5 is a flowchart of local scene control according to Operation Example 1.
  • FIG. FIG. 6 is a sequence diagram of audio scene control according to Operation Example 1.
  • FIG. FIG. 7 is a diagram showing an example of how the content of scene control is narrowed down when the information terminal that has received the execution instruction is an audio UI device.
  • FIG. 8 is a sequence diagram of remote scene control according to Operation Example 1.
  • FIG. FIG. 9 is a diagram showing an example of how the content of scene control is narrowed down when the information terminal that has received the execution instruction is a mobile terminal.
  • FIG. 10 is a diagram showing an example of a notification screen.
  • FIG. 11 is a sequence diagram of an operation for setting the content of scene control according to Operation Example 2.
  • FIG. 12 is a diagram showing an example of setting information for audio scene control.
  • FIG. 13 is a diagram showing an example of setting information for remote scene control.
  • 14 is a sequence diagram of audio scene control according to Operation Example 2.
  • FIG. 15 is a sequence diagram of remote scene control according to Operation Example 2.
  • FIG. FIG. 16 is a first sequence diagram of scene control according to Operation Example 3.
  • FIG. 17 is a second sequence diagram of scene control according to Operation Example 3.
  • FIG. 18 is a first sequence diagram of scene control according to Operation Example 4.
  • FIG. 19 is a second sequence diagram of scene control according to Operation Example 4.
  • FIG. 20 is a diagram showing an example of a display screen for obtaining consent to restrict functions in audio scene control.
  • each figure is a schematic diagram and is not necessarily strictly illustrated. Moreover, in each figure, the same code
  • FIG. 1 is a block diagram showing the functional configuration of the control system according to the embodiment.
  • the control system 10 shown in FIG. 1 is a device control system capable of controlling a plurality of devices 30 installed in a facility 90.
  • the facility 90 is, for example, a residence such as an apartment complex or a detached house, but may be a facility other than a residence such as an office.
  • the control system 10 includes a control terminal 20, a plurality of devices 30, a wireless communication device 35, a voice UI device 40, a voice recognition server 50, a voice control server 60, and a mobile phone.
  • a terminal 70 and a remote control server 80 are provided.
  • the audio UI device 40 , the control terminal 20 , and the plurality of devices 30 are installed inside the facility 90 .
  • the mobile terminal 70 may be positioned inside the facility 90 or may be positioned outside the facility 90 .
  • the control terminal 20 is, for example, a HEMS (Home Energy Management System) controller having an energy management function.
  • the control terminal 20 is installed in the facility 90 and manages the amount of electricity used (in other words, power consumption) of the plurality of devices 30 installed in the facility 90 .
  • the control terminal 20 also controls a plurality of devices 30 installed within the facility 90 (or within the site of the facility 90).
  • the control terminal 20 is not limited to the HEMS controller, and may be another home controller that does not have an energy management function, or a gateway device.
  • the control terminal 20 is an example of an information terminal.
  • control terminal 20 includes a display unit 21, a communication unit 22, an information processing unit 23, and a storage unit 24.
  • the display unit 21 is a display device having an image display function and a function of accepting manual input from the user.
  • the display unit 21 is implemented by a touch panel and a display panel such as a liquid crystal panel or an organic EL (Electro Luminescence) panel.
  • the touch panel is, for example, a capacitive touch panel, but may be a resistive touch panel.
  • the communication unit 22 is a communication circuit for the control terminal 20 to communicate with other devices via the wide area communication network 100 .
  • the communication unit 22 is, for example, a wireless communication circuit that performs wireless communication, but may be a wired communication circuit that performs wired communication.
  • a communication standard for communication performed by the communication unit 22 is not particularly limited.
  • the information processing section 23 performs information processing related to control of the device 30 .
  • the information processing section 23 is realized by, for example, a microcomputer, but may be realized by a processor or a dedicated circuit.
  • the information processing section 23 has a setting section 25, a control section 26, and a display control section 27 as functional components. Functions of the setting unit 25 , the control unit 26 , and the display control unit 27 are realized by, for example, executing a computer program stored in the storage unit 24 by a microcomputer or the like that constitutes the information processing unit 23 . Details of the functions of the setting unit 25, the control unit 26, and the display control unit 27 will be described later.
  • the storage unit 24 is a storage device that stores computer programs executed by the information processing unit 23, setting information for scene control (described later), and the like.
  • the storage unit 24 is implemented by, for example, a semiconductor memory.
  • the plurality of devices 30 are devices to be controlled in the control system 10 installed in the facility 90 .
  • the multiple devices 30 are controlled by control signals transmitted by the control unit 26 of the control terminal 20 .
  • the plurality of devices 30 include an air conditioner 31, a lighting device 32, an electric lock 33, an electric blind 34, and the like.
  • the air conditioner 31 is a general household air conditioner.
  • the air conditioner 31 is an air conditioner capable of adjusting the temperature of the air sent from the air conditioner 31 by having a heat exchanger (not shown) or the like. That is, the air conditioner 31 has a temperature adjustment function (blowing function and cooling/heating function).
  • the air conditioner 31 is not limited to a general household air conditioner, and may be an industrial air conditioner.
  • the lighting device 32 illuminates the interior (room) of the facility 90 .
  • the lighting device 32 is, for example, a ceiling light, but the specific aspect of the lighting device 32 is not particularly limited.
  • the lighting device 32 may be a downlight, pendant light, spotlight, bracket light, or the like.
  • the lighting device 32 may be a device that illuminates the outside (outdoor) of the facility 90 .
  • the electric lock 33 is a security device that controls the unlocking and locking of the doors (or windows, etc.) of the facility 90 .
  • the electric lock 33 has an RFID reader that acquires key information from, for example, a card key.
  • the electric lock 33 may include a biosensor that acquires biometric information such as a fingerprint as key information.
  • the electric lock 33 is described as an electric lock provided on the front door.
  • the electric blinds 34 are placed inside the windows of the facility 90 and adjust the amount of outside light that enters the facility 90 . Also, the electric blind 34 may be used as a partition inside the facility 90 .
  • the electric blind 34 has a plurality of slats, and the angles of the slats (hereinafter also simply referred to as the angle of the electric blind 34, etc.) can be adjusted.
  • the wireless communication device 35 is a device that performs short-range wireless communication, and is a wireless LAN (Local Area Network) router installed in the facility 90 .
  • the wireless communication device 35 is connected to the control terminal 20 and the audio UI device 40 for communication. Communication between the control terminal 20 and the voice UI device 40 through the wide area communication network 100 is performed via the wireless communication device 35 .
  • the voice UI device 40 is an information terminal into which user's uttered voice is input, and transmits a voice signal of the input uttered voice to the voice recognition server 50 via the wide area communication network 100 .
  • the audio UI device 40 is, for example, a smart speaker installed at the facility 90 .
  • the voice UI device 40 specifically has a microphone 41 and a speaker 42 .
  • the microphone 41 acquires the voice uttered by the user (speech voice).
  • the microphone 41 is specifically a condenser microphone, a dynamic microphone, a MEMS (Micro Electro Mechanical Systems) microphone, or the like.
  • the speaker 42 outputs sound (mechanical sound) as a response to the speech sound acquired by the microphone 41 . This allows the user to interactively control the device 30 .
  • the speech recognition server 50 is a computer located outside the facility 90 .
  • the voice recognition server 50 is a cloud server that receives voice signals transmitted by the voice UI device 40 and performs voice recognition processing on the received voice signals.
  • a business that provides a speech recognition service uses the speech recognition server 50 to provide the speech recognition service.
  • the voice recognition server 50 for example, converts a voice signal transmitted by the voice UI device 40 into text information and transmits the text information to the voice control server 60 .
  • the voice control server 60 is a computer located outside the facility 90 .
  • the voice control server 60 is specifically a cloud server that generates a control command based on text information transmitted from the voice recognition server 50 and transmits the generated control command to the control terminal 20 .
  • the voice control server 60 includes a first communication section 61 , a first information processing section 62 and a first storage section 63 .
  • the first communication unit 61 is a communication module (communication circuit) for the voice control server 60 to communicate with other devices via the wide area communication network 100 .
  • the communication performed by the first communication unit 61 is, for example, wired communication, but may be wireless communication.
  • the communication standard used for communication is also not particularly limited.
  • the first information processing unit 62 performs information processing for controlling the device 30 based on voice input received by the voice UI device 40 .
  • the first information processing section 62 is implemented by, for example, a microcomputer, but may be implemented by a processor.
  • the first information processing section 62 has a first setting section 64 and a first control section 65 as functional components.
  • the functions of the first setting unit 64 and the first control unit 65 are realized by, for example, executing a computer program stored in the first storage unit 63 by a microcomputer or the like that constitutes the first information processing unit 62. be. Details of the functions of the first setting unit 64 and the first control unit 65 will be described later.
  • the first storage unit 63 is a storage device that stores computer programs and the like executed by the first information processing unit 62 .
  • the first storage unit 63 is implemented by, for example, an HDD (Hard Disc Drive).
  • the mobile terminal 70 is a mobile information terminal for manual input (that is, operation) for the user to remotely control the device 30, and is specifically a smart phone, a tablet terminal, or the like.
  • the mobile terminal 70 has a display section 71 , a position acquisition section 72 and a wireless communication section 73 .
  • the display unit 71 is a display device having an image display function and a function of accepting manual input from the user.
  • the display unit 71 is realized by a touch panel and a display panel such as a liquid crystal panel or an organic EL panel.
  • the touch panel is, for example, a capacitive touch panel, but may be a resistive touch panel.
  • the location acquisition unit 72 acquires (in other words, calculates) the current location of the mobile terminal 70 and outputs current location information indicating the acquired current location.
  • the position acquisition unit 72 is implemented by, for example, a positioning module compatible with a satellite positioning system such as GPS (Global Positioning System) or Quasi-Zenith Satellite System.
  • the wireless communication unit 73 is a communication circuit for communication connection between the mobile terminal 70 and the wireless communication device 35 .
  • the wireless communication unit 73 is specifically a short-range wireless communication circuit compatible with wireless LAN.
  • Remote control server 80 is a computer located outside facility 90 .
  • the remote control server 80 is a cloud server that generates a control command based on information transmitted from the mobile terminal 70 and transmits the generated control command to the control terminal 20 .
  • the remote control server 80 includes a second communication section 81 , a second information processing section 82 and a second storage section 83 .
  • the second communication unit 81 is a communication module (communication circuit) for the remote control server 80 to communicate with other devices via the wide area communication network 100 .
  • the communication performed by the second communication unit 81 is, for example, wired communication, but may be wireless communication.
  • the communication standard used for communication is also not particularly limited.
  • the second information processing unit 82 performs information processing for controlling the device 30 based on the user's manual input received by the mobile terminal 70 .
  • the second information processing section 82 is implemented by, for example, a microcomputer, but may be implemented by a processor.
  • the second information processing section 82 has a second setting section 84 and a second control section 85 as functional components.
  • the functions of the second setting unit 84 and the second control unit 85 are realized by, for example, executing a computer program stored in the second storage unit 83 by a microcomputer constituting the second information processing unit 82. be. Details of the functions of the second setting unit 84 and the second control unit 85 will be described later.
  • the second storage unit 83 is a storage device in which computer programs and the like executed by the second information processing unit 82 are stored.
  • the second storage unit 83 is implemented by, for example, an HDD.
  • a user of the control system 10 can control the equipment 30 by making manual inputs to the control terminal 20 .
  • the control unit 26 of the control terminal 20 transmits a control signal corresponding to the manual input to the device 30 via the local communication network.
  • the local communication network is, for example, a communication network conforming to ECHONET Lite (registered trademark). In the following, such control of the device 30 based on manual input to the control terminal 20 is also described as local control.
  • the user can control the device 30 by inputting voice to the voice UI device 40 .
  • voice input is received by the microphone 41
  • the voice UI device 40 transmits a voice signal corresponding to the voice input to the voice recognition server 50 via the wide area communication network 100 .
  • the voice recognition server 50 converts the voice signal into text information and transmits the text information to the voice control server 60 via the wide area communication network 100 .
  • the voice control server 60 generates a control command according to the text information and transmits the generated control command to the control terminal 20 via the wide area communication network 100 .
  • the control unit 26 of the control terminal 20 transmits a control signal corresponding to the control command to the device 30 via the local communication network.
  • the device 30 is controlled.
  • voice control such control of the device 30 based on voice input to the voice UI device 40 is also described as voice control.
  • the user can control the device 30 by performing manual input to the mobile terminal 70 .
  • the mobile terminal 70 transmits control request information according to the manual input to the remote control server 80 via the wide area communication network 100 .
  • the remote control server 80 generates a control command according to the control request information, and transmits the generated control command to the control terminal 20 via the wide area communication network 100 .
  • the control unit 26 of the control terminal 20 transmits a control signal corresponding to the control command to the device 30 via the local communication network.
  • the device 30 is controlled.
  • such control of the device 30 based on manual input to the mobile terminal 70 is also referred to as remote control.
  • Scene control is control that collectively operates one or a plurality of devices 30 arranged in facility 90 in order to bring the interior of facility 90 closer to a predetermined indoor environment.
  • FIG. 2 is a diagram showing an example of setting information for executing scene control.
  • the control details of the device 30 are associated with each life scene (scene name) such as going home, getting up, going out, and going to bed.
  • the control unit 26 of the control terminal 20 changes the setting information stored in the storage unit 24 to The device 30 is controlled by referring to it.
  • the air conditioner 31 and the lighting device 32 are turned on with pre-registered settings (brightness, set temperature, etc.), the electric lock 33 is locked, and the slats of the electric blind 34 are opened to the set angle.
  • the user can set what kind of device 30 performs what kind of operation when the execution of the scene control of "coming home" is instructed.
  • the information transmission path at this time is the same as that of the local control, and the scene control of the device 30 based on manual input to the control terminal 20 is also described as local scene control.
  • the user can also execute the "coming home” scene control by speaking to the voice UI device 40, "make it a coming home scene".
  • the information transmission path at this time is the same as the voice control described above, and scene control based on voice input to the voice UI device 40 is also described as voice scene control.
  • the angle of the electric blind 34 can be designated in local scene control, but the angle of the electric blind 34 cannot be designated in audio scene control, and only opening and closing is possible.
  • the setting information of local scene control cannot be used as it is.
  • the setting information for executing the audio scene control is stored in advance in the first storage unit 63 of the audio control server 60 separately from the setting information (FIG. 2) for executing the local scene control. remembered.
  • setting (storing in the first storage unit 63 ) setting information for executing audio scene control the user performs manual input to the portable terminal 70 .
  • the user can also execute scene control of "coming home" by performing manual input to the mobile terminal 70.
  • the information transmission path at this time is the same as that of the remote control, and scene control based on manual input to the mobile terminal 70 is also described as remote scene control.
  • the remote scene control assumes control of the device 30 from outside the facility 90, there are restrictions on the content of the control.
  • the electric lock 33 is excluded from scene control from the viewpoint of security.
  • detailed operations of the device 30 may be excluded. That is, in remote scene control, the setting information for local scene control and the setting information for audio scene control cannot be used as they are.
  • the setting information for executing remote scene control is stored in the remote control server separately from the setting information for executing local scene control and the setting information for executing audio scene control. It is stored in advance in the second storage unit 83 of 80 .
  • setting (storing in the second storage unit 83 ) setting information for remote scene control the user performs manual input to the portable terminal 70 .
  • the user needs to individually set the control terminal 20, the voice control server 60, and the remote control server 80 for the same "coming home" scene control.
  • the user's burden can be reduced by sharing setting information for scene control.
  • FIG. 3 is a flowchart of an operation for setting the content of scene control according to Operation Example 1. As shown in FIG.
  • the display control unit 27 of the control terminal 20 displays a scene control setting screen on the display unit 21 (S11).
  • the setting screen may include a display screen showing the device 30 to be controlled by each of the local scene control, the audio scene control, and the remote scene control.
  • FIG. 4 is a diagram showing an example of a display screen of a device to be controlled. In the example of FIG. 4, the device 30 to be controlled is displayed as an icon.
  • Such a display screen is an example of an image showing the device 30 to be controlled and the device 30 excluded from the control target when only part of the content of the scene control is executed.
  • the setting unit 25 sets the content of scene control according to the manual input accepted in step S12 (S13). Specifically, the setting unit 25 generates setting information as shown in FIG. 2 according to the manual input accepted in step S12, and stores the generated setting information in the storage unit 24.
  • FIG. 5 is a flowchart of local scene control according to Operation Example 1.
  • the control unit 26 After the setting information is stored in the storage unit 24 of the control terminal 20 as described above, the user performs manual input intended to execute scene control on the display unit 21 .
  • the display unit 21 accepts such a manual input (an example of a scene control execution instruction) (S14).
  • the control unit 26 specifies the details of the scene control indicated by the manual input accepted in step S14 (S15), and stores all of the specified details of the scene control. Execute (S16). Specifically, the control unit 26 executes all scene control by transmitting a control signal to the device 30 via the local communication network.
  • FIG. 6 is a sequence diagram of audio scene control according to Operation Example 1.
  • the microphone of the voice UI device 40 41 acquires this speech sound.
  • the microphone 41 receives an audio input for scene control (an example of an instruction to execute scene control) (S21).
  • the voice UI device 40 transmits the acquired voice signal of the uttered voice to the voice recognition server 50 (S22).
  • the voice recognition server 50 Upon receiving the voice signal, the voice recognition server 50 performs voice recognition processing on the received voice signal (that is, spoken voice) (S23). Specifically, the voice recognition server 50 converts the received voice signal into text information and transmits the text information to the voice control server 60 (S24).
  • the first communication unit 61 of the voice control server 60 receives text information from the voice recognition server 50.
  • the voice recognition server 50 may convert the text information into command information (information similar to a control command described later), and the voice control server 60 may acquire the command information from the voice recognition server 50 .
  • the first control unit 65 identifies the scene intended by the user based on the received text information (S25), and the information terminal that received the scene information indicating the identified scene and the execution instruction is the audio UI device 40.
  • a control command including terminal information indicating that there is a terminal is transmitted to the control terminal 20 (S26).
  • the first communication unit 61 is used for transmitting the control command.
  • the terminal information is information indicating the type of information terminal that has received the execution instruction.
  • the communication unit 22 of the control terminal 20 receives control commands (scene information and terminal information) from the audio control server 60 .
  • the control unit 26 identifies the content of the scene control indicated by the scene information received in step S26 (S27). Further, the control unit 26 narrows down the content of the specified scene control according to the terminal information received in step S26 (S28).
  • FIG. 7 is a diagram showing an example of how the content of scene control is narrowed down when the information terminal that has received the execution instruction is the audio UI device 40 . In the example of FIG. 7, the details of the control of the air conditioner 31, the lighting device 32, and the electric lock 33 are maintained, but the details of the control of the electric blind 34 are narrowed down to basic operations.
  • control unit 26 executes the content of the narrowed down scene control (that is, part of the content of the scene control) (S29). Specifically, the control unit 26 executes a part of scene control by transmitting a control signal to the device 30 via the local communication network.
  • FIG. 8 is a sequence diagram of remote scene control according to Operation Example 1.
  • the user After the setting information is stored in the storage unit 24 of the control terminal 20 as described above, the user performs manual input intended to set the scene control on the display unit 71 of the mobile terminal 70 .
  • the display unit 71 accepts such a manual input (an example of a scene control execution instruction) (S31).
  • the portable terminal 70 transmits to the remote control server 80 control request information requesting execution of scene control indicated by the manual input accepted in step S31 (S32).
  • the second communication unit 81 of the remote control server 80 receives the control request information.
  • the second control unit 85 specifies the scene intended by the user based on the received control request information (S33), and the information terminal that received the scene information indicating the specified scene and the execution instruction is the portable terminal 70.
  • a control command including terminal information indicating that there is a terminal is transmitted to the control terminal 20 (S34).
  • the second communication unit 81 is used for transmitting the control command.
  • the communication unit 22 of the control terminal 20 receives control commands (scene information and terminal information) from the remote control server 80 .
  • the control unit 26 specifies the content of the scene control indicated by the scene information received in step S34 (S35). Further, the control unit 26 narrows down the contents of the specified scene control according to the terminal information received in step S34 (S36).
  • FIG. 9 is a diagram showing an example of how the content of scene control is narrowed down when the information terminal that has received the execution instruction is the mobile terminal 70 .
  • the details of the control of the air conditioner 31, the lighting device 32, and the electric blinds 34 are all narrowed down to basic operations. Also, the control of the electric lock 33 is excluded.
  • the control unit 26 notifies the content of the narrowed down scene control (that is, that only part of the content of the scene control is executed) (S37). Specifically, the control unit 26 transmits notification information to the remote control server 80 , and the remote control server 80 transmits notification information to the mobile terminal 70 . As a result, a notification screen as shown in FIG. 10 is displayed on the display unit 71 of the mobile terminal 70 (S38).
  • FIG. 10 is a diagram showing an example of a notification screen.
  • the display unit 71 receives manual input intended to approve execution of scene control (S39).
  • the mobile terminal 70 transmits the approval information to the remote control server 80, and the remote control server 80 transmits the approval information to the control terminal 20 (S40).
  • the communication unit 22 of the control terminal 20 receives the approval information. Triggered by the reception of the approval information by the communication unit 22, the control unit 26 executes the content of the narrowed-down scene control (that is, part of the content of the scene control) (S41). Specifically, the control unit 26 executes a part of scene control by transmitting a control signal to the device 30 via the local communication network. The control unit 26 does not execute scene control when the approval information is not received within a certain period of time after the notification information is transmitted.
  • steps S36 to S39 may be omitted. Further, the control unit 26 may notify that only part of the scene control has been executed after executing part of the scene control.
  • the setting information is stored only in the control terminal 20, and the control unit 26 of the control terminal 20 selects the scene indicated by the setting information according to the type of information terminal that has received the execution instruction. Determine whether to execute all or part of the contents of the control.
  • the user of the control system 10 only needs to set the contents of scene control in the control terminal 20 , and has the advantage of not having to set the scene control in the voice control server 60 and the remote control server 80 .
  • the manual input intended to set the content of the scene control was received by the control terminal 20, but may be received by the mobile terminal 70.
  • the setting information is transmitted to the control terminal 20 via the voice control server 60 or remote control server 80 and stored in the storage unit 24 .
  • FIG. 11 is a sequence diagram of an operation for setting the content of scene control according to Operation Example 2. As shown in FIG.
  • the display control unit 27 of the control terminal 20 displays a scene control setting screen on the display unit 21 (S51). While the setting screen is displayed, the user performs manual input (an example of the first manual input) intended to set the scene control on the display unit 21 of the control terminal 20 .
  • the display unit 21 accepts such manual input (S52).
  • the setting unit 25 sets the content of scene control according to the manual input accepted in step S52 (S53). Specifically, the setting unit 25 generates setting information as shown in FIG. 2 according to the manual input accepted in step S12, and stores the generated setting information in the storage unit 24.
  • the processing of steps S51 to S53 is the same as the processing of steps S11 to S13.
  • the setting unit 25 transmits the setting information generated in step S53 to each of the voice control server 60 and the remote control server 80 (S54).
  • the voice control server 60 is used when the voice UI device 40 accepts voice input
  • the remote control server 80 operates when the mobile terminal 70 accepts manual input. It is a server device used for
  • the first communication unit 61 of the voice control server 60 receives the setting information.
  • the first setting unit 64 narrows down the received setting information (S55). Specifically, the first setting unit 64 extracts, from the received setting information, only the part to be executed by the audio scene control.
  • the first setting unit 64 stores the extracted setting information in the first storage unit 63 as setting information for audio scene control (S56).
  • FIG. 12 is a diagram showing an example of setting information for audio scene control, the contents of which are narrowed down from the original setting information (FIG. 2).
  • the second communication unit 81 of the remote control server 80 receives the setting information.
  • the second setting unit 84 narrows down the received setting information (S57). Specifically, the second setting unit 84 extracts only the portion to be executed by remote scene control from the received setting information.
  • the second setting unit 84 stores the extracted setting information in the second storage unit 83 as setting information for remote scene control (S58).
  • FIG. 13 is a diagram showing an example of setting information for remote scene control, the contents of which are narrowed down from the original setting information (FIG. 2).
  • setting information with different contents is stored in each of the control terminal 20, the voice control server 60, and the remote control server 80.
  • the distribution of the setting information (the process of steps S54 to S56) is automatically performed when the setting information is stored in the storage unit 24 of the control terminal 20.
  • the distribution of the setting information may be performed by the user's manual input.
  • the manual input in this case is the second manual input with the intention of consenting to transmitting the setting information to the voice control server 60 and the remote control server 80 (sharing the setting information).
  • the second manual input is a simple one-touch operation on a predetermined icon.
  • FIG. 14 is a sequence diagram of audio scene control according to Operation Example 2.
  • FIG. 14 is a sequence diagram of audio scene control according to Operation Example 2.
  • the microphone 41 receives voice input intended to execute scene control (S61).
  • the voice UI device 40 transmits the acquired voice signal of the uttered voice to the voice recognition server 50 (S62).
  • the voice recognition server 50 performs voice recognition processing on the received voice signal (S63), and transmits text information to the voice control server 60 (S64).
  • the processing of steps S61 to S64 is the same as that of steps S21 to S24.
  • the first communication unit 61 of the voice control server 60 receives text information from the voice recognition server 50.
  • the first control unit 65 identifies the content of scene control determined by the received text information by referring to the setting information stored in the first storage unit 63 (S65). Then, the first control unit 65 transmits to the control terminal 20 a control command indicating the content of the specified scene control (specific control content of the target device 30) (S66).
  • the first communication unit 61 is used for transmitting the control command.
  • the communication unit 22 of the control terminal 20 receives control instructions from the voice control server 60 .
  • the control unit 26 executes the content of the scene control indicated by the received control command (that is, part of the original content of the scene control) (S67). Specifically, the control unit 26 executes scene control by transmitting a control signal to the device 30 via the local communication network.
  • FIG. 15 is a sequence diagram of remote scene control according to Operation Example 2.
  • FIG. 15 is a sequence diagram of remote scene control according to Operation Example 2.
  • the display unit 71 of the mobile terminal 70 accepts manual input intended to execute scene control (S71).
  • the portable terminal 70 transmits to the remote control server 80 control request information requesting execution of scene control indicated by the manual input accepted in step S71 (S72).
  • the processing of steps S71 to S72 is the same as that of steps S31 to S32.
  • the second communication unit 81 of the remote control server 80 receives the control request information.
  • the second control unit 85 identifies the content of scene control determined by the received control request information by referring to the setting information stored in the second storage unit 83 (S73).
  • the second control unit 85 notifies the content of the specified scene control (that is, that only part of the content of the original scene control is executed) (S74). Specifically, the second control unit 85 transmits notification information to the mobile terminal 70 . As a result, a notification screen as shown in FIG. 10 is displayed on the display unit 71 of the mobile terminal 70 (S75).
  • the display unit 71 receives manual input intended to approve execution of scene control (S76).
  • the mobile terminal 70 transmits the approval information to the remote control server 80 (S77), and the second communication section 81 of the remote control server 80 receives the approval information.
  • the second control unit 85 transmits to the control terminal 20 a control command indicating the content of the scene control specified in step S73 (specific control content of the target device 30). (S78).
  • the second communication unit 81 is used for transmitting the control command.
  • the second control unit 85 does not transmit the control command when the approval information is not received within a certain period of time after transmitting the notification information.
  • the communication unit 22 of the control terminal 20 receives control instructions from the remote control server 80 .
  • the control unit 26 executes the content of the scene control indicated by the received control command (that is, part of the original content of the scene control) (S79). Specifically, the control unit 26 executes scene control by transmitting a control signal to the device 30 via the local communication network.
  • steps S74 to S77 may be omitted.
  • the second control unit 85 may notify that only part of the content of the scene control has been executed after transmitting the control command.
  • the setting information stored in the control terminal 20 is distributed to the voice control server 60 and the remote control server 80, and the control terminal 20, the voice control server 60 and the remote control server Setting information is shared among the servers 80 . If the user of the control system 10 sets the content of scene control only on the control terminal 20, the setting information is shared automatically or by simple manual input. In other words, there is an advantage that the user does not need to set the scene control in the voice control server 60 and the remote control server 80 .
  • scene control settings were made in the control terminal 20, and the setting information was distributed from the control terminal 20 to the voice control server 60 and the remote control server 80.
  • Scene control may be set in the voice control server 60 , and the setting information may be distributed from the voice control server 60 to the control terminal 20 and the remote control server 80 .
  • scene control settings may be made to the remote control server 80 via the mobile terminal 70 , and the setting information may be distributed from the remote control server 80 to the control terminal 20 and the voice control server 60 .
  • FIG. 16 is a sequence diagram when the mobile terminal 70 is located inside the facility 90
  • FIG. 17 is a sequence diagram when the mobile terminal 70 is located outside the facility 90.
  • control system 10 when the mobile terminal 70 is located inside the facility 90 will be described with reference to FIG.
  • the display unit 71 of the mobile terminal 70 receives manual input intended to execute scene control (S81).
  • the mobile terminal 70 transmits to the remote control server 80 control request information requesting execution of scene control indicated by the manual input accepted in step S81 (S82).
  • the control request information includes location information indicating the current location of the mobile terminal 70 .
  • the location information is output by the location acquisition unit 72 included in the mobile terminal 70 .
  • the second communication unit 81 of the remote control server 80 receives the control request information.
  • the second control unit 85 identifies the scene intended by the user based on the received control request information (S83). Also, the second control unit 85 determines that the mobile terminal 70 is located inside the facility 90 based on the location information included in the control request information (S84a). Note that the determination criteria (coordinates of the facility 90) at this time are stored in the second storage unit 83 of the remote control server 80 in advance.
  • the second control unit 85 transmits to the control terminal 20 a control command including scene information indicating the specified scene and determination result information indicating that the mobile terminal 70 is located inside the facility 90 (S85a). .
  • the second communication unit 81 is used for transmitting the control command.
  • the communication unit 22 of the control terminal 20 receives control commands (scene information and determination result information) from the remote control server 80 . Since the received determination result information indicates that the mobile terminal 70 is located within the facility 90, the control unit 26 executes all the contents of the scene control indicated by the scene information received in step S85a ( S86a).
  • steps S81 to S83 is the same as in FIG.
  • the second control unit 85 determines that the mobile terminal 70 is located outside the facility 90 based on the location information included in the control request information (S84b).
  • the second control unit 85 transmits to the control terminal 20 a control command including scene information indicating the identified scene and determination result information indicating that the portable terminal 70 is located outside the facility 90 (S85b). .
  • the second communication unit 81 is used for transmitting the control command.
  • the communication unit 22 of the control terminal 20 receives control commands (scene information and determination result information) from the remote control server 80 . Since the received determination result information indicates that the portable terminal 70 is located outside the facility 90, the control unit 26 narrows down the content of the scene control (for example, narrows down as shown in FIG. 9), and in step S85b Part of the content of the scene control indicated by the received scene information is executed (S86b).
  • control system 10 can execute only part or all of the contents of the scene control according to the current position of the mobile terminal 70 that receives the instruction to execute the scene control.
  • the execution instruction to the mobile terminal 70 is regarded as a control instruction for the device 30 from outside the facility 90, and only part of the content of the scene control is executed.
  • scene control is adaptively executed according to the current position of the mobile terminal 70 .
  • the remote control server 80 determines whether or not the mobile terminal 70 is located within the facility 90 , but the control terminal 20 may determine whether or not the mobile terminal 70 is located within the facility 90 . Also, between steps S84b and S85b in FIG. 17, processing (notification and approval) similar to steps S74 to S76 may be performed. Further, in the operation example 3 as well, the control unit 26 may notify that only part of the content of the scene control has been executed after executing a part of the content of the scene control.
  • FIG. 18 is a sequence diagram when the mobile terminal 70 is located inside the facility 90
  • FIG. 19 is a sequence diagram when the mobile terminal 70 is located outside the facility 90.
  • control system 10 when the mobile terminal 70 is located inside the facility 90 will be described with reference to FIG.
  • the display unit 71 of the mobile terminal 70 receives manual input intended to execute scene control (S91).
  • the portable terminal 70 transmits to the remote control server 80 control request information requesting execution of scene control indicated by the manual input accepted in step S91 (S92).
  • the control request information does not include location information indicating the current location of the mobile terminal 70 .
  • the second communication unit 81 of the remote control server 80 receives the control request information.
  • the second control unit 85 identifies the scene intended by the user based on the received control request information (S93).
  • the second control unit 85 transmits a control command including scene information indicating the specified scene to the control terminal 20 (S94).
  • the second communication unit 81 is used for transmitting the control command.
  • the communication unit 22 of the control terminal 20 receives control instructions (scene information) from the remote control server 80 .
  • the control unit 26 determines whether or not the mobile terminal 70 is connected to the wireless communication device 35 for communication. When the mobile terminal 70 is connected to the wireless communication device 35 for communication, it is considered that the mobile terminal 70 is located near the facility 90 (inside the facility 90). Therefore, when determining that the mobile terminal 70 is connected to the wireless communication device 35 (S95a), the control section 26 executes all scene control contents indicated by the scene information received in step S94 (S96a). .
  • steps S91 to S94 is the same as in FIG.
  • the control unit 26 determines whether or not the mobile terminal 70 is connected to the wireless communication device 35 for communication.
  • the control unit 26 narrows down the details of the scene control (for example, narrowing down as shown in FIG. 9), A part of the contents of the scene control indicated by the scene information obtained is executed (S96b).
  • control system 10 determines whether or not the mobile terminal 70 that receives the execution instruction for scene control is connected to the wireless communication device 35 installed in the facility 90 . It can be considered whether or not Thereby, the control system 10 can adaptively execute scene control according to the current position of the mobile terminal 70 .
  • control unit 26 may notify that only a part of the content of the scene control has been executed after executing a part of the content of the scene control.
  • FIG. 20 is a diagram showing an example of a display screen for obtaining consent to restrict functions in audio scene control.
  • remote control server 80 when starting to use the remote control server 80, it may be necessary to perform user registration by manually inputting to the mobile terminal 70 and consent to the terms of use of the remote control server 80. Again, when agreeing to the terms of use, the user may agree that remote scene control is limited in functionality.
  • the control system 10 includes the setting unit 25 for setting the content of scene control for the equipment 30 installed in the facility 90, and the type of information terminal or information terminal that receives the instruction to execute the scene control.
  • a control unit 26 that executes only part or all of the content of scene control set by the setting unit 25 according to the current position.
  • Such a control system 10 can adaptively change the content of scene control according to the type of information terminal that accepts a scene control execution instruction or the current position of the information terminal. That is, the control system 10 can adaptively change the content of control of the device 30 .
  • control unit 26 executes only part or all of the content of the scene control set by the setting unit 25 according to the type of information terminal.
  • the information terminal is the first information terminal, the entire contents of the scene control are executed, and when the information terminal is the second information terminal, only part of the contents of the scene control are executed.
  • Such a control system 10 can adaptively change the content of scene control according to the type of information terminal that receives the instruction to execute scene control.
  • the first information terminal is an information terminal (eg, control terminal 20) installed in facility 90
  • the second information terminal is a portable information terminal (eg, mobile terminal 70).
  • Such a control system 10 adaptively changes the content of scene control according to whether the information terminal that receives the instruction to execute scene control is an information terminal installed in the facility 90 or a mobile terminal. be able to.
  • the first information terminal is an information terminal (for example, the mobile terminal 70) that accepts a user's manual input as an execution instruction
  • the second information terminal is an information terminal that accepts a user's voice input as an execution instruction.
  • the audio UI device 40 For example, the audio UI device 40.
  • Such a control system 10 adaptively changes the content of scene control according to whether the information terminal that receives the instruction to execute scene control is an information terminal that receives manual input or an information terminal that receives voice input. can do.
  • control unit 26 executes only part or all of the content of the scene control set by the setting unit 25 according to the current position of the information terminal (for example, the mobile terminal 70). , when the information terminal is located inside the facility 90, the entire contents of the scene control are executed, and when the information terminal is located outside the facility 90, only a part of the contents of the scene control are executed.
  • Such a control system 10 can adaptively change the content of scene control according to the type of information terminal that receives the instruction to execute scene control.
  • control unit 26 when executing only part of the content of scene control, notifies that only part of the content of scene control will be executed before execution.
  • Such a control system 10 can notify that only part of the content of scene control is executed.
  • control system 10 further includes a display control unit 27 that displays an image showing the device 30 to be controlled when only part of the content of scene control is executed.
  • Such a control system 10 can display an image showing the equipment 30 to be controlled.
  • the setting unit 25 generates setting information indicating details of scene control based on the user's first manual input to the first information terminal (for example, the control terminal 20), and sends the generated setting information to the first It is transmitted to the server device (eg, voice control server 60 or remote control server 80) used when the second information terminal (eg, voice UI device 40 or portable terminal 70) receives the execution instruction.
  • the server device eg, voice control server 60 or remote control server 80
  • the second information terminal eg, voice UI device 40 or portable terminal 70
  • Such a control system 10 can share setting information between the first information terminal and the server device.
  • the setting unit 25 generates setting information indicating details of scene control based on the user's first manual input to the first information terminal, and converts the generated setting information to the user's second manual input. and send it to the server device.
  • Such a control system 10 can share setting information between the first information terminal and the server device based on manual input by the user.
  • the setting unit 25 and the control unit 26 are provided by the control terminal 20 installed in the facility 90 .
  • Such a control system 10 can be realized as a control terminal 20 installed in the facility 90.
  • control method executed by a computer such as the control system 10 includes a setting step of setting the content of scene control for the equipment 30 installed in the facility 90, and the type or and a control step of executing only part or all of the content of the scene control set in the setting step according to the current position of the information terminal.
  • Such a control method can adaptively change the content of scene control according to the type of information terminal that receives the execution instruction of scene control or the current position of the information terminal. That is, the control method can adaptively change the content of control of the device 30 .
  • control system was realized by a plurality of devices, but it may be realized by a single device.
  • control system may be implemented as a single device corresponding to the voice control server or the remote control server in the control terminal of the above embodiments.
  • the components included in the control system may be distributed among the multiple devices in any way.
  • the communication method between devices in the above embodiment is not particularly limited.
  • a relay device (not shown) may intervene in communication between devices.
  • the information transmission paths described in the above embodiments are not limited to the transmission paths shown in the sequence diagrams.
  • the processing executed by a specific processing unit may be executed by another processing unit.
  • the processing executed by the setting unit of the control terminal may be executed by the first setting unit of the voice control server or the second setting unit of the remote control server.
  • the processing executed by the control unit of the control terminal may be executed by the first control unit of the voice control server or the second control unit of the remote control server.
  • the order of multiple processes may be changed, and multiple processes may be executed in parallel.
  • each component may be realized by executing a software program suitable for each component.
  • Each component may be realized by reading and executing a software program recorded in a recording medium such as a hard disk or a semiconductor memory by a program execution unit such as a CPU or processor.
  • each component may be realized by hardware.
  • each component may be a circuit (or integrated circuit). These circuits may form one circuit as a whole, or may be separate circuits. These circuits may be general-purpose circuits or dedicated circuits.
  • the present invention may be realized as a control terminal or server device (voice control server or remote control server) according to the above embodiments, or as a control system corresponding to the control terminal or server device. good too.
  • the present invention may be implemented as a control method executed by a computer such as a control system, or may be implemented as a program for causing a computer to execute such a control method.
  • the present invention may be implemented as a computer-readable non-temporary recording medium in which such a program is recorded.
  • control system 20 control terminal 25 setting unit 26 control unit 27 display control unit 30 device 60 voice control server (server device) 80 Remote control server (server device) 90 facilities

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Mechanical Engineering (AREA)
  • General Engineering & Computer Science (AREA)
  • Selective Calling Equipment (AREA)
  • Telephonic Communication Services (AREA)
  • Air Conditioning Control Device (AREA)

Abstract

A control system (10) comprising a setting unit (25) that sets content of scene control with devices (30) installed in a facility (90) as targets, and a control unit (26) that executes only a portion of, or all of, the content of the scene control set by the setting unit (25) in accordance with the variety of an information terminal that receives an execution instruction for the scene control or the current position of the information terminal.

Description

制御システム、及び、制御方法Control system and control method
 本発明は、制御システム、及び、制御方法に関する。 The present invention relates to control systems and control methods.
 機器を制御するための様々な技術が提案されている。特許文献1には、ネットワークに接続可能な集音機器が音声情報を収集し、収集した音声情報に基づいて、ネットワークを介して家電機器を制御する家電機器制御システムが開示されている。 Various technologies have been proposed for controlling devices. Patent Literature 1 discloses a home appliance control system in which a sound collector connectable to a network collects voice information and controls the home appliance via the network based on the collected voice information.
国際公開第2014/171144号WO2014/171144
 本発明は、機器の制御の内容を適応的に変更することができる制御システム、及び、制御方法を提供する。 The present invention provides a control system and a control method that can adaptively change the content of device control.
 本発明の一態様に係る制御システムは、施設に設置された機器を対象としたシーン制御の内容を設定する設定部と、前記シーン制御の実行指示を受け付ける情報端末の種類または前記情報端末の現在位置に応じて、前記設定部によって設定された前記シーン制御の内容の一部のみ、または、全部を実行する制御部とを備える。 A control system according to an aspect of the present invention includes a setting unit that sets details of scene control for equipment installed in a facility, and a type of information terminal that receives an instruction to execute the scene control or a current state of the information terminal. and a control unit that executes only part or all of the content of the scene control set by the setting unit according to the position.
 本発明の一態様に係る制御方法は、施設に設置された機器を対象としたシーン制御の内容を設定する設定ステップと、前記シーン制御の実行指示を受け付ける情報端末の種類または前記情報端末の現在位置に応じて、前記設定ステップにおいて設定された前記シーン制御の内容の一部のみ、または、全部を実行する制御ステップとを含む。 A control method according to an aspect of the present invention includes a setting step of setting details of scene control for equipment installed in a facility; and a control step of executing only part or all of the content of the scene control set in the setting step according to the position.
 本発明の一態様に係るプログラムは、前記制御方法をコンピュータに実行させるためのプログラムである。 A program according to one aspect of the present invention is a program for causing a computer to execute the control method.
 本発明の一態様に係る制御システム及び制御方法は、機器の制御の内容を適応的に変更することができる。 A control system and a control method according to one aspect of the present invention can adaptively change the details of device control.
図1は、実施の形態に係る制御システムの機能構成を示すブロック図である。FIG. 1 is a block diagram showing the functional configuration of the control system according to the embodiment. 図2は、シーン制御を実行するための設定情報の一例を示す図である。FIG. 2 is a diagram showing an example of setting information for executing scene control. 図3は、動作例1に係るシーン制御の内容の設定動作のフローチャートである。FIG. 3 is a flowchart of an operation for setting the content of scene control according to Operation Example 1. As shown in FIG. 図4は、制御の対象となる機器の表示画面の一例を示す図である。FIG. 4 is a diagram showing an example of a display screen of a device to be controlled. 図5は、動作例1に係るローカルシーン制御のフローチャートである。FIG. 5 is a flowchart of local scene control according to Operation Example 1. FIG. 図6は、動作例1に係る音声シーン制御のシーケンス図である。FIG. 6 is a sequence diagram of audio scene control according to Operation Example 1. FIG. 図7は、実行指示を受け付けた情報端末が音声UI装置であるときに、シーン制御の内容をどのように絞り込むかの一例を示す図である。FIG. 7 is a diagram showing an example of how the content of scene control is narrowed down when the information terminal that has received the execution instruction is an audio UI device. 図8は、動作例1に係る遠隔シーン制御のシーケンス図である。FIG. 8 is a sequence diagram of remote scene control according to Operation Example 1. FIG. 図9は、実行指示を受け付けた情報端末が携帯端末であるときに、シーン制御の内容をどのように絞り込むかの一例を示す図である。FIG. 9 is a diagram showing an example of how the content of scene control is narrowed down when the information terminal that has received the execution instruction is a mobile terminal. 図10は、通知画面の一例を示す図である。FIG. 10 is a diagram showing an example of a notification screen. 図11は、動作例2に係るシーン制御の内容の設定動作のシーケンス図である。FIG. 11 is a sequence diagram of an operation for setting the content of scene control according to Operation Example 2. As shown in FIG. 図12は、音声シーン制御用の設定情報の一例を示す図である。FIG. 12 is a diagram showing an example of setting information for audio scene control. 図13は、遠隔シーン制御用の設定情報の一例を示す図である。FIG. 13 is a diagram showing an example of setting information for remote scene control. 図14は、動作例2に係る音声シーン制御のシーケンス図である。14 is a sequence diagram of audio scene control according to Operation Example 2. FIG. 図15は、動作例2に係る遠隔シーン制御のシーケンス図である。15 is a sequence diagram of remote scene control according to Operation Example 2. FIG. 図16は、動作例3に係るシーン制御の第一のシーケンス図である。FIG. 16 is a first sequence diagram of scene control according to Operation Example 3. FIG. 図17は、動作例3に係るシーン制御の第二のシーケンス図である。17 is a second sequence diagram of scene control according to Operation Example 3. FIG. 図18は、動作例4に係るシーン制御の第一のシーケンス図である。18 is a first sequence diagram of scene control according to Operation Example 4. FIG. 図19は、動作例4に係るシーン制御の第二のシーケンス図である。19 is a second sequence diagram of scene control according to Operation Example 4. FIG. 図20は、音声シーン制御において機能が制限されることの同意を得るための表示画面の一例を示す図である。FIG. 20 is a diagram showing an example of a display screen for obtaining consent to restrict functions in audio scene control.
 以下、実施の形態について、図面を参照しながら具体的に説明する。なお、以下で説明する実施の形態は、いずれも包括的または具体的な例を示すものである。以下の実施の形態で示される数値、形状、材料、構成要素、構成要素の配置位置及び接続形態、ステップ、ステップの順序などは、一例であり、本発明を限定する主旨ではない。また、以下の実施の形態における構成要素のうち、独立請求項に記載されていない構成要素については、任意の構成要素として説明される。 Hereinafter, embodiments will be specifically described with reference to the drawings. It should be noted that the embodiments described below are all comprehensive or specific examples. Numerical values, shapes, materials, components, arrangement positions and connection forms of components, steps, order of steps, and the like shown in the following embodiments are examples and are not intended to limit the present invention. Further, among the constituent elements in the following embodiments, constituent elements not described in independent claims will be described as optional constituent elements.
 なお、各図は模式図であり、必ずしも厳密に図示されたものではない。また、各図において、実質的に同一の構成に対しては同一の符号を付し、重複する説明は省略または簡略化される場合がある。 It should be noted that each figure is a schematic diagram and is not necessarily strictly illustrated. Moreover, in each figure, the same code|symbol is attached|subjected with respect to substantially the same structure, and the overlapping description may be abbreviate|omitted or simplified.
 (実施の形態)
 [構成]
 まず、実施の形態に係る制御システムの構成について説明する。図1は、実施の形態に係る制御システムの機能構成を示すブロック図である。
(Embodiment)
[Constitution]
First, the configuration of the control system according to the embodiment will be described. FIG. 1 is a block diagram showing the functional configuration of the control system according to the embodiment.
 図1に示される制御システム10は、施設90に設置された複数の機器30を制御することができる機器制御システムである。施設90は、例えば、集合住宅または戸建住宅などの住宅であるが、オフィスなどの住宅以外の施設であってもよい。 The control system 10 shown in FIG. 1 is a device control system capable of controlling a plurality of devices 30 installed in a facility 90. The facility 90 is, for example, a residence such as an apartment complex or a detached house, but may be a facility other than a residence such as an office.
 図1に示されるように、制御システム10は、制御端末20と、複数の機器30と、無線通信機器35と、音声UI装置40と、音声認識サーバ50と、音声制御用サーバ60と、携帯端末70と、遠隔制御用サーバ80とを備える。音声UI装置40、制御端末20、及び、複数の機器30は、施設90内に設置されている。なお、携帯端末70は、施設90内に位置する場合と、施設90外に位置する場合とがある。 As shown in FIG. 1, the control system 10 includes a control terminal 20, a plurality of devices 30, a wireless communication device 35, a voice UI device 40, a voice recognition server 50, a voice control server 60, and a mobile phone. A terminal 70 and a remote control server 80 are provided. The audio UI device 40 , the control terminal 20 , and the plurality of devices 30 are installed inside the facility 90 . It should be noted that the mobile terminal 70 may be positioned inside the facility 90 or may be positioned outside the facility 90 .
 まず、制御端末20について説明する。制御端末20は、例えば、エネルギーマネジメント機能を有するHEMS(Home Energy Management System)コントローラである。制御端末20は、施設90内に設置され、施設90に設置された複数の機器30の電気使用量(言い換えれば、消費電力量)を管理する。また、制御端末20は、施設90内(あるいは、施設90の敷地内)に設置された複数の機器30の制御を行う。制御端末20は、HEMSコントローラに限定されず、エネルギーマネジメント機能を有しない他のホームコントローラ、または、ゲートウェイ装置であってもよい。制御端末20は、情報端末の一例である。 First, the control terminal 20 will be explained. The control terminal 20 is, for example, a HEMS (Home Energy Management System) controller having an energy management function. The control terminal 20 is installed in the facility 90 and manages the amount of electricity used (in other words, power consumption) of the plurality of devices 30 installed in the facility 90 . The control terminal 20 also controls a plurality of devices 30 installed within the facility 90 (or within the site of the facility 90). The control terminal 20 is not limited to the HEMS controller, and may be another home controller that does not have an energy management function, or a gateway device. The control terminal 20 is an example of an information terminal.
 制御端末20は、具体的には、表示部21と、通信部22と、情報処理部23と、記憶部24とを備える。 Specifically, the control terminal 20 includes a display unit 21, a communication unit 22, an information processing unit 23, and a storage unit 24.
 表示部21は、画像の表示機能、及び、ユーザの手動入力を受け付ける機能を有する表示デバイスである。表示部21は、タッチパネル、及び、液晶パネルまたは有機EL(Electro Luminescence)パネルなどの表示パネルによって実現される。タッチパネルは、例えば、静電容量方式のタッチパネルであるが、抵抗膜方式のタッチパネルであってもよい。 The display unit 21 is a display device having an image display function and a function of accepting manual input from the user. The display unit 21 is implemented by a touch panel and a display panel such as a liquid crystal panel or an organic EL (Electro Luminescence) panel. The touch panel is, for example, a capacitive touch panel, but may be a resistive touch panel.
 通信部22は、制御端末20が他の装置と広域通信ネットワーク100を介して通信を行うための通信回路である。通信部22は、例えば、無線通信を行う無線通信回路であるが、有線通信を行う有線通信回路であってもよい。通信部22が行う通信の通信規格については特に限定されない。 The communication unit 22 is a communication circuit for the control terminal 20 to communicate with other devices via the wide area communication network 100 . The communication unit 22 is, for example, a wireless communication circuit that performs wireless communication, but may be a wired communication circuit that performs wired communication. A communication standard for communication performed by the communication unit 22 is not particularly limited.
 情報処理部23は、機器30の制御に関する情報処理を行う。情報処理部23は、例えば、マイクロコンピュータによって実現されるが、プロセッサまたは専用回路によって実現されてもよい。情報処理部23は、機能的な構成要素として、設定部25、制御部26、及び、表示制御部27を有する。設定部25、制御部26、及び、表示制御部27の機能は、例えば、情報処理部23を構成するマイクロコンピュータ等が記憶部24に記憶されたコンピュータプログラムを実行することにより実現される。設定部25、制御部26、及び、表示制御部27の機能の詳細については後述される。 The information processing section 23 performs information processing related to control of the device 30 . The information processing section 23 is realized by, for example, a microcomputer, but may be realized by a processor or a dedicated circuit. The information processing section 23 has a setting section 25, a control section 26, and a display control section 27 as functional components. Functions of the setting unit 25 , the control unit 26 , and the display control unit 27 are realized by, for example, executing a computer program stored in the storage unit 24 by a microcomputer or the like that constitutes the information processing unit 23 . Details of the functions of the setting unit 25, the control unit 26, and the display control unit 27 will be described later.
 記憶部24は、情報処理部23が実行するコンピュータプログラム、及び、シーン制御の設定情報(後述)などが記憶される記憶装置である。記憶部24は、例えば、半導体メモリによって実現される。 The storage unit 24 is a storage device that stores computer programs executed by the information processing unit 23, setting information for scene control (described later), and the like. The storage unit 24 is implemented by, for example, a semiconductor memory.
 次に、複数の機器30について説明する。複数の機器30は、施設90に設置される、制御システム10における制御対象機器である。複数の機器30は、制御端末20の制御部26によって送信される制御信号によって制御される。複数の機器30には、空調機器31、照明機器32、電気錠33、及び、電動ブラインド34などが含まれる。 Next, the multiple devices 30 will be described. The plurality of devices 30 are devices to be controlled in the control system 10 installed in the facility 90 . The multiple devices 30 are controlled by control signals transmitted by the control unit 26 of the control terminal 20 . The plurality of devices 30 include an air conditioner 31, a lighting device 32, an electric lock 33, an electric blind 34, and the like.
 空調機器31は、一般家庭用の空調機器である。空調機器31は、熱交換器(図示せず)などを有することにより、空調機器31から送出される風の温度の調整が可能な空調機器である。つまり、空調機器31は、温度調整機能(送風機能及び冷暖房機能)を有する。空調機器31は、一般家庭用の空調機器に限定されず、産業用の空調機器であってもよい。 The air conditioner 31 is a general household air conditioner. The air conditioner 31 is an air conditioner capable of adjusting the temperature of the air sent from the air conditioner 31 by having a heat exchanger (not shown) or the like. That is, the air conditioner 31 has a temperature adjustment function (blowing function and cooling/heating function). The air conditioner 31 is not limited to a general household air conditioner, and may be an industrial air conditioner.
 照明機器32は、施設90の中(室内)を照明する。照明機器32は、例えば、シーリングライトであるが、照明機器32の具体的態様は、特に限定されない。照明機器32は、ダウンライト、ペンダントライト、スポットライト、または、ブラケットライトなどであってもよい。また、照明機器32は、施設90の外(室外)を照明する機器であってもよい。 The lighting device 32 illuminates the interior (room) of the facility 90 . The lighting device 32 is, for example, a ceiling light, but the specific aspect of the lighting device 32 is not particularly limited. The lighting device 32 may be a downlight, pendant light, spotlight, bracket light, or the like. Also, the lighting device 32 may be a device that illuminates the outside (outdoor) of the facility 90 .
 電気錠33は、施設90が有するドア(または窓など)の開錠及び施錠を制御する防犯機器である。電気錠33は、例えば、カードキーなどから鍵情報を取得するRFIDリーダを備える。また、電気錠33は、指紋などの生体情報を鍵情報として取得する生体センサを備えてもよい。以下の実施の形態では、電気錠33は、玄関のドアに設けられた電気錠であるものとして説明される。 The electric lock 33 is a security device that controls the unlocking and locking of the doors (or windows, etc.) of the facility 90 . The electric lock 33 has an RFID reader that acquires key information from, for example, a card key. Also, the electric lock 33 may include a biosensor that acquires biometric information such as a fingerprint as key information. In the following embodiments, the electric lock 33 is described as an electric lock provided on the front door.
 電動ブラインド34は、施設90が有する窓の内側に配置され、施設90内への外光の取り入れ量を調整する。また、電動ブラインド34は、施設90内の間仕切りとして使用される場合もある。電動ブラインド34は、複数のスラットを有し、複数のスラットの角度(以下、単に電動ブラインド34の角度などとも記載される)が調整可能である。 The electric blinds 34 are placed inside the windows of the facility 90 and adjust the amount of outside light that enters the facility 90 . Also, the electric blind 34 may be used as a partition inside the facility 90 . The electric blind 34 has a plurality of slats, and the angles of the slats (hereinafter also simply referred to as the angle of the electric blind 34, etc.) can be adjusted.
 次に、無線通信機器35について説明する。無線通信機器35は、近距離無線通信を行う機器であり、施設90内に設置される、無線LAN(Local Area Network)ルータである。無線通信機器35には、制御端末20及び音声UI装置40が通信接続される。制御端末20及び音声UI装置40の広域通信ネットワーク100を通じた通信は、無線通信機器35を介して行われる。 Next, the wireless communication device 35 will be explained. The wireless communication device 35 is a device that performs short-range wireless communication, and is a wireless LAN (Local Area Network) router installed in the facility 90 . The wireless communication device 35 is connected to the control terminal 20 and the audio UI device 40 for communication. Communication between the control terminal 20 and the voice UI device 40 through the wide area communication network 100 is performed via the wireless communication device 35 .
 次に、音声UI装置40について説明する。音声UI装置40は、ユーザの発話音声が入力される情報端末であり、入力された発話音声の音声信号を、広域通信ネットワーク100を介して音声認識サーバ50へ送信する。音声UI装置40は、例えば、施設90に設置されるスマートスピーカである。音声UI装置40は、具体的には、マイクロフォン41、及び、スピーカ42を有する。 Next, the voice UI device 40 will be explained. The voice UI device 40 is an information terminal into which user's uttered voice is input, and transmits a voice signal of the input uttered voice to the voice recognition server 50 via the wide area communication network 100 . The audio UI device 40 is, for example, a smart speaker installed at the facility 90 . The voice UI device 40 specifically has a microphone 41 and a speaker 42 .
 マイクロフォン41は、ユーザが発する音声(発話音声)を取得する。マイクロフォン41は、具体的には、コンデンサマイク、ダイナミックマイク、または、MEMS(Micro Electro Mechanical Systems)マイクなどである。 The microphone 41 acquires the voice uttered by the user (speech voice). The microphone 41 is specifically a condenser microphone, a dynamic microphone, a MEMS (Micro Electro Mechanical Systems) microphone, or the like.
 スピーカ42は、マイクロフォン41によって取得された発話音声への応答として、音声(機械音声)を出力する。これにより、ユーザは対話形式で機器30を制御することができる。 The speaker 42 outputs sound (mechanical sound) as a response to the speech sound acquired by the microphone 41 . This allows the user to interactively control the device 30 .
 次に、音声認識サーバ50について説明する。音声認識サーバ50は、施設90外に位置するコンピュータである。音声認識サーバ50は、具体的には、音声UI装置40によって送信される音声信号を受信し、受信した音声信号に対して音声認識処理を行うクラウドサーバである。音声認識サービスを提供する事業者は、音声認識サーバ50を用いて当該音声認識サービスを提供する。音声認識サーバ50は、例えば、音声UI装置40によって送信される音声信号をテキスト情報に変換し、テキスト情報を音声制御用サーバ60に送信する。 Next, the speech recognition server 50 will be explained. The speech recognition server 50 is a computer located outside the facility 90 . Specifically, the voice recognition server 50 is a cloud server that receives voice signals transmitted by the voice UI device 40 and performs voice recognition processing on the received voice signals. A business that provides a speech recognition service uses the speech recognition server 50 to provide the speech recognition service. The voice recognition server 50 , for example, converts a voice signal transmitted by the voice UI device 40 into text information and transmits the text information to the voice control server 60 .
 次に、音声制御用サーバ60について説明する。音声制御用サーバ60は、施設90外に位置するコンピュータである。音声制御用サーバ60は、具体的には、音声認識サーバ50から送信されるテキスト情報に基づいて制御命令を生成し、生成した制御命令を制御端末20に送信するクラウドサーバである。音声制御用サーバ60は、第一通信部61と、第一情報処理部62と、第一記憶部63とを備える。 Next, the voice control server 60 will be explained. The voice control server 60 is a computer located outside the facility 90 . The voice control server 60 is specifically a cloud server that generates a control command based on text information transmitted from the voice recognition server 50 and transmits the generated control command to the control terminal 20 . The voice control server 60 includes a first communication section 61 , a first information processing section 62 and a first storage section 63 .
 第一通信部61は、音声制御用サーバ60が広域通信ネットワーク100を介して他の装置と通信を行うための通信モジュール(通信回路)である。第一通信部61によって行われる通信は、例えば、有線通信であるが、無線通信であってもよい。通信に用いられる通信規格についても特に限定されない。 The first communication unit 61 is a communication module (communication circuit) for the voice control server 60 to communicate with other devices via the wide area communication network 100 . The communication performed by the first communication unit 61 is, for example, wired communication, but may be wireless communication. The communication standard used for communication is also not particularly limited.
 第一情報処理部62は、音声UI装置40が受け付けた音声入力に基づいて機器30を制御するための情報処理を行う。第一情報処理部62は、例えば、マイクロコンピュータによって実現されるが、プロセッサによって実現されてもよい。第一情報処理部62は、機能的な構成要素として、第一設定部64、及び、第一制御部65を有する。第一設定部64、及び、第一制御部65の機能は、例えば、第一情報処理部62を構成するマイクロコンピュータ等が第一記憶部63に記憶されたコンピュータプログラムを実行することにより実現される。第一設定部64、及び、第一制御部65の機能の詳細については後述される。 The first information processing unit 62 performs information processing for controlling the device 30 based on voice input received by the voice UI device 40 . The first information processing section 62 is implemented by, for example, a microcomputer, but may be implemented by a processor. The first information processing section 62 has a first setting section 64 and a first control section 65 as functional components. The functions of the first setting unit 64 and the first control unit 65 are realized by, for example, executing a computer program stored in the first storage unit 63 by a microcomputer or the like that constitutes the first information processing unit 62. be. Details of the functions of the first setting unit 64 and the first control unit 65 will be described later.
 第一記憶部63は、第一情報処理部62が実行するコンピュータプログラムなどが記憶される記憶装置である。第一記憶部63は、例えば、HDD(Hard Disc Drive)などによって実現される。 The first storage unit 63 is a storage device that stores computer programs and the like executed by the first information processing unit 62 . The first storage unit 63 is implemented by, for example, an HDD (Hard Disc Drive).
 次に、携帯端末70について説明する。携帯端末70は、ユーザが機器30を遠隔制御するために手動入力を行う(つまり、操作する)携帯型の情報端末であり、具体的には、スマートフォンまたはタブレット端末などである。携帯端末70は、表示部71、位置取得部72、及び、無線通信部73を有する。 Next, the portable terminal 70 will be explained. The mobile terminal 70 is a mobile information terminal for manual input (that is, operation) for the user to remotely control the device 30, and is specifically a smart phone, a tablet terminal, or the like. The mobile terminal 70 has a display section 71 , a position acquisition section 72 and a wireless communication section 73 .
 表示部71は、画像の表示機能、及び、ユーザの手動入力を受け付ける機能を有する表示デバイスである。表示部71は、タッチパネル、及び、液晶パネルまたは有機ELパネルなどの表示パネルによって実現される。タッチパネルは、例えば、静電容量方式のタッチパネルであるが、抵抗膜方式のタッチパネルであってもよい。 The display unit 71 is a display device having an image display function and a function of accepting manual input from the user. The display unit 71 is realized by a touch panel and a display panel such as a liquid crystal panel or an organic EL panel. The touch panel is, for example, a capacitive touch panel, but may be a resistive touch panel.
 位置取得部72は、携帯端末70の現在位置を取得(言い換えれば、算出)し、取得した現在位置を示す現在位置情報を出力する。位置取得部72は、例えば、GPS(Global Positioning System)または準天頂衛星システムなどの衛星測位システムに対応する測位モジュールによって実現される。 The location acquisition unit 72 acquires (in other words, calculates) the current location of the mobile terminal 70 and outputs current location information indicating the acquired current location. The position acquisition unit 72 is implemented by, for example, a positioning module compatible with a satellite positioning system such as GPS (Global Positioning System) or Quasi-Zenith Satellite System.
 無線通信部73は、携帯端末70が無線通信機器35に通信接続するための通信回路である。無線通信部73は、具体的には、無線LANに対応した近距離無線通信回路である。 The wireless communication unit 73 is a communication circuit for communication connection between the mobile terminal 70 and the wireless communication device 35 . The wireless communication unit 73 is specifically a short-range wireless communication circuit compatible with wireless LAN.
 次に、遠隔制御用サーバ80について説明する。遠隔制御用サーバ80は、施設90外に位置するコンピュータである。遠隔制御用サーバ80は、具体的には、携帯端末70から送信される情報に基づいて制御命令を生成し、生成した制御命令を制御端末20に送信するクラウドサーバである。遠隔制御用サーバ80は、第二通信部81と、第二情報処理部82と、第二記憶部83とを備える。 Next, the remote control server 80 will be explained. Remote control server 80 is a computer located outside facility 90 . Specifically, the remote control server 80 is a cloud server that generates a control command based on information transmitted from the mobile terminal 70 and transmits the generated control command to the control terminal 20 . The remote control server 80 includes a second communication section 81 , a second information processing section 82 and a second storage section 83 .
 第二通信部81は、遠隔制御用サーバ80が広域通信ネットワーク100を介して他の装置と通信を行うための通信モジュール(通信回路)である。第二通信部81によって行われる通信は、例えば、有線通信であるが、無線通信であってもよい。通信に用いられる通信規格についても特に限定されない。 The second communication unit 81 is a communication module (communication circuit) for the remote control server 80 to communicate with other devices via the wide area communication network 100 . The communication performed by the second communication unit 81 is, for example, wired communication, but may be wireless communication. The communication standard used for communication is also not particularly limited.
 第二情報処理部82は、携帯端末70が受け付けたユーザの手動入力に基づいて機器30を制御するための情報処理を行う。第二情報処理部82は、例えば、マイクロコンピュータによって実現されるが、プロセッサによって実現されてもよい。第二情報処理部82は、機能的な構成要素として、第二設定部84、及び、第二制御部85を有する。第二設定部84、及び、第二制御部85の機能は、例えば、第二情報処理部82を構成するマイクロコンピュータ等が第二記憶部83に記憶されたコンピュータプログラムを実行することにより実現される。第二設定部84、及び、第二制御部85の機能の詳細については後述される。 The second information processing unit 82 performs information processing for controlling the device 30 based on the user's manual input received by the mobile terminal 70 . The second information processing section 82 is implemented by, for example, a microcomputer, but may be implemented by a processor. The second information processing section 82 has a second setting section 84 and a second control section 85 as functional components. The functions of the second setting unit 84 and the second control unit 85 are realized by, for example, executing a computer program stored in the second storage unit 83 by a microcomputer constituting the second information processing unit 82. be. Details of the functions of the second setting unit 84 and the second control unit 85 will be described later.
 第二記憶部83は、第二情報処理部82が実行するコンピュータプログラムなどが記憶される記憶装置である。第二記憶部83は、例えば、HDDなどによって実現される。 The second storage unit 83 is a storage device in which computer programs and the like executed by the second information processing unit 82 are stored. The second storage unit 83 is implemented by, for example, an HDD.
 [機器の制御の概要]
 制御システム10のユーザは、制御端末20に手動入力を行うことで機器30を制御することができる。制御端末20の制御部26は、表示部21によって手動入力が受け付けられると、手動入力に応じた制御信号を、局所通信ネットワークを介して機器30へ送信する。これにより、機器30が制御される。局所通信ネットワークは、例えば、ECHONET Lite(登録商標)に準拠した通信ネットワークである。以下では、このような制御端末20への手動入力に基づく機器30の制御は、ローカル制御とも記載される。
[Outline of device control]
A user of the control system 10 can control the equipment 30 by making manual inputs to the control terminal 20 . When the display unit 21 receives the manual input, the control unit 26 of the control terminal 20 transmits a control signal corresponding to the manual input to the device 30 via the local communication network. Thereby, the device 30 is controlled. The local communication network is, for example, a communication network conforming to ECHONET Lite (registered trademark). In the following, such control of the device 30 based on manual input to the control terminal 20 is also described as local control.
 また、ユーザは、音声UI装置40へ音声を入力することで機器30を制御することができる。音声UI装置40は、マイクロフォン41によって音声入力が受け付けられると、音声入力に応じた音声信号を、広域通信ネットワーク100を介して音声認識サーバ50へ送信する。音声認識サーバ50は音声信号をテキスト情報に変換し、テキスト情報を、広域通信ネットワーク100を介して音声制御用サーバ60へ送信する。音声制御用サーバ60は、テキスト情報に応じた制御命令を生成し、生成した制御命令を、広域通信ネットワーク100を介して制御端末20へ送信する。制御端末20の制御部26は、通信部22によって制御命令が受信されると、制御命令に応じた制御信号を、局所通信ネットワークを介して機器30へ送信する。これにより、機器30が制御される。以下では、このような音声UI装置40への音声入力に基づく機器30の制御は、音声制御とも記載される。 Also, the user can control the device 30 by inputting voice to the voice UI device 40 . When voice input is received by the microphone 41 , the voice UI device 40 transmits a voice signal corresponding to the voice input to the voice recognition server 50 via the wide area communication network 100 . The voice recognition server 50 converts the voice signal into text information and transmits the text information to the voice control server 60 via the wide area communication network 100 . The voice control server 60 generates a control command according to the text information and transmits the generated control command to the control terminal 20 via the wide area communication network 100 . When the communication unit 22 receives the control command, the control unit 26 of the control terminal 20 transmits a control signal corresponding to the control command to the device 30 via the local communication network. Thereby, the device 30 is controlled. Hereinafter, such control of the device 30 based on voice input to the voice UI device 40 is also described as voice control.
 また、ユーザは、携帯端末70に手動入力を行うことで機器30を制御することができる。携帯端末70は、表示部71によって手動入力が受け付けられると、手動入力に応じた制御依頼情報を、広域通信ネットワーク100を介して遠隔制御用サーバ80へ送信する。遠隔制御用サーバ80は制御依頼情報に応じた制御命令を生成し、生成した制御命令を、広域通信ネットワーク100を介して制御端末20へ送信する。制御端末20の制御部26は、通信部22によって制御命令が受信されると、制御命令に応じた制御信号を、局所通信ネットワークを介して機器30へ送信する。これにより、機器30が制御される。以下では、このような携帯端末70への手動入力に基づく機器30の制御は、遠隔制御とも記載される。 Also, the user can control the device 30 by performing manual input to the mobile terminal 70 . When the manual input is accepted by the display unit 71 , the mobile terminal 70 transmits control request information according to the manual input to the remote control server 80 via the wide area communication network 100 . The remote control server 80 generates a control command according to the control request information, and transmits the generated control command to the control terminal 20 via the wide area communication network 100 . When the communication unit 22 receives the control command, the control unit 26 of the control terminal 20 transmits a control signal corresponding to the control command to the device 30 via the local communication network. Thereby, the device 30 is controlled. Hereinafter, such control of the device 30 based on manual input to the mobile terminal 70 is also referred to as remote control.
 [シーン制御]
 また、制御システム10のユーザは、シーン制御(定型アクションなどと呼ばれる場合もある)を実行することができる。シーン制御は、施設90内を所定の室内環境に近づけるために、施設90内に配置された1つまたは複数の機器30を一括して動作させる制御である。図2は、シーン制御を実行するための設定情報の一例を示す図である。
[Scene control]
A user of the control system 10 can also perform scene controls (sometimes called routine actions, etc.). Scene control is control that collectively operates one or a plurality of devices 30 arranged in facility 90 in order to bring the interior of facility 90 closer to a predetermined indoor environment. FIG. 2 is a diagram showing an example of setting information for executing scene control.
 シーン制御の設定情報においては、帰宅、起床、外出、及び、就寝などの生活シーン(シーン名称)のそれぞれに対して、機器30の制御内容が紐づけられている。例えば、ユーザが制御端末20の表示部21に対して「帰宅」のシーン制御の実行を指示する手動入力を行うと、制御端末20の制御部26は、記憶部24に記憶された設定情報を参照して機器30を制御する。この結果、空調機器31及び照明機器32があらかじめ登録された設定(明るさ、設定温度など)でオンし、電気錠33が施錠され、電動ブラインド34のスラットが設定された角度まで開く。「帰宅」のシーン制御の実行が指示されたときに、どのような機器30がどのような動作を行うかは、ユーザが設定することができる。なお、このときの情報の伝達経路は、上記ローカル制御と同様であり、制御端末20への手動入力に基づく機器30のシーン制御は、ローカルシーン制御とも記載される。 In the scene control setting information, the control details of the device 30 are associated with each life scene (scene name) such as going home, getting up, going out, and going to bed. For example, when the user performs a manual input to the display unit 21 of the control terminal 20 to instruct the execution of the scene control of "coming home", the control unit 26 of the control terminal 20 changes the setting information stored in the storage unit 24 to The device 30 is controlled by referring to it. As a result, the air conditioner 31 and the lighting device 32 are turned on with pre-registered settings (brightness, set temperature, etc.), the electric lock 33 is locked, and the slats of the electric blind 34 are opened to the set angle. The user can set what kind of device 30 performs what kind of operation when the execution of the scene control of "coming home" is instructed. The information transmission path at this time is the same as that of the local control, and the scene control of the device 30 based on manual input to the control terminal 20 is also described as local scene control.
 また、ユーザは、音声UI装置40に対して「帰宅シーンにして」と発話することでも「帰宅」のシーン制御を実行することができる。このときの情報の伝達経路は、上記音声制御と同様であり、音声UI装置40への音声入力に基づくシーン制御は、音声シーン制御とも記載される。 In addition, the user can also execute the "coming home" scene control by speaking to the voice UI device 40, "make it a coming home scene". The information transmission path at this time is the same as the voice control described above, and scene control based on voice input to the voice UI device 40 is also described as voice scene control.
 音声シーン制御の内容には制約がある。例えば、ローカルシーン制御においては電動ブラインド34の角度の指定が可能であるが、音声シーン制御においては、電動ブラインド34の角度を指定することができず、開閉のみが可能である。つまり、音声シーン制御において、ローカルシーン制御の設定情報をそのまま使用することはできない。  There are restrictions on the content of the audio scene control. For example, the angle of the electric blind 34 can be designated in local scene control, but the angle of the electric blind 34 cannot be designated in audio scene control, and only opening and closing is possible. In other words, in audio scene control, the setting information of local scene control cannot be used as it is.
 そこで、一般的には、音声シーン制御を実行するための設定情報は、ローカルシーン制御を実行するための設定情報(図2)とは別に、音声制御用サーバ60の第一記憶部63にあらかじめ記憶される。なお、音声シーン制御を実行するための設定情報を設定(第一記憶部63に記憶)するときには、ユーザは携帯端末70へ手動入力を行う。 Therefore, in general, the setting information for executing the audio scene control is stored in advance in the first storage unit 63 of the audio control server 60 separately from the setting information (FIG. 2) for executing the local scene control. remembered. When setting (storing in the first storage unit 63 ) setting information for executing audio scene control, the user performs manual input to the portable terminal 70 .
 また、ユーザは、携帯端末70へ手動入力を行うことにより、「帰宅」のシーン制御を実行することもできる。このときの情報の伝達経路は、上記遠隔制御と同様であり、携帯端末70への手動入力に基づくシーン制御は、遠隔シーン制御とも記載される。 In addition, the user can also execute scene control of "coming home" by performing manual input to the mobile terminal 70. The information transmission path at this time is the same as that of the remote control, and scene control based on manual input to the mobile terminal 70 is also described as remote scene control.
 遠隔シーン制御は、施設90外からの機器30の制御を想定したものであることから、制御の内容に制約がある。例えば、遠隔シーン制御においては、電気錠33は、セキュリティの観点からシーン制御の対象外とされる。また、遠隔シーン制御においては、機器30の詳細動作は対象外とされる場合がある。つまり、遠隔シーン制御において、ローカルシーン制御の設定情報、及び、音声シーン制御の設定情報をそのまま使用することはできない。 Since the remote scene control assumes control of the device 30 from outside the facility 90, there are restrictions on the content of the control. For example, in remote scene control, the electric lock 33 is excluded from scene control from the viewpoint of security. Further, in remote scene control, detailed operations of the device 30 may be excluded. That is, in remote scene control, the setting information for local scene control and the setting information for audio scene control cannot be used as they are.
 そこで、一般的には、遠隔シーン制御を実行するための設定情報は、ローカルシーン制御を実行するための設定情報、及び、音声シーン制御を実行するための設定情報とは別に、遠隔制御用サーバ80の第二記憶部83にあらかじめ記憶される。なお、遠隔シーン制御の設定情報を設定(第二記憶部83に記憶)するときには、ユーザは携帯端末70へ手動入力を行う。 Therefore, in general, the setting information for executing remote scene control is stored in the remote control server separately from the setting information for executing local scene control and the setting information for executing audio scene control. It is stored in advance in the second storage unit 83 of 80 . When setting (storing in the second storage unit 83 ) setting information for remote scene control, the user performs manual input to the portable terminal 70 .
 このように、一般的には、同じ「帰宅」のシーン制御に関して、ユーザは、制御端末20、音声制御用サーバ60、及び、遠隔制御用サーバ80にそれぞれに個別に設定を行う必要がある。これに対し、制御システム10では、シーン制御の設定情報を共通化することでユーザの負担を軽減することができる。 Thus, in general, the user needs to individually set the control terminal 20, the voice control server 60, and the remote control server 80 for the same "coming home" scene control. On the other hand, in the control system 10, the user's burden can be reduced by sharing setting information for scene control.
 [動作例1]
 以下、シーン制御の設定情報が共通化される場合の動作例1について説明する。まず、動作例1に係る、シーン制御の内容の設定動作について説明する。図3は、動作例1に係るシーン制御の内容の設定動作のフローチャートである。
[Operation example 1]
An operation example 1 in which setting information for scene control is shared will be described below. First, an operation for setting the content of scene control according to operation example 1 will be described. FIG. 3 is a flowchart of an operation for setting the content of scene control according to Operation Example 1. As shown in FIG.
 制御端末20の表示制御部27は、シーン制御の設定画面を表示部21に表示する(S11)。なお、設定画面には、ローカルシーン制御、音声シーン制御、及び、遠隔シーン制御のそれぞれで制御の対象とされる機器30を示す表示画面が含まれてもよい。図4は、制御の対象となる機器の表示画面の一例を示す図である。図4の例では、制御の対象となる機器30がアイコンで表示されている。このような表示画面は、シーン制御の内容の一部のみが実行されるときの、制御の対象となる機器30、及び、制御の対象から除外される機器30を示す画像の一例である。 The display control unit 27 of the control terminal 20 displays a scene control setting screen on the display unit 21 (S11). Note that the setting screen may include a display screen showing the device 30 to be controlled by each of the local scene control, the audio scene control, and the remote scene control. FIG. 4 is a diagram showing an example of a display screen of a device to be controlled. In the example of FIG. 4, the device 30 to be controlled is displayed as an icon. Such a display screen is an example of an image showing the device 30 to be controlled and the device 30 excluded from the control target when only part of the content of the scene control is executed.
 設定画面が表示されているときに、ユーザは、制御端末20の表示部21にシーン制御の設定を意図した手動入力を行う。表示部21はこのような手動入力を受け付ける(S12)。設定部25は、ステップS12において受け付けられた手動入力に応じてシーン制御の内容を設定する(S13)。設定部25は、具体的には、ステップS12において受け付けられた手動入力に応じて図2に示されるような設定情報を生成し、生成した設定情報を記憶部24に記憶する。 While the setting screen is displayed, the user performs manual input intended to set the scene control on the display unit 21 of the control terminal 20 . The display unit 21 accepts such manual input (S12). The setting unit 25 sets the content of scene control according to the manual input accepted in step S12 (S13). Specifically, the setting unit 25 generates setting information as shown in FIG. 2 according to the manual input accepted in step S12, and stores the generated setting information in the storage unit 24.
 次に、動作例1に係るローカルシーン制御について説明する。図5は、動作例1に係るローカルシーン制御のフローチャートである。 Next, the local scene control according to Operation Example 1 will be described. FIG. 5 is a flowchart of local scene control according to Operation Example 1. FIG.
 上記のように設定情報が制御端末20の記憶部24に記憶された後、ユーザは、表示部21にシーン制御の実行を意図した手動入力を行う。表示部21はこのような手動入力(シーン制御の実行指示の一例)を受け付ける(S14)。制御部26は、記憶部24に記憶された設定情報を参照することにより、ステップS14において受け付けられた手動入力が示すシーン制御の内容を特定し(S15)、特定したシーン制御の内容の全部を実行する(S16)。制御部26は、具体的には、制御信号を、局所通信ネットワークを介して機器30へ送信することでシーン制御の内容の全てを実行する。 After the setting information is stored in the storage unit 24 of the control terminal 20 as described above, the user performs manual input intended to execute scene control on the display unit 21 . The display unit 21 accepts such a manual input (an example of a scene control execution instruction) (S14). By referring to the setting information stored in the storage unit 24, the control unit 26 specifies the details of the scene control indicated by the manual input accepted in step S14 (S15), and stores all of the specified details of the scene control. Execute (S16). Specifically, the control unit 26 executes all scene control by transmitting a control signal to the device 30 via the local communication network.
 次に、動作例1に係る音声シーン制御について説明する。図6は、動作例1に係る音声シーン制御のシーケンス図である。 Next, audio scene control according to Operation Example 1 will be described. FIG. 6 is a sequence diagram of audio scene control according to Operation Example 1. FIG.
 上記のように設定情報が制御端末20の記憶部24に記憶された後、ユーザがシーン制御の実行を意図した音声(例えば、帰宅シーンにして、など)を発すると、音声UI装置40のマイクロフォン41は、この発話音声を取得する。言い換えれば、マイクロフォン41は、シーン制御の音声入力(シーン制御の実行指示の一例)を受け付ける(S21)。音声UI装置40は、取得した発話音声の音声信号を音声認識サーバ50へ送信する(S22)。 After the setting information is stored in the storage unit 24 of the control terminal 20 as described above, when the user utters a voice intended to execute scene control (for example, a scene of returning home), the microphone of the voice UI device 40 41 acquires this speech sound. In other words, the microphone 41 receives an audio input for scene control (an example of an instruction to execute scene control) (S21). The voice UI device 40 transmits the acquired voice signal of the uttered voice to the voice recognition server 50 (S22).
 音声認識サーバ50は音声信号を受信すると、受信した音声信号(つまり、発話音声)に対して音声認識処理を行う(S23)。音声認識サーバ50は、具体的には、受信した音声信号をテキスト情報に変換し、テキスト情報を音声制御用サーバ60に送信する(S24)。 Upon receiving the voice signal, the voice recognition server 50 performs voice recognition processing on the received voice signal (that is, spoken voice) (S23). Specifically, the voice recognition server 50 converts the received voice signal into text information and transmits the text information to the voice control server 60 (S24).
 音声制御用サーバ60の第一通信部61は、音声認識サーバ50からテキスト情報を受信する。なお、音声認識サーバ50はテキスト情報をコマンド情報(後述の制御命令に類似する情報)に変換し、音声制御用サーバ60は、音声認識サーバ50からコマンド情報を取得してもよい。 The first communication unit 61 of the voice control server 60 receives text information from the voice recognition server 50. Note that the voice recognition server 50 may convert the text information into command information (information similar to a control command described later), and the voice control server 60 may acquire the command information from the voice recognition server 50 .
 第一制御部65は、受信されたテキスト情報に基づいてユーザが意図するシーンを特定し(S25)、特定したシーンを示すシーン情報、及び、実行指示を受け付けた情報端末が音声UI装置40であることを示す端末情報を含む制御命令を制御端末20へ送信する(S26)。制御命令の送信には、第一通信部61が用いられる。端末情報は、実行指示を受け付けた情報端末の種類を示す情報である。 The first control unit 65 identifies the scene intended by the user based on the received text information (S25), and the information terminal that received the scene information indicating the identified scene and the execution instruction is the audio UI device 40. A control command including terminal information indicating that there is a terminal is transmitted to the control terminal 20 (S26). The first communication unit 61 is used for transmitting the control command. The terminal information is information indicating the type of information terminal that has received the execution instruction.
 制御端末20の通信部22は、音声制御用サーバ60から制御命令(シーン情報、及び、端末情報)を受信する。制御部26は、記憶部24に記憶された設定情報を参照することにより、ステップS26において受信されたシーン情報が示すシーン制御の内容を特定する(S27)。また、制御部26は、特定したシーン制御の内容を、ステップS26において受信された端末情報に応じて絞り込む(S28)。図7は、実行指示を受け付けた情報端末が音声UI装置40であるときに、シーン制御の内容をどのように絞り込むかの一例を示す図である。図7の例では、空調機器31、照明機器32、及び、電気錠33の制御の内容は維持されるが、電動ブラインド34の制御の内容は、基本動作のみに絞り込まれる。 The communication unit 22 of the control terminal 20 receives control commands (scene information and terminal information) from the audio control server 60 . By referring to the setting information stored in the storage unit 24, the control unit 26 identifies the content of the scene control indicated by the scene information received in step S26 (S27). Further, the control unit 26 narrows down the content of the specified scene control according to the terminal information received in step S26 (S28). FIG. 7 is a diagram showing an example of how the content of scene control is narrowed down when the information terminal that has received the execution instruction is the audio UI device 40 . In the example of FIG. 7, the details of the control of the air conditioner 31, the lighting device 32, and the electric lock 33 are maintained, but the details of the control of the electric blind 34 are narrowed down to basic operations.
 そして、制御部26は、絞り込んだシーン制御の内容(つまりシーン制御の内容の一部)を実行する(S29)。制御部26は、具体的には、制御信号を、局所通信ネットワークを介して機器30へ送信することでシーン制御の内容の一部を実行する。 Then, the control unit 26 executes the content of the narrowed down scene control (that is, part of the content of the scene control) (S29). Specifically, the control unit 26 executes a part of scene control by transmitting a control signal to the device 30 via the local communication network.
 次に、動作例1に係る遠隔シーン制御について説明する。図8は、動作例1に係る遠隔シーン制御のシーケンス図である。 Next, remote scene control according to Operation Example 1 will be described. FIG. 8 is a sequence diagram of remote scene control according to Operation Example 1. FIG.
 上記のように設定情報が制御端末20の記憶部24に記憶された後、ユーザは、携帯端末70の表示部71にシーン制御の設定を意図した手動入力を行う。表示部71はこのような手動入力(シーン制御の実行指示の一例)を受け付ける(S31)。携帯端末70は、ステップS31において受け付けられた手動入力が示すシーン制御の実行を依頼する制御依頼情報を遠隔制御用サーバ80へ送信する(S32)。 After the setting information is stored in the storage unit 24 of the control terminal 20 as described above, the user performs manual input intended to set the scene control on the display unit 71 of the mobile terminal 70 . The display unit 71 accepts such a manual input (an example of a scene control execution instruction) (S31). The portable terminal 70 transmits to the remote control server 80 control request information requesting execution of scene control indicated by the manual input accepted in step S31 (S32).
 遠隔制御用サーバ80の第二通信部81は、制御依頼情報を受信する。第二制御部85は、受信された制御依頼情報に基づいてユーザが意図するシーンを特定し(S33)、特定したシーンを示すシーン情報、及び、実行指示を受け付けた情報端末が携帯端末70であることを示す端末情報を含む制御命令を制御端末20へ送信する(S34)。制御命令の送信には、第二通信部81が用いられる。 The second communication unit 81 of the remote control server 80 receives the control request information. The second control unit 85 specifies the scene intended by the user based on the received control request information (S33), and the information terminal that received the scene information indicating the specified scene and the execution instruction is the portable terminal 70. A control command including terminal information indicating that there is a terminal is transmitted to the control terminal 20 (S34). The second communication unit 81 is used for transmitting the control command.
 制御端末20の通信部22は、遠隔制御用サーバ80から制御命令(シーン情報、及び、端末情報)を受信する。制御部26は、記憶部24に記憶された設定情報を参照することにより、ステップS34において受信されたシーン情報が示すシーン制御の内容を特定する(S35)。また、制御部26は、特定したシーン制御の内容を、ステップS34において受信された端末情報に応じて絞り込む(S36)。図9は、実行指示を受け付けた情報端末が携帯端末70であるときに、シーン制御の内容をどのように絞り込むかの一例を示す図である。図9の例では、空調機器31、照明機器32、及び、電動ブラインド34の制御の内容は、いずれも基本動作のみに絞り込まれる。また電気錠33の制御は除外される。 The communication unit 22 of the control terminal 20 receives control commands (scene information and terminal information) from the remote control server 80 . By referring to the setting information stored in the storage unit 24, the control unit 26 specifies the content of the scene control indicated by the scene information received in step S34 (S35). Further, the control unit 26 narrows down the contents of the specified scene control according to the terminal information received in step S34 (S36). FIG. 9 is a diagram showing an example of how the content of scene control is narrowed down when the information terminal that has received the execution instruction is the mobile terminal 70 . In the example of FIG. 9, the details of the control of the air conditioner 31, the lighting device 32, and the electric blinds 34 are all narrowed down to basic operations. Also, the control of the electric lock 33 is excluded.
 そして、制御部26は、絞り込んだシーン制御の内容(つまりシーン制御の内容の一部のみが実行されること)を通知する(S37)。制御部26は、具体的には、通知情報を遠隔制御用サーバ80へ送信し、遠隔制御用サーバ80は、通知情報を携帯端末70へ送信する。この結果、携帯端末70の表示部71には、図10のような通知画面が表示される(S38)。図10は、通知画面の一例を示す図である。 Then, the control unit 26 notifies the content of the narrowed down scene control (that is, that only part of the content of the scene control is executed) (S37). Specifically, the control unit 26 transmits notification information to the remote control server 80 , and the remote control server 80 transmits notification information to the mobile terminal 70 . As a result, a notification screen as shown in FIG. 10 is displayed on the display unit 71 of the mobile terminal 70 (S38). FIG. 10 is a diagram showing an example of a notification screen.
 図10の通知画面が表示されているときにユーザがシーン制御の実行を承認するための承認操作を行うと、表示部71はシーン制御の実行の承認を意図した手動入力を受け付ける(S39)。携帯端末70は、承認情報を遠隔制御用サーバ80へ送信し、遠隔制御用サーバ80は承認情報を制御端末20へ送信する(S40)。 When the user performs an approval operation to approve execution of scene control while the notification screen of FIG. 10 is displayed, the display unit 71 receives manual input intended to approve execution of scene control (S39). The mobile terminal 70 transmits the approval information to the remote control server 80, and the remote control server 80 transmits the approval information to the control terminal 20 (S40).
 制御端末20の通信部22は、承認情報を受信する。制御部26は、通信部22によって承認情報が受信されたことを契機に、絞り込んだシーン制御の内容(つまりシーン制御の内容の一部)を実行する(S41)。制御部26は、具体的には、制御信号を、局所通信ネットワークを介して機器30へ送信することでシーン制御の内容の一部を実行する。制御部26は、通知情報を送信してから一定期間の間に承認情報が受信されなかった場合には、シーン制御を実行しない。 The communication unit 22 of the control terminal 20 receives the approval information. Triggered by the reception of the approval information by the communication unit 22, the control unit 26 executes the content of the narrowed-down scene control (that is, part of the content of the scene control) (S41). Specifically, the control unit 26 executes a part of scene control by transmitting a control signal to the device 30 via the local communication network. The control unit 26 does not execute scene control when the approval information is not received within a certain period of time after the notification information is transmitted.
 なお、ステップS36~ステップS39の処理(通知及び承認)は省略されてもよい。また、制御部26は、シーン制御の内容の一部を実行した後に、シーン制御の内容の一部のみが実行されたことを通知してもよい。 Note that the processing (notification and approval) of steps S36 to S39 may be omitted. Further, the control unit 26 may notify that only part of the scene control has been executed after executing part of the scene control.
 以上説明したように、動作例1では、設定情報は、制御端末20にのみ記憶され、制御端末20の制御部26は、実行指示を受け付けた情報端末の種類に応じて、設定情報が示すシーン制御の内容の全部を実行するか、一部のみを実行するかを判定する。制御システム10のユーザは、制御端末20にのみシーン制御の内容の設定を行えばよく、音声制御用サーバ60及び遠隔制御用サーバ80にシーン制御の設定を行わなくてよい利点がある。 As described above, in operation example 1, the setting information is stored only in the control terminal 20, and the control unit 26 of the control terminal 20 selects the scene indicated by the setting information according to the type of information terminal that has received the execution instruction. Determine whether to execute all or part of the contents of the control. The user of the control system 10 only needs to set the contents of scene control in the control terminal 20 , and has the advantage of not having to set the scene control in the voice control server 60 and the remote control server 80 .
 なお、動作例1では、シーン制御の内容の設定を意図した手動入力は、制御端末20によって受け付けられたが、携帯端末70によって受け付けられてもよい。この場合、設定情報は音声制御用サーバ60または遠隔制御用サーバ80経由で制御端末20へ送信され、記憶部24に記憶される。 In operation example 1, the manual input intended to set the content of the scene control was received by the control terminal 20, but may be received by the mobile terminal 70. In this case, the setting information is transmitted to the control terminal 20 via the voice control server 60 or remote control server 80 and stored in the storage unit 24 .
 [動作例2]
 以下、シーン制御の設定情報が共通化される場合の動作例2について説明する。まず、動作例2に係るシーン制御の内容の設定動作について説明する。図11は、動作例2に係るシーン制御の内容の設定動作のシーケンス図である。
[Operation example 2]
An operation example 2 in which setting information for scene control is shared will be described below. First, the operation of setting the content of scene control according to Operation Example 2 will be described. FIG. 11 is a sequence diagram of an operation for setting the content of scene control according to Operation Example 2. As shown in FIG.
 制御端末20の表示制御部27は、シーン制御の設定画面を表示部21に表示する(S51)。設定画面が表示されているときに、ユーザは、制御端末20の表示部21にシーン制御の設定を意図した手動入力(第一手動入力の一例)を行う。表示部21はこのような手動入力を受け付ける(S52)。設定部25は、ステップS52において受け付けられた手動入力に応じてシーン制御の内容を設定する(S53)。設定部25は、具体的には、ステップS12において受け付けられた手動入力に応じて図2に示されるような設定情報を生成し、生成した設定情報を記憶部24に記憶する。ステップS51~ステップS53の処理は、ステップS11~ステップS13の処理と同様である。 The display control unit 27 of the control terminal 20 displays a scene control setting screen on the display unit 21 (S51). While the setting screen is displayed, the user performs manual input (an example of the first manual input) intended to set the scene control on the display unit 21 of the control terminal 20 . The display unit 21 accepts such manual input (S52). The setting unit 25 sets the content of scene control according to the manual input accepted in step S52 (S53). Specifically, the setting unit 25 generates setting information as shown in FIG. 2 according to the manual input accepted in step S12, and stores the generated setting information in the storage unit 24. The processing of steps S51 to S53 is the same as the processing of steps S11 to S13.
 ステップS53の後、設定部25は、ステップS53において生成した設定情報を、音声制御用サーバ60及び遠隔制御用サーバ80のそれぞれへ送信する(S54)。上述のように、音声制御用サーバ60は、音声UI装置40が音声入力を受け付けたときに使用されるサーバ装置であり、遠隔制御用サーバ80は、携帯端末70が手動入力を受け付けられたときに使用されるサーバ装置である。 After step S53, the setting unit 25 transmits the setting information generated in step S53 to each of the voice control server 60 and the remote control server 80 (S54). As described above, the voice control server 60 is used when the voice UI device 40 accepts voice input, and the remote control server 80 operates when the mobile terminal 70 accepts manual input. It is a server device used for
 音声制御用サーバ60の第一通信部61は、設定情報を受信する。第一設定部64は、受信された設定情報の絞り込みを行う(S55)。第一設定部64は、具体的には、受信された設定情報の中から音声シーン制御で実行する部分だけを抽出する。第一設定部64は、抽出した設定情報を、音声シーン制御用の設定情報として第一記憶部63に記憶する(S56)。図12は、音声シーン制御用の設定情報の一例を示す図であり、元の設定情報(図2)から内容が絞り込まれている。 The first communication unit 61 of the voice control server 60 receives the setting information. The first setting unit 64 narrows down the received setting information (S55). Specifically, the first setting unit 64 extracts, from the received setting information, only the part to be executed by the audio scene control. The first setting unit 64 stores the extracted setting information in the first storage unit 63 as setting information for audio scene control (S56). FIG. 12 is a diagram showing an example of setting information for audio scene control, the contents of which are narrowed down from the original setting information (FIG. 2).
 また、遠隔制御用サーバ80の第二通信部81は、設定情報を受信する。第二設定部84は、受信された設定情報の絞り込みを行う(S57)。第二設定部84は、具体的には、受信された設定情報の中から遠隔シーン制御で実行する部分だけを抽出する。第二設定部84は、抽出した設定情報を、遠隔シーン制御用の設定情報として第二記憶部83に記憶する(S58)。図13は、遠隔シーン制御用の設定情報の一例を示す図であり、元の設定情報(図2)から内容が絞り込まれている。 Also, the second communication unit 81 of the remote control server 80 receives the setting information. The second setting unit 84 narrows down the received setting information (S57). Specifically, the second setting unit 84 extracts only the portion to be executed by remote scene control from the received setting information. The second setting unit 84 stores the extracted setting information in the second storage unit 83 as setting information for remote scene control (S58). FIG. 13 is a diagram showing an example of setting information for remote scene control, the contents of which are narrowed down from the original setting information (FIG. 2).
 以上のように、動作例2では、制御端末20、音声制御用サーバ60、及び、遠隔制御用サーバ80のそれぞれに内容の異なる設定情報が記憶される。設定情報の配信(ステップS54~ステップS56の処理)は、制御端末20の記憶部24に設定情報が記憶されたことを契機に自動的に行われる。 As described above, in Operation Example 2, setting information with different contents is stored in each of the control terminal 20, the voice control server 60, and the remote control server 80. The distribution of the setting information (the process of steps S54 to S56) is automatically performed when the setting information is stored in the storage unit 24 of the control terminal 20. FIG.
 なお、設定情報の配信は、ユーザの手動入力によって行われてもよい。この場合の手動入力は、設定情報を音声制御用サーバ60及び遠隔制御用サーバ80へ送信する(設定情報を共有する)ことを承諾する意図の第二手動入力である。第二手動入力は、所定のアイコンへのワンタッチ操作などの簡単なものである。  The distribution of the setting information may be performed by the user's manual input. The manual input in this case is the second manual input with the intention of consenting to transmitting the setting information to the voice control server 60 and the remote control server 80 (sharing the setting information). The second manual input is a simple one-touch operation on a predetermined icon.
 次に、動作例2に係るローカルシーン制御、音声シーン制御、及び、遠隔シーン制御について説明する。動作例2に係るローカルシーン制御は、動作例1に係るローカルシーン制御と同様であるため説明が省略される。 Next, local scene control, audio scene control, and remote scene control according to Operation Example 2 will be described. Since the local scene control according to Operation Example 2 is the same as the local scene control according to Operation Example 1, description thereof is omitted.
 次に、動作例2に係る音声シーン制御について説明する。図14は、動作例2に係る音声シーン制御のシーケンス図である。 Next, audio scene control according to Operation Example 2 will be described. 14 is a sequence diagram of audio scene control according to Operation Example 2. FIG.
 上記のように設定情報が音声制御用サーバ60の第一記憶部63に記憶された後、マイクロフォン41は、シーン制御の実行を意図した音声入力を受け付ける(S61)。音声UI装置40は、取得した発話音声の音声信号を音声認識サーバ50へ送信する(S62)。音声認識サーバ50は受信した音声信号に対して音声認識処理を行い(S63)、テキスト情報を音声制御用サーバ60に送信する(S64)。ステップS61~ステップS64の処理は、ステップS21~ステップS24と同様である。 After the setting information is stored in the first storage unit 63 of the voice control server 60 as described above, the microphone 41 receives voice input intended to execute scene control (S61). The voice UI device 40 transmits the acquired voice signal of the uttered voice to the voice recognition server 50 (S62). The voice recognition server 50 performs voice recognition processing on the received voice signal (S63), and transmits text information to the voice control server 60 (S64). The processing of steps S61 to S64 is the same as that of steps S21 to S24.
 音声制御用サーバ60の第一通信部61は、音声認識サーバ50からテキスト情報を受信する。第一制御部65は、受信されたテキスト情報によって定まるシーン制御の内容を、第一記憶部63に記憶された設定情報を参照することにより特定する(S65)。そして、第一制御部65は、特定したシーン制御の内容(対象の機器30の具体的な制御内容)を示す制御命令を制御端末20へ送信する(S66)。制御命令の送信には、第一通信部61が用いられる。 The first communication unit 61 of the voice control server 60 receives text information from the voice recognition server 50. The first control unit 65 identifies the content of scene control determined by the received text information by referring to the setting information stored in the first storage unit 63 (S65). Then, the first control unit 65 transmits to the control terminal 20 a control command indicating the content of the specified scene control (specific control content of the target device 30) (S66). The first communication unit 61 is used for transmitting the control command.
 制御端末20の通信部22は、音声制御用サーバ60から制御命令を受信する。制御部26は、受信された制御命令が示すシーン制御の内容(つまり大元のシーン制御の内容の一部)を実行する(S67)。制御部26は、具体的には、制御信号を、局所通信ネットワークを介して機器30へ送信することでシーン制御を実行する。 The communication unit 22 of the control terminal 20 receives control instructions from the voice control server 60 . The control unit 26 executes the content of the scene control indicated by the received control command (that is, part of the original content of the scene control) (S67). Specifically, the control unit 26 executes scene control by transmitting a control signal to the device 30 via the local communication network.
 次に、動作例2に係る遠隔シーン制御について説明する。図15は、動作例2に係る遠隔シーン制御のシーケンス図である。 Next, remote scene control according to Operation Example 2 will be described. 15 is a sequence diagram of remote scene control according to Operation Example 2. FIG.
 上記のように設定情報が遠隔制御用サーバ80の第二記憶部83に記憶された後、携帯端末70の表示部71はシーン制御の実行を意図した手動入力を受け付ける(S71)。携帯端末70は、ステップS71において受け付けられた手動入力が示すシーン制御の実行を依頼する制御依頼情報を遠隔制御用サーバ80へ送信する(S72)。ステップS71~ステップS72の処理は、ステップS31~ステップS32と同様である。 After the setting information is stored in the second storage unit 83 of the remote control server 80 as described above, the display unit 71 of the mobile terminal 70 accepts manual input intended to execute scene control (S71). The portable terminal 70 transmits to the remote control server 80 control request information requesting execution of scene control indicated by the manual input accepted in step S71 (S72). The processing of steps S71 to S72 is the same as that of steps S31 to S32.
 遠隔制御用サーバ80の第二通信部81は、制御依頼情報を受信する。第二制御部85は、受信された制御依頼情報によって定まるシーン制御の内容を、第二記憶部83に記憶された設定情報を参照することにより特定する(S73)。 The second communication unit 81 of the remote control server 80 receives the control request information. The second control unit 85 identifies the content of scene control determined by the received control request information by referring to the setting information stored in the second storage unit 83 (S73).
 また、第二制御部85は、特定されたシーン制御の内容(つまり大元のシーン制御の内容の一部のみが実行されること)を通知する(S74)。第二制御部85は、具体的には、通知情報を携帯端末70へ送信する。この結果、携帯端末70の表示部71には、図10のような通知画面が表示される(S75)。 Also, the second control unit 85 notifies the content of the specified scene control (that is, that only part of the content of the original scene control is executed) (S74). Specifically, the second control unit 85 transmits notification information to the mobile terminal 70 . As a result, a notification screen as shown in FIG. 10 is displayed on the display unit 71 of the mobile terminal 70 (S75).
 図10の通知画面が表示されているときにユーザがシーン制御の実行を承認するための承認操作を行うと、表示部71はシーン制御の実行の承認を意図した手動入力を受け付ける(S76)。携帯端末70は、承認情報を遠隔制御用サーバ80へ送信し(S77)、遠隔制御用サーバ80の第二通信部81は承認情報を受信する。 When the user performs an approval operation to approve execution of scene control while the notification screen of FIG. 10 is displayed, the display unit 71 receives manual input intended to approve execution of scene control (S76). The mobile terminal 70 transmits the approval information to the remote control server 80 (S77), and the second communication section 81 of the remote control server 80 receives the approval information.
 第二制御部85は、承認情報が受信されたことを契機に、ステップS73において特定したシーン制御の内容(対象の機器30の具体的な制御内容)を示す制御命令を制御端末20へ送信する(S78)。制御命令の送信には、第二通信部81が用いられる。第二制御部85は、通知情報を送信してから一定期間の間に承認情報が受信されなかった場合には、制御命令を送信しない。 When the approval information is received, the second control unit 85 transmits to the control terminal 20 a control command indicating the content of the scene control specified in step S73 (specific control content of the target device 30). (S78). The second communication unit 81 is used for transmitting the control command. The second control unit 85 does not transmit the control command when the approval information is not received within a certain period of time after transmitting the notification information.
 制御端末20の通信部22は、遠隔制御用サーバ80から制御命令を受信する。制御部26は、受信された制御命令が示すシーン制御の内容(つまり大元のシーン制御の内容の一部)を実行する(S79)。制御部26は、具体的には、制御信号を、局所通信ネットワークを介して機器30へ送信することでシーン制御を実行する。 The communication unit 22 of the control terminal 20 receives control instructions from the remote control server 80 . The control unit 26 executes the content of the scene control indicated by the received control command (that is, part of the original content of the scene control) (S79). Specifically, the control unit 26 executes scene control by transmitting a control signal to the device 30 via the local communication network.
 なお、ステップS74~ステップS77の処理(通知及び承認)は省略されてもよい。また、第二制御部85は、制御命令を送信した後に、シーン制御の内容の一部のみが実行されたことを通知してもよい。 Note that the processing (notification and approval) of steps S74 to S77 may be omitted. Also, the second control unit 85 may notify that only part of the content of the scene control has been executed after transmitting the control command.
 以上説明したように、動作例2では、制御端末20に記憶された設定情報が、音声制御用サーバ60及び遠隔制御用サーバ80に配信され、制御端末20、音声制御用サーバ60及び遠隔制御用サーバ80の間で設定情報が共有される。制御システム10のユーザは、制御端末20にのみシーン制御の内容の設定を行えば、自動的または簡易な手動入力によって設定情報が共有される。つまり、ユーザは、音声制御用サーバ60及び遠隔制御用サーバ80にシーン制御の設定を行わなくてよい利点がある。 As described above, in the operation example 2, the setting information stored in the control terminal 20 is distributed to the voice control server 60 and the remote control server 80, and the control terminal 20, the voice control server 60 and the remote control server Setting information is shared among the servers 80 . If the user of the control system 10 sets the content of scene control only on the control terminal 20, the setting information is shared automatically or by simple manual input. In other words, there is an advantage that the user does not need to set the scene control in the voice control server 60 and the remote control server 80 .
 なお、上記動作例2では、制御端末20にシーン制御の設定が行われ、設定情報は制御端末20から音声制御用サーバ60及び遠隔制御用サーバ80に配信された、しかしながら、携帯端末70を介して音声制御用サーバ60にシーン制御の設定が行われ、設定情報は音声制御用サーバ60から制御端末20及び遠隔制御用サーバ80に配信されてもよい。また、携帯端末70を介して遠隔制御用サーバ80にシーン制御の設定が行われ、設定情報は遠隔制御用サーバ80から制御端末20及び音声制御用サーバ60に配信されてもよい。 In operation example 2 above, scene control settings were made in the control terminal 20, and the setting information was distributed from the control terminal 20 to the voice control server 60 and the remote control server 80. Scene control may be set in the voice control server 60 , and the setting information may be distributed from the voice control server 60 to the control terminal 20 and the remote control server 80 . Further, scene control settings may be made to the remote control server 80 via the mobile terminal 70 , and the setting information may be distributed from the remote control server 80 to the control terminal 20 and the voice control server 60 .
 [動作例3]
 動作例1及び2では、シーン制御の実行指示を受け付ける情報端末の種類に応じて、シーン制御の内容の一部のみ、または、全部を実行する例について説明されたが、制御システム10は、シーン制御の実行指示を受け付ける情報端末(具体的には、携帯端末70)の現在位置に応じて、シーン制御の内容の一部のみ、または、全部を実行してもよい。図16及び図17は、動作例3に係るシーン制御のシーケンス図である。図16は、携帯端末70が施設90内に位置する場合のシーケンス図であり、図17は、携帯端末70が施設90外に位置する場合のシーケンス図である。
[Operation Example 3]
In the operation examples 1 and 2, examples have been described in which only part or all of the content of the scene control is executed according to the type of the information terminal that receives the execution instruction of the scene control. Depending on the current position of the information terminal (more specifically, mobile terminal 70) that receives the control execution instruction, only a part or all of the contents of the scene control may be executed. 16 and 17 are sequence diagrams of scene control according to Operation Example 3. FIG. 16 is a sequence diagram when the mobile terminal 70 is located inside the facility 90, and FIG. 17 is a sequence diagram when the mobile terminal 70 is located outside the facility 90. FIG.
 まず、図16を参照しながら、携帯端末70が施設90内に位置する場合の制御システム10の動作について説明する。 First, the operation of the control system 10 when the mobile terminal 70 is located inside the facility 90 will be described with reference to FIG.
 動作例1と同様に制御端末20の記憶部24に設定情報が記憶された後、携帯端末70の表示部71はシーン制御の実行を意図した手動入力を受け付ける(S81)。携帯端末70は、ステップS81において受け付けられた手動入力が示すシーン制御の実行を依頼する制御依頼情報を遠隔制御用サーバ80へ送信する(S82)。ここで、制御依頼情報には、携帯端末70の現在位置を示す位置情報が含まれる。位置情報は、携帯端末70が備える位置取得部72によって出力される。 After the setting information is stored in the storage unit 24 of the control terminal 20 as in operation example 1, the display unit 71 of the mobile terminal 70 receives manual input intended to execute scene control (S81). The mobile terminal 70 transmits to the remote control server 80 control request information requesting execution of scene control indicated by the manual input accepted in step S81 (S82). Here, the control request information includes location information indicating the current location of the mobile terminal 70 . The location information is output by the location acquisition unit 72 included in the mobile terminal 70 .
 遠隔制御用サーバ80の第二通信部81は、制御依頼情報を受信する。第二制御部85は、受信された制御依頼情報に基づいてユーザが意図するシーンを特定する(S83)。また、第二制御部85は、制御依頼情報に含まれる位置情報に基づいて携帯端末70が施設90内に位置すると判定する(S84a)。なお、このときの判定基準(施設90の座標)は、あらかじめ遠隔制御用サーバ80の第二記憶部83に記憶されている。 The second communication unit 81 of the remote control server 80 receives the control request information. The second control unit 85 identifies the scene intended by the user based on the received control request information (S83). Also, the second control unit 85 determines that the mobile terminal 70 is located inside the facility 90 based on the location information included in the control request information (S84a). Note that the determination criteria (coordinates of the facility 90) at this time are stored in the second storage unit 83 of the remote control server 80 in advance.
 次に、第二制御部85は、特定したシーンを示すシーン情報、及び、携帯端末70が施設90内に位置することを示す判定結果情報を含む制御命令を制御端末20へ送信する(S85a)。制御命令の送信には、第二通信部81が用いられる。 Next, the second control unit 85 transmits to the control terminal 20 a control command including scene information indicating the specified scene and determination result information indicating that the mobile terminal 70 is located inside the facility 90 (S85a). . The second communication unit 81 is used for transmitting the control command.
 制御端末20の通信部22は、遠隔制御用サーバ80から制御命令(シーン情報、及び、判定結果情報)を受信する。制御部26は、受信された判定結果情報が携帯端末70が施設90内に位置することを示していることから、ステップS85aにおいて受信されたシーン情報が示すシーン制御の内容の全部を実行する(S86a)。 The communication unit 22 of the control terminal 20 receives control commands (scene information and determination result information) from the remote control server 80 . Since the received determination result information indicates that the mobile terminal 70 is located within the facility 90, the control unit 26 executes all the contents of the scene control indicated by the scene information received in step S85a ( S86a).
 次に、図17を参照しながら、携帯端末70が施設90外に位置する場合の制御システム10の動作について説明する。ステップS81~ステップS83の処理は、図16と同様である。 Next, the operation of the control system 10 when the mobile terminal 70 is located outside the facility 90 will be described with reference to FIG. The processing of steps S81 to S83 is the same as in FIG.
 ステップS83の後、第二制御部85は、制御依頼情報に含まれる位置情報に基づいて携帯端末70が施設90外に位置すると判定する(S84b)。 After step S83, the second control unit 85 determines that the mobile terminal 70 is located outside the facility 90 based on the location information included in the control request information (S84b).
 次に、第二制御部85は、特定したシーンを示すシーン情報、及び、携帯端末70が施設90外に位置することを示す判定結果情報を含む制御命令を制御端末20へ送信する(S85b)。制御命令の送信には、第二通信部81が用いられる。 Next, the second control unit 85 transmits to the control terminal 20 a control command including scene information indicating the identified scene and determination result information indicating that the portable terminal 70 is located outside the facility 90 (S85b). . The second communication unit 81 is used for transmitting the control command.
 制御端末20の通信部22は、遠隔制御用サーバ80から制御命令(シーン情報、及び、判定結果情報)を受信する。制御部26は、受信された判定結果情報が携帯端末70が施設90外に位置することを示していることから、シーン制御の内容を絞り込み(例えば、図9のように絞り込み)、ステップS85bにおいて受信されたシーン情報が示すシーン制御の内容の一部を実行する(S86b)。 The communication unit 22 of the control terminal 20 receives control commands (scene information and determination result information) from the remote control server 80 . Since the received determination result information indicates that the portable terminal 70 is located outside the facility 90, the control unit 26 narrows down the content of the scene control (for example, narrows down as shown in FIG. 9), and in step S85b Part of the content of the scene control indicated by the received scene information is executed (S86b).
 このように、制御システム10は、シーン制御の実行指示を受け付ける携帯端末70の現在位置に応じて、シーン制御の内容の一部のみ、または、全部を実行することができる。動作例1及び2では、携帯端末70への実行指示は、施設90外から機器30の制御指示とみなされ、シーン制御の内容の一部のみが実行された。これに対し、動作例3によれば、携帯端末70の現在位置に応じて適応的にシーン制御が実行される。 In this way, the control system 10 can execute only part or all of the contents of the scene control according to the current position of the mobile terminal 70 that receives the instruction to execute the scene control. In operation examples 1 and 2, the execution instruction to the mobile terminal 70 is regarded as a control instruction for the device 30 from outside the facility 90, and only part of the content of the scene control is executed. In contrast, according to Operation Example 3, scene control is adaptively executed according to the current position of the mobile terminal 70 .
 なお、動作例3において、携帯端末70が施設90内に位置するか否かの判定は遠隔制御用サーバ80によって行われたが、制御端末20によって行われてもよい。また、図17のステップS84bとステップS85bとの間には、ステップS74~ステップS76と同様の処理(通知及び承認)が行われてもよい。また、動作例3においても、制御部26は、シーン制御の内容の一部を実行した後に、シーン制御の内容の一部のみが実行されたことを通知してもよい。 In Operation Example 3, the remote control server 80 determines whether or not the mobile terminal 70 is located within the facility 90 , but the control terminal 20 may determine whether or not the mobile terminal 70 is located within the facility 90 . Also, between steps S84b and S85b in FIG. 17, processing (notification and approval) similar to steps S74 to S76 may be performed. Further, in the operation example 3 as well, the control unit 26 may notify that only part of the content of the scene control has been executed after executing a part of the content of the scene control.
 [動作例4]
 動作例3においては、携帯端末70が施設90内に位置するか否かの判定に、位置取得部72によって出力される位置情報が用いられたが、上記判定は、他の方法で行うこともできる。図18及び図19は、このような動作例4に係るシーン制御のシーケンス図である。図18は、携帯端末70が施設90内に位置する場合のシーケンス図であり、図19は、携帯端末70が施設90外に位置する場合のシーケンス図である。
[Operation example 4]
In the operation example 3, the location information output by the location acquisition unit 72 is used to determine whether or not the mobile terminal 70 is located within the facility 90, but the above determination may be made by other methods. can. 18 and 19 are sequence diagrams of scene control according to the operation example 4. FIG. 18 is a sequence diagram when the mobile terminal 70 is located inside the facility 90, and FIG. 19 is a sequence diagram when the mobile terminal 70 is located outside the facility 90. FIG.
 まず、図18を参照しながら、携帯端末70が施設90内に位置する場合の制御システム10の動作について説明する。 First, the operation of the control system 10 when the mobile terminal 70 is located inside the facility 90 will be described with reference to FIG.
 動作例1と同様に制御端末20の記憶部24に設定情報が記憶された後、携帯端末70の表示部71はシーン制御の実行を意図した手動入力を受け付ける(S91)。携帯端末70は、ステップS91において受け付けられた手動入力が示すシーン制御の実行を依頼する制御依頼情報を遠隔制御用サーバ80へ送信する(S92)。ここで、制御依頼情報には、携帯端末70の現在位置を示す位置情報は含まれない。 After the setting information is stored in the storage unit 24 of the control terminal 20 as in operation example 1, the display unit 71 of the mobile terminal 70 receives manual input intended to execute scene control (S91). The portable terminal 70 transmits to the remote control server 80 control request information requesting execution of scene control indicated by the manual input accepted in step S91 (S92). Here, the control request information does not include location information indicating the current location of the mobile terminal 70 .
 遠隔制御用サーバ80の第二通信部81は、制御依頼情報を受信する。第二制御部85は、受信された制御依頼情報に基づいてユーザが意図するシーンを特定する(S93)。 The second communication unit 81 of the remote control server 80 receives the control request information. The second control unit 85 identifies the scene intended by the user based on the received control request information (S93).
 次に、第二制御部85は、特定したシーンを示すシーン情報を含む制御命令を制御端末20へ送信する(S94)。制御命令の送信には、第二通信部81が用いられる。 Next, the second control unit 85 transmits a control command including scene information indicating the specified scene to the control terminal 20 (S94). The second communication unit 81 is used for transmitting the control command.
 制御端末20の通信部22は、遠隔制御用サーバ80から制御命令(シーン情報)を受信する。制御部26は、携帯端末70が無線通信機器35に通信接続されているか否かを判定する。携帯端末70が無線通信機器35に通信接続されている場合、携帯端末70が施設90の近傍(施設90内)に位置すると考えられる。そこで、制御部26は、携帯端末70が無線通信機器35に通信接続されていると判定すると(S95a)、ステップS94において受信されたシーン情報が示すシーン制御の内容の全部を実行する(S96a)。 The communication unit 22 of the control terminal 20 receives control instructions (scene information) from the remote control server 80 . The control unit 26 determines whether or not the mobile terminal 70 is connected to the wireless communication device 35 for communication. When the mobile terminal 70 is connected to the wireless communication device 35 for communication, it is considered that the mobile terminal 70 is located near the facility 90 (inside the facility 90). Therefore, when determining that the mobile terminal 70 is connected to the wireless communication device 35 (S95a), the control section 26 executes all scene control contents indicated by the scene information received in step S94 (S96a). .
 次に、図19を参照しながら、携帯端末70が施設90外に位置する場合の制御システム10の動作について説明する。ステップS91~ステップS94の処理は、図18と同様である。 Next, the operation of the control system 10 when the mobile terminal 70 is located outside the facility 90 will be described with reference to FIG. The processing of steps S91 to S94 is the same as in FIG.
 ステップS94の後、制御部26は、携帯端末70が無線通信機器35に通信接続されているか否かを判定する。携帯端末70が無線通信機器35に通信接続されていない場合、携帯端末70が施設90の遠方(施設90外)に位置すると考えられる。そこで、制御部26は、携帯端末70が無線通信機器35に通信接続されていないと判定すると(S95b)、シーン制御の内容を絞り込み(例えば、図9のように絞り込み)、ステップS94において受信されたシーン情報が示すシーン制御の内容の一部を実行する(S96b)。 After step S94, the control unit 26 determines whether or not the mobile terminal 70 is connected to the wireless communication device 35 for communication. When the mobile terminal 70 is not connected to the wireless communication device 35 for communication, it is considered that the mobile terminal 70 is located far from the facility 90 (outside the facility 90). Therefore, when determining that the mobile terminal 70 is not connected to the wireless communication device 35 (S95b), the control unit 26 narrows down the details of the scene control (for example, narrowing down as shown in FIG. 9), A part of the contents of the scene control indicated by the scene information obtained is executed (S96b).
 このように、制御システム10は、シーン制御の実行指示を受け付ける携帯端末70が施設90内に設置された無線通信機器35に通信接続されているか否かを、携帯端末70が施設90内に位置するか否かとみなすことができる。これにより、制御システム10は、携帯端末70の現在位置に応じて適応的にシーン制御を実行することができる。 In this manner, the control system 10 determines whether or not the mobile terminal 70 that receives the execution instruction for scene control is connected to the wireless communication device 35 installed in the facility 90 . It can be considered whether or not Thereby, the control system 10 can adaptively execute scene control according to the current position of the mobile terminal 70 .
 なお、動作例4において、図19のステップS95bとステップS96bとの間には、ステップS37~ステップS40と同様の処理(通知及び承認)が行われてもよい。また、動作例4においても、制御部26は、シーン制御の内容の一部を実行した後に、シーン制御の内容の一部のみが実行されたことを通知してもよい。 Note that, in Operation Example 4, the same processing (notification and approval) as in steps S37 to S40 may be performed between steps S95b and S96b in FIG. Also in the operation example 4, the control unit 26 may notify that only a part of the content of the scene control has been executed after executing a part of the content of the scene control.
 [変形例]
 ユーザは、音声制御用サーバ60の利用を開始するときに、ユーザは、携帯端末70へ手動入力を行うことによりユーザ登録を行い、音声制御用サーバ60の利用規約に同意する必要がある。利用規約への同意を得るときに、ユーザに音声シーン制御では機能が制限されることの同意を得てもよい。図20は、音声シーン制御において機能が制限されることの同意を得るための表示画面の一例を示す図である。
[Modification]
When the user starts using the voice control server 60 , the user needs to perform user registration by manually inputting to the portable terminal 70 and consent to the terms of use of the voice control server 60 . When agreeing to the terms of use, the user may be asked to agree that the audio scene control has limited functionality. FIG. 20 is a diagram showing an example of a display screen for obtaining consent to restrict functions in audio scene control.
 なお、遠隔制御用サーバ80を利用開始するときにも携帯端末70へ手動入力を行うことによりユーザ登録を行い、遠隔制御用サーバ80の利用規約に同意する必要がある場合もある。この場合も利用規約への同意を得るときに、ユーザに遠隔シーン制御では機能が制限されることの同意を得てもよい。 Also, when starting to use the remote control server 80, it may be necessary to perform user registration by manually inputting to the mobile terminal 70 and consent to the terms of use of the remote control server 80. Again, when agreeing to the terms of use, the user may agree that remote scene control is limited in functionality.
 [効果等]
 以上説明したように、制御システム10は、施設90に設置された機器30を対象としたシーン制御の内容を設定する設定部25と、シーン制御の実行指示を受け付ける情報端末の種類または情報端末の現在位置に応じて、設定部25によって設定されたシーン制御の内容の一部のみ、または、全部を実行する制御部26とを備える。
[Effects, etc.]
As described above, the control system 10 includes the setting unit 25 for setting the content of scene control for the equipment 30 installed in the facility 90, and the type of information terminal or information terminal that receives the instruction to execute the scene control. A control unit 26 that executes only part or all of the content of scene control set by the setting unit 25 according to the current position.
 このような制御システム10は、シーン制御の実行指示を受け付ける情報端末の種類または情報端末の現在位置に応じて、シーン制御の内容を適応的に変更することができる。つまり、制御システム10は、機器30の制御の内容を適応的に変更することができる。 Such a control system 10 can adaptively change the content of scene control according to the type of information terminal that accepts a scene control execution instruction or the current position of the information terminal. That is, the control system 10 can adaptively change the content of control of the device 30 .
 また、動作例1及び2では、制御部26は、情報端末の種類に応じて、設定部25によって設定されたシーン制御の内容の一部のみまたは全部を実行する。情報端末が第一情報端末である場合、シーン制御の内容の全部を実行し、情報端末が第二情報端末である場合、シーン制御の内容の一部のみを実行する。 Also, in operation examples 1 and 2, the control unit 26 executes only part or all of the content of the scene control set by the setting unit 25 according to the type of information terminal. When the information terminal is the first information terminal, the entire contents of the scene control are executed, and when the information terminal is the second information terminal, only part of the contents of the scene control are executed.
 このような制御システム10は、シーン制御の実行指示を受け付ける情報端末の種類に応じて、シーン制御の内容を適応的に変更することができる。 Such a control system 10 can adaptively change the content of scene control according to the type of information terminal that receives the instruction to execute scene control.
 また、例えば、第一情報端末は、施設90内に設置された情報端末(例えば、制御端末20)であり、第二情報端末は、携帯型の情報端末(例えば、携帯端末70)である。 Also, for example, the first information terminal is an information terminal (eg, control terminal 20) installed in facility 90, and the second information terminal is a portable information terminal (eg, mobile terminal 70).
 このような制御システム10は、シーン制御の実行指示を受け付ける情報端末が施設90内に設置された情報端末であるか、携帯端末であるかに応じて、シーン制御の内容を適応的に変更することができる。 Such a control system 10 adaptively changes the content of scene control according to whether the information terminal that receives the instruction to execute scene control is an information terminal installed in the facility 90 or a mobile terminal. be able to.
 また、例えば、第一情報端末は、実行指示として、ユーザの手動入力を受け付ける情報端末(例えば、携帯端末70)であり、第二情報端末は、実行指示として、ユーザの音声入力を受け付ける情報端末(例えば、音声UI装置40)である。 Further, for example, the first information terminal is an information terminal (for example, the mobile terminal 70) that accepts a user's manual input as an execution instruction, and the second information terminal is an information terminal that accepts a user's voice input as an execution instruction. (For example, the audio UI device 40).
 このような制御システム10は、シーン制御の実行指示を受け付ける情報端末が手動入力を受け付ける情報端末であるか、音声入力を受け付ける情報端末であるかに応じて、シーン制御の内容を適応的に変更することができる。 Such a control system 10 adaptively changes the content of scene control according to whether the information terminal that receives the instruction to execute scene control is an information terminal that receives manual input or an information terminal that receives voice input. can do.
 また、動作例3及び4では、制御部26は、情報端末(例えば、携帯端末70)の現在位置に応じて、設定部25によって設定されたシーン制御の内容の一部のみまたは全部を実行し、情報端末が施設90内に位置する場合、シーン制御の内容の全部を実行し、情報端末が施設90外に位置する場合、シーン制御の内容の一部のみを実行する。 Further, in operation examples 3 and 4, the control unit 26 executes only part or all of the content of the scene control set by the setting unit 25 according to the current position of the information terminal (for example, the mobile terminal 70). , when the information terminal is located inside the facility 90, the entire contents of the scene control are executed, and when the information terminal is located outside the facility 90, only a part of the contents of the scene control are executed.
 このような制御システム10は、シーン制御の実行指示を受け付ける情報端末の種類に応じて、シーン制御の内容を適応的に変更することができる。 Such a control system 10 can adaptively change the content of scene control according to the type of information terminal that receives the instruction to execute scene control.
 また、例えば、制御部26(または第二制御部85)は、シーン制御の内容の一部のみを実行する場合、実行前にシーン制御の内容の一部のみが実行されることを通知する。 Also, for example, when executing only part of the content of scene control, the control unit 26 (or the second control unit 85) notifies that only part of the content of scene control will be executed before execution.
 このような制御システム10は、シーン制御の内容の一部のみが実行されることを通知することができる。 Such a control system 10 can notify that only part of the content of scene control is executed.
 また、例えば、制御システム10は、シーン制御の内容の一部のみが実行されるときに制御の対象となる機器30を示す画像を表示する表示制御部27をさらに備える。 Also, for example, the control system 10 further includes a display control unit 27 that displays an image showing the device 30 to be controlled when only part of the content of scene control is executed.
 このような制御システム10は、制御の対象となる機器30を示す画像を表示することができる。 Such a control system 10 can display an image showing the equipment 30 to be controlled.
 また、例えば、設定部25は、第一情報端末(例えば、制御端末20)へのユーザの第一手動入力に基づいてシーン制御の内容を示す設定情報を生成し、生成した設定情報を、第二情報端末(例えば、音声UI装置40または携帯端末70)が実行指示を受け付けたときに使用されるサーバ装置(例えば、音声制御用サーバ60または遠隔制御用サーバ80)へ送信する。 Further, for example, the setting unit 25 generates setting information indicating details of scene control based on the user's first manual input to the first information terminal (for example, the control terminal 20), and sends the generated setting information to the first It is transmitted to the server device (eg, voice control server 60 or remote control server 80) used when the second information terminal (eg, voice UI device 40 or portable terminal 70) receives the execution instruction.
 このような制御システム10は、設定情報を第一情報端末及びサーバ装置の間で共有することができる。 Such a control system 10 can share setting information between the first information terminal and the server device.
 また、例えば、設定部25は、第一情報端末へのユーザの第一手動入力に基づいてシーン制御の内容を示す設定情報を生成し、生成した設定情報を、ユーザの第二手動入力に基づいてサーバ装置へ送信する。 Further, for example, the setting unit 25 generates setting information indicating details of scene control based on the user's first manual input to the first information terminal, and converts the generated setting information to the user's second manual input. and send it to the server device.
 このような制御システム10は、ユーザの手動入力に基づいて、設定情報を第一情報端末及びサーバ装置の間で共有することができる。 Such a control system 10 can share setting information between the first information terminal and the server device based on manual input by the user.
 また、例えば、設定部25、及び、制御部26は、施設90内に設置される制御端末20によって備えられる。 Also, for example, the setting unit 25 and the control unit 26 are provided by the control terminal 20 installed in the facility 90 .
 このような制御システム10は、施設90内に設置される制御端末20として実現することができる。 Such a control system 10 can be realized as a control terminal 20 installed in the facility 90.
 また、制御システム10などのコンピュータが実行する制御方法は、施設90に設置された機器30を対象としたシーン制御の内容を設定する設定ステップと、シーン制御の実行指示を受け付ける情報端末の種類または情報端末の現在位置に応じて、設定ステップにおいて設定されたシーン制御の内容の一部のみ、または、全部を実行する制御ステップとを含む。 Further, the control method executed by a computer such as the control system 10 includes a setting step of setting the content of scene control for the equipment 30 installed in the facility 90, and the type or and a control step of executing only part or all of the content of the scene control set in the setting step according to the current position of the information terminal.
 このような制御方法は、シーン制御の実行指示を受け付ける情報端末の種類または情報端末の現在位置に応じて、シーン制御の内容を適応的に変更することができる。つまり、制御方法は、機器30の制御の内容を適応的に変更することができる。 Such a control method can adaptively change the content of scene control according to the type of information terminal that receives the execution instruction of scene control or the current position of the information terminal. That is, the control method can adaptively change the content of control of the device 30 .
 (その他の実施の形態)
 以上、実施の形態について説明したが、本発明は、上記実施の形態に限定されるものではない。
(Other embodiments)
Although the embodiments have been described above, the present invention is not limited to the above embodiments.
 例えば、上記実施の形態では、制御システムは、複数の装置によって実現されたが、単一の装置によって実現されてもよい。例えば、制御システムは、上記実施の形態の制御端末に、音声制御用サーバ、または、遠隔制御用サーバに相当する単一の装置として実現されてもよい。制御システムが複数の装置によって実現される場合、制御システムが備える構成要素は、複数の装置にどのように振り分けられてもよい。 For example, in the above embodiments, the control system was realized by a plurality of devices, but it may be realized by a single device. For example, the control system may be implemented as a single device corresponding to the voice control server or the remote control server in the control terminal of the above embodiments. When the control system is implemented by multiple devices, the components included in the control system may be distributed among the multiple devices in any way.
 また、例えば、上記実施の形態における装置間の通信方法については特に限定されるものではない。また、装置間の通信においては、図示されない中継装置が介在してもよい。また、上記実施の形態で説明された情報の伝達経路は、シーケンス図に示される伝達経路に限定されない。 Also, for example, the communication method between devices in the above embodiment is not particularly limited. Further, a relay device (not shown) may intervene in communication between devices. Further, the information transmission paths described in the above embodiments are not limited to the transmission paths shown in the sequence diagrams.
 また、上記実施の形態において、特定の処理部が実行する処理を別の処理部が実行してもよい。例えば、制御端末が備える設定部が実行する処理を、音声制御用サーバが備える第一設定部または遠隔制御用サーバが備える第二設定部が実行してもよい。また、制御端末が備える制御部が実行する処理を、音声制御用サーバが備える第一制御部または遠隔制御用サーバが備える第二制御部が実行してもよい。また、複数の処理の順序が変更されてもよいし、複数の処理が並行して実行されてもよい。 Further, in the above embodiment, the processing executed by a specific processing unit may be executed by another processing unit. For example, the processing executed by the setting unit of the control terminal may be executed by the first setting unit of the voice control server or the second setting unit of the remote control server. Further, the processing executed by the control unit of the control terminal may be executed by the first control unit of the voice control server or the second control unit of the remote control server. In addition, the order of multiple processes may be changed, and multiple processes may be executed in parallel.
 また、上記実施の形態において、各構成要素は、各構成要素に適したソフトウェアプログラムを実行することによって実現されてもよい。各構成要素は、CPUまたはプロセッサなどのプログラム実行部が、ハードディスクまたは半導体メモリなどの記録媒体に記録されたソフトウェアプログラムを読み出して実行することによって実現されてもよい。 Also, in the above embodiments, each component may be realized by executing a software program suitable for each component. Each component may be realized by reading and executing a software program recorded in a recording medium such as a hard disk or a semiconductor memory by a program execution unit such as a CPU or processor.
 また、各構成要素は、ハードウェアによって実現されてもよい。例えば、各構成要素は、回路(または集積回路)でもよい。これらの回路は、全体として1つの回路を構成してもよいし、それぞれ別々の回路でもよい。また、これらの回路は、それぞれ、汎用的な回路でもよいし、専用の回路でもよい。 Also, each component may be realized by hardware. For example, each component may be a circuit (or integrated circuit). These circuits may form one circuit as a whole, or may be separate circuits. These circuits may be general-purpose circuits or dedicated circuits.
 また、本発明の全般的または具体的な態様は、システム、装置、方法、集積回路、コンピュータプログラムまたはコンピュータ読み取り可能なCD-ROMなどの記録媒体で実現されてもよい。また、システム、装置、方法、集積回路、コンピュータプログラム及び記録媒体の任意な組み合わせで実現されてもよい。 Also, general or specific aspects of the present invention may be implemented in a system, apparatus, method, integrated circuit, computer program, or recording medium such as a computer-readable CD-ROM. Also, any combination of systems, devices, methods, integrated circuits, computer programs and recording media may be implemented.
 例えば、本発明は、上記実施の形態に係る制御端末またはサーバ装置(音声制御用サーバまたは遠隔制御用サーバ)として実現されてもよいし、制御端末またはサーバ装置に相当する制御システムとして実現されてもよい。また、本発明は、制御システムなどのコンピュータが実行する制御方法として実現されてもよいし、このような制御方法をコンピュータに実行させるためのプログラムとして実現されてもよい。本発明は、このようなプログラムが記録されたコンピュータ読み取り可能な非一時的な記録媒体として実現されてもよい。 For example, the present invention may be realized as a control terminal or server device (voice control server or remote control server) according to the above embodiments, or as a control system corresponding to the control terminal or server device. good too. Further, the present invention may be implemented as a control method executed by a computer such as a control system, or may be implemented as a program for causing a computer to execute such a control method. The present invention may be implemented as a computer-readable non-temporary recording medium in which such a program is recorded.
 その他、各実施の形態に対して当業者が思いつく各種変形を施して得られる形態、または、本発明の趣旨を逸脱しない範囲で各実施の形態における構成要素及び機能を任意に組み合わせることで実現される形態も本発明に含まれる。 In addition, forms obtained by applying various modifications to each embodiment that a person skilled in the art can think of, or realized by arbitrarily combining the constituent elements and functions of each embodiment without departing from the spirit of the present invention. Also included in the present invention.
 10 制御システム
 20 制御端末
 25 設定部
 26 制御部
 27 表示制御部
 30 機器
 60 音声制御用サーバ(サーバ装置)
 80 遠隔制御用サーバ(サーバ装置)
 90 施設
REFERENCE SIGNS LIST 10 control system 20 control terminal 25 setting unit 26 control unit 27 display control unit 30 device 60 voice control server (server device)
80 Remote control server (server device)
90 facilities

Claims (12)

  1.  施設に設置された機器を対象としたシーン制御の内容を設定する設定部と、
     前記シーン制御の実行指示を受け付ける情報端末の種類または前記情報端末の現在位置に応じて、前記設定部によって設定された前記シーン制御の内容の一部のみ、または、全部を実行する制御部とを備える
     制御システム。
    a setting unit for setting the content of scene control for equipment installed in the facility;
    a control unit that executes only part or all of the content of the scene control set by the setting unit according to the type of the information terminal that receives the instruction to execute the scene control or the current position of the information terminal; Equipped with a control system.
  2.  前記制御部は、
     前記情報端末の種類に応じて、前記設定部によって設定された前記シーン制御の内容の一部のみまたは全部を実行し、
     前記情報端末が第一情報端末である場合、前記シーン制御の内容の全部を実行し、
     前記情報端末が第二情報端末である場合、前記シーン制御の内容の一部のみを実行する
     請求項1に記載の制御システム。
    The control unit
    executing only a part or all of the scene control contents set by the setting unit according to the type of the information terminal;
    if the information terminal is the first information terminal, execute all of the contents of the scene control;
    2. The control system according to claim 1, wherein when the information terminal is a second information terminal, only part of the content of the scene control is executed.
  3.  前記第一情報端末は、前記施設内に設置された情報端末であり、
     前記第二情報端末は、携帯型の情報端末である
     請求項2に記載の制御システム。
    The first information terminal is an information terminal installed in the facility,
    The control system according to claim 2, wherein the second information terminal is a portable information terminal.
  4.  前記第一情報端末は、前記実行指示として、ユーザの手動入力を受け付ける情報端末であり、
     前記第二情報端末は、前記実行指示として、ユーザの音声入力を受け付ける情報端末である
     請求項2に記載の制御システム。
    The first information terminal is an information terminal that receives a user's manual input as the execution instruction,
    The control system according to claim 2, wherein the second information terminal is an information terminal that receives a user's voice input as the execution instruction.
  5.  前記制御部は、
     前記情報端末の現在位置に応じて、前記設定部によって設定された前記シーン制御の内容の一部のみまたは全部を実行し、
     前記情報端末が前記施設内に位置する場合、前記シーン制御の内容の全部を実行し、
     前記情報端末が前記施設外に位置する場合、前記シーン制御の内容の一部のみを実行する
     請求項1に記載の制御システム。
    The control unit
    executing only part or all of the scene control content set by the setting unit according to the current position of the information terminal;
    when the information terminal is located within the facility, executing all of the contents of the scene control;
    2. The control system according to claim 1, wherein only part of the content of the scene control is executed when the information terminal is located outside the facility.
  6.  前記制御部は、前記シーン制御の内容の一部のみを実行する場合、実行前に前記シーン制御の内容の一部のみが実行されることを通知する
     請求項1~5のいずれか1項に記載の制御システム。
    6. The control unit according to any one of claims 1 to 5, wherein when only a part of the content of the scene control is to be executed, the control unit notifies that only a part of the content of the scene control is to be executed before execution. Control system as described.
  7.  前記シーン制御の内容の一部のみが実行されるときに制御の対象となる機器を示す画像を表示する表示制御部をさらに備える
     請求項1~6のいずれか1項に記載の制御システム。
    7. The control system according to any one of claims 1 to 6, further comprising a display control unit that displays an image showing a device to be controlled when only part of the content of the scene control is executed.
  8.  前記設定部は、前記第一情報端末へのユーザの第一手動入力に基づいて前記シーン制御の内容を示す設定情報を生成し、生成した設定情報を、前記第二情報端末が前記実行指示を受け付けたときに使用されるサーバ装置へ送信する
     請求項2~4のいずれか1項に記載の制御システム。
    The setting unit generates setting information indicating details of the scene control based on a user's first manual input to the first information terminal, and transmits the generated setting information to the second information terminal according to the execution instruction. 5. The control system according to any one of claims 2 to 4, wherein the control system transmits to a server device used upon acceptance.
  9.  前記設定部は、前記第一情報端末へのユーザの第一手動入力に基づいて前記シーン制御の内容を示す設定情報を生成し、生成した設定情報を、前記ユーザの第二手動入力に基づいて前記サーバ装置へ送信する
     請求項8に記載の制御システム。
    The setting unit generates setting information indicating details of the scene control based on a user's first manual input to the first information terminal, and sets the generated setting information based on the user's second manual input. The control system according to claim 8, which transmits to the server device.
  10.  前記設定部、及び、前記制御部は、前記施設内に設置される制御端末によって備えられる
     請求項1~9のいずれか1項に記載の制御システム。
    The control system according to any one of claims 1 to 9, wherein the setting unit and the control unit are provided by a control terminal installed in the facility.
  11.  施設に設置された機器を対象としたシーン制御の内容を設定する設定ステップと、
     前記シーン制御の実行指示を受け付ける情報端末の種類または前記情報端末の現在位置に応じて、前記設定ステップにおいて設定された前記シーン制御の内容の一部のみ、または、全部を実行する制御ステップとを含む
     制御方法。
    a setting step of setting the content of scene control for equipment installed in the facility;
    a control step of executing only part or all of the content of the scene control set in the setting step according to the type of the information terminal that receives the execution instruction of the scene control or the current position of the information terminal; Including control method.
  12.  請求項11に記載の制御方法をコンピュータに実行させるためのプログラム。 A program for causing a computer to execute the control method according to claim 11.
PCT/JP2022/006382 2021-02-22 2022-02-17 Control system, and control method WO2022176946A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021-026688 2021-02-22
JP2021026688A JP2022128255A (en) 2021-02-22 2021-02-22 Control system and control method

Publications (1)

Publication Number Publication Date
WO2022176946A1 true WO2022176946A1 (en) 2022-08-25

Family

ID=82932282

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/006382 WO2022176946A1 (en) 2021-02-22 2022-02-17 Control system, and control method

Country Status (2)

Country Link
JP (1) JP2022128255A (en)
WO (1) WO2022176946A1 (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016181180A (en) * 2015-03-24 2016-10-13 東芝ライテック株式会社 Control system, controller unit and control method

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016181180A (en) * 2015-03-24 2016-10-13 東芝ライテック株式会社 Control system, controller unit and control method

Also Published As

Publication number Publication date
JP2022128255A (en) 2022-09-01

Similar Documents

Publication Publication Date Title
EP3637243B1 (en) Customized interface based on vocal input
US11929844B2 (en) Customized interface based on vocal input
US9602172B2 (en) User identification and location determination in control applications
US9204291B2 (en) User identification and location determination in control applications
KR102062580B1 (en) Method and apparatus for controlling of devices in home network system
US11557186B2 (en) Connection to legacy panel and self-configuration
US11303955B2 (en) Video integration with home assistant
CN110543159B (en) Intelligent household control method, control equipment and storage medium
WO2022166339A1 (en) Smart home control method, terminal device, and smart home control system
WO2022176946A1 (en) Control system, and control method
JP7503759B2 (en) Voice control system and voice control method
JP7300670B2 (en) Control device, setting device, program
TW201806434A (en) Intelligent home control system enabling to utilize an mobile device to control and configure home appliances through an application program
Okemiri et al. Development of a Smart Home Control System
JP2024086889A (en) Gateway and information processing system
WO2023048720A1 (en) Hierarchical mobile application launch
MOUNIKA et al. Smart Home Automaton Based GPS And GSM using Android App

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22756264

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22756264

Country of ref document: EP

Kind code of ref document: A1