WO2018011890A1 - Control system and apparatus control method - Google Patents

Control system and apparatus control method Download PDF

Info

Publication number
WO2018011890A1
WO2018011890A1 PCT/JP2016/070566 JP2016070566W WO2018011890A1 WO 2018011890 A1 WO2018011890 A1 WO 2018011890A1 JP 2016070566 W JP2016070566 W JP 2016070566W WO 2018011890 A1 WO2018011890 A1 WO 2018011890A1
Authority
WO
WIPO (PCT)
Prior art keywords
control
operation information
information
user
terminal device
Prior art date
Application number
PCT/JP2016/070566
Other languages
French (fr)
Japanese (ja)
Inventor
諭司 花井
聡司 峯澤
遠藤 聡
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to JP2018527290A priority Critical patent/JP6641482B2/en
Priority to PCT/JP2016/070566 priority patent/WO2018011890A1/en
Publication of WO2018011890A1 publication Critical patent/WO2018011890A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F13/00Interconnection of, or transfer of information or other signals between, memories, input/output devices or central processing units
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04QSELECTING
    • H04Q9/00Arrangements in telecontrol or telemetry systems for selectively calling a substation from a main station, in which substation desired apparatus is selected for applying a control signal thereto or for obtaining measured values therefrom

Definitions

  • the present invention relates to a control system and a device control method.
  • Patent Document 1 A technique for operating a device connected to a network using information indicated by a barcode is known (for example, Patent Document 1).
  • a bar code is read by a bar code reader connected to a mobile phone, and the digital home appliance is operated according to the information code described in the read bar code.
  • wearable computers also referred to as wearable devices and wearable terminals
  • wearable computers are becoming popular.
  • Such wearable computers are mainly used for entertainment and health management, and further, researches are being conducted to operate devices using information measured by sensors provided in the wearable computers.
  • the sensor included in the wearable computer is assumed to be constantly activated, that is, constantly measured, when used for device operation as described above, an operation unintended by the user is automatically performed. There are concerns.
  • the present invention has been made to solve the above-described problems, and an object thereof is to provide a control system or the like that can avoid execution of an operation not intended by the user.
  • a control system that controls a plurality of devices, and a control system including a terminal device,
  • the terminal device An image sensor; Specific image detection means for detecting a specific image that matches a predetermined condition from an image captured by the image sensor; Detection information transmitting means for transmitting image detection information based on the specific image to the control device;
  • a notification means for notifying the user of the content of the operation information
  • the controller is When the image detection information transmitted from the terminal device is received, based on the image detection information, a control candidate device is selected from the plurality of devices, and operation information related to the control candidate device is generated, and the terminal device Operation information transmitting means for transmitting;
  • Device control means for determining whether or not control of the control candidate device is permitted by the user and, if permitted, controlling the control candidate device based on the operation information.
  • the terminal device transmits image detection information based on the detected specific image to the control device.
  • the control device selects a control candidate device based on the image detection information, generates operation information related to the control candidate device, and transmits the operation information to the terminal device.
  • the terminal device notifies the user of the content of the operation information, and the control device controls the control candidate device when the control is permitted by the user. Therefore, it is possible to avoid an operation unintended by the user on the device.
  • FIG. 1 is a figure which shows the structure of the control system which concerns on Embodiment 1 of this invention. It is a block diagram which shows the hardware constitutions of a control apparatus. It is a figure for demonstrating the table etc. which are memorize
  • 2 is a block diagram illustrating a hardware configuration of a terminal device according to Embodiment 1.
  • FIG. It is a figure which shows the function structure of the terminal device which concerns on Embodiment 1.
  • FIG. is a figure which shows the function structure of a control apparatus.
  • 3 is a flowchart illustrating a procedure of device control processing according to the first embodiment. It is a figure which shows a mode that the specific image is contained in the user's visual field.
  • FIG. 6 is a diagram for explaining a first modification of the first embodiment.
  • FIG. 6 is a diagram (No. 1) for describing a second modification of the first embodiment;
  • FIG. 6 is a diagram (No. 2) for describing the second modification of the first embodiment.
  • FIG. 10 is a diagram (No. 1) for describing a third modification of the first embodiment;
  • FIG. 10 is a second diagram for explaining the third modification of the first embodiment.
  • FIG. 10 is a diagram for explaining a fourth modification of the first embodiment. It is a block diagram which shows the hardware constitutions of the terminal device which concerns on Embodiment 2 of this invention.
  • FIG. 10 is a diagram for explaining a control permission instruction in the second embodiment.
  • FIG. 10 is a diagram for explaining a first modification of the second embodiment.
  • FIG. 10 is a diagram for explaining a second modification of the second embodiment. It is a figure for demonstrating other embodiment.
  • FIG. 1 is a diagram illustrating a configuration of a control system 1 according to the first embodiment of the present invention.
  • the control system 1 is a so-called HEMS (Home Energy Management System) system that manages electric power used in a general home.
  • HEMS Home Energy Management System
  • the control system 1 includes a control device 2, a terminal device 3, and a plurality of devices 4 (devices 4-1, 4-2,).
  • the control device 2 is installed at an appropriate location in the house H, monitors the power consumed in this home (demand area), and displays the power consumption status. In addition, the control device 2 performs control of each device 4 and monitoring of an operation state.
  • the device 4 (devices 4-1, 4-2,...) Is, for example, an electric device such as an air conditioner, an illuminator, a television, a water heater, or an IH (Induction Heating) cooker.
  • Each device 4 is installed in a house H (including a site) and connected to a power line that supplies power from a commercial power source, a power generation facility, a power storage facility, or the like (all not shown).
  • Each device 4 is communicably connected to the control device 2 via a wireless network (not shown).
  • This wireless network is, for example, a network conforming to ECHONET Lite. Depending on the specifications of the device 4, it may be connected to this wireless network via an external communication adapter (not shown).
  • each device 4 transmits data (operation state data) storing information indicating the current operation state to the control device 2.
  • the control device 2 includes a CPU (Central Processing Unit) 20, a communication interface 21, a ROM (Read Only Memory) 22, a RAM (Random Access Memory) 23, and a secondary storage device 24. . These components are connected to each other via a bus 25.
  • the CPU 20 controls the control device 2 in an integrated manner. Details of functions realized by the CPU 20 will be described later.
  • the communication interface 21 includes a network card for wirelessly communicating with each device 4 via the above-described wireless network and an IC chip for wirelessly communicating with the terminal device 3.
  • ROM 22 stores a plurality of firmware and data used when executing these firmware.
  • the RAM 23 is used as a work area for the CPU 20.
  • the secondary storage device 24 includes an EEPROM (Electrically-Erasable-Programmable-Read-Only Memory), a readable / writable non-volatile semiconductor memory such as a flash memory, an HDD (Hard Disk-Drive), or the like. As shown in FIG. 3, the secondary storage device 24 stores a remote operation program 240 and device information 241. In addition to this, the secondary storage device 24 includes various programs including a program for monitoring the operation state of the device 4, a program for monitoring the power consumed in the home, and the like. Stores data used during program execution.
  • EEPROM Electrically-Erasable-Programmable-Read-Only Memory
  • a readable / writable non-volatile semiconductor memory such as a flash memory, an HDD (Hard Disk-Drive), or the like.
  • the secondary storage device 24 stores a remote operation program 240 and device information 241.
  • the secondary storage device 24 includes various programs including a program for monitoring the operation state of the device 4, a program
  • the remote operation program 240 is a computer program executed by the CPU 20.
  • the remote operation program 240 describes processing for operating the device 4 via the terminal device 3.
  • the device information 241 is information related to the device 4, and includes a connection information table 2410, a location information table 2411, an operation information table 2412, and an operation code table 2413.
  • the connection information table 2410 is a data table that stores connection information of the device 4.
  • the connection information is information for communicating with the device 4, for example, the address of the device 4.
  • the location information table 2411 is a data table that stores location information of the device 4.
  • the location information is information indicating a location where the device 4 is installed. More specifically, the location information is information for identifying a room in the house H.
  • the information for identifying a room is, for example, an ID (identification) previously assigned to each room or a name of each room (“living room”, “kitchen”, etc.).
  • the operation information table 2412 is a data table that stores the current operation state of the device 4.
  • the operation state includes, for example, a power supply state (power on / off), operation presence / absence (during operation / stop), abnormality presence / absence (abnormal / normal), operation mode, setting information, and the like.
  • the operation mode indicates an operation method. For example, in the case of an air conditioner, cooling, heating, ventilation, dehumidification, and the like correspond to the operation mode, and in the case of an illuminator, normal illumination, power saving illumination, and the like correspond to the operation mode.
  • the setting information corresponds to a set temperature (also referred to as a target temperature), an air volume, and the like, and in the case of an illuminator, a brightness level or the like.
  • the CPU 20 updates the operation information table 2412 based on the operation state data acquired from each device 4.
  • the operation code table 2413 is a data table in which an operation code is associated with information for identifying the device 4 (for example, device ID or address) and the operation content of the device 4.
  • the operation code is, for example, a four-byte character string combining alphabets and numbers. For example, “L001” indicates the power on / off operation of the illuminator installed in the living room, and “L002” Indicates the power on / off operation of the air conditioner installed in the living room, and “K013” indicates the power saving function on / off operation of the refrigerator installed in the kitchen.
  • the terminal device 3 is a so-called wearable computer (also referred to as a wearable device or a wearable terminal), and in this embodiment, a glasses-type head-mounted display (also referred to as a smart glass) that is worn on the user's head. .)
  • a wearable computer also referred to as a wearable device or a wearable terminal
  • a glasses-type head-mounted display also referred to as a smart glass
  • the terminal device 3 includes a CPU 30, a communication interface 31, an image sensor 32, a video projection unit 33, a ROM 34, a RAM 35, and a secondary storage device 36. These components are connected to each other via a bus 37.
  • the CPU 30 controls the terminal device 3 in an integrated manner. Details of the functions realized by the CPU 30 will be described later.
  • the communication interface 31 includes an IC chip for wireless communication with the control device 2.
  • the image sensor 32 is an image sensor such as a CCD (Charge-Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor), and the terminal device 3 so as to be able to image a subject in the field of view of the user wearing the terminal device 3. It is provided outside the frame. That is, the imaging range (field of view) of the image sensor 32 is equivalent to the field of view of the user wearing the terminal device 3.
  • CCD Charge-Coupled Device
  • CMOS Complementary Metal Oxide Semiconductor
  • the image projection unit 33 projects an image on at least one of the right-eye lens and the left-eye lens in the terminal device 3 under the control of the CPU 30.
  • These lenses in the terminal device 3 employ a half mirror made of a dielectric multilayer film, and have a structure that transmits light from the front and reflects an image by the image projection unit 33.
  • the user can see the projected image while viewing the external environment while wearing the terminal device 3. That is, the user wearing the terminal device 3 can experience so-called augmented reality (AR).
  • AR augmented reality
  • the ROM 34 stores a plurality of firmware, data used when executing these firmware, and the like.
  • the RAM 35 is used as a work area for the CPU 30.
  • the secondary storage device 36 is configured by a readable / writable nonvolatile semiconductor memory such as an EEPROM or a flash memory.
  • the secondary storage device 36 stores various programs including a remote operation program describing processing for remotely operating the device 4 via the control device 2 and data used when executing these programs.
  • the terminal device 3 includes a specific image detection unit 300, a detection information transmission unit 301, and a notification unit 302 as shown in FIG. These functional units are realized by the CPU 30 executing a remote operation program stored in the secondary storage device 36.
  • the specific image detection unit 300 detects a specific image that matches a predetermined condition from images captured by the image sensor 32 at regular time intervals (for example, at intervals of 0.5 seconds).
  • the specific image is a two-dimensional code, and more specifically, a matrix-type two-dimensional code.
  • the specific image may be a stack type two-dimensional code or a one-dimensional code.
  • the specific image is not limited to that based on a known standard, and may be defined for use in the control system 1.
  • the detection information transmission unit 301 decodes the detected specific image and stores the decoded result (the operation code described above in the present embodiment). (Image detection information) is transmitted to the control device 2.
  • the notification unit 302 When the notification unit 302 receives later-described operation information transmitted from the control device 2, the notification unit 302 notifies the user of the content of the operation information. More specifically, the notification unit 302 projects an image indicating the content of the operation information on a predetermined position of the lens of the terminal device 3.
  • control device 2 includes an operation information transmission unit 200 and a device control unit 201 as shown in FIG. These functional units are realized by the CPU 20 executing the remote operation program 240 stored in the secondary storage device 24.
  • the operation information transmission unit 200 includes a candidate determination unit 2000 and an operation information generation unit 2001.
  • the candidate determination unit 2000 receives the above-described image detection information from the terminal device 3, the candidate determination unit 2000 extracts an operation code from the received image detection information. Then, the candidate determining unit 2000 refers to the operation code table 2413 using the extracted operation code, and determines the device 4 (control candidate device) to be a control target candidate and the operation content of the device 4.
  • the candidate determination unit 2000 determines an illuminator installed in the living room (hereinafter referred to as a living illuminator) as a control candidate device, and turns the power on / off. It is determined as the operation content of the lighting device in the living room.
  • the operation information generation unit 2001 refers to the operation information table 2412 and acquires the current operation state corresponding to the determined operation content in the determined control candidate device. Then, the operation information generation unit 2001 generates operation information for notifying the terminal device 3 based on the determined control candidate device and its operation content and the acquired current operation state of the control candidate device.
  • the operation information includes the living illuminator turned off to on.
  • Information indicating that the control is to be performed is stored.
  • the operation information generation unit 2001 transmits the operation information generated in this way to the terminal device 3.
  • the device control unit 201 includes a permission determination unit 2010 and a control command transmission unit 2011.
  • the permission determination unit 2010 determines whether or not the user is permitted to control the control candidate device based on the image detection information transmitted from the terminal device 3 after the operation information is transmitted.
  • the permission determination unit 2010 sets the image detection information having the same content (that is, the same operation code) as the image detection information corresponding to the operation information previously transmitted to the terminal device 3 for a predetermined period (first operation). 1 period), it is determined whether or not it has been continuously received from the terminal device 3. And when it receives continuously, it will discriminate
  • the first period is, for example, 10 seconds.
  • the permission determination unit 2010 can receive the image detection information having the same content as the image detection information corresponding to the operation information transmitted to the terminal device 3 once during a predetermined period (second period). If not, it is determined that the non-permission condition is satisfied, and control of the control candidate device is not permitted.
  • the second period is, for example, 10 seconds.
  • the permission determination unit 2010 determines whether or not permission has been given as described above after the operation information is transmitted to the terminal device 3 by the operation information generation unit 2001, that is, the content of the operation information is transmitted to the user in the terminal device 3. This is performed during a predetermined period (waiting for permission) after the notification.
  • the permission waiting period is, for example, 30 seconds. If neither the permission condition nor the non-permission condition is satisfied during the permission waiting period, the permission determination unit 2010 determines that control of the control candidate device is not permitted.
  • control command transmission unit 2011 When the above permission condition is satisfied, the control command transmission unit 2011 generates a control command for controlling the control candidate device and transmits the control command to the control candidate device. More specifically, the control command transmission unit 2011 generates a control command for controlling the control candidate device with the content indicated by the operation information.
  • FIG. 7 is a flowchart showing a procedure of device control processing in remote operation executed by the control system 1.
  • a specific image that is, a two-dimensional code
  • the detection unit 300 detects the specific image (step S101).
  • the detection information transmission unit 301 of the terminal device 3 decodes the specific image detected by the specific image detection unit 300, and transmits the image detection information storing the decoded result (that is, the operation code) to the control device 2 (step). S102).
  • control device 2 When the operation information transmission unit 200 (candidate determination unit 2000, operation information generation unit 2001) of the control device 2 receives the image detection information from the terminal device 3, the operation information transmission unit 200 based on the operation code stored in the received image detection information. Then, control candidate devices are determined and operation information is generated (step S103). And the operation information transmission part 200 transmits the produced
  • the notification unit 302 of the terminal device 3 receives the operation information from the control device 2, the notification unit 302 notifies the user of the content of the operation information (step S105). More specifically, the notification unit 302 projects an image (operation information image) indicating the content of the operation information onto the lens of the terminal device 3.
  • FIG. 9 shows an example in which the operation information video is projected.
  • a control candidate device when permitting control on a control candidate device (here, when permitting control to turn on a living room illuminator from off), at least the first within the permission waiting period (for example, within 30 seconds) described above. Care is taken that the two-dimensional code is in the user's field of view (view) for one period (eg, 10 seconds). Specifically, the user does not move the head or move the object on which the two-dimensional code is printed so that the two-dimensional code is out of the user's field of view.
  • the user moves the head or moves the head so that the two-dimensional code is out of the user's field of view for the second period (for example, 10 seconds). Just move the object with the two-dimensional code printed on it.
  • the permission determination unit 2010 of the control device 2 transmits the operation information to the terminal device 3 (that is, after the content of the operation information is notified by the terminal device 3), and then the image detection information sent from the terminal device 3. Is monitored to determine whether or not control of the control candidate device is permitted (step S106).
  • control command transmission unit 2011 When it is determined that the control is permitted (step S106; YES), the control command transmission unit 2011 generates a control command for controlling the control candidate device with the content of the operation information generated by the operation information generation unit 2001. Then, it transmits to the control candidate device (step S107).
  • the control system 1 can control the device 4 using a specific image, that is, a two-dimensional code. Excellent in properties. Even if the specific image is detected by the terminal device 3, the control device 2 does not immediately control the corresponding device 4, but generates operation information corresponding to the specific image and transmits it to the terminal device 3. And the terminal device 3 alert
  • the control device 2 executes the control of the device 4 (control candidate device) only when the user permits the execution of the control. Therefore, it is possible to avoid a situation in which an operation unintended by the user is performed on the device 4, and an effect of preventing an erroneous operation on the device 4 can be obtained.
  • the remote operation service of the device 4 using such a terminal device 3 can be provided to a wide range of users.
  • the notification unit 302 of the terminal device 3 may notify the user of permission and non-permission instruction methods as shown in FIG.
  • Modification 2 In the first embodiment, when the operation information video is projected, the user can control the specific image (two-dimensional code) in the user's field of view for the first period (for example, 10 seconds). Instructed permission.
  • the instruction method is not limited to this.
  • the user may be able to instruct permission of control by moving the position of the current specific image in the field of view to a predetermined area in the field of view.
  • the notification unit 302 of the terminal device 3 projects an operation information image on the lens of the terminal device 3 and uses a destination frame 1101 as shown in FIG. 11 as the lens. Project.
  • the user moves the head or moves the position of the object on which the specific image is printed, so that the user's field of view (that is, within the imaging range of the image sensor 32).
  • the position of the specific image in is moved to the area surrounded by the destination frame 1101 (see FIG. 12).
  • the detection information transmission unit 301 of the terminal device 3 uses the image detection information to be transmitted to the control device 2 during a predetermined period (same period as the permission waiting period) after the operation information video is projected.
  • the position information of the specific image is stored.
  • the specific image position information is information indicating the position of the specific image in the image (captured image) captured by the image sensor 32.
  • the permission determining unit 2010 of the control device 2 in this modification determines whether or not the control is permitted by the user based on the position information of the specific image included in the received image detection information. That is, when the specific image is located in the area corresponding to the projection position of the movement destination frame 1101 in the captured image, the permission determination unit 2010 determines that the permission condition is satisfied, and allows the control candidate device to control. It is determined that it has been done. It is assumed that data indicating the correspondence between the captured image area and the projection position of the movement destination frame 1101 is stored in advance in the secondary storage device 24 of the control device 2.
  • Modification 3 In the first embodiment, the control in the case where one specific image enters the user's field of view (the imaging range of the image sensor 32), that is, the control when there is one control candidate device has been described. However, as shown in FIG. 13, there may be a case where a plurality of specific images fall within the user's field of view. In such a case, when receiving the operation information transmitted from the control device 2, the notification unit 302 of the terminal device 3 in this modified example projects the operation information video in the vicinity of the corresponding specific image (see FIG. 14). ).
  • the user moves the position of any specific image in the user's field of view to the area surrounded by the movement destination frame 1101 projected onto the lens.
  • the control device 2 can be instructed to permit control of the control candidate device corresponding to the specific image.
  • the permission waiting period is not fixed and may be changed according to the number of control candidate devices. For example, when there is one control candidate device, the permission waiting period may be 30 seconds, and when there are three control candidate devices, the permission waiting period may be 60 seconds.
  • the operation information transmission unit 200 of the control device 2 further includes a control availability determination unit (not shown) that determines whether control of the control candidate device is possible.
  • control availability determination unit refers to the operation information table 2412 to determine whether an abnormality is detected in the control candidate device. When an abnormality is detected, the control availability determination unit determines that the control candidate device cannot be controlled. In addition, for example, when a predetermined operation prohibition period has not elapsed since the most recent operation performed on the control candidate device, the control availability determination unit determines that the control candidate device cannot be controlled. To do.
  • the operation prohibition period may be common to all the devices 4, or may be set for each type of device 4 or individually.
  • control availability determination unit It is determined that control of the control candidate device is not possible.
  • the operation information generating unit 2001 If the control information determining unit 2001 determines that control is impossible, the operation information generating unit 2001 generates operation information indicating that the current operation state of the control candidate device and control are not possible, and transmits the operation information to the terminal device 3. Send.
  • An example of the contents of the operation information notified by the terminal device 3 in this case is shown in FIG.
  • the permission determination unit 2010 determines whether control is permitted or not even when the image detection information corresponding to the control candidate device determined to be uncontrollable by the control permission determination unit is received from the terminal device 3. The image detection information is discarded without performing the processing relating to the above.
  • the device 4 that is not preferably operated via the terminal device 3 from the viewpoint of safety, energy saving, comfort, and the like is excluded from the operation target. Is possible.
  • Embodiment 2 (Embodiment 2) Subsequently, Embodiment 2 of the present invention will be described.
  • components and the like that are common to the first embodiment are denoted by the same reference numerals, and description thereof is omitted.
  • FIG. 16 is a block diagram illustrating a hardware configuration of the terminal device 3A according to the second embodiment.
  • a line-of-sight detection unit 38 is added in addition to the configuration of the terminal device 3 of the first embodiment.
  • the line-of-sight detection unit 38 includes an electrooculogram sensor and detects the line of sight of the user wearing the terminal device 3A.
  • the line-of-sight detection unit 38 may include an image sensor that captures an image of the user's eye, and may detect the user's line of sight by analyzing the captured image of the eye.
  • the notification unit 302 of the terminal device 3A receives the operation information from the control device 2, the operation information video is projected onto the lens of the terminal device 3A, and a mark 1701 as shown in FIG. 17 is projected onto the lens.
  • the user looks at the mark 1701 for a predetermined period (third period).
  • the third period is shorter than the first and second periods described above, for example, 3 seconds.
  • the detection information transmission unit 301 of the terminal device 3A includes the specific image detection unit in the image detection information transmitted to the control device 2 during a predetermined period (same period as the permission waiting period) after the operation information video is projected.
  • the line-of-sight information is stored.
  • the line-of-sight information is information indicating the detection value of the line-of-sight detection unit 38.
  • the permission determination unit 2010 of the control device 2 of this embodiment determines whether or not the control is permitted by the user by monitoring the line-of-sight information included in the received image detection information. More specifically, the permission determination unit 2010 determines whether or not the user is looking at the mark 1701 from the line-of-sight information, and the state in which the user is looking at the mark 1701 is a predetermined period (the same period as the third period). ), It is determined that the control for the control candidate device is permitted, assuming that the permission condition is satisfied. It is assumed that data indicating the correspondence between the detection value of the line-of-sight detection unit 38 and the projection position of the mark 1701 is stored in advance in the secondary storage device 24 of the control device 2.
  • the permission determining unit 2010 determines that control of the control candidate device is not permitted.
  • control command transmission unit 2011 When the above permission condition is satisfied, the control command transmission unit 2011 generates a control command for controlling the control candidate device with the content of the operation information generated by the operation information generation unit 2001, and transmits the control command to the control candidate device. To do.
  • the device 4 can be controlled using a specific image, that is, a two-dimensional code. Excellent in properties. Even if the specific image is detected by the terminal device 3 ⁇ / b> A, the control device 2 does not immediately control the corresponding device 4, generates operation information corresponding to the specific image, and transmits the operation information to the terminal device 3. Then, the terminal device 3A notifies the user of the contents of the operation information from the control device 2. The control device 2 executes the control of the device 4 (control candidate device) only when the user permits the execution of the control. A situation in which an operation unintended by the user is performed on the device 4 can be avoided, and an effect of preventing an erroneous operation on the device 4 can be obtained.
  • a specific image that is, a two-dimensional code. Excellent in properties. Even if the specific image is detected by the terminal device 3 ⁇ / b> A, the control device 2 does not immediately control the corresponding device 4, generates operation information corresponding to the specific image, and transmits the operation information to the terminal device 3. The
  • the user can instruct the control device 2 to change whether the control is permitted or not by simply changing the line of sight. Therefore, it is possible to provide a wider range of users with a remote operation service of the device 4 using such a terminal device 3.
  • Modification 1 In the second embodiment, when the control content is notified, the user instructs the permission of the control by staring at the mark 1701 projected on the lens of the terminal device 3 during the third period.
  • the instruction method is not limited to this.
  • the notification unit 302 of the terminal device 3A projects not the mark 1701 but marks 1801A and 1801B indicating the movement destinations of the line of sight when allowing or not permitting onto the lens of the terminal device 3A, as shown in FIG. May be.
  • the user looks at the mark 1801A during the third period.
  • the user looks at the mark 1801B.
  • the permission determination unit 2010 of the control device 2 determines whether the user is looking at the mark 1801A from the line-of-sight information included in the received image detection information. When the state where the user is looking at the mark 1801A continues for the third period, the permission determining unit 2010 determines that the control for the control candidate device is permitted, assuming that the permission condition is satisfied.
  • the permission determination unit 2010 determines whether the user is looking at the mark 1801B from the line-of-sight information. When the state where the user is looking at the mark 1801B continues during the third period, the permission determining unit 2010 determines that the control for the control candidate device is not permitted, assuming that the disallowed condition is satisfied.
  • the permission determination unit 2010 determines that control of the control candidate device is not permitted.
  • Modification 2 In the second embodiment, the control when one specific image enters the user's field of view (the imaging range of the image sensor 32), that is, when the number of control candidate devices is one has been described. However, as shown in FIG. 13, there may be a case where a plurality of specific images fall within the user's field of view. In such a case, the notification unit 302 of the terminal device 3A in this modified example receives the operation information from the control device 2 as in the case of the modified example 3 of the first embodiment, and closes it to the corresponding specific image. An operation information image is projected (see FIG. 19).
  • the notification unit 302 changes the projection mode of the operation information video to which the user's line of sight is directed from the others. For example, in the example of FIG. 19, it is shown that the user's line of sight is directed to the operation information video corresponding to the kitchen illuminator that is the control candidate device. In this way, the user can easily and accurately recognize where his / her line of sight is directed.
  • the detection information transmission unit 301 of the terminal device 3A uses the image detection information to be transmitted to the control device 2 during a predetermined period (same period as the permission waiting period) after the operation information video is projected.
  • the line-of-sight information is stored.
  • the line-of-sight information in this modification is information indicating whether or not the user's line of sight is directed to the operation information video corresponding to the specific image.
  • the permission determination unit 2010 of the control device 2 in this modification monitors the line-of-sight information included in the received image detection information to determine whether the user is viewing any one operation information video. If the state where any one of the operation information images is viewed continues for a predetermined period (fourth period), the permission determination unit 2010 determines that the permission condition is satisfied and responds to the image detection information. It is determined that control of the control candidate device to be permitted is permitted.
  • the permission waiting period is not fixed and may be changed according to the number of control candidate devices as in the case of the third modification of the first embodiment. For example, when there is one control candidate device, the permission waiting period may be 30 seconds, and when there are three control candidate devices, the permission waiting period may be 60 seconds.
  • the operation information transmission unit 200 may further include a control availability determination unit (not shown) that determines whether control of the control candidate device is possible. Good.
  • the control information determining unit 2001 determines that the control is impossible
  • the operation information generating unit 2001 generates operation information indicating that the current operation state of the control candidate device and the control are not possible, and the terminal Transmit to device 3A.
  • the permission determination unit 2010 determines whether control is permitted or not even when the image detection information corresponding to the control candidate device determined to be uncontrollable by the control permission determination unit is received from the terminal device 3A.
  • the image detection information is discarded without performing the processing relating to the above.
  • the terminal devices 3 and 3A are eyeglass-type head-mounted displays (also referred to as smart glasses), but as the terminal device according to the present invention, a non-transmissive head-mounted display or It is also possible to employ a wearable computer other than the head mounted display.
  • a display device such as a liquid crystal display that displays an image showing the state of the outside world (outside world image) captured by the image sensor 32 is provided. In this case, the operation information video is displayed on the display device so as to be superimposed on the external image.
  • the terminal devices 3 and 3A may emit sound or light, or may vibrate.
  • the terminal devices 3 and 3A may notify the user that the control device 2 has received an instruction for permission or non-permission of control.
  • the control device 2 notifies the terminal devices 3 and 3A that an instruction to allow or disallow control by the user has been received.
  • the terminal devices 3 and 3A change the projection mode (or display mode) of the corresponding operation information video according to permission or non-permission.
  • FIG. 20 An example in which such notification is performed by the terminal device 3 in the third modification of the first embodiment is shown in FIG. In the example of FIG. 20, it is shown that the control for the living room illuminator is permitted, the control for the living room air conditioner is not permitted, and the control for the kitchen illuminator is waiting for a permission or non-permission instruction. Has been.
  • the manner of notifying the content of the operation information is not limited to video only, and may be notified by a voice message, for example.
  • the control device 2 determines whether or not the control of the control candidate device is permitted by the user. However, the determination may be performed by the terminal devices 3 and 3A. Specifically, the terminal devices 3 and 3A include a permission determination unit (not shown) that performs the same function as the permission determination unit 2010 of the control device 2, and the permission determination unit includes information ( Permission determination information) is transmitted to the control device 2. And the control apparatus 2 discriminate
  • the terminal devices 3 and 3A may be separately provided with a push button, a switch, or the like for receiving permission or non-permission of control for the control candidate device from the user.
  • a microphone may be provided in the terminal devices 3 and 3A so that the user can instruct permission or non-permission of control of the control candidate device by voice, or an acceleration sensor is provided in the terminal devices 3 and 3A. The user may be able to instruct permission or non-permission of control of the control candidate device by tilting or shaking the terminal devices 3 and 3A.
  • each function unit (see FIG. 6) of the control device 2 is realized by the CPU 20 executing the remote operation program 240 stored in the secondary storage device 24.
  • each function part (refer FIG. 5) of the terminal devices 3 and 3A was implement
  • all or part of the functional units of the control device 2 and the terminal devices 3 and 3A may be realized by dedicated hardware.
  • the dedicated hardware is, for example, a single circuit, a composite circuit, a programmed processor, an ASIC (Application Specific Integrated Circuit), an FPGA (Field-Programmable Gate Array), or a combination thereof.
  • the program executed by the control device 2 includes CD-ROM (Compact Disc Read Only Memory), DVD (digital versatile disc), magneto-optical disc (Magneto-Optical Disc), USB (Universal Serial). Bus) It is also possible to store and distribute in a computer-readable recording medium such as a memory, a memory card, or an HDD. Then, by installing such a program on a specific or general-purpose computer, it is possible to cause the computer to function as the control device 2 in each of the above embodiments.
  • CD-ROM Compact Disc Read Only Memory
  • DVD digital versatile disc
  • magneto-optical disc Magneto-optical disc
  • USB Universal Serial
  • the above program may be stored in a disk device or the like of a server on a network such as the Internet, and the program may be downloaded from the server to a computer.
  • the present invention can be suitably employed in a system that manages the operation of electrical equipment installed in the home.
  • control system 1 control system, 2 control device, 3 terminal device, 4-1, 4-2 equipment, 20, 30 CPU, 21, 31 communication interface, 22, 34 ROM, 23, 35 RAM, 24, 36 secondary storage device, 25, 37 bus, 32 image sensor, 33 video projection unit, 38 gaze detection unit, 200 operation information transmission unit, 201 device control unit, 240 remote operation program, 241 device information, 300 specific image detection unit, 301 detection information transmission unit 302 notification unit, 2000 candidate determination unit, 2001 operation information generation unit, 2010 permission determination unit, 2011 control command transmission unit, 2410 connection information table, 2411 location information table, 2412 operation information table, 2413 operation code table

Abstract

A terminal device that detects a specific image from images captured by an image sensor (step S101) and transmits image detection information based on the specific image to a control device (step S102). When the image detection information transmitted from the terminal device is received, the control device selects a control candidate apparatus from a plurality of apparatuses on the basis of the image detection information and generates operation information relating to the control candidate apparatus (step S103) and transmits same to the terminal device (step S104). When the operation information transmitted from the control device is received, the terminal device notifies the content of the operation information to a user (step S105). The control device determines whether or not control for the control candidate apparatus is enabled by the user (step S106), and, when control is enabled, transmits a control command to the control candidate apparatus (step S107).

Description

制御システム及び機器制御方法Control system and device control method
 本発明は、制御システム及び機器制御方法に関する。 The present invention relates to a control system and a device control method.
 バーコードが示す情報を用いてネットワークに接続された機器を操作する技術が知られている(例えば、特許文献1)。 A technique for operating a device connected to a network using information indicated by a barcode is known (for example, Patent Document 1).
 特許文献1に開示されるデジタル家電遠隔操作システムでは、携帯電話に接続されたバーコードリーダによってバーコードを読み取って、読み取ったバーコードに記述された情報コードに従ってデジタル家電が操作される。 In the digital home appliance remote control system disclosed in Patent Document 1, a bar code is read by a bar code reader connected to a mobile phone, and the digital home appliance is operated according to the information code described in the read bar code.
特開2004-110148号公報JP 2004-110148 A
 ところで、近年、ウェアラブルコンピュータ(ウェアラブルデバイス、ウェアラブル端末とも呼ばれる。)が普及しつつある。このようなウェアラブルコンピュータは、主として娯楽や健康管理で利用されているが、さらに、ウェアラブルコンピュータが備えるセンサによって計測される情報を用いて機器の操作を行う研究も進められている。 Incidentally, in recent years, wearable computers (also referred to as wearable devices and wearable terminals) are becoming popular. Such wearable computers are mainly used for entertainment and health management, and further, researches are being conducted to operate devices using information measured by sensors provided in the wearable computers.
 しかしながら、ウェアラブルコンピュータが備えるセンサは、常時起動、即ち、常時の計測が想定されるため、上記のように機器の操作に利用される場合、ユーザの意図しない操作が自動的に行われてしまうという懸念がある。 However, since the sensor included in the wearable computer is assumed to be constantly activated, that is, constantly measured, when used for device operation as described above, an operation unintended by the user is automatically performed. There are concerns.
 本発明は、上記課題を解決するためになされたものであり、ユーザの意図しない操作の実行を回避可能な制御システム等を提供することを目的とする。 The present invention has been made to solve the above-described problems, and an object thereof is to provide a control system or the like that can avoid execution of an operation not intended by the user.
 上記目的を達成するため、本発明に係る制御システムは、
 複数の機器を制御する制御装置と、端末装置を備える制御システムであって、
 前記端末装置は、
 イメージセンサと、
 前記イメージセンサにより撮像された画像から予め定めた条件に合致する特定画像を検出する特定画像検出手段と、
 前記特定画像に基づく画像検出情報を前記制御装置に送信する検出情報送信手段と、
 前記制御装置から送信された操作情報を受信すると、前記操作情報の内容をユーザに報知する報知手段と、を備え、
 前記制御装置は、
 前記端末装置から送信された前記画像検出情報を受信すると、前記画像検出情報に基づいて、前記複数の機器から制御候補機器を選択すると共に前記制御候補機器に関する操作情報を生成し、前記端末装置に送信する操作情報送信手段と、
 ユーザにより前記制御候補機器に対する制御が許可されたか否かを判別し、許可された場合には、前記制御候補機器を前記操作情報に基づいて制御する機器制御手段と、を備える。
In order to achieve the above object, a control system according to the present invention provides:
A control system that controls a plurality of devices, and a control system including a terminal device,
The terminal device
An image sensor;
Specific image detection means for detecting a specific image that matches a predetermined condition from an image captured by the image sensor;
Detection information transmitting means for transmitting image detection information based on the specific image to the control device;
When receiving the operation information transmitted from the control device, a notification means for notifying the user of the content of the operation information,
The controller is
When the image detection information transmitted from the terminal device is received, based on the image detection information, a control candidate device is selected from the plurality of devices, and operation information related to the control candidate device is generated, and the terminal device Operation information transmitting means for transmitting;
Device control means for determining whether or not control of the control candidate device is permitted by the user and, if permitted, controlling the control candidate device based on the operation information.
 本発明では、端末装置は、検出した特定画像に基づく画像検出情報を制御装置に送信する。制御装置は、画像検出情報に基づいて、制御候補機器を選択すると共に制御候補機器に関する操作情報を生成し、端末装置に送信する。端末装置は、操作情報の内容をユーザに報知し、制御装置は、ユーザにより制御が許可された場合に、制御候補機器を制御する。したがって、機器に対するユーザの意図しない操作の実行が回避可能となる。 In the present invention, the terminal device transmits image detection information based on the detected specific image to the control device. The control device selects a control candidate device based on the image detection information, generates operation information related to the control candidate device, and transmits the operation information to the terminal device. The terminal device notifies the user of the content of the operation information, and the control device controls the control candidate device when the control is permitted by the user. Therefore, it is possible to avoid an operation unintended by the user on the device.
本発明の実施形態1に係る制御システムの構成を示す図である。It is a figure which shows the structure of the control system which concerns on Embodiment 1 of this invention. 制御装置のハードウェア構成を示すブロック図である。It is a block diagram which shows the hardware constitutions of a control apparatus. 制御装置が備える二次記憶装置に記憶されるテーブル等を説明するための図である。It is a figure for demonstrating the table etc. which are memorize | stored in the secondary storage device with which a control apparatus is provided. 実施形態1に係る端末装置のハードウェア構成を示すブロック図である。2 is a block diagram illustrating a hardware configuration of a terminal device according to Embodiment 1. FIG. 実施形態1に係る端末装置の機能構成を示す図である。It is a figure which shows the function structure of the terminal device which concerns on Embodiment 1. FIG. 制御装置の機能構成を示す図である。It is a figure which shows the function structure of a control apparatus. 実施形態1の機器制御処理の手順を示すフローチャートである。3 is a flowchart illustrating a procedure of device control processing according to the first embodiment. ユーザの視野に特定画像が入っている様子を示す図である。It is a figure which shows a mode that the specific image is contained in the user's visual field. 実施形態1における操作情報映像の投影例を示す図である。It is a figure which shows the example of a projection of the operation information image | video in Embodiment 1. FIG. 実施形態1の変形例1について説明するための図である。FIG. 6 is a diagram for explaining a first modification of the first embodiment. 実施形態1の変形例2について説明するための図(その1)である。FIG. 6 is a diagram (No. 1) for describing a second modification of the first embodiment; 実施形態1の変形例2について説明するための図(その2)である。FIG. 6 is a diagram (No. 2) for describing the second modification of the first embodiment. 実施形態1の変形例3について説明するための図(その1)である。FIG. 10 is a diagram (No. 1) for describing a third modification of the first embodiment; 実施形態1の変形例3について説明するための図(その2)である。FIG. 10 is a second diagram for explaining the third modification of the first embodiment. 実施形態1の変形例4について説明するための図である。FIG. 10 is a diagram for explaining a fourth modification of the first embodiment. 本発明の実施形態2に係る端末装置のハードウェア構成を示すブロック図である。It is a block diagram which shows the hardware constitutions of the terminal device which concerns on Embodiment 2 of this invention. 実施形態2における制御許可の指示について説明するための図である。FIG. 10 is a diagram for explaining a control permission instruction in the second embodiment. 実施形態2の変形例1について説明するための図である。FIG. 10 is a diagram for explaining a first modification of the second embodiment. 実施形態2の変形例2について説明するための図である。FIG. 10 is a diagram for explaining a second modification of the second embodiment. 他の実施形態について説明するための図である。It is a figure for demonstrating other embodiment.
 以下、本発明の実施形態について図面を参照して詳細に説明する。 Hereinafter, embodiments of the present invention will be described in detail with reference to the drawings.
(実施形態1)
 図1は、本発明の実施形態1に係る制御システム1の構成を示す図である。制御システム1は、一般家庭で使用される電力の管理を行う、いわゆる、HEMS(Home Energy Management System)と呼ばれるシステムである。
(Embodiment 1)
FIG. 1 is a diagram illustrating a configuration of a control system 1 according to the first embodiment of the present invention. The control system 1 is a so-called HEMS (Home Energy Management System) system that manages electric power used in a general home.
 図1に示すように、制御システム1は、制御装置2と、端末装置3と、複数の機器4(機器4-1,4-2,…)と、を備える。制御装置2は、家屋H内の適切な場所に設置され、この家庭(需要地)において消費される電力の監視を行い、電力の消費状況を表示する。また、制御装置2は、各機器4の制御や動作状態の監視などを行う。 As shown in FIG. 1, the control system 1 includes a control device 2, a terminal device 3, and a plurality of devices 4 (devices 4-1, 4-2,...). The control device 2 is installed at an appropriate location in the house H, monitors the power consumed in this home (demand area), and displays the power consumption status. In addition, the control device 2 performs control of each device 4 and monitoring of an operation state.
 機器4(機器4-1,4-2,…)は、例えば、エアコン、照明器、テレビ、給湯機、IH(Induction Heating)調理器等の電気機器である。各機器4は、家屋H(敷地も含む)内に設置され、商用電源、発電設備や蓄電設備等(何れも図示せず)からの電力を供給する電力線に接続される。また、各機器4は、図示しない無線ネットワークを介して、制御装置2と通信可能に接続する。この無線ネットワークは、例えば、エコーネットライト(ECHONET Lite)に準じたネットワークである。なお、機器4の仕様によっては、外付けの通信アダプタ(図示せず)を介して、この無線ネットワークに接続されるようにしてもよい。 The device 4 (devices 4-1, 4-2,...) Is, for example, an electric device such as an air conditioner, an illuminator, a television, a water heater, or an IH (Induction Heating) cooker. Each device 4 is installed in a house H (including a site) and connected to a power line that supplies power from a commercial power source, a power generation facility, a power storage facility, or the like (all not shown). Each device 4 is communicably connected to the control device 2 via a wireless network (not shown). This wireless network is, for example, a network conforming to ECHONET Lite. Depending on the specifications of the device 4, it may be connected to this wireless network via an external communication adapter (not shown).
 各機器4は、制御装置2からの要求に応答して、現在の運転状態を示す情報を格納したデータ(運転状態データ)を制御装置2に送信する。 In response to a request from the control device 2, each device 4 transmits data (operation state data) storing information indicating the current operation state to the control device 2.
 制御装置2は、図2に示すように、CPU(Central Processing Unit)20と、通信インタフェース21と、ROM(Read Only Memory)22と、RAM(Random Access Memory)23と、二次記憶装置24と、を備える。これらの構成部は、バス25を介して相互に接続される。CPU20は、この制御装置2を統括的に制御する。CPU20によって実現される機能の詳細については後述する。 As shown in FIG. 2, the control device 2 includes a CPU (Central Processing Unit) 20, a communication interface 21, a ROM (Read Only Memory) 22, a RAM (Random Access Memory) 23, and a secondary storage device 24. . These components are connected to each other via a bus 25. The CPU 20 controls the control device 2 in an integrated manner. Details of functions realized by the CPU 20 will be described later.
 通信インタフェース21は、上述した無線ネットワークを介して各機器4と無線通信するためのネットワークカードと、端末装置3と無線通信するためのICチップを含んで構成される。 The communication interface 21 includes a network card for wirelessly communicating with each device 4 via the above-described wireless network and an IC chip for wirelessly communicating with the terminal device 3.
 ROM22は、複数のファームウェアやこれらのファームウェアの実行時に使用されるデータ等を記憶する。RAM23は、CPU20の作業領域として使用される。 ROM 22 stores a plurality of firmware and data used when executing these firmware. The RAM 23 is used as a work area for the CPU 20.
 二次記憶装置24は、EEPROM(Electrically Erasable Programmable Read-Only Memory)、フラッシュメモリ等の読み書き可能な不揮発性の半導体メモリやHDD(Hard Disk Drive)等から構成される。二次記憶装置24は、図3に示すように、遠隔操作プログラム240と、機器情報241とを記憶する。なお、この他にも、二次記憶装置24は、機器4の動作状態を監視するためのプログラムや、この家庭で消費される電力を監視するためのプログラム等を含む各種のプログラムと、これらのプログラムの実行時に使用されるデータを記憶する。 The secondary storage device 24 includes an EEPROM (Electrically-Erasable-Programmable-Read-Only Memory), a readable / writable non-volatile semiconductor memory such as a flash memory, an HDD (Hard Disk-Drive), or the like. As shown in FIG. 3, the secondary storage device 24 stores a remote operation program 240 and device information 241. In addition to this, the secondary storage device 24 includes various programs including a program for monitoring the operation state of the device 4, a program for monitoring the power consumed in the home, and the like. Stores data used during program execution.
 遠隔操作プログラム240は、CPU20によって実行されるコンピュータ・プログラムである。遠隔操作プログラム240には、端末装置3を介した機器4の操作のための処理について記述されている。 The remote operation program 240 is a computer program executed by the CPU 20. The remote operation program 240 describes processing for operating the device 4 via the terminal device 3.
 機器情報241は、機器4に関する情報であり、接続情報テーブル2410と、場所情報テーブル2411と、動作情報テーブル2412と、操作コードテーブル2413と、を有する。接続情報テーブル2410は、機器4の接続情報を格納するデータテーブルである。接続情報とは、機器4と通信するための情報であり、例えば、機器4のアドレスである。 The device information 241 is information related to the device 4, and includes a connection information table 2410, a location information table 2411, an operation information table 2412, and an operation code table 2413. The connection information table 2410 is a data table that stores connection information of the device 4. The connection information is information for communicating with the device 4, for example, the address of the device 4.
 場所情報テーブル2411は、機器4の場所情報を格納するデータテーブルである。場所情報とは、機器4が設置されている場所を示す情報であり、より詳細には、家屋Hにおける部屋を識別する情報である。部屋を識別する情報とは、例えば、各部屋に予め割り当てたID(identification)や各部屋の名称(“リビング”、“キッチン”等)である。 The location information table 2411 is a data table that stores location information of the device 4. The location information is information indicating a location where the device 4 is installed. More specifically, the location information is information for identifying a room in the house H. The information for identifying a room is, for example, an ID (identification) previously assigned to each room or a name of each room (“living room”, “kitchen”, etc.).
 動作情報テーブル2412は、機器4の現在の動作状態を格納するデータテーブルである。動作状態には、例えば、電源状態(電源のオン/オフ)、運転有無(運転中/停止中)、異常有無(異常/正常)、運転モード、設定情報等が含まれる。運転モードとは、運転方式を示し、例えば、エアコンの場合では、冷房、暖房、送風、除湿等が該当し、照明器の場合では、通常照明、節電照明等が運転モードに該当する。設定情報には、エアコンの場合では、設定温度(目標温度ともいう。)や風量等が該当し、照明器の場合では、明るさの段階等が該当する。CPU20は、各機器4から取得する運転状態データに基づいて、動作情報テーブル2412を更新する。 The operation information table 2412 is a data table that stores the current operation state of the device 4. The operation state includes, for example, a power supply state (power on / off), operation presence / absence (during operation / stop), abnormality presence / absence (abnormal / normal), operation mode, setting information, and the like. The operation mode indicates an operation method. For example, in the case of an air conditioner, cooling, heating, ventilation, dehumidification, and the like correspond to the operation mode, and in the case of an illuminator, normal illumination, power saving illumination, and the like correspond to the operation mode. In the case of an air conditioner, the setting information corresponds to a set temperature (also referred to as a target temperature), an air volume, and the like, and in the case of an illuminator, a brightness level or the like. The CPU 20 updates the operation information table 2412 based on the operation state data acquired from each device 4.
 操作コードテーブル2413は、操作コードと、機器4を識別する情報(例えば、機器のIDやアドレス)及び機器4の操作内容とを対応付けたデータテーブルである。操作コードは、例えば、アルファベットと数字とを組み合わせた半角4桁の文字列であり、例えば、“L001”がリビングに設置されている照明器の電源のオン/オフ操作を示し、“L002”がリビングに設置されているエアコンの電源のオン/オフ操作を示し、“K013”がキッチンに設置されている冷蔵庫の節電機能のオン/オフ操作を示す。 The operation code table 2413 is a data table in which an operation code is associated with information for identifying the device 4 (for example, device ID or address) and the operation content of the device 4. The operation code is, for example, a four-byte character string combining alphabets and numbers. For example, “L001” indicates the power on / off operation of the illuminator installed in the living room, and “L002” Indicates the power on / off operation of the air conditioner installed in the living room, and “K013” indicates the power saving function on / off operation of the refrigerator installed in the kitchen.
 図1に戻り、端末装置3は、いわゆるウェアラブルコンピュータ(ウェアラブルデバイス、ウェアラブル端末とも呼ばれる。)であり、本実施形態では、ユーザの頭部に装着される眼鏡型のヘッドマウントディスプレイ(スマートグラスともいう。)である。 Returning to FIG. 1, the terminal device 3 is a so-called wearable computer (also referred to as a wearable device or a wearable terminal), and in this embodiment, a glasses-type head-mounted display (also referred to as a smart glass) that is worn on the user's head. .)
 端末装置3は、図4に示すように、CPU30、通信インタフェース31、イメージセンサ32、映像投影部33、ROM34、RAM35、二次記憶装置36と、を備える。これらの構成部は、バス37を介して相互に接続される。 4, the terminal device 3 includes a CPU 30, a communication interface 31, an image sensor 32, a video projection unit 33, a ROM 34, a RAM 35, and a secondary storage device 36. These components are connected to each other via a bus 37.
 CPU30は、端末装置3を統括的に制御する。CPU30によって実現される機能の詳細については後述する。通信インタフェース31は、制御装置2と無線通信するためのICチップを含んで構成される。 The CPU 30 controls the terminal device 3 in an integrated manner. Details of the functions realized by the CPU 30 will be described later. The communication interface 31 includes an IC chip for wireless communication with the control device 2.
 イメージセンサ32は、CCD(Charge-Coupled Device)やCMOS(Complementary Metal Oxide Semiconductor)等のイメージセンサであり、端末装置3を装着したユーザの視野に入っている被写体を撮像可能なように端末装置3のフレームの外側に設けられている。即ち、イメージセンサ32の撮像範囲(視野)は、端末装置3を装着したユーザの視野と同等である。 The image sensor 32 is an image sensor such as a CCD (Charge-Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor), and the terminal device 3 so as to be able to image a subject in the field of view of the user wearing the terminal device 3. It is provided outside the frame. That is, the imaging range (field of view) of the image sensor 32 is equivalent to the field of view of the user wearing the terminal device 3.
 映像投影部33は、CPU30の制御の下、端末装置3における右眼用、左眼用のレンズの少なくとも何れか一方に映像を投影する。なお、端末装置3におけるこれらのレンズは、誘電体多層膜によるハーフミラーを採用し、前方からの光は透過し、映像投影部33による映像は反射させる構造となっている。これにより、ユーザは、端末装置3を装着したままで外界の様子を見ることができつつ、投影された映像も併せて見ることが可能である。つまり、端末装置3を装着したユーザは、いわゆる拡張現実(Augmented Reality:AR)を体感することができる。 The image projection unit 33 projects an image on at least one of the right-eye lens and the left-eye lens in the terminal device 3 under the control of the CPU 30. These lenses in the terminal device 3 employ a half mirror made of a dielectric multilayer film, and have a structure that transmits light from the front and reflects an image by the image projection unit 33. As a result, the user can see the projected image while viewing the external environment while wearing the terminal device 3. That is, the user wearing the terminal device 3 can experience so-called augmented reality (AR).
 ROM34は、複数のファームウェアやこれらのファームウェアの実行時に使用されるデータ等を記憶する。RAM35は、CPU30の作業領域として使用される。 The ROM 34 stores a plurality of firmware, data used when executing these firmware, and the like. The RAM 35 is used as a work area for the CPU 30.
 二次記憶装置36は、EEPROMやフラッシュメモリ等の読み書き可能な不揮発性の半導体メモリで構成される。二次記憶装置36は、制御装置2を介して機器4を遠隔操作するための処理について記述された遠隔操作プログラムを含む各種のプログラムと、これらのプログラムの実行時に使用されるデータを記憶する。 The secondary storage device 36 is configured by a readable / writable nonvolatile semiconductor memory such as an EEPROM or a flash memory. The secondary storage device 36 stores various programs including a remote operation program describing processing for remotely operating the device 4 via the control device 2 and data used when executing these programs.
 端末装置3は、機能的には、図5に示すように、特定画像検出部300と、検出情報送信部301と、報知部302と、を備える。これらの機能部は、CPU30が二次記憶装置36に記憶されている遠隔操作プログラムを実行することで実現される。 Functionally, the terminal device 3 includes a specific image detection unit 300, a detection information transmission unit 301, and a notification unit 302 as shown in FIG. These functional units are realized by the CPU 30 executing a remote operation program stored in the secondary storage device 36.
 特定画像検出部300は、一定時間間隔(例えば、0.5秒間隔)で、イメージセンサ32により撮像された画像から予め定めた条件に合致する特定画像を検出する。特定画像とは、本実施形態では、二次元コードであり、より詳細には、マトリックス式の二次元コードである。なお、特定画像は、スタック式の二次元コードであってもよいし、一次元コードでもよい。あるいは、特定画像は、周知の規格に基づくものに限定されず、制御システム1で使用するために定義したものであってもよい。 The specific image detection unit 300 detects a specific image that matches a predetermined condition from images captured by the image sensor 32 at regular time intervals (for example, at intervals of 0.5 seconds). In this embodiment, the specific image is a two-dimensional code, and more specifically, a matrix-type two-dimensional code. The specific image may be a stack type two-dimensional code or a one-dimensional code. Alternatively, the specific image is not limited to that based on a known standard, and may be defined for use in the control system 1.
 検出情報送信部301は、特定画像検出部300により特定画像が検出される度に、当該検出された特定画像をデコードし、デコードした結果(本実施形態では、上述した操作コード)を格納した情報(画像検出情報)を制御装置2に送信する。 Each time the specific image is detected by the specific image detection unit 300, the detection information transmission unit 301 decodes the detected specific image and stores the decoded result (the operation code described above in the present embodiment). (Image detection information) is transmitted to the control device 2.
 報知部302は、制御装置2から送信された後述する操作情報を受信すると、操作情報の内容をユーザに報知する。より詳細には、報知部302は、操作情報の内容を示す映像を端末装置3のレンズの予め定めた位置に投影する。 When the notification unit 302 receives later-described operation information transmitted from the control device 2, the notification unit 302 notifies the user of the content of the operation information. More specifically, the notification unit 302 projects an image indicating the content of the operation information on a predetermined position of the lens of the terminal device 3.
 続いて、制御装置2の機能構成について説明する。制御装置2は、機能的には、図6に示すように、操作情報送信部200と、機器制御部201と、を備える。これらの機能部は、CPU20が二次記憶装置24に記憶されている遠隔操作プログラム240を実行することで実現される。 Subsequently, the functional configuration of the control device 2 will be described. Functionally, the control device 2 includes an operation information transmission unit 200 and a device control unit 201 as shown in FIG. These functional units are realized by the CPU 20 executing the remote operation program 240 stored in the secondary storage device 24.
 操作情報送信部200は、候補決定部2000と、操作情報生成部2001と、を有する。候補決定部2000は、端末装置3からの上述した画像検出情報を受信すると、受信した画像検出情報から操作コードを抽出する。そして、候補決定部2000は、抽出した操作コードを用いて操作コードテーブル2413を参照することで、制御対象の候補となる機器4(制御候補機器)と当該機器4の操作内容を決定する。 The operation information transmission unit 200 includes a candidate determination unit 2000 and an operation information generation unit 2001. When the candidate determination unit 2000 receives the above-described image detection information from the terminal device 3, the candidate determination unit 2000 extracts an operation code from the received image detection information. Then, the candidate determining unit 2000 refers to the operation code table 2413 using the extracted operation code, and determines the device 4 (control candidate device) to be a control target candidate and the operation content of the device 4.
 例えば、操作コードが“L001”の場合、候補決定部2000は、リビングに設置されている照明器(以下、リビングの照明器と称する。)を制御候補機器として決定し、電源のオン/オフをリビングの照明器の操作内容として決定する。 For example, when the operation code is “L001”, the candidate determination unit 2000 determines an illuminator installed in the living room (hereinafter referred to as a living illuminator) as a control candidate device, and turns the power on / off. It is determined as the operation content of the lighting device in the living room.
 操作情報生成部2001は、動作情報テーブル2412を参照して、決定した制御候補機器における決定した操作内容に対応する現在の動作状態を取得する。そして、操作情報生成部2001は、決定した制御候補機器及びその操作内容と、取得した当該制御候補機器の現在の動作状態に基づいて、端末装置3に通知するための操作情報を生成する。 The operation information generation unit 2001 refers to the operation information table 2412 and acquires the current operation state corresponding to the determined operation content in the determined control candidate device. Then, the operation information generation unit 2001 generates operation information for notifying the terminal device 3 based on the determined control candidate device and its operation content and the acquired current operation state of the control candidate device.
 例えば、リビングの照明器が制御候補機器であり、操作内容が電源のオン/オフであり、リビングの照明器の電源が現在オフである場合、操作情報には、リビングの照明器をオフからオンにする制御であることを示す情報が格納される。操作情報生成部2001は、このようにして生成した操作情報を端末装置3に送信する。 For example, if the living room illuminator is a control candidate device, the operation is turned on / off, and the living room illuminator is currently off, the operation information includes the living illuminator turned off to on. Information indicating that the control is to be performed is stored. The operation information generation unit 2001 transmits the operation information generated in this way to the terminal device 3.
 機器制御部201は、許可判別部2010と、制御コマンド送信部2011と、を有する。許可判別部2010は、操作情報の送信後に、端末装置3から送られてくる画像検出情報に基づいて、ユーザにより、当該制御候補機器に対する制御が許可されたか否かを判別する。 The device control unit 201 includes a permission determination unit 2010 and a control command transmission unit 2011. The permission determination unit 2010 determines whether or not the user is permitted to control the control candidate device based on the image detection information transmitted from the terminal device 3 after the operation information is transmitted.
 より詳細には、許可判別部2010は、先に端末装置3に送信した操作情報に対応する画像検出情報と同内容(即ち、操作コードが同一)の画像検出情報を、予め定めた期間(第1期間)の間、端末装置3から継続して受信したか否かを判別する。そして、継続して受信した場合、許可条件が成立したとして、当該制御候補機器に対する制御が許可されたと判別する。第1期間は、例えば10秒である。 More specifically, the permission determination unit 2010 sets the image detection information having the same content (that is, the same operation code) as the image detection information corresponding to the operation information previously transmitted to the terminal device 3 for a predetermined period (first operation). 1 period), it is determined whether or not it has been continuously received from the terminal device 3. And when it receives continuously, it will discriminate | determine that control with respect to the said control candidate apparatus was permitted noting that permission conditions were satisfied. The first period is, for example, 10 seconds.
 また、許可判別部2010は、先に端末装置3に送信した操作情報に対応する画像検出情報と同内容の画像検出情報を、予め定めた期間(第2期間)の間、1回も受信できなかった場合、不許可条件が成立したとして、当該制御候補機器に対する制御が許可されなかったと判別する。第2期間は、例えば10秒である。 Further, the permission determination unit 2010 can receive the image detection information having the same content as the image detection information corresponding to the operation information transmitted to the terminal device 3 once during a predetermined period (second period). If not, it is determined that the non-permission condition is satisfied, and control of the control candidate device is not permitted. The second period is, for example, 10 seconds.
 許可判別部2010は、上記のような許可されたか否かの判別を、操作情報生成部2001により操作情報が端末装置3に送信されてから、即ち、端末装置3において操作情報の内容がユーザに報知されてから、予め定めた期間(許可待ち期間)の間に行う。許可待ち期間は、例えば30秒である。なお、許可待ち期間の間に、許可条件、不許可条件の何れも成立しなかった場合、許可判別部2010は、当該制御候補機器に対する制御が許可されなかったと判別する。 The permission determination unit 2010 determines whether or not permission has been given as described above after the operation information is transmitted to the terminal device 3 by the operation information generation unit 2001, that is, the content of the operation information is transmitted to the user in the terminal device 3. This is performed during a predetermined period (waiting for permission) after the notification. The permission waiting period is, for example, 30 seconds. If neither the permission condition nor the non-permission condition is satisfied during the permission waiting period, the permission determination unit 2010 determines that control of the control candidate device is not permitted.
 制御コマンド送信部2011は、上記の許可条件が成立した場合、当該制御候補機器を制御するための制御コマンドを生成し、当該制御候補機器に送信する。より詳細には、制御コマンド送信部2011は、上記の操作情報で示される内容で当該制御候補機器を制御するための制御コマンドを生成する。 When the above permission condition is satisfied, the control command transmission unit 2011 generates a control command for controlling the control candidate device and transmits the control command to the control candidate device. More specifically, the control command transmission unit 2011 generates a control command for controlling the control candidate device with the content indicated by the operation information.
 図7は、制御システム1で実行される、遠隔操作における機器制御処理の手順を示すフローチャートである。端末装置3を装着したユーザの視野(レンズ越しのユーザ視野)に、紙等の物体に印字された特定画像(即ち、二次元コード)が入ると(図8参照)、端末装置3の特定画像検出部300は、かかる特定画像を検出する(ステップS101)。 FIG. 7 is a flowchart showing a procedure of device control processing in remote operation executed by the control system 1. When a specific image (that is, a two-dimensional code) printed on an object such as paper enters the visual field of the user wearing the terminal device 3 (user visual field through the lens) (see FIG. 8), the specific image of the terminal device 3 The detection unit 300 detects the specific image (step S101).
 端末装置3の検出情報送信部301は、特定画像検出部300により検出された特定画像をデコードし、デコードした結果(即ち、操作コード)を格納した画像検出情報を制御装置2に送信する(ステップS102)。 The detection information transmission unit 301 of the terminal device 3 decodes the specific image detected by the specific image detection unit 300, and transmits the image detection information storing the decoded result (that is, the operation code) to the control device 2 (step). S102).
 制御装置2の操作情報送信部200(候補決定部2000、操作情報生成部2001)は、端末装置3からの画像検出情報を受信すると、受信した画像検出情報に格納されている操作コードに基づいて、制御候補機器を決定し、操作情報を生成する(ステップS103)。そして、操作情報送信部200は、生成した操作情報を端末装置3に送信する(ステップS104)。 When the operation information transmission unit 200 (candidate determination unit 2000, operation information generation unit 2001) of the control device 2 receives the image detection information from the terminal device 3, the operation information transmission unit 200 based on the operation code stored in the received image detection information. Then, control candidate devices are determined and operation information is generated (step S103). And the operation information transmission part 200 transmits the produced | generated operation information to the terminal device 3 (step S104).
 端末装置3の報知部302は、制御装置2からの操作情報を受信すると、操作情報の内容をユーザに報知する(ステップS105)。より詳細には、報知部302は、操作情報の内容を示す映像(操作情報映像)を端末装置3のレンズに投影する。図9に、操作情報映像が投影された例を示す。 When the notification unit 302 of the terminal device 3 receives the operation information from the control device 2, the notification unit 302 notifies the user of the content of the operation information (step S105). More specifically, the notification unit 302 projects an image (operation information image) indicating the content of the operation information onto the lens of the terminal device 3. FIG. 9 shows an example in which the operation information video is projected.
 ユーザは、制御候補機器に対する制御を許可する場合(ここでは、リビングの照明器をオフからオンにする制御を許可する場合)、上述した許可待ち期間内(例えば、30秒以内)に、少なくとも第1期間(例えば10秒)の間、当該二次元コードがユーザの視野(視界)に入っているように注意する。具体的には、ユーザは、当該二次元コードがユーザの視野から外れてしまうほどに、頭部を動かしたり、あるいは、二次元コードが印字された物体を動かさないようにする。 When permitting control on a control candidate device (here, when permitting control to turn on a living room illuminator from off), at least the first within the permission waiting period (for example, within 30 seconds) described above. Care is taken that the two-dimensional code is in the user's field of view (view) for one period (eg, 10 seconds). Specifically, the user does not move the head or move the object on which the two-dimensional code is printed so that the two-dimensional code is out of the user's field of view.
 一方、制御候補機器に対する制御を許可しない場合には、ユーザは、第2期間(例えば10秒)の間、当該二次元コードがユーザの視野から外れるように、頭部を動かしたり、あるいは、当該二次元コードが印字された物体を動かせばよい。 On the other hand, if the control of the control candidate device is not permitted, the user moves the head or moves the head so that the two-dimensional code is out of the user's field of view for the second period (for example, 10 seconds). Just move the object with the two-dimensional code printed on it.
 制御装置2の許可判別部2010は、操作情報を端末装置3に送信してから(即ち、端末装置3にて操作情報の内容が報知されてから)、端末装置3から送られくる画像検出情報を監視して、当該制御候補機器に対する制御が許可されたか否かを判別する(ステップS106)。 The permission determination unit 2010 of the control device 2 transmits the operation information to the terminal device 3 (that is, after the content of the operation information is notified by the terminal device 3), and then the image detection information sent from the terminal device 3. Is monitored to determine whether or not control of the control candidate device is permitted (step S106).
 制御が許可されたと判別されると(ステップS106;YES)、制御コマンド送信部2011は、操作情報生成部2001が生成した操作情報の内容で当該制御候補機器を制御するための制御コマンドを生成し、当該制御候補機器に送信する(ステップS107)。 When it is determined that the control is permitted (step S106; YES), the control command transmission unit 2011 generates a control command for controlling the control candidate device with the content of the operation information generated by the operation information generation unit 2001. Then, it transmits to the control candidate device (step S107).
 以上説明したように、本発明の実施形態1に係る制御システム1では、特定画像、即ち二次元コードを用いて機器4を制御することが可能であるため、ユーザの操作性が向上し、利便性に優れる。また、端末装置3により特定画像が検出されても、制御装置2は、直ちに対応する機器4の制御を実行せず、当該特定画像に対応する操作情報を生成し、端末装置3に送信する。そして、端末装置3は、制御装置2からの操作情報の内容をユーザに報知する。制御装置2は、制御の実行をユーザが許可した場合に限り、当該機器4(制御候補機器)の制御を実行する。したがって、機器4に対して、ユーザの意図しない操作が実行されてしまう事態を回避でき、機器4に対する誤操作の防止効果が得られる。 As described above, the control system 1 according to the first embodiment of the present invention can control the device 4 using a specific image, that is, a two-dimensional code. Excellent in properties. Even if the specific image is detected by the terminal device 3, the control device 2 does not immediately control the corresponding device 4, but generates operation information corresponding to the specific image and transmits it to the terminal device 3. And the terminal device 3 alert | reports the content of the operation information from the control apparatus 2 to a user. The control device 2 executes the control of the device 4 (control candidate device) only when the user permits the execution of the control. Therefore, it is possible to avoid a situation in which an operation unintended by the user is performed on the device 4, and an effect of preventing an erroneous operation on the device 4 can be obtained.
 さらに、制御の実行の許可、不許可を、特別な操作を行うことなく、簡易な動作によって制御装置2に指示することができるため、このような端末装置3を用いた機器4の遠隔操作サービスを幅広いユーザに提供することが可能となる。 Further, since it is possible to instruct the control device 2 with simple operations without performing a special operation, permission or non-permission of the execution of control, the remote operation service of the device 4 using such a terminal device 3 Can be provided to a wide range of users.
 なお、実施形態1は、以下のように変形してもよい。 Note that the first embodiment may be modified as follows.
(変形例1)
 端末装置3の報知部302は、操作情報の内容に加え、図10に示すように、許可及び不許可の指示方法をユーザに報知してもよい。
(Modification 1)
In addition to the content of the operation information, the notification unit 302 of the terminal device 3 may notify the user of permission and non-permission instruction methods as shown in FIG.
(変形例2)
 実施形態1では、操作情報映像が投影されると、ユーザは、第1期間(例えば10秒)の間、特定画像(二次元コード)がユーザの視野に入っているようにすることで制御の許可を指示した。しかしながら、この指示方法に限定されることはない。
(Modification 2)
In the first embodiment, when the operation information video is projected, the user can control the specific image (two-dimensional code) in the user's field of view for the first period (for example, 10 seconds). Instructed permission. However, the instruction method is not limited to this.
 例えば、ユーザは、視野内における現在の特定画像の位置を、視野内における予め定めた領域に移動させることで、制御の許可を指示できるようにしてもよい。この場合、端末装置3の報知部302は、制御装置2からの操作情報を受信すると、操作情報映像を端末装置3のレンズに投影すると共に、図11に示すような移動先枠1101をレンズに投影する。 For example, the user may be able to instruct permission of control by moving the position of the current specific image in the field of view to a predetermined area in the field of view. In this case, when receiving the operation information from the control device 2, the notification unit 302 of the terminal device 3 projects an operation information image on the lens of the terminal device 3 and uses a destination frame 1101 as shown in FIG. 11 as the lens. Project.
 ユーザは、制御の実行を許可する場合、頭部を動かしたり、あるいは、当該特定画像が印字された物体の位置を動かす等して、ユーザの視野内(即ち、イメージセンサ32の撮像範囲内)における特定画像の位置を、上記の移動先枠1101で囲まれた領域まで移動させる(図12参照)。 When the user permits the execution of the control, the user moves the head or moves the position of the object on which the specific image is printed, so that the user's field of view (that is, within the imaging range of the image sensor 32). The position of the specific image in is moved to the area surrounded by the destination frame 1101 (see FIG. 12).
 この変形例では、端末装置3の検出情報送信部301は、操作情報映像が投影されてから予め定めた期間(許可待ち期間と同一期間)の間は、制御装置2に送信する画像検出情報に、特定画像検出部300により検出された特定画像をデコードした結果に加え、当該特定画像の位置情報を格納する。特定画像の位置情報とは、イメージセンサ32によって撮像された画像(撮像画像)における特定画像の位置を示す情報である。 In this modification, the detection information transmission unit 301 of the terminal device 3 uses the image detection information to be transmitted to the control device 2 during a predetermined period (same period as the permission waiting period) after the operation information video is projected. In addition to the result of decoding the specific image detected by the specific image detection unit 300, the position information of the specific image is stored. The specific image position information is information indicating the position of the specific image in the image (captured image) captured by the image sensor 32.
 また、この変形例における制御装置2の許可判別部2010は、受信した画像検出情報に含まれる特定画像の位置情報に基づいて、ユーザにより制御が許可されたか否かを判別する。即ち、特定画像が、撮像画像における、上記の移動先枠1101の投影位置に相当する領域内に位置する場合、許可判別部2010は、許可条件が成立したとして、当該制御候補機器に対する制御が許可されたと判別する。なお、撮像画像の領域と、移動先枠1101の投影位置との対応関係を示すデータは、予め制御装置2の二次記憶装置24に保存されているものとする。 Further, the permission determining unit 2010 of the control device 2 in this modification determines whether or not the control is permitted by the user based on the position information of the specific image included in the received image detection information. That is, when the specific image is located in the area corresponding to the projection position of the movement destination frame 1101 in the captured image, the permission determination unit 2010 determines that the permission condition is satisfied, and allows the control candidate device to control. It is determined that it has been done. It is assumed that data indicating the correspondence between the captured image area and the projection position of the movement destination frame 1101 is stored in advance in the secondary storage device 24 of the control device 2.
(変形例3)
 実施形態1では、1つの特定画像がユーザの視野(イメージセンサ32の撮像範囲)に入った場合、即ち、制御候補機器が1つの場合の制御について説明した。しかし、図13に示すように、複数の特定画像がユーザの視野に入るケースもあり得る。このような場合、この変形例における端末装置3の報知部302は、制御装置2から送信された操作情報を受信すると、対応する特定画像に近接させて当該操作情報映像を投影する(図14参照)。
(Modification 3)
In the first embodiment, the control in the case where one specific image enters the user's field of view (the imaging range of the image sensor 32), that is, the control when there is one control candidate device has been described. However, as shown in FIG. 13, there may be a case where a plurality of specific images fall within the user's field of view. In such a case, when receiving the operation information transmitted from the control device 2, the notification unit 302 of the terminal device 3 in this modified example projects the operation information video in the vicinity of the corresponding specific image (see FIG. 14). ).
 このようにすると、複数の特定画像が検出された場合であっても、ユーザは、各特定画像と操作情報の内容との対応関係を把握できる。 In this way, even when a plurality of specific images are detected, the user can grasp the correspondence between each specific image and the content of the operation information.
 この変形例では、前述した変形例2の場合と同様、ユーザは、ユーザの視野内における何れかの特定画像の位置を、レンズに投影された移動先枠1101で囲まれた領域まで移動させることで、当該特定画像に対応する制御候補機器に対する制御の許可を制御装置2に指示することができる。この変形例において、許可待ち期間は、固定ではなく、制御候補機器の数に応じて変更してもよい。例えば、制御候補機器が1つの場合、許可待ち期間は30秒であり、制御候補機器が3つの場合、許可待ち期間は60秒にしてもよい。 In this modification, as in Modification 2 described above, the user moves the position of any specific image in the user's field of view to the area surrounded by the movement destination frame 1101 projected onto the lens. Thus, the control device 2 can be instructed to permit control of the control candidate device corresponding to the specific image. In this modification, the permission waiting period is not fixed and may be changed according to the number of control candidate devices. For example, when there is one control candidate device, the permission waiting period may be 30 seconds, and when there are three control candidate devices, the permission waiting period may be 60 seconds.
(変形例4)
 制御装置2において、候補決定部2000により制御候補機器とその操作内容が決定された後、操作情報生成部2001が操作情報を端末装置3に送信する前に、当該制御候補機器に対する制御の可否を判断するようにしてもよい。具体的には、制御装置2の操作情報送信部200は、制御候補機器に対する制御の可否を判別する図示しない制御可否判別部をさらに備える。
(Modification 4)
In the control device 2, after the candidate determining unit 2000 determines the control candidate device and its operation content, before the operation information generating unit 2001 transmits the operation information to the terminal device 3, whether or not the control candidate device can be controlled is determined. You may make it judge. Specifically, the operation information transmission unit 200 of the control device 2 further includes a control availability determination unit (not shown) that determines whether control of the control candidate device is possible.
 例えば、制御可否判別部は、動作情報テーブル2412を参照して、当該制御候補機器に異常が検出されているか否かを判別する。異常が検出されている場合、制御可否判別部は、当該制御候補機器に対する制御は不可であると判別する。また、例えば、当該制御候補機器に対して行われた直近の操作から予め定められた操作禁止期間が経過していない場合、制御可否判別部は、当該制御候補機器に対する制御は不可であると判別する。操作禁止期間は、全ての機器4で共通であってもよいし、機器4の種別毎、あるいは個別に設定されてもよい。 For example, the control availability determination unit refers to the operation information table 2412 to determine whether an abnormality is detected in the control candidate device. When an abnormality is detected, the control availability determination unit determines that the control candidate device cannot be controlled. In addition, for example, when a predetermined operation prohibition period has not elapsed since the most recent operation performed on the control candidate device, the control availability determination unit determines that the control candidate device cannot be controlled. To do. The operation prohibition period may be common to all the devices 4, or may be set for each type of device 4 or individually.
 また、例えば、当該制御候補機器が、予めスケジュールされた運転の最中または、予めスケジュールされた運転の開始直前(例えば、開始時刻の10分前以内など)の場合、制御可否判別部は、当該制御候補機器に対する制御は不可であると判別する。 Further, for example, when the control candidate device is in the middle of a pre-scheduled operation or immediately before the start of a pre-scheduled operation (for example, within 10 minutes before the start time), the control availability determination unit It is determined that control of the control candidate device is not possible.
 操作情報生成部2001は、制御可否判別部により制御が不可であると判別されると、制御候補機器の現在の動作状態及び制御が不可であることを示す操作情報を生成し、端末装置3に送信する。この場合に端末装置3で報知される操作情報の内容の例を図15に示す。 If the control information determining unit 2001 determines that control is impossible, the operation information generating unit 2001 generates operation information indicating that the current operation state of the control candidate device and control are not possible, and transmits the operation information to the terminal device 3. Send. An example of the contents of the operation information notified by the terminal device 3 in this case is shown in FIG.
 許可判別部2010は、制御可否判別部により制御が不可であると判別された制御候補機器に対応する画像検出情報を端末装置3から受信した場合であっても、制御の許可/不許可の判別に係る処理を行うことなく、当該画像検出情報を破棄する。 The permission determination unit 2010 determines whether control is permitted or not even when the image detection information corresponding to the control candidate device determined to be uncontrollable by the control permission determination unit is received from the terminal device 3. The image detection information is discarded without performing the processing relating to the above.
 このように、制御候補機器の制御の可否を判別することで、安全性、省エネ性、快適性等の観点から端末装置3を介して操作されることが好ましくない機器4を操作対象から外すことが可能となる。 In this way, by determining whether or not the control candidate device can be controlled, the device 4 that is not preferably operated via the terminal device 3 from the viewpoint of safety, energy saving, comfort, and the like is excluded from the operation target. Is possible.
(実施形態2)
 続いて、本発明の実施形態2について説明する。なお、以下の説明において、実施形態1と共通する構成要素等については、同一の符号を付し、その説明を省略する。
(Embodiment 2)
Subsequently, Embodiment 2 of the present invention will be described. In the following description, components and the like that are common to the first embodiment are denoted by the same reference numerals, and description thereof is omitted.
 図16は、実施形態2の端末装置3Aのハードウェア構成を示すブロック図である。端末装置3Aには、実施形態1の端末装置3の構成に加え、視線検出部38が追加されている。視線検出部38は、眼電位センサを備え、端末装置3Aを装着したユーザの視線を検出する。なお、視線検出部38は、ユーザの眼部を撮像するイメージセンサを有し、撮像した眼部の画像を解析することでユーザの視線を検出してもよい。 FIG. 16 is a block diagram illustrating a hardware configuration of the terminal device 3A according to the second embodiment. In the terminal device 3A, a line-of-sight detection unit 38 is added in addition to the configuration of the terminal device 3 of the first embodiment. The line-of-sight detection unit 38 includes an electrooculogram sensor and detects the line of sight of the user wearing the terminal device 3A. Note that the line-of-sight detection unit 38 may include an image sensor that captures an image of the user's eye, and may detect the user's line of sight by analyzing the captured image of the eye.
 端末装置3Aの報知部302は制御装置2からの操作情報を受信すると、操作情報映像を端末装置3Aのレンズに投影すると共に、図17に示すようなマーク1701をレンズに投影する。ユーザは、制御の実行を許可する場合、予め定めた期間(第3期間)の間、マーク1701を見つめる。第3期間は、前述した第1,第2期間より短い時間であり、例えば3秒である。 When the notification unit 302 of the terminal device 3A receives the operation information from the control device 2, the operation information video is projected onto the lens of the terminal device 3A, and a mark 1701 as shown in FIG. 17 is projected onto the lens. When permitting execution of the control, the user looks at the mark 1701 for a predetermined period (third period). The third period is shorter than the first and second periods described above, for example, 3 seconds.
 端末装置3Aの検出情報送信部301は、操作情報映像が投影されてから予め定めた期間(許可待ち期間と同一期間)の間は、制御装置2に送信する画像検出情報に、特定画像検出部300により検出された特定画像をデコードした結果に加え、視線情報を格納する。視線情報とは、視線検出部38の検出値を示す情報である。 The detection information transmission unit 301 of the terminal device 3A includes the specific image detection unit in the image detection information transmitted to the control device 2 during a predetermined period (same period as the permission waiting period) after the operation information video is projected. In addition to the result of decoding the specific image detected by 300, the line-of-sight information is stored. The line-of-sight information is information indicating the detection value of the line-of-sight detection unit 38.
 本実施形態の制御装置2の許可判別部2010は、受信した画像検出情報に含まれる視線情報を監視することで、ユーザにより制御が許可されたか否かを判別する。より詳細には、許可判別部2010は、視線情報から、ユーザがマーク1701を見ているか否かを判別し、ユーザがマーク1701を見ている状態が予め定めた期間(第3期間と同一期間)の間、継続した場合に、許可条件が成立したとして、当該制御候補機器に対する制御が許可されたと判別する。なお、視線検出部38の検出値と、マーク1701の投影位置との対応関係を示すデータは、予め制御装置2の二次記憶装置24に保存されているものとする。 The permission determination unit 2010 of the control device 2 of this embodiment determines whether or not the control is permitted by the user by monitoring the line-of-sight information included in the received image detection information. More specifically, the permission determination unit 2010 determines whether or not the user is looking at the mark 1701 from the line-of-sight information, and the state in which the user is looking at the mark 1701 is a predetermined period (the same period as the third period). ), It is determined that the control for the control candidate device is permitted, assuming that the permission condition is satisfied. It is assumed that data indicating the correspondence between the detection value of the line-of-sight detection unit 38 and the projection position of the mark 1701 is stored in advance in the secondary storage device 24 of the control device 2.
 一方、許可待ち期間の間に、上記の許可条件が成立しなかった場合、許可判別部2010は、当該制御候補機器に対する制御が許可されなかったと判別する。 On the other hand, if the above permission condition is not satisfied during the permission waiting period, the permission determining unit 2010 determines that control of the control candidate device is not permitted.
 制御コマンド送信部2011は、上記の許可条件が成立した場合、操作情報生成部2001が生成した操作情報の内容で当該制御候補機器を制御するための制御コマンドを生成し、当該制御候補機器に送信する。 When the above permission condition is satisfied, the control command transmission unit 2011 generates a control command for controlling the control candidate device with the content of the operation information generated by the operation information generation unit 2001, and transmits the control command to the control candidate device. To do.
 以上説明したように、本発明の実施形態2に係る制御システム1では、特定画像、即ち二次元コードを用いて機器4を制御することが可能であるため、ユーザの操作性が向上し、利便性に優れる。また、端末装置3Aにより特定画像が検出されても、制御装置2は、直ちに対応する機器4の制御を実行せず、当該特定画像に対応する操作情報を生成し、端末装置3に送信する。そして、端末装置3Aは、制御装置2からの操作情報の内容をユーザに報知する。制御装置2は、制御の実行をユーザが許可した場合に限り、当該機器4(制御候補機器)の制御を実行する。機器4に対して、ユーザの意図しない操作が実行されてしまう事態を回避でき、機器4に対する誤操作の防止効果が得られる。 As described above, in the control system 1 according to Embodiment 2 of the present invention, the device 4 can be controlled using a specific image, that is, a two-dimensional code. Excellent in properties. Even if the specific image is detected by the terminal device 3 </ b> A, the control device 2 does not immediately control the corresponding device 4, generates operation information corresponding to the specific image, and transmits the operation information to the terminal device 3. Then, the terminal device 3A notifies the user of the contents of the operation information from the control device 2. The control device 2 executes the control of the device 4 (control candidate device) only when the user permits the execution of the control. A situation in which an operation unintended by the user is performed on the device 4 can be avoided, and an effect of preventing an erroneous operation on the device 4 can be obtained.
 さらに、ユーザは、制御の実行の許可、不許可を視線を変えるだけで制御装置2に指示することができる。したがって、このような端末装置3を用いた機器4の遠隔操作サービスをより幅広いユーザに提供することが可能となる。 Furthermore, the user can instruct the control device 2 to change whether the control is permitted or not by simply changing the line of sight. Therefore, it is possible to provide a wider range of users with a remote operation service of the device 4 using such a terminal device 3.
 なお、実施形態2は、以下のように変形してもよい。 Note that the second embodiment may be modified as follows.
(変形例1)
 実施形態2では、ユーザは、制御内容が報知されると、第3期間の間、端末装置3のレンズに投影されたマーク1701を見つめることで制御の許可を指示した。しかしながら、この指示方法に限定されることはない。
(Modification 1)
In the second embodiment, when the control content is notified, the user instructs the permission of the control by staring at the mark 1701 projected on the lens of the terminal device 3 during the third period. However, the instruction method is not limited to this.
 例えば、端末装置3Aの報知部302は、マーク1701ではなく、図18に示すように、許可する場合と許可しない場合の視線の移動先を示すマーク1801A,1801Bを端末装置3Aのレンズに投影してもよい。 For example, the notification unit 302 of the terminal device 3A projects not the mark 1701 but marks 1801A and 1801B indicating the movement destinations of the line of sight when allowing or not permitting onto the lens of the terminal device 3A, as shown in FIG. May be.
 ユーザは、制御を許可する場合、第3期間の間、マーク1801Aを見つめ、一方、制御を許可しない場合は、第3期間の間、マーク1801Bを見つめる。 When the user permits the control, the user looks at the mark 1801A during the third period. On the other hand, when the user does not permit the control, the user looks at the mark 1801B.
 この変形例において、制御装置2の許可判別部2010は、受信した画像検出情報に含まれる視線情報から、ユーザがマーク1801Aを見ているか否かを判別する。ユーザがマーク1801Aを見ている状態が第3期間の間、継続した場合、許可判別部2010は、許可条件が成立したとして、当該制御候補機器に対する制御が許可されたと判別する。 In this modification, the permission determination unit 2010 of the control device 2 determines whether the user is looking at the mark 1801A from the line-of-sight information included in the received image detection information. When the state where the user is looking at the mark 1801A continues for the third period, the permission determining unit 2010 determines that the control for the control candidate device is permitted, assuming that the permission condition is satisfied.
 また、許可判別部2010は、視線情報から、ユーザがマーク1801Bを見ているか否かを判別する。ユーザがマーク1801Bを見ている状態が第3期間の間、継続した場合、許可判別部2010は、不許可条件が成立したとして、当該制御候補機器に対する制御が許可されなかったと判別する。 Further, the permission determination unit 2010 determines whether the user is looking at the mark 1801B from the line-of-sight information. When the state where the user is looking at the mark 1801B continues during the third period, the permission determining unit 2010 determines that the control for the control candidate device is not permitted, assuming that the disallowed condition is satisfied.
 なお、視線検出部38の検出値と、マーク1801A,1801Bそれぞれの投影位置との対応関係を示すデータは、予め制御装置2の二次記憶装置24に保存されているものとする。 Note that data indicating the correspondence between the detection value of the line-of-sight detection unit 38 and the projection positions of the marks 1801A and 1801B is stored in the secondary storage device 24 of the control device 2 in advance.
 許可待ち期間の間に、上記の許可条件、不許可条件の何れも成立しなかった場合、許可判別部2010は、当該制御候補機器に対する制御が許可されなかったと判別する。 If neither the permission condition nor the non-permission condition is satisfied during the permission waiting period, the permission determination unit 2010 determines that control of the control candidate device is not permitted.
(変形例2)
 実施形態2では、1つの特定画像がユーザの視野(イメージセンサ32の撮像範囲)に入った場合、即ち、制御候補機器が1つの場合の制御について説明した。しかし、図13に示すように、複数の特定画像がユーザの視野に入るケースもあり得る。このような場合、この変形例における端末装置3Aの報知部302は、実施形態1の変形例3の場合と同様、制御装置2からの操作情報を受信すると、対応する特定画像に近接させて当該操作情報映像を投影する(図19参照)。
(Modification 2)
In the second embodiment, the control when one specific image enters the user's field of view (the imaging range of the image sensor 32), that is, when the number of control candidate devices is one has been described. However, as shown in FIG. 13, there may be a case where a plurality of specific images fall within the user's field of view. In such a case, the notification unit 302 of the terminal device 3A in this modified example receives the operation information from the control device 2 as in the case of the modified example 3 of the first embodiment, and closes it to the corresponding specific image. An operation information image is projected (see FIG. 19).
 この変形例では、ユーザは、制御を許可する場合、第4期間の間、当該制御候補機器に対応する操作情報映像を見つめる。第4期間は、前述した第1,第2期間より短く、第3期間より長い時間であり、例えば5秒である。また、報知部302は、ユーザの視線が向けられた操作情報映像の投影態様を他と異なるようにする。例えば、図19の例では、制御候補機器であるキッチンの照明器に対応する操作情報映像にユーザの視線が向けられていることが示されている。このようすれば、ユーザは、自分の視線がどこに向けられているかを容易且つ正確に認識できる。 In this modification, when permitting the control, the user looks at the operation information video corresponding to the control candidate device during the fourth period. The fourth period is shorter than the first and second periods described above and longer than the third period, for example, 5 seconds. In addition, the notification unit 302 changes the projection mode of the operation information video to which the user's line of sight is directed from the others. For example, in the example of FIG. 19, it is shown that the user's line of sight is directed to the operation information video corresponding to the kitchen illuminator that is the control candidate device. In this way, the user can easily and accurately recognize where his / her line of sight is directed.
 端末装置3Aの検出情報送信部301は、上述したように、操作情報映像が投影されてから予め定めた期間(許可待ち期間と同一期間)の間は、制御装置2に送信する画像検出情報に、特定画像検出部300により検出された特定画像をデコードした結果に加え、視線情報を格納する。ただし、この変形例における視線情報は、当該特定画像に対応する操作情報映像にユーザの視線が向けられているか否かを示す情報である。 As described above, the detection information transmission unit 301 of the terminal device 3A uses the image detection information to be transmitted to the control device 2 during a predetermined period (same period as the permission waiting period) after the operation information video is projected. In addition to the result of decoding the specific image detected by the specific image detection unit 300, the line-of-sight information is stored. However, the line-of-sight information in this modification is information indicating whether or not the user's line of sight is directed to the operation information video corresponding to the specific image.
 この変形例における制御装置2の許可判別部2010は、受信した画像検出情報に含まれる視線情報を監視して、ユーザが何れか1つの操作情報映像を見ているか否かを判別する。そして、何れか1つの操作情報映像を見ている状態が予め定めた期間(第4期間)の間、継続した場合、許可判別部2010は、許可条件が成立したとして、当該画像検出情報に対応する制御候補機器に対する制御が許可されたと判別する。 The permission determination unit 2010 of the control device 2 in this modification monitors the line-of-sight information included in the received image detection information to determine whether the user is viewing any one operation information video. If the state where any one of the operation information images is viewed continues for a predetermined period (fourth period), the permission determination unit 2010 determines that the permission condition is satisfied and responds to the image detection information. It is determined that control of the control candidate device to be permitted is permitted.
 なお、実施形態1の変形例3の場合と同様、許可待ち期間は、固定ではなく、制御候補機器の数に応じて変更してもよい。例えば、制御候補機器が1つの場合、許可待ち期間は30秒であり、制御候補機器が3つの場合、許可待ち期間は60秒にしてもよい。 Note that the permission waiting period is not fixed and may be changed according to the number of control candidate devices as in the case of the third modification of the first embodiment. For example, when there is one control candidate device, the permission waiting period may be 30 seconds, and when there are three control candidate devices, the permission waiting period may be 60 seconds.
(変形例3)
 実施形態1の変形例4と同様に、実施形態2の制御装置2において、操作情報送信部200が、制御候補機器に対する制御の可否を判別する図示しない制御可否判別部をさらに備えるようにしてもよい。この場合、操作情報生成部2001は、制御可否判別部により制御が不可であると判別されると、制御候補機器の現在の動作状態及び制御が不可であることを示す操作情報を生成し、端末装置3Aに送信する。
(Modification 3)
As in the fourth modification of the first embodiment, in the control device 2 of the second embodiment, the operation information transmission unit 200 may further include a control availability determination unit (not shown) that determines whether control of the control candidate device is possible. Good. In this case, when the control information determining unit 2001 determines that the control is impossible, the operation information generating unit 2001 generates operation information indicating that the current operation state of the control candidate device and the control are not possible, and the terminal Transmit to device 3A.
 許可判別部2010は、制御可否判別部により制御が不可であると判別された制御候補機器に対応する画像検出情報を端末装置3Aから受信した場合であっても、制御の許可/不許可の判別に係る処理を行うことなく、当該画像検出情報を破棄する。 The permission determination unit 2010 determines whether control is permitted or not even when the image detection information corresponding to the control candidate device determined to be uncontrollable by the control permission determination unit is received from the terminal device 3A. The image detection information is discarded without performing the processing relating to the above.
 本発明は、上記の各実施形態に限定されず、本発明の要旨を逸脱しない範囲での種々の変更は勿論可能である。 The present invention is not limited to the above-described embodiments, and various modifications can be made without departing from the gist of the present invention.
 例えば、上記の各実施形態では、端末装置3,3Aは、眼鏡型のヘッドマウントディスプレイ(スマートグラスともいう。)であったが、本発明に係る端末装置として、非透過型のヘッドマウントディスプレイや、ヘッドマウントディスプレイ以外のウェアラブルコンピュータを採用することも可能である。このような端末装置では、端末装置3,3Aにおける映像投影部33の替わりに、イメージセンサ32により撮像された外界の様子を示す画像(外界画像)を表示する液晶ディスプレイ等の表示デバイスを備える。この場合、操作情報映像が、外界画像に重畳させて表示デバイスに表示される。 For example, in each of the above embodiments, the terminal devices 3 and 3A are eyeglass-type head-mounted displays (also referred to as smart glasses), but as the terminal device according to the present invention, a non-transmissive head-mounted display or It is also possible to employ a wearable computer other than the head mounted display. In such a terminal device, instead of the video projection unit 33 in the terminal devices 3 and 3A, a display device such as a liquid crystal display that displays an image showing the state of the outside world (outside world image) captured by the image sensor 32 is provided. In this case, the operation information video is displayed on the display device so as to be superimposed on the external image.
 また、操作情報の内容が報知されていることをユーザに知らせるため、端末装置3,3Aは、音や光を発したり、あるいは、振動する等してもよい。 Further, in order to notify the user that the contents of the operation information are notified, the terminal devices 3 and 3A may emit sound or light, or may vibrate.
 また、端末装置3,3Aが、制御の許可又は不許可の指示を制御装置2が受付済みであることをユーザに報知してもよい。この場合、ユーザによる制御の許可又は不許可の指示を受け付けたことが制御装置2から端末装置3,3Aに通知される。かかる通知を受けた端末装置3,3Aは、対応する操作情報映像の投影態様(あるいは表示態様)を許可又は不許可に応じて変化させる。このような報知が、実施形態1の変形例3における端末装置3で行われた例を図20に示す。図20の例では、リビングの照明器に対する制御が許可済みであり、リビングのエアコンに対する制御が不許可済みであり、キッチンの照明器に対する制御については許可又は不許可の指示待ちであることが示されている。 Further, the terminal devices 3 and 3A may notify the user that the control device 2 has received an instruction for permission or non-permission of control. In this case, the control device 2 notifies the terminal devices 3 and 3A that an instruction to allow or disallow control by the user has been received. Receiving such notification, the terminal devices 3 and 3A change the projection mode (or display mode) of the corresponding operation information video according to permission or non-permission. An example in which such notification is performed by the terminal device 3 in the third modification of the first embodiment is shown in FIG. In the example of FIG. 20, it is shown that the control for the living room illuminator is permitted, the control for the living room air conditioner is not permitted, and the control for the kitchen illuminator is waiting for a permission or non-permission instruction. Has been.
 また、操作情報の内容の報知態様も映像のみに限定されず、例えば、音声メッセージで報知してもよい。 Also, the manner of notifying the content of the operation information is not limited to video only, and may be notified by a voice message, for example.
 また、上記の各実施形態では、制御装置2が、制御候補機器に対する制御がユーザにより許可されたか否かを判別していたが、かかる判別を端末装置3,3Aが行ってもよい。具体的には、端末装置3,3Aは、制御装置2の許可判別部2010と同様の機能を果たす許可判別部(図示せず)を備え、この許可判別部は、判別した結果を示す情報(許可判別情報)を制御装置2に送信する。そして、制御装置2は、かかる許可判別情報に基づいて、ユーザが制御候補機器に対する制御を許可したか否かを判別する。 In each of the above embodiments, the control device 2 determines whether or not the control of the control candidate device is permitted by the user. However, the determination may be performed by the terminal devices 3 and 3A. Specifically, the terminal devices 3 and 3A include a permission determination unit (not shown) that performs the same function as the permission determination unit 2010 of the control device 2, and the permission determination unit includes information ( Permission determination information) is transmitted to the control device 2. And the control apparatus 2 discriminate | determines whether the user permitted control with respect to a control candidate apparatus based on this permission discrimination | determination information.
 また、端末装置3,3Aに、制御候補機器に対する制御の許可又は不許可をユーザから受け付けるための押しボタンやスイッチ等を別途設けてもよい。あるいは、端末装置3,3Aに、マイクロフォンを設け、ユーザが、音声により制御候補機器に対する制御の許可又は不許可を指示できるようにしてもよいし、端末装置3,3Aに、加速度センサを設け、ユーザが、端末装置3,3Aを傾けたり、振ったりすることで制御候補機器に対する制御の許可又は不許可を指示できるようにしてもよい。 Further, the terminal devices 3 and 3A may be separately provided with a push button, a switch, or the like for receiving permission or non-permission of control for the control candidate device from the user. Alternatively, a microphone may be provided in the terminal devices 3 and 3A so that the user can instruct permission or non-permission of control of the control candidate device by voice, or an acceleration sensor is provided in the terminal devices 3 and 3A. The user may be able to instruct permission or non-permission of control of the control candidate device by tilting or shaking the terminal devices 3 and 3A.
 上記の実施形態では、CPU20によって二次記憶装置24に記憶されている遠隔操作プログラム240が実行されることで、制御装置2の各機能部(図6参照)が実現された。また、CPU30によって二次記憶装置36に記憶されている遠隔操作プログラムが実行されることで、端末装置3,3Aの各機能部(図5参照)が実現された。しかし、制御装置2、端末装置3,3Aそれぞれの機能部の全部又は一部が、専用のハードウェアで実現されるようにしてもよい。専用のハードウェアとは、例えば、単一回路、複合回路、プログラム化されたプロセッサ、ASIC(Application Specific Integrated Circuit)、FPGA(Field-Programmable Gate Array)、又は、これらの組み合わせ等である。 In the above embodiment, each function unit (see FIG. 6) of the control device 2 is realized by the CPU 20 executing the remote operation program 240 stored in the secondary storage device 24. Moreover, each function part (refer FIG. 5) of the terminal devices 3 and 3A was implement | achieved by running the remote operation program memorize | stored in the secondary storage device 36 by CPU30. However, all or part of the functional units of the control device 2 and the terminal devices 3 and 3A may be realized by dedicated hardware. The dedicated hardware is, for example, a single circuit, a composite circuit, a programmed processor, an ASIC (Application Specific Integrated Circuit), an FPGA (Field-Programmable Gate Array), or a combination thereof.
 また、上記の実施形態において、制御装置2によって実行されるプログラムは、CD-ROM(Compact Disc Read Only Memory)、DVD(digital versatile disc)、光磁気ディスク(Magneto-Optical Disc)、USB(Universal Serial Bus)メモリ、メモリカード、HDD等のコンピュータ読み取り可能な記録媒体に格納して配布することも可能である。そして、かかるプログラムを特定の又は汎用のコンピュータにインストールすることによって、当該コンピュータを上記の各実施形態における制御装置2として機能させることも可能である。 In the above embodiment, the program executed by the control device 2 includes CD-ROM (Compact Disc Read Only Memory), DVD (digital versatile disc), magneto-optical disc (Magneto-Optical Disc), USB (Universal Serial). Bus) It is also possible to store and distribute in a computer-readable recording medium such as a memory, a memory card, or an HDD. Then, by installing such a program on a specific or general-purpose computer, it is possible to cause the computer to function as the control device 2 in each of the above embodiments.
 また、上記のプログラムをインターネット等のネットワーク上のサーバが有するディスク装置等に格納しておき、当該プログラムがサーバからコンピュータにダウンロードされるようにしてもよい。 Further, the above program may be stored in a disk device or the like of a server on a network such as the Internet, and the program may be downloaded from the server to a computer.
 本発明は、広義の精神と範囲を逸脱することなく、様々な実施形態及び変形が可能である。また、上述した実施形態は、本発明を説明するためのものであり、本発明の範囲を限定するものではない。つまり、本発明の範囲は、実施形態ではなく、請求の範囲によって示される。そして、請求の範囲内及びそれと同等の発明の意義の範囲内で施される様々な変形が、本発明の範囲内とみなされる。 The present invention can be variously modified and modified without departing from the spirit and scope of the broad sense. Further, the above-described embodiment is for explaining the present invention, and does not limit the scope of the present invention. That is, the scope of the present invention is shown not by the embodiments but by the claims. Various modifications within the scope of the claims and within the scope of the equivalent invention are considered to be within the scope of the present invention.
 本発明は、家庭内に設置された電気機器の動作を管理するシステム等に好適に採用され得る。 The present invention can be suitably employed in a system that manages the operation of electrical equipment installed in the home.
 1 制御システム、2 制御装置、3 端末装置、4-1,4-2 機器、20,30 CPU、21,31 通信インタフェース、22,34 ROM、23,35 RAM、24,36 二次記憶装置、25,37 バス、32 イメージセンサ、33 映像投影部、38 視線検出部、200 操作情報送信部、201 機器制御部、240 遠隔操作プログラム、241 機器情報、300 特定画像検出部、301 検出情報送信部、302 報知部、2000 候補決定部、2001 操作情報生成部、2010 許可判別部、2011 制御コマンド送信部、2410 接続情報テーブル、2411 場所情報テーブル、2412 動作情報テーブル、2413 操作コードテーブル 1 control system, 2 control device, 3 terminal device, 4-1, 4-2 equipment, 20, 30 CPU, 21, 31 communication interface, 22, 34 ROM, 23, 35 RAM, 24, 36 secondary storage device, 25, 37 bus, 32 image sensor, 33 video projection unit, 38 gaze detection unit, 200 operation information transmission unit, 201 device control unit, 240 remote operation program, 241 device information, 300 specific image detection unit, 301 detection information transmission unit 302 notification unit, 2000 candidate determination unit, 2001 operation information generation unit, 2010 permission determination unit, 2011 control command transmission unit, 2410 connection information table, 2411 location information table, 2412 operation information table, 2413 operation code table

Claims (8)

  1.  複数の機器を制御する制御装置と、端末装置を備える制御システムであって、
     前記端末装置は、
     イメージセンサと、
     前記イメージセンサにより撮像された画像から予め定めた条件に合致する特定画像を検出する特定画像検出手段と、
     前記特定画像に基づく画像検出情報を前記制御装置に送信する検出情報送信手段と、
     前記制御装置から送信された操作情報を受信すると、前記操作情報の内容をユーザに報知する報知手段と、を備え、
     前記制御装置は、
     前記端末装置から送信された前記画像検出情報を受信すると、前記画像検出情報に基づいて、前記複数の機器から制御候補機器を選択すると共に前記制御候補機器に関する操作情報を生成し、前記端末装置に送信する操作情報送信手段と、
     ユーザにより前記制御候補機器に対する制御が許可されたか否かを判別し、許可された場合には、前記制御候補機器を前記操作情報に基づいて制御する機器制御手段と、を備える、制御システム。
    A control system that controls a plurality of devices, and a control system including a terminal device,
    The terminal device
    An image sensor;
    Specific image detection means for detecting a specific image that matches a predetermined condition from an image captured by the image sensor;
    Detection information transmitting means for transmitting image detection information based on the specific image to the control device;
    When receiving the operation information transmitted from the control device, a notification means for notifying the user of the content of the operation information,
    The controller is
    When the image detection information transmitted from the terminal device is received, based on the image detection information, a control candidate device is selected from the plurality of devices, and operation information related to the control candidate device is generated, and the terminal device Operation information transmitting means for transmitting;
    A control system comprising: device control means for determining whether or not control of the control candidate device is permitted by a user and, if permitted, controlling the control candidate device based on the operation information.
  2.  前記機器制御手段は、前記操作情報送信手段により前記操作情報が送信された後、前記端末装置から、前記制御候補機器に対応する前記画像検出情報を予め定めた期間の間、継続して受信した場合、ユーザにより前記制御候補機器に対する制御が許可されたと判別する、請求項1に記載の制御システム。 The device control unit continuously receives the image detection information corresponding to the control candidate device for a predetermined period from the terminal device after the operation information is transmitted by the operation information transmission unit. In this case, the control system according to claim 1, wherein it is determined that control of the control candidate device is permitted by a user.
  3.  前記報知手段により前記操作情報の内容が報知された後、前記検出情報送信手段は、前記画像検出情報に前記特定画像の位置に関する位置情報を格納し、
     前記機器制御手段は、前記位置情報に基づいて、ユーザにより前記制御候補機器に対する制御が許可されたか否かを判別する、請求項1に記載の制御システム。
    After the content of the operation information is notified by the notification means, the detection information transmission means stores position information regarding the position of the specific image in the image detection information,
    The control system according to claim 1, wherein the device control unit determines whether or not control of the control candidate device is permitted by a user based on the position information.
  4.  前記端末装置は、ユーザの視線を検出する視線検出手段をさらに備え、
     前記報知手段により前記操作情報の内容が報知された後、前記検出情報送信手段は、前記画像検出情報に前記視線検出手段により検出された結果を示す視線情報を格納し、
     前記機器制御手段は、前記視線情報に基づいて、ユーザにより前記制御候補機器に対する制御が許可されたか否かを判別する、請求項1に記載の制御システム。
    The terminal device further includes line-of-sight detection means for detecting the line of sight of the user,
    After the content of the operation information is notified by the notification means, the detection information transmission means stores line-of-sight information indicating a result detected by the line-of-sight detection means in the image detection information,
    2. The control system according to claim 1, wherein the device control unit determines whether or not control of the control candidate device is permitted by a user based on the line-of-sight information.
  5.  前記操作情報送信手段は、前記制御候補機器に対する制御の可否を判別し、制御が不可であると判別した場合、前記端末装置に送信する前記操作情報に前記制御候補機器に対する制御が不可であることを示す情報を格納する、請求項1から4の何れか1項に記載の制御システム。 The operation information transmitting means determines whether or not the control candidate device is controllable, and if it is determined that control is impossible, the operation information transmitted to the terminal device cannot control the control candidate device. The control system of any one of Claim 1 to 4 which stores the information which shows.
  6.  前記報知手段は、前記制御候補機器に対する制御の許可又は不許可の指示済みであることをユーザに報知する、請求項1から5の何れか1項に記載の制御システム。 The control system according to any one of claims 1 to 5, wherein the notification unit notifies a user that a control permission or non-permission has been instructed for the control candidate device.
  7.  前記端末装置は、前記報知手段により前記操作情報の内容が報知された後、ユーザにより前記制御候補機器に対する制御が許可されたか否かを判別し、判別した結果を示す許可判別情報を前記制御装置に送信する許可判別手段をさらに備え、
     前記機器制御手段は、前記許可判別情報に基づいて、ユーザにより前記制御候補機器に対する制御が許可されたか否かを判別する、請求項1に記載の制御システム。
    After the content of the operation information is notified by the notification means, the terminal device determines whether or not control over the control candidate device is permitted by a user, and permits determination information indicating the determined result is the control device Further comprising permission determination means for transmitting to
    The control system according to claim 1, wherein the device control unit determines whether or not control of the control candidate device is permitted by a user based on the permission determination information.
  8.  複数の機器を制御する制御装置と、端末装置を備える制御システムで実行される方法であって、
     端末装置が、イメージセンサにより撮像された画像から予め定めた条件に合致する特定画像を検出し、前記特定画像に基づく画像検出情報を前記制御装置に送信し、
     制御装置が、前記端末装置から送信された前記画像検出情報を受信すると、前記画像検出情報に基づいて、前記複数の機器から制御候補機器を選択すると共に前記制御候補機器に関する操作情報を生成し、前記端末装置に送信し、
     前記端末装置が、前記制御装置から送信された前記操作情報を受信すると、前記操作情報の内容をユーザに報知し、
     前記制御装置が、ユーザにより前記制御候補機器に対する制御が許可されたか否かを判別し、許可された場合には、前記制御候補機器を前記操作情報に基づいて制御する、機器制御方法。
    A control device that controls a plurality of devices, and a method that is executed by a control system including a terminal device,
    The terminal device detects a specific image that matches a predetermined condition from the image captured by the image sensor, transmits image detection information based on the specific image to the control device,
    When the control device receives the image detection information transmitted from the terminal device, the control device selects a control candidate device from the plurality of devices based on the image detection information and generates operation information related to the control candidate device, Transmitted to the terminal device,
    When the terminal device receives the operation information transmitted from the control device, it informs the user of the content of the operation information,
    A device control method, wherein the control device determines whether or not control of the control candidate device is permitted by a user, and controls the control candidate device based on the operation information when permitted.
PCT/JP2016/070566 2016-07-12 2016-07-12 Control system and apparatus control method WO2018011890A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2018527290A JP6641482B2 (en) 2016-07-12 2016-07-12 Control system and device control method
PCT/JP2016/070566 WO2018011890A1 (en) 2016-07-12 2016-07-12 Control system and apparatus control method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2016/070566 WO2018011890A1 (en) 2016-07-12 2016-07-12 Control system and apparatus control method

Publications (1)

Publication Number Publication Date
WO2018011890A1 true WO2018011890A1 (en) 2018-01-18

Family

ID=60952424

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/070566 WO2018011890A1 (en) 2016-07-12 2016-07-12 Control system and apparatus control method

Country Status (2)

Country Link
JP (1) JP6641482B2 (en)
WO (1) WO2018011890A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030020707A1 (en) * 2001-06-27 2003-01-30 Kangas Kari J. User interface
JP2012090077A (en) * 2010-10-20 2012-05-10 Konica Minolta Business Technologies Inc Portable terminal, and method for operating processing apparatus
JP2013025638A (en) * 2011-07-22 2013-02-04 Kyocera Document Solutions Inc Image formation system, portable terminal device, and program
WO2015140106A1 (en) * 2014-03-17 2015-09-24 IT-Universitetet i København Computer-implemented gaze interaction method and apparatus

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011209805A (en) * 2010-03-29 2011-10-20 Konica Minolta Opto Inc Video display device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030020707A1 (en) * 2001-06-27 2003-01-30 Kangas Kari J. User interface
JP2012090077A (en) * 2010-10-20 2012-05-10 Konica Minolta Business Technologies Inc Portable terminal, and method for operating processing apparatus
JP2013025638A (en) * 2011-07-22 2013-02-04 Kyocera Document Solutions Inc Image formation system, portable terminal device, and program
WO2015140106A1 (en) * 2014-03-17 2015-09-24 IT-Universitetet i København Computer-implemented gaze interaction method and apparatus

Also Published As

Publication number Publication date
JPWO2018011890A1 (en) 2018-09-20
JP6641482B2 (en) 2020-02-05

Similar Documents

Publication Publication Date Title
JP5620287B2 (en) Portable terminal, method and program for changing user interface
US10987804B2 (en) Robot device and non-transitory computer readable medium
US10754161B2 (en) Apparatus control system
KR20170051136A (en) Method and apparatus for alerting to head mounted display user
US10769509B2 (en) Determining an action associated with an apparatus using a combined bar code image
CN112526892B (en) Method and device for controlling intelligent household equipment and electronic equipment
JP2006208997A (en) Video display device and video display system
JP6195628B2 (en) Terminal apparatus, control apparatus, installation position confirmation support system, installation position setting support system, installation position confirmation support method, installation position setting support method, and program
US20150039104A1 (en) Remote controlling method, communication device, and computer-readable storage medium recorded with computer program for performing remote control
JP6355850B2 (en) Determination support apparatus, determination support method, and program
JP2009206774A (en) System and device for transmitting image, and control method
WO2018088349A1 (en) Information output system, device control system, information output method, and program
US20150351201A1 (en) System and method for selecting members of a lighting system
CN104904191A (en) Mobile device and method for establishing a wireless link
KR20150086807A (en) Method and apparatus of universal remocon based on object recognition
WO2018011890A1 (en) Control system and apparatus control method
JP2018195895A (en) Control device, air conditioner, terminal device, control method, and control program
CN109479359B (en) System and method for providing device access to sensor data
US10043374B2 (en) Method, system, and electronic device for monitoring
US10810867B2 (en) Remote control system, remote control method, and program
CN116699841A (en) Virtual reality equipment, and lens barrel position state detection method, device and medium thereof
JP6157588B2 (en) Remote control system, terminal device, remote control method and program
JP2011175360A (en) Arbitration server, arbitration method and arbitration program
JP6570616B2 (en) Remote operation system, controller, and program
KR20120102445A (en) Apparatus and method for controlling public device

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2018527290

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16908794

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16908794

Country of ref document: EP

Kind code of ref document: A1