WO2018011890A1 - Système de commande et procédé de commande d'appareil - Google Patents

Système de commande et procédé de commande d'appareil Download PDF

Info

Publication number
WO2018011890A1
WO2018011890A1 PCT/JP2016/070566 JP2016070566W WO2018011890A1 WO 2018011890 A1 WO2018011890 A1 WO 2018011890A1 JP 2016070566 W JP2016070566 W JP 2016070566W WO 2018011890 A1 WO2018011890 A1 WO 2018011890A1
Authority
WO
WIPO (PCT)
Prior art keywords
control
operation information
information
user
terminal device
Prior art date
Application number
PCT/JP2016/070566
Other languages
English (en)
Japanese (ja)
Inventor
諭司 花井
聡司 峯澤
遠藤 聡
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to JP2018527290A priority Critical patent/JP6641482B2/ja
Priority to PCT/JP2016/070566 priority patent/WO2018011890A1/fr
Publication of WO2018011890A1 publication Critical patent/WO2018011890A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F13/00Interconnection of, or transfer of information or other signals between, memories, input/output devices or central processing units
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04QSELECTING
    • H04Q9/00Arrangements in telecontrol or telemetry systems for selectively calling a substation from a main station, in which substation desired apparatus is selected for applying a control signal thereto or for obtaining measured values therefrom

Definitions

  • the present invention relates to a control system and a device control method.
  • Patent Document 1 A technique for operating a device connected to a network using information indicated by a barcode is known (for example, Patent Document 1).
  • a bar code is read by a bar code reader connected to a mobile phone, and the digital home appliance is operated according to the information code described in the read bar code.
  • wearable computers also referred to as wearable devices and wearable terminals
  • wearable computers are becoming popular.
  • Such wearable computers are mainly used for entertainment and health management, and further, researches are being conducted to operate devices using information measured by sensors provided in the wearable computers.
  • the sensor included in the wearable computer is assumed to be constantly activated, that is, constantly measured, when used for device operation as described above, an operation unintended by the user is automatically performed. There are concerns.
  • the present invention has been made to solve the above-described problems, and an object thereof is to provide a control system or the like that can avoid execution of an operation not intended by the user.
  • a control system that controls a plurality of devices, and a control system including a terminal device,
  • the terminal device An image sensor; Specific image detection means for detecting a specific image that matches a predetermined condition from an image captured by the image sensor; Detection information transmitting means for transmitting image detection information based on the specific image to the control device;
  • a notification means for notifying the user of the content of the operation information
  • the controller is When the image detection information transmitted from the terminal device is received, based on the image detection information, a control candidate device is selected from the plurality of devices, and operation information related to the control candidate device is generated, and the terminal device Operation information transmitting means for transmitting;
  • Device control means for determining whether or not control of the control candidate device is permitted by the user and, if permitted, controlling the control candidate device based on the operation information.
  • the terminal device transmits image detection information based on the detected specific image to the control device.
  • the control device selects a control candidate device based on the image detection information, generates operation information related to the control candidate device, and transmits the operation information to the terminal device.
  • the terminal device notifies the user of the content of the operation information, and the control device controls the control candidate device when the control is permitted by the user. Therefore, it is possible to avoid an operation unintended by the user on the device.
  • FIG. 1 is a figure which shows the structure of the control system which concerns on Embodiment 1 of this invention. It is a block diagram which shows the hardware constitutions of a control apparatus. It is a figure for demonstrating the table etc. which are memorize
  • 2 is a block diagram illustrating a hardware configuration of a terminal device according to Embodiment 1.
  • FIG. It is a figure which shows the function structure of the terminal device which concerns on Embodiment 1.
  • FIG. is a figure which shows the function structure of a control apparatus.
  • 3 is a flowchart illustrating a procedure of device control processing according to the first embodiment. It is a figure which shows a mode that the specific image is contained in the user's visual field.
  • FIG. 6 is a diagram for explaining a first modification of the first embodiment.
  • FIG. 6 is a diagram (No. 1) for describing a second modification of the first embodiment;
  • FIG. 6 is a diagram (No. 2) for describing the second modification of the first embodiment.
  • FIG. 10 is a diagram (No. 1) for describing a third modification of the first embodiment;
  • FIG. 10 is a second diagram for explaining the third modification of the first embodiment.
  • FIG. 10 is a diagram for explaining a fourth modification of the first embodiment. It is a block diagram which shows the hardware constitutions of the terminal device which concerns on Embodiment 2 of this invention.
  • FIG. 10 is a diagram for explaining a control permission instruction in the second embodiment.
  • FIG. 10 is a diagram for explaining a first modification of the second embodiment.
  • FIG. 10 is a diagram for explaining a second modification of the second embodiment. It is a figure for demonstrating other embodiment.
  • FIG. 1 is a diagram illustrating a configuration of a control system 1 according to the first embodiment of the present invention.
  • the control system 1 is a so-called HEMS (Home Energy Management System) system that manages electric power used in a general home.
  • HEMS Home Energy Management System
  • the control system 1 includes a control device 2, a terminal device 3, and a plurality of devices 4 (devices 4-1, 4-2,).
  • the control device 2 is installed at an appropriate location in the house H, monitors the power consumed in this home (demand area), and displays the power consumption status. In addition, the control device 2 performs control of each device 4 and monitoring of an operation state.
  • the device 4 (devices 4-1, 4-2,...) Is, for example, an electric device such as an air conditioner, an illuminator, a television, a water heater, or an IH (Induction Heating) cooker.
  • Each device 4 is installed in a house H (including a site) and connected to a power line that supplies power from a commercial power source, a power generation facility, a power storage facility, or the like (all not shown).
  • Each device 4 is communicably connected to the control device 2 via a wireless network (not shown).
  • This wireless network is, for example, a network conforming to ECHONET Lite. Depending on the specifications of the device 4, it may be connected to this wireless network via an external communication adapter (not shown).
  • each device 4 transmits data (operation state data) storing information indicating the current operation state to the control device 2.
  • the control device 2 includes a CPU (Central Processing Unit) 20, a communication interface 21, a ROM (Read Only Memory) 22, a RAM (Random Access Memory) 23, and a secondary storage device 24. . These components are connected to each other via a bus 25.
  • the CPU 20 controls the control device 2 in an integrated manner. Details of functions realized by the CPU 20 will be described later.
  • the communication interface 21 includes a network card for wirelessly communicating with each device 4 via the above-described wireless network and an IC chip for wirelessly communicating with the terminal device 3.
  • ROM 22 stores a plurality of firmware and data used when executing these firmware.
  • the RAM 23 is used as a work area for the CPU 20.
  • the secondary storage device 24 includes an EEPROM (Electrically-Erasable-Programmable-Read-Only Memory), a readable / writable non-volatile semiconductor memory such as a flash memory, an HDD (Hard Disk-Drive), or the like. As shown in FIG. 3, the secondary storage device 24 stores a remote operation program 240 and device information 241. In addition to this, the secondary storage device 24 includes various programs including a program for monitoring the operation state of the device 4, a program for monitoring the power consumed in the home, and the like. Stores data used during program execution.
  • EEPROM Electrically-Erasable-Programmable-Read-Only Memory
  • a readable / writable non-volatile semiconductor memory such as a flash memory, an HDD (Hard Disk-Drive), or the like.
  • the secondary storage device 24 stores a remote operation program 240 and device information 241.
  • the secondary storage device 24 includes various programs including a program for monitoring the operation state of the device 4, a program
  • the remote operation program 240 is a computer program executed by the CPU 20.
  • the remote operation program 240 describes processing for operating the device 4 via the terminal device 3.
  • the device information 241 is information related to the device 4, and includes a connection information table 2410, a location information table 2411, an operation information table 2412, and an operation code table 2413.
  • the connection information table 2410 is a data table that stores connection information of the device 4.
  • the connection information is information for communicating with the device 4, for example, the address of the device 4.
  • the location information table 2411 is a data table that stores location information of the device 4.
  • the location information is information indicating a location where the device 4 is installed. More specifically, the location information is information for identifying a room in the house H.
  • the information for identifying a room is, for example, an ID (identification) previously assigned to each room or a name of each room (“living room”, “kitchen”, etc.).
  • the operation information table 2412 is a data table that stores the current operation state of the device 4.
  • the operation state includes, for example, a power supply state (power on / off), operation presence / absence (during operation / stop), abnormality presence / absence (abnormal / normal), operation mode, setting information, and the like.
  • the operation mode indicates an operation method. For example, in the case of an air conditioner, cooling, heating, ventilation, dehumidification, and the like correspond to the operation mode, and in the case of an illuminator, normal illumination, power saving illumination, and the like correspond to the operation mode.
  • the setting information corresponds to a set temperature (also referred to as a target temperature), an air volume, and the like, and in the case of an illuminator, a brightness level or the like.
  • the CPU 20 updates the operation information table 2412 based on the operation state data acquired from each device 4.
  • the operation code table 2413 is a data table in which an operation code is associated with information for identifying the device 4 (for example, device ID or address) and the operation content of the device 4.
  • the operation code is, for example, a four-byte character string combining alphabets and numbers. For example, “L001” indicates the power on / off operation of the illuminator installed in the living room, and “L002” Indicates the power on / off operation of the air conditioner installed in the living room, and “K013” indicates the power saving function on / off operation of the refrigerator installed in the kitchen.
  • the terminal device 3 is a so-called wearable computer (also referred to as a wearable device or a wearable terminal), and in this embodiment, a glasses-type head-mounted display (also referred to as a smart glass) that is worn on the user's head. .)
  • a wearable computer also referred to as a wearable device or a wearable terminal
  • a glasses-type head-mounted display also referred to as a smart glass
  • the terminal device 3 includes a CPU 30, a communication interface 31, an image sensor 32, a video projection unit 33, a ROM 34, a RAM 35, and a secondary storage device 36. These components are connected to each other via a bus 37.
  • the CPU 30 controls the terminal device 3 in an integrated manner. Details of the functions realized by the CPU 30 will be described later.
  • the communication interface 31 includes an IC chip for wireless communication with the control device 2.
  • the image sensor 32 is an image sensor such as a CCD (Charge-Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor), and the terminal device 3 so as to be able to image a subject in the field of view of the user wearing the terminal device 3. It is provided outside the frame. That is, the imaging range (field of view) of the image sensor 32 is equivalent to the field of view of the user wearing the terminal device 3.
  • CCD Charge-Coupled Device
  • CMOS Complementary Metal Oxide Semiconductor
  • the image projection unit 33 projects an image on at least one of the right-eye lens and the left-eye lens in the terminal device 3 under the control of the CPU 30.
  • These lenses in the terminal device 3 employ a half mirror made of a dielectric multilayer film, and have a structure that transmits light from the front and reflects an image by the image projection unit 33.
  • the user can see the projected image while viewing the external environment while wearing the terminal device 3. That is, the user wearing the terminal device 3 can experience so-called augmented reality (AR).
  • AR augmented reality
  • the ROM 34 stores a plurality of firmware, data used when executing these firmware, and the like.
  • the RAM 35 is used as a work area for the CPU 30.
  • the secondary storage device 36 is configured by a readable / writable nonvolatile semiconductor memory such as an EEPROM or a flash memory.
  • the secondary storage device 36 stores various programs including a remote operation program describing processing for remotely operating the device 4 via the control device 2 and data used when executing these programs.
  • the terminal device 3 includes a specific image detection unit 300, a detection information transmission unit 301, and a notification unit 302 as shown in FIG. These functional units are realized by the CPU 30 executing a remote operation program stored in the secondary storage device 36.
  • the specific image detection unit 300 detects a specific image that matches a predetermined condition from images captured by the image sensor 32 at regular time intervals (for example, at intervals of 0.5 seconds).
  • the specific image is a two-dimensional code, and more specifically, a matrix-type two-dimensional code.
  • the specific image may be a stack type two-dimensional code or a one-dimensional code.
  • the specific image is not limited to that based on a known standard, and may be defined for use in the control system 1.
  • the detection information transmission unit 301 decodes the detected specific image and stores the decoded result (the operation code described above in the present embodiment). (Image detection information) is transmitted to the control device 2.
  • the notification unit 302 When the notification unit 302 receives later-described operation information transmitted from the control device 2, the notification unit 302 notifies the user of the content of the operation information. More specifically, the notification unit 302 projects an image indicating the content of the operation information on a predetermined position of the lens of the terminal device 3.
  • control device 2 includes an operation information transmission unit 200 and a device control unit 201 as shown in FIG. These functional units are realized by the CPU 20 executing the remote operation program 240 stored in the secondary storage device 24.
  • the operation information transmission unit 200 includes a candidate determination unit 2000 and an operation information generation unit 2001.
  • the candidate determination unit 2000 receives the above-described image detection information from the terminal device 3, the candidate determination unit 2000 extracts an operation code from the received image detection information. Then, the candidate determining unit 2000 refers to the operation code table 2413 using the extracted operation code, and determines the device 4 (control candidate device) to be a control target candidate and the operation content of the device 4.
  • the candidate determination unit 2000 determines an illuminator installed in the living room (hereinafter referred to as a living illuminator) as a control candidate device, and turns the power on / off. It is determined as the operation content of the lighting device in the living room.
  • the operation information generation unit 2001 refers to the operation information table 2412 and acquires the current operation state corresponding to the determined operation content in the determined control candidate device. Then, the operation information generation unit 2001 generates operation information for notifying the terminal device 3 based on the determined control candidate device and its operation content and the acquired current operation state of the control candidate device.
  • the operation information includes the living illuminator turned off to on.
  • Information indicating that the control is to be performed is stored.
  • the operation information generation unit 2001 transmits the operation information generated in this way to the terminal device 3.
  • the device control unit 201 includes a permission determination unit 2010 and a control command transmission unit 2011.
  • the permission determination unit 2010 determines whether or not the user is permitted to control the control candidate device based on the image detection information transmitted from the terminal device 3 after the operation information is transmitted.
  • the permission determination unit 2010 sets the image detection information having the same content (that is, the same operation code) as the image detection information corresponding to the operation information previously transmitted to the terminal device 3 for a predetermined period (first operation). 1 period), it is determined whether or not it has been continuously received from the terminal device 3. And when it receives continuously, it will discriminate
  • the first period is, for example, 10 seconds.
  • the permission determination unit 2010 can receive the image detection information having the same content as the image detection information corresponding to the operation information transmitted to the terminal device 3 once during a predetermined period (second period). If not, it is determined that the non-permission condition is satisfied, and control of the control candidate device is not permitted.
  • the second period is, for example, 10 seconds.
  • the permission determination unit 2010 determines whether or not permission has been given as described above after the operation information is transmitted to the terminal device 3 by the operation information generation unit 2001, that is, the content of the operation information is transmitted to the user in the terminal device 3. This is performed during a predetermined period (waiting for permission) after the notification.
  • the permission waiting period is, for example, 30 seconds. If neither the permission condition nor the non-permission condition is satisfied during the permission waiting period, the permission determination unit 2010 determines that control of the control candidate device is not permitted.
  • control command transmission unit 2011 When the above permission condition is satisfied, the control command transmission unit 2011 generates a control command for controlling the control candidate device and transmits the control command to the control candidate device. More specifically, the control command transmission unit 2011 generates a control command for controlling the control candidate device with the content indicated by the operation information.
  • FIG. 7 is a flowchart showing a procedure of device control processing in remote operation executed by the control system 1.
  • a specific image that is, a two-dimensional code
  • the detection unit 300 detects the specific image (step S101).
  • the detection information transmission unit 301 of the terminal device 3 decodes the specific image detected by the specific image detection unit 300, and transmits the image detection information storing the decoded result (that is, the operation code) to the control device 2 (step). S102).
  • control device 2 When the operation information transmission unit 200 (candidate determination unit 2000, operation information generation unit 2001) of the control device 2 receives the image detection information from the terminal device 3, the operation information transmission unit 200 based on the operation code stored in the received image detection information. Then, control candidate devices are determined and operation information is generated (step S103). And the operation information transmission part 200 transmits the produced
  • the notification unit 302 of the terminal device 3 receives the operation information from the control device 2, the notification unit 302 notifies the user of the content of the operation information (step S105). More specifically, the notification unit 302 projects an image (operation information image) indicating the content of the operation information onto the lens of the terminal device 3.
  • FIG. 9 shows an example in which the operation information video is projected.
  • a control candidate device when permitting control on a control candidate device (here, when permitting control to turn on a living room illuminator from off), at least the first within the permission waiting period (for example, within 30 seconds) described above. Care is taken that the two-dimensional code is in the user's field of view (view) for one period (eg, 10 seconds). Specifically, the user does not move the head or move the object on which the two-dimensional code is printed so that the two-dimensional code is out of the user's field of view.
  • the user moves the head or moves the head so that the two-dimensional code is out of the user's field of view for the second period (for example, 10 seconds). Just move the object with the two-dimensional code printed on it.
  • the permission determination unit 2010 of the control device 2 transmits the operation information to the terminal device 3 (that is, after the content of the operation information is notified by the terminal device 3), and then the image detection information sent from the terminal device 3. Is monitored to determine whether or not control of the control candidate device is permitted (step S106).
  • control command transmission unit 2011 When it is determined that the control is permitted (step S106; YES), the control command transmission unit 2011 generates a control command for controlling the control candidate device with the content of the operation information generated by the operation information generation unit 2001. Then, it transmits to the control candidate device (step S107).
  • the control system 1 can control the device 4 using a specific image, that is, a two-dimensional code. Excellent in properties. Even if the specific image is detected by the terminal device 3, the control device 2 does not immediately control the corresponding device 4, but generates operation information corresponding to the specific image and transmits it to the terminal device 3. And the terminal device 3 alert
  • the control device 2 executes the control of the device 4 (control candidate device) only when the user permits the execution of the control. Therefore, it is possible to avoid a situation in which an operation unintended by the user is performed on the device 4, and an effect of preventing an erroneous operation on the device 4 can be obtained.
  • the remote operation service of the device 4 using such a terminal device 3 can be provided to a wide range of users.
  • the notification unit 302 of the terminal device 3 may notify the user of permission and non-permission instruction methods as shown in FIG.
  • Modification 2 In the first embodiment, when the operation information video is projected, the user can control the specific image (two-dimensional code) in the user's field of view for the first period (for example, 10 seconds). Instructed permission.
  • the instruction method is not limited to this.
  • the user may be able to instruct permission of control by moving the position of the current specific image in the field of view to a predetermined area in the field of view.
  • the notification unit 302 of the terminal device 3 projects an operation information image on the lens of the terminal device 3 and uses a destination frame 1101 as shown in FIG. 11 as the lens. Project.
  • the user moves the head or moves the position of the object on which the specific image is printed, so that the user's field of view (that is, within the imaging range of the image sensor 32).
  • the position of the specific image in is moved to the area surrounded by the destination frame 1101 (see FIG. 12).
  • the detection information transmission unit 301 of the terminal device 3 uses the image detection information to be transmitted to the control device 2 during a predetermined period (same period as the permission waiting period) after the operation information video is projected.
  • the position information of the specific image is stored.
  • the specific image position information is information indicating the position of the specific image in the image (captured image) captured by the image sensor 32.
  • the permission determining unit 2010 of the control device 2 in this modification determines whether or not the control is permitted by the user based on the position information of the specific image included in the received image detection information. That is, when the specific image is located in the area corresponding to the projection position of the movement destination frame 1101 in the captured image, the permission determination unit 2010 determines that the permission condition is satisfied, and allows the control candidate device to control. It is determined that it has been done. It is assumed that data indicating the correspondence between the captured image area and the projection position of the movement destination frame 1101 is stored in advance in the secondary storage device 24 of the control device 2.
  • Modification 3 In the first embodiment, the control in the case where one specific image enters the user's field of view (the imaging range of the image sensor 32), that is, the control when there is one control candidate device has been described. However, as shown in FIG. 13, there may be a case where a plurality of specific images fall within the user's field of view. In such a case, when receiving the operation information transmitted from the control device 2, the notification unit 302 of the terminal device 3 in this modified example projects the operation information video in the vicinity of the corresponding specific image (see FIG. 14). ).
  • the user moves the position of any specific image in the user's field of view to the area surrounded by the movement destination frame 1101 projected onto the lens.
  • the control device 2 can be instructed to permit control of the control candidate device corresponding to the specific image.
  • the permission waiting period is not fixed and may be changed according to the number of control candidate devices. For example, when there is one control candidate device, the permission waiting period may be 30 seconds, and when there are three control candidate devices, the permission waiting period may be 60 seconds.
  • the operation information transmission unit 200 of the control device 2 further includes a control availability determination unit (not shown) that determines whether control of the control candidate device is possible.
  • control availability determination unit refers to the operation information table 2412 to determine whether an abnormality is detected in the control candidate device. When an abnormality is detected, the control availability determination unit determines that the control candidate device cannot be controlled. In addition, for example, when a predetermined operation prohibition period has not elapsed since the most recent operation performed on the control candidate device, the control availability determination unit determines that the control candidate device cannot be controlled. To do.
  • the operation prohibition period may be common to all the devices 4, or may be set for each type of device 4 or individually.
  • control availability determination unit It is determined that control of the control candidate device is not possible.
  • the operation information generating unit 2001 If the control information determining unit 2001 determines that control is impossible, the operation information generating unit 2001 generates operation information indicating that the current operation state of the control candidate device and control are not possible, and transmits the operation information to the terminal device 3. Send.
  • An example of the contents of the operation information notified by the terminal device 3 in this case is shown in FIG.
  • the permission determination unit 2010 determines whether control is permitted or not even when the image detection information corresponding to the control candidate device determined to be uncontrollable by the control permission determination unit is received from the terminal device 3. The image detection information is discarded without performing the processing relating to the above.
  • the device 4 that is not preferably operated via the terminal device 3 from the viewpoint of safety, energy saving, comfort, and the like is excluded from the operation target. Is possible.
  • Embodiment 2 (Embodiment 2) Subsequently, Embodiment 2 of the present invention will be described.
  • components and the like that are common to the first embodiment are denoted by the same reference numerals, and description thereof is omitted.
  • FIG. 16 is a block diagram illustrating a hardware configuration of the terminal device 3A according to the second embodiment.
  • a line-of-sight detection unit 38 is added in addition to the configuration of the terminal device 3 of the first embodiment.
  • the line-of-sight detection unit 38 includes an electrooculogram sensor and detects the line of sight of the user wearing the terminal device 3A.
  • the line-of-sight detection unit 38 may include an image sensor that captures an image of the user's eye, and may detect the user's line of sight by analyzing the captured image of the eye.
  • the notification unit 302 of the terminal device 3A receives the operation information from the control device 2, the operation information video is projected onto the lens of the terminal device 3A, and a mark 1701 as shown in FIG. 17 is projected onto the lens.
  • the user looks at the mark 1701 for a predetermined period (third period).
  • the third period is shorter than the first and second periods described above, for example, 3 seconds.
  • the detection information transmission unit 301 of the terminal device 3A includes the specific image detection unit in the image detection information transmitted to the control device 2 during a predetermined period (same period as the permission waiting period) after the operation information video is projected.
  • the line-of-sight information is stored.
  • the line-of-sight information is information indicating the detection value of the line-of-sight detection unit 38.
  • the permission determination unit 2010 of the control device 2 of this embodiment determines whether or not the control is permitted by the user by monitoring the line-of-sight information included in the received image detection information. More specifically, the permission determination unit 2010 determines whether or not the user is looking at the mark 1701 from the line-of-sight information, and the state in which the user is looking at the mark 1701 is a predetermined period (the same period as the third period). ), It is determined that the control for the control candidate device is permitted, assuming that the permission condition is satisfied. It is assumed that data indicating the correspondence between the detection value of the line-of-sight detection unit 38 and the projection position of the mark 1701 is stored in advance in the secondary storage device 24 of the control device 2.
  • the permission determining unit 2010 determines that control of the control candidate device is not permitted.
  • control command transmission unit 2011 When the above permission condition is satisfied, the control command transmission unit 2011 generates a control command for controlling the control candidate device with the content of the operation information generated by the operation information generation unit 2001, and transmits the control command to the control candidate device. To do.
  • the device 4 can be controlled using a specific image, that is, a two-dimensional code. Excellent in properties. Even if the specific image is detected by the terminal device 3 ⁇ / b> A, the control device 2 does not immediately control the corresponding device 4, generates operation information corresponding to the specific image, and transmits the operation information to the terminal device 3. Then, the terminal device 3A notifies the user of the contents of the operation information from the control device 2. The control device 2 executes the control of the device 4 (control candidate device) only when the user permits the execution of the control. A situation in which an operation unintended by the user is performed on the device 4 can be avoided, and an effect of preventing an erroneous operation on the device 4 can be obtained.
  • a specific image that is, a two-dimensional code. Excellent in properties. Even if the specific image is detected by the terminal device 3 ⁇ / b> A, the control device 2 does not immediately control the corresponding device 4, generates operation information corresponding to the specific image, and transmits the operation information to the terminal device 3. The
  • the user can instruct the control device 2 to change whether the control is permitted or not by simply changing the line of sight. Therefore, it is possible to provide a wider range of users with a remote operation service of the device 4 using such a terminal device 3.
  • Modification 1 In the second embodiment, when the control content is notified, the user instructs the permission of the control by staring at the mark 1701 projected on the lens of the terminal device 3 during the third period.
  • the instruction method is not limited to this.
  • the notification unit 302 of the terminal device 3A projects not the mark 1701 but marks 1801A and 1801B indicating the movement destinations of the line of sight when allowing or not permitting onto the lens of the terminal device 3A, as shown in FIG. May be.
  • the user looks at the mark 1801A during the third period.
  • the user looks at the mark 1801B.
  • the permission determination unit 2010 of the control device 2 determines whether the user is looking at the mark 1801A from the line-of-sight information included in the received image detection information. When the state where the user is looking at the mark 1801A continues for the third period, the permission determining unit 2010 determines that the control for the control candidate device is permitted, assuming that the permission condition is satisfied.
  • the permission determination unit 2010 determines whether the user is looking at the mark 1801B from the line-of-sight information. When the state where the user is looking at the mark 1801B continues during the third period, the permission determining unit 2010 determines that the control for the control candidate device is not permitted, assuming that the disallowed condition is satisfied.
  • the permission determination unit 2010 determines that control of the control candidate device is not permitted.
  • Modification 2 In the second embodiment, the control when one specific image enters the user's field of view (the imaging range of the image sensor 32), that is, when the number of control candidate devices is one has been described. However, as shown in FIG. 13, there may be a case where a plurality of specific images fall within the user's field of view. In such a case, the notification unit 302 of the terminal device 3A in this modified example receives the operation information from the control device 2 as in the case of the modified example 3 of the first embodiment, and closes it to the corresponding specific image. An operation information image is projected (see FIG. 19).
  • the notification unit 302 changes the projection mode of the operation information video to which the user's line of sight is directed from the others. For example, in the example of FIG. 19, it is shown that the user's line of sight is directed to the operation information video corresponding to the kitchen illuminator that is the control candidate device. In this way, the user can easily and accurately recognize where his / her line of sight is directed.
  • the detection information transmission unit 301 of the terminal device 3A uses the image detection information to be transmitted to the control device 2 during a predetermined period (same period as the permission waiting period) after the operation information video is projected.
  • the line-of-sight information is stored.
  • the line-of-sight information in this modification is information indicating whether or not the user's line of sight is directed to the operation information video corresponding to the specific image.
  • the permission determination unit 2010 of the control device 2 in this modification monitors the line-of-sight information included in the received image detection information to determine whether the user is viewing any one operation information video. If the state where any one of the operation information images is viewed continues for a predetermined period (fourth period), the permission determination unit 2010 determines that the permission condition is satisfied and responds to the image detection information. It is determined that control of the control candidate device to be permitted is permitted.
  • the permission waiting period is not fixed and may be changed according to the number of control candidate devices as in the case of the third modification of the first embodiment. For example, when there is one control candidate device, the permission waiting period may be 30 seconds, and when there are three control candidate devices, the permission waiting period may be 60 seconds.
  • the operation information transmission unit 200 may further include a control availability determination unit (not shown) that determines whether control of the control candidate device is possible. Good.
  • the control information determining unit 2001 determines that the control is impossible
  • the operation information generating unit 2001 generates operation information indicating that the current operation state of the control candidate device and the control are not possible, and the terminal Transmit to device 3A.
  • the permission determination unit 2010 determines whether control is permitted or not even when the image detection information corresponding to the control candidate device determined to be uncontrollable by the control permission determination unit is received from the terminal device 3A.
  • the image detection information is discarded without performing the processing relating to the above.
  • the terminal devices 3 and 3A are eyeglass-type head-mounted displays (also referred to as smart glasses), but as the terminal device according to the present invention, a non-transmissive head-mounted display or It is also possible to employ a wearable computer other than the head mounted display.
  • a display device such as a liquid crystal display that displays an image showing the state of the outside world (outside world image) captured by the image sensor 32 is provided. In this case, the operation information video is displayed on the display device so as to be superimposed on the external image.
  • the terminal devices 3 and 3A may emit sound or light, or may vibrate.
  • the terminal devices 3 and 3A may notify the user that the control device 2 has received an instruction for permission or non-permission of control.
  • the control device 2 notifies the terminal devices 3 and 3A that an instruction to allow or disallow control by the user has been received.
  • the terminal devices 3 and 3A change the projection mode (or display mode) of the corresponding operation information video according to permission or non-permission.
  • FIG. 20 An example in which such notification is performed by the terminal device 3 in the third modification of the first embodiment is shown in FIG. In the example of FIG. 20, it is shown that the control for the living room illuminator is permitted, the control for the living room air conditioner is not permitted, and the control for the kitchen illuminator is waiting for a permission or non-permission instruction. Has been.
  • the manner of notifying the content of the operation information is not limited to video only, and may be notified by a voice message, for example.
  • the control device 2 determines whether or not the control of the control candidate device is permitted by the user. However, the determination may be performed by the terminal devices 3 and 3A. Specifically, the terminal devices 3 and 3A include a permission determination unit (not shown) that performs the same function as the permission determination unit 2010 of the control device 2, and the permission determination unit includes information ( Permission determination information) is transmitted to the control device 2. And the control apparatus 2 discriminate
  • the terminal devices 3 and 3A may be separately provided with a push button, a switch, or the like for receiving permission or non-permission of control for the control candidate device from the user.
  • a microphone may be provided in the terminal devices 3 and 3A so that the user can instruct permission or non-permission of control of the control candidate device by voice, or an acceleration sensor is provided in the terminal devices 3 and 3A. The user may be able to instruct permission or non-permission of control of the control candidate device by tilting or shaking the terminal devices 3 and 3A.
  • each function unit (see FIG. 6) of the control device 2 is realized by the CPU 20 executing the remote operation program 240 stored in the secondary storage device 24.
  • each function part (refer FIG. 5) of the terminal devices 3 and 3A was implement
  • all or part of the functional units of the control device 2 and the terminal devices 3 and 3A may be realized by dedicated hardware.
  • the dedicated hardware is, for example, a single circuit, a composite circuit, a programmed processor, an ASIC (Application Specific Integrated Circuit), an FPGA (Field-Programmable Gate Array), or a combination thereof.
  • the program executed by the control device 2 includes CD-ROM (Compact Disc Read Only Memory), DVD (digital versatile disc), magneto-optical disc (Magneto-Optical Disc), USB (Universal Serial). Bus) It is also possible to store and distribute in a computer-readable recording medium such as a memory, a memory card, or an HDD. Then, by installing such a program on a specific or general-purpose computer, it is possible to cause the computer to function as the control device 2 in each of the above embodiments.
  • CD-ROM Compact Disc Read Only Memory
  • DVD digital versatile disc
  • magneto-optical disc Magneto-optical disc
  • USB Universal Serial
  • the above program may be stored in a disk device or the like of a server on a network such as the Internet, and the program may be downloaded from the server to a computer.
  • the present invention can be suitably employed in a system that manages the operation of electrical equipment installed in the home.
  • control system 1 control system, 2 control device, 3 terminal device, 4-1, 4-2 equipment, 20, 30 CPU, 21, 31 communication interface, 22, 34 ROM, 23, 35 RAM, 24, 36 secondary storage device, 25, 37 bus, 32 image sensor, 33 video projection unit, 38 gaze detection unit, 200 operation information transmission unit, 201 device control unit, 240 remote operation program, 241 device information, 300 specific image detection unit, 301 detection information transmission unit 302 notification unit, 2000 candidate determination unit, 2001 operation information generation unit, 2010 permission determination unit, 2011 control command transmission unit, 2410 connection information table, 2411 location information table, 2412 operation information table, 2413 operation code table

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Selective Calling Equipment (AREA)

Abstract

L'invention concerne un dispositif terminal qui détecte une image spécifique parmi des images capturées par un capteur d'image (étape S101) et transmet des informations de détection d'image en fonction de l'image spécifique à un dispositif de commande (étape S102). Lorsque les informations de détection d'image transmises en provenance du dispositif terminal sont reçues, le dispositif de commande sélectionne un appareil de commande candidat parmi une pluralité d'appareils en fonction des informations de détection d'image et génère des informations d'opération concernant l'appareil de commande candidat (étape S103) et les transmet au dispositif terminal (étape S104). Lorsque les informations d'opération transmises en provenance du dispositif de commande sont reçues, le dispositif terminal notifie du contenu des informations d'opération à un utilisateur (étape S105). Le dispositif de commande détermine si la commande de l'appareil de commande candidat est activée ou non par l'utilisateur (étape S106) et, lorsque la commande est activée, transmet une instruction de commande à l'appareil de commande candidat (étape S107).
PCT/JP2016/070566 2016-07-12 2016-07-12 Système de commande et procédé de commande d'appareil WO2018011890A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2018527290A JP6641482B2 (ja) 2016-07-12 2016-07-12 制御システム及び機器制御方法
PCT/JP2016/070566 WO2018011890A1 (fr) 2016-07-12 2016-07-12 Système de commande et procédé de commande d'appareil

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2016/070566 WO2018011890A1 (fr) 2016-07-12 2016-07-12 Système de commande et procédé de commande d'appareil

Publications (1)

Publication Number Publication Date
WO2018011890A1 true WO2018011890A1 (fr) 2018-01-18

Family

ID=60952424

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/070566 WO2018011890A1 (fr) 2016-07-12 2016-07-12 Système de commande et procédé de commande d'appareil

Country Status (2)

Country Link
JP (1) JP6641482B2 (fr)
WO (1) WO2018011890A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030020707A1 (en) * 2001-06-27 2003-01-30 Kangas Kari J. User interface
JP2012090077A (ja) * 2010-10-20 2012-05-10 Konica Minolta Business Technologies Inc 携帯端末及び処理装置の操作方法
JP2013025638A (ja) * 2011-07-22 2013-02-04 Kyocera Document Solutions Inc 画像形成システム、携帯端末装置、およびプログラム
WO2015140106A1 (fr) * 2014-03-17 2015-09-24 IT-Universitetet i København Procédé d'interaction du regard mis en œuvre par ordinateur et son appareil

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011209805A (ja) * 2010-03-29 2011-10-20 Konica Minolta Opto Inc 映像表示装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030020707A1 (en) * 2001-06-27 2003-01-30 Kangas Kari J. User interface
JP2012090077A (ja) * 2010-10-20 2012-05-10 Konica Minolta Business Technologies Inc 携帯端末及び処理装置の操作方法
JP2013025638A (ja) * 2011-07-22 2013-02-04 Kyocera Document Solutions Inc 画像形成システム、携帯端末装置、およびプログラム
WO2015140106A1 (fr) * 2014-03-17 2015-09-24 IT-Universitetet i København Procédé d'interaction du regard mis en œuvre par ordinateur et son appareil

Also Published As

Publication number Publication date
JP6641482B2 (ja) 2020-02-05
JPWO2018011890A1 (ja) 2018-09-20

Similar Documents

Publication Publication Date Title
JP5620287B2 (ja) ユーザインターフェースを変更する携帯端末、方法及びプログラム
US10754161B2 (en) Apparatus control system
JP6195628B2 (ja) 端末装置、制御装置、設置位置確認支援システム、設置位置設定支援システム、設置位置確認支援方法、設置位置設定支援方法、及び、プログラム
KR20170051136A (ko) 헤드 마운트 디스플레이의 사용자에게 알림을 제공하기 위한 방법 및 장치
US10769509B2 (en) Determining an action associated with an apparatus using a combined bar code image
CN112526892B (zh) 用于控制智能家居设备的方法及装置、电子设备
JP2006208997A (ja) 映像表示装置及び映像表示システム
JP6711635B2 (ja) ネットワークシステム、電気機器、通信端末、および通信端末のためのプログラム
US20160004231A1 (en) Method of managing electrical device, managing system, electrical device, operation terminal, and program
JP6355850B2 (ja) 判定支援装置、判定支援方法及びプログラム
US20150039104A1 (en) Remote controlling method, communication device, and computer-readable storage medium recorded with computer program for performing remote control
JP2009206774A (ja) 画像伝送システム、画像伝送装置及び制御方法
CN112005297A (zh) 信息输出方法、信息输出装置以及程序
CN104904191A (zh) 移动装置与用于建立无线链路的方法
KR20150086807A (ko) 사물인식기반 통합 리모컨 장치 및 그 방법
WO2018011890A1 (fr) Système de commande et procédé de commande d'appareil
JP2018195895A (ja) 制御装置、空気調和機、端末装置、制御方法、および制御プログラム
CN109479359B (zh) 用于为设备提供对传感器数据的访问的系统和方法
US10810867B2 (en) Remote control system, remote control method, and program
US20170193804A1 (en) Method, system, and electronic device for monitoring
JP2011175360A (ja) 調停サーバ、調停方法および調停プログラム
JP7518731B2 (ja) 電子機器および制御方法
JP6570616B2 (ja) 遠隔操作システム、コントローラ、および、プログラム
KR20120102445A (ko) 공공 기기 제어 장치 및 방법.
JP6898521B2 (ja) センサ連携設備システム

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2018527290

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16908794

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16908794

Country of ref document: EP

Kind code of ref document: A1