WO2018180635A1 - Operation device and apparatus control system - Google Patents

Operation device and apparatus control system Download PDF

Info

Publication number
WO2018180635A1
WO2018180635A1 PCT/JP2018/010591 JP2018010591W WO2018180635A1 WO 2018180635 A1 WO2018180635 A1 WO 2018180635A1 JP 2018010591 W JP2018010591 W JP 2018010591W WO 2018180635 A1 WO2018180635 A1 WO 2018180635A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
recognition target
processing unit
sensor
specific movement
Prior art date
Application number
PCT/JP2018/010591
Other languages
French (fr)
Japanese (ja)
Inventor
賢志 喜多村
橋本 勝
英亮 山口
隼也 酒見
Original Assignee
パナソニックIpマネジメント株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パナソニックIpマネジメント株式会社 filed Critical パナソニックIpマネジメント株式会社
Publication of WO2018180635A1 publication Critical patent/WO2018180635A1/en

Links

Images

Classifications

    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F24HEATING; RANGES; VENTILATING
    • F24FAIR-CONDITIONING; AIR-HUMIDIFICATION; VENTILATION; USE OF AIR CURRENTS FOR SCREENING
    • F24F11/00Control or safety arrangements
    • F24F11/50Control or safety arrangements characterised by user interfaces or communication
    • F24F11/52Indication arrangements, e.g. displays
    • F24F11/523Indication arrangements, e.g. displays for displaying temperature data
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F24HEATING; RANGES; VENTILATING
    • F24FAIR-CONDITIONING; AIR-HUMIDIFICATION; VENTILATION; USE OF AIR CURRENTS FOR SCREENING
    • F24F11/00Control or safety arrangements
    • F24F11/50Control or safety arrangements characterised by user interfaces or communication
    • F24F11/56Remote control
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04QSELECTING
    • H04Q9/00Arrangements in telecontrol or telemetry systems for selectively calling a substation from a main station, in which substation desired apparatus is selected for applying a control signal thereto or for obtaining measured values therefrom

Definitions

  • the present invention relates to an operating device and a device control system. More specifically, the present invention relates to an operating device including a display and an equipment control system that controls equipment using the operating device.
  • Patent Document 1 describes a technology for canceling sleep when a gesture or pattern due to a touch event on a touch sensor matches a sleep cancel setting gesture or pattern.
  • An object of the present invention is to provide an operation device and a device control system in which the number of operations by a user is reduced.
  • the operating device includes a display, a sensor, and a processing unit.
  • the display is selected between a display state and a non-display state.
  • the sensor detects a position of a recognition target in a contact type or a non-contact type within a detection range close to the display.
  • the processing unit receives information on the position of the recognition target from the sensor. Furthermore, the processing unit instructs the display device to be in a non-display state when the sensor does not detect the recognition target.
  • the processing unit includes a control content to the device according to the specific movement of the recognition target. Change the output state.
  • a device control system includes the operation device described above and an output device that controls the device in accordance with the output state of the processing unit.
  • FIG. 1 is a block diagram illustrating a device control system according to the first embodiment.
  • FIG. 2 is a block diagram illustrating the operating device according to the first embodiment.
  • FIG. 3 is a perspective view illustrating the operating device according to the first embodiment.
  • FIG. 4 is an exploded perspective view showing the operating device according to the first embodiment.
  • FIG. 5 is a diagram illustrating an example when the display device is in the display state in the operation device according to the first embodiment.
  • FIG. 6 is a diagram illustrating a non-display state of the display device in the operation device according to the first embodiment.
  • FIG. 7 is a diagram illustrating another example when the display device is in the display state in the operation device according to the first embodiment.
  • FIG. 8 is a diagram illustrating another example when the display device is in the display state in the operating device according to the first embodiment.
  • FIG. 9 is a block diagram illustrating an operating device according to the second embodiment.
  • the device control system 100 includes an operation device 10 and an output device 20.
  • the output device 20 is a device that instructs the device 30 about the contents of control.
  • the output device 20 is provided integrally with the housing 40 (see FIGS. 3 and 4) of the operation device 10.
  • the operating device 10 includes a display 11, a sensor 12, and a processing unit 13. Furthermore, the operating device 10 includes power supply circuits 14 and 15.
  • the power supply circuit 14 receives an AC power supply (for example, AC 100V) such as a commercial power supply and outputs DC power.
  • the power supply circuit 15 steps down the voltage of the DC power output from the power supply circuit 14 and outputs DC power with a constant voltage.
  • the power supply circuit 15 is an integrated circuit, and for example, a three-terminal regulator is used.
  • the display 11 is a transmissive liquid crystal display.
  • the sensor 12 is a touch sensor that is superimposed on the screen of the display 11.
  • the sensor 12 is transparent and plate-shaped, and functions as a position input device that detects the position of the recognition target in a contact or non-contact manner within a detection range close to the display 11.
  • a capacitive touch sensor capable of multipoint detection is assumed as the sensor 12.
  • the sensor 12 may be another type of touch sensor such as a resistive film type.
  • the recognition target is assumed to be a human finger.
  • the position of the recognition target when the recognition target approaches the sensor 12 even if the recognition target is not in contact with the sensor 12. May be detectable.
  • the distance between the sensor 12 and the recognition target when the sensor 12 detects the position of the recognition target in a non-contact manner varies depending on environmental conditions. That is, the detection range in which the sensor 12 detects the position of the recognition target varies depending on the environmental conditions.
  • the sensor 12 can detect the position of the recognition target even if the recognition target is non-contact.
  • the recognition target is in contact with the sensor 12 below. In this case, it is assumed that the sensor 12 detects the position of the recognition target.
  • a square surrounding the power supply circuit 14 represents the power supply board 41
  • a square surrounding the power supply circuit 15 represents the main board 42.
  • the power supply board 41 has a power supply terminal 411 connected to an AC power supply, and the input terminal of the power supply circuit 14 is connected to the power supply terminal 411.
  • the power supply board 41 and the main board 42 are electrically connected through a connector and a cable.
  • FIG. 2 schematically shows a configuration in which the power supply board 41 and the main board 42 are electrically connected to each other by the connection part 412 provided on the power supply board 41 and the connection part 421 provided on the main board 42. Yes.
  • the main board 42 includes the processing unit 13, the current source 16, and the wireless module 21 in addition to the power supply circuit 15.
  • the processing unit 13 is configured by a microcontroller including a processor and a memory, for example.
  • the processor executes a program stored in the memory.
  • the processing unit 13 outputs a video signal having display content information to the display 11 and receives information on the position of the recognition target from the sensor 12.
  • the processing unit 13 controls operations of the display 11, the sensor 12, the current source 16, and the wireless module 21 based on the position information received from the sensor 12.
  • the processing unit 13 instructs the display 11 to select a display state and a non-display state, and instructs display contents in the display state. In addition, the processing unit 13 can instruct the sensor 12 to select whether to accept or prohibit input.
  • the current source 16 outputs a drive current to the backlight 17 disposed on the back surface of the display 11.
  • the backlight 17 includes, for example, an LED (Light Emitting Diode) as a light source.
  • the processing unit 13 outputs a luminance signal that determines the luminance of the backlight 17, and the current source 16 outputs a current having a magnitude indicated by the luminance signal to the backlight 17.
  • the luminance signal switches the luminance of the backlight 17 in at least two stages.
  • One of the two levels of luminance is the luminance at which the backlight 17 is extinguished, and the other of the two levels of luminance is the luminance at which the display content of the display 11 can be visually recognized when the backlight 17 is turned on. That is, one luminance is, for example, 0%, and the other luminance is, for example, 70% or more and 100% or less.
  • luminance of the backlight 17 may be changed according to ambient illuminance so that the visibility of the indicator 11 may be improved by adding the brightness sensor which detects ambient illuminance.
  • the processing unit 13 may be configured so that the user can adjust the luminance of the backlight 17 using the display 11 and the sensor 12. That is, the processing unit 13 may adjust the luminance of the backlight 17 by interactively receiving input from the user using the display 11 and the sensor 12.
  • the wireless module 21 is configured to perform wireless communication.
  • the wireless module 21 is a wireless station of a communication standard selected from Wi-Fi (registered trademark), Bluetooth (registered trademark), BLE (Bluetooth Low Energy), etc. It is configured as a power radio station.
  • As the communication protocol for example, Echonet Light is adopted.
  • the communication standard and communication protocol used by the wireless module 21 shown here are merely examples, and the configuration of the wireless module 21 is not intended to be limited. Further, instead of the wireless module 21, a communication interface for performing wired communication such as power line carrier communication may be provided.
  • the operating device 10 is configured to have the same appearance as a wall switch attached to a wall of a building, for example, as shown in FIGS.
  • the operation device 10 shown in FIGS. 3 and 4 includes a housing 40 that is attached to a wall in a state where a part is embedded in an embedding hole formed in the wall.
  • the housing 40 includes a first housing 401 and a second housing 402.
  • the first housing 401 is formed in a box shape having a rectangular opening 403 on one surface, and the power supply board 41 and the main board 42 are incorporated therein.
  • the first housing 401 includes two mounting plates 404 that extend in a direction away from the opening 403 along one surface having the opening 403.
  • the two mounting plates 404 are provided on two opposite sides of the four sides surrounding the opening 403.
  • the first housing 401 is fixed to the wall in a state where the mounting plate 404 is applied around the embedding hole formed in the wall. Since the technique for fixing the first housing 401 to the wall is the same as that of the wall switch or the like, description thereof is omitted.
  • the second casing 402 is coupled to the first casing 401.
  • a peripheral portion of the base 405 is sandwiched between the first housing 401 and the second housing 402.
  • the display 11 and the sensor 12 are fixed to the base 405.
  • the second housing 402 has a window hole 406 at the center, and the display 11 and the sensor 12 fixed to the base 405 are the window holes in a state where the second housing 402 is coupled to the first housing 401. Exposed through 406.
  • the second casing 402 is a member corresponding to a flash plate (decorative plate) in the wall switch, and is coupled to the mounting plate 404 of the first casing 401.
  • the first housing 401 and the second housing 402 are coupled by, for example, a hole provided in the mounting plate 404 and a claw protruding from the back surface of the second housing 402.
  • the devices 30 to be controlled by the operation device 10 are three types of lighting devices, air conditioning devices, and electric curtains.
  • the operation device 10 displays the type of the device 30 on the screen of the display device 11. In the state where the type of the device 30 is displayed on the screen of the display device 11, the type of the device 30 to be controlled by the operating device 10 can be selected by the operating device 10.
  • the processing unit 13 receives an input from the sensor 12 according to the display content of the display unit 11 and changes the output state according to the input from the sensor 12.
  • the processing unit 13 changes the output state
  • the operation state of at least one of the display device 11, the sensor 12, the current source 16, and the wireless module 21 changes.
  • the output state of the processing unit 13 is represented by the contents of signals that give instructions to the display 11, the sensor 12, the current source 16, and the wireless module 21.
  • the processing unit 13 communicates directly or indirectly with the device 30 to be controlled by the operating device 10 through the wireless module 21.
  • the processing unit 13 communicates with the device 30 indirectly, it communicates with a communication device that can monitor and control the device 30 such as a HEMS controller (HEMS: Home Energy Management System), and the device passes through the communication device. 30 means communication.
  • HEMS controller Home Energy Management System
  • the processing unit 13 determines the control content to the device 30 based on the information input through the sensor 12 and outputs a signal corresponding to the control content to the device 30 to the wireless module 21.
  • the content of control to the device 30 includes the device 30 that is a control target and an operation state that instructs the control target device 30.
  • the wireless module 21 functions as the output device 20 with respect to the operation device 10. In this case, outputting the signal corresponding to the control content to the control target device 30 to the wireless module 21 corresponds to the change in the output state of the processing unit 13.
  • the operating device 10 may display the operation status of the device 30 selected through the screen of the display unit 11 on the screen of the display unit 11. If the lighting device is a lighting device, for example, it is different between lighting and extinguishing. If the lighting device is dimmable, it may include a dimming level. May contain color. In the case of an air conditioner, the status of operation is a set temperature, whether operation or stop, and may further include the direction of air blowing, the intensity of air blowing, and the like. Further, in the electric curtain, the operation status is different depending on whether the curtain is open or closed.
  • the screen of the display device 11 includes, for example, three tabs T1, T2, and T3 representing lighting equipment (lighting), air conditioning equipment (air conditioner), and electric curtain (curtain). Is displayed.
  • the recognition target finger touches the sensor 12 at a position corresponding to one of the three tabs T1, T2, T3, the device 30 corresponding to the tab T1, T2, T3 at the touched position is operated. It becomes a control target of the device 10.
  • FIG. 5 shows an example of a state in which the air conditioner is selected.
  • the screen of the display 11 includes a field F1 indicating the current set temperature of the air conditioner and two buttons B11 for changing the set temperature. B12 is displayed.
  • the set temperature indicated in the field F1 is changed when the recognition target touches the sensor 12 in an area corresponding to one of the two buttons B11 and B12.
  • push an operation in which a recognition target touches the sensor 12 in a specific area such as a tab or a button is referred to as “push” following the operation of the push button.
  • the processing unit 13 instructs the display unit 11 to not display and turns off the backlight 17. At this time, nothing is displayed on the screen of the display 11 as shown in FIG. 6, but the processing unit 13 can receive an input from the sensor 12.
  • the processing unit 13 When the display 11 is in the non-display state, the processing unit 13 recognizes the movement of the recognition target based on the position of the recognition target when the sensor 12 detects the recognition target. That is, the processing unit 13 waits for an input from the sensor 12 even when the display 11 is in a non-display state. Since the sensor 12 can detect multiple points, the processing unit 13 can distinguish whether the recognition target is one finger or two fingers. Or processing part 13 may be constituted so that three or more fingers can be distinguished.
  • the movement of the recognition target recognized by the processing unit 13 includes tap, swipe, pinch-in, pinch-out, and the like. Further, the processing unit 13 may additionally have a function of recognizing a double tap, a flick, a long press, a drag, or the like as a movement to be recognized.
  • the tap is an operation in which the recognition target touches the sensor 12 for a short time
  • the processing unit 13 recognizes that the recognition target leaves the sensor 12 within a predetermined time after the recognition target touches the sensor 12 as a tap.
  • Swipe is an operation in which the recognition target slides on the surface of the sensor 12 in one direction.
  • the direction in which the recognition target moves in swipe is often one of the vertical and horizontal directions of the screen of the display 11.
  • the processing unit 13 performs a swipe when the position change of the recognition target is started as time elapses within a predetermined time after the recognition target contacts the sensor 12 and the position change of the recognition target is generated linearly. recognize.
  • the processing unit 13 recognizes the direction of the position change of the recognition target by distinguishing them.
  • Pinch-in is an operation of moving the finger so as to reduce the distance between the two fingers after touching the sensor 12 with two fingers to be recognized.
  • Pinch-out is touching the sensor 12 with two fingers. This is an operation of moving the finger so as to increase the distance between the two fingers.
  • the processing unit 13 recognizes that two fingers are in contact with the sensor 12 and moves so that the interval between the two fingers is shortened or widened with the passage of time, the pinch-in is performed. Or it is recognized as a pinch out.
  • the processing unit 13 similarly recognizes other movements to be recognized.
  • the processing unit 13 is configured to recognize, as a specific movement, a swipe in which a recognition target moves upward in the vertical direction among the vertical direction and the horizontal direction of the screen of the display 11. That is, in FIG. 6, the operation of moving the recognition target upward indicated by the arrow A is a specific movement of the recognition target.
  • the processing unit 13 changes the output state according to the specific movement of the recognition target when the sensor 12 detects the recognition target and recognizes the specific movement of the recognition target.
  • the processing unit 13 changes the output state so as to display the screen of FIG. 5, for example. That is, when the display unit 11 is in a non-display state as shown in FIG. 6 and the processing unit 13 recognizes a specific movement to be recognized, the processing unit 13 displays the screen of FIG. 5 on the display unit 11. Then, the output state is changed so that the video signal representing the display content of the display 11 is output to the display 11. At this time, the processing unit 13 changes the output state of the luminance signal to the current source 16 so that the backlight 17 is lit.
  • the set temperature displayed in the field F ⁇ b> 1 on the screen of the display device 11 may be stored in the processing unit 13, but may be acquired directly or indirectly from the device 30 through the wireless module 21.
  • the processing unit 13 when the processing unit 13 recognizes a specific movement to be recognized when the display 11 is in the non-display state, the processing unit 13 controls the specific device 30 (the air conditioner in FIG. 5). The output state is changed so that a screen for instructing the contents is displayed on the display 11.
  • the processing unit 13 When the display unit 11 is in the non-display state and the processing unit 13 recognizes a specific movement to be recognized, the processing unit 13 sends the control content to the device 30 without shifting the display unit 11 to the display state. You may instruct. For example, if the device 30 associated with the specific movement to be recognized is a lighting device, the processing unit 13 instructs the lighting device to turn on the specific movement to be recognized. Change the output state to. In this case, the processing unit 13 does not change the output state with respect to the display unit 11 and the current source 16, and the display unit 11 is kept in the non-display state.
  • the processing unit 13 When the display unit 11 is in the non-display state and the processing unit 13 recognizes a specific movement to be recognized, the processing unit 13 instructs the device 30 associated with the specific movement to be recognized to control content. Then, the control content instructed to the device 30 may be displayed on the display 11. For example, for a specific movement to be recognized, the processing unit 13 changes the output state to the wireless module 21 so as to give an instruction to change the set temperature of the air conditioner. In this example, the set temperature of the air conditioner is increased by 1 ° C. relative to the upward swipe, and the set temperature of the air conditioner is decreased by 1 ° C. by the downward swipe. May be associated.
  • the processing unit 13 displays the display content for confirming the operation of the device 30 (characters “+ 1 ° C. (25 ° C.) set” in FIG. 7) on the display 11.
  • the button B13 for canceling the instructed control content may be displayed together. In the screen shown in FIG. 7, when the control content is not suitable, the user can cancel the instructed control content by pressing the button B13.
  • the processing unit 13 selects whether or not the control content can be instructed to the device 30 as shown in FIG. Buttons B14 and B15 to be displayed may be displayed on the display 11.
  • the processing unit 13 may be configured to wait for an input from the sensor 12 as to whether or not to instruct the device 30 to control content corresponding to a specific operation to be recognized.
  • the processing unit 13 associates a specific movement to be recognized with a specific control content to the specific device 30.
  • the processing unit 13 recognizes a specific movement to be recognized, the processing unit 13 selects the user to control the device 30 and whether to adopt the control content without giving an instruction to the device 30 immediately. Buttons B14 and B15 to be displayed are displayed on the display unit 11.
  • the control content to be instructed to the device 30 is a character “Do you want to set + 1 ° C. (25 ° C.)?” In FIG.
  • the processing unit 13 adopts the control content when the button B14 is pressed, and discards the control content when the button B15 is pressed.
  • the processing unit 13 performs not only when the display 11 is in the non-display state but also when it is in the display state, when the specific movement to be recognized is recognized, the same operation as in the non-display state is performed. May be. Specifically, even when the display unit 11 is in the display state, when the processing unit 13 recognizes a specific movement to be recognized, the display unit 11 associates the movement with the specific movement to be recognized without switching the screen of the display unit 11. A specific control content may be instructed to a specific device 30. In addition, when the processing unit 13 recognizes a specific movement to be recognized, the processing unit 13 switches the screen of the display unit 11 and causes the display unit 11 to display a screen that enables control of the specific device 30 as illustrated in FIG. Also good.
  • the display unit 11 switches the screen of the display unit 11 and instructs the device 30 to indicate the control content as shown in FIG. 11 may be displayed.
  • the screen shown in FIG. 7 includes a button B13 for canceling an instruction to the device 30, but the button B13 can be omitted.
  • a plurality of devices 30 or a plurality of types of devices 30 to which control contents are instructed may be used. For example, it is possible to associate a specific movement with the control content to the device 30 so that the plurality of devices 30 are turned off collectively.
  • the processing unit 13 can perform an operation of increasing the set temperature of the air conditioner by 1 ° C. with an upward swipe and decreasing the set temperature of the air conditioner by 1 ° C. with a downward swipe. Similarly, the processing unit 13 can perform an operation of turning on the lighting device with an upward swipe and turning off the lighting device with a downward swipe.
  • the type of operation of the recognition target can be associated with the control content to the device 30, and when the recognition target is one finger and when the recognition target is two fingers,
  • the correspondence relationship of the control content to 30 may be varied. For example, if you swipe up with two fingers, the electric curtain opens, and if you swipe down with two fingers, the electric curtain closes. May be attached.
  • the type of operation performed by the processing unit 13 on the display device 11 may be varied according to the control content to the device 30. For example, when the operation result of the device 30 can recognize the operation without depending on the operation device 10 like a lighting device, the processing unit 13 does not display the display 11 with respect to a specific movement to be recognized. You can keep it. Further, when the content of the instruction to the device 30 cannot be confirmed without the display 11 such as the set temperature of the air conditioner, the processing unit 13 displays the set temperature or the like on the display 11. It only has to work.
  • the processing unit 13 can register the correspondence between the movement of the recognition target and the control content to the device 30 through the display 11 and the sensor 12. That is, it is desirable that the user registers a desired correspondence relationship between the movement of the recognition target and the control content to the device 30 in the processing unit 13 through the display 11 and the sensor 12. Further, the processing unit 13 is configured so that the user can register not only the content of the control on the device 30 but also the type of operation that the processing unit 13 performs on the display 11 with respect to the movement of the recognition target. It is desirable.
  • the housing 40 described in the present embodiment has the same dimensions as the wiring device so that it can be attached to the wall using the same attachment member as the wall switch. Therefore, it is possible to replace the existing wall switch with the operation device 10.
  • the size and shape of the housing 40 are not intended to be limited, and the size and shape of the housing 40 can be appropriately changed depending on the design.
  • the operation device 10 is not necessarily attached to the wall, and may be configured to be attached to a place other than the wall in the building, or may be configured to be portable. Further, the above-described operation can be realized by executing an appropriate application program (so-called application) in a portable computer having a touch panel and a wireless communication function such as a smartphone or a tablet terminal. .
  • the operation device 10 of the present embodiment has a relay 22 added as an output device 20 in addition to the configuration of the first embodiment.
  • a load terminal 413 is added to the power supply board 41 with the addition of the relay 22.
  • the relay 22 is provided on the power supply board 41 and selects conduction (on) or non-conduction (off) of the electric path between the power supply terminal 411 and the load terminal 413 included in the power supply board 41.
  • the relay 22 is an electromagnetic relay, the contact of the relay 22 is connected between the power supply terminal 411 and the load terminal 413.
  • the operation of the relay 22 is controlled by the processing unit 13.
  • the relay 22 may be a semiconductor relay.
  • a device 30 such as a lighting device is connected to the load terminal 413. That is, the relay 22 is provided in the power supply path to the device 30. Therefore, when the processing unit 13 controls the relay 22, a state where power is supplied to the device 30 and a state where power is not supplied are selected. You may insert in the electric power feeding path
  • Other configurations and operations of the present embodiment are the same as those of the first embodiment.
  • the wireless module 21 is not an essential component as the output device 20.
  • the operating device 10 according to the present embodiment may include only the relay 22 as the output device 20.
  • the display device 11 is exemplified by a liquid crystal display device, but may be replaced with an organic EL display (EL), electronic paper, or the like.
  • EL organic EL display
  • the backlight 17 is unnecessary.
  • a front light is placed on the front surface of the display 11 instead of the backlight 17.
  • the front light includes a light source disposed around the display 11 and a light guide plate that guides light from the light source to enter the display 11.
  • the operating device 10 may not include either the backlight 17 or the front light depending on the usage environment of the operating device 10.
  • the processing unit 13 may accept not only an input from the sensor 12 but also an input from a mechanical switch. That is, the operating device 10 may be provided with a mechanical switch.
  • the recognition target is assumed to be a human finger, but may be a touch pen.
  • the operating device (10) of the first aspect includes the display (11), the sensor (12), and the processing unit (13).
  • the display (11) is selected between a display state and a non-display state.
  • the sensor (12) detects the position of the recognition target in a contact type or a non-contact type within a detection range close to the display (11).
  • the processing unit (13) receives information on the position of the recognition target from the sensor (12). Furthermore, the processing unit (13) instructs the display (11) to be in a non-display state when the sensor (12) does not detect the recognition target.
  • the processing unit (13) controls the device (30) according to the specific movement of the recognition target when the display (11) is in the non-display state and recognizes the specific movement of the recognition target. Change the output state including the contents.
  • the processing unit (13) changes the output state when the recognition target such as a finger performs a specific movement. Therefore, the number of operations by the user is reduced as compared with the case where the control content to the device (30) is instructed after the display (11) is shifted to the display state.
  • the processing unit (13) is configured such that when the display (11) is in a non-display state and a specific movement of the recognition target is recognized, The output state may be changed so that the device (30) is instructed to control contents associated with a specific movement to be recognized while the display (11) is kept in the non-display state.
  • the processing unit (13) is configured such that when the display (11) is in a non-display state and a specific movement of the recognition target is recognized, After changing the output state so as to instruct the device (30) of the control content associated with the specific movement to be recognized, the control content may be displayed on the display.
  • control content instructed to the device (30) is displayed on the display (11), the control content can be confirmed.
  • the processing unit (13) is configured such that when the display (11) is in a non-display state and a specific movement of the recognition target is recognized, It is desirable to display the control content on the display (11) before instructing the device (30) of the control content associated with the specific movement to be recognized. In this case, it is desirable that the processing unit (13) waits for an input as to whether or not to instruct the device (30) about the control content.
  • the sensor (12) can detect the position of each of the plurality of recognition targets, and the specific movement of the recognition target is Alternatively, it may be represented by a combination of the number of recognition targets selected from a plurality of recognition targets and the movement of the recognition target selected from the plurality of recognition targets.
  • the processing unit (13) is configured so that the correspondence between the specific movement of the recognition target and the control content is registered. It is desirable to be configured.
  • the processing unit (13) causes the movement of the recognition target to touch the sensor (12) and the recognition target is a display ( 11) It is desirable to be configured to distinguish the movement that moves along the screen.
  • the operating device (10) in any one of the first to seventh aspects, includes a display (11), a sensor (12), and a processing unit (13). It is desirable to have a housing (40) provided.
  • the housing (40) is preferably configured to be attached to a building.
  • the device control system (100) of the ninth aspect includes an output that controls the device (30) according to the output state of the operating device (10) of any of the first to eighth aspects and the processing unit (13).
  • the output state of the processing unit (13) can be adapted to the device (30) by the output device (20).
  • the output device (20) is preferably a wireless module (21) that communicates with the device (30).
  • the output device (20) may be a relay (22) provided in the power feeding path of the device (30).
  • the wireless module (21) it is possible to control a plurality of devices (30) with a single operation device (10) as well as the individual devices (30). Further, if the device (30) can be controlled only by turning on and off the power supply path, it can be used in place of the wall switch by using the relay (22).

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Mechanical Engineering (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Selective Calling Equipment (AREA)
  • Air Conditioning Control Device (AREA)

Abstract

Provided are an operation device with which the number of times of user operations is decreased, and an apparatus control system. The operation device (10) is provided with a display (11), a sensor (12), and a processing unit (13). In the display (11), a display state and a non-display state are selectable. The sensor (12) detects, in a contacting or contactless manner, the position of an object to be recognized in a detection range in proximity to the display (11). The processing unit (13) receives information about the position of the object to be recognized from the sensor (12). The processing unit (13) also provides the display (11) with an instruction for the non-display state in the absence of detection of the object to be recognized by the sensor (12). Further, the processing unit (13), when the display (11) is in the non-display state and upon recognition of a specific motion of the object to be recognized, causes a change in an output state including control content with respect to the apparatus (30) in accordance with the specific motion of the object to be recognized.

Description

操作装置、機器制御システムOperation device, equipment control system
 本発明は、操作装置、機器制御システムに関する。さらに詳しくは、本発明は、表示器を備える操作装置、この操作装置により機器を制御する機器制御システムに関する。 The present invention relates to an operating device and a device control system. More specifically, the present invention relates to an operating device including a display and an equipment control system that controls equipment using the operating device.
 特許文献1には、タッチセンサに対するタッチイベントによるジェスチャまたはパターンがスリープ解除設定ジェスチャまたはパターンにマッチする場合に、スリープ解除を行う技術が記載されている。 Patent Document 1 describes a technology for canceling sleep when a gesture or pattern due to a touch event on a touch sensor matches a sleep cancel setting gesture or pattern.
 特許文献1に記載された技術では、タッチイベントによりスリープ状態を解除し、その後に、次の操作を行うことになる。そのため、スリープ状態から目的とする作業を行うまでに、タッチイベントという操作を経る必要があり、ユーザの操作回数が多くなる。 In the technique described in Patent Document 1, the sleep state is canceled by a touch event, and then the following operation is performed. Therefore, it is necessary to perform an operation called a touch event before the target work is performed from the sleep state, and the number of operations by the user increases.
特表2014-501990号公報Special table 2014-501990 publication
 本発明は、ユーザの操作回数を低減させた操作装置、及び機器制御システムを提供することを目的とする。 An object of the present invention is to provide an operation device and a device control system in which the number of operations by a user is reduced.
 本発明の一態様に係る操作装置は、表示器と、センサと、処理部とを備える。前記表示器は、表示状態と非表示状態とが選択される。前記センサは、前記表示器に近接した検出範囲内において認識対象の位置を接触式又は非接触式で検出する。前記処理部は、前記センサから前記認識対象の位置の情報を受け取る。さらに、前記処理部は、前記センサが前記認識対象を検出していないときに、前記表示器に非表示状態を指示する。また、前記処理部は、前記表示器が非表示状態であり、かつ前記認識対象の特定の動きを認識したときに、前記認識対象の前記特定の動きに応じて機器への制御内容を含んだ出力状態を変化させる。 The operating device according to an aspect of the present invention includes a display, a sensor, and a processing unit. The display is selected between a display state and a non-display state. The sensor detects a position of a recognition target in a contact type or a non-contact type within a detection range close to the display. The processing unit receives information on the position of the recognition target from the sensor. Furthermore, the processing unit instructs the display device to be in a non-display state when the sensor does not detect the recognition target. In addition, when the display unit is in a non-display state and the specific movement of the recognition target is recognized, the processing unit includes a control content to the device according to the specific movement of the recognition target. Change the output state.
 本発明の一態様に係る機器制御システムは、上述した操作装置と、前記処理部の出力状態に応じて前記機器を制御する出力装置とを備える。 A device control system according to an aspect of the present invention includes the operation device described above and an output device that controls the device in accordance with the output state of the processing unit.
図1は実施形態1の機器制御システムを示すブロック図である。FIG. 1 is a block diagram illustrating a device control system according to the first embodiment. 図2は実施形態1の操作装置を示すブロック図である。FIG. 2 is a block diagram illustrating the operating device according to the first embodiment. 図3は実施形態1の操作装置を示す斜視図である。FIG. 3 is a perspective view illustrating the operating device according to the first embodiment. 図4は実施形態1の操作装置を示す分解斜視図である。FIG. 4 is an exploded perspective view showing the operating device according to the first embodiment. 図5は実施形態1の操作装置において表示器が表示状態であるときの一例を示す図である。FIG. 5 is a diagram illustrating an example when the display device is in the display state in the operation device according to the first embodiment. 図6は実施形態1の操作装置において表示器の非表示状態を示す図である。FIG. 6 is a diagram illustrating a non-display state of the display device in the operation device according to the first embodiment. 図7は実施形態1の操作装置において表示器が表示状態であるときの他例を示す図である。FIG. 7 is a diagram illustrating another example when the display device is in the display state in the operation device according to the first embodiment. 図8は実施形態1の操作装置において表示器が表示状態であるときの別例を示す図である。FIG. 8 is a diagram illustrating another example when the display device is in the display state in the operating device according to the first embodiment. 図9は実施形態2の操作装置を示すブロック図である。FIG. 9 is a block diagram illustrating an operating device according to the second embodiment.
 (実施形態1)
 本実施形態では、操作装置で機器を制御する機器制御システムを例として説明する。図1に示すように、機器制御システム100は、操作装置10と出力装置20とを備える。出力装置20は、機器30に制御内容を指示する装置である。本実施形態では、出力装置20は、操作装置10の筐体40(図3、図4参照)に一体に設けられている。
(Embodiment 1)
In the present embodiment, a device control system that controls devices with an operation device will be described as an example. As illustrated in FIG. 1, the device control system 100 includes an operation device 10 and an output device 20. The output device 20 is a device that instructs the device 30 about the contents of control. In the present embodiment, the output device 20 is provided integrally with the housing 40 (see FIGS. 3 and 4) of the operation device 10.
 操作装置10は、図2に示すように、表示器11とセンサ12と処理部13とを備えている。さらに、操作装置10は、電源回路14、15を備える。電源回路14は、商用電源のような交流電源(例えば、AC100V)を入力として直流電力を出力する。電源回路15は、電源回路14が出力した直流電力の電圧を降圧し、一定電圧の直流電力を出力する。電源回路15は、集積回路であり、例えば3端子レギュレータが用いられる。 As shown in FIG. 2, the operating device 10 includes a display 11, a sensor 12, and a processing unit 13. Furthermore, the operating device 10 includes power supply circuits 14 and 15. The power supply circuit 14 receives an AC power supply (for example, AC 100V) such as a commercial power supply and outputs DC power. The power supply circuit 15 steps down the voltage of the DC power output from the power supply circuit 14 and outputs DC power with a constant voltage. The power supply circuit 15 is an integrated circuit, and for example, a three-terminal regulator is used.
 表示器11は透過型の液晶表示器である。センサ12は、表示器11の画面に重ねられるタッチセンサである。センサ12は、透明かつ板状であって、表示器11に近接した検出範囲内において認識対象の位置を接触式又は非接触式で検出する位置入力装置として機能する。本実施形態では、センサ12として、多点検出が可能な静電容量方式のタッチセンサを想定している。ただし、センサ12は、抵抗膜方式など他の方式のタッチセンサであってもよい。また、認識対象は、人の指を想定している。 The display 11 is a transmissive liquid crystal display. The sensor 12 is a touch sensor that is superimposed on the screen of the display 11. The sensor 12 is transparent and plate-shaped, and functions as a position input device that detects the position of the recognition target in a contact or non-contact manner within a detection range close to the display 11. In the present embodiment, a capacitive touch sensor capable of multipoint detection is assumed as the sensor 12. However, the sensor 12 may be another type of touch sensor such as a resistive film type. The recognition target is assumed to be a human finger.
 センサ12が静電容量方式のタッチセンサであり、認識対象が人の指であるから、認識対象がセンサ12に対して非接触であっても、認識対象がセンサ12に接近すると認識対象の位置を検出可能な場合がある。なお、センサ12が認識対象の位置を非接触で検出する場合のセンサ12と認識対象との距離は環境条件で変化する。すなわち、センサ12が認識対象の位置を検出する検出範囲は環境条件によって変動する。上述したように、センサ12は認識対象が非接触であっても認識対象の位置を検出することが可能であるが、説明を簡単にするために、以下では、認識対象がセンサ12に接触した場合に、センサ12が認識対象の位置を検出すると仮定して説明する。 Since the sensor 12 is a capacitive touch sensor and the recognition target is a human finger, the position of the recognition target when the recognition target approaches the sensor 12 even if the recognition target is not in contact with the sensor 12. May be detectable. Note that the distance between the sensor 12 and the recognition target when the sensor 12 detects the position of the recognition target in a non-contact manner varies depending on environmental conditions. That is, the detection range in which the sensor 12 detects the position of the recognition target varies depending on the environmental conditions. As described above, the sensor 12 can detect the position of the recognition target even if the recognition target is non-contact. However, in order to simplify the description, the recognition target is in contact with the sensor 12 below. In this case, it is assumed that the sensor 12 detects the position of the recognition target.
 図2において、電源回路14を囲む四角形は電源基板41を表し、電源回路15を囲む四角形はメイン基板42を表している。電源基板41は、交流電源に接続される電源端子411を有し、電源端子411に電源回路14の入力端子が接続されている。また、電源基板41とメイン基板42とは、コネクタ及びケーブルを通して電気的に接続される。図2では、電源基板41に設けた接続部412と、メイン基板42に設けた接続部421とにより、電源基板41とメイン基板42とを相互に電気的に接続する構成を模式的に表している。 In FIG. 2, a square surrounding the power supply circuit 14 represents the power supply board 41, and a square surrounding the power supply circuit 15 represents the main board 42. The power supply board 41 has a power supply terminal 411 connected to an AC power supply, and the input terminal of the power supply circuit 14 is connected to the power supply terminal 411. The power supply board 41 and the main board 42 are electrically connected through a connector and a cable. FIG. 2 schematically shows a configuration in which the power supply board 41 and the main board 42 are electrically connected to each other by the connection part 412 provided on the power supply board 41 and the connection part 421 provided on the main board 42. Yes.
 メイン基板42は、電源回路15のほかに、処理部13、電流源16、無線モジュール21を備える。処理部13は、例えば、プロセッサ及びメモリを備えるマイクロコントローラで構成されている。プロセッサは、メモリに格納されたプログラムを実行する。処理部13は、表示器11に対して表示内容の情報を持つ映像信号を出力し、センサ12から認識対象の位置の情報を受け取る。処理部13は、センサ12から受け取った位置の情報に基づいて、表示器11、センサ12、電流源16、無線モジュール21の動作を制御する。処理部13は、表示器11に対しては、表示状態と非表示状態との選択を指示し、表示状態では表示内容を指示する。また、処理部13は、センサ12に対して、入力の受付と禁止との選択を指示することが可能である。 The main board 42 includes the processing unit 13, the current source 16, and the wireless module 21 in addition to the power supply circuit 15. The processing unit 13 is configured by a microcontroller including a processor and a memory, for example. The processor executes a program stored in the memory. The processing unit 13 outputs a video signal having display content information to the display 11 and receives information on the position of the recognition target from the sensor 12. The processing unit 13 controls operations of the display 11, the sensor 12, the current source 16, and the wireless module 21 based on the position information received from the sensor 12. The processing unit 13 instructs the display 11 to select a display state and a non-display state, and instructs display contents in the display state. In addition, the processing unit 13 can instruct the sensor 12 to select whether to accept or prohibit input.
 電流源16は、表示器11の背面に重ねて配置されたバックライト17に駆動電流を出力する。バックライト17は、例えばLED(Light  Emitting  Diode)を光源として備えている。処理部13は、バックライト17の輝度を定める輝度信号を出力し、電流源16は輝度信号に指示された大きさの電流をバックライト17に出力する。 The current source 16 outputs a drive current to the backlight 17 disposed on the back surface of the display 11. The backlight 17 includes, for example, an LED (Light Emitting Diode) as a light source. The processing unit 13 outputs a luminance signal that determines the luminance of the backlight 17, and the current source 16 outputs a current having a magnitude indicated by the luminance signal to the backlight 17.
 輝度信号は、バックライト17の輝度を少なくとも2段階で切り替える。2段階の輝度の一方は、バックライト17が消灯する輝度であり、2段階の輝度の他方は、バックライト17の点灯で表示器11の表示内容が視認できる輝度である。すなわち、一方の輝度は例えば0%であり、他方の輝度は例えば70%以上100%以下である。 The luminance signal switches the luminance of the backlight 17 in at least two stages. One of the two levels of luminance is the luminance at which the backlight 17 is extinguished, and the other of the two levels of luminance is the luminance at which the display content of the display 11 can be visually recognized when the backlight 17 is turned on. That is, one luminance is, for example, 0%, and the other luminance is, for example, 70% or more and 100% or less.
 なお、周囲照度を検出する明るさセンサを付加することにより、表示器11の視認性を高めるように、周囲照度に応じてバックライト17の輝度を変化させるように構成してもよい。また、処理部13は、表示器11とセンサ12とを用いて、バックライト17の輝度をユーザが調節できるように構成されていてもよい。すなわち、処理部13は、表示器11及びセンサ12を用いてユーザから対話的に入力を受け付けることにより、バックライト17の輝度を調節してもよい。 In addition, you may comprise so that the brightness | luminance of the backlight 17 may be changed according to ambient illuminance so that the visibility of the indicator 11 may be improved by adding the brightness sensor which detects ambient illuminance. The processing unit 13 may be configured so that the user can adjust the luminance of the backlight 17 using the display 11 and the sensor 12. That is, the processing unit 13 may adjust the luminance of the backlight 17 by interactively receiving input from the user using the display 11 and the sensor 12.
 無線モジュール21は、無線通信を行うように構成されており、Wi-Fi(登録商標)、Bluetooth(登録商標)、BLE(Bluetooth Low Energy)などから選択される通信規格の無線局、あるいは特定小電力無線局として構成されている。通信プロトコルは、例えばエコーネットライトが採用される。ここに示した無線モジュール21が用いる通信規格、通信プロトコルは一例であり、無線モジュール21の構成を限定する趣旨ではない。また、無線モジュール21に代えて、電力線搬送通信のように有線通信を行う通信インターフェイスを設けてもよい。 The wireless module 21 is configured to perform wireless communication. The wireless module 21 is a wireless station of a communication standard selected from Wi-Fi (registered trademark), Bluetooth (registered trademark), BLE (Bluetooth Low Energy), etc. It is configured as a power radio station. As the communication protocol, for example, Echonet Light is adopted. The communication standard and communication protocol used by the wireless module 21 shown here are merely examples, and the configuration of the wireless module 21 is not intended to be limited. Further, instead of the wireless module 21, a communication interface for performing wired communication such as power line carrier communication may be provided.
 操作装置10は、例えば、図3、図4に示すように、建物の壁に取り付けられる壁スイッチと同様の外観となるように構成される。図3、図4に示す操作装置10は、壁に開けた埋込孔に、一部を埋込んだ状態で壁に取り付けられる筐体40を備える。筐体40は、第1筐体401と第2筐体402とを備える。 The operating device 10 is configured to have the same appearance as a wall switch attached to a wall of a building, for example, as shown in FIGS. The operation device 10 shown in FIGS. 3 and 4 includes a housing 40 that is attached to a wall in a state where a part is embedded in an embedding hole formed in the wall. The housing 40 includes a first housing 401 and a second housing 402.
 第1筐体401は、一面に四角形状の開口403を有する箱状に形成されており、電源基板41とメイン基板42とが内蔵されている。第1筐体401は、開口403を有する一面に沿って開口403から離れる向きに延びた2つの取付板404を備える。2つの取付板404は、開口403を囲む4辺のうち向かい合う2辺に設けられている。第1筐体401は、壁に開けた埋込孔の周囲に取付板404を当てた状態で壁に固定される。第1筐体401を壁に固定する技術は、壁スイッチなどと同様であるから説明を省略する。 The first housing 401 is formed in a box shape having a rectangular opening 403 on one surface, and the power supply board 41 and the main board 42 are incorporated therein. The first housing 401 includes two mounting plates 404 that extend in a direction away from the opening 403 along one surface having the opening 403. The two mounting plates 404 are provided on two opposite sides of the four sides surrounding the opening 403. The first housing 401 is fixed to the wall in a state where the mounting plate 404 is applied around the embedding hole formed in the wall. Since the technique for fixing the first housing 401 to the wall is the same as that of the wall switch or the like, description thereof is omitted.
 第2筐体402は、第1筐体401に結合される。第1筐体401と第2筐体402との間には、基台405の周部が挟まれる。表示器11とセンサ12とは基台405に固定されている。第2筐体402は中央部に窓孔406を有し、第2筐体402が第1筐体401に結合された状態で、基台405に固定された表示器11及びセンサ12が窓孔406を通して露出する。 The second casing 402 is coupled to the first casing 401. A peripheral portion of the base 405 is sandwiched between the first housing 401 and the second housing 402. The display 11 and the sensor 12 are fixed to the base 405. The second housing 402 has a window hole 406 at the center, and the display 11 and the sensor 12 fixed to the base 405 are the window holes in a state where the second housing 402 is coupled to the first housing 401. Exposed through 406.
 第2筐体402は、壁スイッチにおけるフラッシュプレート(化粧プレート)に対応する部材であり、第1筐体401の取付板404に結合される。第1筐体401と第2筐体402とは、例えば、取付板404に設けられている孔と、第2筐体402の背面から突出した爪とにより結合される。第1筐体401が壁に取り付けられた状態で、第2筐体402が第1筐体401に結合されると、第2筐体402の周部が壁に当たり、壁に開けられた埋込孔の周部が第2筐体402に覆われる。 The second casing 402 is a member corresponding to a flash plate (decorative plate) in the wall switch, and is coupled to the mounting plate 404 of the first casing 401. The first housing 401 and the second housing 402 are coupled by, for example, a hole provided in the mounting plate 404 and a claw protruding from the back surface of the second housing 402. When the second housing 402 is coupled to the first housing 401 in a state where the first housing 401 is attached to the wall, the peripheral portion of the second housing 402 hits the wall and is embedded in the wall. The periphery of the hole is covered with the second housing 402.
 次に、操作装置10の動作を説明する。ここでは、操作装置10による制御対象の機器30が、照明機器と空調機器と電動カーテンとの3種類であると仮定する。操作装置10は、表示器11が表示状態であるとき、表示器11の画面に機器30の種類を表示する。表示器11の画面に機器30の種類が表示された状態では、操作装置10が制御対象とする機器30の種類を操作装置10で選択することができる。 Next, the operation of the controller device 10 will be described. Here, it is assumed that the devices 30 to be controlled by the operation device 10 are three types of lighting devices, air conditioning devices, and electric curtains. When the display device 11 is in the display state, the operation device 10 displays the type of the device 30 on the screen of the display device 11. In the state where the type of the device 30 is displayed on the screen of the display device 11, the type of the device 30 to be controlled by the operating device 10 can be selected by the operating device 10.
 表示器11が表示状態であれば、処理部13は、表示器11の表示内容に応じて、センサ12からの入力を受け付け、センサ12からの入力に応じて出力状態を変化させる。処理部13が出力状態を変化させると、表示器11、センサ12、電流源16、無線モジュール21のうちの少なくとも1つの動作状態が変化する。言い換えると、処理部13の出力状態は、表示器11、センサ12、電流源16、無線モジュール21に指示を与える信号の内容で表される。 If the display unit 11 is in the display state, the processing unit 13 receives an input from the sensor 12 according to the display content of the display unit 11 and changes the output state according to the input from the sensor 12. When the processing unit 13 changes the output state, the operation state of at least one of the display device 11, the sensor 12, the current source 16, and the wireless module 21 changes. In other words, the output state of the processing unit 13 is represented by the contents of signals that give instructions to the display 11, the sensor 12, the current source 16, and the wireless module 21.
 処理部13は、操作装置10が制御対象とする機器30と、無線モジュール21を通して直接又は間接に通信を行う。処理部13が機器30と間接に通信を行うとは、HEMSコントローラ(HEMS:Home Energy Management System)のように機器30の監視及び制御が可能な通信機器との通信を行い、この通信機器を通して機器30と通信することを意味する。 The processing unit 13 communicates directly or indirectly with the device 30 to be controlled by the operating device 10 through the wireless module 21. When the processing unit 13 communicates with the device 30 indirectly, it communicates with a communication device that can monitor and control the device 30 such as a HEMS controller (HEMS: Home Energy Management System), and the device passes through the communication device. 30 means communication.
 処理部13は、センサ12を通して入力された情報に基づいて、機器30への制御内容を定め、無線モジュール21に対して機器30への制御内容に応じた信号を出力する。機器30への制御内容は、制御対象である機器30と、当該制御対象の機器30に指示する動作状態とを含む。無線モジュール21は、操作装置10に対して出力装置20として機能する。この場合、処理部13が制御対象の機器30への制御内容に応じた信号を無線モジュール21に出力することが、処理部13の出力状態が変化することに相当する。 The processing unit 13 determines the control content to the device 30 based on the information input through the sensor 12 and outputs a signal corresponding to the control content to the device 30 to the wireless module 21. The content of control to the device 30 includes the device 30 that is a control target and an operation state that instructs the control target device 30. The wireless module 21 functions as the output device 20 with respect to the operation device 10. In this case, outputting the signal corresponding to the control content to the control target device 30 to the wireless module 21 corresponds to the change in the output state of the processing unit 13.
 操作装置10は、表示器11の画面を通して選択された機器30の動作の状況を表示器11の画面に表示してもよい。動作の状況は、照明機器であれば、例えば、点灯と消灯との別であり、調光可能な照明機器であれば調光レベルを含むことがあり、調色可能な照明機器であれば発光色を含むことがある。空調機器であれば、動作の状況は、運転と停止との別、設定温度であり、さらに送風の向き、送風の強さなどを含むことがある。また、電動カーテンでは、動作の状況は、カーテンが開いているか閉じているかの別である。 The operating device 10 may display the operation status of the device 30 selected through the screen of the display unit 11 on the screen of the display unit 11. If the lighting device is a lighting device, for example, it is different between lighting and extinguishing. If the lighting device is dimmable, it may include a dimming level. May contain color. In the case of an air conditioner, the status of operation is a set temperature, whether operation or stop, and may further include the direction of air blowing, the intensity of air blowing, and the like. Further, in the electric curtain, the operation status is different depending on whether the curtain is open or closed.
 表示器11に表示される画面の具体例を図5に示す。表示器11が表示状態であるときに、表示器11の画面には、例えば、照明機器(照明)と空調機器(エアコン)と電動カーテン(カーテン)とを表す3つのタブT1、T2、T3が表示される。この状態で、3つのタブT1、T2、T3のいずれかに対応する位置で、認識対象(指)がセンサ12に触れると、触れた位置のタブT1、T2、T3に対応する機器30が操作装置10の制御対象になる。 A specific example of the screen displayed on the display 11 is shown in FIG. When the display device 11 is in the display state, the screen of the display device 11 includes, for example, three tabs T1, T2, and T3 representing lighting equipment (lighting), air conditioning equipment (air conditioner), and electric curtain (curtain). Is displayed. In this state, when the recognition target (finger) touches the sensor 12 at a position corresponding to one of the three tabs T1, T2, T3, the device 30 corresponding to the tab T1, T2, T3 at the touched position is operated. It becomes a control target of the device 10.
 図5では空調装置が選択された状態の一例を示しており、表示器11の画面には、空調機器の現状の設定温度を示すフィールドF1と、設定温度を変更するための2つのボタンB11、B12とが表示されている。この画面が表示された状態では、2つのボタンB11、B12のいずれかに対応する領域で、認識対象がセンサ12に触れることにより、フィールドF1に示された設定温度が変更される。以下では、タブあるいはボタンのような特定の領域で、認識対象がセンサ12に触れる操作を、押しボタンの操作にならって「押す」という。 FIG. 5 shows an example of a state in which the air conditioner is selected. The screen of the display 11 includes a field F1 indicating the current set temperature of the air conditioner and two buttons B11 for changing the set temperature. B12 is displayed. In a state where this screen is displayed, the set temperature indicated in the field F1 is changed when the recognition target touches the sensor 12 in an area corresponding to one of the two buttons B11 and B12. In the following, an operation in which a recognition target touches the sensor 12 in a specific area such as a tab or a button is referred to as “push” following the operation of the push button.
 ところで、処理部13は、センサ12が認識対象を検出していないときには、表示器11に非表示状態を指示し、バックライト17を消灯させる。このとき、表示器11の画面には図6のように何も表示されないが、処理部13は、センサ12からの入力を受け取ることが可能である。 By the way, when the sensor 12 does not detect the recognition target, the processing unit 13 instructs the display unit 11 to not display and turns off the backlight 17. At this time, nothing is displayed on the screen of the display 11 as shown in FIG. 6, but the processing unit 13 can receive an input from the sensor 12.
 表示器11が非表示状態であるときに、処理部13は、センサ12が認識対象を検出すると、認識対象の位置に基づいて、認識対象の動きを認識する。すなわち、処理部13は表示器11が非表示状態であってもセンサ12からの入力を待ち受けている。センサ12は多点検出が可能であるから、処理部13は、認識対象が1本の指か2本の指かを区別することが可能である。あるいは、処理部13は、3本以上の指を区別することができるように構成されていてもよい。処理部13が認識する認識対象の動きには、タップ、スワイプ、ピンチイン、ピンチアウトなどがある。また、処理部13は、認識対象の動きとして、ダブルタップ、フリック、長押し、ドラッグなどを認識する機能を付加的に備えていてもよい。 When the display 11 is in the non-display state, the processing unit 13 recognizes the movement of the recognition target based on the position of the recognition target when the sensor 12 detects the recognition target. That is, the processing unit 13 waits for an input from the sensor 12 even when the display 11 is in a non-display state. Since the sensor 12 can detect multiple points, the processing unit 13 can distinguish whether the recognition target is one finger or two fingers. Or processing part 13 may be constituted so that three or more fingers can be distinguished. The movement of the recognition target recognized by the processing unit 13 includes tap, swipe, pinch-in, pinch-out, and the like. Further, the processing unit 13 may additionally have a function of recognizing a double tap, a flick, a long press, a drag, or the like as a movement to be recognized.
 これらの動きは、スマートフォン、タブレット端末などにおいて周知である。例えば、タップは、認識対象がセンサ12に短時間だけ触れる操作であり、処理部13は、認識対象がセンサ12に触れてから、所定時間内に認識対象がセンサ12から離れる状態をタップと認識する。また、スワイプは、認識対象がセンサ12の表面を一方向に滑って移動する操作である。スワイプにおいて認識対象が移動する方向は、表示器11の画面の垂直方向あるいは水平方向の一つの向きであることが多い。処理部13は、認識対象がセンサ12に接触してから所定時間内に時間経過に伴って認識対象の位置変化が開始され、かつ認識対象の位置変化が直線状に生じているときにスワイプと認識する。また、処理部13は、認識対象の位置変化の向きを区別して認識する。 These movements are well known in smartphones and tablet terminals. For example, the tap is an operation in which the recognition target touches the sensor 12 for a short time, and the processing unit 13 recognizes that the recognition target leaves the sensor 12 within a predetermined time after the recognition target touches the sensor 12 as a tap. To do. Swipe is an operation in which the recognition target slides on the surface of the sensor 12 in one direction. The direction in which the recognition target moves in swipe is often one of the vertical and horizontal directions of the screen of the display 11. The processing unit 13 performs a swipe when the position change of the recognition target is started as time elapses within a predetermined time after the recognition target contacts the sensor 12 and the position change of the recognition target is generated linearly. recognize. In addition, the processing unit 13 recognizes the direction of the position change of the recognition target by distinguishing them.
 ピンチインは、センサ12に認識対象である2本の指で触れた後、2本の指の距離を縮めるように指を移動させる操作であり、ピンチアウトは、センサ12に2本の指で触れた後に、2本の指の距離を拡げるように指を移動させる操作である。処理部13は、センサ12に2本の指が接触したことを認識してから所定時間内に、時間経過に伴って2本の指の間隔が縮むかあるいは拡がるように移動するときに、ピンチインあるいはピンチアウトと認識する。認識対象の他の動きについても同様にして処理部13が認識する。 Pinch-in is an operation of moving the finger so as to reduce the distance between the two fingers after touching the sensor 12 with two fingers to be recognized. Pinch-out is touching the sensor 12 with two fingers. This is an operation of moving the finger so as to increase the distance between the two fingers. When the processing unit 13 recognizes that two fingers are in contact with the sensor 12 and moves so that the interval between the two fingers is shortened or widened with the passage of time, the pinch-in is performed. Or it is recognized as a pinch out. The processing unit 13 similarly recognizes other movements to be recognized.
 いま、処理部13は、表示器11の画面の垂直方向及び水平方向のうち、垂直方向の上向きに認識対象が移動するスワイプを、特定の動きとして認識するように構成されていると仮定する。すなわち、図6では、矢印Aで示す上向きに認識対象を移動させる操作が、認識対象の特定の動きである。処理部13は、表示器11が非表示状態であるときに、センサ12が認識対象を検出し、かつ認識対象の特定の動きを認識すると、認識対象の特定の動きに応じて出力状態を変化させる。 Now, it is assumed that the processing unit 13 is configured to recognize, as a specific movement, a swipe in which a recognition target moves upward in the vertical direction among the vertical direction and the horizontal direction of the screen of the display 11. That is, in FIG. 6, the operation of moving the recognition target upward indicated by the arrow A is a specific movement of the recognition target. When the display unit 11 is in the non-display state, the processing unit 13 changes the output state according to the specific movement of the recognition target when the sensor 12 detects the recognition target and recognizes the specific movement of the recognition target. Let
 ここでは、認識対象の特定の動きが認識対象の上方向へのスワイプであると仮定する。この特定の動きに対して、処理部13は、例えば、図5の画面を表示するように出力状態を変化させる。つまり、表示器11が図6のように非表示状態であり、処理部13が認識対象の特定の動きを認識すると、図5の画面が表示器11に表示されるように、処理部13は、表示器11の表示内容を表す映像信号を表示器11に出力するように出力状態を変化させる。このとき、処理部13は、バックライト17が点灯するように電流源16への輝度信号の出力状態を変化させる。表示器11の画面におけるフィールドF1に表示されている設定温度は、処理部13が記憶していてもよいが、無線モジュール21を通して機器30から直接又は間接に取得してもよい。 Here, it is assumed that the specific movement of the recognition target is a swipe upward of the recognition target. For this specific movement, the processing unit 13 changes the output state so as to display the screen of FIG. 5, for example. That is, when the display unit 11 is in a non-display state as shown in FIG. 6 and the processing unit 13 recognizes a specific movement to be recognized, the processing unit 13 displays the screen of FIG. 5 on the display unit 11. Then, the output state is changed so that the video signal representing the display content of the display 11 is output to the display 11. At this time, the processing unit 13 changes the output state of the luminance signal to the current source 16 so that the backlight 17 is lit. The set temperature displayed in the field F <b> 1 on the screen of the display device 11 may be stored in the processing unit 13, but may be acquired directly or indirectly from the device 30 through the wireless module 21.
 上述した動作では、表示器11が非表示状態であるときに、処理部13が認識対象の特定の動きを認識した場合、処理部13は、特定の機器30(図5では空調機器)に制御内容を指示するための画面が表示器11に表示されるように出力状態を変化させている。 In the above-described operation, when the processing unit 13 recognizes a specific movement to be recognized when the display 11 is in the non-display state, the processing unit 13 controls the specific device 30 (the air conditioner in FIG. 5). The output state is changed so that a screen for instructing the contents is displayed on the display 11.
 表示器11が非表示状態であるときに、処理部13が認識対象の特定の動きを認識した場合に、処理部13は、表示器11を表示状態に移行させることなく機器30に制御内容を指示してもよい。例えば、認識対象の特定の動きに対応付けた機器30が照明機器であるとすれば、処理部13は、認識対象の特定の動きに対して照明機器に点灯を指示するように、無線モジュール21への出力状態を変化させる。この場合、処理部13は、表示器11及び電流源16に対する出力状態を変更せず、表示器11は非表示状態に保たれる。 When the display unit 11 is in the non-display state and the processing unit 13 recognizes a specific movement to be recognized, the processing unit 13 sends the control content to the device 30 without shifting the display unit 11 to the display state. You may instruct. For example, if the device 30 associated with the specific movement to be recognized is a lighting device, the processing unit 13 instructs the lighting device to turn on the specific movement to be recognized. Change the output state to. In this case, the processing unit 13 does not change the output state with respect to the display unit 11 and the current source 16, and the display unit 11 is kept in the non-display state.
 また、表示器11が非表示状態であるときに、処理部13が認識対象の特定の動きを認識すると、処理部13は、認識対象の特定の動きに対応付けた機器30に制御内容を指示し、機器30に指示した制御内容を表示器11に表示するように構成されていてもよい。例えば、認識対象の特定の動きに対して、処理部13は、空調機器の設定温度を変更する指示を行うように無線モジュール21への出力状態を変化させる。この例では、上向きのスワイプに対して空調機器の設定温度を1℃高くし、下向きのスワイプによって空調機器の設定温度を1℃下げるというように、認識対象の動きを設定温度の引き上げと引き下げとに対応付けてもよい。 When the display unit 11 is in the non-display state and the processing unit 13 recognizes a specific movement to be recognized, the processing unit 13 instructs the device 30 associated with the specific movement to be recognized to control content. Then, the control content instructed to the device 30 may be displayed on the display 11. For example, for a specific movement to be recognized, the processing unit 13 changes the output state to the wireless module 21 so as to give an instruction to change the set temperature of the air conditioner. In this example, the set temperature of the air conditioner is increased by 1 ° C. relative to the upward swipe, and the set temperature of the air conditioner is decreased by 1 ° C. by the downward swipe. May be associated.
 この動作では、表示器11が非表示状態であるにもかかわらず、認識対象の特定の動きにより機器30に制御内容が指示される。したがって、処理部13が指示した機器30への制御内容が、ユーザの意図しない制御内容となる可能性がある。そのため、処理部13は、図7に示すように、機器30の動作を確認する表示内容(図7では「+1℃(25℃)設定しました」の文字)を表示器11に表示するときに、指示した制御内容を取り消すためのボタンB13を併せて表示してもよい。図7に示す画面では、ユーザは、制御内容が意図にそぐわないときには、ボタンB13を押すことにより、指示した制御内容を取り消すことができる。 In this operation, although the display device 11 is in the non-display state, the control content is instructed to the device 30 by the specific movement of the recognition target. Therefore, there is a possibility that the control content to the device 30 instructed by the processing unit 13 becomes control content not intended by the user. Therefore, as shown in FIG. 7, the processing unit 13 displays the display content for confirming the operation of the device 30 (characters “+ 1 ° C. (25 ° C.) set” in FIG. 7) on the display 11. The button B13 for canceling the instructed control content may be displayed together. In the screen shown in FIG. 7, when the control content is not suitable, the user can cancel the instructed control content by pressing the button B13.
 表示器11が非表示状態であり、処理部13が認識対象の特定の動きを認識すると、処理部13は、図8のように、機器30に制御内容を指示してよいか否かを選択するボタンB14、B15を表示器11に表示させてもよい。すなわち、処理部13は、認識対象の特定の動作に対応する制御内容を機器30に指示するか否かについて、センサ12からの入力を待ち受けるように構成されてもよい。処理部13は、図7に示した動作例と同様に、認識対象の特定の動きに対して、特定の機器30に特定の制御内容を指示するように対応付けている。ただし、処理部13は、認識対象の特定の動きを認識したときに、機器30にただちに指示を与えずに、機器30への制御内容と、その制御内容を採用するか否かをユーザに選択させるボタンB14、B15とを表示器11に表示する。機器30に指示する制御内容は、図8では、「+1℃(25℃)設定しますか?」の文字である。処理部13は、ボタンB14が押されると制御内容を採用し、ボタンB15が押されると制御内容を破棄する。 When the display unit 11 is in a non-display state and the processing unit 13 recognizes a specific movement to be recognized, the processing unit 13 selects whether or not the control content can be instructed to the device 30 as shown in FIG. Buttons B14 and B15 to be displayed may be displayed on the display 11. In other words, the processing unit 13 may be configured to wait for an input from the sensor 12 as to whether or not to instruct the device 30 to control content corresponding to a specific operation to be recognized. Similarly to the operation example shown in FIG. 7, the processing unit 13 associates a specific movement to be recognized with a specific control content to the specific device 30. However, when the processing unit 13 recognizes a specific movement to be recognized, the processing unit 13 selects the user to control the device 30 and whether to adopt the control content without giving an instruction to the device 30 immediately. Buttons B14 and B15 to be displayed are displayed on the display unit 11. The control content to be instructed to the device 30 is a character “Do you want to set + 1 ° C. (25 ° C.)?” In FIG. The processing unit 13 adopts the control content when the button B14 is pressed, and discards the control content when the button B15 is pressed.
 処理部13は、表示器11が非表示状態であるときだけではなく、表示状態であるときも、認識対象の特定の動きを認識したときに、非表示状態であるときと同様の動作を行ってもよい。具体的には、表示器11が表示状態であっても、処理部13が認識対象の特定の動きを認識すると、表示器11の画面を切り替えることなく、認識対象の特定の動きに対応付けた、特定の機器30に特定の制御内容を指示してもよい。また、処理部13は、認識対象の特定の動きを認識すると、表示器11の画面を切り替えて、図5のような特定の機器30の制御を可能とする画面を表示器11に表示させてもよい。あるいは、処理部13は、認識対象の特定の動きを認識すると、表示器11の画面を切り替えて、図7のように機器30に制御内容を指示した後に、制御内容を確認する画面を表示器11に表示させてもよい。図7に示す画面には、機器30への指示を取り消すためのボタンB13を備えているが、ボタンB13は省略可能である。 The processing unit 13 performs not only when the display 11 is in the non-display state but also when it is in the display state, when the specific movement to be recognized is recognized, the same operation as in the non-display state is performed. May be. Specifically, even when the display unit 11 is in the display state, when the processing unit 13 recognizes a specific movement to be recognized, the display unit 11 associates the movement with the specific movement to be recognized without switching the screen of the display unit 11. A specific control content may be instructed to a specific device 30. In addition, when the processing unit 13 recognizes a specific movement to be recognized, the processing unit 13 switches the screen of the display unit 11 and causes the display unit 11 to display a screen that enables control of the specific device 30 as illustrated in FIG. Also good. Alternatively, when the processing unit 13 recognizes a specific movement to be recognized, the display unit 11 switches the screen of the display unit 11 and instructs the device 30 to indicate the control content as shown in FIG. 11 may be displayed. The screen shown in FIG. 7 includes a button B13 for canceling an instruction to the device 30, but the button B13 can be omitted.
 表示器11が非表示状態であるときに、処理部13が認識対象の特定の動きを認識したときに、制御内容が指示される機器30は、複数台あるいは複数種類であってもよい。例えば、複数の機器30を一括してオフにするように、特定の動きを機器30への制御内容に対応付けることが可能である。 When the display unit 11 is in a non-display state, when the processing unit 13 recognizes a specific movement to be recognized, a plurality of devices 30 or a plurality of types of devices 30 to which control contents are instructed may be used. For example, it is possible to associate a specific movement with the control content to the device 30 so that the plurality of devices 30 are turned off collectively.
 表示器11が非表示状態であり、処理部13がセンサ12からの情報に基づいて認識対象の特定の動きを認識したときの上述した種々の動作は、認識対象の動きの種類に応じて選択されるようにしてもよい。一例として説明したように、処理部13は、上向きのスワイプで空調機器の設定温度を1℃上げ、下向きのスワイプで空調機器の設定温度を1℃下げるという動作が可能である。同様にして、処理部13は、上向きのスワイプで照明機器を点灯させ、下向きのスワイプで照明機器を消灯させるという動作が可能である。あるいは、認識対象の操作の種類を、機器30への制御内容に対応付けることが可能であり、認識対象が1本の指であるときと、認識対象が2本の指であるときとで、機器30への制御内容の対応関係を異ならせてもよい。例えば、2本の指で上向きにスワイプを行うと電動カーテンが開き、2本の指で下向きにスワイプを行うと電動カーテンが閉じるように、電動カーテンへの制御内容と認識対象の動きとを対応付けてもよい。 Various operations described above when the display 11 is in a non-display state and the processing unit 13 recognizes a specific movement of the recognition target based on information from the sensor 12 are selected according to the type of movement of the recognition target. You may be made to do. As described as an example, the processing unit 13 can perform an operation of increasing the set temperature of the air conditioner by 1 ° C. with an upward swipe and decreasing the set temperature of the air conditioner by 1 ° C. with a downward swipe. Similarly, the processing unit 13 can perform an operation of turning on the lighting device with an upward swipe and turning off the lighting device with a downward swipe. Alternatively, the type of operation of the recognition target can be associated with the control content to the device 30, and when the recognition target is one finger and when the recognition target is two fingers, The correspondence relationship of the control content to 30 may be varied. For example, if you swipe up with two fingers, the electric curtain opens, and if you swipe down with two fingers, the electric curtain closes. May be attached.
 さらに、機器30への制御内容に応じて、処理部13が表示器11に対して行う動作の種類を異ならせてもよい。例えば、照明機器のように、機器30の動作の結果が操作装置10によらずに動作を認識できる場合は、処理部13は、認識対象の特定の動きに対して表示器11を非表示状態に保てばよい。また、空調機器の設定温度のように、表示器11の表示がなければ機器30への指示の内容を確認できない場合には、処理部13は、表示器11に設定温度などを表示するように動作すればよい。 Furthermore, the type of operation performed by the processing unit 13 on the display device 11 may be varied according to the control content to the device 30. For example, when the operation result of the device 30 can recognize the operation without depending on the operation device 10 like a lighting device, the processing unit 13 does not display the display 11 with respect to a specific movement to be recognized. You can keep it. Further, when the content of the instruction to the device 30 cannot be confirmed without the display 11 such as the set temperature of the air conditioner, the processing unit 13 displays the set temperature or the like on the display 11. It only has to work.
 処理部13は、認識対象の動きと機器30への制御内容との対応関係が、表示器11及びセンサ12を通して登録可能であることが望ましい。すなわち、表示器11及びセンサ12を通して、ユーザが、認識対象の動きと機器30への制御内容との所望の対応関係を処理部13に登録することが望ましい。また、処理部13は、機器30への制御内容だけではなく、認識対象の動きに対して処理部13が表示器11に対して行う動作の種類も、ユーザが登録できるように構成されていることが望ましい。 It is desirable that the processing unit 13 can register the correspondence between the movement of the recognition target and the control content to the device 30 through the display 11 and the sensor 12. That is, it is desirable that the user registers a desired correspondence relationship between the movement of the recognition target and the control content to the device 30 in the processing unit 13 through the display 11 and the sensor 12. Further, the processing unit 13 is configured so that the user can register not only the content of the control on the device 30 but also the type of operation that the processing unit 13 performs on the display 11 with respect to the movement of the recognition target. It is desirable.
 本実施形態で説明した筐体40は、壁スイッチと同様の取付部材を用いて壁に取り付けることができるように、配線器具と同様の寸法を有している。したがって、既存の壁スイッチを操作装置10に置き換えることが可能である。ただし、筐体40の寸法及び形状を限定する趣旨ではなく、筐体40の寸法及び形状は設計などにより適宜に変更することができる。また、操作装置10は、壁に取り付けることが必須ではなく、建築物において壁以外の場所に取り付けられるように構成されるか、あるいは可搬型に構成されていてもよい。さらには、スマートフォン、タブレット端末のようなタッチパネルと無線通信の機能を備える可搬型のコンピュータにおいて、適宜のアプリケーションプログラム(いわゆる、アプリ)を実行することにより、上述した動作を実現することも可能である。 The housing 40 described in the present embodiment has the same dimensions as the wiring device so that it can be attached to the wall using the same attachment member as the wall switch. Therefore, it is possible to replace the existing wall switch with the operation device 10. However, the size and shape of the housing 40 are not intended to be limited, and the size and shape of the housing 40 can be appropriately changed depending on the design. The operation device 10 is not necessarily attached to the wall, and may be configured to be attached to a place other than the wall in the building, or may be configured to be portable. Further, the above-described operation can be realized by executing an appropriate application program (so-called application) in a portable computer having a touch panel and a wireless communication function such as a smartphone or a tablet terminal. .
 (実施形態2)
 本実施形態の操作装置10は、図9に示すように、実施形態1の構成に加えて、出力装置20としてリレー22が追加されている。また、リレー22の追加に伴って電源基板41に負荷端子413が追加されている。リレー22は、電源基板41に設けられ、電源基板41が備える電源端子411と負荷端子413との間の電路の導通(オン)と非導通(オフ)とを選択する。リレー22が電磁継電器である場合には、リレー22の接点が電源端子411と負荷端子413との間に接続される。また、リレー22の動作は、処理部13が制御する。リレー22は半導体リレーであってもよい。
(Embodiment 2)
As shown in FIG. 9, the operation device 10 of the present embodiment has a relay 22 added as an output device 20 in addition to the configuration of the first embodiment. In addition, a load terminal 413 is added to the power supply board 41 with the addition of the relay 22. The relay 22 is provided on the power supply board 41 and selects conduction (on) or non-conduction (off) of the electric path between the power supply terminal 411 and the load terminal 413 included in the power supply board 41. When the relay 22 is an electromagnetic relay, the contact of the relay 22 is connected between the power supply terminal 411 and the load terminal 413. The operation of the relay 22 is controlled by the processing unit 13. The relay 22 may be a semiconductor relay.
 負荷端子413には照明機器のような機器30が接続される。すなわち、機器30への給電経路にリレー22が設けられる。したがって、処理部13がリレー22を制御することにより、機器30に給電する状態と給電しない状態とが選択される。電動カーテンのように常時は動作させない機器30の給電経路に挿入されていてもよい。すなわち、機器30への給電が必要な場合にだけ、処理部13がリレー22にオンを指示してもよい。本実施形態の他の構成及び動作は実施形態1と同様である。 A device 30 such as a lighting device is connected to the load terminal 413. That is, the relay 22 is provided in the power supply path to the device 30. Therefore, when the processing unit 13 controls the relay 22, a state where power is supplied to the device 30 and a state where power is not supplied are selected. You may insert in the electric power feeding path | route of the apparatus 30 which is not always operated like an electric curtain. That is, the processing unit 13 may instruct the relay 22 to turn on only when power supply to the device 30 is necessary. Other configurations and operations of the present embodiment are the same as those of the first embodiment.
 なお、本実施形態において、出力装置20として無線モジュール21は、必須の構成要件ではない。本実施形態の操作装置10は、出力装置20としてリレー22のみを備えてもよい。 In the present embodiment, the wireless module 21 is not an essential component as the output device 20. The operating device 10 according to the present embodiment may include only the relay 22 as the output device 20.
 (変形例)
 表示器11は、液晶表示器を例示したが、有機ELディスプレイ(EL:Electroluminescence)、電子ペーパーなどに代えてもよい。表示器11が有機ELディスプレイのように自発光を行う場合、バックライト17は不要である。また、表示器11として、反射型の液晶表示器あるいは電子ペーパーなどが採用される場合、バックライト17に代えて表示器11の前面に重ねてフロントライトが配置される。フロントライトは、表示器11の周囲に配置された光源と、光源からの光が表示器11に入射するように導光する導光板とを備える。また、反射型の液晶表示器あるいは電子ペーパーなどが採用される場合、操作装置10の使用環境によっては、操作装置10は、バックライト17とフロントライトとのいずれも備えないこともある。
(Modification)
The display device 11 is exemplified by a liquid crystal display device, but may be replaced with an organic EL display (EL), electronic paper, or the like. When the display device 11 emits light like an organic EL display, the backlight 17 is unnecessary. When a reflective liquid crystal display or electronic paper is used as the display 11, a front light is placed on the front surface of the display 11 instead of the backlight 17. The front light includes a light source disposed around the display 11 and a light guide plate that guides light from the light source to enter the display 11. When a reflective liquid crystal display or electronic paper is used, the operating device 10 may not include either the backlight 17 or the front light depending on the usage environment of the operating device 10.
 操作装置10において、処理部13は、センサ12からの入力だけではなく、機械的スイッチからの入力を受け付けてもよい。すなわち、操作装置10は機械的スイッチが付加されていてもよい。また、認識対象は、人の指を想定しているが、タッチペンでもよい。 In the operation device 10, the processing unit 13 may accept not only an input from the sensor 12 but also an input from a mechanical switch. That is, the operating device 10 may be provided with a mechanical switch. The recognition target is assumed to be a human finger, but may be a touch pen.
 (まとめ)
 以上説明したように、第1の態様の操作装置(10)は、表示器(11)と、センサ(12)と、処理部(13)とを備える。表示器(11)は、表示状態と非表示状態とが選択される。センサ(12)は、表示器(11)に近接した検出範囲内において認識対象の位置を接触式又は非接触式で検出する。処理部(13)は、センサ(12)から認識対象の位置の情報を受け取る。さらに、処理部(13)は、センサ(12)が認識対象を検出していないときに、表示器(11)に非表示状態を指示する。また、処理部(13)は、表示器(11)が非表示状態であり、かつ認識対象の特定の動きを認識したときに、認識対象の特定の動きに応じて機器(30)への制御内容を含んだ出力状態を変化させる。
(Summary)
As explained above, the operating device (10) of the first aspect includes the display (11), the sensor (12), and the processing unit (13). The display (11) is selected between a display state and a non-display state. The sensor (12) detects the position of the recognition target in a contact type or a non-contact type within a detection range close to the display (11). The processing unit (13) receives information on the position of the recognition target from the sensor (12). Furthermore, the processing unit (13) instructs the display (11) to be in a non-display state when the sensor (12) does not detect the recognition target. The processing unit (13) controls the device (30) according to the specific movement of the recognition target when the display (11) is in the non-display state and recognizes the specific movement of the recognition target. Change the output state including the contents.
 すなわち、表示器(11)が非表示状態であっても、指のような認識対象が特定の動きを行った場合に、処理部(13)は出力状態を変化させる。そのため、表示器(11)を表示状態に移行させた後に機器(30)への制御内容を指示する場合と比べると、ユーザの操作回数が低減される。 That is, even when the display (11) is in a non-display state, the processing unit (13) changes the output state when the recognition target such as a finger performs a specific movement. Therefore, the number of operations by the user is reduced as compared with the case where the control content to the device (30) is instructed after the display (11) is shifted to the display state.
 第2の態様の操作装置(10)では、第1の態様において、処理部(13)は、表示器(11)が非表示状態であり、かつ認識対象の特定の動きを認識したときに、表示器(11)を非表示状態に保ったままで、認識対象の特定の動きに対応付けられた制御内容を機器(30)に指示するように出力状態を変化させてもよい。 In the operating device (10) of the second aspect, in the first aspect, the processing unit (13) is configured such that when the display (11) is in a non-display state and a specific movement of the recognition target is recognized, The output state may be changed so that the device (30) is instructed to control contents associated with a specific movement to be recognized while the display (11) is kept in the non-display state.
 すなわち、認識対象の特定の動きに対しては、表示器(11)を非表示状態に保ったままで機器(30)への制御内容を指示することが可能であって、表示器(11)の起動に要する電力消費が削減される。 That is, for a specific movement to be recognized, it is possible to instruct the control content to the device (30) while keeping the display (11) in a non-display state, and the display (11) Power consumption required for startup is reduced.
 第3の態様の操作装置(10)では、第1の態様において、処理部(13)は、表示器(11)が非表示状態であり、かつ認識対象の特定の動きを認識したときに、認識対象の特定の動きに対応付けられた制御内容を機器(30)に指示するように出力状態を変化させた後、制御内容を表示器に表示させてもよい。 In the operating device (10) of the third aspect, in the first aspect, the processing unit (13) is configured such that when the display (11) is in a non-display state and a specific movement of the recognition target is recognized, After changing the output state so as to instruct the device (30) of the control content associated with the specific movement to be recognized, the control content may be displayed on the display.
 すなわち、機器(30)に指示された制御内容が表示器(11)に表示されるから、制御内容の確認が可能になる。 That is, since the control content instructed to the device (30) is displayed on the display (11), the control content can be confirmed.
 第4の態様の操作装置(10)では、第1の態様において、処理部(13)は、表示器(11)が非表示状態であり、かつ認識対象の特定の動きを認識したときに、認識対象の特定の動きに対応付けられた制御内容を機器(30)に指示する前に、制御内容を表示器(11)に表示させることが望ましい。この場合、処理部(13)は、制御内容を機器(30)に指示するか否かの入力を待ち受けることが望ましい。 In the operating device (10) of the fourth aspect, in the first aspect, the processing unit (13) is configured such that when the display (11) is in a non-display state and a specific movement of the recognition target is recognized, It is desirable to display the control content on the display (11) before instructing the device (30) of the control content associated with the specific movement to be recognized. In this case, it is desirable that the processing unit (13) waits for an input as to whether or not to instruct the device (30) about the control content.
 すなわち、認識に対象の特定の動きに対して機器(30)に制御内容を直ちに指示するのではなく、機器(30)に制御内容を指示するか否かを確認させることができる。 That is, it is possible to confirm whether or not to instruct the device (30) to indicate the control content, instead of immediately instructing the device (30) to control the specific movement of the target for recognition.
 第5の態様の操作装置(10)では、第1~第4のいずれかの態様において、センサ(12)は、複数の認識対象それぞれの位置を検出可能であり、認識対象の特定の動きは、複数の認識対象から選択される認識対象の個数と、複数の認識対象から選択される認識対象の動きとの組み合わせで表されてもよい。 In the operating device (10) of the fifth aspect, in any one of the first to fourth aspects, the sensor (12) can detect the position of each of the plurality of recognition targets, and the specific movement of the recognition target is Alternatively, it may be represented by a combination of the number of recognition targets selected from a plurality of recognition targets and the movement of the recognition target selected from the plurality of recognition targets.
 この構成によれば、処理部(13)が認識する認識対象の動きの種類が増える。 According to this configuration, the types of movements of the recognition target recognized by the processing unit (13) increase.
 第6の態様の操作装置(10)では、第1~第5のいずれかの態様において、処理部(13)は、認識対象の特定の動きと制御内容との対応関係が登録されるように構成されていることが望ましい。 In the operating device (10) of the sixth aspect, in any one of the first to fifth aspects, the processing unit (13) is configured so that the correspondence between the specific movement of the recognition target and the control content is registered. It is desirable to be configured.
 すなわち、認識対象の特定の動きに対する機器(30)への制御内容の変更、追加が可能である。 That is, it is possible to change or add control content to the device (30) for a specific movement to be recognized.
 第7の態様の操作装置(10)では、第1~第6のいずれかの態様において、処理部(13)は、センサ(12)に認識対象がタッチする動きと、認識対象が表示器(11)の画面に沿って移動する動きとを区別するように構成されていることが望ましい。 In the operating device (10) of the seventh aspect, in any one of the first to sixth aspects, the processing unit (13) causes the movement of the recognition target to touch the sensor (12) and the recognition target is a display ( 11) It is desirable to be configured to distinguish the movement that moves along the screen.
 この構成によれば、処理部(13)が認識する認識対象の動きの種類が増える。 According to this configuration, the types of movements of the recognition target recognized by the processing unit (13) increase.
 第8の態様の操作装置(10)では、第1~第7のいずれかの態様において、操作装置(10)は、表示器(11)とセンサ(12)と処理部(13)とが設けられた筐体(40)を備えることが望ましい。筐体(40)は、建築物に取り付けられるように構成されていることが望ましい。 In the operating device (10) according to the eighth aspect, in any one of the first to seventh aspects, the operating device (10) includes a display (11), a sensor (12), and a processing unit (13). It is desirable to have a housing (40) provided. The housing (40) is preferably configured to be attached to a building.
 すなわち、筐体(40)を建築物に取り付けることにより、操作装置(10)を壁スイッチのように用いて機器(30)の制御を行うことが可能である。 That is, by attaching the housing (40) to the building, it is possible to control the device (30) using the operating device (10) like a wall switch.
 第9の態様の機器制御システム(100)は、第1~第8のいずれかの態様の操作装置(10)と、処理部(13)の出力状態に応じて機器(30)を制御する出力装置(20)とを備える。 The device control system (100) of the ninth aspect includes an output that controls the device (30) according to the output state of the operating device (10) of any of the first to eighth aspects and the processing unit (13). A device (20).
 すなわち、出力装置(20)により、処理部(13)の出力状態を機器(30)に適合させることができる。 That is, the output state of the processing unit (13) can be adapted to the device (30) by the output device (20).
 第10の態様の機器制御システム(100)では、第9の態様において、出力装置(20)は、機器(30)と通信する無線モジュール(21)であることが望ましい。第11の態様の機器制御システム100では、第9の態様において、出力装置(20)は、機器(30)の給電路に設けられるリレー(22)であってもよい。 In the device control system (100) of the tenth aspect, in the ninth aspect, the output device (20) is preferably a wireless module (21) that communicates with the device (30). In the device control system 100 of the eleventh aspect, in the ninth aspect, the output device (20) may be a relay (22) provided in the power feeding path of the device (30).
 すなわち、無線モジュール(21)を備える構成では、個々の機器(30)だけではなく、1台の操作装置(10)で複数の機器(30)を制御することが可能である。また、給電路のオンとオフとのみで制御可能な機器(30)であれば、リレー(22)を用いることにより、壁スイッチに代えて用いることが可能である。 That is, in the configuration including the wireless module (21), it is possible to control a plurality of devices (30) with a single operation device (10) as well as the individual devices (30). Further, if the device (30) can be controlled only by turning on and off the power supply path, it can be used in place of the wall switch by using the relay (22).
 以上説明した実施形態は、本発明の様々な実施形態の一部に過ぎない。また、上述した実施形態は、本発明の目的を達成できれば、設計等に応じて種々の変更が可能である。 The embodiments described above are only some of the various embodiments of the present invention. Further, the above-described embodiment can be variously changed according to the design or the like as long as the object of the present invention can be achieved.
 10 操作装置
 11 表示器
 12 センサ
 13 処理部
 20 出力装置
 21 無線モジュール
 22 リレー
 30 機器
 40 筐体
 100 機器制御システム
DESCRIPTION OF SYMBOLS 10 Operating device 11 Display 12 Sensor 13 Processing part 20 Output device 21 Wireless module 22 Relay 30 Equipment 40 Case 100 Equipment control system

Claims (11)

  1.  表示状態と非表示状態とが選択される表示器と、
     前記表示器に近接した検出範囲内において認識対象の位置を接触式又は非接触式で検出するセンサと、
     前記センサから前記認識対象の位置の情報を受け取る処理部とを備え、
     前記処理部は、
      前記センサが前記認識対象を検出していないときに、前記表示器に非表示状態を指示し、
      前記表示器が非表示状態であり、かつ前記認識対象の特定の動きを認識したときに、前記認識対象の前記特定の動きに応じて機器への制御内容を含んだ出力状態を変化させる
     ことを特徴とする操作装置。
    An indicator from which a display state and a non-display state are selected;
    A sensor for detecting a position of a recognition object in a contact type or a non-contact type within a detection range close to the display;
    A processing unit that receives information of the position of the recognition target from the sensor,
    The processor is
    When the sensor does not detect the recognition target, instruct the display device to hide,
    When the display device is in a non-display state and recognizes a specific movement of the recognition target, an output state including a control content to a device is changed according to the specific movement of the recognition target. Characteristic operating device.
  2.  前記処理部は、
      前記表示器が非表示状態であり、かつ前記認識対象の前記特定の動きを認識したときに、前記表示器を非表示状態に保ったままで、前記認識対象の前記特定の動きに対応付けられた前記制御内容を前記機器に指示するように前記出力状態を変化させる
     請求項1記載の操作装置。
    The processor is
    When the display is in a non-display state and the specific movement of the recognition target is recognized, the display is kept in a non-display state and is associated with the specific movement of the recognition target The operating device according to claim 1, wherein the output state is changed so that the control content is instructed to the device.
  3.  前記処理部は、
      前記表示器が非表示状態であり、かつ前記認識対象の前記特定の動きを認識したときに、前記認識対象の前記特定の動きに対応付けられた前記制御内容を前記機器に指示するように前記出力状態を変化させた後、前記制御内容を前記表示器に表示させる
     請求項1記載の操作装置。
    The processor is
    When the display unit is in a non-display state and the specific movement of the recognition target is recognized, the control content associated with the specific movement of the recognition target is instructed to the device. The operating device according to claim 1, wherein after the output state is changed, the control content is displayed on the display.
  4.  前記処理部は、
      前記表示器が非表示状態であり、かつ前記認識対象の前記特定の動きを認識したときに、前記認識対象の前記特定の動きに対応付けられた前記制御内容を前記機器に指示する前に、前記制御内容を前記表示器に表示させ、併せて前記制御内容を前記機器に指示するか否かの入力を待ち受ける
     請求項1記載の操作装置。
    The processor is
    When the display unit is in a non-display state and the specific movement of the recognition target is recognized, before instructing the device of the control content associated with the specific movement of the recognition target, The operation device according to claim 1, wherein the control content is displayed on the display unit and an input as to whether or not to instruct the device about the control content is awaited.
  5.  前記センサは、複数の認識対象それぞれの位置を検出可能であり、
     前記認識対象の特定の動きは、前記複数の認識対象から選択される認識対象の個数と、前記複数の認識対象から選択される認識対象の動きとの組み合わせで表される
     請求項1~4のいずれか1項に記載の操作装置。
    The sensor is capable of detecting the position of each of a plurality of recognition objects,
    The specific movement of the recognition target is represented by a combination of the number of recognition targets selected from the plurality of recognition targets and the movement of the recognition target selected from the plurality of recognition targets. The operating device according to any one of the above.
  6.  前記処理部は、
      前記認識対象の前記特定の動きと前記制御内容との対応関係が登録されるように構成されている
     請求項1~5のいずれか1項に記載の操作装置。
    The processor is
    The operation device according to any one of claims 1 to 5, wherein a correspondence relationship between the specific movement of the recognition target and the control content is registered.
  7.  前記処理部は、
      前記センサに前記認識対象がタッチする動きと、
      前記認識対象が前記表示器の画面に沿って移動する動きとを区別するように構成されている
     請求項1~6のいずれか1項に記載の操作装置。
    The processor is
    Movement of the recognition target touching the sensor;
    The operating device according to any one of claims 1 to 6, wherein the recognition target is configured to be distinguished from a movement that moves along a screen of the display.
  8.  前記表示器と前記センサと前記処理部とが設けられた筐体を更に備え、
     前記筐体は、建築物に取り付けられるように構成されている
     請求項1~7のいずれか1項に記載の操作装置。
    A housing provided with the display, the sensor, and the processing unit;
    The operating device according to any one of claims 1 to 7, wherein the casing is configured to be attached to a building.
  9.  請求項1~8のいずれか1項に記載の操作装置と、
     前記処理部の出力状態に応じて前記機器を制御する出力装置とを備える
     ことを特徴とする機器制御システム。
    The operating device according to any one of claims 1 to 8,
    An apparatus control system comprising: an output device that controls the apparatus according to an output state of the processing unit.
  10.  前記出力装置は、前記機器と通信する無線モジュールである
     請求項9記載の機器制御システム。
    The device control system according to claim 9, wherein the output device is a wireless module that communicates with the device.
  11.  前記出力装置は、前記機器の給電路に設けられるリレーである
     請求項9記載の機器制御システム。
    The device control system according to claim 9, wherein the output device is a relay provided in a power supply path of the device.
PCT/JP2018/010591 2017-03-30 2018-03-16 Operation device and apparatus control system WO2018180635A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-069217 2017-03-30
JP2017069217A JP2018170747A (en) 2017-03-30 2017-03-30 Operation device, equipment control system

Publications (1)

Publication Number Publication Date
WO2018180635A1 true WO2018180635A1 (en) 2018-10-04

Family

ID=63675877

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/010591 WO2018180635A1 (en) 2017-03-30 2018-03-16 Operation device and apparatus control system

Country Status (3)

Country Link
JP (1) JP2018170747A (en)
TW (1) TW201837658A (en)
WO (1) WO2018180635A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7355504B2 (en) * 2019-02-19 2023-10-03 株式会社ジャパンディスプレイ detection device
JP7392556B2 (en) 2020-04-06 2023-12-06 株式会社デンソーウェーブ air conditioning controller

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012198419A (en) * 2011-03-22 2012-10-18 Panasonic Corp Liquid crystal display controller
US20130065648A1 (en) * 2011-09-08 2013-03-14 Hyungjung KIM Mobile terminal and control method for the same
JP2014017735A (en) * 2012-07-10 2014-01-30 Toshiba Corp Information processing terminal and information processing method
JP2016107796A (en) * 2014-12-05 2016-06-20 株式会社日立製作所 Terminal device for train operation management system
JP2016527625A (en) * 2013-06-26 2016-09-08 グーグル インコーポレイテッド Method, system, and medium for controlling a remote device using a touch screen of a mobile device in a display inhibited state

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012198419A (en) * 2011-03-22 2012-10-18 Panasonic Corp Liquid crystal display controller
US20130065648A1 (en) * 2011-09-08 2013-03-14 Hyungjung KIM Mobile terminal and control method for the same
JP2014017735A (en) * 2012-07-10 2014-01-30 Toshiba Corp Information processing terminal and information processing method
JP2016527625A (en) * 2013-06-26 2016-09-08 グーグル インコーポレイテッド Method, system, and medium for controlling a remote device using a touch screen of a mobile device in a display inhibited state
JP2016107796A (en) * 2014-12-05 2016-06-20 株式会社日立製作所 Terminal device for train operation management system

Also Published As

Publication number Publication date
TW201837658A (en) 2018-10-16
JP2018170747A (en) 2018-11-01

Similar Documents

Publication Publication Date Title
CN110268371B (en) Household equipment controller with touch control groove
JP5225576B2 (en) Mobile terminal and operation method thereof
US9326407B1 (en) Automated dimmer wall switch with a color multi-touch LCD/LED display
US9084329B2 (en) Lighting control device having a touch sensitive user interface
US20160126950A1 (en) Power outlet socket sensor switch
EP2704529A1 (en) Illumination control system
KR20120015349A (en) User interface with circular light guided ring with adaptive appearance depending on function
CN105135632A (en) Dormant display control method and device of air conditioner
JP6060034B2 (en) Touch switch and operation panel
WO2018180635A1 (en) Operation device and apparatus control system
EP3478029A1 (en) Dimming processing system for led lamps
JP2010123470A (en) Multiple switch for vehicle accessory
CN203928294U (en) Air-conditioner
JP5777454B2 (en) Lighting control system, lighting control device, and lighting control method
US20200133431A1 (en) Pcb with integrated touch sensors
JP5971583B2 (en) Load controller
CN107678606B (en) Control device and method for remote control of electrical equipment
CN107388082A (en) Lighting device
CN204464132U (en) Every empty induction-type backlight Keysheet module
WO2017107006A1 (en) Switch control device
CN101964142A (en) Remote control device
CN219809867U (en) Lighting control device
US20170364201A1 (en) Touch-sensitive remote control
KR20130107212A (en) Load controller
JP6982864B2 (en) LED lighting device control method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18775949

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18775949

Country of ref document: EP

Kind code of ref document: A1