WO2018180635A1 - Dispositif d'actionnement et système de commande d'appareil - Google Patents

Dispositif d'actionnement et système de commande d'appareil Download PDF

Info

Publication number
WO2018180635A1
WO2018180635A1 PCT/JP2018/010591 JP2018010591W WO2018180635A1 WO 2018180635 A1 WO2018180635 A1 WO 2018180635A1 JP 2018010591 W JP2018010591 W JP 2018010591W WO 2018180635 A1 WO2018180635 A1 WO 2018180635A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
recognition target
processing unit
sensor
specific movement
Prior art date
Application number
PCT/JP2018/010591
Other languages
English (en)
Japanese (ja)
Inventor
賢志 喜多村
橋本 勝
英亮 山口
隼也 酒見
Original Assignee
パナソニックIpマネジメント株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パナソニックIpマネジメント株式会社 filed Critical パナソニックIpマネジメント株式会社
Publication of WO2018180635A1 publication Critical patent/WO2018180635A1/fr

Links

Images

Classifications

    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F24HEATING; RANGES; VENTILATING
    • F24FAIR-CONDITIONING; AIR-HUMIDIFICATION; VENTILATION; USE OF AIR CURRENTS FOR SCREENING
    • F24F11/00Control or safety arrangements
    • F24F11/50Control or safety arrangements characterised by user interfaces or communication
    • F24F11/52Indication arrangements, e.g. displays
    • F24F11/523Indication arrangements, e.g. displays for displaying temperature data
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F24HEATING; RANGES; VENTILATING
    • F24FAIR-CONDITIONING; AIR-HUMIDIFICATION; VENTILATION; USE OF AIR CURRENTS FOR SCREENING
    • F24F11/00Control or safety arrangements
    • F24F11/50Control or safety arrangements characterised by user interfaces or communication
    • F24F11/56Remote control
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04QSELECTING
    • H04Q9/00Arrangements in telecontrol or telemetry systems for selectively calling a substation from a main station, in which substation desired apparatus is selected for applying a control signal thereto or for obtaining measured values therefrom

Definitions

  • the present invention relates to an operating device and a device control system. More specifically, the present invention relates to an operating device including a display and an equipment control system that controls equipment using the operating device.
  • Patent Document 1 describes a technology for canceling sleep when a gesture or pattern due to a touch event on a touch sensor matches a sleep cancel setting gesture or pattern.
  • An object of the present invention is to provide an operation device and a device control system in which the number of operations by a user is reduced.
  • the operating device includes a display, a sensor, and a processing unit.
  • the display is selected between a display state and a non-display state.
  • the sensor detects a position of a recognition target in a contact type or a non-contact type within a detection range close to the display.
  • the processing unit receives information on the position of the recognition target from the sensor. Furthermore, the processing unit instructs the display device to be in a non-display state when the sensor does not detect the recognition target.
  • the processing unit includes a control content to the device according to the specific movement of the recognition target. Change the output state.
  • a device control system includes the operation device described above and an output device that controls the device in accordance with the output state of the processing unit.
  • FIG. 1 is a block diagram illustrating a device control system according to the first embodiment.
  • FIG. 2 is a block diagram illustrating the operating device according to the first embodiment.
  • FIG. 3 is a perspective view illustrating the operating device according to the first embodiment.
  • FIG. 4 is an exploded perspective view showing the operating device according to the first embodiment.
  • FIG. 5 is a diagram illustrating an example when the display device is in the display state in the operation device according to the first embodiment.
  • FIG. 6 is a diagram illustrating a non-display state of the display device in the operation device according to the first embodiment.
  • FIG. 7 is a diagram illustrating another example when the display device is in the display state in the operation device according to the first embodiment.
  • FIG. 8 is a diagram illustrating another example when the display device is in the display state in the operating device according to the first embodiment.
  • FIG. 9 is a block diagram illustrating an operating device according to the second embodiment.
  • the device control system 100 includes an operation device 10 and an output device 20.
  • the output device 20 is a device that instructs the device 30 about the contents of control.
  • the output device 20 is provided integrally with the housing 40 (see FIGS. 3 and 4) of the operation device 10.
  • the operating device 10 includes a display 11, a sensor 12, and a processing unit 13. Furthermore, the operating device 10 includes power supply circuits 14 and 15.
  • the power supply circuit 14 receives an AC power supply (for example, AC 100V) such as a commercial power supply and outputs DC power.
  • the power supply circuit 15 steps down the voltage of the DC power output from the power supply circuit 14 and outputs DC power with a constant voltage.
  • the power supply circuit 15 is an integrated circuit, and for example, a three-terminal regulator is used.
  • the display 11 is a transmissive liquid crystal display.
  • the sensor 12 is a touch sensor that is superimposed on the screen of the display 11.
  • the sensor 12 is transparent and plate-shaped, and functions as a position input device that detects the position of the recognition target in a contact or non-contact manner within a detection range close to the display 11.
  • a capacitive touch sensor capable of multipoint detection is assumed as the sensor 12.
  • the sensor 12 may be another type of touch sensor such as a resistive film type.
  • the recognition target is assumed to be a human finger.
  • the position of the recognition target when the recognition target approaches the sensor 12 even if the recognition target is not in contact with the sensor 12. May be detectable.
  • the distance between the sensor 12 and the recognition target when the sensor 12 detects the position of the recognition target in a non-contact manner varies depending on environmental conditions. That is, the detection range in which the sensor 12 detects the position of the recognition target varies depending on the environmental conditions.
  • the sensor 12 can detect the position of the recognition target even if the recognition target is non-contact.
  • the recognition target is in contact with the sensor 12 below. In this case, it is assumed that the sensor 12 detects the position of the recognition target.
  • a square surrounding the power supply circuit 14 represents the power supply board 41
  • a square surrounding the power supply circuit 15 represents the main board 42.
  • the power supply board 41 has a power supply terminal 411 connected to an AC power supply, and the input terminal of the power supply circuit 14 is connected to the power supply terminal 411.
  • the power supply board 41 and the main board 42 are electrically connected through a connector and a cable.
  • FIG. 2 schematically shows a configuration in which the power supply board 41 and the main board 42 are electrically connected to each other by the connection part 412 provided on the power supply board 41 and the connection part 421 provided on the main board 42. Yes.
  • the main board 42 includes the processing unit 13, the current source 16, and the wireless module 21 in addition to the power supply circuit 15.
  • the processing unit 13 is configured by a microcontroller including a processor and a memory, for example.
  • the processor executes a program stored in the memory.
  • the processing unit 13 outputs a video signal having display content information to the display 11 and receives information on the position of the recognition target from the sensor 12.
  • the processing unit 13 controls operations of the display 11, the sensor 12, the current source 16, and the wireless module 21 based on the position information received from the sensor 12.
  • the processing unit 13 instructs the display 11 to select a display state and a non-display state, and instructs display contents in the display state. In addition, the processing unit 13 can instruct the sensor 12 to select whether to accept or prohibit input.
  • the current source 16 outputs a drive current to the backlight 17 disposed on the back surface of the display 11.
  • the backlight 17 includes, for example, an LED (Light Emitting Diode) as a light source.
  • the processing unit 13 outputs a luminance signal that determines the luminance of the backlight 17, and the current source 16 outputs a current having a magnitude indicated by the luminance signal to the backlight 17.
  • the luminance signal switches the luminance of the backlight 17 in at least two stages.
  • One of the two levels of luminance is the luminance at which the backlight 17 is extinguished, and the other of the two levels of luminance is the luminance at which the display content of the display 11 can be visually recognized when the backlight 17 is turned on. That is, one luminance is, for example, 0%, and the other luminance is, for example, 70% or more and 100% or less.
  • luminance of the backlight 17 may be changed according to ambient illuminance so that the visibility of the indicator 11 may be improved by adding the brightness sensor which detects ambient illuminance.
  • the processing unit 13 may be configured so that the user can adjust the luminance of the backlight 17 using the display 11 and the sensor 12. That is, the processing unit 13 may adjust the luminance of the backlight 17 by interactively receiving input from the user using the display 11 and the sensor 12.
  • the wireless module 21 is configured to perform wireless communication.
  • the wireless module 21 is a wireless station of a communication standard selected from Wi-Fi (registered trademark), Bluetooth (registered trademark), BLE (Bluetooth Low Energy), etc. It is configured as a power radio station.
  • As the communication protocol for example, Echonet Light is adopted.
  • the communication standard and communication protocol used by the wireless module 21 shown here are merely examples, and the configuration of the wireless module 21 is not intended to be limited. Further, instead of the wireless module 21, a communication interface for performing wired communication such as power line carrier communication may be provided.
  • the operating device 10 is configured to have the same appearance as a wall switch attached to a wall of a building, for example, as shown in FIGS.
  • the operation device 10 shown in FIGS. 3 and 4 includes a housing 40 that is attached to a wall in a state where a part is embedded in an embedding hole formed in the wall.
  • the housing 40 includes a first housing 401 and a second housing 402.
  • the first housing 401 is formed in a box shape having a rectangular opening 403 on one surface, and the power supply board 41 and the main board 42 are incorporated therein.
  • the first housing 401 includes two mounting plates 404 that extend in a direction away from the opening 403 along one surface having the opening 403.
  • the two mounting plates 404 are provided on two opposite sides of the four sides surrounding the opening 403.
  • the first housing 401 is fixed to the wall in a state where the mounting plate 404 is applied around the embedding hole formed in the wall. Since the technique for fixing the first housing 401 to the wall is the same as that of the wall switch or the like, description thereof is omitted.
  • the second casing 402 is coupled to the first casing 401.
  • a peripheral portion of the base 405 is sandwiched between the first housing 401 and the second housing 402.
  • the display 11 and the sensor 12 are fixed to the base 405.
  • the second housing 402 has a window hole 406 at the center, and the display 11 and the sensor 12 fixed to the base 405 are the window holes in a state where the second housing 402 is coupled to the first housing 401. Exposed through 406.
  • the second casing 402 is a member corresponding to a flash plate (decorative plate) in the wall switch, and is coupled to the mounting plate 404 of the first casing 401.
  • the first housing 401 and the second housing 402 are coupled by, for example, a hole provided in the mounting plate 404 and a claw protruding from the back surface of the second housing 402.
  • the devices 30 to be controlled by the operation device 10 are three types of lighting devices, air conditioning devices, and electric curtains.
  • the operation device 10 displays the type of the device 30 on the screen of the display device 11. In the state where the type of the device 30 is displayed on the screen of the display device 11, the type of the device 30 to be controlled by the operating device 10 can be selected by the operating device 10.
  • the processing unit 13 receives an input from the sensor 12 according to the display content of the display unit 11 and changes the output state according to the input from the sensor 12.
  • the processing unit 13 changes the output state
  • the operation state of at least one of the display device 11, the sensor 12, the current source 16, and the wireless module 21 changes.
  • the output state of the processing unit 13 is represented by the contents of signals that give instructions to the display 11, the sensor 12, the current source 16, and the wireless module 21.
  • the processing unit 13 communicates directly or indirectly with the device 30 to be controlled by the operating device 10 through the wireless module 21.
  • the processing unit 13 communicates with the device 30 indirectly, it communicates with a communication device that can monitor and control the device 30 such as a HEMS controller (HEMS: Home Energy Management System), and the device passes through the communication device. 30 means communication.
  • HEMS controller Home Energy Management System
  • the processing unit 13 determines the control content to the device 30 based on the information input through the sensor 12 and outputs a signal corresponding to the control content to the device 30 to the wireless module 21.
  • the content of control to the device 30 includes the device 30 that is a control target and an operation state that instructs the control target device 30.
  • the wireless module 21 functions as the output device 20 with respect to the operation device 10. In this case, outputting the signal corresponding to the control content to the control target device 30 to the wireless module 21 corresponds to the change in the output state of the processing unit 13.
  • the operating device 10 may display the operation status of the device 30 selected through the screen of the display unit 11 on the screen of the display unit 11. If the lighting device is a lighting device, for example, it is different between lighting and extinguishing. If the lighting device is dimmable, it may include a dimming level. May contain color. In the case of an air conditioner, the status of operation is a set temperature, whether operation or stop, and may further include the direction of air blowing, the intensity of air blowing, and the like. Further, in the electric curtain, the operation status is different depending on whether the curtain is open or closed.
  • the screen of the display device 11 includes, for example, three tabs T1, T2, and T3 representing lighting equipment (lighting), air conditioning equipment (air conditioner), and electric curtain (curtain). Is displayed.
  • the recognition target finger touches the sensor 12 at a position corresponding to one of the three tabs T1, T2, T3, the device 30 corresponding to the tab T1, T2, T3 at the touched position is operated. It becomes a control target of the device 10.
  • FIG. 5 shows an example of a state in which the air conditioner is selected.
  • the screen of the display 11 includes a field F1 indicating the current set temperature of the air conditioner and two buttons B11 for changing the set temperature. B12 is displayed.
  • the set temperature indicated in the field F1 is changed when the recognition target touches the sensor 12 in an area corresponding to one of the two buttons B11 and B12.
  • push an operation in which a recognition target touches the sensor 12 in a specific area such as a tab or a button is referred to as “push” following the operation of the push button.
  • the processing unit 13 instructs the display unit 11 to not display and turns off the backlight 17. At this time, nothing is displayed on the screen of the display 11 as shown in FIG. 6, but the processing unit 13 can receive an input from the sensor 12.
  • the processing unit 13 When the display 11 is in the non-display state, the processing unit 13 recognizes the movement of the recognition target based on the position of the recognition target when the sensor 12 detects the recognition target. That is, the processing unit 13 waits for an input from the sensor 12 even when the display 11 is in a non-display state. Since the sensor 12 can detect multiple points, the processing unit 13 can distinguish whether the recognition target is one finger or two fingers. Or processing part 13 may be constituted so that three or more fingers can be distinguished.
  • the movement of the recognition target recognized by the processing unit 13 includes tap, swipe, pinch-in, pinch-out, and the like. Further, the processing unit 13 may additionally have a function of recognizing a double tap, a flick, a long press, a drag, or the like as a movement to be recognized.
  • the tap is an operation in which the recognition target touches the sensor 12 for a short time
  • the processing unit 13 recognizes that the recognition target leaves the sensor 12 within a predetermined time after the recognition target touches the sensor 12 as a tap.
  • Swipe is an operation in which the recognition target slides on the surface of the sensor 12 in one direction.
  • the direction in which the recognition target moves in swipe is often one of the vertical and horizontal directions of the screen of the display 11.
  • the processing unit 13 performs a swipe when the position change of the recognition target is started as time elapses within a predetermined time after the recognition target contacts the sensor 12 and the position change of the recognition target is generated linearly. recognize.
  • the processing unit 13 recognizes the direction of the position change of the recognition target by distinguishing them.
  • Pinch-in is an operation of moving the finger so as to reduce the distance between the two fingers after touching the sensor 12 with two fingers to be recognized.
  • Pinch-out is touching the sensor 12 with two fingers. This is an operation of moving the finger so as to increase the distance between the two fingers.
  • the processing unit 13 recognizes that two fingers are in contact with the sensor 12 and moves so that the interval between the two fingers is shortened or widened with the passage of time, the pinch-in is performed. Or it is recognized as a pinch out.
  • the processing unit 13 similarly recognizes other movements to be recognized.
  • the processing unit 13 is configured to recognize, as a specific movement, a swipe in which a recognition target moves upward in the vertical direction among the vertical direction and the horizontal direction of the screen of the display 11. That is, in FIG. 6, the operation of moving the recognition target upward indicated by the arrow A is a specific movement of the recognition target.
  • the processing unit 13 changes the output state according to the specific movement of the recognition target when the sensor 12 detects the recognition target and recognizes the specific movement of the recognition target.
  • the processing unit 13 changes the output state so as to display the screen of FIG. 5, for example. That is, when the display unit 11 is in a non-display state as shown in FIG. 6 and the processing unit 13 recognizes a specific movement to be recognized, the processing unit 13 displays the screen of FIG. 5 on the display unit 11. Then, the output state is changed so that the video signal representing the display content of the display 11 is output to the display 11. At this time, the processing unit 13 changes the output state of the luminance signal to the current source 16 so that the backlight 17 is lit.
  • the set temperature displayed in the field F ⁇ b> 1 on the screen of the display device 11 may be stored in the processing unit 13, but may be acquired directly or indirectly from the device 30 through the wireless module 21.
  • the processing unit 13 when the processing unit 13 recognizes a specific movement to be recognized when the display 11 is in the non-display state, the processing unit 13 controls the specific device 30 (the air conditioner in FIG. 5). The output state is changed so that a screen for instructing the contents is displayed on the display 11.
  • the processing unit 13 When the display unit 11 is in the non-display state and the processing unit 13 recognizes a specific movement to be recognized, the processing unit 13 sends the control content to the device 30 without shifting the display unit 11 to the display state. You may instruct. For example, if the device 30 associated with the specific movement to be recognized is a lighting device, the processing unit 13 instructs the lighting device to turn on the specific movement to be recognized. Change the output state to. In this case, the processing unit 13 does not change the output state with respect to the display unit 11 and the current source 16, and the display unit 11 is kept in the non-display state.
  • the processing unit 13 When the display unit 11 is in the non-display state and the processing unit 13 recognizes a specific movement to be recognized, the processing unit 13 instructs the device 30 associated with the specific movement to be recognized to control content. Then, the control content instructed to the device 30 may be displayed on the display 11. For example, for a specific movement to be recognized, the processing unit 13 changes the output state to the wireless module 21 so as to give an instruction to change the set temperature of the air conditioner. In this example, the set temperature of the air conditioner is increased by 1 ° C. relative to the upward swipe, and the set temperature of the air conditioner is decreased by 1 ° C. by the downward swipe. May be associated.
  • the processing unit 13 displays the display content for confirming the operation of the device 30 (characters “+ 1 ° C. (25 ° C.) set” in FIG. 7) on the display 11.
  • the button B13 for canceling the instructed control content may be displayed together. In the screen shown in FIG. 7, when the control content is not suitable, the user can cancel the instructed control content by pressing the button B13.
  • the processing unit 13 selects whether or not the control content can be instructed to the device 30 as shown in FIG. Buttons B14 and B15 to be displayed may be displayed on the display 11.
  • the processing unit 13 may be configured to wait for an input from the sensor 12 as to whether or not to instruct the device 30 to control content corresponding to a specific operation to be recognized.
  • the processing unit 13 associates a specific movement to be recognized with a specific control content to the specific device 30.
  • the processing unit 13 recognizes a specific movement to be recognized, the processing unit 13 selects the user to control the device 30 and whether to adopt the control content without giving an instruction to the device 30 immediately. Buttons B14 and B15 to be displayed are displayed on the display unit 11.
  • the control content to be instructed to the device 30 is a character “Do you want to set + 1 ° C. (25 ° C.)?” In FIG.
  • the processing unit 13 adopts the control content when the button B14 is pressed, and discards the control content when the button B15 is pressed.
  • the processing unit 13 performs not only when the display 11 is in the non-display state but also when it is in the display state, when the specific movement to be recognized is recognized, the same operation as in the non-display state is performed. May be. Specifically, even when the display unit 11 is in the display state, when the processing unit 13 recognizes a specific movement to be recognized, the display unit 11 associates the movement with the specific movement to be recognized without switching the screen of the display unit 11. A specific control content may be instructed to a specific device 30. In addition, when the processing unit 13 recognizes a specific movement to be recognized, the processing unit 13 switches the screen of the display unit 11 and causes the display unit 11 to display a screen that enables control of the specific device 30 as illustrated in FIG. Also good.
  • the display unit 11 switches the screen of the display unit 11 and instructs the device 30 to indicate the control content as shown in FIG. 11 may be displayed.
  • the screen shown in FIG. 7 includes a button B13 for canceling an instruction to the device 30, but the button B13 can be omitted.
  • a plurality of devices 30 or a plurality of types of devices 30 to which control contents are instructed may be used. For example, it is possible to associate a specific movement with the control content to the device 30 so that the plurality of devices 30 are turned off collectively.
  • the processing unit 13 can perform an operation of increasing the set temperature of the air conditioner by 1 ° C. with an upward swipe and decreasing the set temperature of the air conditioner by 1 ° C. with a downward swipe. Similarly, the processing unit 13 can perform an operation of turning on the lighting device with an upward swipe and turning off the lighting device with a downward swipe.
  • the type of operation of the recognition target can be associated with the control content to the device 30, and when the recognition target is one finger and when the recognition target is two fingers,
  • the correspondence relationship of the control content to 30 may be varied. For example, if you swipe up with two fingers, the electric curtain opens, and if you swipe down with two fingers, the electric curtain closes. May be attached.
  • the type of operation performed by the processing unit 13 on the display device 11 may be varied according to the control content to the device 30. For example, when the operation result of the device 30 can recognize the operation without depending on the operation device 10 like a lighting device, the processing unit 13 does not display the display 11 with respect to a specific movement to be recognized. You can keep it. Further, when the content of the instruction to the device 30 cannot be confirmed without the display 11 such as the set temperature of the air conditioner, the processing unit 13 displays the set temperature or the like on the display 11. It only has to work.
  • the processing unit 13 can register the correspondence between the movement of the recognition target and the control content to the device 30 through the display 11 and the sensor 12. That is, it is desirable that the user registers a desired correspondence relationship between the movement of the recognition target and the control content to the device 30 in the processing unit 13 through the display 11 and the sensor 12. Further, the processing unit 13 is configured so that the user can register not only the content of the control on the device 30 but also the type of operation that the processing unit 13 performs on the display 11 with respect to the movement of the recognition target. It is desirable.
  • the housing 40 described in the present embodiment has the same dimensions as the wiring device so that it can be attached to the wall using the same attachment member as the wall switch. Therefore, it is possible to replace the existing wall switch with the operation device 10.
  • the size and shape of the housing 40 are not intended to be limited, and the size and shape of the housing 40 can be appropriately changed depending on the design.
  • the operation device 10 is not necessarily attached to the wall, and may be configured to be attached to a place other than the wall in the building, or may be configured to be portable. Further, the above-described operation can be realized by executing an appropriate application program (so-called application) in a portable computer having a touch panel and a wireless communication function such as a smartphone or a tablet terminal. .
  • the operation device 10 of the present embodiment has a relay 22 added as an output device 20 in addition to the configuration of the first embodiment.
  • a load terminal 413 is added to the power supply board 41 with the addition of the relay 22.
  • the relay 22 is provided on the power supply board 41 and selects conduction (on) or non-conduction (off) of the electric path between the power supply terminal 411 and the load terminal 413 included in the power supply board 41.
  • the relay 22 is an electromagnetic relay, the contact of the relay 22 is connected between the power supply terminal 411 and the load terminal 413.
  • the operation of the relay 22 is controlled by the processing unit 13.
  • the relay 22 may be a semiconductor relay.
  • a device 30 such as a lighting device is connected to the load terminal 413. That is, the relay 22 is provided in the power supply path to the device 30. Therefore, when the processing unit 13 controls the relay 22, a state where power is supplied to the device 30 and a state where power is not supplied are selected. You may insert in the electric power feeding path
  • Other configurations and operations of the present embodiment are the same as those of the first embodiment.
  • the wireless module 21 is not an essential component as the output device 20.
  • the operating device 10 according to the present embodiment may include only the relay 22 as the output device 20.
  • the display device 11 is exemplified by a liquid crystal display device, but may be replaced with an organic EL display (EL), electronic paper, or the like.
  • EL organic EL display
  • the backlight 17 is unnecessary.
  • a front light is placed on the front surface of the display 11 instead of the backlight 17.
  • the front light includes a light source disposed around the display 11 and a light guide plate that guides light from the light source to enter the display 11.
  • the operating device 10 may not include either the backlight 17 or the front light depending on the usage environment of the operating device 10.
  • the processing unit 13 may accept not only an input from the sensor 12 but also an input from a mechanical switch. That is, the operating device 10 may be provided with a mechanical switch.
  • the recognition target is assumed to be a human finger, but may be a touch pen.
  • the operating device (10) of the first aspect includes the display (11), the sensor (12), and the processing unit (13).
  • the display (11) is selected between a display state and a non-display state.
  • the sensor (12) detects the position of the recognition target in a contact type or a non-contact type within a detection range close to the display (11).
  • the processing unit (13) receives information on the position of the recognition target from the sensor (12). Furthermore, the processing unit (13) instructs the display (11) to be in a non-display state when the sensor (12) does not detect the recognition target.
  • the processing unit (13) controls the device (30) according to the specific movement of the recognition target when the display (11) is in the non-display state and recognizes the specific movement of the recognition target. Change the output state including the contents.
  • the processing unit (13) changes the output state when the recognition target such as a finger performs a specific movement. Therefore, the number of operations by the user is reduced as compared with the case where the control content to the device (30) is instructed after the display (11) is shifted to the display state.
  • the processing unit (13) is configured such that when the display (11) is in a non-display state and a specific movement of the recognition target is recognized, The output state may be changed so that the device (30) is instructed to control contents associated with a specific movement to be recognized while the display (11) is kept in the non-display state.
  • the processing unit (13) is configured such that when the display (11) is in a non-display state and a specific movement of the recognition target is recognized, After changing the output state so as to instruct the device (30) of the control content associated with the specific movement to be recognized, the control content may be displayed on the display.
  • control content instructed to the device (30) is displayed on the display (11), the control content can be confirmed.
  • the processing unit (13) is configured such that when the display (11) is in a non-display state and a specific movement of the recognition target is recognized, It is desirable to display the control content on the display (11) before instructing the device (30) of the control content associated with the specific movement to be recognized. In this case, it is desirable that the processing unit (13) waits for an input as to whether or not to instruct the device (30) about the control content.
  • the sensor (12) can detect the position of each of the plurality of recognition targets, and the specific movement of the recognition target is Alternatively, it may be represented by a combination of the number of recognition targets selected from a plurality of recognition targets and the movement of the recognition target selected from the plurality of recognition targets.
  • the processing unit (13) is configured so that the correspondence between the specific movement of the recognition target and the control content is registered. It is desirable to be configured.
  • the processing unit (13) causes the movement of the recognition target to touch the sensor (12) and the recognition target is a display ( 11) It is desirable to be configured to distinguish the movement that moves along the screen.
  • the operating device (10) in any one of the first to seventh aspects, includes a display (11), a sensor (12), and a processing unit (13). It is desirable to have a housing (40) provided.
  • the housing (40) is preferably configured to be attached to a building.
  • the device control system (100) of the ninth aspect includes an output that controls the device (30) according to the output state of the operating device (10) of any of the first to eighth aspects and the processing unit (13).
  • the output state of the processing unit (13) can be adapted to the device (30) by the output device (20).
  • the output device (20) is preferably a wireless module (21) that communicates with the device (30).
  • the output device (20) may be a relay (22) provided in the power feeding path of the device (30).
  • the wireless module (21) it is possible to control a plurality of devices (30) with a single operation device (10) as well as the individual devices (30). Further, if the device (30) can be controlled only by turning on and off the power supply path, it can be used in place of the wall switch by using the relay (22).

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Mechanical Engineering (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Selective Calling Equipment (AREA)
  • Air Conditioning Control Device (AREA)

Abstract

L'invention concerne un dispositif d'actionnement permettant de réduire le nombre d'opérations utilisateur, ainsi qu'un système de commande d'appareil. Le dispositif d'actionnement (10) comprend un affichage (11), un capteur (12) et une unité de traitement (13). Sur l'affichage (11), un état d'affichage et un état de non-affichage peuvent être sélectionnés. Le capteur (12) détecte, avec ou sans contact, la position d'un objet à reconnaître sur une plage de détection à proximité de l'affichage (11). L'unité de traitement (13) reçoit du capteur (12) des informations relatives à la position de l'objet à reconnaître. L'unité de traitement (13) fournit également à l'affichage (11) une instruction pour l'état de non-affichage en l'absence de détection de l'objet à reconnaître par le capteur (12). En outre, lorsque l'affichage (11) se trouve dans l'état de non-affichage et lors de la reconnaissance d'un mouvement spécifique de l'objet à reconnaître, l'unité de traitement (13) provoque un changement dans un état de sortie comprenant un contenu de commande par rapport à l'appareil (30), en fonction du mouvement spécifique de l'objet à reconnaître.
PCT/JP2018/010591 2017-03-30 2018-03-16 Dispositif d'actionnement et système de commande d'appareil WO2018180635A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-069217 2017-03-30
JP2017069217A JP2018170747A (ja) 2017-03-30 2017-03-30 操作装置、機器制御システム

Publications (1)

Publication Number Publication Date
WO2018180635A1 true WO2018180635A1 (fr) 2018-10-04

Family

ID=63675877

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/010591 WO2018180635A1 (fr) 2017-03-30 2018-03-16 Dispositif d'actionnement et système de commande d'appareil

Country Status (3)

Country Link
JP (1) JP2018170747A (fr)
TW (1) TW201837658A (fr)
WO (1) WO2018180635A1 (fr)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7355504B2 (ja) * 2019-02-19 2023-10-03 株式会社ジャパンディスプレイ 検出装置
JP7392556B2 (ja) 2020-04-06 2023-12-06 株式会社デンソーウェーブ 空調コントローラ

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012198419A (ja) * 2011-03-22 2012-10-18 Panasonic Corp 液晶表示操作器
US20130065648A1 (en) * 2011-09-08 2013-03-14 Hyungjung KIM Mobile terminal and control method for the same
JP2014017735A (ja) * 2012-07-10 2014-01-30 Toshiba Corp 情報処理端末及び情報処理方法
JP2016107796A (ja) * 2014-12-05 2016-06-20 株式会社日立製作所 列車運行管理システム用端末装置
JP2016527625A (ja) * 2013-06-26 2016-09-08 グーグル インコーポレイテッド 表示抑制状態にあるモバイルデバイスのタッチスクリーンを用いてリモートデバイスを制御するための方法、システム、および媒体

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012198419A (ja) * 2011-03-22 2012-10-18 Panasonic Corp 液晶表示操作器
US20130065648A1 (en) * 2011-09-08 2013-03-14 Hyungjung KIM Mobile terminal and control method for the same
JP2014017735A (ja) * 2012-07-10 2014-01-30 Toshiba Corp 情報処理端末及び情報処理方法
JP2016527625A (ja) * 2013-06-26 2016-09-08 グーグル インコーポレイテッド 表示抑制状態にあるモバイルデバイスのタッチスクリーンを用いてリモートデバイスを制御するための方法、システム、および媒体
JP2016107796A (ja) * 2014-12-05 2016-06-20 株式会社日立製作所 列車運行管理システム用端末装置

Also Published As

Publication number Publication date
TW201837658A (zh) 2018-10-16
JP2018170747A (ja) 2018-11-01

Similar Documents

Publication Publication Date Title
CN110268371B (zh) 具有触摸控制槽的家居设备控制器
JP5225576B2 (ja) 携帯端末機及びその動作方法
US20200249785A1 (en) Modular touch panel smart switches and systems
US9326407B1 (en) Automated dimmer wall switch with a color multi-touch LCD/LED display
US9084329B2 (en) Lighting control device having a touch sensitive user interface
US20160126950A1 (en) Power outlet socket sensor switch
EP1809076A2 (fr) Affichage doté d'une lumière d'exposition
KR20120015349A (ko) 기능에 따른 적응형 형상을 가지는 원형 광 도파된 링을 가지는 사용자 인터페이스
CN105135632A (zh) 空调器的休眠显示控制方法及装置
JP6060034B2 (ja) タッチスイッチおよび操作パネル
US20140239844A1 (en) Intelligent lighting apparatus
WO2018180635A1 (fr) Dispositif d'actionnement et système de commande d'appareil
EP3478029A1 (fr) Système de traitement de gradation pour lampes à diodes électroluminescentes
JP2010123470A (ja) 車両アクセサリ用多連スイッチ
CN203928294U (zh) 空调器
JP5777454B2 (ja) 照明制御システム、照明制御装置、および照明制御方法
US20200133431A1 (en) Pcb with integrated touch sensors
CN107678606B (zh) 用于远程操控电器设备的控制装置和方法
CN107388082A (zh) 照明装置
CN204464132U (zh) 隔空感应式背光键盘模块
CN101964142A (zh) 遥控装置
CN219809867U (zh) 照明控制装置
US20170364201A1 (en) Touch-sensitive remote control
KR20130107212A (ko) 부하 컨트롤러
JP6982864B2 (ja) Led照明装置の制御方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18775949

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18775949

Country of ref document: EP

Kind code of ref document: A1