WO2017063893A1 - Touch control - Google Patents

Touch control Download PDF

Info

Publication number
WO2017063893A1
WO2017063893A1 PCT/EP2016/073280 EP2016073280W WO2017063893A1 WO 2017063893 A1 WO2017063893 A1 WO 2017063893A1 EP 2016073280 W EP2016073280 W EP 2016073280W WO 2017063893 A1 WO2017063893 A1 WO 2017063893A1
Authority
WO
WIPO (PCT)
Prior art keywords
vibration
interaction
controller
vibration signal
control
Prior art date
Application number
PCT/EP2016/073280
Other languages
French (fr)
Inventor
Ruben Rajagopalan
Harry Broers
Original Assignee
Philips Lighting Holding B.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Philips Lighting Holding B.V. filed Critical Philips Lighting Holding B.V.
Publication of WO2017063893A1 publication Critical patent/WO2017063893A1/en

Links

Classifications

    • HELECTRICITY
    • H03ELECTRONIC CIRCUITRY
    • H03KPULSE TECHNIQUE
    • H03K17/00Electronic switching or gating, i.e. not by contact-making and –breaking
    • H03K17/94Electronic switching or gating, i.e. not by contact-making and –breaking characterised by the way in which the control signals are generated
    • H03K17/965Switches controlled by moving an element forming part of the switch
    • H03K17/967Switches controlled by moving an element forming part of the switch having a plurality of control members, e.g. keyboard
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0362Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 1D translations or rotations of an operating part of the device, e.g. scroll wheels, sliders, knobs, rollers or belts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/043Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using propagating acoustic waves
    • HELECTRICITY
    • H03ELECTRONIC CIRCUITRY
    • H03KPULSE TECHNIQUE
    • H03K17/00Electronic switching or gating, i.e. not by contact-making and –breaking
    • H03K17/94Electronic switching or gating, i.e. not by contact-making and –breaking characterised by the way in which the control signals are generated
    • H03K17/96Touch switches
    • H03K17/964Piezoelectric touch switches
    • H03K17/9643Piezoelectric touch switches using a plurality of detectors, e.g. keyboard
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/105Controlling the light source in response to determined parameters
    • H05B47/115Controlling the light source in response to determined parameters by determining the presence or movement of objects or living beings
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2200/00Indexing scheme relating to G06F1/04 - G06F1/32
    • G06F2200/16Indexing scheme relating to G06F1/16 - G06F1/18
    • G06F2200/163Indexing scheme relating to constructional details of the computer
    • G06F2200/1636Sensing arrangement for detection of a tap gesture on the housing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/033Indexing scheme relating to G06F3/033
    • G06F2203/0339Touch strips, e.g. orthogonal touch strips to control cursor movement or scrolling; single touch strip to adjust parameter or to implement a row of soft keys
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04809Textured surface identifying touch areas, e.g. overlay structure for a virtual keyboard
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01HELECTRIC SWITCHES; RELAYS; SELECTORS; EMERGENCY PROTECTIVE DEVICES
    • H01H2239/00Miscellaneous
    • H01H2239/054Acoustic pick-up, e.g. ultrasonic
    • HELECTRICITY
    • H03ELECTRONIC CIRCUITRY
    • H03KPULSE TECHNIQUE
    • H03K2217/00Indexing scheme related to electronic switching or gating, i.e. not by contact-making or -breaking covered by H03K17/00
    • H03K2217/94Indexing scheme related to electronic switching or gating, i.e. not by contact-making or -breaking covered by H03K17/00 characterised by the way in which the control signal is generated
    • H03K2217/96Touch switches
    • H03K2217/96003Touch switches using acoustic waves, e.g. ultrasound
    • H03K2217/96011Touch switches using acoustic waves, e.g. ultrasound with propagation, SAW or BAW
    • HELECTRICITY
    • H03ELECTRONIC CIRCUITRY
    • H03KPULSE TECHNIQUE
    • H03K2217/00Indexing scheme related to electronic switching or gating, i.e. not by contact-making or -breaking covered by H03K17/00
    • H03K2217/94Indexing scheme related to electronic switching or gating, i.e. not by contact-making or -breaking covered by H03K17/00 characterised by the way in which the control signal is generated
    • H03K2217/96Touch switches
    • H03K2217/96066Thumbwheel, potentiometer, scrollbar or slider simulation by touch switch
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/40Control techniques providing energy savings, e.g. smart controller or presence detection

Definitions

  • the present disclosure relates to touch control of a device.
  • the present disclosure relates to touch control of a luminaire.
  • buttons for the user to interact with. These buttons require both a mechanical and electronic component at each button location.
  • many devices like smartphones or tablets use an accelerometer to detect user inputs, like single or double tapping that could also be combined with information coming from a touchscreen.
  • the location on the touchscreen is used by the interaction (e.g. zooming by double tapping).
  • Touch control of luminaires or other electronic devices have been used as a predominant gesture for many years.
  • a variety of touch sensing features have been enabled in the recent decade, from simple binary touch detection through capacitive sensing on surfaces, to using a complete device as a touch interface, where the mechanical force exerted on the device is measured by an integrated accelerometer to enable gesture control.
  • PCB printed circuit board
  • mechanical push buttons and surfaces of the housing of a device are designed in such a way that they have a specific vibration signature over time. Vibrations detected by a vibration sensor caused by a user pressing a button or running their finger along a specific portion of the housing are correlated with predetermined vibration signatures of surfaces/buttons to determine which button or surface of the device a user has interacted with. This detected user interaction is then translated to control the device. For example, if the device is a luminaire the user interaction is translated to light control.
  • Embodiments of the present disclosure improve the versatility of touch-control by exploiting the 3D texture of the surface being touched.
  • Specific parameters of touch can be implicitly encoded in the vibration signature depending on the roughness of the surface.
  • surfaces of the device are designed with specific grooves/dents that create a specific vibration signature when touched, thus making it feasible to detect
  • a device comprising: at least one interaction surface; a vibration sensor configured to output a vibration signal; a memory storing a predetermined vibration pattern associated with each of the at least one interaction surface; and a controller configured to: receive the vibration signal; detect a user's interaction with an interaction surface of the at least one interaction surface based on the vibration signal corresponding to the vibration pattern associated with said interaction surface; and control the device based on said detection.
  • the at least one interaction surface may comprise at least one mechanical switch, and the memory may store a predetermined vibration pattern associated with each of the at least one mechanical switch.
  • the controller may be configured to detect a mechanical switch of the at least mechanical switch being interacted with based on the vibration signal corresponding to the vibration pattern associated with said mechanical switch.
  • the at least one interaction surface may comprise at least one three- dimensional textured surface, and the memory may store a predetermined vibration pattern associated with each of the at least one three dimensional textured surfaces.
  • the controller may be configured to detect an object being dragged across a three dimensional textured surface of the at least one three dimensional textured surface based on the vibration signal corresponding to the vibration pattern associated with said three dimensional textured surface.
  • the controller may be configured to detect a direction of the object being dragged across said three dimensional textured surface based on the vibration signal, and control the device based on said direction.
  • the controller may be configured to detect a speed of the object being dragged across said three dimensional textured surface based on the vibration signal, and control the device based on said speed.
  • the controller may be configured to detect a trajectory of the object being dragged across said three dimensional textured surface based on the vibration signal, and control the device based on said trajectory.
  • the controller may be configured to detect a force of the user's interaction with the interaction surface based on the vibration signal, and control the device based on said force.
  • the vibration sensor may be one or any combination of: an accelerometer; a microphone; a gyroscope; a tilt sensor; a magnetometer; and a piezoelectric sensor.
  • the device may further comprise at least one light source for emitting light to illuminate an environment of the device, and the controller may be configured to control the light emitted from the at least one light source based on said detection.
  • the controller may be configured to transmit a lighting command to the at least one light source to control a lighting parameter of the light emitted from the at least one light source based on said detection.
  • the lighting parameter may be one or any combination of: intensity, color, saturation, color temperature, size, shape, pattern, and dynamics of the light emitted from the at least one light source.
  • the device may comprise a housing which houses the vibration sensor, the memory and the controller; and the at least one interaction surface comprises a portion of said housing; and the memory stores a predetermined vibration pattern associated with a remote device, that comprises one or any combination of at least one mechanical switch and at least one three dimensional textured surface, interacting with said portion of said housing.
  • a method of controlling a device comprising: receiving a vibration signal output by a vibration sensor of the device; detecting a user's interaction with an interaction surface of at least one interaction surface of the device based on the vibration signal corresponding to a predetermined vibration pattern associated with said interaction surface that is stored in a memory of the device; and controlling the device based on said detection.
  • a computer program product comprising code embodied on a computer-readable medium and being configured so as when executed on a processor of a device, to: receive a vibration signal output by a vibration sensor of the device; detect a user's interaction with an interaction surface of at least one interaction surface of the device based on the vibration signal corresponding to a predetermined vibration pattern associated with said interaction surface that is stored in a memory of the device; and control the device based on said detection.
  • Figure 1 a illustrates a schematic block diagram of a luminaire
  • Figure 2 illustrates a demonstration device
  • Figure 3 illustrates steady state signals output by a vibration sensor of the demonstration device
  • Figure 4 illustrates signals output by a vibration sensor of the demonstration device in response to a tap on the housing of the demonstration device
  • Figure 5 illustrates signals output by a vibration sensor of the demonstration device in response to a first button of the demonstration device being pressed
  • Figure 6 illustrates signals output by a vibration sensor of the demonstration device in response to a second button of the demonstration device being pressed
  • Figure 7a illustrates signals output by a vibration sensor of the demonstration device in response to an object being quickly dragged over a textured surface of the demonstration device
  • Figure 7b illustrates signals output by a vibration sensor of the demonstration device in response to an object being slowly dragged over a textured surface of the demonstration device.
  • Figure 1 illustrates a schematic block diagram of a luminaire 100 which is a device for emitting illumination for illuminating its environment, comprising at least one light source plus any associated socket, housing and/or support.
  • the luminaire 100 shown in Figure 1 may be installed at a fixed location within the environment of the luminaire 100, e.g. in a ceiling, walls, or on a light pole fixed to the floor or ground.
  • the luminaire 100 may be installed at a fixed location within the environment of the luminaire 100, e.g. in a ceiling, walls, or on a light pole fixed to the floor or ground.
  • the luminaire 100 may be installed at a fixed location within the environment of the luminaire 100, e.g. in a ceiling, walls, or on a light pole fixed to the floor or ground.
  • the luminaire 100 may
  • the luminaire 100 comprises a controller 102, light source(s) 104, a vibration sensor 106, and a memory 108. It will be appreciated that the luminaire 100 may comprise other components and connections not shown in Figure 1.
  • the luminaire 100 has one or more interaction surfaces.
  • the luminaire 100 may comprise one or more mechanical push button 1 10, and one or more three-dimensional surface texture of the housing 1 12 of the luminaire 100 in which components of the luminaire 100 are housed.
  • a three-dimensional surface texture of the housing 1 12 is formed by surface irregularities, for example by a series of grooves in the surface of the housing 1 12 or a series of raised bumps on the surface of the housing 1 12.
  • the mechanical push button(s) 1 10 and portions of the luminaire housing 1 12 are designed in such a way that they have a specific vibration signature over time when a user interacts with them (i.e. presses a button or drags their finger (or other object) over a three-dimensional surface texture of the housing 1 12).
  • Predetermined vibration patterns associated with each of the interaction surfaces are stored in the memory 108 that is coupled to the controller 102.
  • a vibration pattern associated with an interaction surface represents the vibrations sensed by the vibration sensor 106 when a user interacts with the interaction surface.
  • a vibration pattern associated with a particular mechanical push button 1 10 is indicative of the particular mechanical push button 100 being pressed.
  • a vibration pattern associated with a particular three-dimensional surface texture of the housing 1 12 is indicative of an object being dragged over the particular three-dimensional surface texture.
  • the vibration sensor 106 is positioned such that it can detect vibrations generated by a user pressing any of the mechanical push button(s) 1 10 or a user rubbing any of the three-dimensional surface texture(s) of the housing 1 12 of the luminaire 100.
  • the vibration sensor 106 may for example be a single of multiple axis accelerometer which is configured to detect movement in various axes and provide corresponding signal(s) to the controller 102.
  • the controller 102 is coupled to the vibration sensor 106 such that it is configured to receive a vibration signal output from the vibration sensor 106.
  • the controller 102 is configured to analyses the vibration signal output from the vibration sensor 106 to determine which, if any, of the predetermined vibration patterns (stored in memory 108) has been measured by the vibration sensor 106.
  • the controller 102 is configured to detect a user's interaction with an interaction surface of the luminaire 100 based on the vibration signal output by the vibration sensor 106 corresponding to the predetermined vibration pattern associated with the interaction surface that is stored in memory 108.
  • a vibration signal output from the vibration sensor 106 may correspond to a predetermined vibration pattern stored in memory 108 when it matches or comes close enough to the predetermined vibration pattern (e.g. within a predefined or user-controllable margin of error).
  • the controller 102 performs a spatio-temporal analysis to determine the interaction surface that the user has interacted with.
  • Memory 108 further stores predefined lighting control commands associated with user interactions with each of the interaction surfaces.
  • the lighting commands are used to control the light emitted by the light source(s) 104.
  • a lighting command may be used to control a lighting parameter (for example the intensity, color, saturation, color temperature, size, shape, pattern, dynamics etc.) of the light emitted by the light source(s) 104.
  • the controller 102 In response to the controller 102 detecting that a user has interacted with an interaction surface of the luminaire 100 based on the vibration signal output by the vibration sensor 106, the controller 102 is configured to control the light emitted by the light source(s) 104. In particular, the controller 102 is configured to transmit a lighting command to the light source(s) 104 to control the lighting parameter that is associated with the interaction surface that the user has interacted with.
  • controller 102 The functionality of the controller 102 referred to herein may be
  • controller 102 implemented in code (software) stored on a memory (e.g. memory 108) comprising one or more storage media, and arranged for execution on a processor (not shown in Figure 1) comprising on or more processing units.
  • code is configured so as when fetched from the memory and executed on the processor to perform operations in line with embodiments discussed below.
  • processor not shown in Figure 1
  • FPGA field-programmable gate array
  • the light source(s) 104 are operable to emit light to illuminate an environment of the luminaire 100 which may comprise an indoor space such as a room, office space or building, and/or an outdoor space such as a garden or park, and/or a partially-covered environment such as a gazebo or stadium, and/or any other type of environment such as the interior of a vehicle.
  • the light source(s) 104 may comprise any suitable source of light such as e.g. a high/low pressure gas discharge source, a laser diode, an inorganic/organic light emitting diode (LED), an incandescent source, or a halogen source.
  • a light source may be a single light source, or could comprise multiple light sources, e.g. multiple LEDs which may, for example, form an array of light sources collectively operating as a single light source.
  • the light source(s) 104 are controllable in that the light emitted by the light source(s) 104 may be controlled by the controller 102.
  • Embodiments of the present disclosure exploit the fact that interaction surfaces of the luminaire 100 have different vibration signatures which can be used by the controller 102 to determine when, and which interaction surface (i.e. mechanical press button or three- dimensional surface texture of the housing 1 12) of the luminaire 100 has been touched by a user.
  • interaction surface i.e. mechanical press button or three- dimensional surface texture of the housing 1 12
  • the demonstration device 200 comprises a housing 212 which houses a vibration sensor (a 3 -axis accelerometer) that is positioned such that it can detect vibrations generated by a user pressing a first mechanical push button 210a and a second mechanical push button 210b, and a user dragging their finger (or other object) over a three-dimensional surface texture 214 of the housing 212 of the device 200 (which is formed by a series of grooves in the surface of the housing 212).
  • a vibration sensor a 3 -axis accelerometer
  • Figures 3 illustrates steady state signals (x-axis signal 302, y-axis signal 304, and z-axis signal 306) output by the accelerometer of the demonstration device 200 when a user is not interacting with any interaction surface of the demonstration device 200.
  • Figure 4 illustrates signals (x-axis signal 402, y-axis signal 404, and z-axis signal 406) output by the accelerometer of the demonstration device 200 in response to a tap (in the -z direction) on the top of the housing 212 of the demonstration device 200 (the side of the demonstration device on which the first mechanical push button 210a is located).
  • this is a spiky signal that is dampened quickly with the dominant component in the z-direction and due to mechanical cross-over, the other axes of the accelerometer are also observing the tap.
  • Figure 4 has been included for completeness as known solutions described above measure the mechanical force exerted on the device by observing the tap to enable gesture control.
  • an additional mechanical component e.g.
  • the inventors have identified that it is possible to distinguish a specific vibration and relate it to a button.
  • the device that is in accordance with embodiments described herein will not react to unwanted taps on the device resulting in higher robustness to vibrations not originated by touch of the device (e.g. if the device is a mobile phone positioned on a table that is exposed to vibrations of the table, the device will not react to the vibrations of the table).
  • Figure 5 illustrates signals (x-axis signal 502, y-axis signal 504, and z-axis signal 506) output by the accelerometer of the demonstration device 200 in response to the first mechanical push button 210a of the demonstration device 200 being pressed by a user (in the -z direction).
  • This vibration signal output from the vibration sensor 106 has a different vibration signature compared to the tap referred to above with multiple spikes being present in the z-direction and due to mechanical cross-over also the other axes of the accelerometer are observing the button press.
  • Figure 6 illustrates signals (x-axis signal 602, y-axis signal 604, and z-axis signal 606) output by the accelerometer of the demonstration device 200 in response to the second mechanical push button 210b of the demonstration device being pressed by a user (in the y direction).
  • This vibration signal output from the vibration sensor 106 has a different vibration signature compared to the tap and press of the first mechanical push button 210a (in the -z direction) referred to above. It also comprises multiple spikes but the component in the z-direction is almost completely dampened.
  • Figures 5 and 6 referred to above illustrate how two different mechanical push buttons 1 10 of the luminaire 1 OOmay generate different vibration signals when pressed by a user.
  • the controller 102 is able to determine that a particular mechanical push button 1 10 has been pressed based on the vibration signal output by the vibration sensor 106 corresponding to the predetermined vibration pattern associated with the particular mechanical push button 1 10 that is stored in memory 108, and control the light emitted by the light source(s) 104 based on this determination.
  • a first mechanical push button 1 10 of the luminaire 100 may be associated with increasing the intensity of light emitted by the light source(s) 104, and a second mechanical push button 1 10 of the luminaire 100 may be associated with decreasing the intensity of light emitted by the light source(s) 104.
  • the memory 108 would store a vibration pattern associated with the first mechanical push button 1 10 and a different vibration pattern associated with the second mechanical push button 1 10.
  • the memory 108 would further store a lighting command which is used to increase the intensity of light emitted by the light source(s) 104, in association with the first mechanical push button 1 10; and a lighting command which is used to decrease the intensity of light emitted by the light source(s) 104, in association with the second mechanical push button 1 10.
  • the controller 102 is configured to detect a user pressing one of the mechanical push buttons 1 10 based on the vibration signal output by the vibration sensor 106 corresponding to the predetermined vibration pattern associated with the push button that has been pressed that is stored in memory 108, and transmit the lighting command associated with the detected push button to increase/decrease the intensity of light emitted by the light source(s) 104.
  • the controller 102 may be configured to detect a force by which a user presses one of the mechanical push buttons 1 10 based on the vibration signal output by the vibration sensor 106, and control the light emitted by the light source(s) 104 based on the detected force. For example, the controller 102 may control the intensity of the light emitted by the light source(s) 104 based on the force of the button press.
  • Figure 7a illustrates signals (x-axis signal 702, y-axis signal 704, and z-axis signal 706) output by the accelerometer of the demonstration device 200 in response to a user quickly dragging their finger (or other object) over the textured surface 214 of the demonstration device 200 (in the y direction), and Figure 7b illustrates signals (x-axis signal 712, y-axis signal 714, and z-axis signal 716) output by the accelerometer of the
  • demonstration device 200 in response to a user slowly dragging their finger (or other object) over the textured surface 214 of the demonstration device 200 (in the y direction).
  • the vibration signal output from the vibration sensor 106 in both of these scenarios has a different vibration signature than the tap and press of mechanical push buttons 210a and 210b in that it comprises many more spikes, due to the series of grooves.
  • Figures 7a and 7b referred to above illustrate how having a predetermined vibration pattern associated with a three-dimensional surface texture of the housing 1 12 of the luminaire 100 can be used by the controller 102 to determine that a user has interacted with the three-dimensional surface texture of the housing 1 12 rather than any other interaction surface of the luminaire 100.
  • the luminaire 100 may have a plurality of three-dimensional surface textures on the housing 1 12 of the luminaire 100 (each with unique 3D mechanical characteristics).
  • the controller 102 is able to determine that a user has dragged an object over a three- dimensional surface texture based on the vibration signal output by the vibration sensor 106 corresponding to the predetermined vibration pattern associated with the particular three- dimensional surface texture that is stored in memory 108, and control the light emitted by the light source(s) 104 based on this determination. That is, if the housing 1 12 has different surface textures, the vibration signal output by the vibration sensor 106 can be used to localize where the luminaire is touched.
  • the distance between the spikes in the observed vibration signal is dependent on the speed at which a user runs their finger (or other object) over the textured surface 214 of the demonstration device 200.
  • the distance between the spikes in the observed vibration signal is inversely proportional to the speed at which a user runs their finger over the textured surface 214 of the demonstration device 200 (the faster the speed at which a user runs their finger over the textured surface 214 of the demonstration device 200, the smaller the distance between the spikes in the vibration signal).
  • FIGS 7a and 7b referred to above illustrate how spatio-temporal analysis performed by the controller 102 can detect a speed at which a user drags their finger over a three-dimensional surface texture on the housing 1 12 of the luminaire 100.
  • memory 108 may store predefined lighting control commands in association with a three-dimensional surface texture on the housing 1 12 of the luminaire 100 which control a lighting parameter of the light emitted by the light source(s) 104 in dependence on the speed at which a user drags their finger (or other object) over the three- dimensional surface texture.
  • the controller 102 is configured to detect a speed at which a user drags their finger (or other object) over the three-dimensional surface texture on the housing 1 12 of the luminaire 100 based on the vibration signal output from the vibration sensor 106, and control the light emitted by the light source(s) 104 by transmission of an appropriate lighting command selected based on the detected speed.
  • the detected speed could be used to control the speed at which a user is able to scroll through predetermined light settings (in a manner similar to the way a user may scroll through a list of contacts in a phonebook on a mobile telephone).
  • the detected speed could also be used to control a dynamic light effect whereby the detected speed controls how fast the light effect changes.
  • the vibration signal output from the vibration sensor 106 varies in dependence on a direction that a user runs their finger over the textured surface 214 of the demonstration device 200 (i.e. whether the user runs their finger over the textured surface 214 in the +y direction or -y direction). For example, if the textured surface 214 was formed by grooves having a saw tooth pattern it is possible to distinguish the direction of the movement. From left to right swiping, a finger will encounter a steep threshold that is followed by a relative slowly declining slope. Due to the asymmetric shape of the grooves of the textured surface 214 a swipe in the opposite direction will result in a different vibration signal.
  • the memory 108 may store predefined lighting control commands in association with a three-dimensional surface texture on the housing 1 12 of the luminaire 100 which control a lighting parameter of the light emitted by the light source(s) 104 in dependence on a direction that a user drags their finger (or other object) in across the three-dimensional surface texture.
  • the controller 102 is configured to detect a direction that a user drags their finger (or other object) in across the three-dimensional surface texture on the housing 1 12 of the luminaire 100 based on the vibration signal output from the vibration sensor 106, and control the light emitted by the light source(s) 104 by transmission of an appropriate lighting command selected based on the detected direction.
  • the detected direction could be used to
  • the controller 102 is configured to detect a trajectory (combination of location and time) of the object being dragged across a three dimensional textured portion of the housing 1 12 based on the vibration signal, and control the light emitted by the light source(s) 104 based on the detected trajectory.
  • Three dimensional textured portions of the housing 1 12 can be designed such that the vibration signal output from the vibration sensor 106, in response to a user dragging their finger (or other object) over the three-dimensional textured portion, varies in dependence on the force applied by the object.
  • the controller 102 may be configured to detect a force by which a user applies and drags an object over a three-dimensional textured portion of the housing 1 12 based on the vibration signal, and control the light emitted by the light source(s) 104 based on the detected force.
  • the present invention extends the versatility of gestures that can be enabled by exploiting the 3D surface texture of the luminaire 100.
  • Surfaces with specific roughness will generate specific vibration patterns.
  • the gesture-set can be made richer (beyond simple binary touch) by implicitly encoding gesture parameters depending on the surface properties.
  • the vibration pattern sensed by the vibration sensor can be used to identify the velocity/direction of the object (finger) touching the surface as described above.
  • embodiments of the present disclosure removes the need of having a PCB or flex foil with electronic read-out circuitry at the location of the mechanical push buttons. This reduces the physical size of the luminaire 100 compared with those known in the art which require a bulky PCB to connect the mechanical part of the buttons with their associated read-out electronic circuits.
  • the electro-mechanical push buttons present in known devices have an impact on the cost and freedom of design of the devices. By adding only the mechanical part of the push button and using the vibration signal output by the vibration sensor 106 to decide when and which button is pressed, the device according to embodiments of the present disclosure can be made cheaper and with less restrictions in its design. In many products a vibration sensor 106 is already present, so this component can be reused. Whilst incorporating one or more three- dimensional surface textures on the housing 1 12 of the luminaire 100 results in a slightly more complex design of the mould to produce the housing 1 12 of the luminaire there will be no increase in the financial cost of the mechanical push buttons.
  • the vibration sensor 106 has been described above as being an accelerometer, this is merely an example and the vibration sensor 106 may take other forms.
  • the vibration sensor 106 may be a microphone, gyroscope, a magnetometer (eCompass), a piezo electric sensor, or a tilt sensor that may be fluid based with the capability to detect high frequency motion or comprise a metal ball which is arranged to make contact with conductive elements in certain orientations.
  • an attachment e.g. a sticker
  • a three-dimensional surface texture may be attached to a surface of the housing 1 12 of the luminaire 100. This enables additional interaction surfaces to be provided to those already provided on the device.
  • a remote device e.g. pen or other handheld instrument
  • a remote device may comprise the one or more mechanical push button 1 10 and/or one or more three-dimensional textured surface that results in a specific vibration when a user touches a surface of the housing of the luminaire 100 with the remote device.
  • One or more vibration pattern associated with a user interacting with a surface of the housing 1 12 of the luminaire 100 with the remote device may be stored in memory 108 with predefined lighting control commands associated with the user interactions. This enables the controller 102 to detect that a user has interacted with surface of the luminaire 100 using the remote device based on the vibration signal output by the vibration sensor 106, and control the light emitted by the light source(s) 104 accordingly.
  • the controller 102 is configured receive the vibration signal output by the vibration sensor 106; detect a user's interaction with an interaction surface on the device based on the vibration signal corresponding to the vibration pattern associated with the interaction surface, and control the device based on this detection. Whilst this "control" has been described above with reference to lighting control in the embodiments described herein, when embodiments of the present disclosure are applied to other devices other than a luminaire, this "control" is dependent on the functionality of the device.
  • controller generally represent software, firmware, hardware, or a combination thereof.
  • the controller represents program code that performs specified tasks when executed on a processor (e.g. CPU or CPUs).
  • the program code can be stored in one or more computer readable memory devices.
  • a single processor or other unit may fulfil the functions of several items recited in the claims.
  • a computer program may be stored/distributed on a suitable medium, such as an optical storage medium or a solid-state medium supplied together with or as part of other hardware, but may also be distributed in other forms, such as via the Internet or other wired or wireless telecommunication systems. Any reference signs in the claims should not be construed as limiting the scope.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Acoustics & Sound (AREA)
  • Circuit Arrangement For Electric Light Sources In General (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A device comprising: at least one interaction surface (110, 112); a vibration sensor (106) configured to output a vibration signal; a memory (108) storing a predetermined vibration pattern associated with each of the at least one interaction surface; and a controller (102) configured to: receive the vibration signal; detect a user's interaction with an interaction surface of the at least one interaction surface based on the vibration signal corresponding to the vibration pattern associated with said interaction surface; and control the device based on said detection.

Description

Touch control
TECHNICAL FIELD
The present disclosure relates to touch control of a device. In particular, the present disclosure relates to touch control of a luminaire. BACKGROUND
Many devices and remote controllers provide electro-mechanical push buttons for the user to interact with. These buttons require both a mechanical and electronic component at each button location.
Furthermore, many devices like smartphones or tablets use an accelerometer to detect user inputs, like single or double tapping that could also be combined with information coming from a touchscreen. In the latter case, the location on the touchscreen is used by the interaction (e.g. zooming by double tapping).
Touch control of luminaires or other electronic devices have been used as a predominant gesture for many years. With the advancement in smart sensor systems, a variety of touch sensing features have been enabled in the recent decade, from simple binary touch detection through capacitive sensing on surfaces, to using a complete device as a touch interface, where the mechanical force exerted on the device is measured by an integrated accelerometer to enable gesture control. SUMMARY
Known devices incorporating mechanical press buttons require a bulky printed circuit board (PCB) to connect the mechanical part of the buttons with their associated readout electronic circuits. This increases the physical size of the device.
Whilst a range of touch-control applications exist that detect a mechanical force exerted by touching a luminaire (or directly the bulb itself), these known systems do not use specific mechanical characteristics of the surface being touch.
In embodiments of the present disclosure, mechanical push buttons and surfaces of the housing of a device are designed in such a way that they have a specific vibration signature over time. Vibrations detected by a vibration sensor caused by a user pressing a button or running their finger along a specific portion of the housing are correlated with predetermined vibration signatures of surfaces/buttons to determine which button or surface of the device a user has interacted with. This detected user interaction is then translated to control the device. For example, if the device is a luminaire the user interaction is translated to light control.
Embodiments of the present disclosure improve the versatility of touch-control by exploiting the 3D texture of the surface being touched. Specific parameters of touch can be implicitly encoded in the vibration signature depending on the roughness of the surface. For example, surfaces of the device are designed with specific grooves/dents that create a specific vibration signature when touched, thus making it feasible to detect
direction/velocity/force of the touch, in turn creating richer gestures.
According to one aspect of the present disclosure there is provided a device comprising: at least one interaction surface; a vibration sensor configured to output a vibration signal; a memory storing a predetermined vibration pattern associated with each of the at least one interaction surface; and a controller configured to: receive the vibration signal; detect a user's interaction with an interaction surface of the at least one interaction surface based on the vibration signal corresponding to the vibration pattern associated with said interaction surface; and control the device based on said detection.
The at least one interaction surface may comprise at least one mechanical switch, and the memory may store a predetermined vibration pattern associated with each of the at least one mechanical switch.
The controller may be configured to detect a mechanical switch of the at least mechanical switch being interacted with based on the vibration signal corresponding to the vibration pattern associated with said mechanical switch.
The at least one interaction surface may comprise at least one three- dimensional textured surface, and the memory may store a predetermined vibration pattern associated with each of the at least one three dimensional textured surfaces.
The controller may be configured to detect an object being dragged across a three dimensional textured surface of the at least one three dimensional textured surface based on the vibration signal corresponding to the vibration pattern associated with said three dimensional textured surface.
The controller may be configured to detect a direction of the object being dragged across said three dimensional textured surface based on the vibration signal, and control the device based on said direction. The controller may be configured to detect a speed of the object being dragged across said three dimensional textured surface based on the vibration signal, and control the device based on said speed.
The controller may be configured to detect a trajectory of the object being dragged across said three dimensional textured surface based on the vibration signal, and control the device based on said trajectory.
The controller may be configured to detect a force of the user's interaction with the interaction surface based on the vibration signal, and control the device based on said force.
The vibration sensor may be one or any combination of: an accelerometer; a microphone; a gyroscope; a tilt sensor; a magnetometer; and a piezoelectric sensor.
The device may further comprise at least one light source for emitting light to illuminate an environment of the device, and the controller may be configured to control the light emitted from the at least one light source based on said detection.
The controller may be configured to transmit a lighting command to the at least one light source to control a lighting parameter of the light emitted from the at least one light source based on said detection. The lighting parameter may be one or any combination of: intensity, color, saturation, color temperature, size, shape, pattern, and dynamics of the light emitted from the at least one light source.
The device may comprise a housing which houses the vibration sensor, the memory and the controller; and the at least one interaction surface comprises a portion of said housing; and the memory stores a predetermined vibration pattern associated with a remote device, that comprises one or any combination of at least one mechanical switch and at least one three dimensional textured surface, interacting with said portion of said housing.
According to another aspect of the present disclosure there is provided a method of controlling a device, the method comprising: receiving a vibration signal output by a vibration sensor of the device; detecting a user's interaction with an interaction surface of at least one interaction surface of the device based on the vibration signal corresponding to a predetermined vibration pattern associated with said interaction surface that is stored in a memory of the device; and controlling the device based on said detection.
According to another aspect of the present disclosure there is provided a computer program product comprising code embodied on a computer-readable medium and being configured so as when executed on a processor of a device, to: receive a vibration signal output by a vibration sensor of the device; detect a user's interaction with an interaction surface of at least one interaction surface of the device based on the vibration signal corresponding to a predetermined vibration pattern associated with said interaction surface that is stored in a memory of the device; and control the device based on said detection.
These and other aspects will be apparent from the embodiments described in the following. The scope of the present disclosure is not intended to be limited by this summary nor to implementations that necessarily solve any or all of the disadvantages noted.
BRIEF DESCRIPTION OF THE DRAWINGS
For a better understanding of the present disclosure and to show how embodiments may be put into effect, reference is made to the accompanying drawings in which:
Figure 1 a illustrates a schematic block diagram of a luminaire;
Figure 2 illustrates a demonstration device;
Figure 3 illustrates steady state signals output by a vibration sensor of the demonstration device;
Figure 4 illustrates signals output by a vibration sensor of the demonstration device in response to a tap on the housing of the demonstration device;
Figure 5 illustrates signals output by a vibration sensor of the demonstration device in response to a first button of the demonstration device being pressed;
Figure 6 illustrates signals output by a vibration sensor of the demonstration device in response to a second button of the demonstration device being pressed; and
Figure 7a illustrates signals output by a vibration sensor of the demonstration device in response to an object being quickly dragged over a textured surface of the demonstration device; and
Figure 7b illustrates signals output by a vibration sensor of the demonstration device in response to an object being slowly dragged over a textured surface of the demonstration device.
DETAILED DESCRIPTION
Embodiments will now be described by way of example only.
Figure 1 illustrates a schematic block diagram of a luminaire 100 which is a device for emitting illumination for illuminating its environment, comprising at least one light source plus any associated socket, housing and/or support. The luminaire 100 shown in Figure 1 may be installed at a fixed location within the environment of the luminaire 100, e.g. in a ceiling, walls, or on a light pole fixed to the floor or ground. However it will be appreciated that the luminaire 100 may
alternatively be portable.
The luminaire 100 comprises a controller 102, light source(s) 104, a vibration sensor 106, and a memory 108. It will be appreciated that the luminaire 100 may comprise other components and connections not shown in Figure 1.
The luminaire 100 has one or more interaction surfaces. For example, the luminaire 100 may comprise one or more mechanical push button 1 10, and one or more three-dimensional surface texture of the housing 1 12 of the luminaire 100 in which components of the luminaire 100 are housed. A three-dimensional surface texture of the housing 1 12 is formed by surface irregularities, for example by a series of grooves in the surface of the housing 1 12 or a series of raised bumps on the surface of the housing 1 12.
In embodiments of the present invention, the mechanical push button(s) 1 10 and portions of the luminaire housing 1 12 are designed in such a way that they have a specific vibration signature over time when a user interacts with them (i.e. presses a button or drags their finger (or other object) over a three-dimensional surface texture of the housing 1 12).
Predetermined vibration patterns associated with each of the interaction surfaces are stored in the memory 108 that is coupled to the controller 102. A vibration pattern associated with an interaction surface represents the vibrations sensed by the vibration sensor 106 when a user interacts with the interaction surface. In particular, a vibration pattern associated with a particular mechanical push button 1 10 is indicative of the particular mechanical push button 100 being pressed. Furthermore, a vibration pattern associated with a particular three-dimensional surface texture of the housing 1 12 is indicative of an object being dragged over the particular three-dimensional surface texture.
The vibration sensor 106 is positioned such that it can detect vibrations generated by a user pressing any of the mechanical push button(s) 1 10 or a user rubbing any of the three-dimensional surface texture(s) of the housing 1 12 of the luminaire 100. The vibration sensor 106 may for example be a single of multiple axis accelerometer which is configured to detect movement in various axes and provide corresponding signal(s) to the controller 102.
The controller 102 is coupled to the vibration sensor 106 such that it is configured to receive a vibration signal output from the vibration sensor 106. The controller 102 is configured to analyses the vibration signal output from the vibration sensor 106 to determine which, if any, of the predetermined vibration patterns (stored in memory 108) has been measured by the vibration sensor 106.
The controller 102 is configured to detect a user's interaction with an interaction surface of the luminaire 100 based on the vibration signal output by the vibration sensor 106 corresponding to the predetermined vibration pattern associated with the interaction surface that is stored in memory 108. A vibration signal output from the vibration sensor 106 may correspond to a predetermined vibration pattern stored in memory 108 when it matches or comes close enough to the predetermined vibration pattern (e.g. within a predefined or user-controllable margin of error). Thus it can be seen that the controller 102 performs a spatio-temporal analysis to determine the interaction surface that the user has interacted with.
Memory 108 further stores predefined lighting control commands associated with user interactions with each of the interaction surfaces. The lighting commands are used to control the light emitted by the light source(s) 104. For example, a lighting command may be used to control a lighting parameter (for example the intensity, color, saturation, color temperature, size, shape, pattern, dynamics etc.) of the light emitted by the light source(s) 104.
In response to the controller 102 detecting that a user has interacted with an interaction surface of the luminaire 100 based on the vibration signal output by the vibration sensor 106, the controller 102 is configured to control the light emitted by the light source(s) 104. In particular, the controller 102 is configured to transmit a lighting command to the light source(s) 104 to control the lighting parameter that is associated with the interaction surface that the user has interacted with.
The functionality of the controller 102 referred to herein may be
implemented in code (software) stored on a memory (e.g. memory 108) comprising one or more storage media, and arranged for execution on a processor (not shown in Figure 1) comprising on or more processing units. The code is configured so as when fetched from the memory and executed on the processor to perform operations in line with embodiments discussed below. Alternatively it is not excluded that some or all of the functionality of the controller 102 is implemented in dedicated hardware circuitry, or configurable hardware circuitry like a field-programmable gate array (FPGA).
The light source(s) 104 are operable to emit light to illuminate an environment of the luminaire 100 which may comprise an indoor space such as a room, office space or building, and/or an outdoor space such as a garden or park, and/or a partially-covered environment such as a gazebo or stadium, and/or any other type of environment such as the interior of a vehicle.
The light source(s) 104 may comprise any suitable source of light such as e.g. a high/low pressure gas discharge source, a laser diode, an inorganic/organic light emitting diode (LED), an incandescent source, or a halogen source. A light source may be a single light source, or could comprise multiple light sources, e.g. multiple LEDs which may, for example, form an array of light sources collectively operating as a single light source. The light source(s) 104 are controllable in that the light emitted by the light source(s) 104 may be controlled by the controller 102.
Embodiments of the present disclosure exploit the fact that interaction surfaces of the luminaire 100 have different vibration signatures which can be used by the controller 102 to determine when, and which interaction surface (i.e. mechanical press button or three- dimensional surface texture of the housing 1 12) of the luminaire 100 has been touched by a user.
To illustrate this in greater detail, reference is made to a demonstration device 200 which is shown in Figure 2 relative to x, y and z axes. The demonstration device 200 comprises a housing 212 which houses a vibration sensor (a 3 -axis accelerometer) that is positioned such that it can detect vibrations generated by a user pressing a first mechanical push button 210a and a second mechanical push button 210b, and a user dragging their finger (or other object) over a three-dimensional surface texture 214 of the housing 212 of the device 200 (which is formed by a series of grooves in the surface of the housing 212).
Figures 3 illustrates steady state signals (x-axis signal 302, y-axis signal 304, and z-axis signal 306) output by the accelerometer of the demonstration device 200 when a user is not interacting with any interaction surface of the demonstration device 200.
Figure 4 illustrates signals (x-axis signal 402, y-axis signal 404, and z-axis signal 406) output by the accelerometer of the demonstration device 200 in response to a tap (in the -z direction) on the top of the housing 212 of the demonstration device 200 (the side of the demonstration device on which the first mechanical push button 210a is located). As can be seen from Figure 4 this is a spiky signal that is dampened quickly with the dominant component in the z-direction and due to mechanical cross-over, the other axes of the accelerometer are also observing the tap. Figure 4 has been included for completeness as known solutions described above measure the mechanical force exerted on the device by observing the tap to enable gesture control. With an additional mechanical component (e.g. a button) the inventors have identified that it is possible to distinguish a specific vibration and relate it to a button. The device that is in accordance with embodiments described herein will not react to unwanted taps on the device resulting in higher robustness to vibrations not originated by touch of the device (e.g. if the device is a mobile phone positioned on a table that is exposed to vibrations of the table, the device will not react to the vibrations of the table).
Figure 5 illustrates signals (x-axis signal 502, y-axis signal 504, and z-axis signal 506) output by the accelerometer of the demonstration device 200 in response to the first mechanical push button 210a of the demonstration device 200 being pressed by a user (in the -z direction). This vibration signal output from the vibration sensor 106 has a different vibration signature compared to the tap referred to above with multiple spikes being present in the z-direction and due to mechanical cross-over also the other axes of the accelerometer are observing the button press.
Figure 6 illustrates signals (x-axis signal 602, y-axis signal 604, and z-axis signal 606) output by the accelerometer of the demonstration device 200 in response to the second mechanical push button 210b of the demonstration device being pressed by a user (in the y direction). This vibration signal output from the vibration sensor 106 has a different vibration signature compared to the tap and press of the first mechanical push button 210a (in the -z direction) referred to above. It also comprises multiple spikes but the component in the z-direction is almost completely dampened.
Figures 5 and 6 referred to above illustrate how two different mechanical push buttons 1 10 of the luminaire 1 OOmay generate different vibration signals when pressed by a user. By having predetermined vibration patterns associated with a plurality of different mechanical push buttons stored in memory 108, the controller 102 is able to determine that a particular mechanical push button 1 10 has been pressed based on the vibration signal output by the vibration sensor 106 corresponding to the predetermined vibration pattern associated with the particular mechanical push button 1 10 that is stored in memory 108, and control the light emitted by the light source(s) 104 based on this determination.
In one example implementation, a first mechanical push button 1 10 of the luminaire 100 may be associated with increasing the intensity of light emitted by the light source(s) 104, and a second mechanical push button 1 10 of the luminaire 100 may be associated with decreasing the intensity of light emitted by the light source(s) 104.
In this example implementation, the memory 108 would store a vibration pattern associated with the first mechanical push button 1 10 and a different vibration pattern associated with the second mechanical push button 1 10. The memory 108 would further store a lighting command which is used to increase the intensity of light emitted by the light source(s) 104, in association with the first mechanical push button 1 10; and a lighting command which is used to decrease the intensity of light emitted by the light source(s) 104, in association with the second mechanical push button 1 10.
The controller 102 is configured to detect a user pressing one of the mechanical push buttons 1 10 based on the vibration signal output by the vibration sensor 106 corresponding to the predetermined vibration pattern associated with the push button that has been pressed that is stored in memory 108, and transmit the lighting command associated with the detected push button to increase/decrease the intensity of light emitted by the light source(s) 104.
The controller 102 may be configured to detect a force by which a user presses one of the mechanical push buttons 1 10 based on the vibration signal output by the vibration sensor 106, and control the light emitted by the light source(s) 104 based on the detected force. For example, the controller 102 may control the intensity of the light emitted by the light source(s) 104 based on the force of the button press.
Figure 7a illustrates signals (x-axis signal 702, y-axis signal 704, and z-axis signal 706) output by the accelerometer of the demonstration device 200 in response to a user quickly dragging their finger (or other object) over the textured surface 214 of the demonstration device 200 (in the y direction), and Figure 7b illustrates signals (x-axis signal 712, y-axis signal 714, and z-axis signal 716) output by the accelerometer of the
demonstration device 200 in response to a user slowly dragging their finger (or other object) over the textured surface 214 of the demonstration device 200 (in the y direction).
The vibration signal output from the vibration sensor 106 in both of these scenarios has a different vibration signature than the tap and press of mechanical push buttons 210a and 210b in that it comprises many more spikes, due to the series of grooves. Thus Figures 7a and 7b referred to above illustrate how having a predetermined vibration pattern associated with a three-dimensional surface texture of the housing 1 12 of the luminaire 100 can be used by the controller 102 to determine that a user has interacted with the three-dimensional surface texture of the housing 1 12 rather than any other interaction surface of the luminaire 100.
In embodiments of the present disclosure the luminaire 100 may have a plurality of three-dimensional surface textures on the housing 1 12 of the luminaire 100 (each with unique 3D mechanical characteristics). By having predetermined vibration patterns associated with the plurality of different three-dimensional surface textures stored in memory 108, the controller 102 is able to determine that a user has dragged an object over a three- dimensional surface texture based on the vibration signal output by the vibration sensor 106 corresponding to the predetermined vibration pattern associated with the particular three- dimensional surface texture that is stored in memory 108, and control the light emitted by the light source(s) 104 based on this determination. That is, if the housing 1 12 has different surface textures, the vibration signal output by the vibration sensor 106 can be used to localize where the luminaire is touched.
As shown in Figures 7a and 7b, the distance between the spikes in the observed vibration signal is dependent on the speed at which a user runs their finger (or other object) over the textured surface 214 of the demonstration device 200. In particular, the distance between the spikes in the observed vibration signal is inversely proportional to the speed at which a user runs their finger over the textured surface 214 of the demonstration device 200 (the faster the speed at which a user runs their finger over the textured surface 214 of the demonstration device 200, the smaller the distance between the spikes in the vibration signal).
Figures 7a and 7b referred to above illustrate how spatio-temporal analysis performed by the controller 102 can detect a speed at which a user drags their finger over a three-dimensional surface texture on the housing 1 12 of the luminaire 100. In embodiments of the present disclosure memory 108 may store predefined lighting control commands in association with a three-dimensional surface texture on the housing 1 12 of the luminaire 100 which control a lighting parameter of the light emitted by the light source(s) 104 in dependence on the speed at which a user drags their finger (or other object) over the three- dimensional surface texture. The controller 102 is configured to detect a speed at which a user drags their finger (or other object) over the three-dimensional surface texture on the housing 1 12 of the luminaire 100 based on the vibration signal output from the vibration sensor 106, and control the light emitted by the light source(s) 104 by transmission of an appropriate lighting command selected based on the detected speed. As a mere example, the detected speed could be used to control the speed at which a user is able to scroll through predetermined light settings (in a manner similar to the way a user may scroll through a list of contacts in a phonebook on a mobile telephone). The detected speed could also be used to control a dynamic light effect whereby the detected speed controls how fast the light effect changes. This could be extended to a scratchpad to create dynamic light effects similar to scratching of vinyl records. The inventors have also observed that the vibration signal output from the vibration sensor 106 varies in dependence on a direction that a user runs their finger over the textured surface 214 of the demonstration device 200 (i.e. whether the user runs their finger over the textured surface 214 in the +y direction or -y direction). For example, if the textured surface 214 was formed by grooves having a saw tooth pattern it is possible to distinguish the direction of the movement. From left to right swiping, a finger will encounter a steep threshold that is followed by a relative slowly declining slope. Due to the asymmetric shape of the grooves of the textured surface 214 a swipe in the opposite direction will result in a different vibration signal. In embodiments of the present disclosure the memory 108 may store predefined lighting control commands in association with a three-dimensional surface texture on the housing 1 12 of the luminaire 100 which control a lighting parameter of the light emitted by the light source(s) 104 in dependence on a direction that a user drags their finger (or other object) in across the three-dimensional surface texture. The controller 102 is configured to detect a direction that a user drags their finger (or other object) in across the three-dimensional surface texture on the housing 1 12 of the luminaire 100 based on the vibration signal output from the vibration sensor 106, and control the light emitted by the light source(s) 104 by transmission of an appropriate lighting command selected based on the detected direction. As a mere example, the detected direction could be used to
increase/decrease the intensity/colour and adapt the beam of the light. It could also be used to change the pace of dynamic light effects.
In other embodiments, the controller 102 is configured to detect a trajectory (combination of location and time) of the object being dragged across a three dimensional textured portion of the housing 1 12 based on the vibration signal, and control the light emitted by the light source(s) 104 based on the detected trajectory.
Three dimensional textured portions of the housing 1 12 can be designed such that the vibration signal output from the vibration sensor 106, in response to a user dragging their finger (or other object) over the three-dimensional textured portion, varies in dependence on the force applied by the object. The controller 102 may be configured to detect a force by which a user applies and drags an object over a three-dimensional textured portion of the housing 1 12 based on the vibration signal, and control the light emitted by the light source(s) 104 based on the detected force.
Thus it can be seen from the embodiments described above that the present invention extends the versatility of gestures that can be enabled by exploiting the 3D surface texture of the luminaire 100. Surfaces with specific roughness will generate specific vibration patterns. The gesture-set can be made richer (beyond simple binary touch) by implicitly encoding gesture parameters depending on the surface properties. For example, the vibration pattern sensed by the vibration sensor can be used to identify the velocity/direction of the object (finger) touching the surface as described above.
It will be appreciated by persons skilled in the art that embodiments of the present disclosure removes the need of having a PCB or flex foil with electronic read-out circuitry at the location of the mechanical push buttons. This reduces the physical size of the luminaire 100 compared with those known in the art which require a bulky PCB to connect the mechanical part of the buttons with their associated read-out electronic circuits. The electro-mechanical push buttons present in known devices have an impact on the cost and freedom of design of the devices. By adding only the mechanical part of the push button and using the vibration signal output by the vibration sensor 106 to decide when and which button is pressed, the device according to embodiments of the present disclosure can be made cheaper and with less restrictions in its design. In many products a vibration sensor 106 is already present, so this component can be reused. Whilst incorporating one or more three- dimensional surface textures on the housing 1 12 of the luminaire 100 results in a slightly more complex design of the mould to produce the housing 1 12 of the luminaire there will be no increase in the financial cost of the mechanical push buttons.
It will be appreciated that the above embodiments have been described only by way of example.
Whilst the vibration sensor 106 has been described above as being an accelerometer, this is merely an example and the vibration sensor 106 may take other forms. For example, the vibration sensor 106 may be a microphone, gyroscope, a magnetometer (eCompass), a piezo electric sensor, or a tilt sensor that may be fluid based with the capability to detect high frequency motion or comprise a metal ball which is arranged to make contact with conductive elements in certain orientations.
Whilst embodiments have been described above with reference to mechanical push buttons, it will be appreciated that this is just one example of a mechanical switch. Other mechanical switches may be used which have a specific vibration signature over time when a user interacts with them. For example, pull buttons, rotary switches, toggle switches, rocker switches, slide switches etc.
Whilst embodiments have been described above with reference to three- dimensional surface textures being a feature of the housing 1 12 itself, in other embodiments an attachment (e.g. a sticker) having a three-dimensional surface texture may be attached to a surface of the housing 1 12 of the luminaire 100. This enables additional interaction surfaces to be provided to those already provided on the device.
Whilst embodiments have been described above with reference to the luminaire 100 comprising one or more mechanical push button 1 10 and/or one or more three- dimensional textured surface, in other embodiments a remote device (e.g. pen or other handheld instrument) may comprise the one or more mechanical push button 1 10 and/or one or more three-dimensional textured surface that results in a specific vibration when a user touches a surface of the housing of the luminaire 100 with the remote device. One or more vibration pattern associated with a user interacting with a surface of the housing 1 12 of the luminaire 100 with the remote device may be stored in memory 108 with predefined lighting control commands associated with the user interactions. This enables the controller 102 to detect that a user has interacted with surface of the luminaire 100 using the remote device based on the vibration signal output by the vibration sensor 106, and control the light emitted by the light source(s) 104 accordingly.
Whilst embodiments have been described above with reference to a luminaire, embodiments of the present disclosure can be applied to other devices other than luminaire for example a remote control or appliances such a kitchen appliance, shaver, toothbrush etc. In these other embodiments, the controller 102 is configured receive the vibration signal output by the vibration sensor 106; detect a user's interaction with an interaction surface on the device based on the vibration signal corresponding to the vibration pattern associated with the interaction surface, and control the device based on this detection. Whilst this "control" has been described above with reference to lighting control in the embodiments described herein, when embodiments of the present disclosure are applied to other devices other than a luminaire, this "control" is dependent on the functionality of the device.
Other variations to the disclosed embodiments can be understood and effected by those skilled in the art in practicing the claimed invention, from a study of the drawings, the disclosure, and the appended claims. In the claims, the word "comprising" does not exclude other elements or steps, and the indefinite article "a" or "an" does not exclude a plurality. The term "controller" as used herein generally represent software, firmware, hardware, or a combination thereof. In the case of a software implementation, the controller represents program code that performs specified tasks when executed on a processor (e.g. CPU or CPUs). The program code can be stored in one or more computer readable memory devices. A single processor or other unit may fulfil the functions of several items recited in the claims. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage. A computer program may be stored/distributed on a suitable medium, such as an optical storage medium or a solid-state medium supplied together with or as part of other hardware, but may also be distributed in other forms, such as via the Internet or other wired or wireless telecommunication systems. Any reference signs in the claims should not be construed as limiting the scope.

Claims

CLAIMS:
1. A device comprising:
at least one interaction surface (1 10, 1 12);
a vibration sensor (106) configured to output a vibration signal; a memory (108) storing a predetermined vibration pattern associated with each of the at least one interaction surface; and
a controller (102) configured to: receive the vibration signal; detect a user's interaction with an interaction surface of the at least one interaction surface based on the vibration signal corresponding to the vibration pattern associated with said interaction surface; and control the device based on said detection.
2. The device according to claim 1, wherein the at least one interaction surface comprises at least one mechanical switch, and the memory stores a predetermined vibration pattern associated with each of the at least one mechanical switch.
3. The device according to claim 2, wherein the controller is configured to detect a mechanical switch of the at least one mechanical switch being interacted with based on the vibration signal corresponding to the vibration pattern associated with said mechanical switch.
4. The device according to any preceding claim, wherein the at least one interaction surface comprises at least one three dimensional textured surface, and the memory stores a predetermined vibration pattern associated with each of the at least one three dimensional textured surfaces.
5. The device according to claim 4, wherein the controller is configured to detect an object being dragged across a three dimensional textured surface of the at least one three dimensional textured surface based on the vibration signal corresponding to the vibration pattern associated with said three dimensional textured surface.
6. The device according to claim 5, wherein the controller is configured to detect a direction of the object being dragged across said three dimensional textured surface based on the vibration signal, and control the device based on said direction.
7. The device according to claim 5 or 6, wherein the controller is configured to detect a speed of the object being dragged across said three dimensional textured surface based on the vibration signal, and control the device based on said speed.
8. The device according to any of claims 5 to 7, wherein the controller is configured to detect a trajectory of the object being dragged across said three dimensional textured surface based on the vibration signal, and control the device based on said trajectory.
9. The device according to any preceding claim, wherein the controller is configured to detect a force of the user's interaction with the interaction surface based on the vibration signal, and control the device based on said force.
10. The device according to any preceding claim wherein the vibration sensor is one or any combination of: an accelerometer; a microphone; a gyroscope; a tilt sensor; a magnetometer; and a piezoelectric sensor.
1 1. The device according to any preceding claim, wherein the device further comprises at least one light source (104) for emitting light to illuminate an environment of the device, and the controller is configured to control the light emitted from the at least one light source based on said detection.
12. The device according to claim 1 1, wherein the controller is configured to transmit a lighting command to the at least one light source to control a lighting parameter of the light emitted from the at least one light source based on said detection.
13. The device according to claim 1, wherein the device comprises a housing which houses the vibration sensor, the memory and the controller; and the at least one interaction surface comprises a portion of said housing; and the memory stores a
predetermined vibration pattern associated with a remote device, that comprises one or any combination of at least one mechanical switch and at least one three dimensional textured surface, interacting with said portion of said housing.
14. A method of controlling a device, the method comprising:
receiving a vibration signal output by a vibration sensor (106) of the device; detecting a user's interaction with an interaction surface (1 10,1 12) of at least one interaction surface of the device based on the vibration signal corresponding to a predetermined vibration pattern associated with said interaction surface that is stored in a memory (108) of the device; and
controlling the device based on said detection.
15. A computer program product comprising code embodied on a computer- readable medium and being configured so as when executed on a processor of a device, to:
receive a vibration signal output by a vibration sensor (106) of the device; detect a user's interaction with an interaction surface (1 10,1 12) of at least one interaction surface of the device based on the vibration signal corresponding to a predetermined vibration pattern associated with said interaction surface that is stored in a memory (108) of the device; and
control the device based on said detection.
PCT/EP2016/073280 2015-10-15 2016-09-29 Touch control WO2017063893A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201562242024P 2015-10-15 2015-10-15
US62/242,024 2015-10-15

Publications (1)

Publication Number Publication Date
WO2017063893A1 true WO2017063893A1 (en) 2017-04-20

Family

ID=57209422

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2016/073280 WO2017063893A1 (en) 2015-10-15 2016-09-29 Touch control

Country Status (1)

Country Link
WO (1) WO2017063893A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107329577A (en) * 2017-07-19 2017-11-07 深圳市连山易科技有限公司 A kind of Intelligent lamp control method and device based on gesture identification
GB2560322A (en) * 2017-03-06 2018-09-12 Jaguar Land Rover Ltd Control apparatus and method for controlling operation of a component
CN112099631A (en) * 2020-09-16 2020-12-18 歌尔科技有限公司 Electronic equipment and control method, device and medium thereof
CN114077325A (en) * 2020-08-12 2022-02-22 北京钛方科技有限责任公司 Sensing device of equipment, touch detection method and system
EP4216576A1 (en) 2022-01-21 2023-07-26 Oticon A/s A hearing aid comprising an activation element

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070132738A1 (en) * 2005-12-14 2007-06-14 Research In Motion Limited Handheld electronic device having virtual navigational input device, and associated method
FR2927575A1 (en) * 2008-02-18 2009-08-21 Faurecia Interieur Ind Snc Man-machine interface device for controlling e.g. audio system, in fascia of motor vehicle, has generation device generating impacts on rigid support and activated by user, where generation device includes brusque action activation element
WO2011001229A1 (en) * 2009-07-03 2011-01-06 Sony Ericsson Mobile Communications Ab Tactile input for accessories
US20140191963A1 (en) * 2013-01-08 2014-07-10 Sony Corporation Apparatus and method for controlling a user interface of a device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070132738A1 (en) * 2005-12-14 2007-06-14 Research In Motion Limited Handheld electronic device having virtual navigational input device, and associated method
FR2927575A1 (en) * 2008-02-18 2009-08-21 Faurecia Interieur Ind Snc Man-machine interface device for controlling e.g. audio system, in fascia of motor vehicle, has generation device generating impacts on rigid support and activated by user, where generation device includes brusque action activation element
WO2011001229A1 (en) * 2009-07-03 2011-01-06 Sony Ericsson Mobile Communications Ab Tactile input for accessories
US20140191963A1 (en) * 2013-01-08 2014-07-10 Sony Corporation Apparatus and method for controlling a user interface of a device

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2560322A (en) * 2017-03-06 2018-09-12 Jaguar Land Rover Ltd Control apparatus and method for controlling operation of a component
GB2560322B (en) * 2017-03-06 2022-02-16 Jaguar Land Rover Ltd Control apparatus and method for controlling operation of a component
CN107329577A (en) * 2017-07-19 2017-11-07 深圳市连山易科技有限公司 A kind of Intelligent lamp control method and device based on gesture identification
CN107329577B (en) * 2017-07-19 2020-12-22 深圳市连山易科技有限公司 Intelligent lamp control method and device based on gesture recognition
CN114077325A (en) * 2020-08-12 2022-02-22 北京钛方科技有限责任公司 Sensing device of equipment, touch detection method and system
CN114077325B (en) * 2020-08-12 2023-09-29 北京钛方科技有限责任公司 Sensing device of equipment, touch detection method and system
CN112099631A (en) * 2020-09-16 2020-12-18 歌尔科技有限公司 Electronic equipment and control method, device and medium thereof
EP4216576A1 (en) 2022-01-21 2023-07-26 Oticon A/s A hearing aid comprising an activation element

Similar Documents

Publication Publication Date Title
WO2017063893A1 (en) Touch control
US11093053B2 (en) Input device
US20170329412A1 (en) Systems and Methods of Gesture-Based Control
CN104516675A (en) Control method of foldable screen and electronic equipment
TWI686828B (en) keyboard
RU2733649C2 (en) Lighting device control method
TWI511008B (en) Touch detection method and optical touch system thereof
TWI536211B (en) Dual mode optical navigation device and mode switching method thereof
EP3458872A1 (en) Gesture-enabled audio device with visible feedback
JP5783645B2 (en) Touchpad
US20160227635A1 (en) Methods and apparatus for controlling lighting
TWI494749B (en) Displacement detecting device and power saving method thereof
TWI536033B (en) Object detection method and device
TWI511007B (en) Optical touch apparatus and optical touch method
KR102183873B1 (en) Recognizing apparatus a moving direction of gesture and method thereof
US10338692B1 (en) Dual touchpad system
TWI468998B (en) Hand-held pointing device and operating method thereof
TWI498793B (en) Optical touch system and control method
TWI363491B (en) A sensing controller
US20120182231A1 (en) Virtual Multi-Touch Control Apparatus and Method Thereof
US20130201149A1 (en) Electronic device with feedback function
JP2014002942A (en) Illumination control device
KR20140060191A (en) Method for controlling touch based virtual switch and apparatus thereof
KR101462792B1 (en) Apparatus and method for detecting gestures
JP5705913B2 (en) Touch input device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16788030

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16788030

Country of ref document: EP

Kind code of ref document: A1