WO2014187900A1 - Schalterbetätigungseinrichtung, mobiles gerät und verfahren zum betätigen eines schalters durch eine nicht-taktile translationsgeste - Google Patents

Schalterbetätigungseinrichtung, mobiles gerät und verfahren zum betätigen eines schalters durch eine nicht-taktile translationsgeste Download PDF

Info

Publication number
WO2014187900A1
WO2014187900A1 PCT/EP2014/060546 EP2014060546W WO2014187900A1 WO 2014187900 A1 WO2014187900 A1 WO 2014187900A1 EP 2014060546 W EP2014060546 W EP 2014060546W WO 2014187900 A1 WO2014187900 A1 WO 2014187900A1
Authority
WO
WIPO (PCT)
Prior art keywords
signal
pixels
pixel
switch
translational
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/EP2014/060546
Other languages
German (de)
English (en)
French (fr)
Inventor
Carsten Giebeler
Spyros BROWN
Tim Chamberlain
Jonathan Ephraim David Hurwitz
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Pyreos Ltd
Original Assignee
Pyreos Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Pyreos Ltd filed Critical Pyreos Ltd
Priority to EP14725696.0A priority Critical patent/EP3005564B1/de
Priority to JP2016514411A priority patent/JP2016526213A/ja
Priority to CN201480039985.7A priority patent/CN105493408A/zh
Priority to KR1020157036314A priority patent/KR20160013140A/ko
Publication of WO2014187900A1 publication Critical patent/WO2014187900A1/de
Priority to US14/949,993 priority patent/US10007353B2/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • G06F3/0308Detection arrangements using opto-electronic means comprising a plurality of distinctive and separately oriented light emitters or reflectors associated to the pointing device, e.g. remote cursor controller with distinct and separately oriented LEDs at the tip whose radiations are captured by a photo-detector associated to the screen
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • HELECTRICITY
    • H03ELECTRONIC CIRCUITRY
    • H03KPULSE TECHNIQUE
    • H03K17/00Electronic switching or gating, i.e. not by contact-making and –breaking
    • H03K17/94Electronic switching or gating, i.e. not by contact-making and –breaking characterised by the way in which the control signals are generated
    • H03K17/941Electronic switching or gating, i.e. not by contact-making and –breaking characterised by the way in which the control signals are generated using an optical detector
    • H03K17/943Electronic switching or gating, i.e. not by contact-making and –breaking characterised by the way in which the control signals are generated using an optical detector using a plurality of optical emitters or detectors, e.g. keyboard
    • HELECTRICITY
    • H03ELECTRONIC CIRCUITRY
    • H03KPULSE TECHNIQUE
    • H03K17/00Electronic switching or gating, i.e. not by contact-making and –breaking
    • H03K17/94Electronic switching or gating, i.e. not by contact-making and –breaking characterised by the way in which the control signals are generated
    • H03K17/945Proximity switches
    • HELECTRICITY
    • H03ELECTRONIC CIRCUITRY
    • H03KPULSE TECHNIQUE
    • H03K2217/00Indexing scheme related to electronic switching or gating, i.e. not by contact-making or -breaking covered by H03K17/00
    • H03K2217/94Indexing scheme related to electronic switching or gating, i.e. not by contact-making or -breaking covered by H03K17/00 characterised by the way in which the control signal is generated
    • H03K2217/941Indexing scheme related to electronic switching or gating, i.e. not by contact-making or -breaking covered by H03K17/00 characterised by the way in which the control signal is generated using an optical detector
    • H03K2217/94102Indexing scheme related to electronic switching or gating, i.e. not by contact-making or -breaking covered by H03K17/00 characterised by the way in which the control signal is generated using an optical detector characterised by the type of activation
    • H03K2217/94106Passive activation of light sensor, e.g. by ambient light
    • HELECTRICITY
    • H03ELECTRONIC CIRCUITRY
    • H03KPULSE TECHNIQUE
    • H03K2217/00Indexing scheme related to electronic switching or gating, i.e. not by contact-making or -breaking covered by H03K17/00
    • H03K2217/94Indexing scheme related to electronic switching or gating, i.e. not by contact-making or -breaking covered by H03K17/00 characterised by the way in which the control signal is generated
    • H03K2217/941Indexing scheme related to electronic switching or gating, i.e. not by contact-making or -breaking covered by H03K17/00 characterised by the way in which the control signal is generated using an optical detector
    • H03K2217/94112Indexing scheme related to electronic switching or gating, i.e. not by contact-making or -breaking covered by H03K17/00 characterised by the way in which the control signal is generated using an optical detector having more than one receiver

Definitions

  • Translationsgeste The invention relates to a switch actuator, a mobile device with the switch actuator and a method for actuating a switch with the
  • Translational gesture in particular a human hand.
  • the gesture recognition device is equipped with a device for optical detection, in particular of the gesturing hand, the image information generated thereby being evaluated with corresponding algorithms in order to derive a gesture from the image information.
  • the device for optically detecting a non-tactile gesture is conventionally a camera which disadvantageously takes up a large space and entails high investment costs.
  • camera-based devices for gesture recognition in miniaturized design at low cost such as would be advantageous for use in mobile phones, not to realize.
  • camera-based devices have a disadvantageous high energy consumption, which mobile
  • Touch screens are only suitable for detecting tactile gestures and not for detecting non-tactile gestures.
  • the object of the invention is a
  • a switch actuator and a method for actuating a switch with the switch actuator by a non-tactile translational gesture to create the switch actuator has a miniaturized design at a low cost and low energy consumption and the operation of the switch with the switch actuator is safe and poor in error.
  • the temporal succession of the signal excursion the exercise of one of the types of translational gestures can be determined, and having an actuator which is controlled by the signal evaluation unit and, as soon as the exercise of one of the types
  • Translational Gestures is actuated, the switch actuated, wherein a first type of translational gestures by a movement of the part in a longitudinal direction, a second type of
  • Translational gestures by a movement of the part against the longitudinal direction a third type of translational gestures by a movement of the part in one of the longitudinal direction
  • Translational gestures is defined by a movement of the part against the transverse direction and four of the pixels in each case in one of the corners of a convex quadrilateral, one of the diagonals being substantially parallel to the longitudinal direction and the other diagonal being substantially parallel to the transverse direction.
  • the inventive mobile device has the
  • Switch actuator wherein the switch for activating / deactivating a functionality of the mobile device is connected in this.
  • a switch actuator comprises the steps of: applying one of the possible translational gestures to the heat emitting part so that the sequence of signal excursions is output from the pixels to the signal evaluation unit; based on the sequence of signal rashes:
  • Translational gesture is identified as the first, the second, the third or the fourth type; depending on the identified type of translation gesture: corresponding activation of the actuator for actuating the switch by the signal evaluation unit.
  • the pyroelectric material lead zirconate titanate is preferred.
  • the signals generated by the heat emitting part in applying the non-tactile translational gestures are so advantageous that the detection of the The type of translational gesture exercised with the method according to the invention can be carried out safely and with little error.
  • the gesture sensor with the pixels in such a miniaturized design at low cost can be produced that the
  • Switch actuator for the mobile device is advantageously used.
  • the signal is generated with the thin films by the heat emitted by the part, so that the gesture sensor does not need to be powered with an external power source.
  • the switch operation means instructs the signal evaluation unit and the actuator as
  • the quadrangle is a rhombus.
  • the distance between two immediately adjacent pixels is preferably between 50 pm to 300 pm.
  • the part is a human hand and the heat emitted from the part is the body heat radiated from the human hand.
  • the method for operating the switch actuating device comprises the step:
  • the first predetermined offset period is 0.5 ms.
  • Simultaneity of the occurrence of the average signal beats for safe and error-free gesture recognition is set.
  • the method for operating the switch actuator preferably further comprises the step of: checking whether the first time signal excursion at least a second predetermined offset period before the time second or third signal excursion and the fourth time
  • the second predetermined offset period is between 7 ms and 40 ms.
  • the second predetermined offset period specifies the time advance of the first signal excursion and the time delay of the last signal excursion relative to the average signal excursion for safe and error-free gesture recognition. Checking if the time lag of the middle
  • Offset period is and whether the lead time of the first signal excursion and the time lag of the last
  • the directions provided for the translational gestures are substantially parallel to the longitudinal direction or substantially parallel to the transverse direction.
  • the signal deflections prefferably use either the amplitude profiles of the signals output by the pixels or the first derivative according to the time of the amplitude profiles of the signals output by the pixels.
  • the pixels each have the thin layer of the pyroelectric material, preferably lead zirconate titanate.
  • the respective S-shaped signal excursion has a sinusoidal shape, as shown in particular in FIG.
  • Switch operator means the step of: identifying the shapes of the signal excursions and checking whether the shapes of the signal excursions each have the S-shape; is the
  • pyroelectric material preferably lead zirconate titanate
  • the characteristic S-form of Signal deflections during the approach of the part is the characteristic S-form of Signal deflections during the approach of the part and the
  • the temporal occurrence of the maxima and / or minima of the S-shaped signal amplitudes of the signals of the pixels are used for the checks.
  • the maxima and / or minima of the S-shaped signal amplitudes of the signals can be easily and precisely determined by the signal evaluation unit.
  • FIG. 1 shows a schematic representation of a switch actuating device according to the invention for a mobile device according to the invention
  • FIG. 2 shows a schematic representation of a gesture sensor
  • 3 shows a diagram with amplitude profiles of signals of the gesture sensor from FIG. 2,
  • FIG. 2 shows a schematic representation of a gesture sensor
  • FIG. 4 shows a diagram with the first derivative after the time of the amplitude profiles from FIG. 3,
  • Figure 5 is a detail view of Figure 3 and
  • FIG. 6 shows a diagram with a prescription for the formation of the first derivative after the time of the amplitude profiles, as shown in FIG.
  • FIG. 1 there is shown a switch actuator 100 incorporated in a mobile device.
  • Switch operating device 100 has a gesture sensor 1 and a signal evaluation unit 101, which via a
  • Signal line 102 for transmitting signals from the gesture sensor 1 to the signal evaluation unit 101 is coupled. According to the evaluation of the signals from the gesture sensor 1 to the
  • Signal evaluation unit 101 are transmitted, the signal evaluation unit 101 activates or deactivates an actuator 104 with which a switch 103 of the mobile device can be actuated.
  • the switch 103 is for activating / deactivating a
  • the gesture sensor 1 is for detecting non-tactile
  • Translational gesture of the gesture sensor 1 detects one or more signals via the signal line 102 for
  • the operation of the switch 103 is only triggered when the gesture sensor 1 and the
  • Signal evaluation unit 101 one of four types
  • Translational gestures 111 to 114 is identified.
  • the translational gestures must be non-tactile with a hand 115 near the gesture sensor 1, using the gesture sensor 1 heat emitted by the hand 115 is detectable.
  • the translational gesture of first type 111 is a
  • Translational gesture of the second kind 112 a movement of the hand 115 from right to left, the translational gesture of the third kind 113, a movement of the hand 115 from bottom to top and the
  • FIG. 2 is a schematic representation of the gesture sensor
  • the first pixel 21 is seen in the upper corner of the rhombus 11, the second pixel 22 in the right corner of the rhombus 11, the third pixel 23 in the lower corner of the rhombus 11, and the fourth pixel 24 in the left corner of the rhombus 11 arranged.
  • the distance 25 of two immediately adjacent pixels is between 50 pm to 300 pm.
  • a longitudinal direction 31, which runs horizontally in FIG. 2, is parallel to the longitudinal diagonal 12, which is formed by the fourth pixel 24 and the second pixel 22.
  • a transverse direction 32 is seen in Figure 2 parallel to
  • Transverse diagonal 13 which is formed by the first pixel 21 and the third pixel 23.
  • the translational gestures of the first type 111 and second type 112 run parallel to the longitudinal direction 31, whereas the translational gestures of the third type 113 and fourth type 114 run parallel to the transverse direction 32, wherein the
  • the pixels 21 to 24 each have a thin film
  • the signal upon exerting one of the translational gestures with the hand 115 of each pixel 21 through 24, the signal will be signal bounce 58 corresponding to the temporal intensity pattern of the thin film of the thin film corresponding pixels 21 to 24 detected heat to the
  • Signal evaluation unit 101 is output.
  • the signal of the first pixel 21 is denoted by the reference numeral 51
  • the signal of the second pixel 22 is denoted by the reference numeral 52
  • the signal of the third pixel 23 is denoted by the reference numeral 53
  • the signal of the fourth pixel 24 is designated by the reference numeral 54.
  • FIG. 3 shows a diagram with the temporal amplitude profiles of the signals 51 to 54 output by the pixels 21 to 24, the time being plotted over the abscissa 61 and the time
  • the signals 51 to 54 each have an S-shape, wherein a first part 56 of the S-shape of the signal excursion at an approach of the hand 115 to the gesture sensor 1 and a second part 57 following on the first part 56 of the S-shape of Signal excursion at one
  • Removal of the hand 115 is generated by the gesture sensor 1.
  • a signal level 55 at pixel passivity occurs, which occurs when no heat emitted by the hand 115 is detected by the gesture sensor 1.
  • FIG. 5 shows a detail from FIG. 3, wherein the first part 56 of the S-shape of the signal excursion on approach
  • the signals 51 to 54 are generated by the translational gesture of the first type 111.
  • the hand 115 passes first the fourth pixel 24, then simultaneously the first pixel 21 and the third pixel 23, and then the second pixel 22. This results in a corresponding time offset of the signals 51 to 54, so that the signal 54 of the fourth pixel 24, the time-first signal and the signal 52 of the second pixel 22 are the last-time signal.
  • the signals 52 and 53 of the second pixel 22 and the third pixel 23 are located in time between the signals 51 and 54.
  • This temporal arrangement order of the signals 51 to 54 is also reflected in the arrangement of the minima 81 to 84, so that the fourth minimum 84 first and second minimum 82 occurs last, with first minimum 81 and third minimum 83 between minima 84 and 82.
  • the translational gesture of the first type 111 is carried out so that the hand 115 is moved parallel to the longitudinal direction 31 and perpendicular to the transverse direction 32. As a result, the hand 115 is first detected by the fourth pixel 24 and lastly by the second pixel 22, with the detection of the hand 115 from the third pixel 23 and first pixel 21 interposed therebetween. Characterized in that the translational gesture of the first type 111 is perpendicular to the transverse direction 32 detect the first pixel 21 and the third
  • the occurrence of the fourth minimum 84 is designated in FIG. 5 with a first time 91, the occurrence of the first time
  • Time 92 and the third time 93 is in each case a time offset 94.
  • any gesture may be exercised by the hand 115.
  • Translational gesture to be actuated is necessary to identify the presence of a translational gesture of one of the four types 111-114.
  • Signal evaluation unit 101 is checked whether the signals 51 to 54 have an S-shape and in the course of time first
  • Amplitude deflection down and then up that is, whether first the minima 81 to 84 of the signals 51 to 54 and then their maxima occur. If this check is positive, signals 51 to 54 are used to identify the translational gesture. It would also be conceivable that the pixels 21 to 24 are connected in such a way that, when the same translational gesture is exercised, first the maxima and then the minimums occur. In addition, it is checked in the signal evaluation unit 101 whether the absolute values of all four minima 81 to 84 of the first part 56 of the S-shape of the signal excursion at
  • the predetermined amplitude level is sized such that expected spurious signals from the environment of the gesture sensor 1 are below the predetermined amplitude level.
  • the distance between two of the pixels 21 to 24, which are arranged immediately adjacent, is between 50 pm to 300 pm. Due to the usual movement speeds of the hand 115, the identification of the type 111 to 114 with the time sequence of the minima 81 to 84 or their associated maxima Translational gesture or rejection of the translational gesture as not belonging to one of the types 111 to 114 possible.
  • the functioning of the signal evaluation unit 101 will be explained below with reference to the identification of the translational gesture of the first type 111.
  • the identifications of the translational gestures of the other types 112 to 114 are analogous.
  • the signal detected by the gesture sensor 1 is to be rejected as not belonging to a translational gesture of one of the four types 111 to 114. For example, in the
  • Signal evaluation unit 101 determines that the first minimum 81 of the first pixel 21 and the third minimum 83 of the third pixel 23 occur within 0.5 ms, it is derived that either a translational gesture of the first type 111 or a translational gesture of the second type 112th is present. Then it is checked in the signal evaluation unit 101, whether the fourth minimum 84 of the fourth pixel 24 before and after the
  • the fourth minimum 84 is 7 ms to 40 ms before the first minimum 81 or the third minimum 83, whichever of the minima 81, 83 occurs earlier, and the second minimum 82 is 7 ms to 40 ms after the first minimum 81 or the third minimum 83, depending on which of the minima 81, 83 occurs later, the translational gesture detected by the gesture sensor 1 becomes one
  • Translational gesture of the first type 111 identified.
  • the switch 103 is actuated via the actuator 104. Gestures that are not considered to be one of the four types 111 to 114 are identified, are discarded in the signal evaluation unit 101 and lead to none
  • Translational gesture of the second to fourth types 112 to 114 takes place in an analogous manner. In principle, any combination of the checks in any order is conceivable.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Telephone Function (AREA)
  • Photometry And Measurement Of Optical Pulse Characteristics (AREA)
  • Geophysics And Detection Of Objects (AREA)
  • Electronic Switches (AREA)
  • Telephone Set Structure (AREA)
  • Thermotherapy And Cooling Therapy Devices (AREA)
  • Position Input By Displaying (AREA)
PCT/EP2014/060546 2013-05-24 2014-05-22 Schalterbetätigungseinrichtung, mobiles gerät und verfahren zum betätigen eines schalters durch eine nicht-taktile translationsgeste Ceased WO2014187900A1 (de)

Priority Applications (5)

Application Number Priority Date Filing Date Title
EP14725696.0A EP3005564B1 (de) 2013-05-24 2014-05-22 Schalterbetätigungseinrichtung, mobiles gerät und verfahren zum betätigen eines schalters durch eine nicht-taktile translationsgeste
JP2016514411A JP2016526213A (ja) 2013-05-24 2014-05-22 スイッチ作動装置、移動機器、および、非触覚並進ジェスチャによるスイッチの作動方法
CN201480039985.7A CN105493408A (zh) 2013-05-24 2014-05-22 开关启动系统、移动装置、及用于使用非触碰推动手势启动开关的方法
KR1020157036314A KR20160013140A (ko) 2013-05-24 2014-05-22 비-촉각 병진 제스처를 이용하여 스위치를 작동시키기 위한 스위치 작동 시스템, 모바일 디바이스 및 방법
US14/949,993 US10007353B2 (en) 2013-05-24 2015-11-24 Switch operating device, mobile device and method for operating a switch by a non-tactile translational gesture

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201361827108P 2013-05-24 2013-05-24
US61/827,108 2013-05-24

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US14/949,993 Continuation US10007353B2 (en) 2013-05-24 2015-11-24 Switch operating device, mobile device and method for operating a switch by a non-tactile translational gesture

Publications (1)

Publication Number Publication Date
WO2014187900A1 true WO2014187900A1 (de) 2014-11-27

Family

ID=50771495

Family Applications (3)

Application Number Title Priority Date Filing Date
PCT/EP2014/060546 Ceased WO2014187900A1 (de) 2013-05-24 2014-05-22 Schalterbetätigungseinrichtung, mobiles gerät und verfahren zum betätigen eines schalters durch eine nicht-taktile translationsgeste
PCT/EP2014/060551 Ceased WO2014187904A1 (de) 2013-05-24 2014-05-22 Schalterbetätigungseinrichtung, mobiles gerät und verfahren zum betätigen eines schalters durch eine präsenz eines wärme emittierenden teils
PCT/EP2014/060549 Ceased WO2014187902A1 (de) 2013-05-24 2014-05-22 Schalterbetätigungseinrichtung, mobiles gerät und verfahren zum betätigen eines schalters durch eine nicht-taktile "push"-geste

Family Applications After (2)

Application Number Title Priority Date Filing Date
PCT/EP2014/060551 Ceased WO2014187904A1 (de) 2013-05-24 2014-05-22 Schalterbetätigungseinrichtung, mobiles gerät und verfahren zum betätigen eines schalters durch eine präsenz eines wärme emittierenden teils
PCT/EP2014/060549 Ceased WO2014187902A1 (de) 2013-05-24 2014-05-22 Schalterbetätigungseinrichtung, mobiles gerät und verfahren zum betätigen eines schalters durch eine nicht-taktile "push"-geste

Country Status (7)

Country Link
US (3) US10001840B2 (enExample)
EP (3) EP3005564B1 (enExample)
JP (3) JP2016530481A (enExample)
KR (3) KR20160012168A (enExample)
CN (3) CN105531930A (enExample)
DE (3) DE102014106661B4 (enExample)
WO (3) WO2014187900A1 (enExample)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102014017585B4 (de) * 2014-11-27 2017-08-24 Pyreos Ltd. Schalterbetätigungseinrichtung, mobiles Gerät und Verfahren zum Betätigen eines Schalters durch eine nicht-taktile Geste
JP6543185B2 (ja) * 2015-12-22 2019-07-10 クラリオン株式会社 車載装置
JP2023067082A (ja) 2021-10-29 2023-05-16 上海天馬微電子有限公司 検出装置及び検出方法
CN115051699B (zh) * 2022-07-22 2025-09-16 青岛海信智慧生活科技股份有限公司 无接触开关控制方法及无接触开关

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1993009414A1 (en) * 1991-11-04 1993-05-13 Honeywell Inc. Thin film pyroelectric imaging array
JP2008232715A (ja) * 2007-03-19 2008-10-02 Matsushita Electric Works Ltd 物体検知システム
US20100204953A1 (en) * 2009-02-12 2010-08-12 Sony Corporation Gesture recognition apparatus, gesture recognition method and program
DE102009017845A1 (de) * 2009-04-17 2010-10-21 Pyreos Ltd. Infrarotlichtsensor mit hoher Signalspannung und hohem Signal-Rausch-Verhältnis, sowie Infrarotlichtdetektor mit dem Infrarotlichtsensor
US20100295773A1 (en) * 2009-05-22 2010-11-25 Rachid Alameh Electronic device with sensing assembly and method for interpreting offset gestures
WO2011018253A1 (de) * 2009-08-11 2011-02-17 Pyreos Ltd. Kompakter infrarotlichtdetektor und verfahren zur herstellung desselben sowie ein infrarotlichtdetektorsystem mit dem infrarotlichtdetektor
US20110050643A1 (en) * 2009-08-28 2011-03-03 INVENTEC APPLIANCES (Shanghai) CO., LTD./ INVENTEC APPLIANCES CORP. Passive infrared sensing user interface and device using the same

Family Cites Families (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE3176558D1 (en) * 1980-12-29 1988-01-14 Rothenhaus Robert Control device responsive to infrared radiation
JPS6113816A (ja) * 1984-06-29 1986-01-22 Toshiba Corp 焦電スイツチ装置
JPH0786537B2 (ja) 1987-09-26 1995-09-20 松下電工株式会社 人体検出装置
JPH0615347Y2 (ja) 1987-11-20 1994-04-20 三洋電機株式会社 自動販売機の商品投入規制装置
JPH07120316A (ja) * 1993-10-25 1995-05-12 Matsushita Electric Works Ltd 赤外線式人体検知装置
JPH07120318A (ja) * 1993-10-27 1995-05-12 Hokuriku Electric Ind Co Ltd 焦電型赤外線検出装置
JPH08122143A (ja) * 1994-10-27 1996-05-17 Murata Mfg Co Ltd 赤外線検出器
JPH10142351A (ja) * 1996-11-08 1998-05-29 Matsushita Seiko Co Ltd 人感センサ
JP2001235552A (ja) * 2000-02-22 2001-08-31 Noboru Yoshizaki パターン化手合図の赤外線を識別検知する装置
JP3835244B2 (ja) * 2001-10-19 2006-10-18 松下電工株式会社 焦電型赤外線検知素子
JP2005223629A (ja) * 2004-02-05 2005-08-18 Asahi Kasei Corp 携帯電子機器
US7196316B2 (en) * 2004-09-22 2007-03-27 Avago Technologies Ecbu Ip (Singapore) Pte. Ltd. Portable electronic device with activation sensor
JP4576958B2 (ja) * 2004-09-27 2010-11-10 セイコーエプソン株式会社 印刷装置、および、印刷材の状態の検出方法
JP4884078B2 (ja) * 2006-05-26 2012-02-22 三洋電機株式会社 人体検出装置及び映像表示装置
JP2008245086A (ja) * 2007-03-28 2008-10-09 Brother Ind Ltd コードレス電話装置
WO2008132546A1 (en) * 2007-04-30 2008-11-06 Sony Ericsson Mobile Communications Ab Method and algorithm for detecting movement of an object
JP2009223490A (ja) * 2008-03-14 2009-10-01 Shimizu Corp 仮想スイッチならびにそれを用いた家電制御システムおよび家電制御方法
JP5056623B2 (ja) * 2008-06-30 2012-10-24 ぺんてる株式会社 非接触入力装置
JP2009134761A (ja) * 2009-03-16 2009-06-18 Hitachi Ltd 非接触入力インターフェース装置及び情報端末装置
CN102043486A (zh) * 2010-08-31 2011-05-04 苏州佳世达电通有限公司 手持电子装置的操作方法
WO2012066562A2 (en) * 2010-11-16 2012-05-24 Muthukumar Prasad Smart radiation protection system for mobile device to reduce sar by forming actively tunable electromagnetic shadow on user facing direction works by sensing device proximity environment with property, position, orientation, signal quality and operating modes
JP5554689B2 (ja) * 2010-11-22 2014-07-23 旭化成エレクトロニクス株式会社 位置および動作判定方法および入力装置
JP5617581B2 (ja) 2010-12-08 2014-11-05 オムロン株式会社 ジェスチャ認識装置、ジェスチャ認識方法、制御プログラム、および、記録媒体

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1993009414A1 (en) * 1991-11-04 1993-05-13 Honeywell Inc. Thin film pyroelectric imaging array
JP2008232715A (ja) * 2007-03-19 2008-10-02 Matsushita Electric Works Ltd 物体検知システム
US20100204953A1 (en) * 2009-02-12 2010-08-12 Sony Corporation Gesture recognition apparatus, gesture recognition method and program
DE102009017845A1 (de) * 2009-04-17 2010-10-21 Pyreos Ltd. Infrarotlichtsensor mit hoher Signalspannung und hohem Signal-Rausch-Verhältnis, sowie Infrarotlichtdetektor mit dem Infrarotlichtsensor
US20100295773A1 (en) * 2009-05-22 2010-11-25 Rachid Alameh Electronic device with sensing assembly and method for interpreting offset gestures
WO2011018253A1 (de) * 2009-08-11 2011-02-17 Pyreos Ltd. Kompakter infrarotlichtdetektor und verfahren zur herstellung desselben sowie ein infrarotlichtdetektorsystem mit dem infrarotlichtdetektor
US20110050643A1 (en) * 2009-08-28 2011-03-03 INVENTEC APPLIANCES (Shanghai) CO., LTD./ INVENTEC APPLIANCES CORP. Passive infrared sensing user interface and device using the same

Also Published As

Publication number Publication date
EP3005564A1 (de) 2016-04-13
JP2016526213A (ja) 2016-09-01
JP6400083B2 (ja) 2018-10-03
EP3005565B1 (de) 2018-10-31
US9857880B2 (en) 2018-01-02
EP3005565A1 (de) 2016-04-13
KR20160012201A (ko) 2016-02-02
DE102014106681B4 (de) 2021-08-26
CN105493408A (zh) 2016-04-13
DE102014106661B4 (de) 2023-11-16
CN105531930A (zh) 2016-04-27
US10001840B2 (en) 2018-06-19
KR20160013140A (ko) 2016-02-03
WO2014187904A1 (de) 2014-11-27
US20160154467A1 (en) 2016-06-02
EP3005564B1 (de) 2018-10-31
EP3005566A1 (de) 2016-04-13
JP2016527590A (ja) 2016-09-08
DE102014106680A1 (de) 2014-11-27
US10007353B2 (en) 2018-06-26
EP3005566B1 (de) 2021-02-17
DE102014106681A1 (de) 2014-11-27
DE102014106680B4 (de) 2023-11-16
JP2016530481A (ja) 2016-09-29
DE102014106661A1 (de) 2014-11-27
CN105531931A (zh) 2016-04-27
WO2014187902A1 (de) 2014-11-27
US20160077600A1 (en) 2016-03-17
US20160077583A1 (en) 2016-03-17
KR20160012168A (ko) 2016-02-02

Similar Documents

Publication Publication Date Title
EP2972698B1 (de) Verfahren zum betreiben eines berührungsempfindlichen bediensystems und vorrichtung mit einem solchen bediensystem
EP2016480A2 (de) Optoelektronische vorrichtung zur erfassung der position und/oder bewegung eines objekts sowie zugehöriges verfahren
DE102014019040B4 (de) Verfahren zum Betreiben einer Bedienvorrichtung eines Kraftfahrzeugs bei einer Mehrfingerbedienung
WO2014040930A1 (de) Verfahren und vorrichtung zur bedienung einer kraftfahrzeugkomponente mittels gesten
EP3005564B1 (de) Schalterbetätigungseinrichtung, mobiles gerät und verfahren zum betätigen eines schalters durch eine nicht-taktile translationsgeste
EP3233323A1 (de) Biegewerkzeug mit einer längsversatz-messvorrichtung
EP2888569B1 (de) Sensorsystem zum erkennen einer bewegung einer infrarotlichtquelle
DE102013200457B4 (de) Bedienvorrichtung für ein Kraftfahrzeug mit einer Gestenüberwachungseinheit
EP3072688A1 (de) Verfahren und vorrichtung zum laminieren eines mehrschichten- sicherheits-dokumentkörpers mit deformationsüberwachung
WO2014001227A1 (de) Bediensystem für ein kraftfahrzeug
EP2492786A1 (de) Bedienelement
EP4038483B1 (de) Anordnung zum erkennen durch eine berührungsempfindliche sensormatrix
EP3400459B1 (de) Verfahren zur elektronischen analyse eines zeitlich veränderlichen signals
DE102020102160B3 (de) Verfahren zum Erzeugen eines Eingabebefehls für einen Roboterarm und Roboterarm
DE102014017585B4 (de) Schalterbetätigungseinrichtung, mobiles Gerät und Verfahren zum Betätigen eines Schalters durch eine nicht-taktile Geste
DE102022111104A1 (de) Eingabevorrichtung für ein Kraftfahrzeug sowie Verfahren zum Betreiben einer Eingabevorrichtung für ein Kraftfahrzeug
EP3513274B1 (de) Verfahren zum erkennen einer berührung eines kapazitiven sensorelements
DE102014201313A1 (de) Verfahren zur Erkennung einer Bewegungsbahn mindestens eines bewegten Objektes innerhalb eines Erfassungsbereiches, Verfahren zur Gestikerkennung unter Einsatz eines derartigen Erkennungsverfahrens sowie Vorrichtung zur Durchführung eines derartigen Erkennungsverfahrens
EP3790678A1 (de) Verfahren mit einer fertigungseinrichtung zum umformen von blech
WO2019207028A1 (de) Bedienvorrichtung
WO2023083724A1 (de) Bedieneingabevorrichtung und verfahren zum betrieb einer bedieneingabevorrichtung
DE102023129876A1 (de) Steuervorrichtung und Steuerverfahren
DE102016202526A1 (de) Verfahren und Vorrichtung zur Erkennung einer Bediengeste eines Nutzers, insbesondere in einem Kraftfahrzeug
DE102023102210A1 (de) Eingabevorrichtung für ein Kraftfahrzeug, bei welcher eine Auswerteeinrichtung ein gewichtetes Betätigungssignal mit einem weiteren Betätigungssignal vergleicht, und Verfahren zum Betreiben einer Eingabevorrichtung
DE102013005662A1 (de) Touchscreen mit Sicherheitsfunktion

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 201480039985.7

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14725696

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2016514411

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2014725696

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 20157036314

Country of ref document: KR

Kind code of ref document: A