WO2014168416A1 - Dispositif d'actionnement sans contact et dispositif électronique lié à ce dernier - Google Patents

Dispositif d'actionnement sans contact et dispositif électronique lié à ce dernier Download PDF

Info

Publication number
WO2014168416A1
WO2014168416A1 PCT/KR2014/003086 KR2014003086W WO2014168416A1 WO 2014168416 A1 WO2014168416 A1 WO 2014168416A1 KR 2014003086 W KR2014003086 W KR 2014003086W WO 2014168416 A1 WO2014168416 A1 WO 2014168416A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
light
light receiving
array
contact
Prior art date
Application number
PCT/KR2014/003086
Other languages
English (en)
Korean (ko)
Inventor
유태경
조현용
허용구
장병탁
신은성
문명지
Original Assignee
주식회사 루멘스
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020130038417A external-priority patent/KR101524050B1/ko
Priority claimed from KR1020130038416A external-priority patent/KR101415931B1/ko
Priority claimed from KR1020130074877A external-priority patent/KR101469186B1/ko
Priority claimed from KR1020130074876A external-priority patent/KR101469129B1/ko
Priority claimed from KR1020130074878A external-priority patent/KR101471816B1/ko
Priority claimed from KR1020130082215A external-priority patent/KR101504148B1/ko
Priority claimed from KR1020130091493A external-priority patent/KR101460028B1/ko
Application filed by 주식회사 루멘스 filed Critical 주식회사 루멘스
Publication of WO2014168416A1 publication Critical patent/WO2014168416A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0421Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by interrupting or reflecting a light beam, e.g. optical touch-screen
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers

Definitions

  • the present invention relates to a contactless control device and an electronic device linked thereto, and more particularly, to a non-contact control device and an electronic device linked thereto, which can provide a new and various interface to a user through a simple position and / or motion recognition process. It is about.
  • Motion recognition in the control methods is implemented using an image sensor (camera).
  • image sensor camera
  • power must be connected at all times, so a lot of power is unnecessarily wasted.
  • the image sensor requires a series of processes for extracting a plurality of images from the motion recognition process to perform image processing, the burden on the processor is high and the device implementation and processing algorithms are very complicated. Therefore, in order to implement motion recognition with an image sensor, a problem arises in that the manufacturing cost increases.
  • the present invention is to solve a number of problems including the above problems, low power consumption, fast algorithm for motion recognition, low manufacturing cost, and provides a non-contact operation device with a simple operation recognition process
  • An object of the present invention is to provide an electronic device interlocked with such a non-contact manipulation device.
  • these problems are exemplary, and the scope of the present invention is not limited thereby.
  • Non-contact manipulation apparatus may include at least one light emitting unit capable of emitting light; Comprising a plurality of optical sensors arranged in a horizontal and vertical array, respectively, receiving the light reflected by the object motion by using the light emitting unit as a light source, for the object motion having information on the intensity and angle of the received light At least one light receiving unit capable of converting an optical signal into an electrical signal; And a microcontrol unit configured to receive the electrical signal, calculate a spatial coordinate for the object, determine a gesture motion with respect to the object motion using the spatial coordinate, and generate a control signal corresponding to the gesture motion.
  • MCU light emitting unit capable of emitting light
  • At least one light receiving unit capable of converting an optical signal into an electrical signal
  • a microcontrol unit configured to receive the electrical signal,
  • the light receiving unit comprises: a first array unit including a plurality of photodiodes arranged horizontally; And a second array unit including a plurality of photodiodes arranged in a vertical array, wherein the optical signal for the object motion comprises: an optical signal for a horizontal component of the object motion detected by the first array unit; An optical signal for a vertical component of the object motion detected by the second array unit; And an optical signal for the height component of the object motion detected by the first array unit and the second array unit.
  • the first array portion and the second array portion may be arranged separately from each other and spaced apart from each other without being mixed.
  • the contactless control device is a computer, notebook, tablet PC, tablet mobile communication device, smart phone, mobile phone, smart pad, game device, virtual experience device, portable multimedia playback device, electronic book, USB hub (USB HUB)
  • the electronic device may further include a communication unit configured to transmit to an electronic device including at least one of a mouse, a keyboard, a monitor, and a headset.
  • the control signal may include a signal for controlling hardware constituting the electronic device or software installed on the electronic device.
  • the communication unit may include at least one of a wireless communication unit using Bluetooth communication or Zigbee communication and a wired communication unit using a universal serial bus (USB) cable.
  • a wireless communication unit using Bluetooth communication or Zigbee communication and a wired communication unit using a universal serial bus (USB) cable.
  • USB universal serial bus
  • At least some of the light emitting unit, the light receiving unit, the microcontrol unit, and the communication unit may be embedded in the electronic device.
  • At least some of the light emitting unit, the light receiving unit, the microcontrol unit, and the communication unit may be disposed outside the electronic device.
  • the light emitting portion and the light receiving portion may be disposed apart from each other without being modularized.
  • the light emitting unit and the light receiving unit may constitute at least one non-contact sensor module.
  • the at least one non-contact sensor module includes a first non-contact sensor module and a second non-contact sensor module spaced apart from each other, the microcontrol unit, the separation distance of the first and second non-contact sensor modules and the first The height component of the spatial coordinates of the object may be calculated using the light receiving angles of the reflected light received by the light receiving units provided in the first and second non-contact sensor modules, respectively.
  • the at least one light receiving unit includes a first light receiving unit and a second light receiving unit spaced apart from each other, and the microcontrol unit includes a separation distance between the first light receiving unit and the second light receiving unit and the first light receiving unit. And a height component of the spatial coordinates of the object using the light receiving angle of the reflected light received by the second light receiving unit.
  • the light emitting unit may include an infrared light emitting diode (IRED), and the light receiving unit may include a photodiode.
  • IRED infrared light emitting diode
  • An electronic device may be provided.
  • the electronic device may be configured in conjunction with the non-contact manipulation apparatus described above.
  • the electronic device includes a computer, a laptop, a tablet PC, a tablet mobile communication device, a smartphone, a mobile phone, a smart pad, a game device, a virtual experience device, a portable multimedia player, an electronic book, a USB hub, a mouse, a keyboard, It may include at least one of a monitor and a headset.
  • a non-contact operation device and a linked electronic device with low power consumption, fast algorithm for motion recognition, low manufacturing cost, and simple motion recognition processing. can do.
  • the scope of the present invention is not limited by these effects.
  • FIG. 1 is a configuration diagram illustrating a configuration including a non-contact operation device according to an embodiment of the present invention.
  • FIG. 2 is a view illustrating a non-contact sensor module including a light emitting unit and a light receiving unit in a non-contact manipulation apparatus according to an embodiment of the present invention.
  • 3 and 4 are views illustrating a concept in which the first array unit detects an optical signal for a horizontal component of the movement of the object in the non-contact manipulation device according to the embodiment of the present invention.
  • 5 and 6 are views illustrating a concept of detecting the optical signal for the vertical component of the movement of the object in the non-contact manipulation apparatus according to an embodiment of the present invention, respectively.
  • FIG. 7 is a view illustrating a concept of the non-contact sensor module to detect the optical signal for the height component of the movement of the object in the non-contact operation device according to an embodiment of the present invention.
  • FIG. 8 is a block diagram illustrating a configuration including a non-contact manipulation apparatus according to a modified embodiment of the present invention.
  • FIG. 9 illustrates a concept of calculating a spatial coordinate of an object based on an optical signal detected by a first array unit and a second array unit in a non-contact manipulation apparatus according to an embodiment of the present invention.
  • FIG. 10 is a configuration diagram illustrating a configuration including a non-contact manipulation apparatus according to another embodiment of the present invention.
  • FIG. 11 is a diagram illustrating a concept of calculating a height component among spatial coordinates of an object by using a distance between two non-contact sensor modules and a light receiving angle of reflected light in a non-contact manipulation device according to another exemplary embodiment of the present invention.
  • FIG. 12 is a configuration diagram illustrating a configuration including a non-contact manipulation apparatus according to another modified embodiment of the present invention.
  • FIG. 13A and 13B illustrate a concept in which a first array unit detects an optical signal for a horizontal component of a movement of an object in a non-contact manipulation device according to still another embodiment of the present invention.
  • FIG. 14 is a diagram illustrating a configuration of a mouse to which a non-contact manipulation device is applied according to some embodiments of the present disclosure.
  • 15 is a diagram illustrating a configuration of a keyboard to which a non-contact manipulation device is applied according to some embodiments of the present disclosure.
  • 16 is a diagram illustrating a configuration of a USB hub to which a non-contact manipulation device according to some embodiments of the present invention is applied.
  • FIG. 17 is a diagram illustrating a configuration of a headset to which a non-contact manipulation device is applied according to some embodiments of the present invention.
  • FIG. 18 is a diagram illustrating a configuration of a monitor to which a non-contact manipulation device is applied according to some embodiments of the present disclosure.
  • FIG. 19 is a diagram illustrating a configuration of a smart phone to which a non-contact manipulation device according to some embodiments of the present invention is applied.
  • FIG. 20 is a diagram illustrating a configuration in which a non-contact manipulation device according to some embodiments of the present disclosure is disposed and provided outside of an electronic device.
  • 21A-21D illustrate exemplary movements of an object that may be provided on a mouse to which a non-contact manipulation device is applied in accordance with some embodiments of the present invention.
  • 22A-22D illustrate exemplary movements of an object that may be provided on a keyboard to which a non-contact manipulation device is applied according to some embodiments of the present disclosure.
  • first, second, etc. are used herein to describe various members, parts, regions, layers, and / or parts, but these members, parts, regions, layers, and / or parts are defined by these terms. It is obvious that not. These terms are only used to distinguish one member, part, region, layer or portion from another region, layer or portion. Thus, the first member, part, region, layer or portion, which will be discussed below, may refer to the second member, component, region, layer or portion without departing from the teachings of the present invention.
  • FIG. 1 is a block diagram illustrating a configuration including a non-contact manipulation apparatus according to an embodiment of the present invention
  • Figure 2 is a diagram illustrating a non-contact sensor module in a non-contact manipulation apparatus according to an embodiment of the present invention.
  • the non-contact manipulation apparatus 1000 includes a light emitting unit 10 and a light receiving unit 20, respectively, and an object (for example, FIGS. 3 and 5). 7, or at least one non-contact sensor module 30 capable of converting the optical signal for S) of FIG. 11 into an electrical signal.
  • the non-contact sensor module 30 includes a light emitting unit 10 and a light receiving unit 20.
  • the light emitter 10 may irradiate the object S with irradiation light.
  • the light emitting unit 10 may be a light emitting device such as a light emitting diode (LED) capable of irradiating irradiated light, and the light emitting unit 10 may be an infrared light emitting diode in order to detect an operation and / or a position of a human body. (IRED).
  • the light emitter 10 may be configured to emit irradiation light only when necessary according to a user's selection. This is to prevent unnecessary consumption of power by continuously radiating light even during a period of inactivity not used by the user.
  • the light receiving unit 20 may receive the reflected light reflected from the object S by using the light emitting unit 10 as a light source.
  • the light receiver 20 may include, for example, an array arrangement horizontally and vertically, and a plurality of optical sensors.
  • the optical sensor may be a light receiving device capable of converting an optical signal for the object S into an electrical signal, and may include, for example, a photodiode.
  • the light receiving unit 20 includes a first array unit 20a composed of a plurality of photodiodes 25 arranged horizontally and a second array composed of a plurality of photodiodes 25 arranged vertically. It may include a portion 20b.
  • the light receiver 20 may convert an optical signal for the object S into an electrical signal.
  • the optical signal refers to all signals related to light in a broad sense, and refers to an optical signal that can be converted into various electrical signals capable of measuring voltage, current, power, etc. in a narrow sense.
  • the optical signal for the object S is, for example, a signal implemented by extracting a plurality of images and performing image processing with respect to the position and / or movement of the object S. Rather, it may mean a signal having information on the intensity of the reflected light and / or the angle of the reflected light that is reflected and received by the static position and / or dynamic movement of the object S.
  • the optical signal for the object S for example, as shown in Fig. 4 or 6, the information on the intensity of the reflected light received, for example, as shown in Fig. 11, the light receiving angle of the received reflected light ( ⁇ 1 and ⁇ 2 ).
  • the non-contact manipulation apparatus 1000 includes a micro control unit (MCU) 40 that receives the electrical signal and calculates a spatial coordinate for the object S. Further, the microcontrol unit 40 determines the static position of the object S by using the spatial coordinates, determines the gesture motion corresponding to the dynamic movement of the object S, and determines the static position and / or A control signal corresponding to the gesture operation may be generated.
  • MCU micro control unit
  • the aforementioned object S may include a body part of a user who uses the non-contact manipulation apparatus 1000 and / or the electronic device 1120.
  • the object S may include tools, devices, instruments and / or devices that the user possesses or wears.
  • the object S may be fixed in space for a predetermined time, and furthermore, the position S may be variable in space.
  • the non-contact manipulation apparatus 1000 may perform a function of a position sensor for detecting the position of the object S, and the object S may be Accordingly, when the position is changed in the space, the non-contact manipulation apparatus 1000 may perform a function of a motion sensor or a gesture sensor that detects a movement of the object S and determines a gesture motion corresponding thereto.
  • the motion of the object S refers to a user's finger movement, palm movement, back movement of the hand, hand movement, arm movement, head movement, It may include at least one of facial movement, eye movement, eye movement, eyelid movement, mouth movement, lip movement, tongue movement, shoulder movement, torso movement, leg movement, toe movement, and foot movement.
  • the movement of the object S may include not only the movement of the user's body mentioned above, but also the movement of a tool, an apparatus, an apparatus, and / or an apparatus whose position may be changed by the user having or wearing it.
  • the movement of the object S may include the movement of the ring.
  • the non-contact operation apparatus 1000 may further include a communication unit 50 for transmitting the control signal generated by the microcontrol unit 40 to the electronic device 1120.
  • the communication unit 50 may transmit and receive information data with the electronic device 1120.
  • the electronic device 1120 referred to herein is not only a device that is connected to a communication device in the center through a communication network and inputs data or outputs a processing result, but also independently or independently of the computer device in the center. It may include a device that can input or output the processing results.
  • the electronic device 1120 may be a computer, a laptop, a tablet PC, a tablet mobile communication device, a smartphone, a mobile phone, a smart pad, a game device, a virtual experience device, a portable multimedia player, an electronic book, a USB hub, and a USB HUB. ), A mouse, a keyboard, a monitor, and a headset.
  • the control signal transmitted to the electronic device 1120 by the communication unit 50 may be, for example, a signal capable of controlling at least one hardware or software installed in the electronic device 1120 constituting the electronic device 1120. It may include.
  • the control signal may be a signal for operating the keyboard or a computer connected to the keyboard or a signal for controlling software installed in the keyboard or a computer connected to the keyboard. It may include.
  • the communication unit 50 may include at least one of the wireless communication unit 52 and the wired communication unit 54.
  • the wireless communication unit 52 may use, for example, Bluetooth communication or Zigbee communication.
  • the communication method applied to the wireless communication unit 52 is not limited to the Bluetooth method or the Zigbee method, for example, a variety of RFID (Radio Frequency Identification), NFC (Near Field Communication), resonance induction, magnetic induction Communication methods can also be applied.
  • the electronic device 1120 may include a module supporting Bluetooth communication or Zigbee communication. If the electronic device 1120 does not have such a module, a separate communication repeater, for example, a dongle 220 may be additionally provided to the electronic device 1120.
  • the dongle 220 may include, for example, a USB dongle that may be connected to a USB port of the electronic device 1120, but is not particularly limited to a form connected to the electronic device 1120.
  • the wired communication unit 54 may be configured to support, for example, a wired cable connecting the non-contact manipulation device 1000 and the electronic device 1120.
  • the wired cable may include, for example, a USB cable or an extension cable, and may connect a connection port portion of the non-contact manipulation apparatus 1000 and a connection port portion of the electronic device 1120.
  • the non-contact manipulation apparatus 1000 may further include an input / output port unit 60 that may exchange various signals, data, and the like with an external device.
  • At least a part of the non-contact manipulation apparatus 1000 may be embedded in the electronic device 1120.
  • the components constituting the non-contact manipulation apparatus 1000 eg, the light emitting unit 10, the light receiving unit 20, the microcontrol unit 40, and the communication unit 50
  • the remaining parts of the components constituting the non-contact manipulation apparatus 1000 may be disposed to be exposed in the surface of the electronic device 1120.
  • the light emitting unit 10 and / or the light receiving unit 20 constituting the non-contact sensor module 30 may be disposed to be exposed on the surface of the electronic device 1120, and the microcontrol unit 40 may be electronic. It may be arranged in a form that is embedded inside the device 1120.
  • the components constituting the non-contact manipulation device 1000 may be spaced apart from the electronic device 1120 without forming a single unity.
  • the components forming the non-contact manipulation apparatus 1000 may be provided in the electronic device 1120 while forming a single integral part.
  • non-contact manipulation apparatus 1000 may be provided outside the electronic device 1120 and spaced apart from the electronic device 1120 separately from the electronic device 1120.
  • the non-contact manipulation apparatus 1000 detects an optical signal and calculates spatial coordinates in response to the position of the object S and / or the movement of the object S will be described.
  • FIGS. 3 and 4 are views illustrating a concept in which the first array unit detects an optical signal for the horizontal component of the movement of the object in the non-contact manipulation device according to one embodiment of the present invention
  • FIGS. 5 and 6 illustrate the present invention.
  • a second array unit illustrates a concept of detecting an optical signal for a vertical component of a movement of an object
  • FIG. 7 is a non-contact manipulation device according to an embodiment of the present invention. Is a diagram illustrating the concept of the non-contact sensor module to detect the optical signal for the height component of the movement of the object.
  • the light receiving unit 20 receives the reflected light reflected by the position of the object S and / or the movement of the object S using the light emitting unit 10 as a light source, and is horizontal and vertical.
  • a plurality of photosensors eg photodiodes
  • the light receiving unit 20 is, for example, a second array in which the photodiode 25 is arranged in the horizontal axis (x-axis) and the second array in which the photodiode 25 is arranged in the vertical axis (y-axis). It may be composed of an array unit 20b.
  • a photodiode 25 having a form of an M1 x N1 matrix may be arranged in the first array unit 20a in which the photodiodes 25 are arranged in the horizontal axis (x-axis).
  • the number of columns N1 may be greater than the number M1 of rows of the matrix.
  • a photodiode 25 having a form of an M2 x N2 matrix may be arranged, but not shown in FIG. 5.
  • the number of rows M2 may be larger than the number of columns N2 of the matrix.
  • the first array unit 20a and the second array unit 20b may not be mixed with each other, but may be separately spaced apart from each other.
  • the accuracy of sensing is improved by recognizing the motion by dividing the component of the motion of the object having an arbitrary direction into the horizontal motion and the vertical motion, and this increase in accuracy is due to the improvement of the resolution of the sensor itself.
  • it is implemented by the array arrangement of the existing sensor it is possible to expect an advantageous effect that the burden of manufacturing cost for improving the accuracy of sensing is significantly lowered.
  • first array portion 20a and the second array portion 20b may be disposed adjacent to each other instead of being separated from each other and spaced apart from each other, and further, mixed and disposed. May be
  • the horizontal movement of the object S is performed by the plurality of photodiodes 25 arranged in the horizontal axis (movement in a direction parallel to the x-axis). It can detect the optical signal for.
  • the plurality of photodiodes 25 arranged in the horizontal axis may be used to provide an optical system with respect to the horizontal component (component in the direction parallel to the x axis) of an arbitrary movement of the object S. The signal can be detected.
  • the intensity of the optical signal of the reflected light received by the first array unit 20a may be reduced by any photodiodes (eg, in the first array unit 20a). It changes with time as shown in FIG. 4 based on 25a, 25b, 25c). That is, as the object S moves closer to the photodiode 25a, the intensity A of the optical signal of the reflected light received by the photodiode 25a is gradually increased, and the photo at the position closest to the photodiode 25a is increased.
  • the intensity A of the optical signal of the reflected light received by the diode 25a peaks, and as the object S moves and moves away from the photodiode 25a, the optical signal of the reflected light received by the photodiode 25a increases. The intensity A is gradually reduced.
  • the distribution of the intensity B of the optical signal with time is shown in the photodiode 25b, and the distribution of the intensity C of the optical signal with time is shown in the photodiode 25c.
  • the separation distances of the object S and the photodiodes 25a, 25b, and 25c each change sequentially with time, and the intensity (A, B, C) of the optical signal is changed.
  • the distribution also changes sequentially with time.
  • the intensity and angle of the optical signals detected by each of the plurality of photodiodes 25 arranged in the array are collected and analyzed, the horizontal component of the horizontal movement of the object S or the arbitrary movement of the object S is determined.
  • Vector information such as moving distance, speed, acceleration, and the start and end points of the movement can be grasped.
  • the vertical movement of the object S (movement in a direction parallel to the y axis) is performed by the plurality of photodiodes 25 arranged in the vertical axis. It can detect the optical signal for.
  • the plurality of photodiodes 25 arranged in the vertical axis may be used to provide an optical system with respect to the vertical component (component in the direction parallel to the y axis) of an arbitrary movement of the object S. The signal can be detected.
  • the optical signal intensity of the reflected light received by the second array unit 20b is in the second array unit 20b. It changes with time as shown in FIG. 6 based on arbitrary photodiodes 25d, 25e, 25f, and 25g. That is, as the object S moves closer and closer to the photodiode 25d, the optical signal intensity D of the reflected light received by the photodiode 25d is gradually increased, and the photodiode at the position closest to the photodiode 25d is increased.
  • the intensity D of the optical signal of the reflected light received at 25d is at its peak, and as the object S moves and moves away from the photodiode 25d, the optical signal intensity of the reflected light received at the photodiode 25d ( D) gradually decreases.
  • the distribution of the optical signal intensity E with time is shown in the photodiode 25e, the distribution of the optical signal intensity F with the time in the photodiode 25f, and time in the photodiode 25g.
  • the distribution of the optical signal intensity G according to the above appears.
  • the separation distances of the object S and the photodiodes 25d, 25e, 25f, and 25g sequentially change with time, and the optical signal intensities D, E, F, The distribution of G) also changes sequentially with time.
  • the intensity and angle of the optical signal detected by each of the plurality of photodiodes 25 arranged in the array are collected and analyzed, the longitudinal component of the vertical movement of the object S or the arbitrary movement of the object S is determined.
  • Vector information such as moving distance, speed, acceleration, and the start and end points of the movement can be grasped.
  • the plurality of photodiodes 25 are arranged in a horizontal axis and / or a vertical axis.
  • the optical signal for the height movement (movement in the direction parallel to the z axis) of the object S can be detected.
  • any of the objects S may be formed by a plurality of photodiodes 25 arrayed in a horizontal axis and / or a vertical axis.
  • the optical signal for the height component (component in the direction parallel to the z axis) of the directional motion can be detected.
  • the strength of the signal is increased.
  • the intensity and angle of the optical signals detected by each of the plurality of photodiodes 25 arranged in the array are collected and analyzed, the height component of the height movement of the object S or the arbitrary movement of the object S is determined.
  • Vector information such as moving distance, speed, acceleration, and the start and end points of the movement can be grasped.
  • the optical signal (or the electrical signal converted from the optical signal) for the arbitrary direction movement of the object S in space is the object detected by the first array unit 20a ( An optical signal (or an electrical signal converted from the optical signal) for the horizontal component of the movement of S); 5 and 6, an optical signal (or an electrical signal converted from the optical signal) for the vertical component of the movement of the object S detected by the second array unit 20b; As described with reference to FIG. 7, an optical signal (or an electrical signal converted from the optical signal) for the height component of the movement of the object S detected by the first array unit 20a and the second array unit 20b. It can be implemented by separating and integrating;
  • the integrated optical signal may reflect time information and vector information such as a moving distance, a speed, an acceleration, a start point and an end point of the movement of the object S in any direction. Can be. Accordingly, the non-contact sensor module 30 including the light emitting unit 10 and the light receiving unit 20 can measure the angle and intensity of light reflected by the object S without using an image sensor (camera). It is possible to detect the exact trajectory of the position and the movement of the object S in space.
  • the microcontrol unit 40 receives an optical signal for the object S detected by the non-contact sensor module 30 and an electrical signal converted from the optical signal, and then spatial coordinates of the position and / or movement of the object S. Algorithm to implement Furthermore, the gesture coordinate corresponding to the position and / or the movement of the object S may be determined using the spatial coordinates, and a control signal corresponding to the gesture gesture may be generated.
  • the microcontrol unit 40 may include a database unit which stores a gesture operation and a control signal corresponding thereto in order to generate the control signal.
  • the microcontrol unit 40 may include a comparison determination unit to determine a gesture operation corresponding to the position and / or movement of the object S using the spatial coordinates.
  • the microcontrol unit 40 includes at least one hardware or electronic device 1120 in which a control signal corresponding to the electronic device 1120 when the gesture action forms a circle clockwise. And a database portion set as a control signal for enabling a zoom up function of the software installed in the.
  • a database unit may be configured to be arbitrarily set by a user.
  • the microcontrol unit 40 is the actual shape of the object (S) even if the motion of the object (S), that is, the shape that the actual movement of the object (S) forms in space does not exactly match the starting point and the end point.
  • the motion of the object S forms a circle in the clockwise direction. It may include a comparison determination unit that can be determined by the gesture operation.
  • the comparison determination unit calculates an error between the movement of the object S and the gesture operation, and a algorithm for classifying and determining the movement of the object S as a gesture operation stored in the database within a predetermined error range. It may include.
  • FIG. 8 is a block diagram illustrating a configuration including a non-contact manipulation apparatus according to a modified embodiment of the present invention.
  • the light emitting unit 10 and the light receiving unit 20 may be connected to the non-contact sensor module 30.
  • the modules may be spaced apart from each other without configuring the same single module. Since the light emitting unit 10 and the light receiving unit 20 do not constitute a single module, the light emitting unit 10 and the light receiving unit 20 may be freely disposed independently in the non-contact manipulation apparatus 1000, so that the degree of freedom of arrangement is increased. You can expect the effect.
  • FIG. 1 in FIG.
  • the light emitting unit 10 and the light receiving unit 20 are configured as one module, an advantageous effect that the light emitting unit 10 and the light receiving unit 20 may be easily disposed and handled in the non-contact manipulation apparatus 1000 may be expected.
  • the concept of detecting the optical signal and calculating the spatial coordinates in response to the position of the object S and / or the movement of the object S using the light emitter 10 and the light receiver 20 is the same as FIG.
  • Description of the control unit 40 and the communication unit 50 is also the same as in FIG.
  • the optical signal intensity of the reflected light received by the photodiodes 25a, 25b, and 25c in the first array unit 20a may be represented by the optical signal C, It becomes smaller gradually in order of the optical signal B and the optical signal A.
  • FIG. That is, the position of the object S at time T 1 may correspond to any one point on the plane that is closest to the photodiode 25c and spaced apart from the photodiode 25a. Therefore, the position of the object S at time T 1 is perpendicular to the xy plane, perpendicular to the x-axis, parallel to the y-axis and z-axis, by the first array portion 20a, as shown in FIG. 9. It can correspond to any one point on the plane P x .
  • the optical signal intensity of the reflected light received by the photodiodes 25d, 25e, 25f, and 25g in the second array unit 20b may be represented by an optical signal. (F), optical signal (E), optical signal (G), and optical signal (D) in order of decreasing. That is, at time T 1 , the position of the object S is closest to the photodiode 25f, then close to the photodiode 25e, and is spaced apart from the photodiode 25d most arbitrarily. It can correspond to any point of.
  • the position of the object S at time T 1 is perpendicular to the xy plane, perpendicular to the y-axis, parallel to the x-axis and z-axis, by the second array portion 20b, as shown in FIG. 9. It can correspond to any one point on the plane P y .
  • the position of the object S is in the xy plane by the first array unit 20a, as shown in FIG. 9.
  • Perpendicular to the x-axis corresponding to any point on the plane P x parallel to the y-axis and z-axis, and perpendicular to the y-axis by the second array portion 20b Since it is perpendicular and can correspond to any point on the plane P y parallel to the x and z axes, at time T 1 , the position of the object S is the plane P x and the plane P y. ) May correspond to any point on the intersection line L z that intersects.
  • the position of the object S is set to any one on the intersection line L z even by the first array unit 20a and the second array unit 20b provided in one non-contact sensor module 30. May correspond to a point.
  • the absolute value of the intensity of the optical signal shown in FIGS. 4 and 6 is relatively small, and the object S is
  • the non-contact sensor module 30 gets closer to the height direction (z-axis direction)
  • the absolute value of the intensity of the optical signal shown in FIGS. 4 and 6 becomes relatively large, but the absolute value of the height component (z component) in the spatial coordinates is increased. It is not easy to figure out.
  • the non-contact manipulation apparatus 1000 includes a plurality of non-contact sensor modules to solve this problem.
  • FIG. 10 is a configuration diagram illustrating a configuration including a non-contact manipulation apparatus according to another embodiment of the present invention.
  • FIG. 11 is a diagram illustrating a concept of calculating a height component among spatial coordinates of an object by using a distance between two non-contact sensor modules and a light receiving angle of reflected light in a non-contact manipulation device according to another exemplary embodiment of the present invention.
  • At least two non-contact sensor modules 30 include a first non-contact sensor module 30-1 and a second non-contact sensor module 30-2 spaced apart from each other.
  • Each of the first non-contact sensor module 30-1 and the second non-contact sensor module 30-2 is the same as the non-contact sensor module described above. Referring to FIG. 11, the separation distance D between the first non-contact sensor module 30-1 and the second non-contact sensor module 30-2, and the plurality of non-contact sensor modules 30-1 and 30-2.
  • the middle height component (H) can be calculated by the trigonometric method as in Equation (1).
  • the plurality of non-contact sensor modules 30 are arranged, it is possible to calculate the exact position coordinates regardless of the size of the object (S). 10 and 11 exemplarily illustrate a case in which the non-contact sensor modules 30 are two non-contact sensor modules. Furthermore, a larger number of the plurality of non-contact sensor modules 30 are in the horizontal and / or vertical directions. By arranging in an array, it is possible to calculate a more accurate position coordinate of the object S.
  • the received angles of the received reflected light are a plurality of photodiodes 25 arranged in the horizontal axis in the first array unit 20a and a plurality of photodiodes 25 arranged in the vertical axis in the second array unit 20b.
  • the intensity distribution of the optical signal of the reflected light received at can be comprehensively analyzed and calculated.
  • slit portions (not shown) for selectively blocking and passing the reflected light may be disposed on the first array portion 20a and the second array portion 20b.
  • the slit portion may be composed of a plurality of bars arranged side by side spaced apart from each other.
  • the slit portion direction on the first array portion 20a and the slit portion direction on the second array portion 20b may be directions crossing each other.
  • the spatial coordinates of the object S may be calculated.
  • the microcontrol unit 40 receives the optical information of the reflected light detected by the plurality of non-contact sensor modules 30 and the distance D information between the plurality of non-contact sensor modules 30 and receives spatial coordinates of the object S. Equipped with an algorithm to output accurate coordinate data.
  • the microcontrol unit 40 may determine a gesture motion with respect to the movement of the object S by using the calculated spatial coordinates indicating the position of the object S, and generate a control signal corresponding to the gesture motion. have. To this end, it may optionally further comprise a database unit for storing the gesture operation and the control signal corresponding thereto. In addition, the microcontrol unit 40 may further include a comparison determination unit to determine the gesture operation for the movement of the object (S).
  • the microcontrol unit 40 moves the horizontal component, the vertical component, and the height component of the actual movement of the target object S, even if the actual movement of the target object S is formed in space, when the starting point and the end point do not exactly match.
  • the actual movement of the object S may include a comparison determination unit which can be determined by the gesture motion forming a clockwise circle. Can be.
  • the comparison determination unit calculates an error between the actual motion of the object S and the gesture motion, and classifies and determines the motion of the object S as a gesture motion stored in the database within a predetermined error range. It may include.
  • FIG. 12 is a configuration diagram illustrating a configuration including a non-contact manipulation apparatus according to another modified embodiment of the present invention.
  • the light emitting unit 10 and the light receiving unit 20 may be connected to the non-contact sensor module 30.
  • the modules may be spaced apart from each other without configuring the same single module. Since the light emitting unit 10 and the light receiving unit 20 do not constitute a single module, the light emitting unit 10 and the light receiving unit 20 may be freely disposed independently in the non-contact manipulation apparatus 1000, so that the degree of freedom of arrangement is increased. You can expect the effect.
  • at least two light receiving units 21 include a first light receiving unit 20-1 and a second light receiving unit 20-2.
  • each of the first light receiving unit 20-1 and the second light receiving unit 20-2 is the same as the light receiving unit 20 described above.
  • the height component H of the spatial coordinates of the object S is the separation distance D between the first light receiving unit 20-1 and the second light receiving unit 20-2 and the first light receiving unit 20-1.
  • the light receiving angles ( ⁇ 1 , ⁇ 2 , unit: degree (°)) of the reflected light R 1 , R 2 received at the second light receiving unit 20-2, respectively. Can be calculated.
  • the concept of detecting an optical signal and calculating spatial coordinates in accordance with the position of the object S and / or the movement of the object S using the light emitter 10 and the plurality of light receivers 21 is the same as that of FIG. 10. Since the description of the microcontrol unit 40 and the communication unit 50 is also the same as in FIG. 10, it will be omitted here.
  • FIG. 13A and 13B illustrate a concept in which a first array unit detects an optical signal for a horizontal component of a movement of an object in a non-contact manipulation device according to still another embodiment of the present invention.
  • FIG. 13A corresponds to a modified embodiment of the first array unit 20a disclosed in FIG. 3, and
  • FIG. 13B is a cross-sectional view illustrating the Q-Q cut plane of FIG. 13A.
  • the first photodiode PD1 has a height H at which the amount of light passage can vary according to at least an angle, and has a first partition W1 having a plurality of first slits S1 arranged side by side in the first direction.
  • the photodiode may be a photodiode capable of detecting the light amount of the first region biased to one side of the light passing between the first slits S1 and the light amount of the second region biased to the other side.
  • the first photodiode PD1 may include a first eccentric array PD1a and a second eccentric array PD1b, the first shown in FIGS. 13A and 13B.
  • the first eccentric array PD1a may correspond to the photodiode 25a shown in FIG. 3
  • the second eccentric array PD1b shown in FIGS. 13A and 13B may correspond to the photodiode 25b shown in FIG. 3. have.
  • the first eccentric array PD1a is installed below the first partition walls W1 and is installed to be biased to one side based on the center line of each of the first slits S1, and has different intensities according to the amount of light. Signal can be output.
  • the second eccentric array PD1b is disposed below the first partition walls W1, and is disposed to be biased to the other side based on the center line of each of the first slits S1, and has a different intensity according to the amount of light. Can output a signal.
  • the first array unit 20a having the photodiode 25 arrayed on the horizontal axis (x axis) may include first partition walls W1 extending along the vertical axis (y axis). Accordingly, the microcontrol unit 40 may determine the X-axis angle of the object S by using the output difference of the relative electrical signals of the first eccentric array PD1a and the second eccentric array PD1b. .
  • the second array portion 20b in which the photodiodes 25 are arranged in the vertical axis (y axis) is disposed between the first partition walls W1 extending along the horizontal axis (x axis).
  • the first eccentric array PD1a and the second eccentric array PD1b described above are arranged so that the microcontrol unit 40 is relative to the first eccentric array PD1a and the second eccentric array PD1b.
  • the Y-axis angle of the object S may be determined using the difference in output of the electrical signal.
  • the angle of the object S can be finally calculated by combining the X-axis angle and the Y-axis angle of the object S thus determined.
  • the non-contact manipulation apparatus 1000 may improve the problem of the conventional image sensor (camera) method and provide a new and diverse interface to the user through a motion recognition algorithm capable of low power and low cost.
  • the conventional non-contact gesture gesture recognition method is implemented by a camera module method.
  • the camera module method is a method of capturing a user's movement with a CMOS camera and recognizing an image captured by a sensor integrated circuit (IC).
  • IC sensor integrated circuit
  • the time taken to transfer the image taken by the CMOS camera to the sensor integrated circuit and to recognize and detect the motion is relatively high, and the recognition sensitivity is much lowered.
  • the camera module since the camera module must be operated to recognize the operation, there is a disadvantage in that a lot of power consumption is required. In addition, since the camera module is used, manufacturing costs are also very high. On the contrary, in the embodiments of the present invention, since the motion recognition is performed by using an infrared light emitting diode (IRED), a photodiode, and a microcontrol unit, this problem is achieved through a low power, fast motion recognition algorithm, low cost, and simple motion recognition processing. Can solve them.
  • IRED infrared light emitting diode
  • FIG. 14 is a diagram illustrating a configuration of a mouse to which a non-contact manipulation device is applied according to some embodiments of the present disclosure.
  • the non-contact manipulation apparatus 1000 described above may be embedded in the mouse 1120.
  • the plurality of non-contact sensor modules 30 including the first non-contact sensor module 30-1 and the second non-contact sensor module 30-2 spaced apart from each other may be disposed on the surface of the mouse 1120. .
  • the control signal transmitted to the computing device connected to the mouse 1120 may include a signal corresponding to the right direction key of the keyboard.
  • 15 is a diagram illustrating a configuration of a keyboard to which a non-contact manipulation device is applied according to some embodiments of the present disclosure.
  • the non-contact manipulation apparatus 1000 described above may be provided in a form of being attached to the surface of the keyboard 1120 or inserted into the inside thereof.
  • the plurality of non-contact sensor modules 30 including the first non-contact sensor module 30-1 and the second non-contact sensor module 30-2 spaced apart from each other may be disposed on the surface of the keyboard 1120.
  • the non-contact sensor modules 30 may be arranged in a frame 124a surrounding the keyboard 124b of the keyboard.
  • the keyboard 1120 interlocked with the non-contact manipulation apparatus 1000 detects a user's motion without touching and touching a body, such as a finger, to easily operate a computer device such as a PC and related software. Because it can be controlled, it can be applied to various fields.
  • 16 is a diagram illustrating a configuration of a USB hub to which a non-contact manipulation device according to some embodiments of the present invention is applied.
  • the non-contact manipulation apparatus 1000 described above may be provided to be attached to or inserted into the surface of the UBS hub 1120 having the multi UBS port 122a.
  • the plurality of non-contact sensor modules 30 including the first non-contact sensor module 30-1 and the second non-contact sensor module 30-2 spaced apart from each other may be disposed on the surface 122b of the USB hub 1120.
  • the USB hub 1120 interlocked with the non-contact manipulation device 1000 according to an exemplary embodiment of the present invention detects a user's motion without touching and touching a body, such as a finger, and easily operates a computer device such as a PC and related software. It can be applied to various fields because it can control.
  • FIG. 17 is a diagram illustrating a configuration of a headset to which a non-contact manipulation device is applied according to some embodiments of the present invention.
  • the headset 1120 includes two speaker units 126a, a microphone unit 126b, and a headband unit 126c connecting the two speaker units 126a and the speaker unit 126a. It includes the housing portion 126d formed on the outside.
  • the plurality of non-contact sensor modules 30 and / or the microcontrol unit 40 including the first non-contact sensor module 30-1 and the second non-contact sensor module 30-2 spaced apart from each other are each a speaker unit.
  • 126a, the microphone unit 126b, the headband portion 126c, and the housing portion 126d may be disposed and provided in a form that is attached to or inserted into the inside.
  • the above-described gesture recognition headset 1120 is a non-contact gesture headset using a light emitting unit and a light receiving unit, and can easily control a mobile device by recognizing a user's motion without touching and touching a finger.
  • FIG. 18 is a diagram illustrating a configuration of a monitor to which a non-contact manipulation device is applied according to some embodiments of the present disclosure.
  • the non-contact manipulation apparatus 1000 described above may be provided in a form of being attached to the surface of the monitor 1120 or inserted into the inside thereof.
  • the plurality of non-contact sensor modules 30 including the first non-contact sensor module 30-1 and the second non-contact sensor module 30-2 spaced apart from each other may be disposed on the surface of the monitor 1120.
  • the non-contact sensor modules 30 in the monitor 1120 interlocked with the non-contact manipulation device 1000 may be selected from a monitor pedestal 125a, a monitor pillar 125b, a monitor frame 125c, and a monitor screen 125d. It may be arranged and configured in at least one.
  • the monitor 1120 interlocked with the non-contact manipulation apparatus 1000 detects a user's motion without touching and touching a body, such as a finger, to easily operate a computer device such as a PC and related software. Because it can be controlled, it can be applied to various fields.
  • FIG. 19 is a diagram illustrating a configuration of a smart phone to which a non-contact manipulation device according to some embodiments of the present invention is applied.
  • At least a part of the non-contact manipulation apparatus 1000 described above may be provided to be attached to the surface of the smart phone 1120 or inserted into the inside.
  • the first non-contact sensor module 30-1 and the second non-contact sensor module 30-2 constituting a part of the non-contact manipulation device 1000 may be spaced apart from each other to have a predetermined distance.
  • FIG. 20 is a diagram illustrating a configuration in which a non-contact manipulation device according to some embodiments of the present disclosure is disposed and provided outside of an electronic device.
  • the non-contact manipulation apparatus 1000 described above may be provided as a separate device from the electronic device 1120.
  • the control signal generated by the microcontrol unit 40 of the non-contact manipulation apparatus 1000 is wired to the electronic device 1120 through wired cables 127a and 127b, or is disposed in the non-contact manipulation apparatus 1000.
  • 52 may be wirelessly transmitted to the electronic device 1120.
  • the electronic device 1120 may include at least one of a computer, a laptop, a tablet PC, a tablet mobile communication device, a smartphone, a mobile phone, a smart pad, a game device, a virtual experience device, a portable multimedia playback device, and an electronic book. Can be.
  • 21A-21D illustrate exemplary movements of an object that may be provided on a mouse to which a non-contact manipulation device is applied in accordance with some embodiments of the present invention.
  • the non-contact manipulation apparatus when the user's hand S moves from the left to the right without contacting the mouse on the mouse 1120 interlocked with the non-contact manipulation apparatus 1000 described above, the non-contact manipulation apparatus
  • the gesture motion determined by the microcontrol unit 40 of FIG. 1000 may be classified into a linear motion toward the right. Accordingly, a control signal transmitted to a computing device such as a computer connected to the mouse 1120 may be transmitted to the right direction key of the keyboard. It may be a signal corresponding to.
  • the contactless state is not contacted.
  • the gesture motion determined by the microcontrol unit 40 of the control device 1000 may be classified into a linear motion directed upward. Accordingly, a control signal transmitted to a computer device such as a computer connected to the mouse 1120 may be transmitted. It may be a signal corresponding to an up direction key.
  • the gesture motion determined by the micro-control unit 40 of the non-contact manipulation device 1000 may be classified into a linear motion in the direction of gravity, and thus a control signal transmitted to a computer device such as a computer connected to the mouse 1120. May be a signal corresponding to a click of an enter key or a mouse button on the keyboard.
  • the control signal transmitted to the computing device such as a computer connected to the mouse 1120 may be a signal corresponding to the direction in which the joystick is pulled toward the user.
  • 22A-22D illustrate exemplary movements of an object that may be provided on a keyboard to which a non-contact manipulation device is applied according to some embodiments of the present disclosure.
  • a control signal transmitted to a computing device such as a computer connected to the keyboard 1120 may be transmitted to the keyboard. It may be a signal corresponding to the right direction key of 1120.
  • the user's hand S may be spaced apart from the keyboard 1120 without touching the keyboard 1120 on the keyboard 1120 interlocked with the non-contact manipulation apparatus 1000 described above.
  • the gesture motion determined by the microcontrol unit 40 of the non-contact manipulation apparatus 1000 may be classified as a linear movement toward the upper side, and thus connected to the keyboard 1120.
  • the control signal transmitted to the computing device such as a computer may be a signal corresponding to the upper direction key of the keyboard 1120.
  • the user's hand S may be spaced apart from the keyboard 1120 without touching the keyboard 1120 on the keyboard 1120 interlocked with the non-contact manipulation apparatus 1000, at a high point in the height direction.
  • the gesture motion determined by the microcontrol unit 40 of the non-contact manipulation apparatus 1000 may be classified into a linear motion toward the gravity direction, and accordingly, the computing device such as a computer connected to the keyboard 1120 may be used.
  • the control signal to be transmitted may be a signal corresponding to a click of an enter key or a mouse button of the keyboard 1120.
  • the user's hand S may be spaced apart from the keyboard 1120 without contacting the keyboard 1120 on the keyboard 1120 interlocked with the non-contact manipulation apparatus 1000 described above.
  • the control signal transmitted to the computing device such as a computer connected to the keyboard 1120 may be a signal corresponding to the direction in which the joystick is pulled toward the user.
  • the keyboard 1120 interlocked with the non-contact manipulation apparatus 1000 may include a frame 124a, a basic input key 124b, a special function key 124c, a switch key 124d, and a display window 124e. Can be.
  • the position and / or the movement of the object S may be determined by the first state in which the basic input key 124b is available and the non-contact manipulation apparatus 1000. It may be necessary to clearly distinguish the second state.
  • the first state refers to a general keyboard use state using a general basic input key 124b such as a character input key, a numeric input key, an enter input key, a space input key, and the like.
  • the second state refers to a state in which the position and / or movement of the object S can be grasped by the non-contact manipulation apparatus 1000 as shown in FIGS. 22A to 22D.
  • the non-contact manipulation apparatus 1000 is activated to hold the user's palm.
  • the movement of the controller determines the movement of the gesture, the gesture action corresponding to the gesture, and the control signal is generated.
  • the first state in which the primary input key 124b of the keyboard 1120 can be used and the position or movement of the object S can be determined by the non-contact sensor module 30 and the microcontrol unit 40.
  • the switch key 124d for selecting any one of the second states may be provided separately from the existing basic input key 124b.
  • the switch key 124d may be understood as a toggle key capable of toggling the first state and the second state. That is, as the toggle key 124d is sequentially pressed, the first step is activated and the second state is deactivated, and the second step is deactivated and the second state is activated. It can be configured to be implemented sequentially.
  • the first state in which the basic input key 124b of the keyboard 1120 can be used and the position or movement of the object S can be determined by the non-contact sensor module 30 and the microcontrol unit 40.
  • the toggle key 124d is configured to prevent the second state from being activated at the same time
  • the modified embodiment of the present invention may be configured to include the case where the first state and the second state are activated at the same time. For example, a step of inputting an enter key, which is a basic input key of a keyboard, for shooting in a process of performing a simulation game, etc. and the position of the object S by the non-contact manipulation apparatus 1000 instead of the joystick. It may be necessary for the steps to grasp the movement to be performed simultaneously.
  • the non-contact sensor module 30 may detect the left hand of the user who inputs the enter key, which is the basic input key of the keyboard, and the right hand of the user who expresses the movement of the object S, for example, moving toward the right.
  • the range of the area and the algorithm of the microcontrol unit 40 can be finely adjusted. Accordingly, in a modified embodiment of the present invention, as the toggle key 124d is sequentially pressed, the first state is activated and the second state is deactivated, and the first state is deactivated and the second is deactivated.
  • the state may be configured to sequentially implement a second step of activating, and a third step of activating the first state and the second state simultaneously. Of course, the order of the steps can be arbitrarily changed.
  • each step is configured to be implemented sequentially by pressing a single toggle key 124d in sequence, but in another modified embodiment of the present invention, two switch keys 124d may be provided. have. That is, a first switch key for sequentially activating and deactivating the first state and a second switch key for sequentially activating and deactivating the second state can be provided. Accordingly, the first to third steps may be implemented by the combination of the first switch key and the second switch key.
  • a first state in which the basic input key 124b of the keyboard 1120 is used and a second in which the position or movement of the object S can be detected by the non-contact sensor module 30 and the microcontrol unit 40 are used.
  • Other configurations are also possible that can impart a switching function between states.
  • the switching function may be given to any one of the special function keys 124c.
  • a separate switching device may be provided on the frame 124a of the keyboard 1120.
  • the switching function may be configured by a specific gesture operation corresponding to a specific movement of the object S.
  • information about the activated state may be displayed on the display window 124e so that the user of the keyboard 1120 may easily determine whether the first state and / or the second state are activated. That is, at least one selected from a first state capable of using the basic input key 124b of the keyboard 1120 and a second state capable of detecting the position or movement of the object S by the non-contact manipulation apparatus 1000.
  • a display window 124e for checking the status may be provided. For example, when the display window 124e indicates that the first state is activated, the contactless operation device 1000 is activated by pressing the above-described toggle key 124d to adjust the position and / or movement of the object S. FIG. I can figure it out. For example, if the display window 124e indicates that the second state is activated, the non-contact manipulation device 1000 is deactivated by pressing the above-described toggle key 124d to disable the normal basic input key 124b. It is available.
  • Existing general keyboards are equipped with a track ball, a touch pad, and a pointing stick to implement various user interfaces.
  • the interface using the trackball is inexpensive, but the sensitivity is often poor and there is a bulky problem, a relatively high cost is required to obtain a sensitivity over a certain performance.
  • the interface using the touch pad is weak to heat and has a problem of malfunctioning and a problem of accurately capturing the pointer, and also has a disadvantage of difficulty in positioning the pointer even when drawing a picture.
  • the interface using the pointing stick is a pressure sensor that uses the method of moving the mouse cursor on the screen at a speed proportional to the pressure that the user presses, and relatively fine adjustment is possible but it is not easy to cope with various movements of the user.
  • the existing input methods such as trackball, touch pad, and pointing stick have many disadvantages to replace the mouse in terms of fine adjustment or quick response.
  • the embodiments of the present invention can overcome these disadvantages by providing a keyboard interlocked with the non-contact operation apparatus 1000 described above.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)

Abstract

Pour pourvoir à un dispositif d'actionnement sans contact ayant une faible consommation d'énergie, un algorithme rapide pour reconnaître un mouvement, un bas coût de fabrication et une étape de traitement de reconnaissance de mouvement simple, la présente invention porte sur le dispositif d'actionnement sans contact comprenant : une ou plusieurs parties électroluminescentes pour émettre de la lumière ; une ou plusieurs parties réceptrices de lumière comprenant une pluralité de capteurs de lumière agencés en groupements horizontaux et verticaux, pour recevoir de la lumière provenant de la partie électroluminescente en tant que source de lumière et réfléchie en raison d'un mouvement d'un objet, et convertir en un signal électrique un signal optique relativement au mouvement de l'objet contenant des informations sur l'intensité et l'angle de la lumière qui est reçue ; et un microcontrôleur (MCU) pour recevoir une entrée du signal électrique, calculer des coordonnées spatiales de l'objet, déterminer un mouvement gestuel relativement au mouvement de l'objet par utilisation des coordonnées spatiales, et générer un signal de commande correspondant au mouvement gestuel.
PCT/KR2014/003086 2013-04-09 2014-04-09 Dispositif d'actionnement sans contact et dispositif électronique lié à ce dernier WO2014168416A1 (fr)

Applications Claiming Priority (14)

Application Number Priority Date Filing Date Title
KR1020130038417A KR101524050B1 (ko) 2013-04-09 2013-04-09 비접촉 제스쳐 조작장치
KR10-2013-0038416 2013-04-09
KR1020130038416A KR101415931B1 (ko) 2013-04-09 2013-04-09 동작인식 헤드셋
KR10-2013-0038417 2013-04-09
KR10-2013-0074876 2013-06-27
KR1020130074877A KR101469186B1 (ko) 2013-06-27 2013-06-27 비접촉 제스쳐 조작장치를 가지는 마우스
KR10-2013-0074877 2013-06-27
KR1020130074876A KR101469129B1 (ko) 2013-06-27 2013-06-27 비접촉 제스쳐 조작장치를 가지는 usb 허브
KR10-2013-0074878 2013-06-27
KR1020130074878A KR101471816B1 (ko) 2013-06-27 2013-06-27 비접촉 제스쳐 조작장치
KR10-2013-0082215 2013-07-12
KR1020130082215A KR101504148B1 (ko) 2013-07-12 2013-07-12 비접촉 조작 장치
KR1020130091493A KR101460028B1 (ko) 2013-08-01 2013-08-01 비접촉 조작 장치를 구비한 키보드
KR10-2013-0091493 2013-08-01

Publications (1)

Publication Number Publication Date
WO2014168416A1 true WO2014168416A1 (fr) 2014-10-16

Family

ID=51689763

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2014/003086 WO2014168416A1 (fr) 2013-04-09 2014-04-09 Dispositif d'actionnement sans contact et dispositif électronique lié à ce dernier

Country Status (1)

Country Link
WO (1) WO2014168416A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113946219A (zh) * 2021-10-25 2022-01-18 陈奕名 智能设备的控制方法及装置、交互设备及存储介质

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100703199B1 (ko) * 2005-02-17 2007-04-05 (주)모비솔 일체형 포인팅 장치를 내장한 키보드 및 조이스틱
KR20100012422A (ko) * 2008-07-29 2010-02-08 비에스엔텍(주) 광 포인팅 장치 및 동작 방법
KR20110028922A (ko) * 2009-09-14 2011-03-22 마루엘에스아이 주식회사 근접센서
KR20110045330A (ko) * 2009-10-26 2011-05-04 엘지전자 주식회사 이동 단말기

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100703199B1 (ko) * 2005-02-17 2007-04-05 (주)모비솔 일체형 포인팅 장치를 내장한 키보드 및 조이스틱
KR20100012422A (ko) * 2008-07-29 2010-02-08 비에스엔텍(주) 광 포인팅 장치 및 동작 방법
KR20110028922A (ko) * 2009-09-14 2011-03-22 마루엘에스아이 주식회사 근접센서
KR20110045330A (ko) * 2009-10-26 2011-05-04 엘지전자 주식회사 이동 단말기

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113946219A (zh) * 2021-10-25 2022-01-18 陈奕名 智能设备的控制方法及装置、交互设备及存储介质

Similar Documents

Publication Publication Date Title
WO2013183938A1 (fr) Procédé et appareil d'interface utilisateur basés sur une reconnaissance d'emplacement spatial
WO2016028097A1 (fr) Dispositif pouvant être porté
WO2015084111A1 (fr) Dispositif de traitement d'entrée d'utilisateur utilisant un nombre limité de capteurs de champ magnétique
US20140132512A1 (en) Controlling a graphical user interface
WO2015199304A1 (fr) Terminal mobile et son procédé de commande
WO2017119745A1 (fr) Dispositif électronique et procédé de commande associé
WO2014073926A1 (fr) Dispositif de télécommande, dispositif d'affichage, et son procédé de commande
WO2016182181A1 (fr) Dispositif portable et procédé permettant de fournir une rétroaction d'un dispositif portable
WO2015126197A1 (fr) Appareil et procédé de commande à distance par toucher virtuel mis en œuvre sur un appareil photo
WO2020050636A1 (fr) Procédé et appareil de reconnaissance de gestes basée sur l'intention de l'utilisateur
WO2012111862A1 (fr) Dispositif d'entrée d'informations et procédé pour effectuer une commutation automatique entre un mode d'entrée d'informations en utilisant un panneau tactile et un mode d'entrée d'informations en utilisant un signal ultrasonore
WO2017126741A1 (fr) Visiocasque et procédé de commande de celui-ci
WO2013027983A2 (fr) Appareil de commande de dispositif électronique et procédé de commande de ce dispositif
WO2020242087A1 (fr) Dispositif électronique et procédé de correction de données biométriques sur la base de la distance entre le dispositif électronique et l'utilisateur, mesurée à l'aide d'au moins un capteur
WO2013154268A1 (fr) Procédé et appareil de reconnaissance d'une entrée de touche d'un clavier d'instructeur
KR20150145729A (ko) 지문 입력을 통한 화면 이동 및 서비스 선택 방법, 지문 센서를 구비한 착용형 전자 장치 및 컴퓨터 프로그램
WO2014168416A1 (fr) Dispositif d'actionnement sans contact et dispositif électronique lié à ce dernier
WO2018194227A1 (fr) Dispositif de reconnaissance tactile tridimensionnel utilisant un apprentissage profond et procédé de reconnaissance tactile tridimensionnel utilisant ledit dispositif
KR101258969B1 (ko) 비접촉식 사용자 인터페이스를 이용한 입력장치
KR101460028B1 (ko) 비접촉 조작 장치를 구비한 키보드
WO2016122153A1 (fr) Appareil d'affichage et son procédé de commande
WO2013172560A1 (fr) Dispositif d'entrée de direction et procédé de fonctionnement d'une interface utilisateur utilisant celui-ci
WO2022080549A1 (fr) Dispositif de suivi de déplacement de structure de capteur lidar double
WO2020171607A1 (fr) Circuit tactile pour empêcher un toucher erroné dû à un changement de température, dispositif électronique comprenant le circuit tactile et son procédé de fonctionnement
KR101504148B1 (ko) 비접촉 조작 장치

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14782537

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14782537

Country of ref document: EP

Kind code of ref document: A1