WO2014048251A1 - 一种触摸识别装置及识别方法 - Google Patents

一种触摸识别装置及识别方法 Download PDF

Info

Publication number
WO2014048251A1
WO2014048251A1 PCT/CN2013/083363 CN2013083363W WO2014048251A1 WO 2014048251 A1 WO2014048251 A1 WO 2014048251A1 CN 2013083363 W CN2013083363 W CN 2013083363W WO 2014048251 A1 WO2014048251 A1 WO 2014048251A1
Authority
WO
WIPO (PCT)
Prior art keywords
touch
data
dimensional
gesture
processing unit
Prior art date
Application number
PCT/CN2013/083363
Other languages
English (en)
French (fr)
Inventor
刘新斌
张海兵
Original Assignee
北京汇冠新技术股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京汇冠新技术股份有限公司 filed Critical 北京汇冠新技术股份有限公司
Publication of WO2014048251A1 publication Critical patent/WO2014048251A1/zh

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0428Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by sensing at the edges of the touch surface the interruption of optical paths, e.g. an illumination plane, parallel to the touch surface which may be virtual

Definitions

  • the present invention relates to the field of 5U3 ⁇ 4, and more particularly to an apparatus and method for performing touch recognition using a multi-layer touch detection device. Background technique
  • Touch screens are becoming more and more widely used. Touch screens are an indispensable device for human-computer interaction. Current touch technology only provides parsing and application for flat touch actions, such as click, double click and slide. However, in the prior art, the display screen can provide three-dimensional images in addition to the menu of the plan view. The current operation for the three-dimensional image can only be performed after a series of selections using the mouse, which is very complicated and efficient. not tall. With the in-depth development of display technology, there have been televisions, computers, etc. that use the parallax principle for stereoscopic display, and many of them have been used for product display, so that a touch technology capable of operating on a stereoscopic image is required.
  • the touch depth is detected and applied for the stereoscopic display technology, but the application is simple, but the touch depth is increased by 3 ⁇ 4:.
  • the touch recognition device includes four layers, a touch panel 1, a touch layer 2, a touch layer 3, and a touch layer 4. Each touch layer has a longitudinal coordinate for reflecting the touch depth.
  • a technique for de-sensing touch force using a multi-layer touch is disclosed, but the above-mentioned disclosed technologies have a very simple application to the three-dimensional touch technology and cannot satisfy the needs of a true three-dimensional touch operation. Summary of the invention
  • the technical problem to be solved by the present invention is to provide a touch recognition device and a recognition method capable of recognizing a stereoscopic touch gesture.
  • a touch recognition device includes a signal processing unit and at least two touch signal detecting devices whose identification layers are not in the same plane, wherein the touch signal detecting device is associated with the letter
  • the number processing unit is connected, and transmits detection data to the signal processing unit, and the signal processing unit processes the received detection data to obtain gesture data indicating a stereoscopic touch gesture.
  • the signal processing unit includes a touch trajectory integration unit and a touch recognition unit correspondingly connected to the touch signal detection device, the touch recognition unit is connected to the touch trajectory integration unit and transmits the recognized touch point data.
  • the touch track integration unit synthesizes the gesture data according to touch point data sent by each touch recognition unit.
  • the gesture data includes trajectory data in a direction that is not parallel to the display plane.
  • the touch signal detecting means of at least one of the touch recognition means comprises an infrared pair tube array.
  • the touch signal detecting means of at least one of the touch recognition means comprises an optical sensing means.
  • the optical sensing device is a camera.
  • the motion data includes plane trajectory data and direction data of each layer touch point mapped on the same plane.
  • the signal processing unit generates touch object three-dimensional bone line data according to the detection data of each layer of the touch signal detecting device, and the gesture data includes motion trajectory data of the three-dimensional bone line.
  • the signal processing unit generates projection data on a specific plane according to the detection data of each layer of the touch signal detecting device, and the gesture data includes the projection data and the motion trajectory data of the projection data.
  • the present invention also provides a method for identifying a stereo touch gesture, including:
  • Step 1 use at least two layers of touch detection devices to collect touch recognition data of different planes;
  • Step 2 Identify the touch data of the different planes! ⁇ Establish a body touch gesture data.
  • the step 2 is: generating touch object three-dimensional bone line data according to the detection data of each layer of the touch signal detecting device, wherein the gesture data includes motion trajectory data of the three-dimensional bone line.
  • the step 2 is: projecting all the points detected by the touch detection devices of each layer for the plane of one of the layers, and obtaining a series of two-dimensional point sets, the motion from the two-dimensional point set. Change to get gesture data.
  • the touch detection device comprises an infrared pair tube array or a camera.
  • the present invention also provides an interactive system comprising the above described touch recognition device.
  • the touch recognition device of the present invention recognizes a touch action in a three-dimensional space and recognizes a stereo touch gesture, and can be better applied to a stereoscopic display or a stereoscopic operating system, such as a touch operation of a 3D image.
  • FIG. 1 is a schematic structural view of a multi-layer touch recognition device in the prior art
  • FIG. 2 is a schematic structural view of a multi-layer touch recognition device according to an embodiment of the present invention
  • FIG. 3 is a schematic diagram of a 2D cross section formed by a touch object in each layer touch recognition device
  • FIG. 4 is a touch object in each layer touch recognition device. Schematic diagram of the formed bone line of the cross section
  • FIG. 5 is a schematic diagram of the judgment action grasping
  • Figure 6 is a schematic diagram of the judgment action
  • Figure 7 is a schematic diagram of determining the action transition
  • FIG. 8 is a block diagram of a system in accordance with another embodiment of the present invention. detailed description
  • a multi-layer touch recognition apparatus includes a plurality of touch signal detecting devices 1, ..., N disposed in a stack, and is connected to the plurality of touch signal detecting devices 1, ..., N
  • the signal processing unit 20 the touch signal detecting device transmits the detection data to the signal processing unit, and the signal processing unit processes the received detection data to obtain gesture data capable of indicating a stereoscopic touch gesture.
  • the gesture data includes trajectory data in a direction that is not parallel to the display plane.
  • the signal processing unit may be implemented by a single-chip microcomputer with multiple I/O interfaces, or may be a single-chip microcomputer with a single I/O interface, and the plurality of signal detecting devices transmit data through the bus.
  • the present methods are all well known in the art and will not be described in detail herein.
  • the signal processing unit 20 includes a touch track integration unit.
  • a touch recognition unit A, BN correspondingly connected to the touch signal detecting device, wherein the touch recognition unit is connected to the touch track integration unit, and transmits the recognized touch point data to the touch track integration unit,
  • the touch track integration unit synthesizes the gesture data according to touch point data sent by each touch recognition unit.
  • the touch recognition units A, BN are for receiving detection data of a touch signal detecting device connected thereto, and identifying touch point coordinates.
  • the touch point coordinates of each layer have a Z-axis value, such as: ( ⁇ , ⁇ , ⁇ ).
  • the touch point coordinates are sent to the touch track integration unit 5, and after the touch track integration unit 5 receives the coordinates of the touch points sent by the respective touch recognition units, the coordinate points belonging to the same finger are associated, and the movement curve of the finger is tracked, and the finger is
  • the motion curve feature (also called a gesture) is sent to the application system 6 to which it is connected.
  • the application system 6 compares and recognizes the pre-stored finger motion curve features, and performs an operation according to the operation command corresponding to the recognized finger motion curve feature.
  • the motion curve feature does not necessarily have to be a three-dimensional motion curve, and the motion of the gesture can also be interpreted by mapping on a plane, as described in detail below.
  • the 2D section S1 of the touch object formed in each layer can be obtained in the plane of each layer.
  • S2, ..., Sn calculate the centers of the respective sections Sl, S2, ..., Sn, and represent them in three-dimensional coordinates: P l (xl, yl, zl), P2 (x2, y2, z2), ..., Pn ( Xn, yn, zn).
  • the point set consisting of PI to Pn is subjected to spatial curve fitting (common fitting methods include B-spline curve fitting method or projection method, etc.), thereby obtaining a three-dimensional space-based bone line formed according to the touch object (as shown in the figure). 4)). It can be foreseen that the more layers of the touch layer, the denser the resulting point set, and the closer the fitted curve is to the bone line of the actual touch object.
  • the shape change of the three-dimensional bone line can be tracked in two ways: one way is to track the center point of the touch object in the touch layer that activates the touch operation (center of gravity) Another way is to track the path change of the point set in the three-dimensional direction. If the relative position of the point set does not change, but from the bottom touch layer, the touch action activated by the touch is from no to no.
  • the touch operation is a lifting operation; if the relative position between the point sets changes, and the latest bone line rotates around the original bone line, the action is a twisted operation), the bone line
  • the motion track reflects the touch gesture, and various touch gestures can be defined according to the recognized motion.
  • multiple touch objects multiple fingers
  • multiple three-dimensional space-based bones can be obtained.
  • grasping the center points of the touch layers at the bottom of the bone line appear to be collapsed, as shown in FIG.
  • each frame of data can obtain a set of points obtained by projection, thereby obtaining a series of point sets based on two-dimensional, and the gesture can be acquired from the motion change of the two-dimensional point set.
  • a plurality of fingers are projected onto the bottommost touch layer at the center point of the cross section formed by each touch layer to obtain a two-dimensional captured point set.
  • the finger When the gripping action occurs, the finger appears as a collapsing action, then the center point of the cross section of each touch layer will also appear as a collection of point sets. Therefore, the point set projected to the bottommost touch layer will also appear as a gather. At the same time, since the uppermost layer of the grasping action is relatively small, the bottom layer is the largest. Therefore, according to the surrounding point of the projected point set and the inner circumference point, the point can be directly judged whether the point is a grasping action.
  • the action on the action hook can be recognized based on the shape of the finger projection, the direction of movement of the Z axis, and the speed.
  • a simple method for extracting bone lines is to set a distance threshold. When the distance between the points and the point is less than the set threshold, the change is considered to be the same bone line. The associated point, otherwise it is a non-association point. But in the process of judging, when multiple touches When the center distance of the object in the cross section of each touch layer is less than the pitch of the two touch layers, it may not be possible to distinguish which bone line the center point of each section is on. On the other hand, if the pitch of the plurality of fingers is satisfied to be larger than the interval between the two touches, the bone line of the touch object can be accurately recognized.
  • the coordinate point at which the touched point closest to the display screen is identified can be determined as the operated point.
  • a multi-layer infrared touch screen is taken as an example for description, and a plurality of cameras can also be used for identification.
  • the plurality of cameras includes a touch-positioned camera and a gesture-recognized camera, combining touch positioning and gesture recognition.
  • two touch positioning cameras 8 are provided in the upper left corner and the upper right corner of the touch screen 7, respectively, connected to the touch positioning recognition units A, B.
  • Two gesture recognition cameras 9 are provided in the lower left corner and the lower right corner of the touch screen 7, respectively, and connected to the gesture recognition units VIII and B, respectively.
  • the touch location recognition units A, B and the gesture recognition unit VIII, B are all connected to the touch track integration unit 5 for integrating the positioning and gesture data, and forming a control command to be sent to the application system 6.
  • the touch track integration unit of the present embodiment can add touch position data on the basis of the gesture data to operate and control a specific position in the display image.
  • the application system 6 pre-stores data of various touch modes (gestures), and facilitates comparison of the data received from the touch integration unit 5 with the stored touch mode data to determine which touch mode is used.
  • the above embodiments of the present invention can combine touch and gesture recognition to satisfy the stereoscopic touch operation on the image, and also make the human-computer interaction more humanized and intelligent.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)
  • Image Analysis (AREA)

Abstract

本发明公开了一种触摸识别装置,其包括信号处理单元以及至少两个识别层不在同一平面的触摸信号检测装置。所述触摸信号检测装置均与所述信号处理单元连接,并将检测数据发送到所述信号处理单元。所述信号处理单元对接收的检测数据进行处理,获得表示立体触摸手势的手势数据。本发明还提供相应的识别方法以及包括触摸识别装置的人机交互系统。本发明利用多层触摸检测装置对立体空间的触摸动作进行识别,识别立体触摸手势,能够更好地应用于立体显示或立体操作系统,例如3D图像的触摸操作等。

Description

一种触摸识别装置及识别方法 技术领域
本发明涉 5U¾ 领域, 尤其涉及一种利用多层触摸检测装置进行触 摸识别的装置及方法。 背景技术
触摸屏的应用越来越广泛,触摸屏是提供人机互动所必不可少的装置, 目前的触摸技术仅仅为平面触摸动作提供解析和应用, 例如单击、 双击和 滑动等。 但是在现有技术中, 显示屏除了提供平面图^ M者菜单外, 还能 提供三维图像, 目前的针对三维图像的操作只有利用鼠标进行一系列选择 之后再进行操作, 非常的复杂, 操作效率也不高。 随着显示技术的深入发 展, 出现了利用视差原理进行立体显示的电视、 电脑等, 有很多用于商品 展示, 这样就^ 需要能够对立体图 ϋ行操作的触摸技术。
在现有技术中 (参见中国专利文献 CN102156575A )针对立体显示技 术提出对触摸深度进行检测和应用, 不过这种应用很简单, 只是增加了触 摸深度这个 ¾:。 如图 1所示, 触摸识别装置包括四层, 触^^ 1、 触摸 层 2、触摸层 3和触摸层 4,每个触摸层都有一个纵向坐标, 用于体现触摸 深度。 在另一篇中国专利 CN101403952B中公开了利用多层触^"去感应 触摸力度的技术, 但是上述这些公开的技术对三维触摸技术的应用非常筒 单, 并不能满足真正的三维触摸操作的需要。 发明内容
本发明所要解决的技术问题是提供一种能识别立体触摸手势的触摸识 别装置和识别方法。
为解决上述技术问题, 本发明采用的技术方案如下:
一种触摸识别装置包括信号处理单元以及至少两个识别层不在同一平 面的触摸信号检测装置, 其特征在于, 所述触摸信号检测装置均与所述信 号处理单元连接, 并将检测数据发送到所述信号处理单元, 所述信号处理 单元对接收的检测数据进行处理, 获得表示立体触摸手势的手势数据。
优选地, 所述信号处理单元包括触摸轨迹整合单元及与所述触摸信号 检测装置对应连接的触摸识别单元, 所述触摸识别单元均连接到所述触摸 轨迹整合单元并将识别的触摸点数据发送到所述触摸轨迹整合单元, 所述 触摸轨迹整合单元根据各触摸识别单元发送的触摸点数据合成所述手势数 据。
优选地, 所述手势数据包括非平行于显示平面的方向上的轨迹数据。 优选地 , 所述触摸识别装置的至少其中之一的触摸信号检测装置包括 红外对管阵列。
优选地, 所述触摸识别装置的至少其中之一的触摸信号检测装置包括 光学感应装置。
优选地, 所述光学感应装置为摄像头。
优选地, 所述运动数据包括各层触摸点映射在同一平面的平面轨迹数 据和方向数据。
优选地, 所述信号处理单元根据各层触摸信号检测装置的检测数据生 成触摸物三维骨线数据, 手势数据包括三维骨线的运动轨迹数据。
优选地, 所述信号处理单元根据各层触摸信号检测装置的检测数据生 成在一特定平面的投影数据, 手势数据包括所述投影数据及该投影数据的 运动轨迹数据。
本发明还提供一种识别立体触摸手势的方法, 包括:
步驟 1、 首先利用至少两层触摸检测装置采集不同平面的触摸识别数 据;
步驟 2、 将所述不同平面的触摸识别数据! ^成立体触摸手势数据。 优选地, 所述步骤 2为: 根据各层触摸信号检测装置的检测数据生成 触摸物三维骨线数据, 手势数据包括三维骨线的运动轨迹数据。
优选地, 所述步骤 2为: 将各层触摸检测装置检测的所有点针对其中 一层所在平面进行投影, 得到一系列基于二维的点集, 从二维点集的运动 变化获取手势数据。
优选地, 所述触摸检测装置包括红外对管阵列或摄像头。
本发明还提供一种 交互系统, 包括上述的触摸识别装置。
本发明的上述触摸识别装置对立体空间的触摸动作进行识别, 识别立 体触摸手势, 能够更好地应用到立体显示或立体操作系统, 例如 3D 图像 的触摸操作等。 附图说明
图 1 为现有技术中的多层触摸识别装置的结构示意图;
图 2为根据本发明的实施例的多层触摸识别装置的结构示意图; 图 3为触摸物在各层触摸识别装置中形成的 2D截面的示意图; 图 4为触摸物在各层触摸识别装置中形成的截面的骨线的示意图; 图 5为判断动作抓的示意图;
图 6为判断动作提的示意图;
图 7为判断动作转的示意图;
图 8为才艮据本发明的另一实施例的系统的结构图。 具体实施方式
下面将结合具体实施方式及附图, 对本发明的技术方案进行清楚、 完 整地描述。
如图 2所示, 根据本发明的实施例的多层触摸识别装置包括层叠设置 的多个触摸信号检测装置 1、 …、 N,与所述多个触摸信号检测装置 1、 …、 N连接的信号处理单元 20, 所述触摸信号检测装置将检测数据发送到所述 信号处理单元, 所述信号处理单元对接收的检测数据进行处理, 获得能够 表示立体触摸手势的手势数据。 该手势数据包括非平行于显示平面的方向 上的轨迹数据。
所述信号处理单元可以采用多 I/O接口的单片机实现, 也可以采用单 I/O 接口的单片机, 与所述多个信号检测装置通过总线传输数据, 这种实 现方式均为本领域公知技术, 这里就不再详细叙述。
作为一种优选实施方式,所述信号处理单元 20包括触摸轨迹整合单元
5及与所述触摸信号检测装置对应连接的触摸识别单元 A、 B N, 所 述触摸识别单元均连接到所述触摸轨迹整合单元, 并将识别的触摸点数据 发送到所述触摸轨迹整合单元, 所述触摸轨迹整合单元根据各触摸识别单 元发送的触摸点数据, 整合成所述手势数据。
所述触摸识别单元 A、 B N用于接收与其连接的触摸信号检测 装置的检测数据,并识别触摸点坐标。每层的触摸点坐标都有一个 Z轴值, 坐标点例如: (Χ、 Υ、 Ζ ) 。 将触摸点坐标发送到所述触摸轨迹整合单元 5,触摸轨迹整合单元 5接收到各个触摸识别单元发送的触摸点坐标后,将 属于同一手指的坐标点进行关联, 跟踪手指的运动曲线, 将手指的运动曲 线特征(也可以叫手势)发送到与其连接的应用系统 6。 应用系统 6与预 存的手指运动曲线特征进行对比识别, 根据识别出的手指运动曲线特征所 对应的操作命令执行操作。 所述运动曲线特征不一定必须是三维的运动曲 线,也可以通过映射在平面上的方式去解读手势的运动, 下面有详细说明。
下面对手势识别原理进行详细的说明。
如图 3所示, 当三维物体进入由多层触摸屏(TP 1、 TP2, …, TPn ) 构成的立体扫描空间后, 在每层的平面中可得到触摸物体在每一层形成的 2D截面 Sl、 S2、 …、 Sn, 计算各个截面 Sl、 S2、 …、 Sn的中心, 并以 三维坐标表示为: P l(xl,yl,zl)、 P2(x2,y2,z2), …、 Pn(xn,yn,zn)。 将由 P I 到 Pn组成的点集进行空间曲线拟合 (常见的拟合方法包括 B样条曲线拟 合法或者投影法等) , 从而得到一条根据触摸物所形成的基于三维空间的 骨线(如图 4所示)。 可以预见, 触摸层的层数越多, 得到的点集越密集, 拟合的曲线与实际触摸物的骨线越接近。 在三维坐标系中, 根据这条三维 骨线的形状的变化(三维骨线的形状变化可采用两种方式进行跟踪: 一种 方式是在激发触摸操作的触摸层进行跟踪触摸物中心点(重心点)的轨迹; 另一种方式是跟踪三维方向点集的路线变化, 如果点集的相对位置未发生 变化, 但从最底层的触摸层开始往上, 由触摸激活的触摸动作由有到无, 则可识别为该触摸操作为抬起的操作; 如果点集之间相对位置发生变化, 并且最新的骨线以原骨线为中心发生了旋转, 则说明该动作为扭曲的操 作) , 骨线的运动轨迹反映触摸手势, 根据识别出的动作, 可以定义出各 种触摸手势。
如果有多个触摸物(多个手指) , 则可以得到多条基于三维空间的骨 线。通过判断多条骨线之间的相对运动关系,可以定义更加复杂的如抓(骨 线底部各触摸层的中心点表现为收拢, 如图 5所示, 同一触摸层的 P01点 和 P11点点相互靠拢) 、 提(在由最底层往上的各触摸层形成骨线的触摸 操作陆续由有到无, 如图 6所示, 虚线点 P01点和 P11点为先有后无) 、 转 骨线沿着一个中心点发生旋转, 如图 7 所示, 对应各层中, P01 点和 P11点相对它们之间的某点旋转, P02点和 P12点相对它们之间的某 点旋转, POn点和 Pin点相对它们之间的某点旋转)等三维手势, 继而控 制虚拟三维物体。
对于从具体的数据中识别上述手势的方法, 除了跟踪三维骨线的运动 轨迹, 还有一种较为可行的方法为将由各触摸检测装置检测的所有点针对 其中一层进行投影, 例如针对中间触 这层进行投影, 则每帧数据都能 得到一个投影获得的点集, 从而得到一系列基于二维的点集, 从二维点集 的运动变化就可以获取手势。 以抓为例, 多个手指在各触摸层形成的截面 中心点被投影到最底层的触摸层以得到二维的抓的点集。 当抓的动作发生 后, 手指表现为收拢的动作, 那么在各触摸层的截面中心点也将表现为点 集的收拢, 因而, 投影到最底层触摸层的点集也将表现为收拢。 同时, 由 于抓的动作最上层收拢相对较小, 最底层收拢最大, 因此, 根据投影后的 点集的外围点和内围点的收拢情况, 可直接判断该点是否为抓的动作。 对 于动作勾的动作, 能根据手指投影形状、 在 Z轴的移动方向和速度相结合 进行识别。
对骨线的提取的简单方法(更复杂的方法则可采取动态预测跟踪等方 法)是设定一个距离阈值, 当点与点的距离小于设定的阈值时, 则认为改 点为同一骨线的关联点, 否则为非关联点。 但在判断过程中, 当多个触摸 物体在各触摸层截面的中心距离小于两个触摸层的间距时, 可能无法分辨 每一个截面中心点在哪一个骨线上。 反之, 如果满足多个手指的间距大于 两个触^^之间的间距, 就可准确地识别触摸物的骨线。
对显示器图像中被操作点的确定, 可以将被识别的最靠近显示屏的触 摸点所在的坐标点确定为被操作点。
在上述实施例中以多层红外触摸屏为例进行的说明, 还可以采用多个 摄像头进行识别。 所述多个摄像头包括触摸定位的摄像头和手势识别的摄 像头, 将触摸定位和手势识别进行结合。 如图 8所示, 在触摸屏 7的左上 角和右上角设置两个触摸定位用摄像头 8, 分别与触摸定位识别单元 A、 B 连接。 在触摸屏 7的左下角和右下角设置两个手势识别用摄像头 9, 分别 与手势识别单元八、 B连接。 触摸定位识别单元 A、 B及手势识别单元八、 B 均连接到触摸轨迹整合单元 5, 用于整合定位与手势数据, 形成控制指 令发送到应用系统 6。 本实施例的触摸轨迹整合单元可以在手势数据的基 础上增加触摸位置数据, 以便对显示图像中的特定位置进行操作和控制。 应用系统 6预先存储了各种触摸方式(手势)的数据, 便于将从触摸^½ 整合单元 5接收到的数据与存储的触摸方式数据进行比较, 以确定是哪一 种触摸方式。
本发明的上述实施方案, 能够将触摸及手势识别结合, 满足对图像的 立体触摸操作, 也使得人机交互更加的人性化和智能化。
显然, 本领域的技术人员可以对本发明进行各种改动和变形而不脱离 本发明的精神和范围。 这样, 倘若本发明的这些修改和变形属于本发明权 利要求及其同等技术的范围之内, 则本发明也意图包含这些改动和变形在 内。

Claims

权利要求
1、一种触摸识别装置, 包括信号处理单元以及至少两个识别层不在同 一平面的触摸信号检测装置, 其特征在于, 所述触摸信号检测装置均与所 述信号处理单元连接, 并将检测数据发送到所述信号处理单元, 所述信号 处理单元对接收的检测数据进行处理,获得表示立体触摸手势的手势数据。
2、根据权利要求 1所述的触摸识别装置, 其特征在于, 所述信号处理 单元包括触摸轨迹整合单元及与所述触摸信号检测装置对应连接的触摸识 别单元, 所述触摸识别单元均连接到所述触摸轨迹整合单元并将识别的触 摸点数据发送到所述触摸轨迹整合单元, 所述触摸轨迹整合单元根据各触 摸识别单元发送的触摸点数据合成所述手势数据。
3、根据权利要求 2所述的触摸识别装置, 其特征在于, 所述手势数据 包括非平行于显示平面的方向上的轨迹数据。
4、根据权利要求 2所述的触摸识别装置, 其特征在于, 所述触摸识别 装置的至少其中之一的触摸信号检测装置包括红外对管阵列。
5、根据权利要求 2所述的触摸识别装置, 其特征在于, 所述触摸识别 装置的至少其中之一的触摸信号检测装置包括光学感应装置。
6、根据权利要求 5所述的触摸识别装置, 其特征在于, 所述光学感应 装置为摄像头。
7、根据权利要求 3所述的触摸识别装置, 其特征在于, 所述运动数据 包括各层触摸点映射在同一平面的平面轨迹数据和方向数据。
8、根据权利要求 1所述的触摸识别装置, 其特征在于, 所述信号处理 单元根据各层触摸信号检测装置的检测数据生成触摸物的三维骨线的数 据, 手势数据包括三维骨线的运动轨迹数据。
9、根据权利要求 1所述的触摸识别装置, 其特征在于, 所述信号处理 单元根据各层触摸信号检测装置的检测数据生成在一特定平面的投影数 据, 手势数据包括所述投影数据及该投影数据的运动轨迹数据。
10、 一种识别立体触摸手势的方法, 其特征在于, 包括:
步骤 1、 首先利用至少两层触摸检测装置采集不同平面的触摸识别数 据;
步骤 2、 将所述不同平面的触摸识别数据整合成立体触摸手势数据。
11、 根据权利要求 10所述的识别立体触摸手势的方法, 其特征在于, 所述步骤 2为才艮据各层触摸信号检测装置的检测数据生成触摸物三维骨线 数据, 手势数据包括三维骨线的运动轨迹数据。
12、 根据权利要求 10所述的识别立体触摸手势的方法, 其特征在于, 所述步骤 2为: 将各层触摸检测装置检测的所有点针对其中一层所在平面 进行投影, 得到一系列基于二维的点集, 从二维点集的运动变化获取手势 数据。
13、 根据权利要求 10所述的识别立体触摸手势的方法, 其特征在于, 所述触摸检测装置包括红外对管阵列或摄像头。
14、 一种 ^L交互系统, 包括权利要求 1至 9中任一项所述的触摸识 别装置。
PCT/CN2013/083363 2012-09-29 2013-09-12 一种触摸识别装置及识别方法 WO2014048251A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201210375746.1 2012-09-29
CN201210375746.1A CN103713755B (zh) 2012-09-29 2012-09-29 一种触摸识别装置及识别方法

Publications (1)

Publication Number Publication Date
WO2014048251A1 true WO2014048251A1 (zh) 2014-04-03

Family

ID=50386977

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2013/083363 WO2014048251A1 (zh) 2012-09-29 2013-09-12 一种触摸识别装置及识别方法

Country Status (2)

Country Link
CN (1) CN103713755B (zh)
WO (1) WO2014048251A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017124452A1 (zh) * 2016-01-23 2017-07-27 曹晟 手势匹配系统指令技术的数据采集方法以及操作装置

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102014207920A1 (de) * 2014-04-28 2015-10-29 Robert Bosch Gmbh Elektrisches Gerät und Verfahren zum Betrieb eines elektrischen Geräts
CN106598355A (zh) * 2016-11-23 2017-04-26 杭州碳诺电子科技有限公司 可识别物体多种特征的手写板装置及方法
CN107357431A (zh) * 2017-07-14 2017-11-17 信利光电股份有限公司 一种实现三维触控功能的触控显示装置及方法
CN112346601A (zh) * 2020-11-09 2021-02-09 吴建国 红外三维扫描结构及其在空中触摸屏和按键中的应用
WO2022222980A1 (zh) * 2021-04-22 2022-10-27 广州创知科技有限公司 一种触控校验方法、装置、交互平板和存储介质
WO2022222982A1 (zh) * 2021-04-22 2022-10-27 广州创知科技有限公司 触控信号的校验、人机交互、笔迹的显示方法及相关装置

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101403952A (zh) * 2008-11-10 2009-04-08 成都市华为赛门铁克科技有限公司 一种触摸屏感应信息的获取方法和装置
KR20110086309A (ko) * 2010-01-22 2011-07-28 삼성전자주식회사 커맨드 생성방법 및 이를 이용한 디스플레이 장치
CN102156575A (zh) * 2011-03-28 2011-08-17 华映视讯(吴江)有限公司 立体触控显示装置及其触控输入方法
CN102299990A (zh) * 2010-06-22 2011-12-28 希姆通信息技术(上海)有限公司 手势控制的手机

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101403952A (zh) * 2008-11-10 2009-04-08 成都市华为赛门铁克科技有限公司 一种触摸屏感应信息的获取方法和装置
KR20110086309A (ko) * 2010-01-22 2011-07-28 삼성전자주식회사 커맨드 생성방법 및 이를 이용한 디스플레이 장치
CN102299990A (zh) * 2010-06-22 2011-12-28 希姆通信息技术(上海)有限公司 手势控制的手机
CN102156575A (zh) * 2011-03-28 2011-08-17 华映视讯(吴江)有限公司 立体触控显示装置及其触控输入方法

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017124452A1 (zh) * 2016-01-23 2017-07-27 曹晟 手势匹配系统指令技术的数据采集方法以及操作装置

Also Published As

Publication number Publication date
CN103713755B (zh) 2017-02-08
CN103713755A (zh) 2014-04-09

Similar Documents

Publication Publication Date Title
WO2014048251A1 (zh) 一种触摸识别装置及识别方法
TWI690842B (zh) 基於手勢辨認的互動顯示方法和裝置
US9569005B2 (en) Method and system implementing user-centric gesture control
CN102063618B (zh) 互动系统中的动态手势识别方法
US20130257736A1 (en) Gesture sensing apparatus, electronic system having gesture input function, and gesture determining method
US8648808B2 (en) Three-dimensional human-computer interaction system that supports mouse operations through the motion of a finger and an operation method thereof
US20140139429A1 (en) System and method for computer vision based hand gesture identification
KR101890459B1 (ko) 3차원으로 디스플레이된 오브젝트의 사용자 선택 제스쳐에 응답하기 위한 방법 및 시스템
WO2014106219A1 (en) User centric interface for interaction with visual display that recognizes user intentions
TW201120681A (en) Method and system for operating electric apparatus
TWI471815B (zh) 手勢辨識裝置及方法
TWI403922B (zh) 徒手人機介面操作系統及其方法
US10366281B2 (en) Gesture identification with natural images
TWI431538B (zh) 基於影像之動作手勢辨識方法及系統
WO2016026365A1 (zh) 实现非接触式鼠标控制的人机交互方法和系统
WO2013149475A1 (zh) 一种用户界面的控制方法及装置
JP2015118442A (ja) 情報処理装置、情報処理方法およびプログラム
US9525906B2 (en) Display device and method of controlling the display device
TWI499938B (zh) 觸控系統
CN103914668A (zh) 一种防止误触摸的触摸识别装置及识别方法
WO2018076609A1 (zh) 一种操作终端的方法和终端
WO2014033722A1 (en) Computer vision stereoscopic tracking of a hand
TWI444875B (zh) 多點觸碰輸入裝置及其使用單點觸控感應板與影像感測器之資料融合之介面方法
KR20130096073A (ko) 손 동작 인식을 이용한 가상 마우스 구동 방법
JP2018010539A (ja) 画像認識装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13840590

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205N DATED 27/07/2015)

122 Ep: pct application non-entry in european phase

Ref document number: 13840590

Country of ref document: EP

Kind code of ref document: A1