WO2014146516A1 - Interactive device and method for left and right hands - Google Patents

Interactive device and method for left and right hands Download PDF

Info

Publication number
WO2014146516A1
WO2014146516A1 PCT/CN2014/071408 CN2014071408W WO2014146516A1 WO 2014146516 A1 WO2014146516 A1 WO 2014146516A1 CN 2014071408 W CN2014071408 W CN 2014071408W WO 2014146516 A1 WO2014146516 A1 WO 2014146516A1
Authority
WO
WIPO (PCT)
Prior art keywords
touch
unit
parameters
hand
matching
Prior art date
Application number
PCT/CN2014/071408
Other languages
French (fr)
Chinese (zh)
Inventor
贺真
刘吉林
李腾跃
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2014146516A1 publication Critical patent/WO2014146516A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • the present invention relates to the field of human-computer interaction technology, and in particular, to a left-right hand interaction device and method. Background technique
  • the soft keys take up a certain amount of UI space and affect the line of sight. If the soft button design is too small, it is difficult to click. If the soft button design is too large, in addition to occupying space, it is easy to cause misoperation. At the same time, the soft keys must be fixed in one position and cannot be used as intended.
  • the computer on the projection screen can be controlled, and the projection content can be remotely controlled.
  • This method is still a way of positioning by the mouse, and the actual experience is worse than the mouse, and it is not convenient to perform related operations such as commenting.
  • the embodiments of the present invention provide a left-right hand interaction device and method to solve the technical problems of inconvenient operation, poor fluency, and low efficiency of the existing interaction device and method.
  • a right and left hand interaction device comprising a connection to a bus respectively Measuring unit, analyzing unit, matching unit and feedback unit;
  • the detecting unit is configured to detect a range of the touched area on the touch screen
  • the analyzing unit is configured to receive an input of the detecting unit, further analyze an inductor array within a touch range, and analyze a two-hand model parameter when touched;
  • the matching unit is configured to match the analysis result of the analysis unit with a feature library, and determine a two-hand model of the touch source;
  • the feedback unit is configured to feed back the corresponding triggering behavior into the user interface according to the matching and the judgment result of the matching unit.
  • the apparatus further includes a capture unit coupled to the bus for capturing a two-handed model of the user when interacting with the projection screen.
  • the capturing unit comprises an imaging device and a distance sensor.
  • the feature library includes specific parameters of ten finger touch behaviors of the hands,
  • the analysis result obtained by the analysis unit is matched with the parameters in the feature library to determine the hand model of the touch source.
  • the second aspect provides a touch screen-based left-right hand interaction method, where the method specifically includes: detecting a contour, a range, and a touch intensity of the touched area on the touch screen;
  • the sensor array in the touch range is further analyzed, and the area of the touch range, the coordinates of the center point, and the direction angle are analyzed;
  • the corresponding triggering behavior is called up and fed back into the user interface.
  • the contour, the range, and the touch intensity of the touched area on the touch screen are detected by a real-time capture function of the array sensor.
  • the method for further analyzing the sensor array in the touch range and analyzing the area, the center point coordinate, and the direction angle of the touch range is specifically:
  • the touch sensor array of the touch screen itself, when the sensor senses the finger contact, a corresponding signal is generated, and the sensing signal of each sensor is statistically analyzed, and the distribution of the sensor can be obtained, thereby further calculating the touch range. Area, center point coordinates, and direction angle.
  • the feature library includes specific parameters of ten finger touch behaviors of the hands, by performing the analyzed parameters and the parameters in the existing feature database. Match, determine the two-hand model of the source of the touch.
  • a left-right hand interaction method based on a projection screen specifically includes: detecting, by an imaging device and a distance sensor, a position and a touch manner of the touched area on the projection screen;
  • the user's two-hand model of the touch moment is captured by the camera device and the distance sensor;
  • further analysis is performed to obtain specific two-hand model parameters; the parameters obtained by the above analysis are matched with the parameters in the existing feature database to determine the two-hand model of the touch source;
  • the corresponding triggering behavior is called up and fed back into the user interface.
  • the touch manner includes: clicking, sliding, long pressing, dragging, and double clicking.
  • the further analysis according to the captured two-hand model is specifically:
  • the depth model of the two hands is captured based on the depth and the color difference information, the orientation and the depth are acquired by the depth model, and then the specific two-hand model parameters are calculated to determine whether the current touch is the left hand or the right hand.
  • the specific two-hand model parameters include: azimuth, posture, angle, depth of the hands, And hierarchical relationships.
  • the function operation is realized by the cooperation of the right and left hands, the function keys in the prior art are eliminated, the screen space is greatly saved, and the distance of the finger movement during the user interaction is also minimized, so that the interaction process is more convenient and smooth.
  • people can freely comment and perform routine operations in front of the projection screen, without having to go back to the computer to operate, creating a new concept of interaction in the computer age.
  • FIG. 1 is a structural diagram of a right and left hand interaction device according to an embodiment of the present invention.
  • FIG. 2 is a flow chart of a touch screen-based right and left hand interaction method provided by the present invention
  • FIG. 3 is a schematic diagram of finger touch recognition
  • Figure 4 is a schematic diagram of the left and right hand touch interaction behavior
  • FIG. 5 is a flowchart of a left-right hand interaction method based on a projection screen provided by the present invention
  • FIG. 6 is a schematic diagram of interaction behavior of left and right hand projection screens.
  • the invention provides a right and left hand interaction device, which can be used for right and left hand touch interaction behavior, the device comprises a detection unit, an analysis unit, a matching unit and a feedback unit respectively connected to the bus; the detection unit is configured to detect the touched area on the touch screen range; The analyzing unit is configured to receive an input of the detecting unit, further analyze an inductor array within a touch range, and analyze a two-hand model parameter when touched;
  • the matching unit is configured to match the analysis result of the analysis unit with a feature library, and determine a two-hand model of the touch source;
  • the feedback unit is configured to feed back the corresponding triggering behavior into the user interface according to the matching and the judgment result of the matching unit.
  • the feature library includes specific parameters of ten finger touch behaviors of the hands, and specific parameters of each finger are different, and the hands of the touch source are determined by matching the analysis result obtained by the analysis unit with the parameters in the feature library. model.
  • the invention realizes the functional operation by the cooperation of the left and right hands, and cancels the function keys for interaction in the prior art, which greatly saves the screen space and also makes the interaction process more convenient and smooth.
  • FIG. 1 is a structural diagram of a right and left hand interaction device according to an embodiment of the present invention.
  • the device can be applied to the right and left hand interaction behavior of the touch screen, and can also be applied to the left and right hand interaction behavior of the projection screen.
  • the apparatus comprises: a detecting unit 11, an analyzing unit 12, a matching unit 13, a feedback unit 14, and a capturing unit 15, which are connected to each other by a bus.
  • the detecting unit 11 is configured to detect a range of the touched area on the touch screen, including a specific area and an area outline;
  • the analyzing unit 12 is configured to receive an input of the detecting unit 11, and further analyze a sensor array within a touch range, and analyze a two-hand model parameter when touched;
  • the matching unit 13 is configured to match the analysis result of the analysis unit 12 with the feature database, and determine the two-hand model of the touch source;
  • the feedback unit 14 is configured to feed back the corresponding triggering behavior into the user interface according to the matching and judgment result of the matching unit 12.
  • the capture unit 15 is for capturing a two-hand model of the user when interacting with the projection screen.
  • the capturing unit 15 includes an imaging device and a distance sensor.
  • the feature library includes specific parameters of ten finger touch behaviors of the hands, each finger The specific parameters are different, and the hand model of the touch source is determined by matching the analysis result obtained by the analysis unit with the parameters in the feature library.
  • the present embodiment adds a capture unit 15 for capturing the user's two-hand model when interacting with the projection screen.
  • a capture unit 15 for capturing the user's two-hand model when interacting with the projection screen.
  • people can freely perform commentary and normal operations on the projection screen without having to go back to the computer to operate, so that the interaction process based on the projection screen is also very convenient and smooth.
  • FIG. 2 is a flow chart of a touch screen-based left and right hand interaction method provided by the present invention, and the method includes:
  • S10 detects a contour, a range, and a touch intensity of the touched area on the touch screen
  • the corresponding triggering behavior is called out and fed back to the user interface.
  • the feature library includes specific parameters of ten finger touch behaviors of the hands, and specific parameters of each finger are different, and the parameters obtained by the analysis are matched with the parameters in the existing feature database to determine the source of the touch.
  • the two-hand model according to the determined two-hand model, the system will call out the corresponding triggering behavior, and feedback to the user interface to complete the relevant operations.
  • This embodiment provides a flow chart of a left-right hand interaction method based on a touch screen, including the following steps:
  • the S10 detects the specific area of the touch, including its contour and range, and the intensity of the touch, through real-time capture of the array sensor.
  • Figure 3 is a schematic diagram of finger touch recognition.
  • the capacitive touch screen is microscopically and is a sensor. Integrated on the screen to form a square matrix.
  • the sensor covered by the area can sense the change of the capacitance value, thereby obtaining the arrangement of the sensor, and analyzing the shape of the area of the sensor arrangement to identify whether the touch is left-handed or Right hand, and you can determine which finger is by the size and detail of the area.
  • each sensor senses a finger contact, a corresponding signal is generated, and the sensing signal of each sensor is statistically analyzed, thereby obtaining the distribution of the sensor, thereby Further calculate the area of the touch range, the coordinates of the center point, and the angle of the direction.
  • the specific parameters obtained by the analysis are matched with the parameter intervals in the existing feature database, and the parameter interval in the feature database includes a specific parameter interval of ten finger touch behaviors of the hands, and each finger has a specific parameter interval. Difference, if each parameter obtained belongs to a certain interval, it can be judged that the current touch is specifically derived from the finger of the interval, thereby judging the hand model of the touch source.
  • the corresponding triggering behavior is called out and fed back to the user interface.
  • the system can determine whether the touch finger is left or right, the index finger or the thumb.
  • the interaction here is mainly divided into two categories: left thumb / other fingers.
  • Figure 4 is a schematic diagram of the left and right hand touch interaction behavior.
  • the menu such as Menu will automatically pop up at the position of the right thumb. If the right thumb is not detected, it will pop up in the right half.
  • the function is controlled by the left hand. This method of canceling the function key greatly saves the screen space, and at the same time minimizes the distance of the finger movement when the user interacts, making the interaction process more convenient and smooth.
  • FIG. 5 is a flowchart of a left-right hand interaction method based on a projection screen provided by the present invention, where the method specifically includes:
  • S201 detecting, by the imaging device and the distance sensor, a position and a touch manner of the touched area on the projection screen, where the touch manner includes: clicking, sliding, long pressing, dragging, and double-clicking;
  • the corresponding triggering behavior is called out and fed back to the user interface.
  • the feature library includes specific parameters of the two-hand touch behavior, and by matching the analyzed parameters with the parameters in the existing feature database, determining the two-hand model of the touch source, according to the determined two-hand model, the system will The corresponding triggering behavior is called up and fed back into the user interface to complete the related operations.
  • the embodiment provides a flow of a left-right hand interaction method based on a projection screen, including the following steps. Step:
  • the S20 detects the position and the touch manner of the touched area on the projection screen by the imaging device and the distance sensor, and the touch manner includes: clicking, sliding, long pressing, dragging, and double-clicking.
  • the depth model of the two hands is captured based on the depth and the color difference information, the orientation and the depth are acquired by the depth model, and then the specific two-hand model parameters are calculated to determine whether the current touch is the left hand or the right hand.
  • the specific parameters obtained by the analysis are matched with the parameter intervals in the existing feature database, and the parameter interval in the feature library includes a specific parameter interval of the two-hand touch behavior, and if each parameter obtained belongs to a certain interval, It is judged that the current touch is specifically derived from the finger of the section, thereby judging the hand model of the touch source.
  • the corresponding triggering behavior is called out and fed back to the user interface.
  • Figure 6 is a schematic diagram of the interactive behavior of the left and right hand projection screens. All Menus and related operations are displayed next to the touch points, which is easy to interact and operate.
  • the specific way of the left and right hand touch interaction based on the projection screen can be:
  • Double click multi-window switching
  • aspects of the present invention, or possible implementations of various aspects can be embodied as a system, method, or computer program product.
  • aspects of the invention, or possible implementations of various aspects may be in the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, etc.), or a combination of software and hardware aspects, They are collectively referred to herein as "circuits," “modules,” or “systems.”
  • aspects of the invention, or possible implementations of various aspects may take the form of a computer program product, which is a computer readable program code stored in a computer readable medium.
  • the computer readable medium can be a computer readable signal medium or a computer readable storage medium.
  • the computer readable storage medium includes, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, device, or device, or any suitable combination of the foregoing, such as random access memory (RAM), read only memory (ROM), Erase programmable read-only memory (EPROM or flash memory), optical fiber, portable read-only memory (CD-ROM).
  • the processor in the computer reads the computer readable program code stored in the computer readable medium, such that the processor can perform the functional actions specified in each step or combination of steps in the flowchart; A device that functions as specified in each block, or combination of blocks.
  • the computer readable program code can execute entirely on the user's computer, partly on the user's computer, as a separate software package, partly on the user's computer, and partly on the remote meter On a computer, or on a remote computer or server. It should also be noted that in some alternative implementations, the functions noted in the various steps of the flowcharts or in the blocks of the block diagrams may not occur in the order noted in the drawings. For example, two steps, or two blocks shown in succession may be executed substantially concurrently, or the blocks may sometimes be performed in the reverse order, depending upon the functionality involved. The spirit and scope of the invention. Thus, it is intended that the present invention cover the modifications and the modifications of the invention

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Disclosed are an interactive device and method for left and right hands. The device comprises a detection unit, an analysis unit, a matching unit and a feedback unit which are respectively connected with a bus. The detection unit is used for detecting the range of a touched region on a touch screen; the analysis unit is used for receiving the input of the detection unit, further analyzing an inductor array within the range of touch and analyzing model parameters of both hands during touch; the matching unit is used for matching the analysis result of the analysis unit with a feature library and judging models of both hands of a touch source; and the feedback unit is used for feeding back a corresponding trigger behavior to a user interface in accordance with the matching result and the judging result of the matching unit. The present invention realizes a functional operation through the cooperation of left and right hands, cancels a functional button in the prior art, greatly saves the space of the screen, and also minimizes the moving distance of fingers during the interaction of a user so that the interactive process is more convenient and smoother.

Description

说明书 一种左右手交互装置及方法 技术领域  Description: A left and right hand interaction device and method
本发明涉及人机交互技术领域, 具体涉及一种左右手交互装置及方法。 背景技术  The present invention relates to the field of human-computer interaction technology, and in particular, to a left-right hand interaction device and method. Background technique
人们在操作手机或平板设备时, 在屏幕上进行触摸操作的过程中, 很多 时候需要被迫点击屏幕之外的功能按键, 导致整个交互过程并不流畅, 效率 低下。 此外, 在投影屏幕上展示内容时, 经常需要在投影屏幕上进行评注以 给其他参会人员进行提示, 有时甚至需要走回电脑前面进行相关操作, 无论 在批注提示上还是内容操作上都十分麻烦, 影响交互过程的流畅度。  When people operate a mobile phone or tablet device, during the touch operation on the screen, many times they need to be forced to click on the function buttons outside the screen, resulting in the entire interaction process is not smooth and inefficient. In addition, when displaying content on the projection screen, it is often necessary to make comments on the projection screen to prompt other participants, and sometimes even need to go back to the front of the computer to perform related operations, which is very troublesome in both the annotation prompt and the content operation. , affecting the fluency of the interaction process.
现有技术中通过在屏幕中设置软按键来替代屏幕之外的功能按键,通过 点击屏幕 UI ( User Interface )周边的软按键, 进而扩展弹出相关选项, 再 点击相关的选项来实现原本手机或平板设备上功能按键的作用。 然而在这种 方式中, 软按键占用了一定的 UI空间, 影响视线。 如果软按键设计的太小, 则难以点击到,如果软按键设计的太大, 除了占用空间,还容易造成误操作。 同时, 软按键必须固定于一个位置, 无法做到想用即用。  In the prior art, by setting a soft button on the screen instead of a function button outside the screen, by clicking a soft button around the UI (User Interface), the pop-up related option is expanded, and then the related option is clicked to implement the original mobile phone or tablet. The function of the function buttons on the device. However, in this way, the soft keys take up a certain amount of UI space and affect the line of sight. If the soft button design is too small, it is difficult to click. If the soft button design is too large, in addition to occupying space, it is easy to cause misoperation. At the same time, the soft keys must be fixed in one position and cannot be used as intended.
另外, 通过在手机或平板设备中安装遥控应用, 再使用 WIFI等连接方 式连接电脑, 就能够控制投影屏上的电脑, 进而遥控投影内容。 这种方式依 然是通过鼠标进行定位的方式, 并且实际体验比鼠标还要差, 进行评注等相 关操作并不方便。  In addition, by installing a remote control application on a mobile phone or tablet device, and then connecting to the computer using a connection method such as WIFI, the computer on the projection screen can be controlled, and the projection content can be remotely controlled. This method is still a way of positioning by the mouse, and the actual experience is worse than the mouse, and it is not convenient to perform related operations such as commenting.
发明内容 Summary of the invention
本发明实施例提供一种左右手交互装置及方法, 以解决现有的交互装置 和方法操作不方便、 流畅性差、 效率低的技术问题。  The embodiments of the present invention provide a left-right hand interaction device and method to solve the technical problems of inconvenient operation, poor fluency, and low efficiency of the existing interaction device and method.
为了解决上述技术问题, 本发明实施例公开了如下技术方案:  In order to solve the above technical problem, the embodiment of the present invention discloses the following technical solutions:
第一方面, 提供一种左右手交互装置, 该装置包括分别与总线连接的检 测单元、 分析单元、 匹配单元和反馈单元; In a first aspect, a right and left hand interaction device is provided, the device comprising a connection to a bus respectively Measuring unit, analyzing unit, matching unit and feedback unit;
所述检测单元用于检测触摸屏上被触摸区域的范围;  The detecting unit is configured to detect a range of the touched area on the touch screen;
所述分析单元用于接收所述检测单元的输入, 并进一步分析触摸范围内 的感应器阵列, 以及分析触摸时的双手模型参数;  The analyzing unit is configured to receive an input of the detecting unit, further analyze an inductor array within a touch range, and analyze a two-hand model parameter when touched;
所述匹配单元用于将所述分析单元的分析结果与特征库进行匹配, 并判 断出触摸来源的双手模型;  The matching unit is configured to match the analysis result of the analysis unit with a feature library, and determine a two-hand model of the touch source;
所述反馈单元用于根据所述匹配单元的匹配和判断结果将相应的触发 行为反馈至用户界面中。  The feedback unit is configured to feed back the corresponding triggering behavior into the user interface according to the matching and the judgment result of the matching unit.
结合第一方面, 在第一方面第一种可能的实现方式中, 所述装置进一步 包括与所述总线连接的捕捉单元, 用于捕捉与投影屏幕交互时用户的双手模 型。  In conjunction with the first aspect, in a first possible implementation of the first aspect, the apparatus further includes a capture unit coupled to the bus for capturing a two-handed model of the user when interacting with the projection screen.
结合第一方面第一种可能的实现方式, 在第一方面第二种可能的实现方 式中, 所述捕捉单元包括摄像设备和距离传感器。  In conjunction with the first possible implementation of the first aspect, in a second possible implementation of the first aspect, the capturing unit comprises an imaging device and a distance sensor.
结合第一方面, 或者结合上述第一方面任一种可能的实现方式, 在第一 方面第三种可能的实现方式中,所述特征库中包含双手十个手指触摸行为的 具体参数,通过将分析单元得到的分析结果与特征库中的参数进行匹配来判 断出触摸来源的双手模型。  With reference to the first aspect, or in combination with any of the possible implementations of the foregoing first aspect, in a third possible implementation manner of the first aspect, the feature library includes specific parameters of ten finger touch behaviors of the hands, The analysis result obtained by the analysis unit is matched with the parameters in the feature library to determine the hand model of the touch source.
第二方面, 提供一种基于触摸屏的左右手交互方法, 该方法具体包括: 检测触摸屏上被触摸区域的轮廓、 范围、 以及触摸强度;  The second aspect provides a touch screen-based left-right hand interaction method, where the method specifically includes: detecting a contour, a range, and a touch intensity of the touched area on the touch screen;
根据上述检测结果, 进一步分析触摸范围内的感应器阵列, 并分析出该 触摸范围的面积、 中心点坐标和方向角度;  According to the above detection result, the sensor array in the touch range is further analyzed, and the area of the touch range, the coordinates of the center point, and the direction angle are analyzed;
将上述分析得到的参数与已有的特征库中的参数进行匹配, 判断出触摸 来源的双手模型;  Matching the parameters obtained by the above analysis with the parameters in the existing feature database to determine the two-hand model of the touch source;
根据匹配和判断的结果, 将相应的触发行为调出, 并且反馈到用户界面 中。  Based on the results of the matching and judgment, the corresponding triggering behavior is called up and fed back into the user interface.
结合第二方面, 在第二方面第一种可能的实现方式中, 通过阵列传感器 的实时捕捉功能来检测触摸屏上被触摸区域的轮廓、 范围、 以及触摸强度。 结合第二方面, 在第二方面第二种可能的实现方式中, 进一步分析触摸 范围内的感应器阵列, 并分析出该触摸范围的面积、 中心点坐标和方向角度 的方法具体为: In conjunction with the second aspect, in a first possible implementation of the second aspect, the contour, the range, and the touch intensity of the touched area on the touch screen are detected by a real-time capture function of the array sensor. With reference to the second aspect, in the second possible implementation manner of the second aspect, the method for further analyzing the sensor array in the touch range and analyzing the area, the center point coordinate, and the direction angle of the touch range is specifically:
利用触摸屏本身的触控感应器阵列, 当感应器感应到手指接触时会产生 相应的信号, 把每一个感应器的感应信号做统计分析, 能够得出感应器的分 布, 从而进一步推算出触摸范围的面积、 中心点坐标和方向角度。  By using the touch sensor array of the touch screen itself, when the sensor senses the finger contact, a corresponding signal is generated, and the sensing signal of each sensor is statistically analyzed, and the distribution of the sensor can be obtained, thereby further calculating the touch range. Area, center point coordinates, and direction angle.
结合第二方面, 在第二方面第三种可能的实现方式中, 所述特征库中包 含双手十个手指触摸行为的具体参数,通过将分析得到的参数与已有的特征 库中的参数进行匹配, 判断出触摸来源的双手模型。  With reference to the second aspect, in a third possible implementation manner of the second aspect, the feature library includes specific parameters of ten finger touch behaviors of the hands, by performing the analyzed parameters and the parameters in the existing feature database. Match, determine the two-hand model of the source of the touch.
第三方面,提供一种基于投影屏幕的左右手交互方法,该方法具体包括: 通过摄像设备和距离传感器检测投影屏幕上被触摸区域的位置和触摸 方式;  In a third aspect, a left-right hand interaction method based on a projection screen is provided. The method specifically includes: detecting, by an imaging device and a distance sensor, a position and a touch manner of the touched area on the projection screen;
通过摄像设备和距离传感器捕捉触摸瞬间用户的双手模型;  The user's two-hand model of the touch moment is captured by the camera device and the distance sensor;
根据捕捉到的双手模型进行进一步分析, 获得具体的双手模型参数; 将上述分析得到的参数与已有的特征库中的参数进行匹配, 判断出触摸 来源的双手模型;  According to the captured two-hand model, further analysis is performed to obtain specific two-hand model parameters; the parameters obtained by the above analysis are matched with the parameters in the existing feature database to determine the two-hand model of the touch source;
根据匹配和判断的结果, 将相应的触发行为调出, 并且反馈到用户界面 中。  Based on the results of the matching and judgment, the corresponding triggering behavior is called up and fed back into the user interface.
结合第三方面, 在第三方面第一种可能的实现方式中, 所述触摸方式包 括: 点击、 滑动、 长按、 拖拽和双击。  In conjunction with the third aspect, in a first possible implementation manner of the third aspect, the touch manner includes: clicking, sliding, long pressing, dragging, and double clicking.
结合第三方面, 在第三方面第二种可能的实现方式中, 根据捕捉到的双 手模型进行进一步分析具体为:  In combination with the third aspect, in the second possible implementation manner of the third aspect, the further analysis according to the captured two-hand model is specifically:
通过所述摄像设备和距离传感器,基于深度和色差信息捕捉到双手的深 度模型, 通过深度模型获取方位和深度, 进而计算出具体的双手模型参数, 判断当前触摸的是左手还是右手。  Through the imaging device and the distance sensor, the depth model of the two hands is captured based on the depth and the color difference information, the orientation and the depth are acquired by the depth model, and then the specific two-hand model parameters are calculated to determine whether the current touch is the left hand or the right hand.
结合第三方面第二种可能的实现方式,在第三方面第三种可能的实现方 式中, 所述具体的双手模型参数包括: 双手的方位、 姿势、 角度、 深度、 以 及层次关系。 With reference to the second possible implementation manner of the third aspect, in a third possible implementation manner of the third aspect, the specific two-hand model parameters include: azimuth, posture, angle, depth of the hands, And hierarchical relationships.
本发明实施例中通过左右手配合实现功能操作,取消了现有技术中的功 能按键, 极大地节约了屏幕空间, 同时也最小化了用户交互时手指移动的距 离, 使得交互过程更加便捷流畅。 在会议场景下, 人们可以自如地在投影屏 幕前进行评注和常规操作, 无需回到电脑面前进行操作, 开创了电脑时代的 交互新理念。  In the embodiment of the present invention, the function operation is realized by the cooperation of the right and left hands, the function keys in the prior art are eliminated, the screen space is greatly saved, and the distance of the finger movement during the user interaction is also minimized, so that the interaction process is more convenient and smooth. In the meeting scene, people can freely comment and perform routine operations in front of the projection screen, without having to go back to the computer to operate, creating a new concept of interaction in the computer age.
附图说明  DRAWINGS
为了更清楚地说明本发明实施例或现有技术中的技术方案, 下面将对实 施例或现有技术描述中所需要使用的附图作一筒单地介绍, 显而易见地, 下 面描述中的附图是本发明的一些实施例, 对于本领域普通技术人员来讲, 在 不付出创造性劳动的前提下, 还可以根据这些附图获得其他的附图。  In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings to be used in the embodiments or the description of the prior art will be briefly described below, and obviously, the attached in the following description The drawings are some embodiments of the present invention, and those skilled in the art can obtain other drawings based on these drawings without any creative work.
图 1是依照本发明一种实施例的左右手交互装置的结构图;  1 is a structural diagram of a right and left hand interaction device according to an embodiment of the present invention;
图 2是本发明提供的基于触摸屏的左右手交互方法的流程图; 图 3是手指触控识别示意图;  2 is a flow chart of a touch screen-based right and left hand interaction method provided by the present invention; FIG. 3 is a schematic diagram of finger touch recognition;
图 4是左右手触摸交互行为示意图;  Figure 4 is a schematic diagram of the left and right hand touch interaction behavior;
图 5是本发明提供的基于投影屏幕的左右手交互方法的流程图; 图 6是左右手投影屏幕交互行为示意图。  FIG. 5 is a flowchart of a left-right hand interaction method based on a projection screen provided by the present invention; FIG. 6 is a schematic diagram of interaction behavior of left and right hand projection screens.
具体实施方式 detailed description
为使本发明实施例的目的、 技术方案和优点更加清楚, 下面将结合本发 明实施例中的附图, 对本发明实施例中的技术方案进行清楚地描述, 显然, 所描述的实施例是本发明一部分实施例, 而不是全部的实施例。 基于本发明 中的实施例, 本领域普通技术人员在没有做出创造性劳动前提下所获得的所 有其他实施例, 都属于本发明保护的范围。  The technical solutions in the embodiments of the present invention are clearly described in conjunction with the accompanying drawings in the embodiments of the present invention, and it is obvious that the described embodiments are Some embodiments, rather than all of the embodiments, are invented. All other embodiments obtained by those skilled in the art based on the embodiments of the present invention without creative efforts are within the scope of the present invention.
实施例 1 :  Example 1
本发明提供了一种左右手交互装置, 可用于左右手触摸交互行为, 该装 置包括分别与总线连接的检测单元、 分析单元、 匹配单元和反馈单元; 所述检测单元用于检测触摸屏上被触摸区域的范围; 所述分析单元用于接收所述检测单元的输入, 并进一步分析触摸范围内 的感应器阵列, 以及分析触摸时的双手模型参数; The invention provides a right and left hand interaction device, which can be used for right and left hand touch interaction behavior, the device comprises a detection unit, an analysis unit, a matching unit and a feedback unit respectively connected to the bus; the detection unit is configured to detect the touched area on the touch screen range; The analyzing unit is configured to receive an input of the detecting unit, further analyze an inductor array within a touch range, and analyze a two-hand model parameter when touched;
所述匹配单元用于将所述分析单元的分析结果与特征库进行匹配, 并判 断出触摸来源的双手模型;  The matching unit is configured to match the analysis result of the analysis unit with a feature library, and determine a two-hand model of the touch source;
所述反馈单元用于根据所述匹配单元的匹配和判断结果将相应的触发 行为反馈至用户界面中。  The feedback unit is configured to feed back the corresponding triggering behavior into the user interface according to the matching and the judgment result of the matching unit.
其中, 所述特征库中包含双手十个手指触摸行为的具体参数, 每个手指 的具体参数均不同,通过将分析单元得到的分析结果与特征库中的参数进行 匹配来判断出触摸来源的双手模型。  Wherein, the feature library includes specific parameters of ten finger touch behaviors of the hands, and specific parameters of each finger are different, and the hands of the touch source are determined by matching the analysis result obtained by the analysis unit with the parameters in the feature library. model.
本发明通过左右手配合实现功能操作,取消了现有技术中用于交互的功 能按键, 极大地节约了屏幕空间, 同时也使得交互过程更加便捷流畅。  The invention realizes the functional operation by the cooperation of the left and right hands, and cancels the function keys for interaction in the prior art, which greatly saves the screen space and also makes the interaction process more convenient and smooth.
实施例 2:  Example 2:
图 1是依照本发明一种实施例的左右手交互装置的结构图,该装置可应 用于触摸屏左右手交互行为, 也可应用于投影屏幕的左右手交互行为。  1 is a structural diagram of a right and left hand interaction device according to an embodiment of the present invention. The device can be applied to the right and left hand interaction behavior of the touch screen, and can also be applied to the left and right hand interaction behavior of the projection screen.
该装置包括: 检测单元 11、 分析单元 12、 匹配单元 13、 反馈单元 14 和捕捉单元 15 , 以上几个单元通过总线相互连接。  The apparatus comprises: a detecting unit 11, an analyzing unit 12, a matching unit 13, a feedback unit 14, and a capturing unit 15, which are connected to each other by a bus.
所述检测单元 11 用于检测触摸屏上被触摸区域的范围, 包括具体区域 和区域轮廓;  The detecting unit 11 is configured to detect a range of the touched area on the touch screen, including a specific area and an area outline;
所述分析单元 12用于接收所述检测单元 11的输入, 并进一步分析触摸 范围内的感应器阵列, 以及分析触摸时的双手模型参数;  The analyzing unit 12 is configured to receive an input of the detecting unit 11, and further analyze a sensor array within a touch range, and analyze a two-hand model parameter when touched;
所述匹配单元 13用于将所述分析单元 12 的分析结果与特征库进行匹 配, 并判断出触摸来源的双手模型;  The matching unit 13 is configured to match the analysis result of the analysis unit 12 with the feature database, and determine the two-hand model of the touch source;
所述反馈单元 14用于根据所述匹配单元 12的匹配和判断结果将相应的 触发行为反馈至用户界面中。  The feedback unit 14 is configured to feed back the corresponding triggering behavior into the user interface according to the matching and judgment result of the matching unit 12.
所述捕捉单元 15用于捕捉与投影屏幕交互时用户的双手模型。  The capture unit 15 is for capturing a two-hand model of the user when interacting with the projection screen.
其中, 所述捕捉单元 15包括摄像设备和距离传感器。  The capturing unit 15 includes an imaging device and a distance sensor.
其中, 所述特征库中包含双手十个手指触摸行为的具体参数, 每个手指 的具体参数均不同,通过将分析单元得到的分析结果与特征库中的参数进行 匹配来判断出触摸来源的双手模型。 Wherein, the feature library includes specific parameters of ten finger touch behaviors of the hands, each finger The specific parameters are different, and the hand model of the touch source is determined by matching the analysis result obtained by the analysis unit with the parameters in the feature library.
在实施例 1的基础上,本实施例增加了捕捉单元 15 ,用于捕捉与投影屏 幕交互时用户的双手模型。 在会议场景下, 人们可以自如地在投影屏幕前进 行评注和常规操作, 无需回到电脑面前进行操作, 使得基于投影屏幕的交互 过程也十分便捷流畅。  Based on Embodiment 1, the present embodiment adds a capture unit 15 for capturing the user's two-hand model when interacting with the projection screen. In the conference scene, people can freely perform commentary and normal operations on the projection screen without having to go back to the computer to operate, so that the interaction process based on the projection screen is also very convenient and smooth.
实施例 3:  Example 3:
图 2是本发明提供的基于触摸屏的左右手交互方法的流程图, 该方法具 体包括:  2 is a flow chart of a touch screen-based left and right hand interaction method provided by the present invention, and the method includes:
S10 检测触摸屏上被触摸区域的轮廓、 范围、 以及触摸强度; S10 detects a contour, a range, and a touch intensity of the touched area on the touch screen;
5102、 根据上述检测结果, 进一步分析触摸范围内的感应器阵列, 并分 析出该触摸范围的面积、 中心点坐标和方向角度; 5102. further analyzing, according to the foregoing detection result, the sensor array in the touch range, and analyzing an area, a center point coordinate, and a direction angle of the touch range;
5103、 将上述分析得到的参数与已有的特征库中的参数进行匹配, 判断 出触摸来源的双手模型;  5103. Match the parameters obtained by the foregoing analysis with the parameters in the existing feature database to determine the two-hand model of the touch source;
5104、 根据匹配和判断的结果, 将相应的触发行为调出, 并且反馈到用 户界面中。  5104. According to the result of the matching and the judgment, the corresponding triggering behavior is called out and fed back to the user interface.
其中, 所述特征库中包含双手十个手指触摸行为的具体参数, 每个手指 的具体参数均不同,通过将分析得到的参数与已有的特征库中的参数进行匹 配, 判断出触摸来源的双手模型, 根据判断出的双手模型, 系统将相应的触 发行为调出, 并且反馈到用户界面中, 完成相关操作。  Wherein, the feature library includes specific parameters of ten finger touch behaviors of the hands, and specific parameters of each finger are different, and the parameters obtained by the analysis are matched with the parameters in the existing feature database to determine the source of the touch. The two-hand model, according to the determined two-hand model, the system will call out the corresponding triggering behavior, and feedback to the user interface to complete the relevant operations.
实施例 4 :  Example 4:
本实施例提供一种基于触摸屏的左右手交互方法流程, 包括以下步骤: This embodiment provides a flow chart of a left-right hand interaction method based on a touch screen, including the following steps:
S10 通过阵列传感器的实时捕捉, 检测至触摸的具体区域, 包括其轮 廓和范围, 以及触摸的强度。 The S10 detects the specific area of the touch, including its contour and range, and the intensity of the touch, through real-time capture of the array sensor.
S102、 根据上述检测结果, 进一步分析触摸范围内的感应器阵列, 并分 析出该触摸范围的面积、 中心点坐标和方向角度。  S102. Further analyzing the sensor array in the touch range according to the detection result, and analyzing the area of the touch range, the center point coordinate, and the direction angle.
图 3是手指触控识别示意图, 电容触摸屏在微观上, 都是一个个感应器 集成在屏幕上, 形成方阵。 当手指触控在某一区域, 该区域覆盖到的感应器 就可以感应到电容值的变化, 进而获得感应器的排列, 针对感应器排列的区 域形状进行分析, 可以识别出触摸的是左手还是右手, 并且可以通过面积的 大小和细节判断出是哪只手指。 Figure 3 is a schematic diagram of finger touch recognition. The capacitive touch screen is microscopically and is a sensor. Integrated on the screen to form a square matrix. When the finger touches an area, the sensor covered by the area can sense the change of the capacitance value, thereby obtaining the arrangement of the sensor, and analyzing the shape of the area of the sensor arrangement to identify whether the touch is left-handed or Right hand, and you can determine which finger is by the size and detail of the area.
利用触摸屏本身的触控感应器阵列,每一个感应器如果感应到手指接触 了, 就会相应的产生信号, 把每一个感应器的感应信号做统计分析, 即可以 得出感应器的分布, 从而进一步推算出触摸范围的面积、 中心点坐标和方向 角度。  By using the touch sensor array of the touch screen itself, if each sensor senses a finger contact, a corresponding signal is generated, and the sensing signal of each sensor is statistically analyzed, thereby obtaining the distribution of the sensor, thereby Further calculate the area of the touch range, the coordinates of the center point, and the angle of the direction.
5103、 将上述分析得到的参数与已有的特征库中的参数进行匹配, 判断 出触摸来源的双手模型。  5103. Match the parameters obtained by the foregoing analysis with the parameters in the existing feature database to determine the two-hand model of the touch source.
具体为: 将分析得到的具体参数与已有的特征库中的参数区间进行匹 配, 特征库中的参数区间包括双手十个手指触摸行为的具体参数区间, 每个 手指的具体参数区间均有所差异, 若得到的各个参数均属于某个区间, 即可 判断当前触摸具体来源于该区间的手指, 从而判断出触摸来源的双手模型。  Specifically, the specific parameters obtained by the analysis are matched with the parameter intervals in the existing feature database, and the parameter interval in the feature database includes a specific parameter interval of ten finger touch behaviors of the hands, and each finger has a specific parameter interval. Difference, if each parameter obtained belongs to a certain interval, it can be judged that the current touch is specifically derived from the finger of the interval, thereby judging the hand model of the touch source.
5104、 根据匹配和判断的结果, 将相应的触发行为调出, 并且反馈到用 户界面中。  5104. According to the result of the matching and the judgment, the corresponding triggering behavior is called out and fed back to the user interface.
系统可以判断出触摸指头为左手还是右手, 食指还是拇指, 这里的交互 行为主要分为两类: 左手拇指 /其他手指。  The system can determine whether the touch finger is left or right, the index finger or the thumb. The interaction here is mainly divided into two categories: left thumb / other fingers.
当检测到当前触摸来自于左手拇指时, 不同的动作会触发不同的行为, 具体如下:  When it is detected that the current touch is from the left thumb, different actions trigger different behaviors, as follows:
1 )当检测到左手拇指向左滑动时, 系统会识别并返回至上一页(Back ); 1) When the left thumb is detected to slide to the left, the system will recognize and return to the previous page (Back);
2 ) 当检测到左手拇指向上滑动时, 系统会识别并返回至首页 (Home );2) When the left thumb is detected to slide up, the system will recognize and return to the home page (Home);
3 )当检测到左手拇指向下滑动时, 系统会识别并打开选项列表(Menu );3) When the left thumb is detected to slide down, the system will recognize and open the option list (Menu);
4 ) 当检测到左手拇指向右滑动时, 系统会识别并前进至下一页 ( Forward ); 4) When the left thumb is detected to slide to the right, the system will recognize and advance to the next page ( Forward );
5 ) 当检测到左手拇指双击屏幕时, 系统会识别并打开多任务菜单 ( Mul t i-Tasking )。 图 4是左右手触摸交互行为示意图,当左手拇指进行相应的操作时, Menu 等菜单会自动在右手拇指的位置弹出, 如果没有检测到右手拇指, 则在右半 部区域弹出。 5) When the left thumb is detected and the screen is double clicked, the system recognizes and opens the multitasking menu (Mul t i-Tasking). Figure 4 is a schematic diagram of the left and right hand touch interaction behavior. When the left thumb performs the corresponding operation, the menu such as Menu will automatically pop up at the position of the right thumb. If the right thumb is not detected, it will pop up in the right half.
当检测到当前触摸来自于其他手指时,进行正常的触摸操作行为,例如: 滑动、 点击、 多指触控等。  When it is detected that the current touch is from other fingers, normal touch operation behaviors are performed, such as: sliding, clicking, multi-finger touch, and the like.
本实施例中通过左手来控制功能, 这种取消功能按键的方法极大地节约 了屏幕空间, 同时也最小化了用户交互时手指移动的距离, 使得交互过程更 加便捷流畅。  In this embodiment, the function is controlled by the left hand. This method of canceling the function key greatly saves the screen space, and at the same time minimizes the distance of the finger movement when the user interacts, making the interaction process more convenient and smooth.
实施例 5 :  Example 5:
图 5是本发明提供的基于投影屏幕的左右手交互方法的流程图, 该方法 具体包括:  FIG. 5 is a flowchart of a left-right hand interaction method based on a projection screen provided by the present invention, where the method specifically includes:
S 201、通过摄像设备和距离传感器检测投影屏幕上被触摸区域的位置和 触摸方式, 所述触摸方式包括: 点击、 滑动、 长按、 拖拽和双击等;  S201, detecting, by the imaging device and the distance sensor, a position and a touch manner of the touched area on the projection screen, where the touch manner includes: clicking, sliding, long pressing, dragging, and double-clicking;
5202、 通过摄像设备和距离传感器捕捉触摸瞬间用户的双手模型; 5202. Capture a two-hand model of the user by touching the camera device and the distance sensor;
5203、 根据捕捉到的双手模型进行进一步分析, 获得具体的双手模型参 数, 所述具体的双手模型参数包括: 双手的方位、 姿势、 角度、 深度、 以及 层次关系等; 5203. Perform further analysis according to the captured two-hand model to obtain a specific two-hand model parameter, where the specific two-hand model parameters include: azimuth, posture, angle, depth, and hierarchical relationship of the hands;
5204、 将上述分析得到的参数与已有的特征库中的参数进行匹配, 判断 出触摸来源的双手模型;  5204. Match the parameters obtained by the foregoing analysis with the parameters in the existing feature database to determine the two-hand model of the touch source;
5205、 根据匹配和判断的结果, 将相应的触发行为调出, 并且反馈到用 户界面中。  5205. According to the result of the matching and the judgment, the corresponding triggering behavior is called out and fed back to the user interface.
其中, 所述特征库中包含双手触摸行为的具体参数, 通过将分析得到的 参数与已有的特征库中的参数进行匹配, 判断出触摸来源的双手模型, 根据 判断出的双手模型, 系统将相应的触发行为调出, 并且反馈到用户界面中, 完成相关操作。  Wherein, the feature library includes specific parameters of the two-hand touch behavior, and by matching the analyzed parameters with the parameters in the existing feature database, determining the two-hand model of the touch source, according to the determined two-hand model, the system will The corresponding triggering behavior is called up and fed back into the user interface to complete the related operations.
实施例 6:  Example 6:
本实施例提供一种基于投影屏幕的左右手交互方法流程, 包括以下步 骤: The embodiment provides a flow of a left-right hand interaction method based on a projection screen, including the following steps. Step:
S20 通过摄像设备和距离传感器检测投影屏幕上被触摸区域的位置和 触摸方式, 所述触摸方式包括: 点击、 滑动、 长按、 拖拽和双击等。  The S20 detects the position and the touch manner of the touched area on the projection screen by the imaging device and the distance sensor, and the touch manner includes: clicking, sliding, long pressing, dragging, and double-clicking.
5202、 通过摄像设备和距离传感器捕捉触摸瞬间用户的双手模型。 5202. Capture a two-hand model of the user by touching the camera device and the distance sensor.
5203、 根据捕捉到的双手模型进行进一步分析, 获得具体的双手模型参 数, 所述具体的双手模型参数包括: 双手的方位、 姿势、 角度、 深度、 以及 层次关系等。 5203. Perform further analysis according to the captured two-hand model to obtain specific two-hand model parameters, including: azimuth, posture, angle, depth, and hierarchical relationship of the two hands.
其中, 根据捕捉到的双手模型进行进一步分析具体为:  Among them, further analysis based on the captured two-hand model is as follows:
通过所述摄像设备和距离传感器,基于深度和色差信息捕捉到双手的深 度模型, 通过深度模型获取方位和深度, 进而计算出具体的双手模型参数, 判断当前触摸的是左手还是右手。  Through the imaging device and the distance sensor, the depth model of the two hands is captured based on the depth and the color difference information, the orientation and the depth are acquired by the depth model, and then the specific two-hand model parameters are calculated to determine whether the current touch is the left hand or the right hand.
5204、 将上述分析得到的参数与已有的特征库中的参数进行匹配, 判断 出触摸来源的双手模型。  5204. Match the parameters obtained by the foregoing analysis with the parameters in the existing feature database to determine the two-hand model of the touch source.
具体为: 将分析得到的具体参数与已有的特征库中的参数区间进行匹 配, 特征库中的参数区间包括双手触摸行为的具体参数区间, 若得到的各个 参数均属于某个区间, 即可判断当前触摸具体来源于该区间的手指, 从而判 断出触摸来源的双手模型。  Specifically, the specific parameters obtained by the analysis are matched with the parameter intervals in the existing feature database, and the parameter interval in the feature library includes a specific parameter interval of the two-hand touch behavior, and if each parameter obtained belongs to a certain interval, It is judged that the current touch is specifically derived from the finger of the section, thereby judging the hand model of the touch source.
5205、 根据匹配和判断的结果, 将相应的触发行为调出, 并且反馈到用 户界面中。  5205. According to the result of the matching and the judgment, the corresponding triggering behavior is called out and fed back to the user interface.
图 6是左右手投影屏幕交互行为示意图, 所有的 Menu和相关操作均在 触摸点旁显示, 易于交互和操作。  Figure 6 is a schematic diagram of the interactive behavior of the left and right hand projection screens. All Menus and related operations are displayed next to the touch points, which is easy to interact and operate.
例如, 基于投影屏幕的左右手触控交互的具体方式可以为:  For example, the specific way of the left and right hand touch interaction based on the projection screen can be:
左手交互行为^口下:  Left-hand interaction behavior ^ under the mouth:
1 )单指单击: 选择 /打开等操作;  1) Single-finger click: Select / open and other operations;
2 )单指长按: 弹出定制化 Menu, 包含相关操作;  2) Single-finger long press: pop-up customized Menu, including related operations;
3 )单指滑动: 一次选中多个内容, 包括多个文档, 多个文字等; 3) Single-finger swipe: Select multiple content at a time, including multiple documents, multiple texts, etc.
4 )长按拖拽: 将选中内容移动到合理的位置; 5 )双指以上向左滑动: 销, 后退; 4) Long press and drag: Move the selected content to a reasonable position; 5) Sliding to the left above the two fingers: pin, back;
6 )双击: 多窗口切换;  6) Double click: multi-window switching;
7 )双指靠近、 拉远、 旋转: 缩小, 放大, 旋转。  7) Two fingers approach, pull away, rotate: zoom out, zoom in, rotate.
右手交互行为^口下:  Right hand interaction behavior ^ under the mouth:
8 )单指滑动: 使用默认 /当前工具进行评注;  8) Single-finger swipe: Comment using the default/current tool;
9 )双指滑动: 橡皮擦工具;  9) Two-finger sliding: Eraser tool;
10 )单指长按: Menu , 可以换笔, 换工具, 保存, 删除等。  10) Single-finger long press: Menu, you can change the pen, change tools, save, delete, etc.
本实施例中, 在会议场景下, 人们可以自如地在投影屏幕前进行评注和 常规操作, 无需回到电脑面前进行操作, 使得基于投影屏幕的交互过程十分 便捷流畅。  In this embodiment, in the conference scene, people can freely perform commentary and normal operations in front of the projection screen, and do not need to go back to the computer to operate, so that the interaction process based on the projection screen is very convenient and smooth.
本领域普通技术人员将会理解, 本发明的各个方面、 或各个方面的可能 实现方式可以被具体实施为系统、 方法或者计算机程序产品。 因此, 本发明 的各方面、 或各个方面的可能实现方式可以采用完全硬件实施例、 完全软件 实施例 (包括固件、驻留软件等等), 或者组合软件和硬件方面的实施例的形 式, 在这里都统称为"电路"、 "模块 "或者 "系统"。 此外, 本发明的各方面、 或各个方面的可能实现方式可以采用计算机程序产品的形式,计算机程序产 品是指存储在计算机可读介质中的计算机可读程序代码。  Those of ordinary skill in the art will appreciate that various aspects of the present invention, or possible implementations of various aspects, can be embodied as a system, method, or computer program product. Thus, aspects of the invention, or possible implementations of various aspects, may be in the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, etc.), or a combination of software and hardware aspects, They are collectively referred to herein as "circuits," "modules," or "systems." Furthermore, aspects of the invention, or possible implementations of various aspects, may take the form of a computer program product, which is a computer readable program code stored in a computer readable medium.
计算机可读介质可以是计算机可读信号介质或者计算机可读存储介质。 计算机可读存储介质包含但不限于电子、 磁性、 光学、 电磁、 红外或半导体 系统、 设备或者装置, 或者前述的任意适当组合, 如随机存取存储器 (RAM), 只读存储器 (ROM)、 可擦除可编程只读存储器(EPROM或者快闪 存储器)、 光纤、 便携式只读存储器 (CD-ROM)。  The computer readable medium can be a computer readable signal medium or a computer readable storage medium. The computer readable storage medium includes, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, device, or device, or any suitable combination of the foregoing, such as random access memory (RAM), read only memory (ROM), Erase programmable read-only memory (EPROM or flash memory), optical fiber, portable read-only memory (CD-ROM).
计算机中的处理器读取存储在计算机可读介质中的计算机可读程序代 码, 使得处理器能够执行在流程图中每个步骤、 或各步骤的组合中规定的功 能动作;生成实施在框图的每一块、或各块的组合中规定的功能动作的装置。  The processor in the computer reads the computer readable program code stored in the computer readable medium, such that the processor can perform the functional actions specified in each step or combination of steps in the flowchart; A device that functions as specified in each block, or combination of blocks.
计算机可读程序代码可以完全在用户的计算机上执行、部分在用户的计 算机上执行、 作为单独的软件包、 部分在用户的计算机上并且部分在远程计 算机上, 或者完全在远程计算机或者服务器上执行。 也应该注意, 在某些替 代实施方案中, 在流程图中各步骤、 或框图中各块所注明的功能可能不按图 中注明的顺序发生。 例如, 依赖于所涉及的功能, 接连示出的两个步骤、 或 两个块实际上可能被大致同时执行,或者这些块有时候可能被以相反顺序执 行。 发明的精神和范围。 这样, 倘若本发明的这些修改和变型属于本发明权利要 求及其等同技术的范围之内, 则本发明也意图包含这些改动和变型在内。 The computer readable program code can execute entirely on the user's computer, partly on the user's computer, as a separate software package, partly on the user's computer, and partly on the remote meter On a computer, or on a remote computer or server. It should also be noted that in some alternative implementations, the functions noted in the various steps of the flowcharts or in the blocks of the block diagrams may not occur in the order noted in the drawings. For example, two steps, or two blocks shown in succession may be executed substantially concurrently, or the blocks may sometimes be performed in the reverse order, depending upon the functionality involved. The spirit and scope of the invention. Thus, it is intended that the present invention cover the modifications and the modifications of the invention

Claims

权利要求 Rights request
1、 一种左右手交互装置, 其特征在于, 该装置包括分别与总线连接的 检测单元、 分析单元、 匹配单元和反馈单元; 1. A left- and right-hand interactive device, characterized in that the device includes a detection unit, an analysis unit, a matching unit and a feedback unit that are respectively connected to the bus;
所述检测单元用于检测触摸屏上被触摸区域的范围; The detection unit is used to detect the range of the touched area on the touch screen;
所述分析单元用于接收所述检测单元的输入, 并进一步分析触摸范围内 的感应器阵列, 以及分析触摸时的双手模型参数; The analysis unit is used to receive input from the detection unit, and further analyze the sensor array within the touch range, and analyze the parameters of the two-hand model during touch;
所述匹配单元用于将所述分析单元的分析结果与特征库进行匹配, 并判 断出触摸来源的双手模型; The matching unit is used to match the analysis results of the analysis unit with the feature library, and determine the two-hand model of the touch source;
所述反馈单元用于根据所述匹配单元的匹配和判断结果将相应的触发 行为反馈至用户界面中。 The feedback unit is used to feed back the corresponding trigger behavior to the user interface according to the matching and judgment results of the matching unit.
2、 如权利要求 1所述的交互装置, 其特征在于, 所述装置进一步包括 与所述总线连接的捕捉单元, 用于捕捉与投影屏幕交互时用户的双手模型。 2. The interactive device according to claim 1, wherein the device further includes a capture unit connected to the bus for capturing the user's hand model when interacting with the projection screen.
3、 如权利要求 2所述的交互装置, 其特征在于, 所述捕捉单元包括摄 像设备和距离传感器。 3. The interactive device according to claim 2, wherein the capturing unit includes a camera device and a distance sensor.
4、 如权利要求 1-3 中任一项所述的交互装置, 其特征在于, 所述特征 库中包含双手十个手指触摸行为的具体参数,通过将分析单元得到的分析结 果与特征库中的参数进行匹配来判断出触摸来源的双手模型。 4. The interactive device according to any one of claims 1 to 3, characterized in that the feature library contains specific parameters of the touch behavior of ten fingers of both hands, and the analysis results obtained by the analysis unit are compared with those in the feature library. The parameters are matched to determine the two-hand model of the touch source.
5、 一种基于触摸屏的左右手交互方法, 其特征在于, 该方法具体包括: 检测触摸屏上被触摸区域的轮廓、 范围、 以及触摸强度; 5. A left-hand interaction method based on a touch screen, characterized in that the method specifically includes: detecting the outline, range, and touch intensity of the touched area on the touch screen;
根据上述检测结果, 进一步分析触摸范围内的感应器阵列, 并分析出该 触摸范围的面积、 中心点坐标和方向角度; Based on the above detection results, further analyze the sensor array within the touch range, and analyze the area, center point coordinates and direction angle of the touch range;
将上述分析得到的参数与已有的特征库中的参数进行匹配, 判断出触摸 来源的双手模型; Match the parameters obtained from the above analysis with the parameters in the existing feature library to determine the two-hand model of the touch source;
根据匹配和判断的结果, 将相应的触发行为调出, 并且反馈到用户界面 中。 According to the results of matching and judgment, the corresponding trigger behavior is called up and fed back to the user interface.
6、 如权利要求 5所述的交互方法, 其特征在于, 通过阵列传感器的实 时捕捉功能来检测触摸屏上被触摸区域的轮廓、 范围、 以及触摸强度。 6. The interaction method according to claim 5, characterized in that, through the implementation of the array sensor Time capture function to detect the outline, range, and touch intensity of the touched area on the touch screen.
7、 如权利要求 5所述的交互方法, 其特征在于, 进一步分析触摸范围 内的感应器阵列, 并分析出该触摸范围的面积、 中心点坐标和方向角度的方 法具体为: 7. The interaction method according to claim 5, characterized in that the method of further analyzing the sensor array within the touch range and analyzing the area, center point coordinates and direction angle of the touch range is specifically:
利用触摸屏本身的触控感应器阵列, 当感应器感应到手指接触时会产生 相应的信号, 把每一个感应器的感应信号做统计分析, 能够得出感应器的分 布, 从而进一步推算出触摸范围的面积、 中心点坐标和方向角度。 Using the touch sensor array of the touch screen itself, when the sensor senses finger contact, it will generate a corresponding signal. By statistically analyzing the sensing signal of each sensor, the distribution of the sensors can be obtained, and the touch range can be further calculated. area, center point coordinates and direction angle.
8、 如权利要求 5所述的交互方法, 其特征在于, 所述特征库中包含双 手十个手指触摸行为的具体参数,通过将分析得到的参数与已有的特征库中 的参数进行匹配, 判断出触摸来源的双手模型。 8. The interaction method according to claim 5, wherein the feature database contains specific parameters of the touch behavior of ten fingers of both hands, and by matching the analyzed parameters with parameters in the existing feature database, A two-hand model that determines the source of a touch.
9、 一种基于投影屏幕的左右手交互方法, 其特征在于, 该方法具体包 括: 9. A left-hand and right-hand interaction method based on a projection screen, characterized in that the method specifically includes:
通过摄像设备和距离传感器检测投影屏幕上被触摸区域的位置和触摸 方式; Detect the position and touch mode of the touched area on the projection screen through camera equipment and distance sensors;
通过摄像设备和距离传感器捕捉触摸瞬间用户的双手模型; Capture the user's hand model at the moment of touch through camera equipment and distance sensors;
根据捕捉到的双手模型进行进一步分析, 获得具体的双手模型参数; 将上述分析得到的参数与已有的特征库中的参数进行匹配, 判断出触摸 来源的双手模型; Conduct further analysis based on the captured two-hand model to obtain specific two-hand model parameters; match the parameters obtained from the above analysis with the parameters in the existing feature library to determine the two-hand model from which the touch originated;
根据匹配和判断的结果, 将相应的触发行为调出, 并且反馈到用户界面 中。 According to the results of matching and judgment, the corresponding trigger behavior is called up and fed back to the user interface.
10、 如权利要求 9所述的交互方法, 其特征在于, 所述触摸方式包括: 点击、 滑动、 长按、 拖拽和双击。 10. The interaction method according to claim 9, wherein the touch method includes: click, slide, long press, drag and double click.
11、 如权利要求 9所述的交互方法, 其特征在于, 根据捕捉到的双手模 型进行进一步分析具体为: 11. The interaction method according to claim 9, characterized in that further analysis based on the captured hand model is specifically:
通过所述摄像设备和距离传感器,基于深度和色差信息捕捉到双手的深 度模型, 通过深度模型获取方位和深度, 进而计算出具体的双手模型参数, 判断当前触摸的是左手还是右手。 Through the camera equipment and distance sensor, the depth model of both hands is captured based on the depth and color difference information. The orientation and depth are obtained through the depth model, and then the specific model parameters of both hands are calculated to determine whether the left hand or the right hand is currently being touched.
12、 如权利要求 11 所述的交互方法, 其特征在于, 所述具体的双手模 型参数包括: 双手的方位、 姿势、 角度、 深度、 以及层次关系。 12. The interaction method according to claim 11, wherein the specific hands model parameters include: the orientation, posture, angle, depth, and hierarchical relationship of the hands.
PCT/CN2014/071408 2013-03-20 2014-01-24 Interactive device and method for left and right hands WO2014146516A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN2013100913922A CN103164160A (en) 2013-03-20 2013-03-20 Left hand and right hand interaction device and method
CN201310091392.2 2013-03-20

Publications (1)

Publication Number Publication Date
WO2014146516A1 true WO2014146516A1 (en) 2014-09-25

Family

ID=48587289

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2014/071408 WO2014146516A1 (en) 2013-03-20 2014-01-24 Interactive device and method for left and right hands

Country Status (2)

Country Link
CN (1) CN103164160A (en)
WO (1) WO2014146516A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107291361A (en) * 2016-03-30 2017-10-24 阿里巴巴集团控股有限公司 The keyboard layout method to set up and device of a kind of input method

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103164160A (en) * 2013-03-20 2013-06-19 华为技术有限公司 Left hand and right hand interaction device and method
CN104657011B (en) * 2013-11-15 2018-06-15 联发科技股份有限公司 Touch-control transmitting device, electronic device and data transmission method
CN104777994A (en) * 2014-01-15 2015-07-15 东莞市步步高通信软件有限公司 Method and system for intelligent identification of mobile terminal held in left hand or right hand of user
CN105573514A (en) * 2015-12-22 2016-05-11 武雄英 Input equipment with double navigation keys
CN105718196A (en) * 2016-01-18 2016-06-29 惠州Tcl移动通信有限公司 Mobile terminal and identification method for left and right hands on touch screen thereof
CN109960406B (en) * 2019-03-01 2020-12-08 清华大学 Intelligent electronic equipment gesture capturing and recognizing technology based on action between fingers of two hands

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101916161A (en) * 2010-08-04 2010-12-15 宇龙计算机通信科技(深圳)有限公司 Interface model selection method based on image of region pressed by finger and mobile terminal
WO2012019350A1 (en) * 2010-08-12 2012-02-16 Google Inc. Finger identification on a touchscreen
CN102780864A (en) * 2012-07-03 2012-11-14 深圳创维-Rgb电子有限公司 Projection menu-based television remote control method and device, and television
CN102789312A (en) * 2011-12-23 2012-11-21 乾行讯科(北京)科技有限公司 User interaction system and method
CN103164160A (en) * 2013-03-20 2013-06-19 华为技术有限公司 Left hand and right hand interaction device and method

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5203797B2 (en) * 2008-05-13 2013-06-05 株式会社エヌ・ティ・ティ・ドコモ Information processing apparatus and display information editing method for information processing apparatus
US8896555B2 (en) * 2011-05-20 2014-11-25 Robert H Duffield Touch alphabet and communication system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101916161A (en) * 2010-08-04 2010-12-15 宇龙计算机通信科技(深圳)有限公司 Interface model selection method based on image of region pressed by finger and mobile terminal
WO2012019350A1 (en) * 2010-08-12 2012-02-16 Google Inc. Finger identification on a touchscreen
CN102789312A (en) * 2011-12-23 2012-11-21 乾行讯科(北京)科技有限公司 User interaction system and method
CN102780864A (en) * 2012-07-03 2012-11-14 深圳创维-Rgb电子有限公司 Projection menu-based television remote control method and device, and television
CN103164160A (en) * 2013-03-20 2013-06-19 华为技术有限公司 Left hand and right hand interaction device and method

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107291361A (en) * 2016-03-30 2017-10-24 阿里巴巴集团控股有限公司 The keyboard layout method to set up and device of a kind of input method

Also Published As

Publication number Publication date
CN103164160A (en) 2013-06-19

Similar Documents

Publication Publication Date Title
WO2014146516A1 (en) Interactive device and method for left and right hands
CN105824559B (en) False touch recognition and processing method and electronic equipment
US10871894B2 (en) Apparatus and method of copying and pasting content in a computing device
CN103064627B (en) A kind of application management method and device
KR20160073359A (en) Apparatus and method for providing user interface, and computer-readable recording medium recording the same
KR20140114913A (en) Apparatus and Method for operating sensors in user device
TWI463355B (en) Signal processing apparatus, signal processing method and selecting method of user-interface icon for multi-touch interface
WO2014029043A1 (en) Method and device for simulating mouse input
CN102662462A (en) Electronic device, gesture recognition method and gesture application method
CN103218044B (en) A kind of touching device of physically based deformation feedback and processing method of touch thereof
TWI505155B (en) Touch-control method for capactive and electromagnetic dual-mode touch screen and handheld electronic device
TWI389014B (en) Touchpad detection method
KR102323892B1 (en) Multi-touch virtual mouse
TWI421731B (en) Method for executing mouse function of electronic device and electronic device thereof
CN105549856A (en) Mobile terminal based instruction input method and apparatus
CN103870156A (en) Method and device for processing object
TWI528271B (en) Method, apparatus and computer program product for polygon gesture detection and interaction
WO2021197487A1 (en) Method and apparatus for controlling terminal screen by means of mouse, mouse and storage medium
CN105589636A (en) Method and mobile terminal used for realizing virtual pointer control on touch screen
CN105824553A (en) Touch method and mobile terminal
CN108255300B (en) Control method and device of electronic equipment
CN107870705B (en) Method and device for changing icon position of application menu
CN106909256A (en) Screen control method and device
CN105808129B (en) Method and device for quickly starting software function by using gesture
US9525906B2 (en) Display device and method of controlling the display device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14767739

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14767739

Country of ref document: EP

Kind code of ref document: A1