WO2019113868A1 - Ear-based human-computer interaction technology for interaction with touch surface device - Google Patents

Ear-based human-computer interaction technology for interaction with touch surface device Download PDF

Info

Publication number
WO2019113868A1
WO2019113868A1 PCT/CN2017/116042 CN2017116042W WO2019113868A1 WO 2019113868 A1 WO2019113868 A1 WO 2019113868A1 CN 2017116042 W CN2017116042 W CN 2017116042W WO 2019113868 A1 WO2019113868 A1 WO 2019113868A1
Authority
WO
WIPO (PCT)
Prior art keywords
ear
touch surface
human
computer interaction
interaction method
Prior art date
Application number
PCT/CN2017/116042
Other languages
French (fr)
Chinese (zh)
Inventor
喻纯
王若琳
史元春
何纬捷
Original Assignee
清华大学
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 清华大学 filed Critical 清华大学
Priority to PCT/CN2017/116042 priority Critical patent/WO2019113868A1/en
Publication of WO2019113868A1 publication Critical patent/WO2019113868A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means

Definitions

  • the invention relates to a human-computer interaction method based on interaction between an ear and a touch surface device, and a human-computer interaction system implementing the method.
  • a touch surface device especially a mobile phone with a touch screen
  • the thumb can only be used to access the screen, and the range of motion is limited to the lower right corner of the screen (holding in the right hand) In the case of equipment).
  • the user wants to access any position on the screen he has to move his hand up and down, or deflect the orientation of the screen of the mobile phone, so that it cannot be easily achieved in a stable grip state, especially for a large-screen mobile phone of 5 inches or more.
  • the present invention provides a human-computer interaction method and a corresponding device based on interaction between an ear and a touch surface device, and the ear provides input as a human body interacting with the device, enabling a stable one-handed posture.
  • Access to the entire touch surface of the touch surface device can be supported to support the use of a blind person in an eye-free situation and a blind person.
  • the finger operation can be completely replaced or only supplemented.
  • a human-computer interaction method based on interaction of an ear with a touch surface device wherein the ear provides input to a human body that interacts with the device, the method comprising:
  • the touch surface device performs a corresponding control operation.
  • the inputting the pointing finger and/or the gesture includes: obtaining a proximity image associated with the ear to identify one of an input location, a position, a force, and/or an action of the ear on the device or Multiple or a combination thereof.
  • the input site comprises one or more of the upper auricle, the lower auricle, the earlobe, or a combination thereof, or the entire ear.
  • the location comprises an edge location and a non-edge location of the touch surface.
  • the actions include: touching, pressing, rotating, sliding, and detaching.
  • the method further comprises: detecting a density value and a pattern of the region of interest as an ear candidate, wherein the region of interest refers to a region of pixels of a point at which the density value exceeds a certain threshold.
  • the movement of the ear relative to the ear when it is in contact with the touch surface is identified and tracked, and a touch navigation operation command is implemented based on the recognition result.
  • the touch surface device senses the input of the ear using a capacitive sensor, and the contact state of the ear with the screen is measured by the sum of the touch surface capacitance value intensity.
  • the complete contact of the ear with the touch surface is defined as an ear contact event, the summed value falling back to the ear contact frame for the first time during the contact of the ear with the touch surface; and wherein the ear is
  • the touch surface lift-off is defined as an ear disengagement event that rapidly drops during the process of the ear moving away from the touch surface, taking the last frame above the set threshold to disengage the ear.
  • the movement of the ear on the touch surface is defined as an ear movement event, and the proximity image is tracked to obtain a relative amount of displacement between successive two frames of images.
  • the relative displacement between successive two frames of images is calculated using a kernel correlation filter KCF algorithm.
  • the method uses two KCF trackers to work alternately: the first tracker is in an active working state, and when the tracking result is output, the second tracker is in a background running state; when the first tracker works for a certain period of time or When a tracker fails to track, the state of the two trackers is exchanged, and the replaced first tracker is reinitialized, and so on.
  • said time is 500 ms.
  • the contact coordinates of the ear are calculated.
  • the intent to define an ear image at a position between the edge of the upper auricle and the ear hole is defined Click on the point.
  • the minimum rectangle of the surrounding image of the ear is used as a reference near the vertex of the posterior superior auricle, and the distance boundary is a certain value at the intent click point.
  • the inventors have noted that when interacting, the user tends to move the touch surface device toward the ear rather than moving the ear toward the device.
  • the contact coordinates of the ear are mapped to the touch surface to minimize movement of the user's arms and thumb.
  • the operation comprises the touch surface device providing feedback to the user.
  • the feedback comprises audible feedback, in particular in the form of sound effects, speech or a combination of both.
  • the feedback comprises haptic feedback, in particular providing the haptic feedback in a vibrating manner.
  • the hand inputs an operational command to the touch surface device as an auxiliary human organ that interacts with the touch surface device.
  • the touch surface device also recognizes input from the hand.
  • the touch surface defines an ear interaction area and a hand interaction area.
  • the touch surface device prompts an error by feedback.
  • the method further comprises: identifying the orientation of the ear on the screen to distinguish the left and right ears, and performing different controls based on the result of the differentiation.
  • a touch surface device for facilitating human interaction with an ear, the ear being a human body interacting with the touch surface device, the touch surface device comprising:
  • a touch surface element comprising a sensor component for sensing the operation of the ear, receiving an operation command input by the ear,
  • a memory storing computer executable instructions that, when executed by the processor, perform the method as previously described.
  • the touch surface device is a mobile phone and the touch surface is a touch screen.
  • a computer storage medium having stored thereon computer executable instructions that, when executed by a computer, perform the method as defined above.
  • FIG. 1 shows a schematic flow chart of a human-computer interaction method 1 based on an ear interacting with a touch surface device according to an embodiment of the present invention
  • FIG. 2 is a schematic diagram showing a structural configuration of a touch surface device according to an embodiment of the present invention
  • Figure 3 is a schematic illustration of the life cycle of an ear contact event when the ear interacts with the screen
  • Figure 4 illustrates a method of alternately operating two core correlation filters KCF employed in tracking an image of a moving ear
  • Figure 5 shows a schematic diagram of calculating the contact coordinates of the ear.
  • Figure 6 shows a schematic diagram of the mapping of the ear contacts to the touch surface.
  • the input finger of the ear here draws on the concept of "pointing" in the input field.
  • Pointing generally refers to the coordinates of the click point of the mouse, indicator pointer, etc. in the screen coordinate system.
  • An input position point determined based on the detection result and a predetermined algorithm.
  • proximity image herein is meant that the touch surface device is capable of detecting objects as they approach and/or contact the touch surface.
  • Each sensing element ie, "pixel” in a two-dimensional array of sensing elements (ie, a touch surface) produces an output signal to indicate an electric field disturbance (for a capacitive sensor) at the sensor element, force (for pressure) Sensor) or photo-affinity (for photosensors).
  • the overall pixel value represents a "proximity image”.
  • various embodiments of the present invention provide the ability to detect and discern touch surface signals (represented as proximity images) produced by identified input locations, locations, force paths, and/or motion types.
  • touch surface elements herein may be touch panels or touch screens, and the touch surface elements include sensors capable of sensing input, such as capacitive sensors, pressure sensors, photosensors, image sensors, and the like.
  • the ear Compared to the fingers, the ear has its own characteristics as a human organ that provides input for interaction with the touch surface device:
  • the shape of the finger on the touch surface is relatively simple and can be easily reduced to a single spot.
  • the ear itself has a more complex shape, a larger area, and a more uneven surface.
  • the shape of the contact area varies from person to person when in contact with the touch surface, and even for the same user, there is a huge difference every time it is input. . Therefore, the recognition and tracking of the ear input is more difficult.
  • actions that can be performed when the ear interacts with the touch surface device, making the input richer include, but are not limited to:
  • the area of the ear is large, resulting in more options for the parts that can provide input.
  • one or more of the upper auricle, the lower auricle, and the earlobe, or a combination thereof may be used as an input site, or the entire ear may also serve as an input site. Therefore, performing some or all of the same operation of the ear is two kinds of interactions, which can trigger different operation instructions.
  • the upper part of the ear is first contacted on the screen, and as a contact rotation point, the lower end of the mobile phone can be attached or lifted to the head. The lower half of the image of the ear will appear and disappear, and the operation can be identified accordingly, and the operation instructions are defined based on this.
  • the ear since the ear is very soft and has a complicated shape, it deforms during contact with the touch surface. Therefore, different forces—light or heavy—will also bring new information to the input, and these differences can be used to distinguish between different interactions.
  • the location in which the ear interacts with the touch surface device can also provide more options for input.
  • the ear may be in contact at the edge of the touch surface or in a non-edge position; of course, the ear may not be in contact with the touch surface, but at a distance from it, and/or at an angle.
  • the orientation of the left and right ears presented on the screen is different: the operation of the left and right ears can be given different meanings.
  • the input position, force, position and/or motion of the ear combined with the action time can form an extremely rich input, making different operational instructions, including but not limited to:
  • a part of the ear contacts the touch surface, and all the ears are in contact with the touch surface by the action of the fitting;
  • one of the most basic operations is to move the ear in a state of being attached to the touch surface, and the touch surface device reads the information content of the corresponding position.
  • the user can touch the ear twice on the touch surface to make a selected operational command.
  • the control amount may be the brightness of the light control, the opening degree of the curtain control, the volume of the video control, and the playing. Tracks, set temperature of air conditioning control, air volume, and so on.
  • the touch surface device 200 includes a touch surface element 230 that receives an operational command input by an ear, the touch surface element 230 includes a sensing device 231 that senses an input of the ear; it further includes a processor 210 and stores A memory 220 of computer executable instructions that can execute the human computer interaction method in accordance with the present invention.
  • the touch surface device can be, for example, a mobile phone, a tablet computer, a smart watch, or the like.
  • the touch of the ear to the touch surface is often taken as an input, but this is a preferred example, and in the case of sensor support, other actions of the ear on the touch surface, such as proximity, may also form an input, such as on a touch surface.
  • a photoelectric sensor is provided thereon, which can be detected by the photoelectric sensor when the ear is close enough.
  • step S110 identifying an input pointing and/or a gesture of an ear
  • step S120 The touch surface device performs phase in response to the result of the recognition
  • the control operation should be.
  • the point at which the center of the ear is upward is defined as the ear position.
  • the positional movement and orientation change of the ear are discerned by comparing the matches between the ear shapes.
  • the touch surface device is a mobile phone, which may be, for example, an Android (Android) system based mobile phone, and thus the touch screen of the mobile phone is a touch surface.
  • the handset for example, has a capacitive sensor that senses the input received from the ear.
  • the touch surface device of the present invention can also utilize other sensors such as pressure sensors, photo sensors, image sensors, sonic sensors, and the like.
  • the ear When the ear interacts with the phone, it recognizes the input finger and/or posture of the ear. In particular, a proximity image associated with the ear is obtained to identify one or more of the input locations, locations, forces, and/or motions of the ear on the device, or a combination thereof.
  • the input site can include one or more of the upper auricle, the lower auricle, the earlobe, or a combination thereof, or the entire ear.
  • the position includes: the edge position and the non-edge position of the touch surface.
  • the edge position of the touch surface is a position that is easy to be perceived by the ear, so it is preferable to set some events triggered by the ear motion at the edge position.
  • the auricle portion of the ear can be slid up and down along the edge of the screen and slide left and right to achieve an up and down slide and a left and right slide operation similar to the finger on the screen.
  • the capacitance matrix of the screen can be obtained from the Android HAL layer (hardware abstraction layer), and processed, interpolated, denoised, etc. to obtain a capacitive screen image suitable for analysis.
  • Android HAL layer hardware abstraction layer
  • the capacitive image produced by the ear and screen contact is then identified and tracked. Specifically, the density value and the pattern of the region of interest as the ear candidate are detected, wherein the region of interest may refer to a region composed of pixels of a point whose density value exceeds a certain threshold.
  • the unique attribute based on the interaction of the ear with the touch surface corresponds to the definition of the existing finger touch event (down, up, move).
  • a unique ear-touch event can be proposed for ear interaction, that is, a concept definition of Ear-on, Ear-off, and Ear-move. .
  • the "full-screen capacitance value intensity summation value" can be used to identify the region of interest, and the contact state of the ear with the screen can be measured to extract two stable frames as the ear contact Ear-on and the ear disengaged Ear. -off.
  • Figure 3 shows the ear contact event life cycle when the ear interacts with the screen once, with the Y-axis being the "addition value.”
  • Ear contact with Ear-on The ear is "fully in contact” with the screen.
  • the “addition value” fluctuates in the process of contact between the ear and the screen, taking its first fall, that is, the first trough appearing in the curve is an Ear-on frame.
  • Ear off the ear When the ear is lifted off the screen. The "addition value" drops rapidly as the ear moves away from the screen, taking the last frame above the set threshold as an Ear-off frame, as shown in Figure 3.
  • Ear-move The KCF kernel correlation filter (Kernelized Correlation Filter) algorithm is used to track the image of the ear moving on the screen to obtain the relative displacement between two consecutive frames of images.
  • KCF kernel correlation filter Kernelized Correlation Filter
  • two KCF trackers can be used to alternately work, as shown in FIG. 4:
  • the first tracker is in an active working state, and when the tracking result is output, the second tracker is in a background running state;
  • the status of the two trackers is swapped and the replaced first tracker is reinitialized.
  • time 500ms is a preferred example, and other time thresholds can be set as needed.
  • the contact coordinates of the ear are calculated and mapped to the full screen.
  • a certain point located between the edge of the upper auricle and the ear hole is defined as an intent click point of the ear image.
  • the point at which the distance boundary is a certain value is taken as the reference by using the smallest rectangle surrounding the image of the ear near the apex of the posterior superior auricle.
  • the first point (ear contact ear-on) uses this definition of absolute position coordinates, and the ear movement of ear-move is calculated by the absolute coordinates and the relative displacement of adjacent frames.
  • Other locations may be employed in the present invention as intent click points.
  • the ears interact only within a certain area of the screen, thereby defining an interactive area of the ear, as shown in Figure 5, in which the interactive area as shown is defined as a comfortable boundary area for ear interaction.
  • the movement of the ear in this area is then linearly mapped to full screen, as shown in Figure 6. Thereby, the range of movement of the user's arm and thumb can be reduced as much as possible.
  • the touchevent is injected back into the Android system by the adb using the sendevent command.
  • the touch surface device may provide to the user during operation of the method according to an embodiment of the invention Feedback.
  • the feedback includes audible feedback, for example, during sliding of the ear on the touch surface, the information content on the ear sliding path can be output to prompt the user.
  • the audible feedback can be provided, for example, in the form of sound effects, speech, or a combination of both.
  • the feedback includes haptic feedback, such as providing tactile feedback to alert the user when the ear is mishandling.
  • the haptic feedback can be provided in a vibrating manner.
  • the feedback can include a combination of audible and tactile feedback.
  • the hand can also input operational commands to the touch surface device as an auxiliary human organ that interacts with the touch surface device.
  • the touch surface device can recognize input from the hand.
  • the touch surface defines an ear interaction area and a hand interaction area such that when an ear or hand inputs an operation instruction in a non-self interaction area, the touch surface device prompts an error by feedback.
  • ear-to-finger cooperation when the finger clicks on the screen and does not press and hold the screen (just like the shift key), the ear operates as a different interaction.
  • a capacitor is used as an example of a sensor that senses ear input, but this is by way of example and not limitation, and other types of sensors may be used instead or in combination, such as: a pressure sensor, a light sensor, a camera, as an example of a camera, You can get a traditional flat image or you can use a depth map.
  • the contact of the ear with the surface of the screen is sensed, but this is by way of example and not limitation, and alternatively, or in combination, the distance of the ear from the screen may be recognized when the surface of the screen that the ear does not touch.
  • the orientation of the ear relative to the screen, and different input signals are obtained based on different distances and different orientations, so that different input signals can be corresponding to different operation commands.
  • a mobile phone is taken as an example of a touch surface device, but this is by way of example and not limitation, and the touch surface device may also be, for example, a tablet computer system, a handheld computer system, a portable music player system, a portable video player system, etc. Wait.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Disclosed is an ear-based human-computer interaction method for interaction with a touch surface device, wherein an ear, serving as the human organ that interacts with the device, provides an input for the device. The method comprises: identifying an input orientation and/or position of an ear; and in response to an identification result, a touch surface device executing a corresponding control operation. The ear, serving as the human organ that interacts with the device, provides an input for the device, such that access to the entire touch surface of the touch surface device can be realized under the conditions of a stable single-hand grip, and both the use thereof by sighted people in eyes-free conditions and the use thereof by blind people can be supported. Therefore, finger operations can be completely replaced or are just supplementary.

Description

一种基于耳朵与触摸表面设备进行交互的人机交互技术Human-computer interaction technology based on interaction between ear and touch surface device 技术领域Technical field
本发明涉及一种基于耳朵与触摸表面设备进行交互的人机交互方法,以及实施该方法的人机交互系统。The invention relates to a human-computer interaction method based on interaction between an ear and a touch surface device, and a human-computer interaction system implementing the method.
背景技术Background technique
当用户在一只手被占用而以单手握姿与触摸表面设备、特别是具有触摸屏的手机进行交互时,只能用拇指来访问屏幕,且活动范围局限在屏幕右下角(在右手握住设备的情况下)。若用户想要访问屏幕上的任意位置,不得不把手上下挪动,或偏转手机屏幕朝向,因而无法在稳定握持的状态下容易地达成,尤其是对于5寸以上的大屏幕手机来说。When the user is occupied with one hand and interacts with a touch surface device, especially a mobile phone with a touch screen, with one hand, the thumb can only be used to access the screen, and the range of motion is limited to the lower right corner of the screen (holding in the right hand) In the case of equipment). If the user wants to access any position on the screen, he has to move his hand up and down, or deflect the orientation of the screen of the mobile phone, so that it cannot be easily achieved in a stable grip state, especially for a large-screen mobile phone of 5 inches or more.
对于盲人,与触摸表面设备进行单手交互则更加困难,因为他们把手机握持得更紧。对语音反馈的依赖使得他们往往需要把设备在耳边举起以听清语音反馈,由此导致拇指的活动范围受到更多限制。虽然特殊地,盲人也可以用拇指在右下角左右滑动,通过顺序浏览的方式访问全屏,但是这种操作方式更难,效率也很低。For the blind, it is more difficult to interact with touch-surface devices with one hand because they hold the phone tighter. The reliance on speech feedback makes them often need to lift the device to the ear to hear the speech feedback, which results in more limited range of thumb activity. Although special, blind people can also use the thumb to slide left and right in the lower right corner to access the full screen through sequential browsing, but this operation is more difficult and less efficient.
这些限制导致了人们在一只手被占用的情况下,用手指访问全屏的能力降低。These limitations have led to a reduction in the ability of a person to access a full screen with a finger in the event that one hand is occupied.
发明内容Summary of the invention
针对上述问题,本发明提出一种基于耳朵与触摸表面设备进行交互的人机交互方法及相应设备,耳朵作为与该设备进行交互的人体器官为其提供输入,使得能够在稳定的单手握姿下实现对触摸表面设备的整个触摸表面的访问,可以支持明眼人在无视觉(eyes-free)情况下和盲人的使用。因而,手指操作可被完全取代,或仅作为补充。In view of the above problems, the present invention provides a human-computer interaction method and a corresponding device based on interaction between an ear and a touch surface device, and the ear provides input as a human body interacting with the device, enabling a stable one-handed posture. Access to the entire touch surface of the touch surface device can be supported to support the use of a blind person in an eye-free situation and a blind person. Thus, the finger operation can be completely replaced or only supplemented.
根据本发明的第一方面,提出了一种基于耳朵与触摸表面设备进行交互的人机交互方法,其中,耳朵作为与该设备进行交互的人体器官为其提供输入,所述方法包括: According to a first aspect of the present invention, a human-computer interaction method based on interaction of an ear with a touch surface device is proposed, wherein the ear provides input to a human body that interacts with the device, the method comprising:
-识别耳朵的输入指点和/或姿势,和- identifying the input finger and/or posture of the ear, and
-响应于所述识别的结果,所述触摸表面设备执行相应的控制操作。- in response to the result of the recognition, the touch surface device performs a corresponding control operation.
在该方法中,所述识别耳朵的输入指点和/或姿势包括:获得与耳朵相关联的接近图像,以识别耳朵在所述设备上的输入部位、位置、力道和/或动作中的一个或多个或其组合。In the method, the inputting the pointing finger and/or the gesture includes: obtaining a proximity image associated with the ear to identify one of an input location, a position, a force, and/or an action of the ear on the device or Multiple or a combination thereof.
根据一个实施例,所述输入部位包括:上耳廓、下耳廓、耳垂中的一个或多个或其组合,或整耳。According to one embodiment, the input site comprises one or more of the upper auricle, the lower auricle, the earlobe, or a combination thereof, or the entire ear.
根据一个实施例,所述位置包括:触摸表面的边缘位置和非边缘位置。According to an embodiment, the location comprises an edge location and a non-edge location of the touch surface.
根据一个实施例,所述动作包括:触碰、按压、旋转、滑动、抽离。According to one embodiment, the actions include: touching, pressing, rotating, sliding, and detaching.
优选地,该方法还包括:检测作为耳朵候选的感兴趣区域的密度值和图案,其中,所述感兴趣区域是指密度值超过一定阈值的点的像素构成的区域。Preferably, the method further comprises: detecting a density value and a pattern of the region of interest as an ear candidate, wherein the region of interest refers to a region of pixels of a point at which the density value exceeds a certain threshold.
有利地,识别并跟踪耳朵与触摸表面接触时相对于其的移动,并根据识别结果实现触摸浏览操作指令。Advantageously, the movement of the ear relative to the ear when it is in contact with the touch surface is identified and tracked, and a touch navigation operation command is implemented based on the recognition result.
根据一个实施例,所述触摸表面设备利用电容传感器对耳朵的输入进行感测,借助触摸表面电容值强度加和值来衡量耳朵与屏幕的接触状态。According to one embodiment, the touch surface device senses the input of the ear using a capacitive sensor, and the contact state of the ear with the screen is measured by the sum of the touch surface capacitance value intensity.
有利地,将耳朵完全接触触摸表面定义为耳朵接触事件,所述加和值在耳朵与该触摸表面接触过程中的波动式增长的第一次回落为耳朵接触帧;并且其中,将耳朵从该触摸表面抬离时定义为耳朵脱开事件,所述加和值在耳朵从触摸表面远离的过程中迅速下降,取其高于设定阈值时的最后一帧,为耳朵脱开帧。Advantageously, the complete contact of the ear with the touch surface is defined as an ear contact event, the summed value falling back to the ear contact frame for the first time during the contact of the ear with the touch surface; and wherein the ear is The touch surface lift-off is defined as an ear disengagement event that rapidly drops during the process of the ear moving away from the touch surface, taking the last frame above the set threshold to disengage the ear.
有利地,将耳朵在触摸表面上的移动定义为耳朵移动事件,对接近图像进行跟踪,从而得到连续两帧图像之间的相对位移量。Advantageously, the movement of the ear on the touch surface is defined as an ear movement event, and the proximity image is tracked to obtain a relative amount of displacement between successive two frames of images.
有利地,利用核相关滤波器KCF算法,计算连续两帧图像之间的相对位移量。Advantageously, the relative displacement between successive two frames of images is calculated using a kernel correlation filter KCF algorithm.
有利地,采用两个KCF跟踪器交替式工作的方法:第一跟踪器处于活跃工作状态,输出跟踪结果时,第二跟踪器处于后台运行状态;当第一跟踪器工作满一定时间时或者第一跟踪器跟踪失败时,两个跟踪器的状态交换,被替换掉的第一跟踪器则重新初始化,如此往复。Advantageously, the method uses two KCF trackers to work alternately: the first tracker is in an active working state, and when the tracking result is output, the second tracker is in a background running state; when the first tracker works for a certain period of time or When a tracker fails to track, the state of the two trackers is exchanged, and the replaced first tracker is reinitialized, and so on.
有利地,所述时间为500ms。Advantageously, said time is 500 ms.
有利地,计算得出耳朵的触点坐标。Advantageously, the contact coordinates of the ear are calculated.
优选地,定义位于上耳廓边缘和耳洞之间的某个位置为耳朵图像的意图 点击点。Preferably, the intent to define an ear image at a position between the edge of the upper auricle and the ear hole is defined Click on the point.
优选地,利用包围住耳朵图像的最小矩形靠近后上方耳廓的顶点为基准,距离边界一定值处为所述意图点击点。Preferably, the minimum rectangle of the surrounding image of the ear is used as a reference near the vertex of the posterior superior auricle, and the distance boundary is a certain value at the intent click point.
发明人注意到,在交互时,用户倾向于将触摸表面设备移向耳朵,而不是将耳朵移向该设备。因此,有利地,将耳朵的触点坐标映射到触摸表面,以便使得用户胳膊和拇指的移动最小化。The inventors have noted that when interacting, the user tends to move the touch surface device toward the ear rather than moving the ear toward the device. Thus, advantageously, the contact coordinates of the ear are mapped to the touch surface to minimize movement of the user's arms and thumb.
有利地,所述操作包括触摸表面设备向用户提供反馈。Advantageously, the operation comprises the touch surface device providing feedback to the user.
优选地,所述反馈包括听觉反馈,特别是以音效、语音或二者组合的方式提供所述听觉反馈。Preferably, the feedback comprises audible feedback, in particular in the form of sound effects, speech or a combination of both.
优选地,所述反馈包括触觉反馈,特别是以振动的方式提供所述触觉反馈。Preferably, the feedback comprises haptic feedback, in particular providing the haptic feedback in a vibrating manner.
另外或替换地,手作为与该触摸表面设备进行交互的辅助人体器官向触摸表面设备输入操作指令。Additionally or alternatively, the hand inputs an operational command to the touch surface device as an auxiliary human organ that interacts with the touch surface device.
有利地,触摸表面设备还识别来自于手的输入。Advantageously, the touch surface device also recognizes input from the hand.
替换地,所述触摸表面限定有耳朵交互区域和手交互区域。Alternatively, the touch surface defines an ear interaction area and a hand interaction area.
优选地,当耳朵或手在非自身交互区域中输入操作指令时,触摸表面设备通过反馈提示错误。Preferably, when an ear or hand inputs an operation instruction in a non-self-interaction area, the touch surface device prompts an error by feedback.
有利地,该方法还包括:识别耳朵呈现在屏幕上的朝向,从而区分左耳和右耳,并基于区分的结果,进行不同的控制。Advantageously, the method further comprises: identifying the orientation of the ear on the screen to distinguish the left and right ears, and performing different controls based on the result of the differentiation.
根据本发明的另一方面,提供了一种便于人类利用耳朵与之进行交互的触摸表面设备,耳朵作为与该触摸表面设备进行交互的人体器官,所述触摸表面设备包括:According to another aspect of the present invention, there is provided a touch surface device for facilitating human interaction with an ear, the ear being a human body interacting with the touch surface device, the touch surface device comprising:
-触摸表面元件,包括用于感测耳朵的操作的传感器部件,接收由耳朵输入的操作指令,a touch surface element comprising a sensor component for sensing the operation of the ear, receiving an operation command input by the ear,
-处理器;和- processor; and
-存储器,存储有计算机可执行指令,当所述指令被所述处理器执行时,执行如前所述的方法。a memory storing computer executable instructions that, when executed by the processor, perform the method as previously described.
优选地,所述触摸表面设备是移动电话,所述触摸表面是触摸屏。Preferably, the touch surface device is a mobile phone and the touch surface is a touch screen.
根据本发明的又一方面,提供了一种计算机存储介质,其上存储有计算机可执行指令,当所述指令被计算机执行时,执行如前所限定的方法。 According to still another aspect of the present invention, there is provided a computer storage medium having stored thereon computer executable instructions that, when executed by a computer, perform the method as defined above.
附图说明DRAWINGS
图1示出了根据本发明实施例的基于耳朵与触摸表面设备进行交互的人机交互方法1的示意流程图;1 shows a schematic flow chart of a human-computer interaction method 1 based on an ear interacting with a touch surface device according to an embodiment of the present invention;
图2示出了根据本发明实施例的触摸表面设备的结构配置的示意图;2 is a schematic diagram showing a structural configuration of a touch surface device according to an embodiment of the present invention;
图3示意性地示出了耳朵与屏幕进行一次交互时的耳朵接触事件的生命周期;Figure 3 is a schematic illustration of the life cycle of an ear contact event when the ear interacts with the screen;
图4示出了在跟踪移动的耳朵的图像时所采用的两个核相关滤波器KCF交替式工作的方法;Figure 4 illustrates a method of alternately operating two core correlation filters KCF employed in tracking an image of a moving ear;
图5示出了计算耳朵的触点坐标的示意图;和Figure 5 shows a schematic diagram of calculating the contact coordinates of the ear; and
图6示出了耳朵触点映射到触摸表面的示意图。Figure 6 shows a schematic diagram of the mapping of the ear contacts to the touch surface.
具体实施方式Detailed ways
为了使本领域技术人员更好地理解本发明,下面结合附图和具体实施方式对本发明作进一步详细说明。The present invention will be further described in detail below in conjunction with the drawings and specific embodiments.
本文中的耳朵的输入指点,这里借鉴了输入领域的“指点”概念,指点一般指鼠标、指示器pointer等点击的点在屏幕坐标系下的坐标,本文指在耳朵作为输入器官的情况下,基于检测结果和预定算法确定的输入位置点。In this article, the input finger of the ear, here draws on the concept of "pointing" in the input field. Pointing generally refers to the coordinates of the click point of the mouse, indicator pointer, etc. in the screen coordinate system. In this case, when the ear is used as the input organ, An input position point determined based on the detection result and a predetermined algorithm.
本文中的“接近图像”是指,触摸表面设备能够在物体接近和/或接触触摸表面时检测到它们。在感测元件的二维阵列(即触摸表面)中的每个感测元件(即"像素")产生输出信号,用以指示在传感器元件处的电场扰动(对于电容传感器)、力度(对于压力传感器)或光亲合(对于光电传感器)。全体像素值表示一个"接近图像(proximity image)"。如本文所述,本发明的各种实施例提供检测和辨别由识别的输入部位、位置、力道和/或动作类型产生的触摸表面信号(表示为接近图像)的能力。By "proximity image" herein is meant that the touch surface device is capable of detecting objects as they approach and/or contact the touch surface. Each sensing element (ie, "pixel") in a two-dimensional array of sensing elements (ie, a touch surface) produces an output signal to indicate an electric field disturbance (for a capacitive sensor) at the sensor element, force (for pressure) Sensor) or photo-affinity (for photosensors). The overall pixel value represents a "proximity image". As described herein, various embodiments of the present invention provide the ability to detect and discern touch surface signals (represented as proximity images) produced by identified input locations, locations, force paths, and/or motion types.
本文中的触摸表面元件可以为触摸面板或触摸屏,触摸表面元件包括能够感测输入的传感器,所述传感器例如电容传感器、压力传感器、光电传感器、图像传感器等等。The touch surface elements herein may be touch panels or touch screens, and the touch surface elements include sensors capable of sensing input, such as capacitive sensors, pressure sensors, photosensors, image sensors, and the like.
相比于手指,耳朵作为为与触摸表面设备的交互提供输入的人体器官有其自身的特点:Compared to the fingers, the ear has its own characteristics as a human organ that provides input for interaction with the touch surface device:
1、形状复杂1, complex shape
手指在触摸表面上的形状相对简单,可以容易地简化为一个斑点。但是, 耳朵本身具有更复杂的形状、更大的面积以及更凹凸不平的表面,在与触摸表面接触时,接触区域的形状因人而异,且即便对于同一用户,每次输入时也会存在巨大差异。因此,对耳朵输入的识别与跟踪难度加大。The shape of the finger on the touch surface is relatively simple and can be easily reduced to a single spot. However, The ear itself has a more complex shape, a larger area, and a more uneven surface. The shape of the contact area varies from person to person when in contact with the touch surface, and even for the same user, there is a huge difference every time it is input. . Therefore, the recognition and tracking of the ear input is more difficult.
2、可提供丰富的输入2, can provide rich input
耳朵与触摸表面设备进行交互时可以进行的不同动作,使得输入更加丰富。例如,耳朵可以做出的动作包括但不限于:Different actions that can be performed when the ear interacts with the touch surface device, making the input richer. For example, actions that the ear can make include, but are not limited to:
-触碰:耳朵的部分或全部接触触摸表面;- touch: part or all of the ear contacts the touch surface;
-按压:耳朵的部分或全部接触触摸表面并缓慢地增加接触的力度;- Pressing: Part or all of the ear touches the touch surface and slowly increases the strength of the contact;
-旋转:耳朵的部分或全部保持着接触触摸表面的状态旋转;- Rotation: a part or all of the ear maintains a state of contact with the touch surface;
-滑动:耳朵的部分或全部接触触摸表面并产生位移;- sliding: part or all of the ear contacts the touch surface and produces displacement;
-抽离:耳朵的全部紧密接触触摸表面,缓慢地减小接触的力度。- Pull away: All of the ears are in close contact with the touch surface, slowly reducing the strength of the contact.
并且,耳朵的面积较大,导致可以提供输入的部位有更多的选择。例如,上耳廓、下耳廓、耳垂中的一个或多个或其组合,都可以作为输入部位,或者整耳也可以作为输入部位。因此,耳朵的部分或全部执行同一种操作是两种交互,可以触发不同的操作指令。作为应用耳朵不同部位组合起来限定交互方式的一个操作示例,将耳朵上半部分先接触在屏幕上,以此为接触旋转点,可以将手机下端向头部贴合或抬离。耳朵的下半部分图像会有出现和消失的趋势,可以据此来进行识别这个操作,并基于此来定义操作指令。Also, the area of the ear is large, resulting in more options for the parts that can provide input. For example, one or more of the upper auricle, the lower auricle, and the earlobe, or a combination thereof, may be used as an input site, or the entire ear may also serve as an input site. Therefore, performing some or all of the same operation of the ear is two kinds of interactions, which can trigger different operation instructions. As an example of an operation in which different parts of the ear are combined to define an interaction mode, the upper part of the ear is first contacted on the screen, and as a contact rotation point, the lower end of the mobile phone can be attached or lifted to the head. The lower half of the image of the ear will appear and disappear, and the operation can be identified accordingly, and the operation instructions are defined based on this.
此外,由于耳朵很软且具有复杂的形状,其在与触摸表面进行接触的过程中会发生变形。因此,不同的力道——轻压或者重压——也会为输入带来新的信息,这些不同均可以用来区分不同的交互。In addition, since the ear is very soft and has a complicated shape, it deforms during contact with the touch surface. Therefore, different forces—light or heavy—will also bring new information to the input, and these differences can be used to distinguish between different interactions.
另外,耳朵与触摸表面设备进行交互时所处的位置也可以为输入提供更多的选择。例如,耳朵可以在触摸表面的边缘位置接触,也可以在非边缘位置接触;当然,耳朵也可以并不与触摸表面接触,而是与之相距一定距离,和/或成一角度。In addition, the location in which the ear interacts with the touch surface device can also provide more options for input. For example, the ear may be in contact at the edge of the touch surface or in a non-edge position; of course, the ear may not be in contact with the touch surface, but at a distance from it, and/or at an angle.
而且,左耳和右耳呈现在屏幕上的朝向不一样:左耳和右耳的操作可以被赋予不同含义。Moreover, the orientation of the left and right ears presented on the screen is different: the operation of the left and right ears can be given different meanings.
耳朵的输入部位、力道、位置和/或动作结合动作的时间可以形成极为丰富的输入,做出不同的操作指令,包括但不限于:The input position, force, position and/or motion of the ear combined with the action time can form an extremely rich input, making different operational instructions, including but not limited to:
-单击:耳朵的部分或全部单次、短时接触触摸表面后离开;- Click: part or all of the ear, short-time contact with the touch surface and leave;
-双击/多次连续点击:耳朵的部分或全部在一定时间内连续两次、多 次接触并离开触摸表面;- Double click / multiple consecutive clicks: part or all of the ear is consecutively and twice in a certain period of time Contact and leave the touch surface;
-长按:耳朵的部分或全部单次、以一定时间持续接触触摸表面后离开;- Long press: Part or all of the ear is left in a single time and continuously touches the touch surface for a certain period of time;
-滑动:耳朵的部分或全部接触触摸表面并产生位移;- sliding: part or all of the ear contacts the touch surface and produces displacement;
-旋转:耳朵的部分或全部保持着接触触摸表面的状态旋转;- Rotation: a part or all of the ear maintains a state of contact with the touch surface;
-贴合:耳朵的部分接触触摸表面,通过贴合的动作使得耳朵的全部都接触触摸表面;- fitting: a part of the ear contacts the touch surface, and all the ears are in contact with the touch surface by the action of the fitting;
-部分远离:耳朵的全部接触触摸表面,保持耳朵的部分接触触摸表面,其他部分远离触摸表面;- Partially away: all of the ear touches the touch surface, keeping part of the ear in contact with the touch surface, and other parts away from the touch surface;
-按压:耳朵的全部轻轻接触触摸表面,缓慢地加大接触的力度;- Pressing: All of the ear gently touches the touch surface, slowly increasing the strength of the contact;
-抽离:耳朵的全部紧密接触触摸表面,缓慢地减小接触的力度。- Pull away: All of the ears are in close contact with the touch surface, slowly reducing the strength of the contact.
作为上述耳朵动作的应用示例,一个最基础的操作是,将耳朵保持着贴在触摸表面上的状态移动,触摸表面设备会读出相应位置的信息内容。再例如,用户可以将耳朵在触摸表面上触碰两次做出选择的操作指令。又例如,当用户将耳朵贴在触摸表面上旋转做出对某一控制量的调整操作指令,比如,所述控制量可以是灯光控制的亮度、窗帘控制的开度、影音控制的音量、播放曲目、空调控制的设定温度、风量,等等。As an application example of the ear motion described above, one of the most basic operations is to move the ear in a state of being attached to the touch surface, and the touch surface device reads the information content of the corresponding position. As another example, the user can touch the ear twice on the touch surface to make a selected operational command. For another example, when the user rotates the ear on the touch surface to make an adjustment operation instruction for a certain control amount, for example, the control amount may be the brightness of the light control, the opening degree of the curtain control, the volume of the video control, and the playing. Tracks, set temperature of air conditioning control, air volume, and so on.
接下来,描述根据本发明实施例的触摸表面设备,该触摸表面设备便于人类利用耳朵与之进行交互,其中,耳朵作为与该触摸表面设备进行交互的人体器官。如图2所示,该触摸表面设备200包括接收由耳朵输入的操作指令的触摸表面元件230,触摸表面元件230包括感测耳朵的输入的传感装置231;其还包括处理器210和存储有可执行根据本发明的人机交互方法的计算机可执行指令的存储器220。该触摸表面设备例如可以是手机、平板电脑、智能手表等。Next, a touch surface device that facilitates human interaction with an ear is described in accordance with an embodiment of the present invention, wherein the ear acts as a human organ that interacts with the touch surface device. As shown in FIG. 2, the touch surface device 200 includes a touch surface element 230 that receives an operational command input by an ear, the touch surface element 230 includes a sensing device 231 that senses an input of the ear; it further includes a processor 210 and stores A memory 220 of computer executable instructions that can execute the human computer interaction method in accordance with the present invention. The touch surface device can be, for example, a mobile phone, a tablet computer, a smart watch, or the like.
需要说明的是,下文中,常常以耳朵对于触摸表面的触摸作为输入,不过此为优选示例,在传感器支持的情况下,耳朵对于触摸表面的其它动作例如靠近也可以形成输入,例如在触摸表面上设置有光电传感器,当耳朵足够靠近时,能够被光电传感器检测到。It should be noted that, hereinafter, the touch of the ear to the touch surface is often taken as an input, but this is a preferred example, and in the case of sensor support, other actions of the ear on the touch surface, such as proximity, may also form an input, such as on a touch surface. A photoelectric sensor is provided thereon, which can be detected by the photoelectric sensor when the ear is close enough.
如图1所示,在根据本发明实施例的基于耳朵与触摸表面设备进行交互的人机交互方法100中,执行以下步骤:步骤S110,识别耳朵的输入指点和/或姿势;和步骤S120,响应于所述识别的结果,所述触摸表面设备执行相 应的控制操作。As shown in FIG. 1, in a human-computer interaction method 100 based on interaction between an ear and a touch surface device according to an embodiment of the present invention, the following steps are performed: step S110, identifying an input pointing and/or a gesture of an ear; and step S120, The touch surface device performs phase in response to the result of the recognition The control operation should be.
在一个示例中,定义耳朵中心向上的点作为耳朵位置。在一个示例中,通过比较耳朵形状之间的匹配,来判别耳朵的位置移动和方位变化。In one example, the point at which the center of the ear is upward is defined as the ear position. In one example, the positional movement and orientation change of the ear are discerned by comparing the matches between the ear shapes.
现通过一实施例进行具体解释,在该实施例中,所述触摸表面设备是手机,其例如可以是基于安卓(Android)系统的手机,因而该手机的触摸屏幕即触摸表面。该手机例如具有电容传感器,对从耳朵接收的输入进行感测。当然,本发明的触摸表面设备也可以利用其它传感器,例如压力传感器、光电传感器、图像传感器、声波传感器等等。A specific explanation is now made by an embodiment. In this embodiment, the touch surface device is a mobile phone, which may be, for example, an Android (Android) system based mobile phone, and thus the touch screen of the mobile phone is a touch surface. The handset, for example, has a capacitive sensor that senses the input received from the ear. Of course, the touch surface device of the present invention can also utilize other sensors such as pressure sensors, photo sensors, image sensors, sonic sensors, and the like.
当耳朵与手机进行交互时,识别耳朵的输入指点和/或姿势。特别地,获得与耳朵相关联的接近图像,以识别耳朵在所述设备上的输入部位、位置、力道和/或动作中的一个或多个或其组合。When the ear interacts with the phone, it recognizes the input finger and/or posture of the ear. In particular, a proximity image associated with the ear is obtained to identify one or more of the input locations, locations, forces, and/or motions of the ear on the device, or a combination thereof.
输入部位可以包括:上耳廓、下耳廓、耳垂中的一个或多个或其组合,或整耳。The input site can include one or more of the upper auricle, the lower auricle, the earlobe, or a combination thereof, or the entire ear.
位置包括:触摸表面的边缘位置和非边缘位置。其中,触摸表面的边缘位置是易于为耳朵感知的位置,因此在边缘位置设置一些为耳朵动作触发的事件是优选的。例如,可以用耳朵的耳廓部分沿着屏幕边缘上下滑动和左右滑动来实现类似于手指在屏幕上的上下滑动和左右滑动操作。The position includes: the edge position and the non-edge position of the touch surface. Among them, the edge position of the touch surface is a position that is easy to be perceived by the ear, so it is preferable to set some events triggered by the ear motion at the edge position. For example, the auricle portion of the ear can be slid up and down along the edge of the screen and slide left and right to achieve an up and down slide and a left and right slide operation similar to the finger on the screen.
在一个例子中,可以从安卓HAL层(硬件抽象层)获得屏幕的电容值矩阵,经过插值、去噪等处理得到适合分析的电容屏幕图像。In one example, the capacitance matrix of the screen can be obtained from the Android HAL layer (hardware abstraction layer), and processed, interpolated, denoised, etc. to obtain a capacitive screen image suitable for analysis.
然后对耳朵与屏幕接触产生的电容图像进行识别和跟踪。具体地,检测作为耳朵候选的感兴趣区域的密度值和图案,其中,所述感兴趣区域可以是指密度值超过一定阈值的点的像素构成的区域。The capacitive image produced by the ear and screen contact is then identified and tracked. Specifically, the density value and the pattern of the region of interest as the ear candidate are detected, wherein the region of interest may refer to a region composed of pixels of a point whose density value exceeds a certain threshold.
如前所述,基于耳朵与触摸表面交互时特有的属性,对应于现存的手指交互接触事件(touch event)的定义(向下(down)、向上(up)、移动(move)),在本实施例中可以对耳朵交互提出独有的耳朵接触事件(ear-touch event),即,耳朵接触(Ear-on)、耳朵脱开(Ear-off)、耳朵移动(Ear-move)的概念定义。As mentioned before, the unique attribute based on the interaction of the ear with the touch surface corresponds to the definition of the existing finger touch event (down, up, move). In the embodiment, a unique ear-touch event can be proposed for ear interaction, that is, a concept definition of Ear-on, Ear-off, and Ear-move. .
例如,在本实施例中可以利用“全屏电容值强度加和值”来识别感兴趣区域,衡量耳朵与屏幕的接触状态,来提取两个稳定的帧作为耳朵接触Ear-on和耳朵脱开Ear-off。图3示出了耳朵与屏幕进行一次交互时的耳朵接触事件生命周期,Y轴为“加和值”。 For example, in the present embodiment, the "full-screen capacitance value intensity summation value" can be used to identify the region of interest, and the contact state of the ear with the screen can be measured to extract two stable frames as the ear contact Ear-on and the ear disengaged Ear. -off. Figure 3 shows the ear contact event life cycle when the ear interacts with the screen once, with the Y-axis being the "addition value."
耳朵接触Ear-on:耳朵“完全接触”屏幕。“加和值”在耳朵与屏幕接触的过程中会波动式增长,取其第一次回落,即,曲线中出现的第一次波谷,为Ear-on帧。Ear contact with Ear-on: The ear is "fully in contact" with the screen. The "addition value" fluctuates in the process of contact between the ear and the screen, taking its first fall, that is, the first trough appearing in the curve is an Ear-on frame.
耳朵脱开Ear-off:耳朵从屏幕抬离时。“加和值”在耳朵与屏幕远离的过程中会迅速下降,取其高于设定阈值时的最后一帧,为Ear-off帧,如图3所示。Ear off the ear: When the ear is lifted off the screen. The "addition value" drops rapidly as the ear moves away from the screen, taking the last frame above the set threshold as an Ear-off frame, as shown in Figure 3.
耳朵移动Ear-move:利用KCF核相关滤波器(Kernelized Correlation Filter)算法,对在屏幕上移动的耳朵图像进行跟踪,从而得到连续两帧图像之间的相对位移量。Ear-move: The KCF kernel correlation filter (Kernelized Correlation Filter) algorithm is used to track the image of the ear moving on the screen to obtain the relative displacement between two consecutive frames of images.
此外,针对耳朵易于变形的特性,为了获得更加稳定的跟踪效果,可以采用两个KCF跟踪器(tracker)交替式工作的方法,如图4所示:In addition, for the characteristics of easy deformation of the ear, in order to obtain a more stable tracking effect, two KCF trackers can be used to alternately work, as shown in FIG. 4:
-第一跟踪器处于活跃工作状态,输出跟踪结果时,第二跟踪器处于后台运行状态;- the first tracker is in an active working state, and when the tracking result is output, the second tracker is in a background running state;
-当第一跟踪器工作达到预定时间阈值例如500ms时或者第一跟踪器跟踪失败时,两个跟踪器的状态会交换,被替换掉的第一跟踪器则重新初始化。- When the first tracker reaches a predetermined time threshold, for example 500 ms, or the first tracker fails to track, the status of the two trackers is swapped and the replaced first tracker is reinitialized.
这里,时间500ms为优选示例,可以根据需要设置其它的时间阈值。Here, the time 500ms is a preferred example, and other time thresholds can be set as needed.
然后,基于识别和跟踪的结果,计算得出耳朵的触点坐标,并将其映射到全屏。Then, based on the results of the recognition and tracking, the contact coordinates of the ear are calculated and mapped to the full screen.
在根据本发明的该实施例中,如图5所示,例如定义位于上耳廓边缘和耳洞之间的某个位置为耳朵图像的意图点击点。利用包围住耳朵图像的最小矩形靠近后上方耳廓的顶点为基准,距离边界一定值处为该点。第一点(耳朵接触ear-on)采用这种绝对位置坐标的定义,耳朵移动ear-move则由绝对坐标及相邻帧的相对位移累加得出。本发明也可以采用其他的位置作为意图点击点。In this embodiment according to the present invention, as shown in Fig. 5, for example, a certain point located between the edge of the upper auricle and the ear hole is defined as an intent click point of the ear image. The point at which the distance boundary is a certain value is taken as the reference by using the smallest rectangle surrounding the image of the ear near the apex of the posterior superior auricle. The first point (ear contact ear-on) uses this definition of absolute position coordinates, and the ear movement of ear-move is calculated by the absolute coordinates and the relative displacement of adjacent frames. Other locations may be employed in the present invention as intent click points.
在该实施例中,耳朵仅在屏幕的一定区域内进行交互,从而限定了耳朵的交互区域,如图5所示,其中限定了如图所示的交互区域作为耳朵交互的舒适边界区域。然后将耳朵在这个区域内的移动,线性映射到全屏,如图6所示。由此,可以尽可能减小用户胳膊和拇指的移动范围。In this embodiment, the ears interact only within a certain area of the screen, thereby defining an interactive area of the ear, as shown in Figure 5, in which the interactive area as shown is defined as a comfortable boundary area for ear interaction. The movement of the ear in this area is then linearly mapped to full screen, as shown in Figure 6. Thereby, the range of movement of the user's arm and thumb can be reduced as much as possible.
此后,例如将映射后的坐标轨迹进行平滑处理之后,模拟触摸事件的结构,通过adb利用sendevent命令,将touchevent注入回安卓系统。Thereafter, after smoothing the mapped coordinate trajectory, for example, simulating the structure of the touch event, the touchevent is injected back into the Android system by the adb using the sendevent command.
在根据本发明实施例的方法的操作期间,触摸表面设备可以向用户提供 反馈。所述反馈包括听觉反馈,例如,在耳朵在触摸表面上滑动期间,可以输出耳朵滑动路径上的信息内容,以对用户进行提示。例如可以以音效、语音或二者组合的方式提供所述听觉反馈。替换地,所述反馈包括触觉反馈,例如当耳朵进行误操作时,提供触觉反馈警告用户。例如可以以振动的方式提供所述触觉反馈。替换地,所述反馈可以包括是听觉和触觉反馈的组合。The touch surface device may provide to the user during operation of the method according to an embodiment of the invention Feedback. The feedback includes audible feedback, for example, during sliding of the ear on the touch surface, the information content on the ear sliding path can be output to prompt the user. The audible feedback can be provided, for example, in the form of sound effects, speech, or a combination of both. Alternatively, the feedback includes haptic feedback, such as providing tactile feedback to alert the user when the ear is mishandling. For example, the haptic feedback can be provided in a vibrating manner. Alternatively, the feedback can include a combination of audible and tactile feedback.
此外,手也可以作为与触摸表面设备进行交互的辅助人体器官向触摸表面设备输入操作指令。触摸表面设备可以识别来自于手的输入。替换地,所述触摸表面限定有耳朵交互区域和手交互区域,这样,当耳朵或手在非自身交互区域中输入操作指令时,触摸表面设备通过反馈提示错误。作为耳朵与手指合作的一个示例,手指点按住屏幕和未点按住屏幕时(就像shift键一样),耳朵进行操作,可以作为不同的交互方式。In addition, the hand can also input operational commands to the touch surface device as an auxiliary human organ that interacts with the touch surface device. The touch surface device can recognize input from the hand. Alternatively, the touch surface defines an ear interaction area and a hand interaction area such that when an ear or hand inputs an operation instruction in a non-self interaction area, the touch surface device prompts an error by feedback. As an example of ear-to-finger cooperation, when the finger clicks on the screen and does not press and hold the screen (just like the shift key), the ear operates as a different interaction.
前面的示例中,以电容作为感测耳朵输入的传感器示例,不过此作为示例而非限制,可以替代地或结合地使用其它类型的传感器,例如:压力传感器、光传感器、摄像头,作为摄像头示例,可以获得传统的平面图像,也可以利用深度图。In the previous example, a capacitor is used as an example of a sensor that senses ear input, but this is by way of example and not limitation, and other types of sensors may be used instead or in combination, such as: a pressure sensor, a light sensor, a camera, as an example of a camera, You can get a traditional flat image or you can use a depth map.
在前面的示例中,感测的是耳朵对屏幕表面的接触,不过此为示例而非限制,替代地或结合地,还可以在耳朵没有接触到的屏幕表面的时候,识别耳朵距屏幕的距离、耳朵相对屏幕的方位,并基于距离的不同、方位的不同,而获得不同的输入信号,可以将这样不同的输入信号对应于不同的操作指令。In the previous example, the contact of the ear with the surface of the screen is sensed, but this is by way of example and not limitation, and alternatively, or in combination, the distance of the ear from the screen may be recognized when the surface of the screen that the ear does not touch. The orientation of the ear relative to the screen, and different input signals are obtained based on different distances and different orientations, so that different input signals can be corresponding to different operation commands.
前文中以移动电话(手机)作为触摸表面设备的例子,不过此为示例而非限制,触摸表面设备还可以为例如平板计算机系统、手持计算机系统、便携音乐播放器系统、便携视频播放器系统等等。In the foregoing, a mobile phone (handset) is taken as an example of a touch surface device, but this is by way of example and not limitation, and the touch surface device may also be, for example, a tablet computer system, a handheld computer system, a portable music player system, a portable video player system, etc. Wait.
以上已经描述了本发明的各实施例,上述说明是示例性的,并非穷尽性的,并且也不限于所披露的各实施例。在不偏离所说明的各实施例的范围和精神的情况下,对于本技术领域的普通技术人员来说许多修改和变更都是显而易见的。因此,本发明的保护范围应该以权利要求的保护范围为准。 The embodiments of the present invention have been described above, and the foregoing description is illustrative, not limiting, and not limited to the disclosed embodiments. Numerous modifications and changes will be apparent to those skilled in the art without departing from the scope of the invention. Therefore, the scope of protection of the present invention should be determined by the scope of the claims.

Claims (30)

  1. 一种基于耳朵与触摸表面设备进行交互的人机交互方法,其中,耳朵作为与该设备进行交互的人体器官为其提供输入,所述方法包括:A human-computer interaction method based on interaction between an ear and a touch surface device, wherein the ear provides input as a human body interacting with the device, the method comprising:
    -识别耳朵的输入指点和/或姿势,和- identifying the input finger and/or posture of the ear, and
    -响应于所述识别的结果,所述触摸表面设备执行相应的控制操作。- in response to the result of the recognition, the touch surface device performs a corresponding control operation.
  2. 如权利要求1所述的人机交互方法,其中,所述识别耳朵的输入指点和/或姿势包括:获得与耳朵相关联的接近图像,以识别耳朵在所述设备上的输入部位、位置、力道和/或动作中的一个或多个或其组合。The human-computer interaction method according to claim 1, wherein the recognizing an input pointing and/or a gesture of the ear comprises: obtaining a proximity image associated with the ear to identify an input portion, a position of the ear on the device, One or more of the forces and/or actions or a combination thereof.
  3. 如权利要求2所述的人机交互方法,其中,所述输入部位包括:上耳廓、下耳廓、耳垂中的一个或多个或其组合,或整耳。The human-computer interaction method according to claim 2, wherein the input portion comprises one or more of an upper auricle, a lower auricle, and an earlobe or a combination thereof, or an entire ear.
  4. 如权利要求2所述的人机交互方法,其中,所述位置包括:触摸表面的边缘位置和非边缘位置。The human-computer interaction method according to claim 2, wherein the position comprises: an edge position and a non-edge position of the touch surface.
  5. 如权利要求2所述的人机交互方法,其中,所述动作包括:触碰、按压、旋转、滑动、抽离。The human-computer interaction method according to claim 2, wherein the action comprises: touching, pressing, rotating, sliding, and detaching.
  6. 如权利要求2所述的人机交互方法,还包括:检测作为耳朵候选的感兴趣区域的密度值和图案,其中,所述感兴趣区域是指密度值超过一定阈值的点的像素构成的区域。The human-computer interaction method according to claim 2, further comprising: detecting a density value and a pattern of the region of interest as an ear candidate, wherein the region of interest refers to a region composed of pixels of a point whose density value exceeds a certain threshold .
  7. 如权利要求1至6中的任一项所述的人机交互方法,其中,识别并跟踪耳朵与触摸表面接触时相对于其的移动,并根据识别结果实现触摸浏览操作指令。The human-computer interaction method according to any one of claims 1 to 6, wherein the movement of the ear with respect to the touch surface when it is in contact with the touch surface is recognized and tracked, and a touch browsing operation instruction is implemented according to the recognition result.
  8. 如权利要求7所述的人机交互方法,其中,所述触摸表面设备利用电容传感器对耳朵的输入进行感测,借助触摸表面电容值强度加和值来衡量耳朵与屏幕的接触状态。The human-computer interaction method according to claim 7, wherein the touch surface device senses an input of the ear using a capacitance sensor, and measures a contact state of the ear with the screen by means of a touch surface capacitance value intensity addition value.
  9. 如权利要求8所述的人机交互方法,其中,将耳朵完全接触触摸表面定义为耳朵接触事件,所述加和值在耳朵与该触摸表面接触过程中的波动式增长的第一次回落为耳朵接触帧;并且其中,将耳朵从该触摸表面抬离时定义为耳朵脱开事件,所述加和值在耳朵从触摸表面远离的过程中迅速下降,取其高于设定阈值时的最后一帧,为耳朵脱开帧。The human-computer interaction method according to claim 8, wherein the full contact of the ear with the touch surface is defined as an ear contact event, and the first fall of the gradual increase in the contact of the ear with the touch surface is An ear contact frame; and wherein the lifting of the ear from the touch surface is defined as an ear disengagement event, the summing value rapidly decreasing as the ear moves away from the touch surface, taking the last time above the set threshold One frame, the frame is released for the ear.
  10. 如权利要求9所述的人机交互方法,其中,将耳朵在触摸表面上的移动定义为耳朵移动事件,对接近图像进行跟踪,从而得到连续两帧图像之 间的相对位移量。The human-computer interaction method according to claim 9, wherein the movement of the ear on the touch surface is defined as an ear movement event, and the proximity image is tracked to obtain two consecutive frames of images. The relative amount of displacement between.
  11. 如权利要求10所述的人机交互方法,其中,利用核相关滤波器KCF算法,计算连续两帧图像之间的相对位移量。The human-computer interaction method according to claim 10, wherein the relative displacement amount between successive two frames of images is calculated using a kernel correlation filter KCF algorithm.
  12. 如权利要求11所述的人机交互方法,其中,采用两个KCF跟踪器交替式工作的方法:第一跟踪器处于活跃工作状态,输出跟踪结果时,第二跟踪器处于后台运行状态;当第一跟踪器工作满一定时间时或者第一跟踪器跟踪失败时,两个跟踪器的状态交换,被替换掉的第一跟踪器则重新初始化,如此往复。The human-computer interaction method according to claim 11, wherein two KCF trackers are alternately operated: the first tracker is in an active working state, and when the tracking result is output, the second tracker is in a background running state; When the first tracker works for a certain period of time or when the first tracker fails to track, the state of the two trackers is exchanged, and the replaced first tracker is re-initialized, and so on.
  13. 如权利要求12所述的人机交互方法,其中,所述时间为500ms。The human-computer interaction method according to claim 12, wherein said time is 500 ms.
  14. 如权利要求7所述的人机交互方法,其中,计算得出耳朵的触点坐标。The human-computer interaction method according to claim 7, wherein the contact coordinates of the ear are calculated.
  15. 如权利要求14所述的人机交互方法,其中,定义位于上耳廓边缘和耳洞之间的某个位置为耳朵图像的意图点击点。The human-computer interaction method according to claim 14, wherein the position between the upper auricle edge and the ear hole is defined as an intention click point of the ear image.
  16. 如权利要求15所述的人机交互方法,其中,利用包围住耳朵图像的最小矩形靠近后上方耳廓的顶点为基准,距离边界一定值处为所述意图点击点。The human-computer interaction method according to claim 15, wherein the imaginary click point is determined by using a minimum rectangle surrounding the ear image near a vertex of the upper rear auricle.
  17. 如权利要求16所述的人机交互方法,其中,将耳朵的触点坐标映射到触摸表面。The human-computer interaction method according to claim 16, wherein the contact coordinates of the ear are mapped to the touch surface.
  18. 如权利要求1至6中的任一项所述的人机交互方法,其中,所述操作包括触摸表面设备向用户提供反馈。The human-computer interaction method according to any one of claims 1 to 6, wherein the operation comprises the touch surface device providing feedback to the user.
  19. 如权利要求18所述的人机交互方法,其中,所述反馈包括听觉反馈。The human-computer interaction method of claim 18, wherein the feedback comprises audible feedback.
  20. 如权利要求19所述的人机交互方法,其中,以音效、语音或二者组合的方式提供所述听觉反馈。The human-computer interaction method according to claim 19, wherein the auditory feedback is provided in a sound effect, a voice, or a combination of both.
  21. 如权利要求18所述的人机交互方法,其中,所述反馈包括触觉反馈。The human-computer interaction method of claim 18, wherein the feedback comprises haptic feedback.
  22. 如权利要求21所述的人机交互方法,其中,以振动的方式提供所述触觉反馈。The human-computer interaction method according to claim 21, wherein said tactile feedback is provided in a vibration manner.
  23. 如权利要求1至6中的任一项所述的人机交互方法,其中,手作为与该触摸表面设备进行交互的辅助人体器官向触摸表面设备输入操作指令。The human-computer interaction method according to any one of claims 1 to 6, wherein the hand inputs an operation instruction to the touch surface device as an auxiliary human body that interacts with the touch surface device.
  24. 如权利要求23所述的人机交互方法,其中,触摸表面设备还识别 来自于手的输入。The human-computer interaction method according to claim 23, wherein the touch surface device further recognizes Input from the hand.
  25. 如权利要求23所述的人机交互方法,其中,所述触摸表面限定有耳朵交互区域和手交互区域。The human-computer interaction method according to claim 23, wherein the touch surface defines an ear interaction area and a hand interaction area.
  26. 如权利要求24所述的人机交互方法,其中,当耳朵或手在非自身交互区域中输入操作指令时,触摸表面设备通过反馈提示错误。The human-computer interaction method according to claim 24, wherein the touch surface device prompts an error by feedback when an ear or a hand inputs an operation instruction in the non-self-interaction area.
  27. 如权利要求1到6任一项的人机交互方法,还包括:识别耳朵呈现在屏幕上的朝向,从而区分左耳和右耳,并基于区分的结果,进行不同的控制。The human-computer interaction method according to any one of claims 1 to 6, further comprising: recognizing an orientation of the ear on the screen to distinguish the left ear and the right ear, and performing different control based on the result of the discrimination.
  28. 一种便于人类利用耳朵与之进行交互的触摸表面设备,耳朵作为与该触摸表面设备进行交互的人体器官,所述触摸表面设备包括:A touch surface device that facilitates human interaction with an ear, the ear being a human body that interacts with the touch surface device, the touch surface device comprising:
    -触摸表面元件,包括用于感测耳朵的操作的传感器部件,接收由耳朵输入的操作指令,a touch surface element comprising a sensor component for sensing the operation of the ear, receiving an operation command input by the ear,
    -处理器;以及- processor; and
    -存储器,存储有计算机可执行指令,当所述指令被所述处理器执行时,执行如权利要求1至27中的任一项所述的方法。a memory storing computer executable instructions which, when executed by the processor, perform the method of any one of claims 1 to 27.
  29. 如权利要求28所述的触摸表面设备,所述触摸表面设备是移动电话,所述触摸表面是触摸屏。The touch surface device of claim 28, the touch surface device is a mobile phone, and the touch surface is a touch screen.
  30. 一种计算机存储介质,其上存储有计算机可执行指令,当所述指令被计算机执行时,执行如权利要求1至27中的任一项所述的方法。 A computer storage medium having stored thereon computer executable instructions for performing the method of any one of claims 1 to 27 when the instructions are executed by a computer.
PCT/CN2017/116042 2017-12-14 2017-12-14 Ear-based human-computer interaction technology for interaction with touch surface device WO2019113868A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/116042 WO2019113868A1 (en) 2017-12-14 2017-12-14 Ear-based human-computer interaction technology for interaction with touch surface device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/116042 WO2019113868A1 (en) 2017-12-14 2017-12-14 Ear-based human-computer interaction technology for interaction with touch surface device

Publications (1)

Publication Number Publication Date
WO2019113868A1 true WO2019113868A1 (en) 2019-06-20

Family

ID=66818743

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/116042 WO2019113868A1 (en) 2017-12-14 2017-12-14 Ear-based human-computer interaction technology for interaction with touch surface device

Country Status (1)

Country Link
WO (1) WO2019113868A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104683563A (en) * 2013-11-28 2015-06-03 京瓷办公信息系统株式会社 Electronic device and operation accepting method
CN104704801A (en) * 2012-10-09 2015-06-10 高通Mems科技公司 Ear position and gesture detection with mobile device
US20150161459A1 (en) * 2013-12-11 2015-06-11 Descartes Biometrics, Inc. Ear-scan-based biometric authentication
CN105120089A (en) * 2015-08-17 2015-12-02 惠州Tcl移动通信有限公司 A method for mobile terminal automatic telephone answering and a mobile terminal

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104704801A (en) * 2012-10-09 2015-06-10 高通Mems科技公司 Ear position and gesture detection with mobile device
CN104683563A (en) * 2013-11-28 2015-06-03 京瓷办公信息系统株式会社 Electronic device and operation accepting method
US20150161459A1 (en) * 2013-12-11 2015-06-11 Descartes Biometrics, Inc. Ear-scan-based biometric authentication
CN105120089A (en) * 2015-08-17 2015-12-02 惠州Tcl移动通信有限公司 A method for mobile terminal automatic telephone answering and a mobile terminal

Similar Documents

Publication Publication Date Title
CN104956292B (en) The interaction of multiple perception sensing inputs
KR101844366B1 (en) Apparatus and method for recognizing touch gesture
JP5103380B2 (en) Large touch system and method of interacting with the system
WO2018076523A1 (en) Gesture recognition method and apparatus, and in-vehicle system
EP2972727B1 (en) Non-occluded display for hover interactions
CN105549783B (en) Multi-touch input discrimination
US9298261B2 (en) Method for actuating a tactile interface layer
TWI514248B (en) Method for preventing from accidentally triggering edge swipe gesture and gesture triggering
US20190384450A1 (en) Touch gesture detection on a surface with movable artifacts
US9619042B2 (en) Systems and methods for remapping three-dimensional gestures onto a finite-size two-dimensional surface
US10366281B2 (en) Gesture identification with natural images
WO2019091124A1 (en) Terminal user interface display method and terminal
TW201741814A (en) Interface control method and mobile terminal
CN103809787B (en) Be adapted for contact with controlling and suspend the touch-control system and its operating method of control
US20220019288A1 (en) Information processing apparatus, information processing method, and program
US10222866B2 (en) Information processing method and electronic device
WO2019113868A1 (en) Ear-based human-computer interaction technology for interaction with touch surface device
CN104951211B (en) A kind of information processing method and electronic equipment
WO2016206438A1 (en) Touch screen control method and device and mobile terminal
TW202343205A (en) Method for the non-contact triggering of buttons
TW201504925A (en) Method for operating user interface and electronic device
Onodera et al. Vision-Based User Interface for Mouse and Multi-mouse System

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17934827

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17934827

Country of ref document: EP

Kind code of ref document: A1