WO2016201760A1 - 一种触摸显示装置中识别手势的方法和系统 - Google Patents
一种触摸显示装置中识别手势的方法和系统 Download PDFInfo
- Publication number
- WO2016201760A1 WO2016201760A1 PCT/CN2015/084458 CN2015084458W WO2016201760A1 WO 2016201760 A1 WO2016201760 A1 WO 2016201760A1 CN 2015084458 W CN2015084458 W CN 2015084458W WO 2016201760 A1 WO2016201760 A1 WO 2016201760A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- trajectory
- gesture
- touch input
- touch
- input
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 20
- 230000008859 change Effects 0.000 claims description 12
- 230000006870 function Effects 0.000 description 13
- 238000001514 detection method Methods 0.000 description 10
- 230000009471 action Effects 0.000 description 5
- 238000010586 diagram Methods 0.000 description 5
- 230000008901 benefit Effects 0.000 description 4
- 230000001960 triggered effect Effects 0.000 description 4
- 230000003993 interaction Effects 0.000 description 2
- 241000699670 Mus sp. Species 0.000 description 1
- 230000006835 compression Effects 0.000 description 1
- 238000007906 compression Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/02—Input arrangements using manually operated switches, e.g. using keyboards or dials
- G06F3/023—Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
- G06F3/0233—Character input methods
- G06F3/0237—Character input methods using prediction or retrieval techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/30—Arrangements for executing machine instructions, e.g. instruction decode
- G06F9/38—Concurrent instruction execution, e.g. pipeline or look ahead
- G06F9/3802—Instruction prefetching
- G06F9/3804—Instruction prefetching for branches, e.g. hedging, branch folding
- G06F9/3806—Instruction prefetching for branches, e.g. hedging, branch folding using address prediction, e.g. return stack, branch history buffer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/30—Arrangements for executing machine instructions, e.g. instruction decode
- G06F9/38—Concurrent instruction execution, e.g. pipeline or look ahead
- G06F9/3824—Operand accessing
- G06F9/383—Operand prefetching
- G06F9/3832—Value prediction for operands; operand history buffers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/30—Arrangements for executing machine instructions, e.g. instruction decode
- G06F9/38—Concurrent instruction execution, e.g. pipeline or look ahead
- G06F9/3836—Instruction issuing, e.g. dynamic instruction scheduling or out of order instruction execution
- G06F9/3842—Speculative instruction execution
- G06F9/3844—Speculative instruction execution using dynamic branch prediction, e.g. using branch history tables
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/30—Arrangements for executing machine instructions, e.g. instruction decode
- G06F9/38—Concurrent instruction execution, e.g. pipeline or look ahead
- G06F9/3836—Instruction issuing, e.g. dynamic instruction scheduling or out of order instruction execution
- G06F9/3842—Speculative instruction execution
- G06F9/3846—Speculative instruction execution using static prediction, e.g. branch taken strategy
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/30—Arrangements for executing machine instructions, e.g. instruction decode
- G06F9/38—Concurrent instruction execution, e.g. pipeline or look ahead
- G06F9/3836—Instruction issuing, e.g. dynamic instruction scheduling or out of order instruction execution
- G06F9/3842—Speculative instruction execution
- G06F9/3848—Speculative instruction execution using hybrid branch prediction, e.g. selection between prediction techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/30—Arrangements for executing machine instructions, e.g. instruction decode
- G06F9/38—Concurrent instruction execution, e.g. pipeline or look ahead
- G06F9/3861—Recovery, e.g. branch miss-prediction, exception handling
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16C—COMPUTATIONAL CHEMISTRY; CHEMOINFORMATICS; COMPUTATIONAL MATERIALS SCIENCE
- G16C20/00—Chemoinformatics, i.e. ICT specially adapted for the handling of physicochemical or structural data of chemical particles, elements, compounds or mixtures
- G16C20/10—Analysis or design of chemical reactions, syntheses or processes
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16C—COMPUTATIONAL CHEMISTRY; CHEMOINFORMATICS; COMPUTATIONAL MATERIALS SCIENCE
- G16C20/00—Chemoinformatics, i.e. ICT specially adapted for the handling of physicochemical or structural data of chemical particles, elements, compounds or mixtures
- G16C20/30—Prediction of properties of chemical compounds, compositions or mixtures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/041—Indexing scheme relating to G06F3/041 - G06F3/045
- G06F2203/04104—Multi-touch detection in digitiser, i.e. details about the simultaneous detection of a plurality of touching locations, e.g. multiple fingers or pen and finger
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2209/00—Indexing scheme relating to G06F9/00
- G06F2209/50—Indexing scheme relating to G06F9/50
- G06F2209/5019—Workload prediction
Definitions
- the present invention relates to the field of human-computer interaction technology in a computer, and in particular, to a method and system for recognizing a gesture in a touch display device.
- touch display devices have been used as an input device instead of or in addition to ordinary keyboards and mice, and have been used in many occasions and fields, especially the emergence of a new generation of touch display devices supporting multi-touch interaction, enabling the application of touch display devices. New changes have taken place, and touch display devices can provide users with more and more natural ways of interacting.
- the existing touch display device in particular, a large-size (for example, 55-110 inch) touch display device
- the system that touches the display device can recognize the matched gesture input according to the acquired touch input trajectory. And perform the function corresponding to the gesture input.
- the user needs to perform a large-scale operation within a large range of the touch display device during the execution of the touch action, that is, to move the finger or the touch input device substantially (for example, Touching the pen) can realize the functions that the user wants, thus increasing the user's fatigue during use and giving the user a poor experience.
- One of the technical problems to be solved by the present invention is to provide a method capable of reducing a user's recognition of a gesture in a touch display device using a fatigue feeling of a large-sized touch display device, which can provide a good user experience.
- an embodiment of the present application first provides a method for recognizing a gesture in a touch display device, the method comprising: receiving a trajectory of a touch input on the touch display device in real time; after the touch input is completed Previously, all gesture inputs that match the trajectory of the currently received touch input are identified and displayed, and the gesture input determined from all gesture inputs is received and the function corresponding to the gesture input is performed.
- the step of recognizing all gesture inputs that match the trajectory of the currently received touch input determining a feature of the trajectory of the touch input; selecting from the plurality of gesture inputs stored in advance All gesture input.
- the characteristics of the trajectory of the touch input include the number of touch points, the type of the trajectory, and the quadrant in which the trajectory is located.
- a system for recognizing a gesture in a touch display device comprising: a receiving unit that receives a trajectory of a touch input on the touch display device in real time; and an identification unit that is Before the touch input is completed, all gesture inputs that match the trajectory of the currently received touch input are recognized; an execution unit that receives the gesture input determined from all the gesture inputs displayed in the touch display device, and Perform the function corresponding to the gesture input.
- the identification unit is further configured to: determine a feature of the trajectory of the touch input; select all gesture inputs having the feature from among a plurality of gesture inputs stored in advance.
- the characteristics of the trajectory of the touch input include the number of touch points, the type of the trajectory, and the quadrant in which the trajectory is located.
- a trajectory calculation unit that calculates a real-time distance between two contact points in the touch input; when the real-time distance between the two contact points calculated by the trajectory calculation unit reaches a first setting At the time of the distance, the recognition unit recognizes all gesture inputs that match the trajectory of the currently received touch input.
- the trajectory calculation unit further calculates a trajectory change distance of at least one of the touch inputs; when the trajectory change distance of the at least one contact point calculated by the trajectory calculation unit changes to a second set distance
- the recognition unit identifies all gesture inputs that match the trajectory of the currently received touch input.
- timing unit that starts timing from the start of the touch input; and when the time counted by the timing unit reaches the set time, the recognition unit recognizes all of the trajectories matching the currently received touch input Gesture input.
- One or more of the above aspects may have the following advantages or benefits compared to the prior art.
- the system receives the trajectory of the touch input in the touch display device in real time, and recognizes and displays all the gesture inputs that match the currently received touch input trajectory before the touch input is completed. Finally, The system receives the gesture input determined from all of the gesture inputs and performs a function corresponding to the gesture input.
- the system can pre-determine the gesture input that the user may want to take before the user completes the touch input, and control the display unit to display all possible similar gesture inputs as an indication (or navigation) message, so the user is using a large-size touch display.
- the system can recognize similar gesture inputs in advance without extensive touch operation in a large range, thereby reducing the burden on the user and obtaining a better user experience.
- FIG. 1 is a schematic structural diagram of a touch display device according to an embodiment of the present application.
- FIG. 2 is a schematic structural diagram of a recognition gesture system in a touch display device according to an embodiment of the present application.
- FIG. 3 is a schematic flowchart diagram of a method for recognizing a gesture in a touch display device according to an embodiment of the present application.
- 4(a) and 4(b) are illustrations showing different stages of gesture input on a touch display device, respectively examples.
- FIG. 1 is a schematic structural diagram of a touch display device according to an embodiment of the present application. It should be noted that the touch display device of the present application can be applied to a television, a personal computer, a mobile phone, or the like.
- the touch display device includes an identification gesture system 10, a touch detection unit 20, an I/O interface 30, and a display unit 40.
- the touch detection section 20 is disposed above the display unit 40, and the touch detection section 20 is for detecting a touch input of the user, and transmits the detected touch input trajectory to the recognition gesture system 10 through the I/O interface 30.
- the system 10 receives the trajectory of the touch input currently detected by the touch detection portion 20 in real time, and recognizes all gesture inputs that match the touch input trajectory before the touch input is completed, and then the system 10 obtains the obtained through the I/O interface 30. All the gesture inputs are displayed on the display unit 40, and finally the gesture input determined by the user according to all the gesture inputs displayed by the display unit 40 is received, and the function corresponding to the gesture input is performed.
- the system 10 mainly includes a receiving unit 101, an identifying unit 102, and an executing unit 103.
- the receiving unit 101 receives the trajectory of the touch input in the touch display device in real time through the I/O interface 30.
- the recognition unit 102 identifies all gesture inputs that match the currently received touch input trajectory before the touch input is completed.
- the execution unit 103 receives the gesture input determined from all the gesture inputs displayed in the touch display device (which may be said to be the display unit 40), and performs a function corresponding to the gesture input.
- the system 10 further includes a storage unit 104 that stores a plurality of gesture input information, mainly including multi-point gesture input information.
- the information is stored in a list format of a specific touch gesture and corresponding input signals generated by the gesture (such as function signals for clicking, dragging, zooming, zooming, and rotating).
- the storage unit 104 can be a television, a memory of a mobile phone, or a hard disk of a computer.
- the touch gesture of the body may include a finger click, a double tap, a finger hold and drag, or two fingers as shown in FIG. 4 to perform vertical direction or 45° direction stretching, and the like.
- the system 10 also provides a trigger module 105 for using at a certain The time (the time before the user completes the touch input) triggers the recognition unit 102 to start the action.
- the trigger module 105 is coupled to the identification unit 102, which includes a trajectory calculation unit 1051 and/or a timing unit 1052.
- the trajectory calculation unit 1051 can calculate the real-time distance between two contact points in the touch input, and can also calculate the trajectory change distance of each contact point, and the so-called trajectory change distance refers to the change distance from the initial position of the contact point.
- the timing unit 1502 starts timing from the start of the touch input.
- recognition unit 102 is triggered to recognize all gesture inputs that match the trajectory of the currently received touch input. Further, the recognition unit 102 can start the recognition operation of the gesture input before the touch input is completed.
- the recognition unit 102 first determines the feature of the trajectory of the touch input when performing the recognition operation of the gesture input, and then selects all the gesture inputs having these features from the plurality of gesture inputs previously stored in the storage unit 104.
- the features of the trajectory of the touch input include three characteristics of the number of touch points, the type of the trajectory, and the quadrant where the trajectory is located.
- the present invention is not limited to these features, and those skilled in the art can select other features as needed. .
- FIG. 3 is a schematic flowchart diagram of a method for recognizing a gesture in a touch display device according to an embodiment of the present application.
- the workflow of the system 10 for gesture recognition will be described below with reference to FIGS. 2 and 3.
- the touch detection unit 20 detects the touch input of the user in real time, and transmits the detected touch input trajectory to the receiving unit 101 through the I/O interface 30, and the receiving unit 101 performs real-time.
- a trajectory of a touch input on the touch display device is received (step S310).
- the user performs the stretching input of two fingers on the large-sized display device, that is, the two fingers move in opposite directions to realize the function of enlarging the current operation window.
- the touch detection unit 20 detects the position of the finger at a certain frequency (generally 60 times in 1s detection), and the touch detection unit 20 connects the positions where the fingers are located at different time points to obtain the user. Touch the track you entered. Subsequently, the touch detecting section 20 sends the trajectory shown in FIG. 4(a) through the I/O interface 30. It is sent to the receiving unit 101.
- the recognition unit 102 identifies all gesture inputs that match the trajectory of the currently received touch input before the user completes the touch input, and displays all the recognized gesture inputs through the receiving unit 101 and the I/O interface unit 30. It is on the display unit 40 (step S320).
- the trajectory calculating unit 1051 detects the real-time distance between the two contact points.
- the recognition unit 102 starts to recognize the gesture input based on the currently received touch input trajectory. In this way, it is possible to avoid a large-scale operation when the user performs an operation of the enlargement function on the large-sized display device as much as possible, thereby reducing the user's fatigue and the like.
- the recognition unit 102 compares and matches the currently received trajectory of the touch input with various multi-point gesture input information previously stored in the storage unit 104.
- FIG. 4 is taken as an example to illustrate how the recognition unit 102 filters out similar gesture inputs from the storage unit 104.
- the recognition unit 102 determines the number of points of the touch input trajectory, and the touch action in FIG. 4(a) is a two-point input, so that the recognition unit 102 can select all the two-point input gesture inputs from the storage unit 104.
- the recognition unit 102 determines the type of the trajectory again, and the two fingers in FIG. 4(a) are both stretched relative to the initial position. In other words, both fingers move toward the edge of the display device relative to the initial position, so that The recognition unit 102 can further select all the gesture inputs that are simultaneously stretched by two points from the gesture inputs input by the two points.
- the recognition unit 102 determines the quadrant of the input trajectory. If the midpoint of the initial position of the two fingers is taken as the origin, one of the contact points operates in the first quadrant, and the other contact point operates in the third quadrant, and both move away from the origin. Moreover, two gesture inputs similar to the above-described actions are stored in the storage unit 104 (refer to FIG. 4(b)), and one gesture input is stretched in the vertical direction (the corresponding input signal is such that the operation window is maximized in the vertical direction) Another gesture input is stretching in the 45° direction (the corresponding input signal is to maximize the operating window in the 45° direction). Therefore, the recognition unit 102 finally screens out two gesture inputs similar to the touch input trajectory of FIG. 4(a).
- the display unit 40 displays all the recognized gesture inputs as guidance information to indicate a possible similar gesture corresponding to the current touch input, and as shown in FIG. 4(b), the display unit 40 displays the vertical stretching. Gesture I1 and gesture I2 stretched in the 45° direction.
- the user selects a desired gesture input according to various gesture inputs displayed on the display unit 40 of the touch display device, and clicks the gesture input.
- the touch detection unit 20 detects the click event, the touch detection unit 20
- the information is transmitted to the receiving unit 101 through the I/O interface 30, and the receiving unit 101 receives the gesture input determined from all the gesture inputs, and the executing unit 103 connected to the receiving unit 101 performs a function corresponding to the gesture input (step S330).
- the execution unit 103 realizes maximizing the operation window in the vertical direction, after the user selects the stretched gesture I2 in the 45° direction, the execution unit 103 achieves maximizing the operating window in the 45° direction.
- the recognition unit 102 is also triggered to recognize all the trajectories matching the currently received touch input. Gesture input.
- the recognition unit 102 is triggered to recognize all gesture inputs that match the trajectory of the currently received touch input.
- the recognition unit 102 is also triggered to recognize all the gesture inputs that match the trajectory of the currently received touch input.
- the recognition gesture system of the present embodiment receives the trajectory of the touch input in the touch display device in real time, and recognizes and displays all the gesture inputs that match the currently received touch input trajectory before the touch input is completed. Finally, the system receives the gesture input determined from all of the gesture inputs and performs the function corresponding to the gesture input.
- the system can pre-determine the gesture input that the user may want to take before the user completes the touch input, and control the display unit to display all possible similar gesture inputs as an indication (or navigation) message, so the user is using a large-size touch display.
- the system can recognize similar gesture inputs in advance without extensive touch operation in a large range, thereby reducing the burden on the user and obtaining a better user experience.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Chemical & Material Sciences (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Crystallography & Structural Chemistry (AREA)
- Life Sciences & Earth Sciences (AREA)
- Bioinformatics & Computational Biology (AREA)
- Computing Systems (AREA)
- Chemical Kinetics & Catalysis (AREA)
- Analytical Chemistry (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
一种触摸显示装置中识别手势的方法和系统,通过此方法,能够在用户完成触摸输入之前预先判断用户可能想要采取的手势输入,并控制显示单元显示所有可能的类似手势输入作为指示(或导航)讯息,因此用户在使用大尺寸触摸显示装置时,无需在较大范围内进行大幅度地触摸操作,就能提前识别出类似的手势输入,藉此可以减轻用户的负担,以得到更好的用户体验。
Description
相关技术的交叉引用
本申请要求享有2015年6月17日提交的名称为“一种触摸显示装置中识别手势的方法和系统”的中国专利申请CN201510337013.2的优先权,其全部内容通过引用并入本文中。
本发明涉及一种计算机中的人机交互技术领域,尤其涉及一种触摸显示装置中识别手势的方法和系统。
目前,触摸显示装置作为一种代替或补充普通键盘和鼠标的输入设备已经在许多场合和领域得到了使用,尤其是新一代支持多点触摸交互的触摸显示装置的出现,使触摸显示装置的应用出现了新的变化,触摸显示装置可以为用户提供更多、更自然的交互方式。
现有的这种触摸显示装置,特别是大尺寸(例如55~110寸)的触摸显示装置,在用户完成触摸动作后,触摸显示装置的系统才能根据获取的触摸输入轨迹识别出匹配的手势输入,并执行与手势输入相应的功能。然而,由于这种触摸显示装置的尺寸较大,用户在执行触摸动作的过程中,需要在触摸显示装置的较大范围内进行大幅度地动作,即大幅度地移动手指或触摸输入设备(例如触摸笔)才能实现用户想要的功能,因此增加了用户在使用过程中的疲劳感,给用户带来较差的使用体验。
发明内容
本发明所要解决的技术问题之一是需要提供一种能够减轻用户在使用大尺寸触摸显示装置的疲劳感的触摸显示装置中识别手势的方法,该方法能够给用户带来良好的使用体验。
为了解决上述技术问题,本申请的实施例首先提供了一种触摸显示装置中识别手势的方法,该方法包括:实时接收在所述触摸显示装置上的触摸输入的轨迹;在所述触摸输入完成之前,识别并显示出与当前接收到的触摸输入的轨迹相匹配的所有手势输入,以及接收从所有手势输入中所确定的手势输入,并执行与该手势输入对应的功能。
优选地,在识别出与当前接收到的触摸输入的轨迹相匹配的所有手势输入的步骤中,确定所述触摸输入的轨迹的特征;从预先存储的多种手势输入中选择具备所述特征的所有手势输入。
优选地,所述触摸输入的轨迹的特征包括触摸点数、轨迹类型和轨迹所处的象限。
优选地,在所述触摸输入中的两个接触点之间的实时距离达到第一设定距离时,则识别并显示出与当前接收到的触摸输入的轨迹相匹配的所有手势输入。
优选地,在所述触摸输入中的至少一个接触点的轨迹变化距离变化到第二设定距离时,则识别并显示出与当前接收到的触摸输入的轨迹相匹配的所有手势输入。
优选地,在自触摸输入开始起计时的时间达到设定时间时,则识别并显示出与当前接收到的触摸输入的轨迹相匹配的所有手势输入。
根据本发明的另一方面,还提供了一种触摸显示装置中识别手势的系统,该系统包括:接收单元,其实时接收在所述触摸显示装置上的触摸输入的轨迹;识别单元,其在所述触摸输入完成之前,识别出与当前接收到的触摸输入的轨迹相匹配的所有手势输入;执行单元,其接收从所述触摸显示装置中显示的所有手势输入中所确定的手势输入,并执行与该手势输入对应的功能。
优选地,所述识别单元进一步用于,确定所述触摸输入的轨迹的特征;从预先存储的多种手势输入中选择具备所述特征的所有手势输入。
优选地,所述触摸输入的轨迹的特征包括触摸点数、轨迹类型和轨迹所处的象限。
优选地,还包括轨迹计算单元,其计算所述触摸输入中的两个接触点之间的实时距离;当所述轨迹计算单元计算得到的两个接触点之间的实时距离达到第一设定距离时,所述识别单元识别出与当前接收到的触摸输入的轨迹相匹配的所有手势输入。
优选地,所述轨迹计算单元还计算所述触摸输入中的至少一个接触点的轨迹变化距离;当所述轨迹计算单元计算得到的至少一个接触点的轨迹变化距离变化到第二设定距离时,所述识别单元识别出与当前接收到的触摸输入的轨迹相匹配的所有手势输入。
优选地,还包括计时单元,其自触摸输入开始起开始计时;在所述计时单元计时的时间达到设定时间时,所述识别单元识别出与当前接收到的触摸输入的轨迹相匹配的所有手势输入。
与现有技术相比,上述方案中的一个或多个实施例可以具有如下优点或有益效果。
在本发明实施例的方法中,系统实时接收在触摸显示装置中的触摸输入的轨迹,在触摸输入完成之前,识别并显示出与当前接收到的触摸输入轨迹相匹配的所有手势输入,最后,系统接收从所有手势输入中所确定的手势输入,并执行与该手势输入对应的功能。通过此方法,系统能够在用户完成触摸输入之前预先判断用户可能想要采取的手势输入,并控制显示单元显示所有可能的类似手势输入作为指示(或导航)讯息,因此用户在使用大尺寸触摸显示装置时,无需在较大范围内进行大幅度地触摸操作,系统就能提前识别类似的手势输入,藉此可以减轻用户的负担,以得到更好的用户体验。
本发明的其它特征和优点将在随后的说明书中阐述,并且,部分地从说明书中变得显而易见,或者通过实施本发明的技术方案而了解。本发明的目的和其他优点可通过在说明书、权利要求书以及附图中所特别指出的结构和/或流程来实现和获得。
附图用来提供对本申请的技术方案或现有技术的进一步理解,并且构成说明书的一部分。其中,表达本申请实施例的附图与本申请的实施例一起用于解释本申请的技术方案,但并不构成对本申请技术方案的限制。
图1为本申请实施例的触摸显示装置的结构示意图。
图2为本申请实施例的触摸显示装置中识别手势系统的结构示意图。
图3为本申请实施例的触摸显示装置中识别手势的方法的流程示意图。
图4(a)和图4(b)分别为在触摸显示装置上显示手势输入不同阶段的示
例图。
以下将结合附图及实施例来详细说明本发明的实施方式,借此对本发明如何应用技术手段来解决技术问题,并达成相应技术效果的实现过程能充分理解并据以实施。本申请实施例以及实施例中的各个特征,在不相冲突前提下可以相互结合,所形成的技术方案均在本发明的保护范围之内。
另外,附图的流程图示出的步骤可以在诸如一组计算机可执行指令的计算机系统中执行。并且,虽然在流程图中示出了逻辑顺序,但是在某些情况下,可以以不同于此处的顺序执行所示出或描述的步骤。
图1为本申请实施例的触摸显示装置的概略结构示意图。需要说明的是,本申请的触摸显示装置可以应用在电视、个人计算机或手机等。
如图1所示,该触摸显示装置包括识别手势系统10、触摸检测部20、I/O接口30以及显示单元40。触摸检测部20设置在显示单元40的上面,触摸检测部20用于检测用户的触摸输入,并将检测到的触摸输入轨迹通过I/O接口30发送至识别手势系统10中。系统10实时接收来自触摸检测部20当前检测到的触摸输入的轨迹,并在触摸输入完成之前识别出与触摸输入轨迹相匹配的所有手势输入,然后系统10通过I/O接口30将所获得到的所有手势输入显示在显示单元40上,最后接收用户根据显示单元40显示的所有手势输入中所确定的手势输入,并执行与该手势输入对应的功能。
如图2所示,系统10主要包括接收单元101、识别单元102以及执行单元103。接收单元101通过I/O接口30实时接收在触摸显示装置中的触摸输入的轨迹。识别单元102在触摸输入完成之前,识别出与当前接收到的触摸输入轨迹相匹配的所有手势输入。执行单元103接收从触摸显示装置(也可以说是显示单元40)中显示的所有手势输入中所确定的手势输入,并执行与该手势输入对应的功能。
另外,本系统10还包括存储单元104,其存储了多种手势输入信息,主要包括多点手势输入信息。这些信息按照如下模式的列表格式进行存储:具体的触摸手势和由该手势产生的相应输入信号(比如实现单击、拖动、放大缩小和旋转等的功能信号)。该存储单元104可以是电视、手机的内存或者计算机的硬盘。具
体的触摸手势可以包括手指单击、双击,手指按住后拖动,或者如图4中所示的两个手指进行垂直方向或45°方向的拉伸等。
为了能够在用户完成触摸输入之前,识别单元102就能识别出与当前接收到的触摸输入轨迹相匹配的所有手势输入,系统10中还设置了触发模块105,该触发模块105用于在某一时刻(用户完成触摸输入之前的时刻)来触发识别单元102开始进行动作。如图2所示,触发模块105与识别单元102连接,该触发模块105包括轨迹计算单元1051和/或计时单元1052。轨迹计算单元1051可以计算在触摸输入中的两个接触点之间的实时距离,也可以计算每个接触点各自的轨迹变化距离,所谓轨迹变化距离是指从接触点的初始位置开始的变化距离。计时单元1502自触摸输入开始起开始计时。
这样一来,当轨迹计算单元1501计算得到的两个接触点之间的实时距离达到第一设定距离,或者计算得到的至少一个接触点的轨迹变化距离变化到第二设定距离,或者计时单元1502计时的时间达到设定时间时,识别单元102就会被触发以识别出与当前接收到的触摸输入的轨迹相匹配的所有手势输入。进一步,识别单元102就能够在触摸输入完成之前,开始进行手势输入的识别动作。
识别单元102在进行手势输入的识别动作时,首先会确定触摸输入的轨迹的特征,然后从预先存储在存储单元104中的多种手势输入中选择具备这些特征的所有手势输入。在本实施例中,触摸输入的轨迹的特征包括触摸点数、轨迹类型和轨迹所处的象限这三个特征,然而,本发明不限于这些特征,本领域技术人与可以根据需要选择其他的特征。
图3为本申请实施例的触摸显示装置中识别手势的方法的流程示意图。下面同时参照图2和图3来说明系统10进行手势识别的工作流程。
当用户在触摸显示装置上进行多点触摸输入时,触摸检测部20实时检测用户的触摸输入,并将检测到的触摸输入轨迹通过I/O接口30发送至接收单元101中,接收单元101实时接收在触摸显示装置上的触摸输入的轨迹(步骤S310)。
如图4(a)所示,用户在大尺寸的显示装置上进行两个手指的拉伸输入,即两个手指沿相反方向移动,来实现放大当前操作窗口的功能。当用户在动作时,触摸检测部20会以一定的频率对手指的位置进行检测(一般是1s检测60次),触摸检测部20把不同时间点手指所处的位置连接起来就可以得到用户进行触摸输入的轨迹。随后,触摸检测部20将如图4(a)所示的轨迹通过I/O接口30发
送至接收单元101中。
接下来,识别单元102在用户完成触摸输入之前,识别与当前接收到的触摸输入的轨迹相匹配的所有手势输入,并将识别出的所有手势输入通过接收单元101和I/O接口单元30显示在显示单元40上(步骤S320)。
具体而言,以图4(a)中的触摸输入为例,当用户的两个手指在大尺寸显示单元40上进行拉伸动作时,轨迹计算单元1051检测到这两个接触点的实时距离达到第一设定距离(例如5cm)时,识别单元102就开始以当前接收到的触摸输入轨迹为准进行手势输入识别了。这样一来,能够尽量避免用户在大尺寸显示装置上完成放大功能的操作时的较大幅度的动作,减轻用户的疲劳感等。
识别单元102将当前接收到的触摸输入的轨迹与预先存储在存储单元104中的各种多点手势输入信息进行对比匹配。
以图4为例来说明识别单元102如何从存储单元104中筛选出类似的手势输入。首先识别单元102判断触摸输入轨迹的点数,图4(a)中的触摸动作是两点输入,这样识别单元102就可以从存储单元104中选出所有的两点输入的手势输入。识别单元102再次判断轨迹的类型,图4(a)中两个手指相对于最初的位置都是拉伸的动作,换言之,相对于最初位置,两个手指都是向显示装置的边缘移动,这样识别单元102能够从两点输入的手势输入中进一步选出两点同时拉伸的所有手势输入。最后,识别单元102对输入轨迹的象限进行判断。如果以两个手指最初位置的中点为原点,则其中一个接触点在第一象限内进行动作,另外一个接触点在第三象限内进行动作,而且都是进行远离原点的动作。而且在存储单元104中储存了两个与上述动作类似的手势输入(参照图4(b)),一个手势输入是垂直方向的拉伸(对应的输入信号是使该操作窗口在垂直方向上最大化),另外一个手势输入是45°方向的拉伸(对应的输入信号是使该操作窗口在45°方向上最大化)。因此,识别单元102最终筛选出了与图4(a)的触摸输入轨迹类似的两种手势输入。
随后,显示单元40将识别出的所有手势输入显示出来作为引导信息,以指示出对应当前触摸输入的可能的类似手势,如图4(b)所示,显示单元40显示出了垂直拉伸的手势I1和45°方向拉伸的手势I2。
最后,用户根据触摸显示装置的显示单元40上显示的多种手势输入中选取想要的手势输入,并点击该手势输入。触摸检测部20检测到点击事件后,将该
信息通过I/O接口30发送至接收单元101,接收单元101接收从所有手势输入中所确定的手势输入,接收单元101连接的执行单元103执行与该手势输入对应的功能(步骤S330)。
在用户选择如图4(b)所示的垂直拉伸的手势I1之后,执行单元103实现使操作窗口在垂直方向上最大化,在用户选择45°方向的拉伸的手势I2之后,执行单元103实现使操作窗口在45°方向上最大化。
另外,当轨迹计算单元1501计算得到的至少一个接触点的轨迹变化距离变化到第二设定距离时,识别单元102也会被触发以识别出与当前接收到的触摸输入的轨迹相匹配的所有手势输入。
例如,在用户进行两点距离压缩的触摸输入,进而实现当前操作窗口的缩小化功能时,若轨迹计算单元1501计算得到的至少一个接触点的轨迹变化距离变化到第二设定距离(例如4cm)时,在用户完成触摸输入之前,识别单元102就会被触发以识别出与当前接收到的触摸输入的轨迹相匹配的所有手势输入。
或者,计时单元1502从触摸输入开始起计时的时间达到设定时间(例如0.5s)时,识别单元102也会被触发以识别出与当前接收到的触摸输入的轨迹相匹配的所有手势输入。
综上所述,本实施例的识别手势系统实时接收在触摸显示装置中的触摸输入的轨迹,在触摸输入完成之前,识别并显示出与当前接收到的触摸输入轨迹相匹配的所有手势输入,最后,系统接收从所有手势输入中所确定的手势输入,并执行与该手势输入对应的功能。通过此方法,系统能够在用户完成触摸输入之前预先判断用户可能想要采取的手势输入,并控制显示单元显示所有可能的类似手势输入作为指示(或导航)讯息,因此用户在使用大尺寸触摸显示装置时,无需在较大范围内进行大幅度地触摸操作,系统就能提前识别类似的手势输入,藉此可以减轻用户的负担,以得到更好的用户体验。
本领域的技术人员应该明白,上述的本发明的各单元或各步骤可以用通用的计算装置来实现,它们可以集中在单个的计算装置上,或者分布在多个计算装置所组成的网络上,可选地,它们可以用计算装置可执行的程序代码来实现,从而,可以将它们存储在存储装置中由计算装置来执行,或者将它们分别制作成各个集成电路模块,或者将它们中的多个模块或步骤制作成单个集成电路模块来实现。这样,本发明不限制于任何特定的硬件和软件结合。
虽然本发明所揭露的实施方式如上,但所述的内容只是为了便于理解本发明而采用的实施方式,并非用以限定本发明。任何本发明所属技术领域内的技术人员,在不脱离本发明所揭露的精神和范围的前提下,可以在实施的形式上及细节上作任何的修改与变化,但本发明的专利保护范围,仍须以所附的权利要求书所界定的范围为准。
本领域普通技术人员可以理解实现上述实施例方法中的全部或部分步骤是可以通过程序来指令相关的硬件来完成,所述的程序可以存储于一计算机可读取存储介质中,该程序在执行时,包括上面涉及的步骤,所述的存储介质,如:ROM/RAM、磁碟、光盘等。
Claims (12)
- 一种触摸显示装置中识别手势的方法,该方法包括:实时接收在所述触摸显示装置上的触摸输入的轨迹;在所述触摸输入完成之前,识别并显示出与当前接收到的触摸输入的轨迹相匹配的所有手势输入,以及接收从所有手势输入中所确定的手势输入,并执行与该手势输入对应的功能。
- 根据权利要求1所述的方法,其中,在识别出与当前接收到的触摸输入的轨迹相匹配的所有手势输入的步骤中,确定所述触摸输入的轨迹的特征;从预先存储的多种手势输入中选择具备所述特征的所有手势输入。
- 根据权利要求2所述的方法,其中,所述触摸输入的轨迹的特征包括触摸点数、轨迹类型和轨迹所处的象限。
- 根据权利要求1所述的方法,其中,在所述触摸输入中的两个接触点之间的实时距离达到第一设定距离时,则识别并显示出与当前接收到的触摸输入的轨迹相匹配的所有手势输入。
- 根据权利要求1所述的方法,其中,在所述触摸输入中的至少一个接触点的轨迹变化距离变化到第二设定距离时,则识别并显示出与当前接收到的触摸输入的轨迹相匹配的所有手势输入。
- 根据权利要求1所述的方法,其中,在自触摸输入开始起计时的时间达到设定时间时,则识别并显示出与当前接收到的触摸输入的轨迹相匹配的所有手势输入。
- 一种触摸显示装置中识别手势的系统,该系统包括:接收单元,其实时接收在所述触摸显示装置上的触摸输入的轨迹;识别单元,其在所述触摸输入完成之前,识别出与当前接收到的触摸输入的轨迹相匹配的所有手势输入;执行单元,其接收从所述触摸显示装置中显示的所有手势输入中所确定的手势输入,并执行与该手势输入对应的功能。
- 根据权利要求7所述的系统,其中,所述识别单元进一步用于,确定所述触摸输入的轨迹的特征;从预先存储的多种手势输入中选择具备所述特征的所有手势输入。
- 根据权利要求8所述的系统,其中,所述触摸输入的轨迹的特征包括触摸点数、轨迹类型和轨迹所处的象限。
- 根据权利要求7所述的系统,其中,还包括:轨迹计算单元,其计算所述触摸输入中的两个接触点之间的实时距离;当所述轨迹计算单元计算得到的两个接触点之间的实时距离达到第一设定距离时,所述识别单元识别出与当前接收到的触摸输入的轨迹相匹配的所有手势输入。
- 根据权利要求10所述的系统,其中,所述轨迹计算单元还计算所述触摸输入中的至少一个接触点的轨迹变化距离;当所述轨迹计算单元计算得到的至少一个接触点的轨迹变化距离变化到第二设定距离时,所述识别单元识别出与当前接收到的触摸输入的轨迹相匹配的所有手势输入。
- 根据权利要求7所述的系统,其中,还包括计时单元,其自触摸输入开始起开始计时;在所述计时单元计时的时间达到设定时间时,所述识别单元识别出与当前接收到的触摸输入的轨迹相匹配的所有手势输入。
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/785,937 US10318147B2 (en) | 2015-06-17 | 2015-07-20 | Method and system of gesture recognition in touch display device |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510337013.2 | 2015-06-17 | ||
CN201510337013.2A CN104898980A (zh) | 2015-06-17 | 2015-06-17 | 一种触摸显示装置中识别手势的方法和系统 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016201760A1 true WO2016201760A1 (zh) | 2016-12-22 |
Family
ID=54031665
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2015/084458 WO2016201760A1 (zh) | 2015-06-17 | 2015-07-20 | 一种触摸显示装置中识别手势的方法和系统 |
Country Status (3)
Country | Link |
---|---|
US (1) | US10318147B2 (zh) |
CN (1) | CN104898980A (zh) |
WO (1) | WO2016201760A1 (zh) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105786377B (zh) * | 2016-02-17 | 2019-08-06 | 京东方科技集团股份有限公司 | 触控监测方法及装置、终端 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2107448A2 (en) * | 2008-04-02 | 2009-10-07 | ASUSTeK Computer Inc. | Electronic apparatus and control method thereof |
CN101859226A (zh) * | 2009-04-08 | 2010-10-13 | Lg电子株式会社 | 在移动终端中输入命令的方法和使用该方法的移动终端 |
US20100259493A1 (en) * | 2009-03-27 | 2010-10-14 | Samsung Electronics Co., Ltd. | Apparatus and method recognizing touch gesture |
CN102298485A (zh) * | 2010-06-22 | 2011-12-28 | 广东国笔科技股份有限公司 | 一种基于触摸屏的实时调用系统 |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010015238A (ja) * | 2008-07-01 | 2010-01-21 | Sony Corp | 情報処理装置、及び補助情報の表示方法 |
KR101557358B1 (ko) * | 2010-02-25 | 2015-10-06 | 엘지전자 주식회사 | 문자열 입력 방법 및 그 장치 |
CN102289341B (zh) * | 2010-06-17 | 2013-04-03 | 汉王科技股份有限公司 | 应用于触控设备上的游戏控制方法、装置及一种触控设备 |
CN102609165B (zh) * | 2011-01-24 | 2014-09-03 | 广州三星通信技术研究有限公司 | 具有触摸屏的移动终端和移动终端的模式控制方法 |
JP5931627B2 (ja) * | 2012-07-23 | 2016-06-08 | 京セラ株式会社 | 携帯端末装置、プログラムおよび入力訂正方法 |
CN104020989B (zh) * | 2014-05-06 | 2017-10-03 | 深信服科技股份有限公司 | 基于远程应用的控制方法和系统 |
-
2015
- 2015-06-17 CN CN201510337013.2A patent/CN104898980A/zh active Pending
- 2015-07-20 US US14/785,937 patent/US10318147B2/en active Active
- 2015-07-20 WO PCT/CN2015/084458 patent/WO2016201760A1/zh active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2107448A2 (en) * | 2008-04-02 | 2009-10-07 | ASUSTeK Computer Inc. | Electronic apparatus and control method thereof |
US20100259493A1 (en) * | 2009-03-27 | 2010-10-14 | Samsung Electronics Co., Ltd. | Apparatus and method recognizing touch gesture |
CN101859226A (zh) * | 2009-04-08 | 2010-10-13 | Lg电子株式会社 | 在移动终端中输入命令的方法和使用该方法的移动终端 |
CN102298485A (zh) * | 2010-06-22 | 2011-12-28 | 广东国笔科技股份有限公司 | 一种基于触摸屏的实时调用系统 |
Also Published As
Publication number | Publication date |
---|---|
US10318147B2 (en) | 2019-06-11 |
US20170153805A1 (en) | 2017-06-01 |
CN104898980A (zh) | 2015-09-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8390577B2 (en) | Continuous recognition of multi-touch gestures | |
US20180024643A1 (en) | Gesture Based Interface System and Method | |
US9104308B2 (en) | Multi-touch finger registration and its applications | |
US20150220149A1 (en) | Systems and methods for a virtual grasping user interface | |
WO2018196699A1 (zh) | 一种指纹识别区域的显示方法和移动终端 | |
WO2016090888A1 (zh) | 图标的移动方法、装置、设备及非易失性计算机存储介质 | |
WO2016138661A1 (zh) | 终端的用户界面的处理方法、用户界面和终端 | |
BR112017007752B1 (pt) | Método, dispositivo, e sistema de processamento de interação por toque | |
CN103257811A (zh) | 基于触摸屏的图片显示系统和方法 | |
US20140298223A1 (en) | Systems and methods for drawing shapes and issuing gesture-based control commands on the same draw grid | |
CN102915202A (zh) | 一种触摸设备的触控方法及系统 | |
US10282087B2 (en) | Multi-touch based drawing input method and apparatus | |
US9778780B2 (en) | Method for providing user interface using multi-point touch and apparatus for same | |
US20150363037A1 (en) | Control method of touch panel | |
CN104898880B (zh) | 一种控制方法及电子设备 | |
CN103019426A (zh) | 触摸终端中的交互方法及装置 | |
WO2018218392A1 (zh) | 触摸操作的处理方法和触摸键盘 | |
WO2017101340A1 (zh) | 多点触控调整视频窗口的方法及设备 | |
WO2016201760A1 (zh) | 一种触摸显示装置中识别手势的方法和系统 | |
US9524051B2 (en) | Method and terminal for inputting multiple events | |
CN103809912A (zh) | 基于多点触控触摸屏的平板电脑 | |
US9396390B2 (en) | Systems and methods for sketch processing | |
US20150042586A1 (en) | Input Device | |
CN104484117B (zh) | 人机交互方法及装置 | |
US20150268734A1 (en) | Gesture recognition method for motion sensing detector |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 14785937 Country of ref document: US |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15895333 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 15895333 Country of ref document: EP Kind code of ref document: A1 |