CN102012773A - Method and equipment for operating interfaces by utilizing airflows - Google Patents

Method and equipment for operating interfaces by utilizing airflows Download PDF

Info

Publication number
CN102012773A
CN102012773A CN2009100922836A CN200910092283A CN102012773A CN 102012773 A CN102012773 A CN 102012773A CN 2009100922836 A CN2009100922836 A CN 2009100922836A CN 200910092283 A CN200910092283 A CN 200910092283A CN 102012773 A CN102012773 A CN 102012773A
Authority
CN
China
Prior art keywords
screen
air
flow
screen area
intersection point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN2009100922836A
Other languages
Chinese (zh)
Other versions
CN102012773B (en
Inventor
刘琨
章锋
王森
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Mobile Communications Group Co Ltd
Original Assignee
China Mobile Communications Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Mobile Communications Group Co Ltd filed Critical China Mobile Communications Group Co Ltd
Priority to CN2009100922836A priority Critical patent/CN102012773B/en
Publication of CN102012773A publication Critical patent/CN102012773A/en
Application granted granted Critical
Publication of CN102012773B publication Critical patent/CN102012773B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a method and equipment for operating interfaces by utilizing airflows to solve the problems of large occupied resource amount of an internal memory and complicated technological realization in the prior art when a control operation is carried out to the interfaces by utilizing souffle. In the method, an equipment screen is divided into a plurality of zones in advance, and a plurality of sensors capable of sensing the directions of the airflows are arranged around the screen. The method comprises the following steps of: receiving the directions of the airflows reflected by the screen, sensed by each sensor, and determining the straight paths of reflective positions on the screen from each sensor to each airflow and the intersection point of the straight paths according to the airflow direction sensed by each sensor; respectively calculating the space distance between the intersection point and the feature point of each screen zone and determining the positions of the screen zones corresponding to the smallest space distance on the screen as the reflective positions of the airflows on the screen; and carrying out operation control to an user interface displayed in the screen according to the reflective positions of the airflows on the screen.

Description

Utilize air-flow that method of operating and equipment thereof are carried out in the interface
Technical field
The present invention relates to the communications field, relate in particular to a kind of air-flow that utilizes method of operating and equipment thereof are carried out in the interface.
Background technology
In operating system, user interface is one of direct embodiment of man-machine interaction.The operating efficiency of user interface is directly connected to the effect of man-machine interaction.If operating user interface process complexity, the operating system delay of response tends to reduce operating system to user's the attractive force and the availability of system itself; On the contrary, operating process is simple and easy effectively, highly sensitive then can to improve user's service efficiency, strengthens satisfaction of users.Existing keyboard operation mode the single problem of user experience occurred, even under some environment, can't satisfy the requirement of man-machine interaction because its interactive mode is fixed.
On the other hand, along with the development of man-machine interaction, more and more new technology limb assisting actions and gesture begin to substitute gradually traditional button operation, as touch-screen, multiple point touching, gravity sensing etc., have greatly improved the efficient of man-machine interaction.But mostly these new technologies are the support for gesture, and in field of human-computer interaction, sound and air-flow can pass to information machine, by finishing the action that the people specifies it to finish after the machine processing effectively as typical human body information output.Especially when people's limbs can't go machine operated, other controls then seemed particularly important alternately.
At present, utilize bruit can effectively control to the user interface of PC, method described in the paper of delivering at 07 year UIST ' 07 as the Shwetak N.Patel and Gregory D.Abowd of Georgia science and engineering computing machine institute " BLUI:Low-cost Localized Blowable User Interfaces ", capture the huge sound that air blowing causes by Mike, through after the machine learning, control as incident and to choose, icon moves, roll screen etc.This method is that the screen periphery in user interface is no more than in the distance of 10cm and establishes a Mike as receiving sensor, when the user blows to screen, produce reflection behind the air-flow contact screen, for the next bruit of zones of different reflection from screen, the phase place of the received bruit of sensor is different with intensity, can obtain different voice signal thus from each screen area, thereby for training category set to provide the foundation.
The flow process of its described user interface control method as shown in Figure 1, concrete steps are as follows:
Step 101, with user interface screen be divided into n the zone;
Step 102, user blow to screen, and Mike receives this bruit, obtain voice signal;
Step 103, by SD-FFT (Sparse Data Fast Fourier Transform, discrete Fourier transformation in short-term) conversion, with voice signal by time-domain signal S (t) be transformed to frequency domain signal X (f, t);
Step 104, utilize PCA (Principal Comonents Analysis, principal component analysis (PCA)) algorithm extract X (arthmetic statement is as follows for f, the t) eigenwert in:
X (f t) can be expressed as:
X = x 11 x 12 Λ x 1 p x 21 x 22 Λ x 2 p M M M M x n 1 x n 2 Λ x np · · · ( 1 )
Calculate covariance matrix:
R = r 11 r 12 Λ r 1 p r 21 r 22 Λ r 2 p M M M M r p 1 r p 2 Λ r pp · · · ( 2 )
Wherein, r Ij(i, j=1,2 ..., p) be former variable x iWith x jRelated coefficient, r Ij=r Ji, its computing formula is:
r ij = Σ k = 1 n ( x ki - x ‾ i ) ( x kj - x ‾ j ) Σ k = 1 n ( x ki - x ‾ i ) 2 Σ k = 1 n ( x kj - x ‾ j ) 2 · · · ( 3 )
Separate secular equation | λ I-R|=0, obtain eigenwert with Jacobi method, by size series arrangement, i.e. λ 1〉=λ 2〉=Λ, 〉=λ p〉=0;
Obtain corresponding to eigenvalue respectively iProper vector e i(i=1,2, Λ p), requires || e i||=1, promptly
Figure B2009100922836D0000031
E wherein IjRepresent vectorial e iJ component.
Calculate the major component contribution rate
λ i Σ k = 1 p λ k ( i = 1,2 , Λ , p ) · · · ( 4 )
And contribution rate of accumulative total
Σ k = 1 i λ k Σ k = 1 p λ k ( i = 1,2 , Λ , p ) · · · ( 5 )
Choosing contribution rate of accumulative total is major component greater than preceding m the eigenwert of threshold value a.
Calculate major component load:
l ij = p ( z i , x j ) = λ i e ij ( i , j = 1,2 , Λ , p ) · · · ( 6 )
Obtain X (f, the principal component vector of m t) is:
z 1 = l 11 x 1 + l 12 x 2 + Λ + l 1 p x p z 2 = l 21 x 1 + l 22 x 2 + Λ + l 2 p x p M z m = l m 1 x 1 + l m 2 x 2 + Λ + l mp x p · · · ( 7 )
With (7) formula Z=LX matrix of coefficients L as eigenwert.
Step 105, utilize the eigenwert L calculate to carry out KNN (k nearest-neighbors) algorithm to differentiate, with the position at the place, source of determining bruit, the differentiation process is as follows:
After being admitted to the differentiation process by differentiation eigenwert L, in training set (training set is the set of the eigenwert L of N experiment), choose k the vector nearest by the compute euclidian distances value with L, by differentiating the quantity that this k vector comes from each regional characteristic value collection in the training set, differentiate and differentiated the regional location that eigenwert L belongs to, with of the source region output of this regional location as bruit.
Step 106, after obtaining zone position information, operating system can be controlled user interface according to this positional information and choose, icon moves, operations such as roll screen.
The present inventor finds that there is following defective in the scheme of the prior art:
On the one hand, the calculated amount of algorithm and complexity are higher.In this method, be o (n*m) (n is the zoning number, and m is the training set sample number in each zone) to the algorithm complex of Euclidean distance, calculated amount is bigger; Therefore, the memory requirements of this method application device is bigger, for the internal memory small device, System response time in the practical operation is also just corresponding longer, cause interface operation slow, reduce the satisfaction of user experience, so this method is difficult to be applied to the less mobile phone operation platform of dominant frequency.
On the other hand, because the resolution at interface is to be directly proportional with the number of zoning, interface, and the calculated amount of this method is big, EMS memory occupation degree height, should not divide more screen area, therefore, this method can't further improve the degree of accuracy of interface operation under the situation that guarantees certain system response speed; And because it can not divide more screen area, cause the area in single zone bigger, thereby in pre-training process, import air-flow information and number of samples in the training set that is accumulated to is also just corresponding more by the user, storing these sample datas also needs to take more memory headroom, thereby has further strengthened the demand to device memory.
Also have on the one hand, the pre-training process of this method is to form training set by user's input information as sample and finish in the equipment back of coming into operation, so the initial stage of user when using this interface control system, because of the input sample number seldom, the eigenvector accuracy that makes pre-training obtain is very low, thereby cause system to use the sensitivity at initial stage and accuracy also just very low, must experience certain adaptation time and certain operational ton and can reach and make the user use purpose smoothly the user.
In sum, the above-mentioned technology of utilizing bruit that the pc user interface is controlled, calculation of complex, EMS memory occupation amount are bigger, are not suitable for being applied to data-handling capacity and the limited portable terminal of memory size.
Summary of the invention
The embodiment of the invention provides a kind of air-flow that utilizes that method of operating and terminal device are carried out in the interface, the problem of big, the complicated technology realization of memory source occupancy when utilizing bruit that control operation is carried out at the interface in order to solve prior art.
The technical scheme that the embodiment of the invention provides comprises:
A kind of method of utilizing air-flow control operation user interface, device screen is divided into a plurality of zones in advance, and screen periphery is provided with a plurality of sensors that can the perception airflow direction, and the method comprising the steps of:
Receive each sensor senses to the direction of the air-flow that screen reflected, according to the airflow direction that each sensor senses arrives, determine the straight line path of each sensor, and the intersection point of described straight line path to the reflection position of this air-flow on screen;
Calculate the space length between the unique point of described intersection point and each screen area respectively, and wherein the position of the minimum space pairing screen area of distance on this screen is defined as the reflection position of this air-flow on screen;
According to the reflection position of the air-flow of determining on screen, the user interface that shows in the screen is operated control.
A kind of equipment that utilizes air-flow control operation user interface comprises:
A plurality of sensors that can the perception airflow direction are arranged on screen periphery;
The intersection point determining unit, be used to receive each sensor senses to the direction of the air-flow that screen reflected, according to the airflow direction that each sensor senses arrives, determine the straight line path of each sensor to the reflection position of this air-flow on screen, and the intersection point of described straight line path;
Position determination unit be used for calculating respectively the space length between the unique point of described intersection point and each screen area, and wherein the position of the minimum space pairing screen area of distance on this screen is defined as the reflection position of this air-flow on screen;
The interface control unit is used for according to the reflection position of air-flow on screen of determining the user interface that shows in the screen being operated control.
In the embodiment of the invention, by screen periphery a plurality of airflow-direction sensors are set at equipment, utilize the airflow direction of its perception to determine the straight line path of each sensor to the reflection position of air-flow on screen, and the intersection point of per two straight line paths, and then respectively according to the definite space length between the two of the unique point coordinate of intersecting point coordinate and each screen area, utilize the high more principle of the more little similarity of space length, thereby determine the reflection position of air-flow on screen according to the space length size, with use Mike's perceives sound signal in the prior art and carry out the screen area location technology and compare, because the signal of airflow-direction sensor institute perception is the signal relevant with the position, so can simplifies location algorithm and reduce the consumption of memory source; Unique point coordinate owing to the screen area in the embodiment of the invention is preset in the terminal device again, therefore can reduce the storage space of the required sample of training screen area unique point process, therefore, be applicable to the equipment that processing power and storage space are all limited, as portable terminal.Then can improve the quantity of screen area for other equipment, and then improve the screen bearing accuracy.
Description of drawings
Fig. 1 is the schematic flow sheet of user interface control method in the prior art;
Fig. 2 A is an embodiment of the invention portable terminal external structure synoptic diagram;
Fig. 2 B is an embodiment of the invention portable terminal inner structure synoptic diagram;
Fig. 3 utilizes the schematic flow sheet of air-flow control user interface for the embodiment of the invention;
Fig. 4 is the received airflow direction synoptic diagram of the invention process ratio sensor.
Embodiment
At the deficiency of user interface being carried out the technology existence of control operation according to user's behavior of blowing at present, the embodiment of the invention is provided with a plurality of airflow-direction sensors at the screen periphery of terminal device and gathers the airflow direction signal, and reply determines that the method that air-flow acts on the position (to call the air flow source position in the following text) on the terminal device screen improves mutually, thereby shortcut calculation, reduces the use of memory source.
In the embodiment of the invention, screen periphery at terminal device is provided with the airflow-direction sensor array, and airflow-direction sensor quantity wherein is no less than 2, and the number of airflow-direction sensor is many more, the position judgment that air-flow is affacted on the screen is accurate more, and the corresponding calculated amount is also big more.Each airflow-direction sensor can be fixed on screen periphery in the array, be preferably in screen around be arranged with airflow-direction sensor.
In the embodiment of the invention, also needing the screen divider with terminal device is a plurality of zones.Can determine the region quantity divided according to the different needs to manipulation accuracy, the number of regions of division is many more, and the location positioning degree of accuracy of air flow source is high more, but the corresponding calculated amount is also big more.According to a plurality of zones that mark off, also need in terminal device, store each regional eigenvector in advance.
The embodiment of the invention is based on airflow-direction sensor perception airflow direction, thereby to the principle that user interface is carried out control operation is:
Each airflow-direction sensor that is provided with on the terminal device can the perception air-flow direction, in given coordinate system, the position of each airflow-direction sensor can be represented with coordinate parameters, the airflow direction of each airflow-direction sensor institute perception can be represented with angle parameter, therefore can obtain the straight line path with the coordinate of each airflow-direction sensor and the air-flow angle of this sensor institute perception, per two straight line paths intersect at an intersection point, the coordinate of this intersection point can calculate, if N sensor arranged then have at most
Figure B2009100922836D0000071
Individual intersection point, this
Figure B2009100922836D0000072
The scope that individual intersection point surrounded is exactly the air flow source scope of pneumatic sensor institute perception.Because it is not a point but a scope that air-flow affacts on the screen, again owing to reasons such as the characteristic of air-flow and sensor errors, this
Figure B2009100922836D0000073
Therefore the scope that individual intersection point surrounded not necessarily in the ready-portioned in advance zone of screen, might cover a plurality of zones, needs further accurately to locate the air flow source position according to these intersecting point coordinates and unique point that each is regional.
Each zone has
Figure B2009100922836D0000074
Individual unique point, each regional unique point set can be expressed as [X i, Y i] (i=0,1 ...), wherein X, Y are characteristic point coordinates.When determining the air flow source position, can obtain space length each other according to the coordinate of each intersection point and the characteristic point coordinates that each is regional, because space length can be represented both similarities, distance is near more similar more, therefore can determine the most similar zone by the space length that calculates, thereby should the position, zone be defined as the position at air flow source place.
Be example with the portable terminal shown in Fig. 2 A, Fig. 2 B below, describe the implementation of the embodiment of the invention in detail.
Respectively be provided with an airflow-direction sensor around the screen of the portable terminal shown in Fig. 2 A, as shown in the figure, comprise sensor (A, B, C, D), screen evenly is divided into 24 zones, there are 6 unique points in each zone.
The structure of this portable terminal can comprise storage unit 201 shown in Fig. 2 B, wherein preserve each regional unique point coordinate, and the set of the unique point that each is regional can be expressed as [X i, Y i] (i=1,2 ..., 6).The functional unit of determining the air flow source position according to the airflow direction of airflow-direction sensor perception can comprise: intersection point determining unit 202, metrics calculation unit 203, position determination unit 204, these functional units can be realized by software, are configured among the CPU or other processors of portable terminal.In addition, also comprise interface control unit 205, the user interface in the screen of this unit controls portable terminal.
Use the portable terminal shown in Fig. 2 A, Fig. 2 B, the flow process of utilizing air-flow that the user interface in this mobile terminal screen is operated can be as shown in Figure 3, and concrete steps are as follows:
Step 301, when air-flow acts on the mobile terminal screen, sensor (C D) perceives airflow direction information and send to intersection point determining unit 202 for A, B, sensor (A, B, C, D) the airflow direction information of institute's perception is expressed as signal X respectively 1(t), X 2(t), X 3(t), X 4(t).
Step 302, intersection point determining unit 202 are determined the angle of the airflow direction of each sensor institute perception with respect to given coordinate system according to the signal that receives, then, according to the angle of determining and each sensor coordinate with respect to given coordinate system, determine the straight line path of the coordinate position of each sensor respectively, and then determine the intersecting point coordinate of per two straight line paths to the air flow source position of its perception.
As shown in Figure 4, corresponding to sensor (A, B, C, straight line path D) is respectively a, b, c, d, its intersection point comprises: the intersection point O of a and d 1, b and d intersection point O 2, the intersection point O of b and c 3, c and d intersection point O 4, a and c intersection point O 5, a and b intersection point O 6, can obtain following straight-line equation:
α 1 x + y = α 1 x A + y A α 2 x + y = α 2 x B + y A α 3 x + y = α 3 x C + y C α 4 x + y = α 4 x D + y D · · · ( 8 )
Wherein, α 1, α 2, α 3, α 4Represent sensor (A, B, C, the air-flow angle that D) is perceived, the i.e. angle of straight line path a, b, c, d respectively; (x A, y A), (x B, y B), (x C, y C), (x D, y D) represent sensor (A, B, C, D) coordinate of position respectively.
Can calculate 6 intersecting point coordinates of a, b, c, d by (8) formula, that is:
O 1 : ( α 4 x D - α 1 x A α 4 - α 1 , α 4 y A - α 1 y D α 4 - α 1 ) O 2 : ( α 4 x D - α 2 x B α 4 - α 2 , α 4 y B - α 2 y D α 4 - α 2 ) O 3 : ( α 3 x C - α 2 x B α 3 - α 2 , α 3 y B - α 2 y C α 3 - α 2 ) O 4 : ( α 4 x D - α 3 x C α 4 - α 3 , α 3 y C - α 3 y D α 4 - α 3 ) O 5 : ( α 3 x C - α 1 x A α 3 - α 1 , α 3 y A - α 1 y C α 3 - α 1 ) O 6 : ( α 2 x B - α 1 x A α 2 - α 1 , α 2 y A - α 1 y B α 2 - α 1 ) · · · ( 9 )
Step 305, metrics calculation unit 203 are according to these intersecting point coordinates, and each regional unique point coordinate, and the position of determining air flow source belongs to which zone in the screen area that portable terminal divides in advance.
In this step,, each intersecting point coordinate can be expressed as the location estimation matrix of following input air-flow reflection spot on screen for convenience of calculation:
[ X , Y ] = ( x AB , y AB ) ( x AC , y AC ) ( x AD , y AD ) 0 ( x BC , y BC ) ( x BD , y BD ) 0 0 ( x CD , y CD ) · · · ( 10 )
Wherein, (x AB, y AB), (x AC, y AC), (x AD, y AD), (x BC, y BC), (x BD, y BD), (x CD, y CD) represent the coordinate of each intersection point respectively.
Then, utilize location estimation matrix shown in (10) formula and the formed eigenvector [X of unique point coordinate that each is regional i, Y i] (i=1 ... 6) carry out matrix operation, (Euclidean distance, ED), each regional correspondence obtains an ED value to calculate Euclidean distance between each intersection point and each the regional individual features point.If the intersection point set is [O 1, O 2, O 3, O 4, O 5, O 6], regional k (k=1 ..., 24) feature point set be combined into [P 1, P 2, P 3, P 4, P 5, P 6], then:
ED k = Σ i = 1 6 ( O i - P i ) 2 · · · ( 11 )
Wherein, O i-O iExpression O iAnd P iSpace length between 2 can pass through O iAnd P iCoordinate Calculation draw.
Step 306, regional determining unit 204 are chosen the minimum value The corresponding area in all ED distance values, as the reflector space (be air flow source position) of air-flow on screen.
Step 307, interface control unit 205 carry out the interface control operation according to determining the air flow source position.
In the step 305 of above-mentioned flow process, between calculating location estimated matrix and the eigenvector that each is regional apart from the time, also available other distance calculating methods realize that it is preferred version that the embodiment of the invention is selected Euclidean distance for use.
In the step 307 of above-mentioned flow process, the interface control operation of carrying out can comprise:
The icon selection operation: certain zone in the interface is confirmed as the air-flow reflector space, and air-flow in this locational continuous action time the time threshold t greater than default 1The time, should the zone in corresponding icon choose;
Icon moves: after icon is selected, smoothly drops down onto regional j by regional i, then makes the icon of choosing move to the j zone by the i segment smoothing if judge the air-flow reflection position, and i wherein, j=1,2 ..., N;
The screen page turning is switched: when the air-flow reflector space of determining does not have selectable icons, and when judging the air-flow reflector space thereafter and smoothly dropping down onto zone, the screen leftmost side by current region i, then user interface takes over seamlessly page up by current page; In like manner, smoothly drop down onto screen rightmost side when zone if judge the air-flow reflector space by current region i, then user interface takes over seamlessly down one page by current page, i=1 wherein, and 2 ..., N.In like manner, also definable is defined as screen page turning switching mode: when the air-flow reflector space smoothly dropped down onto zone, screen top side by current region i, then user interface took over seamlessly page up by current page; When the air-flow reflector space smoothly dropped down onto zone, screen lower side by current region i, then user interface took over seamlessly down one page by current page.
Owing in the actual environment for use of equipment, may receive the interference of irrelevant air-flow or noise air-flow, the technical scheme that the embodiment of the invention provides can further make improvements, on the basis of the foregoing description, increase the filtering interference signals operation, behind the signal that receives each sensor transmission in intersection point determining unit 202, carry out processing such as filtering, remove the undesired signal that disturbing flow produces.
In the embodiment of the invention, each regional unique point coordinate can be stored in the terminal device in advance on the terminal device, also can train each regional unique point gradually in the use of terminal device.Preferably, the embodiment of the invention is carried out the C-mean cluster for N>=500 sample in each regional training set before terminal device dispatches from the factory, thereby obtains each regional unique point set.
In sum, in the embodiment of the invention, because the airflow direction that uses airflow-direction sensor to come the perception terminal screen to be reflected, airflow direction according to each sensor institute perception estimates the air-flow reflector space that is surrounded by a plurality of intersection point, calculate the ED distance between each regional unique point of these intersection points and screen then, thereby orient the zone under the air-flow reflector space, and then determine the position of air-flow reflection spot.As can be seen, the embodiment of the invention is lower to the request memory of equipment, and calculated amount is less, therefore not only can be used for the bigger equipment of memory capacity such as computing machine, also can be used for internal memory small device such as portable terminal, realizes interface control.Because the technical scheme that the embodiment of the invention provided utilizes the effect of air-flow that the interface is controlled, not limited by other environmental factors such as sound, therefore in a riot of sound environment, still can keep reacting more accurately and higher sensitivity, therefore alleviate and utilized aspirant to carry out the problem that interface control is subject to noise jamming in the prior art.
Obviously, those skilled in the art can carry out various changes and modification to the present invention and not break away from the spirit and scope of the present invention.Like this, if of the present invention these are revised and modification belongs within the scope of claim of the present invention and equivalent technologies thereof, then the present invention also is intended to comprise these changes and modification interior.

Claims (12)

1. a method of utilizing air-flow control operation user interface is characterized in that, device screen is divided into a plurality of zones in advance, and screen periphery is provided with a plurality of sensors that can the perception airflow direction, and the method comprising the steps of:
Receive each sensor senses to the direction of the air-flow that screen reflected, according to the airflow direction that each sensor senses arrives, determine the straight line path of each sensor, and the intersection point of described straight line path to the reflection position of this air-flow on screen;
Calculate the space length between the unique point of described intersection point and each screen area respectively, and wherein the position of the minimum space pairing screen area of distance on this screen is defined as the reflection position of this air-flow on screen;
According to the reflection position of the air-flow of determining on screen, the user interface that shows in the screen is operated control.
2. the method for claim 1 is characterized in that, for each screen area in described each screen area, calculates the space length between the unique point of described intersection point and this screen area, is specially:
According to the unique point coordinate of described intersecting point coordinate and this screen area, calculate respectively space length between each intersection point and the character pair point square;
Add up space length between each intersection point and the character pair point square;
Accumulation result is carried out square root calculation, obtain the space length between the unique point of each intersection point and this screen area.
3. the method for claim 1 is characterized in that, according to the reflection position of the air-flow of determining on screen, the interface in the screen is operated, and comprising:
If the reflection position of being determined all is a same position, and on this position selectable icons is arranged, then choose this locational icon in official hour;
If the zone of reflection position of being determined at the appointed time from a screen area smooth transformation to screen edge, and do not have selectable icons on the reflection position, then carry out the interface page turn over operation.
4. method as claimed in claim 3, it is characterized in that, after choosing icon, if the follow-up reflection position of determining from the screen area smooth transformation of this icon position correspondence to other screen areas, then selected icon is followed moving of reflection position and is moved.
5. as each described method of claim 1-4, it is characterized in that the unique point quantity of each screen area is
Figure F2009100922836C0000021
Individual, wherein, N is the quantity of described sensor.
6. as each described method of claim 1-4, it is characterized in that, described sensor disperse to be arranged on described screen around.
7. an equipment that utilizes air-flow control operation user interface is characterized in that, comprising:
A plurality of sensors that can the perception airflow direction are arranged on screen periphery;
The intersection point determining unit, be used to receive each sensor senses to the direction of the air-flow that screen reflected, according to the airflow direction that each sensor senses arrives, determine the straight line path of each sensor to the reflection position of this air-flow on screen, and the intersection point of described straight line path;
Position determination unit be used for calculating respectively the space length between the unique point of described intersection point and each screen area, and wherein the position of the minimum space pairing screen area of distance on this screen is defined as the reflection position of this air-flow on screen;
The interface control unit is used for according to the reflection position of air-flow on screen of determining the user interface that shows in the screen being operated control.
8. equipment as claimed in claim 7 is characterized in that, described position determination unit is for each screen area in described each screen area, when calculating the space length between the unique point of described intersection point and this screen area:
According to the unique point coordinate of described intersecting point coordinate and this screen area, calculate respectively space length between each intersection point and the character pair point square;
Add up space length between each intersection point and the character pair point square;
Accumulation result is carried out square root calculation, obtain the space length between the unique point of each intersection point and this screen area.
9. equipment as claimed in claim 7 is characterized in that, the operation that described interface control unit carries out the interface in the screen comprises:
If the reflection position of being determined all is a same position, and on this position selectable icons is arranged, then choose this locational icon in official hour;
If the zone of reflection position of being determined at the appointed time from a screen area smooth transformation to screen edge, and do not have selectable icons on the reflection position, then carry out the interface page turn over operation.
10. equipment as claimed in claim 9, it is characterized in that, after icon is chosen in described interface control unit, if the follow-up reflection position of determining to other screen areas, then makes selected icon follow moving and moving of reflection position from the screen area smooth transformation of this icon position correspondence.
11., it is characterized in that as each described equipment of claim 7-10, also comprise storage unit, be used to store the unique point coordinate of each screen area, wherein, the unique point quantity of each screen area is
Figure F2009100922836C0000031
Individual, N is the quantity of described sensor.
12. as each described equipment of claim 7-10, it is characterized in that, described sensor disperse to be arranged on described screen around.
CN2009100922836A 2009-09-08 2009-09-08 Method and equipment for operating interfaces by utilizing airflows Expired - Fee Related CN102012773B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2009100922836A CN102012773B (en) 2009-09-08 2009-09-08 Method and equipment for operating interfaces by utilizing airflows

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2009100922836A CN102012773B (en) 2009-09-08 2009-09-08 Method and equipment for operating interfaces by utilizing airflows

Publications (2)

Publication Number Publication Date
CN102012773A true CN102012773A (en) 2011-04-13
CN102012773B CN102012773B (en) 2012-06-06

Family

ID=43842957

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2009100922836A Expired - Fee Related CN102012773B (en) 2009-09-08 2009-09-08 Method and equipment for operating interfaces by utilizing airflows

Country Status (1)

Country Link
CN (1) CN102012773B (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103383594A (en) * 2012-05-04 2013-11-06 富泰华工业(深圳)有限公司 Electronic equipment and control method thereof
CN103780738A (en) * 2012-10-17 2014-05-07 腾讯科技(深圳)有限公司 Mobile terminal image processing method and mobile terminal
CN104049869A (en) * 2013-03-14 2014-09-17 联想(北京)有限公司 Data processing method and data processing device
CN104461527A (en) * 2014-11-27 2015-03-25 深圳市中兴移动通信有限公司 Processing method and device for object in terminal
CN104571523A (en) * 2015-01-22 2015-04-29 合肥联宝信息技术有限公司 Control method and device of mobile terminal
CN105373314A (en) * 2015-06-22 2016-03-02 王宇 Target object control method and apparatus
CN105892670A (en) * 2016-04-20 2016-08-24 上海斐讯数据通信技术有限公司 Page turning method for smart terminal and smart terminal
WO2022161382A1 (en) * 2021-01-28 2022-08-04 维沃移动通信有限公司 Control method and apparatus, and electronic device

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1374575A (en) * 2002-02-28 2002-10-16 刘本林 Port mark system for man-computer interface
CN101382571A (en) * 2007-09-07 2009-03-11 宸鸿光电科技股份有限公司 Adjustable non-contact type touch pad detecting method

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103383594A (en) * 2012-05-04 2013-11-06 富泰华工业(深圳)有限公司 Electronic equipment and control method thereof
CN103383594B (en) * 2012-05-04 2018-01-05 富泰华工业(深圳)有限公司 Electronic equipment and its control method
CN103780738A (en) * 2012-10-17 2014-05-07 腾讯科技(深圳)有限公司 Mobile terminal image processing method and mobile terminal
US9977651B2 (en) 2012-10-17 2018-05-22 Tencent Technology (Shenzhen) Company Limited Mobile terminal and image processing method thereof
CN104049869B (en) * 2013-03-14 2017-09-29 联想(北京)有限公司 A kind of data processing method and device
CN104049869A (en) * 2013-03-14 2014-09-17 联想(北京)有限公司 Data processing method and data processing device
CN104461527A (en) * 2014-11-27 2015-03-25 深圳市中兴移动通信有限公司 Processing method and device for object in terminal
CN104571523B (en) * 2015-01-22 2018-03-20 合肥联宝信息技术有限公司 A kind of control method and device of mobile terminal
CN104571523A (en) * 2015-01-22 2015-04-29 合肥联宝信息技术有限公司 Control method and device of mobile terminal
CN105373314A (en) * 2015-06-22 2016-03-02 王宇 Target object control method and apparatus
CN105892670A (en) * 2016-04-20 2016-08-24 上海斐讯数据通信技术有限公司 Page turning method for smart terminal and smart terminal
CN105892670B (en) * 2016-04-20 2019-11-29 上海斐讯数据通信技术有限公司 A kind of page turning method and intelligent terminal of intelligent terminal
WO2022161382A1 (en) * 2021-01-28 2022-08-04 维沃移动通信有限公司 Control method and apparatus, and electronic device

Also Published As

Publication number Publication date
CN102012773B (en) 2012-06-06

Similar Documents

Publication Publication Date Title
CN102012773B (en) Method and equipment for operating interfaces by utilizing airflows
US10481789B2 (en) Method for providing a graphical user interface for an electronic transaction with a handheld touch screen device
EP2359212B1 (en) Interface adaptation system
US9552068B2 (en) Input device with hand posture control
EP2708996A1 (en) Display device, user interface method, and program
EP2309370A2 (en) Information processing apparatus, information processing method, and information processing program
US20110018825A1 (en) Sensing a type of action used to operate a touch panel
JP4376198B2 (en) Cursor moving device
CN101349956A (en) Method and apparatus for executing pattern touch order
KR20120028944A (en) Optical capacitive thumb control with pressure sensor
US10345912B2 (en) Control method, control device, display device and electronic device
CN106605202A (en) Handedness detection from touch input
CN103713809A (en) Dynamic generating method and dynamic generating device for annular menu of touch screen
CN103019518B (en) A kind of method of automatic adjustment human-computer interaction interface
CN105074626B (en) Detection and response to extras touch event
KR101770309B1 (en) Determining input received via tactile input device
CN102736838A (en) Method and device for identifying multi-point rotation motion
CN105487689A (en) Ring mouse and method for operating mobile terminal through same
US10599282B2 (en) Cursor control for a visual user interface
KR20170067669A (en) Method and apparatus for predicting touch location of electronic device
CN102360263A (en) Method implemented by taking three-dimensional moving track as input and mobile terminal
KR102219908B1 (en) DISPLAY SYSTEM WITH concurrent mult-mode control MECHANISM AND METHOD OF OPERATION THEREOF
CN103257724B (en) A kind of non-contact type mouse and method of operating thereof
CN104216560B (en) Mobile device and realize the system of the aerial touch-control of mobile device, control device
CN102955601B (en) 3D sensing method and system of touch panel

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20120606

Termination date: 20210908

CF01 Termination of patent right due to non-payment of annual fee