WO2010029491A2 - Display apparatus for processing touch events - Google Patents

Display apparatus for processing touch events Download PDF

Info

Publication number
WO2010029491A2
WO2010029491A2 PCT/IB2009/053901 IB2009053901W WO2010029491A2 WO 2010029491 A2 WO2010029491 A2 WO 2010029491A2 IB 2009053901 W IB2009053901 W IB 2009053901W WO 2010029491 A2 WO2010029491 A2 WO 2010029491A2
Authority
WO
WIPO (PCT)
Prior art keywords
positions
movement
objects
multiple groups
screen device
Prior art date
Application number
PCT/IB2009/053901
Other languages
French (fr)
Other versions
WO2010029491A3 (en
Inventor
Gerrit Hollemans
Original Assignee
Koninklijke Philips Electronics N.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips Electronics N.V. filed Critical Koninklijke Philips Electronics N.V.
Publication of WO2010029491A2 publication Critical patent/WO2010029491A2/en
Publication of WO2010029491A3 publication Critical patent/WO2010029491A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen

Definitions

  • the invention relates to a display apparatus comprising a touch screen device and a controller, to an electronic device comprising the display apparatus and to a method of processing touch events.
  • a first aspect of the invention provides a display apparatus as claimed in claim 1.
  • a second aspect of the invention provides an electronic device as claimed in claim 10.
  • a third aspect of the invention provides a method as claimed in claim 11.
  • Advantageous embodiments are defined in the dependent claims.
  • a display apparatus in accordance with the first aspect of the invention comprises a touch screen device and a controller.
  • the touch screen device receives multiple touch events as a result of objects that touch the touch screen device.
  • the objects touch the touch screen device at respective positions.
  • the objects are grouped into multiple groups of objects which belong together.
  • the controller detects movement of the positions of the touch events.
  • the controller determines whether the movement of a subset of the positions belongs to a pre-defined gesture.
  • the pre-defined gesture indicates that the objects associated with the subset of positions belong to one of the multiple groups. If the movement of the subset of positions belongs to the pre-defined gesture, the controller assigns the touch events for which is determined that the movement of the subset of positions is equal to or resembles the predefined gesture to the associated one of the multiple groups.
  • the display apparatus determines whether the movements detected belong to the pre-defined gesture. As a result of this determination the display apparatus associates the touch events for which the pre-defined gesture was detected to the associated one of the multiple groups. This solves the problem of the state of the art because by moving the objects of one of the multiple groups on the touch screen device in accordance with the pre-defined gesture, the complete set of touch events is subdivided in a subset that is associated with the one of the multiple groups that is detected due to the movement of the objects of this group and in a subset that is associated with all other groups.
  • the display apparatus is able to distinguish the touch events in the associated one of the multiple groups from the other touch events.
  • the display apparatus is able to distinguish the touch events of the objects of one of the multiple groups from the touch events of the other groups.
  • the pre-defined gesture may be different for different groups or may be the same for each group of objects (or for a subset of the groups) and may be performed sequentially or concurrently in time.
  • the electronic device according to the second aspect of the invention and the method according to the third aspect of the invention provide the user with the same benefits as the display apparatus in relation to distinguishing between touch events belonging to different groups.
  • the pre-defined gesture is one of the gestures of: rotating the objects of one of the multiple groups around a single point, translating the objects of one of the multiple groups, moving the objects of one of the multiple groups towards a single point or moving the objects of one of the multiple groups away from a single point.
  • the single point may be a single position on the touch screen device. Alternatively, the single point may be close positions. If the objects start in a single common point, it suffices that the positions of the objects move away from each other during the gesture. If the object ends in a single point, it suffices that the positions move towards each other during the gesture.
  • the gesture of rotating the objects of one of the multiple groups and the gesture of translating the objects of one of the multiple groups are intuitive because the objects of the one of the multiple groups follow the same type of movement.
  • the gesture of the rotation the objects of the one of the multiple groups follow a part of an imaginary circle and all circles followed by all objects share a single point.
  • the gesture of the translation all the objects of the one of the multiple groups follow a part of an imaginary line in the same direction.
  • the lines run parallel and all objects follow the lines over the same distance.
  • the gestures of moving with the objects to the single point or moving away from the single point are intuitive because the end position and the start position of respective gestures are or are close to the single point. This closeness indicates that the objects belong together.
  • the determination whether the movement of the subset of positions belongs to the pre-defined gesture may involve a tolerance mechanism.
  • the tolerance may be that the objects do not exactly follow a part of the circle or that the centres of the circles do not exactly overlap in the single point.
  • the tolerance may be that the lines followed by the objects are not exactly parallel and that the lines are followed over slightly different distances.
  • the tolerance may be that the objects move to or away from multiple points that are located within a tolerance distance from each other.
  • the objects which are grouped into multiple groups are fingers of multiple hands and the pre-defined gesture is a movement of the hand.
  • fingers of multiple hands touch the touch screen device.
  • the fingers of the hands move over the touch screen device.
  • One of the multiple hands may apply the predefined gesture to the touch screen device.
  • the display apparatus detects whether the movement of the subset of fingers is in accordance with the pre-defined gesture and assigns the touch events that are associated with the subset of fingers to the associated one of the multiple groups, wherein a group is formed by the one of the multiple hands. Therefore, the display apparatus according to this embodiment is able to receive input from multiple hands and the display apparatus is able to distinguish between the multiple hands.
  • a user can use two, three, four, or five fingers per hand to touch the touch screen device.
  • the objects which are grouped into multiple groups are fingers of multiple users, and wherein the pre-defined gesture is a movement of one hand or two hands of one of the users. Multiple users touch the touch screen device with the fingers of one of their hands. The fingers of one hand may apply the pre-defined gesture to the touch screen device.
  • the display apparatus is able to distinguish between different users because the touch events for which is determined that the movement of the subset of positions belongs to the pre-defined gesture are associated with one of the multiple users.
  • the multiple users can use two, three, four of five fingers of the hand to touch the touch screen device.
  • the user applies the pre-defined gesture to the touch screen device with one hand.
  • the pre-defined gesture is applied by the user with two hands.
  • the controller is further constructed for determining whether the movement of the subset of positions of the touch events of the associated one of the multiple groups belongs to a further pre-defined gesture for initiating a subsequent action.
  • Objects of one of the multiple groups have to identify themselves with the pre-defined gesture when they start touching the touch screen device. The identification is used to determine which subset of touch events form a group. Subsequently, the objects of the group apply the further pre-defined gesture that is a command for the display apparatus to initiate the subsequent action. Because the objects of one of the multiple groups have to identify themselves before the display apparatus is able to determine the further pre-defined gesture, the display apparatus will not mix the touch events that belong to different groups.
  • the further predetermined gesture will not result in an unexpected determination of the further pre-defined gesture based on touch events belonging to different groups. Therefore, this enables a display apparatus when objects of different groups apply further pre-defined gestures to perform the respective commands.
  • the subsequent action may be executed by the controller or, in the case that the display apparatus is part of an electronic device, by another part of the electronic device. Initiating the subsequent action may be done by sending a trigger signal to the other part of the electronic device. Examples of subsequent actions are changing an image that is displayed on the touch screen device, or another action that is related to other functions of the electronic device, like, for example, starting to play music.
  • the touch screen device displays an image
  • the controller generates the image such that a feedback indication is displayed at the position of the touch events.
  • Feedback improves user experience because it makes the user aware of the behaviour of the display apparatus.
  • the display apparatus assigns a position to the location where the touch of the object on the touch screen device is detected.
  • the position assigned by the display apparatus is not by definition exactly the same position as assumed by the user. This difference between assumed and assigned position may be, for example, the result of parallax. Another discrepancy between the user assumption and the behaviour of the display apparatus may be the number of detected touch events.
  • the feedback shows the user whether the objects touch the touch screen device with sufficient pressure.
  • the touch screen device displays an image
  • the controller generates the image displaying a feedback indication in the case that is determined that the movement of the subset of positions belongs to a pre-defined gesture.
  • the feedback improves the user experience as well.
  • the display apparatus signals to the user that the objects of one of the multiple groups moved on the touch screen device in accordance with the pre-defined gesture. The user knows, based on this feedback, that moving the objects on the touch screen device was successful.
  • the user knows that the display apparatus was not yet capable of determining whether the movement of the subset of positions belongs to the pre-defined gesture. Thus, the user has to continue or restart the movement of the objects on the touch screen device.
  • the feedback indication comprises an indication at the positions of the touch events that are assigned to the associated one of the multiple groups.
  • This feedback helps the user to understand which of the objects are associated with one of the multiple groups.
  • the feedback signals to the user which objects moved on the touch screen device in accordance with the pre-defined gesture. It helps the user to decide whether the association of the touch events with one of the multiple groups match with his intentions. It is possible that one touch event was not associated with the group because the movement of the object associated with this touch event was not exactly according to the pre- defined gesture. If the intention of the user was to identify this object also as a member of the group, the user knows that applying the pre-defined gesture with the objects was not completely successful and that he has to apply the pre-defined gesture all over again.
  • the controller comprises a gesture storage unit for storing the pre-defined gesture, a movement detection unit for detecting movement of positions of the touch events on the touch screen device, a determination unit for determining whether the movement of the subset of positions belong to a pre-defined gesture, and an assignment unit for assigning the touch events, for which is determined that the movement of the subset of positions belongs to the pre-defined gesture, to the associated one of the multiple groups.
  • the units of this embodiment perform a specific task. Specialized hardware may be used to perform the specific tasks of the units mentioned hereinabove. Alternatively, a processor may be used which performs all or a subset of the mentioned tasks.
  • the gesture storing unit comprises a rewritable memory
  • an opportunity is created to customize the display apparatus by storing a customer specific pre-defined gesture in the memory of the gesture storing unit.
  • Fig. 1 schematically shows a display apparatus, which receives input from two hands,
  • Fig. 2 schematically shows a display apparatus, which receives input from different groups of objects
  • Fig. 3 schematically shows a display apparatus, which receives a pre-defined gesture from a hand, which receives input of two groups of styli and which displays different feedback indications
  • Fig. 4A schematically shows a table that contains an electronic device to edit and view pictures
  • Fig. 4B schematically shows an enlargement of a pre-defined gesture and a further pre-defined gesture applied by a user to the electronic device of fig. 4A
  • Fig. 5 shows a block diagram of the display apparatus
  • Fig. 6 shows a flowchart of the method according to the invention.
  • a first embodiment is shown in Fig. 1.
  • the figure shows a display apparatus 100 comprising a controller 101 and a touch screen device 106.
  • the controller 101 comprises a gesture storage unit 102, a movement detection unit 103, a determination unit 104 and an assignment unit 105.
  • Fingers of a first hand 110 and a second hand 120 are touching the touch screen device 106. Only two fingers 111 and 112 of the first hand 110 touch the touch screen device at the positions indicated by A and B, respectively. All five fingers of the second hand 120 touch the touch screen device at the positions indicated by K, L, M, N and O, respectively.
  • a pre-defined gesture is stored in the gesture storage unit 102.
  • the pre-defined gesture indicates that the fingers that apply the gesture belong to one of the multiple hands.
  • the pre-defined gesture is defined by a movement of the fingers of the hand over the touch screen device 106 towards a single point.
  • the definition of the gesture includes that the number of fingers moving towards a single point may be two, three, four, or
  • the touch screen device 106 receives multiple touch events. In the example shown, seven touch events are received, at positions where the fingers touch the touch screen device 106.
  • the two hands 110, 120 in Fig. 1 may be of different users or of one user.
  • the movement detection unit 102 detects movement of the positions A, B; K, L, M, N and O.
  • the information of the movement of positions is transferred to the determination unit 104.
  • the determination unit 104 receives the information of the pre-defined gesture from the gesture storage unit 102. Based on this information, the determination unit 104 determines whether the movement of a subset of the positions matches with the definition of the pre-defined gesture.
  • fingers 111 and 112 touch the touch screen device 106 initially at locations A and B, respectively. Both fingers 111 and 112 move into the direction of point C on the touch screen device 106.
  • the movement detection unit 103 detects this movement and the determination unit 104 determines whether the movement belongs to the pre-defined gesture. This movement of the two fingers 111 and 112 to a common point resembles the pre-defined gesture and thus is determined by the assignment unit 105 that the fingers 111 and 112 belong to a first group, associated with the first hand. Also the five fingers of the second hand 120 move on the touch screen device 106.
  • the thumb touches the touch screen device 106 initially at position K and moves into the direction of P
  • the other four fingers touch the touch screen device 106 initially at positions L, M, N and O, respectively, and move also into the direction of P.
  • the determination unit 104 determines that the movement of the positions of the five fingers matches with the pre-defined gesture because the movement of the five fingers is towards a single point, which is point P in the example, and thus, the assignment unit 105 concludes that the five fingers belong to a second group, associated with the second hand 120.
  • the assignment unit 105 assigns the touch events for which this was determined to the associated one of the multiple hands.
  • the touch events that were the result of the fingers 111 and 112 touching the touch screen device 106 are associated with the hand 110.
  • the other five touch events, which were the result of the fingers of the second hand 120 touching the touch screen device 106 and moving on the touch screen device 106, are associated with the hand 120.
  • both hands 110 and 120 identify their fingers as being a member of one of the multiple groups. After the identification of which fingers belong to which hand, the display apparatus is able to distinguish between the touch events that are the result of the fingers of the first hand 110 and of the fingers of the second hand 120.
  • the determination whether the movement of the subset of positions belong to the pre-defined gesture may include a tolerance mechanism.
  • the tolerance may be that the multiple fingers do not move exactly towards one single point P, but move towards an area of limited size, for example an area formed by a circle with P as its centre and with a radius suitably selected such that when the fingers of the second hand 120 touch each other near the point P they are within the area.
  • a further embodiment is shown in Fig. 2.
  • the display apparatus 100 comprises the touch screen device 106 and the controller 101.
  • the controller 101 comprises the gesture storage unit 102, the movement detection unit 103, the determination unit 104 and the assignment unit 105.
  • Four styli 210, 211, 220 and 221 touch the touch screen device 106 at positions F, G, E and D, respectively.
  • the styli 210 and 211 form a first group and the styli 220 and 221 form a second group.
  • the display apparatus of the embodiment stores or has stored two pre-defined gestures in the gesture storage unit 102.
  • the first pre-defined gesture defines that if the movement of positions of two or more objects is a translation in one direction along parallel lines, the touch events that are the result of the two or more objects touching the touch screen display belong to one group.
  • the second pre-defined gesture defines that if the movement of positions of two or more objects is a rotation around a single point, the touch events that are associated with the two or more objects belong to one group.
  • the styli 210 and 211 initially touch the touch screen device 106 at the positions F and G, respectively. Because of these two touches, the touch screen device 106 receives two touch events with an associated position.
  • the movement detection unit 103 detects the movement of the positions F and G.
  • the stylus 210 moves over the touch screen device 106 following an imaginary circle from position F to F' and a point H is the centre of the imaginary circle.
  • the stylus 211 moves over the touch screen device 106 following a further imaginary circle from position G to G' and the point H is the center of the further imaginary circle.
  • the determination unit 104 determines whether the movement of a first subset of positions corresponds to one of the pre-defined gestures.
  • the movement of the styli 210 and 211 follow a pattern according to one of the two earlier introduced predefined gestures.
  • the determination unit 104 concludes that the movement of the first subset of positions belongs to one of the pre-defined gestures.
  • the assignment unit 105 assigns the two touch events that are the result of styli 210 and 211 touching the touch screen device 106 to one group.
  • Styli 220 and 221 are initially touching the touch screen device 106 at positions E and D, respectively. These two touches result in the additional reception of two touch events by the touch screen device 106 with respective associated positions.
  • Stylus 220 moves over the touch screen device 106 from position E towards E' following a straight line.
  • Stylus 221 moves over the touch screen device 106 from D towards D' following another straight line.
  • the line from D to D' and the line from E to E' run in parallel. Therefore, the determination unit 104 concludes that the movement of a second subset of positions of the touch events, which movement is the result of styli 220 and 221 touching the touch screen device 106, belong to one of the two pre-defined gestures. Subsequently, the assignment unit 105 assigns the touch events that associated with the styli 220 and 221 to one group.
  • the display apparatus 100 is able to subdivide the set of touch events into subsets because the styli 210 and 211 of a first group applied one of the pre-defined gestures to the touch screen device 106 and because the styli 220 and 221 of a second group applied one of the pre-defined gestures to the touch screen device 106.
  • determining whether the movement of the subset of positions belongs to the pre-defined gesture may involve a tolerance mechanism.
  • the tolerance may be that the centers of the imaginary circles are not exactly one single point, but are within a small delta distance from one single point. Another tolerance may be that the objects do not follow a perfect imaginary circle.
  • the tolerance may be that the lines followed by the objects are not perfectly parallel or not perfectly straight.
  • more than the two pre-defined gestures may be stored in the gesture storage unit 102. For example, also the gestures discussed with respect to Fig.
  • ⁇ 1 may be stored such that the display apparatus is able to detect groups of fingers belonging to a same hand and groups of styli belonging together.
  • alternative movements may be stored for one or more pre-defined gestures. For example, several pre-defined gestures may be stored to detect a sloppy rotational movement of objects which belong to a same group.
  • FIG. 3 shows another embodiment.
  • a display apparatus 300 comprises a touch screen device 306 and a controller 301.
  • a hand 310 touches the touch screen device 306 with two fingers 311 and 312 at positions close to point Q.
  • the figure shows also two styli 320 and 321 that touch the touch screen device 306 at locations U and V, respectively.
  • Two other styli 331 and 332 touch the touch screen device 306.
  • Feedback indications 322, 323 and 333 are displayed at the touch screen device 306.
  • the touch screen device 306 receives touch events at respective positions as the result of fingers 311 and 312 and as the result of styli 320, 321, 331 and 332.
  • the controller 301 receives the touch events from the touch screen device 306 and detects movement of positions of the touch events on the touch screen device 306.
  • the controller 301 determines whether the movement of a subset of positions belongs to the pre-defined gesture.
  • the pre-defined gesture is defined by a movement of positions of two or more objects of one group away from a single point.
  • the fingers 311 and 312 initially touch the touch screen device 306 at positions close to point Q. Finger 311 moves into the direction of position R. Finger 312 moves into the direction of position S. This movement of positions matches with the pre-defined gesture because the objects move away from the single point.
  • the controller 301 assigns the touch events that correspond to the touches of the fingers 311 and 312 to a first group.
  • the styli 320 and 321 are shown in figure 3 at a location on the touch screen device 306 after they moved from positions close to point T towards positions U and V respectively.
  • the movement of the styli 320 and 321 did correspond to the pre-defined gesture of moving away from a single point.
  • the touch events of the styli 320 and 321 were assigned by the controller 301 to a second group.
  • the touch screen device 306 of display apparatus 300 displays an image.
  • the controller 301 generates the image.
  • the image displays feedback indications 322, 323 and 333.
  • Displayed ellipse 323 is the feedback indication showing that the controller
  • the ellipse 323 is drawn around the locations where styli 320 and 321 touch the touch screen device 306.
  • the feedback indication 323 is displayed immediately after the determination that the movement of the subset of positions belongs to the pre-defined gesture. It should be noted that other forms of feedback indications may be displayed as well. Other examples are other geometrical shapes around the positions of the touch events that are assigned to the second group, changing the foreground or background color in an area related to the positions of the touch events in the second group, changing the foreground or background color of the whole image for a short time just after the determination that the movement of positions belongs to the predefined gesture, or e.g. displaying at a location of the touch screen device 306 a text indicating that the pre-defined gesture is determined.
  • Feedback indications 322 are displayed on the touch screen device 306 at the locations of the touch events that are assigned to the second group.
  • the indications 322 of the example of figure 3 are blinking circles around the positions where styli 320 and 321 touch the touch screen device 306.
  • Other feedback indications which show the user of the display apparatus 300 which touch events are assigned to the group, are displaying specific geometrical shapes in a specific color at the positions of the touch events of a group. The geometrical shapes are possibly blinking. For the convenience of the user, different geometrical shapes of different colors may be used for different groups.
  • Another example of feedback indications, which shows which touch events belong to the group is connecting the positions of the touch events with lines.
  • the feedback indications 333 are an indication of the positions where the touch event is received by the touch screen device 306. At the positions where the styli 331 and 332 touch the touch screen device 306 the feedback indication 333 is an "X". Other examples of feedback indications at the location where the touch event is received are a cursor, a pointer, an arrow, a small circle, a local change in background color or e.g. a local change in foreground color.
  • the feedback indications 333 are different from the feedback indications 322.
  • the feedback indications 333 are displayed at locations of touch events that are not associated with one of the multiple groups, and the feedback indications 322 are displayed at locations of touch events that are associated with one of the multiple groups.
  • Fig. 4A shows a table 400 which comprises an electronic device to view and edit pictures.
  • the electronic device comprises the display apparatus in accordance with the invention.
  • the figure shows the touch screen device 406 of the display apparatus. Other parts of the electronic device are assumed to be built in the table, including the controller of the display apparatus.
  • the touch screen device 406 shows several small pictures 410 to the users (users are not drawn in the figure) of the table. A first large picture 411 is shown to a first user and a second large picture 412 is shown to a second user.
  • the users of the table 400 have to identify the fingers of their hands by applying a pre-defined gesture to the touch screen device 406.
  • the pre-defined gesture of the embodiment of figure 4 is the gesture of moving with the fingers of the hand away from a single point. Subsequently, the pre-defined gesture may be followed by a further pre-defined gesture for initiating a subsequent action.
  • the further pre-defined gesture is rotating with the fingers of the hand around a single point. The subsequent action is rotating the picture.
  • Fig. 4A the sequence of the predefined gestures and the further pre-defined gesture is shown in the area of the second large picture 412. This part of Fig. 4A has been enlarged in Fig. 4B.
  • the second user initially touches the touch screen device 406 with three fingers of one hand at the positions W, X and Y, respectively.
  • the touches of the fingers result in three touch events.
  • the fingers move over the touch screen device 406 into the direction of W, X' and Y', respectively. All three fingers move away from the single point Z.
  • the controller detects the movement of positions and the controller determines that the movement of positions of the three fingers corresponds to the pre-defined gesture and assigns the three touch events to one group that is associated with the hand of the user.
  • the touch screen device 406 displays an image with a feedback indication that the pre-defined gesture is recognized in the movement of the fingers.
  • the controller of the display apparatus is further constructed to determine whether the movement of the subset of positions of the touch events in the group belongs to a further pre-defined gesture.
  • the second user continues the movement of the fingers on the touch screen device 406 by moving the respective fingers from W to W", from X' to X" and from Y' to Y".
  • This movement of positions of the three fingers matches with the further pre-defined gesture because the movement of positions of the three fingers is the rotation of the fingers of the hand around the single point Z.
  • the controller initiates the subsequent action. Because the further pre-defined gesture was applied to the touch screen device 406 inside the area of the second larger picture 412, the initiated subsequent action is rotating the picture.
  • the controller rotates the picture on the touch screen device 406, and in an alternative embodiment the controller sends an interrupt signal to another part of the electronic device, which executes the action of rotating the picture.
  • the first picture 411 is shown to the first user. The first user can initiate actions related to the first picture 411 by applying the pre-defined gesture to identify the fingers of his hand followed by the further pre-defined gesture. If the first user applies these gestures inside the area of the first picture 411, the subsequent action will be related to the first picture 411.
  • the electronic device to view and edit pictures is able to distinguish between the touch events of the first and the second user. Therefore, two or more people can sit around table 400 and concurrently use the electronic device to view and edit pictures.
  • the further pre-defined gesture is not limited to the rotation of the fingers of the hand around the single point and that the subsequent action is not limited to rotating the first picture 411 or the second picture 412.
  • Example of other further pre-defined gestures are a translation of the fingers of the hand, moving the fingers further away from the single point or e.g. moving the fingers backwards to the single point.
  • the initiated subsequent actions may be moving the first picture 411 or second picture 412 over the screen, or alternatively in the case that only a part of the first picture 411 or the second picture 412 is shown, scrolling through the first picture 411 or second picture 412.
  • moving further away from the single point e.g.
  • the subsequent action may be zooming into a part of the first picture 411 or the second picture 412.
  • the subsequent action may be zooming out of the picture.
  • the display apparatus may be used in other electronic devices as well, like in organizers, laptops, e-book readers, or, for example, screens of computers.
  • Fig. 5 shows a block diagram of a display apparatus 500 comprising a touch screen device 506 and a controller 501.
  • the controller 501 comprises a gesture storage unit 502, a movement detection unit 503, a determination unit 504 and an assignment unit 505.
  • the touch screen device 506 receives the touch events at respective positions as the result of objects that touch the touch screen device 506.
  • the touch events are sent in the form of a signal to the movement detection unit 503.
  • the position where the touch event is received on the touch screen device 506 is detected at regular moments.
  • the movement detection unit 503 detects the movement of positions of the touch events. The movement of positions is detected by keeping track of the changes in the positions on consecutive detection moments.
  • the movements of positions of the touch events are transferred in the form of another signal to the determination unit 504.
  • the determination unit 504 receives for every touch event a sequence of positions that describes the movement of positions or, alternatively, it receives a sequence of changes of the positions.
  • the determination unit 504 receives also in the form of a further signal the pre-defined gesture from the gesture storage unit 502.
  • the pre-defined gesture is stored in the gesture storage unit 502.
  • the gesture storage unit 502 contains memory to store the pre-defined gesture. In the case of volatile memory, like Random Access Memory (RAM), the pre-defined gesture is loaded into the memory when the display apparatus starts to operate and the pre-defined gesture is possibly re-loaded with an updated version of the pre-defined gesture during a session of operation.
  • RAM Random Access Memory
  • the information of the pre-defined gesture is programmed in the memory during the fabrication of the gesture storing unit 502 (e.g. in the case of a Read Only memory), is programmed into the memory during the installation (e.g. in an EPROM) or is loaded into the memory during previous use of the display apparatus (which is e.g. possible in the case that flash memory is used).
  • the pre-defined gesture may be stored in the form of a description of a topology of points, together with, if required, changes in the topology and geometrical information related to the topology.
  • a topology describes which point is left, right, below or above another point and the geometrical information describes distances between the some of the points.
  • the topology in the gesture of two objects that move away from a single point, the topology is a point a that is left of a point b.
  • This description does not include a topology that changes over time, but it includes information about the distance between point a and point b that is initially smaller than a threshold value and that increases over time.
  • Another form of describing and storing the pre-defined gesture may be in the format of vectors, which start at a specified location. E.g. in the gesture of three objects moving away from a single point, there will be three vectors which start at positions that are located close to a single central point and the three vectors are all three oriented away from the single central point.
  • the determination unit 504 is dedicated hardware to execute a form of pattern recognition.
  • the pre-defined gesture describes the pattern of movement of positions and this pattern must be matched with the movement of a subset of positions of all the received touch events.
  • the pattern recognition may be translation, scaling and rotation invariant and contains optionally other fuzzy pattern recognition mechanisms.
  • the determination unit 504 determines that the movement of a subset of the positions belongs to the pre-defined gesture, the determination unit 504 sends in the form of signals information about the touch events for which was detected that the movement of the subset of positions belongs to the pre-defined gesture to the assignment unit 505.
  • the assignment unit 505 assigns the touch events for which it received a signal from the determination unit 504 to one of the multiple groups.
  • Fig. 6 shows a flowchart of the method of the invention.
  • the touch screen device 106 receives touch events.
  • the touch events are the result of objects of multiple groups touching the touch screen device 106 at respective positions.
  • step 602 is the movement of positions of the touch events on the touch screen device 106 detected.
  • step 603 is determined whether the movement of a subset of positions belongs to a pre-defined gesture.
  • the pre-defined gesture indicates that the objects associated with the subset of positions belong to one of the multiple groups. If the movement of a subset of the positions corresponds to the pre-defined gesture, the touch events for which the correspondence is detected are assigned to the associated one of the multiple groups in step 604.
  • step 605 is determined whether the movement of the subset of positions of the touch events of the associated one of the multiple groups belongs to a further pre-defined gesture. If the movement of the subset of positions of the touch events of the associated one of the multiple groups corresponds to a further pre-defined gesture, in step 606 a subsequent action is initiated. If the movement of the subset of positions of the touch events of the associated one of the multiple groups does not belong to a further pre-defined gesture, no further steps are taken.
  • the display apparatus may concurrently receive touch events from styli and fingers, only from styli, or only from fingers.
  • Other objects than styli may be used as well, like a pencil, a ballpoint, pieces to play games like pawns and chess pieces, or any other object that is detected by the touch screen device if the object touches the touch screen device.
  • the above-mentioned embodiments are not limited to the disclosed pre-defined gestures, or limited to the disclosed combinations of pre-defined gestures.
  • the display apparatus may determine whether the movement of the subset of positions belongs to any set of pre-defined gestures, of which the set of the discussed gestures of moving away from a single point, moving towards a single point, translating the objects of the group or rotating with the objects of the group around a single point is only an example.
  • any reference signs placed between parentheses shall not be construed as limiting the claim.
  • Use of the verb "comprise” and its conjugations does not exclude the presence of elements or steps other than those stated in a claim.
  • the article "a” or “an” preceding an element does not exclude the presence of a plurality of such elements.
  • the invention may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the device claim enumerating several means, several of these means may be embodied by one and the same item of hardware. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage.

Abstract

The invention relates to a method, a display apparatus (100) and an electronic device (400) for processing touch events. The display apparatus (100) comprises a touch screen device (106) and a controller (101). The touch screen device (106) receives touch events as a result of objects touching the touch screen device (106) at respective positions. The objects belong to multiple groups of objects that belong together. The controller (101) is constructed for detecting movement of positions of the touch events on the touch screen device (106). The controller (101) is further constructed for determining whether the movement of a subset of positions belongs to a pre-defined gesture. The pre-defined gesture indicates that the objects associated with the subset of positions belong to one of the multiple groups. The touch events for which is determined that the movement of the subset of positions belongs to the pre-defined gesture are assigned by the controller (101) to one of the multiple groups.

Description

Display apparatus for processing touch events
FIELD OF THE INVENTION
The invention relates to a display apparatus comprising a touch screen device and a controller, to an electronic device comprising the display apparatus and to a method of processing touch events.
BACKGROUND OF THE INVENTION
Published patent application US2007/0262964A1 discloses a multi- input touch screen which receives multiple touch inputs. The multiple touch inputs are processed to find a match between the characteristics of the touch inputs and the characteristics of pre-defined multi- input gestures. Subsequently, a computer system is triggered to start a specific action in response to the recognized multi- input gesture. A limitation of this multi- input touch screen is that it is not able to make a distinction between the touch inputs of two separate hands of one user or of gestures of different users. Thus, the multi- input touch screen is not suitable for use by more than one hand or by more than one person.
SUMMARY OF THE INVENTION
It is an object of the invention to provide a display apparatus that is able to distinguish between touch events belonging to different groups.
A first aspect of the invention provides a display apparatus as claimed in claim 1. A second aspect of the invention provides an electronic device as claimed in claim 10. A third aspect of the invention provides a method as claimed in claim 11. Advantageous embodiments are defined in the dependent claims.
A display apparatus in accordance with the first aspect of the invention comprises a touch screen device and a controller. The touch screen device receives multiple touch events as a result of objects that touch the touch screen device. The objects touch the touch screen device at respective positions. The objects are grouped into multiple groups of objects which belong together. The controller detects movement of the positions of the touch events. The controller determines whether the movement of a subset of the positions belongs to a pre-defined gesture. The pre-defined gesture indicates that the objects associated with the subset of positions belong to one of the multiple groups. If the movement of the subset of positions belongs to the pre-defined gesture, the controller assigns the touch events for which is determined that the movement of the subset of positions is equal to or resembles the predefined gesture to the associated one of the multiple groups. If different objects of multiple groups touch the touch screen device, and if the objects of one of the groups are moved on the touch screen device according to the predefined gesture, the display apparatus determines whether the movements detected belong to the pre-defined gesture. As a result of this determination the display apparatus associates the touch events for which the pre-defined gesture was detected to the associated one of the multiple groups. This solves the problem of the state of the art because by moving the objects of one of the multiple groups on the touch screen device in accordance with the pre-defined gesture, the complete set of touch events is subdivided in a subset that is associated with the one of the multiple groups that is detected due to the movement of the objects of this group and in a subset that is associated with all other groups. Thus, the display apparatus is able to distinguish the touch events in the associated one of the multiple groups from the other touch events. By performing a pre-defined gesture with the objects of different groups, the display apparatus is able to distinguish the touch events of the objects of one of the multiple groups from the touch events of the other groups. The pre-defined gesture may be different for different groups or may be the same for each group of objects (or for a subset of the groups) and may be performed sequentially or concurrently in time.
For example, if a person puts al ten fingers of his two hands on the display device and makes a rotating movement with both his left and right hand, it is possible to deduct from the change of position of the fingers on the display device which fingers belong to which one of the two hands. The electronic device according to the second aspect of the invention and the method according to the third aspect of the invention provide the user with the same benefits as the display apparatus in relation to distinguishing between touch events belonging to different groups.
In an embodiment, the pre-defined gesture is one of the gestures of: rotating the objects of one of the multiple groups around a single point, translating the objects of one of the multiple groups, moving the objects of one of the multiple groups towards a single point or moving the objects of one of the multiple groups away from a single point. These gestures are intuitive gestures. The single point may be a single position on the touch screen device. Alternatively, the single point may be close positions. If the objects start in a single common point, it suffices that the positions of the objects move away from each other during the gesture. If the object ends in a single point, it suffices that the positions move towards each other during the gesture. The gesture of rotating the objects of one of the multiple groups and the gesture of translating the objects of one of the multiple groups are intuitive because the objects of the one of the multiple groups follow the same type of movement. In the gesture of the rotation, the objects of the one of the multiple groups follow a part of an imaginary circle and all circles followed by all objects share a single point. In the gesture of the translation, all the objects of the one of the multiple groups follow a part of an imaginary line in the same direction. The lines run parallel and all objects follow the lines over the same distance. The gestures of moving with the objects to the single point or moving away from the single point are intuitive because the end position and the start position of respective gestures are or are close to the single point. This closeness indicates that the objects belong together.
It should be noted that in another embodiment the determination whether the movement of the subset of positions belongs to the pre-defined gesture may involve a tolerance mechanism. In the case of the gestures of the rotation, the tolerance may be that the objects do not exactly follow a part of the circle or that the centres of the circles do not exactly overlap in the single point. In the case of the gesture of the translation, the tolerance may be that the lines followed by the objects are not exactly parallel and that the lines are followed over slightly different distances. In the case of gestures wherein the objects of the group move to or away from a single point, the tolerance may be that the objects move to or away from multiple points that are located within a tolerance distance from each other.
In a further embodiment, the objects which are grouped into multiple groups are fingers of multiple hands and the pre-defined gesture is a movement of the hand. In this embodiment, fingers of multiple hands touch the touch screen device. The fingers of the hands move over the touch screen device. One of the multiple hands may apply the predefined gesture to the touch screen device. The display apparatus detects whether the movement of the subset of fingers is in accordance with the pre-defined gesture and assigns the touch events that are associated with the subset of fingers to the associated one of the multiple groups, wherein a group is formed by the one of the multiple hands. Therefore, the display apparatus according to this embodiment is able to receive input from multiple hands and the display apparatus is able to distinguish between the multiple hands. It should be noted that a user can use two, three, four, or five fingers per hand to touch the touch screen device. In another embodiment, the objects which are grouped into multiple groups are fingers of multiple users, and wherein the pre-defined gesture is a movement of one hand or two hands of one of the users. Multiple users touch the touch screen device with the fingers of one of their hands. The fingers of one hand may apply the pre-defined gesture to the touch screen device. The display apparatus is able to distinguish between different users because the touch events for which is determined that the movement of the subset of positions belongs to the pre-defined gesture are associated with one of the multiple users. It should be noted that the multiple users can use two, three, four of five fingers of the hand to touch the touch screen device. In this embodiment the user applies the pre-defined gesture to the touch screen device with one hand. In an alternative embodiment the pre-defined gesture is applied by the user with two hands.
In a further embodiment, the controller is further constructed for determining whether the movement of the subset of positions of the touch events of the associated one of the multiple groups belongs to a further pre-defined gesture for initiating a subsequent action. Objects of one of the multiple groups have to identify themselves with the pre-defined gesture when they start touching the touch screen device. The identification is used to determine which subset of touch events form a group. Subsequently, the objects of the group apply the further pre-defined gesture that is a command for the display apparatus to initiate the subsequent action. Because the objects of one of the multiple groups have to identify themselves before the display apparatus is able to determine the further pre-defined gesture, the display apparatus will not mix the touch events that belong to different groups. The further predetermined gesture will not result in an unexpected determination of the further pre-defined gesture based on touch events belonging to different groups. Therefore, this enables a display apparatus when objects of different groups apply further pre-defined gestures to perform the respective commands. The subsequent action may be executed by the controller or, in the case that the display apparatus is part of an electronic device, by another part of the electronic device. Initiating the subsequent action may be done by sending a trigger signal to the other part of the electronic device. Examples of subsequent actions are changing an image that is displayed on the touch screen device, or another action that is related to other functions of the electronic device, like, for example, starting to play music.
In an embodiment, the touch screen device displays an image, and the controller generates the image such that a feedback indication is displayed at the position of the touch events. Feedback improves user experience because it makes the user aware of the behaviour of the display apparatus. After the user touched the touch screen device at a specific position, the display apparatus assigns a position to the location where the touch of the object on the touch screen device is detected. The position assigned by the display apparatus is not by definition exactly the same position as assumed by the user. This difference between assumed and assigned position may be, for example, the result of parallax. Another discrepancy between the user assumption and the behaviour of the display apparatus may be the number of detected touch events. For example, in the case that one has to touch the touch screen device with an object with a minimal pressure before the touch is detected, the feedback shows the user whether the objects touch the touch screen device with sufficient pressure. In another embodiment, the touch screen device displays an image, and the controller generates the image displaying a feedback indication in the case that is determined that the movement of the subset of positions belongs to a pre-defined gesture. In this embodiment, the feedback improves the user experience as well. As soon as feedback is shown that the movement of the subset of positions belongs to the pre-defined gesture, the display apparatus signals to the user that the objects of one of the multiple groups moved on the touch screen device in accordance with the pre-defined gesture. The user knows, based on this feedback, that moving the objects on the touch screen device was successful. On the other hand, as long as the feedback is not shown, the user knows that the display apparatus was not yet capable of determining whether the movement of the subset of positions belongs to the pre-defined gesture. Thus, the user has to continue or restart the movement of the objects on the touch screen device.
In a further embodiment, the feedback indication comprises an indication at the positions of the touch events that are assigned to the associated one of the multiple groups. This feedback helps the user to understand which of the objects are associated with one of the multiple groups. The feedback signals to the user which objects moved on the touch screen device in accordance with the pre-defined gesture. It helps the user to decide whether the association of the touch events with one of the multiple groups match with his intentions. It is possible that one touch event was not associated with the group because the movement of the object associated with this touch event was not exactly according to the pre- defined gesture. If the intention of the user was to identify this object also as a member of the group, the user knows that applying the pre-defined gesture with the objects was not completely successful and that he has to apply the pre-defined gesture all over again.
In an embodiment, the controller comprises a gesture storage unit for storing the pre-defined gesture, a movement detection unit for detecting movement of positions of the touch events on the touch screen device, a determination unit for determining whether the movement of the subset of positions belong to a pre-defined gesture, and an assignment unit for assigning the touch events, for which is determined that the movement of the subset of positions belongs to the pre-defined gesture, to the associated one of the multiple groups. The units of this embodiment perform a specific task. Specialized hardware may be used to perform the specific tasks of the units mentioned hereinabove. Alternatively, a processor may be used which performs all or a subset of the mentioned tasks. If the gesture storing unit comprises a rewritable memory, an opportunity is created to customize the display apparatus by storing a customer specific pre-defined gesture in the memory of the gesture storing unit. These and other aspects of the invention are apparent from and will be elucidated with reference to the embodiments described hereinafter.
BRIEF DESCRIPTION OF THE DRAWINGS In the drawings:
Fig. 1 schematically shows a display apparatus, which receives input from two hands,
Fig. 2 schematically shows a display apparatus, which receives input from different groups of objects, Fig. 3 schematically shows a display apparatus, which receives a pre-defined gesture from a hand, which receives input of two groups of styli and which displays different feedback indications,
Fig. 4A schematically shows a table that contains an electronic device to edit and view pictures, Fig. 4B schematically shows an enlargement of a pre-defined gesture and a further pre-defined gesture applied by a user to the electronic device of fig. 4A, Fig. 5 shows a block diagram of the display apparatus, and Fig. 6 shows a flowchart of the method according to the invention.
It should be noted that items which have the same reference numbers in different figures, have the same structural features and the same functions, or are the same signals. Where the function and/or structure of such an item has been explained, there is no necessity for repeated explanation thereof in the detailed description. DETAILED DESCRIPTION OF THE EMBODIMENTS
A first embodiment is shown in Fig. 1. The figure shows a display apparatus 100 comprising a controller 101 and a touch screen device 106. The controller 101 comprises a gesture storage unit 102, a movement detection unit 103, a determination unit 104 and an assignment unit 105. Fingers of a first hand 110 and a second hand 120 are touching the touch screen device 106. Only two fingers 111 and 112 of the first hand 110 touch the touch screen device at the positions indicated by A and B, respectively. All five fingers of the second hand 120 touch the touch screen device at the positions indicated by K, L, M, N and O, respectively. A pre-defined gesture is stored in the gesture storage unit 102. The pre-defined gesture indicates that the fingers that apply the gesture belong to one of the multiple hands. In this embodiment the pre-defined gesture is defined by a movement of the fingers of the hand over the touch screen device 106 towards a single point. The definition of the gesture includes that the number of fingers moving towards a single point may be two, three, four, or five fingers.
Two fingers of the first hand 110 and five fingers of second hand 120 touch the touch screen device 106. Thus, the touch screen device 106 receives multiple touch events. In the example shown, seven touch events are received, at positions where the fingers touch the touch screen device 106. The two hands 110, 120 in Fig. 1 may be of different users or of one user.
Subsequently, the movement detection unit 102 detects movement of the positions A, B; K, L, M, N and O. The information of the movement of positions is transferred to the determination unit 104. The determination unit 104 receives the information of the pre-defined gesture from the gesture storage unit 102. Based on this information, the determination unit 104 determines whether the movement of a subset of the positions matches with the definition of the pre-defined gesture.
In the example of figure 1, fingers 111 and 112 touch the touch screen device 106 initially at locations A and B, respectively. Both fingers 111 and 112 move into the direction of point C on the touch screen device 106. The movement detection unit 103 detects this movement and the determination unit 104 determines whether the movement belongs to the pre-defined gesture. This movement of the two fingers 111 and 112 to a common point resembles the pre-defined gesture and thus is determined by the assignment unit 105 that the fingers 111 and 112 belong to a first group, associated with the first hand. Also the five fingers of the second hand 120 move on the touch screen device 106. The thumb touches the touch screen device 106 initially at position K and moves into the direction of P, the other four fingers touch the touch screen device 106 initially at positions L, M, N and O, respectively, and move also into the direction of P. The determination unit 104 determines that the movement of the positions of the five fingers matches with the pre-defined gesture because the movement of the five fingers is towards a single point, which is point P in the example, and thus, the assignment unit 105 concludes that the five fingers belong to a second group, associated with the second hand 120.
Or said differently, in the case that the determination unit 104 determines that the movement of the subset of positions is in accordance with the pre-defined gesture of moving the fingers towards a single point, the assignment unit 105 assigns the touch events for which this was determined to the associated one of the multiple hands. In the example of figure 1 the touch events that were the result of the fingers 111 and 112 touching the touch screen device 106 are associated with the hand 110. The other five touch events, which were the result of the fingers of the second hand 120 touching the touch screen device 106 and moving on the touch screen device 106, are associated with the hand 120.
This results in two groups of touch events that subdivide the set of touch events in one group associated with the first hand 110 and one group associated with the second hand 120. By applying the pre-defined gesture on the touch screen device 106 both hands 110 and 120 identify their fingers as being a member of one of the multiple groups. After the identification of which fingers belong to which hand, the display apparatus is able to distinguish between the touch events that are the result of the fingers of the first hand 110 and of the fingers of the second hand 120.
It should be noted that the determination whether the movement of the subset of positions belong to the pre-defined gesture may include a tolerance mechanism. The tolerance may be that the multiple fingers do not move exactly towards one single point P, but move towards an area of limited size, for example an area formed by a circle with P as its centre and with a radius suitably selected such that when the fingers of the second hand 120 touch each other near the point P they are within the area. A further embodiment is shown in Fig. 2. The display apparatus 100 comprises the touch screen device 106 and the controller 101. The controller 101 comprises the gesture storage unit 102, the movement detection unit 103, the determination unit 104 and the assignment unit 105. Four styli 210, 211, 220 and 221 touch the touch screen device 106 at positions F, G, E and D, respectively. The styli 210 and 211 form a first group and the styli 220 and 221 form a second group.
The display apparatus of the embodiment stores or has stored two pre-defined gestures in the gesture storage unit 102. The first pre-defined gesture defines that if the movement of positions of two or more objects is a translation in one direction along parallel lines, the touch events that are the result of the two or more objects touching the touch screen display belong to one group. The second pre-defined gesture defines that if the movement of positions of two or more objects is a rotation around a single point, the touch events that are associated with the two or more objects belong to one group. In the example of Fig. 2, the styli 210 and 211 initially touch the touch screen device 106 at the positions F and G, respectively. Because of these two touches, the touch screen device 106 receives two touch events with an associated position. The movement detection unit 103 detects the movement of the positions F and G. The stylus 210 moves over the touch screen device 106 following an imaginary circle from position F to F' and a point H is the centre of the imaginary circle. The stylus 211 moves over the touch screen device 106 following a further imaginary circle from position G to G' and the point H is the center of the further imaginary circle.
Subsequently, the determination unit 104 determines whether the movement of a first subset of positions corresponds to one of the pre-defined gestures. The movement of the styli 210 and 211 follow a pattern according to one of the two earlier introduced predefined gestures. Thus, the determination unit 104 concludes that the movement of the first subset of positions belongs to one of the pre-defined gestures. Subsequently, the assignment unit 105 assigns the two touch events that are the result of styli 210 and 211 touching the touch screen device 106 to one group. Styli 220 and 221 are initially touching the touch screen device 106 at positions E and D, respectively. These two touches result in the additional reception of two touch events by the touch screen device 106 with respective associated positions. Stylus 220 moves over the touch screen device 106 from position E towards E' following a straight line. Stylus 221 moves over the touch screen device 106 from D towards D' following another straight line. As can be seen in the figure, the line from D to D' and the line from E to E' run in parallel. Therefore, the determination unit 104 concludes that the movement of a second subset of positions of the touch events, which movement is the result of styli 220 and 221 touching the touch screen device 106, belong to one of the two pre-defined gestures. Subsequently, the assignment unit 105 assigns the touch events that associated with the styli 220 and 221 to one group.
Thus, the display apparatus 100 is able to subdivide the set of touch events into subsets because the styli 210 and 211 of a first group applied one of the pre-defined gestures to the touch screen device 106 and because the styli 220 and 221 of a second group applied one of the pre-defined gestures to the touch screen device 106.
It should be noted that determining whether the movement of the subset of positions belongs to the pre-defined gesture may involve a tolerance mechanism. In the case that the pre-defined gesture defines the rotation of the objects of one group around a single point, the tolerance may be that the centers of the imaginary circles are not exactly one single point, but are within a small delta distance from one single point. Another tolerance may be that the objects do not follow a perfect imaginary circle. In the case that the pre-defined gesture defines the translation of the objects of one group, the tolerance may be that the lines followed by the objects are not perfectly parallel or not perfectly straight. Further, it has to be noted that more than the two pre-defined gestures may be stored in the gesture storage unit 102. For example, also the gestures discussed with respect to Fig. 1 may be stored such that the display apparatus is able to detect groups of fingers belonging to a same hand and groups of styli belonging together. Alternatively or additionally, alternative movements may be stored for one or more pre-defined gestures. For example, several pre-defined gestures may be stored to detect a sloppy rotational movement of objects which belong to a same group.
Fig. 3 shows another embodiment. A display apparatus 300 comprises a touch screen device 306 and a controller 301. A hand 310 touches the touch screen device 306 with two fingers 311 and 312 at positions close to point Q. The figure shows also two styli 320 and 321 that touch the touch screen device 306 at locations U and V, respectively. Two other styli 331 and 332 touch the touch screen device 306. Feedback indications 322, 323 and 333 are displayed at the touch screen device 306.
The touch screen device 306 receives touch events at respective positions as the result of fingers 311 and 312 and as the result of styli 320, 321, 331 and 332. The controller 301 receives the touch events from the touch screen device 306 and detects movement of positions of the touch events on the touch screen device 306. The controller 301 determines whether the movement of a subset of positions belongs to the pre-defined gesture. The pre-defined gesture is defined by a movement of positions of two or more objects of one group away from a single point. The fingers 311 and 312 initially touch the touch screen device 306 at positions close to point Q. Finger 311 moves into the direction of position R. Finger 312 moves into the direction of position S. This movement of positions matches with the pre-defined gesture because the objects move away from the single point. As the result of the determination that the movement of the subset of positions belongs to the pre-defined gesture, the controller 301 assigns the touch events that correspond to the touches of the fingers 311 and 312 to a first group.
The styli 320 and 321 are shown in figure 3 at a location on the touch screen device 306 after they moved from positions close to point T towards positions U and V respectively. The movement of the styli 320 and 321 did correspond to the pre-defined gesture of moving away from a single point. The touch events of the styli 320 and 321 were assigned by the controller 301 to a second group.
The touch screen device 306 of display apparatus 300 displays an image. The controller 301 generates the image. The image displays feedback indications 322, 323 and 333. Displayed ellipse 323 is the feedback indication showing that the controller
301 determined that the movement of the subset of positions belongs to the pre-defined gesture. In the example of the embodiment of Fig. 3 the ellipse 323 is drawn around the locations where styli 320 and 321 touch the touch screen device 306. The feedback indication 323 is displayed immediately after the determination that the movement of the subset of positions belongs to the pre-defined gesture. It should be noted that other forms of feedback indications may be displayed as well. Other examples are other geometrical shapes around the positions of the touch events that are assigned to the second group, changing the foreground or background color in an area related to the positions of the touch events in the second group, changing the foreground or background color of the whole image for a short time just after the determination that the movement of positions belongs to the predefined gesture, or e.g. displaying at a location of the touch screen device 306 a text indicating that the pre-defined gesture is determined.
Feedback indications 322 are displayed on the touch screen device 306 at the locations of the touch events that are assigned to the second group. The indications 322 of the example of figure 3 are blinking circles around the positions where styli 320 and 321 touch the touch screen device 306. Other feedback indications, which show the user of the display apparatus 300 which touch events are assigned to the group, are displaying specific geometrical shapes in a specific color at the positions of the touch events of a group. The geometrical shapes are possibly blinking. For the convenience of the user, different geometrical shapes of different colors may be used for different groups. Another example of feedback indications, which shows which touch events belong to the group, is connecting the positions of the touch events with lines.
The feedback indications 333 are an indication of the positions where the touch event is received by the touch screen device 306. At the positions where the styli 331 and 332 touch the touch screen device 306 the feedback indication 333 is an "X". Other examples of feedback indications at the location where the touch event is received are a cursor, a pointer, an arrow, a small circle, a local change in background color or e.g. a local change in foreground color. The feedback indications 333 are different from the feedback indications 322. The feedback indications 333 are displayed at locations of touch events that are not associated with one of the multiple groups, and the feedback indications 322 are displayed at locations of touch events that are associated with one of the multiple groups. For the convenience of the user, feedback indications 322 and feedback indications 333 have to be different. Fig. 4A shows a table 400 which comprises an electronic device to view and edit pictures. The electronic device comprises the display apparatus in accordance with the invention. The figure shows the touch screen device 406 of the display apparatus. Other parts of the electronic device are assumed to be built in the table, including the controller of the display apparatus. The touch screen device 406 shows several small pictures 410 to the users (users are not drawn in the figure) of the table. A first large picture 411 is shown to a first user and a second large picture 412 is shown to a second user.
The users of the table 400 have to identify the fingers of their hands by applying a pre-defined gesture to the touch screen device 406. The pre-defined gesture of the embodiment of figure 4 is the gesture of moving with the fingers of the hand away from a single point. Subsequently, the pre-defined gesture may be followed by a further pre-defined gesture for initiating a subsequent action. The further pre-defined gesture is rotating with the fingers of the hand around a single point. The subsequent action is rotating the picture.
In Fig. 4A the sequence of the predefined gestures and the further pre-defined gesture is shown in the area of the second large picture 412. This part of Fig. 4A has been enlarged in Fig. 4B.
The second user initially touches the touch screen device 406 with three fingers of one hand at the positions W, X and Y, respectively. The touches of the fingers result in three touch events. Subsequently, the fingers move over the touch screen device 406 into the direction of W, X' and Y', respectively. All three fingers move away from the single point Z. The controller detects the movement of positions and the controller determines that the movement of positions of the three fingers corresponds to the pre-defined gesture and assigns the three touch events to one group that is associated with the hand of the user. In a further embodiment the touch screen device 406 displays an image with a feedback indication that the pre-defined gesture is recognized in the movement of the fingers.
The controller of the display apparatus is further constructed to determine whether the movement of the subset of positions of the touch events in the group belongs to a further pre-defined gesture. The second user continues the movement of the fingers on the touch screen device 406 by moving the respective fingers from W to W", from X' to X" and from Y' to Y". This movement of positions of the three fingers matches with the further pre-defined gesture because the movement of positions of the three fingers is the rotation of the fingers of the hand around the single point Z.
As the result of determining the further pre-defined gesture, the controller initiates the subsequent action. Because the further pre-defined gesture was applied to the touch screen device 406 inside the area of the second larger picture 412, the initiated subsequent action is rotating the picture. The controller rotates the picture on the touch screen device 406, and in an alternative embodiment the controller sends an interrupt signal to another part of the electronic device, which executes the action of rotating the picture. The first picture 411 is shown to the first user. The first user can initiate actions related to the first picture 411 by applying the pre-defined gesture to identify the fingers of his hand followed by the further pre-defined gesture. If the first user applies these gestures inside the area of the first picture 411, the subsequent action will be related to the first picture 411.
The electronic device to view and edit pictures is able to distinguish between the touch events of the first and the second user. Therefore, two or more people can sit around table 400 and concurrently use the electronic device to view and edit pictures.
It should be noted that the further pre-defined gesture is not limited to the rotation of the fingers of the hand around the single point and that the subsequent action is not limited to rotating the first picture 411 or the second picture 412. Example of other further pre-defined gestures are a translation of the fingers of the hand, moving the fingers further away from the single point or e.g. moving the fingers backwards to the single point. In the case of the translation, the initiated subsequent actions may be moving the first picture 411 or second picture 412 over the screen, or alternatively in the case that only a part of the first picture 411 or the second picture 412 is shown, scrolling through the first picture 411 or second picture 412. In the case of moving further away from the single point, e.g. applied after the pre-defined gesture of moving away from the single point, the subsequent action may be zooming into a part of the first picture 411 or the second picture 412. In the case of moving backwards to the single point, e.g. applied after the pre-defined gesture of moving away from the single point, the subsequent action may be zooming out of the picture.
It should be noted that the display apparatus according to the invention may be used in other electronic devices as well, like in organizers, laptops, e-book readers, or, for example, screens of computers.
In Fig. 5 shows a block diagram of a display apparatus 500 comprising a touch screen device 506 and a controller 501. The controller 501 comprises a gesture storage unit 502, a movement detection unit 503, a determination unit 504 and an assignment unit 505.
The touch screen device 506 receives the touch events at respective positions as the result of objects that touch the touch screen device 506. The touch events are sent in the form of a signal to the movement detection unit 503. The position where the touch event is received on the touch screen device 506 is detected at regular moments. The movement detection unit 503 detects the movement of positions of the touch events. The movement of positions is detected by keeping track of the changes in the positions on consecutive detection moments.
The movements of positions of the touch events are transferred in the form of another signal to the determination unit 504. The determination unit 504 receives for every touch event a sequence of positions that describes the movement of positions or, alternatively, it receives a sequence of changes of the positions. The determination unit 504 receives also in the form of a further signal the pre-defined gesture from the gesture storage unit 502. The pre-defined gesture is stored in the gesture storage unit 502. The gesture storage unit 502 contains memory to store the pre-defined gesture. In the case of volatile memory, like Random Access Memory (RAM), the pre-defined gesture is loaded into the memory when the display apparatus starts to operate and the pre-defined gesture is possibly re-loaded with an updated version of the pre-defined gesture during a session of operation. In the case of non- volatile memory the information of the pre-defined gesture is programmed in the memory during the fabrication of the gesture storing unit 502 (e.g. in the case of a Read Only memory), is programmed into the memory during the installation (e.g. in an EPROM) or is loaded into the memory during previous use of the display apparatus (which is e.g. possible in the case that flash memory is used). The pre-defined gesture may be stored in the form of a description of a topology of points, together with, if required, changes in the topology and geometrical information related to the topology. A topology describes which point is left, right, below or above another point and the geometrical information describes distances between the some of the points. E.g. in the gesture of two objects that move away from a single point, the topology is a point a that is left of a point b. This description does not include a topology that changes over time, but it includes information about the distance between point a and point b that is initially smaller than a threshold value and that increases over time. Another form of describing and storing the pre-defined gesture may be in the format of vectors, which start at a specified location. E.g. in the gesture of three objects moving away from a single point, there will be three vectors which start at positions that are located close to a single central point and the three vectors are all three oriented away from the single central point.
The determination unit 504 is dedicated hardware to execute a form of pattern recognition. The pre-defined gesture describes the pattern of movement of positions and this pattern must be matched with the movement of a subset of positions of all the received touch events. The pattern recognition may be translation, scaling and rotation invariant and contains optionally other fuzzy pattern recognition mechanisms. In the case that the determination unit 504 determines that the movement of a subset of the positions belongs to the pre-defined gesture, the determination unit 504 sends in the form of signals information about the touch events for which was detected that the movement of the subset of positions belongs to the pre-defined gesture to the assignment unit 505.
The assignment unit 505 assigns the touch events for which it received a signal from the determination unit 504 to one of the multiple groups.
Fig. 6 shows a flowchart of the method of the invention. In step 601 the touch screen device 106 receives touch events. The touch events are the result of objects of multiple groups touching the touch screen device 106 at respective positions. In step 602 is the movement of positions of the touch events on the touch screen device 106 detected. In step 603 is determined whether the movement of a subset of positions belongs to a pre-defined gesture. The pre-defined gesture indicates that the objects associated with the subset of positions belong to one of the multiple groups. If the movement of a subset of the positions corresponds to the pre-defined gesture, the touch events for which the correspondence is detected are assigned to the associated one of the multiple groups in step 604. If the movement of a subset of positions does not belong to the pre-defined gesture, no further steps are taken. In step 605 is determined whether the movement of the subset of positions of the touch events of the associated one of the multiple groups belongs to a further pre-defined gesture. If the movement of the subset of positions of the touch events of the associated one of the multiple groups corresponds to a further pre-defined gesture, in step 606 a subsequent action is initiated. If the movement of the subset of positions of the touch events of the associated one of the multiple groups does not belong to a further pre-defined gesture, no further steps are taken.
It should be noted that the above-mentioned embodiments illustrate rather than limit the invention, and that those skilled in the art will be able to design many alternative embodiments without departing from the scope of the appended claims. The display apparatus may concurrently receive touch events from styli and fingers, only from styli, or only from fingers. Other objects than styli may be used as well, like a pencil, a ballpoint, pieces to play games like pawns and chess pieces, or any other object that is detected by the touch screen device if the object touches the touch screen device. The above-mentioned embodiments are not limited to the disclosed pre-defined gestures, or limited to the disclosed combinations of pre-defined gestures. The display apparatus may determine whether the movement of the subset of positions belongs to any set of pre-defined gestures, of which the set of the discussed gestures of moving away from a single point, moving towards a single point, translating the objects of the group or rotating with the objects of the group around a single point is only an example.
In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. Use of the verb "comprise" and its conjugations does not exclude the presence of elements or steps other than those stated in a claim. The article "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The invention may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the device claim enumerating several means, several of these means may be embodied by one and the same item of hardware. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage.

Claims

CLAIMS:
1. Display apparatus (100, 300, 500) comprising a touch screen device (106, 306, 406, 506) for receiving touch events as a result of objects touching the touch screen device (106, 306, 406, 506) at respective positions, the objects (111, 112, 210, 211, 220, 221, 311, 312, 320, 321, 331, 332) being grouped into multiple groups of objects belonging together, and a controller (101, 301, 501) constructed for detecting movement of positions of the touch events on the touch screen device (106, 306, 406, 506), determining whether the movement of a subset of positions belongs to a pre- defined gesture indicating that the objects (111, 112, 210, 211, 220, 221, 311, 312, 320, 321, 331, 332) associated with the subset of positions belong to one of the multiple groups, assigning the touch events for which is determined that the movement of the subset of positions belongs to the pre-defined gesture to the associated one of the multiple groups.
2. Display apparatus (100, 300, 500) according to claim 1, wherein the first predefined gesture is one of the gestures of: rotating the objects (210, 211) of one of the multiple groups around a single point (H), translating the objects (220, 221) of one of the multiple groups, moving the objects (111, 112) of one of the multiple groups towards a single point (C, P) or moving the objects (311, 312, 320, 321) of one of the multiple groups away from a single point (Q, T, Z).
3. Display apparatus (100, 300, 500) according to claim 1 or 2, wherein the objects (111, 112, 311, 312) being grouped into multiple groups are fingers (111, 112, 311, 312) of multiple hands (110, 120, 310), and wherein the pre-defined gesture is a movement of the hand (110, 120, 310).
4. Display apparatus (100, 300, 500) according to claim 1 or 2, wherein the objects (111, 112, 311, 312) being grouped into multiple groups are fingers (111, 112, 311, 312) of multiple users, and wherein the pre-defined gesture is a movement of one hand or two hands (110, 120, 310) of one of the users.
5. Display apparatus (100, 300, 500) according to claim 1, wherein the controller (101, 301, 501) is further constructed for determining whether the movement of the subset of positions of the touch events of the associated one of the multiple groups belongs to a further pre-defined gesture for initiating a subsequent action.
6. Display apparatus (300) according to claim 1, wherein the touch screen device is (306) further arranged for displaying an image, and wherein the controller (301) is further constructed for generating the image displaying a feedback indication (333) at the position of the touch events.
7. A display apparatus (300) according to claim 1, wherein the touch screen device (306) is further arranged for displaying an image, and wherein the controller is further constructed for generating the image displaying a feedback indication (323) in the case that is determined that the movement of the subset of positions belongs to a pre-defined gesture.
8. A display apparatus (300) according to claim 7, wherein the feedback indication (322, 323) comprises an indication (322) at the positions of the touch events that are assigned to the associated one of the multiple groups.
9. A display apparatus (100, 500) according to claim 1, wherein the controller (301, 501) comprises: - a gesture storage unit (102, 502) for storing the pre-defined gesture, a movement detection unit (103, 503) for detecting movement of positions of the touch events on the touch screen device, a determination unit (104, 504) for determining whether the movement of the subset of positions belong to a pre-defined gesture, and - an assignment unit (105, 505) for assigning the touch events for which is determined that the movement of the subset of positions belongs to the pre-defined gesture to the associated one of the multiple groups.
10. An electronic device comprising the display apparatus according to claim 1.
11. A method of processing touch events comprising: receiving (601) the multiple touch events as a result of objects (111, 112, 210, 211, 220, 221, 311, 312, 320, 321, 331, 332) touching a screen device (106, 306, 406, 506), the objects (111, 112, 210, 211, 220, 221, 311, 312, 320, 321, 331, 332) at respective positions, the objects being grouped into multiple groups of objects belonging together, detecting (602) movement of positions of the touch events on the touch screen device (106, 306, 406, 506), determining (603) whether the movement of a subset of positions belong to a pre-defined gesture indicating that the objects (111, 112, 210, 211, 220, 221, 311, 312, 320, 321, 331, 332) associated with the subset of positions belong to one of the multiple groups, and assigning (604) the touch events for which is detected that the movement of the subset of positions belongs to the pre-defined gesture to the associated one of the multiple groups.
12. A method according to claim 11, further comprising the steps of: determining (605) whether the movement of the subset of positions of the touch events of the associated one of the multiple groups belongs to a further pre-defined gesture, and initiating (606) a subsequent action if the movement of the subset of positions of the touch events of the associated one of the multiple groups belongs to a further predefined gesture.
PCT/IB2009/053901 2008-09-12 2009-09-08 Display apparatus for processing touch events WO2010029491A2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP08164268.8 2008-09-12
EP08164268 2008-09-12

Publications (2)

Publication Number Publication Date
WO2010029491A2 true WO2010029491A2 (en) 2010-03-18
WO2010029491A3 WO2010029491A3 (en) 2010-07-01

Family

ID=41683077

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2009/053901 WO2010029491A2 (en) 2008-09-12 2009-09-08 Display apparatus for processing touch events

Country Status (2)

Country Link
TW (1) TW201015402A (en)
WO (1) WO2010029491A2 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2508974A3 (en) * 2011-04-06 2016-01-27 Sony Corporation Information processing apparatus, information processing method, and computer-readable storage medium
EP2998838A1 (en) * 2014-09-19 2016-03-23 Samsung Electronics Co., Ltd. Display apparatus and method for controlling the same

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6570557B1 (en) * 2001-02-10 2003-05-27 Finger Works, Inc. Multi-touch system and method for emulating modifier keys via fingertip chords
US20040239653A1 (en) * 2003-05-27 2004-12-02 Wolfgang Stuerzlinger Collaborative pointing devices
US20070262964A1 (en) * 2006-05-12 2007-11-15 Microsoft Corporation Multi-touch uses, gestures, and implementation
WO2007139484A1 (en) * 2006-05-26 2007-12-06 Touchtable Ab User identification for multi-user touch screens
WO2008014819A1 (en) * 2006-08-02 2008-02-07 Alterface S.A. Multi-user pointing apparatus and method
US20080065619A1 (en) * 2006-09-07 2008-03-13 Bhogal Kulvir S Method, system, and program product for sharing collaborative data among a plurality of authors

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6570557B1 (en) * 2001-02-10 2003-05-27 Finger Works, Inc. Multi-touch system and method for emulating modifier keys via fingertip chords
US20040239653A1 (en) * 2003-05-27 2004-12-02 Wolfgang Stuerzlinger Collaborative pointing devices
US20070262964A1 (en) * 2006-05-12 2007-11-15 Microsoft Corporation Multi-touch uses, gestures, and implementation
WO2007139484A1 (en) * 2006-05-26 2007-12-06 Touchtable Ab User identification for multi-user touch screens
WO2008014819A1 (en) * 2006-08-02 2008-02-07 Alterface S.A. Multi-user pointing apparatus and method
US20080065619A1 (en) * 2006-09-07 2008-03-13 Bhogal Kulvir S Method, system, and program product for sharing collaborative data among a plurality of authors

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
WU M ET AL: "Multi-finger and whole hand gestural interaction techniques for multi-user tabletop displays" PROCEEDINGS OF THE 16TH ANNUAL ACM SYMPOSIUM ON USER INTERFACE SOFTWARE AND TECHNOLOGY; [ACM SYMPOSIUM ON USER INTERFACE SOFTWARE AND TECHNOLOGY], VANCOUVER, CANADA, vol. 5, no. 2, 2 November 2003 (2003-11-02), pages 193-202, XP002394716 ISBN: 978-1-58113-636-4 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2508974A3 (en) * 2011-04-06 2016-01-27 Sony Corporation Information processing apparatus, information processing method, and computer-readable storage medium
EP2998838A1 (en) * 2014-09-19 2016-03-23 Samsung Electronics Co., Ltd. Display apparatus and method for controlling the same
CN105446586A (en) * 2014-09-19 2016-03-30 三星电子株式会社 Display apparatus and method for controlling the same

Also Published As

Publication number Publication date
WO2010029491A3 (en) 2010-07-01
TW201015402A (en) 2010-04-16

Similar Documents

Publication Publication Date Title
US10353570B1 (en) Thumb touch interface
US9268483B2 (en) Multi-touch input platform
CN102129311B (en) Messaging device, method of operation input and operation loading routine
US20120105367A1 (en) Methods of using tactile force sensing for intuitive user interface
US8413075B2 (en) Gesture movies
US20160342779A1 (en) System and method for universal user interface configurations
US9612736B2 (en) User interface method and apparatus using successive touches
CN105359083B (en) For the dynamic management of edge input of the user on touch apparatus
US20090183098A1 (en) Configurable Keyboard
WO2008138046A1 (en) Double touch inputs
WO2014127697A1 (en) Method and terminal for triggering application programs and application program functions
JP7233109B2 (en) Touch-sensitive surface-display input method, electronic device, input control method and system with tactile-visual technology
JP6004716B2 (en) Information processing apparatus, control method therefor, and computer program
US9489086B1 (en) Finger hover detection for improved typing
AU2015327573B2 (en) Interaction method for user interfaces
JP5374564B2 (en) Drawing apparatus, drawing control method, and drawing control program
EP4307096A1 (en) Key function execution method, apparatus and device, and storage medium
CN110413187B (en) Method and device for processing annotations of interactive intelligent equipment
US9619134B2 (en) Information processing device, control method for information processing device, program, and information storage medium
CN202133989U (en) Terminal unit and icon position exchanging device thereof
US10627991B2 (en) Device and control methods therefor
WO2010029491A2 (en) Display apparatus for processing touch events
US10860120B2 (en) Method and system to automatically map physical objects into input devices in real time
Uddin Improving Multi-Touch Interactions Using Hands as Landmarks
WO2021223546A1 (en) Using a stylus to modify display layout of touchscreen displays

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09787120

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase in:

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 09787120

Country of ref document: EP

Kind code of ref document: A2