WO2016083671A1 - Unit for controlling an object displayed on a display, a method for controlling an object displayed on a display and a computer program product - Google Patents

Unit for controlling an object displayed on a display, a method for controlling an object displayed on a display and a computer program product Download PDF

Info

Publication number
WO2016083671A1
WO2016083671A1 PCT/FI2015/050828 FI2015050828W WO2016083671A1 WO 2016083671 A1 WO2016083671 A1 WO 2016083671A1 FI 2015050828 W FI2015050828 W FI 2015050828W WO 2016083671 A1 WO2016083671 A1 WO 2016083671A1
Authority
WO
WIPO (PCT)
Prior art keywords
vector
unit
velocity vector
normalised
display
Prior art date
Application number
PCT/FI2015/050828
Other languages
French (fr)
Inventor
Markus Halttunen
Gergely PATAI
Original Assignee
Small Giant Games Oy
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Small Giant Games Oy filed Critical Small Giant Games Oy
Publication of WO2016083671A1 publication Critical patent/WO2016083671A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • Unit for controlling an object displayed on a display a method for controlling an object displayed on a display and a computer program product
  • the invention relates to a unit for controlling an object displayed on a display which display is a touch screen and the unit comprises a processor arranged to receive touch input data that indicates touch input on the touch screen, and said touch input data is interpreted as instructions for movement of said object by said unit.
  • the invention also relates to a method and a computer program product for controlling an object displayed on a display which display is a touch screen and the object has a velocity vector v and a direction vector d, which is a direction towards which the object is pointing.
  • Touch sensing systems are commonly used in a variety of applications. Typically touch systems are actuated by a touch object, such a finger or a stylus, either in direct contact or through proximity with a touch surface. Touch system that is overlaid or integrated in a display is known as a touch screen. The user can use the touch screen to react to what is displayed and to control how it is displayed for example by zooming the text size or moving some object on the screen. The touch screen enables the user to interact directly with what is displayed, rather than us- ing some additional devices such as mouse or keyboard. Touch screens are common in devices such as game consoles, personal computers, tablet computers, and smart phones.
  • the touch sensing technology gives the possibility to exchange hardware input devices with virtual equivalents.
  • Devices having a touch screen are known to have a virtual joystick where a visual representation of a joystick is displayed on a touch screen.
  • Touch input data from the touch screen is processed to determine when the touch data indicates user interaction with the virtual joystick.
  • Joystick data is generated based on the user interaction with the virtual joystick and the display of an object or an element on the touch screen can be adjusted based on the said joystick data. The user therefore can control the movement of the object by positioning a finger or some other touch object in relation to the virtual representation of the joystick.
  • the object displayed on the display of the touch screen can be moved or guided by touch gestures on the touch screen.
  • the user controls the movement of the object by positioning a finger or some other touch object on the touch screen producing a rest position and then moving said finger or some other touch object on the touch screen.
  • FIG. 1 a is presented an electric device 100 that comprises a touch screen 102.
  • a touch screen 102 On the touch screen is displayed an object 103.
  • a finger is moved on the touch screen 102 for producing a touch input data for the device for moving the object.
  • the starting point is called a rest position 104 and the end point is called a second position 107.
  • a second position can be the next rest position but there can be also a new rest position indicating that the finger is lifted from the surface of the touch screen and moved to some other position.
  • the device calculates the touch vector and based on the touch vector generates a movement vector 105.
  • the device then moves the object in the direction of the movement vector.
  • the finger is moved on the touch screen 102 thereby producing a touch input data from a rest position 108 which is the previous second position to a new second position 1 1 1 and there is a new touch vector 1 10.
  • the device calculates a new movement vector 109.
  • the device then moves the object in the direction of the movement vector 109. This produces unnatural looking movement and also the turning may happen too late.
  • This kind of object controlling leads easily to too large movements and thus too large trajectories.
  • the touch points move on the touch screen, they can reach the proximity of the border of the touch screen. This demands moving a finger to a new place farther from the border. This may cause a stop on the movement and misguidance.
  • the object of the invention is a solution by which the disadvantages and drawbacks of the prior art can be diminished.
  • the object of the invention is to provide a solution for moving an object on a touch screen in a way that the speed and direction are precisely controlled at the same time.
  • the objects according to the invention are achieved through a unit, a method, and a computer program product characterized by what is disclosed in the independent claims. Some preferred embodiments of the invention are disclosed in the dependent claims.
  • the main idea of the invention is to calculate the normalised object velocity vector from the input position change vector and the previous normalised object velocity vector.
  • the velocity vector of the object asymptotically closes the normalised object velocity vector. This allows precise controlling of the object.
  • the magnitude of the normal- ised object velocity vector is changed when the angle between the direction vector, i.e. the vector showing the facing direction of the object, and the input position change vector is small.
  • the calculated normalised object velocity vector is given the same magnitude of the normalised object velocity vector as the previous normalised object velocity vector.
  • the movement of the object on the display can be real and in that case the object is moved on different places on the display of the touch screen.
  • the movement of the object can be virtual and in that case the object is stationary and the background is moved thereby creating an illusion of movement.
  • the movement of the object can be a combination of both.
  • a processor arranged to receive touch input data that indicates touch input on the touch screen, and said touch input data is interpreted as instructions to move said object by said unit.
  • the movement of the object is real or virtual or a combination of both and touch input data comprises at least an input position vector p and an input change position vector ⁇ since last sampling of the touch input data and the object has a velocity vector v and a direction vector d which is a direction towards which the object is pointing.
  • the unit is arranged to calculate a normalised velocity vector u for the object, which nor- malised velocity vector has a magnitude U, based on the input change position vector ⁇ , sensitivity K, which is a proportion between the input change position vector ⁇ and velocity vector v, and the previous normalised velocity vector.
  • the unit is arranged to change the velocity vector v towards i , thus enhancing effective movement.
  • the unit is arranged to change the magnitude U of the i , and otherwise U of the Unext is retained from the u that was used for calculation of the i t-
  • the unit is arranged to change the velocity vector v towards i , the velocity vector asymptotically moves towards S*u ne xt, where S is the maximum speed allowed for the object. This provides that the velocity vector v is changed as much as possible. Fast changes in the velocity vector v allow fast response from the touch input data, i.e. touch on the touch screen.
  • the unit is arranged to calculate an object turning speed T based on the difference of directions between the velocity vector v and the calculated u ne xt, in a way that when the Unext is to the left of the v, the object turning speed T is positive, and when the u ne xt is to the right of the v, the object turning speed T is negative.
  • the calculated object turning speeds T therefore provides faster turns of the object.
  • the object has a preferable route and the unit is arranged to calculate an object turning speed T based on the difference of directions between the velocity vector v and the calculated Unext, and to choose the direction of the turning of the object to be clockwise or counterclockwise based on which direction is closer to the preferable route, i.e. to choose the sign of the object turning speed T.
  • the magnitude of the object turning speed T is proportional to the angle between the velocity vector v and the calculated u ne xt, in a way that T is small when the angle is small and grows when the said angle grows.
  • the unit is arranged to set the direction of the normalised velocity vector u the same as the direction of the velocity vector v when starting a new touch input data. In this way the path of the movement is not disturbed even if the object controlling user releases the touch screen momentarily.
  • the unit is arranged to change the magnitude U of the i when the angle between the input change position vector ⁇ and the direction vector d is between +35 and -35 degrees.
  • the unit when changing the magnitude U, the unit is arranged to increase the magnitude if the unit interprets from the touch input data that the object accelerates, and to decrease the magnitude if the unit interprets from the touch input data that the object brakes.
  • the unit when the angle between the direction vector d and the normalised velocity vector u is near 180 degrees, the unit is arranged to change the sign of the object turning speed T to be the same as the angular velocity W of the object. This prevents erratic turnings and possible switching between the directions and stuck events when quickly changing directions opposite or near opposite. This provides stable behaviour in big and fast turnings.
  • the unit is arranged to display a visual presentation of the velocity vector v or a combination of several velocity vectors on the display near the object indicating the direction to which the object is moving. This makes the user to quickly see how his or her touch movements affect the movements of the object.
  • the visual presentation is an arrow-like image that is transparent. In that way the visual presentation does not prevent the user from seeing the elements of the display.
  • the visual presentation is a spline that curves toward the direction of the velocity vector v starting from the current position of the object.
  • the spline curvature is determined by the maximum angular velocity W max of the object.
  • the sensitivity K is an arbitrary positive number and when K is bigger than an arbitrary value, the length of the object movement on the display is bigger than the length of the input position vector, and when K is smaller than the said arbitrary value, the length of the object movement on the display is smaller than the length of the input position vector.
  • the sen- sitivity K is a predetermined value that is set by a user of a device containing said unit or it is a set constant.
  • the object has a velocity vector v and a direction vector d, which is a direction towards which the object is pointing.
  • the method also comprises steps where the velocity vector v is changed towards i and if the angle between the input change position vector ⁇ and the direction vector d is small, the unit is arranged to change the magnitude U of the i , and otherwise U of the i is retained from the u that was used for calculation of the i t-
  • which computer program product when executed on a device having at least a processor and a touch screen, is adapted to perform the steps of the method for controlling an object displayed on a display which display is a touch screen, and the object has a velocity vector v and a direction vector d which is a direction towards which the object is pointing.
  • the computer program product also comprises computer program code means which are arranged to change the magnitude U of the i if the angle between the input change position vector ⁇ and the direction vector d is small, and otherwise retain the U of the Unext from the u that was used for calculation of the i , change the velocity vector v towards i , and give instructions to move the object on the display based on the velocity vector v.
  • An advantage of the invention is that it provides a precise controlling system for moving an object displayed on a touch screen.
  • the controlling system is also intu- itive because it interprets the touch input data in a way that resembles natural movements.
  • An advantage of the invention is that it can be applied to all applications where an object is moved by touch around the touch screen.
  • a further advantage of the invention is that it does not require any fixed graphical interface on the touch screen.
  • an advantage of the invention is that it is always responsive, meaning that there are no blind spots where a change in the control input would have no effect.
  • An advantage of the invention is also that it allows fully controlling velocity, i.e. both speed and direction simultaneously by a single touch as an input.
  • an advantage of the invention is that it is applicable to both 2D and 3D movement of an object.
  • the invention is also applicable to situations where the movement is virtual, i.e. the background or some elements of the background is/are moved and the object is stationary. Description of the figures
  • Figure 1 a shows a prior art example of moving an object on a touch screen
  • Figure 1 b shows a prior art example of moving an object on a touch screen
  • Figure 2 shows by way of example a device according to the invention
  • Figure 3 shows by way of example movements of the object
  • Figure 4 shows an example of the touch input data
  • Figure 5 shows an example of the calculation of the i
  • Figure 6 shows an example of the modification of the magnitude U when the angle between d and ⁇ is smaller than given threshold angle
  • Figure 7 shows an example of the modification of the magnitude U when the angle between d and ⁇ is bigger than given threshold angle
  • Figure 8 shows an example of a movement of an object on a touch screen according the invention
  • Figure 9 shows an example of a visual presentation of a velocity vector
  • Figure 10 shows an example of a method according the invention.
  • FIG 2 is shown by way of example a device 200 having a touch screen 201 , a unit 204 for controlling an object displayed on the touch screen, and a graphical unit 205 for updating the touch screen display.
  • the unit 204 comprises a processor 203 and a memory 202.
  • the processor is arranged to receive and interpret touch data from the touch screen and give instructions to the graphical unit 205 for moving or otherwise visually modifying at least one object on the touch screen.
  • the instructions about how to handle the touch input data and modify the movements of the object on the touch screen are stored on the memory 202.
  • the device having a unit according to the invention could be different than what is presented here.
  • the unit according to the invention does not necessarily need a memory but can use for example a memory that is outside the unit.
  • Figure 3 is described the movement of an object 301 on a display that is a touch screen.
  • the object 301 has a velocity vector v1 that describes the direction and the speed of the object.
  • the object has a direction vector d1 that describes the facing of the object, i.e. to what direction it is currently going.
  • the object is given a new velocity vector v2. Because the object does not change direction and speed instantly, a direction vector d can have different direction than the velocity vector.
  • the object 301 has the velocity vector v2 and a direction vector d2. The object receives a new velocity vector v3.
  • Figure 4 is described an input position vector p and an input change position vector ⁇ , and how they form touch input data.
  • the input position vector p starts from origin which is an arbitrary position on the screen.
  • the device where the touch screen is samples touch data and determines if there are changes in the input position vector p.
  • the input change position vector ⁇ is the actual movement of the input position on the touch screen. It is equal to the velocity of the movement of the finger or some other touching means multiplied by the sampling period of the touch screen.
  • Sensitivity describes the relation of the touch movement on the touch screen and the movement of the object on the screen. For example a finger touch on the screen is longer than the length of the movement of the object corresponding to said finger touch.
  • the normalised velocity vectors u are used to define the velocity vectors v of the object.
  • v is moved towards the said u.
  • the restricting factors are speed limitations and turning limitations, i.e. the v that is to be changed must not exceed these values.
  • the v asymptotically closes the normalised velocity vector u. In this process the velocity vector v tracks down the normalised velocity vectors u, but does not overshoot them.
  • closing the normalised velocity vector u the direction and the speed of the velocity vector v close corresponding values of the normalised velocity vector u. This provides sharper movements compared to the prior art.
  • the normalised velocity vector u is multiplied by the maximum speed S allowed for the object. This makes fast movements possible.
  • the object turning speed T is calculated based on the difference of directions between the velocity vector v and the calculated i , in a way that when the i is to the left of the v, the object turning speed T is positive, and when the i is to the right of the v, the object turning speed T is negative.
  • the sign of the object turning speed indicates the direction of the turn.
  • the velocity vector v closes towards the normalised velocity vector u by the speed of the object turning speed T.
  • the magnitude of T is proportional to the angle between the normalised velocity vector u and the velocity vector v. This means that the magnitude of the object turning speed T is zero or near zero if the input ( ⁇ ) dictates that the current direction is maintained.
  • the sign of the object turning speed T is changed to be the same as the angular velocity W of the object. This ensures that turning has a stable behaviour.
  • the sign of object turning speed T is changed in a way that the object steers away possible obstacles automatically if there is such a route. This makes turnings in the difficult places easier and the control method is still intuitive. This can be used also with the previously described situation where the direction vector d and the normalised velocity vector u were nearly in opposite directions.
  • the magnitude U of the normalised velocity vector u is modified.
  • the modification of the magnitude U is done based on the angle between the direction vector d of the object and the input change position vector ⁇ .
  • There is a threshold value for the angle Preferably this angle is small, i.e. the directions between the direction vector d and the input change position vector ⁇ do not differ much from each other. In an advantageous embodiment this angle is between +35 and - 35 degrees. If the angle between the direction vector d of the object and the input change position vector ⁇ is small or within said interval, the magnitude U of the Unext is to be changed. In this change the magnitude U could be increased or decreased depending on what kind of a movement is detected.
  • the magnitude is to be increased, and if it is deducted that the object brakes, the magnitude is to be decreased. Of course, due to slowing down, if the speed is attempted to be kept constant, the magnitude increases. If the angle between the direction vector d of the object and the input change position vector ⁇ is outside the threshold angle, the magnitude U of the i is replaced with the magnitude U of the normalised velocity vector u that was used in the calculation of the i t- In this way the speed of the velocity vector v stays essentially the same even if there are turns.
  • the object has a velocity vector v1.
  • the normalised velocity vector u1 is calculated according to the invention.
  • the velocity vector v1 changes towards u1 and the result is a new velocity vector v2.
  • a new input change position vector ⁇ 1 is read.
  • a new normalised velocity vector u2 is calculated.
  • the velocity vector v2 changes towards u2 and the result is a new velocity vector v3. Since the velocity vectors v track the new normalised velocity vectors u, it can be seen that a new velocity vector is between the previous velocity vector and the normalised velocity vector that was used for defining the velocity vector.
  • the velocity vector v2 is between the normalised velocity vector u1 and the velocity vector v1 and the velocity vector v3 is between the normalised velocity vector u2 and the velocity vector v2.
  • Figure 9 is described an example of a visual presentation 902 of a velocity vector.
  • an object 901 that is controlled by touch input that is described in this example by two input change position vec- tors ⁇ 1 and ⁇ 2.
  • These input change position vectors correspond to the movement of the object that is described by velocity vectors v1 and v2.
  • a visual presentation 902 is displayed near the object.
  • the visual presentation is a presentation of the velocity vector or several velocity vectors. In this example the visual presentation corresponds to the velocity vectors v1 and v2.
  • Figure 10 shows by way of example a flow chart of the method according to the invention. Hereby the method is described step by step.
  • Controlling an object displayed on a touch screen is started at 1001 .
  • an initial touch data input is read.
  • first velocity vector v and direction vector d are calculated based on the touch data.
  • a normalised velocity vector u is defined by the first velocity vector v.
  • the object is moved according to those values.
  • step 1004 touch data input is read. This is based on the sampling rate of the device from which the touch data is gathered. In that step an input change position vector ⁇ is calculated.
  • step 1005 a new normalised velocity vector u ne xt is calculated similarly as was described before by using previous u, sensitivity K, and an input change position vector ⁇ .
  • the direction vector d and the velocity vector v calculated in step 1003 are unchanged.
  • step 1006 it is checked if the angle between the input change position vector ⁇ and the direction vector d is small, i.e. within some predetermined fixed value which is small compared to 180 degrees. If the answer is YES, then in step 1008, the magnitude U of the calculated normalised velocity vector u ne xt is changed. Then in step 1009 the velocity vector v of the object displayed on the touch screen is changed towards the modified normalised velocity vector u ne xt-
  • step 1007 the magnitude U of the calculated normalised velocity vector i is the magnitude of the previous normalised velocity vector u, i.e. the normalised velocity vector that was used for calculating the normalised velocity vector u ne xt- Then in step 1009 the velocity vector v of the object displayed on the touch screen is changed towards the modified normalised velocity vector
  • step 1010 the object is moved on the touch screen based on the velocity vector v that was changed in step 1009.
  • step 1004 a touch data input is read. At this time also the direction vector d is detected or calculated. In step 1005 a new normalised velocity vector i is calculated. At this time the previous u is the u that was modified in step 1007 or in step 1008.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

In the invention an object (301, 901) displayed on a display which display is a touch screen (201) is controlled. The main idea is to calculate the normalised object velocity vector from the input position change vector and the previous normalised object velocity vector. The velocity vector of the object (301, 901) asymptotically closes the normalised object velocity vector. This allows precise controlling the object (301, 901). However, to prevent the speed from slowing down due to turnings, the magnitude of the normalised object velocity vector is changed when the angle between the direction vector, i.e. the vector showing the facing direction of the object (301, 901), and the input position change vector is small. Otherwise the calculated normalised object velocity vector is given the same magnitude of the normalised object velocity vector as the previous normalised object velocity vector.

Description

Unit for controlling an object displayed on a display, a method for controlling an object displayed on a display and a computer program product
The invention relates to a unit for controlling an object displayed on a display which display is a touch screen and the unit comprises a processor arranged to receive touch input data that indicates touch input on the touch screen, and said touch input data is interpreted as instructions for movement of said object by said unit. The invention also relates to a method and a computer program product for controlling an object displayed on a display which display is a touch screen and the object has a velocity vector v and a direction vector d, which is a direction towards which the object is pointing.
Prior art
Touch sensing systems are commonly used in a variety of applications. Typically touch systems are actuated by a touch object, such a finger or a stylus, either in direct contact or through proximity with a touch surface. Touch system that is overlaid or integrated in a display is known as a touch screen. The user can use the touch screen to react to what is displayed and to control how it is displayed for example by zooming the text size or moving some object on the screen. The touch screen enables the user to interact directly with what is displayed, rather than us- ing some additional devices such as mouse or keyboard. Touch screens are common in devices such as game consoles, personal computers, tablet computers, and smart phones.
The touch sensing technology gives the possibility to exchange hardware input devices with virtual equivalents. Devices having a touch screen are known to have a virtual joystick where a visual representation of a joystick is displayed on a touch screen. Touch input data from the touch screen is processed to determine when the touch data indicates user interaction with the virtual joystick. Joystick data is generated based on the user interaction with the virtual joystick and the display of an object or an element on the touch screen can be adjusted based on the said joystick data. The user therefore can control the movement of the object by positioning a finger or some other touch object in relation to the virtual representation of the joystick.
Of course, the object displayed on the display of the touch screen can be moved or guided by touch gestures on the touch screen. In that case the user controls the movement of the object by positioning a finger or some other touch object on the touch screen producing a rest position and then moving said finger or some other touch object on the touch screen. On the path of the finger or some other touch object there is a second position and the device with the touch screen cal- culates a touch vector between the rest position and the second position. The device then moves the object on the display of the touch screen to the direction based on the touch vector.
In Figure 1 a is presented an electric device 100 that comprises a touch screen 102. On the touch screen is displayed an object 103. A finger is moved on the touch screen 102 for producing a touch input data for the device for moving the object. The starting point is called a rest position 104 and the end point is called a second position 107. It must be noted that those mentioned terms are explanatory and the touch input data contains several said positions. A second position can be the next rest position but there can be also a new rest position indicating that the finger is lifted from the surface of the touch screen and moved to some other position. Between the rest position 104 and the second position 107 there is a touch vector 106. The device calculates the touch vector and based on the touch vector generates a movement vector 105. The device then moves the object in the direction of the movement vector. In Figure 1 b the finger is moved on the touch screen 102 thereby producing a touch input data from a rest position 108 which is the previous second position to a new second position 1 1 1 and there is a new touch vector 1 10. Based on the touch vector 1 10 the device calculates a new movement vector 109. The device then moves the object in the direction of the movement vector 109. This produces unnatural looking movement and also the turning may happen too late. This kind of object controlling leads easily to too large movements and thus too large trajectories. Also, when the touch points move on the touch screen, they can reach the proximity of the border of the touch screen. This demands moving a finger to a new place farther from the border. This may cause a stop on the movement and misguidance.
Using virtual joysticks may help with some of these problems but it demands lifting the finger and is not always intuitive. Also, making turns and maintaining speed is difficult with the virtual joysticks. Sometimes the user must stop, turn, and then start moving again. Summary
The object of the invention is a solution by which the disadvantages and drawbacks of the prior art can be diminished. Particularly, the object of the invention is to provide a solution for moving an object on a touch screen in a way that the speed and direction are precisely controlled at the same time.
The objects according to the invention are achieved through a unit, a method, and a computer program product characterized by what is disclosed in the independent claims. Some preferred embodiments of the invention are disclosed in the dependent claims. The main idea of the invention is to calculate the normalised object velocity vector from the input position change vector and the previous normalised object velocity vector. The velocity vector of the object asymptotically closes the normalised object velocity vector. This allows precise controlling of the object. However, to prevent the speed from slowing down due to turnings, the magnitude of the normal- ised object velocity vector is changed when the angle between the direction vector, i.e. the vector showing the facing direction of the object, and the input position change vector is small. Otherwise the calculated normalised object velocity vector is given the same magnitude of the normalised object velocity vector as the previous normalised object velocity vector. The movement of the object on the display can be real and in that case the object is moved on different places on the display of the touch screen. The movement of the object can be virtual and in that case the object is stationary and the background is moved thereby creating an illusion of movement. Naturally, the movement of the object can be a combination of both. According to an embodiment of the invention, in a unit for controlling an object displayed on a display, which display is a touch screen, there is a processor arranged to receive touch input data that indicates touch input on the touch screen, and said touch input data is interpreted as instructions to move said object by said unit. According to a preferred embodiment of the invention, the movement of the object is real or virtual or a combination of both and touch input data comprises at least an input position vector p and an input change position vector Δρ since last sampling of the touch input data and the object has a velocity vector v and a direction vector d which is a direction towards which the object is pointing. The unit is arranged to calculate a normalised velocity vector u for the object, which nor- malised velocity vector has a magnitude U, based on the input change position vector Δρ, sensitivity K, which is a proportion between the input change position vector Δρ and velocity vector v, and the previous normalised velocity vector. The previous normalised velocity vector is calculated using equation unext = (u + Κ*Δρ). The unit is arranged to change the velocity vector v towards i , thus enhancing effective movement. To prevent gradual slowing down, if the angle between the input change position vector Δρ and the direction vector d is small, the unit is arranged to change the magnitude U of the i , and otherwise U of the Unext is retained from the u that was used for calculation of the i t- In an embodiment of the unit according to the invention, the unit is arranged to change the velocity vector v towards i , the velocity vector asymptotically moves towards S*unext, where S is the maximum speed allowed for the object. This provides that the velocity vector v is changed as much as possible. Fast changes in the velocity vector v allow fast response from the touch input data, i.e. touch on the touch screen.
In a second embodiment of the unit according to the invention, the unit is arranged to calculate an object turning speed T based on the difference of directions between the velocity vector v and the calculated unext, in a way that when the Unext is to the left of the v, the object turning speed T is positive, and when the unext is to the right of the v, the object turning speed T is negative. The calculated object turning speeds T therefore provides faster turns of the object.
In a third embodiment of the unit according to the invention, the object has a preferable route and the unit is arranged to calculate an object turning speed T based on the difference of directions between the velocity vector v and the calculated Unext, and to choose the direction of the turning of the object to be clockwise or counterclockwise based on which direction is closer to the preferable route, i.e. to choose the sign of the object turning speed T.
In a fourth embodiment of the unit according to the invention, the magnitude of the object turning speed T is proportional to the angle between the velocity vector v and the calculated unext, in a way that T is small when the angle is small and grows when the said angle grows.
In a fifth embodiment of the unit according to the invention, the unit is arranged to set the direction of the normalised velocity vector u the same as the direction of the velocity vector v when starting a new touch input data. In this way the path of the movement is not disturbed even if the object controlling user releases the touch screen momentarily.
In a sixth embodiment of the unit according to the invention, the unit is arranged to change the magnitude U of the i when the angle between the input change position vector Δρ and the direction vector d is between +35 and -35 degrees.
In a seventh embodiment of the unit according to the invention, when changing the magnitude U, the unit is arranged to increase the magnitude if the unit interprets from the touch input data that the object accelerates, and to decrease the magnitude if the unit interprets from the touch input data that the object brakes. In an eighth embodiment of the unit according to the invention, when the angle between the direction vector d and the normalised velocity vector u is near 180 degrees, the unit is arranged to change the sign of the object turning speed T to be the same as the angular velocity W of the object. This prevents erratic turnings and possible switching between the directions and stuck events when quickly changing directions opposite or near opposite. This provides stable behaviour in big and fast turnings.
In a ninth embodiment of the unit according to the invention the unit is arranged to display a visual presentation of the velocity vector v or a combination of several velocity vectors on the display near the object indicating the direction to which the object is moving. This makes the user to quickly see how his or her touch movements affect the movements of the object. In a tenth embodiment of the unit according to the invention the visual presentation is an arrow-like image that is transparent. In that way the visual presentation does not prevent the user from seeing the elements of the display. In an eleventh embodiment of the unit accord- ing to the invention the visual presentation is a spline that curves toward the direction of the velocity vector v starting from the current position of the object. In a twelfth embodiment of the unit according to the invention the spline curvature is determined by the maximum angular velocity Wmax of the object.
In a thirteenth embodiment of the unit according to the invention the sensitivity K is an arbitrary positive number and when K is bigger than an arbitrary value, the length of the object movement on the display is bigger than the length of the input position vector, and when K is smaller than the said arbitrary value, the length of the object movement on the display is smaller than the length of the input position vector. In a fourteenth embodiment of the unit according to the invention the sen- sitivity K is a predetermined value that is set by a user of a device containing said unit or it is a set constant.
In a method according to an embodiment of the invention, for controlling an object displayed on a display which is a touch screen, the object has a velocity vector v and a direction vector d, which is a direction towards which the object is pointing. According to a preferred embodiment of the invention, the method comprises the steps where touch input data is read and touch input data comprises at least an input position vector p and an input change position vector Δρ since last sampling of the touch input data, and a normalised velocity vector u for the object is calcu- lated, which normalised velocity vector has a magnitude U, based on the input change position vector Δρ, sensitivity K, which is a proportion between the input change position vector Δρ and velocity vector v, and the previous normalised velocity vector by equation unext = (u + Κ*Δρ ). The method also comprises steps where the velocity vector v is changed towards i and if the angle between the input change position vector Δρ and the direction vector d is small, the unit is arranged to change the magnitude U of the i , and otherwise U of the i is retained from the u that was used for calculation of the i t-
In an embodiment of the computer program product according to the invention, which computer program product, when executed on a device having at least a processor and a touch screen, is adapted to perform the steps of the method for controlling an object displayed on a display which display is a touch screen, and the object has a velocity vector v and a direction vector d which is a direction towards which the object is pointing. According to a preferred embodiment of the invention, the computer program product comprises computer program code means which are arranged to read touch input data, which touch input data comprises at least an input position vector p and an input change position vector Δρ since last sampling of the touch input data, and to calculate a normalised velocity vector u for the object, which normalised velocity vector has a magnitude U, based on the input change position vector Δρ, sensitivity K, which is a proportion between the input change position vector Δρ and velocity vector v, and the previous normalised velocity vector by equation unext = (u + Κ*Δρ ). The computer program product also comprises computer program code means which are arranged to change the magnitude U of the i if the angle between the input change position vector Δρ and the direction vector d is small, and otherwise retain the U of the Unext from the u that was used for calculation of the i , change the velocity vector v towards i , and give instructions to move the object on the display based on the velocity vector v.
An advantage of the invention is that it provides a precise controlling system for moving an object displayed on a touch screen. The controlling system is also intu- itive because it interprets the touch input data in a way that resembles natural movements.
An advantage of the invention is that it can be applied to all applications where an object is moved by touch around the touch screen.
A further advantage of the invention is that it does not require any fixed graphical interface on the touch screen.
Further, an advantage of the invention is that it is always responsive, meaning that there are no blind spots where a change in the control input would have no effect.
An advantage of the invention is also that it allows fully controlling velocity, i.e. both speed and direction simultaneously by a single touch as an input.
Also, an advantage of the invention is that it is applicable to both 2D and 3D movement of an object. The invention is also applicable to situations where the movement is virtual, i.e. the background or some elements of the background is/are moved and the object is stationary. Description of the figures
In the following, the invention will be described in detail. In the description, reference is made to the enclosed drawings, in which
Figure 1 a shows a prior art example of moving an object on a touch screen, Figure 1 b shows a prior art example of moving an object on a touch screen, Figure 2 shows by way of example a device according to the invention, Figure 3 shows by way of example movements of the object, Figure 4 shows an example of the touch input data, Figure 5 shows an example of the calculation of the i , Figure 6 shows an example of the modification of the magnitude U when the angle between d and Δρ is smaller than given threshold angle,
Figure 7 shows an example of the modification of the magnitude U when the angle between d and Δρ is bigger than given threshold angle, Figure 8 shows an example of a movement of an object on a touch screen according the invention,
Figure 9 shows an example of a visual presentation of a velocity vector, and
Figure 10 shows an example of a method according the invention.
Detailed description of drawings The embodiments in the following description are given as examples only, and a person skilled in the art may realise the basic idea of the invention also in some other way than what is described in the description. Though the description may refer to a certain embodiment or embodiments in several places, this does not mean that the reference would be directed towards only one described embodi- ment or that the described characteristic would be usable only in one described embodiment. The individual characteristics of two or more embodiments may be combined and new embodiments of the invention may thus be provided.
Figures 1 a and 1 b were described when discussing the prior art.
In Figure 2 is shown by way of example a device 200 having a touch screen 201 , a unit 204 for controlling an object displayed on the touch screen, and a graphical unit 205 for updating the touch screen display. The unit 204 comprises a processor 203 and a memory 202. The processor is arranged to receive and interpret touch data from the touch screen and give instructions to the graphical unit 205 for moving or otherwise visually modifying at least one object on the touch screen. The instructions about how to handle the touch input data and modify the movements of the object on the touch screen are stored on the memory 202. It must be noted that the device having a unit according to the invention could be different than what is presented here. Also the unit according to the invention does not necessarily need a memory but can use for example a memory that is outside the unit.
In Figure 3 is described the movement of an object 301 on a display that is a touch screen. In Figure 3a the object 301 has a velocity vector v1 that describes the direction and the speed of the object. The object has a direction vector d1 that describes the facing of the object, i.e. to what direction it is currently going. The object is given a new velocity vector v2. Because the object does not change direction and speed instantly, a direction vector d can have different direction than the velocity vector. In Figure 3b the object 301 has the velocity vector v2 and a direction vector d2. The object receives a new velocity vector v3.
In Figure 4 is described an input position vector p and an input change position vector Δρ, and how they form touch input data. In Figure 4a the input position vector p starts from origin which is an arbitrary position on the screen. The device where the touch screen is, samples touch data and determines if there are changes in the input position vector p. The input change position vector Δρ is the actual movement of the input position on the touch screen. It is equal to the velocity of the movement of the finger or some other touching means multiplied by the sampling period of the touch screen. In Figure 4b there are several input change posi- tion vectors Δρ that describe the movement of the touching point, i.e. the input point on the touch screen.
It must be noted that the properties of the input position vectors and the input change position vectors depend on the sample frequency and not on the direction changes. In Figure 5 it is described how a new normalised velocity vector unext is calculated in the invention. In Figure 5a there is a normalised velocity vector u. This is obtained for example from previous calculations or it is an estimate of when movement is initiated, i.e. the touch data is started. In an advantageous embodiment, when starting a new touch input data the direction of the normalised velocity vec- tor u is the same as the direction of the velocity vector v.
A new normalised velocity vector, which is called i , is calculated using a previous normalised velocity vector u, the change in input position vector Δρ and the sensitivity K, by equation unext = (u + Κ*Δρ ). This is presented in Figure 5b. Sensitivity describes the relation of the touch movement on the touch screen and the movement of the object on the screen. For example a finger touch on the screen is longer than the length of the movement of the object corresponding to said finger touch.
In the invention the normalised velocity vectors u are used to define the velocity vectors v of the object. When the normalised velocity vector u is calculated, v is moved towards the said u. The restricting factors are speed limitations and turning limitations, i.e. the v that is to be changed must not exceed these values. The v asymptotically closes the normalised velocity vector u. In this process the velocity vector v tracks down the normalised velocity vectors u, but does not overshoot them. When closing the normalised velocity vector u, the direction and the speed of the velocity vector v close corresponding values of the normalised velocity vector u. This provides sharper movements compared to the prior art. In an advantageous embodiment the normalised velocity vector u is multiplied by the maximum speed S allowed for the object. This makes fast movements possible. To change the velocity vector v towards the normalised velocity vector u, the object turning speed T is calculated based on the difference of directions between the velocity vector v and the calculated i , in a way that when the i is to the left of the v, the object turning speed T is positive, and when the i is to the right of the v, the object turning speed T is negative. The sign of the object turning speed indicates the direction of the turn. The velocity vector v closes towards the normalised velocity vector u by the speed of the object turning speed T. The magnitude of T is proportional to the angle between the normalised velocity vector u and the velocity vector v. This means that the magnitude of the object turning speed T is zero or near zero if the input (Δρ) dictates that the current direction is maintained.
In an advantageous embodiment, if the angle between the direction vector d and the normalised velocity vector u is so big that they have nearly opposite directions, the sign of the object turning speed T is changed to be the same as the angular velocity W of the object. This ensures that turning has a stable behaviour. On the display of the touch screen there may be some elements or objects that the object that is controlled by the invention is meant to avoid. In that case in an advantageous embodiment the sign of object turning speed T is changed in a way that the object steers away possible obstacles automatically if there is such a route. This makes turnings in the difficult places easier and the control method is still intuitive. This can be used also with the previously described situation where the direction vector d and the normalised velocity vector u were nearly in opposite directions.
However, the speed of the controlled object may slow down after a few turns. To prevent this, the magnitude U of the normalised velocity vector u is modified. The modification of the magnitude U is done based on the angle between the direction vector d of the object and the input change position vector Δρ. There is a threshold value for the angle. Preferably this angle is small, i.e. the directions between the direction vector d and the input change position vector Δρ do not differ much from each other. In an advantageous embodiment this angle is between +35 and - 35 degrees. If the angle between the direction vector d of the object and the input change position vector Δρ is small or within said interval, the magnitude U of the Unext is to be changed. In this change the magnitude U could be increased or decreased depending on what kind of a movement is detected. If it is deducted that the object accelerates, the magnitude is to be increased, and if it is deducted that the object brakes, the magnitude is to be decreased. Of course, due to slowing down, if the speed is attempted to be kept constant, the magnitude increases. If the angle between the direction vector d of the object and the input change position vector Δρ is outside the threshold angle, the magnitude U of the i is replaced with the magnitude U of the normalised velocity vector u that was used in the calculation of the i t- In this way the speed of the velocity vector v stays essentially the same even if there are turns.
In Figures 6 and 7 it is described how a magnitude U of a normalised velocity vector u is modified. In Figure 6a there is a normalised velocity vector u, a direction vector d and an input change position vector Δρ. In this example the angle be- tween a direction vector d and an input change position vector Δρ is small, i.e. within a threshold angle. In Figure 6b there is an unmodified normalised velocity vector Unext- In Figure 6c the magnitude is changed to be smaller. In the example of Figure 7 there are the same vectors as in Figure 6. In Figure 7a the angle between a direction vector d and an input change position vector Δρ is now outside the threshold angle. In Figure 7b there is an unmodified velocity vector unext- In Figure 7c the magnitude of i is replaced by the magnitude of u as seen in Figure 7a. Because the magnitude of u is bigger than the magnitude of the i , the modified i is longer than the unmodified. This means that the speed of the object does not decrease. In Figure 8 there is an example on how normalised velocity vectors u and velocity vectors v relate.
The object has a velocity vector v1. The normalised velocity vector u1 is calculated according to the invention. The velocity vector v1 changes towards u1 and the result is a new velocity vector v2. A new input change position vector Δρ1 is read. Based on that, a new normalised velocity vector u2 is calculated. The velocity vector v2 changes towards u2 and the result is a new velocity vector v3. Since the velocity vectors v track the new normalised velocity vectors u, it can be seen that a new velocity vector is between the previous velocity vector and the normalised velocity vector that was used for defining the velocity vector. In this example the velocity vector v2 is between the normalised velocity vector u1 and the velocity vector v1 and the velocity vector v3 is between the normalised velocity vector u2 and the velocity vector v2.
In Figure 9 is described an example of a visual presentation 902 of a velocity vector. On a touch screen of a device 903 is displayed an object 901 that is controlled by touch input that is described in this example by two input change position vec- tors Δρ1 and Δρ2. These input change position vectors correspond to the movement of the object that is described by velocity vectors v1 and v2. To make the movements of the object easier to comprehend and anticipate for the user of the device 903, a visual presentation 902 is displayed near the object. The visual presentation is a presentation of the velocity vector or several velocity vectors. In this example the visual presentation corresponds to the velocity vectors v1 and v2.
Figure 10 shows by way of example a flow chart of the method according to the invention. Hereby the method is described step by step.
Controlling an object displayed on a touch screen is started at 1001 . At 1002 an initial touch data input is read. In step 1003, first velocity vector v and direction vector d are calculated based on the touch data. A normalised velocity vector u is defined by the first velocity vector v. The object is moved according to those values.
In step 1004 touch data input is read. This is based on the sampling rate of the device from which the touch data is gathered. In that step an input change position vector Δρ is calculated.
In step 1005 a new normalised velocity vector unext is calculated similarly as was described before by using previous u, sensitivity K, and an input change position vector Δρ. The direction vector d and the velocity vector v calculated in step 1003 are unchanged.
In step 1006 it is checked if the angle between the input change position vector Δρ and the direction vector d is small, i.e. within some predetermined fixed value which is small compared to 180 degrees. If the answer is YES, then in step 1008, the magnitude U of the calculated normalised velocity vector unext is changed. Then in step 1009 the velocity vector v of the object displayed on the touch screen is changed towards the modified normalised velocity vector unext-
If the answer is NO, in step 1007, the magnitude U of the calculated normalised velocity vector i is the magnitude of the previous normalised velocity vector u, i.e. the normalised velocity vector that was used for calculating the normalised velocity vector unext- Then in step 1009 the velocity vector v of the object displayed on the touch screen is changed towards the modified normalised velocity vector
Unext- In step 1010 the object is moved on the touch screen based on the velocity vector v that was changed in step 1009.
In step 1004 a touch data input is read. At this time also the direction vector d is detected or calculated. In step 1005 a new normalised velocity vector i is calculated. At this time the previous u is the u that was modified in step 1007 or in step 1008.
Then the controlling process is continued.
It must be noted that this flow chart is a simplified example of working of the invention. The object is, of course, moving and touch screen data inputs are sampled constantly according to instructions. Above, some preferred embodiments according to the invention have been described. The invention is not limited to the solutions described above, but the inventive idea can be applied in numerous ways within the scope of the claims.

Claims

Claims
1 . Unit (204) for controlling an object (301 , 901 ) displayed on a display which display is a touch screen (201 ), which unit (204) comprises a processor (203) arranged to receive touch input data that indicates touch input on the touch screen (201 ), and said touch input data is interpreted as instructions for movement of said object (301 , 901 ) by said unit (204), characterized in that the movement of the object (301 , 901 ) is real or virtual or a combination of both, and touch input data comprises at least an input position vector p and an input change position vector Δρ since last sampling of the touch input data and the object (301 , 901 ) has a velocity vector v and a direction vector d, which is a direction towards which the object (301 , 901 ) is pointing, and the unit (204) is arranged to calculate a normalised velocity vector u for the object (301 , 901 ), which normalised velocity vector has a magnitude U, based on the input change position vector Δρ, sensitivity K, which is a proportion between the input change position vector Δρ and ve- locity vector v, and the previous normalised velocity vector by the following equation Unext = (u + Κ*Δρ ), and the unit (204) is arranged to change the velocity vector v towards i , and if the angle between the input change position vector Δρ and the direction vector d is small, the unit (204) is arranged to change the magnitude U of the Unext, and otherwise U of the i is retained from the u that was used for calculation of the i
2. Unit (204) according to claim 1 , characterized in that when the unit (204) is arranged to change the velocity vector v towards i , the velocity vector asymptotically moves towards S*unext, where S is the maximum speed allowed for the object (301 , 901 ).
3. Unit (204) according to one of the claims 1-2, characterized in that the unit (204) is arranged to calculate an object turning speed T based on the difference of directions between the velocity vector v and the calculated i , in a way that when the i is to the left of the v, the object turning speed T is positive, and when the i is to the right of the v, the object turning speed T is negative.
4. Unit (204) according to one of the claims 1-2, characterized in that the object (301 , 901 ) has a preferable route and the unit (204) is arranged to calculate an object turning speed T based on the difference of directions between the velocity vector v and the calculated i , and to choose the direction of the turning of the object (301 , 901 ) to be clockwise or counterclockwise based on which direc- tion is closer to the preferable route, i.e. to choose the sign of the object turning speed T.
5. Unit (204) according to one of the claims 3-4, characterized in that the magnitude of the object turning speed T is proportional to the angle between the velocity vector v and the calculated i , in a way that T is small when the angle is small and grows when the said angle grows.
6. Unit (204) according to one of the claims 1-5, characterized in that the unit (204) is arranged to set the direction of the normalised velocity vector u the same as the direction of the velocity vector v when starting a new touch input data.
7. Unit (204) according to one of the claims 1-6, characterized in that the unit (204) is arranged to change the magnitude U of the i when the angle between the input change position vector Δρ and the direction vector d is between +35 and -35 degrees.
8. Unit (204) according to one of the claims 1-7, characterized in that when changing the magnitude U, the unit (204) is arranged to increase the magnitude if the unit (204) interprets from the touch input data that the object (301 , 901 ) accelerates, and to decrease the magnitude if the unit (204) interprets from the touch input data that the object (301 , 901 ) brakes.
9. Unit (204) according to one of the claims 3-8, characterized in that when the angle between the direction vector d and the normalised velocity vector u is near 180 degrees, the unit (204) is arranged to change the sign of the object turning speed T to be the same as the angular velocity W of the object (301 , 901 ).
10. Unit (204) according to one of claims 1-9, characterized in that the unit (204) is arranged to display a visual presentation (902) of the velocity vector v or a combination of several velocity vectors on the display near the object (301 , 901 ) indicating the direction to which the object (301 , 901 ) is moving.
1 1 . Unit (204) according to claim 10, characterized in that the visual presentation (902) is an arrow-like image that is transparent.
12. Unit (204) according to one of the claims 10-1 1 , characterized in that the visual presentation (902) is a spline that curves toward the direction of the velocity vector v starting from the current position of the object (301 , 901 ).
13. Unit (204) according to claim 12, characterized in that the spline curvature is determined by the maximum angular velocity Wmax of the object (301 , 901 ).
14. Unit (204) according to one of the claims 1-13, characterized in that the sensitivity K is an arbitrary positive number, and when K is bigger than an arbi- trary value, the length of the object (301 , 901 ) movement on the display is bigger than the length of the input position vector, and when K is smaller than the said arbitrary value, the length of the object (301 , 901 ) movement on the display is smaller than the length of the input position vector.
15. Unit (204) according to claim 14, characterized in that the sensitivity K is a predetermined value that is set by a user of a device (200, 903) containing said unit (204) or it is a set constant.
16. Method for controlling an object (301 , 901 ) displayed on a display which display is a touch screen (201 ), and the object (301 , 901 ) has a velocity vector v and a direction vector d, which is a direction to which the object (301 , 901 ) is pointing, characterized in that the method comprises the steps, where
- a touch input data is read (1004), and touch input data comprises at least an input position vector p and an input change position vector Δρ since last sampling of the touch input data,
- a normalised velocity vector u for the object (301 , 901 ) is calculated (1005), which normalised velocity vector has a magnitude U, based on the input change position vector Δρ, sensitivity K, which is a proportion between the input change position vector Δρ and velocity vector v, and the previous normalised velocity vector by equation unext = (u + Κ*Δρ ),
- the velocity vector v is changed (1009) towards i ,
- if the angle between the input change position vector Δρ and the direction vector d is small (1006), the unit (204) is arranged to change (1008) the magnitude U of the i , and otherwise U of the i is retained (1007) from the u that was used for calculation of the i
17. A computer program product which, when executed on a device (200, 903) having at least a processor (203) and a touch screen (201 ), is adapted to perform the steps of the method for controlling an object (301 , 901 ) displayed on a display which display is a touch screen (201 ), and the object (301 , 901 ) has a velocity vector v and a direction vector d, which is a direction towards which the object (301 , 901 ) is pointing, characterized in that the computer program product comprises computer program code means which are arranged to
- read (1004) a touch input data, which touch input data comprises at least an input position vector p and an input change position vector Δρ since last sampling of the touch input data,
- calculate (1005) a normalised velocity vector u for the object (301 , 901 ), which normalised velocity vector has a magnitude U, based on the input change position vector Δρ, sensitivity K, which is a proportion between the input change position vector Δρ and velocity vector v, and the previous normalised velocity vector by equation unext = (u + Κ*Δρ ),
- change (1008) the magnitude U of the i if the angle between the input change position vector Δρ and the direction vector d is small (1006), and otherwise retain (1007) the U of the i from the u that was used for calculation Of the Unext,
- change (1009) the velocity vector v towards i ,
- give instructions to move (1010) the object (301 , 901 ) on the display based on the velocity vector v.
PCT/FI2015/050828 2014-11-28 2015-11-27 Unit for controlling an object displayed on a display, a method for controlling an object displayed on a display and a computer program product WO2016083671A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FI20146045 2014-11-28
FI20146045A FI127452B (en) 2014-11-28 2014-11-28 Unit for controlling an object displayed on a display, a method for controlling an object displayed on a display and a computer program product

Publications (1)

Publication Number Publication Date
WO2016083671A1 true WO2016083671A1 (en) 2016-06-02

Family

ID=56073675

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/FI2015/050828 WO2016083671A1 (en) 2014-11-28 2015-11-27 Unit for controlling an object displayed on a display, a method for controlling an object displayed on a display and a computer program product

Country Status (2)

Country Link
FI (1) FI127452B (en)
WO (1) WO2016083671A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117132595A (en) * 2023-10-25 2023-11-28 北京市肿瘤防治研究所 Intelligent light-weight processing method and system for DWI (discrete wavelet transform) images of rectal cancer and cervical cancer

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1726342A2 (en) * 2005-05-26 2006-11-29 Nintendo Co., Limited Image processing program and image processing device for moving display area
US20130217498A1 (en) * 2012-02-20 2013-08-22 Fourier Information Corp. Game controlling method for use in touch panel medium and game medium
US20130257807A1 (en) * 2012-04-03 2013-10-03 Apple Inc. System and method for enhancing touch input
US20140201666A1 (en) * 2013-01-15 2014-07-17 Raffi Bedikian Dynamic, free-space user interactions for machine control
US20140232669A1 (en) * 2013-02-15 2014-08-21 Flatfrog Laboratories Ab Interpretation of pressure based gesture

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1726342A2 (en) * 2005-05-26 2006-11-29 Nintendo Co., Limited Image processing program and image processing device for moving display area
US20130217498A1 (en) * 2012-02-20 2013-08-22 Fourier Information Corp. Game controlling method for use in touch panel medium and game medium
US20130257807A1 (en) * 2012-04-03 2013-10-03 Apple Inc. System and method for enhancing touch input
US20140201666A1 (en) * 2013-01-15 2014-07-17 Raffi Bedikian Dynamic, free-space user interactions for machine control
US20140232669A1 (en) * 2013-02-15 2014-08-21 Flatfrog Laboratories Ab Interpretation of pressure based gesture

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117132595A (en) * 2023-10-25 2023-11-28 北京市肿瘤防治研究所 Intelligent light-weight processing method and system for DWI (discrete wavelet transform) images of rectal cancer and cervical cancer
CN117132595B (en) * 2023-10-25 2024-01-16 北京市肿瘤防治研究所 Intelligent light-weight processing method and system for DWI (discrete wavelet transform) images of rectal cancer and cervical cancer

Also Published As

Publication number Publication date
FI20146045A (en) 2016-05-29
FI127452B (en) 2018-06-15

Similar Documents

Publication Publication Date Title
EP3424208B1 (en) Movable user interface shutter button for camera
US20180373376A1 (en) Program and information processing method
JP6603059B2 (en) System and method for determining haptic effects for multi-touch input
EP2917818B1 (en) Mutli-stage gesture
CN107249706B (en) Game control program, game control method, and game control device
CN111135556B (en) Virtual camera control method and device, electronic equipment and storage medium
JP5932790B2 (en) Highlight objects on the display
KR101800182B1 (en) Apparatus and Method for Controlling Virtual Object
US20090058801A1 (en) Fluid motion user interface control
KR20160007634A (en) Feedback for gestures
KR102237363B1 (en) Graphical interface and method for managing said graphical interface during the touch-selection of a displayed element
JP2018187289A (en) Program and information processing device
KR102237452B1 (en) Graphical interface and method for managing said graphical interface during the touch-selection of a displayed element
JP6018474B2 (en) Program, information processing apparatus, information processing method, and information processing system
KR20160019449A (en) Disambiguation of indirect input
EP2362302B1 (en) Method for controlling motions of an object in a 3-dimensional virtual environment
WO2016083671A1 (en) Unit for controlling an object displayed on a display, a method for controlling an object displayed on a display and a computer program product
KR20180112785A (en) Information processing apparatus, information processing method, and program
US10444985B2 (en) Computing device responsive to contact gestures
CN113440835B (en) Virtual unit control method and device, processor and electronic device
US20160117075A1 (en) Advanced touch user interface
CN109847344A (en) Virtual reality exchange method and device, storage medium, electronic equipment
CN106843676A (en) For the method for toch control and touch control device of touch terminal
KR102224930B1 (en) Method of displaying menu based on depth information and space gesture of user
JP5618926B2 (en) Multipointing device control method and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15862158

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15862158

Country of ref document: EP

Kind code of ref document: A1