US20150338974A1 - Definition and use of node-based points, lines and routes on touch screen devices - Google Patents

Definition and use of node-based points, lines and routes on touch screen devices Download PDF

Info

Publication number
US20150338974A1
US20150338974A1 US14/020,835 US201314020835A US2015338974A1 US 20150338974 A1 US20150338974 A1 US 20150338974A1 US 201314020835 A US201314020835 A US 201314020835A US 2015338974 A1 US2015338974 A1 US 2015338974A1
Authority
US
United States
Prior art keywords
node
touch
user
segment
touch screen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/020,835
Inventor
Norman Michael Stone
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
STORMLIT Ltd
Original Assignee
STORMLIT Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by STORMLIT Ltd filed Critical STORMLIT Ltd
Priority to US14/020,835 priority Critical patent/US20150338974A1/en
Priority to PCT/GB2014/051157 priority patent/WO2014167363A1/en
Priority to GB1517611.8A priority patent/GB2527244B/en
Publication of US20150338974A1 publication Critical patent/US20150338974A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0412Digitisers structurally integrated in a display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen

Definitions

  • no methods are node-based.
  • Nodes are defined here as being user touch-defined points on a touch screen which may be moved without the background moving via touch with movement, and of which there can be multiple existing at the same time either on the screen or in an off-screen area.
  • Nodes may also have data attached to them (in addition to their position on the touch screen), either defined by the application or by the user.
  • Nodes can have particular operations performed on them, such as deletion or association with other nodes. The length or routing of node based lines and routes can be changed by the movement of one or more of their nodes.
  • Barkai et al. in U.S. Pat. No. 0,093,188 (2011) describe different electronic mapping resources available including GoogleMaps, OpenStreetMap and Wayfaring when describing a means to visualise shared routes.
  • the points and routes which are described are from users and travellers who permit their location to be shared with the wider web community and therefore the points of interest shared are from actual geographic locations measured from mobile devices (predominantly smart-phones).
  • Barkai et al. only describe web-based applications (rather than maps embedded in mobile devices) through which to perform and customise this data sharing.
  • mapping applications including Google Maps and Apple Maps have a feature whereby a touch screen user can define a point anywhere on the screen (typically marked by a pin symbol) which does relate to a geographic point which the user is effectively marking.
  • Google Maps and Apple Maps have a feature whereby a touch screen user can define a point anywhere on the screen (typically marked by a pin symbol) which does relate to a geographic point which the user is effectively marking.
  • the location cannot be moved; the background map pans when it is attempted to move the location.
  • 0,181,620 (2011) also reference customisable map information from different user-supplied information, which is not specifically web-based, but which nevertheless does not include point, line and route definition via touch-screen.
  • Karaoguz, in U.S. Pat. No. 0,046,881 (2011) does discuss enhancement of points of interest where the user associates points of interest derived from various means with photographs, related information and multimedia files. The user may also name the points and define categories of points of interest.
  • the user can display positions and historical positions and routes of other selected users, along with data and route information of those other users. All selected information can be overlaid on an electronic map and stored.
  • the points of interest locations either relate to position (current or historical), a look-up from a street address, entered or received coordinates or other form of look-up from a network or database; there is no touch-screen definition of points.
  • route information referred to is note and information based or historical coordinate-based, rather than being node-based. Lines for map personalization are not referred to at all.
  • Tandon in U.S. Pat. No. 0,307,512 (2011) describes a property-centric real estate map with personalised points of interest.
  • Realtor and seller can define points of interest relevant to the surroundings of the property, annotating these with defined graphical symbols, pictures and information.
  • the points of interest are defined via a look-up from the web-based application, based on address or multiple listing service database information, and not defined on a map or earth model directly by a user on a touch-screen.
  • Mincey et al. in U.S. Pat. No. 0,178,697 (2011) define several points of interest which may be relevant to a route being planned or followed, which consist of several steps and alternative points of interest, where these points of interest have attributes such as visibility with which to make route decisions.
  • the points of interest are not user defined, but are the result of searches, and the routes relate to automobile navigation.
  • Schulte et al. in U.S. Pat. No. 8,380,366 (2013) add some highly relevant aspects in terms of touches to touch screens approximating to nodes for defining routes for air navigation, and other operations on these nodes and routes.
  • nodes are discrete nodes and equate to known entities underlying the map—known locations with predefined coordinates and other known attributes such as the altitude above sea-level of an airport.
  • the airways defined also equate to existing routes which must be followed until particular entry or exit points. Therefore both nodes and routes are generally limited to existing node and route positions, and are not intended to define waypoints in the middle of nowhere.
  • Various operations on nodes are similar to those defined in this application, including being able to select a node to see its attributes, the deletion of a waypoint resulting in a route of less segments, and the snapping of nodes together so that they are associated and become part of a single route. However these operations are specific to the apparatus defined by Schulte et al., and especially the involvement of an unspecified control device which is separate to the touch screen device itself.
  • a final area of prior art which is of relevance to the current application is that of displaying distance information between user touches.
  • Britton in U.S. Pat. No. 0,022,308 (2011) describes a means of defining a start point of a route via touch on a touch screen, recognising an approximate route traced by touch by the user and then recognising the end point of the proposed route, also by touch. This method then calculates a distance from the start location to the final location along the route.
  • Britton's method is useful in defining start and final destinations by touch without having to enter location manually or use points of interest from searches.
  • the intermediate touch and tracing step which is obviously of use for calculation of distance likely to be travelled by road is not appropriate to where a minimum straight-line distance is required to be calculated.
  • the distance is not displayed to the user directly between the points, but is used indirectly for calculation of route distance.
  • the method is claimed only for roadway routes.
  • no method or apparatus is available by which node-based points of definition, node-based lines, node-based routes, and node-based corridors can be created or shared using gestures on a touch screen device.
  • no method or apparatus is in existence or is otherwise prior art whereby node-based points of definition or node-based lines and routes can be directly applied to mapping, earth model and navigation applications so as to define and share any user-defined geographic location, boundary or route.
  • multiple independent nodes on a mapping application are created by touch gestures on a touch screen, which correspond to multiple independent geographic points on the Earth's surface.
  • the nodes can be labelled by the user, moved by the user without the background map moving, and have attributes added by the user or application.
  • Multi-touch selections on the touch screen with two touches together are interpreted as line segments, and with two touches in rapid sequence are interpreted as route segments with direction implied by the order of touches.
  • the lines may exist for the duration of the creating touches or indefinitely, and may display distance or vector difference between the two touches, to the user. Line segments may be divided into several segments by the selection and movement of hidden (latent) nodes along existing line segments.
  • Lines and routes segments can be defined to exist together, and these can be joined to make complex, multi-segment lines and routes by the joining of the end nodes of line and route segments.
  • Lines and routes can be made into a two dimensional corridor by the definition of a width relevant either side of a line or route segment, or a composite, multi-segment line or route.
  • gestures for node, line, route and corridor definition and management are entirely new in one embodiment, while for others the gestures used are adapted from existing gestures, but used for an entirely new purpose or with a new human computer interaction interface.
  • the points, lines, routes and corridors can be easily manipulated and edited, and the lines, routes and corridors can be efficiently stretched and re-sized.
  • the use of key nodes permits operations on a composite entity made up of multiple line or route segments, such as the movement or deletion of a complex route.
  • the node-based aspect also gives the advantage of sharing data in one embodiment such as the remote transmission of geometric information between different devices programmed or enabled to process or interpret the information, for example for passing planned routes over a communications network.
  • Various applications of one or more embodiments include sharing planned wilderness hiking routes, defining flight plans, remote drawing, defining boundaries in real estate, maritime navigation and getting representative distance information between two points on an x-ray or electron microscope image by touching the two required points on a touch screen.
  • FIG. 1 represents the definition of a node or point of definition by the user of a multi-touch enabled device according to an embodiment of the invention.
  • FIG. 2 is a potential result on a mapping application of a multi-touch defined point of definition according to an embodiment of the invention.
  • FIG. 3 shows how a node on the touch screen, representing a point of definition may be moved according to an embodiment of the invention.
  • FIG. 4 shows how a point of definition on a map is moved in sympathy with the node being moved on the touch screen according to an embodiment of the invention.
  • FIG. 5A shows a flow-chart of a method to create and select a node or point of definition using multi-touch according to embodiments of the invention.
  • FIG. 5B shows a flow chart of normal node and margin node movements according to embodiments of the invention.
  • FIG. 5C shows normal node movement according to an embodiment of the invention.
  • FIG. 5D shows margin node movement according to an embodiment of the invention.
  • FIG. 6 demonstrates potential information related to a touch defined point of definition on a map according to embodiments of the invention.
  • FIG. 7 represents the definition of a line segment on a touch screen device according to embodiments of the invention.
  • FIG. 8 shows a result on a mapping application of a multi-touch defined line segment according to an embodiment of the invention.
  • FIG. 9 shows a flow-chart of a method to detect a multi-touch defined line segment according to embodiments of the invention.
  • FIG. 10 highlights potential information related to a multi-touch defined line segment according to an embodiment of the invention.
  • FIG. 11 demonstrates how latent nodes can be used to divide a line segment in two according to embodiments of the invention.
  • FIG. 12 presents a flow-chart of how latent nodes can be used with fractional or space division to divide a line segment into two or more segments according to embodiments of the invention.
  • FIG. 13 demonstrates how the act of creating a line segment can also be used to display representative or actual distances between the touch nodes, according to embodiments of the invention.
  • FIG. 14 illustrates how a sequenced tapping of a touch screen can be used to define direction of travel, and therefore also a vector or route according to embodiments of the invention.
  • FIG. 15 indicates how when defining a route or line in a node based fashion, a display of vector difference between the nodes can also be displayed according to embodiments of the invention.
  • FIG. 16 presents a flow-chart defining how sequenced touches result in a route or vector line, and how it is determined whether to display distance between the nodes, according to embodiments of the invention.
  • FIG. 17 illustrates how a composite, multi-segment line can be defined from multiple line segments according to embodiments of the invention.
  • FIG. 18 shows how a line can be used as a route, with directionality, according to an embodiment of the invention.
  • FIG. 19 presents a flow-chart defining how a composite multi-segment line or route can be formed from joining nodes of other lines or routes together, according to embodiments of the invention.
  • FIG. 20A shows how a node on a line or route can be deleted according to an embodiment of the invention.
  • FIG. 20B indicates a resulting composite, multi-segment line after the deletion of a node, according to an embodiment of the invention.
  • FIG. 21 represents the formation of a corridor from a line or route according to an embodiment of the invention.
  • FIG. 22A depicts a multi-segment route according to an embodiment of the invention.
  • FIG. 22B shows a multi-segment route with rectangles and a circle to show the creation of a corridor area, according to an embodiment of the invention.
  • FIG. 22C illustrates a complete corridor over a multi-segment route according to an embodiment of the invention.
  • FIG. 22D represents a process for the creation of a corridor area around a multi-segment line or route according to an embodiment of the invention.
  • FIG. 23A illustrates how route lines can be converted into non-route lines and vice versa, according to embodiments of the invention.
  • FIG. 23B illustrates how route lines can be reversed in directionality according to an embodiment of the invention.
  • FIG. 23C illustrates how a closed area can be converted between a route and area, and between a closed line and an area according to an embodiment of the invention.
  • FIG. 24 is a modular view of a multi-touch device capable of defining, editing and displaying node-based points, lines, routes and corridors according to embodiments of the invention.
  • FIG. 25 demonstrates how a network of multi-touch enabled devices, touch-screen devices and databases can be used together to share node-based points, lines, routes and corridors between each other according to embodiments of the invention.
  • node-based point, line, route or corridor definition via a multi-touch surface
  • node-based point, line, route or corridor definition via a multi-touch surface
  • the use of one finger in contact with a touch-sensitive area is termed a node, and one node on the touch screen can define location and information relating to a point of definition on a background such as a map or image.
  • two touch screen nodes can be used to create line segments, and route segments on a background. From these primitive entities, multi-segment, composite, lines, routes and corridors can be created, and any primitive or composite entity can be manipulated through the movement of the nodes defining them.
  • Various combinations of the following node-based point, line, route and corridor definitions, manipulations and edits can be used in an embodiment.
  • a point on a touch screen device is prior art, especially when implemented as a single touch, held for a short duration.
  • a mapping service such as Google Maps and Apple Maps results in a point of definition, marked with an information bubble or pin symbol.
  • a prolonged touch (typically between 0.3 seconds and 1.0 seconds) to a touch-screen device would also create a point of definition on a map, similar to the prior art method of Google Maps and Apple Maps implemented on touch screens, except that the point of definition would be marked with a symbol.
  • the symbol would be a filled, colored shape such as a square.
  • a node refers to a persistent defined touch point on the touch screen, which may be used to produce various entities, including points of definition, on a background application such as a map or image application. Therefore when used for defining points of definition, nodes are used in the creation of them, and effectively one node entity exists for one point of definition entity.
  • nodes are not specific to points of definition, since nodes are used in the creation of other entities. For example two nodes are used in the definition of a line segment, and therefore two nodes are associated with every line segment. Multiple nodes can exist on a touch screen concurrently.
  • FIG. 1 illustrates the creation and selection of a node and point of definition on a touch-screen surface 102 , via a prolonged touch with a single touch implement 104 —in this case a finger.
  • a single touch implement 104 in this case a finger.
  • FIG. 2 shows the result of a point of definition node creation touch on touch screen device 202 , with the created point represented by square 204 in the position on the map or background image at which it was created. In one embodiment no label will be given to the created point.
  • a means to label the point is automatically given to the user, such as a virtual keyboard being provided on the touch screen display.
  • the application will provide an automatically generated number or label upon creation.
  • a longer duration touch than required for node creation of approximately two seconds, will initiate a means to label the point of definition.
  • the selection of an existing point of definition will enable the user to be able to perform a labelling or re-naming operation on that point.
  • Legend 206 is an example of a point of definition label defined by the user.
  • node-based points of definition can exist concurrently, and be visible to the touch screen user at the same time, as depicted by additional points 208 .
  • additional points 208 For prior art points of definition such as those of Google Maps or Apple Maps, only one point of definition is permitted. Trying to define a second one will result in the original point being deleted, and the new point being created at the new position instead.
  • the creation of a node-based point of definition also selects that node for immediate movement of the point of definition as indicated in FIG. 3 by the motion arrow 304 .
  • the node and therefore associated point of definition will be moved along the surface of the touch screen 102 with the finger. Therefore in one embodiment when a selected point of definition is moved, no panning of the screen or underlying map typically occurs; the node and associated point of definition will move and not the background to the point of definition, with the exception of node-based panning described below.
  • FIG. 4 The result of a node-based point of definition creation and movement is shown in FIG. 4 .
  • the created node-based point of definition representation 404 on the touch-screen device 202 is shown.
  • the direction arrow 402 represents movement of the point of definition, equal in direction and distance on the touch-screen to the causal finger motion.
  • the selection of a node-based point of definition previously created is similar to the selection during creation of a node-based point of definition except that it requires the existence of a node-based point of definition at the location of the touch (or within a number of pixels of the touch). Therefore touching a node-based point of definition on the touch-screen and continuing to touch it will select that point of definition for an operation including moving the node around the touch screen.
  • the node-based point of definition is only required to be present to be able to be selected, and therefore the selection—even of invisible points—is possible.
  • the appearance of the selected point of definition is changed to denote that it is selected.
  • the contrast of a selected point of definition is inverted to denote a selected point such as that shown in 404
  • the color of a node-based point of definition changes once selected.
  • FIG. 5A demonstrates how to create the functionality of node-based definition, selection and movement on the touch-screen for points of definition.
  • Process box 502 is a background task which monitors for a multi-touch operation; when a multi-touch operation is detected, the decision logic at 504 detects the specific multi-touch input of a touch device such as a finger touching the screen for a duration appropriate with the application (for example one second). If this node event is detected there is another decision point 506 which determines whether there is an existing node at the location of the touch on the screen. If there is no existing node, the node-based point of definition creation process 510 is initiated, which creates a node at the location being touched, and then selects the point of definition which has just been created.
  • node at, or close to the detected node multi-touch, that node will be selected, as shown in 508 .
  • the user can move the node-based point of definition freely as summarized by process 512 , and further elaborated in the process description of FIG. 5B .
  • the node-based point of definition position on the touch-screen will repeatedly track the position of the finger performing the touch as shown by process 518 .
  • a decision 520 as to whether to perform normal node movement or margin-based node movement depends on whether the node, and associated point of definition, is within a screen margin.
  • a screen margin may be used—although it is not necessarily visible—in the situation where the background to a node, such as a map, occupies a larger area than can be seen on the touch-screen.
  • the node-based point of definition remains under the controlling finger, but the background moves in the opposite direction to the margin as described by 524 . Therefore if a node-based point of definition is moved into a defined margin area of the left of the touch-screen, the user's controlling finger may stop there, in which case the background will move to the right.
  • borders will be relevant at the top, bottom, left and right sides of a rectangular touch-screen, although for example borders near the corners could act in two directions.
  • Such scrolling of a background can occur for as long as the user's finger is in contact with the screen and within a margin. If the finger is no longer in a margin, but still in contact with the touch-screen, normal node-based point of definition motion 522 will occur, with the node following the finger.
  • Decision logic 514 on FIG. 5A determines whether any other operation is performed on the node-based point of definition after movement; an immediate time-out occurs if the node controlling finger is removed from the touch-screen, in which case the node-based point of definition stays at the position it was last at—where the finger was removed from—and is deselected.
  • FIG. 5C shows normal node movement on a touch-screen surface 102 —in this case a satellite navigation system—with 528 representing the position of a user's finger, which is evidently outside the margin 530 .
  • the node, and associated point of definition if applicable, is moved with the finger, for example as shown by arrow 526 , without the background being panned.
  • FIG. 5D shows margin-based node movement, where finger position 528 is within the margin 530 —in this case the top margin.
  • Arrow 532 represents the movement or scrolling of the background as a consequence of the presence and position of the finger controlling the node being within the margin—in the opposite direction to the previous direction of travel across the screen by the node.
  • Margin-based node movement is possible where the relevant area for which nodes are relevant is greater than the area shown by the screen. The effect will be to move the node further along the background in the desired direction. In this case the direction of movement of the background will be opposite to a panning multi-touch operation in the same direction that would happen with the attempted movement of a point of definition pin, in Apple Maps, for example.
  • various embodiments relate to the movement of nodes underlying lines, routes and corridors. Therefore for example the movement of the end node of a multi-segment line can also use normal and margin-based node movement as described in FIGS. 5A , 5 B, 5 C and 5 D.
  • FIG. 6 shows some of the information which could be attributed to a node after creation and selection—particularly a node representing a geographic location on a map.
  • Latitude and longitude would be important for a location node—this would be received from a mapping or geographic model application once the position on a touch-screen has been established.
  • start date and finish date could be useful for determining relevance.
  • elevation or altitude perhaps with minimum and maximum elevation/altitude would allow a three dimensional location definition. Therefore the altitude of a surveying point on a mountain could be usefully defined, an altitude above ground could be defined, or a depth below the sea could be added to latitude and longitude data.
  • a name would also be useful—especially when sharing the node on a network, for a shared reference by those users with permissions to see specific nodes.
  • Certain information if required, could be attributed to a node by the operating system, such as the user and time at which a node was created. Miscellaneous information or notes about a location could also be added by a user.
  • visual information such as node icon and label color, size and shape could be defined, or these could be defined or defaulted by the application itself.
  • any information desired to be entered by the user would be available on a menu or form following normal selection of the node of approximately 0.5 seconds. In another embodiment a menu or form would be presented to the user to complete following an extra long selection period of more than one second.
  • FIG. 6 also shows that nodes and node data can be shared across a communication network, and that a node created on one device could be viewed either as a point of interest or an editable node on other devices.
  • Line segments are defined with the use of two fingers; in this case a thumb 704 and middle finger 702 touching a touch-screen 102 together, and for a minimum duration (for example one second) without being moved.
  • This action will create or select two nodes, at the locations of the two touches on the touch-screen, between which will be drawn a line segment which in one embodiment will be straight, continuous, blue and thick.
  • Other embodiments relevant for different application would produce line segment combinations of wavy, saw-tooth or straight type, with combinations of continuous, dotted or dashed style, with one of various common colors and thicknesses.
  • one of the two nodes created will be determined as the key node, and indicated to the user as the key node by a specific appearance.
  • a key node for a line segment is defined, the selection of the whole line segment, and operations on that line segment are possible by selection of the key node first—for example movement of the line segment with both of its nodes, instead of the movement of just one node.
  • the straight line segment shown as 806 on FIG. 8 between position nodes 802 and 804 , on the touch-screen device 202 constitutes a line segment which can be further enhanced by the user, as illustrated in subsequent figures and paragraphs.
  • node 804 is marked as the key node, although other ways can be used to indicate a key node from other node, including shape, contrast, color or fill.
  • FIG. 9 The creation method of a line segment by a touch-screen device user is shown in FIG. 9 .
  • User inputs to the touch-screen will be monitored for multi-touch node gestures by process 502 .
  • the logic of 902 will determine whether two fingers are touching the touch-screen and remaining still for greater then a minimum duration (for example 1 second), and if so the process 904 will create a line.
  • Process 904 will create a line firstly by selecting the two nodes specified by the finger positions. If a node already exists at a finger position (or within a defined radius), the node will be selected. If there is no pre-existing node at the position, a node will be created at the given screen position and the node will be selected.
  • a line segment will be drawn on the touch-screen between the two selected nodes.
  • the line will be straight, but there are various possibilities with regard to line type, which may for example be a wave, a saw-tooth, an arc, a spline, or other common line type, and of typical thickness and color possibilities found in drawing applications.
  • line type which may for example be a wave, a saw-tooth, an arc, a spline, or other common line type, and of typical thickness and color possibilities found in drawing applications.
  • the allocation of a key node can vary according to application and user defaults and preferences, for instance the first touch to be made during creation of the line, or the highest and furthest left on the touch-screen.
  • line segment latent nodes may be automatically created for the purpose of line division as described later with the assistance of FIG. 11 .
  • the logic of 906 will detect whether the two fingers remain on the created nodes for a minimum time-out period after the creation of the line segment. If not (for instance the fingers are removed immediately upon the drawing of the line segment) in one embodiment the line will be completed without any additional user information added to the line at that time. In another embodiment the line segment will disappear if a minimum time-out period is not met with the creating fingers remaining substantially still. If the fingers do remain substantially still for an additional period following the time-out, for example 0.5 seconds, process 908 will allow the user to add additional information for the line segment via a user interface.
  • Selection of a whole line segment is possible after the initial creation of the line segment; in one embodiment will consist of touching and holding the line segment anywhere on its length (although this embodiment is not compatible with the use of latent nodes for line division). Other embodiments will select a whole line segment from the touching and holding, or double-tapping of the key node of the line segment. Yet another embodiment will allow the selection of a whole line segment from a menu option once either of the end nodes of the line segment is selected.
  • FIG. 10 illustrates some of the information which could be added to, and relevant to a line segment.
  • information can be added automatically by the operating system, such as Creation User and Creation Time.
  • Other information in various embodiments can be graphical preferences from a user, or from default values.
  • some information may be defined by the user, such as Line Segment Name and Information.
  • Other information including node positions on screen and node position representation (such as latitude/longitude/elevation in a mapping application) will be inherited from the node data relating to the nodes either end of the line segment.
  • FIG. 13 denotes the touching of a multi-touch enabled touch screen device 202 with two fingers 1306 which are held still for a minimum amount of time as for the normal node-based creation of a line.
  • a representative distance 1302 is displayed next to line 1304 , which states the straight line horizontal distance between the points on the map defined by the nodes.
  • the displayed distance is calculated by performing typical distance calculating navigational algorithms on the latitudes and longitudes represented by the nodes, which allows accurate real round-earth and great-circle calculations to be used.
  • the displayed distance is calculated by multiplying the calculated touch screen distance by the known scale of the map or image to the screen representation.
  • the distance only remains while the fingers are in contact with the touch screen, and disappears when one or both fingers are removed, although in other embodiments the distance can persist.
  • no line is displayed, and only the distance measurement is shown to the user.
  • the distance measurement and display method shown in FIG. 13 can be used in other applications than electronic mapping; it can be used with earth or other planet models such as Google Earth, and can be used with image backgrounds such as satellite imagery, x-ray images, architectural plans, geographic information system images, radio telescopes, photographs, video feeds and electron microscopy.
  • the distance calculated and displayed in one embodiment represents an angle or degree of arc rather than a scalar distance, which for example could be used in astronomy. Therefore this technology is of potential use to town planners, engineers, architects, microbiologists, planet scientists, radiographers, meteorologists, farmers, real estate agents, navigators and astronomers, to name a few, who are equipped with a suitable multi-touch enabled touch screen device.
  • a route segment may be defined. Route segment definition is similar to line segment definition, except that the direction or vector is defined by the order of finger touches and the creation order of the nodes.
  • FIG. 14 shows the creation of a route segment. Finger 1404 is touched to the touch screen to remain substantially motionless a certain time before finger 1402 touches the touch screen. I estimate a typical value in use of approximately 0.5 seconds between touches. Once the second touch has been detected within the time window permitted, an arrow 1406 will be drawn from the node under the first touch 1404 in the direction of, and up to, the node under the second touch 1402 . In one embodiment the arrow direction is in the direction of the first touch instead of the second touch, and other embodiments provide for the use of multiple line types, styles, thicknesses and colors as described for line definition, including the drawing of a normal line without an arrowhead.
  • scalar distances can be displayed next to a route segment as in FIG. 13 .
  • a route segment gives direction as well as quantity
  • vector quantities or differences can be displayed to the user as shown in FIG. 15 .
  • a horizontal value difference of touch screen pixels 1508 and a vertical value difference of screen pixels 1510 is shown between the node under touch 1504 and the node under touch 1502 . Since the direction of the vector is known, the horizontal x difference value and the vertical y difference value have polarity or direction, so that in effect the node under touch 1504 is a temporary origin, and the node under touch 1502 is a vector relative to the former.
  • the axis values of quantities 1508 and 1510 are Northings (distances to North or South) and Eastings (distances to East or West) respectively in a mapping application.
  • the two dimensions for which distance is shown is based on two orthogonal axes relevant to an image.
  • the information presented is an angle and distance between the nodes using measurement quantities and axes appropriate to the scale and application.
  • FIG. 16 shows a method of creating a route or vector segment, and differentiating it from a line or other multi-touch gesture.
  • Activity 1602 monitors for the first touch and decision logic 1604 determines whether a second touch occurs within a specific time window. In this example a window of between 0.2 and 0.8 seconds is defined, however more generally it is anticipated that the time between the two touches will have a minimum value for example of 0.1 seconds and a maximum value for example of two seconds for a route to be recognised. However other durations are possible.
  • Activity 1606 creates the vector line itself, for example a straight, black, thick arrow from the node under the first touch to the node under the second touch.
  • Decision logic 1608 determines whether vector distance is required to be displayed, and if this is the case it is displayed via process 1612 . If not, finger presence after the drawing of the route is used to decide in 1610 whether user information is required to be added in activity 1614 . Additional information which may be added in by the user includes name of segment and free text describing its significance.
  • FIG. 11 demonstrates how a line may be sub-divided via node-based multi-touch.
  • a created line or route segment will automatically have one or more latent nodes (as indicated by 1106 ) created along its length in addition to the line-end defining nodes.
  • these extra latent nodes may be visible or invisible to the user, and may be regularly spaced or irregularly spaced. If a user selects one of these latent nodes (as per the selection of any node with a short touch typically of between 0.5 seconds and 1.0 second), the latent node can be moved relative to the line-end nodes 1104 .
  • FIG. 12 illustrates a process by which node-based line division may be performed, which will result in a multi-segment line or route.
  • Decision logic 1202 identifies whether the use of latent nodes is valid with current defaults and user selections.
  • a second decision logic 1204 determines which method to use. In the process shown two methods are possible depending on default or pre-definition by the user.
  • process 1206 becomes active and divides a line or route segment into N equal parts with N ⁇ 1 equally spaced latent nodes, where N is a predefined integer value. For example if N has the value of 2, any line or route segment created will have one latent node half way between the two end nodes.
  • a latent node will be placed at that interval, starting at one end node.
  • FIG. 17 illustrates a composite line created from line segments 1708 , 1710 and 1706 .
  • Line segments 1708 and 1710 are joined by a common node 1704
  • line segments 1710 and 1706 are joined by a common node 1702 .
  • Such a composite line could for example represent a border, boundary or road on a map, or a complex line on a drawing application. Since non end nodes of a composite line are common between two line segments, for example the selection and moving of a node 1704 would result in a change of angle and possibly length by both line segments 1708 and 1710 .
  • FIG. 17 illustrates a composite line created from line segments 1708 , 1710 and 1706 .
  • Line segments 1708 and 1710 are joined by a common node 1704
  • line segments 1710 and 1706 are joined by a common node 1702 .
  • Such a composite line could for example represent a border, boundary or road on a map, or a complex line on a drawing application. Since non end no
  • FIG. 18 shows a composite route created through the combining of multiple line segments 1802 .
  • Such a composite line can be used to show flow or direction of travel.
  • a particular use of this type of line would be in defining a route which does not necessarily rely on existing geographic locations. This would be beneficial for example to define a planned wilderness route, journey on water or the flight plan for an aircraft.
  • FIG. 19 shows a method for the joining of different line or route segments to make a composite line or route.
  • Decision logic 1902 verifies whether a new segment has been created, and if so, decision logic 1904 determines whether either of the end nodes or both correspond with existing nodes. It is unlikely that even if the user desires to exactly match the position of an existing node that the same central pixel on the touch screen will be able to be selected, especially if fingers are used compared to a stylus or precision pointing device. Therefore said logic will accept close approximations to the position of an existing node as being identical, and those nodes will be merged as defined in process 1906 .
  • the mean position of the original node and new node will be averaged so that the new joining node of the two segments will be halfway between the precise centers of the nodes.
  • the joining node position is taken to be the position of the original node, and in yet another embodiment the joining node position is taken to be the position of the new node.
  • there is user-defined or default data attached to the nodes for example the altitude above sea level of the point represented by the node, this will be merged.
  • the data associated with the original node is used to populate the data fields of the joining node.
  • newer non-default data such as creation date from the new node will over-write the equivalent data of the existing node.
  • Process 1908 allows a key node for a composite line or route to be determined where neither or both segments have a key node already, since in one embodiment a composite, multi-segment line or route may not have more than one key node.
  • the key node of the original line segment, route segment, composite line or composite route is retained as the key node for the new composite line or route.
  • node joining method it is not just new nodes on line or route segments which may be joined to existing nodes, but two nodes of existing segments where the user has selected an end node from a segment and moved it in close proximity to an existing end node of another segment or existing point node.
  • the closeness of the centers of nodes when deciding whether they are to be joined depends somewhat on the application. However it is anticipated in general that nodes would not be joined unless the touch areas under a typical finger touch overlapped with an equivalent radius from the second node.
  • One operation is the deletion of a node, as shown in FIG. 20A , although the operation can also be performed on nodes which are not part of a composite line, composite route, line segment or route segment.
  • a user selects a node using a finger 2004 .
  • a long press of more than approximately one second results in a node operation menu, one of whose operations is deletion of the node.
  • the node Upon selection the node will be deleted and the symbol representing the node will be deleted as shown by area 2006 on FIG. 20B . If the node is an end node of a composite line or route, the end segment of the line or route which incorporated the node will be deleted also.
  • the segments attached to the deleted node will be deleted.
  • the composite line or route would be reformed by the joining together of the nodes either side of the deleted node with a new line or route segment.
  • the deletion of a node would result in two entities, such as a node and a composite line, with no automatic replacement or substitution of line segments.
  • the whole segment and both nodes will be deleted.
  • the node not deleted will remain as a point of designation node. If a key node is deleted, in one embodiment another node will be made the key node. In another embodiment the whole entity that the key node represents will be deleted.
  • any node can be defined as a key node by menu selection, which in one embodiment replaces the existing key node in that function.
  • a labelling means such as a virtual keyboard on the touch screen becomes active.
  • Another option for selection in one embodiment illustrated in FIG. 23A and FIG. 23C is the conversion of a line segment 2304 or composite line 2308 into a route segment 2302 or composite route 2310 respectively, and vice versa.
  • an option for selection is the means to reverse the direction of a composite route 2306 .
  • the said reversal means is also applicable to route segments in various embodiments.
  • FIG. 21 illustrates the use of line segments to create not only a composite line, but a corridor.
  • a corridor is a central composite line or composite route such as 2110 with associated parallel composite lines or composite routes illustrated in FIG. 21 by 2106 and 2108 which represent desired limits related to the central line.
  • One use for this is the definition of air corridors or sea lanes for touch-screen devices used for navigation or navigation planning.
  • the end of a corridor will be a semi-circle centered on the end-node.
  • Corridors can be created by the user of a touch-screen device for a line segment by specifying distance offsets from a central line, as part of the user-added line information process 908 previously described in FIG. 9 .
  • a distance offset can be defined for a route segment as part of the user-added route segment information process 1614 described in FIG. 16 .
  • a line or route segment is offset equally on both sides of the existing line or route segment.
  • selection of the whole multi-segment line or route is first performed. In one embodiment selection is achieved by the selection of the key node of the multi-segment line or route. In another embodiment, selection is achieved by a long press of over approximately one second anywhere on the line or route. In a third embodiment selection is achieved by the long press of over approximately one second of any node on the multi-segment line or route.
  • the touch-screen device will present one or more options to the user, including the option to create a corridor. In one embodiment the user will subsequently provide numerical input to the touch-screen device representing a width of corridor.
  • a default or pre-selected value will automatically be used as the corridor width.
  • an active symbol is placed over the key node, or all nodes of the multi-segment entity. When a finger touch is made to an active symbol, the active symbol can be moved away from the node it is over for the user to indicate the width of the corridor required.
  • a dynamic circle will be created centered on the node and with radius defined by the active symbol to visually feed back to the user what the width of the corridor will be.
  • the active symbol and circle will disappear upon the user removing their finger.
  • the user's finger must remain substantially motionless for approximately 0.5 seconds before the corridor width is finalised and the active symbol is removed.
  • a corridor will be drawn around the central multi-segment line or route in accordance with the selected width.
  • the corridor area will be calculated by the union of segment rectangle area as shown for one segment by 2210 in FIG. 22B , and node circle area as shown for one node by 2208 .
  • Segment rectangle area is the union of all areas made up of rectangles with length given by individual segment lengths and width given by the selected width.
  • Node circle area is defined by circles with a radius of the selected width, for all nodes in the multi-segment line.
  • the addition of node circles for the calculation of corridor area eliminates discontinuities of corridor shape shown by 2212 in FIG. 22B .
  • the border of the final corridor area calculated will be displayed around the original multi-segment line or route, as shown by 2214 in FIG. 22C .
  • corridors are not just created by the user of a touch screen device, but are defined on a remote computer or touch screen device and communicated to a computer or touch screen device for display.
  • the use of nodes facilitates communication of corridors since little data is required to be transmitted to define a corridor.
  • Navigation restricted corridors can therefore be provided centrally, which can be overlaid on a touch-screen display with local information—such as GPS position and planned route of the local user.
  • the key is the use of nodes to represent the required information between users and data sources.
  • a touch screen module In order to detect nodes, define points of definition, and to create lines, routes and corridors from nodes, a touch screen module is required, as indicated by 2402 on FIG. 24 .
  • the output signals from the touch screen module in response to user touches are fed to a control module 2404 to interpret.
  • the control module will determine whether a multi-touch event relating to nodes, node-based lines, node-based routes or node-based corridors has occurred. If so, the control module will process the information to create or modify the relevant entity. Node, line, route or corridor data for storage will be routed to the memory module 2412 , with the memory module also serving as the source of these data to the control module where required by an application 2418 or operating system 2414 running which requires the information.
  • the communications module 2410 sends or receives the point of definition, line, route or corridor node data, and supplementary information associated with that entity. This information is passed to or from the control module which may also route the data to or from the touch screen module or the memory module.
  • FIG. 25 shows how node-based point of definition, line, route and corridor information may be exchanged between different devices and computers via a network.
  • a tablet computer 2502 a large touch screen device (such as a touch screen television) 2504 , a personal computer or workstation 2506 (which does not have to have a touch screen or be multi-touch enabled), a smartphone 2508 and a satellite navigation system 2510 are shown communicating node-based point of definition, line, route and corridor information via a network.
  • the information being provided by some or all of the devices is processed and stored at central servers or databases 2512 .
  • the central servers or databases will share information as requested and required by the devices' applications and operating systems, including node-based point of definition, line, route and corridor information, for example a corridor area on a map.
  • the link 2514 represents a one-to-one communication of node-based point of definition, line, route and corridor information between two users with suitable apparatus, and shows that a centralized information distribution system is not necessarily required. Peer-to-peer and small clusters of users can also share node-based entity information.
  • a way of defining multiple points of definition by the user is provided, which means that places of relevance to her may be defined just by a touch at the applicable screen location over a background such as a map. Furthermore, those points of definition may be named as desired, remembered by the touch screen device, and shared with friends, social networks or databases. Points of definition could include favourite shops, parking spaces currently vacant and rendezvous points. Current mapping applications typically only allow one user-defined point or pin, which are not customizable, storable and may not be labelled.
  • routes can also be defined easily with two taps, which show direction as well as routing. Route segments may be quickly defined and joined by touch to create a composite route for navigators and pilots. Since routes—like all node-based entities—are easy to define and repeat, they are easily communicated via a communication network, which could have advantages for example in the remote filing of flight plans.
  • corridors are two dimensional extensions to lines and routes.
  • Corridors are easy to create by touch and user selection, and have application in navigation and control. Corridors can be defined centrally for example by air traffic control on a touch screen, and communicated to pilots.
  • the same method of defining lines, with two touches lends itself to defining two points of which it is desired to know the distance between, which is displayed to the touch screen device user.
  • the distance can either be the screen distance, for example in horizontal and vertical pixels to a programmer, or the distance which the touches represent on a background image or map. Therefore the method is useful for navigators to assess distance between two points.
  • Other examples include the use for radiographers to determine the size of bone fractures from an image on the touch screen and air traffic control to determine following distances between two aircraft by touching their symbols on a radar display on touch screen workstation screen.
  • the node-based method lends itself to the efficient communication of the said entities.

Abstract

A method and system are presented which will detect combinations of user touches on a touch screen device as nodes, and will create points of definition, lines, routes and corridors from these nodes. Where the device has a communications capability, the locally defined points of definition, routes and corridors can be shared with remote users and databases, and similar entities created by remote users can be viewed on a local display. The method and system are of particular benefit to navigation applications, map customization on touch screen devices, real estate management, scientific measurement, and geographic information systems.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application claims the benefit of provisional patent application Ser. No. 61/698,625, filed Sep. 8, 2012 by the present inventor.
  • FEDERALLY SPONSORED RESEARCH
  • None.
  • BACKGROUND Prior Art
  • The following is a tabulation of some prior art that presently appears relevant:
  • U.S. PATENTS
  • Patent Number Kind Code Issue Date Patentee
    0,862,382 A1 2013 Apr. 13 Stone
    0,093,188 A1 2011 Apr. 21 Barkai et al.
    0,040,684 A1 2008 Feb. 14 Crump
    0,181,620 A1 2011 Jul. 28 Hung
    0,046,881 A1 2011 Feb. 24 Karaoguz
    0,149,109 A1 2010 Jun. 17 Elias
    0,307,512 A1 2008 Dec. 11 Tandon
    0,178,697 A1 2011 Jul. 21 Mincey et al.
    8,380,366 B1 2013 Feb. 19 Schulte et al.
    0,022,308 A1 2011 Jan. 27 Britton
  • FOREIGN PATENT DOCUMENTS
  • Patent Number Issue Date Patentee
    KR20090017462 (A) 2009 Feb. 18 Sang et al.
  • NON-PATENT SOURCES
    • 1. Google, Google Maps for Android (HTC), version 6.8.1, 2012
    • 2. Apple iOS 6.1.3, Maps
  • The numbers and types of touch-screen devices is rapidly increasing, and user interfaces to some of these devices, and the way in which they are used is still evolving. There are already some typical uses of certain gestures with fingers or thumbs in order to perform selection, panning, scrolling and zooming functions. These gestures are often collectively referred to as Multi-touch, since they rely on more than one digit being in contact with a touch-sensitive surface at a time, or the same digit being used to tap the surface more than once. Certain multi-touch gestures are now so common as to have become de-facto standards, for example pinch, zoom and pan on any mapping application in, for example, Apple's iOS and Google's Android.
  • However, with the exception of shape, area and window definition by the current author in U.S. Pat. No. 0,862,382 (Stone, 2013), no methods are node-based. Nodes are defined here as being user touch-defined points on a touch screen which may be moved without the background moving via touch with movement, and of which there can be multiple existing at the same time either on the screen or in an off-screen area. Nodes may also have data attached to them (in addition to their position on the touch screen), either defined by the application or by the user. Nodes can have particular operations performed on them, such as deletion or association with other nodes. The length or routing of node based lines and routes can be changed by the movement of one or more of their nodes.
  • One area of prior art which is pertinent to this application is that of defining and sharing points of interest on electronic map displays, as found on touch-screen devices. Barkai et al. in U.S. Pat. No. 0,093,188 (2011) describe different electronic mapping resources available including GoogleMaps, OpenStreetMap and Wayfaring when describing a means to visualise shared routes. However it is significant that the points and routes which are described are from users and travellers who permit their location to be shared with the wider web community and therefore the points of interest shared are from actual geographic locations measured from mobile devices (predominantly smart-phones). Also Barkai et al. only describe web-based applications (rather than maps embedded in mobile devices) through which to perform and customise this data sharing. In fact there is now prior art for similar position sharing via embedded applications such as Google Maps for Android (HTC, version 6.8.1, 2012) and specifically the latitude feature which allows mobile users to share their positions with other chosen mobile users, and to represent these positions on the map display. Separately, modern mapping applications including Google Maps and Apple Maps have a feature whereby a touch screen user can define a point anywhere on the screen (typically marked by a pin symbol) which does relate to a geographic point which the user is effectively marking. However there is no means to define multiple new geographic points, lines or routes using touches against a point on a map on a touch screen, and then to share these with selected users or use them in a mapping application such as navigation or real estate management. Also the location cannot be moved; the background map pans when it is attempted to move the location.
  • Crump, in U.S. Pat. No. 0,040,684 (2008) describes method and apparatus for a user to customise their use of a map, by means of map overlays, which together with user-interactive menus and dropdowns permit the display of points of interest of particular classes, with changeable icons in relation to a known point of interest close to the desired travel destination. This will enable features and businesses of interest to be displayed near a planned destination, although the invention is limited to web-based applications, and the graphical user interface is best suited for computers with mouse control. Sang and Hwa in Korean Pat. No. KR20090017462 (2009) and Hung in U.S. Pat. No. 0,181,620 (2011) also reference customisable map information from different user-supplied information, which is not specifically web-based, but which nevertheless does not include point, line and route definition via touch-screen. Karaoguz, in U.S. Pat. No. 0,046,881 (2011) does discuss enhancement of points of interest where the user associates points of interest derived from various means with photographs, related information and multimedia files. The user may also name the points and define categories of points of interest. In addition the user can display positions and historical positions and routes of other selected users, along with data and route information of those other users. All selected information can be overlaid on an electronic map and stored. However, critically, the points of interest locations either relate to position (current or historical), a look-up from a street address, entered or received coordinates or other form of look-up from a network or database; there is no touch-screen definition of points. In addition the route information referred to is note and information based or historical coordinate-based, rather than being node-based. Lines for map personalization are not referred to at all.
  • One proposal does promote a method to define lines via multi-touch gestures: Elias (2010) in U.S. Pat. No. 0,149,109 describes an innovative means of drawing a line on a touch screen device and means for editing a line once drawn, in terms of lengthening, shortening and rotating. While these gestures are innovative, they rely on the detection of motion along or across a line, rather than being created and edited via two explicit, persistent nodes. The lack of a node based approach may be suitable for certain drawing applications, but presents difficulties for complex editing, conversion to other graphical entities, sharing with remote users and databases and joining of line segments to create a multi-segment line. Currently therefore there is no method available for line definition on multi-touch enabled devices which supports complex editing, remote sharing or the explicit joining of multiple lines for use with applications such as map displays.
  • A further area of prior art which is of relevance to map personalization, but specific in its area of application is that of real estate mapping. Tandon, in U.S. Pat. No. 0,307,512 (2011) describes a property-centric real estate map with personalised points of interest. Realtor and seller can define points of interest relevant to the surroundings of the property, annotating these with defined graphical symbols, pictures and information. The points of interest are defined via a look-up from the web-based application, based on address or multiple listing service database information, and not defined on a map or earth model directly by a user on a touch-screen.
  • Another area of prior art specifically applicable to this application is routes for navigation. Mincey et al. in U.S. Pat. No. 0,178,697 (2011) define several points of interest which may be relevant to a route being planned or followed, which consist of several steps and alternative points of interest, where these points of interest have attributes such as visibility with which to make route decisions. However the points of interest are not user defined, but are the result of searches, and the routes relate to automobile navigation. Schulte et al. in U.S. Pat. No. 8,380,366 (2013) add some highly relevant aspects in terms of touches to touch screens approximating to nodes for defining routes for air navigation, and other operations on these nodes and routes. However it seems that the nodes are discrete nodes and equate to known entities underlying the map—known locations with predefined coordinates and other known attributes such as the altitude above sea-level of an airport. The airways defined also equate to existing routes which must be followed until particular entry or exit points. Therefore both nodes and routes are generally limited to existing node and route positions, and are not intended to define waypoints in the middle of nowhere. Various operations on nodes are similar to those defined in this application, including being able to select a node to see its attributes, the deletion of a waypoint resulting in a route of less segments, and the snapping of nodes together so that they are associated and become part of a single route. However these operations are specific to the apparatus defined by Schulte et al., and especially the involvement of an unspecified control device which is separate to the touch screen device itself.
  • A final area of prior art which is of relevance to the current application is that of displaying distance information between user touches. Britton in U.S. Pat. No. 0,022,308 (2011) describes a means of defining a start point of a route via touch on a touch screen, recognising an approximate route traced by touch by the user and then recognising the end point of the proposed route, also by touch. This method then calculates a distance from the start location to the final location along the route. Britton's method is useful in defining start and final destinations by touch without having to enter location manually or use points of interest from searches. However the intermediate touch and tracing step which is obviously of use for calculation of distance likely to be travelled by road is not appropriate to where a minimum straight-line distance is required to be calculated. Also the distance is not displayed to the user directly between the points, but is used indirectly for calculation of route distance. Finally the method is claimed only for roadway routes.
  • The current author in U.S. Pat. No. 0,862,382 exploited the use of concurrent nodes for the definition of shapes, areas and windows, but did not include the use of nodes as individual points of interest or definition, nor the use of more than one node to create lines and routes.
  • In conclusion, insofar as I am aware, no method or apparatus is available by which node-based points of definition, node-based lines, node-based routes, and node-based corridors can be created or shared using gestures on a touch screen device. In addition, no method or apparatus is in existence or is otherwise prior art whereby node-based points of definition or node-based lines and routes can be directly applied to mapping, earth model and navigation applications so as to define and share any user-defined geographic location, boundary or route.
  • SUMMARY
  • Currently there is no means to be able to define and manage node-based points of definition, lines and routes rapidly and in a user-friendly manner using multi-touch gestures, nor to create a corridor from a line or route. I have produced such means through the creation and manipulation of nodes on a touch screen device.
  • In one embodiment multiple independent nodes on a mapping application are created by touch gestures on a touch screen, which correspond to multiple independent geographic points on the Earth's surface. The nodes can be labelled by the user, moved by the user without the background map moving, and have attributes added by the user or application. Multi-touch selections on the touch screen with two touches together are interpreted as line segments, and with two touches in rapid sequence are interpreted as route segments with direction implied by the order of touches. The lines may exist for the duration of the creating touches or indefinitely, and may display distance or vector difference between the two touches, to the user. Line segments may be divided into several segments by the selection and movement of hidden (latent) nodes along existing line segments. Multiple line and route segments can be defined to exist together, and these can be joined to make complex, multi-segment lines and routes by the joining of the end nodes of line and route segments. Lines and routes can be made into a two dimensional corridor by the definition of a width relevant either side of a line or route segment, or a composite, multi-segment line or route.
  • Some of the gestures for node, line, route and corridor definition and management are entirely new in one embodiment, while for others the gestures used are adapted from existing gestures, but used for an entirely new purpose or with a new human computer interaction interface.
  • Being node-based, once defined the points, lines, routes and corridors can be easily manipulated and edited, and the lines, routes and corridors can be efficiently stretched and re-sized. In some embodiments the use of key nodes permits operations on a composite entity made up of multiple line or route segments, such as the movement or deletion of a complex route. The node-based aspect also gives the advantage of sharing data in one embodiment such as the remote transmission of geometric information between different devices programmed or enabled to process or interpret the information, for example for passing planned routes over a communications network.
  • Various applications of one or more embodiments include sharing planned wilderness hiking routes, defining flight plans, remote drawing, defining boundaries in real estate, maritime navigation and getting representative distance information between two points on an x-ray or electron microscope image by touching the two required points on a touch screen.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The aforementioned and other aspects of the invention may best be understood by reference to the following description taken in conjunction with the accompanying drawings in which:
  • FIG. 1 represents the definition of a node or point of definition by the user of a multi-touch enabled device according to an embodiment of the invention.
  • FIG. 2 is a potential result on a mapping application of a multi-touch defined point of definition according to an embodiment of the invention.
  • FIG. 3 shows how a node on the touch screen, representing a point of definition may be moved according to an embodiment of the invention.
  • FIG. 4 shows how a point of definition on a map is moved in sympathy with the node being moved on the touch screen according to an embodiment of the invention.
  • FIG. 5A shows a flow-chart of a method to create and select a node or point of definition using multi-touch according to embodiments of the invention.
  • FIG. 5B shows a flow chart of normal node and margin node movements according to embodiments of the invention.
  • FIG. 5C shows normal node movement according to an embodiment of the invention.
  • FIG. 5D shows margin node movement according to an embodiment of the invention.
  • FIG. 6 demonstrates potential information related to a touch defined point of definition on a map according to embodiments of the invention.
  • FIG. 7 represents the definition of a line segment on a touch screen device according to embodiments of the invention.
  • FIG. 8 shows a result on a mapping application of a multi-touch defined line segment according to an embodiment of the invention.
  • FIG. 9 shows a flow-chart of a method to detect a multi-touch defined line segment according to embodiments of the invention.
  • FIG. 10 highlights potential information related to a multi-touch defined line segment according to an embodiment of the invention.
  • FIG. 11 demonstrates how latent nodes can be used to divide a line segment in two according to embodiments of the invention.
  • FIG. 12 presents a flow-chart of how latent nodes can be used with fractional or space division to divide a line segment into two or more segments according to embodiments of the invention.
  • FIG. 13 demonstrates how the act of creating a line segment can also be used to display representative or actual distances between the touch nodes, according to embodiments of the invention.
  • FIG. 14 illustrates how a sequenced tapping of a touch screen can be used to define direction of travel, and therefore also a vector or route according to embodiments of the invention.
  • FIG. 15 indicates how when defining a route or line in a node based fashion, a display of vector difference between the nodes can also be displayed according to embodiments of the invention.
  • FIG. 16 presents a flow-chart defining how sequenced touches result in a route or vector line, and how it is determined whether to display distance between the nodes, according to embodiments of the invention.
  • FIG. 17 illustrates how a composite, multi-segment line can be defined from multiple line segments according to embodiments of the invention.
  • FIG. 18 shows how a line can be used as a route, with directionality, according to an embodiment of the invention.
  • FIG. 19 presents a flow-chart defining how a composite multi-segment line or route can be formed from joining nodes of other lines or routes together, according to embodiments of the invention.
  • FIG. 20A shows how a node on a line or route can be deleted according to an embodiment of the invention.
  • FIG. 20B indicates a resulting composite, multi-segment line after the deletion of a node, according to an embodiment of the invention.
  • FIG. 21 represents the formation of a corridor from a line or route according to an embodiment of the invention.
  • FIG. 22A depicts a multi-segment route according to an embodiment of the invention.
  • FIG. 22B shows a multi-segment route with rectangles and a circle to show the creation of a corridor area, according to an embodiment of the invention.
  • FIG. 22C illustrates a complete corridor over a multi-segment route according to an embodiment of the invention.
  • FIG. 22D represents a process for the creation of a corridor area around a multi-segment line or route according to an embodiment of the invention.
  • FIG. 23A illustrates how route lines can be converted into non-route lines and vice versa, according to embodiments of the invention.
  • FIG. 23B illustrates how route lines can be reversed in directionality according to an embodiment of the invention.
  • FIG. 23C illustrates how a closed area can be converted between a route and area, and between a closed line and an area according to an embodiment of the invention.
  • FIG. 24 is a modular view of a multi-touch device capable of defining, editing and displaying node-based points, lines, routes and corridors according to embodiments of the invention.
  • FIG. 25 demonstrates how a network of multi-touch enabled devices, touch-screen devices and databases can be used together to share node-based points, lines, routes and corridors between each other according to embodiments of the invention.
  • DETAILED DESCRIPTION & OPERATION First Embodiment
  • One area which has not yet been served by efficient multi-touch gestures is that of node-based point, line, route or corridor definition via a multi-touch surface, and yet there is considerable use which this could afford users of mobile touch-screens and even large scale touch-sensitive surfaces. The use of one finger in contact with a touch-sensitive area is termed a node, and one node on the touch screen can define location and information relating to a point of definition on a background such as a map or image. Similarly two touch screen nodes can be used to create line segments, and route segments on a background. From these primitive entities, multi-segment, composite, lines, routes and corridors can be created, and any primitive or composite entity can be manipulated through the movement of the nodes defining them. Various combinations of the following node-based point, line, route and corridor definitions, manipulations and edits can be used in an embodiment.
  • Node-Based Point Definition, Selection & Movement
  • The definition of a point on a touch screen device is prior art, especially when implemented as a single touch, held for a short duration. Such an operation on a mapping service such as Google Maps and Apple Maps results in a point of definition, marked with an information bubble or pin symbol. I term the created point a point of definition and not a point of interest since the existence and location of the point is defined by the user, at will, anywhere on a map or geographic application; a point of interest is normally taken to mean a pre-existing geographic or commercial entity, the symbol or information of which may be normally hidden, that can be displayed to the user on request or when satisfying a search request. In one embodiment a prolonged touch (typically between 0.3 seconds and 1.0 seconds) to a touch-screen device would also create a point of definition on a map, similar to the prior art method of Google Maps and Apple Maps implemented on touch screens, except that the point of definition would be marked with a symbol. In one embodiment the symbol would be a filled, colored shape such as a square. As a note of clarification a node refers to a persistent defined touch point on the touch screen, which may be used to produce various entities, including points of definition, on a background application such as a map or image application. Therefore when used for defining points of definition, nodes are used in the creation of them, and effectively one node entity exists for one point of definition entity. However nodes are not specific to points of definition, since nodes are used in the creation of other entities. For example two nodes are used in the definition of a line segment, and therefore two nodes are associated with every line segment. Multiple nodes can exist on a touch screen concurrently.
  • FIG. 1 illustrates the creation and selection of a node and point of definition on a touch-screen surface 102, via a prolonged touch with a single touch implement 104—in this case a finger. The use of one or more fingers is assumed during the remainder of the detailed description and operation section, although other touch implements can be used such as a stylus or gloves adapted to the purpose of multi-touch gesturing on touch screens. FIG. 2 shows the result of a point of definition node creation touch on touch screen device 202, with the created point represented by square 204 in the position on the map or background image at which it was created. In one embodiment no label will be given to the created point. In a second embodiment, directly after the creation of a point of definition a means to label the point is automatically given to the user, such as a virtual keyboard being provided on the touch screen display. In a third embodiment the application will provide an automatically generated number or label upon creation. In a fourth embodiment, a longer duration touch than required for node creation, of approximately two seconds, will initiate a means to label the point of definition. In a further embodiment which is not mutually exclusive to the point labelling embodiments described above, the selection of an existing point of definition will enable the user to be able to perform a labelling or re-naming operation on that point. Legend 206 is an example of a point of definition label defined by the user. One of the fundamental differences between prior art points of definition compared to node-based points of definition is that multiple node-based points of definition can exist concurrently, and be visible to the touch screen user at the same time, as depicted by additional points 208. For prior art points of definition such as those of Google Maps or Apple Maps, only one point of definition is permitted. Trying to define a second one will result in the original point being deleted, and the new point being created at the new position instead.
  • The creation of a node-based point of definition also selects that node for immediate movement of the point of definition as indicated in FIG. 3 by the motion arrow 304. When the finger that creates or selects the node is moved while still maintaining contact with the touch-screen, the node and therefore associated point of definition will be moved along the surface of the touch screen 102 with the finger. Therefore in one embodiment when a selected point of definition is moved, no panning of the screen or underlying map typically occurs; the node and associated point of definition will move and not the background to the point of definition, with the exception of node-based panning described below. The result of a node-based point of definition creation and movement is shown in FIG. 4. The created node-based point of definition representation 404, on the touch-screen device 202 is shown. The direction arrow 402 represents movement of the point of definition, equal in direction and distance on the touch-screen to the causal finger motion.
  • The selection of a node-based point of definition previously created, is similar to the selection during creation of a node-based point of definition except that it requires the existence of a node-based point of definition at the location of the touch (or within a number of pixels of the touch). Therefore touching a node-based point of definition on the touch-screen and continuing to touch it will select that point of definition for an operation including moving the node around the touch screen. The node-based point of definition is only required to be present to be able to be selected, and therefore the selection—even of invisible points—is possible. In one embodiment the appearance of the selected point of definition is changed to denote that it is selected. In a second embodiment the contrast of a selected point of definition is inverted to denote a selected point such as that shown in 404, and in a third embodiment the color of a node-based point of definition changes once selected.
  • FIG. 5A demonstrates how to create the functionality of node-based definition, selection and movement on the touch-screen for points of definition. Process box 502 is a background task which monitors for a multi-touch operation; when a multi-touch operation is detected, the decision logic at 504 detects the specific multi-touch input of a touch device such as a finger touching the screen for a duration appropriate with the application (for example one second). If this node event is detected there is another decision point 506 which determines whether there is an existing node at the location of the touch on the screen. If there is no existing node, the node-based point of definition creation process 510 is initiated, which creates a node at the location being touched, and then selects the point of definition which has just been created. If there is a node at, or close to the detected node multi-touch, that node will be selected, as shown in 508. Whether the node was just selected by the node multi-touch, or created and selected in a combined operation, the user can move the node-based point of definition freely as summarized by process 512, and further elaborated in the process description of FIG. 5B. The node-based point of definition position on the touch-screen will repeatedly track the position of the finger performing the touch as shown by process 518. However a decision 520 as to whether to perform normal node movement or margin-based node movement depends on whether the node, and associated point of definition, is within a screen margin. A screen margin may be used—although it is not necessarily visible—in the situation where the background to a node, such as a map, occupies a larger area than can be seen on the touch-screen. In this case the node-based point of definition remains under the controlling finger, but the background moves in the opposite direction to the margin as described by 524. Therefore if a node-based point of definition is moved into a defined margin area of the left of the touch-screen, the user's controlling finger may stop there, in which case the background will move to the right. Typically, borders will be relevant at the top, bottom, left and right sides of a rectangular touch-screen, although for example borders near the corners could act in two directions. Such scrolling of a background can occur for as long as the user's finger is in contact with the screen and within a margin. If the finger is no longer in a margin, but still in contact with the touch-screen, normal node-based point of definition motion 522 will occur, with the node following the finger. Decision logic 514 on FIG. 5A determines whether any other operation is performed on the node-based point of definition after movement; an immediate time-out occurs if the node controlling finger is removed from the touch-screen, in which case the node-based point of definition stays at the position it was last at—where the finger was removed from—and is deselected. However if the controlling finger is detected as staying in the same position, but still in contact with the touch-screen for a certain time—for example two seconds—a user interface will be brought up to enable the user to assign additional information for the point of definition, as shown in process 516.
  • FIG. 5C shows normal node movement on a touch-screen surface 102—in this case a satellite navigation system—with 528 representing the position of a user's finger, which is evidently outside the margin 530. The node, and associated point of definition if applicable, is moved with the finger, for example as shown by arrow 526, without the background being panned. By contrast, FIG. 5D shows margin-based node movement, where finger position 528 is within the margin 530—in this case the top margin. Arrow 532 represents the movement or scrolling of the background as a consequence of the presence and position of the finger controlling the node being within the margin—in the opposite direction to the previous direction of travel across the screen by the node. Margin-based node movement is possible where the relevant area for which nodes are relevant is greater than the area shown by the screen. The effect will be to move the node further along the background in the desired direction. In this case the direction of movement of the background will be opposite to a panning multi-touch operation in the same direction that would happen with the attempted movement of a point of definition pin, in Apple Maps, for example. Note that although the above functionality concentrates on node movement with respect to an associated point of definition, various embodiments relate to the movement of nodes underlying lines, routes and corridors. Therefore for example the movement of the end node of a multi-segment line can also use normal and margin-based node movement as described in FIGS. 5A, 5B, 5C and 5D.
  • FIG. 6 shows some of the information which could be attributed to a node after creation and selection—particularly a node representing a geographic location on a map. Latitude and longitude would be important for a location node—this would be received from a mapping or geographic model application once the position on a touch-screen has been established. Also for geographic nodes, start date and finish date could be useful for determining relevance. Significantly, elevation or altitude—perhaps with minimum and maximum elevation/altitude would allow a three dimensional location definition. Therefore the altitude of a surveying point on a mountain could be usefully defined, an altitude above ground could be defined, or a depth below the sea could be added to latitude and longitude data. A name would also be useful—especially when sharing the node on a network, for a shared reference by those users with permissions to see specific nodes. Certain information, if required, could be attributed to a node by the operating system, such as the user and time at which a node was created. Miscellaneous information or notes about a location could also be added by a user. Finally visual information, such as node icon and label color, size and shape could be defined, or these could be defined or defaulted by the application itself. In one embodiment any information desired to be entered by the user would be available on a menu or form following normal selection of the node of approximately 0.5 seconds. In another embodiment a menu or form would be presented to the user to complete following an extra long selection period of more than one second. In some embodiments data associated with a node would have default values used if the user did not specify values. FIG. 6 also shows that nodes and node data can be shared across a communication network, and that a node created on one device could be viewed either as a point of interest or an editable node on other devices.
  • Line Segment Definition and Distance Display
  • Line segments—with reference to FIG. 7—are defined with the use of two fingers; in this case a thumb 704 and middle finger 702 touching a touch-screen 102 together, and for a minimum duration (for example one second) without being moved. This action will create or select two nodes, at the locations of the two touches on the touch-screen, between which will be drawn a line segment which in one embodiment will be straight, continuous, blue and thick. Other embodiments relevant for different application would produce line segment combinations of wavy, saw-tooth or straight type, with combinations of continuous, dotted or dashed style, with one of various common colors and thicknesses. In one embodiment, one of the two nodes created will be determined as the key node, and indicated to the user as the key node by a specific appearance. Where a key node for a line segment is defined, the selection of the whole line segment, and operations on that line segment are possible by selection of the key node first—for example movement of the line segment with both of its nodes, instead of the movement of just one node. The straight line segment shown as 806 on FIG. 8 between position nodes 802 and 804, on the touch-screen device 202, constitutes a line segment which can be further enhanced by the user, as illustrated in subsequent figures and paragraphs. In this case node 804 is marked as the key node, although other ways can be used to indicate a key node from other node, including shape, contrast, color or fill.
  • The creation method of a line segment by a touch-screen device user is shown in FIG. 9. User inputs to the touch-screen will be monitored for multi-touch node gestures by process 502. The logic of 902 will determine whether two fingers are touching the touch-screen and remaining still for greater then a minimum duration (for example 1 second), and if so the process 904 will create a line. Process 904 will create a line firstly by selecting the two nodes specified by the finger positions. If a node already exists at a finger position (or within a defined radius), the node will be selected. If there is no pre-existing node at the position, a node will be created at the given screen position and the node will be selected. A line segment will be drawn on the touch-screen between the two selected nodes. Typically the line will be straight, but there are various possibilities with regard to line type, which may for example be a wave, a saw-tooth, an arc, a spline, or other common line type, and of typical thickness and color possibilities found in drawing applications. If a key node is required by an application, the allocation of a key node can vary according to application and user defaults and preferences, for instance the first touch to be made during creation of the line, or the highest and furthest left on the touch-screen.
  • According to the application, line segment latent nodes may be automatically created for the purpose of line division as described later with the assistance of FIG. 11. The logic of 906 will detect whether the two fingers remain on the created nodes for a minimum time-out period after the creation of the line segment. If not (for instance the fingers are removed immediately upon the drawing of the line segment) in one embodiment the line will be completed without any additional user information added to the line at that time. In another embodiment the line segment will disappear if a minimum time-out period is not met with the creating fingers remaining substantially still. If the fingers do remain substantially still for an additional period following the time-out, for example 0.5 seconds, process 908 will allow the user to add additional information for the line segment via a user interface. Selection of a whole line segment is possible after the initial creation of the line segment; in one embodiment will consist of touching and holding the line segment anywhere on its length (although this embodiment is not compatible with the use of latent nodes for line division). Other embodiments will select a whole line segment from the touching and holding, or double-tapping of the key node of the line segment. Yet another embodiment will allow the selection of a whole line segment from a menu option once either of the end nodes of the line segment is selected.
  • FIG. 10 illustrates some of the information which could be added to, and relevant to a line segment. In one embodiment, information can be added automatically by the operating system, such as Creation User and Creation Time. Other information in various embodiments can be graphical preferences from a user, or from default values. In various embodiments some information may be defined by the user, such as Line Segment Name and Information. Other information including node positions on screen and node position representation (such as latitude/longitude/elevation in a mapping application) will be inherited from the node data relating to the nodes either end of the line segment. This is advantageous to a touch-screen device user, since if one end of a line segment is not in the desired location, the user can select that node and move it, with the effect that the line segment will be stretched, contracted or rotated in accordance with the motion of the node.
  • Apart from, or in addition to the creation of a line segment between two nodes, the same touching and holding of two node points on a touch screen can be used to show distance or difference between the selected nodes. FIG. 13 denotes the touching of a multi-touch enabled touch screen device 202 with two fingers 1306 which are held still for a minimum amount of time as for the normal node-based creation of a line. However in addition to line 1304 being created between the nodes under fingers 1306, a representative distance 1302 is displayed next to line 1304, which states the straight line horizontal distance between the points on the map defined by the nodes. In one embodiment the displayed distance is calculated by performing typical distance calculating navigational algorithms on the latitudes and longitudes represented by the nodes, which allows accurate real round-earth and great-circle calculations to be used. In another embodiment the displayed distance is calculated by multiplying the calculated touch screen distance by the known scale of the map or image to the screen representation. In one embodiment the distance only remains while the fingers are in contact with the touch screen, and disappears when one or both fingers are removed, although in other embodiments the distance can persist. In another embodiment no line is displayed, and only the distance measurement is shown to the user.
  • The distance measurement and display method shown in FIG. 13 can be used in other applications than electronic mapping; it can be used with earth or other planet models such as Google Earth, and can be used with image backgrounds such as satellite imagery, x-ray images, architectural plans, geographic information system images, radio telescopes, photographs, video feeds and electron microscopy. The distance calculated and displayed in one embodiment represents an angle or degree of arc rather than a scalar distance, which for example could be used in astronomy. Therefore this technology is of potential use to town planners, engineers, architects, microbiologists, planet scientists, radiographers, meteorologists, farmers, real estate agents, navigators and astronomers, to name a few, who are equipped with a suitable multi-touch enabled touch screen device.
  • Route Segment Definition and Vector Display
  • If it is desired to show a vector or direction of travel rather than a scalar line, a route segment may be defined. Route segment definition is similar to line segment definition, except that the direction or vector is defined by the order of finger touches and the creation order of the nodes. FIG. 14 shows the creation of a route segment. Finger 1404 is touched to the touch screen to remain substantially motionless a certain time before finger 1402 touches the touch screen. I estimate a typical value in use of approximately 0.5 seconds between touches. Once the second touch has been detected within the time window permitted, an arrow 1406 will be drawn from the node under the first touch 1404 in the direction of, and up to, the node under the second touch 1402. In one embodiment the arrow direction is in the direction of the first touch instead of the second touch, and other embodiments provide for the use of multiple line types, styles, thicknesses and colors as described for line definition, including the drawing of a normal line without an arrowhead.
  • In a similar manner to how distance displays have been described as being able to be displayed next to a line segment, scalar distances can be displayed next to a route segment as in FIG. 13. However since a route segment gives direction as well as quantity, vector quantities or differences can be displayed to the user as shown in FIG. 15. In the example shown, a horizontal value difference of touch screen pixels 1508 and a vertical value difference of screen pixels 1510 is shown between the node under touch 1504 and the node under touch 1502. Since the direction of the vector is known, the horizontal x difference value and the vertical y difference value have polarity or direction, so that in effect the node under touch 1504 is a temporary origin, and the node under touch 1502 is a vector relative to the former. One specific application of this example is for touch screen graphic designers and webpage authors in determining relative positioning on a touch screen. However the areas of application are much broader. In one embodiment the axis values of quantities 1508 and 1510 are Northings (distances to North or South) and Eastings (distances to East or West) respectively in a mapping application. In another embodiment the two dimensions for which distance is shown is based on two orthogonal axes relevant to an image. In another embodiment the information presented is an angle and distance between the nodes using measurement quantities and axes appropriate to the scale and application.
  • FIG. 16 shows a method of creating a route or vector segment, and differentiating it from a line or other multi-touch gesture. Activity 1602 monitors for the first touch and decision logic 1604 determines whether a second touch occurs within a specific time window. In this example a window of between 0.2 and 0.8 seconds is defined, however more generally it is anticipated that the time between the two touches will have a minimum value for example of 0.1 seconds and a maximum value for example of two seconds for a route to be recognised. However other durations are possible. Activity 1606 creates the vector line itself, for example a straight, black, thick arrow from the node under the first touch to the node under the second touch. Decision logic 1608 determines whether vector distance is required to be displayed, and if this is the case it is displayed via process 1612. If not, finger presence after the drawing of the route is used to decide in 1610 whether user information is required to be added in activity 1614. Additional information which may be added in by the user includes name of segment and free text describing its significance.
  • Multi-Segment Line and Route Definition and Editing
  • FIG. 11 demonstrates how a line may be sub-divided via node-based multi-touch. In one embodiment, a created line or route segment will automatically have one or more latent nodes (as indicated by 1106) created along its length in addition to the line-end defining nodes. In different embodiments these extra latent nodes may be visible or invisible to the user, and may be regularly spaced or irregularly spaced. If a user selects one of these latent nodes (as per the selection of any node with a short touch typically of between 0.5 seconds and 1.0 second), the latent node can be moved relative to the line-end nodes 1104. This will bend the line, and create two line segments out of one, sharing a common end-line node (which was previously a latent node of the original line). In one embodiment new line segments created by line sub-division will have their own new latent nodes created.
  • FIG. 12 illustrates a process by which node-based line division may be performed, which will result in a multi-segment line or route. Decision logic 1202 identifies whether the use of latent nodes is valid with current defaults and user selections. A second decision logic 1204 determines which method to use. In the process shown two methods are possible depending on default or pre-definition by the user. In the case of line division, process 1206 becomes active and divides a line or route segment into N equal parts with N−1 equally spaced latent nodes, where N is a predefined integer value. For example if N has the value of 2, any line or route segment created will have one latent node half way between the two end nodes. Alternatively if a particular spacing value, such as 3 cm, has been decided as the basis for latent node spacing, a latent node will be placed at that interval, starting at one end node. In different embodiments there may be a choice between line division method, spacing method or irregularly-spaced latent node methods, or just one of these may be implemented. Once the number and position of latent nodes on a line or route segment has been determined, activity 1210 will monitor for a touch over one of the latent nodes. On selection as shown in process 1212, a normal node will be created from the selected latent node, and result in a multi-segment line or route, even if all nodes are in a straight line still. In one embodiment of process 1212, once a latent node has been selected, a new node will not be created from the latent node unless the selected latent node is first moved.
  • Another means of creating a composite, multi-segment line or route is from moving line or route segments together so that they are joined at their end nodes. FIG. 17 illustrates a composite line created from line segments 1708, 1710 and 1706. Line segments 1708 and 1710 are joined by a common node 1704, and line segments 1710 and 1706 are joined by a common node 1702. Such a composite line could for example represent a border, boundary or road on a map, or a complex line on a drawing application. Since non end nodes of a composite line are common between two line segments, for example the selection and moving of a node 1704 would result in a change of angle and possibly length by both line segments 1708 and 1710. FIG. 18 shows a composite route created through the combining of multiple line segments 1802. Such a composite line can be used to show flow or direction of travel. A particular use of this type of line would be in defining a route which does not necessarily rely on existing geographic locations. This would be beneficial for example to define a planned wilderness route, journey on water or the flight plan for an aircraft.
  • FIG. 19 shows a method for the joining of different line or route segments to make a composite line or route. Decision logic 1902 verifies whether a new segment has been created, and if so, decision logic 1904 determines whether either of the end nodes or both correspond with existing nodes. It is unlikely that even if the user desires to exactly match the position of an existing node that the same central pixel on the touch screen will be able to be selected, especially if fingers are used compared to a stylus or precision pointing device. Therefore said logic will accept close approximations to the position of an existing node as being identical, and those nodes will be merged as defined in process 1906. In one embodiment the mean position of the original node and new node will be averaged so that the new joining node of the two segments will be halfway between the precise centers of the nodes. In another embodiment the joining node position is taken to be the position of the original node, and in yet another embodiment the joining node position is taken to be the position of the new node. In some embodiments if there is user-defined or default data attached to the nodes, for example the altitude above sea level of the point represented by the node, this will be merged. In one embodiment the data associated with the original node is used to populate the data fields of the joining node. In another embodiment newer non-default data, such as creation date from the new node will over-write the equivalent data of the existing node. Process 1908 allows a key node for a composite line or route to be determined where neither or both segments have a key node already, since in one embodiment a composite, multi-segment line or route may not have more than one key node. In one embodiment the key node of the original line segment, route segment, composite line or composite route is retained as the key node for the new composite line or route.
  • In one embodiment of the node joining method described by FIG. 19 it is not just new nodes on line or route segments which may be joined to existing nodes, but two nodes of existing segments where the user has selected an end node from a segment and moved it in close proximity to an existing end node of another segment or existing point node. The closeness of the centers of nodes when deciding whether they are to be joined depends somewhat on the application. However it is anticipated in general that nodes would not be joined unless the touch areas under a typical finger touch overlapped with an equivalent radius from the second node.
  • Other operations are possible on multi-segment lines and routes, however said lines and routes were created. One operation is the deletion of a node, as shown in FIG. 20A, although the operation can also be performed on nodes which are not part of a composite line, composite route, line segment or route segment. A user selects a node using a finger 2004. A long press of more than approximately one second results in a node operation menu, one of whose operations is deletion of the node. Upon selection the node will be deleted and the symbol representing the node will be deleted as shown by area 2006 on FIG. 20B. If the node is an end node of a composite line or route, the end segment of the line or route which incorporated the node will be deleted also. In the case of one of the non-end nodes being deleted, the segments attached to the deleted node will be deleted. In one embodiment shown by FIG. 20B, the composite line or route would be reformed by the joining together of the nodes either side of the deleted node with a new line or route segment. In another embodiment the deletion of a node would result in two entities, such as a node and a composite line, with no automatic replacement or substitution of line segments. On the selection of deletion of a node on a line or route segment which is not connected to any other nodes, in one embodiment the whole segment and both nodes will be deleted. In a second embodiment the node not deleted will remain as a point of designation node. If a key node is deleted, in one embodiment another node will be made the key node. In another embodiment the whole entity that the key node represents will be deleted.
  • Other operations on lines and routes are possible, which can be performed via a menu selection as shown in FIG. 20A. Firstly any node can be defined as a key node by menu selection, which in one embodiment replaces the existing key node in that function. Secondly naming of a whole entity is possible after which a labelling means such as a virtual keyboard on the touch screen becomes active. Another option for selection in one embodiment illustrated in FIG. 23A and FIG. 23C is the conversion of a line segment 2304 or composite line 2308 into a route segment 2302 or composite route 2310 respectively, and vice versa. In various embodiments, as shown by FIG. 23B, an option for selection is the means to reverse the direction of a composite route 2306. The said reversal means is also applicable to route segments in various embodiments.
  • Corridor Definition and Options
  • FIG. 21 illustrates the use of line segments to create not only a composite line, but a corridor. A corridor is a central composite line or composite route such as 2110 with associated parallel composite lines or composite routes illustrated in FIG. 21 by 2106 and 2108 which represent desired limits related to the central line. One use for this is the definition of air corridors or sea lanes for touch-screen devices used for navigation or navigation planning. In one embodiment the end of a corridor will be a semi-circle centered on the end-node. Corridors can be created by the user of a touch-screen device for a line segment by specifying distance offsets from a central line, as part of the user-added line information process 908 previously described in FIG. 9. Similarly a distance offset can be defined for a route segment as part of the user-added route segment information process 1614 described in FIG. 16. In one embodiment there is a single offset to line or route segment on one side of the selected segment. In another embodiment a line or route segment is offset equally on both sides of the existing line or route segment.
  • In the case of multi-segment lines or routes, such as 2204 in FIG. 22A, selection of the whole multi-segment line or route is first performed. In one embodiment selection is achieved by the selection of the key node of the multi-segment line or route. In another embodiment, selection is achieved by a long press of over approximately one second anywhere on the line or route. In a third embodiment selection is achieved by the long press of over approximately one second of any node on the multi-segment line or route. After user selection of the multi-segment line or route, in one embodiment the touch-screen device will present one or more options to the user, including the option to create a corridor. In one embodiment the user will subsequently provide numerical input to the touch-screen device representing a width of corridor. In another embodiment a default or pre-selected value will automatically be used as the corridor width. In other embodiments an active symbol is placed over the key node, or all nodes of the multi-segment entity. When a finger touch is made to an active symbol, the active symbol can be moved away from the node it is over for the user to indicate the width of the corridor required. In one embodiment a dynamic circle will be created centered on the node and with radius defined by the active symbol to visually feed back to the user what the width of the corridor will be. In one embodiment the active symbol and circle will disappear upon the user removing their finger. In another embodiment the user's finger must remain substantially motionless for approximately 0.5 seconds before the corridor width is finalised and the active symbol is removed.
  • Once a corridor width has been defined graphically or numerically, a corridor will be drawn around the central multi-segment line or route in accordance with the selected width. In one embodiment the corridor area will be calculated by the union of segment rectangle area as shown for one segment by 2210 in FIG. 22B, and node circle area as shown for one node by 2208. Segment rectangle area is the union of all areas made up of rectangles with length given by individual segment lengths and width given by the selected width. Node circle area is defined by circles with a radius of the selected width, for all nodes in the multi-segment line. The addition of node circles for the calculation of corridor area eliminates discontinuities of corridor shape shown by 2212 in FIG. 22B. In one embodiment, the border of the final corridor area calculated will be displayed around the original multi-segment line or route, as shown by 2214 in FIG. 22C.
  • In several embodiments corridors are not just created by the user of a touch screen device, but are defined on a remote computer or touch screen device and communicated to a computer or touch screen device for display. The use of nodes facilitates communication of corridors since little data is required to be transmitted to define a corridor. Navigation restricted corridors can therefore be provided centrally, which can be overlaid on a touch-screen display with local information—such as GPS position and planned route of the local user. The key is the use of nodes to represent the required information between users and data sources.
  • Apparatus Detailed Description & Operation
  • In order to detect nodes, define points of definition, and to create lines, routes and corridors from nodes, a touch screen module is required, as indicated by 2402 on FIG. 24. The output signals from the touch screen module in response to user touches are fed to a control module 2404 to interpret. The control module will determine whether a multi-touch event relating to nodes, node-based lines, node-based routes or node-based corridors has occurred. If so, the control module will process the information to create or modify the relevant entity. Node, line, route or corridor data for storage will be routed to the memory module 2412, with the memory module also serving as the source of these data to the control module where required by an application 2418 or operating system 2414 running which requires the information. Where an application or operating system requires an interface with remote devices, networks, servers or databases, the communications module 2410 sends or receives the point of definition, line, route or corridor node data, and supplementary information associated with that entity. This information is passed to or from the control module which may also route the data to or from the touch screen module or the memory module.
  • FIG. 25 shows how node-based point of definition, line, route and corridor information may be exchanged between different devices and computers via a network. However several networks combining different communications links and different servers and databases could be used, depending on the application. A tablet computer 2502, a large touch screen device (such as a touch screen television) 2504, a personal computer or workstation 2506 (which does not have to have a touch screen or be multi-touch enabled), a smartphone 2508 and a satellite navigation system 2510 are shown communicating node-based point of definition, line, route and corridor information via a network. The information being provided by some or all of the devices is processed and stored at central servers or databases 2512. The central servers or databases will share information as requested and required by the devices' applications and operating systems, including node-based point of definition, line, route and corridor information, for example a corridor area on a map. The link 2514 represents a one-to-one communication of node-based point of definition, line, route and corridor information between two users with suitable apparatus, and shows that a centralized information distribution system is not necessarily required. Peer-to-peer and small clusters of users can also share node-based entity information.
  • Advantages
  • Various applications of this new human interface to touch-device technology are foreseen.
  • Firstly, a way of defining multiple points of definition by the user is provided, which means that places of relevance to her may be defined just by a touch at the applicable screen location over a background such as a map. Furthermore, those points of definition may be named as desired, remembered by the touch screen device, and shared with friends, social networks or databases. Points of definition could include favourite shops, parking spaces currently vacant and rendezvous points. Current mapping applications typically only allow one user-defined point or pin, which are not customizable, storable and may not be labelled.
  • There are not currently ways to easily draw lines on touch screen devices, and especially lines which can be joined or shared remotely. Node-based line drawing allows lines to be quickly drawn with just two user touches between the desired points. This provides an efficient means to define borders of land for agriculture and real estate for example.
  • Similarly to lines, routes can also be defined easily with two taps, which show direction as well as routing. Route segments may be quickly defined and joined by touch to create a composite route for navigators and pilots. Since routes—like all node-based entities—are easy to define and repeat, they are easily communicated via a communication network, which could have advantages for example in the remote filing of flight plans.
  • A further development of composite or multi-segment lines and routes is the definition of corridors, which are two dimensional extensions to lines and routes. Corridors are easy to create by touch and user selection, and have application in navigation and control. Corridors can be defined centrally for example by air traffic control on a touch screen, and communicated to pilots.
  • The same method of defining lines, with two touches, lends itself to defining two points of which it is desired to know the distance between, which is displayed to the touch screen device user. The distance can either be the screen distance, for example in horizontal and vertical pixels to a programmer, or the distance which the touches represent on a background image or map. Therefore the method is useful for navigators to assess distance between two points. Other examples include the use for radiographers to determine the size of bone fractures from an image on the touch screen and air traffic control to determine following distances between two aircraft by touching their symbols on a radar display on touch screen workstation screen.
  • Finally, as well as being simple to define points of definition, lines, routes and corridors using user-touched nodes, the node-based method lends itself to the efficient communication of the said entities. The sending of only a few key nodes over a communication channel which completely define a whole geometric entity, without sending a complete record of the entity, allows simple and low bandwidth communication of such data.
  • Although the description above contains several specificities, these should not be construed as limiting the scope of the embodiment, but as examples of several embodiments. For example the range of devices applicable is greater than tablet computers and smartphones, and finger operation referred to includes operation by a pointing device such as a stylus.
  • Thus the scope of the embodiments should be determined by the claims appended and their legal implications, rather than by the examples supplied.

Claims (34)

I claim:
1. A method for interpreting user touches on a touch screen device to create and edit points of definition, lines, routes and corridors on the display of said touch screen device, comprising:
a. recognizing single and double, concurrent user touches to touch screen device,
b. interpreting said user touches as node positions, node touch sequences and associated node motions on the screen display of said touch screen,
c. interpreting said node positions, said node touch sequences and said node motions to determine the point, line segment or route segment entities to be drawn on the touch screen display,
d. retaining recognition and information of said entities persistently after said user touches to the touch screen device have ceased,
e. allowing reselection by a user of a previously defined entity for operation on that entity.
f. allowing reselection by a user of any node of a previously defined entity for operation on that node.
2. The method of claim 1 wherein the number of said concurrent user touches is interpreted as one, and the node produced by said concurrent user touch remains substantially motionless for a predetermined length of creation time, thereby resulting in the creation of a point of definition and the drawing of a symbol on the touch screen to represent said point to the user.
3. The method of claim 2 wherein said user touch remains substantially motionless for an additional predetermined length of time after said creation time, thereby resulting in a means being provided to the user for adding and viewing alphanumeric name or identification information to the said point of definition.
4. The method of claim 1 wherein the number of said concurrent user touches is interpreted as two, and the nodes produced by said concurrent user touches remain substantially motionless for a predetermined length of creation time, thereby resulting in the creation of a line segment and the drawing of a line on the touch screen between positions of the two user touches.
5. The method of claim 4 wherein the said drawn line has a predetermined style, color and thickness.
6. The method of claim 4 wherein one or more latent nodes is automatically created at intervals along a line segment, allowing the user to identify, select and move any latent node whereby said latent node becomes a new node of the line segment which thereby becomes a multi-segment line.
7. The method of claim 4 wherein a node from one line segment is moved so that it is substantially at the same location on the touch screen as a second node of a different line segment, thereby resulting in the merging of the two nodes and the creation of a multi-segment line.
8. The method of claim 1 wherein the number of said user touches is interpreted as two and there is a detected said node touch sequence with the time between the first touch and the second touch being within a predetermined time value of each other, thereby resulting in the creation of a route segment and the drawing of an arrow from the point of the first touch in the direction of the second touch on the touch screen using a predetermined style, color and thickness.
9. The method of claim 8 wherein one or more latent nodes is automatically created at intervals along a route segment, allowing the user to identify, select and move any latent node whereby said latent node becomes a new node of the route segment which thereby becomes a multi-segment route.
10. The method of claim 8 wherein a node from one route segment is moved so that it is substantially at the same location on the touch screen as a second node of a different route segment, thereby resulting in the merging of the two nodes and the creation of a multi-segment route.
11. The method of claim 1 wherein the number of said concurrent user touches is interpreted as two, and the nodes produced by said concurrent user touches remain substantially motionless for a predetermined length of creation time, thereby resulting in the display of actual distance between the two user touches on the touch screen, to the user.
12. The method of claim 1 wherein the number of said concurrent user touches is interpreted as two, and the nodes produced by said concurrent user touches remain substantially motionless for a predetermined length of creation time, thereby resulting in the display to the user of representative distance between the two node points created by the two user touches on the underlying map or image, taking into account the scaling of said underlying map or image.
13. The method of claim 1 wherein the number of said user touches is interpreted as two and there is a detected said node touch sequence with the time between the first touch and the second touch being within a predetermined time value of each other, thereby resulting in the display of screen vector distance between the two user touches on the touch screen, to the user.
14. The method of claim 1 wherein the number of said user touches is interpreted as two and there is a detected said node touch sequence with the time between the first touch and the second touch being within a predetermined time value of each other, thereby resulting in the display to the user of representative two dimensional vector distance between the two node points created by the two user touches on the underlying map or image, taking into account the scaling of said underlying map or image.
15. The method of claim 1 wherein a said reselection by a user of a previously defined entity is performed and said operation on said entity is selected as corridor creation, whereby a bounded area around said entity is calculated and displayed to the user on the touch screen, defined by the logical union of circle area around all nodes of said entity and rectangle area around all line or route segments of said entity.
16. The method of claim 15 wherein the corridor width is predetermined and therefore the radius of the circles around said nodes is made equal to the predetermined corridor width and the width of the rectangles around said segments is also made equal to the predetermined corridor width.
17. The method of claim 15 wherein the corridor width is defined by touch by the user, and therefore the radius of the circles around said nodes is made equal to the user-specified corridor width and the width of the rectangles around said segments is also made equal to the user-specified corridor width.
18. The method of claim 1 wherein a said reselection by a user of a previously defined entity is performed through the means of the entity having one or more key nodes whereby operations specific to the whole entity such as the movement, deletion or addition of data is performed.
19. The method of claim 1 wherein said operation on a node is taken from the list including movement, deletion, labelling, addition of data and definition as a key node.
20. The method of claim 19 wherein said movement operation on the node is by the user dragging the node around within the perimeters of the multi-touch enabled input device without any background map or image being scrolled.
21. The method of claim 19 wherein said movement of the node is by the user maintaining a touch within a predetermined distance of a perimeter of the touch screen, thereby causing the node to stay at the position of the touch, but any background map or image being scrolled in the opposite direction of said perimeter.
22. The method of claim 19 wherein if the geometric location coordinates of the moved node become substantially the same as the geometric location coordinates of an existing node, the two nodes are equated as being the same, and the new single node inherits the properties of said existing node.
23. The method of claim 19 wherein said addition of data includes information taken from the list of start date, end date, elevation above sea level, planned altitude, depth below sea level, and free text information.
24. The method of claim 19 wherein said deletion operation removes the node, and also a point of definition associated with a node.
25. A distance measurement and display system graphical user interface for touch screen devices with a mapping, navigation or image background, comprising:
a. a detection of two concurrent user touches to a touch screen means that will permit a user to input two points of definition on a background map or image for which it is desired to know the distance between, and
b. a measurement means for calculating the representative distance between the two concurrent touches including scaling and conversion to the measurement units and axes of the background map or image, and
c. a means of display of the calculated representative distance between the two concurrent touches to the user of the touch screen device.
26. An apparatus, comprising:
a. a touch screen module incorporating a touch panel adapted to receiving user input in the form of multi-touch shape gestures including finger touches and finger movements, and a display surface adapted to present point of definition, line, route and corridor information to the user,
b. a control module which is operatively connected to said touch screen module to determine node and point of definition positions from said finger touches, to determine node motions and touch sequence from said finger movements, to recognize a line or route segment from combinations of said node positions and touch sequences, to create multi-segment lines and routes from individual segments by node position equivalence detection, to create multi-segment lines and routes from detection of latent node selection and movement on line and route segments, to detect a selection touch to a pre-existing entity from the list including point of definition, line segment, route segment, multi-segment line and multi-segment route, to control the editing of said pre-existing entity, to control other modules and to generate a continuous graphical image including said node positions and plurality of said pre-existing entities for display on the touch screen module,
c. a memory module logically connected to said control module which is able to store from and provide to said control module a logical element selected from the group consisting of operating systems, system data for said operating systems, applications which can be executed by the control module, data for said applications, node data, point of definition data, line segment data, route segment data, multi-segment line data, multi-segment route data and corridor data.
27. The apparatus of claim 26, wherein the apparatus is selected from the group consisting of a mobile telephone with a touch screen, a tablet computer with a touch screen, a satellite navigation device with a touch screen, an electronic book reader with a touch screen, a television with a touch screen, a desktop computer with a touch screen, a notebook computer with a touch screen, a touch screen display which interacts with medical and scientific image display equipment and a workstation computer of the type used in command and control operations centers such as air traffic control centers, but having a touch screen.
28. The apparatus of claim 26, wherein the node, point of definition, line segment, route segment, multi-segment line, multi-segment route and corridor information presented to the user includes symbols and lines currently detected or selected and those recalled from memory, previously defined, or received from a remote database, device or application.
29. The apparatus of claim 26, wherein a communications module is incorporated adapted to the transfer of node, point of definition, line segment, route segment, multi-segment line, multi-segment route and corridor information including node position and entity type, to and from other devices, networks and databases.
30. The apparatus of claim 29, wherein the control module will accept external said information from said communications module and pass them to the touch screen module for display to the user.
31. The apparatus of claim 29, wherein the control module will pass locally created said information to the communications module for communication to other devices, networks and databases.
32. The apparatus of claim 26, wherein the entities recognised by said control module from the said detected node positions, touch sequences and node motions include nodes, points of definition, line segments, route segments, multi-segment lines, multi-segment routes and corridors.
33. The apparatus of claim 26, wherein the said editing of pre-existing entities includes the movement of points of definition, the movement of entire entities, the deletion of entire entities, the stretching of lines by the movement of their individual nodes, the editing of a corridor width, the creation of multi-segment lines and routes by joining segments at common nodes, and the addition of a new node between two existing nodes of an entity.
34. The apparatus of claim 26, wherein the said nodes and points of definition recognised by said control module represent locations on a two dimensional image, map or surface having its own coordinate system which are readable by said control module from the memory module.
US14/020,835 2012-09-08 2013-09-07 Definition and use of node-based points, lines and routes on touch screen devices Abandoned US20150338974A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US14/020,835 US20150338974A1 (en) 2012-09-08 2013-09-07 Definition and use of node-based points, lines and routes on touch screen devices
PCT/GB2014/051157 WO2014167363A1 (en) 2013-04-13 2014-04-14 Systems and methods for interacting with a touch screen
GB1517611.8A GB2527244B (en) 2013-04-13 2014-04-14 Systems and methods for interacting with a touch screen

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201261698625P 2012-09-08 2012-09-08
US14/020,835 US20150338974A1 (en) 2012-09-08 2013-09-07 Definition and use of node-based points, lines and routes on touch screen devices

Publications (1)

Publication Number Publication Date
US20150338974A1 true US20150338974A1 (en) 2015-11-26

Family

ID=54556060

Family Applications (2)

Application Number Title Priority Date Filing Date
US13/862,382 Expired - Fee Related US9268423B2 (en) 2012-09-08 2013-04-13 Definition and use of node-based shapes, areas and windows on touch screen devices
US14/020,835 Abandoned US20150338974A1 (en) 2012-09-08 2013-09-07 Definition and use of node-based points, lines and routes on touch screen devices

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US13/862,382 Expired - Fee Related US9268423B2 (en) 2012-09-08 2013-04-13 Definition and use of node-based shapes, areas and windows on touch screen devices

Country Status (1)

Country Link
US (2) US9268423B2 (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150160819A1 (en) * 2013-12-06 2015-06-11 Microsoft Corporation Crane Gesture
US20150268828A1 (en) * 2014-03-18 2015-09-24 Panasonic Intellectual Property Management Co., Ltd. Information processing device and computer program
US20150317004A1 (en) * 2014-05-05 2015-11-05 Adobe Systems Incorporated Editing on a touchscreen
US20170188290A1 (en) * 2015-06-26 2017-06-29 M. Imran Hayee Message hoppping on predefined routes using dsrc based vehicle-to-vehicle communication
US10001898B1 (en) 2011-07-12 2018-06-19 Domo, Inc. Automated provisioning of relational information for a summary data visualization
USD831111S1 (en) 2016-03-02 2018-10-16 ACCO Brands Corporation Dry erase board
US10474352B1 (en) * 2011-07-12 2019-11-12 Domo, Inc. Dynamic expansion of data visualizations
CN110807807A (en) * 2018-08-01 2020-02-18 深圳市优必选科技有限公司 Monocular vision target positioning pattern, method, device and equipment
US10726624B2 (en) 2011-07-12 2020-07-28 Domo, Inc. Automatic creation of drill paths
US10902148B2 (en) * 2017-12-07 2021-01-26 Verizon Media Inc. Securing digital content using separately authenticated hidden folders
US11460973B1 (en) * 2022-04-11 2022-10-04 Sas Institute Inc:. User interfaces for converting node-link data into audio outputs
USD1013779S1 (en) 2020-08-19 2024-02-06 ACCO Brands Corporation Office panel with dry erase surface

Families Citing this family (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120005045A1 (en) * 2010-07-01 2012-01-05 Baker Scott T Comparing items using a displayed diagram
US10146428B2 (en) * 2011-04-21 2018-12-04 Inpris Innovative Products From Israel Ltd Device, system, and methods for entering commands or characters using a touch screen
JP6326855B2 (en) * 2013-03-15 2018-05-23 株式会社リコー Delivery control system, delivery system, delivery control method, and program
US9298740B2 (en) * 2013-09-25 2016-03-29 Corelogic Solutions, Llc System and method for enhancing the normalization of parcel data
US20150169531A1 (en) * 2013-12-17 2015-06-18 Microsoft Corporation Touch/Gesture-Enabled Interaction with Electronic Spreadsheets
US10365804B1 (en) * 2014-02-20 2019-07-30 Google Llc Manipulation of maps as documents
CN104503697B (en) * 2014-12-29 2018-08-07 联想(北京)有限公司 A kind of information processing method and electronic equipment
KR20160098700A (en) * 2015-02-11 2016-08-19 삼성전자주식회사 Apparatus for processing multi-touch input and method thereof
US20160260346A1 (en) * 2015-03-02 2016-09-08 Foundation For Exxcellence In Women's Healthcare, Inc. System and computer method providing customizable and real-time input, tracking, and feedback of a trainee's competencies
JP6643776B2 (en) * 2015-06-11 2020-02-12 株式会社バンダイナムコエンターテインメント Terminal device and program
US10386997B2 (en) * 2015-10-23 2019-08-20 Sap Se Integrating functions for a user input device
US9811926B2 (en) * 2016-01-21 2017-11-07 Lenovo Enterprise Solutions (Singapore) Pte. Ltd. Touch screen gesture for perfect simple line drawings
JP6682951B2 (en) * 2016-03-29 2020-04-15 ブラザー工業株式会社 Program and information processing device
CN107368465B (en) * 2016-05-13 2020-03-03 北京京东尚科信息技术有限公司 System and method for processing screenshot note of streaming document
US10558288B2 (en) 2016-07-07 2020-02-11 Samsung Display Co., Ltd. Multi-touch display panel and method of controlling the same
WO2018018378A1 (en) * 2016-07-25 2018-02-01 深圳市大疆创新科技有限公司 Method, device and system for controlling movement of moving object
TWI794812B (en) * 2016-08-29 2023-03-01 日商半導體能源研究所股份有限公司 Display device and control program
KR102560598B1 (en) * 2016-12-21 2023-07-28 삼성전자주식회사 Display Apparatus AND CONTROLLING METHOD THEREOF
US11449167B2 (en) 2017-06-26 2022-09-20 Inpris Innovative Products Fromisrael, Ltd Systems using dual touch and sound control, and methods thereof
US10552010B2 (en) * 2018-06-21 2020-02-04 International Business Machines Corporation Creating free-form contour regions on a display
US11106786B2 (en) * 2018-12-27 2021-08-31 Paypal, Inc. Emulator detection through user interactions
US10885796B2 (en) 2019-05-02 2021-01-05 Honeywell International Inc. Ground traffic aircraft management
US11093046B2 (en) 2019-12-16 2021-08-17 Microsoft Technology Licensing, Llc Sub-display designation for remote content source device
US11042222B1 (en) 2019-12-16 2021-06-22 Microsoft Technology Licensing, Llc Sub-display designation and sharing
US11487423B2 (en) * 2019-12-16 2022-11-01 Microsoft Technology Licensing, Llc Sub-display input areas and hidden inputs
US11404028B2 (en) 2019-12-16 2022-08-02 Microsoft Technology Licensing, Llc Sub-display notification handling
CN115576451A (en) * 2022-12-09 2023-01-06 普赞加信息科技南京有限公司 Multi-point touch device and system

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060022956A1 (en) * 2003-09-02 2006-02-02 Apple Computer, Inc. Touch-sensitive electronic apparatus for media applications, and methods therefor
US20110224896A1 (en) * 2010-03-09 2011-09-15 Nokia Corporation Method and apparatus for providing touch based routing services
US20120287071A1 (en) * 2010-01-20 2012-11-15 Nokia Corporation User input
US8407606B1 (en) * 2009-01-02 2013-03-26 Perceptive Pixel Inc. Allocating control among inputs concurrently engaging an object displayed on a multi-touch device
US20130085669A1 (en) * 2011-09-30 2013-04-04 The Boeing Company Systems and Methods for Processing Flight Information
WO2013051051A1 (en) * 2011-10-03 2013-04-11 古野電気株式会社 Touch-panel-equipped device, radar device, plotter device, shipboard network system, information display method, and information display program
US20140250410A1 (en) * 2013-03-04 2014-09-04 Triology LLC Scheduling menu system and method having flip style graphical display
US20150199073A1 (en) * 2011-09-27 2015-07-16 Timothy W. Kukulski Ordering of objects displayed by a computing device

Family Cites Families (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4555775B1 (en) 1982-10-07 1995-12-05 Bell Telephone Labor Inc Dynamic generation and overlaying of graphic windows for multiple active program storage areas
US5861886A (en) * 1996-06-26 1999-01-19 Xerox Corporation Method and apparatus for grouping graphic objects on a computer based system having a graphical user interface
US7840912B2 (en) 2006-01-30 2010-11-23 Apple Inc. Multi-touch gesture dictionary
US7028264B2 (en) 1999-10-29 2006-04-11 Surfcast, Inc. System and method for simultaneous display of multiple information sources
US8095879B2 (en) 2002-12-10 2012-01-10 Neonode Inc. User interface for mobile handheld computer unit
KR20070006477A (en) * 2005-07-08 2007-01-11 삼성전자주식회사 Method for arranging contents menu variably and display device using the same
US7933632B2 (en) 2005-09-16 2011-04-26 Microsoft Corporation Tile space user interface for mobile devices
US7864163B2 (en) 2006-09-06 2011-01-04 Apple Inc. Portable electronic device, method, and graphical user interface for displaying structured electronic documents
KR100720335B1 (en) * 2006-12-20 2007-05-23 최경순 Apparatus for inputting a text corresponding to relative coordinates values generated by movement of a touch position and method thereof
US8214768B2 (en) 2007-01-05 2012-07-03 Apple Inc. Method, system, and graphical user interface for viewing multiple application windows
US7752555B2 (en) 2007-01-31 2010-07-06 Microsoft Corporation Controlling multiple map application operations with a single gesture
US20080307512A1 (en) 2007-05-26 2008-12-11 Pankaj Tandon Property Centric Real Estate Maps with Personalized Points of Interest
US20100036750A1 (en) 2008-08-08 2010-02-11 John Whelan System and method for displaying real estate properties for sale, real estate properties wanted and/or areas in which properties are for sale and/or desired
US8749497B2 (en) 2008-12-12 2014-06-10 Apple Inc. Multi-touch shape drawing
US8219937B2 (en) 2009-02-09 2012-07-10 Microsoft Corporation Manipulation of graphical elements on graphical user interface via multi-touch gestures
KR101844366B1 (en) 2009-03-27 2018-04-02 삼성전자 주식회사 Apparatus and method for recognizing touch gesture
US9182854B2 (en) 2009-07-08 2015-11-10 Microsoft Technology Licensing, Llc System and method for multi-touch interactions with a touch sensitive screen
US8576182B2 (en) * 2009-09-01 2013-11-05 Atmel Corporation Methods and apparatuses to test the functionality of capacitive sensors
EP2325737B1 (en) 2009-10-28 2019-05-08 Orange Verfahren und Vorrichtung zur gestenbasierten Eingabe in eine graphische Benutzeroberfläche zur Anzeige von Anwendungsfenstern
US8587532B2 (en) 2009-12-18 2013-11-19 Intel Corporation Multi-feature interactive touch user interface
US20120235923A1 (en) 2011-03-15 2012-09-20 Sony Corporation Electronic device system with notes and method of operation thereof
US8754861B2 (en) * 2011-07-06 2014-06-17 Google Inc. Touch-screen keyboard facilitating touch typing with minimal finger movement
US20130194173A1 (en) * 2012-02-01 2013-08-01 Ingeonix Corporation Touch free control of electronic systems and associated methods
US9041727B2 (en) * 2012-03-06 2015-05-26 Apple Inc. User interface tools for selectively applying effects to image
CN102662510B (en) 2012-03-24 2016-08-03 上海量明科技发展有限公司 The method realizing sectional drawing by multiple point touching

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060022956A1 (en) * 2003-09-02 2006-02-02 Apple Computer, Inc. Touch-sensitive electronic apparatus for media applications, and methods therefor
US8407606B1 (en) * 2009-01-02 2013-03-26 Perceptive Pixel Inc. Allocating control among inputs concurrently engaging an object displayed on a multi-touch device
US20120287071A1 (en) * 2010-01-20 2012-11-15 Nokia Corporation User input
US20110224896A1 (en) * 2010-03-09 2011-09-15 Nokia Corporation Method and apparatus for providing touch based routing services
US20150199073A1 (en) * 2011-09-27 2015-07-16 Timothy W. Kukulski Ordering of objects displayed by a computing device
US20130085669A1 (en) * 2011-09-30 2013-04-04 The Boeing Company Systems and Methods for Processing Flight Information
WO2013051051A1 (en) * 2011-10-03 2013-04-11 古野電気株式会社 Touch-panel-equipped device, radar device, plotter device, shipboard network system, information display method, and information display program
US20140250410A1 (en) * 2013-03-04 2014-09-04 Triology LLC Scheduling menu system and method having flip style graphical display

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10001898B1 (en) 2011-07-12 2018-06-19 Domo, Inc. Automated provisioning of relational information for a summary data visualization
US10726624B2 (en) 2011-07-12 2020-07-28 Domo, Inc. Automatic creation of drill paths
US10474352B1 (en) * 2011-07-12 2019-11-12 Domo, Inc. Dynamic expansion of data visualizations
US20150160819A1 (en) * 2013-12-06 2015-06-11 Microsoft Corporation Crane Gesture
US20150268828A1 (en) * 2014-03-18 2015-09-24 Panasonic Intellectual Property Management Co., Ltd. Information processing device and computer program
US9372563B2 (en) * 2014-05-05 2016-06-21 Adobe Systems Incorporated Editing on a touchscreen
US20150317004A1 (en) * 2014-05-05 2015-11-05 Adobe Systems Incorporated Editing on a touchscreen
US20170188290A1 (en) * 2015-06-26 2017-06-29 M. Imran Hayee Message hoppping on predefined routes using dsrc based vehicle-to-vehicle communication
USD831111S1 (en) 2016-03-02 2018-10-16 ACCO Brands Corporation Dry erase board
US10902148B2 (en) * 2017-12-07 2021-01-26 Verizon Media Inc. Securing digital content using separately authenticated hidden folders
US11501019B2 (en) * 2017-12-07 2022-11-15 Yahoo Assets Llc Securing digital content using separately authenticated hidden folders
CN110807807A (en) * 2018-08-01 2020-02-18 深圳市优必选科技有限公司 Monocular vision target positioning pattern, method, device and equipment
USD1013779S1 (en) 2020-08-19 2024-02-06 ACCO Brands Corporation Office panel with dry erase surface
US11460973B1 (en) * 2022-04-11 2022-10-04 Sas Institute Inc:. User interfaces for converting node-link data into audio outputs

Also Published As

Publication number Publication date
US20150338942A1 (en) 2015-11-26
US9268423B2 (en) 2016-02-23

Similar Documents

Publication Publication Date Title
US20150338974A1 (en) Definition and use of node-based points, lines and routes on touch screen devices
CN106463056B (en) Solution for the interactive moving map that height customizes
US11131559B2 (en) Linear visualization of a driving route
US10579226B2 (en) Time proximity based map user interactions
DE112013002803B4 (en) Method, system and device for providing a three-dimensional transition animation for a change in a map view
US9417777B2 (en) Enabling quick display transitions between indoor and outdoor map data
JP7032451B2 (en) Dynamically changing the visual properties of indicators on digital maps
US11132102B2 (en) System and method for geographic data layer management in a geographic information system
EP3552117B1 (en) Contextual map view
US10126913B1 (en) Interactive digital map including context-based photographic imagery
US20090040186A1 (en) Method and System for Displaying Multiple Synchronized Images
US20190316931A1 (en) Off-Viewport Location Indications for Digital Mapping
WO2014167363A1 (en) Systems and methods for interacting with a touch screen
JP2012242962A (en) Traveling object operation information system
CN105243469A (en) Method for mapping from multidimensional space to low-dimensional space, and display method and system
JP6044912B2 (en) Mobile operation information system

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION