WO2001071282A1 - Information systems having directional interference facility - Google Patents

Information systems having directional interference facility Download PDF

Info

Publication number
WO2001071282A1
WO2001071282A1 PCT/US2001/005763 US0105763W WO0171282A1 WO 2001071282 A1 WO2001071282 A1 WO 2001071282A1 US 0105763 W US0105763 W US 0105763W WO 0171282 A1 WO0171282 A1 WO 0171282A1
Authority
WO
WIPO (PCT)
Prior art keywords
reference
direction
system
arranged
inference module
Prior art date
Application number
PCT/US2001/005763
Other languages
French (fr)
Inventor
Thomas Ellenby
Peter Ellenby
Jeffrey Alan Jay
Original Assignee
Geovector Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US52680100A priority Critical
Priority to US09/526,801 priority
Application filed by Geovector Corporation filed Critical Geovector Corporation
Publication of WO2001071282A1 publication Critical patent/WO2001071282A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W64/00Locating users or terminals or network equipment for network management purposes, e.g. mobility management
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W64/00Locating users or terminals or network equipment for network management purposes, e.g. mobility management
    • H04W64/006Locating users or terminals or network equipment for network management purposes, e.g. mobility management with additional information processing, e.g. for direction or speed determination

Abstract

An information system is arranged to provide information relation to objects at a user interface in response to objects being addressed by the system. An object is addressed by the system whenever a mobile portion of the device (1) is in a location near the object being addressed and an inferred reference direction forms an address vector which intersects a geometric descriptor associated with the object. Reference directions used to arrive at address indicators may be formed by inference rules contained in an inference module (7) which is part of the device. The inference module (7) may also receive hints from outside sources such as clock or via direct user inputs.

Description

Title "Information Systems Having Directional Inference Facility"

Specification for a Letters Patent

BACKGROUND OF THE INVENTION

Field

The field of these inventions described herefollowing may be characterized as information systems having position measuring capacity combined with attitude inference facility and more particularly by such information systems which further interact with a database of data elements having geometric descriptors associated therewith.

Prior Art

Applicants' copending U.S. patent applications having serial numbers 09/510,889; 09/110,572; 09/163,746; 09/384,470; 09/384,469; and 08/662,219 and U.S. patents numbered 5,815,411; 5,682,332; each relate to information systems which interact with a database of data elements having geometric descriptors associated with the data elements, however, most of those systems rely on devices having a means of determining the pointing direction of a directional reference associated with the device.

While the systems and inventions of the art are designed to achieve particular goals and objectives, some of those being no less than remarkable, these inventions have limitations which prevent their use in new ways now possible. These inventions of the art are not used and cannot be used to realize the advantages and objectives of the present invention.

It should be understood that all of the herein referenced materials provide considerable definition of elements of the present invention. Therefore, those materials are incorporated herein by reference whereby the instant specification can rely upon them for enablement of the particular teachings of each.

SUMMARY OF THE INVENTION

Comes now, Tom Ellenby, Peter Ellenby, and Jeffrey Alan Jay with inventions of information systems having position measuring capacity including devices for and methods of presenting information to a user whereby the information presented relates to a measured position and an inferred directional reference.

The present inventions include devices and methods for presenting information relating to objects having an association with a particular geometry and location. A device which determines its position and infers a reference direction, responds by searching a database and determining which objects are being addressed; and further by recalling and presenting information which relates to those addressed objects.

A parameter herein referred to as an 'address indicator' is dependent upon a measured position reference and an inferred direction reference. An address indicator may be envisaged as a geometric construct which relates to point and directional references. An example is a vector with an endpoint which corresponds to the point reference and a direction corresponding to the direction reference. An address indicator serves as criteria against which database searches are made. A database comprised of data records each including a 'geometric descriptor' may include records where the geometric descriptor forms an intersection with an address indicator. When an object's geometric descriptor forms an intersection with an address indicator, the object is said to be 'addressed' by the system. Database output for addressed objects may include information relating to addressed objects in many forms such as common multi-media types. The information relating to addressed objects is presented to a user via a user interface which may include graphical user interfaces such as a video display device, among others.

Objectives of the Invention

It is a primary objective of the invention to provide systems for presenting information.

It is further an objective to provide systems for addressing an object and presenting information relating to the object. It is further an objective to provide systems for addressing an object, identifying the object and presenting information relating to the object.

It is further an objective to provide systems for addressing an object, recalling information relating to the object by way of spatial references and presenting information relating to objects being addressed.

It is further an objective to provide systems for addressing an object via a measured position reference and an inferred direction reference.

A better understanding can be had with reference to the detailed description of preferred embodiments and with reference to the appended drawings. These embodiments represent particular ways to realize the invention and are not inclusive of all ways possible. Therefore, there may exist embodiments that do not deviate from the spirit and scope of this disclosure as set forth by the claims, but do not appear here as specific examples. It will be appreciated that a great plurality of alternative versions are possible.

BRIEF DESCRIPTION OF THE DRAWING FIGURES

These and other features, aspects, and advantages of the present invention will become better understood with regard to the following description, appended claims and drawings where:

Figure 1 is a schematic diagram of a first version of systems of the inventions;

Figure 2 is a schematic diagram of a second version of the inventions;

Figure 3 illustrates use of a handset in an environment of interest;

Figure 4 is a diagram of an environment in which systems may be used; and

Figures 5 and 6 are a similar diagram of environments in which devices may be used.

PREFERRED EMBODIMENTS OF THE INVENTION

In accordance with each preferred embodiment of these inventions, there is provided apparatus for and methods of presenting information relating to objects being addressed. It will be appreciated that each embodiment described may include both apparatus and methods and that an apparatus or method of one preferred embodiment may be different than an apparatus or method of another embodiment.

Throughout this disclosure, reference is made to some terms which may or may not be defined in popular dictionaries exactly as they are defined here. To provide a more precise disclosure, the following terms are presented with a view to clarity so that the true breadth and scope may be more readily appreciated. Although every attempt is made to be precise and thorough, it is a necessary condition that not all meanings associated with each term can be completely set forth. Accordingly, each term is intended to also include its common meaning which may be derived from general usage within the pertinent arts or by dictionary meaning. For purposes of this disclosure:

A Geometric Descriptor is a mathematical definition of a geometric body. A geometric descriptor is used in association with an object which may be addressed by systems of the invention.

An Information or Data Element is a database record which relates to a particular object of interest. An information element may comprise a plurality of forms of multi-media data including but not limited to: text, audio recordings, video streams, pictures, photographs, icons, Java applets, etc. In addition, each information element has associated therewith a geometric descriptor.

Address is a term used herein as a verb, most commonly with the gerund -ing, to indicate a relationship between a device of the invention and an object; the object being the subject of the address. The physical state of a device of the invention defines a particular address indicator where that address indicator for intersection with the geometric descriptor of an object, the system is said to be 'addressing' the object.

An Address Indicator is a geometric construct, examples include vectors and cones, which has a pointing direction associated therewith. In addition to a reference point and reference pointing direction, some address indicators, for example a cone, subtend a solid angle or otherwise have spatial extent.

Objects refer to any element which may be of interest to a user. An object may be a real tangible object or may be a simple element in space. The term 'object' should be read in a liberal sense. Although buildings and mountains suggest concrete forms of objects, objects for purposes of this disclosure include abstract forms as well. For example, the region of airspace over an airport which may be a 'restricted airspace' is considered an 'object'. Indeed any region of space may be considered an object whether it actually contains a tangible object therein or not.

APPARATUS In simplest terms, apparatus versions of the invention include the elements described herefollowing.

Geometric References

Devices of the inventions include a point reference. A point reference may be merely a structural construct. The actual point may or may not correspond to any tangible object or element of a device. Alternatively, it may be correspond with the position of an actual physical element of a device. In either case, an important relationship is made between the point reference and a position determining means which is also included in systems of the invention. The position determining means is arranged to measure the position of the point reference in some spatial frame of reference.

Devices of the inventions also include a directional reference. A directional reference may be arranged to correspond to a natural axis of a device such as the longitudinal axis of a handheld member. However, the direction reference may also be a mere geometric construct without correspondence to a physical thing.

Position Determining Means

Apparatus of the inventions include a position determining means arranged to determine the position of the point reference. Since in some embodiments of the invention the position determining means is a global positioning system GPS receiver, the point reference lies at the center of a sphere which is defined by the resolution limits of the positioning system. For practical purposes, a handheld receiver which includes a GPS antenna may be said to have the point reference within the handheld unit. Due to the fact that a hand held device is substantially smaller that the minimal resolution of a GPS, the position deterrni-ning means is said to measure the position of the handheld unit. Many forms of alternate positioning systems may be used to accomplish a similar task. The particular positioning system employed may be chosen for a specific task at hand, for example a global positioning system would not be appropriate for a small space such as a warehouse so a radio triangulation technique may be preferred. The essence of the invention is not changed by the particular choice of positioning system. Therefore versions of the invention should not be limited to one particular type of positioning system. The limitation described by 'position determining means' is met when the position of the point reference is measured and made available to a computer processor. Therefore, by use of the term "position determining means" it is meant that any conceivable means for determining the position of a point reference and making that position known to a computer is anticipated. Experts will recognize that there are many thousands of possible ways of determining position and it will not serve a further understanding of the invention to attempt to catalogue them here. The reader will appreciate that the broadest possible definition of "positioning determining means" is intended.

Reference Direction Inference Module

An Reference Direction Inference Module may best be described as a rule set which is preferably implemented in software or firmware. Although hardware may be configured to provide a directional reference in some versions, software implementations provide considerable flexibility for change, update, low cost. An inference module may have no inputs or minimal inputs. An inference module provides a reference direction as an output. For example, where there are no inputs, a certain rule set may specify that 'North' be the reference direction by mere default. The reference module may receive inputs from system components. For example, direct user input may be received at a system keypad. In addition, generally available parameters such as time of day are available as input to an inference module. Sidereal time is a timing system upon which considerable information relating astronomical bodies may be deduced. For example, for any given position on earth, one may predict with great accuracy the location of the sun or moon in the sky. In addition, the constellations are precisely located in the night sky via sidereal time. Systems of the invention may be arranged to provide sidereal time as an input to the inference module such that time is used in a determination of a reference direction. A detailed example of this use of sidereal time is set forth below.

User Interfaces

User interfaces of the inventions serve to convey information to or receive input from a user of the device. A simple speaker driven by computer audio systems is operational for producing audio information and description to a user. Similarly, a display screen driven by video systems of a computer functions to present video or graphic information to a user. Tactile entry of commands may be received via a keypad or similar devices. Although a display screen and speakers are preferred devices for interfacing with a user, other systems include non-display type visual systems such as simple light emitting diodes, or non-speaker audio systems such as buzzers, tactile outputs such as vibrating systems, et cetera. User interfaces include a transducer which is electronically driven by the computer to produce some physical disturbance which can be detected by a user's senses.

User interfaces are arranged in preferred embodiments as a display in a handheld unit. A device such as a mobile telephone having advance display capacity operates well to support devices of the inventions.

Computer Processor

In addition, systems of the invention include a computer programmed to execute specific routines. In particular, a computer is arranged to receive inputs from the position deterrnining means and reference direction inference module. From these inputs, the computer determines an 'address indicator'. Against this address indicator definition, the computer performs a database search and determines if objects contained in the database have geometric descriptors which intersect the address indicator. Objects having geometric descriptors which are determined to intersect said address indicator have data associated therewith which may be recalled and played back to user interfaces as appropriate and in agreement with other criteria which may be selected.

Although a 'processor' is called out as structure in descriptions of the invention, an expert will recognize that processing is most conveniently done in a distributed fashion where portions of information processing is executed at the handheld unit, in the network nodes, at the database. For purposes of this disclosure, a processor is meant to include all processing components in the entire distributed system in cooperation with each other.

Similarly, an inference module may be physically located in a handheld device or may be part of the distributed processing scheme.

Database

In systems of the invention a database is arranged to accommodate data relating to objects of interest. Data relating to objects is prepared and stored in a predetermined and well organized fashion. The data may be stored in many formats and configurations and may be of the nature sometimes referred to as 'multi-media'. A database of the invention is comprised of a plurality of information elements. Each information element relates to a particular object which may be of interest to users of devices of the inventions. Each information element contains a geometric descriptor which describes a particular geometry and location associated with a certain object for which the stored information pertains.

A geometric descriptor includes a definition set for a specific geometry including position, and sometimes changing position with respect to time. A simple time independent example in a Cartesian coordinate system includes a sphere having its center at a point (X, Y, Z) = (2, 5, 9) and a radius of 7 units. Thus the sphere and all of the points within the sphere's boundary are part of the geometric descriptor. A geometric descriptor may describe a geometry such as: a single point, a polygon which defines a planar region, a solid such as a sphere, or even a three dimensional object of arbitrary shape. Thus the rules which perfectly describe those geometries which are well known in the sciences are used in geometric descriptors of the invention. In all cases, a geometric descriptor includes at least one point and more frequently includes a set of many points.

WIRELESS NETWORKS A more complete understanding can be realized in consideration of the drawing figures with reference numerals as follows. Figure 1 illustrates major components of certain devices of the inventions. A handheld portable unit 1 which may be in the form of a common mobile telephone is in remote commumcation with a plurality of stations 2. In addition, the system may include ground stations 3 in commumcation with one or more satellites 4. In some versions, the handheld unit may communicate with orbiting satellites and omit need for ground based stations; in this case, the drawing of figure 1 reduces to the exclude the ground stations without loss of generality. Ground stations, satellites, and handheld units form a network which is further in communication with information transaction elements as follows.

A database 5 is connected to the network whereby prerecorded information which relates to objects may be stored therein. A handheld portable unit may make requests by passing information relating to an address indicator to the database and receiving object information from the database.

A position determining means may include orbiting satellites such as GPS or alternatively e911 positioning means. In e911 systems, signals may be transmitted from the handheld unit 1 and analyzed for their arrival time at fixed ground stations to triangulate on a present position of the handheld unit. A point reference associated with the handheld device is said to be in the position so measured.

While some versions are implemented where control processing and database management occur at a fixed site connected to the network, as shown in Figure 1 , other versions also anticipate having these functions or portions and subsets of these functions taken up at the handheld device. Figure 2 diagrams this alternative. A handheld device 21 may be configured for wireless communication with ground transceivers 2, satellite antennas 23, and satellites 24 in a similar fashion as described previously. However, a computing device 25 may provided to perform database, position, inference and processing functions on board the handheld unit. In instances where speed is a priority, devices may be preferably arranged to execute functions locally at the handset, while in versions which are sensitive to overall cost, an arrangement may be better suited such that the device in merely in communication with the network which handles the bulk of processing. In the latter case, a protocol is worked out for the transfer of information such that a plurality of devices all in communication with the network each would receive information which is particular to those devices.

SELF CONTAINED SYSTEMS While preferred embodiments may be arranged as described above, alternative architectures are possible which do not deviate from the spirit of the inventions. Although phone networks are well established and provide an excellent basis upon which devices of the invention may be built, a self contained device which does not interact with a wireless network may also be configured as a device of the inventions. Such a device has a complete database, a complete processor, and all position and direction deteimining means integrated into a single handheld unit. The device, without communication to outside networks is arranged to determine the position of a reference point, infer a direction reference, and provide information recalled from a database at a user interface.

METHODS Methods of the invention are best described as being comprised of the following steps. Measure Position In a first step, the position of the reference point is determined. A GPS may be employed locally at the device and operate to measure the global position of the reference point. Although convenient measurement units might be latitude, longitude and altitude, others similarly provide workable solutions. Alternatively, systems such as those known as 'e911' where triangulation on radio signals allows a position determination are also useful in systems of the inventions. Data from the position determining step is passed to computer processors. Infer Reference Direction

A reference direction is determined by inference in agreement with a rule set provided to systems of the invention. In advanced versions, hints such as user activity may be supplied as input to the inference module. Inputs to the inference module are processed in agreement with the rule set to provide as output a reference direction. Form Address Indicator

In combination, a position value and a reference direction are used to form an address indicator. Although a simple vector may be used in some versions, a cone or other geometric construct having angular extend including solid angle geometries may be used as an address indicator to represent a region of space being addressed by devices of the invention at any time. Search Database

A database search operation reads database records, more particularly the geometric descriptors of data records and performs a coincidence test to see if a particular address indicator computed in agreement with position and direction references intersects any of the points described by the geometric descriptor. Items meeting that criteria are tagged for recall and further processing including presentation at a user interface. Prepare and Display Results

Information relating to objects determined to be addressed objects in a database search is recalled and processed for presentation at a user interface.

It is useful to present in detail techniques which may be used to infer a reference direction. These are meant to be a few illustrative examples and not meant to be a comprehensive listing of all possible ways. One will surely appreciate that a great plurality of rule sets may be devised without varying from the spirit of the invention.

ARBITRARY DEFAULT

In devices having no ability to measure device attitude and where there are no hints regarding a preferred direction, a reference direction may be placed by arbitrary default. In a most rudimentary version, an arbitrary default may be chosen and set within the inference engine rule base. For example the direction associated with 'North' with respect to any given point on the earth's surface serves as a good starting point. Wherever a user of devices of the inventions carries a handheld unit, the system determines the unit's position and defines an address indicator to be the vector which has as an endpoint the determined position and a direction corresponding to North. From this definition of an address indicator, a database search can be performed. A user of such a simple device is made to understand that the search results do not depend upon the pointing direction of the handheld device or any other physical device but rather the results are set to correspond to the direction of zero degrees on the compass. A list of objects may be presented at a user interface which alerts a user to the objects in his immediate vicinity and more particularly, those near him and North of him.

In example, while standing at latitude N32° 49.649'; longitude Wl 17° 16.513' the device set to have North as a default reference direction determines what database items lie on a line segment starting at the point described and extending in a direction North of that point. The 'Explorer Dive and Travel' scuba diving shop in La Jolla, California is displayed as a object meeting the criteria. Movement of the device to another location causes a new search which produces new results. Other database items lying on the line which begins at a new location and extends North are presented to a user in a list.

While listing objects lying in a line in a direction North of a user may be useful, most applications demand better flexibility. Although the direction North is a good starting point, many occasions require that a reference direction be used where the direction is other than north.

Merely Specifying a Reference Direction To 'explore' an environment about a user, inference module programming may be arranged to receive user specification of a reference direction between 0 - 360, the conventional divisions about a compass. To get list of objects in any direction a user may simply specify that direction by way of keypad entry. Although a user may have no knowledge of how the direction specified lines up with real objects in the immediate environment, speculation may lead to a 'hit'; a known object. In addition, the user is enabled a way to arrive at a listing of all objects in the vicinity by serially making requests while incrementing the specified reference direction; for example by five degrees increase with each request.

Although specification of a direction of interest is accomplished as described, a simplification includes using the natural spatial distribution of keys on a common keypad. A telephone keypad includes matrix of numerals or symbols in a 3 by 4 arrangement of rows and columns. In a first preferred arrangement, numerals '0' and '5'; and symbols '*'; and '#' are ignored. A convention where North corresponds to the '2' key is adopted. A user having an interest in things lying on a line in the direction east of him, may indicate that interest by stimulation of the keypad, in particular by depressing the '6' key after being prompted by a user interface. In response to the user pressing the '6' key a new database search may be initiated in view of the present position and the specified reference direction which may be deduced from the user input.

For simplicity, an inference engine may be arranged to translate a common mobile telephone keypad into directions about the compass rose where '2' is made to correspond to North. Accordingly, a user indicates a reference direction assignment of West by pressing the '4' key, South by pressing the '8' key, Northeast by pressing the '3' key, et cetera.

The careful observer will note that it remains a difficult task to determine which direction in the real world corresponds to a chosen direction. Accordingly, an arrangement is provided in some inference engines to receive user feedback and re-set a direction reference.

USER FEEDBACK In some versions of an inference engine, feedback from a user may be used to arrive at a revised reference direction. In particular, where the position of a handheld unit is determined, a list of objects in the immediate vicinity may be presented at a user interface. The user can select from those objects in a list, one which is recognized. By selecting a recognized object and orienting the handheld unit in alignment with that direction the user sets an origin reference direction.

Then, while looking about the environment including objects unknown to the user, a user may have an interest in some object which is off in a direction other than the reference direction. To request information regarding that object, the user may provide indication to the processor by way of the keypad. A inference module rule set may be arranged so that the matrix of keys on the keypad correspond to various directions about the compass with the 'forward' direction corresponding to the '2' key when the keypad is aligned to the origin reference direction. Figure 3 illustrates this scenario. While pointing a handheld device 31 towards a recognized object, schoolhouse 32, the user sets an origin reference direction. The user thereafter makes an inquiry about nearby objects by providing feedback to the inference engine. For example the user may press the '3' key 33 which corresponds to a direction 34 toward a cityscape 35 of interest. In such arrangements, the inference engine determines a direction 34 to be the reference direction and defines an address indicator accordingly. In this way, a user can make requests for information relating to all objects around him by first providing hints to the inference module which is programmed to be initiated with an origin reference direction.

The above technique requires that the user recognize something in the landscape in order to align the device and provide feedback to the device to request information about other things in a direction different than the origin reference direction. It is not always the case that a user could recognize objects in a list of things lying nearby. Therefore alternative means are anticipated for providing a reference direction in an inference module.

SUN POSITION As many persons find it difficult to determine which direction is North while standing in a common environment, for example a cityscape, it may be useful to provide a technique which aids the user in setting a direction reference via reliance on readily available timing information. A user's position and the precise time of day is well known to systems of the invention via communication with standard clocks. From this, a point on the horizon corresponding to where the sun or moon (or other astrological bodies such as constellations) is located is also within simple calculations. Therefore, a user may use that point from which an origin directional reference may be formed. A reference direction could be defined by any direction in relation to that origin direction via user input on the keypad which is properly aligned with the origin direction.

Figure 4 illustrates a user pointing a handheld device 41 in a direction different than North 42, a direction which may be completely unknown to him. The direction 43 towards the horizon 44 and immediately below the sun at a single point 45. As the day passes, the position of the sun in its path 46 changes in a precise fashion with respect to sidereal time. Although the user is unaware of which direction is North and further unaware of the identity of objects around him, she can point a natural axis (the longitudinal axis) of the device at the horizon just below the sun to get an origin reference direction. Now with the keypad so aligned, the user may wish to request information about the buildings 47 ahead and to the right of him. Note that the '2' key no longer corresponds to North as an arrangement which uses the sun's position aligns the keypad to a dynamic direction which depends upon the time of day.

To address the buildings, a user simply presses the '3' key on the keypad to provide a hint to the inference module and indicate a direction reference. Figure 5 illustrates a handheld device 51 which is aligned in a direction 52 towards the horizon 53 where the sun is positioned. Buildings 54 are of interest to the user and lie in a direction 55 with respect to the user's present position. To indicate that direction as the direction reference for the purpose of defining an address vector, the user presses the '3' key 56 after first having set the origin direction. Similarly, If a user wishes to address a schoolhouse to the right, handheld unit 61 is pointed along direction 62 at horizon 63. Since schoolhouse 64 lies in a direction 65, a user must press the '6' key 66 to indicate such direction as the selected reference direction.

One will now fully appreciate how a system which measures position and infers a reference direction may present information relating to objects having an association with a particular geometry and location. Although the present invention has been described in considerable detail with clear and concise language and with reference to certain preferred versions thereof including the best mode anticipated by the inventor, other versions are possible. Therefore, the spirit and scope of the invention should not be limited by the description of the preferred versions contained therein, but rather by the claims appended hereto.

Claims

Claims
What is claimed is
1) An information system arranged to provide information about known objects in response to those objects being addressed, said system comprising: a point reference; a position determining means; direction reference inference module; a data store; a processor; and a user interface, said position determining means in communication with said point reference whereby position measurements are made with respect to the point reference and position information is conveyed to the processor, said direction reference inference module is arranged to provide direction reference output and convey that output to the processor, said processor is arranged to recall information from the data store in accordance with said measured position of the point reference and the direction reference provided by the direction reference inference module, and said user interface is in communication with said processor whereby the recalled information may be presented to a system user.
2) A system of claim 1 , said point reference being arranged within a handheld mobile unit.
3) A system of claim 2, said handheld mobile unit includes a mobile telephone.
4) A system of claim 1 , said position determining means is a global position system having a receiving antenna arranged within a handheld mobile unit.
5) A system of claim 1 , said position determining means is a radio triangulation apparatus. 6) A system of claim 1 , said direction reference inference module is arranged as a generalized processor with stored programming operable to providing a reference direction.
7) A system of claim 6, said direction reference inference module is arranged with a preset direction as a default whereby the absence of inputs to the direction reference inference module results in an output of a reference direction.
8) A system of claim 1, said direction reference inference module is arranged to receive input signals from a keypad.
9) A system of claim 8, said keypad includes a matrix of keys in rows and columns, a first row having keys labeled: '1', '2', and '3'; a second row having keys labeled: '4', '5', and '&; and a third row having keys labeled 7\ '8', and '9'.
10) A system of claim 9, said keypad is arranged such that the '2' corresponds to the direction north.
11) A system of claim 9, said direction reference inference module is arranged to provide an output which is dependent upon a key press operation.
12) A system of claim 11 , said direction reference inference module is arranged to provide an output dependent upon the spatial relationship of the particular key pressed with respect to the matrix of keys.
PCT/US2001/005763 2000-03-16 2001-02-22 Information systems having directional interference facility WO2001071282A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US52680100A true 2000-03-16 2000-03-16
US09/526,801 2000-03-16

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
AU3864901A AU3864901A (en) 2000-03-16 2001-02-22 Information systems having directional interference facility

Publications (1)

Publication Number Publication Date
WO2001071282A1 true WO2001071282A1 (en) 2001-09-27

Family

ID=24098844

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2001/005763 WO2001071282A1 (en) 2000-03-16 2001-02-22 Information systems having directional interference facility

Country Status (2)

Country Link
AU (1) AU3864901A (en)
WO (1) WO2001071282A1 (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102007031922A1 (en) * 2007-07-09 2009-01-15 Continental Automotive Gmbh Portable electronic instrument for tourists, with a compass and a computer, can be aligned on an object for information about it to be displayed
GB2470713A (en) * 2009-04-03 2010-12-08 Digiguide Ltd Handheld navigation device presenting recorded media content associated with user selected targets
US8218873B2 (en) 2000-11-06 2012-07-10 Nant Holdings Ip, Llc Object information derived from object images
US8224077B2 (en) 2000-11-06 2012-07-17 Nant Holdings Ip, Llc Data capture and identification system and process
US8224079B2 (en) 2000-11-06 2012-07-17 Nant Holdings Ip, Llc Image capture and identification system and process
US8588527B2 (en) 2000-11-06 2013-11-19 Nant Holdings Ip, Llc Object information derived from object images
WO2014009691A1 (en) * 2012-07-11 2014-01-16 Ad Astra Forever Limited Mobile terminal viewing
US8810598B2 (en) 2011-04-08 2014-08-19 Nant Holdings Ip, Llc Interference based augmented reality hosting platforms
US9310892B2 (en) 2000-11-06 2016-04-12 Nant Holdings Ip, Llc Object information derived from object images
US9526658B2 (en) 2010-02-24 2016-12-27 Nant Holdings Ip, Llc Augmented reality panorama supporting visually impaired individuals
US9679414B2 (en) 2013-03-01 2017-06-13 Apple Inc. Federated mobile device positioning
US9928652B2 (en) 2013-03-01 2018-03-27 Apple Inc. Registration between actual mobile device position and environmental model
US10140317B2 (en) 2013-10-17 2018-11-27 Nant Holdings Ip, Llc Wide area augmented reality location-based services

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5815411A (en) * 1993-09-10 1998-09-29 Criticom Corporation Electro-optic vision system which exploits position and attitude
US6127945A (en) * 1995-10-18 2000-10-03 Trimble Navigation Limited Mobile personal navigator
US6173239B1 (en) * 1998-09-30 2001-01-09 Geo Vector Corporation Apparatus and methods for presentation of information relating to objects being addressed

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5815411A (en) * 1993-09-10 1998-09-29 Criticom Corporation Electro-optic vision system which exploits position and attitude
US6127945A (en) * 1995-10-18 2000-10-03 Trimble Navigation Limited Mobile personal navigator
US6173239B1 (en) * 1998-09-30 2001-01-09 Geo Vector Corporation Apparatus and methods for presentation of information relating to objects being addressed

Cited By (113)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8774463B2 (en) 2000-11-06 2014-07-08 Nant Holdings Ip, Llc Image capture and identification system and process
US10095712B2 (en) 2000-11-06 2018-10-09 Nant Holdings Ip, Llc Data capture and identification system and process
US8218873B2 (en) 2000-11-06 2012-07-10 Nant Holdings Ip, Llc Object information derived from object images
US8218874B2 (en) 2000-11-06 2012-07-10 Nant Holdings Ip, Llc Object information derived from object images
US8224077B2 (en) 2000-11-06 2012-07-17 Nant Holdings Ip, Llc Data capture and identification system and process
US8224079B2 (en) 2000-11-06 2012-07-17 Nant Holdings Ip, Llc Image capture and identification system and process
US8224078B2 (en) 2000-11-06 2012-07-17 Nant Holdings Ip, Llc Image capture and identification system and process
US8326031B2 (en) 2000-11-06 2012-12-04 Nant Holdings Ip, Llc Image capture and identification system and process
US8335351B2 (en) 2000-11-06 2012-12-18 Nant Holdings Ip, Llc Image capture and identification system and process
US8437544B2 (en) 2000-11-06 2013-05-07 Nant Holdings Ip, Llc Image capture and identification system and process
US8457395B2 (en) 2000-11-06 2013-06-04 Nant Holdings Ip, Llc Image capture and identification system and process
US8463030B2 (en) 2000-11-06 2013-06-11 Nant Holdings Ip, Llc Image capture and identification system and process
US8463031B2 (en) 2000-11-06 2013-06-11 Nant Holdings Ip, Llc Image capture and identification system and process
US8467600B2 (en) 2000-11-06 2013-06-18 Nant Holdings Ip, Llc Image capture and identification system and process
US8467602B2 (en) 2000-11-06 2013-06-18 Nant Holdings Ip, Llc Image capture and identification system and process
US8478036B2 (en) 2000-11-06 2013-07-02 Nant Holdings Ip, Llc Image capture and identification system and process
US8478037B2 (en) 2000-11-06 2013-07-02 Nant Holdings Ip, Llc Image capture and identification system and process
US8478047B2 (en) 2000-11-06 2013-07-02 Nant Holdings Ip, Llc Object information derived from object images
US8488880B2 (en) 2000-11-06 2013-07-16 Nant Holdings Ip, Llc Image capture and identification system and process
US8494264B2 (en) 2000-11-06 2013-07-23 Nant Holdings Ip, Llc Data capture and identification system and process
US8494271B2 (en) 2000-11-06 2013-07-23 Nant Holdings Ip, Llc Object information derived from object images
US8498484B2 (en) 2000-11-06 2013-07-30 Nant Holdingas IP, LLC Object information derived from object images
US8520942B2 (en) 2000-11-06 2013-08-27 Nant Holdings Ip, Llc Image capture and identification system and process
US8548278B2 (en) 2000-11-06 2013-10-01 Nant Holdings Ip, Llc Image capture and identification system and process
US8548245B2 (en) 2000-11-06 2013-10-01 Nant Holdings Ip, Llc Image capture and identification system and process
US8582817B2 (en) 2000-11-06 2013-11-12 Nant Holdings Ip, Llc Data capture and identification system and process
US8588527B2 (en) 2000-11-06 2013-11-19 Nant Holdings Ip, Llc Object information derived from object images
US10089329B2 (en) 2000-11-06 2018-10-02 Nant Holdings Ip, Llc Object information derived from object images
US10080686B2 (en) 2000-11-06 2018-09-25 Nant Holdings Ip, Llc Image capture and identification system and process
US8792750B2 (en) 2000-11-06 2014-07-29 Nant Holdings Ip, Llc Object information derived from object images
US8798322B2 (en) 2000-11-06 2014-08-05 Nant Holdings Ip, Llc Object information derived from object images
US9844467B2 (en) 2000-11-06 2017-12-19 Nant Holdings Ip Llc Image capture and identification system and process
US8824738B2 (en) 2000-11-06 2014-09-02 Nant Holdings Ip, Llc Data capture and identification system and process
US8837868B2 (en) 2000-11-06 2014-09-16 Nant Holdings Ip, Llc Image capture and identification system and process
US8842941B2 (en) 2000-11-06 2014-09-23 Nant Holdings Ip, Llc Image capture and identification system and process
US8849069B2 (en) 2000-11-06 2014-09-30 Nant Holdings Ip, Llc Object information derived from object images
US8855423B2 (en) 2000-11-06 2014-10-07 Nant Holdings Ip, Llc Image capture and identification system and process
US8861859B2 (en) 2000-11-06 2014-10-14 Nant Holdings Ip, Llc Image capture and identification system and process
US8867839B2 (en) 2000-11-06 2014-10-21 Nant Holdings Ip, Llc Image capture and identification system and process
US8873891B2 (en) 2000-11-06 2014-10-28 Nant Holdings Ip, Llc Image capture and identification system and process
US8885983B2 (en) 2000-11-06 2014-11-11 Nant Holdings Ip, Llc Image capture and identification system and process
US8885982B2 (en) 2000-11-06 2014-11-11 Nant Holdings Ip, Llc Object information derived from object images
US8923563B2 (en) 2000-11-06 2014-12-30 Nant Holdings Ip, Llc Image capture and identification system and process
US8938096B2 (en) 2000-11-06 2015-01-20 Nant Holdings Ip, Llc Image capture and identification system and process
US8948460B2 (en) 2000-11-06 2015-02-03 Nant Holdings Ip, Llc Image capture and identification system and process
US8948544B2 (en) 2000-11-06 2015-02-03 Nant Holdings Ip, Llc Object information derived from object images
US8948459B2 (en) 2000-11-06 2015-02-03 Nant Holdings Ip, Llc Image capture and identification system and process
US9014512B2 (en) 2000-11-06 2015-04-21 Nant Holdings Ip, Llc Object information derived from object images
US9014515B2 (en) 2000-11-06 2015-04-21 Nant Holdings Ip, Llc Image capture and identification system and process
US9014514B2 (en) 2000-11-06 2015-04-21 Nant Holdings Ip, Llc Image capture and identification system and process
US9014513B2 (en) 2000-11-06 2015-04-21 Nant Holdings Ip, Llc Image capture and identification system and process
US9014516B2 (en) 2000-11-06 2015-04-21 Nant Holdings Ip, Llc Object information derived from object images
US9020305B2 (en) 2000-11-06 2015-04-28 Nant Holdings Ip, Llc Image capture and identification system and process
US9025813B2 (en) 2000-11-06 2015-05-05 Nant Holdings Ip, Llc Image capture and identification system and process
US9025814B2 (en) 2000-11-06 2015-05-05 Nant Holdings Ip, Llc Image capture and identification system and process
US9031278B2 (en) 2000-11-06 2015-05-12 Nant Holdings Ip, Llc Image capture and identification system and process
US9805063B2 (en) 2000-11-06 2017-10-31 Nant Holdings Ip Llc Object information derived from object images
US9036948B2 (en) 2000-11-06 2015-05-19 Nant Holdings Ip, Llc Image capture and identification system and process
US9036862B2 (en) 2000-11-06 2015-05-19 Nant Holdings Ip, Llc Object information derived from object images
US9036947B2 (en) 2000-11-06 2015-05-19 Nant Holdings Ip, Llc Image capture and identification system and process
US9036949B2 (en) 2000-11-06 2015-05-19 Nant Holdings Ip, Llc Object information derived from object images
US9046930B2 (en) 2000-11-06 2015-06-02 Nant Holdings Ip, Llc Object information derived from object images
US9087240B2 (en) 2000-11-06 2015-07-21 Nant Holdings Ip, Llc Object information derived from object images
US9104916B2 (en) 2000-11-06 2015-08-11 Nant Holdings Ip, Llc Object information derived from object images
US9110925B2 (en) 2000-11-06 2015-08-18 Nant Holdings Ip, Llc Image capture and identification system and process
US9116920B2 (en) 2000-11-06 2015-08-25 Nant Holdings Ip, Llc Image capture and identification system and process
US9135355B2 (en) 2000-11-06 2015-09-15 Nant Holdings Ip, Llc Image capture and identification system and process
US9141714B2 (en) 2000-11-06 2015-09-22 Nant Holdings Ip, Llc Image capture and identification system and process
US9148562B2 (en) 2000-11-06 2015-09-29 Nant Holdings Ip, Llc Image capture and identification system and process
US9152864B2 (en) 2000-11-06 2015-10-06 Nant Holdings Ip, Llc Object information derived from object images
US9154694B2 (en) 2000-11-06 2015-10-06 Nant Holdings Ip, Llc Image capture and identification system and process
US9154695B2 (en) 2000-11-06 2015-10-06 Nant Holdings Ip, Llc Image capture and identification system and process
US9170654B2 (en) 2000-11-06 2015-10-27 Nant Holdings Ip, Llc Object information derived from object images
US9182828B2 (en) 2000-11-06 2015-11-10 Nant Holdings Ip, Llc Object information derived from object images
US9235600B2 (en) 2000-11-06 2016-01-12 Nant Holdings Ip, Llc Image capture and identification system and process
US9244943B2 (en) 2000-11-06 2016-01-26 Nant Holdings Ip, Llc Image capture and identification system and process
US9262440B2 (en) 2000-11-06 2016-02-16 Nant Holdings Ip, Llc Image capture and identification system and process
US9288271B2 (en) 2000-11-06 2016-03-15 Nant Holdings Ip, Llc Data capture and identification system and process
US9311552B2 (en) 2000-11-06 2016-04-12 Nant Holdings IP, LLC. Image capture and identification system and process
US9311553B2 (en) 2000-11-06 2016-04-12 Nant Holdings IP, LLC. Image capture and identification system and process
US9310892B2 (en) 2000-11-06 2016-04-12 Nant Holdings Ip, Llc Object information derived from object images
US9311554B2 (en) 2000-11-06 2016-04-12 Nant Holdings Ip, Llc Image capture and identification system and process
US9317769B2 (en) 2000-11-06 2016-04-19 Nant Holdings Ip, Llc Image capture and identification system and process
US9324004B2 (en) 2000-11-06 2016-04-26 Nant Holdings Ip, Llc Image capture and identification system and process
US9844468B2 (en) 2000-11-06 2017-12-19 Nant Holdings Ip Llc Image capture and identification system and process
US9330326B2 (en) 2000-11-06 2016-05-03 Nant Holdings Ip, Llc Image capture and identification system and process
US9330327B2 (en) 2000-11-06 2016-05-03 Nant Holdings Ip, Llc Image capture and identification system and process
US9336453B2 (en) 2000-11-06 2016-05-10 Nant Holdings Ip, Llc Image capture and identification system and process
US9342748B2 (en) 2000-11-06 2016-05-17 Nant Holdings Ip. Llc Image capture and identification system and process
US9360945B2 (en) 2000-11-06 2016-06-07 Nant Holdings Ip Llc Object information derived from object images
US9844469B2 (en) 2000-11-06 2017-12-19 Nant Holdings Ip Llc Image capture and identification system and process
US9844466B2 (en) 2000-11-06 2017-12-19 Nant Holdings Ip Llc Image capture and identification system and process
US9536168B2 (en) 2000-11-06 2017-01-03 Nant Holdings Ip, Llc Image capture and identification system and process
US9578107B2 (en) 2000-11-06 2017-02-21 Nant Holdings Ip, Llc Data capture and identification system and process
US9613284B2 (en) 2000-11-06 2017-04-04 Nant Holdings Ip, Llc Image capture and identification system and process
US9824099B2 (en) 2000-11-06 2017-11-21 Nant Holdings Ip, Llc Data capture and identification system and process
US9785859B2 (en) 2000-11-06 2017-10-10 Nant Holdings Ip Llc Image capture and identification system and process
US9785651B2 (en) 2000-11-06 2017-10-10 Nant Holdings Ip, Llc Object information derived from object images
US9031290B2 (en) 2000-11-06 2015-05-12 Nant Holdings Ip, Llc Object information derived from object images
US9808376B2 (en) 2000-11-06 2017-11-07 Nant Holdings Ip, Llc Image capture and identification system and process
US9330328B2 (en) 2000-11-06 2016-05-03 Nant Holdings Ip, Llc Image capture and identification system and process
DE102007031922A1 (en) * 2007-07-09 2009-01-15 Continental Automotive Gmbh Portable electronic instrument for tourists, with a compass and a computer, can be aligned on an object for information about it to be displayed
GB2470713A (en) * 2009-04-03 2010-12-08 Digiguide Ltd Handheld navigation device presenting recorded media content associated with user selected targets
US9526658B2 (en) 2010-02-24 2016-12-27 Nant Holdings Ip, Llc Augmented reality panorama supporting visually impaired individuals
US10127733B2 (en) 2011-04-08 2018-11-13 Nant Holdings Ip, Llc Interference based augmented reality hosting platforms
US8810598B2 (en) 2011-04-08 2014-08-19 Nant Holdings Ip, Llc Interference based augmented reality hosting platforms
US9824501B2 (en) 2011-04-08 2017-11-21 Nant Holdings Ip, Llc Interference based augmented reality hosting platforms
US9396589B2 (en) 2011-04-08 2016-07-19 Nant Holdings Ip, Llc Interference based augmented reality hosting platforms
WO2014009691A1 (en) * 2012-07-11 2014-01-16 Ad Astra Forever Limited Mobile terminal viewing
US9928652B2 (en) 2013-03-01 2018-03-27 Apple Inc. Registration between actual mobile device position and environmental model
US10217290B2 (en) 2013-03-01 2019-02-26 Apple Inc. Registration between actual mobile device position and environmental model
US9679414B2 (en) 2013-03-01 2017-06-13 Apple Inc. Federated mobile device positioning
US10140317B2 (en) 2013-10-17 2018-11-27 Nant Holdings Ip, Llc Wide area augmented reality location-based services

Also Published As

Publication number Publication date
AU3864901A (en) 2001-10-03

Similar Documents

Publication Publication Date Title
Thomas et al. A wearable computer system with augmented reality to support terrestrial navigation
US9702721B2 (en) Map service with network-based query for search
US6556222B1 (en) Bezel based input mechanism and user interface for a smart watch
CN1098010C (en) Position/moving sensitive computer connection
Baus et al. A survey of map-based mobile guides
US8396661B2 (en) Using relative position data in a mobile computing device
CN102006550B (en) Power saving system and method for mobile computing device
US6868337B2 (en) Portable navigation device and system, and online navigation service in wireless communication network
US7142870B2 (en) Apparatus and method for displaying direction and position information of portable terminal
US8301159B2 (en) Displaying network objects in mobile devices based on geolocation
US6587788B1 (en) Integrated position and direction system with radio communication for updating data
US20020035609A1 (en) Location bookmark system and method for creating and using location information
JP3791249B2 (en) Mobile terminal
US20070282526A1 (en) Method and apparatus for utilizing geographic location information
EP2077436B1 (en) Graphical user interface for presenting location information
US8483712B2 (en) Displaying network objects in mobile devices based on geolocation
US8880606B2 (en) Multi-modal, geo-tempo communications systems
CA2804096C (en) Methods, apparatuses and computer program products for automatically generating suggested information layers in augmented reality
US20140225814A1 (en) Method and system for representing and interacting with geo-located markers
US9140552B2 (en) User defined names for displaying monitored location
US20080134088A1 (en) Device for saving results of location based searches
EP2547080B1 (en) Location status indicator for mobile phones
US20120139953A1 (en) System and method for adjusting presentation of text and images on an electronic device according to an orientation of the device
US8831636B2 (en) Method of operating mobile device by recognizing user's gesture and mobile device using the method
Simon et al. A mobile application framework for the geospatial web

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AU CA FI JP KP NZ

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application
122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase in:

Ref country code: JP