WO1999042947A2 - Apparatus and methods for presentation of information relating to objects being addressed - Google Patents

Apparatus and methods for presentation of information relating to objects being addressed Download PDF

Info

Publication number
WO1999042947A2
WO1999042947A2 PCT/US1999/003688 US9903688W WO9942947A2 WO 1999042947 A2 WO1999042947 A2 WO 1999042947A2 US 9903688 W US9903688 W US 9903688W WO 9942947 A2 WO9942947 A2 WO 9942947A2
Authority
WO
WIPO (PCT)
Prior art keywords
information
object
position
reference
database
Prior art date
Application number
PCT/US1999/003688
Other languages
French (fr)
Other versions
WO1999042947A3 (en
Inventor
Thomas Ellenby
Original Assignee
Geovector Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US7504798P priority Critical
Priority to US60/075,047 priority
Application filed by Geovector Corporation filed Critical Geovector Corporation
Publication of WO1999042947A2 publication Critical patent/WO1999042947A2/en
Publication of WO1999042947A3 publication Critical patent/WO1999042947A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04812Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance interaction techniques based on cursor appearance or behaviour being affected by the presence of displayed objects, e.g. visual feedback during interaction with elements of a graphical user interface through change in cursor appearance, constraint movement or attraction/repulsion with respect to a displayed object
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with three-dimensional environments, e.g. control of viewpoint to navigate in the environment

Abstract

Systems are arranged to provide a user information which relates to objects of interest. A user may point a device toward an object to address it. The device determines which objects are being addressed by reference to an internal database (7) containing preprogrammed information relating to objects. Information relating to objects being addressed can then be presented at a user interface (6). A device of the system may include a point reference (1), a direction reference, a position determining means (3), an attitude determining means (4), a computer processor (5) and database (7), and a user interface (6). A method of the system includes the steps of addressing an object, determining position and attitude, searching a database, and presenting information to a user.

Description

Title: "Apparatus and Methods for Presentation of Information Relating to

Objects Being Addressed"

Specification for a Letters Patent

BACKGROUND OF THE INVENTION

This application claims benefit of provisional application no. 60/075,047 entitled "Apparatus and Methods For Acquiring and Displaying Geo-Coded

Information Based Upon the Position and Attitude of the Apparatus", filed February

18, 1998 by inventor Thomas W. Ellenby. Provisional patent application no. is hereby incorporated herein by this reference.

Field The following invention disclosure is generally concerned with devices and technique for presenting recorded information relating to objects and specifically concerned with presenting recorded information relating to objects having an association with a particular location.

Prior Art

Systems have been devised to display images of objects which may be in the field-of-view of a vision system. Images may be formed in response to a determination of position and attitude of the vision system which locates the field-of- view with respect to objects being addressed. Details may be fully appreciated in consideration of U.S. Patents # 5,625,765; 5,682,332; and 5,742,521.

While these systems are highly useful and sophisticated, they may require complex imaging apparatus and technique forming composite images which are aligned to actual objects. While the systems and inventions of the prior art are designed to achieve particular goals and objectives, some of those being no less than remarkable, these inventions have limitations which prevent their use in new ways now possible. These prior art inventions are not used and cannot be used to realize the advantages and objectives of the present invention.

SUMMARY OF THE INVENTION

Comes now, Thomas Ellenby with an invention of an information system including devices and methods of presenting information relating to an object being addressed.

The present invention includes devices and methods for presenting information relating to objects having an association with a particular geometry and location. Specifically, devices are arranged with a pointing reference and user interface. A device which is pointed toward an object known to a computer database, responds by determining which objects are being addressed and presenting information which relates to the objects at the interface.

After a device of the invention is pointed towards an object, the device makes a determination of the position and attitude of the device. A reference vector associated with the determined position and attitude is defined and used in a search of a database. The database comprised of data elements having identifiers or descriptors associated with position and other spatial definition may form a geometric intersection with the reference vector. The search produces output which includes information about objects which are being addressed by the device. The information is presented to a user via a user interface.

Objectives of the Invention

It is a primary objective of the invention to provide systems for presenting information.

It is further an objective to provide systems for addressing an object and presenting information relating to the object.

It is further an objective to provide systems for addressing an object, identifying the object and presenting information relating to the object. It is further an objective to provide systems for addressing an object, recalling information relating to the object by way of a spatial reference and presenting information relating to the object being addressed.

A better understanding can be had with reference to the detailed description of preferred embodiments and with reference to the appended drawings. These embodiments represent particular ways to realize the invention and are not inclusive of all ways possible. Therefore, there may exist embodiments that do not deviate from the spirit and scope of this disclosure as set forth by the claims, but do not appear here as specific examples. It will be appreciated that a great plurality of alternative versions are possible.

BRIEF DESCRIPTION OF THE DRAWING FIGURES

These and other features, aspects, and advantages of the present invention will become better understood with regard to the following description, appended claims and drawings where:

Figure 1 is a block diagram illustrating major elements of a device of the invention;

Figure 2 is a block diagram showing the configuration of a database of the invention;

Figure 3 is a geometric construct of interest;

Figure 4 shows a similar geometric construct which illustrates an important geometry; Figures 5 - 9 similarly show geometries of importance;

Figures 10 - 13 illustrate practical uses of devices of the invention;

PREFERRED EMBODIMENTS OF THE INVENTION

In accordance with each of the preferred embodiments of the invention, there is provided an apparatus for and method of presenting information relating to an object being addressed. It will be appreciated that each of the embodiments described include both an apparatus and method and that the apparatus and method of one preferred embodiment may be different than the apparatus and method of another embodiment.

Throughout this disclosure, reference is made to some terms which may or may not be defined in popular dictionaries exactly as they are defined here. To provide a more precise disclosure, the following terms are presented with a view to clarity so that the true breadth and scope may be more readily appreciated. Although every attempt is made to be precise and thorough, it is a necessary condition that not all meanings associated with each term can be completely set forth. Accordingly, each term is intended to also include its common meaning which may be derived from general usage within the pertinent arts or by dictionary meaning. For purposes of this disclosure:

A geometric descriptor is a mathematical definition of a geometric body. A geometric descriptor of the invention is used in association with an object which may be addressed by systems of the invention.

An information element is a database record which relates to a particular object of interest. An information element comprises many forms of multi-media data including but not limited to: text, audio recordings, video streams, pictures, photographs, icons, Java applets, etc. In addition, each information element has associated therewith a geometric descriptor.

Address is a term used herein as a verb, most commonly with the gerund -ing, to indicate a relationship between a device of the invention and an object; the object being the subject of the address. A device of the invention which is pointing at an object is said to be 'addressing' the object.

An address vector has its origin at a device of the invention and points in the direction of a reference direction associated with the device by way of an attitude determining means.

A range gate is a line segment which is colinear with an address vector having a first endpoint at some minimum distance from a point reference and a second endpoint at some maximum distance from the same point reference. objects refer to any element which may be of interest to a user. An object may be a real tangible object or may be a figurative element in space. The term

'object' should be read in a liberal sense. Although buildings and mountains suggest concrete forms of objects, objects for purposes of this disclosure include abstract forms as well. For example, the region of airspace over an airport which may be restricted is considered an 'object'. Indeed any region of space may considered an object whether is actually contains a tangible object therein or not.

In simplest versions of the invention, apparatus include the following elements as described herefollowing. Geometric References

Devices of the invention include a point reference and a directional reference. These may be mere structural constructs. The actual point and directional references may or may not correspond to any tangible object or element of the device. alternatively, they may be collocated with actual physical elements of the device. In either case, an important relationship is made between them and a position and attitude determining means which are also included in devices of the invention.

Position Determining Means A position determining means is arranged to measure the position of the point reference. Since in many embodiments of the invention the position determining means is a global positioning system GPS receiver, the point reference lies at the center of the sphere which is defined by the resolution limits of the positioning system. For practical purposes, a handheld receiver which includes a GPS antenna may be said to have the point reference within the handheld unit. The position determining means therefore measures the position of the handheld unit. Many forms of alternate positioning systems may be used to accomplish the identical task. The particular positioning system employed may be chosen for a specific task at hand, for example a global positioning system would not be appropriate for a small space such as a warehouse so a radio triangulation technique may be preferred. The essence of the invention is not changed by the particular choice of positioning system. Therefore versions of the invention should not be limited to one particular type of positioning system. The limitation described by 'position determining means' is met when the position of the point reference is measured and made available to a computer processor. Therefore, by use of the term "position determining means" it is meant that any conceivable means for determining the position of a point reference and making that position known to a computer is anticipated. Experts will recognize that there are many thousands of possible ways of determining position and it will not serve a further understanding of the invention to attempt to catalogue them here. The reader will appreciate that the broadest possible definition of "positioning determining means" is intended here. Attitude Determining Means

Systems of the invention also include an attitude determining means. An attitude determining means is arranged to determine the pointing direction or orientation of a directional reference. In simple versions, an electronic compass will serve as an attitude determining means. More sophisticated versions will include an attitude determining means which is operable for measuring inclination as well as direction in a horizontal plane. Although an electronic flux gate compass or laser gyroscope system may be used in certain versions of the invention, it does not improve the description to limit the attitude determining means to any particular device. Similar to the position determining means described above, the limitation described as 'attitude determimng means' is fully met by any device or systems which may be used to determine the attitude of a directional reference and make that information known to a computer processor.

User Interface

A user interface of the invention serves to convey information to a user of the device. A simple speaker driven by computer audio systems is operational for producing audio information and description to a user. Similarly, a display screen driven by video systems of the computer functions to present video or graphic information to a user. Although a display screen and speakers are preferred devices for interfacing with a user, other systems include non-display type visual systems such as simple light emitting diodes, or non-speaker audio systems such as buzzers, tactile outputs such as vibrating systems, et cetera. In all cases, a user interface includes a transducer which is electronically driven by the computer which produces some physical disturbance which can be detected by a user's senses.

Computer Processor

In addition, systems of the invention include a computer programmed to execute specific routines. In particular, a computer is arranged to receive inputs from the position and attitude determining means. From these inputs, the computer defines a vector having an endpoint at the device reference point and pointing direction identical to the directional reference. From this vector definition, the computer performs a database search and determines if any of the geometric objects described in the information element geometric descriptors intersects the vector. Data from those information elements which are determined to intersect is recalled and played back to the user interface as appropriate and in agreement with other criteria which may be selected.

Database

In systems of the invention a database is arranged to accommodate data relating to objects of interest. Data relating to objects is prepared and stored in a predetermined and well organized fashion. The data may be stored in many formats and configurations and may be of the nature sometimes referred to as 'multi-media'. A database of the invention is comprised of a plurality of information elements. Each information element relates to a particular object which may be of interest. Each information element contains a descriptor which is describes a geometry and location relating to the object for which the stored information pertains.

A geometric descriptor is a definition set for a specific geometry including position. For example, in a Cartesian coordinate system, a sphere may have its center at a point (X, Y, Z) = (2, 5, 9) while having a radius of 7 units. Thus the sphere and all of the points within the sphere's boundary are completely described. A geometric descriptor may describe a geometry which includes a single point, alternatively a polygon which defines a planar region, a solid such as a sphere, or even a three dimensional object of arbitrary shape. Thus the rules which perfectly describe those geometries which are well known in the sciences are used in geometric descriptors of the invention. In all cases, a geometric descriptor includes at least one point and more frequently includes a set of many points.

Methods of the invention are best described as being comprised of the follows steps.

In a first step, an object is addressed. To address an object, the device is merely pointed toward the object. The device having a pointing reference necessarily is continuously pointing in some direction at all times. Although it is not a necessity that the device be pointing to an object known to the database, the device is always pointing at something and thus it is said that it is addressing something at all times.

In a step to be performed after the first step, the position of the device reference point is determined. A GPS employed locally at the device operates to measure the global position of the reference point. Although convenient measurement units might be latitude, longitude and altitude, others similarly provide workable solutions. Data from the position determining step is passed to the computer processor.

In a further step to be performed after the first step, the attitude of the device directional reference is determined. A compass may be used to determine which direction the device is being pointed. Data from the attitude determining step is similarly passed to the computer processor.

Data received at the computer processor from the position and attitude determining means is used to define an address vector. A search of database information elements is commenced. A search operation reads database geometric descriptors and performs a coincidence test to see if an address vector intersects any of the points in a geometry described. Items meeting that criteria are recalled for further processing and presentation at a user interface.

A more complete understanding can be realized in consideration of the drawing figures with reference numerals as follows. Figure 1 illustrates a block diagram of major components of devices of the invention. A point reference 1 is a geometric construct to which measurements of position are directed. The point may correspond to an actual device such as a GPS antenna or may alternatively be merely a point in space having a convenient location within a physical device. A directional reference 2 similarly forms a geometric construct at a device of the invention but is otherwise arbitrary with respect to any physical element or part of a device of the invention. A position determining means 3 is in communication with the point reference and is arranged to measure its position. The position determining means is further in communication with a computer. The position measurement is made without regard to any particular coordinate system in various versions of the invention but some versions using GPS devices preferably use a latitude, longitude and altitude scheme which allows one to define position anywhere on or above the Earth's surface. Determination of position within a coordinate system is the essence of the function performed by the device without regard for the coordinate system chosen.

An attitude determining means 4 similarly is in commumcation with a directional reference 2 and with a computer. The attitude determining means measures the pointing direction and reports that information to the computer processor. A computer processor 5 is coupled to and receives measurement data from position and attitude determining means. The computer is further connected to a user interface 6 and presents information to a user by way of the interface. The computer includes a database 7 which may contain preprogrammed information. Now, with reference to Figure 2 the database may be more precisely defined.

A database 21 of the invention has a special construction. The database may include a great plurality of basic units each referred herethroughout as an information element 22. An information element may contain stored information in various formats 23. Each information element contains a descriptor 24 which defines a geometric body of interest. Additional information elements 25 each having their own descriptors and stored information further make up the database. The database is comprised of any number of information elements the last element being the N1*1 element 26.

The above described elements when assembled as directed form a device of the invention which is preferably integrated into a small handheld machine. A sturdy case formed of durable plastic operates to contain the assembly and allows a user easy access to functions associated therewith.

In consideration of the above described arrangement and the following procedural description with reference to additional drawings, one will now better appreciate operation of some preferred devices of the invention. Drawing Figure 3 illustrates a simple geometric construction showing a point reference 31 , a directional reference 32, a rectangular cylinder 33 and a circular cylinder 34. A portion of space 35 indicated by a dotted line is shared by the rectangular cylinder and a vector 36 having an endpoint coincident with the point reference and colinear with the direction reference. Having full geometric definition of the vector, and the cylindrical objects, a computer routine is executed to determine which objects are intersected by the vector and which are not. In the case of Figure 3, the square cylinder is intersected by the vector but the circular cylinder is not. A device having a reference point 31 and directional reference 32 is said to be addressing the square cylinder. A computer having programmed information as to the location and shape of the cylinders can tell when a vector is intersecting the space of interest and when it is not. The computer only needs the preprogrammed information and a measurement of the device point and direction references. The computer does not require input from any real object which may be associated with the space of interest and does not need to detect or probe it in any way. For example if the square cylinder 33 is associated with a hotel known by the computer, the hotel is implicitly addressed whenever the device addresses the square cylindrical space. If a construction crew removes the hotel and the computer is not updated, the computer still assumes the presence of the building because it is a necessity that the space remain despite the actual presence, or lack of presence, of the building therein. Accordingly, devices of the invention merely determine what space is being addressed and imply that particular objects are being addressed by way of the association of objects to spatial definitions or geometric descriptors. The mere fact that information contained in the database is accurate suggests and implies the presence of the hotel. It is the geometric descriptor which is preprogrammed into the computer which dictates if an intersection exists or not. The actual presence of an object does not affect whether the device is addressing it or not.

It is useful to rely on this technique for most items of interest. For example, the Empire State Building presently occupies a space which is well defined. It is a very reliable fact that the Empire State Building will similarly occupy that same space tomorrow. A device of the invention which is pointed at the Empire State Building, the position of the device and attitude being measured and defining an address vector, can reasonably deduce that the building is there. In this way, devices of the invention 'know' what they are addressing simply by measuring their own position and attitude and comparing that information with information in a database.

For purposes of this disclosure, an intersection of only one point is sufficient to have the vector be coincident or to have an intersection with the geometric object. Figure 4 illustrates a scheme whereby the vector defined by the reference point 41 and the reference direction 42 is coincident with the square cylinder 43 at a single point 44. The circular cylinder 45 is not intersected by the vector and is not said to be coincident therewith.

It is not a requirement that an object be three dimensional; quite contrarily, a two dimensional or single dimensional object forms perfect basis for an intersection with a vector. Figure 5 illustrates a point reference 51 and a direction reference 52 forming a vector which intersects and is coincident with a plane 53 at a single point 54. One might envisage every advertising billboard as a plane having position information associated with it. When programmed properly, these geometric definitions allow a device of the invention to know of any billboard anywhere. When pointed at a billboard the device can identify the advertiser and be made to respond by playing back information such as a product jingle, text information, video clips, et cetera. The connection between the billboard (object) and the geometric descriptor is made via the database where real objects are associated with geometric descriptors in preprogrammed data storage schemes.

The shape does not necessarily have to be regular or "well-behaved". A geometric description is available for a complexly shaped element as well as those more readily described in simple terms. Figure 6 shows a reference point 61 and reference direction 62 which define a vector having an intersection with a spatial element 63 at line segment 64.

A geometric descriptor used in devices of the invention to associate object data with position and shape may change in time. Although the trains in Japan are moving objects, they move in a highly reliable way in accordance with a rigid schedule. Therefore, a geometric descriptor might include information about changes of position with respect to time. When a device of the invention is pointed at a moving train, inquiry to the database may yield an intersection with a 'moving' spatial element.

Figure 7 shows one additional condition of interest. Although the term 'vector' implies a line segment with infinite extent in one direction, and indeed it is a true vector which is of primary interest for most examples presented, in some cases only a certain portion of the vector is of interest. Some operations described hereafter will refer to a "range gate". A range gate has two delimiters which define a portion of the vector which is of particular importance. Figure 7 shows a reference point 71 , a reference direction 72, a first delimiter 73 a second delimiter 74, a cube body 75, a line segment 76, a circular cylinder 77, and a vector 78. Although the vector 78 intersects and passes through both the cube body and the circular cylinder, only the portion of the vector in the range gate, i.e. that portion between delimiters 73 and 74, forms an intersection with the cube body. Thus, in some instances, a range gate is created to designate which portions of the vector are of greatest interest. Figure 8 shows another important range gate. A range gate may include all the points along the reference vector from the reference point to a selected maximum distance. For example a user may specify all targets "within fifty meters of the device". Objects which are greater than fifty meters away from the user are not included in any recall effort. Figure 8 illustrates this concept. A reference point 81 and reference vector 82 form basis for a system having a range gate starting at the reference point and ending 83 at some predetermined distance from the reference point. A cubic object 84 has a portion 85 of the vector passing through it. Similarly, circular cylindrical object 86 also has a portion of the vector intersecting that object. Of course, the vector 87 continues on without limit. The difference between the cubic object and the circular cylindrical object is that the cubic object lies within the range gate region of the reference vector and the circular cylindrical object does not. A computer search engine arranged to be responsive to criteria describing such a rate gate is useful in restricting objects which are presented as presently being addressed by the device.

It is entirely possible that two objects fall within the limits of a particular range gate. Figure 9 illustrates a reference point 91 and a direction vector 92 which passes through 93 a first object, continues through space 94 and passes through a second object 95. In this case, both objects a cubic object 96 and a circular cylindrical object 97 form an intersection with the vector and lie with a range which lies on the vector somewhere past the point indicated as 98. A search engine therefore identifies both objects as being addressed by the system. A display can handle this occurrence by listing all objects be addressed simultaneously. A list may include a scheme whereby closer objects are listed first while more distant objects appear nearer the end of the list. A user may select from the list an object of particular interest and request from the computer more information relating to that object.

Now with reference to drawing Figures 10 - 13, one will more readily understand a very practical use of the invention. The photograph is of the central place in the village of Chamonix, France with Mont Blanc in the background. A gentleman walking on the sidewalk and having therein one hand a device of the invention is positioned behind cars slightly to the left of center of the photograph. By pointing the unit towards the statue shown in the left hand portion of the photograph, the gentleman addresses the statue object. The device determines the position and attitude of the handheld unit and reports those values to the computer. The computer searches its database to learn that the statue occupies a portion of the same space as the vector defined in the object address step. The computer responds by displaying information to the user about the statue. A brief history of the events being represented and memorialized may be presented as text. In addition, sound clips and video recordings which are associated with the statue similarly provide content accessed by the computer and presented to a user who has pointed the device at the statue out of interest in it.

On desire to identify lodging for the night, the gentleman might point the device at hotels in the area. Figure 11 shows a first building being addressed by the handheld unit. In response, the device quires its database which contains information relating to the hotel building. Prices and description of the accommodation can be displayed nearly instantly. Figure 12 shows the device being pointed toward another hotel, the Hotel Astoria. Data relating to the Hotel Astoria can similarly be presented to a user in real-time. If the user wishes to address the mountain peak behind the Hotel Astoria shown in Figure 13, a range gate is formed to exclude objects within a few hundred meters and the hotel is ignored. A display then responds by showing information relating to Les Bossons the world's largest glacier. Because a database can easily be created whereby information relating to particular objects can be associated with a descriptor which describes the object's location and shape, an object can be addressed from any location and information relating to the object can be presented to a interested user.

One will now fully appreciate how a system which measures position and attitude may present information relating to objects having an association with a particular geometry and location. Although the present invention has been described in considerable detail with clear and concise language and with reference to certain preferred versions thereof including the best mode anticipated by the inventor, other versions are possible. Therefore, the spirit and scope of the invention should not be limited by the description of the preferred versions contained therein, but rather by the claims appended hereto.

Claims

1 ) An apparatus for the presentation of information relating to an obj ect being addressed, the apparatus comprising: a directional reference; a point reference; a position determining means; an attitude determining means; a computer processor; a database of information elements; and a user interface, said position determining means being arranged to determine the position of the point reference and convey position information to said computer processor; said attitude determining means being arranged to determine the orientation of the directional reference and convey attitude information to said computer processor; said database being comprised of a plurality of information elements, each information element having a geometric descriptor associated therewith; and said user interface being in electronic communication with said computer processor.
2 A method of presenting information relating to an object being addressed, the method comprising the acts: addressing an object; determining position; determimng attitude; searching a database; and presenting information, said addressing an object further defined as pointing a reference pointer towards an object said determining position further defined as measuring the position of an apparatus having a point reference; said determining attitude further defined as measuring the orientation of an apparatus having a directional reference; said searching a database further defined as executing a computer routine to compare vector definition to geometric descriptors of information elements; said presenting information further defined as reporting the results of the search where coincidence is found.
PCT/US1999/003688 1998-02-18 1999-02-17 Apparatus and methods for presentation of information relating to objects being addressed WO1999042947A2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US7504798P true 1998-02-18 1998-02-18
US60/075,047 1998-02-18

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
AU28710/99A AU2871099A (en) 1998-02-18 1999-02-17 Apparatus and methods for presentation of information relating to objects being addressed

Publications (2)

Publication Number Publication Date
WO1999042947A2 true WO1999042947A2 (en) 1999-08-26
WO1999042947A3 WO1999042947A3 (en) 1999-12-02

Family

ID=22123200

Family Applications (2)

Application Number Title Priority Date Filing Date
PCT/US1999/003531 WO1999042946A2 (en) 1998-02-18 1999-02-17 Apparatus and methods for presentation of information relating to objects being addressed
PCT/US1999/003688 WO1999042947A2 (en) 1998-02-18 1999-02-17 Apparatus and methods for presentation of information relating to objects being addressed

Family Applications Before (1)

Application Number Title Priority Date Filing Date
PCT/US1999/003531 WO1999042946A2 (en) 1998-02-18 1999-02-17 Apparatus and methods for presentation of information relating to objects being addressed

Country Status (3)

Country Link
AU (2) AU2971499A (en)
CA (1) CA2321448A1 (en)
WO (2) WO1999042946A2 (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6470264B2 (en) 1997-06-03 2002-10-22 Stephen Bide Portable information-providing apparatus
EP2202617A2 (en) 2000-05-16 2010-06-30 Nokia Corporation A method and apparatus to browse and access downloaded contextual information
US8218874B2 (en) 2000-11-06 2012-07-10 Nant Holdings Ip, Llc Object information derived from object images
US8224078B2 (en) 2000-11-06 2012-07-17 Nant Holdings Ip, Llc Image capture and identification system and process
US8224077B2 (en) 2000-11-06 2012-07-17 Nant Holdings Ip, Llc Data capture and identification system and process
US8588527B2 (en) 2000-11-06 2013-11-19 Nant Holdings Ip, Llc Object information derived from object images
US8810598B2 (en) 2011-04-08 2014-08-19 Nant Holdings Ip, Llc Interference based augmented reality hosting platforms
EP2406971A4 (en) * 2009-03-09 2015-05-06 Microsoft Technology Licensing Llc Device transaction model and services based on directional information of device
US9310892B2 (en) 2000-11-06 2016-04-12 Nant Holdings Ip, Llc Object information derived from object images
US9661468B2 (en) 2009-07-07 2017-05-23 Microsoft Technology Licensing, Llc System and method for converting gestures into digital graffiti
US9703385B2 (en) 2008-06-20 2017-07-11 Microsoft Technology Licensing, Llc Data services based on gesture and location information of device
US10057724B2 (en) 2008-06-19 2018-08-21 Microsoft Technology Licensing, Llc Predictive services for devices supporting dynamic direction information
US10140317B2 (en) 2013-10-17 2018-11-27 Nant Holdings Ip, Llc Wide area augmented reality location-based services

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002132806A (en) 2000-10-18 2002-05-10 Fujitsu Ltd Server system, and information providing service system and method
JP2002229991A (en) 2001-01-31 2002-08-16 Fujitsu Ltd Server, user terminal, system and method for providing information
JP2002229992A (en) 2001-01-31 2002-08-16 Fujitsu Ltd Server device for space information service and method for providing the same, and accounting device and method for space information service
DE10156832B4 (en) * 2001-11-20 2004-01-29 Siemens Ag Method and apparatus for displaying information
US7427980B1 (en) 2008-03-31 2008-09-23 International Business Machines Corporation Game controller spatial detection
WO2011106520A1 (en) 2010-02-24 2011-09-01 Ipplex Holdings Corporation Augmented reality panorama supporting visually impaired individuals
US9928652B2 (en) 2013-03-01 2018-03-27 Apple Inc. Registration between actual mobile device position and environmental model
US9679414B2 (en) 2013-03-01 2017-06-13 Apple Inc. Federated mobile device positioning

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5528518A (en) * 1994-10-25 1996-06-18 Laser Technology, Inc. System and method for collecting data used to form a geographic information system database
US5825480A (en) * 1996-01-30 1998-10-20 Fuji Photo Optical Co., Ltd. Observing apparatus
US5870741A (en) * 1995-10-20 1999-02-09 Fuji Xerox Co., Ltd. Information management device
US5902347A (en) * 1996-11-19 1999-05-11 American Navigation Systems, Inc. Hand-held GPS-mapping device

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5454043A (en) * 1993-07-30 1995-09-26 Mitsubishi Electric Research Laboratories, Inc. Dynamic and static hand gesture recognition through low-level image analysis
US5631973A (en) * 1994-05-05 1997-05-20 Sri International Method for telemanipulation with telepresence
JPH0863326A (en) * 1994-08-22 1996-03-08 Hitachi Ltd Image processing device/method
WO1996015517A2 (en) * 1994-11-02 1996-05-23 Visible Interactive Corporation Interactive personal interpretive device and system for retrieving information about a plurality of objects
US5796386A (en) * 1995-01-23 1998-08-18 International Business Machines Corporation Precise calibration procedure for sensor-based view point control system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5528518A (en) * 1994-10-25 1996-06-18 Laser Technology, Inc. System and method for collecting data used to form a geographic information system database
US5870741A (en) * 1995-10-20 1999-02-09 Fuji Xerox Co., Ltd. Information management device
US5825480A (en) * 1996-01-30 1998-10-20 Fuji Photo Optical Co., Ltd. Observing apparatus
US5902347A (en) * 1996-11-19 1999-05-11 American Navigation Systems, Inc. Hand-held GPS-mapping device

Cited By (115)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6470264B2 (en) 1997-06-03 2002-10-22 Stephen Bide Portable information-providing apparatus
EP2202617A2 (en) 2000-05-16 2010-06-30 Nokia Corporation A method and apparatus to browse and access downloaded contextual information
EP2202616A2 (en) 2000-05-16 2010-06-30 Nokia Corporation A method and apparatus to browse and access downloaded contextual information
EP3147759A1 (en) 2000-05-16 2017-03-29 Nokia Technologies Oy A method and apparatus to browse and access downloaded contextual information
US8467602B2 (en) 2000-11-06 2013-06-18 Nant Holdings Ip, Llc Image capture and identification system and process
US8224078B2 (en) 2000-11-06 2012-07-17 Nant Holdings Ip, Llc Image capture and identification system and process
US8224077B2 (en) 2000-11-06 2012-07-17 Nant Holdings Ip, Llc Data capture and identification system and process
US8224079B2 (en) 2000-11-06 2012-07-17 Nant Holdings Ip, Llc Image capture and identification system and process
US8326031B2 (en) 2000-11-06 2012-12-04 Nant Holdings Ip, Llc Image capture and identification system and process
US8335351B2 (en) 2000-11-06 2012-12-18 Nant Holdings Ip, Llc Image capture and identification system and process
US8437544B2 (en) 2000-11-06 2013-05-07 Nant Holdings Ip, Llc Image capture and identification system and process
US8457395B2 (en) 2000-11-06 2013-06-04 Nant Holdings Ip, Llc Image capture and identification system and process
US8463030B2 (en) 2000-11-06 2013-06-11 Nant Holdings Ip, Llc Image capture and identification system and process
US8463031B2 (en) 2000-11-06 2013-06-11 Nant Holdings Ip, Llc Image capture and identification system and process
US8218873B2 (en) 2000-11-06 2012-07-10 Nant Holdings Ip, Llc Object information derived from object images
US8467600B2 (en) 2000-11-06 2013-06-18 Nant Holdings Ip, Llc Image capture and identification system and process
US8478047B2 (en) 2000-11-06 2013-07-02 Nant Holdings Ip, Llc Object information derived from object images
US8478036B2 (en) 2000-11-06 2013-07-02 Nant Holdings Ip, Llc Image capture and identification system and process
US8478037B2 (en) 2000-11-06 2013-07-02 Nant Holdings Ip, Llc Image capture and identification system and process
US8488880B2 (en) 2000-11-06 2013-07-16 Nant Holdings Ip, Llc Image capture and identification system and process
US8494271B2 (en) 2000-11-06 2013-07-23 Nant Holdings Ip, Llc Object information derived from object images
US8494264B2 (en) 2000-11-06 2013-07-23 Nant Holdings Ip, Llc Data capture and identification system and process
US8498484B2 (en) 2000-11-06 2013-07-30 Nant Holdingas IP, LLC Object information derived from object images
US8520942B2 (en) 2000-11-06 2013-08-27 Nant Holdings Ip, Llc Image capture and identification system and process
US8548278B2 (en) 2000-11-06 2013-10-01 Nant Holdings Ip, Llc Image capture and identification system and process
US8548245B2 (en) 2000-11-06 2013-10-01 Nant Holdings Ip, Llc Image capture and identification system and process
US8582817B2 (en) 2000-11-06 2013-11-12 Nant Holdings Ip, Llc Data capture and identification system and process
US8588527B2 (en) 2000-11-06 2013-11-19 Nant Holdings Ip, Llc Object information derived from object images
US8774463B2 (en) 2000-11-06 2014-07-08 Nant Holdings Ip, Llc Image capture and identification system and process
US8792750B2 (en) 2000-11-06 2014-07-29 Nant Holdings Ip, Llc Object information derived from object images
US8798322B2 (en) 2000-11-06 2014-08-05 Nant Holdings Ip, Llc Object information derived from object images
US10095712B2 (en) 2000-11-06 2018-10-09 Nant Holdings Ip, Llc Data capture and identification system and process
US8824738B2 (en) 2000-11-06 2014-09-02 Nant Holdings Ip, Llc Data capture and identification system and process
US8837868B2 (en) 2000-11-06 2014-09-16 Nant Holdings Ip, Llc Image capture and identification system and process
US8842941B2 (en) 2000-11-06 2014-09-23 Nant Holdings Ip, Llc Image capture and identification system and process
US8849069B2 (en) 2000-11-06 2014-09-30 Nant Holdings Ip, Llc Object information derived from object images
US8855423B2 (en) 2000-11-06 2014-10-07 Nant Holdings Ip, Llc Image capture and identification system and process
US8861859B2 (en) 2000-11-06 2014-10-14 Nant Holdings Ip, Llc Image capture and identification system and process
US8867839B2 (en) 2000-11-06 2014-10-21 Nant Holdings Ip, Llc Image capture and identification system and process
US8873891B2 (en) 2000-11-06 2014-10-28 Nant Holdings Ip, Llc Image capture and identification system and process
US8885982B2 (en) 2000-11-06 2014-11-11 Nant Holdings Ip, Llc Object information derived from object images
US8885983B2 (en) 2000-11-06 2014-11-11 Nant Holdings Ip, Llc Image capture and identification system and process
US8923563B2 (en) 2000-11-06 2014-12-30 Nant Holdings Ip, Llc Image capture and identification system and process
US8938096B2 (en) 2000-11-06 2015-01-20 Nant Holdings Ip, Llc Image capture and identification system and process
US8948544B2 (en) 2000-11-06 2015-02-03 Nant Holdings Ip, Llc Object information derived from object images
US8948460B2 (en) 2000-11-06 2015-02-03 Nant Holdings Ip, Llc Image capture and identification system and process
US8948459B2 (en) 2000-11-06 2015-02-03 Nant Holdings Ip, Llc Image capture and identification system and process
US9014515B2 (en) 2000-11-06 2015-04-21 Nant Holdings Ip, Llc Image capture and identification system and process
US9014513B2 (en) 2000-11-06 2015-04-21 Nant Holdings Ip, Llc Image capture and identification system and process
US9014514B2 (en) 2000-11-06 2015-04-21 Nant Holdings Ip, Llc Image capture and identification system and process
US9014516B2 (en) 2000-11-06 2015-04-21 Nant Holdings Ip, Llc Object information derived from object images
US9014512B2 (en) 2000-11-06 2015-04-21 Nant Holdings Ip, Llc Object information derived from object images
US9020305B2 (en) 2000-11-06 2015-04-28 Nant Holdings Ip, Llc Image capture and identification system and process
US9025813B2 (en) 2000-11-06 2015-05-05 Nant Holdings Ip, Llc Image capture and identification system and process
US9025814B2 (en) 2000-11-06 2015-05-05 Nant Holdings Ip, Llc Image capture and identification system and process
US10089329B2 (en) 2000-11-06 2018-10-02 Nant Holdings Ip, Llc Object information derived from object images
US9031290B2 (en) 2000-11-06 2015-05-12 Nant Holdings Ip, Llc Object information derived from object images
US9844468B2 (en) 2000-11-06 2017-12-19 Nant Holdings Ip Llc Image capture and identification system and process
US9036947B2 (en) 2000-11-06 2015-05-19 Nant Holdings Ip, Llc Image capture and identification system and process
US9036862B2 (en) 2000-11-06 2015-05-19 Nant Holdings Ip, Llc Object information derived from object images
US9036949B2 (en) 2000-11-06 2015-05-19 Nant Holdings Ip, Llc Object information derived from object images
US9036948B2 (en) 2000-11-06 2015-05-19 Nant Holdings Ip, Llc Image capture and identification system and process
US9046930B2 (en) 2000-11-06 2015-06-02 Nant Holdings Ip, Llc Object information derived from object images
US9087240B2 (en) 2000-11-06 2015-07-21 Nant Holdings Ip, Llc Object information derived from object images
US9104916B2 (en) 2000-11-06 2015-08-11 Nant Holdings Ip, Llc Object information derived from object images
US9110925B2 (en) 2000-11-06 2015-08-18 Nant Holdings Ip, Llc Image capture and identification system and process
US9116920B2 (en) 2000-11-06 2015-08-25 Nant Holdings Ip, Llc Image capture and identification system and process
US9135355B2 (en) 2000-11-06 2015-09-15 Nant Holdings Ip, Llc Image capture and identification system and process
US9141714B2 (en) 2000-11-06 2015-09-22 Nant Holdings Ip, Llc Image capture and identification system and process
US9148562B2 (en) 2000-11-06 2015-09-29 Nant Holdings Ip, Llc Image capture and identification system and process
US9154694B2 (en) 2000-11-06 2015-10-06 Nant Holdings Ip, Llc Image capture and identification system and process
US9152864B2 (en) 2000-11-06 2015-10-06 Nant Holdings Ip, Llc Object information derived from object images
US9154695B2 (en) 2000-11-06 2015-10-06 Nant Holdings Ip, Llc Image capture and identification system and process
US9170654B2 (en) 2000-11-06 2015-10-27 Nant Holdings Ip, Llc Object information derived from object images
US9182828B2 (en) 2000-11-06 2015-11-10 Nant Holdings Ip, Llc Object information derived from object images
US9235600B2 (en) 2000-11-06 2016-01-12 Nant Holdings Ip, Llc Image capture and identification system and process
US9244943B2 (en) 2000-11-06 2016-01-26 Nant Holdings Ip, Llc Image capture and identification system and process
US9262440B2 (en) 2000-11-06 2016-02-16 Nant Holdings Ip, Llc Image capture and identification system and process
US9269015B2 (en) 2000-11-06 2016-02-23 Nant Holdings Ip, Llc Image capture and identification system and process
US9288271B2 (en) 2000-11-06 2016-03-15 Nant Holdings Ip, Llc Data capture and identification system and process
US9311553B2 (en) 2000-11-06 2016-04-12 Nant Holdings IP, LLC. Image capture and identification system and process
US9311554B2 (en) 2000-11-06 2016-04-12 Nant Holdings Ip, Llc Image capture and identification system and process
US9311552B2 (en) 2000-11-06 2016-04-12 Nant Holdings IP, LLC. Image capture and identification system and process
US9310892B2 (en) 2000-11-06 2016-04-12 Nant Holdings Ip, Llc Object information derived from object images
US9317769B2 (en) 2000-11-06 2016-04-19 Nant Holdings Ip, Llc Image capture and identification system and process
US9324004B2 (en) 2000-11-06 2016-04-26 Nant Holdings Ip, Llc Image capture and identification system and process
US10080686B2 (en) 2000-11-06 2018-09-25 Nant Holdings Ip, Llc Image capture and identification system and process
US9330326B2 (en) 2000-11-06 2016-05-03 Nant Holdings Ip, Llc Image capture and identification system and process
US9330328B2 (en) 2000-11-06 2016-05-03 Nant Holdings Ip, Llc Image capture and identification system and process
US9336453B2 (en) 2000-11-06 2016-05-10 Nant Holdings Ip, Llc Image capture and identification system and process
US9342748B2 (en) 2000-11-06 2016-05-17 Nant Holdings Ip. Llc Image capture and identification system and process
US9360945B2 (en) 2000-11-06 2016-06-07 Nant Holdings Ip Llc Object information derived from object images
US9844466B2 (en) 2000-11-06 2017-12-19 Nant Holdings Ip Llc Image capture and identification system and process
US9536168B2 (en) 2000-11-06 2017-01-03 Nant Holdings Ip, Llc Image capture and identification system and process
US9578107B2 (en) 2000-11-06 2017-02-21 Nant Holdings Ip, Llc Data capture and identification system and process
US8218874B2 (en) 2000-11-06 2012-07-10 Nant Holdings Ip, Llc Object information derived from object images
US9613284B2 (en) 2000-11-06 2017-04-04 Nant Holdings Ip, Llc Image capture and identification system and process
US9844467B2 (en) 2000-11-06 2017-12-19 Nant Holdings Ip Llc Image capture and identification system and process
US9844469B2 (en) 2000-11-06 2017-12-19 Nant Holdings Ip Llc Image capture and identification system and process
US9785859B2 (en) 2000-11-06 2017-10-10 Nant Holdings Ip Llc Image capture and identification system and process
US9785651B2 (en) 2000-11-06 2017-10-10 Nant Holdings Ip, Llc Object information derived from object images
US9805063B2 (en) 2000-11-06 2017-10-31 Nant Holdings Ip Llc Object information derived from object images
US9808376B2 (en) 2000-11-06 2017-11-07 Nant Holdings Ip, Llc Image capture and identification system and process
US9031278B2 (en) 2000-11-06 2015-05-12 Nant Holdings Ip, Llc Image capture and identification system and process
US9824099B2 (en) 2000-11-06 2017-11-21 Nant Holdings Ip, Llc Data capture and identification system and process
US9330327B2 (en) 2000-11-06 2016-05-03 Nant Holdings Ip, Llc Image capture and identification system and process
US10057724B2 (en) 2008-06-19 2018-08-21 Microsoft Technology Licensing, Llc Predictive services for devices supporting dynamic direction information
US9703385B2 (en) 2008-06-20 2017-07-11 Microsoft Technology Licensing, Llc Data services based on gesture and location information of device
EP2406971A4 (en) * 2009-03-09 2015-05-06 Microsoft Technology Licensing Llc Device transaction model and services based on directional information of device
US9661468B2 (en) 2009-07-07 2017-05-23 Microsoft Technology Licensing, Llc System and method for converting gestures into digital graffiti
US9396589B2 (en) 2011-04-08 2016-07-19 Nant Holdings Ip, Llc Interference based augmented reality hosting platforms
US8810598B2 (en) 2011-04-08 2014-08-19 Nant Holdings Ip, Llc Interference based augmented reality hosting platforms
US10127733B2 (en) 2011-04-08 2018-11-13 Nant Holdings Ip, Llc Interference based augmented reality hosting platforms
US9824501B2 (en) 2011-04-08 2017-11-21 Nant Holdings Ip, Llc Interference based augmented reality hosting platforms
US10140317B2 (en) 2013-10-17 2018-11-27 Nant Holdings Ip, Llc Wide area augmented reality location-based services

Also Published As

Publication number Publication date
WO1999042946A3 (en) 1999-10-28
WO1999042947A3 (en) 1999-12-02
AU2971499A (en) 1999-09-06
CA2321448A1 (en) 1999-08-26
AU2871099A (en) 1999-09-06
WO1999042946A2 (en) 1999-08-26

Similar Documents

Publication Publication Date Title
KR101486496B1 (en) Location based, content targeted information
CA2559726C (en) System and method for displaying images in an online directory
US8589069B1 (en) Enhanced identification of interesting points-of-interest
US9057617B1 (en) Enhanced identification of interesting points-of-interest
US8489326B1 (en) Placemarked based navigation and ad auction based on placemarks
US8730312B2 (en) Systems and methods for augmented reality
JP6257124B2 (en) Methods for geocoding personal information, medium and system
US8494215B2 (en) Augmenting a field of view in connection with vision-tracking
JP4236372B2 (en) Spatial information utilization system and server system
US7917543B2 (en) System and method for geo-coding user generated content
US8624958B2 (en) Method and apparatus for accessing multi-dimensional mapping and information
US20030069693A1 (en) Geographic pointing device
US5717392A (en) Position-responsive, hierarchically-selectable information presentation system and control program
US20110196610A1 (en) Schematic maps
US20100035637A1 (en) Displaying image data and geographic element data
US20100185605A1 (en) Method and system for continuous, dynamic, adaptive searching based on a continuously evolving personal region of interest
US8810599B1 (en) Image recognition in an augmented reality application
US8666821B2 (en) Selecting advertisements based on serving area and map area
US9488488B2 (en) Augmented reality maps
US20050278371A1 (en) Method and system for georeferential blogging, bookmarking a location, and advanced off-board data processing for mobile systems
US20180067953A1 (en) Search filtering based on expected future time and location
US20080172173A1 (en) Location mapping for key-point based services
US8611920B2 (en) Method and apparatus for location identification
JP4591353B2 (en) Character recognition device, a mobile communication system, the mobile terminal device, a fixed station device, a character recognition method and a character recognition program
US20140379248A1 (en) Non-map-based mobile interface

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): AU CA CH CN JP KR NZ

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE

121 Ep: the epo has been informed by wipo that ep was designated in this application
AL Designated countries for regional patents

Kind code of ref document: A3

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE

AK Designated states

Kind code of ref document: A3

Designated state(s): AU CA CH CN JP KR NZ

NENP Non-entry into the national phase in:

Ref country code: KR

122 Ep: pct application non-entry in european phase