US20140040252A1 - Providing Location and Spatial Data about the Physical Environment - Google Patents

Providing Location and Spatial Data about the Physical Environment Download PDF

Info

Publication number
US20140040252A1
US20140040252A1 US14/045,337 US201314045337A US2014040252A1 US 20140040252 A1 US20140040252 A1 US 20140040252A1 US 201314045337 A US201314045337 A US 201314045337A US 2014040252 A1 US2014040252 A1 US 2014040252A1
Authority
US
United States
Prior art keywords
tag
data
mobile device
location
spatial data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/045,337
Inventor
Matthew J. Jarvis
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
International Business Machines Corp
Original Assignee
International Business Machines Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by International Business Machines Corp filed Critical International Business Machines Corp
Priority to US14/045,337 priority Critical patent/US20140040252A1/en
Publication of US20140040252A1 publication Critical patent/US20140040252A1/en
Assigned to INTERNATIONAL BUSINESS MACHINES CORPORATION reassignment INTERNATIONAL BUSINESS MACHINES CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: JARVIS, MATTHEW J.
Abandoned legal-status Critical Current

Links

Images

Classifications

    • G06F17/3087
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9537Spatial or temporal dependent retrieval, e.g. spatiotemporal queries
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H3/00Appliances for aiding patients or disabled persons to walk about
    • A61H3/06Walking aids for blind persons
    • A61H3/061Walking aids for blind persons with electronic detecting or guiding means
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/02Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using radio waves
    • G01S5/0257Hybrid positioning
    • G01S5/0263Hybrid positioning by combining or switching between positions derived from two or more separate positioning systems
    • G01S5/0264Hybrid positioning by combining or switching between positions derived from two or more separate positioning systems at least one of the systems being a non-radio wave positioning system
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/16Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using electromagnetic waves other than radio waves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/002Specific input/output arrangements not covered by G06F3/01 - G06F3/16
    • G06F3/005Input arrangements through a video camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B21/00Teaching, or communicating with, the blind, deaf or mute
    • G09B21/001Teaching or communicating with blind persons
    • G09B21/003Teaching or communicating with blind persons using tactile presentation of the information, e.g. Braille displays
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B21/00Teaching, or communicating with, the blind, deaf or mute
    • G09B21/001Teaching or communicating with blind persons
    • G09B21/006Teaching or communicating with blind persons using audible presentation of the information
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B21/00Teaching, or communicating with, the blind, deaf or mute
    • G09B21/001Teaching or communicating with blind persons
    • G09B21/007Teaching or communicating with blind persons using both tactile and audible presentation of the information
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B21/00Teaching, or communicating with, the blind, deaf or mute
    • G09B21/001Teaching or communicating with blind persons
    • G09B21/008Teaching or communicating with blind persons using visual presentation of the information for the partially sighted
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F9/00Methods or devices for treatment of the eyes; Devices for putting-in contact lenses; Devices to correct squinting; Apparatus to guide the blind; Protective devices for the eyes, carried on the body or in the hand
    • A61F9/08Devices or methods enabling eye-patients to replace direct visual perception by another kind of perception
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H3/00Appliances for aiding patients or disabled persons to walk about
    • A61H3/06Walking aids for blind persons
    • A61H3/061Walking aids for blind persons with electronic detecting or guiding means
    • A61H2003/063Walking aids for blind persons with electronic detecting or guiding means with tactile perception
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5007Control means thereof computer controlled
    • A61H2201/501Control means thereof computer controlled connected to external computer devices or networks
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5007Control means thereof computer controlled
    • A61H2201/501Control means thereof computer controlled connected to external computer devices or networks
    • A61H2201/5012Control means thereof computer controlled connected to external computer devices or networks using the internet
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5058Sensors or detectors
    • A61H2201/5064Position sensors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5058Sensors or detectors
    • A61H2201/5092Optical sensor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5097Control means thereof wireless
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S2205/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S2205/01Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations specially adapted for specific applications
    • G01S2205/09Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations specially adapted for specific applications for tracking people
    • G01S2205/10Elderly or infirm

Definitions

  • This invention relates to the field of providing location and spatial data about the physical environment.
  • the invention relates to providing location and spatial data about the physical environment to aid the visually impaired.
  • Signage in the physical environment may indicate specific hazards to specific groups. For example, edge of stair markings or braille signage may be provided for the visually impaired. However, such signage is often not viewable until the user is already very close. Also, such signs are generally shallow in terms of information exchange.
  • hazard indicating signs are lacking in specific detail.
  • a sign may indicate an electrical hazard at the entrance to a space, but this does not give any detail on the exact location or nature of the hazard.
  • Machine readable symbols with data encoding already exist in many fields. For example, bar codes, or data matrix symbols for mobile phones. However, explicit action is usually required by the user to interpret and act on the symbols.
  • a method for providing location and spatial data about the physical environment comprising: sensing a tag provided in the environment by a mobile device; receiving data from the tag at the mobile device, the data including location and spatial data relating to the physical environment of the tag; and representing the location and spatial data from the tag by the mobile device.
  • a system for providing location and spatial data about the physical environment in the form of a mobile device including: a sensing component for sensing a tag provided in the environment by a mobile device; a tag data receiver for receiving data from the tag at the mobile device, the data including location and spatial data relating to the physical environment of the tag; and a data representing component for representing the location and spatial data from the tag by the mobile device.
  • a computer program product for providing location and spatial data about the physical environment
  • the computer program product comprising a computer readable storage medium having computer-readable program code embodied therewith, the computer-readable program code configures to: sense a tag provided in the environment by a mobile device; receive data from the tag at the mobile device, the data including location and spatial data relating to the physical environment of the tag; and represent the location and spatial data from the tag by the mobile device.
  • the described aspects of the invention provide the advantage of providing spatial and locational information for a location via a mobile device.
  • FIG. 1 is block diagram of an example embodiment of a system in accordance with the present invention
  • FIG. 2 is block diagram of an example embodiment of a system in accordance with the present invention.
  • FIG. 3 is block diagram of an example embodiment of a system in accordance with the present invention.
  • FIG. 4 is a block diagram of an embodiment of a computer system in which the present invention may be implemented.
  • FIGS. 5 to 8 are flow diagrams of example embodiments of aspects of a method in accordance with the present invention.
  • Method and system are provided for encoding locational and spatial data about the physical environment in “tags” from which the information may be retrieved by a mobile device and the information may be represented to the user of the mobile device in a suitable manner, such as via auditory information, tactile information, or augmented reality techniques.
  • Locational and spatial data about the physical environment may be provided by tags, for example, in the form of machine readable symbols or low power local transmitters, with the information to be represented to a user through augmented sensory feedback using a mobile device.
  • tags for example, in the form of machine readable symbols or low power local transmitters
  • trip hazards could be encoded so that a visually-impaired person approaching would be warned via sound or touch or augmented reality techniques, including augmented reality overlay on real time video.
  • the form of the locational information may be an exact encoded location of the object to which the tag refers. This could be in the form of Global Positioning System (GPS) positional data for locations where GPS can be used. For indoor locations, this may be a triangulation position dependent on the particular Indoor Positioning System (IPS) being used.
  • GPS Global Positioning System
  • IPS Indoor Positioning System
  • the form of the spatial data may be encoded data about the actual object to which the tag refers. This may be the dimensions in 3D, and other data relevant to the particular object.
  • this might include the number of stairs and their depth and orientation in space.
  • a barrier for example, a fence or wall, this could include the height and thickness.
  • the data may be presented in a variety of ways, including an auditory signal alert as the user approaches the location, haptic type vibrating alerts used in the same way, tool tip type overlays over the real time video containing additional information as text, 3D wire-frame overlays showing the shape and location of the information.
  • tool tip type overlays may appear over the video as the user approaches a set of stairs.
  • the stairs may be identified in the video frame with a wire frame overlay.
  • This overlay could be ‘clickable’, which would trigger further text information about the hazard to appear, or to allow the user to zoom in on the area to view it in more detail before navigating it.
  • FIG. 1 a block diagram shows an example embodiment of the described system.
  • a mobile device 110 may be provided, for example, in the form of a mobile computing device such as a tablet or a mobile telephone device.
  • the mobile device 110 may include all or some of the components of: GPS sensor 111 , spatial sensor 112 , signal receiver 113 , camera 114 , and display 115 , etc.
  • the mobile device 110 includes a physical environment data component 120 which provides location and spatial data about the physical environment to the mobile device 110 user.
  • the physical environment data component 120 may include a sensor component 121 for sensing a tag and a tag data receiver 122 for receiving and decoding data provided by tags 131 - 133 at physical locations which provide location and spatial data about the physical environment at the physical location of the tag 131 - 133 .
  • tags 131 - 133 may be provided at physical hazards, potentially dangerous places, or places of interest.
  • a tag 131 - 133 in a close location to the mobile device 110 may be sensed and the tag data received by the tag data receiver 122 .
  • the physical environment data component 120 may also include data representing component 123 for representing the received tag data to the user of the mobile device 110 .
  • the tag data may be represented via audio, tactile, or using augmented reality techniques.
  • the physical environment data component 120 may also include a decoding component 124 for decoding location and spatial data received from a tag 131 - 133 .
  • the physical environment data component 120 may also include a next tag component 125 for determining the location of a next tag 131 - 133 from the data supplied by a tag 131 - 133 .
  • the data may provide directions to the next tag 131 - 133 .
  • the tag data receiver 121 may include a video camera which senses the tags 131 - 133 and augments the video image display in real time.
  • a first embodiment of a system is described in which machine readable visual symbols encoding spatial and locational information about the physical environment are provided. Such spatial and locational information may augment existing physical signs, and can be automatically identified and decoded by a video camera in a mobile device, which may then represent this data to a user in various ways in real time.
  • FIG. 2 shows the system 200 of FIG. 1 with the tags 231 - 233 in the form of machine readable visual symbols which may be read by the tag data receiver 121 of the mobile device 110 .
  • the tag data receiver 121 may be a still or video camera for receiving the image of the symbol of the tags 231 - 233 .
  • the tags 231 - 233 may be identified by the camera using video analytics. This would work in a similar way to Quick Response (QR) tags. These may be of a size large enough to be identified using the video camera viewing the space.
  • QR Quick Response
  • tags 231 - 233 may be initially identified in the space using a primary tag, which can then trigger the video camera to zoom in to gather further details.
  • a primary tag may take the form of an easily identifiable surround which can be seen at longer distances, which signifies that a tag 231 - 233 is present within it. For example, this could be a brightly colored sign edge.
  • a video camera application may pick this up, and then may zoom to view the tag 231 - 233 and capture the information encoded in it.
  • Layar Reality Browser (Layar is a trade mark of Layar) provides an augmented reality platform for web services serving geo-located points of interest in the vicinity of the user.
  • Image detection and identification on mobile devices are also known in the art.
  • a second embodiment of a system in which low cost, low power wireless transmitters encoding and broadcasting spatial and locational information about the physical environment are provided.
  • Such spatial and locational information may augment existing physical signs, and can be automatically identified and decoded by a mobile devices equipped with the appropriate receivers, which may then re-present this data to a user in various ways in real time.
  • FIG. 3 shows the system 300 of FIG. 1 with the tags 331 - 333 in the form of transmitters whose signal may be received by the tag data receiver 121 of the mobile device 110 .
  • the tag data receiver 121 may be a radio signal receiver.
  • the tag data receiver 121 may include both a signal receiver and a camera for receiving an image of the location of the tags 331 - 333 as well as their data.
  • Transmitters may take the form of low power, low cost wireless devices, using protocols such as Radio-frequency identification (RFID), Bluetooth (proprietary open wireless technology) or ZigBee (a wireless mesh network standard, ZigBee is a trade mark of ZigBee Alliance). Transmitters may also encode the location of other transmitters, enabling a chain of communication throughout a physical space. A mobile device, viewing the environment through a video camera, may identify or sense and decode transmitters in its field of reception, then decode and represent the data to the user in a variety of ways.
  • RFID Radio-frequency identification
  • Bluetooth proprietary open wireless technology
  • ZigBee a wireless mesh network standard, ZigBee is a trade mark of ZigBee Alliance
  • a physical space may be ‘tagged’ with wireless transmitters or machine readable symbols sited at areas of potential hazard or interest, replacing or augmenting traditional signage. For example, a set of stairs may be tagged and would contain encoded data giving the exact location and physical dimensions of the steps.
  • a mobile device viewing the environment through a video camera would identify and decode the symbols in its field of view using visual analytics, and then decode and re-present the data to the user in a variety of ways.
  • a combination of the visual and broadcast tags may be used. This may be required to ensure that all tags are read by the application, dependent on the orientation of possible signage.
  • Many public spaces, for example, museums or galleries have fairly defined flow patterns for visitors so it is relatively easy to site signage where it can be seen from many different positions
  • an exemplary system for the mobile device includes a data processing system 400 suitable for storing and/or executing program code including at least one processor 401 coupled directly or indirectly to memory elements through a bus system 403 .
  • the memory elements may include local memory employed during actual execution of the program code, bulk storage, and cache memories which provide temporary storage of at least some program code in order to reduce the number of times code must be retrieved from bulk storage during execution.
  • the memory elements may include system memory 402 in the form of read only memory (ROM) 404 and random access memory (RAM) 405 .
  • ROM read only memory
  • RAM random access memory
  • a basic input/output system (BIOS) 406 may be stored in ROM 404 .
  • System software 407 may be stored in RAM 405 including operating system software 408 .
  • Software applications 410 may also be stored in RAM 405 .
  • the system 400 may also include a primary storage means 411 such as a magnetic hard disk drive and secondary storage means 412 such as a magnetic disc drive and an optical disc drive.
  • the drives and their associated computer-readable media provide non-volatile storage of computer-executable instructions, data structures, program modules and other data for the system 400 .
  • Software applications may be stored on the primary and secondary storage means 411 , 412 as well as the system memory 402 .
  • the computing system 400 may operate in a networked environment using logical connections to one or more remote computers via a network adapter 416 .
  • Input/output devices 413 may be coupled to the system either directly or through intervening I/O controllers.
  • a user may enter commands and information into the system 400 through input devices such as a keyboard, pointing device, or other input devices (for example, microphone, joy stick, game pad, satellite dish, scanner, or the like).
  • Output devices may include speakers, printers, etc.
  • a display device 414 is also connected to system bus 403 via an interface, such as video adapter 415 .
  • a mobile device such as a mobile computing device or phone, may sense 501 a tag.
  • the tag may be a transmitting device or a machine-readable symbol.
  • the mobile device may receive 502 data from the tag including location and spatial data relating to the physical environment of the tag.
  • the data received from the tag may point 503 or provide information regarding the location of the next tag.
  • the mobile device may represent 504 the location and spatial data from the tag in a manner suitable for the user.
  • the user may be visually impaired in which case the data may be represented by audio means 505 or by tactile means 506 or by augmented reality 507 .
  • a flow diagram 600 shows an example embodiment of the described method in which the tags are machine-readable symbols.
  • the mobile device may sense 601 a tag in the form of a machine-readable symbol.
  • the mobile device may receive 602 data from the tag by means of a camera including location and spatial data relating to the physical environment of the tag.
  • the mobile device may represent 603 the location and spatial data from the tag for the user.
  • a flow diagram 700 shows an example embodiment of the described method in which the tags are transmitting devices.
  • the mobile device may sense 701 a tag in the form of a transmitting device when the mobile device is in the transmission range of the transmitter.
  • the mobile device may receive 702 data from the tag by means of a signal receiver including location and spatial data relating to the physical environment of the tag.
  • the mobile device may represent 703 the location and spatial data from the tag for the user.
  • a flow diagram 800 shows an example embodiment of the described method in which a mobile device views 801 live video of the physical environment of the mobile device.
  • the mobile device may sense 802 a tag and may receive 803 data from the tag including location and spatial data relating to the physical environment of the tag.
  • the mobile device may represent 804 the location and spatial data from the tag as augmented reality over the live video of the physical location to which the tag relates.
  • a user Before navigating through the space, a user could “view” a scene using the mobile device, and access detailed information about all of the potential hazards in that space, and use the device to access real time feedback as a video overlay, audio commentary, or tactile feedback indicating their proximity to such hazards, by using the locational and positional sensors in the device.
  • a visually impaired user may have some close vision, but lack detail in depth vision at any other depth.
  • the user could view the space, and the mobile device would present all the hazards appropriate to them as an augmented reality overlay over the real time video, allowing zooming in to more closely examine the space before attempting to navigate.
  • Another example would be adding such transmitters to signage indicating chemical or explosive risk. Whilst these signs indicate danger, they do not signal in any detail the type or exact location of the hazard. Emergency services entering a space could use a mobile device to access transmitted signals or “scan” such signage, and access data indicating the exact nature, amount and physical location of the hazard.
  • the invention can take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment containing both hardware and software elements.
  • the invention is implemented in software, which includes but is not limited to firmware, resident software, microcode, etc.
  • the invention can take the form of a computer program product accessible from a computer-usable or computer-readable medium providing program code for use by or in connection with a computer or any instruction execution system.
  • a computer usable or computer readable medium can be any apparatus that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus or device.
  • the medium can be an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system (or apparatus or device) or a propagation medium.
  • Examples of a computer-readable medium include a semiconductor or solid state memory, magnetic tape, a removable computer diskette, a random access memory (RAM), a read only memory (ROM), a rigid magnetic disk and an optical disk.
  • Current examples of optical disks include compact disk read only memory (CD-ROM), compact disk read/write (CD-R/W), and DVD.
  • aspects of the present invention may be embodied as a system, method or computer program product. Accordingly, aspects of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a “circuit,” “module” or “system”. Furthermore, aspects of the present invention may take the form of a computer program product embodied in one or more computer readable medium(s) having computer readable program code embodied thereon.
  • the computer readable medium may be a computer readable signal medium or a computer readable storage medium.
  • a computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing.
  • a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
  • a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof.
  • a computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
  • Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
  • Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C++ or the like and conventional procedural programming languages, such as the “C” programming language or similar programming languages.
  • the program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server.
  • the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
  • LAN local area network
  • WAN wide area network
  • Internet Service Provider for example, AT&T, MCI, Sprint, EarthLink, MSN, GTE, etc.
  • These computer program instructions may also be stored in a computer readable medium that can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer readable medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
  • the computer program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s).
  • the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.

Abstract

A method is provided for providing location and spatial data about the physical environment. The method includes: sensing a tag provided in the environment by a mobile device; receiving data from the tag at the mobile device, the data including location and spatial data relating to the physical environment of the tag; and representing the location and spatial data from the tag by the mobile device. In one embodiment, sensing a tag may include sensing a tag in the form of a machine-readable symbol; and receiving data from the tag receives data by processing an image of the machine-readable symbol by the mobile device. In another embodiment, sensing a tag includes sensing a tag in the form of a transmitter device; and receiving data from the tag receives data by receiving a signal by the mobile device.

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • This Application claims the benefit of priority to U.S. patent application Ser. No. 13/905,610, filed on May 30, 2013, which claims the benefit of priority to United Kingdom Patent Application Serial No. 1209585.7, filed on May 30, 2012, the contents of which are hereby incorporated by reference.
  • FIELD OF INVENTION
  • This invention relates to the field of providing location and spatial data about the physical environment. In particular, the invention relates to providing location and spatial data about the physical environment to aid the visually impaired.
  • BACKGROUND OF INVENTION
  • Signage in the physical environment may indicate specific hazards to specific groups. For example, edge of stair markings or braille signage may be provided for the visually impaired. However, such signage is often not viewable until the user is already very close. Also, such signs are generally shallow in terms of information exchange.
  • In addition, other hazard indicating signs are lacking in specific detail. For example, a sign may indicate an electrical hazard at the entrance to a space, but this does not give any detail on the exact location or nature of the hazard.
  • Machine readable symbols with data encoding already exist in many fields. For example, bar codes, or data matrix symbols for mobile phones. However, explicit action is usually required by the user to interpret and act on the symbols.
  • Therefore, there is a need in the art to address the aforementioned problems.
  • BRIEF SUMMARY OF THE INVENTION
  • According to a first aspect of the present invention there is provided a method for providing location and spatial data about the physical environment, comprising: sensing a tag provided in the environment by a mobile device; receiving data from the tag at the mobile device, the data including location and spatial data relating to the physical environment of the tag; and representing the location and spatial data from the tag by the mobile device.
  • According to a second aspect of the present invention there is provided a system for providing location and spatial data about the physical environment, in the form of a mobile device including: a sensing component for sensing a tag provided in the environment by a mobile device; a tag data receiver for receiving data from the tag at the mobile device, the data including location and spatial data relating to the physical environment of the tag; and a data representing component for representing the location and spatial data from the tag by the mobile device.
  • According to a third aspect of the present invention there is provided a computer program product for providing location and spatial data about the physical environment, the computer program product comprising a computer readable storage medium having computer-readable program code embodied therewith, the computer-readable program code configures to: sense a tag provided in the environment by a mobile device; receive data from the tag at the mobile device, the data including location and spatial data relating to the physical environment of the tag; and represent the location and spatial data from the tag by the mobile device.
  • According to a fourth aspect of the present invention there is provided a method substantially as described with reference to the figures.
  • According to an fifth aspect of the present invention there is provided a system substantially as described with reference to the figures.
  • The described aspects of the invention provide the advantage of providing spatial and locational information for a location via a mobile device.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The subject matter regarded as the invention is particularly pointed out and distinctly claimed in the concluding portion of the specification. The invention, both as to organization and method of operation, together with objects, features, and advantages thereof, may best be understood by reference to the following detailed description when read with the accompanying drawings.
  • Preferred embodiments of the present invention will now be described, by way of example only, with reference to the following drawings in which:
  • FIG. 1 is block diagram of an example embodiment of a system in accordance with the present invention;
  • FIG. 2 is block diagram of an example embodiment of a system in accordance with the present invention;
  • FIG. 3 is block diagram of an example embodiment of a system in accordance with the present invention;
  • FIG. 4 is a block diagram of an embodiment of a computer system in which the present invention may be implemented; and
  • FIGS. 5 to 8 are flow diagrams of example embodiments of aspects of a method in accordance with the present invention.
  • DETAILED DESCRIPTION OF THE DRAWINGS
  • It will be appreciated that for simplicity and clarity of illustration, elements shown in the figures have not necessarily been drawn to scale. For example, the dimensions of some of the elements may be exaggerated relative to other elements for clarity. Further, where considered appropriate, reference numbers may be repeated among the figures to indicate corresponding or analogous features.
  • In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the invention. However, it will be understood by those skilled in the art that the present invention may be practiced without these specific details. In other instances, well-known methods, procedures, and components have not been described in detail so as not to obscure the present invention.
  • Method and system are provided for encoding locational and spatial data about the physical environment in “tags” from which the information may be retrieved by a mobile device and the information may be represented to the user of the mobile device in a suitable manner, such as via auditory information, tactile information, or augmented reality techniques.
  • Locational and spatial data about the physical environment may be provided by tags, for example, in the form of machine readable symbols or low power local transmitters, with the information to be represented to a user through augmented sensory feedback using a mobile device. For example, trip hazards could be encoded so that a visually-impaired person approaching would be warned via sound or touch or augmented reality techniques, including augmented reality overlay on real time video.
  • The form of the locational information may be an exact encoded location of the object to which the tag refers. This could be in the form of Global Positioning System (GPS) positional data for locations where GPS can be used. For indoor locations, this may be a triangulation position dependent on the particular Indoor Positioning System (IPS) being used.
  • The form of the spatial data may be encoded data about the actual object to which the tag refers. This may be the dimensions in 3D, and other data relevant to the particular object.
  • For a set of stairs for example, this might include the number of stairs and their depth and orientation in space. For a barrier, for example, a fence or wall, this could include the height and thickness.
  • The data may be presented in a variety of ways, including an auditory signal alert as the user approaches the location, haptic type vibrating alerts used in the same way, tool tip type overlays over the real time video containing additional information as text, 3D wire-frame overlays showing the shape and location of the information.
  • As an example, tool tip type overlays may appear over the video as the user approaches a set of stairs. The stairs may be identified in the video frame with a wire frame overlay. This overlay could be ‘clickable’, which would trigger further text information about the hazard to appear, or to allow the user to zoom in on the area to view it in more detail before navigating it.
  • Referring to FIG. 1, a block diagram shows an example embodiment of the described system.
  • A mobile device 110 may be provided, for example, in the form of a mobile computing device such as a tablet or a mobile telephone device. The mobile device 110 may include all or some of the components of: GPS sensor 111, spatial sensor 112, signal receiver 113, camera 114, and display 115, etc.
  • In the described system, the mobile device 110 includes a physical environment data component 120 which provides location and spatial data about the physical environment to the mobile device 110 user.
  • The physical environment data component 120 may include a sensor component 121 for sensing a tag and a tag data receiver 122 for receiving and decoding data provided by tags 131-133 at physical locations which provide location and spatial data about the physical environment at the physical location of the tag 131-133. For example, tags 131-133 may be provided at physical hazards, potentially dangerous places, or places of interest. A tag 131-133 in a close location to the mobile device 110 may be sensed and the tag data received by the tag data receiver 122.
  • The physical environment data component 120 may also include data representing component 123 for representing the received tag data to the user of the mobile device 110. The tag data may be represented via audio, tactile, or using augmented reality techniques.
  • The physical environment data component 120 may also include a decoding component 124 for decoding location and spatial data received from a tag 131-133.
  • The physical environment data component 120 may also include a next tag component 125 for determining the location of a next tag 131-133 from the data supplied by a tag 131-133. The data may provide directions to the next tag 131-133.
  • The tag data receiver 121 may include a video camera which senses the tags 131-133 and augments the video image display in real time.
  • A first embodiment of a system is described in which machine readable visual symbols encoding spatial and locational information about the physical environment are provided. Such spatial and locational information may augment existing physical signs, and can be automatically identified and decoded by a video camera in a mobile device, which may then represent this data to a user in various ways in real time.
  • An example of the first embodiment is shown in FIG. 2. FIG. 2 shows the system 200 of FIG. 1 with the tags 231-233 in the form of machine readable visual symbols which may be read by the tag data receiver 121 of the mobile device 110. In this embodiment, the tag data receiver 121 may be a still or video camera for receiving the image of the symbol of the tags 231-233.
  • When using visual tags, the tags 231-233 may be identified by the camera using video analytics. This would work in a similar way to Quick Response (QR) tags. These may be of a size large enough to be identified using the video camera viewing the space.
  • Alternatively, tags 231-233 may be initially identified in the space using a primary tag, which can then trigger the video camera to zoom in to gather further details. A primary tag may take the form of an easily identifiable surround which can be seen at longer distances, which signifies that a tag 231-233 is present within it. For example, this could be a brightly colored sign edge. A video camera application may pick this up, and then may zoom to view the tag 231-233 and capture the information encoded in it.
  • Many applications already exist to recognize real world elements using mobile devices and display additional data overlaid to the user. For example, The Layar Reality Browser (Layar is a trade mark of Layar) provides an augmented reality platform for web services serving geo-located points of interest in the vicinity of the user. Image detection and identification on mobile devices are also known in the art.
  • A second embodiment of a system is described in which low cost, low power wireless transmitters encoding and broadcasting spatial and locational information about the physical environment are provided. Such spatial and locational information may augment existing physical signs, and can be automatically identified and decoded by a mobile devices equipped with the appropriate receivers, which may then re-present this data to a user in various ways in real time.
  • An example of the second embodiment is shown in FIG. 3. FIG. 3 shows the system 300 of FIG. 1 with the tags 331-333 in the form of transmitters whose signal may be received by the tag data receiver 121 of the mobile device 110. In this embodiment, the tag data receiver 121 may be a radio signal receiver. In this embodiment, the tag data receiver 121 may include both a signal receiver and a camera for receiving an image of the location of the tags 331-333 as well as their data.
  • Transmitters may take the form of low power, low cost wireless devices, using protocols such as Radio-frequency identification (RFID), Bluetooth (proprietary open wireless technology) or ZigBee (a wireless mesh network standard, ZigBee is a trade mark of ZigBee Alliance). Transmitters may also encode the location of other transmitters, enabling a chain of communication throughout a physical space. A mobile device, viewing the environment through a video camera, may identify or sense and decode transmitters in its field of reception, then decode and represent the data to the user in a variety of ways.
  • A physical space may be ‘tagged’ with wireless transmitters or machine readable symbols sited at areas of potential hazard or interest, replacing or augmenting traditional signage. For example, a set of stairs may be tagged and would contain encoded data giving the exact location and physical dimensions of the steps.
  • Once the environment is “tagged” in this way, a mobile device viewing the environment through a video camera would identify and decode the symbols in its field of view using visual analytics, and then decode and re-present the data to the user in a variety of ways.
  • In some situations (for example, a busy indoor location), a combination of the visual and broadcast tags may be used. This may be required to ensure that all tags are read by the application, dependent on the orientation of possible signage. Many public spaces, for example, museums or galleries have fairly defined flow patterns for visitors so it is relatively easy to site signage where it can be seen from many different positions
  • Emerging tablets, mobile telephones, and other mobile devices are now fitted often with highly accurate spatial and locational sensors, broadcast signal receivers, and high-resolution cameras. These technologies are rapidly improving in resolution, and the devices themselves rapidly increasing in processing power, so true visual analytics using mobile devices in real time video is possible.
  • Referring to FIG. 4, an exemplary system for the mobile device includes a data processing system 400 suitable for storing and/or executing program code including at least one processor 401 coupled directly or indirectly to memory elements through a bus system 403. The memory elements may include local memory employed during actual execution of the program code, bulk storage, and cache memories which provide temporary storage of at least some program code in order to reduce the number of times code must be retrieved from bulk storage during execution.
  • The memory elements may include system memory 402 in the form of read only memory (ROM) 404 and random access memory (RAM) 405. A basic input/output system (BIOS) 406 may be stored in ROM 404. System software 407 may be stored in RAM 405 including operating system software 408. Software applications 410 may also be stored in RAM 405.
  • The system 400 may also include a primary storage means 411 such as a magnetic hard disk drive and secondary storage means 412 such as a magnetic disc drive and an optical disc drive. The drives and their associated computer-readable media provide non-volatile storage of computer-executable instructions, data structures, program modules and other data for the system 400. Software applications may be stored on the primary and secondary storage means 411, 412 as well as the system memory 402.
  • The computing system 400 may operate in a networked environment using logical connections to one or more remote computers via a network adapter 416.
  • Input/output devices 413 may be coupled to the system either directly or through intervening I/O controllers. A user may enter commands and information into the system 400 through input devices such as a keyboard, pointing device, or other input devices (for example, microphone, joy stick, game pad, satellite dish, scanner, or the like). Output devices may include speakers, printers, etc. A display device 414 is also connected to system bus 403 via an interface, such as video adapter 415.
  • Referring to FIG. 5, a flow diagram 500 shows an embodiment of the described method. A mobile device, such as a mobile computing device or phone, may sense 501 a tag. For example, the tag may be a transmitting device or a machine-readable symbol. The mobile device may receive 502 data from the tag including location and spatial data relating to the physical environment of the tag.
  • Optionally, the data received from the tag may point 503 or provide information regarding the location of the next tag.
  • The mobile device may represent 504 the location and spatial data from the tag in a manner suitable for the user. For example, the user may be visually impaired in which case the data may be represented by audio means 505 or by tactile means 506 or by augmented reality 507.
  • Referring to FIG. 6, a flow diagram 600 shows an example embodiment of the described method in which the tags are machine-readable symbols.
  • The mobile device may sense 601 a tag in the form of a machine-readable symbol. The mobile device may receive 602 data from the tag by means of a camera including location and spatial data relating to the physical environment of the tag.
  • The mobile device may represent 603 the location and spatial data from the tag for the user.
  • Referring to FIG. 7, a flow diagram 700 shows an example embodiment of the described method in which the tags are transmitting devices.
  • The mobile device may sense 701 a tag in the form of a transmitting device when the mobile device is in the transmission range of the transmitter. The mobile device may receive 702 data from the tag by means of a signal receiver including location and spatial data relating to the physical environment of the tag.
  • The mobile device may represent 703 the location and spatial data from the tag for the user.
  • Referring to FIG. 8, a flow diagram 800 shows an example embodiment of the described method in which a mobile device views 801 live video of the physical environment of the mobile device.
  • The mobile device may sense 802 a tag and may receive 803 data from the tag including location and spatial data relating to the physical environment of the tag.
  • The mobile device may represent 804 the location and spatial data from the tag as augmented reality over the live video of the physical location to which the tag relates.
  • Before navigating through the space, a user could “view” a scene using the mobile device, and access detailed information about all of the potential hazards in that space, and use the device to access real time feedback as a video overlay, audio commentary, or tactile feedback indicating their proximity to such hazards, by using the locational and positional sensors in the device.
  • For example, a visually impaired user may have some close vision, but lack detail in depth vision at any other depth. Using such a system, the user could view the space, and the mobile device would present all the hazards appropriate to them as an augmented reality overlay over the real time video, allowing zooming in to more closely examine the space before attempting to navigate.
  • Another example would be adding such transmitters to signage indicating chemical or explosive risk. Whilst these signs indicate danger, they do not signal in any detail the type or exact location of the hazard. Emergency services entering a space could use a mobile device to access transmitted signals or “scan” such signage, and access data indicating the exact nature, amount and physical location of the hazard.
  • The invention can take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment containing both hardware and software elements. In a preferred embodiment, the invention is implemented in software, which includes but is not limited to firmware, resident software, microcode, etc.
  • The invention can take the form of a computer program product accessible from a computer-usable or computer-readable medium providing program code for use by or in connection with a computer or any instruction execution system. For the purposes of this description, a computer usable or computer readable medium can be any apparatus that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus or device.
  • The medium can be an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system (or apparatus or device) or a propagation medium. Examples of a computer-readable medium include a semiconductor or solid state memory, magnetic tape, a removable computer diskette, a random access memory (RAM), a read only memory (ROM), a rigid magnetic disk and an optical disk. Current examples of optical disks include compact disk read only memory (CD-ROM), compact disk read/write (CD-R/W), and DVD.
  • As will be appreciated by one skilled in the art, aspects of the present invention may be embodied as a system, method or computer program product. Accordingly, aspects of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a “circuit,” “module” or “system”. Furthermore, aspects of the present invention may take the form of a computer program product embodied in one or more computer readable medium(s) having computer readable program code embodied thereon.
  • Any combination of one or more computer readable medium(s) may be utilized. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
  • A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
  • Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
  • Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C++ or the like and conventional procedural programming languages, such as the “C” programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
  • Aspects of the present invention are described below with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • These computer program instructions may also be stored in a computer readable medium that can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer readable medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
  • The computer program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • The flowchart and block diagrams in the Figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
  • Improvements and modifications can be made to the foregoing without departing from the scope of the present invention.

Claims (12)

1. A method for providing location and spatial data about a physical environment, comprising:
sensing a tag provided in the environment by a mobile device;
receiving data from the tag at the mobile device, the data including location and spatial data relating to the physical environment of the tag; and
representing the location and spatial data from the tag by the mobile device.
2. The method as claimed in claim 1, wherein sensing the tag includes sensing the tag in a form of a machine-readable symbol; and receiving data from the tag receives data by processing an image of the machine-readable symbol by the mobile device.
3. The method as claimed in claim 2, wherein sensing the tag in the form of a machine-readable symbol includes identifying the tag using video analytics.
4. The method as claimed in claim 1, wherein sensing the tag includes sensing the tag in a form of a transmitter device; and receiving data from the tag receives data by receiving a signal by the mobile device.
5. The method as claimed in claim 1, wherein the location data is in a form of Global Positioning System (GPS) data or Indoor Positioning System (IPS) data.
6. The method as claimed in claim 1, wherein the spatial data defines the dimensions of an object at the location.
7. The method as claimed in claim 1, further comprising:
decoding data received from the tag to retrieve the location and spatial data.
8. The method as claimed in claim 1, further comprising:
determining from the data received from the tag, the location of another tag.
9. The method as claimed in claim 1, wherein representing the location and spatial data from the tag by the mobile device includes representing the data by audio means.
10. The method as claimed in claim 1, wherein representing the location and spatial data from the tag by the mobile device includes representing the data by tactile means.
11. The method as claimed in claim 1, wherein representing the location and spatial data from the tag by the mobile device includes representing the data by augmented reality means.
12. The method as claimed in claim 1, further comprising:
viewing a live video of a physical environment of a mobile device; and
wherein representing the location and spatial data from the tag by the mobile device includes representing the data by augmented realty over a view of the live video.
US14/045,337 2012-05-30 2013-10-03 Providing Location and Spatial Data about the Physical Environment Abandoned US20140040252A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US14/045,337 US20140040252A1 (en) 2012-05-30 2013-10-03 Providing Location and Spatial Data about the Physical Environment

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
GB1209585.7 2012-05-30
GB1209585.7A GB2502549A (en) 2012-05-30 2012-05-30 Navigation system
US13/905,610 US9710564B2 (en) 2012-05-30 2013-05-30 Providing location and spatial data about the physical environment
US14/045,337 US20140040252A1 (en) 2012-05-30 2013-10-03 Providing Location and Spatial Data about the Physical Environment

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US13/905,610 Continuation US9710564B2 (en) 2012-05-30 2013-05-30 Providing location and spatial data about the physical environment

Publications (1)

Publication Number Publication Date
US20140040252A1 true US20140040252A1 (en) 2014-02-06

Family

ID=46546169

Family Applications (2)

Application Number Title Priority Date Filing Date
US13/905,610 Active 2034-02-01 US9710564B2 (en) 2012-05-30 2013-05-30 Providing location and spatial data about the physical environment
US14/045,337 Abandoned US20140040252A1 (en) 2012-05-30 2013-10-03 Providing Location and Spatial Data about the Physical Environment

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US13/905,610 Active 2034-02-01 US9710564B2 (en) 2012-05-30 2013-05-30 Providing location and spatial data about the physical environment

Country Status (2)

Country Link
US (2) US9710564B2 (en)
GB (1) GB2502549A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9355316B2 (en) 2014-05-22 2016-05-31 International Business Machines Corporation Identifying an obstacle in a route
US9355547B2 (en) 2014-05-22 2016-05-31 International Business Machines Corporation Identifying a change in a home environment
US20170091607A1 (en) * 2015-09-25 2017-03-30 Ca, Inc. Using augmented reality to assist data center operators
US9979473B2 (en) * 2015-10-29 2018-05-22 Plantronics, Inc. System for determining a location of a user

Families Citing this family (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10024679B2 (en) * 2014-01-14 2018-07-17 Toyota Motor Engineering & Manufacturing North America, Inc. Smart necklace with stereo vision and onboard processing
US9629774B2 (en) 2014-01-14 2017-04-25 Toyota Motor Engineering & Manufacturing North America, Inc. Smart necklace with stereo vision and onboard processing
US9578307B2 (en) 2014-01-14 2017-02-21 Toyota Motor Engineering & Manufacturing North America, Inc. Smart necklace with stereo vision and onboard processing
US9915545B2 (en) * 2014-01-14 2018-03-13 Toyota Motor Engineering & Manufacturing North America, Inc. Smart necklace with stereo vision and onboard processing
US10360907B2 (en) 2014-01-14 2019-07-23 Toyota Motor Engineering & Manufacturing North America, Inc. Smart necklace with stereo vision and onboard processing
US10248856B2 (en) 2014-01-14 2019-04-02 Toyota Motor Engineering & Manufacturing North America, Inc. Smart necklace with stereo vision and onboard processing
US10024667B2 (en) 2014-08-01 2018-07-17 Toyota Motor Engineering & Manufacturing North America, Inc. Wearable earpiece for providing social and environmental awareness
US10024678B2 (en) * 2014-09-17 2018-07-17 Toyota Motor Engineering & Manufacturing North America, Inc. Wearable clip for providing social and environmental awareness
US9922236B2 (en) * 2014-09-17 2018-03-20 Toyota Motor Engineering & Manufacturing North America, Inc. Wearable eyeglasses for providing social and environmental awareness
USD768024S1 (en) 2014-09-22 2016-10-04 Toyota Motor Engineering & Manufacturing North America, Inc. Necklace with a built in guidance device
CN104535960B (en) * 2014-12-29 2017-01-11 华南理工大学 Indoor rapid positioning method based on RFID
US9576460B2 (en) 2015-01-21 2017-02-21 Toyota Motor Engineering & Manufacturing North America, Inc. Wearable smart device for hazard detection and warning based on image and audio data
EP3047825A1 (en) * 2015-01-23 2016-07-27 ADI Access Limited Directional guidance apparatus
US10490102B2 (en) 2015-02-10 2019-11-26 Toyota Motor Engineering & Manufacturing North America, Inc. System and method for braille assistance
US9586318B2 (en) 2015-02-27 2017-03-07 Toyota Motor Engineering & Manufacturing North America, Inc. Modular robot with smart device
US9811752B2 (en) 2015-03-10 2017-11-07 Toyota Motor Engineering & Manufacturing North America, Inc. Wearable smart device and method for redundant object identification
US9677901B2 (en) 2015-03-10 2017-06-13 Toyota Motor Engineering & Manufacturing North America, Inc. System and method for providing navigation instructions at optimal times
US9972216B2 (en) 2015-03-20 2018-05-15 Toyota Motor Engineering & Manufacturing North America, Inc. System and method for storing and playback of information for blind users
US9898039B2 (en) 2015-08-03 2018-02-20 Toyota Motor Engineering & Manufacturing North America, Inc. Modular smart necklace
US10024680B2 (en) 2016-03-11 2018-07-17 Toyota Motor Engineering & Manufacturing North America, Inc. Step based guidance system
US9958275B2 (en) 2016-05-31 2018-05-01 Toyota Motor Engineering & Manufacturing North America, Inc. System and method for wearable smart device communications
US10561519B2 (en) 2016-07-20 2020-02-18 Toyota Motor Engineering & Manufacturing North America, Inc. Wearable computing device having a curved back to reduce pressure on vertebrae
US10432851B2 (en) 2016-10-28 2019-10-01 Toyota Motor Engineering & Manufacturing North America, Inc. Wearable computing device for detecting photography
US10012505B2 (en) 2016-11-11 2018-07-03 Toyota Motor Engineering & Manufacturing North America, Inc. Wearable system for providing walking directions
US10521669B2 (en) 2016-11-14 2019-12-31 Toyota Motor Engineering & Manufacturing North America, Inc. System and method for providing guidance or feedback to a user
US10172760B2 (en) 2017-01-19 2019-01-08 Jennifer Hendrix Responsive route guidance and identification system
US11321929B2 (en) 2018-05-18 2022-05-03 Purdue Research Foundation System and method for spatially registering multiple augmented reality devices
US11599194B2 (en) 2020-05-22 2023-03-07 International Business Machines Corporation Spatial guidance system for visually impaired individuals

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020149681A1 (en) * 2001-03-28 2002-10-17 Kahn Richard Oliver Automatic image capture
US20030197612A1 (en) * 2002-03-26 2003-10-23 Kabushiki Kaisha Toshiba Method of and computer program product for monitoring person's movements
US20040068368A1 (en) * 2000-11-15 2004-04-08 International Business Machines Corporation Apparatus, system, and method for determining a user position and progress along a path
US20040155815A1 (en) * 2001-05-14 2004-08-12 Motorola, Inc. Wireless navigational system, device and method
US20040178894A1 (en) * 2001-06-30 2004-09-16 Holger Janssen Head-up display system and method for carrying out the location-correct display of an object situated outside a vehicle with regard to the position of the driver
US20070125442A1 (en) * 2002-12-03 2007-06-07 Forhealth Technologies, Inc. Automated drug preparation apparatus including automated drug reconstitution
US20090285445A1 (en) * 2008-05-15 2009-11-19 Sony Ericsson Mobile Communications Ab System and Method of Translating Road Signs
US20110210167A1 (en) * 2010-02-28 2011-09-01 Lyon Geoffrey M System And Method For Determining Asset Location In A Rack

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050113075A1 (en) 2003-10-31 2005-05-26 Haberman William E. Blind transmission of content to and storage in mobile device
US6992592B2 (en) * 2003-11-06 2006-01-31 International Business Machines Corporation Radio frequency identification aiding the visually impaired with sound skins
LU91115B1 (en) * 2004-10-26 2006-04-27 European Community Navigation system for disabled persons, in particular visually impaired persons
US7267281B2 (en) * 2004-11-23 2007-09-11 Hopkins Billy D Location, orientation, product and color identification system for the blind or visually impaired
US20060129308A1 (en) * 2004-12-10 2006-06-15 Lawrence Kates Management and navigation system for the blind
US20060226973A1 (en) 2005-03-30 2006-10-12 Ranco Incorporated Of Delaware Device, system, and method for providing hazard warnings
WO2009001991A1 (en) * 2007-06-22 2008-12-31 Core Global It Co., Ltd. Voice guidance apparatus for visually handicapped
US8180396B2 (en) 2007-10-18 2012-05-15 Yahoo! Inc. User augmented reality for camera-enabled mobile devices
FR2945635A1 (en) * 2009-05-12 2010-11-19 Christophe Bevilacqua System for positioning and guiding visually impaired person in e.g. building, has calculation unit for calculating current position of portable device based on reading information of reader unit and measurements of inertial platform
WO2011106520A1 (en) 2010-02-24 2011-09-01 Ipplex Holdings Corporation Augmented reality panorama supporting visually impaired individuals
US20110279446A1 (en) 2010-05-16 2011-11-17 Nokia Corporation Method and apparatus for rendering a perspective view of objects and content related thereto for location-based services on mobile device
US8798534B2 (en) 2010-07-09 2014-08-05 Digimarc Corporation Mobile devices and methods employing haptics
CN201912401U (en) * 2010-12-21 2011-08-03 长春大学 Local area positioning and navigation system and device for blind persons

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040068368A1 (en) * 2000-11-15 2004-04-08 International Business Machines Corporation Apparatus, system, and method for determining a user position and progress along a path
US20020149681A1 (en) * 2001-03-28 2002-10-17 Kahn Richard Oliver Automatic image capture
US20040155815A1 (en) * 2001-05-14 2004-08-12 Motorola, Inc. Wireless navigational system, device and method
US20040178894A1 (en) * 2001-06-30 2004-09-16 Holger Janssen Head-up display system and method for carrying out the location-correct display of an object situated outside a vehicle with regard to the position of the driver
US20030197612A1 (en) * 2002-03-26 2003-10-23 Kabushiki Kaisha Toshiba Method of and computer program product for monitoring person's movements
US20070125442A1 (en) * 2002-12-03 2007-06-07 Forhealth Technologies, Inc. Automated drug preparation apparatus including automated drug reconstitution
US20090285445A1 (en) * 2008-05-15 2009-11-19 Sony Ericsson Mobile Communications Ab System and Method of Translating Road Signs
US20110210167A1 (en) * 2010-02-28 2011-09-01 Lyon Geoffrey M System And Method For Determining Asset Location In A Rack

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Kanbara, Registration for Stereo Vision-Based Augmented Reality Based on Extendible Tracking of Markers and Natural Features, 2002, IEEE, vol.2, 1045-1048 *

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9355316B2 (en) 2014-05-22 2016-05-31 International Business Machines Corporation Identifying an obstacle in a route
US9355547B2 (en) 2014-05-22 2016-05-31 International Business Machines Corporation Identifying a change in a home environment
US9613274B2 (en) 2014-05-22 2017-04-04 International Business Machines Corporation Identifying an obstacle in a route
US9978290B2 (en) 2014-05-22 2018-05-22 International Business Machines Corporation Identifying a change in a home environment
US9984590B2 (en) 2014-05-22 2018-05-29 International Business Machines Corporation Identifying a change in a home environment
US20170091607A1 (en) * 2015-09-25 2017-03-30 Ca, Inc. Using augmented reality to assist data center operators
US9838844B2 (en) * 2015-09-25 2017-12-05 Ca, Inc. Using augmented reality to assist data center operators
US9979473B2 (en) * 2015-10-29 2018-05-22 Plantronics, Inc. System for determining a location of a user

Also Published As

Publication number Publication date
GB201209585D0 (en) 2012-07-11
US20130332452A1 (en) 2013-12-12
US9710564B2 (en) 2017-07-18
GB2502549A (en) 2013-12-04

Similar Documents

Publication Publication Date Title
US9710564B2 (en) Providing location and spatial data about the physical environment
US11862201B2 (en) System and method for displaying objects of interest at an incident scene
Ivanov Indoor navigation system for visually impaired
US10014939B2 (en) Smart device performing LED-ID/RF communication through a camera, and system and method for providing location-based services using the same
US11029172B2 (en) Real scenario navigation method and apparatus, device and computer readable storage medium
US8941752B2 (en) Determining a location using an image
US20180089869A1 (en) System and Method For Previewing Indoor Views Using Augmented Reality
US9918296B2 (en) Method and device for positioning
US20130329061A1 (en) Method and apparatus for storing image data
US20160085518A1 (en) Systems and methods for imaging and generation of executable processor instructions based on ordered objects
JP2009245310A (en) Tag specifying apparatus, tag specifying method, and tag specifying program
US9774397B2 (en) Guidance display, guidance system, and guidance method
KR20150034896A (en) Apparatas and method for offering a information about search location in an electronic device
CN104077954A (en) Locating method and system, and code tag generation method and system
US20130088351A1 (en) System and method for notifying of and monitoring dangerous situations using multi-sensor
KR101702205B1 (en) Interior navigation apparatus and method
KR20210083822A (en) 5g based mixed reality underground utility maintenance syste
KR101954800B1 (en) Positioninng service system, method and providing service apparatus for location information, mobile in the system thereof
EP3280149B1 (en) Method for providing additional contents at terminal, and terminal using same
CN102947772A (en) Method and apparatus for determining input
KR20150070460A (en) User equipment for a safe operation of a bicycle on a bicycle road, control method thereof and service providing device
Kodama et al. A fine-grained visible light communication position detection system embedded in one-colored light using DMD projector
CN107360541A (en) A kind of exhibition section wisdom guide system
KR101623184B1 (en) Helpline method using nfc, system and nfc tag for performing the method
Ivanov A low-cost indoor navigation system for visually impaired and blind

Legal Events

Date Code Title Description
AS Assignment

Owner name: INTERNATIONAL BUSINESS MACHINES CORPORATION, NEW Y

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:JARVIS, MATTHEW J.;REEL/FRAME:036850/0175

Effective date: 20130419

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION