WO2019161064A1 - Methods and apparatus for enhancing user-environment interactions - Google Patents

Methods and apparatus for enhancing user-environment interactions Download PDF

Info

Publication number
WO2019161064A1
WO2019161064A1 PCT/US2019/018023 US2019018023W WO2019161064A1 WO 2019161064 A1 WO2019161064 A1 WO 2019161064A1 US 2019018023 W US2019018023 W US 2019018023W WO 2019161064 A1 WO2019161064 A1 WO 2019161064A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
data
environment
input
output
Prior art date
Application number
PCT/US2019/018023
Other languages
French (fr)
Inventor
Rose STERN
Original Assignee
Sourceamerica
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sourceamerica filed Critical Sourceamerica
Publication of WO2019161064A1 publication Critical patent/WO2019161064A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H3/00Appliances for aiding patients or disabled persons to walk about
    • A61H3/02Crutches
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61GTRANSPORT, PERSONAL CONVEYANCES, OR ACCOMMODATION SPECIALLY ADAPTED FOR PATIENTS OR DISABLED PERSONS; OPERATING TABLES OR CHAIRS; CHAIRS FOR DENTISTRY; FUNERAL DEVICES
    • A61G5/00Chairs or personal conveyances specially adapted for patients or disabled persons, e.g. wheelchairs
    • A61G5/10Parts, details or accessories
    • A61G5/1051Arrangements for steering
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H3/00Appliances for aiding patients or disabled persons to walk about
    • A61H3/06Walking aids for blind persons
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62JCYCLE SADDLES OR SEATS; AUXILIARY DEVICES OR ACCESSORIES SPECIALLY ADAPTED TO CYCLES AND NOT OTHERWISE PROVIDED FOR, e.g. ARTICLE CARRIERS OR CYCLE PROTECTORS
    • B62J45/00Electrical equipment arrangements specially adapted for use as accessories on cycles, not otherwise provided for
    • B62J45/20Cycle computers as cycle accessories
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62JCYCLE SADDLES OR SEATS; AUXILIARY DEVICES OR ACCESSORIES SPECIALLY ADAPTED TO CYCLES AND NOT OTHERWISE PROVIDED FOR, e.g. ARTICLE CARRIERS OR CYCLE PROTECTORS
    • B62J45/00Electrical equipment arrangements specially adapted for use as accessories on cycles, not otherwise provided for
    • B62J45/40Sensor arrangements; Mounting thereof
    • B62J45/41Sensor arrangements; Mounting thereof characterised by the type of sensor
    • B62J45/416Physiological sensors, e.g. heart rate sensors
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62KCYCLES; CYCLE FRAMES; CYCLE STEERING DEVICES; RIDER-OPERATED TERMINAL CONTROLS SPECIALLY ADAPTED FOR CYCLES; CYCLE AXLE SUSPENSIONS; CYCLE SIDE-CARS, FORECARS, OR THE LIKE
    • B62K5/00Cycles with handlebars, equipped with three or more main road wheels
    • B62K5/003Cycles with four or more wheels, specially adapted for disabled riders, e.g. personal mobility type vehicles with four wheels
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62KCYCLES; CYCLE FRAMES; CYCLE STEERING DEVICES; RIDER-OPERATED TERMINAL CONTROLS SPECIALLY ADAPTED FOR CYCLES; CYCLE AXLE SUSPENSIONS; CYCLE SIDE-CARS, FORECARS, OR THE LIKE
    • B62K5/00Cycles with handlebars, equipped with three or more main road wheels
    • B62K5/003Cycles with four or more wheels, specially adapted for disabled riders, e.g. personal mobility type vehicles with four wheels
    • B62K5/007Cycles with four or more wheels, specially adapted for disabled riders, e.g. personal mobility type vehicles with four wheels power-driven
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3679Retrieval, searching and output of POI information, e.g. hotels, restaurants, shops, filling stations, parking facilities
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3697Output of additional, non-guidance related information, e.g. low fuel level
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/63ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/67ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/01Constructive details
    • A61H2201/0157Constructive details portable
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5007Control means thereof computer controlled
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5007Control means thereof computer controlled
    • A61H2201/501Control means thereof computer controlled connected to external computer devices or networks
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5007Control means thereof computer controlled
    • A61H2201/501Control means thereof computer controlled connected to external computer devices or networks
    • A61H2201/5012Control means thereof computer controlled connected to external computer devices or networks using the internet
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5058Sensors or detectors
    • A61H2201/5064Position sensors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5058Sensors or detectors
    • A61H2201/5082Temperature sensors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5058Sensors or detectors
    • A61H2201/5084Acceleration sensors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5097Control means thereof wireless
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2230/00Measuring physical parameters of the user
    • A61H2230/04Heartbeat characteristics, e.g. E.G.C., blood pressure modulation
    • A61H2230/06Heartbeat rate
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2230/00Measuring physical parameters of the user
    • A61H2230/04Heartbeat characteristics, e.g. E.G.C., blood pressure modulation
    • A61H2230/06Heartbeat rate
    • A61H2230/065Heartbeat rate used as a control parameter for the apparatus
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2230/00Measuring physical parameters of the user
    • A61H2230/50Temperature
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2230/00Measuring physical parameters of the user
    • A61H2230/50Temperature
    • A61H2230/505Temperature used as a control parameter for the apparatus

Definitions

  • Various embodiments of the present disclosure relate to systems, devices, and methods for enhancing user-environment interactions. More specifically, embodiments of the present disclosure relate to systems, devices and methods for collecting information about a user’s environment and providing a user with feedback, instructions, and/or directions in relation to the user’s environment. In some aspects, embodiments of the present disclosure may assist a user to navigate its environment.
  • Individuals may have specific needs and circumstances affecting their capacity to interact with, and function within, their surroundings. For example, differently-abled individuals may require information, assistance, guidance, and/or cues in order to navigate, operate within, or function in environments ranging from the familiar (e.g., a home, school, or place of work) to the unfamiliar (e.g., a hotel, a city in a foreign country, place of worship, library, a convention center, transportation hub (e.g., airport, bus station, train terminal) etc.). As another example, individuals traveling in unfamiliar locations, locations where no familiar language is spoken, locations where signage is sparse, or environments with poor or minimal visibility may benefit from additional information, guidance, or cues to operate in such locations and environments.
  • the familiar e.g., a home, school, or place of work
  • the unfamiliar e.g., a hotel, a city in a foreign country, place of worship, library, a convention center, transportation hub (e.g., airport, bus station, train terminal)
  • navigational systems including maps, global positioning system (GPS) devices, cell phone and other navigating software applications (“apps”) and guidance systems often operate under the assumption that a user is abled (e.g., is not physically or mentally impaired, or is not differently-abled) and/or has the ability to rely on their sight.
  • GPS global positioning system
  • apps navigating software applications
  • guidance systems often operate under the assumption that a user is abled (e.g., is not physically or mentally impaired, or is not differently-abled) and/or has the ability to rely on their sight.
  • a sighted individual may be able to look at a map or a set of directions in order to navigate through their environment
  • blind or partially-sighted individuals may have limited options for navigating, interacting with, or functioning in the same environments.
  • FIG. 1 depicts in schematic form an exemplary system for enhancing user- environment interactions, according to aspects of the present disclosure.
  • FIG. 2 depicts in schematic form a device for use in enhancing user- environment interactions, according to aspects of the present disclosure.
  • FIG. 3 depicts an exemplary cane for use in enhancing user-environment interactions, according to aspects of the present disclosure.
  • FIG. 4 depicts an exemplary garment for use in enhancing user-environment interactions, according to aspects of the present disclosure.
  • FIG. 5 depicts an exemplary rolling walker for use in enhancing user- environment interactions, according to aspects of the present disclosure.
  • FIG. 6 depicts an exemplary wheelchair for use in enhancing user- environment interactions, according to aspects of the present disclosure.
  • FIG. 7 depicts an exemplary case for use in enhancing user-environment interactions, according to aspects of the present disclosure.
  • FIG. 8 depicts steps in an exemplary method for enhancing user-environment interactions, according to aspects of the present disclosure.
  • Embodiments of the present disclosure relate to devices, systems, and methods for enhancing user-environment interactions.
  • Embodiments of the present disclosure may relate to, e.g., systems and methods for navigating an environment, interacting with aspects of an environment, reacting to an environment, or functioning in an environment by, e.g., guiding or coaching a user.
  • Embodiments of the present disclosure may include devices that send and/or receive information and/or transmissions between one or more guidance devices and one or more items, objects, sensors, beacons, data
  • embodiments of the present disclosure may relate to systems and methods for enhancing user-environment interactions without the need to see the environment.
  • embodiments of the present disclosure may relate to systems and methods for enhancing user-environment interactions for users having one or more physical or mental limitations.
  • embodiments of the present disclosure may relate to systems and methods for gathering and combining information about an environment and using the gathered and combined information for providing a user with feedback, guidance, navigation, coaching, or other directions.
  • Some embodiments of the present disclosure may include a mobile device and one or more separate sensors, beacons, or other devices that may act in concert to gather and combine information and provide a user with feedback, guidance, navigation, coaching, or other directions.
  • devices and methods of the present disclosure may be configured to enable disabled users, users with physical or cognitive impairments, or typically-abled users desiring enabling technology, to enhance the experience of their environment, to travel, move, work, etc. more efficiently.
  • Disabilities or impairments may be physical, emotional, or cognitive.
  • the terms“comprises,”“comprising,”“includes,” “including,” or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, device, system, or apparatus that comprises a list of elements does not include only those elements, but may include other elements not expressly listed or inherent to such process, method, article, or apparatus.
  • the term“exemplary” is used in the sense of“example,” rather than“ideal.”
  • the terms“first,”“second,” and the like, herein do not denote any order, quantity, or importance, but rather are used to distinguish an element, a structure, a step or a process from another.
  • the terms“a” and“an” herein do not denote a limitation of quantity, but rather denote the presence of one or more of the referenced items.
  • FIG. 1 depicts an exemplary system 100 according to aspects of the present disclosure.
  • a device 110 may be associated with a user 120 in a location 115.
  • the device 110 may be connected to a routing device 130.
  • the routing device 130 may be connected to, e.g., a computer 140 local to the routing device 130 and/or a network 150.
  • Network 150 may, in turn, be connected to servers 160, a communication device 170, a satellite 180, and/or one or more services 190.
  • System 100 may include any collection of networking devices, computing devices, individuals, service providers, etc. that may be connected with one another via wired communication lines, wireless networks, signaling pathways, signaling networks, and the like.
  • System 100 may include the collection of elements depicted in FIG. 1, but may additionally or alternatively include fewer than or more than all of the depicted elements.
  • system 100 may include multiples of any of the elements depicted in FIG. 1 (e.g., multiple devices 110, multiple collections of servers 160, multiple communication devices 170, multiple satellites 180, or multiple services 190).
  • system 100 may be connected via any number of direct or indirect lines or connections that may be wired, wireless or otherwise.
  • System 100 may also include multiple elements beyond those depicted in FIG. 1.
  • aspects of system 100 may be owned, managed or controlled by a variety of individuals, companies, organizations, governments, or other entities. Interactions between the elements of system 100 may also be managed or controlled by a variety of individuals, companies, governments, organizations, or other entities. In some embodiments, a single entity may be permitted by other entities to use communication lines or connections between aspects of system 100 for a particular purpose, such as operating device 110.
  • Device 110 may be any device configured to gather information about the surroundings of the device and/or a user of the device, and to output feedback, guidance, navigation, coaching, or other directions to a user (e.g., user 120) of the device.
  • Device 110 may include any number of features or elements that assist in gathering information and/or outputting directions.
  • device 110 may include one or more internal or external inputs, sensors, and/or monitors, such as buttons, keys, a microphone, motion sensors, light sensors, proximity sensors, temperature sensors, or sensors or monitors configured to gather data about a user (e.g., user 120), such as heart rate monitors, blood sugar monitors, blood pressure monitors, and the like.
  • device 110 may include one or more networking, transmission, or communication elements, such as wireless networking capabilities, Bluetooth capabilities, cellular or other mobile capabilities, or radio frequency capabilities.
  • device 110 may be configured to move or travel along with a user (e.g., user 120) in an environment.
  • device 110 may be a hand-held device, such as a personal computer, a tablet, or a phone.
  • device 110 may be a hand-held device that is separate from a personal computer, tablet, or phone.
  • device 110 may have one or more straps, handles, sleeves, ties, buckles, or other features designed to improve its portability.
  • device 110 may include an armband, a belt, a holster, a harness, or other wearable element.
  • device 110 may include one or more wheels, treads, ball bearings, or other mobility -granting devices.
  • device 110 may be lightweight, such that it may be lifted up and down by an individual.
  • device 110 may be, or may be integrated with, other items, such as clothing, tools or vehicles for disabled individuals, tools for individuals in a military or police force, vehicles, jewelry, tools or accessories for hiking or camping, pet accessories (such as a harness or a leash for a service animal), backpacks, bags, or suitcases.
  • device 110 may be customizable or customized, in terms of its size, shape, appearance, inputs, outputs, or any other characteristics. In some embodiments, for example, device 110 may be customized for a particular user 120.
  • device 110 may be configured to relay data either continuously or intermittently to other elements of system 100, such as user 120, computer 140, network 150, servers 160, or services 190.
  • device 110 may also be configured to receive a variety of data from other elements of system 100 (e.g., user 120, computer 140, satellite 180, services 190), such as instructions, maps, directions, audio recordings, messages, advice, alerts, etc.
  • device 110 may include sensors (e.g., sensors 220, 222, 224 depicted with respect to device 200 in FIG. 2) to allow device 110 to receive a variety of data from other elements of system 100, or from markers, beacons, or other elements of location 115 configured to emit data or a signal.
  • device 110 may be configured to store such data.
  • FIG. 2 depicts elements of an exemplary device 110 in further detail, and is described further below.
  • FIGS. 3-6 depict further exemplary devices 110 or items incorporating devices 110 according to aspects of the present disclosure.
  • User 120 may be an individual, a computer, a group of individuals, a company, or other entity receiving information from device 110. In some embodiments, user 120 may be an individual needing or wanting assistance in navigating in or functioning in an environment. In some embodiments, user 120 may be a disabled or differently-abled individual. For example, in some embodiments, user 120 may have one or more physical or mental disabilities and/or limitations. In some embodiments, user 120 may be a blind or partially-sighted individual. In further embodiments, user 120 may be sighted. In further embodiments, user 120 may be an abled individual.
  • user 120 may be an individual who is serving in a military, police, humanitarian, search-and-rescue, or guide capacity.
  • user 120 may be a traveler, regardless of physical and/or mental capabilities.
  • user 120 may be an elderly individual.
  • user 120 may be robotic (e.g., a drone) or have robotic elements (e.g., may have or include one or more computer processors and/or mechanical parts).
  • user 120 may be an artificial intelligence, a computer, or a group of computers.
  • Location 115 may be any location or environment in which assistance is wanted or needed for user 120.
  • location 115 may be an indoor location or an outdoor location.
  • location 115 may be a part or all of a house, hotel, conference center, store, resort, residential community (e.g., a retirement community), transit center (e.g., an airport or bus port), stadium, shopping center, neighborhood, or town.
  • location 115 may be a park, a garden, a yard, a campground, or wilderness.
  • location 115 may be a vehicle, such as an airplane, train, boat, or automobile.
  • location 115 may be an environment with which user 120 is not familiar.
  • location 115 may be an environment with which user 120 is familiar (e.g., a workplace, home, or neighborhood of user 120).
  • location 115 may include, for example, one or more points of interest to a user (e.g., user 120). Such points of interest may include, e.g., locations at which user 120 may receive assistance or information, or may include hazards, events, and the like. In some embodiments, location 115 may have some method by which to identify such points of interest. In some embodiments, an administrator, owner, company, or other entity may provide identification of points of interest to either device 110 or to a repository of information such as a database in servers 160 (or otherwise), such that those points of interest may be accessible to user 120 of device 110.
  • Routing device 130 may be, for example, a computer, a wireless router, a modem, a Bluetooth receiver, a switchboard, or other device configured to relay signals, data, and/or information between elements of system 100.
  • routing device 130 may include a wireless router having a wireless connection to device 110, computer 140, and/or one or more networks, such as network 150.
  • routing device 130 may include a docking station and/or charging connection for device 110.
  • routing device 130 may be integrated into another element of system 100, such as computer 140. Some embodiments of system 100 may not include a routing device 130.
  • Computer 140 may be any type of computer that includes memory and/or one or more processors.
  • computer 140 may be a personal computer, a server computer, or a handheld computing device.
  • computer 140 may be a computer sharing a location (e.g., location 115) with device 110 and user 120.
  • location 115 e.g., location 115
  • computer 140 may be owned or controlled by user 120.
  • computer 140 may be owned or controlled by another individual or entity at location 115.
  • location 115 is a hotel
  • computer 140 may be a hotel computer.
  • computer 140 may be integrated with device 110 and/or routing device 130.
  • Computer 140 may include information, data, or instructions useful to device 110 and/or user 120.
  • computer 140 may be used to configure device 110 by, e.g., providing device 110 with data, information, and/or instructions via a wired or wireless connection (e.g., by“pushing” data to device 110 directly or via routing device 130).
  • computer 140 may store data accessible to device 110 via a direct or indirect query from device 110 (e.g., a“pull” function of device 110).
  • computer 140 may be configured to connect directly to device 110 (e.g., via a wired or wireless connection) for exchanging data and/or power.
  • computer 140 may include a docking station and/or charging connection for device 110.
  • computer 140 may be a personal and/or portable computer owned and/or operated by user 120.
  • user 120 may use computer 140 simultaneously with device 110.
  • computer 140 may operate in tandem with device 110 to, e.g., record data, analyze data, and/or provide feedback to device 110 with respect to user 120, location 115, or other elements of system 100.
  • an individual or entity other than user 120 may operate computer 140 independently from device 110.
  • While user 120 and device 110 may be in proximity to, e.g., a routing device 130 or computer 140, in alternate embodiments, user 120 and device 110 may not be within any proximity or range of routing device 130 or computer 140. In some embodiments, system 100 may not include a computer 140.
  • Network 150 may be any wired or wireless electronic network, such as a local area network or wide area network (e.g., the internet).
  • network 150 may include various types of data connections, such as wired connections, fiber optic connections, wireless connections, satellite connections, cellular or other mobile connections, and the like.
  • Network 150 may also include any number of computers, digital storage devices and memory connected via one or more wired or wireless networks.
  • network 150 may include“cloud” storage.
  • Servers 160 may include one or more computers configured to send, receive, store, and process data transmitted between one or more computers, databases, networks, and the like.
  • servers 160 may receive, store, and modify data relevant to one or more devices, such as device 110.
  • servers 160 may store data that is potentially useful to a user 120 of device 110.
  • servers 160 may store data relevant to navigating a location of user 120, such as location 115.
  • Such data may include directions, maps and geographical data, cues, instructions, alerts, points of interest, descriptive data, coaching or assistive tools, etc.
  • Servers 160 may be configured to make such data available to device 110 via, for example, network 150, other authorized users, or automatically.
  • servers 160 may receive and store data collected by device 110. In some embodiments, servers may store data collected by multiple devices, including device 110. In some embodiments, servers 160 may provide additional or backup storage for device 110. In further embodiments, servers 160 may provide a searchable database of information usable by device 110.
  • servers 160 may perform one or more processes using data collected by device 110 in location 115.
  • data may be collected or provided by one or more sensors, beacons, monitors, or other communication devices located in location 115.
  • servers 160 may perform one or more analyses to determine what instructions, cues, or other data should be returned to device 110, synthesize such instructions, cues, or other data, and/or return such data to device 110.
  • device 110 may provide servers 160 with its GPS coordinates, a plurality of images taken from the location, and with a request for directions.
  • Servers 160 may use the GPS coordinates and the plurality of images to create (or find, in a database) a map of the location of device 110.
  • servers 160 may be configured to push such data to device 110.
  • servers 160 may await a cue to return such data to device 110, such as via a “pull” notification or other request.
  • servers 160 may synthesize data and/or perform analyses using the data from the multiple devices. For example, servers 160 may generate maps, instructions, alerts, statistics, directions, points of interest, commands, etc. that may be relevant to one or more users, such as user 120, and may store the generated data and/or analyses. As another example, servers 160 may generate metrics using the data from multiple devices by, for example, performing statistical analyses using the data from multiple devices. As a further example, servers 160 may compare data received from one device to data received from another device to, e.g., calibrate one of the devices. In some embodiments, after performing a process, servers 160 may publish, push, or otherwise make available the results of such a process.
  • servers 160 may synthesize or analyze data received from device 110 and make such synthesized data or analyses available to a third party, such as an individual or an organization.
  • servers 160 may make syntheses or analyses available to a medical professional, a hospital, a family member of user 120, a nonprofit organization, a government organization, or other entity.
  • servers 160 may be equipped with applicable privacy safeguards to avoid the spread of sensitive personal health information.
  • servers 160 may include restricted access, or data on servers 160 may be encrypted.
  • Communication device 170 may be any device capable of interfacing with network 150, receiving one or more types of information regarding device 110 and/or user 120, and/or generating one or more alerts.
  • communication device 170 may be associated with user 120 (e.g., communication device 170 may be a personal computer, tablet, or phone associated with user 120).
  • communication device 170 may be a personal computer, tablet, or phone associated with user 120.
  • communication device 170 may be associated with one or more third parties.
  • a third party may be any party designated to receive information from device 110.
  • a third party may be designated to provide input to device 110 to assist a user 120, if needed.
  • a third party may be a medical professional, a hospital, a family member or friend of user 120, a company or member of a company, a member of a government body, a member of a military or police force, a colleague of user 120, a member of a helpline, etc.
  • a third party may be a companion to user 120, such as an aide, nurse, or technician.
  • Device 110 may be configured to contact communication device 170 on command by user 120 (e.g., via a telephonic call, other voice or video call, or by sending an alert to communication device 170 via network 150). In some embodiments, device 110 may be configured to contact communication device 170 automatically upon certain events. For example, if data collected by device 110 reflects abnormal patterns, such as a prolonged and unusual lack of movement or input by user 120, abnormal health monitor readouts (e.g., an abnormal heart rate or other monitored symptom of user 120), a third party may be alerted to check the status of a user 120.
  • abnormal health monitor readouts e.g., an abnormal heart rate or other monitored symptom of user 120
  • communication device 170 may include a program, application, or other feature that allows it to connect to device 110 or servers 160 via network 150.
  • communication device 170 may include an application that allows user 120 or a third party to log in to a secure or encrypted system (e.g., on servers 160) and access current data regarding device 110, either from device 110 directly or from servers 160.
  • system 100 may include multiple communication devices, allowing for multiple third parties to contact device 110 or access servers 160 via a communication device.
  • device 110 or servers 160 may include a list of communication devices or third party information reflecting an approved group of devices and third parties that may contact device 110 or access servers 160.
  • Satellite 180 may be any communications satellite configured to collect, receive and/or broadcast data or information to and/or from device 110.
  • satellite 180 may be a GPS satellite configured to broadcast location information to a GPS receiver on device 110. Satellite 180 may, for example, broadcast data allowing a device 110 or other device to construct a map for use by device 110 with respect to user 120.
  • Services 190 may include, for example, individuals or institutions capable of providing a service to user 120 of device 110 upon request or alert.
  • services 190 may include emergency services, such as police services or emergency medical care.
  • services 190 may include navigational or troubleshooting services, which may be able to provide device 110 or user 120 with navigational, travel, coaching, monitoring, or technical assistance upon request.
  • services 190 may be reachable by placing a call to services 190 via device 110 or another communication device.
  • a call may be placed by any input method, such as voice command, dial-in, or by pressing a customized button on device 110. Alternatively, a call may be placed automatically.
  • services 190 may be alerted automatically if device 110 experiences an event. For example, if device 110 registers an abnormal heart rate of user 120, then device 110 may automatically alert some services 190 via network 150. As another example, if a sensor, input, or output on device 110 fails, then device 110, servers 160, or communication devices 170 may automatically alert services 190. In some embodiments, multiple alerts may be sent to multiple services 190, and/or a combination of services 190 and communication devices 170. In some embodiments, a record of such automatic alerts may be made either locally on device 110 or remotely, on, e.g., servers 160.
  • services 190 may be authorized to provide step-by- step directions to device 110 to assist user 120 with navigating in, or functioning in, location 115.
  • services 190 may be authorized to receive sensor data from device 110 in order to locate, orient, and guide user 120 to a particular destination.
  • Services 190 may be authorized to provide verbal directions to user 120 via network 150 or, alternately, a series of tactile instructions for user 120 to follow.
  • services 190 may be authorized to receive precise location and other data from device 110, in order to best provide assistance to user 120 of device 110.
  • the connections depicted by dotted lines in FIG. 1 may be any suitable connections known to those of ordinary skill in the art.
  • the connections may include any type of wireless connection (e.g., wi-fi, satellite signal, cellular, radio, Bluetooth) or wired connection (e.g., wired telephone or cabled connection).
  • FIG. 2 depicts, in schematic form, an exemplary device 200 according to aspects of the present disclosure.
  • Device 200 may include a processor 202, memory 204, a power source 206, several network-capable components such as a wireless network component 208, a cellular component 210, and a GPS component 212, and a plurality of outputs such as an audio output 214, a tactile output 216, and a visual output 218.
  • Device 200 may also include a plurality of sensors 220, 222, 224. Components may be grouped together into modules, such as module 250 and module 270.
  • Device 200 is a broad, exemplary generalization of the types of devices that are contemplated by the present disclosure.
  • device 200 may be any device suitable for use in system 100 to assist user 120.
  • device 110 in system 100 may be a device 200.
  • Any and/or all characteristics and features of device 110 described above may be characteristics and features of any other version of device 200.
  • device 200 may have an expansive array of sizes, shapes, or collection of sizes and shapes (e.g., a combination of various-sized and -shaped modules).
  • FIGS. 3-6 depict several exemplary configurations.
  • the depiction of device 200 is intended to show one exemplary combination of elements that a device according to the present disclosure may have, but many more combinations of elements and characteristics are possible.
  • device 200 is depicted with only one processor 202 and only one power source 206, device 200 may, in some embodiments, have multiple processors and multiple power sources. This applies to each element of device 200.
  • Device 200 is schematically depicted as a single rectangular unit.
  • elements of device 200 may be divided into multiple units having any shape or size and being disposed in one housing, or in multiple different housings.
  • elements of device 200 may communicate with one another via wired connections, wireless connections, Bluetooth, cellular connections, or any other wired or wireless connections.
  • Parts of device 200 may be strapped, sewn, attached, or otherwise affixed together or to other objects, items, devices, or individuals, including placement in the user-environment in any form or manner that is useful to a user (e.g., user 120).
  • device 200 may include multiple parts arranged in a pattern throughout an environment.
  • connections between elements of device 200 are depicted with solid lines. Depending on the purpose of the connection (e.g., a connection to provide power or a connection for the exchange of signals or data), such connections may be either wired or wireless. Although a particular combination of connections is depicted, it is contemplated that connections may be made between any elements of device 200 to assist in device 200’s function of assisting a user.
  • power source 206 is depicted without any connection lines to any other elements. However, it is contemplated that power source 206 would be connected to any elements of device 200 requiring power.
  • Processor 202 may be any suitable processing unit that may assist in performing the functions of device 200.
  • processor 202 may be configured to receive data from one or more inputs, such as memory 204, sensors 220, 222, 224, wireless network component 208, cellular component 210, and GPS component 212, processing data from such inputs, and outputting data to, e.g., audio output 214, tactile output 216, visual output 218, wireless network component 2018, cellular component 210, GPS component 212, and memory 204.
  • Device 200 may include multiple processors.
  • processor 202 may be a central processor, and other processors may be specific to other components.
  • cellular component 210, GPS component 212, and one or more of sensors 220, 222, 224 and the outputs may have dedicated processors.
  • device 200 may include multiple processors 202 operating in parallel.
  • Memory 204 may be one or more types of digital memory, and may serve to store various types of data.
  • memory 204 may store one or more sets of instructions for processor 202, and/or data sent and received by processor 202 and other components of device 200.
  • Memory 204 may include storage memory and/or processing memory.
  • storage memory may store instructions for processor 202 (e.g., in the form of one or more programs), as well as data gathered from sensors 220, 222, 224 and other components of device 200.
  • storage memory may store data and/or instructions for outputting to elements of device 200, such as audio output 215, tactile output 216, and visual output 218.
  • storage memory may store one or more series of navigation directions for a user of device 200 (e.g., user 120 of device 110) in the form of a sequence of vibrations, turns, or other outputs to tactile output 216.
  • storage memory may store audio recording or audio cues for outputting to audio output 215.
  • storage memory may store instructions for outputting visual information to a visual output 218, such as instructions for a display or a series of lights.
  • Storage memory may also store inputs received from sensors 220, 222, 224. For example, if one of sensors 220, 222, 224 is a movement sensor, then storage memory may store a sequence of movements that are sensed by the movement sensor.
  • Storage memory may also include preprogrammed data and information relevant to turning on and operating device 200, such as operating system information, basic input-output system information (BIOS), and the like.
  • BIOS basic input-output system information
  • storage memory may be referred to as read-only memory (ROM).
  • storage memory may be an electronic storage drive, such as a hard disk drive (HDD) or solid state drive (SSD).
  • Processing memory may be memory used actively by a processor (e.g., processor 202) while running.
  • processing memory may be used by processor 202 to dynamically track and analyze inputs received from other elements of device 200 (such as sensors 220, 222, 224).
  • processing memory may be used by processor 202 to queue a series of instructions stored in storage memory, so that the series of instructions may be provided to elements of device 200 quickly.
  • processor 202 may be able access, read, and write to processing memory more quickly than to storage memory.
  • processing memory may be referred to as random-access memory (RAM).
  • processing memory may include dynamic RAM, static RAM, or both.
  • Device 200 may include as much or as little memory as desired or needed for the functioning and operation of device 200.
  • device 200 may have relatively little storage memory, and may access instructions, data, and other information stored remotely over a network (e.g., network 150). Having relatively little memory may allow for greater portability, affordability, and power efficiency of device 200.
  • device 200 may include substantial storage and processing memory. This may allow for device 200 to have faster operating speeds and more immediate access to a greater variety of instructions and data.
  • Power source 206 may be any source or sources of energy or charge to power device 200 and its elements. Power source 206 may be connected, either directly or indirectly, to each element of device 200 requiring power to function. Power source 206 may include, for example, a battery, such as a rechargeable or replaceable battery. In some embodiments, power source 206 may include one or more sockets, plug points, or connectable elements capable of connecting device 200 to an external power source. In some embodiments, power source 206 may include a gas-fueled engine. In some embodiments, power source 206 may include a solar power source, or a mechanical power source.
  • power source 206 may include a mechanical power generator coupled to one or more of the mobile parts.
  • power source 206 may include an engine and/or a motor.
  • power source 206 may include a combination of different power sources, such as a gas-powered engine and a battery.
  • device 200 may include a single power source. In other embodiments, device 200 may include multiple independent power sources.
  • Device 200 may also include one or more components that allow device 200 to have networking capabilities over a wireless local or wide area network, a cellular network, a GPS network, or other network. While three particular components (wireless network component 208, cellular component 210, and GPS component 212) are depicted in FIG. 2, device 200 may also have other networking components, such as a Bluetooth- compatible component or a radio component. It is contemplated that any of the networking components may be turned on or off, or activated/deactivated, such that a user may connect device 200 to selected types of networks or disconnect it from all types of networks.
  • Wireless network component 208 may be a component that allows elements of device 200 (such as processor 202) to send and receive data over wireless networks, such as wireless local area networks (LANs) and/or wide area networks (WANs).
  • wireless network component 208 may include a wireless card and/or antenna that allows processor 202 and/or other elements of device 200 to communicate over wi-fi networks.
  • Cellular component 210 may be any mobile transmissions component.
  • cellular component 210 may include a modem and a radio frequency antenna configured to be compatible with one or more mobile network standards, such as GSM, CDMA, or iDEN.
  • cellular component 210 may be associated with a cellular account held with one or more providers.
  • cellular component 210 may include a SIM card or other element containing account-identifying information.
  • cellular component 210 may be a mobile phone, including a speaker or earphone, a microphone, user inputs/outputs (such as buttons, voice command features, and/or a screen), a processor, memory, integrated circuits, and the like.
  • cellular component 210 may be independent from other elements of device 200, but may be integrated with other elements of device 200 by, e.g., a wireless or a wired connection.
  • device 200 may include a dock or port where cellular component 210 may be plugged in to device 200.
  • a body of device 200 may be a mobile phone body.
  • features of device 200 may be attached to device 200 as one or more add-on accessories, or may be physically independent from a body of device 200, but connected to device 200 by way of one or more wireless connections (e.g., a wi-fi or Bluetooth connection).
  • wireless connections e.g., a wi-fi or Bluetooth connection
  • GPS component 212 may be a component capable of receiving information from GPS satellites. In some embodiments, GPS component 212 may be configured to receive information from GPS satellites and subsequently calculate the position of device 200 using that information. In some embodiments, GPS component 212 may be configured to provide positional data to a user in the absence of connections to other networks (e.g., wireless networks or cellular networks).
  • Device 200 may be equipped with various types of output components in order to communicate with a user (e.g., user 120), or other aspects of an environment. Such output components may provide output in visual formats and non-visual formats, such as in audio formats or tactile formats. Output components may be configured to provide cues, instructions, queries, responses, directions, advice, coaching, teaching, and the like.
  • output components of device 200 may be configured to work in combination with sensors of device 200 (e.g., sensors 200, 222, 224), processor 202, and memory 204 to translate environmental cues into output in a format that is helpful to a user.
  • sensors of device 200 e.g., sensors 200, 222, 224
  • processor 202 may be configured to receive such input, translate such input into a series of audio or tactile cues, and instruct audio output 214 or tactile output 216 to output those cues to a user.
  • processor 202 and/or memory 204 may be configured to construct audio or tactile cues, e.g., by using synthetic vocal noises or by using a tactile language such as Braille, or other tactile code.
  • Audio output 214 may be one or more of a speaker, a headphone set or set of earbuds, and/or a jack for a wired or wireless headphone, ear bud, speaker, or other wired or wireless communications device. Audio output 214 may be configured to output audio cues, instructions, queries, directions, advice, or other audio content to a user of device 200. Audio output 214 may be configured to interface with any other element of device 200, such as processor 202, cellular component 210, or GPS component 212. In some embodiments, audio output 214 may be connected to device 200 via a wireless connection, such as via wireless network connection 208, or via a Bluetooth connection.
  • a wireless connection such as via wireless network connection 208, or via a Bluetooth connection.
  • Tactile output 216 may be any output of device 200 that can be sensed by touch or by movement.
  • tactile output 216 may include motorized moving parts, such as a handle or other part of device 200 that may move, a component that outputs vibrating cues, an electronic braille component, etc.
  • Tactile output 216 may be configured to communicate directions, instructions, responses, cues, and the like to a user in a non-visual, non-auditory manner.
  • tactile output 216 may move device 200, e.g., by vibrating device 200 or moving one or more wheels or other mobile parts of device 200, to direct or suggest to a user a particular movement.
  • tactile output 216 may provide a user with physical guidance in an environment. To this and other ends, tactile output 216 may be configured to receive instructions from, e.g., processor 202 or memory 204.
  • Visual output 218 may include, for example, a display, such as a screen, and/or lights.
  • visual output 218 may include a tablet, mobile phone, or other type of screen.
  • visual output 218 may include lights in a variety of colors and locations on device 200. It is contemplated that a visual output 218 may assist either a user of device 200 or a companion or other individual with user 200 to interact with various aspects of device 200.
  • Sensors 220, 222, 224 may include any number or type of components configured to receive input from the surroundings of device 200 and transmit such input to one another, to processor 202, and/or to other components of device 200.
  • Sensors 220, 222, 224 may include, for example, cameras, thermometers, heat sensors, proximity sensors (e.g., hall effect sensors), motion sensors (e.g., accelerometers), microphones, speedometers, odometers, balance/orientation sensors, health sensors (e.g., heart rate monitors, blood pressure monitors, body temperature monitors), and others.
  • Sensors 220, 222, 224 may also include other types of input devices, such as keyboards, buttons, and touchpads. In some embodiments, one or more of sensors 220, 222, 224 may be configured to work in tandem.
  • processor 202 may be configured to receive various inputs from sensors 220, 222, 224 and translate those inputs into information regarding an environment in which device 200 is located. While three sensors are depicted on device 200, device 200 may be configured to have any number of sensors such as sensors 220, 222, 224.
  • Modules 250, 270 represent exemplary groupings of elements of device 200. It is contemplated that two or more elements of device 200 may be grouped together physically, in terms of their functions, or both. For example, sensors 220, 222 are shown as being grouped together in module 250, and processor 202, wireless network component 208, cellular component 210, GPS component 212, memory 204, and power source 206 are grouped together in module 270. In some embodiments, sensors 220, 222 may be two sensors grouped together in a software context as module 250, to operate in tandem. As such, sensors 220, 222 may exchange and combine their data prior to forwarding the data to processor 202. In another embodiment, module 250 may be an independently movable part of device 200.
  • module 250 may be removable, replaceable, or repositionable on device 200, independently of module 270.
  • module 250 may be located at a different part of device 200 than module 270.
  • Both module 250 and module 270 are exemplary, and one of ordinary skill in the art will understand that other combinations of elements of device 200 into modules are possible as well.
  • device 200 may be configured to work together in a variety of ways.
  • device 200 may be configured to receive input from sensors 220, 222, 224, use processor 202 and memory 204 to process such input into information regarding the surroundings of device 200, and use that information to provide guidance to a user.
  • Providing guidance may include using, e.g., wireless network component 208, cellular component 210, and/or GPS component 212 to retrieve additional data regarding an environment in which device 200 is located, such as one or more maps, sets of directions, notifications regarding hazards, points of interest, and the like, and providing such additional data or parts of such additional data as output from device 200 to a user, in order to assist the user.
  • Device 200 and other devices of the present disclosure may have a variety of additional features.
  • device 200 may have an emergency feature, by which emergency services may be contacted through, e.g., cellular component 210.
  • Device 200 may be configured to contact emergency services if data from one or more of sensors 220, 222, 224 meets certain parameters (e.g., a heart rate is above or below a certain threshold). It will be apparent to those of skill in the art that other features and conveniences known in the art may also be incorporated into device 200.
  • FIGS. 3-6 depict additional exemplary embodiments of devices according to the present disclosure. Generally, the devices of FIGS. 3-6 show some specific
  • device 200 in order to demonstrate how the elements of device 200 may be incorporated into objects that may offer a user added assistance or convenience.
  • FIGS. 3-6 are exemplary, it is to be understood that any of these devices may be modified to have more or fewer parts, sensors, inputs, outputs, or other attributes.
  • FIG. 3 depicts an exemplary cane 300 according to aspects of the present disclosure.
  • Cane 300 may include a staff 320, a handle 340, a guidance module 350, and a wheel or other movement technology or device 360.
  • Cane 300 may be equipped with various sensors 322, 324, 326, a speaker 342, and a jack 344.
  • Cane 300 may be, for example, a walking stick, a support cane, a probing cane (also referred to as a“white cane” or a“long cane”), or any other type of cane.
  • Cane 300 may have the approximate dimensions of a support cane or probing cane, with the addition of features to assist a user of cane 300.
  • Staff 320 may be sized and configured for use by a user. In some embodiments, staff 320 may have a height, width, and weight to allow a user to comfortably hold cane 300 while walking. In some embodiments, staff 320 may be collapsible for ease of storage or transport when not in use.
  • Staff 320 may include in its body several sensors (e.g., sensors 322, 324, 326), and wiring between such sensors and guidance module 350.
  • Handle 340 may be grippable by a user of cane 300. Handle 340 may be approximately perpendicular to staff 320, as shown. In alternative embodiments, handle 340 may extend coaxially from staff 320. In further embodiments, handle 340 may have any size, shape, or configuration that allows a user to comfortably use cane 300. In some
  • handle 340 may be equipped with one or more sensors, such as a heart rate monitor. Sensors in handle 340 may be connected to guidance module 350 via wires in handle 340 and staff 320.
  • Guidance module 350 may contain a variety of elements to assist in receiving, sending, and processing information to guide a user of cane 300.
  • Guidance module 350 may contain one or more elements found in, e.g., device 200, such as a power source (e.g., power source 206), processor (e.g., processor 202), memory (e.g., memory 204), wireless capabilities, GPS capabilities, Bluetooth capabilities, inputs, outputs, and the like.
  • Guidance module 350 may be configured to receive information from sources in/on cane 300 (e.g., sensors 322, 324, 326, handle 340, or wheel 360) process such information locally and/or transmit such information to a remote processor, and output directions, guidance, or cues to a user based on received information.
  • sources in/on cane 300 e.g., sensors 322, 324, 326, handle 340, or wheel 360
  • process such information locally and/or transmit such information to a remote processor, and output directions, guidance, or cues to a user based on received information.
  • guidance module 350 may be programmed locally by, e.g., inputting commands, directions, or instructions into guidance module 350 via one or more buttons, keys, or other inputs (not shown).
  • guidance module 350 may be coupled or connected to another device, such as a personal computer, phone, or tablet, into which instructions, directions, or other information may be input and transferred to guidance module 350.
  • guidance module 350 may remain connected to a personal device via, e.g., a wireless connection, while cane 300 is in use.
  • Wheel 360 may be a multidirectional wheel or other movement technology at the base of staff 320. Wheel 360 may be rotatable in any direction, such that a user of cane 300 may push or tap cane 300 in any direction as the user moves. In some embodiments, wheel 360 may be rotatable or propellable by, for example, a motor controlled by a module internal to cane 300. In such a manner, cane 300 may be, to some extent, self-propelled. Wheel 360 may be equipped with or coupled to one or more sensors, such as an odometer or a speedometer. Such sensors may be connected to guidance module 350 via wires extending through staff 320. In some embodiments, wheel 360 may be replaced with, e.g., a ball bearing, a slider, or a pad. In some embodiments, cane 300 may not include a wheel 360 and may have a flat or pointed base.
  • Sensors 322, 324, 326 may be, for example, any of the sensors described with respect to device 200.
  • One or more of sensors 322, 324, 326 may include cameras, so as to gather visible data around cane 300. Although three sensors are depicted on staff 320, it is contemplated that more or fewer sensors may be disposed on staff 320.
  • Speaker 342 may be located on, e.g., handle 340, and may be configured to output audio generated by guidance module 350 in the general direction of a user of cane 300.
  • Jack 344 may be, for example, a standard headphone jack into which headphones or an earpiece may be plugged, such that a user wearing the headphones or earpiece may hear auditory cues from guidance module 350.
  • speaker 342 may be automatically turned off.
  • speaker 342 and jack 344 may double as headset elements for making mobile calls from, e.g., guidance module 350, using a cellular or other mobile component of guidance module 350.
  • jack 344 may be configured to receive input from a microphone, or may be configured to accept input from, and provide output to, a combination headphone-and-microphone.
  • jack 344 may be a jack for wired or wireless headphones, wireless transmission/reception devices, speakers, or other device or devices.
  • FIG. 4 depicts an exemplary garment 400 according to aspects of the present disclosure.
  • Garment 400 may include a body 410, a camera 420, external sensors 422, 424, 426, internal sensor 428, a speaker 442, and a jack 444.
  • Garment 400 may be a wearable device configured with guidance elements to assist a person wearing the garment. It is contemplated that garment 400 may be worn as a part of daily wear, as a part of a uniform, as a part of tactical gear, or in any other manner garment 400 may be made of any fabric or material known in the art that can support the elements affixed to garment 400. In some embodiments, garment 400 may be made of an outerwear fabric, such as a waterproof polyester or nylon. In other embodiments, garment 400 may be made of cotton, leather, woven metal, or any other wearable material. Although garment 400 is depicted as being a vest, garment 400 may be styled in any manner.
  • garment 400 may include sleeves, a collar, a zipper, cuffs, an inner shirt, or any other stylistic details.
  • garment 400 may be any type of garment (e.g., trousers, shorts, a belt, etc.) As is described further, garment 400 may be equipped with a variety of sensors to receive input about the surroundings of a user wearing the garment.
  • Camera 420 may be a video camera configured to retrieve visual information about the surroundings of a wearer of garment 400.
  • External sensors 422, 424, 426 may be any sensors configured to gather data from the surroundings of garment 400, such as temperature sensors, accelerometers, and other sensors that have been described elsewhere herein.
  • Internal sensor 428 may be a sensors configured to gather data about a wearer of garment 400, such as body temperature data, heartrate data, and other information.
  • Speaker 442 may be positioned at or near a shoulder of garment 400, such that output from speaker 442 may be heard by a wearer of garment 400.
  • a guidance module may transmit audio instructions and cues to speaker 442 for output to a wearer of garment 400.
  • Speaker 442 and jack 444 may be configured to operate in a manner with respect to a guidance module in garment 400 in a manner similar to speaker 342 and jack 344 of cane 300, with respect to guidance module 350.
  • FIG. 5 depicts an exemplary walker 500 according to aspects of the present disclosure.
  • Walker 500 may include a frame 520, wheels 521, a cushioned rest 522, a handle 524, a brake 526, a brake cord 527, a footrest 528, and a guidance module 530.
  • Guidance module 530 may include speaker 532, and one or more sensors 534, 560, 580.
  • Walker 500 may also include sensors 590, 592, 594, 596 located on various parts of walker 500.
  • Walker 500 may be a knee walker, such a user may rest a knee on, or sit on, walker 500 while walker 500 is stationary or in motion.
  • Frame 520 may be made of a material strong enough to support a user, and light enough to allow for mobility.
  • frame 520 may be made of metal or a synthetic polymer.
  • frame 520 may be made of multiple different materials.
  • Wheels 521 may be movable and steerable by, e.g., a user pushing walker 500, and/or by handle 524.
  • wheels 521 may include full or partial motorized propulsion to aid in walker movement.
  • walker 500 may be self- propelled.
  • wheels 521 may be replaced by other parts granting walker 500 mobility, such as treads, ball bearings, and the like.
  • Rest 522 may be a support rest on which a user may sit or rest a limb. In some embodiments, rest 522 may be padded for comfort. In some embodiments, rest 522 may be customized or molded to specifically fit the body of a user. Footrest 528 may support the foot or feet of a user sitting on walker 500.
  • Handle 524 may be configured to steer the front and/or back wheels of walker 500 when turned by a user.
  • handle 524 may be fully or partially guided by commands from guidance module 530.
  • handle 524 may have a fully or partially motorized turning mechanism connected to guidance module 530.
  • Brake 526 may be, e.g., a hand brake that may control whether one or more wheels of walker 500 may roll via, for example, brake cord 527.
  • brake 526 may be an electronically-controlled brake that may be controlled by, for example, guidance module 530. In such embodiments, brake 526 may be connected to guidance module 530.
  • Brake cord 527 may be configured to apply a braking force to a wheel of walker 500.
  • an inner wire in brake cord 527 may extend between brake 526 and a wheel (or other mobility part) of walker 500. When brake 526 is engaged, the inner wire may be pulled, thus engaging a clamp on a wheel of walker 500. In some embodiments of walker 500, brake 526 may not need a brake cord 527. In such
  • brake 526 may be located directly on one or more wheels (or other mobility parts) of walker 500 and be activated electronically or by another mechanism.
  • Guidance module 530 may have any of the characteristics that guidance module 350 may have, and/or any of the elements of device 200. Connections to various other elements of walker 500 may run through portions of frame 520; for example, connections between guidance module and handle 524, brake 526, speaker 532, and sensors 590, 592, 594, 596 may run inside hollow portions of frame 520. Guidance module 530 may also include several sensors in or on it, such as sensors 534, 560, 580. Speaker 532 on guidance module 530 may have characteristics and capabilities similar to other speakers disclosed with respects to other embodiments herein. In some embodiments, guidance to a user of walker 500 may be primarily auditory.
  • guidance to a user of walker 500 may be provided by physical cues, such as by handle vibrations from a motorized handle 524, slight turns to the handle, application of brake 526, or other physical cues provided by walker 500.
  • Guidance module 530 may also include, e.g., safety-oriented programming, such that any physical cues provided by walker 500 to a user will not endanger the safety of the user.
  • Each sensor 534, 560, 580, 590, 592, 594, 596 may be affixed to, or of a piece with, walker 500.
  • each sensors may be affixed to or welded to frame of any sensor type described elsewhere herein (e.g., with respect to sensors 220, 222, 224 of device 200).
  • multiple different types of sensors may be affixed to, welded to, strapped to, or otherwise attached to frame 520 of walker 500, guidance module 530, the wheels, or other parts of walker 500.
  • Sensors may be configured to track the movement, speed, acceleration, and physical surroundings of walker 500.
  • sensors may be configured to receive voice and/or manual input from a user of walker 500.
  • sensors 534, 560, 580, 590, 592, 594, 596 may include a microphone.
  • sensors may also be incorporated into, for example, handle 524, in order to track physical characteristics of a user of walker 500 (e.g., heartrate).
  • FIG. 6 depicts an exemplary wheelchair 600 according to aspects of the present disclosure.
  • Wheelchair 600 may include one or more modules 620, 630, 638.
  • Module 630 may include a speaker 632.
  • Wheelchair 600 may include one or more sensors, such as sensors 636, 640.
  • Wheelchair 600 may be a motorized or non-motorized wheelchair, such that it is partially or fully self-propelled and/or self-steering.
  • Modules 620, 630, 638 may be various modules making up a guidance module having characteristics similar to one or more modules of device 200.
  • one of modules 620, 630, 638 may include a guidance module, similar to, e.g., module 270 of device 200, or guidance module 350 of cane 300.
  • a guidance module may include, e.g., a processor, memory, wireless connection components, etc.
  • one or more of modules 620, 630, 638 may include a power source for providing motorized propulsion to wheelchair 600.
  • Modules 620, 630, 638 may include one or more sensors to track speed, acceleration, orientation, and other characteristics of wheelchair 600.
  • Such sensors, and sensors 636, 640 may be of any sensor type described elsewhere in the present disclosure.
  • one or more of the sensors may include proximity sensors, accelerometers, cameras, balance/orientation sensors, and the like. It is contemplated that the modules, sensors, and other elements of wheelchair 600 may be connected to one another by wired or wireless connections.
  • Speaker 632 may be positioned so as to provide audible output to a user of wheelchair 600.
  • speaker 632 may be on an arm of wheelchair 600.
  • speaker 632 may be disposed on, e.g., module 620, such that it would be positioned near the head of a user of wheelchair 600.
  • speaker 632 may be combined with, e.g., a headphone jack or other wireless communication device (e.g., a Bluetooth device) to allow for a user to receive audio cues through headphones or an ear piece.
  • one or more of modules 620, 630, 638 may include a microphone input, so as to allow a user to provide audio input to one or more modules of wheelchair 600.
  • FIG. 7 depicts an exemplary movable case 650 according to aspects of the present disclosure.
  • Case 650 may include a body 652 with wheels 674, an arm 654, and a handle 656.
  • Case 650 may include modules 658, 670, and one or more sensors 660, 666,
  • Module 658 may further include a speaker 662 and a jack 664.
  • Case 650 may be a self-propelling case, and/or may be configured to be pushed or pulled by a user, or move alongside the user.
  • Case 650 may be ruggedized such that it can climb inclines, stairs, etc., and such that it can survive a variety of weather and environmental conditions. Although depicted in an inclined position, it is contemplated that case 650 will be capable of standing, moving, and turning in an upright position (e.g., with all wheels or other forms of propulsion on the ground).
  • case 650 may be motorized or may include some other form of self-propulsion, such that it is partially or fully self-propelled and/or self-steering.
  • Case 650 may be made of lightweight, durable materials, such as plastic, metal, or woven material (e.g., fabric, woven nylon, polyester, etc.)
  • body 652 of case 650 may include a motor or other form of propulsion in, e.g., guidance module 670.
  • body 652 may be openable, such that modules inside (e.g., module 670) may be added or removed from its interior.
  • body 652 may include storage space.
  • body 652 may not be openable.
  • Body 652 may have any size or shape suitable for holding one or more modules.
  • Wheels 674 may be any type of wheels, treads, ball bearings, or any other type of part or technology that grants mobility to case 650. In some embodiments, wheels 674 may be retractable. Wheels 674 may be configured in an arrangement to allow case 650 to rest on wheels 674 stably, without assistance or support from a user. In some
  • wheels 674 may be operable if fewer than all of wheels 674 are contacting a surface (e.g., the ground).
  • wheels 674 may each be motorized or self- propelled, and may be configured to turn or self-propel upon direction from a user, or from a guidance module (e.g., module 658 or 670).
  • Arm 654 may be extendable or collapsible, and/or may be adjustable, including to a variety of lengths, widths, shapes, or sizes to suit and/or function with a particular user’s height, limbs, and the like.
  • arm 654 may be rotatable.
  • a user twisting handle 656 may be able to rotate arm 654.
  • a motor or other movement device or technology in body 652 may be configured to twist or rotate arm 654, to indicate to a user a direction in which the user should turn.
  • Arm 654 may be made of a durable material, such as metal or plastic.
  • arm 654 may be made of a flexible material, such as a woven cord or rope.
  • Handle 656 may be configured to be held by a user of case 650.
  • handle 656 may be vibrated or rotated via, e.g., an internal motor controlled by one or more modules, such as module 658 or module 670.
  • handle 656 may be fitted with buttons or other means of communication so that the user may send and receive communication, data, etc. to and from Case 650 and its device 670.
  • Modules 658, 670 may be various modules making up a computer, artificial intelligence and/or guidance modules having characteristics similar to one or more modules of device 200.
  • one of modules 658, 670 may include a guidance module, similar to, e.g., module 270 of device 200, or guidance module 350 of cane 300.
  • a guidance module may include, e.g., a processor, memory, wireless connection components, etc.
  • one or more of modules 658, 670 may include a power source for providing motorized or other means of propulsion to case 650.
  • Modules 658, 670 may include one or more sensors to track speed, acceleration, orientation, and other characteristics of case 650.
  • Such sensors, and sensors 660, 666, 668, 672 may be of any sensor type described elsewhere in the present disclosure.
  • one or more of the sensors may include proximity sensors, accelerometers, cameras, balance/orientation sensors, and the like. It is contemplated that the modules, sensors, and other elements of case 650 may be connected to one another by wired or wireless connections.
  • Speaker 662 may be positioned so as to provide audible output to a user of case 650.
  • speaker 662 may be on module 658, in proximity to handle 656 of case 650.
  • speaker 662 may be combined with, e.g., jack 664, to allow for a user to receive audio cues through headphones, an ear piece, or other wired or wireless communications device.
  • one or more of sensors 660, 666, 668, 672 may include a microphone input, so as to allow a user to provide audio input to various parts of case 650.
  • FIG. 8 depicts steps in a method 700 for enhancing user-environment interactions according to aspects of the present disclosure. It should be noted that in some alternative implementations, the features and/or steps described may occur out of the order depicted in the figures or discussed herein. For example, two steps or figures shown in succession may instead be executed substantially concurrently or may sometimes be executed in the reverse order, depending upon the functionality/acts involved.
  • step 702 data may be received regarding the surroundings of a device.
  • step 704 memory of a device may be updated with the received data.
  • step 706 one or more directions may be generated to a user based on the received data.
  • step 708 additional data may be received regarding device surroundings, in response to a change in location.
  • step 710 memory may be updated with additional received data.
  • step 712 one or more additional directions may be generated to a user based on the additional received data.
  • data may be received regarding the surroundings of a device.
  • This data may include, for example, information from sensors on the device (e.g., sensors 220, 222, 224 of device 200), information transmitted over a network to network components of a device (e.g., wireless network component 208, cellular component 210, GPS component 212 of device 200), or information from memory of a device (e.g., memory 204).
  • the data may be visual data, auditory data, data regarding motion, temperature, location, time, proximity to objects, locations, or people, data regarding the health or status of a user of the device, or data regarding a goal, program, instruction, or destination from the user or from a third party.
  • the data may be input into the device by, e.g., a user of a device, or may be automatically sensed, retrieved, or otherwise received.
  • the data may be received at the device.
  • the data may be received remotely— e.g., at a computer or a server bank in the same location as, or different location from, the device.
  • the data may be transmitted over one or more connections before being received.
  • the data may be pushed to the device or remote location by, e.g., an external signal, cue, or program.
  • the data may be pushed to the remote location from the device.
  • the data may be pushed to the device from a remote location, such as a remote computer or entity.
  • the device may be instructed to actively retrieve the data using one or more sensors (e.g., sensors 220, 222, 224 of device 200).
  • the data may be processed before or after being received. For example, data from one or more sources may be combined, synthesized, sorted, encrypted, decrypted, or interpreted. In some embodiments, a location, date, time, or other information may be associated with the data.
  • memory of a device may be updated with the received data.
  • the received data may be transmitted to memory 204 of device 200 (either processing memory or storage memory).
  • the received data may be transmitted to memory remote to a device, such as servers 160 in system 100.
  • one or more directions may be generated to a user based on the received data.
  • This step may include parsing the received data to obtain information relevant to the user, analyzing the parsed information, and generating one or more directions.
  • the received data may include any type of data about the environment of a device or the user of the device.
  • the received data may include a series of images of an environment, which may be parsed or interpreted by, e.g., a program, to determine paths of travel, hazards, obstacles, and a destination.
  • the received data may include one or more maps, which may be interpreted to determine the user’s location on the map, the user’s destination(s), points of interest, and the like.
  • the received data may include one or more sound samples, health indicators (e.g., a heart rate, blood pressure rate, etc.), or other pieces of information that may indicate that a user is under stress or is in danger of becoming overstressed.
  • the information may be analyzed to determine one or more courses of action that might be communicated to the user of the device. For example, if the information includes a location and nature of a hazard, a course of action may be to report the hazard to the user, or to retain information about the hazard in order to report the hazard to the user if the user comes within a given distance of the hazard. If the information includes a destination, then a course of action may be to find and communicate to the user a route to the destination.
  • a course of action may be to determine and communicate to the user one or more methods to regulate the user’s health status (e.g., by sitting down, taking calming breaths, or running through various physical, verbal, or mental exercises), or to communicate to a third party (e.g., a user’s medical professional, coach, family member, friend, or companion) an alert regarding the user’s health status.
  • a processor of the device e.g., processor 202 of device 200
  • a remote processor or series of processors may interpret the information.
  • a processor may compare the information to information in one or more programs (e.g., medical, psychological, psychiatric, counseling, or coaching programs) to interpret the information.
  • programs e.g., medical, psychological, psychiatric, counseling, or coaching programs
  • Such programs may be provided by, for example, a medical professional, a psychologist or psychiatrist, a counselor, a coach, a companion, or a family member.
  • programs may be created, stored, and/or made available on one or more remote servers (e.g., servers 160 in system 100) or local computers (e.g., computer 140 in system 100).
  • Generating the directions may include outputting one or more signals, cues, or instructions to a user or other entity from a device. Instructions to output one or more signals, cues, or other outputs may be generated at the device, or may be communicated to the device from a remote location. For example, one or more instructions may be read out loud by an electronic voice from a speaker on a device. Instructions may include directions to physically navigate a space (e.g., to make a turn after a certain number of steps), or to change a user’s status (e.g., to stand up, sit down, perform exercises, take deep breaths, or assume a certain position).
  • a space e.g., to make a turn after a certain number of steps
  • a user’s status e.g., to stand up, sit down, perform exercises, take deep breaths, or assume a certain position.
  • Instructions may also include directions to perform one or more physical or mental routines (e.g., calming, focusing, or stress-relieving routines).
  • a device may output a tactile cue, such as a vibration, or may swivel a wheel to indicate a direction in which a user should move.
  • generating the directions may include changing a status of the device. For example, if the device is moving, then the direction may include causing the device to stop moving or slow down by, e.g., applying a brake, or causing the device to accelerate by, e.g., applying power to wheels of the device. As another example, if the device is traveling, then the direction may include changing a direction of travel of the device.
  • signals, cues, or outputs may be generated at the device in a visual manner (e.g., on a screen, or by one or more lights).
  • generating the directions may additionally or alternatively include contacting an entity other than the user.
  • generating the directions may include contacting the communication device of a doctor, aide, friend, or family member of a user (e.g., communication device 170 of system 100) via a cellular or other mobile connection, or contacting one or more services (e.g., services 190 of system 100).
  • generating the directions may include allowing an entity other than the user to communicate with the user (e.g., by establishing a voice call, video call, or one-way transmission of voice or video to or from the user, or to/from a doctor, aide, friend, or family member of the user).
  • step 708 additional data may be received regarding device surroundings, in response to a change in location. Additional data may be received in any manner that data may be received according to step 702. A change in location may include a transplantation of the device from one set of surroundings to another, or may be a smaller change, such as change in the device’s orientation, or a movement of one meter, one half of a meter, or even a few centimeters. Alternately, additional data may be received regarding device surroundings in response to a change that is not location-based.
  • additional data may be received regarding device surroundings in response to sensor data showing a change in a user status (e.g., a change in data received from a heart rate or blood pressure monitor), in response to a change in goals or priorities programmed into the device, or other change.
  • additional data may be received in response to a change in a device’s orientation, acceleration, or environment (e.g., a change from a hot environment to a cold one, or vice versa).
  • additional data may be received regarding device surroundings independent of any change.
  • additional data may be received as a function of time (e.g., periodic updates to data).
  • a user or other entity may compel the retrieval of additional data.
  • the additional data received may be any type of data that was received according to step 702, or may be a different type of data.
  • the type of data received according to step 708 may depend on the directions generated to the user according to step 706. For example, if a user was directed to a door as a part of step 706, then the type of data received according to step 708 may be visual data, to confirm the appearance of a door in front of a device at the user’s location.
  • memory may be updated with additional received data. This may be accomplished in any of the manners described with respect to step 704. Updating memory may or may not include overwriting or erasing out-of-date data and replacing it with more current data. For example, if a map is stored in memory and the additional received data includes information that would update the map, updating the memory may include changing the map stored in memory to reflect new data.
  • one or more additional directions may be generated to a user based on the additional received data. This may be accomplished in any manner described with respect to step 706.
  • the one or more additional directions may be similar to, or different from, the directions generated as a part of step 706.
  • one or more sets of ongoing directions may be paused, slowed down, or stopped based on the additional data.
  • Steps 702 through 712 may be repeated in or out of sequence to provide a user with a series of generated directions over time.
  • a user following the directions may receive guidance from the device generating the directions, where the directions are updated periodically (e.g., in real time, every second or few seconds, every minute or few minutes, or dynamically depending on user movement, physical status, mental status, or other factors).
  • the directions are updated periodically (e.g., in real time, every second or few seconds, every minute or few minutes, or dynamically depending on user movement, physical status, mental status, or other factors).
  • users who are unable to take in data, information, or cues from the environment in certain formats may benefit from such guidance over time.
  • any of the above described steps may be performed by, e.g., a user, the user’s device, or an individual, computer, or entity authorized to communicate with the device.
  • a user may receive assistance both from the user’s device and from other individuals and entities authorized to provide assistance to the user.
  • any of a number of individuals and entities may contribute to determining what data is received by a device, how data should be interpreted, and what information may be relevant to a user.
  • a user, the user’s doctor or medical facility, and/or the user’s family may be able to provide parameters for what health data may affect a course of action taken by the device.
  • a user or other entity may set a heart rate or blood pressure range, and if data showing a heart rate or blood pressure outside of the range is received, the user may specify that the user, the user’s doctor, or the user’s family should be alerted.
  • a user or other authorized entity may be able to specify a maximum range of motion or travel, thus affecting guidance provided by the device to the user.
  • a user or other authorized entity may be able to input into the device any number of goals, programs, and other information that may change what directions are provided by the device.
  • the plurality of goals, instructions, and/or parameters may be prioritized based on general principles to, e.g., keep a user safe, bring a user to a particular destination, ensure that the user remains within a certain area, and the like.
  • methods according to the present disclosure may be fully customized to a particular user or a particular set of priorities.
  • a case is configured to assist a blind individual with navigating through a variety of environments.
  • the case includes a video camera, a microphone, a speedometer, an odometer, and proximity sensors configured to receive input from the user’s environment and forward the input to the processor.
  • the case is also motorized and capable of self-steering.
  • the case also includes a speaker.
  • the speaker is configured to output audible directions to the individual (in the form of either a synthesized voice or a series of sound, vibration, heat, or other cues).
  • the handle is configured to rotate or vibrate upon instruction by the processor.
  • the case is capable of guiding the user through and around obstacles using these and other methods of communication.
  • the case may also act as a means of physical support upon which the user may lean or otherwise transfer his or her weight.
  • the case also includes a guidance device having a processor and memory configured to receive input from, and send output to, the user’s environment and other locations using video technology, microphone, speedometer, odometer, proximity sensors, cellular or other mobile capabilities, processors, and wireless network capabilities.
  • the case processor is configured to send output to the speaker, a motor of the case, and a third party (such as a coach, physician, or family member).
  • the processor is also equipped with wireless network and cellular or other mobile capabilities.
  • the memory of the guidance device is programmed with a plurality of maps of locations of interest to the individual.
  • the maps may include at least one destination of interest to the individual so that the device may guide the user to the destination that may be programmed before the time of travel.
  • the processor is configured to receive voice input from the microphone and translate the received input, via vocal recognition software, into an instruction to provide direction to a destination of interest at any time during the trip.
  • the processor is configured to identify the destination of interest and determine a route to the destination of interest using the maps stored in the memory of the guidance device. If the destination of interest is not identifiable using the stored maps, the processor is configured to use its cellular capabilities, other mobile capabilities, or wireless network capabilities to query a remote computer having a database of additional maps for the destination of interest.
  • the processor is configured to download the relevant additional map or maps containing the destination of interest.
  • the processor is configured to output audible directions via the speaker, and/or tactile directions via rotations, heat, or vibrations, in order to guide the individual to the destination of interest.
  • the device is also configured to receive signals from any physically installed sensors, beacons, or transmitting devices that may be encoded to send data to the case processor.
  • the processor of the device may update one or more maps (e.g., in real time), and may simultaneously inform the user about the environment (e.g., guide the user to the destination of interest).
  • maps e.g., in real time
  • the processor of the device may update one or more maps (e.g., in real time), and may simultaneously inform the user about the environment (e.g., guide the user to the destination of interest).
  • Example 1 attends a conference in a hotel with the case of Example 1.
  • the hotel provides a computer on its premises, the computer being connected to a wireless network.
  • the hotel has also installed sensors and beacons with a variety of capabilities at various locations throughout the user environment, e.g., conference rooms with specified names, hotel rooms by number, or the restaurant and bar.
  • the breakfast bar may have sticker sensors that describe the food in each serving dish in the breakfast bar.
  • the hotel computer includes the user’s reservation and personalized electronic schedule for this individual user attending the conference, including a repository of maps containing potential destinations of interest corresponding to the electronic schedule, all of which are available for wireless download by the individual either before or during the trip.
  • the potential destinations of interest include locations of events within the conference (e.g., particular panels, plenaries, networking events, lunch locations, and seating assignments) that are also reflected in the repository of maps.
  • locations for the conference information on local events, menus, ads, specials, etc. may be included to assist the user in enjoying his or her current or future stay.
  • the hotel also includes a plurality of signal sensors, beacons, etc. which are readable by, and provide location or other information to, the proximity sensors of the case. These sensors may signal the beginning of a hallway, the location of a desired room, or provide general information, e.g., that the user is approaching a particular location, such as a health club.
  • the case receives this kind of information in addition to obstacle data received by the device sensors (e.g., that another individual is approaching in the hallway or a suitcase or other navigable obstacle is in the user’s path).
  • the guidance device of the case Upon arriving at the hotel with the individual and being connected to the wireless network, the guidance device of the case automatically wirelessly downloads the electronic schedule and repository of maps from the hotel computer. Using the video camera, microphone, speedometer, odometer and proximity sensors, the guidance device determines a current location of the case and the individual. Upon a vocal command input from the individual, the processor of the guidance device references the electronic schedule to find upcoming destinations of interest, and outputs an audible list of upcoming destinations of interest via the speaker. The individual selects one of the upcoming destinations of interest, either vocally or by other means of communication with the device. Then, the processor of the guidance device accesses the downloaded repository of maps, finds the destination of interest on the maps, and determines a route to the selected destination of interest.
  • the processor then outputs a series of audible directions via the speaker and tactile directions via the motorized handlebar to guide the individual to the selected destination of interest.
  • the processor receives input from the video camera, microphone, speedometer, odometer, downloaded data, and proximity sensors to determine the changing location of the user.
  • the guidance device outputs a signal to the individual upon arriving at the selected destination of interest.
  • Example 3 As a result of aging, a user has limited use of his or her right lower leg and the user has cognitive disabilities including anxiety and the ability to navigate in new environments. However, the user’s physician determines that exercise will benefit the user.
  • the user’s coach or family member installs sensors, beacons or devices that may be physical or digital/data-points included in the maps downloaded to the user’s walker device (e.g., device 500). The user may use the walker to navigate along the determined path or walking route that may begin from the user’s home or other location.
  • the device may communicate with the coach who is located at different location like the coach’s home or workplace.
  • the user may intentionally contact the coach for assistance through the device or the device may be programmed to contact the coach for assistance if the user is off the designated path (i.e. the user is lost), or the user’s vital signs are not within a designated pattern, for example the user has an accelerated heart rate or other condition indicating anxiety or a medical emergency.
  • the handlebar of the device may measure vital signs.
  • a user has cognitive disabilities that prevent the user from making his or her own meals. The user is unable to follow written cooking directions and forgets safety measures, such as turning off the stove.
  • a garment according to the present disclosure e.g., garment 400
  • a garment according to the present disclosure is programmed with a step by step verbal coaching tool to guide the user through these or similar tasks. Step by step the user is instructed what to do, including safety information such as a reminder to turn off the stove. The user acknowledges the action is complete either verbally or by other means like a button, etc. Similar to other examples, the device may be programmed to contact a coach or family member if the user fails to communicate acknowledgement or if vital signs or other sensory data indicates an emergency requiring the coach’s assistance.
  • Example 5 A typically-abled individual is attending a conference at a large conference campus and wants to take advantage of the enabling technologies described above.
  • a hand-held device e.g., device 110
  • the user may download hotel maps, schedules of events, guidance to rooms and sessions, etc.
  • the user may also gain information through audio or other methods about the event that will enhance the user’s experience. This is especially useful in large conference settings, university campuses, etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • General Health & Medical Sciences (AREA)
  • Mechanical Engineering (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Public Health (AREA)
  • Epidemiology (AREA)
  • Biomedical Technology (AREA)
  • Veterinary Medicine (AREA)
  • Animal Behavior & Ethology (AREA)
  • Pain & Pain Management (AREA)
  • Rehabilitation Therapy (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Physical Education & Sports Medicine (AREA)
  • General Business, Economics & Management (AREA)
  • Primary Health Care (AREA)
  • Medical Informatics (AREA)
  • Business, Economics & Management (AREA)
  • Biophysics (AREA)
  • Cardiology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Physiology (AREA)
  • Navigation (AREA)

Abstract

A system for enhancing an environment interaction, including a server, a user assist device to receive an input from an environment and generate an output, the user assist device assisting a user to navigate the environment, and a network for connecting the server with the device. The device includes wheels, a sensor for receiving the input from the environment, a processor to control the device, and an output device configured to generate the output based on a control from the processor, including controlling the wheels to move automatically in a specified direction.

Description

METHODS AND APPARATUS FOR ENHANCING USER-ENVIRONMENT
INTERACTIONS
Cross-Reference to Related Application
[001] This application claims the benefit of U.S. Provisional Application No.
62/631,537, filed February 16, 2018, the disclosure of which is incorporated herein by reference in its entirety.
Field of the Disclosure
[002] Various embodiments of the present disclosure relate to systems, devices, and methods for enhancing user-environment interactions. More specifically, embodiments of the present disclosure relate to systems, devices and methods for collecting information about a user’s environment and providing a user with feedback, instructions, and/or directions in relation to the user’s environment. In some aspects, embodiments of the present disclosure may assist a user to navigate its environment.
INTRODUCTION
[003] Individuals may have specific needs and circumstances affecting their capacity to interact with, and function within, their surroundings. For example, differently-abled individuals may require information, assistance, guidance, and/or cues in order to navigate, operate within, or function in environments ranging from the familiar (e.g., a home, school, or place of work) to the unfamiliar (e.g., a hotel, a city in a foreign country, place of worship, library, a convention center, transportation hub (e.g., airport, bus station, train terminal) etc.). As another example, individuals traveling in unfamiliar locations, locations where no familiar language is spoken, locations where signage is sparse, or environments with poor or minimal visibility may benefit from additional information, guidance, or cues to operate in such locations and environments. Moreover, many navigational systems including maps, global positioning system (GPS) devices, cell phone and other navigating software applications (“apps”) and guidance systems often operate under the assumption that a user is abled (e.g., is not physically or mentally impaired, or is not differently-abled) and/or has the ability to rely on their sight. For example, while a sighted individual may be able to look at a map or a set of directions in order to navigate through their environment, blind or partially-sighted individuals may have limited options for navigating, interacting with, or functioning in the same environments.
[004] Most environments do not remain completely static— e.g., the locations of paths, roads, and obstacles may change— resulting in confusion for individuals who obtain insufficient contextual information about their environments. Moreover, current systems and methods for allowing differently-abled individuals to interact with environments may not provide such individuals with adequate amounts or types of feedback to allow such individuals to operate independently from the guidance, coaching, or assistance of other individuals.
BRIEF DESCRIPTION OF THE DRAWINGS
[005] The accompanying drawings, which are incorporated into and constitute a part of this specification, illustrate various exemplary embodiments and, together with the description, serve to explain the principles of the disclosed embodiments. The drawings show different aspects of the present disclosure and, where appropriate, reference numerals illustrating like structures, components, materials and/or elements in different figures are labeled similarly. It is understood that various combinations of the structures, components, and/or elements, other than those specifically shown, are contemplated and are within the scope of the present disclosure.
[006] There are many inventions described and illustrated herein. The described inventions are neither limited to any single aspect nor embodiment thereof, nor to any combinations and/or permutations of such aspects and/or embodiments. Moreover, each of the aspects of the described inventions, and/or embodiments thereof, may be employed alone or in combination with one or more of the other aspects of the described inventions and/or embodiments thereof. For the sake of brevity, certain permutations and combinations are not discussed and/or illustrated separately herein. Notably, an embodiment or implementation described herein as“exemplary” is not to be construed as preferred or advantageous, for example, over other embodiments or implementations; rather, it is intended reflect or indicate the embodiment(s) is/are“example” embodiment(s).
[007] FIG. 1 depicts in schematic form an exemplary system for enhancing user- environment interactions, according to aspects of the present disclosure.
[008] FIG. 2 depicts in schematic form a device for use in enhancing user- environment interactions, according to aspects of the present disclosure.
[009] FIG. 3 depicts an exemplary cane for use in enhancing user-environment interactions, according to aspects of the present disclosure.
[010] FIG. 4 depicts an exemplary garment for use in enhancing user-environment interactions, according to aspects of the present disclosure.
[011] FIG. 5 depicts an exemplary rolling walker for use in enhancing user- environment interactions, according to aspects of the present disclosure.
[012] FIG. 6 depicts an exemplary wheelchair for use in enhancing user- environment interactions, according to aspects of the present disclosure.
[013] FIG. 7 depicts an exemplary case for use in enhancing user-environment interactions, according to aspects of the present disclosure.
[014] FIG. 8 depicts steps in an exemplary method for enhancing user-environment interactions, according to aspects of the present disclosure. DFTATT FD DESCRIPTION
[015] Embodiments of the present disclosure relate to devices, systems, and methods for enhancing user-environment interactions. Embodiments of the present disclosure may relate to, e.g., systems and methods for navigating an environment, interacting with aspects of an environment, reacting to an environment, or functioning in an environment by, e.g., guiding or coaching a user. Embodiments of the present disclosure may include devices that send and/or receive information and/or transmissions between one or more guidance devices and one or more items, objects, sensors, beacons, data
collection/distribution devices, and/or physical or digital markers in an environment. For example, embodiments of the present disclosure may relate to systems and methods for enhancing user-environment interactions without the need to see the environment. As another example, embodiments of the present disclosure may relate to systems and methods for enhancing user-environment interactions for users having one or more physical or mental limitations. As a further example, embodiments of the present disclosure may relate to systems and methods for gathering and combining information about an environment and using the gathered and combined information for providing a user with feedback, guidance, navigation, coaching, or other directions. Some embodiments of the present disclosure may include a mobile device and one or more separate sensors, beacons, or other devices that may act in concert to gather and combine information and provide a user with feedback, guidance, navigation, coaching, or other directions.
[016] In some cases, devices and methods of the present disclosure may be configured to enable disabled users, users with physical or cognitive impairments, or typically-abled users desiring enabling technology, to enhance the experience of their environment, to travel, move, work, etc. more efficiently. Disabilities or impairments may be physical, emotional, or cognitive. [017] As used herein, the terms“comprises,”“comprising,”“includes,” “including,” or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, device, system, or apparatus that comprises a list of elements does not include only those elements, but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. The term“exemplary” is used in the sense of“example,” rather than“ideal.” In addition, the terms“first,”“second,” and the like, herein do not denote any order, quantity, or importance, but rather are used to distinguish an element, a structure, a step or a process from another. Moreover, the terms“a” and“an” herein do not denote a limitation of quantity, but rather denote the presence of one or more of the referenced items.
[018] The terms“about” or“approximately” as used herein with respect to a value may refer to a variation of 10% above or below the stated value. Additionally, while a number of objects and advantages of the embodiments disclosed herein (and variations thereof) are described, not necessarily all such objects or advantages may be achieved in accordance with any particular embodiment. Thus, for example, those skilled in the art will recognize that the systems and techniques described herein may be embodied or carried out in a manner that achieves or optimizes one advantage or group of advantages as taught herein without necessarily achieving other objects or advantages as may be taught or suggested herein.
[019] Reference will now be made to the figures. Any features depicted or described with respect to one figure may be added to, combined with, or otherwise used with any other figure or embodiment.
[020] FIG. 1 depicts an exemplary system 100 according to aspects of the present disclosure. In system 100, a device 110 may be associated with a user 120 in a location 115. The device 110 may be connected to a routing device 130. The routing device 130 may be connected to, e.g., a computer 140 local to the routing device 130 and/or a network 150. Network 150 may, in turn, be connected to servers 160, a communication device 170, a satellite 180, and/or one or more services 190.
[021] System 100 may include any collection of networking devices, computing devices, individuals, service providers, etc. that may be connected with one another via wired communication lines, wireless networks, signaling pathways, signaling networks, and the like. System 100 may include the collection of elements depicted in FIG. 1, but may additionally or alternatively include fewer than or more than all of the depicted elements. In embodiments, system 100 may include multiples of any of the elements depicted in FIG. 1 (e.g., multiple devices 110, multiple collections of servers 160, multiple communication devices 170, multiple satellites 180, or multiple services 190). Furthermore, while the elements of system 100 are depicted as being connected in a particular pattern, it is contemplated that elements of system 100 may be connected via any number of direct or indirect lines or connections that may be wired, wireless or otherwise. System 100 may also include multiple elements beyond those depicted in FIG. 1.
[022] Aspects of system 100, including devices, sensors, monitors, or other technology that transmits or receives signals, communications, or data, may be owned, managed or controlled by a variety of individuals, companies, organizations, governments, or other entities. Interactions between the elements of system 100 may also be managed or controlled by a variety of individuals, companies, governments, organizations, or other entities. In some embodiments, a single entity may be permitted by other entities to use communication lines or connections between aspects of system 100 for a particular purpose, such as operating device 110.
[023] Device 110 may be any device configured to gather information about the surroundings of the device and/or a user of the device, and to output feedback, guidance, navigation, coaching, or other directions to a user (e.g., user 120) of the device. Device 110 may include any number of features or elements that assist in gathering information and/or outputting directions. For example, device 110 may include one or more internal or external inputs, sensors, and/or monitors, such as buttons, keys, a microphone, motion sensors, light sensors, proximity sensors, temperature sensors, or sensors or monitors configured to gather data about a user (e.g., user 120), such as heart rate monitors, blood sugar monitors, blood pressure monitors, and the like. As a further example, device 110 may include one or more networking, transmission, or communication elements, such as wireless networking capabilities, Bluetooth capabilities, cellular or other mobile capabilities, or radio frequency capabilities.
[024] In some embodiments, device 110 may be configured to move or travel along with a user (e.g., user 120) in an environment. For example, device 110 may be a hand-held device, such as a personal computer, a tablet, or a phone. In some embodiments, device 110 may be a hand-held device that is separate from a personal computer, tablet, or phone. In some embodiments, device 110 may have one or more straps, handles, sleeves, ties, buckles, or other features designed to improve its portability. For example, device 110 may include an armband, a belt, a holster, a harness, or other wearable element. In some embodiments, device 110 may include one or more wheels, treads, ball bearings, or other mobility -granting devices. In some embodiments, device 110 may be lightweight, such that it may be lifted up and down by an individual. In further embodiments, device 110 may be, or may be integrated with, other items, such as clothing, tools or vehicles for disabled individuals, tools for individuals in a military or police force, vehicles, jewelry, tools or accessories for hiking or camping, pet accessories (such as a harness or a leash for a service animal), backpacks, bags, or suitcases. In some embodiments, device 110 may be customizable or customized, in terms of its size, shape, appearance, inputs, outputs, or any other characteristics. In some embodiments, for example, device 110 may be customized for a particular user 120.
[025] In some embodiments, device 110 may be configured to relay data either continuously or intermittently to other elements of system 100, such as user 120, computer 140, network 150, servers 160, or services 190. In some embodiments, device 110 may also be configured to receive a variety of data from other elements of system 100 (e.g., user 120, computer 140, satellite 180, services 190), such as instructions, maps, directions, audio recordings, messages, advice, alerts, etc. For example, device 110 may include sensors (e.g., sensors 220, 222, 224 depicted with respect to device 200 in FIG. 2) to allow device 110 to receive a variety of data from other elements of system 100, or from markers, beacons, or other elements of location 115 configured to emit data or a signal. In some embodiments, device 110 may be configured to store such data. FIG. 2 depicts elements of an exemplary device 110 in further detail, and is described further below. FIGS. 3-6 depict further exemplary devices 110 or items incorporating devices 110 according to aspects of the present disclosure.
[026] User 120 may be an individual, a computer, a group of individuals, a company, or other entity receiving information from device 110. In some embodiments, user 120 may be an individual needing or wanting assistance in navigating in or functioning in an environment. In some embodiments, user 120 may be a disabled or differently-abled individual. For example, in some embodiments, user 120 may have one or more physical or mental disabilities and/or limitations. In some embodiments, user 120 may be a blind or partially-sighted individual. In further embodiments, user 120 may be sighted. In further embodiments, user 120 may be an abled individual.
[027] In some embodiments, user 120 may be an individual who is serving in a military, police, humanitarian, search-and-rescue, or guide capacity. In some embodiments, user 120 may be a traveler, regardless of physical and/or mental capabilities. In further embodiments, user 120 may be an elderly individual. In further embodiments, user 120 may be robotic (e.g., a drone) or have robotic elements (e.g., may have or include one or more computer processors and/or mechanical parts). For example, user 120 may be an artificial intelligence, a computer, or a group of computers.
[028] Location 115 may be any location or environment in which assistance is wanted or needed for user 120. For example, location 115 may be an indoor location or an outdoor location. In some embodiments, location 115 may be a part or all of a house, hotel, conference center, store, resort, residential community (e.g., a retirement community), transit center (e.g., an airport or bus port), stadium, shopping center, neighborhood, or town. In some embodiments, location 115 may be a park, a garden, a yard, a campground, or wilderness. In further embodiments, location 115 may be a vehicle, such as an airplane, train, boat, or automobile. In some embodiments, location 115 may be an environment with which user 120 is not familiar. In further embodiments, location 115 may be an environment with which user 120 is familiar (e.g., a workplace, home, or neighborhood of user 120).
[029] In some embodiments, location 115 may include, for example, one or more points of interest to a user (e.g., user 120). Such points of interest may include, e.g., locations at which user 120 may receive assistance or information, or may include hazards, events, and the like. In some embodiments, location 115 may have some method by which to identify such points of interest. In some embodiments, an administrator, owner, company, or other entity may provide identification of points of interest to either device 110 or to a repository of information such as a database in servers 160 (or otherwise), such that those points of interest may be accessible to user 120 of device 110.
[030] Routing device 130 may be, for example, a computer, a wireless router, a modem, a Bluetooth receiver, a switchboard, or other device configured to relay signals, data, and/or information between elements of system 100. In some embodiments, for example, routing device 130 may include a wireless router having a wireless connection to device 110, computer 140, and/or one or more networks, such as network 150. In some embodiments, routing device 130 may include a docking station and/or charging connection for device 110. In further embodiments, routing device 130 may be integrated into another element of system 100, such as computer 140. Some embodiments of system 100 may not include a routing device 130.
[031] Computer 140 may be any type of computer that includes memory and/or one or more processors. For example, computer 140 may be a personal computer, a server computer, or a handheld computing device. In some embodiments, computer 140 may be a computer sharing a location (e.g., location 115) with device 110 and user 120. For example, in some embodiments, computer 140 may be owned or controlled by user 120. In further embodiments, computer 140 may be owned or controlled by another individual or entity at location 115. For example, if location 115 is a hotel, then computer 140 may be a hotel computer. In yet further embodiments, computer 140 may be integrated with device 110 and/or routing device 130.
[032] Computer 140 may include information, data, or instructions useful to device 110 and/or user 120. In some embodiments, computer 140 may be used to configure device 110 by, e.g., providing device 110 with data, information, and/or instructions via a wired or wireless connection (e.g., by“pushing” data to device 110 directly or via routing device 130). In other embodiments, computer 140 may store data accessible to device 110 via a direct or indirect query from device 110 (e.g., a“pull” function of device 110). In some embodiments, computer 140 may be configured to connect directly to device 110 (e.g., via a wired or wireless connection) for exchanging data and/or power. In some embodiments, computer 140 may include a docking station and/or charging connection for device 110. [033] In some embodiments, computer 140 may be a personal and/or portable computer owned and/or operated by user 120. In some embodiments, user 120 may use computer 140 simultaneously with device 110. In further embodiments, computer 140 may operate in tandem with device 110 to, e.g., record data, analyze data, and/or provide feedback to device 110 with respect to user 120, location 115, or other elements of system 100. In still further embodiments, an individual or entity other than user 120 may operate computer 140 independently from device 110.
[034] While user 120 and device 110 may be in proximity to, e.g., a routing device 130 or computer 140, in alternate embodiments, user 120 and device 110 may not be within any proximity or range of routing device 130 or computer 140. In some embodiments, system 100 may not include a computer 140.
[035] Network 150 may be any wired or wireless electronic network, such as a local area network or wide area network (e.g., the internet). In some embodiments, network 150 may include various types of data connections, such as wired connections, fiber optic connections, wireless connections, satellite connections, cellular or other mobile connections, and the like. Network 150 may also include any number of computers, digital storage devices and memory connected via one or more wired or wireless networks. In some embodiments, network 150 may include“cloud” storage.
[036] Servers 160 may include one or more computers configured to send, receive, store, and process data transmitted between one or more computers, databases, networks, and the like. In particular, servers 160 may receive, store, and modify data relevant to one or more devices, such as device 110. In some embodiments, servers 160 may store data that is potentially useful to a user 120 of device 110. For example, servers 160 may store data relevant to navigating a location of user 120, such as location 115. Such data may include directions, maps and geographical data, cues, instructions, alerts, points of interest, descriptive data, coaching or assistive tools, etc. Servers 160 may be configured to make such data available to device 110 via, for example, network 150, other authorized users, or automatically.
[037] In some embodiments, servers 160 may receive and store data collected by device 110. In some embodiments, servers may store data collected by multiple devices, including device 110. In some embodiments, servers 160 may provide additional or backup storage for device 110. In further embodiments, servers 160 may provide a searchable database of information usable by device 110.
[038] In some embodiments, servers 160 may perform one or more processes using data collected by device 110 in location 115. In some embodiments, data may be collected or provided by one or more sensors, beacons, monitors, or other communication devices located in location 115. For example, using data collected and forwarded by device 110 from one or more sensors of device 110, servers 160 may perform one or more analyses to determine what instructions, cues, or other data should be returned to device 110, synthesize such instructions, cues, or other data, and/or return such data to device 110. For example, device 110 may provide servers 160 with its GPS coordinates, a plurality of images taken from the location, and with a request for directions. Servers 160 may use the GPS coordinates and the plurality of images to create (or find, in a database) a map of the location of device 110. In some embodiments, servers 160 may be configured to push such data to device 110. In other embodiments, servers 160 may await a cue to return such data to device 110, such as via a “pull” notification or other request.
[039] In embodiments where servers 160 store data collected by multiple devices, servers 160 may synthesize data and/or perform analyses using the data from the multiple devices. For example, servers 160 may generate maps, instructions, alerts, statistics, directions, points of interest, commands, etc. that may be relevant to one or more users, such as user 120, and may store the generated data and/or analyses. As another example, servers 160 may generate metrics using the data from multiple devices by, for example, performing statistical analyses using the data from multiple devices. As a further example, servers 160 may compare data received from one device to data received from another device to, e.g., calibrate one of the devices. In some embodiments, after performing a process, servers 160 may publish, push, or otherwise make available the results of such a process.
[040] In some embodiments, servers 160 may synthesize or analyze data received from device 110 and make such synthesized data or analyses available to a third party, such as an individual or an organization. For example, servers 160 may make syntheses or analyses available to a medical professional, a hospital, a family member of user 120, a nonprofit organization, a government organization, or other entity. In some embodiments, servers 160 may be equipped with applicable privacy safeguards to avoid the spread of sensitive personal health information. For example, servers 160 may include restricted access, or data on servers 160 may be encrypted.
[041] Communication device 170 may be any device capable of interfacing with network 150, receiving one or more types of information regarding device 110 and/or user 120, and/or generating one or more alerts. In some embodiments, communication device 170 may be associated with user 120 (e.g., communication device 170 may be a personal computer, tablet, or phone associated with user 120). In further embodiments,
communication device 170 may be associated with one or more third parties.
[042] A third party may be any party designated to receive information from device 110. In some embodiments, a third party may be designated to provide input to device 110 to assist a user 120, if needed. For example, a third party may be a medical professional, a hospital, a family member or friend of user 120, a company or member of a company, a member of a government body, a member of a military or police force, a colleague of user 120, a member of a helpline, etc. In some embodiments, a third party may be a companion to user 120, such as an aide, nurse, or technician.
[043] Device 110 may be configured to contact communication device 170 on command by user 120 (e.g., via a telephonic call, other voice or video call, or by sending an alert to communication device 170 via network 150). In some embodiments, device 110 may be configured to contact communication device 170 automatically upon certain events. For example, if data collected by device 110 reflects abnormal patterns, such as a prolonged and unusual lack of movement or input by user 120, abnormal health monitor readouts (e.g., an abnormal heart rate or other monitored symptom of user 120), a third party may be alerted to check the status of a user 120.
[044] In some embodiments, communication device 170 may include a program, application, or other feature that allows it to connect to device 110 or servers 160 via network 150. For example, communication device 170 may include an application that allows user 120 or a third party to log in to a secure or encrypted system (e.g., on servers 160) and access current data regarding device 110, either from device 110 directly or from servers 160.
[045] It is contemplated that system 100 may include multiple communication devices, allowing for multiple third parties to contact device 110 or access servers 160 via a communication device. In some embodiments, device 110 or servers 160 may include a list of communication devices or third party information reflecting an approved group of devices and third parties that may contact device 110 or access servers 160.
[046] Satellite 180 may be any communications satellite configured to collect, receive and/or broadcast data or information to and/or from device 110. For example, satellite 180 may be a GPS satellite configured to broadcast location information to a GPS receiver on device 110. Satellite 180 may, for example, broadcast data allowing a device 110 or other device to construct a map for use by device 110 with respect to user 120. [047] Services 190 may include, for example, individuals or institutions capable of providing a service to user 120 of device 110 upon request or alert. For example, services 190 may include emergency services, such as police services or emergency medical care. As another example, services 190 may include navigational or troubleshooting services, which may be able to provide device 110 or user 120 with navigational, travel, coaching, monitoring, or technical assistance upon request. In some embodiments, services 190 may be reachable by placing a call to services 190 via device 110 or another communication device.
A call may be placed by any input method, such as voice command, dial-in, or by pressing a customized button on device 110. Alternatively, a call may be placed automatically.
[048] In some embodiments, services 190 may be alerted automatically if device 110 experiences an event. For example, if device 110 registers an abnormal heart rate of user 120, then device 110 may automatically alert some services 190 via network 150. As another example, if a sensor, input, or output on device 110 fails, then device 110, servers 160, or communication devices 170 may automatically alert services 190. In some embodiments, multiple alerts may be sent to multiple services 190, and/or a combination of services 190 and communication devices 170. In some embodiments, a record of such automatic alerts may be made either locally on device 110 or remotely, on, e.g., servers 160.
[049] In some embodiments, services 190 may be authorized to provide step-by- step directions to device 110 to assist user 120 with navigating in, or functioning in, location 115. For example, services 190 may be authorized to receive sensor data from device 110 in order to locate, orient, and guide user 120 to a particular destination. Services 190 may be authorized to provide verbal directions to user 120 via network 150 or, alternately, a series of tactile instructions for user 120 to follow. In the case of an emergency, services 190 may be authorized to receive precise location and other data from device 110, in order to best provide assistance to user 120 of device 110. [050] The connections depicted by dotted lines in FIG. 1 may be any suitable connections known to those of ordinary skill in the art. For example, the connections may include any type of wireless connection (e.g., wi-fi, satellite signal, cellular, radio, Bluetooth) or wired connection (e.g., wired telephone or cabled connection).
[051] FIG. 2 depicts, in schematic form, an exemplary device 200 according to aspects of the present disclosure. Device 200 may include a processor 202, memory 204, a power source 206, several network-capable components such as a wireless network component 208, a cellular component 210, and a GPS component 212, and a plurality of outputs such as an audio output 214, a tactile output 216, and a visual output 218. Device 200 may also include a plurality of sensors 220, 222, 224. Components may be grouped together into modules, such as module 250 and module 270.
[052] Device 200 is a broad, exemplary generalization of the types of devices that are contemplated by the present disclosure. In general, device 200 may be any device suitable for use in system 100 to assist user 120. For example, device 110 in system 100 may be a device 200. Any and/or all characteristics and features of device 110 described above may be characteristics and features of any other version of device 200. As will be apparent to one of ordinary skill in the art, device 200 may have an expansive array of sizes, shapes, or collection of sizes and shapes (e.g., a combination of various-sized and -shaped modules).
For example, FIGS. 3-6 depict several exemplary configurations. Thus, the depiction of device 200 is intended to show one exemplary combination of elements that a device according to the present disclosure may have, but many more combinations of elements and characteristics are possible. For example, while device 200 is depicted with only one processor 202 and only one power source 206, device 200 may, in some embodiments, have multiple processors and multiple power sources. This applies to each element of device 200. [053] Device 200 is schematically depicted as a single rectangular unit.
Optionally, elements of device 200 may be divided into multiple units having any shape or size and being disposed in one housing, or in multiple different housings. In cases where elements of device 200 are divided, elements of device 200 may communicate with one another via wired connections, wireless connections, Bluetooth, cellular connections, or any other wired or wireless connections. Parts of device 200 may be strapped, sewn, attached, or otherwise affixed together or to other objects, items, devices, or individuals, including placement in the user-environment in any form or manner that is useful to a user (e.g., user 120). In some embodiments, for example, device 200 may include multiple parts arranged in a pattern throughout an environment.
[054] Some connections between elements of device 200 are depicted with solid lines. Depending on the purpose of the connection (e.g., a connection to provide power or a connection for the exchange of signals or data), such connections may be either wired or wireless. Although a particular combination of connections is depicted, it is contemplated that connections may be made between any elements of device 200 to assist in device 200’s function of assisting a user. Notably, power source 206 is depicted without any connection lines to any other elements. However, it is contemplated that power source 206 would be connected to any elements of device 200 requiring power.
[055] Processor 202 may be any suitable processing unit that may assist in performing the functions of device 200. For example, processor 202 may be configured to receive data from one or more inputs, such as memory 204, sensors 220, 222, 224, wireless network component 208, cellular component 210, and GPS component 212, processing data from such inputs, and outputting data to, e.g., audio output 214, tactile output 216, visual output 218, wireless network component 2018, cellular component 210, GPS component 212, and memory 204. [056] Device 200 may include multiple processors. In such embodiments, processor 202 may be a central processor, and other processors may be specific to other components. For example, cellular component 210, GPS component 212, and one or more of sensors 220, 222, 224 and the outputs may have dedicated processors. In other embodiments, device 200 may include multiple processors 202 operating in parallel.
[057] Memory 204 may be one or more types of digital memory, and may serve to store various types of data. For example, memory 204 may store one or more sets of instructions for processor 202, and/or data sent and received by processor 202 and other components of device 200.
[058] Memory 204 may include storage memory and/or processing memory. In some embodiments, storage memory may store instructions for processor 202 (e.g., in the form of one or more programs), as well as data gathered from sensors 220, 222, 224 and other components of device 200. For example, storage memory may store data and/or instructions for outputting to elements of device 200, such as audio output 215, tactile output 216, and visual output 218. For example, storage memory may store one or more series of navigation directions for a user of device 200 (e.g., user 120 of device 110) in the form of a sequence of vibrations, turns, or other outputs to tactile output 216. As another example, storage memory may store audio recording or audio cues for outputting to audio output 215. As a further example, storage memory may store instructions for outputting visual information to a visual output 218, such as instructions for a display or a series of lights. Storage memory may also store inputs received from sensors 220, 222, 224. For example, if one of sensors 220, 222, 224 is a movement sensor, then storage memory may store a sequence of movements that are sensed by the movement sensor. Storage memory may also include preprogrammed data and information relevant to turning on and operating device 200, such as operating system information, basic input-output system information (BIOS), and the like. In some embodiments, storage memory may be referred to as read-only memory (ROM). In some embodiments, storage memory may be an electronic storage drive, such as a hard disk drive (HDD) or solid state drive (SSD).
[059] Processing memory may be memory used actively by a processor (e.g., processor 202) while running. For example, processing memory may be used by processor 202 to dynamically track and analyze inputs received from other elements of device 200 (such as sensors 220, 222, 224). As another example, processing memory may be used by processor 202 to queue a series of instructions stored in storage memory, so that the series of instructions may be provided to elements of device 200 quickly. In some embodiments, processor 202 may be able access, read, and write to processing memory more quickly than to storage memory. In some embodiments, processing memory may be referred to as random-access memory (RAM). In some embodiments, processing memory may include dynamic RAM, static RAM, or both.
[060] Device 200 may include as much or as little memory as desired or needed for the functioning and operation of device 200. In some embodiments, for example, device 200 may have relatively little storage memory, and may access instructions, data, and other information stored remotely over a network (e.g., network 150). Having relatively little memory may allow for greater portability, affordability, and power efficiency of device 200. In other embodiments, device 200 may include substantial storage and processing memory. This may allow for device 200 to have faster operating speeds and more immediate access to a greater variety of instructions and data.
[061] Power source 206 may be any source or sources of energy or charge to power device 200 and its elements. Power source 206 may be connected, either directly or indirectly, to each element of device 200 requiring power to function. Power source 206 may include, for example, a battery, such as a rechargeable or replaceable battery. In some embodiments, power source 206 may include one or more sockets, plug points, or connectable elements capable of connecting device 200 to an external power source. In some embodiments, power source 206 may include a gas-fueled engine. In some embodiments, power source 206 may include a solar power source, or a mechanical power source. For example, in an embodiment of device 200 where device 200 includes one or more wheels, treads, ball bearings, or other mobile parts, power source 206 may include a mechanical power generator coupled to one or more of the mobile parts. In further embodiments, power source 206 may include an engine and/or a motor. In some embodiments, power source 206 may include a combination of different power sources, such as a gas-powered engine and a battery. In some embodiments, device 200 may include a single power source. In other embodiments, device 200 may include multiple independent power sources.
[062] Device 200 may also include one or more components that allow device 200 to have networking capabilities over a wireless local or wide area network, a cellular network, a GPS network, or other network. While three particular components (wireless network component 208, cellular component 210, and GPS component 212) are depicted in FIG. 2, device 200 may also have other networking components, such as a Bluetooth- compatible component or a radio component. It is contemplated that any of the networking components may be turned on or off, or activated/deactivated, such that a user may connect device 200 to selected types of networks or disconnect it from all types of networks.
[063] Wireless network component 208 may be a component that allows elements of device 200 (such as processor 202) to send and receive data over wireless networks, such as wireless local area networks (LANs) and/or wide area networks (WANs). In some embodiments, wireless network component 208 may include a wireless card and/or antenna that allows processor 202 and/or other elements of device 200 to communicate over wi-fi networks. [064] Cellular component 210 may be any mobile transmissions component. For example, cellular component 210 may include a modem and a radio frequency antenna configured to be compatible with one or more mobile network standards, such as GSM, CDMA, or iDEN. In some embodiments, cellular component 210 may be associated with a cellular account held with one or more providers. In some embodiments, cellular component 210 may include a SIM card or other element containing account-identifying information.
[065] In some embodiments, cellular component 210 may be a mobile phone, including a speaker or earphone, a microphone, user inputs/outputs (such as buttons, voice command features, and/or a screen), a processor, memory, integrated circuits, and the like. In such cases, cellular component 210 may be independent from other elements of device 200, but may be integrated with other elements of device 200 by, e.g., a wireless or a wired connection. For example, device 200 may include a dock or port where cellular component 210 may be plugged in to device 200.
[066] In some embodiments, a body of device 200 may be a mobile phone body.
In such embodiments, features of device 200 that may not be typically found in a mobile phone (such as one or more of sensors 220, 222, 224) may be attached to device 200 as one or more add-on accessories, or may be physically independent from a body of device 200, but connected to device 200 by way of one or more wireless connections (e.g., a wi-fi or Bluetooth connection).
[067] GPS component 212 may be a component capable of receiving information from GPS satellites. In some embodiments, GPS component 212 may be configured to receive information from GPS satellites and subsequently calculate the position of device 200 using that information. In some embodiments, GPS component 212 may be configured to provide positional data to a user in the absence of connections to other networks (e.g., wireless networks or cellular networks). [068] Device 200 may be equipped with various types of output components in order to communicate with a user (e.g., user 120), or other aspects of an environment. Such output components may provide output in visual formats and non-visual formats, such as in audio formats or tactile formats. Output components may be configured to provide cues, instructions, queries, responses, directions, advice, coaching, teaching, and the like.
[069] In some embodiments, output components of device 200 may be configured to work in combination with sensors of device 200 (e.g., sensors 200, 222, 224), processor 202, and memory 204 to translate environmental cues into output in a format that is helpful to a user. For example, if one or more of sensors 200, 222, 224 is capable of providing visual input to processor 202, processor 202 may be configured to receive such input, translate such input into a series of audio or tactile cues, and instruct audio output 214 or tactile output 216 to output those cues to a user. In some embodiments, processor 202 and/or memory 204 may be configured to construct audio or tactile cues, e.g., by using synthetic vocal noises or by using a tactile language such as Braille, or other tactile code.
[070] Audio output 214 may be one or more of a speaker, a headphone set or set of earbuds, and/or a jack for a wired or wireless headphone, ear bud, speaker, or other wired or wireless communications device. Audio output 214 may be configured to output audio cues, instructions, queries, directions, advice, or other audio content to a user of device 200. Audio output 214 may be configured to interface with any other element of device 200, such as processor 202, cellular component 210, or GPS component 212. In some embodiments, audio output 214 may be connected to device 200 via a wireless connection, such as via wireless network connection 208, or via a Bluetooth connection.
[071] Tactile output 216 may be any output of device 200 that can be sensed by touch or by movement. For example, tactile output 216 may include motorized moving parts, such as a handle or other part of device 200 that may move, a component that outputs vibrating cues, an electronic braille component, etc. Tactile output 216 may be configured to communicate directions, instructions, responses, cues, and the like to a user in a non-visual, non-auditory manner. In some embodiments, tactile output 216 may move device 200, e.g., by vibrating device 200 or moving one or more wheels or other mobile parts of device 200, to direct or suggest to a user a particular movement. For example, tactile output 216 may provide a user with physical guidance in an environment. To this and other ends, tactile output 216 may be configured to receive instructions from, e.g., processor 202 or memory 204.
[072] Visual output 218 may include, for example, a display, such as a screen, and/or lights. In some embodiments, for example, visual output 218 may include a tablet, mobile phone, or other type of screen. In some embodiments, visual output 218 may include lights in a variety of colors and locations on device 200. It is contemplated that a visual output 218 may assist either a user of device 200 or a companion or other individual with user 200 to interact with various aspects of device 200.
[073] Sensors 220, 222, 224 may include any number or type of components configured to receive input from the surroundings of device 200 and transmit such input to one another, to processor 202, and/or to other components of device 200. Sensors 220, 222, 224 may include, for example, cameras, thermometers, heat sensors, proximity sensors (e.g., hall effect sensors), motion sensors (e.g., accelerometers), microphones, speedometers, odometers, balance/orientation sensors, health sensors (e.g., heart rate monitors, blood pressure monitors, body temperature monitors), and others. Sensors 220, 222, 224 may also include other types of input devices, such as keyboards, buttons, and touchpads. In some embodiments, one or more of sensors 220, 222, 224 may be configured to work in tandem.
In some embodiments, processor 202 may be configured to receive various inputs from sensors 220, 222, 224 and translate those inputs into information regarding an environment in which device 200 is located. While three sensors are depicted on device 200, device 200 may be configured to have any number of sensors such as sensors 220, 222, 224.
[074] Modules 250, 270 represent exemplary groupings of elements of device 200. It is contemplated that two or more elements of device 200 may be grouped together physically, in terms of their functions, or both. For example, sensors 220, 222 are shown as being grouped together in module 250, and processor 202, wireless network component 208, cellular component 210, GPS component 212, memory 204, and power source 206 are grouped together in module 270. In some embodiments, sensors 220, 222 may be two sensors grouped together in a software context as module 250, to operate in tandem. As such, sensors 220, 222 may exchange and combine their data prior to forwarding the data to processor 202. In another embodiment, module 250 may be an independently movable part of device 200. For example, module 250 may be removable, replaceable, or repositionable on device 200, independently of module 270. As another example, in devices having large sizes (e.g., wheelchairs), or in devices that are integrated into other larger systems (e.g., a vehicular system), module 250 may be located at a different part of device 200 than module 270. Both module 250 and module 270 are exemplary, and one of ordinary skill in the art will understand that other combinations of elements of device 200 into modules are possible as well.
[075] One of ordinary skill in the art will understand that the elements of device 200 may be configured to work together in a variety of ways. In particular, device 200 may be configured to receive input from sensors 220, 222, 224, use processor 202 and memory 204 to process such input into information regarding the surroundings of device 200, and use that information to provide guidance to a user. Providing guidance may include using, e.g., wireless network component 208, cellular component 210, and/or GPS component 212 to retrieve additional data regarding an environment in which device 200 is located, such as one or more maps, sets of directions, notifications regarding hazards, points of interest, and the like, and providing such additional data or parts of such additional data as output from device 200 to a user, in order to assist the user.
[076] Device 200 and other devices of the present disclosure may have a variety of additional features. For example, device 200 may have an emergency feature, by which emergency services may be contacted through, e.g., cellular component 210. Device 200 may be configured to contact emergency services if data from one or more of sensors 220, 222, 224 meets certain parameters (e.g., a heart rate is above or below a certain threshold). It will be apparent to those of skill in the art that other features and conveniences known in the art may also be incorporated into device 200.
[077] FIGS. 3-6 depict additional exemplary embodiments of devices according to the present disclosure. Generally, the devices of FIGS. 3-6 show some specific
implementations of device 200, in order to demonstrate how the elements of device 200 may be incorporated into objects that may offer a user added assistance or convenience. As each of the devices in FIGS. 3-6 is exemplary, it is to be understood that any of these devices may be modified to have more or fewer parts, sensors, inputs, outputs, or other attributes.
[078] FIG. 3 depicts an exemplary cane 300 according to aspects of the present disclosure. Cane 300 may include a staff 320, a handle 340, a guidance module 350, and a wheel or other movement technology or device 360. Cane 300 may be equipped with various sensors 322, 324, 326, a speaker 342, and a jack 344.
[079] Cane 300 may be, for example, a walking stick, a support cane, a probing cane (also referred to as a“white cane” or a“long cane”), or any other type of cane. Cane 300 may have the approximate dimensions of a support cane or probing cane, with the addition of features to assist a user of cane 300. [080] Staff 320 may be sized and configured for use by a user. In some embodiments, staff 320 may have a height, width, and weight to allow a user to comfortably hold cane 300 while walking. In some embodiments, staff 320 may be collapsible for ease of storage or transport when not in use. Staff 320 may include in its body several sensors (e.g., sensors 322, 324, 326), and wiring between such sensors and guidance module 350.
[081] Handle 340 may be grippable by a user of cane 300. Handle 340 may be approximately perpendicular to staff 320, as shown. In alternative embodiments, handle 340 may extend coaxially from staff 320. In further embodiments, handle 340 may have any size, shape, or configuration that allows a user to comfortably use cane 300. In some
embodiments, handle 340 may be equipped with one or more sensors, such as a heart rate monitor. Sensors in handle 340 may be connected to guidance module 350 via wires in handle 340 and staff 320.
[082] Guidance module 350 may contain a variety of elements to assist in receiving, sending, and processing information to guide a user of cane 300. Guidance module 350 may contain one or more elements found in, e.g., device 200, such as a power source (e.g., power source 206), processor (e.g., processor 202), memory (e.g., memory 204), wireless capabilities, GPS capabilities, Bluetooth capabilities, inputs, outputs, and the like. Guidance module 350 may be configured to receive information from sources in/on cane 300 (e.g., sensors 322, 324, 326, handle 340, or wheel 360) process such information locally and/or transmit such information to a remote processor, and output directions, guidance, or cues to a user based on received information.
[083] In some embodiments, guidance module 350 may be programmed locally by, e.g., inputting commands, directions, or instructions into guidance module 350 via one or more buttons, keys, or other inputs (not shown). In some embodiments, guidance module 350 may be coupled or connected to another device, such as a personal computer, phone, or tablet, into which instructions, directions, or other information may be input and transferred to guidance module 350. In some embodiments, guidance module 350 may remain connected to a personal device via, e.g., a wireless connection, while cane 300 is in use.
[084] Wheel 360 may be a multidirectional wheel or other movement technology at the base of staff 320. Wheel 360 may be rotatable in any direction, such that a user of cane 300 may push or tap cane 300 in any direction as the user moves. In some embodiments, wheel 360 may be rotatable or propellable by, for example, a motor controlled by a module internal to cane 300. In such a manner, cane 300 may be, to some extent, self-propelled. Wheel 360 may be equipped with or coupled to one or more sensors, such as an odometer or a speedometer. Such sensors may be connected to guidance module 350 via wires extending through staff 320. In some embodiments, wheel 360 may be replaced with, e.g., a ball bearing, a slider, or a pad. In some embodiments, cane 300 may not include a wheel 360 and may have a flat or pointed base.
[085] Sensors 322, 324, 326 may be, for example, any of the sensors described with respect to device 200. One or more of sensors 322, 324, 326 may include cameras, so as to gather visible data around cane 300. Although three sensors are depicted on staff 320, it is contemplated that more or fewer sensors may be disposed on staff 320.
[086] Speaker 342 may be located on, e.g., handle 340, and may be configured to output audio generated by guidance module 350 in the general direction of a user of cane 300. Jack 344 may be, for example, a standard headphone jack into which headphones or an earpiece may be plugged, such that a user wearing the headphones or earpiece may hear auditory cues from guidance module 350. In some embodiments, if headphones or an earpiece is plugged into jack 344, then speaker 342 may be automatically turned off. In some embodiments, speaker 342 and jack 344 may double as headset elements for making mobile calls from, e.g., guidance module 350, using a cellular or other mobile component of guidance module 350. In such embodiments, jack 344 may be configured to receive input from a microphone, or may be configured to accept input from, and provide output to, a combination headphone-and-microphone. In some embodiments, jack 344 may be a jack for wired or wireless headphones, wireless transmission/reception devices, speakers, or other device or devices.
[087] FIG. 4 depicts an exemplary garment 400 according to aspects of the present disclosure. Garment 400 may include a body 410, a camera 420, external sensors 422, 424, 426, internal sensor 428, a speaker 442, and a jack 444.
[088] Garment 400 may be a wearable device configured with guidance elements to assist a person wearing the garment. It is contemplated that garment 400 may be worn as a part of daily wear, as a part of a uniform, as a part of tactical gear, or in any other manner garment 400 may be made of any fabric or material known in the art that can support the elements affixed to garment 400. In some embodiments, garment 400 may be made of an outerwear fabric, such as a waterproof polyester or nylon. In other embodiments, garment 400 may be made of cotton, leather, woven metal, or any other wearable material. Although garment 400 is depicted as being a vest, garment 400 may be styled in any manner. In some embodiments, garment 400 may include sleeves, a collar, a zipper, cuffs, an inner shirt, or any other stylistic details. In further embodiments, garment 400 may be any type of garment (e.g., trousers, shorts, a belt, etc.) As is described further, garment 400 may be equipped with a variety of sensors to receive input about the surroundings of a user wearing the garment.
[089] Not depicted with regards to garment 400 is a central guidance module, similar to guidance module 350 depicted in FIG. 3. It is contemplated that garment 400 may have such a guidance module, and that such a device may be connected to the other elements of garment 400 by wired or wireless connections. [090] Camera 420 may be a video camera configured to retrieve visual information about the surroundings of a wearer of garment 400. External sensors 422, 424, 426 may be any sensors configured to gather data from the surroundings of garment 400, such as temperature sensors, accelerometers, and other sensors that have been described elsewhere herein. Internal sensor 428 may be a sensors configured to gather data about a wearer of garment 400, such as body temperature data, heartrate data, and other information.
[091] Speaker 442 may be positioned at or near a shoulder of garment 400, such that output from speaker 442 may be heard by a wearer of garment 400. In some
embodiments, a guidance module may transmit audio instructions and cues to speaker 442 for output to a wearer of garment 400. Speaker 442 and jack 444 may be configured to operate in a manner with respect to a guidance module in garment 400 in a manner similar to speaker 342 and jack 344 of cane 300, with respect to guidance module 350.
[092] FIG. 5 depicts an exemplary walker 500 according to aspects of the present disclosure. Walker 500 may include a frame 520, wheels 521, a cushioned rest 522, a handle 524, a brake 526, a brake cord 527, a footrest 528, and a guidance module 530. Guidance module 530 may include speaker 532, and one or more sensors 534, 560, 580. Walker 500 may also include sensors 590, 592, 594, 596 located on various parts of walker 500.
[093] Walker 500 may be a knee walker, such a user may rest a knee on, or sit on, walker 500 while walker 500 is stationary or in motion. Frame 520 may be made of a material strong enough to support a user, and light enough to allow for mobility. For example, frame 520 may be made of metal or a synthetic polymer. In some embodiments, frame 520 may be made of multiple different materials.
[094] Wheels 521 may be movable and steerable by, e.g., a user pushing walker 500, and/or by handle 524. In some embodiments, wheels 521 may include full or partial motorized propulsion to aid in walker movement. For example, walker 500 may be self- propelled. In some embodiments, wheels 521 may be replaced by other parts granting walker 500 mobility, such as treads, ball bearings, and the like. Rest 522 may be a support rest on which a user may sit or rest a limb. In some embodiments, rest 522 may be padded for comfort. In some embodiments, rest 522 may be customized or molded to specifically fit the body of a user. Footrest 528 may support the foot or feet of a user sitting on walker 500.
[095] Handle 524 may be configured to steer the front and/or back wheels of walker 500 when turned by a user. In some embodiments, handle 524 may be fully or partially guided by commands from guidance module 530. In such embodiments, handle 524 may have a fully or partially motorized turning mechanism connected to guidance module 530.
[096] Brake 526 may be, e.g., a hand brake that may control whether one or more wheels of walker 500 may roll via, for example, brake cord 527. In some embodiments, brake 526 may be an electronically-controlled brake that may be controlled by, for example, guidance module 530. In such embodiments, brake 526 may be connected to guidance module 530.
[097] Brake cord 527 may be configured to apply a braking force to a wheel of walker 500. In some embodiments, an inner wire in brake cord 527 may extend between brake 526 and a wheel (or other mobility part) of walker 500. When brake 526 is engaged, the inner wire may be pulled, thus engaging a clamp on a wheel of walker 500. In some embodiments of walker 500, brake 526 may not need a brake cord 527. In such
embodiments, brake 526 may be located directly on one or more wheels (or other mobility parts) of walker 500 and be activated electronically or by another mechanism.
[098] Guidance module 530 may have any of the characteristics that guidance module 350 may have, and/or any of the elements of device 200. Connections to various other elements of walker 500 may run through portions of frame 520; for example, connections between guidance module and handle 524, brake 526, speaker 532, and sensors 590, 592, 594, 596 may run inside hollow portions of frame 520. Guidance module 530 may also include several sensors in or on it, such as sensors 534, 560, 580. Speaker 532 on guidance module 530 may have characteristics and capabilities similar to other speakers disclosed with respects to other embodiments herein. In some embodiments, guidance to a user of walker 500 may be primarily auditory. In further embodiments, guidance to a user of walker 500 may be provided by physical cues, such as by handle vibrations from a motorized handle 524, slight turns to the handle, application of brake 526, or other physical cues provided by walker 500. Guidance module 530 may also include, e.g., safety-oriented programming, such that any physical cues provided by walker 500 to a user will not endanger the safety of the user.
[099] Each sensor 534, 560, 580, 590, 592, 594, 596 may be affixed to, or of a piece with, walker 500. For example, each sensors may be affixed to or welded to frame of any sensor type described elsewhere herein (e.g., with respect to sensors 220, 222, 224 of device 200). In some embodiments, multiple different types of sensors may be affixed to, welded to, strapped to, or otherwise attached to frame 520 of walker 500, guidance module 530, the wheels, or other parts of walker 500. Sensors may be configured to track the movement, speed, acceleration, and physical surroundings of walker 500. In addition, sensors may be configured to receive voice and/or manual input from a user of walker 500. For example, one or more of sensors 534, 560, 580, 590, 592, 594, 596 may include a microphone. In some embodiments, sensors may also be incorporated into, for example, handle 524, in order to track physical characteristics of a user of walker 500 (e.g., heartrate).
[0100] FIG. 6 depicts an exemplary wheelchair 600 according to aspects of the present disclosure. Wheelchair 600 may include one or more modules 620, 630, 638. Module 630 may include a speaker 632. Wheelchair 600 may include one or more sensors, such as sensors 636, 640. Wheelchair 600 may be a motorized or non-motorized wheelchair, such that it is partially or fully self-propelled and/or self-steering.
[0101] Modules 620, 630, 638 may be various modules making up a guidance module having characteristics similar to one or more modules of device 200. For example, one of modules 620, 630, 638 may include a guidance module, similar to, e.g., module 270 of device 200, or guidance module 350 of cane 300. Such a guidance module may include, e.g., a processor, memory, wireless connection components, etc. In some embodiments, one or more of modules 620, 630, 638 may include a power source for providing motorized propulsion to wheelchair 600. Modules 620, 630, 638 may include one or more sensors to track speed, acceleration, orientation, and other characteristics of wheelchair 600. Such sensors, and sensors 636, 640 may be of any sensor type described elsewhere in the present disclosure. For example, one or more of the sensors may include proximity sensors, accelerometers, cameras, balance/orientation sensors, and the like. It is contemplated that the modules, sensors, and other elements of wheelchair 600 may be connected to one another by wired or wireless connections.
[0102] Speaker 632 may be positioned so as to provide audible output to a user of wheelchair 600. In some embodiments, speaker 632 may be on an arm of wheelchair 600. In further embodiments, speaker 632 may be disposed on, e.g., module 620, such that it would be positioned near the head of a user of wheelchair 600. In some embodiments, speaker 632 may be combined with, e.g., a headphone jack or other wireless communication device (e.g., a Bluetooth device) to allow for a user to receive audio cues through headphones or an ear piece. In some embodiments, one or more of modules 620, 630, 638 may include a microphone input, so as to allow a user to provide audio input to one or more modules of wheelchair 600. [0103] FIG. 7 depicts an exemplary movable case 650 according to aspects of the present disclosure. Case 650 may include a body 652 with wheels 674, an arm 654, and a handle 656. Case 650 may include modules 658, 670, and one or more sensors 660, 666,
668, 672. Module 658 may further include a speaker 662 and a jack 664.
[0104] Case 650 may be a self-propelling case, and/or may be configured to be pushed or pulled by a user, or move alongside the user. In some embodiments, Case 650 may be ruggedized such that it can climb inclines, stairs, etc., and such that it can survive a variety of weather and environmental conditions. Although depicted in an inclined position, it is contemplated that case 650 will be capable of standing, moving, and turning in an upright position (e.g., with all wheels or other forms of propulsion on the ground). In some embodiments, case 650 may be motorized or may include some other form of self-propulsion, such that it is partially or fully self-propelled and/or self-steering. Case 650 may be made of lightweight, durable materials, such as plastic, metal, or woven material (e.g., fabric, woven nylon, polyester, etc.) In some embodiments, body 652 of case 650 may include a motor or other form of propulsion in, e.g., guidance module 670. In some embodiments, body 652 may be openable, such that modules inside (e.g., module 670) may be added or removed from its interior. In some embodiments, body 652 may include storage space. In further embodiments, body 652 may not be openable. Body 652 may have any size or shape suitable for holding one or more modules.
[0105] Wheels 674 may be any type of wheels, treads, ball bearings, or any other type of part or technology that grants mobility to case 650. In some embodiments, wheels 674 may be retractable. Wheels 674 may be configured in an arrangement to allow case 650 to rest on wheels 674 stably, without assistance or support from a user. In some
embodiments, wheels 674 may be operable if fewer than all of wheels 674 are contacting a surface (e.g., the ground). In some embodiments, wheels 674 may each be motorized or self- propelled, and may be configured to turn or self-propel upon direction from a user, or from a guidance module (e.g., module 658 or 670).
[0106] Arm 654 may be extendable or collapsible, and/or may be adjustable, including to a variety of lengths, widths, shapes, or sizes to suit and/or function with a particular user’s height, limbs, and the like. In some embodiments, arm 654 may be rotatable. For example, a user twisting handle 656 may be able to rotate arm 654. In some embodiments, a motor or other movement device or technology in body 652 may be configured to twist or rotate arm 654, to indicate to a user a direction in which the user should turn. Arm 654 may be made of a durable material, such as metal or plastic. In some embodiments, arm 654 may be made of a flexible material, such as a woven cord or rope. Handle 656 may be configured to be held by a user of case 650. In some embodiments, handle 656 may be vibrated or rotated via, e.g., an internal motor controlled by one or more modules, such as module 658 or module 670. In some embodiments, handle 656 may be fitted with buttons or other means of communication so that the user may send and receive communication, data, etc. to and from Case 650 and its device 670.
[0107] Modules 658, 670 may be various modules making up a computer, artificial intelligence and/or guidance modules having characteristics similar to one or more modules of device 200. For example, one of modules 658, 670 may include a guidance module, similar to, e.g., module 270 of device 200, or guidance module 350 of cane 300. Such a guidance module may include, e.g., a processor, memory, wireless connection components, etc. In some embodiments, one or more of modules 658, 670 may include a power source for providing motorized or other means of propulsion to case 650. Modules 658, 670 may include one or more sensors to track speed, acceleration, orientation, and other characteristics of case 650. Such sensors, and sensors 660, 666, 668, 672 may be of any sensor type described elsewhere in the present disclosure. For example, one or more of the sensors may include proximity sensors, accelerometers, cameras, balance/orientation sensors, and the like. It is contemplated that the modules, sensors, and other elements of case 650 may be connected to one another by wired or wireless connections.
[0108] Speaker 662 may be positioned so as to provide audible output to a user of case 650. In some embodiments, speaker 662 may be on module 658, in proximity to handle 656 of case 650. In some embodiments, speaker 662 may be combined with, e.g., jack 664, to allow for a user to receive audio cues through headphones, an ear piece, or other wired or wireless communications device. In some embodiments, one or more of sensors 660, 666, 668, 672 may include a microphone input, so as to allow a user to provide audio input to various parts of case 650.
[0109] Reference will now be made to exemplary methods that may be applied using one or more of the above-described systems and methods.
[0110] FIG. 8 depicts steps in a method 700 for enhancing user-environment interactions according to aspects of the present disclosure. It should be noted that in some alternative implementations, the features and/or steps described may occur out of the order depicted in the figures or discussed herein. For example, two steps or figures shown in succession may instead be executed substantially concurrently or may sometimes be executed in the reverse order, depending upon the functionality/acts involved.
[0111] Generally, the steps of FIG. 8 may be applicable to a system with a user having a device according to the present disclosure (e.g., system 100). According to step 702, data may be received regarding the surroundings of a device. According to step 704, memory of a device may be updated with the received data. According to step 706, one or more directions may be generated to a user based on the received data. According to step 708, additional data may be received regarding device surroundings, in response to a change in location. According to step 710, memory may be updated with additional received data. According to step 712, one or more additional directions may be generated to a user based on the additional received data.
[0112] According to step 702, data may be received regarding the surroundings of a device. This data may include, for example, information from sensors on the device (e.g., sensors 220, 222, 224 of device 200), information transmitted over a network to network components of a device (e.g., wireless network component 208, cellular component 210, GPS component 212 of device 200), or information from memory of a device (e.g., memory 204). The data may be visual data, auditory data, data regarding motion, temperature, location, time, proximity to objects, locations, or people, data regarding the health or status of a user of the device, or data regarding a goal, program, instruction, or destination from the user or from a third party. The data may be input into the device by, e.g., a user of a device, or may be automatically sensed, retrieved, or otherwise received.
[0113] In some embodiments, the data may be received at the device. In other embodiments, the data may be received remotely— e.g., at a computer or a server bank in the same location as, or different location from, the device. The data may be transmitted over one or more connections before being received.
[0114] In some embodiments, the data may be pushed to the device or remote location by, e.g., an external signal, cue, or program. In embodiments where the data is received remotely from a device, for example, the data may be pushed to the remote location from the device. In embodiments where the data is received at the device, the data may be pushed to the device from a remote location, such as a remote computer or entity. In some embodiments, the device may be instructed to actively retrieve the data using one or more sensors (e.g., sensors 220, 222, 224 of device 200).
[0115] In some embodiments, the data may be processed before or after being received. For example, data from one or more sources may be combined, synthesized, sorted, encrypted, decrypted, or interpreted. In some embodiments, a location, date, time, or other information may be associated with the data.
[0116] According to step 704, memory of a device may be updated with the received data. For example, the received data may be transmitted to memory 204 of device 200 (either processing memory or storage memory). In some embodiments, the received data may be transmitted to memory remote to a device, such as servers 160 in system 100.
[0117] According to step 706, one or more directions may be generated to a user based on the received data. This step may include parsing the received data to obtain information relevant to the user, analyzing the parsed information, and generating one or more directions.
[0118] As discussed, the received data may include any type of data about the environment of a device or the user of the device. For example, the received data may include a series of images of an environment, which may be parsed or interpreted by, e.g., a program, to determine paths of travel, hazards, obstacles, and a destination. As another example, the received data may include one or more maps, which may be interpreted to determine the user’s location on the map, the user’s destination(s), points of interest, and the like. As a further example, the received data may include one or more sound samples, health indicators (e.g., a heart rate, blood pressure rate, etc.), or other pieces of information that may indicate that a user is under stress or is in danger of becoming overstressed.
[0119] Once information is obtained, the information may be analyzed to determine one or more courses of action that might be communicated to the user of the device. For example, if the information includes a location and nature of a hazard, a course of action may be to report the hazard to the user, or to retain information about the hazard in order to report the hazard to the user if the user comes within a given distance of the hazard. If the information includes a destination, then a course of action may be to find and communicate to the user a route to the destination. If the information includes data regarding the health or status of a user (e.g., a user’s heart rate, breathing rate, body temperature, or blood pressure), then a course of action may be to determine and communicate to the user one or more methods to regulate the user’s health status (e.g., by sitting down, taking calming breaths, or running through various physical, verbal, or mental exercises), or to communicate to a third party (e.g., a user’s medical professional, coach, family member, friend, or companion) an alert regarding the user’s health status. In some embodiments, a processor of the device (e.g., processor 202 of device 200) may interpret the information to determine the course of action. In other embodiments, a remote processor or series of processors may interpret the information. In some embodiments, a processor may compare the information to information in one or more programs (e.g., medical, psychological, psychiatric, counseling, or coaching programs) to interpret the information. Such programs may be provided by, for example, a medical professional, a psychologist or psychiatrist, a counselor, a coach, a companion, or a family member. In some embodiments, such programs may be created, stored, and/or made available on one or more remote servers (e.g., servers 160 in system 100) or local computers (e.g., computer 140 in system 100).
[0120] Generating the directions may include outputting one or more signals, cues, or instructions to a user or other entity from a device. Instructions to output one or more signals, cues, or other outputs may be generated at the device, or may be communicated to the device from a remote location. For example, one or more instructions may be read out loud by an electronic voice from a speaker on a device. Instructions may include directions to physically navigate a space (e.g., to make a turn after a certain number of steps), or to change a user’s status (e.g., to stand up, sit down, perform exercises, take deep breaths, or assume a certain position). Instructions may also include directions to perform one or more physical or mental routines (e.g., calming, focusing, or stress-relieving routines). As another example, a device may output a tactile cue, such as a vibration, or may swivel a wheel to indicate a direction in which a user should move. In some embodiments, generating the directions may include changing a status of the device. For example, if the device is moving, then the direction may include causing the device to stop moving or slow down by, e.g., applying a brake, or causing the device to accelerate by, e.g., applying power to wheels of the device. As another example, if the device is traveling, then the direction may include changing a direction of travel of the device. In some embodiments, signals, cues, or outputs may be generated at the device in a visual manner (e.g., on a screen, or by one or more lights).
[0121] In some embodiments, generating the directions may additionally or alternatively include contacting an entity other than the user. For example, generating the directions may include contacting the communication device of a doctor, aide, friend, or family member of a user (e.g., communication device 170 of system 100) via a cellular or other mobile connection, or contacting one or more services (e.g., services 190 of system 100). In further embodiments, generating the directions may include allowing an entity other than the user to communicate with the user (e.g., by establishing a voice call, video call, or one-way transmission of voice or video to or from the user, or to/from a doctor, aide, friend, or family member of the user).
[0122] According to step 708, additional data may be received regarding device surroundings, in response to a change in location. Additional data may be received in any manner that data may be received according to step 702. A change in location may include a transplantation of the device from one set of surroundings to another, or may be a smaller change, such as change in the device’s orientation, or a movement of one meter, one half of a meter, or even a few centimeters. Alternately, additional data may be received regarding device surroundings in response to a change that is not location-based. For example, additional data may be received regarding device surroundings in response to sensor data showing a change in a user status (e.g., a change in data received from a heart rate or blood pressure monitor), in response to a change in goals or priorities programmed into the device, or other change. As another example, additional data may be received in response to a change in a device’s orientation, acceleration, or environment (e.g., a change from a hot environment to a cold one, or vice versa). In further embodiments, additional data may be received regarding device surroundings independent of any change. For example, additional data may be received as a function of time (e.g., periodic updates to data). As another example, a user or other entity may compel the retrieval of additional data.
[0123] The additional data received may be any type of data that was received according to step 702, or may be a different type of data. In some embodiments, the type of data received according to step 708 may depend on the directions generated to the user according to step 706. For example, if a user was directed to a door as a part of step 706, then the type of data received according to step 708 may be visual data, to confirm the appearance of a door in front of a device at the user’s location.
[0124] According to step 710, memory may be updated with additional received data. This may be accomplished in any of the manners described with respect to step 704. Updating memory may or may not include overwriting or erasing out-of-date data and replacing it with more current data. For example, if a map is stored in memory and the additional received data includes information that would update the map, updating the memory may include changing the map stored in memory to reflect new data.
[0125] According to step 712, one or more additional directions may be generated to a user based on the additional received data. This may be accomplished in any manner described with respect to step 706. The one or more additional directions may be similar to, or different from, the directions generated as a part of step 706. In some embodiments, instead of one or more additional directions being generated based on the additional data, one or more sets of ongoing directions may be paused, slowed down, or stopped based on the additional data.
[0126] Steps 702 through 712 may be repeated in or out of sequence to provide a user with a series of generated directions over time. Thus, a user following the directions may receive guidance from the device generating the directions, where the directions are updated periodically (e.g., in real time, every second or few seconds, every minute or few minutes, or dynamically depending on user movement, physical status, mental status, or other factors). In particular, users who are unable to take in data, information, or cues from the environment in certain formats may benefit from such guidance over time.
[0127] Any of the above described steps may be performed by, e.g., a user, the user’s device, or an individual, computer, or entity authorized to communicate with the device. Thus, a user may receive assistance both from the user’s device and from other individuals and entities authorized to provide assistance to the user. Moreover, any of a number of individuals and entities may contribute to determining what data is received by a device, how data should be interpreted, and what information may be relevant to a user. For example, a user, the user’s doctor or medical facility, and/or the user’s family may be able to provide parameters for what health data may affect a course of action taken by the device.
For example, a user or other entity may set a heart rate or blood pressure range, and if data showing a heart rate or blood pressure outside of the range is received, the user may specify that the user, the user’s doctor, or the user’s family should be alerted. As another example, a user or other authorized entity may be able to specify a maximum range of motion or travel, thus affecting guidance provided by the device to the user. A user or other authorized entity may be able to input into the device any number of goals, programs, and other information that may change what directions are provided by the device. [0128] In some embodiments, there may be a plurality of overall goals, instructions, and/or parameters specified to direct or limit the directions provided to a user from a device. In such cases, the plurality of goals, instructions, and/or parameters may be prioritized based on general principles to, e.g., keep a user safe, bring a user to a particular destination, ensure that the user remains within a certain area, and the like. In this way, methods according to the present disclosure may be fully customized to a particular user or a particular set of priorities.
[0129] The above description and examples are illustrative, and are not intended to be restrictive. One of ordinary skill in the art may make numerous modifications and/or changes without departing from the general scope of the invention. For example, and as has been described, the above-described embodiments (and/or aspects thereof) may be used in combination with each other. Additionally, portions of the above-described embodiments may be removed without departing from the scope of the invention. In addition,
modifications may be made to adapt a particular situation or material to the teachings of the various embodiments without departing from their scope. Many other embodiments will also be apparent to those of skill in the art upon reviewing the above description.
[0130] Reference will now be made to brief examples of embodiments of the present disclosure. These examples are illustrative, and do not limit the rest of the present disclosure in any way.
Example 1
[0131] A case is configured to assist a blind individual with navigating through a variety of environments. The case includes a video camera, a microphone, a speedometer, an odometer, and proximity sensors configured to receive input from the user’s environment and forward the input to the processor. The case is also motorized and capable of self-steering. The case also includes a speaker. The speaker is configured to output audible directions to the individual (in the form of either a synthesized voice or a series of sound, vibration, heat, or other cues). The handle is configured to rotate or vibrate upon instruction by the processor. The case is capable of guiding the user through and around obstacles using these and other methods of communication. The case may also act as a means of physical support upon which the user may lean or otherwise transfer his or her weight.
[0132] The case also includes a guidance device having a processor and memory configured to receive input from, and send output to, the user’s environment and other locations using video technology, microphone, speedometer, odometer, proximity sensors, cellular or other mobile capabilities, processors, and wireless network capabilities. In particular, the case processor is configured to send output to the speaker, a motor of the case, and a third party (such as a coach, physician, or family member). The processor is also equipped with wireless network and cellular or other mobile capabilities.
[0133] The memory of the guidance device is programmed with a plurality of maps of locations of interest to the individual. The maps may include at least one destination of interest to the individual so that the device may guide the user to the destination that may be programmed before the time of travel. In addition, the processor is configured to receive voice input from the microphone and translate the received input, via vocal recognition software, into an instruction to provide direction to a destination of interest at any time during the trip. The processor is configured to identify the destination of interest and determine a route to the destination of interest using the maps stored in the memory of the guidance device. If the destination of interest is not identifiable using the stored maps, the processor is configured to use its cellular capabilities, other mobile capabilities, or wireless network capabilities to query a remote computer having a database of additional maps for the destination of interest. If the destination of interest is found in the additional maps, the processor is configured to download the relevant additional map or maps containing the destination of interest. Upon determining a route to the destination of interest, the processor is configured to output audible directions via the speaker, and/or tactile directions via rotations, heat, or vibrations, in order to guide the individual to the destination of interest.
The device is also configured to receive signals from any physically installed sensors, beacons, or transmitting devices that may be encoded to send data to the case processor.
Using such data, the processor of the device may update one or more maps (e.g., in real time), and may simultaneously inform the user about the environment (e.g., guide the user to the destination of interest).
Example 2
[0134] The individual of Example 1 attends a conference in a hotel with the case of Example 1. The hotel provides a computer on its premises, the computer being connected to a wireless network. In addition to points digitally identified on the map, the hotel has also installed sensors and beacons with a variety of capabilities at various locations throughout the user environment, e.g., conference rooms with specified names, hotel rooms by number, or the restaurant and bar. The breakfast bar may have sticker sensors that describe the food in each serving dish in the breakfast bar. The hotel computer includes the user’s reservation and personalized electronic schedule for this individual user attending the conference, including a repository of maps containing potential destinations of interest corresponding to the electronic schedule, all of which are available for wireless download by the individual either before or during the trip. The potential destinations of interest include locations of events within the conference (e.g., particular panels, plenaries, networking events, lunch locations, and seating assignments) that are also reflected in the repository of maps. In addition to locations for the conference, information on local events, menus, ads, specials, etc. may be included to assist the user in enjoying his or her current or future stay. The hotel also includes a plurality of signal sensors, beacons, etc. which are readable by, and provide location or other information to, the proximity sensors of the case. These sensors may signal the beginning of a hallway, the location of a desired room, or provide general information, e.g., that the user is approaching a particular location, such as a health club. The case receives this kind of information in addition to obstacle data received by the device sensors (e.g., that another individual is approaching in the hallway or a suitcase or other navigable obstacle is in the user’s path).
[0135] Upon arriving at the hotel with the individual and being connected to the wireless network, the guidance device of the case automatically wirelessly downloads the electronic schedule and repository of maps from the hotel computer. Using the video camera, microphone, speedometer, odometer and proximity sensors, the guidance device determines a current location of the case and the individual. Upon a vocal command input from the individual, the processor of the guidance device references the electronic schedule to find upcoming destinations of interest, and outputs an audible list of upcoming destinations of interest via the speaker. The individual selects one of the upcoming destinations of interest, either vocally or by other means of communication with the device. Then, the processor of the guidance device accesses the downloaded repository of maps, finds the destination of interest on the maps, and determines a route to the selected destination of interest. The processor then outputs a series of audible directions via the speaker and tactile directions via the motorized handlebar to guide the individual to the selected destination of interest. As the individual follows the directions and moves, being gently guided by the self-propelled case, the processor receives input from the video camera, microphone, speedometer, odometer, downloaded data, and proximity sensors to determine the changing location of the user. The guidance device outputs a signal to the individual upon arriving at the selected destination of interest.
Example 3 [0136] As a result of aging, a user has limited use of his or her right lower leg and the user has cognitive disabilities including anxiety and the ability to navigate in new environments. However, the user’s physician determines that exercise will benefit the user. The user’s coach or family member installs sensors, beacons or devices that may be physical or digital/data-points included in the maps downloaded to the user’s walker device (e.g., device 500). The user may use the walker to navigate along the determined path or walking route that may begin from the user’s home or other location. The device may communicate with the coach who is located at different location like the coach’s home or workplace. The user may intentionally contact the coach for assistance through the device or the device may be programmed to contact the coach for assistance if the user is off the designated path (i.e. the user is lost), or the user’s vital signs are not within a designated pattern, for example the user has an accelerated heart rate or other condition indicating anxiety or a medical emergency. In this example the handlebar of the device may measure vital signs.
Example 4
[0137] A user has cognitive disabilities that prevent the user from making his or her own meals. The user is unable to follow written cooking directions and forgets safety measures, such as turning off the stove. A garment according to the present disclosure (e.g., garment 400) is programmed with a step by step verbal coaching tool to guide the user through these or similar tasks. Step by step the user is instructed what to do, including safety information such as a reminder to turn off the stove. The user acknowledges the action is complete either verbally or by other means like a button, etc. Similar to other examples, the device may be programmed to contact a coach or family member if the user fails to communicate acknowledgement or if vital signs or other sensory data indicates an emergency requiring the coach’s assistance.
Example 5 [0138] A typically-abled individual is attending a conference at a large conference campus and wants to take advantage of the enabling technologies described above. Through a hand-held device (e.g., device 110), the user may download hotel maps, schedules of events, guidance to rooms and sessions, etc. The user may also gain information through audio or other methods about the event that will enhance the user’s experience. This is especially useful in large conference settings, university campuses, etc.

Claims

CLAIMS What is claimed is:
1. A system for enhancing an environment interaction, comprising:
a server;
a user assist device configured to receive an input from an environment and generate an output, the user assist device further configured to assist a user to navigate the environment; and
a network configured to connect the server with the user assist device,
wherein the user assist device includes:
one or more wheels configured to assist the user’s movement; a sensor configured to receive the input from the environment; a processor configured to receive the input and control the user assist device based at least in part on the input; and
an output device configured to generate the output based on a control from the processor,
wherein the output includes controlling the one or more wheels to move automatically in a specified direction.
2. The system according to claim 1, wherein the device is one of a suitcase, a scooter, or a cane.
3. The system according to claim 1, wherein the output includes a warning for avoiding one or more objects in the environment.
4. The system according to claim 1, wherein the sensor is configured to receive the input corresponding to one or more of a visual input, a touch input, or an audio input from the user’s environment.
5. The system according to claim 1, wherein the input further includes a health status of a user, and the output includes a notification sent to emergency services of the user’s health status.
6. The system according to claim 1, wherein the input includes a data providing a location of the device.
7. The system according to claim 1, further comprising a plurality of the user assist devices, wherein the plurality of the user assist devices are configured to communicate and provide one or more of a location data and data relating to the environment of the user assist device to each of the other user assist devices located in the environment.
8. A mobility assistance device configured to assist a user to navigate an environment, the device comprising:
one or more wheels configured to assist movement of the user;
a sensor configured to receive an input from the environment;
a transceiver configured to transmit and receive a signal, the signal including data related to the environment;
a processor configured to control the device according to one or more of the input and the signal; and an output device configured to generate an output based on a control from the processor,
wherein the output includes controlling the device to move automatically in a specified direction.
9. The device according to claim 8, further comprising a memory configured to store one or more maps of locations that are of interest to the user.
10. The device according to claim 9, wherein the one or more maps are prestored in the memory and, in response to a point of interest not being on the one or more maps prestored in the memory, the transceiver is configured to retrieve a map having the point of interest from a remote server.
11. The device according to claim 8, wherein the output is a location of the user assist device.
12. The device according to claim 8, wherein the output includes a warning for avoiding one or more objects in the environment.
13. The device according to claim 8, wherein the output includes one or more of an audio, a tactile, or a visual output related to the location data.
14. The device according to claim 8, wherein the input corresponds to one or more of a visual input, a touch input, or an audio input.
15. The device according to claim 8, wherein the signal corresponds to one or more instructions from another mobility assistance device in a location within a predetermined distance of the device.
16. The device according to claim 15, wherein the signal includes location data of the another mobility assistance device and the output controls the device to avoid the another mobility assistance device.
17. A method for enhancing a user-environment interaction using a mobility assistance device having one or more wheels, the method comprising:
receiving, by the mobility assistance device, data of a user’s surroundings, the data including one or more inputs that the user is unable to sense using biological senses;
inputting a location of the user;
determining a change in the location of the user;
receiving additional data in response to a change in the user’s location; and generating an output to the user including a warning for avoiding one or more objects in the user’s environment.
18. The method according to claim 17, further comprising:
receiving an input corresponding to one or more of a visual input, a touch input, or an audio input.
19. The method according to claim 17, further comprising:
controlling one or more wheels of the mobility assistance device to automatically move the mobility assistance device in a specified direction.
20. The method according to claim 17, wherein the output includes instructions to the user to change a physical movement based on the additional data, including directions to avoid the one or more objects in the user’s environment.
PCT/US2019/018023 2018-02-16 2019-02-14 Methods and apparatus for enhancing user-environment interactions WO2019161064A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201862631537P 2018-02-16 2018-02-16
US62/631,537 2018-02-16

Publications (1)

Publication Number Publication Date
WO2019161064A1 true WO2019161064A1 (en) 2019-08-22

Family

ID=65729425

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2019/018023 WO2019161064A1 (en) 2018-02-16 2019-02-14 Methods and apparatus for enhancing user-environment interactions

Country Status (1)

Country Link
WO (1) WO2019161064A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111610544A (en) * 2020-05-29 2020-09-01 上海美纳德建筑设计事务所有限公司 Auxiliary method for following positioning of blind guiding unmanned aerial vehicle system
US11896536B2 (en) 2020-11-06 2024-02-13 Toyota Motor North America, Inc. Wheelchair systems and methods to follow a companion

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9062986B1 (en) * 2013-05-07 2015-06-23 Christ G. Ellis Guided movement platforms
US20160074262A1 (en) * 2014-12-26 2016-03-17 Sophia Vasiliki Moses Intelliwalker, an intelligent, sensor equipped, motorized robotic walking assistance device.

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9062986B1 (en) * 2013-05-07 2015-06-23 Christ G. Ellis Guided movement platforms
US20160074262A1 (en) * 2014-12-26 2016-03-17 Sophia Vasiliki Moses Intelliwalker, an intelligent, sensor equipped, motorized robotic walking assistance device.

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111610544A (en) * 2020-05-29 2020-09-01 上海美纳德建筑设计事务所有限公司 Auxiliary method for following positioning of blind guiding unmanned aerial vehicle system
US11896536B2 (en) 2020-11-06 2024-02-13 Toyota Motor North America, Inc. Wheelchair systems and methods to follow a companion

Similar Documents

Publication Publication Date Title
US10024679B2 (en) Smart necklace with stereo vision and onboard processing
JP6764879B2 (en) Navigation devices and methods
US20170032787A1 (en) Smart necklace with stereo vision and onboard processing
Heuten et al. Tactile wayfinder: a non-visual support system for wayfinding
US10248856B2 (en) Smart necklace with stereo vision and onboard processing
JP6599877B2 (en) Smart necklace with stereoscopic view and internal processing function
US9915545B2 (en) Smart necklace with stereo vision and onboard processing
US10024678B2 (en) Wearable clip for providing social and environmental awareness
US10024667B2 (en) Wearable earpiece for providing social and environmental awareness
US20080120029A1 (en) Wearable tactile navigation system
Chang et al. Comparing picture and video prompting in autonomous indoor wayfinding for individuals with cognitive impairments
US20160025499A1 (en) Intelligent mobility aid device and method of navigating and providing assistance to a user thereof
US20080062120A1 (en) Location tracking system
JP2000352521A (en) System and method for providing navigation support for user, tactile-sense direction indicating device for providing the same via direction indicating cue, and method for providing the same via the same device
WO2005108926A1 (en) Information processor, portable apparatus and information processing method
Boulos et al. Geo-enabled technologies for independent living: examples from four European projects
KR20170065589A (en) Wearable comptuing device and method for crowd control
WO2019161064A1 (en) Methods and apparatus for enhancing user-environment interactions
CN107801154A (en) Mobile device system for prompting, management system and method for managing object
Choudhary et al. IoT based navigation system for visually impaired people
Macik et al. Smartphoneless context-aware indoor navigation
JP2013016110A (en) Information providing system and program
Mukherjee et al. Android based personal travelling assistant using turning algorithm
KR102103405B1 (en) The smart stick based on location information using artificial intelligence
BR102017008457A2 (en) integrated guidance and navigation system for the visually impaired

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19710516

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19710516

Country of ref document: EP

Kind code of ref document: A1