US20170024010A1 - Guidance device for the sensory impaired - Google Patents

Guidance device for the sensory impaired Download PDF

Info

Publication number
US20170024010A1
US20170024010A1 US14/804,930 US201514804930A US2017024010A1 US 20170024010 A1 US20170024010 A1 US 20170024010A1 US 201514804930 A US201514804930 A US 201514804930A US 2017024010 A1 US2017024010 A1 US 2017024010A1
Authority
US
United States
Prior art keywords
output
user
input
guidance device
guidance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/804,930
Inventor
Chananiel Weinraub
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Apple Inc
Original Assignee
Apple Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Apple Inc filed Critical Apple Inc
Priority to US14/804,930 priority Critical patent/US20170024010A1/en
Assigned to APPLE INC. reassignment APPLE INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: WEINRAUB, CHANANIEL
Publication of US20170024010A1 publication Critical patent/US20170024010A1/en
Priority to US15/900,728 priority patent/US10254840B2/en
Priority to US16/287,883 priority patent/US10664058B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/167Audio in a user interface, e.g. using voice commands for navigating, audio feedback
    • G06K9/00671
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/20Scenes; Scene-specific elements in augmented reality scenes

Definitions

  • the described embodiments relate generally to guidance devices. More particularly, the present embodiments relate to guidance devices for the sensory impaired.
  • Some sensory impaired people use guidance devices or relationships to assist them in navigating and interacting with their environments. For example, some blind people may use a cane in order to navigate and interact with an environment. Others may use a guide animal.
  • the present disclosure relates to guidance devices for sensory impaired users.
  • Sensor data may be obtained regarding an environment.
  • a model of the environment may be generated and the model may be mapped at least to an input/output touch surface.
  • Tactile output and/or other output may be provided to a user based at least on the mapping. In this way, a sensory impaired user may be able to navigate and/or interact with an environment utilizing the guidance device.
  • a guidance device for a sensory impaired user may include an input/output touch surface, a sensor data component that obtains data regarding an environment around the guidance device, and a processing unit coupled to the input/output touch surface and the sensor data component.
  • the processing unit may generate a model of the environment based at least on the data, map the model to the input/output touch surface and provide tactile output to a user based at least on the mapping via the input/output touch surface.
  • the tactile output may be an arrangement of raised portions of the input/output touch surface or other tactile feedback configured to produce a tactile sensation of bumps.
  • the tactile output may include a representation of an object in the environment and a region of the input/output touch surface where the representation is provided may correspond to positional information regarding the object.
  • the positional information regarding the object corresponding to the region may be first positional information when the tactile output includes a first positional information context indicator and second positional information when the tactile output includes a second positional information context indicator.
  • the shape of the representation may be associated with a detected shape of the object.
  • the sensor data component may receive at least a portion of the data from another electronic device.
  • the processing unit may provide at least one audio notification based at least on the model via an audio component of the guidance device or another electronic device.
  • an assistance device for a sensory impaired user may include a surface operable to detect touch and provide tactile output, a sensor that detects information about an environment, and a processing unit coupled to the surface and the sensor.
  • the processing unit may determine a portion of the surface being touched, select a subset of the information for output, and provide the tactile output to a user corresponding to the subset of the information via the portion of the surface.
  • the senor may detect orientation information regarding the assistance device and the processing unit may provide the tactile output via the portion of the surface according to the orientation information.
  • the sensor may detect location information regarding the assistance device and the tactile output may include a direction indication associated with navigation to a destination.
  • the tactile output may include an indication of a height of an object in the environment.
  • the tactile output may include an indication that the object is traveling in a course that will connect with a user (which may be determined using real time calculations).
  • the tactile output may include an indication that the user is approaching the object and the object is below a head height of the user.
  • the tactile output may include a first representation of a first object located in a direction of travel of the assistance device and a second representation of a second object located in an opposite direction of the direction of travel.
  • the wherein the tactile output may include a representation of an object in the environment and a texture of the representation may be associated with a detected texture of the object.
  • regions of the portion of the surface where the first representation and the second representation are provided may indicate that the first object is located in the direction of travel and the second object is located in the opposite direction of the direction of travel.
  • the processing unit may provide an audio notification via an audio component upon determining that the assistance device experiences a fall event during use.
  • an environmental exploration device may include a cylindrical housing, a processing unit located within the cylindrical housing, a touch sensing device coupled to the processing unit and positioned over the cylindrical housing, a haptic device (such as one or more piezoelectric cells) coupled to the processing unit and positioned adjacent to the touch sensing device, and an image sensor coupled to the processing unit that detects image data about an area around the cylindrical housing.
  • the processing unit may analyze the image data using image recognition to identify an object (and/or analyze data from one or more depth sensors to determine distance to and/or speed of moving objects), creates an output image representing the object and positional information regarding the object in the area, map the output image to the haptic device; and provide the output image as tactile output to a user via the haptic device.
  • the processing unit may provide an audio description of the object and the positional information via an audio component.
  • the processing unit may determine details of a hand of the user that is touching the touch sensing device and map the output image to the haptic device in accordance with whether the hand is a left hand of the user, a right hand of the user, has a large palm size, has a small palm size, has less than four fingers, or does not have a thumb.
  • the environmental exploration device may also include a weight component coupled to the cylindrical housing operable to alter an orientation of the environmental exploration device.
  • FIG. 1 shows a user navigating an example environment using a guidance device.
  • FIG. 2 shows the user navigating another example environment using the guidance device.
  • FIG. 3 shows a flow chart illustrating a method for providing guidance using a guidance device.
  • FIG. 4A shows an isometric view of an example guidance device.
  • FIG. 4B shows a block diagram illustrating functional relationships of example components of the guidance device of FIG. 4A .
  • FIG. 4C shows a diagram illustrating an example configuration of the input/output touch surface of the guidance device of FIG. 4A .
  • FIG. 5A shows a cross-sectional view of the example guidance device of FIG. 4A , taken along line A-A of FIG. 4A .
  • FIG. 5B shows a cross-sectional view of another example of the guidance device of FIG. 4A in accordance with further embodiments of the present disclosure.
  • FIG. 6A shows a diagram illustrating an example of how a model of an environment generated based on environmental data may be mapped to the input/output touch surface of the guidance device of FIG. 5A .
  • FIG. 6B shows a diagram illustrating another example of how a model of an environment generated based on environmental data may be mapped to the input/output touch surface of the guidance device of FIG. 5A .
  • FIGS. 7-10 show additional examples of guidance devices.
  • Embodiments described herein may permit a sensory-impaired user to quickly and efficiently interact with his or her environment.
  • Sensory-impaired users may use a device which provides guidance to the user that communicate information about the environment to aid the user's interaction therewith.
  • the device may detect information about the environment, model the environment based on the information, and present guidance output based on the model in a fashion detectable by the user.
  • Such guidance output may be tactile so the user can quickly and efficiently “feel” the guidance output while interacting with the environment.
  • This device may enable the sensory-impaired user to more quickly and efficiently interact with his or her environment than is possible with existing sensory-impaired guidance devices such as canes.
  • the present disclosure relates to guidance devices for sensory impaired users.
  • Sensor data may be obtained regarding an environment around a guidance device, assistance device, environmental exploration device, an/or other such device.
  • a model of the environment may be generated based on the data.
  • the model may be mapped at least to an input/output touch surface of the guidance device.
  • Tactile output may be provided to a user of the guidance device via the input/output touch surface based at least on the mapping.
  • Other output based on the model may also be provided.
  • a sensory impaired user may be able to navigate and/or interact with an environment utilizing the guidance device.
  • Such a guidance device may provide better assistance than and/or take the place of a cane, a guidance animal, and/or other guidance devices and/or relationships.
  • the guidance device may include a variety of different components such as sensors that obtain data regarding the environment, input/output mechanisms for receiving input from and/or providing input to the user, processing units and/or other components for generating the model and/or mapping the model to various input/output mechanisms, and so on. Additionally, the guidance device may cooperate and/or communicate with a variety of different electronic devices that have one or more such components in order to perform one or more of these functions.
  • FIG. 1 shows a user 102 navigating an example environment 100 using a guidance device 101 , assistance device, environmental exploration device, an/or other such device.
  • a guidance device 101 may be a device that detects information about the user's environment 100 and presents that information to the user to aid the user's 102 interaction with the environment 100 .
  • the user 102 is holding the guidance device 101 in a hand 103 while walking down a street.
  • the user 102 also is wearing a wearable device 109 and has a smart phone 108 in the user's pocket.
  • a traffic signal 106 and a moving truck 107 are in front of the user 102 and another person 104 looking at a cellular telephone 105 is walking behind the user 102 .
  • the user 102 may be receiving tactile, audio, and/or other guidance output related to the environment 100 from the guidance device 101 (which may detect information regarding the environment 100 upon which the guidance output may be based) and/or the smart phone 108 and/or wearable device 109 .
  • the user 102 may be receiving output that the user 102 is approaching the traffic signal 106 , the truck 107 is approaching the user 102 , the other person 104 is approaching the user 102 from behind, and so on.
  • one or more of the shown devices may be wired and/or wirelessly transmitting and/or receiving in order to communicate with one or more of each other. Such devices may communicate with each other in order to obtain environmental or other sensor data regarding the environment, generate a model based on the sensor data, and provide the guidance output to the user 102 based on the model.
  • FIG. 1 is illustrated as providing tactile guidance output to the hand 103 of the user 102 , it is understood that this is an example.
  • tactile guidance output may be provided to various different parts of a user's body, such as a shirt made of a fabric configured to provide tactile output.
  • FIG. 2 shows the user 102 navigating another example environment 200 using the guidance device 101 .
  • the user 102 is holding the guidance device 101 (which may detect information regarding the environment 200 upon which the guidance output may be based) in the user's hand 103 while walking through a room in a house.
  • the room has a display screen 208 connected to a communication adapter 209 on a wall and a navigation beacon 206 on a table.
  • Another person 204 looking at a smart phone 205 is also in the room.
  • one or more of the devices may be transmitting and/or receiving in order to communicate with one or more of each other.
  • the user 102 may be receiving tactile, audio, and/or other guidance output related to the environment 200 from the guidance device 101 .
  • the user 102 may be receiving tactile output of the layout of the room based on a map provided by the navigation beacon 206 .
  • the user 102 may be receiving tactile and/or audio direction indications associated with navigation from the user's 102 current location in the environment 200 to a destination input by the other person 204 on the smart phone 205 , the guidance device 101 , and/or the display screen 208 .
  • FIG. 3 shows a flow chart illustrating a method 300 for providing guidance using a guidance device.
  • a guidance device may be one or more of the guidance devices 101 , 501 , 601 , 701 , or 801 illustrated and described herein with respect to FIGS. 1, 2, and 4-10 .
  • environmental or other sensor data may be obtained regarding an environment around a guidance device.
  • the environmental data may be obtained from a variety of different kind of sensors or other sensor data components such as image sensors (such as cameras, three-dimensional cameras, infra-red image sensors, lasers, ambient light detectors, and so on), positional sensors (such as accelerometers, gyroscopes, magnetometers, and so on), navigations systems (such as a global positioning system or other such system), depth sensors, microphones, temperature sensors, Hall effect sensors, and so on.
  • one or more such sensors may be incorporated into the guidance device.
  • one or more such sensors may be components of other electronic devices and the sensor of the guidance device may be a communication component that receives environmental data from such other sensors transmitted from another electronic device (which may communicate with the guidance device in one or more client/server configurations, peer-to-peer configurations, mesh configurations, and/or other communication network configurations).
  • Such sensors may obtain environmental data regarding any aspect of an environment such as the presence and/or position of objects, the movement of objects, weather conditions, textures, temperatures, and/or any other information about the environment.
  • a model of the environment may be generated for guidance output based at least on the environmental data.
  • the model may include a subset of the obtained environmental data.
  • the environmental data may be processed in order to determine which of the environmental data is relevant enough to the user to be included in the model.
  • processing may include image or object recognition and/or other analysis of environmental data.
  • an environment may contain a number of objects, but only the objects directly in front of, to the sides, and/or behind a user and/or particularly dangerous objects such as moving automobiles may be included in a model.
  • the determination of whether to include environmental data in the model may be dependent on a current state of one or more output devices that will be used to present guidance output data for the model.
  • an input/output touch surface may be used to provide tactile output via a currently touched area. When a larger area of the input/output touch surface is being touched, more environmental data may be included in the generated model. Conversely, when a smaller area of the input/output touch surface is being touched, less environmental data may be included in the generated model.
  • the guidance device may generate the model. However, in various cases the guidance device may cooperate with one or more other electronic devices communicably connected (such as in various client/server configurations, peer-to-peer configurations, mesh configurations, and/or other communication network configurations) to the guidance device in order to generate the model. For example, the guidance device may generate the model by receiving a model generated by another device. By way of another example, another device may process the environmental data and provide the guidance device an intermediate subset of the environmental data which the guidance device then uses to generate the model.
  • the guidance device may cooperate with one or more other electronic devices communicably connected (such as in various client/server configurations, peer-to-peer configurations, mesh configurations, and/or other communication network configurations) to the guidance device in order to generate the model.
  • the guidance device may generate the model by receiving a model generated by another device.
  • another device may process the environmental data and provide the guidance device an intermediate subset of the environmental data which the guidance device then uses to generate the model.
  • guidance output based at least on the model may be provided.
  • Such guidance output may be tactile output (such as shapes of objects, indications of positions or motions of objects, and so on), audio output (such as audio notifications related to and/or descriptions of objects or conditions in the environment and so on), and/or any other kind of output.
  • Providing guidance output based on the model may include mapping the model to one or more output devices and providing the guidance output based at least on the mapping via the output device(s).
  • an input/output touch surface may be used to provide tactile guidance output via an area currently being touched by a user's hand.
  • the area currently being touched by the user's hand may be determined along with the orientation of the user's hand touching the area and the model may be mapped to the determined area and orientation.
  • Tactile output may then be provided via the input/output touch surface according to the mapping.
  • the tactile output may be mapped in such a way as to convey shapes and/or textures of one or more objects, information about objects (such as position in the environment, distance, movement, speed of movement, and/or any other such information) via the position on the input/output touch surface (such as where the output is provided in relation to various portions of the user's hand) and/or other contextual indicators presented via the input/output touch surface, and/or other such information about the environment.
  • mapping guidance output to the area currently being touched by a user's hand power savings may be achieved as output may not be provided via areas of the input/output touch surface that are not capable of being felt by a user. Further, mapping guidance output to the area currently being touched by a user's hand, assistance may be provided to the user to ensure that the output is provided to portions of the user's hand as expected by the user as opposed to making the user discover how to touch the guidance device in a particular manner.
  • determination of the area currently being touched by a user's hand may include determining details about the user's hand and the guidance output may be mapped according to such details.
  • the details may include that the hand is a left hand of the user, a right hand of the user, has a large palm size, has a small palm size, has less than four fingers, or does not have a thumb.
  • the guidance device may provide the guidance output via one or more output components of the guidance device such as one or more input/output touch surface, speakers, and/or other output devices.
  • the guidance device may cooperate with one or more other electronic devices communicably connected (such as in various client/server configurations, peer-to-peer configurations, mesh configurations, and/or other communication network configurations) to provide one or more kinds of output.
  • a guidance device may provide a pattern of raised bumps or other such protrusions to indicate objects in the environment, the identity of the objects, the position of those objects, and the distance of those objects to the user.
  • a wearable device on the user's wrist may also provide vibration output to indicate that one or more of the objects are moving towards the user.
  • the user's cellular telephone may output audio directions associated with navigation of the user from the user's current location (such current location information may be detected by one or more sensors such as a global positioning system or other navigation component) to a destination.
  • example method 300 is illustrated and described as including particular operations performed in a particular order, it is understood that this is an example. In various implementations, various orders of the same, similar, and/or different operations may be performed without departing from the scope of the present disclosure.
  • guidance output may be provided based on environmental data without generating a model.
  • the guidance device may detect that the guidance device was held and then dropped using one or more positional sensors, input/output touch surface, and/or other such components. In such a case, the guidance device may emit an audio alarm upon such detection without generation of a model to aid a user in locating the dropped guidance device.
  • FIG. 4A shows an isometric view of an example guidance device 101 .
  • the guidance device 101 may include a cylindrical housing 410 that may be formed of aluminum, plastic, and/or any other suitable material.
  • An input/output touch surface 411 may be disposed on a surface of the cylindrical housing 410 .
  • the input/output touch surface 411 may be operable to detect touch input (such as touch, force, pressure and so on via one or more capacitive sensors, touch sensors, and/or any other kind of touch and/or force sensors).
  • the input/output touch surface 411 may also be operable to provide tactile output, such as one or more patterns of vibrations, raised bumps or other protrusions, and so on.
  • the input/output touch surface 411 may include one or more piezoelectric cells that can be electrically manipulated to create one or more patterns of raised bumps on the input/output touch surface 411 .
  • the guidance device 101 may also include image sensors 413 and 414 .
  • the image sensors 413 and 414 may be any kind of image sensors such as one or more cameras, three dimensional cameras, infra-red image sensors, lasers, ambient light detectors, and so on.
  • the guidance device 101 may also include one or more protectors 412 that may prevent the input/output touch surface 411 from contacting surfaces with which the guidance device 101 comes into contact.
  • the protectors 412 may be formed of rubber, silicone, plastic, and/or any other suitable material. As illustrated, the protectors 412 may be configured as rings. In such a case, the guidance device 101 may include one or more weights and/or other orientation elements (discussed further below) to prevent the guidance device 101 from rolling when placed on a surface. However, in other cases the protectors 412 may be shaped in other configurations (such as with a flat bottom) to prevent rolling of the guidance device 101 on a surface without use of a weight or other orientation element.
  • the cylindrical housing 410 is shaped such that it may be held in any number of different orientations.
  • the guidance device 101 may utilize the input/output touch surface 411 , one or more position sensors, and/or other components to detect orientation information regarding the orientation in which the guidance device 101 is being held in order to map output to output devices such as the input/output touch surface 411 accordingly.
  • the guidance device 101 may have a housing shaped to be held in a particular manner, such as where a housing is configured with a grip that conforms to a user's hand in a particular orientation. In such an implementation, detection of how the guidance device 101 is being held may be omitted while still allowing the guidance device 101 to correctly map output to output devices such as the input/output touch surface 411 .
  • the guidance device 101 may be configured to operate in a particular orientation.
  • the image sensor 413 may be configured as a front image sensor and the image sensor 414 may be configured as a rear image sensor. This may simplify analysis of environmental and/or sensor data from the image sensors 413 and 414 . This may also allow for particular configurations of the image sensors 413 and 414 , such as where the image sensor 413 is a wider angle image sensor than the image sensor 414 as a user may be more concerned with objects in front of the user than behind.
  • the guidance device 101 may be configurable to operate in a variety of orientations.
  • the image sensors 413 and 414 may be identical and the guidance device 101 may use the image sensors 413 and 414 based on a currently detected orientation (which may be based on detection by input/output touch surface 411 , one or more positional sensors, and/or other such components).
  • FIG. 4B shows a block diagram illustrating functional relationships of example components of the guidance device 101 of FIG. 4A .
  • the guidance device 101 may include one or more processing units 424 , batteries 423 , communication units 425 , positional sensors 426 , speakers 427 , microphones 428 , navigation systems 429 , image sensors 413 and 414 , tactile input/output surfaces 411 , and so on.
  • the guidance device 101 may also include one or more additional components not shown, such as one or more non-transitory storage media (which may take the form of, but is not limited to, a magnetic storage medium; optical storage medium; magneto-optical storage medium; read only memory; random access memory; erasable programmable memory; flash memory; and so on).
  • the processing unit 424 may be configured such that the guidance device 101 is able to perform a variety of different functions. One such example may be the method 300 illustrated and described above with respect to FIG. 3 .
  • the processing unit 424 may also be configured such that the guidance device 101 is able to receive a variety of different input and/or provide a variety of different output.
  • the guidance device 101 may be operable to receive input via the communication unit 425 , the positional sensors 426 (such as by shaking or other motion of the guidance device 101 ), the microphone 428 (such as voice or other audio commands), the tactile input/output touch surface 411 , and so on.
  • the guidance device 101 may be operable to provide output via the communication unit 425 , speaker 427 (such as speech or other audio output), the tactile input/output touch surface 411 , and so on.
  • the tactile input/output touch surface 411 may be configured in a variety of different ways in a variety of different implementations such that it is operable to detect touch (or force, pressure, and so on) and/or provide tactile output.
  • the tactile input/output touch surface 411 may include touch sensing device layer and a haptic device layer. Such touch sensing device and a haptic device layers may be positioned adjacent to each other.
  • FIG. 4C shows a diagram illustrating an example configuration of the input/output touch surface 411 .
  • the input/output touch surface 411 may be positioned on the housing 410 .
  • the input/output touch surface 411 may include a number of layers such as a tactile feedback layer 411 B (such as piezoelectric cells, vibration actuators, and so on) and a touch layer 411 C (such as a capacitive touch sensing layer, a resistive touch sensing layer, and so on).
  • the input/output touch surface 411 may also include a coating 411 A (which may be formed of plastic or other material that may be more flexible than materials such as glass), which may function to protect the input/output touch surface 411 .
  • the input/output touch surface 411 may include a display layer 411 D.
  • a vision impaired user may not be completely blind and as such visual output may be presented to him via the display layer 411 D.
  • visual output may be presented via the display layer 411 D to another person who is assisting the user of the guidance device 101 , such as where the other person is being presented visual output so the other person can input a destination for the user to which the guidance device 101 may then guide the user.
  • the input/output touch surface 411 may include a backlight layer 411 E.
  • the input/output touch surface 411 (and/or other components of the guidance device 101 ) may be operable to detect one or more biometrics of the user, such as a fingerprint, palm print and so on.
  • a user's fingerprint may be detected using a capacitive or other touch sensing device of the input/output touch surface 411 .
  • Such a biometric may be used to authenticate the user.
  • entering a password or other authentication mechanism may be more difficult for a sensory impaired user than for other users.
  • using a detected biometric for authentication purposes may make authentication processes easier for the user.
  • FIG. 5A shows a cross-sectional view of the example guidance device 101 of FIG. 4A , taken along line A-A of FIG. 4A .
  • the guidance device 101 may include a printed circuit board 521 (and/or other electronic module) with one or more connected electronic components 522 disposed on one or more surfaces thereon.
  • Such electronic components 522 may include one or more processing units, wired and/or wireless communication units, positional sensors, input/output units (such as one or more cameras, speakers or other audio components, microphones, and so on) navigation systems, and/or any other electronic component.
  • the printed circuit board 521 may be electrically connected to the input/output touch surface 411 , the image sensors 413 and 414 , one or more batteries 423 and/or other power sources, and so on.
  • the battery 423 may be configured as a weight at a “bottom” of the guidance device 101 . This may operate to orient the guidance device 101 as shown when the guidance device 101 is resting on a surface instead of being held by a user. As such, the battery 423 may prevent the guidance device 101 from rolling.
  • FIG. 5A illustrates a particular configuration of components, it is understood that this is an example. In other implementations of the guidance device 101 , other configurations of the same, similar, and/or different components are possible without departing from the scope of the present disclosure.
  • the image sensor 413 may be a wide angle image sensor configured as a front image sensor.
  • image sensor 413 may be a narrow angle image sensor configured as a front image sensor.
  • one or more additional image sensors 530 may be used.
  • the additional image sensor 530 may be located at a bottom corner of the cylindrical housing 410 .
  • the additional image sensor 530 may be maneuverable via a motor 531 and/or other movement mechanism.
  • the additional image sensor 530 may be rotatable via the motor 531 such that it can be operated to obtain image sensor data of an area around the user's feet in order to compensate for a narrow angle image sensor used for the image sensor 413 .
  • a similar image sensor/motor mechanism may be located at a top corner of the cylindrical housing 410 in order to obtain image sensor data of an area above the user's head.
  • one or more ends of the cylindrical housing 410 may be configured with flanges and/or other structures that project from the ends of the cylindrical housing 410 .
  • Such flanges and/or other structures may protect the image sensors 413 and/or 414 from damage.
  • FIG. 6A shows a diagram illustrating an example 600 A of how a model of an environment generated based on environmental data may be mapped to the input/output touch surface 411 of the guidance device of FIG. 5A .
  • the input/output touch surface 411 is shown as unrolled and is marked to indicate what the guidance device 101 may have detected as the top and front of the input/output touch surface 411 is based on a current orientation in which a user is holding the guidance device 101 .
  • the hand 641 indicates an area of the input/output touch surface 411 where a user's hand has been detected as currently touching the input/output touch surface 411 .
  • the input/output touch surface 411 is providing tactile output indicating information about a model generated based on environmental data regarding the environment 100 shown in FIG. 1 .
  • the input/output touch surface 411 may include a number of bumps that can be raised or not to indicate input, such as via piezoelectric cells. As illustrated, filled bumps indicate raised bumps and unfilled bumps indicate bumps that are not raised.
  • the input/output touch surface 411 may provide tactile output via the raised bumps to indicate shapes or textures of objects in the environment.
  • the raised bumps 644 indicate the shape of the truck 107 in the environment 100 of FIG. 1
  • the raised bumps 645 indicate the shape of the traffic signal 106 in the environment 100 of FIG. 1 .
  • the user may be able to feel the shapes of the raised bumps 644 and 645 and understand that the truck 107 and traffic signal 106 are present.
  • the region in which tactile output is provided on the input/output touch surface 411 via the raised bumps may correspond to positional information regarding objects in the environment. Further, the relationships between raised bumps in various regions may correspond to relationships in the positions of the corresponding objects in the environment. For example, the raised bumps 645 are illustrated as further to the left on the input/output touch surface 411 than the raised bumps 644 . This may correspond to the fact that the traffic signal 106 is closer to the user 102 in the environment 100 of FIG. 1 than the truck 107 .
  • the tactile output may also include a variety of different context indicators.
  • the regions in which tactile output is provided on the input/output touch surface 411 via the raised bumps may correspond to positional information regarding objects in the environment.
  • the positional information indicated by the regions may be dependent on a context indicator presented via tactile output via the input/output touch surface 411 and/or otherwise presented, such as where the positional information is first positional information when a first positional context indicator is provided and is second positional information when a second positional context indicator is provided.
  • FIG. 6A illustrates a series of ranges “Range 1 ,” “Range 2 ,” and “Range 3 .”
  • Each range maps an area of the input/output touch surface 411 touched by the user's hand as indicate by the hand 641 to a distance range.
  • range 1 maps the pinky finger of the hand 641 to a distance of 0 meters, the ring finger to a distance of 1 meter, and the middle finger to a distance of 3 meters.
  • Range 2 maps the pinky finger of the hand 641 to a distance of 0 meters, the ring finger to a distance of 10 meters, and the middle finger to a distance of 30 meters.
  • Range 3 maps the pinky finger of the hand 641 to a distance of 0 meters, the ring finger to a distance of 100 meters, and the middle finger to a distance of 300 meters.
  • Regions 642 A-C may be range context indicators that indicate via tactile output which of the ranges is being currently presented.
  • 642 A may indicate Range 1
  • 642 B may indicate Range 2
  • 643 C may indicate Range 3 .
  • a bump is raised for 642 A. This indicates in this example that Range 1 is currently being used.
  • the traffic signal 106 is indicated by the raised bumps 645 as being 2 meters from the user and the truck 107 is indicated by the raised bumps 644 as being 3 meters from the user.
  • the raised bump(s) of the regions 642 A-C indicating the range current being used may be alternatingly raised and lowered to create the sensation that the context indicator is being “flashed.” This may indicate that one or more of the objects are moving.
  • the raised bumps 644 and/or 645 may be similarly raised and lowered to create the sensation of flashing to indicate that the respective object is moving.
  • the speed of the flashing may correspond to the speed of the movement.
  • a zone 643 may present tactile output related to one or more alerts to which the user's attention is specially directed.
  • the zone 643 may present indications of objects above the user's head, objects at the user's feet, the height of an object in the environment, the fact that that an object is traveling in a course that will connect with a user (which may be determined using real time calculations), the fact that the user is approaching the object and the object is below a head height of the user, and so on.
  • other alerts may be provided via the zone 643 , such as raising and lowering bumps in the zone 643 to indicate that an object is moving toward the user at high speed.
  • Various configurations are possible and contemplated without departing from the scope of the present disclosure.
  • one portion of the input/output touch surface 411 may correspond to objects located in the user's direction of travel while another portion corresponds to objects located in the opposite direction.
  • FIG. 6B shows a diagram illustrating another example 600 B of how a model of an environment generated based on environmental data may be mapped to the input/output touch surface 411 of the guidance device 101 of FIG. 5A .
  • an area 646 of the input/output touch surface 411 may provide tactile output related to detected speech in an environment.
  • a microphone or other sound component of the guidance device 101 may be used to detect one or more words spoken in the environment.
  • the guidance device 101 may perform voice to text speech recognition on the detected spoken words and provide a tactile output presenting the text in the area 646 .
  • the detected speech may be presented in braille via raised bumps in the area 646 .
  • FIG. 7 shows an additional example of a guidance device 701 .
  • the guidance device 701 may be a smart phone with a housing 710 , one or more cameras 713 or other sensors, an input/output touch display screen surface 711 (that may include a touch sensing device to detect touch and a haptic device to provide tactile output), and/or various other components.
  • the smart phone may provide guidance to a user by performing a method such as the method 300 illustrated and described above. A user may place a hand on the input/output touch display screen surface 711 in order to feel tactile output related to guidance.
  • FIG. 8 shows another example of a guidance device 801 .
  • the guidance device 801 may be a tablet computing device. Similar to the smart phone of FIG. 7 , the tablet computing device may include a housing 810 , one or more cameras 813 or other sensors, an input/output touch display screen surface 811 , and/or various other components.
  • the tablet computing device may provide guidance to a user by performing a method such as the method 300 illustrated and described above.
  • the input/output touch display screen surface 811 is larger than the input/output touch display screen surface 711 , placement of a hand on the input/output touch display screen surface 811 in order to receive tactile output related to guidance may be more comfortable and may be capable of providing more tactile information than the guidance device 701 .
  • FIG. 9 shows yet another example of a guidance device 901 .
  • the guidance device 901 may be an item of apparel. Similar to the smart phone of FIG. 7 and the tablet computing device of FIG. 8 , the item of apparel may include a housing 910 , one or more cameras 913 or other sensors, an input/output touch surface 911 , and/or various other components.
  • the item of apparel may provide guidance to a user by performing a method such as the method 300 illustrated and described above. As shown, the input/output touch surface 911 may be in contact with a user's back when the item of apparel is worn. Thus, the user may feel tactile output related to guidance provided by the item of apparel without other people being able to visibly detect that the user is receiving guidance.
  • FIG. 10 shows still another example of a guidance device 1001 .
  • the guidance device 1001 may be a smart watch and/or other wearable device. Similar to the smart phone of FIG. 7 , the tablet computing device of FIG. 8 , and the item of apparel of FIG. 9 , the smart watch may include a housing 1010 , one or more cameras 1013 or other sensors, an input/output touch surface 1011 , and/or various other components.
  • the smart watch may provide guidance to a user by performing a method such as the method 300 illustrated and described above. As shown, the input/output touch surface 1011 may be in contact with a user's wrist when the smart watch is attached. Thus, the user may feel tactile output related to guidance provided by the smart watch in a hands free manner.
  • the present disclosure relates to guidance devices for sensory impaired users.
  • Sensor data may be obtained regarding an environment around a guidance device, assistance device, environmental exploration device, an/or other such device.
  • a model of the environment may be generated based on the data.
  • the model may be mapped at least to an input/output touch surface of the guidance device.
  • Tactile output may be provided to a user of the guidance device via the input/output touch surface based at least on the mapping.
  • Other output based on the model may also be provided.
  • a sensory impaired user may be able to navigate and/or interact with an environment utilizing the guidance device.
  • Such a guidance device may provide better assistance than and/or take the place of a cane, a guidance animal, and/or other guidance devices and/or relationships.
  • the described disclosure may be provided as a computer program product, or software, that may include a non-transitory machine-readable medium having stored thereon instructions, which may be used to program a computer system (or other electronic devices) to perform a process according to the present disclosure.
  • a non-transitory machine-readable medium includes any mechanism for storing information in a form (e.g., software, processing application) readable by a machine (e.g., a computer).
  • the non-transitory machine-readable medium may take the form of, but is not limited to, a magnetic storage medium (e.g., floppy diskette, video cassette, and so on); optical storage medium (e.g., CD-ROM); magneto-optical storage medium; read only memory (ROM); random access memory (RAM); erasable programmable memory (e.g., EPROM and EEPROM); flash memory; and so on.
  • a magnetic storage medium e.g., floppy diskette, video cassette, and so on
  • optical storage medium e.g., CD-ROM
  • magneto-optical storage medium e.g., magneto-optical storage medium
  • ROM read only memory
  • RAM random access memory
  • EPROM and EEPROM erasable programmable memory
  • flash memory and so on.
  • biometric data in the present technology
  • biometric authentication data can be used for convenient access to device features without the use of passwords.
  • user biometric data is collected for providing users with feedback about their health or fitness levels.
  • other uses for personal information data, including biometric data, that benefit the user are also contemplated by the present disclosure.
  • the present disclosure further contemplates that the entities responsible for the collection, analysis, disclosure, transfer, storage, or other use of such personal information data will comply with well-established privacy policies and/or privacy practices.
  • such entities should implement and consistently use privacy policies and practices that are generally recognized as meeting or exceeding industry or governmental requirements for maintaining personal information data private and secure, including the use of data encryption and security methods that meets or exceeds industry or government standards.
  • personal information from users should be collected for legitimate and reasonable uses of the entity and not shared or sold outside of those legitimate uses. Further, such collection should occur only after receiving the informed consent of the users.
  • such entities would take any needed steps for safeguarding and securing access to such personal information data and ensuring that others with access to the personal information data adhere to their privacy policies and procedures. Further, such entities can subject themselves to evaluation by third parties to certify their adherence to widely accepted privacy policies and practices.
  • the present disclosure also contemplates embodiments in which users selectively block the use of, or access to, personal information data, including biometric data. That is, the present disclosure contemplates that hardware and/or software elements can be provided to prevent or block access to such personal information data.
  • the present technology can be configured to allow users to optionally bypass biometric authentication steps by providing secure information such as passwords, personal identification numbers (PINS), touch gestures, or other authentication methods, alone or in combination, known to those of skill in the art.
  • PINS personal identification numbers
  • touch gestures or other authentication methods, alone or in combination, known to those of skill in the art.
  • users can select to remove, disable, or restrict access to certain health-related applications collecting users' personal health or fitness data.

Abstract

Sensor data is obtained regarding an environment around a guidance device. A model of the environment is generated based on the data. The model is mapped at least to an input/output touch surface of the guidance device. Tactile output is provided to a user of the guidance device via the input/output touch surface based at least on the mapping. Other output based on the model may also be provided. The guidance device may include a variety of different components such as sensors that obtain data regarding the environment, input/output mechanisms for receiving input from and/or providing input to the user, processing units and/or other components for generating the model and/or mapping the model to various input/output mechanisms, and so on. Additionally, the guidance device may cooperate and/or communicate with a variety of different electronic devices that have one or more such components in order to perform such functions.

Description

    FIELD
  • The described embodiments relate generally to guidance devices. More particularly, the present embodiments relate to guidance devices for the sensory impaired.
  • BACKGROUND
  • People use a variety of senses to navigate and interact with the various environments they encounter on a daily basis. For example, people use their senses of sight and sound to navigate in their homes, on the street, through workplaces and shopping centers, and so on. Such environments may be designed and configured under the assumption that people will be able to use senses such as sight and sound for navigation.
  • However, many people are sensory impaired in one way or another. People may be deaf or at least partially auditorily impaired, blind or at least partially visually impaired, and so on. By way of example, the World Health Organization estimated in April of 2012 that 285 million people were visually impaired. Of these 285 million people, 246 million were estimated as having low vision and 39 million were estimated to be blind. Navigation through environments designed and configured for those lacking sensory impairment may be challenging or difficult for the sensory impaired.
  • Some sensory impaired people use guidance devices or relationships to assist them in navigating and interacting with their environments. For example, some blind people may use a cane in order to navigate and interact with an environment. Others may use a guide animal.
  • SUMMARY
  • The present disclosure relates to guidance devices for sensory impaired users. Sensor data may be obtained regarding an environment. A model of the environment may be generated and the model may be mapped at least to an input/output touch surface. Tactile output and/or other output may be provided to a user based at least on the mapping. In this way, a sensory impaired user may be able to navigate and/or interact with an environment utilizing the guidance device.
  • In various embodiments, a guidance device for a sensory impaired user may include an input/output touch surface, a sensor data component that obtains data regarding an environment around the guidance device, and a processing unit coupled to the input/output touch surface and the sensor data component. The processing unit may generate a model of the environment based at least on the data, map the model to the input/output touch surface and provide tactile output to a user based at least on the mapping via the input/output touch surface.
  • In some examples, the tactile output may be an arrangement of raised portions of the input/output touch surface or other tactile feedback configured to produce a tactile sensation of bumps.
  • In various examples, the tactile output may include a representation of an object in the environment and a region of the input/output touch surface where the representation is provided may correspond to positional information regarding the object. The positional information regarding the object corresponding to the region may be first positional information when the tactile output includes a first positional information context indicator and second positional information when the tactile output includes a second positional information context indicator. The shape of the representation may be associated with a detected shape of the object.
  • In some examples, the sensor data component may receive at least a portion of the data from another electronic device.
  • In various examples, the processing unit may provide at least one audio notification based at least on the model via an audio component of the guidance device or another electronic device.
  • In some embodiments, an assistance device for a sensory impaired user may include a surface operable to detect touch and provide tactile output, a sensor that detects information about an environment, and a processing unit coupled to the surface and the sensor. The processing unit may determine a portion of the surface being touched, select a subset of the information for output, and provide the tactile output to a user corresponding to the subset of the information via the portion of the surface.
  • In some examples, the sensor may detect orientation information regarding the assistance device and the processing unit may provide the tactile output via the portion of the surface according to the orientation information. In various examples, the sensor may detect location information regarding the assistance device and the tactile output may include a direction indication associated with navigation to a destination.
  • In some examples, the tactile output may include an indication of a height of an object in the environment. In various examples, the tactile output may include an indication that the object is traveling in a course that will connect with a user (which may be determined using real time calculations). In some examples, the tactile output may include an indication that the user is approaching the object and the object is below a head height of the user.
  • In various examples, the tactile output may include a first representation of a first object located in a direction of travel of the assistance device and a second representation of a second object located in an opposite direction of the direction of travel. In some examples, the wherein the tactile output may include a representation of an object in the environment and a texture of the representation may be associated with a detected texture of the object. In various examples, regions of the portion of the surface where the first representation and the second representation are provided may indicate that the first object is located in the direction of travel and the second object is located in the opposite direction of the direction of travel.
  • In some examples, the processing unit may provide an audio notification via an audio component upon determining that the assistance device experiences a fall event during use.
  • In various embodiments, an environmental exploration device may include a cylindrical housing, a processing unit located within the cylindrical housing, a touch sensing device coupled to the processing unit and positioned over the cylindrical housing, a haptic device (such as one or more piezoelectric cells) coupled to the processing unit and positioned adjacent to the touch sensing device, and an image sensor coupled to the processing unit that detects image data about an area around the cylindrical housing. The processing unit may analyze the image data using image recognition to identify an object (and/or analyze data from one or more depth sensors to determine distance to and/or speed of moving objects), creates an output image representing the object and positional information regarding the object in the area, map the output image to the haptic device; and provide the output image as tactile output to a user via the haptic device.
  • In some examples, the processing unit may provide an audio description of the object and the positional information via an audio component. In various examples, the processing unit may determine details of a hand of the user that is touching the touch sensing device and map the output image to the haptic device in accordance with whether the hand is a left hand of the user, a right hand of the user, has a large palm size, has a small palm size, has less than four fingers, or does not have a thumb.
  • In various examples, the environmental exploration device may also include a weight component coupled to the cylindrical housing operable to alter an orientation of the environmental exploration device.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The disclosure will be readily understood by the following detailed description in conjunction with the accompanying drawings, wherein like reference numerals designate like structural elements.
  • FIG. 1 shows a user navigating an example environment using a guidance device.
  • FIG. 2 shows the user navigating another example environment using the guidance device.
  • FIG. 3 shows a flow chart illustrating a method for providing guidance using a guidance device.
  • FIG. 4A shows an isometric view of an example guidance device.
  • FIG. 4B shows a block diagram illustrating functional relationships of example components of the guidance device of FIG. 4A.
  • FIG. 4C shows a diagram illustrating an example configuration of the input/output touch surface of the guidance device of FIG. 4A.
  • FIG. 5A shows a cross-sectional view of the example guidance device of FIG. 4A, taken along line A-A of FIG. 4A.
  • FIG. 5B shows a cross-sectional view of another example of the guidance device of FIG. 4A in accordance with further embodiments of the present disclosure.
  • FIG. 6A shows a diagram illustrating an example of how a model of an environment generated based on environmental data may be mapped to the input/output touch surface of the guidance device of FIG. 5A.
  • FIG. 6B shows a diagram illustrating another example of how a model of an environment generated based on environmental data may be mapped to the input/output touch surface of the guidance device of FIG. 5A.
  • FIGS. 7-10 show additional examples of guidance devices.
  • DETAILED DESCRIPTION
  • Reference will now be made in detail to representative embodiments illustrated in the accompanying drawings. It should be understood that the following descriptions are not intended to limit the embodiments to one preferred embodiment. To the contrary, it is intended to cover alternatives, modifications, and equivalents as can be included within the spirit and scope of the described embodiments as defined by the appended claims.
  • Embodiments described herein may permit a sensory-impaired user to quickly and efficiently interact with his or her environment. Sensory-impaired users may use a device which provides guidance to the user that communicate information about the environment to aid the user's interaction therewith. The device may detect information about the environment, model the environment based on the information, and present guidance output based on the model in a fashion detectable by the user. Such guidance output may be tactile so the user can quickly and efficiently “feel” the guidance output while interacting with the environment. This device may enable the sensory-impaired user to more quickly and efficiently interact with his or her environment than is possible with existing sensory-impaired guidance devices such as canes.
  • The present disclosure relates to guidance devices for sensory impaired users. Sensor data may be obtained regarding an environment around a guidance device, assistance device, environmental exploration device, an/or other such device. A model of the environment may be generated based on the data. The model may be mapped at least to an input/output touch surface of the guidance device. Tactile output may be provided to a user of the guidance device via the input/output touch surface based at least on the mapping. Other output based on the model may also be provided. In this way, a sensory impaired user may be able to navigate and/or interact with an environment utilizing the guidance device. Such a guidance device may provide better assistance than and/or take the place of a cane, a guidance animal, and/or other guidance devices and/or relationships.
  • The guidance device may include a variety of different components such as sensors that obtain data regarding the environment, input/output mechanisms for receiving input from and/or providing input to the user, processing units and/or other components for generating the model and/or mapping the model to various input/output mechanisms, and so on. Additionally, the guidance device may cooperate and/or communicate with a variety of different electronic devices that have one or more such components in order to perform one or more of these functions.
  • These and other embodiments are discussed below with reference to FIGS. 1-10. However, those skilled in the art will readily appreciate that the detailed description given herein with respect to these Figures is for explanatory purposes only and should not be construed as limiting.
  • FIG. 1 shows a user 102 navigating an example environment 100 using a guidance device 101, assistance device, environmental exploration device, an/or other such device. A guidance device 101 may be a device that detects information about the user's environment 100 and presents that information to the user to aid the user's 102 interaction with the environment 100.
  • As illustrated, the user 102 is holding the guidance device 101 in a hand 103 while walking down a street. The user 102 also is wearing a wearable device 109 and has a smart phone 108 in the user's pocket. A traffic signal 106 and a moving truck 107 are in front of the user 102 and another person 104 looking at a cellular telephone 105 is walking behind the user 102. The user 102 may be receiving tactile, audio, and/or other guidance output related to the environment 100 from the guidance device 101 (which may detect information regarding the environment 100 upon which the guidance output may be based) and/or the smart phone 108 and/or wearable device 109. For example, the user 102 may be receiving output that the user 102 is approaching the traffic signal 106, the truck 107 is approaching the user 102, the other person 104 is approaching the user 102 from behind, and so on. As illustrated, one or more of the shown devices may be wired and/or wirelessly transmitting and/or receiving in order to communicate with one or more of each other. Such devices may communicate with each other in order to obtain environmental or other sensor data regarding the environment, generate a model based on the sensor data, and provide the guidance output to the user 102 based on the model.
  • Although FIG. 1 is illustrated as providing tactile guidance output to the hand 103 of the user 102, it is understood that this is an example. In various implementations, tactile guidance output may be provided to various different parts of a user's body, such as a shirt made of a fabric configured to provide tactile output.
  • Similarly, FIG. 2 shows the user 102 navigating another example environment 200 using the guidance device 101. As illustrated, the user 102 is holding the guidance device 101 (which may detect information regarding the environment 200 upon which the guidance output may be based) in the user's hand 103 while walking through a room in a house. The room has a display screen 208 connected to a communication adapter 209 on a wall and a navigation beacon 206 on a table. Another person 204 looking at a smart phone 205 is also in the room. As illustrated, one or more of the devices, may be transmitting and/or receiving in order to communicate with one or more of each other. The user 102 may be receiving tactile, audio, and/or other guidance output related to the environment 200 from the guidance device 101. For example, the user 102 may be receiving tactile output of the layout of the room based on a map provided by the navigation beacon 206. By way of another example, the user 102 may be receiving tactile and/or audio direction indications associated with navigation from the user's 102 current location in the environment 200 to a destination input by the other person 204 on the smart phone 205, the guidance device 101, and/or the display screen 208.
  • FIG. 3 shows a flow chart illustrating a method 300 for providing guidance using a guidance device. By way of example, such a guidance device may be one or more of the guidance devices 101, 501, 601, 701, or 801 illustrated and described herein with respect to FIGS. 1, 2, and 4-10.
  • At 310, environmental or other sensor data may be obtained regarding an environment around a guidance device. The environmental data may be obtained from a variety of different kind of sensors or other sensor data components such as image sensors (such as cameras, three-dimensional cameras, infra-red image sensors, lasers, ambient light detectors, and so on), positional sensors (such as accelerometers, gyroscopes, magnetometers, and so on), navigations systems (such as a global positioning system or other such system), depth sensors, microphones, temperature sensors, Hall effect sensors, and so on. In some implementations, one or more such sensors may be incorporated into the guidance device. In other implementations, one or more such sensors may be components of other electronic devices and the sensor of the guidance device may be a communication component that receives environmental data from such other sensors transmitted from another electronic device (which may communicate with the guidance device in one or more client/server configurations, peer-to-peer configurations, mesh configurations, and/or other communication network configurations). Such sensors may obtain environmental data regarding any aspect of an environment such as the presence and/or position of objects, the movement of objects, weather conditions, textures, temperatures, and/or any other information about the environment.
  • At 320, a model of the environment may be generated for guidance output based at least on the environmental data. As even the smallest environment may contain too much information to output or output in such a way that a user can make sense of it, the model may include a subset of the obtained environmental data. The environmental data may be processed in order to determine which of the environmental data is relevant enough to the user to be included in the model. Such processing may include image or object recognition and/or other analysis of environmental data. For example, an environment may contain a number of objects, but only the objects directly in front of, to the sides, and/or behind a user and/or particularly dangerous objects such as moving automobiles may be included in a model.
  • In some cases, the determination of whether to include environmental data in the model may be dependent on a current state of one or more output devices that will be used to present guidance output data for the model. For example, an input/output touch surface may be used to provide tactile output via a currently touched area. When a larger area of the input/output touch surface is being touched, more environmental data may be included in the generated model. Conversely, when a smaller area of the input/output touch surface is being touched, less environmental data may be included in the generated model.
  • The guidance device may generate the model. However, in various cases the guidance device may cooperate with one or more other electronic devices communicably connected (such as in various client/server configurations, peer-to-peer configurations, mesh configurations, and/or other communication network configurations) to the guidance device in order to generate the model. For example, the guidance device may generate the model by receiving a model generated by another device. By way of another example, another device may process the environmental data and provide the guidance device an intermediate subset of the environmental data which the guidance device then uses to generate the model.
  • At 330, guidance output based at least on the model may be provided. Such guidance output may be tactile output (such as shapes of objects, indications of positions or motions of objects, and so on), audio output (such as audio notifications related to and/or descriptions of objects or conditions in the environment and so on), and/or any other kind of output. Providing guidance output based on the model may include mapping the model to one or more output devices and providing the guidance output based at least on the mapping via the output device(s).
  • For example, an input/output touch surface may be used to provide tactile guidance output via an area currently being touched by a user's hand. The area currently being touched by the user's hand may be determined along with the orientation of the user's hand touching the area and the model may be mapped to the determined area and orientation. Tactile output may then be provided via the input/output touch surface according to the mapping. In such a case, the tactile output may be mapped in such a way as to convey shapes and/or textures of one or more objects, information about objects (such as position in the environment, distance, movement, speed of movement, and/or any other such information) via the position on the input/output touch surface (such as where the output is provided in relation to various portions of the user's hand) and/or other contextual indicators presented via the input/output touch surface, and/or other such information about the environment.
  • By mapping guidance output to the area currently being touched by a user's hand, power savings may be achieved as output may not be provided via areas of the input/output touch surface that are not capable of being felt by a user. Further, mapping guidance output to the area currently being touched by a user's hand, assistance may be provided to the user to ensure that the output is provided to portions of the user's hand as expected by the user as opposed to making the user discover how to touch the guidance device in a particular manner.
  • In some implementations, determination of the area currently being touched by a user's hand may include determining details about the user's hand and the guidance output may be mapped according to such details. For example, the details may include that the hand is a left hand of the user, a right hand of the user, has a large palm size, has a small palm size, has less than four fingers, or does not have a thumb.
  • The guidance device may provide the guidance output via one or more output components of the guidance device such as one or more input/output touch surface, speakers, and/or other output devices. However, in some cases the guidance device may cooperate with one or more other electronic devices communicably connected (such as in various client/server configurations, peer-to-peer configurations, mesh configurations, and/or other communication network configurations) to provide one or more kinds of output.
  • For example, a guidance device may provide a pattern of raised bumps or other such protrusions to indicate objects in the environment, the identity of the objects, the position of those objects, and the distance of those objects to the user. A wearable device on the user's wrist may also provide vibration output to indicate that one or more of the objects are moving towards the user. Further, the user's cellular telephone may output audio directions associated with navigation of the user from the user's current location (such current location information may be detected by one or more sensors such as a global positioning system or other navigation component) to a destination.
  • Although the example method 300 is illustrated and described as including particular operations performed in a particular order, it is understood that this is an example. In various implementations, various orders of the same, similar, and/or different operations may be performed without departing from the scope of the present disclosure.
  • For example, although the method 300 is illustrated and described as generating a model based on the environmental data and providing guidance output based on the model, it is understood that this is an example. In some implementations, guidance output may be provided based on environmental data without generating a model. In one example of such a scenario, the guidance device may detect that the guidance device was held and then dropped using one or more positional sensors, input/output touch surface, and/or other such components. In such a case, the guidance device may emit an audio alarm upon such detection without generation of a model to aid a user in locating the dropped guidance device.
  • FIG. 4A shows an isometric view of an example guidance device 101. The guidance device 101 may include a cylindrical housing 410 that may be formed of aluminum, plastic, and/or any other suitable material. An input/output touch surface 411 may be disposed on a surface of the cylindrical housing 410. The input/output touch surface 411 may be operable to detect touch input (such as touch, force, pressure and so on via one or more capacitive sensors, touch sensors, and/or any other kind of touch and/or force sensors). The input/output touch surface 411 may also be operable to provide tactile output, such as one or more patterns of vibrations, raised bumps or other protrusions, and so on. For example, the input/output touch surface 411 may include one or more piezoelectric cells that can be electrically manipulated to create one or more patterns of raised bumps on the input/output touch surface 411.
  • The guidance device 101 may also include image sensors 413 and 414. The image sensors 413 and 414 may be any kind of image sensors such as one or more cameras, three dimensional cameras, infra-red image sensors, lasers, ambient light detectors, and so on.
  • The guidance device 101 may also include one or more protectors 412 that may prevent the input/output touch surface 411 from contacting surfaces with which the guidance device 101 comes into contact. The protectors 412 may be formed of rubber, silicone, plastic, and/or any other suitable material. As illustrated, the protectors 412 may be configured as rings. In such a case, the guidance device 101 may include one or more weights and/or other orientation elements (discussed further below) to prevent the guidance device 101 from rolling when placed on a surface. However, in other cases the protectors 412 may be shaped in other configurations (such as with a flat bottom) to prevent rolling of the guidance device 101 on a surface without use of a weight or other orientation element.
  • As shown, the cylindrical housing 410 is shaped such that it may be held in any number of different orientations. To accommodate for such different holding orientations, the guidance device 101 may utilize the input/output touch surface 411, one or more position sensors, and/or other components to detect orientation information regarding the orientation in which the guidance device 101 is being held in order to map output to output devices such as the input/output touch surface 411 accordingly. However, in other implementations the guidance device 101 may have a housing shaped to be held in a particular manner, such as where a housing is configured with a grip that conforms to a user's hand in a particular orientation. In such an implementation, detection of how the guidance device 101 is being held may be omitted while still allowing the guidance device 101 to correctly map output to output devices such as the input/output touch surface 411.
  • In some cases, the guidance device 101 may be configured to operate in a particular orientation. For example, the image sensor 413 may be configured as a front image sensor and the image sensor 414 may be configured as a rear image sensor. This may simplify analysis of environmental and/or sensor data from the image sensors 413 and 414. This may also allow for particular configurations of the image sensors 413 and 414, such as where the image sensor 413 is a wider angle image sensor than the image sensor 414 as a user may be more concerned with objects in front of the user than behind.
  • However, in other cases the guidance device 101 may be configurable to operate in a variety of orientations. For example, the image sensors 413 and 414 may be identical and the guidance device 101 may use the image sensors 413 and 414 based on a currently detected orientation (which may be based on detection by input/output touch surface 411, one or more positional sensors, and/or other such components).
  • FIG. 4B shows a block diagram illustrating functional relationships of example components of the guidance device 101 of FIG. 4A. As shown, in various example implementations the guidance device 101 may include one or more processing units 424, batteries 423, communication units 425, positional sensors 426, speakers 427, microphones 428, navigation systems 429, image sensors 413 and 414, tactile input/output surfaces 411, and so on. The guidance device 101 may also include one or more additional components not shown, such as one or more non-transitory storage media (which may take the form of, but is not limited to, a magnetic storage medium; optical storage medium; magneto-optical storage medium; read only memory; random access memory; erasable programmable memory; flash memory; and so on).
  • The processing unit 424 may be configured such that the guidance device 101 is able to perform a variety of different functions. One such example may be the method 300 illustrated and described above with respect to FIG. 3. The processing unit 424 may also be configured such that the guidance device 101 is able to receive a variety of different input and/or provide a variety of different output. For example, the guidance device 101 may be operable to receive input via the communication unit 425, the positional sensors 426 (such as by shaking or other motion of the guidance device 101), the microphone 428 (such as voice or other audio commands), the tactile input/output touch surface 411, and so on. By way of another example, the guidance device 101 may be operable to provide output via the communication unit 425, speaker 427 (such as speech or other audio output), the tactile input/output touch surface 411, and so on.
  • The tactile input/output touch surface 411 may be configured in a variety of different ways in a variety of different implementations such that it is operable to detect touch (or force, pressure, and so on) and/or provide tactile output. In some implementations, the tactile input/output touch surface 411 may include touch sensing device layer and a haptic device layer. Such touch sensing device and a haptic device layers may be positioned adjacent to each other.
  • For example, FIG. 4C shows a diagram illustrating an example configuration of the input/output touch surface 411. As shown, the input/output touch surface 411 may be positioned on the housing 410. The input/output touch surface 411 may include a number of layers such as a tactile feedback layer 411B (such as piezoelectric cells, vibration actuators, and so on) and a touch layer 411C (such as a capacitive touch sensing layer, a resistive touch sensing layer, and so on). The input/output touch surface 411 may also include a coating 411A (which may be formed of plastic or other material that may be more flexible than materials such as glass), which may function to protect the input/output touch surface 411.
  • As shown, in some implementations the input/output touch surface 411 may include a display layer 411D. A vision impaired user may not be completely blind and as such visual output may be presented to him via the display layer 411D. Further, in some cases visual output may be presented via the display layer 411D to another person who is assisting the user of the guidance device 101, such as where the other person is being presented visual output so the other person can input a destination for the user to which the guidance device 101 may then guide the user.
  • In some implementations, such as implementations where the display layer 411D is a display that utilizes a backlight, the input/output touch surface 411 may include a backlight layer 411E.
  • In various implementations, the input/output touch surface 411 (and/or other components of the guidance device 101) may be operable to detect one or more biometrics of the user, such as a fingerprint, palm print and so on. For example, a user's fingerprint may be detected using a capacitive or other touch sensing device of the input/output touch surface 411.
  • Such a biometric may be used to authenticate the user. In some situations, entering a password or other authentication mechanism may be more difficult for a sensory impaired user than for other users. In such a situation, using a detected biometric for authentication purposes may make authentication processes easier for the user.
  • FIG. 5A shows a cross-sectional view of the example guidance device 101 of FIG. 4A, taken along line A-A of FIG. 4A. As illustrated, the guidance device 101 may include a printed circuit board 521 (and/or other electronic module) with one or more connected electronic components 522 disposed on one or more surfaces thereon. Such electronic components 522 may include one or more processing units, wired and/or wireless communication units, positional sensors, input/output units (such as one or more cameras, speakers or other audio components, microphones, and so on) navigation systems, and/or any other electronic component. The printed circuit board 521 may be electrically connected to the input/output touch surface 411, the image sensors 413 and 414, one or more batteries 423 and/or other power sources, and so on.
  • As illustrated, the battery 423 may be configured as a weight at a “bottom” of the guidance device 101. This may operate to orient the guidance device 101 as shown when the guidance device 101 is resting on a surface instead of being held by a user. As such, the battery 423 may prevent the guidance device 101 from rolling.
  • Although FIG. 5A illustrates a particular configuration of components, it is understood that this is an example. In other implementations of the guidance device 101, other configurations of the same, similar, and/or different components are possible without departing from the scope of the present disclosure.
  • For example, in one implementation of the guidance device 101 of FIG. 5A, the image sensor 413 may be a wide angle image sensor configured as a front image sensor. However, in another implementation shown in FIG. 5B, image sensor 413 may be a narrow angle image sensor configured as a front image sensor. To compensate for the narrow angle of the image sensor 413, one or more additional image sensors 530 may be used.
  • As shown in this example, the additional image sensor 530 may be located at a bottom corner of the cylindrical housing 410. The additional image sensor 530 may be maneuverable via a motor 531 and/or other movement mechanism. In this example, the additional image sensor 530 may be rotatable via the motor 531 such that it can be operated to obtain image sensor data of an area around the user's feet in order to compensate for a narrow angle image sensor used for the image sensor 413. In other examples, a similar image sensor/motor mechanism may be located at a top corner of the cylindrical housing 410 in order to obtain image sensor data of an area above the user's head.
  • By way of another example, in various implementations, one or more ends of the cylindrical housing 410 may be configured with flanges and/or other structures that project from the ends of the cylindrical housing 410. Such flanges and/or other structures may protect the image sensors 413 and/or 414 from damage.
  • FIG. 6A shows a diagram illustrating an example 600A of how a model of an environment generated based on environmental data may be mapped to the input/output touch surface 411 of the guidance device of FIG. 5A. For purposes of clarity, the input/output touch surface 411 is shown as unrolled and is marked to indicate what the guidance device 101 may have detected as the top and front of the input/output touch surface 411 is based on a current orientation in which a user is holding the guidance device 101. The hand 641 indicates an area of the input/output touch surface 411 where a user's hand has been detected as currently touching the input/output touch surface 411. In this example, the input/output touch surface 411 is providing tactile output indicating information about a model generated based on environmental data regarding the environment 100 shown in FIG. 1.
  • The input/output touch surface 411 may include a number of bumps that can be raised or not to indicate input, such as via piezoelectric cells. As illustrated, filled bumps indicate raised bumps and unfilled bumps indicate bumps that are not raised.
  • The input/output touch surface 411 may provide tactile output via the raised bumps to indicate shapes or textures of objects in the environment. For example, the raised bumps 644 indicate the shape of the truck 107 in the environment 100 of FIG. 1 and the raised bumps 645 indicate the shape of the traffic signal 106 in the environment 100 of FIG. 1. The user may be able to feel the shapes of the raised bumps 644 and 645 and understand that the truck 107 and traffic signal 106 are present.
  • The region in which tactile output is provided on the input/output touch surface 411 via the raised bumps may correspond to positional information regarding objects in the environment. Further, the relationships between raised bumps in various regions may correspond to relationships in the positions of the corresponding objects in the environment. For example, the raised bumps 645 are illustrated as further to the left on the input/output touch surface 411 than the raised bumps 644. This may correspond to the fact that the traffic signal 106 is closer to the user 102 in the environment 100 of FIG. 1 than the truck 107.
  • The tactile output may also include a variety of different context indicators. As described above the regions in which tactile output is provided on the input/output touch surface 411 via the raised bumps may correspond to positional information regarding objects in the environment. However, in some implementations the positional information indicated by the regions may be dependent on a context indicator presented via tactile output via the input/output touch surface 411 and/or otherwise presented, such as where the positional information is first positional information when a first positional context indicator is provided and is second positional information when a second positional context indicator is provided.
  • For example, FIG. 6A illustrates a series of ranges “Range 1,” “Range 2,” and “Range 3.” Each range maps an area of the input/output touch surface 411 touched by the user's hand as indicate by the hand 641 to a distance range. As illustrated, range 1 maps the pinky finger of the hand 641 to a distance of 0 meters, the ring finger to a distance of 1 meter, and the middle finger to a distance of 3 meters. Range 2 maps the pinky finger of the hand 641 to a distance of 0 meters, the ring finger to a distance of 10 meters, and the middle finger to a distance of 30 meters. Range 3 maps the pinky finger of the hand 641 to a distance of 0 meters, the ring finger to a distance of 100 meters, and the middle finger to a distance of 300 meters. Regions 642A-C may be range context indicators that indicate via tactile output which of the ranges is being currently presented. In this example, 642A may indicate Range 1, 642B may indicate Range 2, and 643C may indicate Range 3. By making the interpretation of information corresponding to region dependent on such content indicators, a wider variety of information may be presented via the input/output touch surface 411 while still being comprehensible to a user.
  • As shown, a bump is raised for 642A. This indicates in this example that Range 1 is currently being used. Thus, the traffic signal 106 is indicated by the raised bumps 645 as being 2 meters from the user and the truck 107 is indicated by the raised bumps 644 as being 3 meters from the user.
  • In various implementations, a variety of other kind of information may be presented. For example, the raised bump(s) of the regions 642A-C indicating the range current being used may be alternatingly raised and lowered to create the sensation that the context indicator is being “flashed.” This may indicate that one or more of the objects are moving. In some implementations the raised bumps 644 and/or 645 may be similarly raised and lowered to create the sensation of flashing to indicate that the respective object is moving. In some cases, the speed of the flashing may correspond to the speed of the movement.
  • By way of another example, a zone 643 may present tactile output related to one or more alerts to which the user's attention is specially directed. In some cases, the zone 643 may present indications of objects above the user's head, objects at the user's feet, the height of an object in the environment, the fact that that an object is traveling in a course that will connect with a user (which may be determined using real time calculations), the fact that the user is approaching the object and the object is below a head height of the user, and so on. In other cases, other alerts may be provided via the zone 643, such as raising and lowering bumps in the zone 643 to indicate that an object is moving toward the user at high speed. Various configurations are possible and contemplated without departing from the scope of the present disclosure.
  • Although this example illustrates objects that are in front of the user without illustrating objects that are behind the user, it is understood that this is an example and that various depictions of an environment may be presented. For example, in some implementations one portion of the input/output touch surface 411 may correspond to objects located in the user's direction of travel while another portion corresponds to objects located in the opposite direction.
  • FIG. 6B shows a diagram illustrating another example 600B of how a model of an environment generated based on environmental data may be mapped to the input/output touch surface 411 of the guidance device 101 of FIG. 5A. In this example, an area 646 of the input/output touch surface 411 may provide tactile output related to detected speech in an environment. For example, a microphone or other sound component of the guidance device 101 may be used to detect one or more words spoken in the environment. The guidance device 101 may perform voice to text speech recognition on the detected spoken words and provide a tactile output presenting the text in the area 646. For example, as illustrated the detected speech may be presented in braille via raised bumps in the area 646.
  • FIG. 7 shows an additional example of a guidance device 701. The guidance device 701 may be a smart phone with a housing 710, one or more cameras 713 or other sensors, an input/output touch display screen surface 711 (that may include a touch sensing device to detect touch and a haptic device to provide tactile output), and/or various other components. The smart phone may provide guidance to a user by performing a method such as the method 300 illustrated and described above. A user may place a hand on the input/output touch display screen surface 711 in order to feel tactile output related to guidance.
  • FIG. 8 shows another example of a guidance device 801. The guidance device 801 may be a tablet computing device. Similar to the smart phone of FIG. 7, the tablet computing device may include a housing 810, one or more cameras 813 or other sensors, an input/output touch display screen surface 811, and/or various other components. The tablet computing device may provide guidance to a user by performing a method such as the method 300 illustrated and described above. As the input/output touch display screen surface 811 is larger than the input/output touch display screen surface 711, placement of a hand on the input/output touch display screen surface 811 in order to receive tactile output related to guidance may be more comfortable and may be capable of providing more tactile information than the guidance device 701.
  • FIG. 9 shows yet another example of a guidance device 901. The guidance device 901 may be an item of apparel. Similar to the smart phone of FIG. 7 and the tablet computing device of FIG. 8, the item of apparel may include a housing 910, one or more cameras 913 or other sensors, an input/output touch surface 911, and/or various other components. The item of apparel may provide guidance to a user by performing a method such as the method 300 illustrated and described above. As shown, the input/output touch surface 911 may be in contact with a user's back when the item of apparel is worn. Thus, the user may feel tactile output related to guidance provided by the item of apparel without other people being able to visibly detect that the user is receiving guidance.
  • FIG. 10 shows still another example of a guidance device 1001. The guidance device 1001 may be a smart watch and/or other wearable device. Similar to the smart phone of FIG. 7, the tablet computing device of FIG. 8, and the item of apparel of FIG. 9, the smart watch may include a housing 1010, one or more cameras 1013 or other sensors, an input/output touch surface 1011, and/or various other components. The smart watch may provide guidance to a user by performing a method such as the method 300 illustrated and described above. As shown, the input/output touch surface 1011 may be in contact with a user's wrist when the smart watch is attached. Thus, the user may feel tactile output related to guidance provided by the smart watch in a hands free manner.
  • As described above and illustrated in the accompanying figures, the present disclosure relates to guidance devices for sensory impaired users. Sensor data may be obtained regarding an environment around a guidance device, assistance device, environmental exploration device, an/or other such device. A model of the environment may be generated based on the data. The model may be mapped at least to an input/output touch surface of the guidance device. Tactile output may be provided to a user of the guidance device via the input/output touch surface based at least on the mapping. Other output based on the model may also be provided. In this way, a sensory impaired user may be able to navigate and/or interact with an environment utilizing the guidance device. Such a guidance device may provide better assistance than and/or take the place of a cane, a guidance animal, and/or other guidance devices and/or relationships.
  • The described disclosure may be provided as a computer program product, or software, that may include a non-transitory machine-readable medium having stored thereon instructions, which may be used to program a computer system (or other electronic devices) to perform a process according to the present disclosure. A non-transitory machine-readable medium includes any mechanism for storing information in a form (e.g., software, processing application) readable by a machine (e.g., a computer). The non-transitory machine-readable medium may take the form of, but is not limited to, a magnetic storage medium (e.g., floppy diskette, video cassette, and so on); optical storage medium (e.g., CD-ROM); magneto-optical storage medium; read only memory (ROM); random access memory (RAM); erasable programmable memory (e.g., EPROM and EEPROM); flash memory; and so on.
  • The present disclosure recognizes that personal information data, including biometric data, in the present technology, can be used to the benefit of users. For example, the use of biometric authentication data can be used for convenient access to device features without the use of passwords. In other examples, user biometric data is collected for providing users with feedback about their health or fitness levels. Further, other uses for personal information data, including biometric data, that benefit the user are also contemplated by the present disclosure.
  • The present disclosure further contemplates that the entities responsible for the collection, analysis, disclosure, transfer, storage, or other use of such personal information data will comply with well-established privacy policies and/or privacy practices. In particular, such entities should implement and consistently use privacy policies and practices that are generally recognized as meeting or exceeding industry or governmental requirements for maintaining personal information data private and secure, including the use of data encryption and security methods that meets or exceeds industry or government standards. For example, personal information from users should be collected for legitimate and reasonable uses of the entity and not shared or sold outside of those legitimate uses. Further, such collection should occur only after receiving the informed consent of the users. Additionally, such entities would take any needed steps for safeguarding and securing access to such personal information data and ensuring that others with access to the personal information data adhere to their privacy policies and procedures. Further, such entities can subject themselves to evaluation by third parties to certify their adherence to widely accepted privacy policies and practices.
  • Despite the foregoing, the present disclosure also contemplates embodiments in which users selectively block the use of, or access to, personal information data, including biometric data. That is, the present disclosure contemplates that hardware and/or software elements can be provided to prevent or block access to such personal information data. For example, in the case of biometric authentication methods, the present technology can be configured to allow users to optionally bypass biometric authentication steps by providing secure information such as passwords, personal identification numbers (PINS), touch gestures, or other authentication methods, alone or in combination, known to those of skill in the art. In another example, users can select to remove, disable, or restrict access to certain health-related applications collecting users' personal health or fitness data.
  • The foregoing description, for purposes of explanation, used specific nomenclature to provide a thorough understanding of the described embodiments. However, it will be apparent to one skilled in the art that the specific details are not required in order to practice the described embodiments. Thus, the foregoing descriptions of the specific embodiments described herein are presented for purposes of illustration and description. They are not targeted to be exhaustive or to limit the embodiments to the precise forms disclosed. It will be apparent to one of ordinary skill in the art that many modifications and variations are possible in view of the above teachings.

Claims (21)

What is claimed is:
1. A guidance device for a sensory impaired user, comprising:
an input/output touch surface;
a sensor data component that obtains data regarding an environment around the guidance device; and
a processing unit, coupled to the input/output touch surface and the sensor data component, that:
generates a model of the environment based at least on the data;
maps the model to the input/output touch surface; and
provides tactile output to a user based at least on the mapping via the input/output touch surface.
2. The guidance device of claim 1, wherein the tactile output comprises an arrangement of raised portions of the input/output touch surface or other tactile feedback configured to produce a tactile sensation of bumps.
3. The guidance device of claim 1, wherein the tactile output includes a representation of an object in the environment and a region of the input/output touch surface where the representation is provided corresponds to positional information regarding the object.
4. The guidance device of claim 3, wherein the positional information regarding the object corresponding to the region is first positional information when the tactile output includes a first positional information context indicator and second positional information when the tactile output includes a second positional information context indicator.
5. The guidance device of claim 3, wherein a shape of the representation is associated with a detected shape of the object.
6. The guidance device of claim 1, wherein the sensor data component receives at least a portion of the data from another electronic device.
7. The guidance device of claim 1, wherein the processing unit provides at least one audio notification based at least on the model via an audio component of the guidance device or another electronic device.
8. An assistance device for a sensory impaired user, comprising:
a surface operable to detect touch and provide tactile output;
a sensor that detects information about an environment; and
a processing unit, coupled to the surface and the sensor, that:
determines a portion of the surface being touched;
selects a subset of the information for output; and
provides the tactile output to a user corresponding to the subset of the information via the portion of the surface.
9. The assistance device of claim 8, wherein the sensor detects orientation information regarding the assistance device and the processing unit provides the tactile output via the portion of the surface according to the orientation information.
10. The assistance device of claim 8, wherein the sensor detects location information regarding the assistance device and the tactile output includes a direction indication associated with navigation to a destination.
11. The assistance device of claim 8, wherein the tactile output includes an indication of at least one of:
a height of an object in the environment;
that the object is traveling in a course that will connect with a user; or
that the user is approaching the object and the object is below a head height of the user.
12. The assistance device of claim 8, wherein the tactile output includes a first representation of a first object located in a direction of travel of the assistance device and a second representation of a second object located in an opposite direction of the direction of travel.
13. The assistance device of claim 12, wherein regions of the portion of the surface where the first representation and the second representation are provided indicate that the first object is located in the direction of travel and the second object is located in the opposite direction of the direction of travel.
14. The assistance device of claim 8, wherein the tactile output includes a representation of an object in the environment and a texture of the representation is associated with a detected texture of the object.
15. The assistance device of claim 8, wherein the processing unit provides an audio notification via an audio component upon determining that the assistance device experiences a fall event during use.
16. An environmental exploration device, comprising:
a cylindrical housing;
a processing unit located within the cylindrical housing;
a touch sensing device, coupled to the processing unit, positioned over the cylindrical housing;
a haptic device, coupled to the processing unit, positioned adjacent to the touch sensing device; and
an image sensor, coupled to the processing unit, that detects image data about an area around the cylindrical housing;
wherein the processing unit:
analyzes the image data using image recognition to identify an object;
creates an output image representing the object and positional information regarding the object in the area;
maps the output image to the haptic device; and
provides the output image as tactile output to a user via the haptic device.
17. The environmental exploration device of claim 16, wherein the haptic device comprises piezoelectric cells.
18. The environmental exploration device of claim 16, wherein the processing unit provides an audio description of the object and the positional information via an audio component.
19. The environmental exploration device of claim 16, wherein the processing unit determines details of a hand of the user that is touching the touch sensing device and maps the output image to the haptic device in accordance with whether the hand is a left hand of the user, a right hand of the user, has a large palm size, has a small palm size, has less than four fingers, or does not have a thumb.
20. The environmental exploration device of claim 16, further comprising a weight component coupled to the cylindrical housing operable to alter an orientation of the environmental exploration device.
21. The environmental exploration device of claim 16, wherein the processing unit receives detected speech and provides a tactile representation of the detected speech via the haptic device.
US14/804,930 2015-07-21 2015-07-21 Guidance device for the sensory impaired Abandoned US20170024010A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US14/804,930 US20170024010A1 (en) 2015-07-21 2015-07-21 Guidance device for the sensory impaired
US15/900,728 US10254840B2 (en) 2015-07-21 2018-02-20 Guidance device for the sensory impaired
US16/287,883 US10664058B2 (en) 2015-07-21 2019-02-27 Guidance device for the sensory impaired

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US14/804,930 US20170024010A1 (en) 2015-07-21 2015-07-21 Guidance device for the sensory impaired

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US15/900,728 Continuation US10254840B2 (en) 2015-07-21 2018-02-20 Guidance device for the sensory impaired

Publications (1)

Publication Number Publication Date
US20170024010A1 true US20170024010A1 (en) 2017-01-26

Family

ID=57837117

Family Applications (3)

Application Number Title Priority Date Filing Date
US14/804,930 Abandoned US20170024010A1 (en) 2015-07-21 2015-07-21 Guidance device for the sensory impaired
US15/900,728 Active US10254840B2 (en) 2015-07-21 2018-02-20 Guidance device for the sensory impaired
US16/287,883 Active US10664058B2 (en) 2015-07-21 2019-02-27 Guidance device for the sensory impaired

Family Applications After (2)

Application Number Title Priority Date Filing Date
US15/900,728 Active US10254840B2 (en) 2015-07-21 2018-02-20 Guidance device for the sensory impaired
US16/287,883 Active US10664058B2 (en) 2015-07-21 2019-02-27 Guidance device for the sensory impaired

Country Status (1)

Country Link
US (3) US20170024010A1 (en)

Cited By (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9829981B1 (en) 2016-05-26 2017-11-28 Apple Inc. Haptic output device
US9886090B2 (en) 2014-07-08 2018-02-06 Apple Inc. Haptic notifications utilizing haptic input devices
US20180261055A1 (en) * 2017-03-08 2018-09-13 Winston Yang Tactile Feedback Guidance Device
US20190087049A1 (en) * 2017-09-20 2019-03-21 Alex Hamid Mani Assistive Device for Non-Visually Discerning a Three-Dimensional (3D) Real-World Area Surrounding a User
US20190087050A1 (en) * 2017-09-20 2019-03-21 Alex Hamid Mani Assistive device with a refreshable haptic feedback interface
US10254840B2 (en) 2015-07-21 2019-04-09 Apple Inc. Guidance device for the sensory impaired
US10261585B2 (en) 2014-03-27 2019-04-16 Apple Inc. Adjusting the level of acoustic and haptic output in haptic devices
US10281983B2 (en) 2017-09-20 2019-05-07 Alex Hamid Mani Haptic feedback device and method for providing haptic sensation based on video
US10372214B1 (en) 2016-09-07 2019-08-06 Apple Inc. Adaptable user-selectable input area in an electronic device
US10380850B1 (en) * 2018-02-09 2019-08-13 Adam A. Zuber Virtual cane
US10437359B1 (en) 2017-02-28 2019-10-08 Apple Inc. Stylus with external magnetic influence
US10556252B2 (en) 2017-09-20 2020-02-11 Apple Inc. Electronic device having a tuned resonance haptic actuation system
US10585480B1 (en) 2016-05-10 2020-03-10 Apple Inc. Electronic device with an input device having a haptic engine
US10613678B1 (en) 2018-09-17 2020-04-07 Apple Inc. Input device with haptic feedback
US10649529B1 (en) 2016-06-28 2020-05-12 Apple Inc. Modification of user-perceived feedback of an input device using acoustic or haptic output
US10768747B2 (en) 2017-08-31 2020-09-08 Apple Inc. Haptic realignment cues for touch-input displays
US10768738B1 (en) 2017-09-27 2020-09-08 Apple Inc. Electronic device having a haptic actuator with magnetic augmentation
US10772394B1 (en) 2016-03-08 2020-09-15 Apple Inc. Tactile output for wearable device
US10775889B1 (en) 2017-07-21 2020-09-15 Apple Inc. Enclosure with locally-flexible regions
US10845878B1 (en) 2016-07-25 2020-11-24 Apple Inc. Input device with tactile feedback
US10936071B2 (en) 2018-08-30 2021-03-02 Apple Inc. Wearable electronic device with haptic rotatable input
US10942571B2 (en) 2018-06-29 2021-03-09 Apple Inc. Laptop computing device with discrete haptic regions
US10966007B1 (en) 2018-09-25 2021-03-30 Apple Inc. Haptic output system
US11024135B1 (en) 2020-06-17 2021-06-01 Apple Inc. Portable electronic device having a haptic button assembly
US11054932B2 (en) 2017-09-06 2021-07-06 Apple Inc. Electronic device having a touch sensor, force sensor, and haptic actuator in an integrated module
US20220026990A1 (en) * 2017-09-20 2022-01-27 Niki Mani Assistive device for non-visually discerning a three-dimensional (3d) real-world area surrounding a user
US11513627B2 (en) 2017-09-20 2022-11-29 Niki Mani Assistive device with a refreshable haptic feedback interface
US11549819B2 (en) * 2018-05-30 2023-01-10 International Business Machines Corporation Navigation guidance using tactile feedback implemented by a microfluidic layer within a user device
US11561619B2 (en) 2017-09-20 2023-01-24 Niki Mani Haptic feedback device and method for providing haptic sensation based on video
US20230024866A1 (en) * 2014-07-28 2023-01-26 Ck Materials Lab Co., Ltd. Tactile information supply module
US11966512B2 (en) * 2020-09-29 2024-04-23 Alex Hamid Mani Assistive device for non-visually discerning a three-dimensional (3D) real-world area surrounding a user

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8405618B2 (en) * 2006-03-24 2013-03-26 Northwestern University Haptic device with indirect haptic feedback
US11525684B2 (en) 2019-11-21 2022-12-13 International Business Machines Corporation Assistive mechanism via edge device personalization

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160171767A1 (en) * 2014-12-11 2016-06-16 Intel Corporation Facilitating dynamic non-visual markers for augmented reality on computing devices

Family Cites Families (219)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1991020136A1 (en) 1990-06-18 1991-12-26 Motorola, Inc. Selective call receiver having a variable frequency vibrator
US5196745A (en) 1991-08-16 1993-03-23 Massachusetts Institute Of Technology Magnetic positioning device
EP0580117A3 (en) 1992-07-20 1994-08-24 Tdk Corp Moving magnet-type actuator
US5739759A (en) 1993-02-04 1998-04-14 Toshiba Corporation Melody paging apparatus
US5424756A (en) 1993-05-14 1995-06-13 Ho; Yung-Lung Track pad cursor positioning device and method
US5436622A (en) 1993-07-06 1995-07-25 Motorola, Inc. Variable frequency vibratory alert method and structure
US5999168A (en) 1995-09-27 1999-12-07 Immersion Corporation Haptic accelerator for force feedback computer peripherals
US5668423A (en) 1996-03-21 1997-09-16 You; Dong-Ok Exciter for generating vibration in a pager
US5827119A (en) * 1996-08-14 1998-10-27 Bromley Incorporated Rotatable playing surface game
US6084319A (en) 1996-10-16 2000-07-04 Canon Kabushiki Kaisha Linear motor, and stage device and exposure apparatus provided with the same
US6429846B2 (en) 1998-06-23 2002-08-06 Immersion Corporation Haptic feedback for touchpads and other touch controls
US6717573B1 (en) 1998-06-23 2004-04-06 Immersion Corporation Low-cost haptic mouse implementations
US6707443B2 (en) 1998-06-23 2004-03-16 Immersion Corporation Haptic trackball device
FI981469A (en) 1998-06-25 1999-12-26 Nokia Mobile Phones Ltd Integrated motion detector in a mobile telecommunications device
US6373465B2 (en) 1998-11-10 2002-04-16 Lord Corporation Magnetically-controllable, semi-active haptic interface system and apparatus
GB2344888A (en) 1998-12-18 2000-06-21 Notetry Ltd Obstacle detection system
US6693622B1 (en) 1999-07-01 2004-02-17 Immersion Corporation Vibrotactile haptic feedback devices
DE20022244U1 (en) 1999-07-01 2001-11-08 Immersion Corp Control of vibrotactile sensations for haptic feedback devices
US8169402B2 (en) 1999-07-01 2012-05-01 Immersion Corporation Vibrotactile haptic feedback devices
JP3344385B2 (en) 1999-10-22 2002-11-11 ヤマハ株式会社 Vibration source drive
US6822635B2 (en) 2000-01-19 2004-11-23 Immersion Corporation Haptic interface for laptop computers and other portable devices
EP1303853A4 (en) 2000-05-24 2009-03-11 Immersion Corp Haptic devices using electroactive polymers
US6445093B1 (en) 2000-06-26 2002-09-03 Nikon Corporation Planar motor with linear coil arrays
US6388789B1 (en) 2000-09-19 2002-05-14 The Charles Stark Draper Laboratory, Inc. Multi-axis magnetically actuated device
WO2002027705A1 (en) 2000-09-28 2002-04-04 Immersion Corporation Directional tactile feedback for haptic feedback interface devices
US7370289B1 (en) 2001-03-07 2008-05-06 Palmsource, Inc. Method and apparatus for notification on an electronic handheld device using an attention manager
US7202851B2 (en) 2001-05-04 2007-04-10 Immersion Medical Inc. Haptic interface for palpation simulation
JP2005514681A (en) 2001-10-23 2005-05-19 イマージョン コーポレーション Method of using haptic feedback by communicating a static state to a user of an electronic device
JP2003154315A (en) 2001-11-22 2003-05-27 Matsushita Electric Ind Co Ltd Vibratory linear actuator
US20030117132A1 (en) 2001-12-21 2003-06-26 Gunnar Klinghult Contactless sensing input device
US6952203B2 (en) 2002-01-08 2005-10-04 International Business Machines Corporation Touchscreen user interface: Bluetooth™ stylus for performing right mouse clicks
US7063671B2 (en) 2002-06-21 2006-06-20 Boston Scientific Scimed, Inc. Electronically activated capture device
JP3988608B2 (en) 2002-10-07 2007-10-10 日本電気株式会社 Radio telephone with vibrator control function and vibrator control method for radio telephone
US8125453B2 (en) 2002-10-20 2012-02-28 Immersion Corporation System and method for providing rotational haptic feedback
US7798982B2 (en) 2002-11-08 2010-09-21 Engineering Acoustics, Inc. Method and apparatus for generating a vibrational stimulus
JP2004236202A (en) 2003-01-31 2004-08-19 Nec Commun Syst Ltd Portable phone, call arrival information control method to be used for the portable phone and call arrival information control program
US7080271B2 (en) 2003-02-14 2006-07-18 Intel Corporation Non main CPU/OS based operational environment
JP3891947B2 (en) 2003-03-07 2007-03-14 ジーイー・メディカル・システムズ・グローバル・テクノロジー・カンパニー・エルエルシー Magnetic resonance imaging device
DE10319319A1 (en) 2003-04-29 2005-01-27 Infineon Technologies Ag Sensor device with magnetostrictive force sensor
EP1631893A2 (en) 2003-05-30 2006-03-08 Immersion Corporation System and method for low power haptic feedback
US7130664B1 (en) 2003-06-12 2006-10-31 Williams Daniel P User-based signal indicator for telecommunications device and method of remotely notifying a user of an incoming communications signal incorporating the same
US20050036603A1 (en) 2003-06-16 2005-02-17 Hughes David A. User-defined ring tone file
KR20050033909A (en) 2003-10-07 2005-04-14 조영준 Key switch using magnetic force
WO2005050683A1 (en) 2003-11-20 2005-06-02 Preh Gmbh Control element with programmable haptics
US7355305B2 (en) 2003-12-08 2008-04-08 Shen-Etsu Chemical Co., Ltd. Small-size direct-acting actuator
US20060209037A1 (en) 2004-03-15 2006-09-21 David Wang Method and system for providing haptic effects
JP2005301900A (en) 2004-04-15 2005-10-27 Alps Electric Co Ltd On-vehicle tactile force applying type input device
US7508382B2 (en) 2004-04-28 2009-03-24 Fuji Xerox Co., Ltd. Force-feedback stylus and applications to freeform ink
GB2414309B (en) 2004-05-18 2009-02-25 Simon Richard Daniel Spherical display and control device
US7392066B2 (en) 2004-06-17 2008-06-24 Ixi Mobile (R&D), Ltd. Volume control system and method for a mobile communication device
JP4363642B2 (en) * 2004-07-02 2009-11-11 富士フイルム株式会社 Map display system and digital camera
US20060017691A1 (en) 2004-07-23 2006-01-26 Juan Manuel Cruz-Hernandez System and method for controlling audio output associated with haptic effects
US8002089B2 (en) 2004-09-10 2011-08-23 Immersion Corporation Systems and methods for providing a haptic device
US8106888B2 (en) 2004-10-01 2012-01-31 3M Innovative Properties Company Vibration sensing touch input device
JP4799421B2 (en) 2004-11-09 2011-10-26 孝彦 鈴木 Haptic feedback controller
US7068168B2 (en) 2004-11-12 2006-06-27 Simon Girshovich Wireless anti-theft system for computer and other electronic and electrical equipment
DE102005009110A1 (en) 2005-01-13 2006-07-27 Siemens Ag Device for communicating environmental information to a visually impaired person
ATE358942T1 (en) 2005-01-31 2007-04-15 Research In Motion Ltd USER HAND RECOGNITION FOR WIRELESS TERMINALS
EP1871267B1 (en) 2005-02-22 2018-09-12 Mako Surgical Corp. Haptic guidance system
JP2006260179A (en) 2005-03-17 2006-09-28 Matsushita Electric Ind Co Ltd Trackball device
US20060223547A1 (en) 2005-03-31 2006-10-05 Microsoft Corporation Environment sensitive notifications for mobile devices
TWI260151B (en) 2005-05-06 2006-08-11 Benq Corp Mobile phone
US7825903B2 (en) 2005-05-12 2010-11-02 Immersion Corporation Method and apparatus for providing haptic effects to a touch panel
DE102005043587B4 (en) 2005-06-02 2009-04-02 Preh Gmbh Turntable with programmable feel
US7710397B2 (en) 2005-06-03 2010-05-04 Apple Inc. Mouse with improved input mechanisms using touch sensors
US7919945B2 (en) 2005-06-27 2011-04-05 Coactive Drive Corporation Synchronized vibration device for haptic feedback
US8981682B2 (en) 2005-06-27 2015-03-17 Coactive Drive Corporation Asymmetric and general vibration waveforms from multiple synchronized vibration actuators
US7234379B2 (en) 2005-06-28 2007-06-26 Ingvar Claesson Device and a method for preventing or reducing vibrations in a cutting tool
US7633076B2 (en) 2005-09-30 2009-12-15 Apple Inc. Automated response to and sensing of user activity in portable devices
WO2007049253A2 (en) 2005-10-28 2007-05-03 Koninklijke Philips Electronics N.V. Display system with a haptic feedback via interaction with physical objects
JP5208362B2 (en) 2005-10-28 2013-06-12 ソニー株式会社 Electronics
US20070106457A1 (en) 2005-11-09 2007-05-10 Outland Research Portable computing with geospatial haptic compass
JP5269604B2 (en) 2005-11-14 2013-08-21 イマージョン コーポレイション System and method for editing a model of a physical system for simulation
GB2433351B (en) 2005-12-16 2009-03-25 Dale Mcphee Purcocks Keyboard
KR100877067B1 (en) 2006-01-03 2009-01-07 삼성전자주식회사 Haptic button, and haptic device using it
US7667691B2 (en) 2006-01-19 2010-02-23 International Business Machines Corporation System, computer program product and method of preventing recordation of true keyboard acoustic emanations
US8405618B2 (en) 2006-03-24 2013-03-26 Northwestern University Haptic device with indirect haptic feedback
WO2007114631A2 (en) 2006-04-03 2007-10-11 Young-Jun Cho Key switch using magnetic force
US7708478B2 (en) 2006-04-13 2010-05-04 Nokia Corporation Actuator mechanism and a shutter mechanism
US8174512B2 (en) 2006-06-02 2012-05-08 Immersion Corporation Hybrid haptic device utilizing mechanical and programmable haptic effects
US7326864B2 (en) 2006-06-07 2008-02-05 International Business Machines Corporation Method and apparatus for masking keystroke sounds from computer keyboards
JP2008033739A (en) 2006-07-31 2008-02-14 Sony Corp Touch screen interaction method and apparatus based on tactile force feedback and pressure measurement
US7675414B2 (en) 2006-08-10 2010-03-09 Qualcomm Incorporated Methods and apparatus for an environmental and behavioral adaptive wireless communication device
US7890863B2 (en) 2006-10-04 2011-02-15 Immersion Corporation Haptic effects with proximity sensing
US20080084384A1 (en) 2006-10-05 2008-04-10 Immersion Corporation Multiple Mode Haptic Feedback System
US20080111791A1 (en) 2006-11-15 2008-05-15 Alex Sasha Nikittin Self-propelled haptic mouse system
GB2446428B (en) 2006-12-02 2010-08-04 Nanomotion Ltd Controllable coupling force
JP2008158909A (en) 2006-12-25 2008-07-10 Pro Tech Design Corp Tactile feedback controller
US9430042B2 (en) 2006-12-27 2016-08-30 Immersion Corporation Virtual detents through vibrotactile feedback
US7893922B2 (en) 2007-01-15 2011-02-22 Sony Ericsson Mobile Communications Ab Touch sensor with tactile feedback
US8378965B2 (en) 2007-04-10 2013-02-19 Immersion Corporation Vibration actuator with a unidirectional drive
US7956770B2 (en) 2007-06-28 2011-06-07 Sony Ericsson Mobile Communications Ab Data input device and portable electronic device
JP5602626B2 (en) 2007-06-29 2014-10-08 アーティフィシャル マッスル,インク. Electroactive polymer transducer for sensory feedback applications
US8154537B2 (en) 2007-08-16 2012-04-10 Immersion Corporation Resistive actuator with dynamic variations of frictional forces
KR101425222B1 (en) 2007-08-22 2014-08-04 삼성전자주식회사 Apparatus and method for vibration control in mobile phone
US7667371B2 (en) 2007-09-17 2010-02-23 Motorola, Inc. Electronic device and circuit for providing tactile feedback
US8084968B2 (en) 2007-09-17 2011-12-27 Sony Ericsson Mobile Communications Ab Use of an accelerometer to control vibrator performance
US20090085879A1 (en) 2007-09-28 2009-04-02 Motorola, Inc. Electronic device having rigid input surface with piezoelectric haptics and corresponding method
CN101409164A (en) 2007-10-10 2009-04-15 唐艺华 Key-press and keyboard using the same
US20090115734A1 (en) 2007-11-02 2009-05-07 Sony Ericsson Mobile Communications Ab Perceivable feedback
US9058077B2 (en) 2007-11-16 2015-06-16 Blackberry Limited Tactile touch screen for electronic device
SG186011A1 (en) 2007-11-21 2012-12-28 Artificial Muscle Inc Electroactive polymer transducers for tactile feedback devices
US7911328B2 (en) 2007-11-21 2011-03-22 The Guitammer Company Capture and remote reproduction of haptic events in synchronous association with the video and audio capture and reproduction of those events
US8253686B2 (en) 2007-11-26 2012-08-28 Electronics And Telecommunications Research Institute Pointing apparatus capable of providing haptic feedback, and haptic interaction system and method using the same
US8265308B2 (en) 2007-12-07 2012-09-11 Motorola Mobility Llc Apparatus including two housings and a piezoelectric transducer
US8836502B2 (en) 2007-12-28 2014-09-16 Apple Inc. Personal media device input and output control based on associated conditions
US20090166098A1 (en) 2007-12-31 2009-07-02 Apple Inc. Non-visual control of multi-touch device
US8373549B2 (en) 2007-12-31 2013-02-12 Apple Inc. Tactile feedback in an electronic device
US9857872B2 (en) 2007-12-31 2018-01-02 Apple Inc. Multi-touch display screen with localized tactile feedback
US20090167702A1 (en) 2008-01-02 2009-07-02 Nokia Corporation Pointing device detection
US20090174672A1 (en) 2008-01-03 2009-07-09 Schmidt Robert M Haptic actuator assembly and method of manufacturing a haptic actuator assembly
US8004501B2 (en) 2008-01-21 2011-08-23 Sony Computer Entertainment America Llc Hand-held device with touchscreen and digital tactile pixels
US20090207129A1 (en) 2008-02-15 2009-08-20 Immersion Corporation Providing Haptic Feedback To User-Operated Switch
GB2458146B (en) 2008-03-06 2013-02-13 Nanomotion Ltd Ball-mounted mirror moved by piezoelectric motor
KR100952698B1 (en) 2008-03-10 2010-04-13 한국표준과학연구원 Tactile transmission method using tactile feedback apparatus and the system thereof
US9513704B2 (en) 2008-03-12 2016-12-06 Immersion Corporation Haptically enabled user interface
US7904210B2 (en) 2008-03-18 2011-03-08 Visteon Global Technologies, Inc. Vibration control system
KR101003609B1 (en) 2008-03-28 2010-12-23 삼성전기주식회사 Vibrator and controlling method thereof and portable terminal having the same
US9274601B2 (en) 2008-04-24 2016-03-01 Blackberry Limited System and method for generating a feedback signal in response to an input signal provided to an electronic device
US20090267892A1 (en) 2008-04-24 2009-10-29 Research In Motion Limited System and method for generating energy from activation of an input device in an electronic device
US8217892B2 (en) 2008-05-06 2012-07-10 Dell Products L.P. Tactile feedback input device
EP2705989B1 (en) 2008-05-26 2016-06-29 Daesung Electric Co., Ltd Haptic steering wheel switch device and haptic steering wheel switch system including the same
US8345025B2 (en) 2008-06-05 2013-01-01 Dell Products, Lp Computation device incorporating motion detection and method thereof
US9733704B2 (en) 2008-06-12 2017-08-15 Immersion Corporation User interface impact actuator
FR2934066B1 (en) 2008-07-21 2013-01-25 Dav HAPTIC RETURN CONTROL DEVICE
KR100973979B1 (en) 2008-08-22 2010-08-05 한국과학기술원 Electromagnetic Multi-axis Actuator
DE102008046102B4 (en) 2008-09-05 2016-05-12 Lisa Dräxlmaier GmbH Control element with specific feedback
US8749495B2 (en) 2008-09-24 2014-06-10 Immersion Corporation Multiple actuation handheld device
US20100116629A1 (en) 2008-11-12 2010-05-13 Milo Borissov Dual action push-type button
DE102008061205A1 (en) 2008-11-18 2010-05-20 Institut für Luft- und Kältetechnik gemeinnützige Gesellschaft mbH Electrodynamic linear vibration motor
EP2202619A1 (en) 2008-12-23 2010-06-30 Research In Motion Limited Portable electronic device including tactile touch-sensitive input device and method of controlling same
MX2011007670A (en) 2009-01-21 2011-08-08 Bayer Materialscience Ag Electroactive polymer transducers for tactile feedback devices.
US20100225600A1 (en) 2009-03-09 2010-09-09 Motorola Inc. Display Structure with Direct Piezoelectric Actuation
WO2010104953A1 (en) 2009-03-10 2010-09-16 Artificial Muscle, Inc. Electroactive polymer transducers for tactile feedback devices
DE102009015991A1 (en) 2009-04-02 2010-10-07 Pi Ceramic Gmbh Keramische Technologien Und Bauelemente Device for generating a haptic feedback of a keyless input unit
WO2010119397A2 (en) 2009-04-15 2010-10-21 Koninklijke Philips Electronics N.V. A foldable tactile display
KR101553842B1 (en) 2009-04-21 2015-09-17 엘지전자 주식회사 Mobile terminal providing multi haptic effect and control method thereof
JP5707606B2 (en) 2009-04-22 2015-04-30 株式会社フコク Rotary input device and electronic device
KR20120019471A (en) 2009-05-07 2012-03-06 임머숀 코퍼레이션 Method and apparatus for providing a haptic feedback shape-changing display
US9727790B1 (en) * 2010-06-04 2017-08-08 Masoud Vaziri Method and apparatus for a wearable computer with natural user interface
US20100313425A1 (en) 2009-06-11 2010-12-16 Christopher Martin Hawes Variable amplitude vibrating personal care device
US20100328229A1 (en) 2009-06-30 2010-12-30 Research In Motion Limited Method and apparatus for providing tactile feedback
US8378797B2 (en) 2009-07-17 2013-02-19 Apple Inc. Method and apparatus for localization of haptic feedback
KR101962081B1 (en) 2009-07-22 2019-03-25 임머숀 코퍼레이션 System and method for providing complex haptic stimulation during input of control gestures, and relating to control of virtual equipment
US8730182B2 (en) 2009-07-30 2014-05-20 Immersion Corporation Systems and methods for piezo-based haptic feedback
US8654524B2 (en) 2009-08-17 2014-02-18 Apple Inc. Housing as an I/O device
US8390594B2 (en) 2009-08-18 2013-03-05 Immersion Corporation Haptic feedback using composite piezoelectric actuator
FR2950166B1 (en) 2009-09-16 2015-07-17 Dav ROTARY CONTROL DEVICE WITH HAPTIC RETURN
US9424444B2 (en) 2009-10-14 2016-08-23 At&T Mobility Ii Llc Systems, apparatus, methods and computer-readable storage media for facilitating integrated messaging, contacts and social media for a selected entity
CN201708677U (en) 2009-10-19 2011-01-12 常州美欧电子有限公司 Flat linear vibration motor
US8262480B2 (en) 2009-11-12 2012-09-11 Igt Touch screen displays with physical buttons for gaming devices
US20110115754A1 (en) 2009-11-17 2011-05-19 Immersion Corporation Systems and Methods For A Friction Rotary Device For Haptic Feedback
US20110132114A1 (en) 2009-12-03 2011-06-09 Sony Ericsson Mobile Communications Ab Vibration apparatus for a hand-held mobile device, hand-held mobile device comprising the vibration apparatus and method for operating the vibration apparatus
US8633916B2 (en) 2009-12-10 2014-01-21 Apple, Inc. Touch pad with force sensors and actuator feedback
US8773247B2 (en) 2009-12-15 2014-07-08 Immersion Corporation Haptic feedback device using standing waves
US9436280B2 (en) 2010-01-07 2016-09-06 Qualcomm Incorporated Simulation of three-dimensional touch sensation using haptics
JP5385165B2 (en) 2010-01-15 2014-01-08 ホシデン株式会社 Input device
US8493177B2 (en) 2010-01-29 2013-07-23 Immersion Corporation System and method of haptically communicating vehicle information from a vehicle to a keyless entry device
US8605141B2 (en) 2010-02-24 2013-12-10 Nant Holdings Ip, Llc Augmented reality panorama supporting visually impaired individuals
US9361018B2 (en) 2010-03-01 2016-06-07 Blackberry Limited Method of providing tactile feedback and apparatus
US9535500B2 (en) 2010-03-01 2017-01-03 Blackberry Limited Method of providing tactile feedback and apparatus
JP5847407B2 (en) 2010-03-16 2016-01-20 イマージョン コーポレーションImmersion Corporation System and method for pre-touch and true touch
US8907661B2 (en) 2010-03-22 2014-12-09 Fm Marketing Gmbh Input apparatus with haptic feedback
WO2011129475A1 (en) 2010-04-16 2011-10-20 엘지이노텍 주식회사 Linear vibrator having a broad bandwidth, and mobile device
US20120327006A1 (en) 2010-05-21 2012-12-27 Disney Enterprises, Inc. Using tactile feedback to provide spatial awareness
US9086727B2 (en) 2010-06-22 2015-07-21 Microsoft Technology Licensing, Llc Free space directional force feedback apparatus
US8411874B2 (en) 2010-06-30 2013-04-02 Google Inc. Removing noise from audio
US8576171B2 (en) 2010-08-13 2013-11-05 Immersion Corporation Systems and methods for providing haptic feedback to touch-sensitive input devices
FR2964761B1 (en) 2010-09-14 2012-08-31 Thales Sa HAPTIC INTERACTION DEVICE AND METHOD FOR GENERATING HAPTIC AND SOUND EFFECTS
KR101259683B1 (en) 2010-10-27 2013-05-02 엘지이노텍 주식회사 Horizental vibration motor
US8878401B2 (en) 2010-11-10 2014-11-04 Lg Innotek Co., Ltd. Linear vibrator having a trembler with a magnet and a weight
CN103415833B (en) 2010-11-18 2017-10-24 谷歌公司 The outer visual object of the screen that comes to the surface
US10120446B2 (en) 2010-11-19 2018-11-06 Apple Inc. Haptic input device
WO2012067370A2 (en) 2010-11-19 2012-05-24 (주)하이소닉 Haptic module using piezoelectric element
US8593420B1 (en) * 2011-03-04 2013-11-26 Amazon Technologies, Inc. Providing tactile output and interaction
KR101580022B1 (en) 2011-03-04 2015-12-23 애플 인크. Linear vibrator providing localized and generalized haptic feedback
US9448713B2 (en) 2011-04-22 2016-09-20 Immersion Corporation Electro-vibrotactile display
US9218727B2 (en) 2011-05-12 2015-12-22 Apple Inc. Vibration in portable devices
US8717151B2 (en) 2011-05-13 2014-05-06 Qualcomm Incorporated Devices and methods for presenting information to a user on a tactile output surface of a mobile device
US8681130B2 (en) 2011-05-20 2014-03-25 Sony Corporation Stylus based haptic peripheral for touch screen and tablet devices
US9710061B2 (en) 2011-06-17 2017-07-18 Apple Inc. Haptic feedback device
US20130016042A1 (en) 2011-07-12 2013-01-17 Ville Makinen Haptic device with touch gesture interface
US9256287B2 (en) 2011-08-30 2016-02-09 Kyocera Corporation Tactile sensation providing apparatus
KR102070612B1 (en) 2011-11-18 2020-01-30 센톤스 아이엔씨. Localized haptic feedback
US9286907B2 (en) 2011-11-23 2016-03-15 Creative Technology Ltd Smart rejecter for keyboard click noise
JP5942152B2 (en) 2012-01-20 2016-06-29 パナソニックIpマネジメント株式会社 Electronics
US8872448B2 (en) 2012-02-24 2014-10-28 Nokia Corporation Apparatus and method for reorientation during sensed drop
US9539164B2 (en) 2012-03-20 2017-01-10 Xerox Corporation System for indoor guidance with mobility assistance
US10108265B2 (en) 2012-05-09 2018-10-23 Apple Inc. Calibration of haptic feedback systems for input devices
DE112013002421T5 (en) 2012-05-09 2015-02-05 Apple Inc. Threshold for determining feedback in computing devices
WO2014066516A1 (en) 2012-10-23 2014-05-01 New York University Somatosensory feedback wearable object
KR102124297B1 (en) 2012-06-04 2020-06-19 홈 컨트롤 싱가포르 피티이. 엘티디. User-interface for entering alphanumerical characters
KR101242525B1 (en) 2012-07-19 2013-03-12 (주)엠투시스 Haptic actuator
US9466783B2 (en) 2012-07-26 2016-10-11 Immersion Corporation Suspension element having integrated piezo material for providing haptic effects to a touch screen
US9116546B2 (en) 2012-08-29 2015-08-25 Immersion Corporation System for haptically representing sensor input
US9122330B2 (en) 2012-11-19 2015-09-01 Disney Enterprises, Inc. Controlling a user's tactile perception in a dynamic physical environment
KR20140073398A (en) 2012-12-06 2014-06-16 삼성전자주식회사 Display apparatus and method for controlling thereof
EP2743798A1 (en) 2012-12-13 2014-06-18 BlackBerry Limited Magnetically coupling stylus and host electronic device
CN104364750B (en) 2013-01-06 2019-07-16 英特尔公司 The pretreated methods, devices and systems of distribution controlled for touch data and display area
US9024738B2 (en) 2013-02-01 2015-05-05 Blackberry Limited Apparatus, systems and methods for mitigating vibration of an electronic device
US9304587B2 (en) 2013-02-13 2016-04-05 Apple Inc. Force sensing mouse
US9442570B2 (en) 2013-03-13 2016-09-13 Google Technology Holdings LLC Method and system for gesture recognition
US9557830B2 (en) 2013-03-15 2017-01-31 Immersion Corporation Programmable haptic peripheral
US8867757B1 (en) 2013-06-28 2014-10-21 Google Inc. Microphone under keyboard to assist in noise cancellation
WO2015045063A1 (en) 2013-09-26 2015-04-02 富士通株式会社 Drive control apparatus, electronic device, and drive control method
US9921649B2 (en) 2013-10-07 2018-03-20 Immersion Corporation Electrostatic haptic based user input elements
US20150126070A1 (en) 2013-11-05 2015-05-07 Sony Corporation Apparatus for powering an electronic device in a secure manner
US9632583B2 (en) 2014-01-21 2017-04-25 Senseg Ltd. Controlling output current for electrosensory vibration
US9396629B1 (en) 2014-02-21 2016-07-19 Apple Inc. Haptic modules with independently controllable vertical and horizontal mass movements
US9594429B2 (en) 2014-03-27 2017-03-14 Apple Inc. Adjusting the level of acoustic and haptic output in haptic devices
US10133351B2 (en) 2014-05-21 2018-11-20 Apple Inc. Providing haptic output based on a determined orientation of an electronic device
DE102015209639A1 (en) 2014-06-03 2015-12-03 Apple Inc. Linear actuator
US9886090B2 (en) 2014-07-08 2018-02-06 Apple Inc. Haptic notifications utilizing haptic input devices
US9489049B2 (en) 2014-08-26 2016-11-08 Samsung Electronics Co., Ltd. Force simulation finger sleeve using orthogonal uniform magnetic field
US20160170508A1 (en) * 2014-12-11 2016-06-16 Toyota Motor Engineering & Manufacturing North America, Inc. Tactile display devices
US20170024010A1 (en) 2015-07-21 2017-01-26 Apple Inc. Guidance device for the sensory impaired
US20170249024A1 (en) 2016-02-27 2017-08-31 Apple Inc. Haptic mouse
US9829981B1 (en) 2016-05-26 2017-11-28 Apple Inc. Haptic output device
US9792501B1 (en) * 2016-12-31 2017-10-17 Vasuyantra Corp. Method and device for visually impaired assistance

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160171767A1 (en) * 2014-12-11 2016-06-16 Intel Corporation Facilitating dynamic non-visual markers for augmented reality on computing devices

Cited By (53)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10261585B2 (en) 2014-03-27 2019-04-16 Apple Inc. Adjusting the level of acoustic and haptic output in haptic devices
US9886090B2 (en) 2014-07-08 2018-02-06 Apple Inc. Haptic notifications utilizing haptic input devices
US20230024866A1 (en) * 2014-07-28 2023-01-26 Ck Materials Lab Co., Ltd. Tactile information supply module
US10254840B2 (en) 2015-07-21 2019-04-09 Apple Inc. Guidance device for the sensory impaired
US10664058B2 (en) 2015-07-21 2020-05-26 Apple Inc. Guidance device for the sensory impaired
US10772394B1 (en) 2016-03-08 2020-09-15 Apple Inc. Tactile output for wearable device
US10890978B2 (en) 2016-05-10 2021-01-12 Apple Inc. Electronic device with an input device having a haptic engine
US10585480B1 (en) 2016-05-10 2020-03-10 Apple Inc. Electronic device with an input device having a haptic engine
US11762470B2 (en) 2016-05-10 2023-09-19 Apple Inc. Electronic device with an input device having a haptic engine
US9829981B1 (en) 2016-05-26 2017-11-28 Apple Inc. Haptic output device
US10649529B1 (en) 2016-06-28 2020-05-12 Apple Inc. Modification of user-perceived feedback of an input device using acoustic or haptic output
US10845878B1 (en) 2016-07-25 2020-11-24 Apple Inc. Input device with tactile feedback
US10372214B1 (en) 2016-09-07 2019-08-06 Apple Inc. Adaptable user-selectable input area in an electronic device
US10437359B1 (en) 2017-02-28 2019-10-08 Apple Inc. Stylus with external magnetic influence
US20180261055A1 (en) * 2017-03-08 2018-09-13 Winston Yang Tactile Feedback Guidance Device
US10431056B2 (en) * 2017-03-08 2019-10-01 Winston Yang Tactile feedback guidance device
US10679474B2 (en) * 2017-03-08 2020-06-09 Winston Yang Tactile feedback guidance device
US10775889B1 (en) 2017-07-21 2020-09-15 Apple Inc. Enclosure with locally-flexible regions
US11487362B1 (en) 2017-07-21 2022-11-01 Apple Inc. Enclosure with locally-flexible regions
US10768747B2 (en) 2017-08-31 2020-09-08 Apple Inc. Haptic realignment cues for touch-input displays
US11460946B2 (en) 2017-09-06 2022-10-04 Apple Inc. Electronic device having a touch sensor, force sensor, and haptic actuator in an integrated module
US11054932B2 (en) 2017-09-06 2021-07-06 Apple Inc. Electronic device having a touch sensor, force sensor, and haptic actuator in an integrated module
US11353984B2 (en) 2017-09-20 2022-06-07 Alex Hamid Mani Assistive device with a refreshable haptic feedback interface
US11726571B2 (en) * 2017-09-20 2023-08-15 Niki Mani Assistive device for non-visually discerning a three-dimensional (3D) real-world area surrounding a user
US20220026990A1 (en) * 2017-09-20 2022-01-27 Niki Mani Assistive device for non-visually discerning a three-dimensional (3d) real-world area surrounding a user
US20190087050A1 (en) * 2017-09-20 2019-03-21 Alex Hamid Mani Assistive device with a refreshable haptic feedback interface
US10503310B2 (en) * 2017-09-20 2019-12-10 Alex Hamid Mani Assistive device for non-visually discerning a three-dimensional (3D) real-world area surrounding a user
US10496176B2 (en) 2017-09-20 2019-12-03 Alex Hamid Mani Haptic feedback device and method for providing haptic sensation based on video
US10831311B2 (en) * 2017-09-20 2020-11-10 Alex Hamid Mani Assistive device for non-visually discerning a three-dimensional (3D) real-world area surrounding a user
US20190354227A1 (en) * 2017-09-20 2019-11-21 Alex Hamid Mani Assistive device for non-visually discerning a three-dimensional (3d) real-world area surrounding a user
US20220229511A1 (en) * 2017-09-20 2022-07-21 Alex Hamid Mani Assistive device with a refreshable haptic feedback interface
US20230341943A1 (en) * 2017-09-20 2023-10-26 Niki Mani Assistive device for non-visually discerning a three-dimensional (3d) real-world area surrounding a user
US20190087049A1 (en) * 2017-09-20 2019-03-21 Alex Hamid Mani Assistive Device for Non-Visually Discerning a Three-Dimensional (3D) Real-World Area Surrounding a User
US11561619B2 (en) 2017-09-20 2023-01-24 Niki Mani Haptic feedback device and method for providing haptic sensation based on video
US11656714B2 (en) * 2017-09-20 2023-05-23 Alex Hamid Mani Assistive device with a refreshable haptic feedback interface
US10747359B2 (en) 2017-09-20 2020-08-18 Alex Hamid Mani Assistive device with a refreshable haptic feedback interface
US10556252B2 (en) 2017-09-20 2020-02-11 Apple Inc. Electronic device having a tuned resonance haptic actuation system
US11635819B2 (en) 2017-09-20 2023-04-25 Alex Hamid Mani Haptic feedback device and method for providing haptic sensation based on video
US10754429B2 (en) 2017-09-20 2020-08-25 Alex Hamid Mani Haptic feedback device and method for providing haptic sensation based on video
US10281983B2 (en) 2017-09-20 2019-05-07 Alex Hamid Mani Haptic feedback device and method for providing haptic sensation based on video
US10275083B2 (en) * 2017-09-20 2019-04-30 Alex Hamid Mani Assistive device with a refreshable haptic feedback interface
US11513627B2 (en) 2017-09-20 2022-11-29 Niki Mani Assistive device with a refreshable haptic feedback interface
US10768738B1 (en) 2017-09-27 2020-09-08 Apple Inc. Electronic device having a haptic actuator with magnetic augmentation
US10380850B1 (en) * 2018-02-09 2019-08-13 Adam A. Zuber Virtual cane
US11549819B2 (en) * 2018-05-30 2023-01-10 International Business Machines Corporation Navigation guidance using tactile feedback implemented by a microfluidic layer within a user device
US10942571B2 (en) 2018-06-29 2021-03-09 Apple Inc. Laptop computing device with discrete haptic regions
US10936071B2 (en) 2018-08-30 2021-03-02 Apple Inc. Wearable electronic device with haptic rotatable input
US10613678B1 (en) 2018-09-17 2020-04-07 Apple Inc. Input device with haptic feedback
US10966007B1 (en) 2018-09-25 2021-03-30 Apple Inc. Haptic output system
US11805345B2 (en) 2018-09-25 2023-10-31 Apple Inc. Haptic output system
US11024135B1 (en) 2020-06-17 2021-06-01 Apple Inc. Portable electronic device having a haptic button assembly
US11756392B2 (en) 2020-06-17 2023-09-12 Apple Inc. Portable electronic device having a haptic button assembly
US11966512B2 (en) * 2020-09-29 2024-04-23 Alex Hamid Mani Assistive device for non-visually discerning a three-dimensional (3D) real-world area surrounding a user

Also Published As

Publication number Publication date
US20180181204A1 (en) 2018-06-28
US10254840B2 (en) 2019-04-09
US20190196594A1 (en) 2019-06-27
US10664058B2 (en) 2020-05-26

Similar Documents

Publication Publication Date Title
US10664058B2 (en) Guidance device for the sensory impaired
Poggi et al. A wearable mobility aid for the visually impaired based on embedded 3D vision and deep learning
US10083544B2 (en) System for tracking a handheld device in virtual reality
US10353478B2 (en) Hover touch input compensation in augmented and/or virtual reality
US10096216B1 (en) Activation of security mechanisms through accelerometer-based dead reckoning
US10254847B2 (en) Device interaction with spatially aware gestures
US20140134575A1 (en) Wearable device to represent braille and control method thereof
US10380914B2 (en) Imaging gloves including wrist cameras and finger cameras
Li et al. Leveraging proprioception to make mobile phones more accessible to users with visual impairments
EP3088996A1 (en) Systems and methods for tactile guidance
CN104748742A (en) Blind person wearing product
CN104765461B (en) A kind of control method and electronic equipment
EP3732871B1 (en) Detecting patterns and behavior to prevent a mobile terminal drop event
US20160307561A1 (en) System for Providing Assistance to the Visually Impaired
Mattoccia et al. 3D glasses as mobility aid for visually impaired people
CN110446195A (en) Location processing method and Related product
Bouteraa Smart real time wearable navigation support system for BVIP
CN107479736A (en) Safety device with the touch interface based on power
Billah et al. Experimental investigation of a novel walking stick in avoidance drop-off for visually impaired people
KR20210158695A (en) Electronic device and operating method for detecting a plane in an image
JP2017078915A (en) Information specification device, method, and program
Hojjat Enhanced navigation systems in gps denied environments for visually impaired people: A survey
JP5605281B2 (en) WRITE INFORMATION DISPLAY SYSTEM, WRITE INFORMATION DISPLAY DEVICE, AND WRITE INFORMATION DISPLAY METHOD
Hojjat Indoor navigation systems for blind people
WO2021137128A1 (en) A wearable device for assisting a visually impaired user

Legal Events

Date Code Title Description
AS Assignment

Owner name: APPLE INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:WEINRAUB, CHANANIEL;REEL/FRAME:036145/0212

Effective date: 20150604

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO PAY ISSUE FEE