WO2022043995A1 - Unité de guidage montée sur la tête pour personnes aveugles - Google Patents

Unité de guidage montée sur la tête pour personnes aveugles Download PDF

Info

Publication number
WO2022043995A1
WO2022043995A1 PCT/IL2021/051028 IL2021051028W WO2022043995A1 WO 2022043995 A1 WO2022043995 A1 WO 2022043995A1 IL 2021051028 W IL2021051028 W IL 2021051028W WO 2022043995 A1 WO2022043995 A1 WO 2022043995A1
Authority
WO
WIPO (PCT)
Prior art keywords
actuator
user
processing unit
actuators
skin
Prior art date
Application number
PCT/IL2021/051028
Other languages
English (en)
Inventor
Anatoli Rapoport
Original Assignee
Anatoli Rapoport
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Anatoli Rapoport filed Critical Anatoli Rapoport
Publication of WO2022043995A1 publication Critical patent/WO2022043995A1/fr

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H3/00Appliances for aiding patients or disabled persons to walk about
    • A61H3/06Walking aids for blind persons
    • A61H3/061Walking aids for blind persons with electronic detecting or guiding means
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B6/00Tactile signalling systems, e.g. personal calling systems
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B21/00Teaching, or communicating with, the blind, deaf or mute
    • G09B21/001Teaching or communicating with blind persons
    • G09B21/007Teaching or communicating with blind persons using both tactile and audible presentation of the information
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/01Constructive details
    • A61H2201/0173Means for preventing injuries
    • A61H2201/0184Means for preventing injuries by raising an alarm
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/16Physical interface with patient
    • A61H2201/1602Physical interface with patient kind of interface, e.g. head rest, knee support or lumbar support
    • A61H2201/1604Head
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5007Control means thereof computer controlled
    • A61H2201/501Control means thereof computer controlled connected to external computer devices or networks
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5023Interfaces to the user
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5023Interfaces to the user
    • A61H2201/5048Audio interfaces, e.g. voice or music controlled
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5058Sensors or detectors
    • A61H2201/5084Acceleration sensors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5058Sensors or detectors
    • A61H2201/5092Optical sensor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5097Control means thereof wireless

Definitions

  • This invention relates to devices for guiding the blind and visually impaired.
  • WO 2018/204745 discloses a system for aiding in navigation that includes a vibrating haptic device and a plurality of actuators, and a mobile electronic device coupled to the vibrating haptic device, wherein the mobile electronic device is configured to analyze data gathered by the vibrating haptic device and determine a geographic position and angle of a user.
  • CN209137266U discloses a device built into a spectacle frame for guiding the blind.
  • the device includes a single chip processing module, ultrasonic sensor and various sensors for providing information indicative of the user’s immediate environment and potential obstacles.
  • the spectacle frame supports on opposing arms left and right vibrators that alert the user to obstacles in the front, left and right sides by vibration and provide tactile feedback to inform the user in which direction to turn.
  • W02016086440 discloses a wearable guiding system for the blind comprising a headset eyewear holder for wearing by a user; a sensor for detecting a surrounding; an image gathering device for recording surrounding images; a voice module for voice interaction; an information outputting device for alerting related information to a wearer; and a data processing center for overall control of the system.
  • US Patent No. 10,528,815 discloses a navigation aid wherein at least two computer-controlled haptic transducers are configured to be worn on parts of the body other than the user’s hands, such as the user’s arm, leg, or torso. The haptic transducers provide navigational prompts and are placed far enough away from each other so that the user can easily distinguish between haptic sensations from different haptic transducers.
  • US20180110672 discloses a navigation aid worn by a blind dog that has sensors and stimulators, which provide situational awareness to a blind dog regarding objects and sudden drops in its vicinity, helping the dog freely move about without the risk of collision or fall.
  • the stimulators may be based on vibration, sound, or ultrasound.
  • US2019125587 discloses a system for assisting the visually impaired, the system comprising inter alia an output component in the form of a tactile interface comprising a two-dimensional array of pins that actuate to communicate non-visual output through pressure and temperature.
  • Drawbacks of current devices include their dependency on hearing and touch. After sight, hearing and touch are definitively the second and third major human senses, respectively. For the blind and visually impaired, they evidently become the first and second principal senses, respectively. Blind people rely on hearing environmental cues for key tasks such as: awareness, orientation, mobility and safety.
  • US20080120029 discloses a wearable tactile wayfinding device useful for people that are blind and for people that suffer from Alzheimer’ s disease or where it is desirable to provide geographical information in tactile form as opposed to providing it in visual or auditory form.
  • Fig. 1 is a pictorial perspective view of a device according to an embodiment of the invention.
  • Fig. 2 is a pictorial plan view of the device
  • Figs. 3a to 3c are pictorial perspective views of alternative mounting frames for the device
  • Fig. 4a to 4d show respectively pictorial views of a 5-axis actuator and three forms of a 3-axis actuator suitable for use with the device;
  • Fig. 5 is a block diagram showing the functionality of a system according to the invention.
  • Fig. 6 shows pictorially an actuator whose default rest position is located on the side of the user’s face
  • Figs. 7a and 7b show schematically respective states of an actuator when at rest and when applying a dragging force to the skin surface
  • Fig. 8a shows schematically a closed-loop control system for controlling actuator contact pressure
  • Figs. 8b and 8c show pictorially the effect on the skin of controlling contact pressure during movement of the actuator
  • Fig. 9a shows schematically a closed-loop control system for controlling actuator drag force
  • Fig. 9b shows graphically actuator drag force as a function of time
  • Figs. 10a to 10c show pictorially a vacuum pad mounted on the actuator for improving adhesion
  • Figs. Ila to lid show pictorially different stages of an initialization process for locating the actuators at an initial location
  • Figs. 12a to 12d show pictorially different stages of a reset process for moving the actuators back to the initial location in the event of slippage;
  • Fig. 13 is a block diagram of an alternative implementation, where the device is interfaced to a separate navigation and obstacle avoidance system with a wireless communication port.
  • a device 10 for guiding a blind or visually impaired user comprising a head-mounted frame 11 providing left and right support arms 12, 12' on opposite sides of the user’s head.
  • Left and right actuators 13, 13' each supported by a respective one of the left and right support arms 12, 12' are configured to apply slight pressure against a skin surface on opposite sides of the user’s head.
  • the head-mounted frame 11 is a spectacle frame, whose opposing temple arms serve as the support arms 12, 12'.
  • the head-mounted frame may be any suitable support that is worn on the user’s head and is adapted to support the actuators on opposite sides thereof.
  • Figs. 3a and 3b show band-type mounting frames for supporting an actuator close to the eye and the ear, respectively, while Fig. 3c shows an ear-supported band.
  • a processing unit 15 (shown in Fig. 5) is coupled to both of the actuators 13, 13' and is responsive to environmental sensors 16 and to a known target destination for providing navigation signals to at least one of the actuators for causing the actuator to exert lateral pressure along the skin surface in a direction that is felt by the user and is indicative of navigation instructions that may include a required direction of movement toward the target destination or other actions to be taken such as stop-start, left- right on place (without moving), slower - faster, upstairs-downstairs and so on.
  • Fig. 6 shows pictorially an actuator whose default rest position is located on the side of the user’s face and which applies an upward dragging force to indicate to the user that she move upstairs.
  • a complementary actuator is located on the other side of the user’s face and force is applied simultaneously to both actuators to direct the user upstairs or, when force is applied in the opposite direction, downstairs.
  • the sensors 16 may include a camera 17 mounted on the frame for recording images of the surrounding screen, and the processing unit 15 is coupled to the camera for receiving and analyzing the images to determine potential obstacles in close proximity to the user.
  • the sensors may also include Lidar, proximity sensors, sonars etc. or a combination of multiple sensors, the processing unit 15 being coupled to all the sensors for analyzing their respective signals and generating navigation and guidance signals.
  • An audio/alarm unit 18 is coupled to processing unit 15 for alerting the user of an obstacle typically audibly via earphones 19.
  • the processing unit 15 may also be coupled to a GPS navigation system 20 for obtaining map data of an area surrounding the user and may be responsive to the target destination for deriving the navigation signals.
  • the alarm unit provides an emergency alert when any failure or external threat is detected, e.g. to alert the user to stop in front of a pit or other obstacle. Another cause for emergency action might be an emergency signal from the sensors, failure signal etc.
  • the alarm unit may be configured to operate independently of the processing unit 15 and may have its own internal processor that operates in parallel with the processing unit 15 so as to provide faster processing and immediate alerts.
  • the audio/alarm unit 18 may also serve as an interface allowing the user to issue vocal commands to the processing unit 15, such as the target destination.
  • Figs. 7a and 7b show schematically respective states of an actuator when at rest and when applying a dragging force to the skin surface.
  • the actuator is anchored to the skin by a fixed arm 21 and has a moveable arm 22 that frictionally engages the skin surface.
  • the vertical lines are evenly spaced and show the skin surface in its relaxed state.
  • the moveable arm 22 has moved to the right thereby stretching the skin surface to the left of the moveable arm 22 while puckering the skin to its right between the fixed arm 21 and the moveable arm 22.
  • the processing unit may be coupled to an adaptive unit configured to learn frequent patterns such as commonly-taken routes and is responsive for selection of a selected frequent pattern for operating the actuators.
  • the processing unit may be responsive to a failure to receive one or more sensor signals for receiving instructions directly from the adaptive unit.
  • the processing unit 15 may be a CPU mounted in the frame 11 or it may be a separate unit that is coupled to the sensors 16 and to the camera 17, when provided, either by wires or wirelessly using, for example, a BluetoothTM connection.
  • the processing unit 15 may be configured to provide complementary navigation signals to both actuators 13, 13' such that the left actuator 13 rubs laterally along the skin surface in a first direction and the right actuator 13' rubs laterally along the skin surface in a second direction opposite the first direction. In such manner, the tactile impression is magnified since the two actuators apply a mutually additive turning effect.
  • the manner of operation of the actuators 13, 13' is to apply lateral pressure along a surface of the skin. This may include actual movement of the actuator along the skin surface so as to rub along the skin in a desired direction. Alternatively, the actuator head may exert normal pressure against the skin surface and shift slightly to exert lateral pressure, which will cause the skin to pucker or wrinkle slightly. In either case, the net effect is to push or pull the surface of the skin in a direction and manner that may be interpreted as a navigation or guidance instruction.
  • actuator movement may encompass the following different types of engagement with the skin surface and navigation commands: a. Push/Pull the skin (with controlled force)
  • Anti-slip mechanism i. vacuum sucker, ii. “restart-if-too-loose”
  • Knocking/pressing/pinching for additional info like slow, fast, stop, go
  • Left side antisymmetric versus right side Not required push and pull in opposite directions. It could be any pushing together, pulling together, push or pull from one side only, push/pull up/down, knocking etc. This variety of movements provides an answer for different commands, e.g. stop-start, left- right on place (without moving), slower - faster, up- downstairs
  • each of the left and right actuators 13, 13' is configured to rub along or exert lateral pressure to the skin surface along mutually perpendicular axes in different directions.
  • Fig. 4a is a pictorial view of a 5-axis actuator 16 suitable for use with the device 10.
  • 3-axis actuators shown in Figs 4b and 4c can be used.
  • Such actuators may be operated to induce lateral side-to-side movement; up-and-down movement and to-and-fro movement toward and away from the user’s skin surface thereby varying the pressure applied thereto.
  • the left and right actuators 13, 13' may be configured to apply sufficient pressure against the skin that while rubbing along the skin surface or applying lateral pressure thereto they guide the user’s head. Such actuators may also be controlled to rotate about an axis normal to the skin surface. The actuators may be operated independently of one another or may be operated together to reinforce their effect. For example, the left actuator 13 may apply a forward movement against the skin surface while the right actuator 13' may apply a complementary backward movement against the skin surface, which will have the effect to turn the user’s head to the right.
  • the 5-axis actuator includes a 5-axis articulating robotic arm
  • the 3 -axis formats require only 3 axes, and as detailed in the following, optionally one may be replaced with a constant force spring or piston so only two fully operational control axes are needed.
  • Fig. 4b shows a rotary base axis onto which a lateral translation is mounted, and an additional contact forcing axis approximately normal to the skin exerts a contact force that pushes the contact pad against the skin.
  • the contact forcing axis may be implemented with a fixed uncontrolled axis, effected with a constant force spring, or pneumatic piston, so that only two fully controlled axes are required.
  • Fig. 4b shows a rotary base axis onto which a lateral translation is mounted, and an additional contact forcing axis approximately normal to the skin exerts a contact force that pushes the contact pad against the skin.
  • the contact forcing axis may be implemented with a fixed uncontrolled axis, effected with a constant force
  • FIG. 4c shows a similar rotary base axis onto which two articulating axes are mounted.
  • the skin contact pad angle with the end of the articulating axes can be adjusted manually to accommodate individual preferences of different users.
  • the second articulation axis that is responsible to maintain a constant force against the skin may be replaced with a spring or pneumatic piston, so optionally this arrangement can be implemented with only two fully controlled axes.
  • Fig. 4d shows a three-axis arrangement comprising a rotary base axis supported on the head-mounted frame, an elevation axis, with the third axis realized as a rotating wheel supported by the elevation axis.
  • the transverse pulling of the skin is effected by the rotary base station, the motion of the wheel pulling or pushing the skin and the elevation axis applying force to ensure contact of the wheel with the skin.
  • This arrangement has two advantages. First, as in the other configurations, the elevation axis can be replaced with a constant force spring or pneumatic piston, so that the implementation requires only two controlled axes. Second, no adjustment in the position of the actuator is required should the wheel slip; it may resume contact and pulling action immediately after traction to the skin resumes, with no adjustments. Such repositioning adjustment may be required in certain implementations of other forms of the actuators, as described below.
  • contact pressure may vary during use as a result of physiological and environmental factors such as skin dryness and elasticity, external factors such as rain, humidity, cosmetics and other factors. If, for any reason, contact pressure decreases to a level where lateral movement of the actuators becomes ineffective, this may be sensed and normal pressure may be increased as necessary. For example, upon applying lateral pressure, the actuator may slip in which case recalibration of the applied contact pressure is initiated. This may be done by releasing the actuators so that there is no contact with the skin surface, and then applying pressure until a desired level of contact is achieved.
  • Constant contact pressure may be achieved using a closed-loop servo-control system 25 as shown schematically in Fig. 8a so that optimum contact pressure is applied by increasing or reducing pressure to achieve a preset value, which may itself change according to measured physiological and/or environmental parameters such as those mentioned above.
  • the servo-control system 25 includes a contact pressure sensor 26 that senses the contact pressure applied against the sin surface by the actuator pad. The contact pressure signal is fed back to a subtractor 27 that generates an error signal equal to the difference between the measured contact pressure and a desired set point and the resultant error signal is fed to the actuator motor 28.
  • Figs. 8b and 8c are intended to show pictorially that regardless of the location of the actuator on the skin surface during movement of the actuator the same contact pressure is maintained. Additionally or alternatively other contact sensing methods may be used including such which monitor optical reflection from the skin, or conductivity between two electrodes for which skin contact closes the electrical circuit.
  • the actuators which may be servo-controlled to achieve a desired movement of skin depending, for example, on the extent of movement required and also on the extent of the user’ s reaction. So, for example, if the actuators operate to turn the user’s head in a specified direction but the user does not follow their guiding motion, additional lateral force may be applied. Likewise, the actuators are controlled to release pressure when the sensors determine that the user has sufficiently moved in the required direction. Between these two extremes, the actuators may be servo-controlled to exert less or more lateral pressure so as to urge the user in a required direction.
  • a suitable closed-loop servo control system 30 for controlling actuator pressure is shown schematically in Fig. 9a.
  • the servo-control system 30 includes a dynamometer 31 that measures the force applied by the actuator pad. The measures force signal is fed back to a subtractor 32 that generates an error signal equal to the difference between the measured contact pressure and a desired set point and the resultant error signal is fed to the actuator motor 28. As seen in Fig. 9b upon actuation, the actuator force decays from an initial maximum to zero over an extended time period. This demonstrates a major departure of the invention, which applies continuous control of the actuator force within prescribed limits as compared with prior art approaches whose control force is binary i.e. they have only two force states: ‘on’ and ‘off’.
  • servo-systems 25 and 30 are typically electrical and the actuator motor 28 is electrically driven, in all embodiments the actuators may be hydraulically or pneumatically operated and the servo-systems may be employ suitable transducers to measure contact pressure and actuator force. Therefore, reference to “servosystem” in the claims is not intended to limit the scope of the claims to electrical servo systems or to electrically operated actuators.
  • the amount of applied contact pressure and lateral force may be indicative of a navigation instruction. For example, stronger pressure could be suggestive of an obstacle in front of the user urging him to move more carefully or more slowly. The same applies to lateral force, where a higher degree of lateral force informs the user to turn through a larger turn angle.
  • Improved adhesive contact with the skin may be achieved by mounting at the distal ends of the actuators a vacuum pad 35 that contacts the skin and through which air may be drawn so as to apply suction as shown pictorially in Figs. 10a to 10c.
  • Figs. Ila to lid show pictorially different stages of an initialization process for locating the actuators at an initial location.
  • Cameras 40, 40' are mounted at opposite ends on the rear surface of the head-mounted frame 11 so as to be directed backwards to the user’ s face where they generate respective images over a fairly narrow area, which overlaps the required initial location of the actuator arm.
  • the processing unit is configured to direct the camera 40, 40' or other suitable imaging device toward the image area and to image process the resulting images to identify the location in each image of the required initial location. It then applies suitable actuation signals to move each of the actuators to the respective initial location on opposite sides of the user’s head corresponding to a rest position wherein no active commands are applied to the actuators.
  • the same approach may be applied to move the actuators to a preferred start position prior to applying lateral pressure.
  • the skin just below the eyes is particularly sensitive. Therefore, when it is required to signal upward movement to the user, the actuator may be moved to a starting position just below the eye prior to actuation.
  • Figs. 12a to 12d show pictorially different stages of a reset process for moving the actuators back to the initial location in the event of slippage.
  • the actuator arm 22 is dragging the skin to the right causing the skin to pucker.
  • Fig. 12b shows a subsequent stage where the actuator slips thereby unintentionally losing skin contact and causing the skin to return to its initial relaxed position. The absence of contact pressure is sensed and the actuator arm 22 is immediately returned to its initial rest location as shown in Fig. 12c, from where actuator contact pressure is restored to its required value and the actuator arm 22 is fast moved to its last location as shown in Fig. 12d.
  • actuators which may be divided unevenly on opposite sides of the head. Different actuators may likewise be provided for conveying specific commands and these can be located at respective rest positions that are optimized for the guidance command performed by the corresponding actuator. So, for example, one actuator can be initialized under the user’ s eye for applying an upward dragging force, while another actuator can be initialized at a different location better suited to convey a sideways motion.
  • Fig. 13 is a schematic block diagram showing an alternative implementation of the present invention as an add-on user interface to a separate navigation and obstacle avoidance guidance system.
  • the independent navigation system uses sensors which can be limited to a single camera and a GPS, or alternatively include additional supporting sensors, that are depicted in Fig. 13. These may, optionally include, lidars (laser-based direction and range sensors), ultrasonic sensors, or other proximity sensors.
  • the navigation system may be fully implemented on a commercial off-the- shelf smartphone which incorporates suitable software to perform the different functions required, including path planning, navigation guidance along the planned path, obstacle identifications and audio awareness messaging including orientation indications (“road two meters ahead”), and various warning messages.
  • the current invention provides the guidance commands to the user through skin-pulling indications.
  • the commands and indications are generated by the navigation and obstacle detection system and are transmitted over the wireless communication to the interface system which translates these indications to skin-pulling actions.
  • the actuator device includes miniature 6-degree of freedom acceleration sensors. The latter serve to feedback on the rate of motion in each axis and provide an indication to the system that a required motion has been completed. For example if the user is asked to turn, the acceleration sensors indicate when the turn is complete.
  • the actuator device in conjunction with the head worn device thereby implement an intuitive, convenient and quantitative interface to issue directions and warnings by touch to the user. Such an arrangement carries an important commercial opportunity for the proposed invention whereby it can be sold independently from the navigation devices allowing the supplier direct access to the market (rather than having to sell devices to the system manufacturer).

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Veterinary Medicine (AREA)
  • Rehabilitation Therapy (AREA)
  • Pain & Pain Management (AREA)
  • Animal Behavior & Ethology (AREA)
  • Epidemiology (AREA)
  • Public Health (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Business, Economics & Management (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • Theoretical Computer Science (AREA)
  • Navigation (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Dispositif de traduction d'indications de guidage de machine en commandes physiques de traction de la peau pour guider un utilisateur aveugle ou malvoyant comprenant un cadre monté sur la tête; des capteurs de mouvement et d'accélération miniatures; et un port de communication conçu pour s'interfacer avec une unité de navigation et recevoir des indications de guidage provenant de cette dernière. Une unité de traitement est couplée au port de communication et aux capteurs de mouvement et est sensible aux indications de guidage pour transporter des signaux d'actionnement vers au moins un actionneur pour appliquer une pression latérale le long d'une surface cutanée de l'utilisateur indiquant une action requise devant être entreprise par l'utilisateur.
PCT/IL2021/051028 2020-08-28 2021-08-23 Unité de guidage montée sur la tête pour personnes aveugles WO2022043995A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IL276989 2020-08-28
IL276989A IL276989A (en) 2020-08-28 2020-08-28 Navigation aid for the blind and visually impaired

Publications (1)

Publication Number Publication Date
WO2022043995A1 true WO2022043995A1 (fr) 2022-03-03

Family

ID=80354841

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IL2021/051028 WO2022043995A1 (fr) 2020-08-28 2021-08-23 Unité de guidage montée sur la tête pour personnes aveugles

Country Status (2)

Country Link
IL (1) IL276989A (fr)
WO (1) WO2022043995A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
ES2966863A1 (es) * 2022-09-28 2024-04-24 Perez Gisela Terol Gafa

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080120029A1 (en) 2006-02-16 2008-05-22 Zelek John S Wearable tactile navigation system
DE102013019080A1 (de) * 2013-11-17 2015-05-21 Peter Koppendorfer Vorrichtung zur Umwandlung von Umgebungs-Strukturen in Stellwerte von Stellgliedern eines Aktuators
FR3026638A1 (fr) * 2014-10-07 2016-04-08 Yannick Vaillant Ensemble environnement et interface de stimulation tactile de guidage sur une trajectoire dans l'environnement
WO2016086440A1 (fr) 2014-12-04 2016-06-09 上海交通大学 Dispositif de guidage portable pour non-voyants
US20180110672A1 (en) 2016-10-26 2018-04-26 Kas Kasravi Blind Dog Navigation System
WO2018204745A1 (fr) 2017-05-04 2018-11-08 Wearworks Dispositif haptique vibrant pour aveugles
US20190125587A1 (en) 2017-10-30 2019-05-02 Dylan Phan Assisting the visually impaired
CN209137266U (zh) 2018-11-07 2019-07-23 南京信息工程大学 新型多功能导盲眼镜
WO2019156990A1 (fr) * 2018-02-09 2019-08-15 Vasuyantra Corp., A Delaware Corporation Perception à distance de profondeur et de forme d'objets et de surfaces
US10528815B2 (en) 2016-12-31 2020-01-07 Vasuyantra Corp. Method and device for visually impaired assistance

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080120029A1 (en) 2006-02-16 2008-05-22 Zelek John S Wearable tactile navigation system
DE102013019080A1 (de) * 2013-11-17 2015-05-21 Peter Koppendorfer Vorrichtung zur Umwandlung von Umgebungs-Strukturen in Stellwerte von Stellgliedern eines Aktuators
FR3026638A1 (fr) * 2014-10-07 2016-04-08 Yannick Vaillant Ensemble environnement et interface de stimulation tactile de guidage sur une trajectoire dans l'environnement
WO2016086440A1 (fr) 2014-12-04 2016-06-09 上海交通大学 Dispositif de guidage portable pour non-voyants
US20180110672A1 (en) 2016-10-26 2018-04-26 Kas Kasravi Blind Dog Navigation System
US10528815B2 (en) 2016-12-31 2020-01-07 Vasuyantra Corp. Method and device for visually impaired assistance
WO2018204745A1 (fr) 2017-05-04 2018-11-08 Wearworks Dispositif haptique vibrant pour aveugles
US20190125587A1 (en) 2017-10-30 2019-05-02 Dylan Phan Assisting the visually impaired
WO2019156990A1 (fr) * 2018-02-09 2019-08-15 Vasuyantra Corp., A Delaware Corporation Perception à distance de profondeur et de forme d'objets et de surfaces
CN209137266U (zh) 2018-11-07 2019-07-23 南京信息工程大学 新型多功能导盲眼镜

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
HEATHER CULBERTSON ET AL: "Haptics: The Present and Future of Artificial Touch Sensation", ANNU. REV. CONTROL ROBOT. AUTON. SYST, 1 January 2018 (2018-01-01), pages 385 - 409, XP055701835, Retrieved from the Internet <URL:https://www.annualreviews.org/doi/pdf/10.1146/annurev-control-060117-105043> [retrieved on 20211118], DOI: 10.1146/annurev-control-060117- *
RICKY JACOBS: "PhD thesis", July 2013, DEPARTMENT OF COMPUTER SCIENCE NATIONAL UNIVERSITY OF IRELAND, article "Integrating Haptic Feedback into Mobile Location Based Services"
TOMOHIRO AMEMIYA ET AL., LEAD-ME INTERFACE FOR A PULLING SENSATION FROM HAND-HELD DEVICES, 2008

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
ES2966863A1 (es) * 2022-09-28 2024-04-24 Perez Gisela Terol Gafa

Also Published As

Publication number Publication date
IL276989A (en) 2022-03-01

Similar Documents

Publication Publication Date Title
US9013264B2 (en) Multipurpose controller for electronic devices, facial expressions management and drowsiness detection
US10373452B2 (en) Targeted haptic projection
WO2015083183A1 (fr) Dispositif de navigation tenu en main à rétroaction haptique
KR100934214B1 (ko) 소형 촉각 입출력 모터 및 이를 이용한 촉각 입출력 장치
US6862006B2 (en) Image processing apparatus and image processing method, and image processing program and recording medium of the same
KR101281806B1 (ko) 퍼스널 로봇
JP6934618B2 (ja) ジェスチャ入力システム及びジェスチャ入力方法
US11272283B2 (en) Rendering haptics on headphones with non-audio data
US11347312B1 (en) Ultrasonic haptic output devices
US10296093B1 (en) Altering feedback at an electronic device based on environmental and device conditions
US11474376B2 (en) Hinge designs in wearable electronic devices
US11150737B2 (en) Apparatus, system, and method for wrist tracking and gesture detection via time of flight sensors
JP2004096224A (ja) 電源制御方法及びヘッドマウント装置
WO2022043995A1 (fr) Unité de guidage montée sur la tête pour personnes aveugles
JP4048999B2 (ja) 画像処理装置及び画像処理方法
JP6822963B2 (ja) ファン駆動力デバイス
US11009943B2 (en) On/off detection in wearable electronic devices
US11027430B2 (en) Systems and methods for latency compensation in robotic teleoperation
KR20170038461A (ko) 스마트 기기를 기반으로 한 감성 로봇 시스템 및 동작 모드 제어 방법
Mishra et al. IoT based automated wheel chair for physically challenged
US10560777B1 (en) Bone conduction designs in wearable electronic devices
Bellotto A multimodal smartphone interface for active perception by visually impaired
JP4063131B2 (ja) 画像処理装置及び触覚・力覚提示方法
KR20140106309A (ko) 포스피드백 기능을 구비한 가상현실 입력장치
WO2023189425A1 (fr) Dispositif de commande, procédé de commande, système de rétroaction haptique et produit de programme

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21766240

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21766240

Country of ref document: EP

Kind code of ref document: A1