WO2021171291A1 - Robot de guidage à navigation autonome - Google Patents

Robot de guidage à navigation autonome Download PDF

Info

Publication number
WO2021171291A1
WO2021171291A1 PCT/IL2021/050210 IL2021050210W WO2021171291A1 WO 2021171291 A1 WO2021171291 A1 WO 2021171291A1 IL 2021050210 W IL2021050210 W IL 2021050210W WO 2021171291 A1 WO2021171291 A1 WO 2021171291A1
Authority
WO
WIPO (PCT)
Prior art keywords
self
robot
guide robot
robot according
navigating guide
Prior art date
Application number
PCT/IL2021/050210
Other languages
English (en)
Inventor
Amir NARDIMON
Omer WAXMAN
Original Assignee
Seamless Vision (2017) Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Seamless Vision (2017) Ltd filed Critical Seamless Vision (2017) Ltd
Publication of WO2021171291A1 publication Critical patent/WO2021171291A1/fr

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H3/00Appliances for aiding patients or disabled persons to walk about
    • A61H3/04Wheeled walking aids for patients or disabled persons
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H3/00Appliances for aiding patients or disabled persons to walk about
    • A61H3/06Walking aids for blind persons
    • A61H3/061Walking aids for blind persons with electronic detecting or guiding means
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0248Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means in combination with a laser
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H3/00Appliances for aiding patients or disabled persons to walk about
    • A61H3/04Wheeled walking aids for patients or disabled persons
    • A61H2003/043Wheeled walking aids for patients or disabled persons with a drive mechanism
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H3/00Appliances for aiding patients or disabled persons to walk about
    • A61H3/04Wheeled walking aids for patients or disabled persons
    • A61H2003/046Wheeled walking aids for patients or disabled persons with braking means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H3/00Appliances for aiding patients or disabled persons to walk about
    • A61H3/06Walking aids for blind persons
    • A61H3/061Walking aids for blind persons with electronic detecting or guiding means
    • A61H2003/063Walking aids for blind persons with electronic detecting or guiding means with tactile perception
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/01Constructive details
    • A61H2201/0173Means for preventing injuries
    • A61H2201/0176By stopping operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/01Constructive details
    • A61H2201/0188Illumination related features
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/01Constructive details
    • A61H2201/0192Specific means for adjusting dimensions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/16Physical interface with patient
    • A61H2201/1602Physical interface with patient kind of interface, e.g. head rest, knee support or lumbar support
    • A61H2201/1635Hand or arm, e.g. handle
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5007Control means thereof computer controlled
    • A61H2201/501Control means thereof computer controlled connected to external computer devices or networks
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5023Interfaces to the user
    • A61H2201/5043Displays
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5023Interfaces to the user
    • A61H2201/5048Audio interfaces, e.g. voice or music controlled
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5058Sensors or detectors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5058Sensors or detectors
    • A61H2201/5071Pressure sensors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5058Sensors or detectors
    • A61H2201/5082Temperature sensors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5097Control means thereof wireless
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2230/00Measuring physical parameters of the user
    • A61H2230/04Heartbeat characteristics, e.g. E.G.C., blood pressure modulation
    • A61H2230/06Heartbeat rate
    • A61H2230/065Heartbeat rate used as a control parameter for the apparatus

Definitions

  • TECHNOLOGICAL FIELD The presently disclosed subject matter relates to a self-navigating guide robot for guiding a disabled user along a route.
  • a self-navigating guide robot comprising: at least one sensor configured to receive sensory input relating to the environment; a handle comprising at least one input device configured to receive sensory input from a user and at least one output device configured to communicate with a user; and a controller configured, in use, to navigate the robot based on the sensory input from the at least one sensor and the at least one input device, and to communicate with a user via the at least one output device. Since the guide robot is provided with a handle, a user can retain a connection with the robot while in use, which can enable the user to maintain their stability and/or balance, and position relative to the robot.
  • the handle comprises at least one input device
  • a user can easily provide sensory input to the robot while holding the handle.
  • the robot is configured, in use, to navigate based on sensory input from both the at least one sensor and the at least one input device, optimized navigation can be provided which takes into account both the needs of the user and the features of the surroundings.
  • the robot is configured, in use, to communicate with a user via the at least one output device, the robot can provide feedback and adequate warnings and/or information to a user relating to details such as, but not limited to, the route, the surroundings such as obstacles or the terrain or an impending step or traffic when crossing roads, power consumption, such as in the event that there is not enough power in the robot to complete a desired route, etc.
  • the at least one output device can comprise at least one of a haptic feedback device, a visual display, a speaker, a braille pad, a mobile phone, or any network capable device.
  • Networks may, for example, include, but are not limited to short wave ultra high frequency radio wave communication such as Bluetooth ® , Wireless Local Area Networks (WLAN), and any other suitable protocl.
  • a haptic feedback device also referred to as a haptic feedback mechanism, can comprise as any combination of, but not limited to, a braille pad, moving or rotating parts such as joysticks, levers, rotating portions, vibrating portions, electro-tactile portions and thermal feedback portions.
  • Such haptic feedback devices, portions and mechanisms can be used to allow the robot to communicate with a user, for example, warning about static and/or moving obstacles, an impending left or right turn, crossing of a road or other busy intersection, a change of texture and/or friction level and/or evenness of the ground surface, a change of incline, i.e., uphill, downhill, flat, information related to location, such as beginning a guided route, arrival at a destination, warning of the starting or stopping of the robot, an indication of low power, etc.
  • the handle can comprise a gripping portion.
  • the gripping portion can be designed ergonomically to be comfortable for a user to hold for the duration of a route
  • the gripping portion can comprise an ergonomically designed shape or form, and can be comprised of a breathable material and/or a flexible material and/or an insulating material, all of which can make the use of the handle more comfortable for long-duration holding of the gripping portion of the handle by a user.
  • the gripping portion can comprise an arcuate form.
  • An arcuate form can help prevent injury due to sharp corners and additionally has a variety of directional portions to allow a user to grip in the most comfortable manner for that user.
  • the handle can be located at a rear side of the robot, wherein a central portion of the arcuate form of the gripping portion can coincide with a forward-aft longitudinal axis of the robot, and wherein side portions of the arcuate form can extend laterally and forwardly from the central portion of the arcuate form.
  • the handle is at the rear side of the robot, a user can easily walk behind the robot. Further, since a central portion of the arcuate form of the gripping portion coincides with a forward-aft longitudinal axis of the robot, a user can align themselves centrally behind the robot. Since side portions of the arcuate form extend laterally and forwardly from the central portion of the arcuate form, a user can align themselves directly behind the robot and hold onto each side portion with one hand, or can stand offset to one or the other side of the robot while holding on with a single hand to the respective side portion.
  • the at least one input device can be located on the gripping portion of the handle. With this arrangement, a user can provide sensory input without having to release their grip on the gripping portion, allowing more security and ease of use for the user. Further, in the event that the input device comprises a health sensor, for example, a pulse measurement device, a thermal sensor, and/or any other suitable device, the input can be provided automatically when the user holds the handle during normal use and does not require additional action or movement by the user.
  • a health sensor for example, a pulse measurement device, a thermal sensor, and/or any other suitable device
  • the at least one output device can be located on the gripping portion of the handle.
  • the at least one input device can comprise one or more of a microphone, a camera, a pulse detector, a thermal sensor, a pressure sensor, a joystick, a braille pad, a keyboard, a button, a fingerprint sensor, a touch pad or track pad, or a torque sensor.
  • a microphone a microphone
  • a camera a pulse detector
  • a thermal sensor a pressure sensor
  • a joystick a braille pad
  • a keyboard a button
  • a fingerprint sensor a touch pad or track pad
  • torque sensor can provide input relating to the identity of a user, route instructions from a user, real time health information about a user, and/or any other suitable information.
  • the at least one sensor can comprise one or more of a LIDAR device, an RF Radar device, GPS system, ultrasonic range finder, temperature sensor, humidity sensor, stereo camera, color camera, magnetic field sensor, inertial measurement sensor, wheel encoder or audio sensor.
  • Such sensors can provide information such as the existence and position of, distance to, and speed of, obstacles, traffic, and/or any other suitable information, as well as information relating to the surroundings, such as information relating to the roughness of a surface, gradient of terrain, heat information such as in the case of an IR sensor detecting fire or the existence and location of human passersby; information relating to the existence of potholes; information relating to the location of the robot in order to allow optimized route planning and navigation, etc.
  • the at least one sensor can comprise a plurality of ultrasonic range finder sensors arranged around a portion of a perimeter of a front and sides of the self-navigating guide robot, so as to provide up to 135° field of view laterally on either side of a forward-aft longitudinal axis passing through the robot.
  • a field of view can be provided which is up to 45° field of view laterally on either side of a forward-aft longitudinal axis passing through the robot or up to 90°.
  • the at least one sensor can comprise a plurality of cameras, comprising at least one of: a front camera facing a forward direction of the robot; at least one side camera oriented at least partially forward and at least partially to a side of the robot; and an upward camera oriented angled upward at an acute angle to the horizontal.
  • Front cameras can capture information relating to the route directly in front of the robot.
  • Side cameras oriented at least partially forward and at least partially to a side of the robot can capture information relating to the route in front and to the side of the robot.
  • Upward cameras can capture information relating to the surroundings above the robot, for example, the existence and location of low-hanging branches, signposts, and/or any other suitable information.
  • the plurality of cameras can be stereo cameras. Alternatively or additionally, the plurality of cameras can be color cameras. Stereo cameras enable simulation of human binocular vision and allow capture of three-dimensional images. Color cameras allow reproduction of a colored image which may be easier for a user to interpret, for example than a monochromatic images, in particular when it comes to differentiating different objects from one another.
  • the handle can comprise a low-frequency RFID reader; wherein, in use: the low- frequency RFID reader can be configured to read a low-frequency RFID tag of a user; and the controller can be configured to determine whether the read RFID tag is a recognized tag or an unrecognized tag; and i) if the RFID tag is a recognized tag, to operate the self- navigating guide robot; and ii) if the RFID tag is an unrecognized tag, to lock the self- navigating guide robot against operation.
  • low-frequency RFID what is meant is having a frequency in the range of 30kHz-300kHz, such as, for example, 125kHz and/or 134kHz.
  • a low-frequency RFID reader is capable of reading RFID tags at a range of a few centimeters.
  • Such low- frequency RFID readers can be used, for example, to identify a user for the purposes of security, such as to identify a source of instructions and to determine whether such instructions should be obeyed, and can be used in a security system to issue an alert should an unrecognized RFID tag provide instructions for use.
  • the self-navigating guide robot can comprise a power supply, such as a chemical battery, a capacitor, fuel and generator, or other power supply.
  • a power supply enables the robot to move freely without needing to be linked to a particular location. Batteries and capacitors can be charged and/or replaced at charging points, for example, when connected to the mains electricity, and fuel can be refilled, for example, at a refueling station.
  • the handle can be pivotally connected to a main body of the robot via a connecting rod; and the connecting rod can be rotatable relative to the main body of the robot about a pivot point.
  • Such pivotal connection allows the height of the handle to be raised and lowered in accordance with the height or other anthropometric measurements of a user, and/or based on a gradient of terrain.
  • the self-navigating guide robot can comprise an indexing plunger for locking the connecting rod in place against rotation about the pivot point. Such a locking can ensure that the connecting rod remains holding the handle at a desired height.
  • the self-navigating guide robot can comprise a torque sensor at the pivotal connection between the connecting rod and the main body of the robot.
  • a torque sensor can be used to control the robot, for example, to slow the robot down or speed up the robot depending on a torque applied at the pivotal connection by a user.
  • the torque sensor can also be used as a torque feedback device to communicate with a user.
  • the handle can be pivotally connected to the connecting rod.
  • a pivotal connection allows the handle to constantly lie in a certain directionality, i.e., in planes which are parallel to one another, irrespective of the angle of the connecting rod. It also allows more flexibility for a user to hold the handle at an angle comfortable for the user's grip.
  • the self-navigating guide robot can comprise a torque sensor at the pivotal connection between the handle and the connecting rod.
  • a torque sensor can be used to control the robot, for example, to slow the robot down or speed up the robot depending on a torque applied at the pivotal connection by a user.
  • the torque sensor can also be used as a torque feedback device to communicate with a user.
  • the connecting rod can be a telescopic rod so as to have a variable length.
  • a connecting rod allows conforming the dimensions of the robot for comfortable use with a variety of users with different anthropometric data. Further, such a connecting rod allows variability of length under select conditions, such as but not limited to, based on the gradient of a particular terrain, a speed of a user, and/or any other suitable factors.
  • the controller can be configured to adjust at least one of a length of the connecting rod and an angle between the connecting rod and the main body of the robot, based on at least one of: anthropometric data of a user; a sensed inclination of the ground to be traversed by the robot; or a speed or intended speed of travel of the robot.
  • the controller can be configured to adjust an angle between the connecting rod and the handle based on at least one of: anthropometric data of a user; a sensed inclination of the ground to be traversed by the robot; or a speed or intended speed of travel of the robot.
  • Such adjusting can be automatic and during use, or can comprise instructing a user or helper to adjust the length of the connecting rod manually and/or electrically via one or more input device.
  • the self-navigating guide robot can comprise a ground-contacting portion. Such a portion can be in partial or full contact with the ground or terrain at any one time, depending on the type of terrain such as flat terrain, gradient, rough terrain, stairs, etc.
  • the ground-contacting portion can comprise at least four wheels, each wheel comprising a motor for driving the respective wheel.
  • Four wheels can provide stability. Further, since each wheel has its own motor, in the event that going over bumpy or otherwise uneven terrain causes one of the wheels to lose contact with the ground, the remaining wheels will be able to drive the robot onward.
  • the at least four wheels can comprise six wheels, wherein three of the six wheels can be spaced along the forward-aft direction on either side of the robot. Having three wheels on each side provides increased stability in the event that one of the wheels loses contact with the ground, such as when going over a bump, hole or other terrain feature.
  • the two wheels closest to the front of the robot can be connected to a pivoting member via struts projecting from the pivoting member at a relative angular orientation of greater than 30°, for example, 90°.
  • the pivoting member can be arranged to pivot to allow use of the front and middle pairs of wheels to climb stairs, for example.
  • the ground-contacting portion can comprise at least one caterpillar track.
  • Caterpillar tracks can allow a greater contact surface area with the ground.
  • the ground-contacting portion can comprise at least one caster. Casters can support the robot while allowing ease of movement in a range of directions.
  • the self-navigating guide robot can comprise a cover over a rear portion of the ground-contacting portion.
  • a cover can act as a mudguard or splashguard to prevent the ground contacting portion from throwing up liquid, dirt, stones, etc., onto the user who can be located to the rear of the robot.
  • the self-navigating guide robot can comprise a manual brake for locking the self- navigating guide robot against movement.
  • a manual brake allows a user to selectively lock and unlock the robot when not in use, for example, at the beginning and end of route, while located on a gradient to prevent accidental rolling and at any point during the route should the user wish.
  • the self-navigating guide robot can comprise an automatic brake configured to automatically lock the robot against movement in the event of a power loss.
  • an automatic brake can be a fail-safe mechanism, preventing accidental undesirable rolling of the robot in the event of a power loss, for example, rolling down a gradient, rolling into the path of oncoming traffic, etc.
  • the self-navigating guide robot can comprise a release mechanism for allowing free movement of the robot in the event of a power loss.
  • a release mechanism for allowing free movement of the robot in the event of a power loss.
  • Such an automatic brake can be a fail-safe mechanism, allowing a user to navigate out of dangerous situations, such as in the middle of crossing a road, or to allow a user to complete the route unassisted in the event of a power loss, where restoring power is time consuming and/or not possible in that location.
  • the self-navigating guide robot can be self-driving. Self-driving ensures that a user need not push the robot, in the event that the user is weak or frail, and allows a user to concentrate on their own motion without worrying about the robot.
  • the self-navigating guide robot can be configured to adjust its speed in response to a measured torque about the handle.
  • Applied torque to the handle can be indicative that the robot is moving too fast or too slow for the user and so the robot can adjust its speed to suit the particular user, and/or particular gradients, for example, to go faster downhill than uphill.
  • the self-navigating guide robot can be configured to reduce its power consumption if a user provides a driving force via the handle. In the event that the user does push the robot, instead of the robot resisting such pushing by creating a resistive force, the robot can utilize the input driving force, thereby allowing energy savings.
  • the self-navigating guide robot can comprise a transceiver configured for communication with a distant entity, for example, by Wireless Local Area Network (WLAN), cellular network, short wave ultra high frequency radio wave communication such as Bluetooth ® and/or any other suitable protocal.
  • the transceiver communicates via a user's mobile phone.
  • the robot can be configured to communicate with a distant security company in the event that the robot is stolen or its location is unknown, or in the event that the user sends an alert due to being lost, sudden loss of power mid-route or needs other assistance.
  • the robot can be configured to allow communication by email, telephone or text message with a third party or contact of the user, and can additionally or alternatively be configured to allow communication of system updates, including map and route updates, GPS information, and/or any other suitable information, with a dedicated system external to the robot.
  • a self- navigating guide robot comprising: a handle comprising a low-frequency RFID reader; and a controller; wherein, in use: the low-frequency RFID reader is configured to read a low-frequency RFID tag of a user; and the controller is configured to determine whether the read RFID tag is a recognized tag or an unrecognized tag; and i) if the RFID tag is a recognized tag, to operate the self-navigating guide robot; and ii) if the RFID tag is an unrecognized tag, to lock the self-navigating guide robot against operation.
  • low-frequency RFID By low-frequency RFID, what is meant is having a frequency in the range of 30kHz-300kHz, such as, for example, 125kHz and/or 134kHz.
  • a low-frequency RFID reader is capable of reading RFID tags at a range of a few centimeters.
  • Such low- frequency RFID readers can be used, for example, to identify a user for the purposes of security, such as to identify a source of instructions and to determine whether such instructions should be obeyed, and can be used in a security system to issue an alert should an unrecognized RFID tag provide instructions for use.
  • Fig. 1A shows a top rear perspective view of a guide robot according to the presently disclosed subject matter
  • Fig. IB shows a front view of the guide robot of Fig. 1A
  • Fig. 1C shows a rearview of the guide robot of Fig. 1A;
  • Fig. ID shows an underneath view of the guide robot of Fig. 1A, with a part of the underneath housing removed;
  • Fig. IE shows a plan view of the guide robot of Fig. 1A;
  • Fig. IF shows a left side view of the guide robot of Fig. 1A;
  • Fig. 1G shows a right side view of the guide robot of Fig. 1A;
  • Fig. 2A shows an enlarged top front perspective view of the handle of the guide robot of Fig. 1A
  • Fig. 2B shows an enlarged bottom back perspective view of the handle of the guide robot of Fig. 1A;
  • Fig. 3A shows a front right perspective view of a portion of the guide robot of
  • Fig. 3B shows a side cross-sectional view along the plane B-B in Fig. 3A of the handle and connecting rod;
  • Fig. 3C shows an enlarged perspective cross-sectional view along the plane B-B in Fig. 3A of the connecting rod
  • Figs. 3D-3G show an enlarged perspective view of a portion of the connecting rod of Figs. 3B and 3C, with each successive figure having an additional component of the connecting rod removed from view compared to the previous figure;
  • Fig. 4 shows an enlarged top front perspective view of an articulated joint of the guide robot of Fig. 1A;
  • Fig. 5 shows a plan view of the guide robot of Fig. 1A with a portion of the upper housing removed;
  • Fig. 6 shows an enlarged left side view of a portion of the guide robot of Fig. 1A;
  • Fig. 7 shows an enlarged left side view of a portion of the guide robot of Fig. 1A with a portion of the upper housing removed;
  • Fig. 8 shows a top view of an arrangement of sensors within the guide robot of
  • Fig. 1A shows an enlarged left side view of a portion of the ground contacting portion of the robot of Fig. 1A with a part of the housing removed;
  • Fig. 9B shows a bottom perspective view of the portion of ground contacting portion of Fig. 9A.
  • a guide robot generally designated 100 is provided.
  • the guide robot 100 comprises a handle 120, a ground-contacting portion 140, a sensing portion 160 and a control portion 180.
  • the sensing portion 160 and the control portion 180 are located in a main body portion 150.
  • the handle 120 of this particular example is located at the rear of the guide robot, and is shown in more detail in Fig. 2A.
  • the handle 120 comprises an arcuate gripping portion 122, having a central gripping area 122a and two side gripping areas 122b.
  • the central gripping area 122a allows a user to hold the handle 120 directly in front of him single-handedly, while the side-gripping areas 122b permit a user to stand on either side of the handle and hold the handle 120 single-handedly, orto hold the handle 120 directly in front of him with one hand placed on each of the side-gripping areas 122b.
  • the handle can comprise at least one input device for input of sensory information by a user, and at least one output device for communication to a user.
  • the input and/or output devices can be located on one or more of the gripping portion of the handle and a user interface portion of the handle which is not designed to be gripped by a user.
  • the handle comprises a user-interface portion 130, comprising a microphone array 132 and two speakers 134.
  • the microphone array can be used for one or more of recognizing the user (voice recognition), security, monitoring or sensing ambient noise, and for input of instructions to the robot 100 as described further below.
  • the handle 120 can comprise any one or more haptic feedback mechanisms, to provide haptic feedback to a user, such as any combination of, but not limited to, a braille pad, moving or rotating parts such as joysticks, levers, rotating portions, vibrating portions, electro-tactile portions and thermal feedback portions.
  • haptic feedback mechanisms to provide haptic feedback to a user, such as any combination of, but not limited to, a braille pad, moving or rotating parts such as joysticks, levers, rotating portions, vibrating portions, electro-tactile portions and thermal feedback portions.
  • Such haptic feedback portions and mechanisms can be used to allow the robot to communicate with a user, for example, warning about static and/or moving obstacles, an impending left or right turn, crossing of a road or other busy intersection, a change of texture and/or friction level and/or evenness of the ground surface, a change of incline, i.e., uphill, downhill, flat, information related to location, such as beginning a guided route, arrival at a destination, warning of the starting or stopping of the robot, an indication of low power, etc.
  • Such haptic feedback can be provided on one or more of the gripping portion 122, and/or the user-interface portion 130, and/or in some examples, can be located in the form of a torque feedback mechanism in an articulated joint between the handle 120 and a first end 138a of a connecting rod 138 which connects the handle 120 to the main body portion 150 of the robot 100 (shown in Figs. 1A-1G, for example)
  • the user-interface portion 130, or indeed any portion of the handle 120 can comprise one or more low-frequency RFID reader, i.e., having a frequency in the range of 30kHz-300kHz, such as, for example, 125kHz and/or 134kHz, capable of reading RFID tags at a range of a few centimeters.
  • Such low-frequency RFID readers can be used, for example, to identify a user for the purposes of security, such as to identify a source of instructions and to determine whether such instructions should be obeyed, and can be used in a security system to issue an alert should an unrecognized RFID tag provide instructions for use.
  • the user-interface portion 130 can comprise one or more touch sensor to allow user-input instructions, such as one or more of any of a keypad, button, joystick, rotatable grip, lever, pressure sensor, heat sensor, torque sensor, and ⁇ or any other suitable input mechanisms.
  • buttons 133 for user input.
  • Such buttons 133 may for example be volume control buttons, on-off switches, and/or any other suitable control devices, or may provide other forms of input to the robot 100, or even control output such as turning a speaker on and off, for example.
  • a torque sensor in an articulated joint between the handle 120 and the connecting rod 138, capable of sensing an applied torque between the handle 120 and the connecting rod 138, indicative of a velocity differential between a user and the robot 100, which can be used to effect a change in speed of the robot 100 accordingly.
  • the user-interface portion 130 or indeed any portion of the handle 120, can comprise one or more "alert" input devices, such as a "call for help" button.
  • Such a device can allow a user to allow the robot to produce a local alert, such as a siren, voice message, and/or any other suitable alerts, and/or can allow the user to contact another person such as making a call via a cellphone network, sending an email or text-message, and/or can allow a user to place an alert on a dedicated tracking system, using GPS or other location data, to enable an external operator to provide assistance.
  • a local alert such as a siren, voice message, and/or any other suitable alerts
  • a user can allow the user to contact another person such as making a call via a cellphone network, sending an email or text-message, and/or can allow a user to place an alert on a dedicated tracking system, using GPS or other location data, to enable an external operator to provide assistance.
  • the user-interface portion 130 can comprise one or more screens or displays, which can be adapted to emit light at frequencies visible to a specific user, and can provide data relating to location, route, distance to destination, instructions to start or stop, for example, when crossing roads, or other information.
  • screens or displays can be adapted to provide information about the user to third parties who could assist the user, such as route, location, requests such as "please help the user cross the road," information about the user, such as "the user is visually impaired,” etc.
  • Such visual instructions can be accompanied by audio instructions, or alternatively, there might only be audio instructions.
  • the user-interface portion 130 can comprise one or more health sensors, such as pulse sensors, capable of allowing the robot to slow down or stop should a pulse be above a set threshold, for example, and/or to issue a health warning locally or distally, such as by telephone, text message, email or other alert, should a pulse be detected as being unhealthily high or low.
  • health sensors can include, but are not limited to, any one or more of a stress level sensors, a sweat sensors, and/or any other suitable sensor or sensors.
  • the user-interface portion 130 can comprise one or more grip-sensors capable of causing the robot to stop when a grip is released.
  • grip-sensors capable of causing the robot to stop when a grip is released.
  • heat sensors a button which must be depressed, or a pressure sensor, in order for the robot to operate.
  • the user-interface portion 130 can comprise one or more cameras for facial recognition, for example, to recognize the presence of an authorized user for receipt of instructions, and conversely, to prevent acceptance of instructions from an unrecognized user, to monitor the user, etc.
  • the handle 120 further comprises a height adjustment button 135, arranged to allow disengagement of a locking mechanism, as described further below with reference to Figs. 3A-3G, to allow a height of the handle to be adjusted.
  • the robot 100 comprises the main body portion 150, which comprises, enclosed within a housing 152, a sensing portion 160 and a control portion 180.
  • the connecting rod 138 which is operatively connected at its first end 138a to the handle 120, has a longitudinal axis A along which it extends, from the first end 138a of the connecting rod to a second end 138b of the connecting rod 138.
  • the connecting rod 138 can be telescopic, comprising nested portions 137a, 137b, which can slide relatively axially along the longitudinal axis A.
  • the second end 138b of the connecting rod 138 is articulated to the main body portion 150 of the robot 100 at an articulated joint 139.
  • the nested portions 137a, 137b comprise a generally U-shaped cross-section in a plane perpendicular to the longitudinal axis A, and are arranged in opposing overlapping arrangement.
  • the nested portions 137a, 137b may be surrounded by housings 137c and 137d as shown in Figs. 3E and 3D respectively.
  • the nested portions 137a, 137b can be connected by a chain 137e for smooth relative sliding therebetween, and additionally to ensure that the connecting rod 138 is not over-extended such that the nested portions 137a 137b separate from one another.
  • the relative axial adjustment along the longitudinal axis A of the nested portions 137a, 137b can, for example, be manually, and/or can be controlled by a user via a control on the handle, for example electrically, such as by a switch, dial, button, touch pad or any other input device.
  • nested portion 137b comprises a series of parallel slots 131 in the center-face of the U-shaped cross section, each slot extending perpendicularly to the longitudinal axis A, and the slots 131 being spaced from another in the direction of the longitudinal axis A.
  • Nested portion 137b is arranged to be fixed at one end thereof, which is the second end 138b of the connecting rod 138, to the articulated joint 139.
  • Nested portion 137a is arranged to translate relative to the nested portion 137b in the direction of the longitudinal axis A.
  • a stepped locking lever 136 disposed within nested portion 137a is a stepped locking lever 136.
  • Lever 136 comprises a first end portion 136a and a second end portion 136b and a central portion 136c therebetween.
  • the lever 136 extends generally parallel to the longitudinal axis A, and is stepped such that the first end portion 136a projects away from the central portion 136c towards a rear of the robot 100 and the second end portion 136b projects away from the central portion 136c towards a front of the robot.
  • the lever 136 is pivotally connected, at the central portion 136c thereof, to the nested portion 137a at a pivot 136d.
  • the first end portion 136a of the lever 136 comprises the height adjustment button 135, which is arranged to project through an aperture 131b in the nested portion 137a.
  • the nested portion 137a is configured to translate along the direction of the longitudinal axis A together with the lever 136.
  • the second end portion 136b comprises two projecting pins 136e arranged to engage within one of the slots 131a in order to lock the nested portions 137a, 137b against relative translation in the direction of the longitudinal axis A.
  • the mechanism comprises a biasing member arranged to bias the mechanism in the locked position.
  • the biasing member comprises a compression spring 136f, arranged to urge against the first end portion 136a, to cause the lever 136 to rotate about the pivot 136d in a direction such that the first end portion 136a, protrudes at least partially through the aperture 131b, and so that the projecting pins 136e of the lower end portion 136b of the locking lever 136 are located at least partially within one of the slots 137c.
  • the nesting portions 137a, 137b are locked against relative translation in the direction of the longitudinal axis A.
  • an operator can manually press the height adjustment button 135 of the first end portion 136a of the lever 136 against the urging force of the compression spring 136f, to cause the lever 136 to rotate about the pivot 136d in a direction such that the height adjustment button 135 protrudes to a lesser extent (than when not urged by an operator) through the aperture 131b, and so that the projecting pins 136e of the second end portion 136b of the locking lever 136 are disengaged from within the slot 131a, and are located inside the nesting portion 137b so as to be freely displaceable therealong.
  • the nested portions 137a, 137b can be translated relative to one another in an axial direction to extend or retract the telescoping connecting rod 138 as desired and within the limits defined by the lengths of the nested portions 137a, 137b as well as any accompanying housing 137c, 137d, connections and adjacent components.
  • an operator releases the pressure applied to the height adjustment button 135, allowing the compression spring 136f to urge the lever 136 to rotate about the pivot 136d in a direction such that the height adjustment button 135 protrudes to a greater extent (than when urged by an operator) through the aperture 131b, and so that the projecting pins 136e of the second end portion 136b of the locking lever 136 are engaged at least partially within one of the slots 131a.
  • Such translation and/or locking may alternatively or additionally be provided by electronic means, for example, using a worm drive arrangement, actuators such as pneumatic or hydraulic actuators for example, and ⁇ or any other suitable arrangement.
  • Such electronic operation may be by means of direct user input such as one or more buttons, dials, voice command, etc., and/or by indirect input from the user, for example, by sensing a force on the handle and/or on a torque sensor applied by a user which is indicative of the handle being too short or too long. For example, if the handle is too long, a downward force on the handle or first moment may be applied, and if the handle is too short, an upward force on the handle or second moment may be applied.
  • the articulated joint 139 is shown in more detail.
  • the articulated joint 139 at the second end 138b of the connecting rod 138 comprises a connecting rod portion 139a which is fixedly connected to the nested portion 137b and a main body portion 139b connected to a chassis of the main body portion 150 of the robot 100.
  • the connecting rod portion 139a and the main body portion 139b are pivotally connected by a pivot portion 139c.
  • the articulated joint can also comprise, as shown in this particular example, a locking member, such as an indexing plunger 139d, which can be arranged to lock or tighten the relative orientations of the connecting rod 138 and main body portion 150 in fixed pre-determined positions, such as a set position, or a position adjustable based on a height of a user, and/or can be released to allow folding, collapsing or otherwise providing a compact arrangement of the robot 100 when not in use.
  • the locking member can, for example, be manually adjusted, and/or can be controlled by a user via a control on the handle, for example electrically, such as by a switch, dial, button, touch pad or any other input device operated by a user.
  • the relative orientation of the connecting rod 138 and the main body portion 150 may be adjusted automatically based on biometric data of the user, for example, based on a height of the user.
  • the articulated joint 139 can, in addition or as an alternative to the torque sensor and torque feedback mechanisms described above, comprise one or more of a torque sensor and torque feedback mechanism as described. Further, the articulated joint can be arranged to adjust a relative orientation of the connecting rod 138 and main body portion 150 based on an angle of incline of a ground surface, sensed or recorded height of a user, length of the connecting rod 138, etc. One or more of the length of the telescopic connecting rod 138 and relative orientation of the connecting rod 138 and the main body portion 150, can be adjusted manually or automatically to account for the incline of the ground surface, velocity or acceleration of the robot, according to predetermined settings and/or input by the user. Such adjustments can be made, for example, based on data from an orientation sensor.
  • the sensing portion 160 can comprise any one or more of the sensors described below, and can additionally or alternatively comprise other sensors.
  • the sensing portion 160 comprises a laser range scanner 161, such as a light detection and ranging system (commonly referred to as "LIDAR"), capable of detecting in a 360° field of view.
  • the laser range scanner 161 is located in a portion of the main body 150 comprising a gap 153 between an upper housing portion 152 and lower housing portion 154, as shown in Fig. 6, for example. This allows the laser range scanner 161 to detect ranges in a broad angular field, including forward, out to the sides and, optionally, backwards, such as a 90°, 180°, or 270° angular field.
  • the sensing portion 160 comprises a GPS sensor 162, capable of detecting a location of the robot.
  • location data can be used, for example, to plan and follow routes, provide a "lost" alert capable of allowing an external service to locate the robot.
  • the GPS sensor 162 is located in the gap 153 between the upper housing portion 152 and the lower housing portion 154, as shown in Fig. 6, for example.
  • the sensing portion 160 comprises a plurality of cameras 163, which can provide depth and tracking sensors.
  • Such cameras can be used to detect objects in the environment, such as vehicles, pedestrians, changes in incline, changes in lever such as steps, curb, changes of ground texture, e.g., ice and smooth surfaces can be more reflective than rough surfaces, obstacles at ground level or other levels, etc.
  • the cameras can face different directions. For example, as shown in Figs. 5 and 7, there are shown four cameras 163, comprising an upward-facing camera 163, two side-facing cameras 163b pointing forwards and out to each side, and a forward -facing camera 163c.
  • the upward- facing camera can be inclined by an angle g to the horizontal, which can provide a field of view including low-hanging objects which can provide an obstacle, such as low- hanging tree branches, for example, while the side-facing cameras 163b can detect wide and/or side-obstacles, for example.
  • the angle g can be between 10° and 80°, between 20° and 70°, between 30° and 60°, between 40°and 50°, for example. In some examples, the angle g can be 45°.
  • the forward and side-facing cameras 163b, 163c can cover an angular field of 90°, in the range 90°-180°, in the range 180°-270° or greater than 270°.
  • the cameras 163 can be stereo cameras. Alternatively or additionally, the cameras 163 can be color cameras. Stereo cameras enable simulation of human binocular vision and allow capture of three-dimensional images. Color cameras allow reproduction of a colored image which may be easier for a user to interpret, for example, than a monochromatic images, in particular when it comes to differentiating different objects from one another.
  • the cameras 163 can be arranged to detect visible light, and/or can detect other frequency bands in the electromagnetic spectrum, for example, the cameras 163 can comprise infrared detectors.
  • the robot can further comprise a camera, range or depth sensor, and/or any other suitable arrangement for detecting an elevation of the ground, which can enable determination of unevenness, trip hazards, obstacles on the ground, holes, ditches, water such as puddles, floods, rivers, ground texture, etc.
  • the sensing portion comprises a plurality of ultrasonic sensors 164, which can comprise rangefinders.
  • ultrasonic sensors 164a-g there are seven ultrasonic sensors 164a-g, as shown in Figs. 5-8.
  • the ultrasonic sensors 164a-g are located at increasing relative angular orientations to a forward-aft direction FA of the robot 100.
  • sensor 164d is aligned with the forward-aft direction FA, facing forward, while the sensor 164c is orientated at an angle a clockwise; the sensor 164b is orientated at an angle 2a clockwise; the sensor 164e is orientated at an angle a anticlockwise; the sensor 164f is orientated at an angle 2a anticlockwise; the sensor 164a is orientated at an angle b clockwise; and the sensor 164g is orientated at an angle 2b anticlockwise.
  • this angular arrangement is merely provided as an example, and any angular arrangement may be provided without departing from the scope of the presently disclosed subject matter, mutatis mutandis.
  • the angular field covered by all of the sensors can be 90°, in the range 90°-180°, in the range
  • sensors 165 at the front of the robot can be arranged to track and map the robot's environment.
  • the input from the sensors of the sensing portion 160, the user input via the handle 120 and the various torque sensors throughout, such as on the handle, articulated joints and even on the axels of the wheels, as described further below, can be used by the control portion 180 to control the output of the robot in terms of velocity, user feedback, e.g., haptic feedback, route planning, route alterations, stopping and starting, e.g., crossing roads, emergency stop, sending alerts, etc.
  • the control system 180 can comprise one or more integrated circuits, logic circuits, processors, and/or any other suitable arrangement.
  • the control system 180 can comprise a power source, such as a battery.
  • the control system 180 can comprise a transceiver in order to communicate with a support service, for example, in case of theft, call for help button is pressed, receive system updates, for example, updates to route planning software, GPS maps, etc.
  • the control system 180 can be arranged to control the ground-contacting portion 140 as described further below.
  • the ground-contacting portion comprises a plurality of wheels 142, although it will be appreciated that caterpillar tracks, casters, or any other suitable arrangement may alternatively or additionally be provided, mutatis mutandis.
  • wheels 142a-142f there are provided six wheels 142a-142f, of which three wheels are provided on each side of the robot, with two front wheels 142a, 142f, two middle wheels 142b, 142e and two rear wheels 142c, 142d.
  • the housing of the main body portion 150 comprises a rear portion 156 arranged to house a portion of the two rear wheels 142c, 142d and a forward portion 157 to house a portion of the four front and middle wheels 142a, 142b, 142e and 142f, optionally to act as a mudguard and thereby protect a user from splashing liquid, and any dirt, mud, stones or grit, liquids such as oil, water, etc., which could be on the ground and could be thrown up by the wheels 142.
  • the wheels 142 can comprise all-terrain tires configured to provide grip on a range of surfaces, including on-road and off-road.
  • each wheel 142a-142f comprises its own dedicated motor 143a-143f.
  • the control system 180 can comprise one or more wheel controllers 190, such as the three wheel controllers 190a, 190b and 190c which control the front, middle and back wheels respectively.
  • the front and middle wheels 142a, 142b and 142f, 142e on each side of the robot 100 each extend from a respective arm 144a, 144b, 144f, 144e, wherein each pair of arms 144a, 144b and 144f, 144e is connected at a right-angle about an articulating member 145 which is articulated to a connecting member 146 of the main-body portion 150 of the robot 100.
  • the arms 144b, 144e connected to the middle wheels 142b, 142e are shorter than the arms 144a, 144f connected to the front wheels 142a, 142f.
  • Each pair of wheels connected by the pair of arms to the articulating member 145 is configured to pivot so as to allow the front pair of wheels to be inclined so as to enable ascending and descending curbs, etc.
  • each of the arms 144a, 144b, 144e, 144f comprises a respective sliding contact member 147a, 147b, 147e, 147f arranged to slidingly contact a curved surface 146a of the connecting member 146.
  • the load transfer between the main body 150 of the robot 100 and the ground contacting portion is not solely via the articulating member 145, but rather is directly transferred between the load supporting members, i.e. the connecting members 146 and the arms 144a, 144b, 144e, 144f.
  • One or more of the wheels can additionally or alternatively be provided with a torque sensor of the type described above in relation to the articulated joints.
  • the robot 100 as described above enables a user who is a pedestrian, and who can additionally be visually impaired, elderly or physically or mentally challenged, to be guided in a safe manner to avoid obstacles, cross roads safely, navigate desired routes and reach desired destinations safely, easily and at a pace suitable for the user. In effect, the robot 100 acts as an extension of the user.
  • the robot 100 is provided with headlights 170 at a front portion thereof.
  • Headlights can provide visibility at night, both for the benefit of the user and to improve data sensed by cameras for example. Further, the provision of headlights 170 allows third parties, whether in vehicles or pedestrians, to have better awareness of the robot and the user, and see the robot and user more easily. This can be important for example when crossing roads.
  • the robot may comprise a dynamic center of mass (not illustrated), for example, in the form of a translatable mass, to afford the robot better balance or counterweight against a user pressing down on the handle, and/or so that the center of mass may be varied dynamically while the robot is in use.
  • a dynamic center of mass can aid in balance of a user and the robot, and thus greater stability that with a static center of mass which cannot be adapted in accordance with the user and/or surroundings.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Epidemiology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Rehabilitation Therapy (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Pain & Pain Management (AREA)
  • Automation & Control Theory (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Electromagnetism (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Manipulator (AREA)

Abstract

L'invention concerne un robot de guidage à navigation autonome comprenant : au moins un capteur configuré pour recevoir une entrée sensorielle relative à l'environnement ; une poignée comportant au moins un dispositif d'entrée configuré pour recevoir une entrée sensorielle provenant d'un utilisateur et au moins un dispositif de sortie configuré pour communiquer avec un utilisateur ; et un dispositif de commande configuré, en utilisation, pour faire naviguer le robot sur la base de l'entrée sensorielle provenant dudit au moins un capteur et du dispositif d'entrée, et pour communiquer avec un utilisateur par l'intermédiaire dudit au moins un dispositif de sortie.
PCT/IL2021/050210 2020-02-27 2021-02-24 Robot de guidage à navigation autonome WO2021171291A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IL272970A IL272970A (en) 2020-02-27 2020-02-27 A guiding robot that navigates independently
IL272970 2020-02-27

Publications (1)

Publication Number Publication Date
WO2021171291A1 true WO2021171291A1 (fr) 2021-09-02

Family

ID=77490770

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IL2021/050210 WO2021171291A1 (fr) 2020-02-27 2021-02-24 Robot de guidage à navigation autonome

Country Status (2)

Country Link
IL (1) IL272970A (fr)
WO (1) WO2021171291A1 (fr)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114191268A (zh) * 2021-12-06 2022-03-18 中国矿业大学(北京) 一种智能盲杖的智能向导行为习惯训练方法及装置
CN114191269A (zh) * 2021-12-06 2022-03-18 中国矿业大学(北京) 一种智能盲杖中的智能提醒与向导方法及装置
WO2023049976A1 (fr) * 2021-10-02 2023-04-06 N De Araujo Sellin Desenvolvimento De Sistemas Ltda Chien guide robot pour l'aide au déplacement de personnes malvoyantes ou à mobilité réduite et procédé de cartographie et de partage de trajets
US11886190B2 (en) 2020-12-23 2024-01-30 Panasonic Intellectual Property Management Co., Ltd. Method for controlling robot, robot, and recording medium
US11960285B2 (en) 2020-12-23 2024-04-16 Panasonic Intellectual Property Management Co., Ltd. Method for controlling robot, robot, and recording medium

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040015266A1 (en) * 2000-12-04 2004-01-22 Hans Skoog Robot system
US20110056018A1 (en) * 2005-10-07 2011-03-10 Patterson Richard A Patient lift and transfer device
US20130041507A1 (en) * 2010-07-30 2013-02-14 Toyota Motor Engineering & Manufacturing North America, Inc. Robotic cane devices
US20130332018A1 (en) * 2011-01-26 2013-12-12 Ji Hun Kim Road guidance system for visually impaired
US20170001656A1 (en) * 2015-07-02 2017-01-05 RT. WORKS Co., Ltd. Hand Cart
JP2017124125A (ja) * 2016-01-15 2017-07-20 ヨコキ株式会社 車椅子
US20170215666A1 (en) * 2005-12-02 2017-08-03 Irobot Corporation Modular robot
US20170326019A1 (en) * 2014-11-11 2017-11-16 Bow2Go Gmbh Mobile walking and transport aid device
JP2017222234A (ja) * 2016-06-14 2017-12-21 パイオニア株式会社 アシストカート
US20180088583A1 (en) * 2011-01-28 2018-03-29 Irobot Corporation Time-dependent navigation of telepresence robots
WO2019157511A1 (fr) * 2018-02-12 2019-08-15 Crosby Kelvin Système robotique de guidage pour malvoyants
US20190282433A1 (en) * 2016-10-14 2019-09-19 United States Government As Represented By The Department Of Veterans Affairs Sensor based clear path robot guide

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040015266A1 (en) * 2000-12-04 2004-01-22 Hans Skoog Robot system
US20110056018A1 (en) * 2005-10-07 2011-03-10 Patterson Richard A Patient lift and transfer device
US20170215666A1 (en) * 2005-12-02 2017-08-03 Irobot Corporation Modular robot
US20130041507A1 (en) * 2010-07-30 2013-02-14 Toyota Motor Engineering & Manufacturing North America, Inc. Robotic cane devices
US20130332018A1 (en) * 2011-01-26 2013-12-12 Ji Hun Kim Road guidance system for visually impaired
US20180088583A1 (en) * 2011-01-28 2018-03-29 Irobot Corporation Time-dependent navigation of telepresence robots
US20170326019A1 (en) * 2014-11-11 2017-11-16 Bow2Go Gmbh Mobile walking and transport aid device
US20170001656A1 (en) * 2015-07-02 2017-01-05 RT. WORKS Co., Ltd. Hand Cart
JP2017124125A (ja) * 2016-01-15 2017-07-20 ヨコキ株式会社 車椅子
JP2017222234A (ja) * 2016-06-14 2017-12-21 パイオニア株式会社 アシストカート
US20190282433A1 (en) * 2016-10-14 2019-09-19 United States Government As Represented By The Department Of Veterans Affairs Sensor based clear path robot guide
WO2019157511A1 (fr) * 2018-02-12 2019-08-15 Crosby Kelvin Système robotique de guidage pour malvoyants

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11886190B2 (en) 2020-12-23 2024-01-30 Panasonic Intellectual Property Management Co., Ltd. Method for controlling robot, robot, and recording medium
US11906966B2 (en) 2020-12-23 2024-02-20 Panasonic Intellectual Property Management Co., Ltd. Method for controlling robot, robot, and recording medium
US11960285B2 (en) 2020-12-23 2024-04-16 Panasonic Intellectual Property Management Co., Ltd. Method for controlling robot, robot, and recording medium
WO2023049976A1 (fr) * 2021-10-02 2023-04-06 N De Araujo Sellin Desenvolvimento De Sistemas Ltda Chien guide robot pour l'aide au déplacement de personnes malvoyantes ou à mobilité réduite et procédé de cartographie et de partage de trajets
CN114191268A (zh) * 2021-12-06 2022-03-18 中国矿业大学(北京) 一种智能盲杖的智能向导行为习惯训练方法及装置
CN114191269A (zh) * 2021-12-06 2022-03-18 中国矿业大学(北京) 一种智能盲杖中的智能提醒与向导方法及装置

Also Published As

Publication number Publication date
IL272970A (en) 2021-08-31

Similar Documents

Publication Publication Date Title
WO2021171291A1 (fr) Robot de guidage à navigation autonome
US11716598B2 (en) Early boarding of passengers in autonomous vehicles
AU2018321472B2 (en) Context aware stopping for autonomous vehicles
JP3738694B2 (ja) 可動型ケース
KR101049515B1 (ko) 시각장애인용 도로 안내 시스템
JP6387157B1 (ja) 自動運転車及び自動運転車用プログラム
US11983022B2 (en) Travel route creation system
JPWO2016199312A1 (ja) 自律移動システム
KR20140128086A (ko) 장애인 안내용 가변구조 휠체어 로봇
US10752305B2 (en) Stair-climbing type driving device and climbing driving method
JP6638348B2 (ja) 移動ロボットシステム
WO2017039546A1 (fr) Système, procédé et appareil pour la navigation d'un ou de plusieurs véhicules
KR20150027987A (ko) 전동 휠체어의 자율주행 제어 장치 및 방법
CN109966068A (zh) 智能长者电动轮椅车
Imadu et al. Walking guide interface mechanism and navigation system for the visually impaired
Hameed et al. Smart wheel chair
Langner Effort reduction and collision avoidance for powered wheelchairs: SCAD assistive mobility system
KR102511163B1 (ko) 전동휠체어의 후방 지면의 낙차 감지 및 제어방법
CN217118852U (zh) 一种智能导盲机器人
JP2001161756A (ja) 介護用電動車椅子
CA3132143C (fr) Systemes integres pour les autobus
WO2023276399A1 (fr) Véhicule de mobilité électrique
US20210382498A1 (en) Mobility assist apparatus and method
KR102636313B1 (ko) 자율 주행 전동 휠체어
KR20230073496A (ko) 보조주행체의 합체가 가능한 자율주행장치

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21760946

Country of ref document: EP

Kind code of ref document: A1

DPE1 Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101)
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21760946

Country of ref document: EP

Kind code of ref document: A1