EP3378461B1 - A smart guide device for visually impaired person - Google Patents

A smart guide device for visually impaired person Download PDF

Info

Publication number
EP3378461B1
EP3378461B1 EP17162303.6A EP17162303A EP3378461B1 EP 3378461 B1 EP3378461 B1 EP 3378461B1 EP 17162303 A EP17162303 A EP 17162303A EP 3378461 B1 EP3378461 B1 EP 3378461B1
Authority
EP
European Patent Office
Prior art keywords
obstacle
height
guide device
hpl
laser beam
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
EP17162303.6A
Other languages
German (de)
French (fr)
Other versions
EP3378461A1 (en
Inventor
Ahmet ÖZEL
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vestel Elektronik Sanayi ve Ticaret AS
Original Assignee
Vestel Elektronik Sanayi ve Ticaret AS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vestel Elektronik Sanayi ve Ticaret AS filed Critical Vestel Elektronik Sanayi ve Ticaret AS
Priority to EP17162303.6A priority Critical patent/EP3378461B1/en
Priority to TR2017/04725A priority patent/TR201704725A2/en
Publication of EP3378461A1 publication Critical patent/EP3378461A1/en
Application granted granted Critical
Publication of EP3378461B1 publication Critical patent/EP3378461B1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H3/00Appliances for aiding patients or disabled persons to walk about
    • A61H3/06Walking aids for blind persons
    • A61H3/061Walking aids for blind persons with electronic detecting or guiding means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H3/00Appliances for aiding patients or disabled persons to walk about
    • A61H3/06Walking aids for blind persons
    • A61H3/068Sticks for blind persons

Definitions

  • This invention refers to a smart guide device for visually impaired person according to claim 1.
  • a walking guide device for the blind comprises of an ultrasonic sensor, which senses dynamic and non-dynamic obstacles using an ultrasonic signal, a sensor unit which is equipped with an RFID and grasps location guide information using an RFID signal, a vibrating motor which generates vibration if obstacles are sensed, a controller which controls in order to output walk guiding information, and an output unit which outputs obstacle information by a voice and indicates situation information by an LED.
  • a walking stick provided with an infrared detector is shown.
  • An infrared detection head is installed at the front end of a handle of the stick, and an internally-arranged circuit of the infrared detection head is communicated with an alarm prompter on the handle.
  • the handle is provided with a switch used for connecting or disconnecting the infrared detection head.
  • the infrared detection head can be used for detecting the condition of the road in front in time, the alarm prompter can give sound prompting when an obstacle occurs, and therefore the blind can walk with the walking stick conveniently.
  • Document EP-A1-3088996 discloses a smart guide device for visually impaired person comprising all the technical features set out in the preamble of claim 1.
  • the subject-matter of the prior arts does not provide information to the visually impaired person about the height or the depth of obstacle. Further, the prior art may not provide the distance of the obstacle from the visually impaired person.
  • the device for visually impaired person comprises of a laser source attached to the device for generating laser beam, an ultrasonic sensor provided near the laser source for measuring a height of the laser source from the ground, an output mechanism for providing alert to the visually impaired person about any obstacle on a walking direction, characterized in that a gyroscope is attached to the laser source for maintaining an angle between the laser source and the ultrasonic sensor, a camera for capturing an image of a refracted laser beam from the obstacle, an image processing unit for processing the image captured by the camera, an embedded system configured to calculate a height or depth of the obstacle based on the image of the refracted laser beam from the obstacle, the height of the laser source and the angle between a laser beam path and the ultrasonic sensor.
  • the camera may be maintained with a fixed focal length and a fixed screen resolution.
  • the image of the refracted laser beam can be captured for a reference object or the obstacle, according to claim 3-8.
  • the image of the refracted laser beam includes a plurality of horizontal pixel lines (HPL); and wherein the image processing unit is configured to measure the number of HPL between laser line on the ground and laser line on the reference object or the obstacle.
  • the number of HPL can be transferred to the embedded system.
  • the image processing unit is further configured to calculate one HPL height using a ratio between the number of HPL measured for the reference object to height of the reference object, wherein the HPL height is stored in the embedded system.
  • a vertical scaling factor is obtained from the height of the laser source from ground and the angle between the laser beam path and the ultrasonic sensor.
  • the embedded system is further configured for calculating the height or depth of the obstacle based on one HPL height for the reference object, the number of HPL between laser line on the ground and laser line on the obstacle, and the vertical scaling factor.
  • the embedded system is further configured to measure a distance of the obstacle on the walking direction from the device based on the height of the laser source from ground and the angle between the laser beam path and the ultrasonic sensor.
  • the obstacle can include but not limited to a pit on a ground, a bump on the ground, and any object on the ground.
  • the direction of the laser beam refraction can provide information about whether the obstacle is a pit or a bump.
  • the embedded system is further configured to calculate a depth of the obstacle in case the obstacle is a pit. The depth of the obstacle is determined based on the image of the refracted laser beam from the obstacle, the height of the laser source and the angle between the laser beam path and the ultrasonic sensor.
  • Fig. 1 illustrates the block diagram 100 of a smart guide device for visually impaired person, according to the present invention.
  • the smart guide device for visually impaired person comprises of a laser source 101, an ultrasonic sensor 104, an output mechanism 108, a gyroscope 102, a camera 103, an image processing unit 105, an embedded system 106, and a power supply 107.
  • the laser source 101 is attached to the device and provided for generating laser beam, wherein the generating laser beam is focused towards a walking path of the person.
  • the ultrasonic sensor 104 is provided near the laser source 101 for measuring a height of the laser source 101 from the ground.
  • the output mechanism 108 is capable of providing alert to the visually impaired person about any obstacle 109 on a walking direction.
  • the gyroscope 102 is attached to the laser source 101 for maintaining an angle between the laser source 101 and the ultrasonic sensor 104.
  • the camera 103 is provided for capturing an image of a refracted laser beam from the obstacle 109.
  • the image processing unit 105 is provided for processing the image captured by the camera 103.
  • the embedded system 106 is configured to calculate a height or depth of the obstacle 109 based on the image of the refracted laser beam, the height of the laser source 101 and the angle between a laser beam path and the ultrasonic sensor 104.
  • the power supply 107 is provided for supplying power to the device.
  • the output mechanism 108 may be a speaker like device or a vibrator or the combination of both to provide alert to the visually impaired person.
  • Fig. 2 illustrates the graphical representation 200 of function of the device for visually impaired person without obstacle, according to the present invention.
  • the function of the device is started with a transmission of laser beam from the laser source 101.
  • the laser beam touches the ground 202, the laser beam may get refracted due to disturbance in the straight path 204.
  • the camera 103 is configured to capture the image 207 of the refracted laser beam 206.
  • the camera 103 is maintained with a fixed focal length and fixed screen resolution. That is the view angle 203 of the camera 103 is kept at constant value.
  • the angle 'a' 201 between the laser beam path 204 and the ultrasonic sensor 104 is maintained at a constant by keeping the laser source 101 over the gyroscope 102.
  • the gyroscope 102 enables to move the laser source 101 to keep the angle 201 ' ⁇ ' at constant.
  • the distance 204a of the laser beam path 204 from source to ground 202 can be calculated.
  • the distance 204a of laser path 204 can be calculated from the angle 'a' 201 between the laser beam path 204 and the ultrasonic sensor 104 and the height 'h' 205 of the laser source 101 from the ground 202.
  • a vertical scaling factor 'V' can be calculated. That is, the vertical scaling factor is obtained from the height 205 of the laser source 101 from ground 202 and the angle 201 between the laser beam path 204 and the ultrasonic sensor 104.
  • the image 207 of the refracted laser beam 206 that is captured by the camera 103 is processed to find the height or depth of the obstacle 109.
  • the height or depth of the obstacle 109 is calculated based on comparing the laser beam refraction for a reference object with the obstacle 109.
  • the reference object may be a object with known height or depth.
  • the image 207 of the refracted laser beam 206 includes a plurality of horizontal pixel lines (HPL).
  • N rhpl is number of HPL
  • H r is the height or depth of the reference object
  • H rhpl is one HPL height from the ground 202.
  • the H rhpl value is stored in the embedded system 106 for reference.
  • Fig. 3 illustrates the graphical representation 300 of function of the device for visually impaired person with obstacle, according to the present invention.
  • the image 207 of the refracted laser beam 206 is processed for the obstacle 109.
  • the number of HPL between laser line on the ground 202 and laser line on the obstacle 109 is measured using the image processing unit 105.
  • the number of HPL between laser line on the ground 202 and laser line on the obstacle 109 is transferred to the embedded system 106.
  • the embedded system 106 can be further configured for calculating the height or depth of the obstacle 109 based on one HPL height for the reference object, the number of HPL between laser line on the ground 202 and laser line on the obstacle 109, and the vertical scaling factor.
  • the embedded system 106 further configured to measure a distance of the obstacle 109 on the walking direction from the device based on the height 'h' 205 of the laser source 101 from ground 202 and the angle 'a' 201 between the laser beam path 204 and the ultrasonic sensor 104.
  • the obstacle 109 includes a pit on a road, a bump on the road, and any object on the ground 202.
  • the direction of the laser beam refraction provide information about whether the obstacle 109 is a pit or a bump. Hence for the bump and the pit the direction of the refracted laser beam 206 are opposite to each other.
  • the embedded system 106 can be further configured to calculate a depth of the obstacle 109.
  • the depth of the obstacle 109 is determined based on the image 207 of the refracted laser beam 206 from the obstacle 109, the height 205 of the laser source 101 and the angle 201 between the laser beam path 204 and the ultrasonic sensor 104.
  • the method of calculating the depth of the obstacle 109 is similar to the calculation for the height of the obstacle 109.
  • the device can be wearable on to a body of the person such as belt and so on. Similarly, the device can also be incorporated in any accessories such as stick, guides, umberlla and so on.
  • the present invention refers to a smart guide device for visually impaired person, wherein the device includes a laser source 101, an ultrasonic sensor 104 provided near the laser source 101, an output mechanism 108 for providing alert to the visually impaired person about any obstacle 109, a gyroscope 102 attached to the laser source 101, a camera 103 for capturing an image 207 of a refracted laser beam 206 from the obstacle 109, an image processing unit 105 for processing the image captured by the camera 103, an embedded system 106 configured to calculate a height of the obstacle 109 based on the image 207 of the refracted laser beam, the height 205 of the laser source 101 and the angle 201 between a laser beam path 204 and the ultrasonic sensor 104.
  • the device includes a laser source 101, an ultrasonic sensor 104 provided near the laser source 101, an output mechanism 108 for providing alert to the visually impaired person about any obstacle 109, a gyroscope 102 attached to the laser source 101, a camera 103 for capturing an

Landscapes

  • Health & Medical Sciences (AREA)
  • Epidemiology (AREA)
  • Pain & Pain Management (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Rehabilitation Therapy (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Traffic Control Systems (AREA)
  • Rehabilitation Tools (AREA)

Description

  • This invention refers to a smart guide device for visually impaired person according to claim 1.
  • Background of the Invention
  • For a visually impaired person walking alone without falling down is a difficult task. It's hard for them to walk properly even with stick. There are many devices available in the market. This stick is prudent in alerting the person about an obstacle, while walking. But, such stick may not give the information about whether the obstacle is a pit or a bump on the road. Moreover, the devices cannot give information about the dimensional features of the obstacle. That is if the obstacle is a bump or any other object, the device is not capable to provide information about the height or depth of the obstacle.
  • In prior art KR20110078229A , a walking guide device for the blind is disclosed that comprises of an ultrasonic sensor, which senses dynamic and non-dynamic obstacles using an ultrasonic signal, a sensor unit which is equipped with an RFID and grasps location guide information using an RFID signal, a vibrating motor which generates vibration if obstacles are sensed, a controller which controls in order to output walk guiding information, and an output unit which outputs obstacle information by a voice and indicates situation information by an LED.
  • In another prior art CN103830071A , a walking stick provided with an infrared detector is shown. An infrared detection head is installed at the front end of a handle of the stick, and an internally-arranged circuit of the infrared detection head is communicated with an alarm prompter on the handle. The handle is provided with a switch used for connecting or disconnecting the infrared detection head. When the stick is used, the infrared detection head can be used for detecting the condition of the road in front in time, the alarm prompter can give sound prompting when an obstacle occurs, and therefore the blind can walk with the walking stick conveniently.
  • Document EP-A1-3088996 discloses a smart guide device for visually impaired person comprising all the technical features set out in the preamble of claim 1.
  • The subject-matter of the prior arts does not provide information to the visually impaired person about the height or the depth of obstacle. Further, the prior art may not provide the distance of the obstacle from the visually impaired person.
  • Object of the Invention
  • It is therefore the object of the present invention to provide a smart guide device for a visually impaired person to detect or measure the height or depth and distance of any obstacle on the walking direction.
  • Description of the Invention
  • The before mentioned object is solved by providing a smart guide device [herein referred as 'device'] for a visually impaired person to detect or measure the height or depth and distance of any obstacle on the walking direction according to claim 1. According to an embodiment, the device for visually impaired person comprises of a laser source attached to the device for generating laser beam, an ultrasonic sensor provided near the laser source for measuring a height of the laser source from the ground, an output mechanism for providing alert to the visually impaired person about any obstacle on a walking direction, characterized in that a gyroscope is attached to the laser source for maintaining an angle between the laser source and the ultrasonic sensor, a camera for capturing an image of a refracted laser beam from the obstacle, an image processing unit for processing the image captured by the camera, an embedded system configured to calculate a height or depth of the obstacle based on the image of the refracted laser beam from the obstacle, the height of the laser source and the angle between a laser beam path and the ultrasonic sensor.
  • Further embodiments are subject-matter of the dependent claims and/or of the following specification parts.
  • According to an exemplary embodiment, the camera may be maintained with a fixed focal length and a fixed screen resolution. The image of the refracted laser beam can be captured for a reference object or the obstacle, according to claim 3-8. The image of the refracted laser beam includes a plurality of horizontal pixel lines (HPL); and wherein the image processing unit is configured to measure the number of HPL between laser line on the ground and laser line on the reference object or the obstacle. The number of HPL can be transferred to the embedded system. The image processing unit is further configured to calculate one HPL height using a ratio between the number of HPL measured for the reference object to height of the reference object, wherein the HPL height is stored in the embedded system. A vertical scaling factor is obtained from the height of the laser source from ground and the angle between the laser beam path and the ultrasonic sensor.
  • According to another embodiment, the embedded system is further configured for calculating the height or depth of the obstacle based on one HPL height for the reference object, the number of HPL between laser line on the ground and laser line on the obstacle, and the vertical scaling factor. The embedded system further configured for calculating the height or depth of the obstacle using the formula, h = L r × N go × V
    Figure imgb0001
    wherein
    • h is the height or depth of the obstacle
    • Lr is one HPL height for the reference object;
    • Ngo is the number of HPL between laser line on the ground and laser line on the obstacle; and
    • V is the vertical scaling factor.
  • According to an exemplary embodiment, the embedded system is further configured to measure a distance of the obstacle on the walking direction from the device based on the height of the laser source from ground and the angle between the laser beam path and the ultrasonic sensor. The obstacle can include but not limited to a pit on a ground, a bump on the ground, and any object on the ground.
  • Further, the direction of the laser beam refraction can provide information about whether the obstacle is a pit or a bump. The embedded system is further configured to calculate a depth of the obstacle in case the obstacle is a pit. The depth of the obstacle is determined based on the image of the refracted laser beam from the obstacle, the height of the laser source and the angle between the laser beam path and the ultrasonic sensor.
  • Further benefits, goals and features of the present invention will be described by the following specification of the attached figures, in which components of the invention are exemplarily illustrated. Components of the devices and method according to the inventions, which match at least essentially with respect to their function can be marked with the same reference sign, wherein such components do not have to be marked or described in all figures.
  • The invention is just exemplarily described with respect to the attached figures in the following.
  • Brief Description of the Drawings
  • Fig. 1
    illustrates the block diagram of a smart guide device for visually impaired person according to the present invention;
    Fig. 2
    illustrates the graphical representation of function of the smart guide device for visually impaired person without obstacle, according to the present invention; and
    Fig. 3
    illustrates the graphical representation of function of the smart guide device for visually impaired person with obstacle, according to the present invention.
    Detailed Description of the Drawings
  • Fig. 1 illustrates the block diagram 100 of a smart guide device for visually impaired person, according to the present invention.
  • According to an exemplary embodiment, the smart guide device [herein referrred as "device"] for visually impaired person comprises of a laser source 101, an ultrasonic sensor 104, an output mechanism 108, a gyroscope 102, a camera 103, an image processing unit 105, an embedded system 106, and a power supply 107.
  • According to an embodiment, the laser source 101 is attached to the device and provided for generating laser beam, wherein the generating laser beam is focused towards a walking path of the person. In an embodiment, the ultrasonic sensor 104 is provided near the laser source 101 for measuring a height of the laser source 101 from the ground. In an embodiment, the output mechanism 108 is capable of providing alert to the visually impaired person about any obstacle 109 on a walking direction.
  • According to another embodiment, the gyroscope 102 is attached to the laser source 101 for maintaining an angle between the laser source 101 and the ultrasonic sensor 104. In an embodiment, the camera 103 is provided for capturing an image of a refracted laser beam from the obstacle 109. The image processing unit 105 is provided for processing the image captured by the camera 103.
  • According to a further embodiment, the embedded system 106 is configured to calculate a height or depth of the obstacle 109 based on the image of the refracted laser beam, the height of the laser source 101 and the angle between a laser beam path and the ultrasonic sensor 104. The power supply 107 is provided for supplying power to the device. In an embodiment, the output mechanism 108 may be a speaker like device or a vibrator or the combination of both to provide alert to the visually impaired person.
  • Fig. 2 illustrates the graphical representation 200 of function of the device for visually impaired person without obstacle, according to the present invention.
  • According to an exemplary embodiment, the function of the device is started with a transmission of laser beam from the laser source 101. When the laser beam touches the ground 202, the laser beam may get refracted due to disturbance in the straight path 204. The camera 103 is configured to capture the image 207 of the refracted laser beam 206. The camera 103 is maintained with a fixed focal length and fixed screen resolution. That is the view angle 203 of the camera 103 is kept at constant value. The angle 'a' 201 between the laser beam path 204 and the ultrasonic sensor 104 is maintained at a constant by keeping the laser source 101 over the gyroscope 102. The gyroscope 102 enables to move the laser source 101 to keep the angle 201 'α' at constant. Using the trigonometry equation, the distance 204a of the laser beam path 204 from source to ground 202 can be calculated. The distance 204a of laser path 204 can be calculated from the angle 'a' 201 between the laser beam path 204 and the ultrasonic sensor 104 and the height 'h' 205 of the laser source 101 from the ground 202.
  • That is, d = h cos α
    Figure imgb0002
  • From the above equation, a vertical scaling factor 'V' can be calculated. That is, the vertical scaling factor is obtained from the height 205 of the laser source 101 from ground 202 and the angle 201 between the laser beam path 204 and the ultrasonic sensor 104.
  • According to an exemplary embodiment, the image 207 of the refracted laser beam 206 that is captured by the camera 103 is processed to find the height or depth of the obstacle 109. The height or depth of the obstacle 109 is calculated based on comparing the laser beam refraction for a reference object with the obstacle 109. The reference object may be a object with known height or depth. The image 207 of the refracted laser beam 206 includes a plurality of horizontal pixel lines (HPL). The number of HPL between laser line on the ground 202 and laser line on the reference object is measured using the image processing unit 105. Further, the number of HPL between laser line on the ground 202 and laser line on the reference object is transferred to the embedded system 106. From that, one HPL height from the ground 202 for a reference object is calculated based on the number of HPL and height of the reference object. That is, 1 H rhpl = N rhpl H r
    Figure imgb0003
  • Where, Nrhpl is number of HPL, Hr is the height or depth of the reference object and Hrhpl is one HPL height from the ground 202. The Hrhpl value is stored in the embedded system 106 for reference.
  • Fig. 3 illustrates the graphical representation 300 of function of the device for visually impaired person with obstacle, according to the present invention.
  • According to an exemplary embodiment, the image 207 of the refracted laser beam 206 is processed for the obstacle 109. The number of HPL between laser line on the ground 202 and laser line on the obstacle 109 is measured using the image processing unit 105. The number of HPL between laser line on the ground 202 and laser line on the obstacle 109 is transferred to the embedded system 106. The embedded system 106 can be further configured for calculating the height or depth of the obstacle 109 based on one HPL height for the reference object, the number of HPL between laser line on the ground 202 and laser line on the obstacle 109, and the vertical scaling factor.
  • According to a preferred embodiment, the height or depth of the obstacle 109 is calculated using the formula, h o = H rhpl × N ohpl × V
    Figure imgb0004
    wherein
    • ho is the height or depth of the obstacle 109;
    • Hrhpl is one HPL height for the reference object;
    • Nohpl is the number of HPL between laser line on the ground 202 and laser line on the obstacle 109; and
    • V is the vertical scaling factor.
  • According to an exemplary embodiment, the embedded system 106 further configured to measure a distance of the obstacle 109 on the walking direction from the device based on the height 'h' 205 of the laser source 101 from ground 202 and the angle 'a' 201 between the laser beam path 204 and the ultrasonic sensor 104. The obstacle 109 includes a pit on a road, a bump on the road, and any object on the ground 202. The direction of the laser beam refraction provide information about whether the obstacle 109 is a pit or a bump. Hence for the bump and the pit the direction of the refracted laser beam 206 are opposite to each other. In case if the obstacle 109 is a pit, the embedded system 106 can be further configured to calculate a depth of the obstacle 109. The depth of the obstacle 109 is determined based on the image 207 of the refracted laser beam 206 from the obstacle 109, the height 205 of the laser source 101 and the angle 201 between the laser beam path 204 and the ultrasonic sensor 104. The method of calculating the depth of the obstacle 109 is similar to the calculation for the height of the obstacle 109. Further, the device can be wearable on to a body of the person such as belt and so on. Similarly, the device can also be incorporated in any accessories such as stick, guides, umberlla and so on.
  • Thus, the present invention refers to a smart guide device for visually impaired person, wherein the device includes a laser source 101, an ultrasonic sensor 104 provided near the laser source 101, an output mechanism 108 for providing alert to the visually impaired person about any obstacle 109, a gyroscope 102 attached to the laser source 101, a camera 103 for capturing an image 207 of a refracted laser beam 206 from the obstacle 109, an image processing unit 105 for processing the image captured by the camera 103, an embedded system 106 configured to calculate a height of the obstacle 109 based on the image 207 of the refracted laser beam, the height 205 of the laser source 101 and the angle 201 between a laser beam path 204 and the ultrasonic sensor 104.
  • List of reference numbers
  • 100
    block diagram for the device
    101
    laser source
    102
    a gyroscope
    103
    a camera
    104
    an ultrasonic sensor
    105
    an image processing unit
    106
    an embedded system
    107
    power supply
    108
    an output mechanism
    109
    an obstacle
    200
    graphical representation of function of the device without obstacle
    201
    the angle between a laser beam path and the ultrasonic sensor
    202
    ground line
    203
    view angle of a camera
    204
    laser beam path
    204a
    distance of the laser beam path from source to ground (d)
    205
    height of the laser source from the ground (h)
    206
    refracted laser beam
    207
    image of the refracted laser beam
    300
    graphical representation of function of the device with the obstacle

Claims (14)

  1. A smart guide device for visually impaired person comprises of:
    a laser source (101) attached to the device for generating laser beam;
    an ultrasonic sensor (104) provided near the laser source (101) for measuring a height (205) of the laser source (101) from the ground (202);
    an output mechanism (108) for providing alert to the visually impaired person about any obstacle (109) on a walking direction;
    characterized in that
    a gyroscope (102) attached to the laser source (101) for maintaining an angle between the laser source (101) and the ultrasonic sensor (104);
    a camera (103) for capturing an image (207) of a refracted laser beam (206) from the obstacle (109);
    an image processing unit (105) for processing the image captured by the camera (103);
    an embedded system (106) configured to calculate a height or depth of the obstacle (109) based on the image (207) of the refracted laser beam, the height (205) of the laser source (101) and the angle (201) between a laser beam path (204) and the ultrasonic sensor (104).
  2. The smart guide device of claim 1, wherein the camera (103) is maintained with a fixed focal length and a fixed screen resolution.
  3. The smart guide device of claim 1, wherein the image (207) of the refracted laser beam (206) is captured for a reference object or the obstacle (109).
  4. The smart guide device of claim 3, wherein the image (207) of the refracted laser beam (206) includes a plurality of horizontal pixel lines (HPL); and wherein the image processing unit (105) configured to measure the number of HPL between laser line on the ground (202) and laser line on the reference object or the obstacle (109), and to transfer to the number of HPL to the embedded system (106).
  5. The smart guide device of claim 4, wherein the image processing unit (105) further configured to calculate one HPL height using a ratio between the number of HPL measured for the reference object to height or depth of the reference object; wherein the HPL height is stored in the embedded system (106).
  6. The smart guide device of claim 5, wherein a vertical scaling factor is obtained from the height (205) of the laser source (101) from ground (202) and the angle (201) between the laser beam path (204) and the ultrasonic sensor (104).
  7. The smart guide device of claim 6, wherein the embedded system (106) further configured for calculating the height or depth of the obstacle (109) based on one HPL height for the reference object, the number of HPL between laser line on the ground (202) and laser line on the obstacle (109), and the vertical scaling factor.
  8. The smart guide device of claim 7, wherein the embedded system (106) further configured for calculating the height or depth of the obstacle (109) using the formula, h o = H rhpl × N ohpl × V
    Figure imgb0005
    wherein
    h0 is the height or depth of the obstacle (109)
    Hrhpl is one HPL height for the reference object;
    Nohpl is the number of HPL between laser line on the ground (202) and laser line on the obstacle (109); and
    V is the vertical scaling factor.
  9. The smart guide device of claim 1, wherein the embedded system (106) further configured to measure a distance of the obstacle (109) on the walking direction from the device based on the height (205) of the laser source (101) from ground (202) and the angle (201) between the laser beam path (204) and the ultrasonic sensor (104).
  10. The smart guide device of claim 1, wherein the obstacle (109) includes a pit on a road, a bump on the road, and any object on the ground (202).
  11. The smart guide device of claim 1, wherein direction of the laser beam refraction provide information about whether the obstacle (109) is a pit or a bump.
  12. The smart guide device of claim 1, wherein the output mechanism 108 is a speaker like device or a vibrator or the combination of both to provide alert to the visually impaired person.
  13. The smart guide device of claim 1, wherein the device is wearable on to a body of the person.
  14. The smart guide device of claim 1, wherein the device is incorporated in accessories which includes stick, guides, umberlla, and shoe.
EP17162303.6A 2017-03-22 2017-03-22 A smart guide device for visually impaired person Active EP3378461B1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP17162303.6A EP3378461B1 (en) 2017-03-22 2017-03-22 A smart guide device for visually impaired person
TR2017/04725A TR201704725A2 (en) 2017-03-22 2017-03-29 A SMART GUIDE DEVICE FOR VISUAL IMPAIRED PEOPLE

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
EP17162303.6A EP3378461B1 (en) 2017-03-22 2017-03-22 A smart guide device for visually impaired person

Publications (2)

Publication Number Publication Date
EP3378461A1 EP3378461A1 (en) 2018-09-26
EP3378461B1 true EP3378461B1 (en) 2019-10-23

Family

ID=58412897

Family Applications (1)

Application Number Title Priority Date Filing Date
EP17162303.6A Active EP3378461B1 (en) 2017-03-22 2017-03-22 A smart guide device for visually impaired person

Country Status (2)

Country Link
EP (1) EP3378461B1 (en)
TR (1) TR201704725A2 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109875860A (en) * 2019-03-09 2019-06-14 华北水利水电大学 A kind of intelligent blind-guiding instrument
SE2150896A1 (en) * 2021-07-07 2022-07-07 Visoont Ab A device and a method for guiding a visually impaired person

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101105238B1 (en) 2009-12-30 2012-01-13 한국기술교육대학교 산학협력단 Apparatus and control method for walking guide for the blind
TWI538668B (en) * 2012-03-27 2016-06-21 鴻海精密工業股份有限公司 Electronic guide device
CN103830071A (en) 2012-11-21 2014-06-04 老河口市第四中学 Walking stick provided with infrared detector
CN103584982B (en) * 2013-11-05 2016-05-25 广东欧珀移动通信有限公司 Mobile communication equipment blind man's stick air navigation aid and mobile communication equipment
KR20160028891A (en) * 2014-09-04 2016-03-14 김경연 An objection recognition device for a blind person using a depth camera and a direction-detecting sensor
US20160321880A1 (en) * 2015-04-28 2016-11-03 Immersion Corporation Systems And Methods For Tactile Guidance
CN204814723U (en) * 2015-07-16 2015-12-02 深圳前海达闼科技有限公司 Lead blind system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
None *

Also Published As

Publication number Publication date
TR201704725A2 (en) 2018-10-22
EP3378461A1 (en) 2018-09-26

Similar Documents

Publication Publication Date Title
EP2629737B1 (en) White cane with integrated electronic travel aid using 3d tof sensor
EP1293184B1 (en) Walking auxiliary for person with dysopia
US9801778B2 (en) System and method for alerting visually impaired users of nearby objects
KR101898582B1 (en) A stick for the blind
US10304250B2 (en) Danger avoidance support program
EP3153146A1 (en) Rehabilitation assistance system
US20120092460A1 (en) System And Method For Alerting Visually Impaired Users Of Nearby Objects
JP2010533308A5 (en)
KR100759056B1 (en) A system for guiding an obstacle avoidance direction including senses for supersonic waves
EP3378461B1 (en) A smart guide device for visually impaired person
KR101893374B1 (en) A stick for the blind
CN107049718B (en) Obstacle avoidance device
AU2019292458A1 (en) Display control system, display control device, and display control method
KR101715472B1 (en) Smart walking assistance device for the blind and Smart walking assistance system using the same
KR20160028891A (en) An objection recognition device for a blind person using a depth camera and a direction-detecting sensor
Okayasu Newly developed walking apparatus for identification of obstructions by visually impaired people
KR20180097962A (en) Situation determining guide apparatus and method based image analysis
KR20130004646U (en) A safety supervision system of a blind end in a tunnel
KR101878263B1 (en) Walking Stick for Visually Impaired Person
KR20190104663A (en) Apparatus for measuring body size using drone
KR101777203B1 (en) Walking assistance system for blind person
JP2021163401A (en) Person detection system, person detection program, leaned model generation program and learned model
KR101605551B1 (en) Apparatus of detecting object and control method thereof
KR20190133887A (en) Wearable aids for the visually impaired
KR20220087624A (en) Navigating apparatus for blind

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN PUBLISHED

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20190320

RBV Designated contracting states (corrected)

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

INTG Intention to grant announced

Effective date: 20190516

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE PATENT HAS BEEN GRANTED

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602017007954

Country of ref document: DE

REG Reference to a national code

Ref country code: AT

Ref legal event code: REF

Ref document number: 1192845

Country of ref document: AT

Kind code of ref document: T

Effective date: 20191115

REG Reference to a national code

Ref country code: NL

Ref legal event code: MP

Effective date: 20191023

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG4D

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200123

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191023

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200124

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191023

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191023

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191023

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191023

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200123

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191023

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200224

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: RS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191023

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200224

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191023

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: AL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191023

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 602017007954

Country of ref document: DE

PG2D Information on lapse in contracting state deleted

Ref country code: IS

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191023

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191023

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191023

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191023

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191023

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200223

REG Reference to a national code

Ref country code: AT

Ref legal event code: MK05

Ref document number: 1192845

Country of ref document: AT

Kind code of ref document: T

Effective date: 20191023

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191023

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191023

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191023

26N No opposition filed

Effective date: 20200724

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191023

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191023

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191023

REG Reference to a national code

Ref country code: BE

Ref legal event code: MM

Effective date: 20200331

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20200322

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20200331

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20200322

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20200331

Ref country code: FR

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20200331

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20200331

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191023

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191023

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191023

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: DE

Payment date: 20240320

Year of fee payment: 8

Ref country code: GB

Payment date: 20240320

Year of fee payment: 8

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: TR

Payment date: 20240320

Year of fee payment: 8