GB2610630A - Obstacle detection apparatus - Google Patents

Obstacle detection apparatus Download PDF

Info

Publication number
GB2610630A
GB2610630A GB2113023.2A GB202113023A GB2610630A GB 2610630 A GB2610630 A GB 2610630A GB 202113023 A GB202113023 A GB 202113023A GB 2610630 A GB2610630 A GB 2610630A
Authority
GB
United Kingdom
Prior art keywords
wheelchair
obstacle
obstacles
control unit
detection apparatus
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
GB2113023.2A
Other versions
GB202113023D0 (en
Inventor
Bantounos Kostantinos
Marland Jamie
Underwood Ian
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Duchenne Uk
Original Assignee
Duchenne Uk
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Duchenne Uk filed Critical Duchenne Uk
Priority to GB2113023.2A priority Critical patent/GB2610630A/en
Publication of GB202113023D0 publication Critical patent/GB202113023D0/en
Priority to PCT/GB2022/052285 priority patent/WO2023037114A1/en
Publication of GB2610630A publication Critical patent/GB2610630A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61GTRANSPORT, PERSONAL CONVEYANCES, OR ACCOMMODATION SPECIALLY ADAPTED FOR PATIENTS OR DISABLED PERSONS; OPERATING TABLES OR CHAIRS; CHAIRS FOR DENTISTRY; FUNERAL DEVICES
    • A61G5/00Chairs or personal conveyances specially adapted for patients or disabled persons, e.g. wheelchairs
    • A61G5/04Chairs or personal conveyances specially adapted for patients or disabled persons, e.g. wheelchairs motor-driven
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61GTRANSPORT, PERSONAL CONVEYANCES, OR ACCOMMODATION SPECIALLY ADAPTED FOR PATIENTS OR DISABLED PERSONS; OPERATING TABLES OR CHAIRS; CHAIRS FOR DENTISTRY; FUNERAL DEVICES
    • A61G5/00Chairs or personal conveyances specially adapted for patients or disabled persons, e.g. wheelchairs
    • A61G5/04Chairs or personal conveyances specially adapted for patients or disabled persons, e.g. wheelchairs motor-driven
    • A61G5/041Chairs or personal conveyances specially adapted for patients or disabled persons, e.g. wheelchairs motor-driven having a specific drive-type
    • A61G5/042Front wheel drive
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61GTRANSPORT, PERSONAL CONVEYANCES, OR ACCOMMODATION SPECIALLY ADAPTED FOR PATIENTS OR DISABLED PERSONS; OPERATING TABLES OR CHAIRS; CHAIRS FOR DENTISTRY; FUNERAL DEVICES
    • A61G5/00Chairs or personal conveyances specially adapted for patients or disabled persons, e.g. wheelchairs
    • A61G5/06Chairs or personal conveyances specially adapted for patients or disabled persons, e.g. wheelchairs with obstacle mounting facilities, e.g. for climbing stairs, kerbs or steps
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60LPROPULSION OF ELECTRICALLY-PROPELLED VEHICLES; SUPPLYING ELECTRIC POWER FOR AUXILIARY EQUIPMENT OF ELECTRICALLY-PROPELLED VEHICLES; ELECTRODYNAMIC BRAKE SYSTEMS FOR VEHICLES IN GENERAL; MAGNETIC SUSPENSION OR LEVITATION FOR VEHICLES; MONITORING OPERATING VARIABLES OF ELECTRICALLY-PROPELLED VEHICLES; ELECTRIC SAFETY DEVICES FOR ELECTRICALLY-PROPELLED VEHICLES
    • B60L3/00Electric devices on electrically-propelled vehicles for safety purposes; Monitoring operating variables, e.g. speed, deceleration or energy consumption
    • B60L3/0007Measures or means for preventing or attenuating collisions
    • B60L3/0015Prevention of collisions
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61GTRANSPORT, PERSONAL CONVEYANCES, OR ACCOMMODATION SPECIALLY ADAPTED FOR PATIENTS OR DISABLED PERSONS; OPERATING TABLES OR CHAIRS; CHAIRS FOR DENTISTRY; FUNERAL DEVICES
    • A61G2203/00General characteristics of devices
    • A61G2203/10General characteristics of devices characterised by specific control means, e.g. for adjustment or steering
    • A61G2203/14Joysticks
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61GTRANSPORT, PERSONAL CONVEYANCES, OR ACCOMMODATION SPECIALLY ADAPTED FOR PATIENTS OR DISABLED PERSONS; OPERATING TABLES OR CHAIRS; CHAIRS FOR DENTISTRY; FUNERAL DEVICES
    • A61G2203/00General characteristics of devices
    • A61G2203/10General characteristics of devices characterised by specific control means, e.g. for adjustment or steering
    • A61G2203/16Touchpads
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61GTRANSPORT, PERSONAL CONVEYANCES, OR ACCOMMODATION SPECIALLY ADAPTED FOR PATIENTS OR DISABLED PERSONS; OPERATING TABLES OR CHAIRS; CHAIRS FOR DENTISTRY; FUNERAL DEVICES
    • A61G2203/00General characteristics of devices
    • A61G2203/70General characteristics of devices with special adaptations, e.g. for safety or comfort
    • A61G2203/72General characteristics of devices with special adaptations, e.g. for safety or comfort for collision prevention
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/10Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
    • B60R2300/107Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using stereoscopic cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60YINDEXING SCHEME RELATING TO ASPECTS CROSS-CUTTING VEHICLE TECHNOLOGY
    • B60Y2200/00Type of vehicle
    • B60Y2200/80Other vehicles not covered by groups B60Y2200/10 - B60Y2200/60
    • B60Y2200/84Wheelchairs
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/46Indirect determination of position data

Abstract

An obstacle detection apparatus for a wheelchair has a depth camera 50 to capture images and position information for objects in the pictures. The position information describes estimated positions of the objects relative to the camera 50. A control unit is configured to receive an image and associated position information from the camera 50 and identify one or more obstacles 360, such as curbs, in the image. The controller calculates, from the image and the position data, the distance Bd of each obstacle 360 from the wheelchair and each obstacle 360’s height Bh. Based on the height Bh and distance Bd, the processor determines whether any of the hazards 360 are impassable to the wheelchair. If an impassable obstacle 360 is detected, the controller triggers a pre-emptive action such as an audio or haptic alert. The processor may consider only one particular region of interest when detecting obstacles 360, and may only consider the tallest or nearest danger 360 within this region.

Description

OBSTACLE DETECTION APPARATUS
Background
A commonly faced issue by wheelchair users is that of navigating around obstacles that the wheelchair is not able to pass over/through. An example of this can be found when mounting a pavement (also referred to as a sidewalk) from a road. In some places a dropped kerb (or curb) is provided to make it easier for wheelchair users to join the pavement. However, in locations where such dropped kerbs are not provided, the user may be faced with a full-height kerb which the wheelchair may nor may not be able to mount.
Of particular concern to many wheelchair users is not knowing whether they will be able to scale or otherwise drive past/through a particular obstacle before they have attempted to pass that obstacle. For the example of climbing a kerb, the user may not know whether they will be able to climb the kerb in the wheelchair before they have tried to drive the wheelchair up the kerb.
Faced with this uncertainty, the user can either avoid attempting to pass the obstacle at all, which may be inconvenient as another route may need to be found to the user's destination, or the user may attempt to drive past the obstacle anyway. This can be dangerous as there is a risk that the wheelchair could topple (over-turn) and/or be damaged, and the user could be hurt or find themselves stuck in a toppled wheelchair, possibly unable to right themselves. Even if the wheelchair is able to climb the kerb or pass the obstacle, the user may nonetheless be hurt by the impact.
The inventors have therefore developed an arrangement for a wheelchair which is able to identify potential obstacles, determine whether the obstacles are passable to the wheelchair and if not, trigger a pre-emptive action (e.g to warn the user and/or prevent the user attempting to drive over the obstacle).
Summary of the Invention
Aspects of the invention are set out in the accompanying claims.
Viewed from a first aspect, there is provided an obstacle detection apparatus for a wheelchair comprising: a depth camera to capture images and position information for objects in the images, the position information indicative of estimated positions of the objects relative to the depth camera; and a control unit configured to: receive an image and associated position information from the depth camera; identify one or more obstacles in the image and for each of the obstacles, calculate from the image and the associated position information, a height of the obstacle and a distance of the obstacle from the wheelchair; determine, based on the height and the distance for each of the obstacles, whether any of the obstacles are impassable to the wheelchair; and in response to determining that an obstacle is impassable to the wheelchair, trigger a pre-emptive action.
To capture images for performing obstacle detection in the obstacle detection apparatus, a depth camera is provided. The depth camera captures images as well as position information for objects in the images. The position information is indicative of estimated positions of the objects relative to the depth camera. In some examples, the depth camera is a stereo depth camera having two or more optical image sensors. Based on comparing the output of the two or more optical image sensors, the depth camera may determine the distance of the objects in the image from the camera. Additionally, the depth camera may have an infra-red emitter to emit infra-red light having a known and possibly varying pattern. Based on observing the received pattern of infra-red light reflected from objects in the image, and comparing the patterns of light received between the two sensors, the depth camera may calculate the location of objects in the image.
In contrast to approaches in which ultra-sound is used to detect and measure the distance to objects, by using a depth camera as described herein, the obstacle detection apparatus is able to measure the height and width of obstacles rather than just a distance to an obstacle. This is useful not only in identifying possible obstacles themselves, but also for determining whether the obstacle will be passable to the wheelchair. Therefore, in contrast to systems making use of ultra-sound, the present apparatus is able to more accurately determine whether an obstacle will be an issue to a user, and so reduce the chance of the user attempting to climb obstacles for which the wheelchair is not suited, or the user avoiding obstacles that could have been climbed by the wheelchair. Moreover, the use of a depth camera as described herein can provide a faster response speed allowing an indication of whether an obstacle is passable or not to be produced more quickly and/or more obstacles to be assessed. Further, in contrast to systems in which an ultra-sound camera is used, depth cameras are less affected by dust and water, both of which may be encountered when driving a wheelchair. The use of a depth camera therefore renders the obstacle detection apparatus more resilient.
The obstacle detection apparatus is also provided with a control unit coupled to the depth camera. The control unit may comprise processing circuitry such as a microprocessor or microcontroller and is arranged to perform control functionality for the obstacle detection apparatus. The control unit is configured to receive from the depth camera an image and the associated position information for the objects in the image. The control unit then identifies one or more obstacles in the image. The identified obstacles may be all of the objects in the image or the control unit may perform an initial filtering process so that only some of the objects in the image are characterised as obstacles. For example, the control unit may have a region of interest defined such that only objects lying within the region of interest are considered obstacles and all other objects are ignored. In some examples, the control unit identifies only a tallest object in the image (or in the region of interest) and a nearest object as obstacles. This can reduce the processing load on the control unit, whilst a sensibly position region of interest will mean that only objects likely to be lie on the wheelchair's path are designated obstacles and thus further considered.
For each of the obstacles in the image, the control unit calculates a height of the obstacle and a distance of the obstacle from the wheelchair. This calculation may be performed based on a mathematical relationship between the distance of the obstacle from the wheelchair and the obstacle's position and extent in the image, and the height and distance of the obstacle. Based on the calculated height and distance, the control unit is configured to determine whether any of the obstacles will be impassable to the wheelchair.
This may for example be done by comparing the height of the obstacle to a threshold height such that if the obstacle is above the threshold height, the obstacle is deemed impassable. In some examples, a similar comparison is done using the distance of the obstacle from the wheelchair and a threshold distance such that if the obstacle is beyond the threshold distance, the likelihood that the obstacle will prevent the passage of the wheelchair on its current course is deemed to be low and the obstacle deemed passable. The threshold or thresholds may be preset for the wheelchair. For example, the threshold height may be set to match the standard kerb height in a particular country (e.g., 10cm (4 inches) in the UK or 17cm in the US). However, in some examples, the control unit is configured to calculate the threshold height and/or threshold distance itself. In such examples, the control unit may be provided with dimensions of the wheelchair such as the ground clearance and/or wheel diameter and may calculate the maximum height of an obstacle that would passable to the wheelchair. For example, the control unit may determine the threshold height of an obstacle to be a third of the wheelchair diameter where a third of the diameter of the wheel represents an estimate maximum height of obstacle that can be climbed by a wheel.
In some examples, to assess whether an obstacle is impassable to the wheelchair, the depth camera is configured to identify one or more points between the obstacle and the wheelchair and determine the height of those points relative to the wheelchair based on the images and position information from the depth camera. Based on the height(s) of the one or more points, the control unit may calculate an incline of a surface leading to the obstacle.
Thus, the determination as to whether the obstacle is impassable to the wheelchair may be made on the basis of the incline. That is, for a sufficiently shallow slope, it may be determined that the wheelchair is able to get past/over the obstacle even though the obstacle may otherwise have been too high relative to the wheelchair had the incline not been present. On the other hand, the control unit may determine that the incline is too steep, or that the obstacle itself is too high even with an incline such that the obstacle is not passable to the wheelchair.
If the control unit determines that any of the obstacles are impassable to the wheelchair, the control unit may trigger a pre-emptive action. The pre-emptive action may involve issuing an alert to the user of the wheelchair to let the user know that an obstacle has been detected that is impassable to the wheelchair. The user can then avoid driving the wheelchair over the obstacle, thereby avoiding risking tipping the wheelchair, injuring the user and/or damaging the wheelchair. The alert may take the form of a notification at a mobile device of the user where the control unit is in communication with such a mobile device. Additionally, or alternatively, the obstacle detection apparatus may comprise an output element by which the alert can be issued to the user. Thus, the need for a connected mobile device in order to issue the alert is avoided. Further, the output element may be more suited for attracting the user's attention. For example, the output element may be a speaker or a haptic feedback element (or both may be provided) to issue an audible alert or a vibration respectively.
In some examples, as well as or instead of issuing the alert, the control unit is configured to stop the wheelchair. This control unit may be coupled to a driving control apparatus of the wheelchair such that by sending an indication to the driving control apparatus, the control unit is able to cause the driving control apparatus to stop the wheelchair. Hence, the control unit may be able to prevent the user attempting to drive over an obstacle that has been determined to be impassable to the wheelchair.
Whilst the height and distance of the obstacles in the images from the depth camera may be calculated directly from the images with their position information, the control unit may be configured to first correct for a distortion in the position information arising due to the positioning of the depth camera relative to the wheelchair. In such examples, the control unit may store positioning information indicative of a camera height and tilt angle. Calculating the height and distance of the obstacles may therefore involve correcting for this camera height and tilt angle so that an accurate measurement of the obstacle can be made from which to determine whether the obstacle is passable to the wheelchair.
The control unit of the obstacle detection apparatus may also be responsive to detecting that an obstacle is impassable to the wheelchair to determine a location (e.g., using GPS or other satellite navigation system) of the wheelchair and store an indication of the impassable obstacle in association with that location. In this way, the obstacle detection apparatus can build up a map of impassable obstacles and their positions. The control unit may be configured to calculate routing information for the wheelchair to help a user to navigate. This information about impassable obstacles and their positions may therefore be used as a factor in determining the route that the control unit advises the user to take when providing such navigation information. Specifically, the control unit may provide the user with a route that avoids the obstacles identified as impassable.
The obstacle detection apparatus may be arranged to detect potholes. Potholes are a particular to wheelchairs since driving into a pothole can lead to damaging the wheelchair, getting stuck and/or injuring the user. By identifying potholes using the images and position information from the depth camera and identifying potholes above a threshold size as an obstacle that is impassable to the wheelchair, the pre-emptive action can be triggered to avoid the user driving over the pothole. Additionally, or alternatively, the location of the pothole could be recorded for use in routing by the user to avoid the user driving over the pothole in future. In some examples, on detecting a pothole, the control unit stores the image of the pothole captured by the depth camera along with location data (e.g., determined by a GPS receiver on the user's mobile phone). This could shared with other wheelchair users to aid those users in avoiding the pothole and/or reported to local authority responsible for fixing the potholes. Viewed from another aspect of an invention described herein, there is provided a wheelchair comprising the obstacle detection apparatus described above.
Viewed from yet another aspect, there is provided a method of detecting obstacles impassable to a wheelchair, the method comprising: capturing images and position information for objects in the images, the position information indicative of estimated positions of the objects relative to the depth camera; identifying one or more obstacles in an image; calculating, for each of the obstacles and from an image and associated position information, a height of the obstacle and a distance of the obstacle from the wheelchair; determining, based on the height and the distance for each of the obstacles, whether any of the obstacles are impassable to the wheelchair; and in response to determining that an obstacle is impassable to the wheelchair, triggering a pre-emptive action.
Figures Aspects of the invention will now be described, by way of example only, with reference to the accompanying figures in which: Figure 1 shows a wheelchair of the type with which one example of the invention may be used; Figure 2A shows a schematic of selected components of the wheelchair of Figure 1; Figure 2B shows a schematic of selected components of a user device that may be used with the wheelchair of Figure 1; Figure 3 shows a worked example of the obstacle detection process performed by the obstacle detection apparatus according to an example; Figure 4 shows a depth camera of an obstacle detection apparatus according to one example; Figure 5 shows a worked example of a navigation process performed by the obstacle detection apparatus according to an example; and Figure 6 is a flowchart illustrating a method of performing obstacle detection according to an example.
Any reference to prior art documents in this specification is not to be considered an admission that such prior art is widely known or forms part of the common general knowledge in the field. As used in this specification, the words "comprises", "comprising", and similar words, are not to be interpreted in an exclusive or exhaustive sense. In other words, they are intended to mean "including, but not limited to". The invention is further described with reference to the following examples. It will be appreciated that the invention as claimed is not intended to be limited in any way by these examples. It will also be recognised that the invention covers not only individual embodiments but also combination of the embodiments described herein.
The various embodiments described herein are presented only to assist in understanding and teaching the claimed features. These embodiments are provided as a representative sample of embodiments only, and are not exhaustive and/or exclusive. It is to be understood that advantages, embodiments, examples, functions, features, structures, and/or other aspects described herein are not to be considered limitations on the scope of the invention as defined by the claims or limitations on equivalents to the claims, and that other embodiments may be utilised and modifications may be made without departing from the spirit and scope of the claimed invention. Various embodiments of the invention may suitably comprise, consist of, or consist essentially of, appropriate combinations of the disclosed elements, components, features, parts, steps, means, etc, other than those specifically described herein. In addition, this disclosure may include other inventions not presently claimed, but which may be claimed in future.
In the present application, the words "configured to..." are used to mean that an element of an apparatus has a configuration able to carry out the defined operation. In this context, a "configuration" means an arrangement or manner of interconnection of hardware or software. For example, the apparatus may have dedicated hardware which provides the defined operation, or a processor or other processing device may be programmed to perform the function. "Configured to" does not imply that the apparatus element needs to be changed in any way in order to provide the defined operation.
Detailed Description
Figure 1 shows a wheelchair 2 of the type with which one example of the invention may be used. In this example, the wheelchair 2 is a motorised wheelchair driven by a motor housed within the base 10 that is arranged to drive the wheelchair 2 based on inputs from a user at a gesture controller 40 and/or a secondary controller 70. However, it will be appreciated that the present techniques are also applicable in other forms of wheelchairs such as manual wheelchairs which are driven by a user turning the wheels directly or by another person pushing the wheelchair, or to mobility scooters. As shown in Figure 1, the wheelchair 2 has two front wheels 22 which are driven by the motor and can be operated independently by a control unit such that the wheelchair 2 is able to drive in a straight direction, turn left or right and drive at different speeds. To support the ability of the wheelchair 2 to make a turn in a small amount of space, the wheelchair 2 has rear omniwheels 24. The omni-wheels 24 are able to rotate about an axle running through the centre of the wheel (as is the case for a normal wheel) to enable motion in a forward/backward direction, and also comprise rollers mounted around the circumference of the wheel to enable smooth motion of the wheel in a direction parallel to the axle. This allows the wheelchair 2 to turn with a small turning circle which may be desirable when operating the wheelchair 2 in cramped environments.
Mounted on top of the base 10 is a seat 30 having a seat base 32 on which a user can sit and a seat back 34 to support the user's back when sitting. The seat base 32 and seat back 34 may be fixedly mounted to one another or may be pivotally mounted such that the angle formed between the seat base 32 and the seat back 34 can be altered to provide a comfortable sitting position for the user. For further support when sitting, the seat 30 comprises a headrest 36 to support the user's head when sitting in the wheelchair.
The seat 30 also has armrests 38 on which the user can rest his or her arms and footrests 28 extending from the seat base 32. The footrests 28, armrests 38 and headrest 36 are adjustable so that the position and angle of these components relative to the seat base 32 and seat back 34 can be altered and/or these components removed entirely.
Located at the end of one of the armrests 38 is a gesture controller 40. The gesture controller 40 allows the user to control the motion of the wheelchair 2. As shown in Figure 1, the gesture controller 40 is seated in a mandrel which enables the use of the gesture controller 40 as a joystick whereby pivoting of the gesture controller 40 in the mandrel is used to indicate a direction and speed in which the wheelchair 2 is to travel. The gesture controller 40 may also be removed from the mandrel whereupon control of the wheelchair 2 can be performed by moving the gesture controller 40 freely in three-dimensional space unconstrained by the attachment to the mandrel. This may provide a more comfortable and intuitive method of controlling the wheelchair 2, particularly where the user has limited range of motion or level of motor control, for example.
To display the detected input from the gesture controller 40 and any other information that may be useful for the user, a display 60 is provided at the end of one of the armrests 38. The display 60 is mounted at an angle relative to the armrest such that the display 60 is oriented for easy viewing by a user sitting in the seat 30. The detected input from the gesture controller 40 (e.g., a direction and speed in which the wheelchair 2 is to travel based on the manipulation of the gesture controller 40) is displayed on the display 60. The display 60 may also display other information such as a battery level for the wheelchair 2, navigation information and/or notifications from a connected user device (e.g. mobile phone).
Situated at the end of the other arm rest 38 is a phone holder 70. The phone holder is arranged to securely hold the user's phone. Further, the phone holder 70 is pivotally mounted such that angle of the phone can be adjusted. This allows the user to position that mobile phone at an angle that allows for comfortable viewing.
In some examples, a secondary controller (not pictured) is also provided. This controller may be used to control functions of the wheelchair other than the movement as controlled by the gesture controller 40. For example, the seat 30 may be mounted such that the seat 30 can be elevated vertically from the base 10 by means of an extendable riser post. This would allow the user to adjust the seat height so as to position themselves at the eye line of a person with whom they were talking or to have a better view (e.g., over a counter in a shop). The secondary controller 70 may therefore be used to control the vertical motion of the seat 30. Similarly, the secondary controller 70 may be used to adjust the positioning of the armrests 38, footrests 28 and/or the headrest 36. In the example of Figure 1, such a secondary controller is not provided and this functionality is controlled via display 60.
As is also illustrated in Figure 1, a depth camera 50 is situated in the base 10 of the wheelchair 2. The depth camera 50 is configured to capture images as well as position information for the objects in those images. These can then be used to aid in navigation, detection of obstacles and the identification of obstacles that may be impassable to wheelchair 2 (e.g., kerbs above a height scalable by the wheelchair 2). Based on detection of such impassable obstacles, a pre-emptive action to warn the user and/or prevent the user from attempting to drive over the obstacle may be issued.
Figures 2A and 2B show schematics of selected components of the wheelchair 2 and of a user device that may be used with the wheelchair 2. It will be appreciated that not all features of the wheelchair 2 and the user device are included in these schematics and that these figures are used to explain certain features and their interactions in accordance with an illustrative example. Other examples may contain more or fewer components and the relationship between the components may differ from that shown in Figures 2A and 2B.
As shown in Figure 2A, the wheelchair 2 comprises a central control unit 200 that coordinates the functions of the wheelchair 2. The control unit 200 operates in conjunction with a driving control unit 210 that handles the control of the wheelchair's motion and calculates and issues control signals (e.g., to one or more motors) to cause the wheelchair 2 to move. The wheelchair 2 also has a seat control unit 230 that performs seat monitoring in order to maintain user comfort and/or reduce the risk of the user developing pressure sores by virtue of their position on the seat for an extended period of time. There is further an obstacle detection control unit 250 coupled to the control unit 200 which makes use of a depth camera 50 to identify obstacles and determine whether the obstacles are passable to the wheelchair 2 as discussed in more detail below. The control unit 200 is also coupled to a gesture controller control unit 260 which interprets the inputs from the gesture controller 40 to produce control signals that can be used to control the wheelchair 2.
The control unit 200 provides an interface by which the control units 210, 230, 250, 260 as well as the other components of the wheelchair 2 can interact. For example, as the gesture controller control unit 260 interprets the gestures and determines control signals for driving the wheelchair, the control unit 200 can pass these control signals to the driving control unit 210 whereupon the driving control unit 210 can control the wheelchair 2 to move in accordance with the gestures performed by the user. Similarly, if the obstacle detection control unit 250 detects an obstacle that is deemed to be impassable to the wheelchair 2 (e.g., a kerb that is too high for the wheelchair 2 to mount), the control unit 200 may receive this information and control the driving control unit 210 to stop the wheelchair 2. It will be appreciated that Figure 2A illustrates one possible arrangement of control logic within the wheelchair and that other arrangements could be used, for example, providing a single control unit to handle all the control functionality of the wheelchair.
The seat control unit is coupled to a plurality of seat sensors including a temperature sensor 232, at least one pressure sensor 234 and a humidity sensor 236. The seat control unit 230 is responsive to measurements from these sensors to determine whether a discomfort condition is satisfied. The discomfort condition is set so as to be indicative of a set of temperature, pressure and humidity values at which the user is likely to be at risk of developing a pressure sore. Additionally, or alternatively, the discomfort condition may be set such that the condition is satisfied when a user is expected to being feeling discomfort and so by anticipating this discomfort (or equivalently the onset of pressure sores), the seat control unit 230 can take a pre-emptive action to prevent discomfort/pressure sores.
As shown in Figure 2A, the seat control unit 230 is connected to a fan 238, a heater 240, and an actuator 242 which is able to control a surface profile of the seat. Making use of these components, the seat control unit 230 is able to respond pre-emptively when the discomfort condition is satisfied to adjust the temperature at the seat surface (using the fan 238 or heater 240 as appropriate) or to adjust the user's sitting position using actuator 242. It will be appreciated that the seat monitoring apparatus may be provided with more, fewer, or different components, e.g., in some examples the seat comprises actuator 242 but does not have the fan 238 or heater 240.
As another example of a pre-emptive action that may be taken and which may also be taken in response to detection of an impassable obstacle by the obstacle detection control unit 250, the wheelchair 2 may issue an audible alert via speaker 278 or issue haptic feedback via haptic feedback element 276 which may be located for example, in the arm rest 38, gesture controller 40, or the seat base 32. The wheelchair 2 further comprises a communication element 274 with which the control unit 200 can communicate a user device. For example, the communication element 274 may be a BluetoothTM communication element with the wheelchair 2 can communicate with a corresponding BluetoothTm communication element 282 of the user device. Although the example of BluetoothTM is provided, it will be appreciated that other forms of suitable communication could be used such as other personal area networks (e.g., ZigbeeTM, Wireless USB, WlFiTM or Near-Field Communication (NFC)). In some examples, wired communication between physical ports of the user device and the wheelchair 2 is used using a physical connection technology such as USBTM, a serial port, or FireWireTM.
The wheelchair 2 also comprises storage 272 to store data used by the wheelchair's control systems. The storage may for example be flash storage (e.g., a solid state drive or a memory card), a hard-disk drive, read-only memory (ROM), erasable programmable ROM (EPROM), and/or electrically erasable programmable ROM (EEPROM). The storage 272 may be used to store a database associating gestures of the gesture controller 40 with movements of the wheelchair, personal discomfort conditions set for the user, and/or information for use by the obstacle detection control unit such as a positioning of the depth camera 50 relative to the wheelchair 2.
Turning now to Figure 2B, there is illustrated a schematic of selected components of a user device that may be used with the wheelchair of Figures 1 and 2A. A user device such as a mobile phone or tablet compute may be used to extend the functionality of the wheelchair 2 and/or provide a more convenient means of interacting with the wheelchair 2.
The user device comprises a processor 280 to perform processing operations. The processor 290 is coupled to a communication element 282 which can communicate with the communication element 274 of the wheelchair 2. As discussed above, a number of possible communications protocols could be used and the form of the communication elements provided will be selected in accordance with the communication protocol by which the user device and the wheelchair 2 are to communicate.
The user device also comprises a speaker 284 which may be used to issue an alert to the user, for example, when the seat control unit 230 determines that the discomfort condition is satisfied or when the obstacle detection control unit 250 identifies an obstacle impassable to the wheelchair 2. The user device is also provided with storage 286, which as for the storage of the wheelchair 2 may for example be flash storage (e.g., a solid state drive or a memory card), a hard-disk drive, read-only memory (ROM), erasable programmable ROM (EPROM), and/or electrically erasable programmable ROM (EEPROM). The storage 286 may be used to store user settings such as gesture control profiles or a personalised discomfort condition which can be uploaded to the wheelchair 2.
The user device (e.g., a mobile phone) also comprises a location determining element 288 (such as a GPS receiver element) which can be used to determine the location of the user device and by association the wheelchair 2 and its user. The location may also be communicated via the communication element 282 (e.g., using Bluetooth) to the wheelchair 2. In this way, the user device can be used to provide location information for use in navigation of the wheelchair. Such navigation may take into account obstacles identified by the obstacle detection control unit 250 as impassable (and other impassable obstacles identified for example by other wheelchairs), to direct the user to follow a route suitable for the wheelchair 2. In some examples, the wheelchair 2 itself is provided with its own location determining element and is able to perform navigation without needing to be connected to a user device in order to provide the location determining functionality.
Figure 3 shows a worked example of the obstacle detection process performed by the obstacle detection apparatus according to an example. As shown in Figure 3, there is a depth camera 50 coupled to a control unit 250 (not shown in Figure 3) and positioned on a wheelchair 2 (also not shown in Figure 3). The depth camera 50 is able to capture images as with a conventional camera, however, the depth camera 50 is also able to determine position information for objects in the image. For example, the depth camera 50 may provide to the control unit 250 an image captured by an optical sensor of the depth camera 50 along with positions measurements for each of a plurality of pixels in the image. The position measurements may identify the position using any suitable coordinate system, however, in the example of Figure 3, the position measurements indicate a straight line distance to the object (d), a horizontal offset (w) from a reference direction relative to the depth camera and a vertical offset from a reference direction (h).
Having received the image and the associated position information from the depth camera 50, the control unit 250 identifies one or more obstacles in the image. In some examples, all of the objects in the image are considered as obstacles by the control unit 250. However, in some examples, an initial filter may be performed to identify, as a subset of the objects in the image, one or more obstacles. This may involve identifying only obstacles closer than a certain distance or only obstacles with a region of interest.
This process is depicted in Figure 3 in which the depicted example makes use of a region of interest shown by the dashed box. Thus, although the depth camera 50 captures position information for point C for example which is on tree 380, this object (tree 380) lies outside of the region of interest and is not further considered. This reduces the processing load on the control unit 250 since not all objects in the image need to be further considered to determine whether they are or could be impassable obstacles and only objects within a region of interest in which obstacles likely to impede the motion of the wheelchair 2 are considered further.
Thus, the kerb 360 corresponding to point B in the image captured by the depth camera and the rock 370 on which is situated point A in the image are identified as obstacles for further consideration by the control unit 250. The height and distance of the identified obstacles 360, 370 from the wheelchair 2 are then calculated using the position information from the depth camera 50. This can be seen in Figure 3 for which a height Bh of the kerb 360 and distance Bd of the kerb from the depth camera 50 are calculated. Similarly, a height Ary and distance Ad for the rock 370 are calculated.
In performing the calculation, the control unit 250 corrects for a positioning of the depth camera 50 relative to the wheelchair 2. The control unit 250 has access to a stored indication of how the depth camera 50 is positioned (e.g., the height at which the depth camera 50 is mounted and an angle relative to the horizontal at which the depth camera 50 is tilted). Based on this information, the control unit 250 is able to take account of the positioning information for the depth camera 50 when calculating the height and depth information in order to determine a height and depth relative to the wheelchair 2 itself. This information may be more useful for making an assessment as to whether the wheelchair 2 will be able to get past/over the obstacle.
Having calculated the height and distance from the wheelchair 2 of each of the obstacles, the control unit 250 assesses whether any of the obstacles are impassable to the wheelchair 2. This determination may be based on a range of possible factors, but in the present example, the control unit 250 is arranged to consider only a closest obstacle and a tallest obstacle as the most likely obstacles to be impassable. The control unit 250 compares a height of the obstacles with a threshold height. If the height exceeds the threshold, then the control unit 250 is configured to determine that the obstacle is impassable. The control unit 250 may also compare the distance of the obstacle with a threshold distance such that the obstacle is only found to be impassable if it is above the threshold height and closer than the threshold distance.
In this example, the threshold height is set to be 10cm (4 inches) to coincide with a standard height of a kerb in the UK. Thus, a standard sized kerb will be considered passable to the wheelchair 2 whereas a larger kerb, or another obstacle greater than the height of a standard kerb will be considered impassable. In this way, the obstacle detection apparatus can warn the user/take steps to protect a user before attempting to climb a kerb or other obstacle that the wheelchair 2 is not able to mount. The threshold height may thus be set based on the standard height of the kerbs in the country in which the wheelchair is used (if the wheelchair 2 is suitable for climbing such kerbs).
In some examples, the obstacle detection apparatus is arranged to calculate the threshold height of obstacles that the wheelchair 2 can climb. This calculation could be based on dimensions of the wheelchair 2 stored in the storage 286. For example, the control unit 250 may calculate the threshold height as being a third of the height of the wheels of the wheelchair 2. In this way, the detection of obstacles can be tailored to the characteristics of the wheelchair being used and a more accurate determination as to whether the obstacle is passable or not obtained.
Returning to the example of Figure 3, the rock 370 is within the region of interest and so identified as an obstacle. The height Ary of the rock 370 is less than a threshold height used by the control unit 250 and so the control unit 250 is configured to determine that the rock 370 is passable to the wheelchair 2. However, in the example of Figure 3, the kerb 360 is higher than a standard kerb and above a threshold height. Therefore, the control unit 250 determines that the kerb 360 is impassable to the wheelchair 2.
As such, the control unit 250 is configured trigger a pre-emptive action. In this example, the pre-emptive action involves issuing an audible alert to the user via speaker 278 and issuing a notification to a user device using the communication elements 274, 282 of the wheelchair 2 and the user device. The user is therefore warned of an obstacle that is deemed to be impassable and can respond appropriately. Hence, the risk of the user attempting to climb a kerb (or otherwise mount an obstacle) that the wheelchair 2 is not able to climb can be reduced.
Figure 4 shows a depth camera 50 of an obstacle detection apparatus according to one example. The depth camera comprises an optical sensor 460 to capture a two-dimensional (2D) image (much like a standard camera). The depth camera 50 also comprises an infra-red (IR) emitter 450 to emit infra-red light. This IS light will be reflected from objects in the image and can be detected using the two offset sensors 442 and 440.
Using the pattern of IR light reflected to the optical sensors 442, 444, based on comparing the received IS light with the known pattern of emitted IS light and the difference in received IR light detected for the two sensors 442, 444, the depth camera 50 is able to infer information about the depth of objects in the image. The 2D image is then augmented with the depth information to produce a combined image with associated position information.
A number of possible depth cameras could be used for this purpose, however, in some examples the depth camera is a RealSenseTm Depth Camera D345 from lntelTM or a KinectTM from MicrosoftTM. In some examples a light detection and ranging (LIDAR) or radio detection and ranging (RADAR) based camera is used.
Figure 5 shows a worked example of a navigation process performed by the obstacle detection apparatus according to an example. In this example, the obstacle detection apparatus is responsive to detecting an obstacle impassable to the wheelchair 2 to store an indication of the obstacle (e.g., in in storage 272). Thus, routing information can be provided to the user for navigation with the wheelchair 2, taking into account the indications of impassable obstacles.
In this example, all of the obstacles are kerbs although it will be appreciated that these techniques are applicable to other forms of obstacle as well. Shown in Figure 5 is a map with several interconnected roads and a start point S and finish point F marked. Crossing points 502, 504, 506, 508 are marked on the map and correspond to information stored by the obstacle detection apparatus (e.g., based on having passed those points previously).
Crossing points 502, 504 are crossing points involving obstacles previously determined to be impassable. This may be due to the road having kerbs higher than can be mounted by the wheelchair 2 at that point. On the other hand, crossing points 506, 508 are known to passable to the wheelchair 2. Therefore, in response to a request for navigation from the starting point S to the finish point F, a route is determined, avoiding the impassable crossing points 502 and 504 and instead making use of the crossing points 506, 508.
Figure 6 is a flowchart illustrating a method of performing obstacle detection according to an example. At step 622, the depth camera 50 captures images and position information for the objects in the images. These captured images and position information are then passed to the control unit 250 whereupon one or more obstacles in the image are identified at step 624. As discussed above, the obstacles may correspond to all of the objects in the image or may be a subset of those objects (e.g., those within a region of interest). For the identified obstacles, a height and distance is calculated at step 626 using the position information. This may also involve correcting for a positioning of the depth camera 50 where such positioning is likely to cause distortion of the image and/or the positioning information. On the basis of the height/distance information, a determination as to whether each obstacle is impassable is made at step 628. If the obstacle is passable by the wheelchair then the method returns to step 622 for further images to be captured and processed. However, if the obstacle is found to be impassable, a pre-emptive action is triggered 630, to alert the user before attempting to drive over the obstacle or to prevent the user from driving over the obstacle before they attempt to do so.
Thus, there has been described an obstacle detection apparatus which is able to identify and assess obstacles in the vicinity of a wheelchair to determine whether the wheelchair will be able to drive over/past those obstacles. If is it determined that the wheelchair is not able to pass the obstacle, an appropriate action can be taken to warn the user and/or prevent the user attempting to continue over/towards the obstacle.

Claims (16)

  1. CLAIMS1. An obstacle detection apparatus for a wheelchair comprising: a depth camera to capture images and position information for objects in the images, the position information indicative of estimated positions of the objects relative to the depth camera; and a control unit configured to: receive an image and associated position information from the depth camera; identify one or more obstacles in the image and for each of the obstacles, calculate from the image and the associated position information, a height of the obstacle and a distance of the obstacle from the wheelchair; determine, based on the height and the distance for each of the obstacles, whether any of the obstacles are impassable to the wheelchair; and in response to determining that an obstacle is impassable to the wheelchair, trigger a pre-emptive action.
  2. The obstacle detection apparatus according to claim 1, wherein: the pre-emptive action comprises issuing an alert to a user of the wheelchair.
  3. 3. The obstacle detection apparatus according to claim 2, wherein: the obstacle detection apparatus comprises a speaker and issuing the alert comprises issuing an audio alert.
  4. 4. The obstacle detection apparatus according to claim 2 or claim 3, wherein: the obstacle detection apparatus comprises a haptic feedback element and issuing the alert comprises triggering a vibration using the haptic feedback element.
  5. 5. The obstacle detection apparatus according to any preceding claim, wherein: the control unit is communicatively coupled to a driving control apparatus of the wheelchair and the triggering the pre-emptive action comprises causing the driving control apparatus to stop the wheelchair.
  6. 6. The obstacle detection apparatus according to any preceding claim, wherein: the control unit is configured to determine that the obstacle is impassable to the wheelchair based on determining that the height of the obstacle exceeds a threshold height and optionally that the distance of the obstacle from the wheelchair is below a threshold distance.
  7. The obstacle detection apparatus according to claim 6, wherein: the threshold height is one of 10cm and 17cm.
  8. 8. The obstacle detection apparatus according to claim 6, wherein: the control unit is configured to store information indicative of dimensions of the wheelchair and to calculate, based on the information, a threshold height and optionally a threshold distance for the wheelchair.
  9. 9. The obstacle detection apparatus according to any preceding claim, wherein: to identify the one or more obstacles in the image, the control unit is configured to consider as the one or more obstacles only objects in the image within a region of interest and to ignore any objects outside the region of interest.
  10. 10. The obstacle detection apparatus according to claim 9, wherein: to identify the one or more obstacles in the image, the control unit is further configured to consider as the one or more obstacles only a tallest obstacle and a nearest obstacle, the tallest obstacle being an object within the region of interest having a maximum height and a nearest obstacle being an object within the region of interest having a minimum distance from the wheelchair.
  11. 11. The obstacle detection apparatus according to any preceding claim, wherein: the control unit is configured to store positioning information of the depth camera relative to the wheelchair, the positioning information indicative of a camera height and camera tilt angle; and calculating the height and the distance of the obstacles comprises adjusting for the camera height and the camera tilt angle.
  12. 12. The obstacle detection apparatus according to any preceding claim, wherein: the depth camera is a stereo depth camera comprising two or more optical image sensors, wherein the depth camera is configured to compare the output of the two or more optical image sensors to produce the images and position information.
  13. 13. The obstacle detection apparatus according to any preceding claim, wherein: in response to determining that an obstacle is impassable to the wheelchair, the control unit is further configured to determine a location of the wheelchair and to store, in association with the determined location, an indication of the obstacle.
  14. 14. The obstacle detection apparatus according to claim 13, wherein: the control unit is configured to calculate routing information for navigating with the wheelchair and to provide to the routing information to a user of the wheelchair; wherein, when calculating routing information for a route near to the determined location, the control unit is configured to calculate routing information that avoids the obstacle.
  15. 15. A wheelchair comprising the obstacle detection apparatus of any preceding claim.
  16. 16. A method of detecting obstacles impassable to a wheelchair, the method comprising: capturing images and position information for objects in the images, the position information indicative of estimated positions of the objects relative to the depth camera; identifying one or more obstacles in an image; calculating, for each of the obstacles and from an image and associated position information, a height of the obstacle and a distance of the obstacle from the wheelchair; determining, based on the height and the distance for each of the obstacles, whether any of the obstacles are impassable to the wheelchair; and in response to determining that an obstacle is impassable to the wheelchair, triggering a pre-emptive action.
GB2113023.2A 2021-09-13 2021-09-13 Obstacle detection apparatus Pending GB2610630A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
GB2113023.2A GB2610630A (en) 2021-09-13 2021-09-13 Obstacle detection apparatus
PCT/GB2022/052285 WO2023037114A1 (en) 2021-09-13 2022-09-08 Obstacle detection apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB2113023.2A GB2610630A (en) 2021-09-13 2021-09-13 Obstacle detection apparatus

Publications (2)

Publication Number Publication Date
GB202113023D0 GB202113023D0 (en) 2021-10-27
GB2610630A true GB2610630A (en) 2023-03-15

Family

ID=78149255

Family Applications (1)

Application Number Title Priority Date Filing Date
GB2113023.2A Pending GB2610630A (en) 2021-09-13 2021-09-13 Obstacle detection apparatus

Country Status (2)

Country Link
GB (1) GB2610630A (en)
WO (1) WO2023037114A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140095009A1 (en) * 2011-05-31 2014-04-03 Hitachi, Ltd Autonomous movement system
US20150348416A1 (en) * 2013-03-26 2015-12-03 Sharp Kabushiki Kaisha Obstacle detection device and electric-powered vehicle provided therewith
US20180222473A1 (en) * 2017-02-09 2018-08-09 GM Global Technology Operations LLC Collision avoidance for personal mobility devices
JP2020074816A (en) * 2018-11-05 2020-05-21 スズキ株式会社 Electric vehicle

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7259356B2 (en) * 2019-01-28 2023-04-18 スズキ株式会社 Control device and electric vehicle
KR102146206B1 (en) * 2019-12-13 2020-08-20 서한교 Safety Electric Wheelchair System

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140095009A1 (en) * 2011-05-31 2014-04-03 Hitachi, Ltd Autonomous movement system
US20150348416A1 (en) * 2013-03-26 2015-12-03 Sharp Kabushiki Kaisha Obstacle detection device and electric-powered vehicle provided therewith
US20180222473A1 (en) * 2017-02-09 2018-08-09 GM Global Technology Operations LLC Collision avoidance for personal mobility devices
JP2020074816A (en) * 2018-11-05 2020-05-21 スズキ株式会社 Electric vehicle

Also Published As

Publication number Publication date
WO2023037114A1 (en) 2023-03-16
GB202113023D0 (en) 2021-10-27

Similar Documents

Publication Publication Date Title
JP7150274B2 (en) Autonomous vehicles with improved visual detection capabilities
JP2017100490A (en) Speed control device
JP6047318B2 (en) Driver state detection device and driver state notification device
KR101646503B1 (en) Device, system and method for informing about 3D obstacle or information for blind person
US8094190B2 (en) Driving support method and apparatus
JP6814220B2 (en) Mobility and mobility systems
CN104890671B (en) Trailer lane-departure warning system
US20160025499A1 (en) Intelligent mobility aid device and method of navigating and providing assistance to a user thereof
EP1721782B1 (en) Driving support equipment for vehicles
US9801778B2 (en) System and method for alerting visually impaired users of nearby objects
JP2014191485A (en) Obstacle detection device and electrically-driven vehicle with the same
EP3851800A1 (en) Travel route creation system
US20180020952A1 (en) Systems and methods for warning of a protruding body part of a wheelchair occupant
JP2019087150A (en) Driver monitoring system
WO2021171291A1 (en) Self-navigating guide robot
JP2013254474A (en) Obstacle detector
JP5084756B2 (en) Autonomous mobile wheelchair
GB2610630A (en) Obstacle detection apparatus
KR20150027987A (en) Control device and method of electrical wheel-chair for autonomous moving
JP2022089857A (en) Work vehicle
US20140191489A1 (en) Early warning method and device for preventing wheelchair from tipping over
JP6775135B2 (en) Driver status detector
TW201620468A (en) Mobile electronic device, early warning method and early warning system
JP7295654B2 (en) self-propelled robot
JP2003245310A (en) Low-speed traveling vehicle and low-speed traveling vehicle geographical guidance system

Legal Events

Date Code Title Description
COOA Change in applicant's name or ownership of the application

Owner name: DUCHENNE UK

Free format text: FORMER OWNER: THE MOVEMENT FOR NON-MOBILE CHILDREN (WHIZZ-KIDZ),