GB2610629A - Gesture controller - Google Patents

Gesture controller Download PDF

Info

Publication number
GB2610629A
GB2610629A GB2113022.4A GB202113022A GB2610629A GB 2610629 A GB2610629 A GB 2610629A GB 202113022 A GB202113022 A GB 202113022A GB 2610629 A GB2610629 A GB 2610629A
Authority
GB
United Kingdom
Prior art keywords
wheelchair
gesture
controller
user
movement
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB2113022.4A
Other versions
GB202113022D0 (en
GB2610629B (en
Inventor
Arnoldus Verhoeven Gerardus
Hosking Josh
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Duchenne Uk
Original Assignee
Duchenne Uk
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Duchenne Uk filed Critical Duchenne Uk
Priority to GB2113022.4A priority Critical patent/GB2610629B/en
Publication of GB202113022D0 publication Critical patent/GB202113022D0/en
Priority to PCT/GB2022/052287 priority patent/WO2023037116A1/en
Publication of GB2610629A publication Critical patent/GB2610629A/en
Application granted granted Critical
Publication of GB2610629B publication Critical patent/GB2610629B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61GTRANSPORT, PERSONAL CONVEYANCES, OR ACCOMMODATION SPECIALLY ADAPTED FOR PATIENTS OR DISABLED PERSONS; OPERATING TABLES OR CHAIRS; CHAIRS FOR DENTISTRY; FUNERAL DEVICES
    • A61G5/00Chairs or personal conveyances specially adapted for patients or disabled persons, e.g. wheelchairs
    • A61G5/06Chairs or personal conveyances specially adapted for patients or disabled persons, e.g. wheelchairs with obstacle mounting facilities, e.g. for climbing stairs, kerbs or steps
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F4/00Methods or devices enabling patients or disabled persons to operate an apparatus or a device not forming part of the body 
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61GTRANSPORT, PERSONAL CONVEYANCES, OR ACCOMMODATION SPECIALLY ADAPTED FOR PATIENTS OR DISABLED PERSONS; OPERATING TABLES OR CHAIRS; CHAIRS FOR DENTISTRY; FUNERAL DEVICES
    • A61G5/00Chairs or personal conveyances specially adapted for patients or disabled persons, e.g. wheelchairs
    • A61G5/04Chairs or personal conveyances specially adapted for patients or disabled persons, e.g. wheelchairs motor-driven
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05GCONTROL DEVICES OR SYSTEMS INSOFAR AS CHARACTERISED BY MECHANICAL FEATURES ONLY
    • G05G9/00Manually-actuated control mechanisms provided with one single controlling member co-operating with two or more controlled members, e.g. selectively, simultaneously
    • G05G9/02Manually-actuated control mechanisms provided with one single controlling member co-operating with two or more controlled members, e.g. selectively, simultaneously the controlling member being movable in different independent ways, movement in each individual way actuating one controlled member only
    • G05G9/04Manually-actuated control mechanisms provided with one single controlling member co-operating with two or more controlled members, e.g. selectively, simultaneously the controlling member being movable in different independent ways, movement in each individual way actuating one controlled member only in which movement in two or more ways can occur simultaneously
    • G05G9/047Manually-actuated control mechanisms provided with one single controlling member co-operating with two or more controlled members, e.g. selectively, simultaneously the controlling member being movable in different independent ways, movement in each individual way actuating one controlled member only in which movement in two or more ways can occur simultaneously the controlling member being movable by hand about orthogonal axes, e.g. joysticks
    • G05G9/04737Manually-actuated control mechanisms provided with one single controlling member co-operating with two or more controlled members, e.g. selectively, simultaneously the controlling member being movable in different independent ways, movement in each individual way actuating one controlled member only in which movement in two or more ways can occur simultaneously the controlling member being movable by hand about orthogonal axes, e.g. joysticks with six degrees of freedom
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0338Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of limited linear or angular displacement of an operating part of the device from a neutral position, e.g. isotonic or isometric joysticks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61GTRANSPORT, PERSONAL CONVEYANCES, OR ACCOMMODATION SPECIALLY ADAPTED FOR PATIENTS OR DISABLED PERSONS; OPERATING TABLES OR CHAIRS; CHAIRS FOR DENTISTRY; FUNERAL DEVICES
    • A61G2203/00General characteristics of devices
    • A61G2203/10General characteristics of devices characterised by specific control means, e.g. for adjustment or steering
    • A61G2203/12Remote controls
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05GCONTROL DEVICES OR SYSTEMS INSOFAR AS CHARACTERISED BY MECHANICAL FEATURES ONLY
    • G05G9/00Manually-actuated control mechanisms provided with one single controlling member co-operating with two or more controlled members, e.g. selectively, simultaneously
    • G05G9/02Manually-actuated control mechanisms provided with one single controlling member co-operating with two or more controlled members, e.g. selectively, simultaneously the controlling member being movable in different independent ways, movement in each individual way actuating one controlled member only
    • G05G9/04Manually-actuated control mechanisms provided with one single controlling member co-operating with two or more controlled members, e.g. selectively, simultaneously the controlling member being movable in different independent ways, movement in each individual way actuating one controlled member only in which movement in two or more ways can occur simultaneously
    • G05G9/047Manually-actuated control mechanisms provided with one single controlling member co-operating with two or more controlled members, e.g. selectively, simultaneously the controlling member being movable in different independent ways, movement in each individual way actuating one controlled member only in which movement in two or more ways can occur simultaneously the controlling member being movable by hand about orthogonal axes, e.g. joysticks

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Vascular Medicine (AREA)
  • User Interface Of Digital Computer (AREA)
  • Telephone Function (AREA)

Abstract

An apparatus for a wheelchair comprises a gesture controller 40 to detect a gesture performed by a user where the gesture corresponds to manipulation of the controller (A-F, figure 3); a control system having a database associating gestures with single or composite movements of the wheelchair; where the control system is responsive to detection of a given gesture by the controller to reference the database and determine an associated movement for the given gesture, such that the wheelchair can perform the associated movement. Preferably the controller is configured to detect motion with six degrees of freedom (6DOF) making use of an accelerometer, magnetometer and gyroscope to do so. The composite movements might be turning the wheelchair around, movement in a particular direction, or climbing a kerb. The gestures may be reconfigurable at will by a user, for example through a mobile phone.

Description

GESTURE CONTROLLER
Background
The present invention is concerned with a gesture controller particularly, but not exclusively, for a wheelchair.
Motorised wheelchairs are traditionally controlled using a joystick that can be pivoted about a fixed point to indicate a direction of travel for the wheelchair where the direction in which the joystick is pivoted matches the indicated direction of travel. An amount by which the joystick is pivoted may also be used to indicate the speed at which the wheelchair is to move.
Whilst this form of control may be suitable for some users, other wheelchair users and particularly those with a limited range of motion and/or level of motor control may have difficulty performing the required movements to pivot the joystick, may struggle to perform such movements with sufficient precision to control the wheelchair as intended, or may find operating a joystick uncomfortable. This may include users with ataxia such as users with cerebral palsy for example for whom a joystick may not be a comfortable or convenient form of input device.
The inventors have therefore developed an apparatus that may be used to control a wheelchair that provides a more convenient input mechanism for users and that can be tailored based on a user's range of motion or level of motor control. Additionally, composite movements of the wheelchair that may otherwise have required careful and accurate manipulation of a joystick may be triggered using motions that are more natural to the user.
Summary of the Invention
Aspects of the invention are set out in the accompanying claims.
Viewed from a first aspect, there is provided a wheelchair comprising: a gesture controller to detect a gesture performed by a user, the gesture corresponding to manipulation of the gesture controller; and a control system having a database associating gestures with single or composite movements of the wheelchair; wherein the control system is responsive to detection of a given gesture by the gesture controller to reference the database to determine an associated movement for the given gesture and to control the wheelchair to perform the associated movement.
The term 'gesture' is intended to refer to a particular combination or sequence of individual movements using the hand and/or arm. Specifically this is intended to refer to complex movement combined movements/rotations and speeds of the hand or arm and not simply a conventional Cartesian coordinate movement. The controller may advantageously be used in a complex way combining both conventional Cartesian coordinate control and complex gesture control.
A gesture controller is provided in place of a traditional joystick for receiving the user input to control the wheelchair. The gesture controller is configured to detect gestures performed by the user beyond just pivoting of the controller about a fixed point. Thus more complex physical manipulation of the gesture controller may be used to steer/drive the wheelchair. The gesture controller may be configured to detect manipulation of the gesture controller involving unconstrained motion of the gesture controller in three dimensions. In contrast to a joystick therefore, the gesture controller can be moved freely by the user in a manner most comfortable or natural to the user. For example, instead of pivoting the controller to indicate a direction of movement, the user may shake the controller in a particular direction or tilt the controller. The user may therefore be able to use manipulation of the controller with six degrees of freedom (corresponding to translation of the controller in 3 orthogonal directions and rotation of the controller about 3 orthogonal axes). This provides more flexibility in how the user can operate the wheelchair, enabling the user to make use of convenient and comfortable gestures to control the wheelchair. Moreover, in examples where the gesture controller can be removed from the wheelchair, the gesture controller may be used to remotely control the wheelchair even when the user is no longer in the wheelchair. This may be useful for example when the user is in bed to drive the wheelchair so as to leave it away from the bed and later return the wheelchair when the user wishes to get out of bed.
To detect such manipulation, the gesture controller may be provided with one or more sensors able to detect motion and/or the orientation of the controller. In some examples, the gesture controller is provided with an accelerometer, a magnetometer and a gyroscope to perform such detection.
In some examples, the gesture controller is configured so that the controller is also operable as a joystick. A mandrel or housing is provided in which the gesture controller can be placed, such that the mandrel supports the controller and provides a point about which the gesture controller can be pivoted. The gesture controller may be able to detect when the controller is placed in the mandrel and switch to a 'joystick mode' in which pivoting of the gesture controller is used to control the wheelchair. With the gesture controller removed from the mandrel, the gesture controller may be operated in 'gesture mode' in which gestures with the gesture controller are used to control the wheelchair.
The wheelchair also comprises a control system to handle the gestures detected by the gesture controller and produce the output signals used to control the wheelchair. The control system may comprise processing circuitry such as a microprocessor or microcontroller and is arranged to perform control functionality relating to the gesture controller. The control system also has a database which may be stored in storage accessible to processing circuitry of the control system such as flash storage (e.g., a solid state drive or a memory card), a hard-disk drive, erasable programmable ROM (EPROM), and/or electrically erasable programmable ROM (EEPROM). The database is used to store information associating gestures that could be performed using the gesture controller with movements of the wheelchair. When a given gesture is detected by the gesture controller, the control system can reference the database to lookup the gesture and determine an associated movement for the given gesture. The control system can then control the wheelchair to perform the associated movement.
By maintaining a database associating gestures with movements of the wheelchair in this way, the associations between the gestures the gestures can be modified based on a user's preferences e.g., to account for a user's level of motor control or range of motion. That is, a fixed mapping between inputs to the controller and the resulting movement of the wheelchair does not need to be used for all users. Rather, additional flexibility is provided to tailor the control of the wheelchair to an individual user.
The control system is operable with gestures associated with single or composite movements. A single movement can be expressed as a single action to be performed by the wheelchair such as driving in a straight line or turning in a particular direction, whereas a composite movement may relate to two or more single movements being performed successively or concurrently. For example, while turning in a particular direction may be effected by tilting the controller in the direction of the turn and driving either forwards or backwards may be indicated by tilting the gesture controller forwards or backwards, the control system may support the use of a circular motion of the gesture controller to control the wheelchair to perform, as a composite movement, a turning operation of the wheelchair through 1800. Other examples of composite movements that may be associated with gestures include driving the wheelchair along a preset route (e.g., to automatically control the wheelchair to drive from a user's house to a nearby bus stop) or performing an operation to automatically mount a kerb with the wheelchair. Thus, the user can trigger a series of movements of the wheelchair using a single gesture. This may be more convenient for the user since he/she does not need to perform a series of different gestures to effect the composite movement and may avoid the user needing to perform a series of fine movements with the gesture controller that may be difficult.
In some examples, the database contains two or more different sets of associations between gestures and single or composite movements of the wheelchair. The movement triggered in response to a particular gesture may therefore be contingent on the mode of operation of the control system when the gesture was performed. For example, the user may have a 'home' mode of operation when at home in which a particular gesture is used to drive the wheelchair from the bathroom to the bedroom. The same gesture may be mapped in an 'outdoors' mode of operation to drive the wheelchair from the user's house to a nearby bus stop for example. Therefore, if there are certain particularly comfortable or intuitive gestures for a user to perform, these gestures can be used to trigger a number of different movements depending on the context in which they are performed.
The control system may control the wheelchair to perform the movements determined as a result of referencing the database directly by issuing control signals to one or more motors driving the wheels of the wheelchair. However, in some examples, the control system provides an indication of the movements to be performed to a driving control system of the wheelchair which handles driving of the wheelchair. The indication from the gesture controller may be provided in accordance with a standard format of indication used by joysticks such that the gesture controller appears to the driving control system as a joystick. This may make using the gesture controller with existing driving control systems simpler since a new interface between the control system and the driving control system does not need to be provided and may make it easier to retrofit an apparatus comprising the gesture controller and the control system to an existing wheelchair.
In some examples, the database stores information identifying a neutral orientation for the gesture controller relative to which manipulation is measured in order to identify the movement of wheelchair to perform. Thus, determining the associated movement comprises identifying the manipulation of the gesture controller relative to the neutral orientation. The neutral orientation may be set to coincide with a comfortable resting position for the user so that when in the resting position, no gesture is detected and it is movement from that resting position that is used to control the wheelchair. To adjust the neutral position, the gesture controller may be provided with a button such that in response to pressing the button, the control system updates the information in the database to identify, as a new neutral orientation, an orientation of the gesture controller when the button was pressed. This may be used as the user adjusts their seating position or finds a new comfortable position in which the gesture controller is held differently and so by setting a new neutral orientation, the user can hold and operate the gesture controller from that comfortable position.
Thus, the button on the gesture controller provides one mechanism by which information in the database may be updated. Additionally, or alternatively, the control system may be able to receive other forms of association information that associates gestures with single or composite movements of the wheelchair with that information used to update the database. For example, the control system may be able to receive association information via a command line interface presented by the control system or remotely via a user device such as mobile phone (e.g., by using an app on the user device).
The information stored by the database associating gestures with single or composite movements may take a number of forms. One example of the information that may be stored by the database is a sensitivity for the gesture controller. In this case, determining the associated movement for the given gesture comprises determining a magnitude of the movement to make based on the sensitivity. Thus if the sensitivity is low, the movement of the wheelchair triggered by the control system may be smaller than that triggered for the same gesture when the sensitivity is high. The gesture controller can thereby be adapted to the particular user, for example, by using a high sensitivity for a user with a restricted range of motion so that more control functionality is available with less manipulation of the gesture controller, or by using a lower sensitivity for a user with limited motor control to avoid exaggerating any inaccuracies in the manipulation of the gesture controller.
Some users may find that their range of motion or level of motor control varies with the direction in which the gesture controller is being manipulation. For example, a motion of the gesture controller away from the user may be easier to perform that a manipulation towards the user. In this case, a direction-dependent sensitivity may be used whereby the sensitivity of the gesture controller varies in dependence on the direction in which the controller is manipulated. As another example of how this direction-dependent sensitivity may be used, a higher sensitivity may be applied to manipulations in forward, backward, left, and right directions than in intermediate directions between those positions.
A further way in which the database may be used to adjust how the movement for the wheelchair is determined based on the manipulation of the gesture controller is by remapping directions of manipulation of the gesture controller to different directions of the movement of the wheelchair. Whilst it may appear counter-intuitive for manipulation of the gesture controller to trigger movement of the wheelchair in a different direction, this may for example be done so that a full range of control of the wheelchair can be achieved using only manipulations that are comfortable for the user. So if the user prefers to operate the gesture controller with manipulations in a direction away from the user, the movement of the wheelchair may be associated with a compressed region of manipulation of the gesture controller away from the user such that movement of the wheelchair in a direction towards the user can also be mapped to manipulation of the gesture controller in directions away from the user. It will be appreciated that this provides only an illustrative example of how the directions may be remapped and other schemes of remapping directions may be used.
As a consequence of remapping directions of manipulation of the gesture controller, there may be gestures for which the database does not store an associated movement. This may be done deliberately in order to create deadzones of the gesture controller for example.
In response to detecting such a gesture, the control system may suppress movement of the wheelchair.
Since the single or composite movement of the wheelchair may not correspond in an obvious way to the gestures performed with the gesture controller, the wheelchair may comprise a display to show an indication of the movement being performed in response to a detected gesture. Thus, the user can tell whether the gesture has been interpreted as they intended.
Viewed from a second aspect of an invention described herein, there is provided an apparatus comprising: a gesture controller to detect a gesture performed by a user, the gesture corresponding to manipulation of the gesture controller; and a control system having a database associating gestures with single or composite movements of a wheelchair; wherein the control system is responsive to detection of a given gesture by the gesture controller to reference the database to determine an associated movement for the given gesture and to control the wheelchair to perform the associated movement.
According to the second aspect therefore, the gesture controller and control system may be provided separately from the wheelchair. Thus, the apparatus may be retrofitted to an existing wheelchair, for example in place of a joystick.
Viewed from a third aspect, there is provided an apparatus comprising: a gesture controller to detect a gesture performed by a user, the gesture corresponding to manipulation of the gesture controller; and a control system having a database associating gestures with single or composite movements; wherein the control system is responsive to detection of a given gesture by the gesture controller to reference the database to determine an associated movement for the given gesture and to output control signals to indicate the associated movement.
Although as described herein, the gesture controller and control system are used to control a wheelchair, it will be appreciated that the present techniques may also be applied to other control systems, for example to machinery (e.g., cranes, forklift trucks) or games controllers.
Viewed from a fourth aspect, there is provided a method of configuring a gesture controller for a wheelchair, the method comprising: identifying gestures to be performed by a user with a gesture controller based on the user's level of motor control; associating the gestures with single or composite movements of the wheelchair; and storing the association between the gestures and the single or composite movements of the wheelchair in a database.
Optional features of the first aspect are also optional features of the second, third, and fourth aspects.
Figures Aspects of the invention will now be described, by way of example only, with reference to the accompanying figures in which: Figure 1 shows a wheelchair of the type with which one example of the invention may be used; Figure 2A shows a schematic of selected components of the wheelchair of Figure 1; Figure 2B shows a schematic of selected components of a user device that may be used with the wheelchair of Figure 1; Figure 3 shows a gesture controller according to an example; Figures 4A-4F illustrate example gestures and associated movements of a wheelchair; Figure 5 is a flowchart illustrating a method of controlling a wheelchair according to an example, and Figure 6 is a flowchart illustrating a method of configuring a gesture controller for a 15 wheelchair.
Any reference to prior art documents in this specification is not to be considered an admission that such prior art is widely known or forms part of the common general knowledge in the field. As used in this specification, the words "comprises", "comprising", and similar words, are not to be interpreted in an exclusive or exhaustive sense. In other words, they are intended to mean "including, but not limited to". The invention is further described with reference to the following examples. It will be appreciated that the invention as claimed is not intended to be limited in any way by these examples. It will also be recognised that the invention covers not only individual embodiments but also combination of the embodiments described herein.
The various embodiments described herein are presented only to assist in understanding and teaching the claimed features. These embodiments are provided as a representative sample of embodiments only, and are not exhaustive and/or exclusive. It is to be understood that advantages, embodiments, examples, functions, features, structures, and/or other aspects described herein are not to be considered limitations on the scope of the invention as defined by the claims or limitations on equivalents to the claims, and that other embodiments may be utilised and modifications may be made without departing from the spirit and scope of the claimed invention. Various embodiments of the invention may suitably comprise, consist of, or consist essentially of, appropriate combinations of the disclosed elements, components, features, parts, steps, means, etc, other than those specifically described herein. In addition, this disclosure may include other inventions not presently claimed, but which may be claimed in future.
In the present application, the words "configured to..." are used to mean that an element of an apparatus has a configuration able to carry out the defined operation. In this context, a "configuration" means an arrangement or manner of interconnection of hardware or software. For example, the apparatus may have dedicated hardware which provides the defined operation, or a processor or other processing device may be programmed to perform the function. "Configured to" does not imply that the apparatus element needs to be changed in any way in order to provide the defined operation.
Detailed Description
Figure 1 shows a wheelchair 2 of the type with which one example of the invention may be used. The wheelchair 2 is a motorised wheelchair driven by a motor housed within the base 10 that is arranged to drive the wheelchair 2 based on inputs from a user at a gesture controller 40 and/or a secondary controller 70. As shown in Figure 1, the wheelchair 2 has two front wheels 22 which are driven by the motor and can be operated independently by a control unit such that the wheelchair 2 is able to drive in a straight direction, turn left or right and drive at different speeds. To support the ability of the wheelchair 2 to make a turn in a small amount of space, the wheelchair 2 has rear omni-wheels 24. The omni-wheels 24 are able to rotate about an axle running through the centre of the wheel (as is the case for a normal wheel) to enable motion in a forward/backward direction, and also comprise rollers mounted around the circumference of the wheel to enable smooth motion of the wheel in a direction parallel to the axle. This allows the wheelchair 2 to turn with a small turning circle which may be desirable when operating the wheelchair 2 in cramped environments.
Mounted on top of the base 10 is a seat 30 having a seat base 32 on which a user can sit and a seat back 34 to support the user's back when sitting. The seat base 32 and seat back 34 may be fixedly mounted to one another or may be pivotally mounted such that the angle formed between the seat base 32 and the seat back 34 can be altered to provide a comfortable sitting position for the user. For further support when sitting, the seat 30 comprises a headrest 36 to support the user's head when sitting in the wheelchair.
The seat 30 also has armrests 38 on which the user can rest his/her arms and footrests 28 extending from the seat base 32. The footrests 28, armrests 38 and headrest 36 are adjustable so that the position and angle of these components relative to the seat base 32 and seat back 34 can be altered and/or these components removed entirely.
Located at the end of one of the armrests 38 is a gesture controller 40. The gesture controller 40 allows the user to control the motion of the wheelchair 2. As shown in Figure 1, the gesture controller 40 is seated in a mandrel which enables the use of the gesture controller 40 as a joystick whereby pivoting of the gesture controller 40 in the mandrel is used to indicate a direction and speed in which the wheelchair 2 is to travel. The gesture controller 40 may also be removed from the mandrel whereupon control of the wheelchair 2 can be performed by moving the gesture controller 40 freely in three-dimensional space unconstrained by the attachment to the mandrel. This may provide a more comfortable and intuitive method of controlling the wheelchair 2, particularly where the user has limited range of motion or level of motor control, for example. This motion of the gesture controller 40 is detected using a combination of an accelerometer 44, magnetometer 46 and gyroscope 48. In accordance with examples, the gesture controller 40 can be adapted to the preferences of the user to adjust how the gestures performed by the user are translated into the control of the wheelchair 2. In this way, more comfortable/easier to perform gestures can be mapped to common control commands. Similarly, the gesture controller 40 may be adjusted to account for a user's level of motor control by changing the sensitivity of the gesture controller 40 and/or remapping directions of travel of the wheelchair 2 in to directions of manipulation of the gesture controller 40 that lie within the user's range of motion, for example.
To display the detected input from the gesture controller 40 and any other information that may be useful for the user, a display 60 is provided at the end of one of the armrests 38.
The display 60 is mounted at an angle relative to the armrest such that the display 60 is oriented for easy viewing by a user sitting in the seat 30. The detected input from the gesture controller 40 (e.g., a direction and speed in which the wheelchair 2 is to travel based on the manipulation of the gesture controller 40) is displayed on the display 60. The display 60 may also display other information such as a battery level for the wheelchair 2, navigation information and/or notifications from a connected user device (e.g. mobile phone).
Situated at the end of the other arm rest 38 is a phone holder 70. The phone holder is arranged to securely hold the user's phone. Further, the phone holder 70 is pivotally mounted such that angle of the phone can be adjusted. This allows the user to position that mobile phone at an angle that allows for comfortable viewing.
In some examples, a secondary controller (not pictured) is also provided. This controller may be used to control functions of the wheelchair other than the movement as controlled by the gesture controller 40. For example, the seat 30 may be mounted such that the seat 30 can be elevated vertically from the base 10 by means of an extendable riser post. This would allow the user to adjust the seat height so as to position themselves at the eye line of a person with whom they were talking or to have a better view (e.g., over a counter in a shop). The secondary controller 70 may therefore be used to control the vertical motion of the seat 30. Similarly, the secondary controller 70 may be used to adjust the positioning of the armrests 38, footrests 28 and/or the headrest 36. In the example of Figure 1, such a secondary controller is not provided and this functionality is controlled via display 60.
As is also illustrated in Figure 1, a depth camera 50 is situated in the base 10 of the wheelchair 2. The depth camera 50 is configured to capture images as well as position information for the objects in those images. These can then be used to aid in navigation, detection of obstacles and the identification of obstacles that may be impassable to wheelchair 2 (e.g., kerbs above a height scalable by the wheelchair 2). Based on detection of such impassable obstacles, a pre-emptive action to warn the user and/or prevent the user from attempting to drive over the obstacle may be issued.
Figures 2A and 2B show schematics of selected components of the wheelchair 2 and of a user device that may be used with the wheelchair 2. It will be appreciated that not all features of the wheelchair 2 and the user device are included in these schematics and that these figures are used to explain certain features and their interactions in accordance with an illustrative example. Other examples may contain more or fewer components and the relationship between the components may differ from that shown in Figures 2A and 2B.
As shown in Figure 2A, the wheelchair 2 comprises a central control unit 200 that coordinates the functions of the wheelchair 2. The control unit 200 operates in conjunction with a driving control unit 210 that handles the control of the wheelchair's motion and calculates and issues control signals (e.g., to one or more motors) to cause the wheelchair 2 to move. The wheelchair 2 also has a seat control unit 230 that performs seat monitoring in order to maintain user comfort and/or reduce the risk of the user developing pressure sores by virtue of their position on the seat for an extended period of time. There is further an obstacle detection control unit 250 coupled to the control unit 200 which makes use of a depth camera 50 to identify obstacles and determine whether the obstacles are passable to the wheelchair 2 as discussed in more detail below. The control unit 200 is also coupled to a gesture controller control system 260 which interprets the inputs from the gesture controller 40 to produce control signals that can be used to control the wheelchair 2.
The control unit 200 provides an interface by which the control units 210, 230, 250, 260 as well as the other components of the wheelchair 2 can interact. For example, as the gesture controller control system 260 interprets the gestures and determines control signals for driving the wheelchair, the control unit 200 can pass these control signals to the driving control unit 210 whereupon the driving control unit 210 can control the wheelchair 2 to move in accordance with the gestures performed by the user. Similarly, if the obstacle detection control unit 250 detects an obstacle that is deemed to be impassable to the wheelchair 2 (e.g., a kerb that is too high for the wheelchair 2 to mount), the control unit 200 may receive this information and control the driving control unit 210 to stop the wheelchair 2. It will be appreciated that Figure 2A illustrates one possible arrangement of control logic within the wheelchair and that other arrangements could be used, for example, providing a single control unit to handle all the control functionality of the wheelchair.
The seat control unit is coupled to a plurality of seat sensors including a temperature sensor 232, at least one pressure sensor 234 and a humidity sensor 236. The seat control unit 230 is responsive to measurements from these sensors to determine whether a discomfort condition is satisfied. The discomfort condition is set so as to be indicative of a set of temperature, pressure and humidity values at which the user is likely to be at risk of developing a pressure sore. Additionally, or alternatively, the discomfort condition may be set such that the condition is satisfied when a user is expected to being feeling discomfort and so by anticipating this discomfort (or equivalently the onset of pressure sores), the seat control unit 230 can take a pre-emptive action to prevent discomfort/pressure sores.
As shown in Figure 2A, the seat control unit 230 is connected to a fan 238, a heater 240, and an actuator 242 which is able to control a surface profile of the seat. Making use of these components, the seat control unit 230 is able to respond pre-emptively when the discomfort condition is satisfied to adjust the temperature at the seat surface (using the fan 238 or heater 240 as appropriate) or to adjust the user's sitting position using actuator 242. It will be appreciated that the seat monitoring apparatus may be provided with more, fewer, or different components, e.g., in some examples the seat comprises actuator 242 but does not have the fan 238 or heater 240.
As another example of a pre-emptive action that may be taken and which may also be taken in response to detection of an impassable obstacle by the obstacle detection control unit 250, the wheelchair 2 may issue an audible alert via speaker 278 or issue haptic feedback via haptic feedback element 276 which may be located for example, in the arm rest 38, gesture controller 40, or the seat base 32. The wheelchair 2 further comprises a communication element 274 with which the control unit 200 can communicate a user device. For example, the communication element 274 may be a BluetoothTm communication element with the wheelchair 2 can communicate with a corresponding BluetoothTM communication element 282 of the user device. Although the example of BluetoothTM is provided, it will be appreciated that other forms of suitable communication could be used such as other personal area networks (e.g., ZigbeeTM, Wireless USB, WlFiTM or Near-Field Communication (NFC)). In some examples, wired communication between physical ports of the user device and the wheelchair 2 is used using a physical connection technology such as USBTM, a serial port, or FireWireTM.
The wheelchair 2 also comprises storage 272 to store data used by the wheelchair's control systems. The storage may for example be flash storage (e.g., a solid state drive or a memory card), a hard-disk drive, read-only memory (ROM), erasable programmable ROM (EPROM), and/or electrically erasable programmable ROM (EEPROM). The storage 272 may be used to store a database associating gestures of the gesture controller 40 with movements of the wheelchair, personal discomfort conditions set for the user, and/or information for use by the obstacle detection control unit such as a positioning of the depth camera 50 relative to the wheelchair 2.
Turning now to Figure 2B, there is illustrated a schematic of selected components of a user device that may be used with the wheelchair of Figures 1 and 2A. A user device such as a mobile phone or tablet compute may be used to extend the functionality of the wheelchair 2 and/or provide a more convenient means of interacting with the wheelchair 2.
The user device comprises a processor 280 to perform processing operations. The processor 290 is coupled to a communication element 282 which can communicate with the communication element 274 of the wheelchair 2. As discussed above, a number of possible communications protocols could be used and the form of the communication elements provided will be selected in accordance with the communication protocol by which the user device and the wheelchair 2 are to communicate.
The user device also comprises a speaker 284 which may be used to issue an alert to the user, for example, when the seat control unit 230 determines that the discomfort condition is satisfied or when the obstacle detection control unit 250 identifies an obstacle impassable to the wheelchair 2. The user device is also provided with storage 286, which as for the storage of the wheelchair 2 may for example be flash storage (e.g., a solid state drive or a memory card), a hard-disk drive, read-only memory (ROM), erasable programmable ROM (EPROM), and/or electrically erasable programmable ROM (EEPROM). The storage 286 may be used to store user settings such as gesture control profiles or a personalised discomfort condition which can be uploaded to the wheelchair 2.
The user device (e.g., a mobile phone) also comprises a location determining element 288 (such as a GPS receiver element) which can be used to determine the location of the user device and by association the wheelchair 2 and its user. The location may also be communicated via the communication element 282 (e.g., using Bluetooth) to the wheelchair 2. In this way, the user device can be used to provide location information for use in navigation of the wheelchair. Such navigation may take into account obstacles identified by the obstacle detection control unit 250 as impassable (and other impassable obstacles identified for example by other wheelchairs), to direct the user to follow a route suitable for the wheelchair 2. In some examples, the wheelchair 2 itself is provided with its own location determining element and is able to perform navigation without needing to be connected to a user device in order to provide the location determining functionality.
Figure 3 shows a gesture controller 40 according to an example. As illustrated in Figure 3, the gesture controller 40 is removed from the mandrel shown in Figure 1. The gesture controller 40 is shaped to fit comfortably in the hand of a user such that the gesture controller 40 can be gripped securely for long periods of time and even by users with limited grip strength.
Although other combinations of sensors could be used, the gesture controller 40 comprises accelerometers to detect acceleration of the gesture controller 40, and a magnetometer and gyroscope to measure the orientation of the gesture controller 40. With these sensors, the gesture controller 40 is able to detect movement of the gesture controller 40 with six degrees of freedom. These six degrees of freedom correspond to translation in three dimensions and reorientation of the controller along three orthogonal axes. That is, using the accelerometer, the gesture controller 40 can detect forward/backward motion as illustrated with arrow A in Figure 3, left/right motion as illustrated with arrow B, and up/down motion as illustrated with arrow C. It will be appreciated that movement in intermediate directions can be detected as a combination of such motions. Further, the gesture controller is able to measure the pitch (a forward/backward tilting) of the controller and changes therein as illustrated with arrow D, roll (a left/right tilting) as illustrated with arrow E, and yaw (twisting about a long axis of the controller) as illustrated with arrow F. Similarly, combinations of these reorientations and combined motions involving reorientation of the controller and translation can be detected as a gesture.
By supporting gestures involving movement in this range of directions and orientations, the gesture controller 40 allows a user to use a large number of different possible movements to control the wheelchair 2. This may be particularly important for a user with limited motor control or for whom certain motions are otherwise difficult or tiring. In contrast to a standard joystick in which a central column has to be pivoted away from a central position to express the user's input, with the gesture controller 40, the user has a far greater range of possible manipulations that can be used to control the wheelchair 2.
As illustrated in Figure 2A, the gesture controller is coupled to a gesture controller control system 260. The gesture controller control system 260 is responsive to detection of gestures by the gesture controller to reference a database in order to translate the detected gesture into a single or composite movement of the wheelchair 2. The use of the database in this way allows the mapping between gestures and movements of the wheelchair 2 to be personalised to the user and updated where necessary. Hence, by changing the mapping stored in the database, the gestures associated with common movements of the wheelchair 2 can be set to easy to perform or more intuitive gestures for the user, for example. In some examples, where the user has limited motor control, the database can be set up to account for the user's level of motor control, e.g., by providing a lower sensitivity for gestures in some directions that other directions. In this way, the gesture controller 40 can provide a more convenient and easier to use control device for the wheelchair 2.
Some examples of gestures and associated movements of the wheelchair 2 are explained below with reference to Figures 4A-4F.
As shown in Figure 3, the gesture controller 40 comprises a button 42. The button 42 may used to reset a neutral position for the gesture controller 40. That is, the control system 260 is responsive to detecting a button press to update the database to indicate a current orientation/position of the gesture controller 40 as a neutral position relative to which manipulation should be measured when determining a gesture. This allows a user to readjust the position in which they are sitting or resting their arm such that a most comfortable position for holding the gesture controller 40 is changed for example, whilst being able to reset the gesture controller 40 to measure manipulation from that new resting position rather than a previous position.
Figures 4A-4F illustrate example gestures and associated movements of a wheelchair 2. It will be appreciated that these mappings represent illustrative examples and further or different mappings may be used. In the left portion of each of Figures 4A-4F is represented a manipulation of the gesture controller 40 recognised as a gesture by the controller 40. The right portion of each of Figures 4A-4F represents a corresponding single or composite movement of the wheelchair 2 as would be interpreted by the control system 260.
In Figure 4A, tilting of the gesture controller 40 in a forward direction (as represented by the arrow on the left) is mapped to a forward motion of the wheelchair 2 (as represented by the arrow on the right). Thus, in response to manipulation of the gesture controller 40 to tilt the gesture controller 40 forwards, the control system 260 is configured to identify, as the associated movement, driving in a forwards direction and to issue control signals to the driving control unit 210 to cause the driving control unit 210 to drive the wheelchair 2 forwards.
Figure 4B illustrates an example in which a lower sensitivity is set. The database thus associates the same forward tilt of the gesture controller 40 with a slower motion of the wheelchair 2. By reducing the sensitivity in this way, larger manipulations (which may be easier to perform) can be used to achieve the same effect as if a higher sensitivity were used. Similarly, a higher sensitivity could be used for a user with a limited range of movement so that the user still has control over the functionality of the wheelchair 2 but using finer movements of the gesture controller 40. In some examples, the sensitivity is a direction-dependent sensitivity and the sensitivity varies in dependence on the direction of manipulation of the gesture controller 40.
Figure 4C illustrates an example in which a gesture corresponding to a manipulation of the gesture controller 40 in a particular direction is associated with a movement of the wheelchair 2 in a different direction. As shown in this example, a tilting of the gesture controller in a forwards direction is mapped to a movement involving the wheelchair 2 turning to the right. Although it may appear counter-intuitive for the direction of manipulation of the controller 40 not to align with the direction of motion of the wheelchair 2, this association may be used to ensure that the most comfortable gestures for the user to perform are mapped to the most commonly used movements of the wheelchair 2 or to ensure that, where the user has a limited range of motion, the full range of control of the wheelchair 2 can be performed within the user's range of motion.
An example of a composite movement of the wheelchair 2 is shown in Figure 4D. As illustrated in Figure 4D, in response to a movement of the gesture controller 40 in a circular motion, the control system 260 is configured to identify that the user is indicating that the wheelchair 2 should be turned around and so a turn through 180° is effected. Thus, the user is able to conveniently indicate a more complex motion such that the control system 260 can perform the motion automatically, rather than the user individually controlling the wheelchair 2 to turn and identifying themselves when to stop the turn.
Similarly, as shown in Figure 4E, the control system 260 is responsive to translation of the gesture controller 40 in a forwards direction followed by a backwards direction to identify that a particular composite set of movements of the wheelchair 2 are to be carried out. For example, based on detecting the forward/backward gesture, the control system 260 may identify that the database stores a series of movements to be carried out. For example, these movements may drive the wheelchair 2 from the user's bathroom to their bedroom. Thus, for a commonly performed route, the user does not have to control the wheelchair 2 to perform the whole journey each time, but rather can signal that a preset journey is to be carried out with the wheelchair 2 then driving this route automatically.
The database may comprise more than one set of associations between gestures and movements of the wheelchair 2. In this case, each set of associations has a corresponding mode of operation, whereby the mode of operation in which the wheelchair 2 is operating determines which set of associations is used. Continuing with the example discussed in relation to Figure 4E therefore, the database may store a set of associations for when the user is at home with the journey from the bathroom to the bedroom associated with the forward/backward translation of the gesture controller 40 when in the 'at home' mode of operation.
However, when in a different mode of operation, e.g., an outside mode of operation, the same gesture may be associated with a different single or composite movement of the wheelchair 2. This is illustrated in Figure 4F in which the forward/backward translation of the gesture controller 40 is associated with a kerb-climbing operation when the wheelchair 2 is in the outside mode of operation. Thus, instead of performing a route from the bathroom to the bedroom in response to the forward/backward motion of the gesture controller 40, the control system 260 is arranged to cause the wheelchair 2 to perform a series of movements determined to provide a safe and effective way of climbing a kerb, without requiring the user to manually indicate such movements individually. By supporting such modes of operation, different movements can be mapped to the same gestures, allowing the most comfortable or easiest to perform gestures to be reused.
Figure 5 is a flowchart illustrating a method of controlling a wheelchair 2. At step 510, a gesture is detected by gesture controller 40. This gesture may correspond to manipulation of the gesture controller 40 by the user and may be detected using one or more sensors in the gesture controller 40.
In response to detecting the gesture, the control system 260 references a database to determine an associated single or composite movement for the wheelchair 2 at step 520. By providing a database storing such associations, more complicated relationships between gestures and the movements of the wheelchair 2 can be used as compared to a system for which gestures were mapped directly to a motion of the wheelchair 2 in a direction of the gesture. Thus, the control of the wheelchair 2 can be adapted to be more appropriate for the user and their level of motor control/range of motion.
Based on the single or composite movement associated with the detected gesture, the control system 260 is arranged to control the wheelchair 2 to perform the movement. This may involve issuing control signals to a driving control unit 210 for example in order to effect the determined motion.
Figure 6 is a flowchart illustrating a method of configuring a gesture controller 40 for a wheelchair 2. This process allows the stored association between gestures and single or composite movements of the wheelchair to be updated or modified so as to provide a convenient and appropriate set of mappings for the user.
At step 610, based on the user's level of motor control, the gestures to be performed are identified. This may involve identifying the user's level of motor control, their range of motion, comfortable positions to hold the gesture and/or comfortable movements to make. Based on this information, a set of gestures may be selected that the user wishes to have associated with movements to control the wheelchair 2. Thus, the user is able to choose to use comfortable/convenient gestures tailored for them to control the wheelchair 2.
At step 620, the gestures are associated with the single or composite movements of the wheelchair 2. The association may be based on providing an intuitive mapping between the gestures and the movements of the wheelchair 2 and/or making sure that the most common movements are associated with the most comfortable/easiest to perform gestures.
At step 630, this association between gestures and movements is stored in the database such that the association can be referenced by the control system 260 when interpreting the user's gestures. This method may be used to initially provision the database with an association between gestures and movements of the wheelchair 2, to update/modify an existing association, or to add for an additional mode of operation of the wheelchair 2 a new set of associations.

Claims (23)

  1. CLAIMS1. A wheelchair comprising: a gesture controller to detect a gesture performed by a user, the gesture corresponding to manipulation of the gesture controller; and a control system having a database associating gestures with single or composite movements of the wheelchair; wherein the control system is responsive to detection of a given gesture by the gesture controller to reference the database to determine an associated movement for the given gesture and to control the wheelchair to perform the associated movement.
  2. 2. The wheelchair according to claim 1, wherein: the gesture controller is configured to detect manipulation comprising unconstrained motion in three-dimensions.
  3. 3. The wheelchair according to claim 1 or claim 2, wherein: the gesture controller is configured to detect manipulation comprising motion with six degrees of freedom.
  4. 4. The wheelchair according to any preceding claim, wherein: the gesture controller comprises an accelerometer, magnetometer and gyroscope to detect gestures performed by the user.
  5. 5. The wheelchair according to any preceding claim, wherein: the single movements comprise driving or turning the wheelchair in a particular direction.
  6. 6. The wheelchair according to any preceding claim, wherein: the database contains, as a composite movement, a movement to turn the wheelchair around.
  7. 7. The wheelchair according to claim 6, wherein: the database associates a circular motion of the gesture controller with the movement to turn the wheelchair around.
  8. The wheelchair according to any preceding claim, wherein: the database contains, as a composite movement, an operation to mount a kerb.
  9. 9. The wheelchair according to any preceding claim, wherein: the database contains two or more sets of associations between gestures and single or composite movements of the wheelchair corresponding to different modes of operation; 5 and at least one gesture in a first mode of operation is associated with a different movement than a movement with which that gesture is associated in a second mode of operation.
  10. 10. The wheelchair according to any preceding claim, wherein: the database stores information identifying a neutral orientation for the gesture controller; and determining the associated movement for the given gesture comprises identifying manipulation of the gesture controller relative to the neutral orientation.
  11. 11. The wheelchair according to claim 10, wherein: the gesture controller comprises a button; and the control system is responsive to detecting the button being pressed to update the information identifying the neutral orientation to store, as a new neutral orientation, a current orientation of the gesture controller.
  12. 12. The wheelchair according to any preceding claim, wherein: the database stores information identifying a sensitivity for the gesture controller; and determining the associated movement for the given gesture comprises determining a magnitude of the movement based on the sensitivity.
  13. 13. The wheelchair according to claim 11, wherein: the sensitivity is a direction-dependent sensitivity and the magnitude of the associated movement is based on the sensitivity for a direction in which the gesture controller was manipulated.
  14. 14. The wheelchair according to claim 12, wherein: the direction-dependent sensitivity indicates a higher sensitivity in forward, backward, left and right directions than in intermediate directions between the forward, backward, left and right directions.
  15. 15. The wheelchair according to any preceding claim, wherein: the database stores direction remapping information associating gestures having a particular direction with movements of the wheelchair having a different direction.
  16. 16. The wheelchair according to any preceding claim, wherein: for unmapped gestures for which an association with a movement of the wheelchair are absent from the database, the control system is responsive to detection of an unmapped gesture to suppress movement of the wheelchair.
  17. 17. The wheelchair according to any preceding claim, wherein: the wheelchair comprises a display; and the control system is responsive to detection of the given gesture to display an indication of the associated movement determined by the control system on the display.
  18. 18. The wheelchair according to any preceding claim, wherein: the control system is responsive to receiving association information associating gestures with single or composite movements of the wheelchair to update the database with the association information.
  19. 19. The wheelchair according to claim 18, wherein: the control system is configured to present a command line interface and to receive the association information via the command line interface.
  20. 20. The wheelchair according to claim 18, wherein: the control system is configured to receive the association information from a user device.
  21. 21. An apparatus comprising: a gesture controller to detect a gesture performed by a user, the gesture corresponding to manipulation of the gesture controller; and a control system having a database associating gestures with single or composite movements of a wheelchair; wherein the control system is responsive to detection of a given gesture by the gesture controller to reference the database to determine an associated movement for the given gesture and to control the wheelchair to perform the associated movement.
  22. 22. An apparatus comprising: a gesture controller to detect a gesture performed by a user, the gesture corresponding to manipulation of the gesture controller; and a control system having a database associating gestures with single or composite movements; wherein the control system is responsive to detection of a given gesture by the gesture controller to reference the database to determine an associated movement for the given gesture and to output control signals to indicate the associated movement.
  23. 23. A method of configuring a gesture controller for a wheelchair, the method comprising: identifying gestures to be performed by a user with a gesture controller based on the user's level of motor control; associating the gestures with single or composite movements of the wheelchair; and storing the association between the gestures and the single or composite movements of the wheelchair in a database.
GB2113022.4A 2021-09-13 2021-09-13 Gesture controller Active GB2610629B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
GB2113022.4A GB2610629B (en) 2021-09-13 2021-09-13 Gesture controller
PCT/GB2022/052287 WO2023037116A1 (en) 2021-09-13 2022-09-08 Gesture controller

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB2113022.4A GB2610629B (en) 2021-09-13 2021-09-13 Gesture controller

Publications (3)

Publication Number Publication Date
GB202113022D0 GB202113022D0 (en) 2021-10-27
GB2610629A true GB2610629A (en) 2023-03-15
GB2610629B GB2610629B (en) 2024-03-06

Family

ID=78149305

Family Applications (1)

Application Number Title Priority Date Filing Date
GB2113022.4A Active GB2610629B (en) 2021-09-13 2021-09-13 Gesture controller

Country Status (2)

Country Link
GB (1) GB2610629B (en)
WO (1) WO2023037116A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170189250A1 (en) * 2013-12-05 2017-07-06 Now Technologies Zrt. Personal Vehicle, And Control Apparatus And Control Method Therefore
US20200117184A1 (en) * 2017-06-29 2020-04-16 Munevo Gmbh Specialist control for a wheelchair

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8502641B2 (en) * 2000-03-09 2013-08-06 Intelpro Llc Rate-of-change switches and controllable apparatus
KR20180094218A (en) * 2017-02-15 2018-08-23 피플리안주식회사 Driving Equipment of Wheelchairs using Smartphone Body Motions

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170189250A1 (en) * 2013-12-05 2017-07-06 Now Technologies Zrt. Personal Vehicle, And Control Apparatus And Control Method Therefore
US20200117184A1 (en) * 2017-06-29 2020-04-16 Munevo Gmbh Specialist control for a wheelchair

Also Published As

Publication number Publication date
GB202113022D0 (en) 2021-10-27
WO2023037116A1 (en) 2023-03-16
GB2610629B (en) 2024-03-06

Similar Documents

Publication Publication Date Title
AU2020200343B2 (en) Immersive three-dimensional display for robotic surgery
EP3076906B1 (en) Personal vehicle, and control apparatus and control method therefore
JP5035303B2 (en) MOBILE BODY, SYSTEM INCLUDING THE SAME, MOBILE BODY OPERATING METHOD, AND PROGRAM
US11687073B2 (en) Specialist control for a wheelchair
SE533876C2 (en) Control and control system for a mobile disability aid
EP3197414B1 (en) Drive control system for powered wheelchair
WO2018031533A1 (en) Self-balancing wheelchair
US9904449B2 (en) Method for producing a control profile to operate a mobility device
US10882478B2 (en) Movement-based comfort adjustment
WO2021178425A1 (en) Hybrid wheelchair
US9393165B2 (en) Method for producing or calibrating a control profile for a wheelchair
US9383751B2 (en) Self operable wheelchair
GB2610629A (en) Gesture controller
JP5883939B2 (en) Electric wheelchair with auxiliary power, control terminal device and computer program
KR20180094218A (en) Driving Equipment of Wheelchairs using Smartphone Body Motions
CN107804388B (en) Transport system
GB2619929A (en) Mobility aid device
GB2610630A (en) Obstacle detection apparatus
GB2610631A (en) Seat monitoring apparatus
JP2020196334A (en) Vehicular control device
WO2017105169A1 (en) Automated walker assisted by a voice and electromyography mechanism

Legal Events

Date Code Title Description
COOA Change in applicant's name or ownership of the application

Owner name: DUCHENNE UK

Free format text: FORMER OWNER: THE MOVEMENT FOR NON-MOBILE CHILDREN (WHIZZ-KIDZ),