GB2610374A - Human machine interface device - Google Patents

Human machine interface device Download PDF

Info

Publication number
GB2610374A
GB2610374A GB2111312.1A GB202111312A GB2610374A GB 2610374 A GB2610374 A GB 2610374A GB 202111312 A GB202111312 A GB 202111312A GB 2610374 A GB2610374 A GB 2610374A
Authority
GB
United Kingdom
Prior art keywords
interaction
human
machine interface
interface device
drive mechanism
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB2111312.1A
Other versions
GB2610374B (en
Inventor
matthew lawrence fletcher Henry
Edward Buckley James
Kassim Mohamad David
Valentine Burton Christian
Tomlinson Ian
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tacyx Ltd
Original Assignee
Tacyx Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tacyx Ltd filed Critical Tacyx Ltd
Priority to GB2111312.1A priority Critical patent/GB2610374B/en
Priority to GB2209138.3A priority patent/GB2610266A/en
Priority to PCT/EP2022/071983 priority patent/WO2023012286A1/en
Publication of GB2610374A publication Critical patent/GB2610374A/en
Application granted granted Critical
Publication of GB2610374B publication Critical patent/GB2610374B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B6/00Tactile signalling systems, e.g. personal calling systems

Abstract

A human-machine interface device 1 is provided for communicating touch interaction between two users. The device includes a drive mechanism 5, an interaction element 3, a sensor 15, a network connection 13, and a controller 11. The drive mechanism applies a drive force 7 to the interaction element. The sensor measures interaction by a user 8 with the interaction element so as to obtain interaction data. The network connection transmits the interaction data to a corresponding human-machine interface device and receive remote interaction data from the corresponding human-machine interface device which is indicative of a second user’s interaction with the corresponding device. The controller controls the drive mechanism such that a force is applied to the interaction element based on the remote interaction data providing a haptic/tactile communication between users.

Description

HUMAN MACHINE INTERFACE DEVICE
Field of the Invention
The present invention relates to human-machine interface device for transmitting and receiving a human interaction on a network and a system of human-machine interface devices.
Background
People are increasingly relying on the internet to communicate with each other through text, video, and audio exchanges. However, these exchange formats do not allow for tactile communication between people.
Therefore, a human-machine interface device is desired which can transmit and receive force and displacement information to and from a corresponding device in a remote location to replicate a form of human tactile communication. For instance, a user may move a local object. Force and displacement information is transmitted to a remote object such that the force applied to the local object is replicated in near real time for the remote object.
Correspondingly, another user may move the remote object which results in a corresponding force being applied to the local object. Thus, a feeling is produced of the two users holding the same object.
The Internet of Things includes many devices that allow users to remotely connect to, control, and move devices. For instance, the movement of a joystick can be used to control a drone or a camera. However, these devices do not provide two-way communication of force and displacement information between users.
Moreover, humans are very good at sensing small forces and displacements such as vibrations or oscillations of objects they are holding. Movements or displacements that are a result of device mechanics rather than a force applied by the remote user would be sensed by the local user and the tactile communication would not feel authentic. Additionally, humans are very good at sensing small delays in touch. For instance, humans can sense touch inputs that are spaced approx. 10ms apart. Thus, system latencies which are too large may be sensed by the local user and the tactile communication would not feel authentic. Therefore, a human-machine interface device is desired which is able to measure, communicate, and produce small forces and displacements in line with the human sense of touch.
Summary
Accordingly, in embodiments of a first aspect of the invention a human-machine interface device for communicating touch interaction between two users comprises: an interaction element, a drive mechanism configured to apply a drive force to the interaction element, a sensor, configured to measure interaction by a user with the interaction element so as to obtain interaction data, a network connection configured to transmit the interaction data to a corresponding human-machine interface device and receive remote interaction data from the corresponding human-machine interface device which is indicative of a second user's interaction with the corresponding device, and a controller configured to control the drive mechanism such that a drive force is applied to the interaction element based on the remote interaction data.
The sensor may be separate to the drive mechanism. Providing a sensor separate to, i.e. physically distinct from, the drive mechanism facilitates more sensitive measurement of the interaction by a user. This is because a separate sensor is less likely to measure mechanical interactions caused by the drive mechanism itself rather than by the user.
The invention includes the combination of the aspects and preferred features described except where such a combination is clearly impermissible or expressly avoided.
Optional features of the invention will now be set out. These are applicable singly or in any combination with any aspect of the invention.
The sensor may be a force sensor configured to measure an external force applied to the interaction element by the user, the interaction data comprising the external force. For example, the force sensor may comprise a load cell, strain gauge or force sensing resistor.
The remote interaction data may also comprise a force corresponding to an external force measured on the corresponding human-machine interface device. The remote interaction data may also comprise telemetry information including a position and/or velocity of the remote interaction element.
The interaction element may move at a velocity according to the net sum of the drive force and the external force applied by the user. For instance, if an external force is applied which is in opposition to and in excess of the drive force, the interaction element may be allowed to slip in the drive mechanism or manually drive the drive mechanism in reverse. Thus, the interaction elements is moved in an opposite direction to the direction in which the drive mechanism is driving it.
Alternatively, the sensor may be a movement sensor configured to measure movement or displacement of the interaction element due to the drive force and an external force. For instance, the measurement sensor may be an IMU (inertial measurement unit) or a range-finder. Conveniently, the remote interaction data may comprise movement or displacement information measured by a movement sensor on the corresponding human-machine interface device.
The sensor may be positioned on an external face of the interaction element. This can result in a more accurate measurement of the interaction by the user than if the sensor was located elsewhere, for instance proximal to the drive mechanism. For instance, if the sensor is a force sensor configured to measure an external force applied by a user, the user can apply the external force directly to the force sensor, for example by pushing on it.
The interaction element may be a pin configured to move reciprocally along its longitudinal axis. In other examples, the interaction element may be a panel, a member, a surface, or another geometry capable of being moved in response to both the local user and the remote interaction data.
The drive mechanism may be a stepper motor. Alternatively, the drive mechanism may be a brushless motor, a servo motor, a hydraulic piston, or a pneumatic piston. The drive mechanism may also control the interaction element using electromagnets. In this case, a received force from the corresponding human-machine interaction device may be replicated by controlling a variable resistance which resists the movement of the interaction element. The drive mechanism may comprise a separate position encoder to measure a current position of the interaction element.
Alternatively, the interaction element may comprise one or more flexible membranes encasing a fluid. In this case, the drive mechanism may be a pump which can increase or decrease the fluid pressure inside the flexible membrane by draining or adding fluid. The sensor may be a pressure sensor or, alternatively, the sensor may measure an amount of fluid which is entering or exiting the space defined by the flexible membrane via an exit conduit.
Conveniently, the interaction data may comprise a current position of the interaction element. Accordingly, the remote interaction data may also comprise a current position of an interaction element on the corresponding human-machine interface device.
The controller may comprise a memory device for storing a previous position and/or telemetry information of the interaction element. A position and/or telemetry information received as remote interaction data may also be stored.
The controller may be configured to: determine a target position for the interaction element based on the interaction data and the remote interaction data, and control the drive mechanism such that it attempts to move the interaction element towards the target position.
In this way, the system is sensitive to small adjustments and may be adjusted in a manner which feels authentic. Alternatively, a drive force or torque may be determined based on the interaction data and the remote interaction data. Thus the controller may be configured to control the drive mechanism such that it attempts to move the interaction element according to a target force or torque. The controller may be configured to control the drive mechanism such that the drive force applied to the interaction element is adjusted to simulate a desired virtual mass of the interaction element. The drive mechanism may simulate the desired virtual mass by providing an additional or reduced drive force to simulate an additional inertial component of the interaction element. The additional drive force required to simulate the additional inertial component may be calculated based on: F(t) = Ma(t) where My is the desired virtual mass and a(t) is the current acceleration of the interaction element.
The controller may be configured to control the drive mechanism using a PID controller. Thus, the control of the interaction element can be tuned to optimise the movement of the interaction element. For instance, preference can be given to movements caused by human users and inconsistencies caused by mechanical elements can be filtered out.
The human-machine interface may further comprise one or more additional interaction elements, sensors and drive mechanisms. For instance, an array of pins may be provided to sense and replicate move complicated interactions by the user.
In a second aspect, a system is provided comprising two or more human-machine interface devices of the first aspect wherein: the devices are linked by a network, and the interaction data from each device form the remote interaction data for the or each of the remaining devices.
The system may include a central processing unit linked to the devices by the network. The central processing unit may be configured in the same manner of the first aspect. For instance, the central processing unit may be configured to provide instructions to the controller of a given device such that the controller controls the drive mechanism such that a drive force is applied to the interaction element based on remote interaction data from another given device.
In a third aspect, method is provided of using the human-machine interface device of the first aspect comprising: obtaining interaction data from the sensor, transmitting the interaction data to the corresponding human-machine interface device via the network socket, receiving remote interaction data from the corresponding human-machine interface device via the network socket, and controlling the drive mechanism such that a drive force is applied to the interaction element based on the remote interaction data.
Brief Description of the Drawings
Embodiments of the invention will now be described by way of example with reference to the accompanying drawings in which: Figure 1 shows a schematic a human-machine interface device; Figure 2 shows a schematic of two human-machine interaction devices connected by a network; Figures 3A and 3B show an external perspective view and an internal perspective view respectively of a human-machine interface device; Figures 4A to 4D show a perspective view, two different sectioned perspective views, and a sectioned side elevation respectively of the human-machine interface device of Figure 2A with a pin guard omitted; Figures 5A and 58 show perspective views of the drive mechanism and pin of the human-machine interface device of Figure 3A and Figure 50 shows a detailed view of region A of Figure 5B; Figures 6A and 6B show graphs of the time-varying position of interaction elements on local and remote human-machine interface devices when the devices are idling; the human-machine interface devices of Figure 6A comprising built-in force sensors in the drive mechanisms and the human-machine interface devices of Figure 6B comprising separate force sensors on the interaction elements; Figures 7A and 78 show graphs of the time-varying position of interaction elements on local and remote human-machine interface devices when a user is holding one of the interaction elements; the human-machine interface devices of Figure 7A comprising built-in force sensors in the drive mechanisms and the human-machine interface devices of Figure 78 comprising separate force sensors on the interaction elements; Figures 8A and 8B show graphs of the time-varying position of interaction elements on local and remote human-machine interface devices when one of the interaction elements encounters an obstacle; the human-machine interface devices of Figure 8A comprising built-in force sensors in the drive mechanisms and the human-machine interface devices of Figure 8B comprising separate force sensors on the interaction elements;Figure 9 shows a diagram of two interaction elements and parameters which are monitored to form interaction data and remote interaction data; and Figure 10 shows a flow chart of the control software. Detailed Description and Further Optional Features Aspects and embodiments of the present invention will now be discussed with reference to the accompanying figures. Further aspects and embodiments will be apparent to those skilled in the art.
Figure 1 shows a schematic of a human-machine interface device 1. The device comprises an interaction element 3 such as a pin or rod. As discussed above, other interaction elements may be used such as surfaces, panels, members, or other suitable geometries. In further embodiments, the interaction element comprises one or more flexible membranes encasing a fluid. In these embodiments, the drive mechanism may be a pump which can increase or decrease the fluid pressure inside the flexible membrane by draining or adding fluid. A drive mechanism in the form of a motor 5 provides a drive force 13 to the interaction element which causes it to move. The interaction element may also be moved manually by a user applying an external force 15 to the interaction element. Thus, the motion of the interaction element depends on the direction and magnitude of the sum of the drive force and the external force.
A controller 11 is provided to control the drive mechanism 5 and the drive force 7. A position encoder (not shown, but which may be located within the motors themselves) reports a current location of the interaction element 3 to the controller. Additionally, a sensor 15 is provided, physically separate to the drive mechanism, which measures a level of interaction by the user. Typically the sensor is a force sensor, such as a load cell, which directly measures the external force 15 applied by the user to the interaction element. For example, the force sensor may be a Richmond 210 in-line load cell which is capable of measuring forces of 0 to 100N. The measured external force is reported to the controller. In embodiments where the interaction elements comprise one or more flexible membranes encasing a fluid, the sensor may be a pressure sensor configured to sense the pressure of the fluid, or the sensor may measure an amount of fluid entering or exiting the space defined by the flexible membrane via an exit conduit.
The measured position and the measured external force forms interaction data which the controller provides to a network socket 13. The network socket 14 also allows for remote interaction data to be received from a remote human-interface device. A network interface is provided to allow the user to input desired system parameters such as a virtual mass or friction information (discussed below). For example, the network interface may be an external computer which connects to the controller via the network or a data input device which is installed on the human-machine interface device itself.
Figure 2 shows a schematic of a system comprising a local human-machine interface device 1A connected to remote human-machine interface device 1B by a network 21. Typically, the network is the internet. However, the network could also be a private wired or wireless connection. The interaction data provided, via the network socket 13B, to the controller 11B of the remote device form the remote interaction data received by the controller 11A, via the network socket 13A, of the local device. Thus, the remote interaction data contain a measured position of a remote interaction element 3B of the remote device and a measured external force 9B applied to the remote interaction element.
The controller controls the drive mechanism 5A of the local device 1A such that the movement of the local interaction element 3A mirrors the movement of the remote interaction element 3B, informed by the remote interaction data. The interaction data and the remote interaction data are communicated and updated continuously so that a movement of the interaction element on one device results in a corresponding movement of the interaction element on the other device. In some scenarios, users may simultaneously apply external forces 9A 9B to the interaction elements 3A 3B of both devices 1A 1B. In this case, the drive mechanisms 5A 53 of each device would assert an opposing or assisting drive force 7A 73 on each device which mimics the measured external force of the other device. Thus, the drive mechanisms provide a resistance to the users, replicating a feeling that both users are interacting with the same interaction element.
Additionally, in some scenarios a user may apply an external force to a local interaction element which is in opposition to and in excess of the drive force. In this case, the interaction element may be allowed to drive the drive mechanism in reverse. In this way, the position of the interaction element may be continuously monitored as it is pushed in the reverse direction, preventing measurement gaps or backlash in the system. A user would perceive backlash as a dead band in the system which would cause the touch interaction to feel less authentic. Therefore, preventing backlash in this way creates a more authentic user experience. Alternatively, the interaction element may be allowed to slip in the drive mechanism and its position may be remeasured to prevent backlash.
Figures 3A and 3B show an external perspective view and an internal perspective view, respectively, of a human-machine interface device. Additionally, Figures 4A to 4D show a perspective view, two different sectioned perspective views, and a sectioned side elevation, respectively, of the human-machine interface device of Figure 3A. Figures 4A and 4B show perspective views of the drive mechanism of the human-machine interface device of Figure 2A and Figure 4C shows a detailed view of region A of Figure 4B.
The drive mechanism 5 and the controller 11 are contained in a housing 23. The interaction element is a single pin 3 extending from the housing. The pin is protected by a guard 25. The drive mechanism 5 is arranged to drive the pin reciprocally along its longitudinal axis such that the pin moves in and out of the device housing. Thus, in this example, the user can interact with the device by pushing on the pin.
The drive mechanism 5 is typically a brushless DC motor housed in a motor unit 27, with an internal position encoder. However, alternative drive mechanisms are possible. For instance, the drive mechanism may be a stepper motor, a servo motor, a hydraulic piston, or a pneumatic piston. The drive mechanism may also control the interaction element using electromagnets. In this case, a received force from the remote human-machine interaction device may be replicated by controlling a variable resistance which resists the movement of the interaction element. Bearing carriers 29 support the pin 3 on either side of the drive mechanism 5. Inside the bearing carriers a preload screw 34 suspends a preload springs 35 which in turn suspends a vertically floating bearing housing 36 above the interaction element. Fixed bearing housings 37 support the interaction element from below. The drive mechanism drives the pin using a drive wheel 33 below the interaction element and a drive wheel 32 above the interaction element which is directly fitted to the motor shaft. The drive wheels are held under tension via springs mounting holes 38 and the preload springs 35 in a manner similar to a locomotive drive system. This ensures a good contact with the interaction element and slippage of the interaction element. A flexural motor and drive bracket 39 is fixed at one end to the housing of the human-machine interaction device to allow for small deviations in the height of the pin 3. A soft-stop damper 31 is provided to limit the maximum displacement of the pin to prevent it moving further than the operational range of the drive mechanism.
A force sensor 15 is positioned on the external facing end of the pin 3. Therefore, when the user presses on the pin they apply an external force directly to the sensor. The force sensor is an inline load cell capable of sensing forces between ON and 100N. However, other force sensors may be used.
Some motor units 29 may comprise a built-in force (or torque) sensor which may perform a similar force sensing function to the force sensor 15. However, providing a sensor which is separate to the drive mechanism 5 allows a more sensitive sensor to be used. Furthermore, the external forces can be measured independently of downstream mass and friction influences as a result of the drive mechanism 5 and bearing carriers 29. Thus, including a force sensor which is separate to the drive mechanism enables more accurate and precise measurements of the external force and a better user experience.
This effect is shown in the experimental results of Figures 6 to 8.
Figures 6A and 6B show graphs of the time-varying positions of pins on local and remote devices when the devices are idling with no external forces being applied. In Figure 6A the force sensor is a built-in torque sensor included in the drive mechanism itself. As shown in Figure 6A the pin position slowly deviates without any user input. In additional to this deviation, there is a small (approx. 0.5 mm amplitude) oscillatory movement of the pin. In contrast, Figure BE shows time-varying pin positions for devices installed with separate force sensors which positioned on the external facing ends of the pins. Moreover, in Figure 6B low-level noise in the force measurements is eliminated by applying signal thresholding. In this case, the pins remain almost stationary when the devices are idling without moving significantly from their initial positions.
Similarly, Figures 7A and 7B show graphs of the time-varying positions of pins on local and remote devices when a user is holding one of the pins but is not intentionally applying a force. In Figure 7A significant oscillation and deviation of the pins is shown, whereas in Figure 78 the system is much more stable.
Figures 8A and 8B show the pin positions of local and remote pins (drive 0 and drive 1) when one pin (drive 1) encounters an obstacle. The resultant force measured with the force sensors is also shown. In Figure 8A the force sensor is built-in to the drive mechanism and both pins display large oscillations. However, in Figure 88 separate force sensors are provided on the external facing ends of the pins. Here, the system is much more stable and the pin (drive 1) which encounters the obstacle settles against the obstacle much more quickly than in Figure 8A. Separating the force-sensors from the drive mechanism effectively decouples the drive system and the force sensing system reducing the potential for undesirable feedback in the control algorithm (described below).
With reference back to Figure 2, the controller 11A controls the drive mechanism 5A according to the interaction data and the remote interaction data. Corresponding processing is performed by a controller 11B on the remote human-interface device to control the remote drive mechanism 5B. In some variants, a common controller may be provided which connects to both human-interface devices via the network and generates control instructions for the drive mechanisms of both devices. This common controller could, for example, be provided as a remote server or computing device which is arranged to receive interaction data from both network sockets 13A 13B.
The drive mechanism 5 in some examples comprises a motor controller (not shown) which is separate to the main controller 11. The motor controller may be, for example, a Faulhaber MC 5010 S motion controller which communicates with the main controller 11 over ethernet. An EtherCAT network (httos://www.ethercat.orcil) may be used to exchange data between the main controller and the motor controller where the main controller is a master device and the motor controllers are slave devices. The motor controller reports motor data to the controller and receives control instructions from the controller. The control instructions comprise a desired target position for the pin. The controller 11 updates the control instructions for the drive mechanism at regular intervals determined by a system tick. Typically, this happens every 1ms. However, this interval may be adjusted depending on a desired update rate and the processing capabilities of the controller.
Figure 9 shows a diagram of two interaction elements (pin 1 and pin 2 which correspond to the local and remote interaction elements 3A 3B on Figure 2). Also shown in Figure 9 are some of the system parameters which are measured to form interaction data and remote interaction data. The interaction data comprise the motor data reported to the controller 11A from the drive mechanism 5A, and sensor data reported by the force sensor 15A. The motor data include at least the current position xi of the local interaction element 3A. The motor data may also include the current force or torque being exerted by the drive mechanism 5A, the current velocity v2 of the interaction element and the current acceleration a2 of the interaction element. The sensor data comprise the current external force 9A Fi being applied to the interaction element, as measured by the force sensor. Typically, the external force is reported as an amplified analogue voltage between DV to 10y.
Additionally, storage is provided to store previous telemetry information about the interaction element. The previous telemetry information is typically previously received interaction data and includes, at least, a previous position of the interaction element 3A which was cached during the previous system tick. This may also include the previous velocity of the interaction element, acceleration of the interaction element and calculate net force which was applied to the interaction element. As has been discussed previously, this local interaction data (comprising the motor data from the drive mechanism 5A) are transmitted to the remote human-interface device.
The remote interaction data, received at the local human-machine interface device, comprise at least the position x2 of the remote interaction element at a specified time. The remote interaction data may also comprise any of: the remote external force 9B F2 applied to the remote interaction element 3B as measured by the remote sensor 15B, the previous position of the remote interaction element, the force or torque being exerted by the remote drive mechanism at the specified time, the velocity vz of the remote interaction element at the specified time, the acceleration az of the remote interaction element at the specified time, and any other status information from the remote human-interface device. Alternatively, the position xz may be used, along with the previous position, to calculate the velocity vz and the acceleration az of the remote interaction element. This can give more accurate estimates of the velocity and acceleration than, for example, using built-in estimates of velocity and acceleration from the drive mechanism or position encoder.
Additionally, the user may input some configurable parameters to the system at start-up via the network interface. For instance, these may include real mass Mr, virtual mass Mv, dynamic friction, and static friction. Accounting for real mass and friction allows the controller to quantify and correct for the mass and friction of the interaction elements resulting in a higher fidelity system. The virtual mass is an intended simulated mass of the interaction element. This enables the user to choose how 'heavy' the interaction element should feel. To simulate a virtual mass the drive mechanism may resist or assist movement of the interaction element to simulate added or subtracted inertia.
On start-up, the controller runs an initialisation sequence to calibrate the parameters being received from the drive mechanisms 5 and the force sensors 15. During this initialisation sequence, the controller 11 assumes the interaction element 3 is at a zero position and no external force 9 is being applied. Thus, subsequent position changes of the interaction element are measured relative to the zero position. The external force measurements (if not zero) reported by the force sensors 15 during initialisation are designated as systematic offsets and are subtracted from subsequent external force measurements.
Figure 10 shows a flow chart of the control software which is run by the controller 11 every system tick to update the control instructions for the drive mechanism 7. In this example, a system controller is provided on a local human-machine interface device which comprises "motor controller 1" and "force sensor 1". Remote interaction data are provided from a remote human-machine interface device which comprises "motor controller 2" and "force sensor 2".
First, in steps 101 to 104 the motor controllers and the force sensors on each human-machine interface device report motor data and sensor data to the controller. The controller samples the incoming motor data and sensor data in steps 106 and 107. Simultaneously, the controller also collects previous pin telemetry information in step 105, including a previous pin position from the last system tick. Next, in steps 108 and then 109, the controller uses the current position xl and the previous position of the pin to estimate the velocity v1 and acceleration al of the pin based on how far it moved since the previous system tick.
In step 110, an inertial component is calculated by combining the acceleration al with a parameter Mr which represents a real mass of the pin. The local and remote pins are considered as two ends of the same pin and are intended to move with the same velocity and acceleration. Therefore, the inertial component represents an inertial component of the same pin, comprising inertia from both the local and remote pin. The remote external force F2(t) and the local external force F1 (t), as measured by the force sensors, are combined with the inertial component and an estimated friction component to produce a net system force FT. The net force FT is given by: v(t) FT(t) = F1(t) + F2 (0 ± Mr al - = Ma(t) Ic where F1(t) is the external force measured by the force sensor; F2(0 is the remote external force received as remote interaction data; Mrai(t) represents the inertia of the combined local and remote pins and the motor assemblies, where al (t) is the calculated acceleration of one or both of the pins and Mr is the actual mass of the combined local and remote pins; v(t) is a modelled friction component of the motor assemblies, where k is the coefficient of dynamic friction and v(t) is the velocity of the combined local and remote pins, and Mt, is the virtual mass of the combined local and remote pins.
In step 111, a target position is calculated, accounting for modelled system parameters such as a virtual mass, by double integrating the net system force to calculate the target position of each pin. The controller then dispatches updated telemetry information to the motor controllers instructing the drive mechanism to move the interaction element to the target position.
In steps 112 and 113 the motor controller receives the updated telemetry information and attempts to move the pin to the target position. The drive mechanism then attempts to move the pin to the target position.
In step 114, the most recently sampled current position of the pin is cached in the storage to repeat this processing during the next system tick. Step 114 may be performed in parallel to step 113 (or may be performed after, or before, so long as the target position has been calculated) Corresponding calculations are performed by a corresponding controller on the remote human-interface device such that the interaction element of the remote device is moved to a corresponding target position according to the net system force FT and the current x2 and previous positions of the remote interaction element. In this way, the target positions sent to each drive mechanism can be updated according to the applied external forces on both interaction elements and a desired simulated mass of the interaction elements.
Alternatively, an additional central processor may be provided that is connected, via the network, to both human-machine interaction devices. Here the central controller receives remote interaction data from both human-machine interaction devices and calculates updated telemetry information which is sent to the motor controller of both devices.
In some variants, a target torque or force for the drive mechanism may be calculated as opposed to a target position. In this case the target force or torque may be directly inferred from the net system force equation without the need to double integrate the equations.
In another variant, control of the drive mechanism to achieve the desired target position (or force or torque) may be handled by a software implemented PID controller. The PID controller can be tuned to continuously adjust and update the applied drive force to minimise the difference between the actual position of the interaction element and the target position of the interaction element.
Additional processing may be included to predict the motions of remote interaction elements in advance. This may limit the effects of network latency on the device response time to make the human-machine interactions feel more authentic. This may involve a machine learning algorithm such as a neural network to be trained to predict motion of the interaction elements based on past motions.
Alternative embodiments of the human-machine interface device may comprise a plurality of interaction elements and drive mechanisms. For instance, an array of pins may be provided which can sense and transmit more detail about a user's movements.
The features disclosed in the description, or in the following claims, or in the accompanying drawings, expressed in their specific forms or in terms of a means for performing the disclosed function, or a method or process for obtaining the disclosed results, as appropriate, may, separately, or in any combination of such features, be utilised for realising the invention in diverse forms thereof.
While the invention has been described in conjunction with the exemplary embodiments described above, many equivalent modifications and variations will be apparent to those skilled in the art when given this disclosure. Accordingly, the exemplary embodiments of the invention set forth above are considered to be illustrative and not limiting. Various changes to the described embodiments may be made without departing from the spirit and scope of the invention.
For the avoidance of any doubt, any theoretical explanations provided herein are provided for the purposes of improving the understanding of a reader. The inventors do not wish to be bound by any of these theoretical explanations.
Any section headings used herein are for organizational purposes only and are not to be construed as limiting the subject matter described.
Throughout this specification, including the claims which follow, unless the context requires otherwise, the word "comprise" and "include", and variations such as "comprises", "comprising", and "including" will be understood to imply the inclusion of a stated integer or step or group of integers or steps but not the exclusion of any other integer or step or group of integers or steps.
It must be noted that, as used in the specification and the appended claims, the singular forms "a," "an," and "the" include plural referents unless the context clearly dictates otherwise. Ranges may be expressed herein as from "about' one particular value, and/or to "about" another particular value. When such a range is expressed, another embodiment includes from the one particular value and/or to the other particular value. Similarly, when values are expressed as approximations, by the use of the antecedent "about," it will be understood that the particular value forms another embodiment. The term "about" in relation to a numerical value is optional and means for example +/-10%.

Claims (15)

  1. CLAIMS1. A human-machine interface device for communicating touch interaction between two users comprising: an interaction element, a drive mechanism configured to apply a drive force to the interaction element, a sensor, separate to the drive mechanism, configured to measure interaction by a user with the interaction element so as to obtain interaction data, a network connection configured to transmit the interaction data to a corresponding human-machine interface device and receive remote interaction data from the corresponding human-machine interface device which is indicative of a second user's interaction with the corresponding device, and a controller configured to control the drive mechanism such that a drive force is applied to the interaction element based on the remote interaction data.
  2. 2. The human-machine interface device of claim 1 wherein the sensor is separate to the drive mechanism.
  3. 3. The human-machine interface device of claims 1 or 2 wherein the sensor is a force sensor configured to measure an external force applied to the interaction element by the user, the interaction data comprising the external force.
  4. 4. The human-machine interface device of any preceding claim of wherein the remote interaction data comprise a force.
  5. 5. The human-machine interface device of any of claims 2 to 4 wherein the sensor is positioned on an external face of the interaction element.
  6. 6. The human-machine interface device of any preceding claim wherein the interaction element is a pin configured to move reciprocally along its longitudinal axis.
  7. 7. The human-machine interface device of any preceding claim wherein the drive mechanism is a stepper motor
  8. 8. The human-machine interface device of any preceding claim wherein the interaction data comprise a current position of the interaction element, and the remote interaction data comprise a current position of an interaction element on the corresponding human-machine interface device.
  9. 9. The human-machine interface device of any preceding claim wherein the controller comprises a memory device for storing a previous position and/or telemetry information of the interaction element.
  10. 10. The human-machine interface device of any preceding claim wherein the controller is configured to: determine a target position for the interaction element based on the interaction data and the remote interaction data, and control the drive mechanism such that it attempts to move the interaction element towards the target position.
  11. 11. The human-machine interface device of any preceding wherein the controller is configured to control the drive mechanism such that the drive force applied to the interaction element is adjusted to simulate a desired virtual mass of the interaction element.
  12. 12. The human-machine interface device of any preceding claim wherein the controller is configured to control the drive mechanism using a PID controller.
  13. 13. The human-machine interface device of any preceding claim further comprising one or more additional interaction elements, sensors and drive mechanisms.
  14. 14. A system comprising two or more human-machine interface devices of any preceding claim wherein: the devices are linked by a network, and the interaction data from each device form the remote interaction data for the or each of the remaining devices.
  15. 15. A method of using the human-machine interface device of claim 1 comprising: obtaining interaction data from the sensor, transmitting the interaction data to the corresponding human-machine interface device via the network socket, receiving remote interaction data from the corresponding human-machine interface device via the network socket, and controlling the drive mechanism such that a drive force is applied to the interaction element based on the remote interaction data.
GB2111312.1A 2021-08-05 2021-08-05 Human machine interface device Active GB2610374B (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
GB2111312.1A GB2610374B (en) 2021-08-05 2021-08-05 Human machine interface device
GB2209138.3A GB2610266A (en) 2021-08-05 2022-06-22 Human machine interface device
PCT/EP2022/071983 WO2023012286A1 (en) 2021-08-05 2022-08-04 Human machine interface device for communicating touch interaction

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB2111312.1A GB2610374B (en) 2021-08-05 2021-08-05 Human machine interface device

Publications (2)

Publication Number Publication Date
GB2610374A true GB2610374A (en) 2023-03-08
GB2610374B GB2610374B (en) 2024-04-10

Family

ID=82705424

Family Applications (2)

Application Number Title Priority Date Filing Date
GB2111312.1A Active GB2610374B (en) 2021-08-05 2021-08-05 Human machine interface device
GB2209138.3A Pending GB2610266A (en) 2021-08-05 2022-06-22 Human machine interface device

Family Applications After (1)

Application Number Title Priority Date Filing Date
GB2209138.3A Pending GB2610266A (en) 2021-08-05 2022-06-22 Human machine interface device

Country Status (1)

Country Link
GB (2) GB2610374B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5984880A (en) * 1998-01-20 1999-11-16 Lander; Ralph H Tactile feedback controlled by various medium
US20020082724A1 (en) * 2000-11-15 2002-06-27 Bernard Hennion Force feedback member control method and system
US6639582B1 (en) * 2000-08-10 2003-10-28 International Business Machines Corporation System for combining haptic sensory-motor effects from two separate input devices into resultant sensory-motor effects and for feedback of such resultant effects between the input devices
US20050235032A1 (en) * 2004-04-15 2005-10-20 Mason Wallace R Iii System and method for haptic based conferencing
US20120142416A1 (en) * 2010-06-01 2012-06-07 Joutras Frank E Simulated recreational, training and exercise system
US8508469B1 (en) * 1995-12-01 2013-08-13 Immersion Corporation Networked applications including haptic feedback
US20170193767A1 (en) * 2015-12-30 2017-07-06 Parihug Haptic communication device and system for transmitting haptic interaction
US20200257362A1 (en) * 2016-06-26 2020-08-13 Apple Inc. Wearable interactive user interface

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8508469B1 (en) * 1995-12-01 2013-08-13 Immersion Corporation Networked applications including haptic feedback
US5984880A (en) * 1998-01-20 1999-11-16 Lander; Ralph H Tactile feedback controlled by various medium
US6639582B1 (en) * 2000-08-10 2003-10-28 International Business Machines Corporation System for combining haptic sensory-motor effects from two separate input devices into resultant sensory-motor effects and for feedback of such resultant effects between the input devices
US20020082724A1 (en) * 2000-11-15 2002-06-27 Bernard Hennion Force feedback member control method and system
US20050235032A1 (en) * 2004-04-15 2005-10-20 Mason Wallace R Iii System and method for haptic based conferencing
US20120142416A1 (en) * 2010-06-01 2012-06-07 Joutras Frank E Simulated recreational, training and exercise system
US20170193767A1 (en) * 2015-12-30 2017-07-06 Parihug Haptic communication device and system for transmitting haptic interaction
US20200257362A1 (en) * 2016-06-26 2020-08-13 Apple Inc. Wearable interactive user interface

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
5 September 2004, "What is Telehaptics", Interactivity Consultants, [online], Available from: URL: https://web.archive.org/web/20040905011754/http://www.interactivityconsultants.com/pages/telehaptics/telehaptics_defined.htm [Accessed 20 January 2022] *

Also Published As

Publication number Publication date
GB2610266A (en) 2023-03-01
GB2610374B (en) 2024-04-10
GB202209138D0 (en) 2022-08-10

Similar Documents

Publication Publication Date Title
De Luca et al. Robots with flexible elements
Park et al. Development of force observer in series elastic actuator for dynamic control
CA2828826C (en) Coordinate measuring machine
US6067077A (en) Position sensing for force feedback devices
JP7375754B2 (en) Control device, control method, and control system
GB2610374A (en) Human machine interface device
Chang et al. Stochastic estimation of human arm impedance under nonlinear friction in robot joints: A model study
WO2023012286A1 (en) Human machine interface device for communicating touch interaction
Aksman et al. Force estimation based compliance control of harmonically driven manipulators
Morito et al. Development of a haptic bilateral interface for arm self-rehabilitation
Villgrattner et al. Compact high dynamic 3 DoF camera orientation system: Development and control
JP2017071012A (en) Master slave device
JP2003275975A (en) Master-slave information transmission system and method
Yu et al. Learning the elasticity of a series-elastic actuator for accurate torque control
Kokuryu et al. Wide-bandwidth bilateral control using two stage actuator systems: Evaluation results of a prototype
Jatsun et al. Studying of copying control system with nonlinear measurer
Tischler Experimental investigation of stiffness control for a robotic manipulator
CN113752262B (en) Method and device for damping-variable compliance control of robot and rehabilitation robot
Jaax et al. Mechatronic design of an actuated biomimetic length and velocity sensor
Mitsantisuk et al. Multi-sensor fusion observer based multilatral control of haptic devices without force sensor
Han et al. Research of 2DOF Manipulator for Controlling RC Joystick
Mihelj et al. Control of Haptic Interfaces
Shao Identification and control of low-cost robot manipulators
KR20230129862A (en) System and method for measuring external force of robot
Oda et al. A compliant motion control of planar redundant manipulator by amplifying joint torsional deflections