GB2579846A - Improvements in or relating to tactile sensing - Google Patents

Improvements in or relating to tactile sensing Download PDF

Info

Publication number
GB2579846A
GB2579846A GB1820588.0A GB201820588A GB2579846A GB 2579846 A GB2579846 A GB 2579846A GB 201820588 A GB201820588 A GB 201820588A GB 2579846 A GB2579846 A GB 2579846A
Authority
GB
United Kingdom
Prior art keywords
tactile
events
pixel
taxel
event
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB1820588.0A
Other versions
GB201820588D0 (en
Inventor
Lepora Nathan
Ward-Cherrier Benjamin
Bamford Simeon
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Bristol
Original Assignee
University of Bristol
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Bristol filed Critical University of Bristol
Priority to GB1820588.0A priority Critical patent/GB2579846A/en
Publication of GB201820588D0 publication Critical patent/GB201820588D0/en
Priority to PCT/GB2019/053483 priority patent/WO2020128430A1/en
Publication of GB2579846A publication Critical patent/GB2579846A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01LMEASURING FORCE, STRESS, TORQUE, WORK, MECHANICAL POWER, MECHANICAL EFFICIENCY, OR FLUID PRESSURE
    • G01L1/00Measuring force or stress, in general
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01LMEASURING FORCE, STRESS, TORQUE, WORK, MECHANICAL POWER, MECHANICAL EFFICIENCY, OR FLUID PRESSURE
    • G01L5/00Apparatus for, or methods of, measuring force, work, mechanical power, or torque, specially adapted for specific purposes
    • G01L5/22Apparatus for, or methods of, measuring force, work, mechanical power, or torque, specially adapted for specific purposes for measuring the force applied to control members, e.g. control members of vehicles, triggers
    • G01L5/226Apparatus for, or methods of, measuring force, work, mechanical power, or torque, specially adapted for specific purposes for measuring the force applied to control members, e.g. control members of vehicles, triggers to manipulators, e.g. the force due to gripping
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01LMEASURING FORCE, STRESS, TORQUE, WORK, MECHANICAL POWER, MECHANICAL EFFICIENCY, OR FLUID PRESSURE
    • G01L1/00Measuring force or stress, in general
    • G01L1/24Measuring force or stress, in general by measuring variations of optical properties of material when it is stressed, e.g. by photoelastic stress analysis using infrared, visible light, ultraviolet
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01LMEASURING FORCE, STRESS, TORQUE, WORK, MECHANICAL POWER, MECHANICAL EFFICIENCY, OR FLUID PRESSURE
    • G01L5/00Apparatus for, or methods of, measuring force, work, mechanical power, or torque, specially adapted for specific purposes
    • G01L5/0061Force sensors associated with industrial machines or actuators
    • G01L5/0076Force sensors associated with manufacturing machines
    • G01L5/009Force sensors associated with material gripping devices
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Force Measurement Appropriate To Specific Purposes (AREA)

Abstract

A tactile sensor 1 comprises a sensing component 10 operable to deform in response to tactile inputs and an event-based vision device 20 positioned such that the sensing component is within the field of view of the imaging device. A threshold change in incident light is detected by a pixel of the device and a pixel event is output in response thereto. The detected events are filtered to identify events that are correlated spatiotemporally to other events. The correlated pixel events are integrated and compared to a predetermined threshold value, outputting a taxel event in response thereto. Taxel events are processed to determine the deformation of the sensing component and hence to infer tactile inputs. The sensing component may comprise tactile elements 12. The components may be resiliently deformable. The output pixel events may include time stamps and/or location stamps. The output taxel events may comprise time stamps and/or location stamps.

Description

IMPROVEMENTS IN OR RELATING TO TACTILE SENSING
Technical Field of the Invention
The present invention relates to tactile sensing. In particular the present invention relates to tactile sensors and methods of operating a tactile sensor, especially in relation to tactile sensors suitable for use with robot grippers.
Background to the Invention
In many applications robot grippers are utilised to pick, hold or manipulate objects. In order to successfully achieve such tasks, the gripper must be able to apply sufficient pressure to an object that it can be reliably held, without applying sufficient pressure that the object is damaged. In order to make such a determination, the gripper may be provided with one or more tactile sensors operable to monitor interaction between the gripper and the object.
If a gripper is used to manipulate a limited class of similar objects, it is relatively straightforward to provide tactile sensors that provide a sufficiently accurate indication of pressure over a required range to enable the gripper to operate. Where a gripper might be used to manipulate a wide variety of objects, with differing surface properties it is less straightforward to provide a tactile sensor with sufficient sensitivity and range.
There are many types of tactile sensor, most of which rely on arrays of electromechanical sensing elements to transduce deformation of the sensing surface into an electrical signal. A subtype of tactile sensor, known as an optical tactile sensor, uses a camera to capture image frames of the internal surface as it undergoes deformation that can be used to infer details of the tactile input.
One example of an optical tactile sensor that has been designed to be accurate and sensitive in respect to gripping a diverse range of objects is the TacTip sensor, developed by Bristol Robotics Laboratory. The TacTip comprises a base for mounting the sensor to a gripper and a deformable surface provided over the base and defining a gel filled chamber between the surface and the base. The inner side of the surface is provided with a plurality of pins projecting from the surface towards the base, the pins provided with a bright or reflective tip. The base comprises a camera operable to capture a series of images of the pin tips. By analysing the captured images of the pin tips information on the initial positions and subsequent displacement of the pins in response to deformation of the surface can be obtained. Analysis of this displacement pattern can be used to infer details of the tactile input including: object shape, object localisation, contact force, torque and shear.
Whilst a TacTip type sensor can produce very good results, these results rely upon processing every pixel in a series of captured images to detect pin tip locations and to monitor subsequent displacement of the pins. Accordingly, the practical sensitivity of such sensors is limited by the requirements for significant processing power and/or significant bandwidth for transferring the captured image data to a high power processor. The sensitivity may also be limited by the image capture rate of the camera.
It is therefore an object of the present invention to provide a tactile sensor and a method of tactile sensing that at least partially overcomes or alleviates the above problems.
Summary of the Invention
According to a first aspect of the present invention there is provided a method of operating a tactile sensor comprising a sensing component operable to deform in response to tactile input and an event based vision device positioned such that the sensing component is within the field of view of the imaging device, the method comprising: detecting a threshold change in incident light detected by a pixel of the device and outputting a pixel event in response thereto; filtering detected pixel events to identify pixel events that are correlated spatiotemporally to other pixel events; integrating the correlated pixel events; comparing the output of pixel event integration to a predetermined threshold value and in the event that the output of pixel event integration exceeds a predetermined threshold, outputting a taxel event in response thereto; and processing taxel events to determine the deformation of the sensing component and hence to infer tactile inputs.
By processing only pixel events corresponding to a change in incident light detected by a pixel of greater than a threshold value the pixel events will inherently comprise relevant information corresponding to change in deformation of the sensing component. Less processing power is required to process the pixel events than pixel output values from a full imaging array. Additionally, less bandwidth is required to transmit pixel event information than pixel output values from a full imaging array.
Whilst the use of pixel events can reduce bandwidth and processing power requirements, processing only individual pixel events does not provide complete information as to the identity and motion of the tactile elements. Accordingly, the present invention further proposes the filtering and integration of pixel events to generate taxel events. Such taxel events correlate to the identify and motion of the tactile elements and hence provide direct information as to tactile information input to the sensor. This also has the advantage of providing a more efficient output for calculating deformation of the sensing component in response to tactile input.
The sensing component may comprise one or more tactile elements. The tactile elements may comprise contrasting features of the sensing component. For example, the tactile elements may comprise visually contrasting features of the sensing component. The tactile elements are preferably sufficiently contrasting relative to the rest of the sensing component to be detectable by the imaging device. Monitoring displacement of the tactile elements thus provides an indication of the displacement or deformation of the sensing component.
The sensing component may be resiliently deformable. This allows the sensing component to revert to a neutral configuration when no tactile input is applied. In the neutral configuration, each tactile element may have a particular expected location. Information on expected locations of tactile elements may be pre-set or may be obtained during calibration. In addition to expected locations for tactile elements, there may be expected connections between tactile elements. Typically, these expected connections may be between adjacent tactile elements.
The output pixel events may include time stamps and/or location stamps. The time stamps may comprise information relating to the time at which the pixel event occurred. The location stamps may comprise information relating to the location or address of the pixel which produced the event. The filtering may result in pixel events that are not correlated spatiotemporally to other pixel events being excluded from processing. Additionally or alternatively, the processing may result in pixel events occurring outside a specified time period being excluded from processing. This exclusion of pixel events from processing further limits the processing requirements in the present invention.
In embodiments where the sensor comprises one or more tactile elements, a receptive field may be defined for each tactile element. The receptive field may comprise an area within which the tactile element is expected to be located. The receptive field may comprise a substantially circular area centred on the expected location of the tactile element. The extent of or the shape of the receptive field may be varied. In such embodiments, the filtering may result in pixel events located outside the receptive field of a tactile element being excluded from processing. The receptive field takes advantage of knowledge of the resiliently deformable physical properties of the tactile elements and that the likely positions of each tactile element can therefore be predicted. Exclusion from processing of taxel events outside the receptive field reduces processing load and increase accuracy by eliminating processing of false events.
The integration of pixel events may involve summing of pixel events. The summing may be a weighted sum. In such embodiments, the weighting may be related to the time stamp or location stamp of the pixel event. For example, a higher weighting may be assigned to more recent pixel events than to less recent pixel events. This allows for the relative weighting of pixel events to decline over time rather than just being excluded from the sum once a spec fied time period has elapsed.
Additionally or alternatively, pixel events may be weighted according to their relative location within the receptive field. In this manner events towards the edge of the receptive field may have a lower weighting.
The expected location of the tactile element may be fixed. In alternative embodiments, the expected location of the tactile element may be updated over time in response to pixel events. In particular, the expected location may be updated over time in response to a weighted sum of the locations of pixel events. In this manner the method can compensate for displacement of the tactile element in response to tactile input. In particular, this allows the sensor to sense ongoing variation in tactile input.
In some embodiments, if the displacement between the initial expected location and the updated expected location exceeds a particular threshold value, the updated expected location may be deemed invalid. The threshold may be an absolute value or a relative value. In such circumstances, the updated expected location may not be updated or the updated location may be updated to the threshold value. This enables potentially erroneous or anomalous updates to be excluded or limited.
Such a taxel event indicates that a number of pixel events correspond with sufficient certainty to a displacement of a tactile element.
The output taxel event may comprise time stamps and/or location stamps. The time stamps may comprise information relating to the time at which the taxel event occurred. The location stamps may comprise information relating to the location or address of the taxel which produced the event. The output taxel event may comprise details of the pixel events integrated to generate the taxel event.
The processing may involve filtering detected taxel events. The filter may be a spatio-temporal filter. The filtering may result in taxel events that are not correlated spatio-temporally to other taxel events being excluded from processing. Additionally or alternatively, the filtering may result in taxel events occurring outside a specified time period being excluded from processing.
Taxel events may be excluded from processing if the displacement of a tactile element exceeds a particular threshold.
In alternative embodiments, the expected location of the tactile elements may be updated over time in response to the location of taxel events. In particular, the expected locations of tactile elements may be updated to the location stamps of taxel events corresponding to the tactile elements. In this manner the method can compensate for displacement of the tactile element n response to tactile input.
Additionally or alternatively, taxel events may be subject to a connectivity filter.
Such a filter may be based on a model of expected connections between tactile elements. In some examples, such a filter may be based on a spring model of expected connections between neighbouring tactile elements. The connectivity filter may be operable to exclude taxel events from processing if the displacement of a tactile element associated with the taxel event does not correlate with the displacement of a tactile element with an expected connection. In one embodiment, taxel events may be excluded from processing if no taxel events associated with neighbouring tactile elements have been output. Additionally or alternatively, taxel events that imply that the tactile element has overlapped or crossed over position with a connected tactile element may be excluded from processing.
The processing may include the step of processing taxel events to infer tactile information. The processing of taxel events may be undertaken using any suitable algorithm or series of algorithms including but not limited to Bayesian methods such as sequential analysis and Bayesian networks, deep learning, random forests, Gaussian processes, dimensionality-reduction techniques such as principal component analysis, support vector machines, or the like. The tactile information output may relate to any or all of object shape, object localisation, contact force, torque, shear, slip, incipient slip, texture, vibration, compliance, or the like.
In some implementations the method may additionally include calibrating the tactile sensor. Calibration may involve determining expected tactile element locations for each tactile element when the sensing component is in a neutral configuration. In one embodiment, calibration can be achieved by applying a vibration to the sensing component and processing pixel events to determine the expected locations of each tactile element. As a result of an applied vibration the sensing component will deform at a corresponding frequency, causing the tactile elements to be displaced. The displacement will result in corresponding pixel events being generated. The processing may involve integrating the location stamps of the pixel events to determine the expected tactile element location. Applying a vibration to cause displacement of the tactile elements enables the displacement of the tactile elements to be detected by an event based vision device.
In alternative embodiments, calibration may involve upload of predetermined data relating to the neutral configuration of the sensing component or may involve capturing an image of the neutral configuration utilising a conventional image capture imaging device.
According to a second aspect of the present invention there is provided a tactile sensor comprising: a sensing component operable to deform in response to tactile input; an imaging device positioned such that the sensing component is within the field of view of the imaging device, wherein the imaging device is an event based vision device operable to output pixel events in response to a threshold change in incident light detected by a pixel of the imaging device; and a processing unit operable to filter detected pixel events to identify pixel events that are correlated spatiotemporally to other pixel events; integrate the correlated pixel events; compare the output of pixel event integration to a predetermined threshold value and in the event that the output of pixel event integration exceeds a predetermined threshold, outputting a taxel event in response thereto; and process the output taxel events to determine the deformation of the sensing component and hence to infer tactile inputs.
The method may be applied to a tactile sensor according to the second aspect of the present invention.
By using an event based vision device, the processing unit need only receive and process pixel events corresponding to a change in incident light detected of greater than a threshold value. As the pixel events will correspond to change in deformation of the sensing component, the pixel events will inherently comprise relevant information. Less processing power is required to process the pixel events than pixel output values from a full imaging array. Additionally, less bandwidth is required to transmit pixel event information than pixel output values from a full imaging array.
The tactile sensor of the second aspect of the present invention may include any or all features of the first aspect of the present invention as required or desired. In particular, the tactile sensor of the second aspect of the present invention may operate according to the method of the first aspect of the present invention.
The sensing component may comprise a surface. In a preferred embodiment, the sensing component may comprise a surface provided over the imaging device. The surface may take any desired form. In one preferred embodiment, the surface has a dome form projecting away from the imaging device. Other embodiments include flat surfaces or surfaces that conform to the integration of the sensing component within a device, such as a fingertip on a robot hand.
The surface may be resiliently deformable. In such circumstances, the surface may have a neutral configuration to which it resiliently reverts when no tactile input is applied.
The sensing component may define a chamber between the surface and the imaging device. The chamber may be filled with a suitable gas, liquid or gel. In a preferred embodiment, the space is filled with an optically clear gel. The gel filling may aid reversion to the neutral configuration and/or damp minor tactile events.
The surface may be provided with one or more tactile elements. The tactile elements may be adapted to enable deformation to be readily detected by the imaging device. The tactile elements may be provided in a contrasting colour to the bulk surface and/or may be wholly or partially reflective or florescent. The tactile elements may comprise a two dimensional pattern on the surface. In alternative embodiments, the tactile elements may comprise projections from the surface. In a preferred embodiment, the tactile elements may comprise one or more pins projecting from the surface towards the imaging device. In a preferred embodiment the tips of the pins may be bright or reflective.
The sensor may comprise illumination means operable to illuminate the sensing component. The illumination means may be provided alongside or around the imaging device. The illumination means may comprise one or more light emitting diodes (LEDs).
The sensor may comprise one or more lenses. The one or more lenses may be provided between the imaging device and the sensing component. The imaging device may help focus light from the sensing component onto the imaging device.
The event based vision device may comprise an array of pixels operable to detect incident light. Each pixel may be provided with a dedicated comparison circuit operable to determine the occurrence of a threshold change in the incident light detected by the pixel. Additionally or alternatively, the determination of the occurrence of a threshold change in the incident light detected by a pixel may be achieved by processing the output of each pixel using a suitable algorithm.
The imaging device may be provided within a housing. The housing may comprise a mounting bracket for retaining the imaging device in a desired position. The housing may be provided with a base. The base may comprise fixing elements or other cooperating features for facilitating the mounting of the sensor on a suitable device. In one embodiment, such a device is a robot gripper.
The processing unit may be integrated within the sensor. Alternatively, the 20 processing unit may be provided remote from the sensor and in communication with the imaging device via a suitable data link. The link may be wired or wireless as desired or as required.
In some embodiments, the processing unit may comprise an integrated processing unit. In other embodiments, the processing unit may comprise a plurality of processing modules, each processing module dedicated to a specific processing task.
The sensor may comprise a vibration unit. The vibration unit may be operable to apply a vibration to the sensing component. Applying such vibrations when the sensing component is in neutral configuration and processing consequent pixel events enables the calibration of the sensor with respect to the neutral configuration.
According to a third aspect of the present invention there is provided a method of calibrating a tactile sensor of the type comprising a sensing component having a surface and one or more tactile elements, the sensing component operable to deform in response to tactile input; an imaging device positioned such that the sensing component is within the field of view of the imaging device; and a processing unit operable to process the output of the imaging device, the method comprising the steps of applying a vibration to the sensor and processing pixel events detected during the applied vibration to determine expected locations of the tactile elements.
The method of the third aspect of the present invention may incorporate any or all features of the previous aspects of the invention, as required or as desired.
According to a fourth aspect of the present invention there is provided a robot gripper comprising one or more sensors according to the second aspect of the present 20 invention.
The gripper of the fourth aspect of the present invention may incorporate any or all features of the previous aspects of the invention, as required or as desired.
Detailed Description of the Invention
In order that the invention may be more clearly understood one or more embodiments thereof will now be described, by way of example only, with reference to the accompanying drawings, of which: Figure 1 is (a) an exploded side view, (b) an exploded perspective view, (c) a side view of a tactile sensor tip according to the present invention and (d) an under view of a tactile sensor tip according to the present invention; Figure 2 is a schematic block diagram of a tactile sensor according to the present invention; Figure 3 is a schematic circuit diagram for an example pixel of an event based vision device for use in a tactile sensor according to the present invention; Figure 4 illustrates schematically (a) initial tactile element locations, (b) receptive field areas corresponding to tactile element locations; and (c) connections between neighbouring tactile elements in a tactile sensor according to the present invention; Figure 5 schematically illustrates (a) a spatial filter based on the receptive field applied to detected pixel events for a single tactile element and (b) such a filter applied to multiple tactile elements in a tactile sensor according to the present invention; and Figure 6 is a schematic flow chart illustrating the processing steps involved in operating a tactile sensor according to the present invention.
Turning now to figures 1 and 2 a tactile sensor 1 according to the present invention is shown. The sensor 1 comprises a sensing component 10 operable to deform in response to tactile inputs positioned within the field of view of an imaging device 20. For the sake of clarity, the imaging device 20 is omitted from figure 1.
The sensing component 10 is formed of a resiliently flexible material and comprises a surface 11 on an inner side of which are provided tactile elements 12, of contrasting appearance to the surface 11 In the present example, the surface 11 is darkly coloured, typically black. The tactile elements 12 comprise a series of pins projecting from the surface 12, the tips of the pins being brightly coloured, for example, white.
In a neutral configuration, without any tactile input, the surface 11 forms a dome over the imaging device 20. In response to a tactile input, the surface 11 is deformed. In consequence, the pins comprising the tactile elements 12 are displaced. As the brightly coloured pin tips provide a clear contrast to the daddy coloured surface 11, displacement of the pins can be readily detected using imaging device 20. Processing the output of imaging device 20 allows information about the nature of the tactile input to be inferred.
In order to improve tactile performance and to aid resilient return to the neutral configuration, the chamber defined between the surface I I and the imaging device 20 can be filled with a suitable gel. So as not to impede imaging device 20 operation, the gel is an optically clear gel.
The sensor 1 is provided with a lens 14 to aid the focusing of light from the tactile elements 12 on the imaging device 20. The sensor 1 is also provided with I5 illumination means 15 operable to illuminate the tactile elements 12 In the present example, the illumination means 15 comprises one or more LEDs mounted on a ring provided around the imaging device 20 aperture. The skilled man will appreciate that in other embodiments different combinations of lenses 14 and/or illumination means 15 may be provided.
The imaging device 20 is provided within a protective housing 21. The housing 21 comprises a tip 22, a main body 23 incorporating a mounting bracket 24 adapted to securely retain the imaging device 20. The housing is additionally provided with a base 25, which may be adapted to enable the sensor 1 to be readily affixed to a larger device, such as a robot gripper. The skilled man will appreciate that the housing 21 may be provided in alternative configurations and/or that the sensor 1 may be fitted to other devices where analysis of tactile input is required.
Within the housing 21 may be provided a processing unit 30 operable to process output of imaging device 20 and output information related to the inferred tactile inputs.
Alternatively, the housing 21 may be provided with a wired or wireless communication link (not shown) operable to communicate the output of the imaging device 20 to a remote processing unit 30.
To date, such sensors 1 have relied upon a conventional camera capturing images of the surface 11 and tactile elements 12. These images are then processed to identify the tactile elements 12 and their respective locations in each captured image.
Such operation requires significant processing power and/or significant bandwidth for transferring the captured image data to a high power processor.
Accordingly, the present invention substitutes an event based vision device 20 for a conventional image capturing camera. An example of an event based vision device is described in W02006/128315. The event based vision device 20 comprises an array of pixels, each pixel operable to output a signal in response to a change in incident light detected of greater than a threshold value. Such changes in incident light in respect of an individual pixel are described as pixel events and comprise an associated time stamp and location stamp.
In view of the enclosed nature of the sensor 1, pixel events will correspond to a change in deformation of the sensing component, and thus will inherently comprise relevant information. By only processing pixel events rather than pixel output values from a full imaging array, less processing power is required. Similarly, less bandwidth is required to transmit pixel event data to an external processor than pixel output values from a full imaging array. Processing of the pixel events will be described in more detail below.
The event based vision device 20 may comprise a conventional pixel array and an array processor operable to implement a processing algorithm which extracts only change in incident light detected by a pixel of greater than a threshold value. Alternatively, each pixel in the array may be provided with dedicated circuitry 100 operable to implement event based operation. An example of such circuity is illustrated in figure 3.
Turning now to figure 3, the circuitry 100 shown comprises a sensing arrangement 110, an amplifying arrangement 120 and a comparison arrangement 130. In the sensing arrangement 110, incident light 101 is received by photodiode 111, which in combination with transistor 112 and inventing amplifier 113 generate a pixel signal. The pixel signal is amplified by the amplification arrangement 120 comprising capacitors 121, 122 inverting amplifier 123 and reset switch 124. The amplified signal output by the amplification arrangement 120 is then compared to the previous output of the amplification arrangement using a pair of differential amplifiers 131, 132 to generate an output signal indicating whether or not there has been a change of incident light detected by photodiode 111. If so, an on signal can be generated by differential amplifier 131. This on signal can result in the generation of a pixel event, the pixel event further including a location stamp corresponding to the pixel location and a time stamp related to the time of output of the on signal.
Whilst such arrangements efficiently identify pixel events, and thus provide eth potential for processing more efficiently and require less bandwidth for transmitting the raw output, they do not immediately provide obvious information as to the identity and movement of individual tactile elements 12. Typical event based vision sensors are not readily equipped to process such output efficiently being more concerned with
identifying objects within a field of view.
Accordingly, the present invention makes use of spatiotemporal filtering to identify correlated pixel events. These correlated pixel events, when integrated and exceeding a threshold level may be identified as taxel events, as is described in more detail below. This provides a way to simplify the processing of multiple pixel events generated in response to movement of a common tactile element 12.
The present invention can further add efficiency to processing by considering the expected location and physical properties of each tactile element 12. This can be used to eliminate from processing any cluster of pixel events that does not correspond to an expected location of a tactile element 12 and is described in more detail below.
In order to process pixel events to determine tactile inputs to the sensor, the processing unit 30 is provided with information as to the neutral configuration of the sensing component 10, in particular the expected locations of each of the tactile elements 12 of the sensing component 10. This information may be pre-set or may be derived from a calibration process.
Following implementation of a calibration process or upload of present data, an example of the expected locations 41 of each of the tactile elements 12 of a sensing component 10 is illustrated in figure 4a. Based on the expected locations 41 of each tactile element 12 is known, the processing unit 30 can define receptive fields 42 around the expected locations 41 of each tactile element 12. As illustrated in figure 4b, the receptive fields 42 may be substantially circular and centred on the expected locations 41. The skilled man will appreciate that the shape and size of the receptive fields 42 may be varied if required or desired. The receptive field can include a weighting function (including but not limited to a Gaussian function) which assigns weights to events.
Figure 4c illustrates a network of expected connections 43 between the expected locations 41 of each tactile element 12. The expected connections 43 link neighbouring tactile elements 12 which would be expected to display corresponding displacements in response to a local tactile input. By analysing displacement of neighbouring tactile elements 12 in light of the expected connections 43, a better appreciation of the tactile inputs can be achieved.
Turning now to figure 6, a flow chart is shown illustrating the processing of pixel events to provide information on tactile inputs according to the present invention. Firstly, at step S1, an object contacts the exterior of the sensing component 10. The resultant deformation of the sensing component 10 causes displacement of one or more tactile elements 12 and the consequent detection and output of pixel events by the imaging device 20 as step S2.
The pixel events are passed to the processing unit 30. At step S3, the processing unit 30 is operable to apply spatio-temporal filtering to the detected pixel events based on their time stamps and location stamps. This filtering allows potentially relevant pixel events to be utilised in further processing and to exclude potentially irrelevant events form further processing.
In a typical example, the spatial aspect of the filtering may involve excluding pixel events where the location stamp indicates that they occur outside the receptive field 42 of a tactile element 12. The temporal aspect of the filtering may exclude pixel events with a time stamp that does not correlate to the time stamp of other pixel events
within the same receptive field 42.
The skilled man will appreciate that other filtering criteria may be implemented in alternative embodiments including, but not limited to: excluding pixel events from processing after a particular time interval, reducing the weight applied to pixel events after a particular time interval, or excluding pixel events with a time stamp that does not correlate to the time stamp of other pixel events generally.
The spatial aspect of filtering for a single tactile element U is illustrated in figure 5a. In this example, the expected location 41 and receptive field 42 of a tactile element 12 are illustrated along with the locations of pixel events derived from the location stamps. Following filtering, pixel events detected at locations 52 are excluded from processing as they occur outside the receptive field 42. Conversely, pixel events detected at locations 51 are included in further processing as they occur within the receptive field 42. Turning now to figure 5b, this illustrates how this processing is applied across the sensing component 10 as a whole. In this instance, the location of pixel events are illustrated by white squares and reference numeral have only been applied to the locations of two exemplary pixel events for the sake of clarity. Pixel events such as that at location 51 are within the receptive field 42 of a tactile element 12 and are included in further processing. In contrast, the pixel event 52 is excluded from further processing due to it being located outside any of the receptive fields 42.
turning now to step S4, the integrated pixel events detected within each receptive field 42 are compared to a threshold. If the threshold is exceeded for a receptive field 42, a taxel event may be output. If the threshold is separately exceeded for multiple receptive fields 42, multiple taxel events may be output. A taxel event is effectively an indication that a threshold displacement of a tactile element 12 has occurred. The taxel event may comprise a time stamp and a location stamp. In particular, the taxel event may comprise an indication of the number of pixel events that have occurred within the period for triggering a taxel event.
Following output of a taxel event, the processing unit 30 is operable at step S5 to apply spatio-temporal filtering to the output taxel events. This filtering allows potentially relevant taxel events to be utilised in further processing and to exclude potentially irrelevant taxel events from further processing.
In a typical example, the spatial aspect of the filtering may involve excluding taxel events which do not correlate to taxel events in any other tactile elements 12 with an expected connection. The temporal aspect of the filtering may exclude taxel events with a time stamp that does not correlate to the time stamp of other taxel events in any other tactile elements 12 with an expected connection. This allows processing to be concentrated on tactile events that exceed a threshold level.
The skilled man will appreciate that other filtering criteria may be implemented in alternative embodiments including, but not limited to: excluding taxel events from processing after a particular time interval, reducing the weight applied to taxel events after a particular time interval, or excluding taxel events with a time stamp that does not correlate to the time stamp of other pixel events generally. In particular, the filtering may also exclude taxel events that exceed a threshold displacement or exceed a threshold displacement relative to tactile elements 12 with an expected connection 43. This can help exclude anomalous taxel events from processing or taxel events that imply an unlikely or impossible displacement of the sensing component 10. In particular taxel events that imply that a tactile element 12 has overlapped or crossed over another tactile element 12 may be excluded.
At step S6, taxel events are used to calculate an updated expected location 42 of the corresponding tactile element 12 for use in subsequent filtering of pixel events and taxel events. This allows the expected location 42 to be updated as tactile inputs displace tactile elements 12. In recalculating expected locations, consideration can be given to the updated expected locations 41 of tactile elements 12 with an expected connection 43. In particular, where updated location displacement exceeds a threshold value or a threshold value relative to tactile elements 12 with an expected connection 43, the updated expected location 41 may be deemed invalid.
Turning now to step S7, the location stamps of the filtered taxel events correspond to the pattern of displacement of the tactile elements. This pattern can be processed to infer the tactile input that would correspond to such a displacement pattern and hence to output an indication of the tactile input received by the sensor. Analysis of this pattern can enable processing unit 30 to generate and output tactile information relating to any or all of object shape, object localisation, contact force, torque, shear, slip, incipient slip, texture, vibration, compliance, or the like.
The sensor 100 may be utilised in any circumstances where sensing of tactile input is required. In particular the sensor may be adapted for fitting to a robot gripper. Use of one or more such sensors 100 on the jaws of a robot gripper can enable improved sensitivity and range and/or reduced processing load. This will help to improve the performance and/or cost of robot grippers.
Prior to first use, or prior to a particular use of the sensor, a calibration process may be carried out. The calibration process may involve using a vibration unit 35 to apply a known vibration to the sensing component 10. The vibration will result in the detection and output of multiple pixel events corresponding to oscillatory displacement of the tactile elements in response to the applied vibration. Assuming that the vibration amplitude is sufficiently low, applying a filter based on spatial correlation to the pixel events will isolate groups of pixel events corresponding to separate tactile elements. Integrating the location stamps of each group of pixel events can be used to determine a mean location stamp and hence an expected location 41 of the tactile element 12.
The above embodiment is described by way of example only. Many variations are possible without departing from the scope of the invention as defined in the appended claims.

Claims (25)

  1. CLAIMS1. A method of operating a tactile sensor comprising a sensing component operable to deform in response to tactile input and an event based vision device positioned such that the sensing component is within the field of view of the imaging device, the method comprising: detecting a threshold change in incident light detected by a pixel of the device and outputting a pixel event in response thereto; filtering detected pixel events to identify pixel events that are correlated spatiotemporally to other pixel events; integrating the correlated pixel events; comparing the output of pixel event integration to a predetermined threshold value and in the event that the output of pixel event integration exceeds a predetermined threshold, outputting a taxel event in response thereto; and processing taxel events to determine the deformation of the sensing component and hence to infer tactile inputs.
  2. A method as claimed in claim 1 wherein the sensing component comprises one or more tactile elements.
  3. A method as claimed in claim 2 wherein the sensing component is resiliently deformable and reverts to a neutral configuration when no tactile input is applied, where each tactile element has a particular expected location.
  4. A method as claimed in any preceding claim wherein the output pixel events include time stamps and/or location stamps.
  5. A method as claimed in any one of claims 2 to 4 wherein a receptive field is defined for each tactile element, the receptive field comprises an area within which the tactile element is expected to be located.
  6. A method as claimed in claim 5 wherein the receptive field comprises a substantially circular area centred on the expected location of the tactile element.
  7. A method as claimed in claim 5 or claim 6 wherein the expected location of the tactile element is updated over time in response to pixel events.
  8. A method as claimed in any preceding claim wherein the output taxel event comprises time stamps and/or location stamps.
  9. 9. A method as claimed in claim 8 wherein the processing includes filtering detected taxel events such that taxel events that are not correlated spatio-temporally to other taxel events are excluded from processing.
  10. 10. A method as claimed in claim 8 or claim 9 wherein the expected location of the tactile elements is updated to the location stamps of taxel events corresponding to the tactile elements.
  11. 11. A method as claimed in any preceding claim wherein taxel events are subject to a connectivity filter based on a model of expected connections between tactile elements.
  12. 12. A method as claimed in claim 11 wherein the connectivity filter is operable to exclude taxel events from processing if the displacement of a tactile element associated with the taxel event does not correlate with the displacement of a tactile element with an expected connection.
  13. 13. A tactile sensor comprising: a sensing component operable to deform in response to tactile input; an imaging device positioned such that the sensing component is within the field of view of the imaging device, wherein the imaging device is an event based vision device operable to output pixel events in response to a threshold change in incident light detected by a pixel of the imaging device; and a processing unit operable to filter detected pixel events to identify pixel events that are correlated spatiotemporally to other pixel events; integrate the correlated pixel events; compare the output of pixel event integration to a predetermined threshold value and in the event that the output of pixel event integration exceeds a predetermined threshold, outputting a taxel event in response thereto; and process the output taxel events to determine the deformation of the sensing component and hence to infer tactile inputs.
  14. 14. A tactile sensor as claimed in claim 13 wherein the sensing component comprises a surface provided over the imaging device.
  15. 15. A tactile sensor as claimed in claim 14 wherein the surface is resiliently deformable and has a neutral configuration to which it resiliently reverts when no tactile input is applied.
  16. 16. A tactile sensor as claimed in claim 14 or claim 15 wherein the sensing component defines a chamber between the surface and the imaging device, the chamber filled with a suitable gas, liquid or gel.
  17. 17. A tactile sensor as claimed in any one of claims 14 to 16 wherein the surface is provided with one or more tactile elements adapted to enable deformation to be readily detected by the imaging device.
  18. 18. A tactile sensor as claimed in claim 17 wherein the tactile elements comprise one or more pins projecting from the surface towards the imaging device.
  19. 19. A tactile sensor as claimed in any one of claims 13 to 18 wherein the event based vision device comprise an array of pixels operable to detect incident light.
  20. 20. A tactile sensor as claimed in claim 19 wherein each pixel is provided with a dedicated comparison circuit operable to determine the occurrence of a threshold change in the incident light detected by the pixel.
  21. 21. A tactile sensor as claimed in claim 19 wherein the determination of the occurrence of a threshold change in the incident light detected by a pixel is achieved by processing the output of each pixel using a suitable algorithm.
  22. 22. A tactile sensor as claimed in any one of claims n to 21 wherein the imaging device is provided within a housing, the housing comprising a mounting bracket for retaining the imaging device and a base having fixing elements for facilitating the mounting of the sensor on a suitable device.
  23. 23. A tactile sensor as claimed in any one of claims 13 to 22 wherein the processing unit is integrated within the sensor.
  24. 24. A tactile sensor as claimed in any one of claims 13 to 23 wherein the processing unit is provided remote from the sensor and in communication with the imaging device via a suitable data link.
  25. 25. A robot gripper comprising one or more sensors according to any one of claims 13 to 24.
GB1820588.0A 2018-12-18 2018-12-18 Improvements in or relating to tactile sensing Withdrawn GB2579846A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
GB1820588.0A GB2579846A (en) 2018-12-18 2018-12-18 Improvements in or relating to tactile sensing
PCT/GB2019/053483 WO2020128430A1 (en) 2018-12-18 2019-12-10 Improvements in or relating to tactile sensing

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1820588.0A GB2579846A (en) 2018-12-18 2018-12-18 Improvements in or relating to tactile sensing

Publications (2)

Publication Number Publication Date
GB201820588D0 GB201820588D0 (en) 2019-01-30
GB2579846A true GB2579846A (en) 2020-07-08

Family

ID=65147116

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1820588.0A Withdrawn GB2579846A (en) 2018-12-18 2018-12-18 Improvements in or relating to tactile sensing

Country Status (2)

Country Link
GB (1) GB2579846A (en)
WO (1) WO2020128430A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114500868B (en) * 2022-04-18 2022-07-12 深圳锐视智芯科技有限公司 EVS pixel working method and related device

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1321753A1 (en) * 2000-08-31 2003-06-25 Center for Advanced Science and Technology Incubation, Ltd. Optical tactile sensor
US20080027582A1 (en) * 2004-03-09 2008-01-31 Nagoya Industrial Science Research Institute Optical Tactile Sensor, Sensing Method, Sensing System, Object Operation Force Controlling Method, Object Operation Force Controlling Device, Object Holding Force Controlling Method, and Robot Hand

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2538008A1 (en) * 2003-09-16 2005-03-31 Toudai Tlo, Ltd. Optical tactile sensor and method of reconstructing force vector distribution using the sensor
CN101204079B (en) 2005-06-03 2011-07-27 苏黎世大学 Photoarray for detecting time-dependent image data
CN108574793B (en) * 2017-03-08 2022-05-10 三星电子株式会社 Image processing apparatus configured to regenerate time stamp and electronic apparatus including the same

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1321753A1 (en) * 2000-08-31 2003-06-25 Center for Advanced Science and Technology Incubation, Ltd. Optical tactile sensor
US20080027582A1 (en) * 2004-03-09 2008-01-31 Nagoya Industrial Science Research Institute Optical Tactile Sensor, Sensing Method, Sensing System, Object Operation Force Controlling Method, Object Operation Force Controlling Device, Object Holding Force Controlling Method, and Robot Hand

Also Published As

Publication number Publication date
GB201820588D0 (en) 2019-01-30
WO2020128430A1 (en) 2020-06-25

Similar Documents

Publication Publication Date Title
KR102577448B1 (en) Hand eye calibration method and system
Bai et al. Design and implementation of a fall monitor system by using a 3-axis accelerometer in a smart phone
US8810422B2 (en) Surveillance system
EP3324826A1 (en) Automatic fundus image capture system
US20210327090A1 (en) Sensor calibration system, display control apparatus, program, and sensor calibration method
CN106597463A (en) Photoelectric proximity sensor based on dynamic vision sensor (DVS) chip, and detection method
US9030653B1 (en) Multi point, high sensitive tactile sensing module for robots and devices
US7880768B2 (en) Mobile communication terminal
JP6544044B2 (en) Image processing apparatus, image processing system and image processing method
WO2020128430A1 (en) Improvements in or relating to tactile sensing
WO2014085647A1 (en) Methods to combine radiation-based temperature sensor and inertial sensor and/or camera output in a handheld/mobile device
JP2007121158A (en) Intrusion detector
KR20180110088A (en) Touch and pressure sensing systems with different upper layers
JP4111660B2 (en) Fire detection equipment
US10628951B2 (en) Distance measurement system applicable to different reflecting surfaces and computer system
Shimonomura et al. A combined tactile and proximity sensing employing a compound-eye camera
US9285389B2 (en) Optical accelerometers
US20110050928A1 (en) Selection system and method for camera
US6606540B1 (en) Pressure sensor systems and methods for use in robotic devices
US7423737B2 (en) Velocity determination utilizing two photosensor arrays
TW201224893A (en) Touch device with light frequency sensor for sensing relative position of object to be detected
KR100766995B1 (en) 3 dimension camera module device
Yamaguchi et al. Optical skin for robots: Tactile sensing and whole-body vision
KR102226458B1 (en) System and method of monitoring device in nuclear power plant
CN104568124B (en) Vibration detector

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)