GB2604195A - Object presence sensing - Google Patents

Object presence sensing Download PDF

Info

Publication number
GB2604195A
GB2604195A GB2111502.7A GB202111502A GB2604195A GB 2604195 A GB2604195 A GB 2604195A GB 202111502 A GB202111502 A GB 202111502A GB 2604195 A GB2604195 A GB 2604195A
Authority
GB
United Kingdom
Prior art keywords
finger
optical
light
control unit
fingers
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB2111502.7A
Inventor
Fras Jan
Mnyusiwalla Hussein
Sotiropoulos Panagioti
Del Sol Acero Enrique
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ocado Innovation Ltd
Original Assignee
Ocado Innovation Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ocado Innovation Ltd filed Critical Ocado Innovation Ltd
Publication of GB2604195A publication Critical patent/GB2604195A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J15/00Gripping heads and other end effectors
    • B25J15/02Gripping heads and other end effectors servo-actuated
    • B25J15/0253Gripping heads and other end effectors servo-actuated comprising parallel grippers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
    • B25J13/081Touching devices, e.g. pressure-sensitive
    • B25J13/082Grasping-force detectors
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
    • B25J13/081Touching devices, e.g. pressure-sensitive
    • B25J13/082Grasping-force detectors
    • B25J13/083Grasping-force detectors fitted with slippage detectors
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
    • B25J13/088Controls for manipulators by means of sensing devices, e.g. viewing or touching devices with position, velocity or acceleration sensors
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J15/00Gripping heads and other end effectors
    • B25J15/08Gripping heads and other end effectors having finger members
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/021Optical sensing devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/021Optical sensing devices
    • B25J19/025Optical sensing devices including optical fibres
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1612Programme controls characterised by the hand, wrist, grip control
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01LMEASURING FORCE, STRESS, TORQUE, WORK, MECHANICAL POWER, MECHANICAL EFFICIENCY, OR FLUID PRESSURE
    • G01L5/00Apparatus for, or methods of, measuring force, work, mechanical power, or torque, specially adapted for specific purposes
    • G01L5/16Apparatus for, or methods of, measuring force, work, mechanical power, or torque, specially adapted for specific purposes for measuring several components of force
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/45Nc applications
    • G05B2219/45063Pick and place manipulator

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Orthopedic Medicine & Surgery (AREA)
  • Manipulator (AREA)

Abstract

A finger 111A, 111B for use in a manipulating apparatus 100’ comprises a light source 116, a plurality of optical receivers 118 and a face (114 figure 3) to engage with an object 200. The manipulating apparatus 100’, which may be of use in picking system comprises a pair opposed fingers 111A, 111B, which can be moved to grip an object 200 between the fingers 111A, 111B. One or more light signals may be launched from one of the fingers 111A, 111B such that a gripped object 200 may block some, or all, of the light signals. The light signal, or signals, received by the other finger 111A, 111B can be processed by a control unit 130’ to determine the location of the object 200 and either or both of the fingers 111A, 111B may be moved by an actuator 120A, 120B in response to the received light signal(s) or the location of the object 200. The light source 116 and optical receivers 118 may comprise optical fibres.

Description

OBJECT PRESENCE SENSING
Technical Field
The present invention relates generally to the field of sensing and more specifically to an apparatus and method for sensing the position of an object.
Background
Automated and semi-automated picking systems require robotic picking stations which are able to select an object from a first receptacle, such as a tote or other storage unit, grip the object, and then move the object into a second receptacle, such as a further tote or a bag. The picking station may comprise a manipulating apparatus which needs to be able to pick objects which may vary in size, shape, weight and physical durability. If too much force is applied when an object is gripped then the object may be damaged. However, if too little force is applied when an object is gripped then the object may be dropped before it can be transferred into the second receptacle. During the pick and place operation the manipulating apparatus may incorporate a sensor that can detect the relative pose of the gripping elements and the object gripped in order for the system to be able to react accordingly and maintain the grasp. Currently available sensors have a number of drawbacks, for example, they are too bulky to integrate into the manipulating apparatus, are unreliable in operation, are expensive, etc.
Summary
In view of the problems in known sensing systems for use in a manipulating apparatus which are suitable for use in a picking system, the present invention aims to an optical sensor system which can be integrated into a manipulating apparatus.
In general terms, the invention introduces one or more optical sources within a first component and a plurality of optical receivers within a second component of a manipulating apparatus. In operation, the first component is in opposition to the second component such that an object can be gripped by the first and second components. Light from the light source may be obscured by the object, preventing light from being detected at one or more of the optical receivers. This information can be processed to determine the position of the object relative to the first and second components.
According to a first aspect of the present invention there is provided a finger for use in a manipulating apparatus, the finger comprising: a light source; a plurality of optical receivers; and a face configured, in use, to engage with an object. The face may comprise a pad and a frame, wherein the frame is arranged around the periphery of the pad.
The pad may comprise the light source. In particular, the pad may comprise a plurality of light emitting elements, each of the light emitting elements emitting light at a respective wavelength.
Alternatively, the pad may comprise a plurality of different regions, each of the pad regions having a different colour.
The finger may comprise a plurality of light sources, each of the plurality of light sources being located within the frame. One or more of the plurality of light sources may comprise an optical fibre. Alternatively, one or more of the plurality of light sources may comprise a light emitting element. The or each light emitting element may emit light at a respective wavelength.
Each of the plurality of optical receivers may be received within the frame. Each of the plurality of optical receivers may comprise an optical fibre.
According to a second aspect of the present invention there is provided a manipulating apparatus comprising: a first finger according to the first aspect of the invention; a second finger according to the first aspect of the invention, the first finger being arranged such that it is opposed to the second finger; one or more actuators and a control unit, wherein, in use, the control unit activates the one or more actuators to move the first finger relative to the second finger such that the first finger and the second finger engage an object, wherein light signals received by the second finger from the first finger are used by the control unit to determine the movement of the first finger relative to the second finger.
The light signals received by the first finger from the second finger may be used by the control unit to determine the movement of a gripped object relative to the first finger and the second finger. If the object is moving relative to the fingers then the fingers can be activated to increase the force applied to the object so that it is held more securely by the fingers. The apparatus may comprise one or more further sensors and data from the or each sensor may be used by the control unit in conjunction with the light signals received from the first finger and/or the second finger to determine the movement of the first finger relative to the second finger.
According to a third aspect of the present invention there is provided a method of manipulating an object, the method comprising the steps of: i) moving a first finger relative to a second finger in order to grip an object between the first finger and the second finger; ii) generating one or more optical signals at the first finger; iii) detecting the one or more optical signals at the second finger; and iv) determining the position of the object in accordance with the one or more detected optical signals. The method may comprise the further step of v) moving the first finger relative to the second finger in accordance with the position determined in step iv).
Brief Description of the Drawings
Embodiments of the invention will now be described by way of example only with reference to the accompanying drawings, in which like reference numbers designate the same or corresponding parts, and in which: Figure 1 shows a schematic depiction of a conventional manipulating apparatus incorporating two opposed fingers; Figure 2 shows a schematic depiction of a manipulating apparatus according to an embodiment of the present invention; Figure 3 shows a schematic depiction of the face of a finger according to a first specific embodiment of the present invention; Figure 4 shows a schematic depiction of a manipulating apparatus comprising a first finger 110A" and a second finger 110B" as shown in Figure 3; Figure 5 shows a schematic depiction of the face of a finger 110" according to a second specific embodiment of the present invention; Figure 6 shows an altemative to the embodiment of the finger of Figure 5; Figure 7 shows a schematic depiction of a manipulating apparatus comprising a first finger 110A" and a second finger 110B" as shown in Figure 5; and Figure 8 shows a schematic depiction of control unit 130'.
Detailed Description of Embodiments
Figure 1 shows a schematic depiction of a conventional manipulating apparatus 100. In particular, Figure la shows a schematic depiction of a conventional manipulating apparatus 100 incorporating two opposed fingers 110 and Figure lb shows a schematic depiction of the face of one of the fingers 110 which is used to make contact with an object. The manipulating apparatus shown in Figure la further comprises a first finger 110A, a second finger 110B and a control unit 130. Each of the first and second fingers comprises a respective actuator 120A, 120B, which are connected to the control unit 130. The control unit may activate the first actuator 120A and/or the second actuator 120B to move or rotate the respective finger(s) 110A, 110B such that the fingers can co-operate to grip an object 200. Sensors may be provided within the fingers such that contact with an object may be detected. Information obtained from the sensors may then be supplied to the control unit such that the first actuator 120A and/or the second actuator 120B can be controlled in response to the detected contact. The face comprises a central pad 114 which is surrounded by a frame 113. The central pad may protrude from the plane of the frame to grip an object 200. This enables delicate and/or irregularly shaped objects to be held whilst reducing the probability that the object is damaged. The central pad may have a relatively high coefficient of friction, specifically so that it is higher than the coefficient of friction of the frame 122, to assist in the gripping of the object. The surface of the central pad may be textured or grooved to provide the increase in friction. Alternatively, or in addition, the material used to from the material pad may be selected so as to have a suitable coefficient of friction.
Figure 2 shows a schematic depiction of a manipulating apparatus 100' according to an embodiment of the present invention. The first finger 111A and the second finger 111B are similar to the first and second fingers 110A, 110B described above with reference to Figures la & 1 b.
Specifically, each of the first and second fingers 111A, 111B comprise a respective actuator 120A, 120B, which are connected to the control unit 130'. The control unit may activate the first actuator 120A and/or the second actuator 120B to move or rotate the respective finger(s) 110A, 110B such that the fingers can co-operate to grip an object 200. Sensors may be provided within the fingers such that contact with an object may be detected. Information obtained from the sensors may then be supplied to the control unit such that the first actuator 120A and/or the second actuator 120B can be controlled in response to the detected contact. The face comprises a central pad 114 which is surrounded by a frame 113. The central pad may protrude from the plane of the frame to grip an object 200. This enables delicate and/or irregularly shaped objects to be held whilst reducing the probability that the object is damaged. The central pad may have a relatively high coefficient of friction, specifically so that it is higher than the coefficient of friction of the frame 122, to assist in the gripping of the object. The surface of the central pad may be textured or grooved to provide the increase in friction. Alternatively, or in addition, the material used to from the material pad may be selected so as to have a suitable coefficient of friction.
The first finger 111A additionally comprises a plurality of optical sources 116 which are operable to send optical signals towards the second finger 111B. The second finger 111B additionally comprises a plurality of optical receivers 118 which are configured to receive the optical signals which are sent from the plurality of optical sources and to then route the received signals to the control unit 130'. It should be understood that if there is no object located in between the first finger and the second finger then each of the optical receivers 118 will receive light from the optical sources. However, as is shown in Figure 2, if an object 200 is present then it may block some of the light signals sent from the plurality of optical sources. Depending on the size, position and the composition of the object, some of the plurality of optical receivers may receive no signal from the plurality of optical sources, some of the plurality of optical receivers may receive an attenuated signal and some of the plurality of optical receivers may not experience any change in the received optical signal. The signals received at each of the plurality of optical receivers 118 can be sent to the control unit 130'. The control unit can process the optical signals and use the resulting data when determining the location of the object relative to the first finger and the second finger. This data may be used in conjunction with data generated from other sensor signals when determining the location of the object. The determined location information may then be used by the control unit 130' when sending signals to the first actuator and/or the second actuator to grip the object or to move the object from its present position to a further position. For example, if the data received from the optical receivers indicated that a grasped object is moving relative to the first and second fingers then this can be interpreted to mean that the object is weakly grasped and the first and second fingers should be brought closer together to establish a firmer hold on the object. This relative movement of the fingers may be applied until it is detected that the movement of the grasped object relative to the first and second fingers has ceased.
It should be understood that both the first finger and the second finger may comprise both a plurality of optical sources and a plurality of optical receivers and that during the process of using the apparatus to grip an object either or both fingers can be used to generate optical signals which can detected by the opposed finger to determine the size and/or location of the object to be gripped.
The first and second fingers may be manoeuvred such that an object is held between them. If the optical sensors detect that the object is moving relative to fingers, that is that the detected position of the object is changing, then the force applied by the first finger and/or the second finger may be increased. The increase in applied force provides a firmer grip on the object. The force applied by the fingers may be increased until no further movement of the object relative to the fingers is detected. Once it has been determined that the object is being securely then the force applied by the fingers may be maintained at that level until the apparatus releases the object.
Figure 3 shows a schematic depiction of the face of a finger 111 according to a first specific embodiment of the present invention, which may be used in the manipulating apparatus 100' described above with reference to Figure 2. The finger 111 comprises a frame 113, which comprises a plurality of optical receivers 118. The frame is arranged around a central pad 114, which comprises one or more light emitting elements 117. When the or each light emitting element is activated then the entirety of the central pad is illuminated, sending light to an opposed finger.
Each of the plurality of optical receivers may comprise optical fibres which couple the received optical signal to the control unit 130' for processing. The number of optical receivers arranged within the finger will depend on the size of the finger and the resolution of the object size detection which is required. The central pad is shown as comprising two light emitting elements 117 but it should be understood that the central pad may comprise a single light emitting element or more than two light emitting elements. When the one or more light emitting elements are activated then the central pad 114 acts as the light source 116. If the central pad comprises two or more light emitting elements then each of the plurality of light emitting elements may emit light of a different wavelength (or colour). By having a pattern of light emitting elements, with each of the light emitting elements having a respective wavelength, then the wavelength of the signals received at each of the optical receivers can also be used by the control unit 130' when determining the position and/or shape or an object. A similar effect may be obtained by forming the pad such that it has zones of different colours, such that the illumination provided by the one or more light emitting elements causes the light source to have zones having different colours.
Figure 4 shows a schematic depiction of a first finger 111A and a second finger 111B according to this first specific embodiment of the present invention. It can be seen that the central pad acts a single optical source 116, which radiates from the first finger 111A to the second finger 111B. In the absence of an object, each of the optical receivers in the second finger would receive the light transmitted from the central pad of the first finger. Figure 4 shows an object 200 between the first finger and the second finger and it can be seen that the object prevents light from reaching the middle two optical receivers shown in the second finger. However, light from the central pad of the first finger can be received at the uppermost and lowermost optical receivers which are shown in Figure 4. The control unit 130' can process the optical signals and use the resulting data when determining the location of the object relative to the first finger and the second finger. The data obtained from the processing of the optical signals may be used on its own.
Alternatively, this data may be used in conjunction with data received from other sensors comprised within one of the fingers. For the sake of clarity, the pad 114 of the second finger 111B is not shown in Figure 4.
Figure 5 shows a schematic depiction of the face of a finger 112 according to a second specific embodiment of the present invention, which may be used in the manipulating apparatus 100' described above with reference to Figure 2. The finger 112 comprises a frame 113 which is received around a central pad 114. The frame comprises a plurality of optical sources 116 and a plurality of optical receivers 118. One or more of the plurality of optical sources 116 may be activated, sending light to an opposed finger, where the optical signal(s) may be received by one or more of the plurality of optical receivers.
Figure 6 shows an alternative to the embodiment of the finger shown in Figure 5, with finger 112' comprising a frame 113 which is received around a central pad 114. The finger further comprises a plurality of optical fibres 119 which are received within the frame, located around the periphery of the central pad. The finger 112' shown in Figure 6 uses the optical fibres 119 as both the optical source and the optical receiver. An advantage of this arrangement is that it reduces the number of active components which are present in each finger. The optical fibres 119 connecting the finger to the control unit may be used as either an optical source or as an optical receiver. One or more of the plurality of optical fibres may function as an optical source when light is coupled into the fibre in response to an instruction from the control unit. Similarly, one or more of the plurality of optical fibres may function as an optical receiver when light enters into the fibre at the finger and is transmitted to an optical detector. It should be understood that the optical fibres 119 comprised in the finger 112' shown in Figure 6 could be assigned to form a group of optical sources 116 and a group of optical receivers 118. Whilst it would be possible for one or more of the optical fibres to operate simultaneously as an optical source and an optical receiver, this would require significant complexity in the design of the control unit and the optical interfaces.
Figure 7 shows a schematic depiction of a manipulating apparatus comprising a first finger 112A and a second finger 112B according to this second specific embodiment of the present invention.
In this example, it may be seen that an optical signal is sent via one of the plurality of optical sources 116 of the first finger 112A (shown by the dashed line in Figure 7), which radiates from the first finger 112A to the second finger 112B. It can be seen that the optical signal would diverge and in the absence of an object, one or more of the optical receivers in the second finger would receive the light transmitted from the first finger. The number of receivers that receive the transmitted light may vary in accordance with the separation of the two fingers and the degree of divergence of the optical signal. Figure 7 shows an object 200 between the first finger and the second finger and it can be seen that the object prevents light from reaching the two lower optical receivers shown in the second finger. However, light from the first finger can be received at the two upper optical receivers which are shown in Figure 7 (shown by the long dashed lines). The control unit can process the optical signals and use the resulting data when determining the location of the object relative to the first finger and the second finger. For the sake of clarity, Figure 7 does not show the central pad 114 of either of the first finger or the second finger. The foregoing discussion with reference to Figure 7 shows a manipulating apparatus comprising fingers 112A & 112B. It should be understood that the manipulating apparatus may alternatively comprise fingers 112A' and 112B' (that is the fingers described above with reference to Figure 6).
Figure 8 shows a schematic depiction of control unit 130' which is a part of the manipulating apparatus described above with reference to Figure 2 and which may be used with the finger described above with reference to Figure 3 or with the finger described above with reference to Figure 5. The control unit comprises a processing unit 132, volatile data storage unit 133, nonvolatile data storage unit 134, actuator interface 136, first optical interface 138A and second optical interface 138B. The processing unit 132 is connected to the volatile data storage unit 133 and the non-volatile data storage unit 134. The processing unit 132 can access computer programs, instructions and other data stored on the non-volatile data storage unit and process them in order for the manipulating apparatus to operate. Furthermore, the processing unit may write data to, and read data from, the volatile data storage unit as required. The processing unit 132 is also in communication with the actuator interface 136 and the first and second optical interfaces 138A, 138B. The processing unit can send instructions to the first optical interface 138A which cause one or more of the optical sources 116 in the first finger to transmit an optical signal. Any signal(s) received at the optical receivers of the second finger may be received at the second optical interface 138B. These signal(s) may be sent to the processing unit which can analyse and interpret the received signal(s) to determine the location of an object between the first and second fingers. This location data may then be used to further activate the first actuator 120A and/or the second actuator 120B.
The preceding discussion has described a manipulating apparatus which uses only the optical sensors described above. In practice, such an apparatus may incorporate further sensor sub-systems. In such a case, the optical sensor system described above may provide one input to the control unit which could be used when determining the position of the first and second actuators.
The processing unit may be, for example, one of a central processing unit (CPU), an adapted graphics processing unit (GPU) or an appropriately configured FPGA. The non-volatile data storage unit may comprise ROM data storage chips, solid state memory data storage or other optical or magnetic data storage technologies and the volatile data storage unit may comprise RAM data storage chips.
Modifications and Variations Many modifications and variations can be made to the embodiments described above, without departing from the scope of the present invention.
The foregoing discussion has, for the sake of simplicity, described that the optical signals are generated in the first finger and received in the second finger. It should be understood that the signals may be sent in the reverse direction, that is from the second finger to the first finger. The finger which is used to generate the optical signals may be alternated periodically and/or both fingers may send signals at the same time. It should be understood that in such cases the first and second optical interfaces will need to be capable of both generating and receiving optical signals. The fingers described above comprise a central pad, with a frame received around it, such that the pad can engage with an object gripped between a pair of fingers. It should be understood that the present invention could be used in pairs of opposed fingers which incorporate alternative mechanisms to grip an object.
The first and second optical interfaces may comprise light emitting elements (for example, laser diodes, light emitting diodes (LEDs), etc.) and/or light detecting elements (for example photodiodes) as required. It may be convenient for the light emitting elements and the light detecting elements to be located outside of the control unit such that they are communicably connected to either or both of the first and second optical interfaces and to the light sources and light receivers. In such a case, a single optical interface 138 may be provided to enable signals to be sent from the processing unit to the light emitting elements and from the light detecting elements to the processing unit.
The optical signals may use wavelengths in the visible spectrum or alternatively the signals may use wavelengths in the infra-red spectrum so as to make use of emitters and receivers which are used in optical fibre communications systems. The light emitting elements may transmit at fixed wavelengths or may be capable of generating signals at a variable wavelength. Signals of differing wavelengths may be generated using a light emitting element which generates a wideband signal (such as a white light LED) and then using one or more filters to generate a signal having a narrow wavelength band. In one alternative, each of the plurality of light sources may use the same wavelength. In a further alternative, the plurality of light sources may be placed into one of a plurality of sub-groups. Different wavelengths could be allocated to each of the sub-groups to provide further information regarding the position of an object.
In one respect, the present invention relates to a manipulating apparatus, which may be of use in picking system. The apparatus comprises a pair opposed fingers, which can be moved to grip an object between the fingers. One or more light signals may be launched from one of the fingers such that a gripped object may block some, or all, of the light signals. The light signal, or signals, received by the other finger can be processed to determine the location of the object and either or both of the fingers may be moved in response to the received light signal(s) or the location of the object.
The foregoing description of embodiments of the invention has been presented for the purpose of illustration and description. It is not intended to be exhaustive or to limit the invention to the precise form disclosed..

Claims (16)

  1. CLAIMS1. A finger for use in a manipulating apparatus, the finger comprising: a light source; a plurality of optical receivers and a face configured, in use, to engage with an object.
  2. 2. A finger according to claim 1, wherein the face comprises a pad and a frame, wherein the frame is arranged around the periphery of the pad
  3. 3. A finger according to claim 2, wherein the pad comprises the light source.
  4. 4. A finger according to claim 3, wherein the light source comprises a plurality of light emitting elements, each of the light emitting elements emitting light at a respective wavelength.
  5. 5. A finger according to claim 3, wherein the pad comprises a plurality of different regions, each of the pad regions having a different colour.
  6. 6. A finger according to claim 2, wherein the finger comprises a plurality of light sources, each of the plurality of light sources being located within the frame.
  7. 7. A finger according to claim 6, wherein one or more of the plurality of light sources comprises an optical fibre.
  8. 8. A finger according to claim 6, wherein one or more of the plurality of light sources comprises a light emitting element.
  9. 9. A finger according to claim 8, wherein the or each light emitting element emits light at a respective wavelength.
  10. 10. A finger according to claim 2, wherein each of the plurality of optical receivers are received within the frame
  11. 11. A finger according to claim 10, wherein each of the plurality of optical receivers comprise an optical fibre.
  12. 12. A manipulating apparatus comprising: a first finger according to any of claims 1 to 11; a second finger according to any of claims 1 to 10, the first finger being arranged such that it is opposed to the second finger; one or more actuators and a control unit, wherein, in use, the control unit activates the one or more actuators to move the first finger relative to the second finger such that the first finger and the second finger engage an object, wherein light signals received by the second finger from the first finger are used by the control unit to determine the movement of the first finger relative to the second finger.
  13. 13. An apparatus according to claim 11, wherein light signals received by the first finger from the second finger are used by the control unit to determine the movement of the first finger relative to the second finger.
  14. 14. An apparatus according to claim 11 or claim 12, wherein the apparatus comprises one or more further sensors and data from the or each sensor are used by the control unit in conjunction with the light signals received from the first finger and/or the second finger to determine the movement of the first finger relative to the second finger.
  15. 15. A method of manipulating an object, the method comprising the steps of: i) moving a first finger relative to a second finger in order to grip an object between the first finger and the second finger; ii) generating one or more optical signals at the first finger; Hi) detecting the one or more optical signals at the second finger; and iv) determining the position of the object in accordance with the one or more detected optical signals.
  16. 16. A method according to step 14, wherein the method comprises the further step of v) moving the first finger relative to the second finger in accordance with the position determined in step iv).
GB2111502.7A 2020-08-11 2021-08-10 Object presence sensing Withdrawn GB2604195A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GBGB2012448.3A GB202012448D0 (en) 2020-08-11 2020-08-11 Object presence sensing

Publications (1)

Publication Number Publication Date
GB2604195A true GB2604195A (en) 2022-08-31

Family

ID=72519936

Family Applications (2)

Application Number Title Priority Date Filing Date
GBGB2012448.3A Ceased GB202012448D0 (en) 2020-08-11 2020-08-11 Object presence sensing
GB2111502.7A Withdrawn GB2604195A (en) 2020-08-11 2021-08-10 Object presence sensing

Family Applications Before (1)

Application Number Title Priority Date Filing Date
GBGB2012448.3A Ceased GB202012448D0 (en) 2020-08-11 2020-08-11 Object presence sensing

Country Status (2)

Country Link
GB (2) GB202012448D0 (en)
WO (1) WO2022034107A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024035432A1 (en) * 2022-08-10 2024-02-15 Siemens Corporation Sensor control system for coanda-based end effectors

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS5313767A (en) * 1976-07-20 1978-02-07 Shin Meiwa Ind Co Ltd Gripper for industrial robot
WO1984004723A1 (en) * 1983-05-27 1984-12-06 American Telephone & Telegraph Robotic systems utilizing optical sensing
US4852928A (en) * 1984-02-16 1989-08-01 Multivisions Corporation Robotic end effectors
JPH0699389A (en) * 1992-09-18 1994-04-12 Toshiba Corp Handling device for work arm
JP2005257343A (en) * 2004-03-09 2005-09-22 Nagoya Industrial Science Research Inst Optical tactile sensor, and sensing method and system, object operation force control method and device, object gripping force control device, and robot hand using optical tactile sensor
US8629987B2 (en) * 2009-12-01 2014-01-14 Seiko Epson Corporation Optical-type position detecting device, hand apparatus, and touch panel
US20150298321A1 (en) * 2012-10-16 2015-10-22 Beckman Coulter, Inc. System and method including specimen gripper

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS59353B2 (en) * 1980-07-24 1984-01-06 ファナック株式会社 gripping device
JPS6171302A (en) * 1984-09-14 1986-04-12 Toshiba Corp Access sensor for robot hand
US9120233B2 (en) * 2012-05-31 2015-09-01 Toyota Motor Engineering & Manufacturing North America, Inc. Non-contact optical distance and tactile sensing system and method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS5313767A (en) * 1976-07-20 1978-02-07 Shin Meiwa Ind Co Ltd Gripper for industrial robot
WO1984004723A1 (en) * 1983-05-27 1984-12-06 American Telephone & Telegraph Robotic systems utilizing optical sensing
US4852928A (en) * 1984-02-16 1989-08-01 Multivisions Corporation Robotic end effectors
JPH0699389A (en) * 1992-09-18 1994-04-12 Toshiba Corp Handling device for work arm
JP2005257343A (en) * 2004-03-09 2005-09-22 Nagoya Industrial Science Research Inst Optical tactile sensor, and sensing method and system, object operation force control method and device, object gripping force control device, and robot hand using optical tactile sensor
US8629987B2 (en) * 2009-12-01 2014-01-14 Seiko Epson Corporation Optical-type position detecting device, hand apparatus, and touch panel
US20150298321A1 (en) * 2012-10-16 2015-10-22 Beckman Coulter, Inc. System and method including specimen gripper

Also Published As

Publication number Publication date
GB202012448D0 (en) 2020-09-23
WO2022034107A1 (en) 2022-02-17

Similar Documents

Publication Publication Date Title
CN101996002B (en) For the apparatus and method of optical attitude identification
US20100157278A1 (en) Method and optical sensor for the detection of objects
GB2604195A (en) Object presence sensing
CA1298665C (en) Color sensor system for the recognition of objects with colored surfaces
US4766322A (en) Robot hand including optical approach sensing apparatus
EP3389198B1 (en) Optical connector polarity and loss measurement using an integrating sphere-equipped optical measurement device
US5349371A (en) Electro-optical mouse with means to separately detect the changes in contrast ratio in X and Y directions
US4520932A (en) Stamp detection in a mail processing apparatus
NL8202496A (en) DEVICE FOR DETERMINING THE POSITION OF AN OBJECT.
CN105987713B (en) The optical transition detection of electric bicycle and control
CA2725681A1 (en) Sensor system for detecting the surface structures of several packaged articles
GB1585919A (en) Bottle inspection apparatus
CN107656284A (en) Range unit and distance-finding method
US20060163504A1 (en) Identification sensor
KR20180056946A (en) Article sorting system using picking robot
EP3822750B1 (en) A touch and pressure sensing system with different upper layers
US4051366A (en) Optical apparatus for sensing clustered package orientation
US20210023714A1 (en) Illuminated Surface as Light Source for In-Hand Object Location System
US20210370352A1 (en) Detecting non-handleable items
CN110598624A (en) Infrared sensing gesture recognition method
EP0796438B1 (en) Object sensing system
SE457151B (en) DEVICE FOR OPTICAL SENSING OF TARGETS
CN210142202U (en) SMT charging tray detection subassembly
US20140333939A1 (en) Optoelectronic sensor for recognizing object edges
EP0799650A2 (en) An optical detector for a sorting machine

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)