WO2022034107A1 - Object presence sensing - Google Patents

Object presence sensing Download PDF

Info

Publication number
WO2022034107A1
WO2022034107A1 PCT/EP2021/072311 EP2021072311W WO2022034107A1 WO 2022034107 A1 WO2022034107 A1 WO 2022034107A1 EP 2021072311 W EP2021072311 W EP 2021072311W WO 2022034107 A1 WO2022034107 A1 WO 2022034107A1
Authority
WO
WIPO (PCT)
Prior art keywords
finger
optical
light
control unit
fingers
Prior art date
Application number
PCT/EP2021/072311
Other languages
French (fr)
Inventor
Jan Fras
Hussein MNYUSIWALLA
Panagiotis Sotiropoulos
Enrique DEL SOL ACERO
Original Assignee
Ocado Innovation Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ocado Innovation Limited filed Critical Ocado Innovation Limited
Publication of WO2022034107A1 publication Critical patent/WO2022034107A1/en

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J15/00Gripping heads and other end effectors
    • B25J15/02Gripping heads and other end effectors servo-actuated
    • B25J15/0253Gripping heads and other end effectors servo-actuated comprising parallel grippers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
    • B25J13/081Touching devices, e.g. pressure-sensitive
    • B25J13/082Grasping-force detectors
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
    • B25J13/081Touching devices, e.g. pressure-sensitive
    • B25J13/082Grasping-force detectors
    • B25J13/083Grasping-force detectors fitted with slippage detectors
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
    • B25J13/088Controls for manipulators by means of sensing devices, e.g. viewing or touching devices with position, velocity or acceleration sensors
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J15/00Gripping heads and other end effectors
    • B25J15/08Gripping heads and other end effectors having finger members
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/021Optical sensing devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/021Optical sensing devices
    • B25J19/025Optical sensing devices including optical fibres
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1612Programme controls characterised by the hand, wrist, grip control
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01LMEASURING FORCE, STRESS, TORQUE, WORK, MECHANICAL POWER, MECHANICAL EFFICIENCY, OR FLUID PRESSURE
    • G01L5/00Apparatus for, or methods of, measuring force, work, mechanical power, or torque, specially adapted for specific purposes
    • G01L5/16Apparatus for, or methods of, measuring force, work, mechanical power, or torque, specially adapted for specific purposes for measuring several components of force
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/45Nc applications
    • G05B2219/45063Pick and place manipulator

Definitions

  • the present invention relates generally to the field of sensing and more specifically to an apparatus and method for sensing the position of an object.
  • Automated and semi-automated picking systems require robotic picking stations which are able to select an object from a first receptacle, such as a tote or other storage unit, grip the object, and then move the object into a second receptacle, such as a further tote or a bag.
  • the picking station may comprise a manipulating apparatus which needs to be able to pick objects which may vary in size, shape, weight and physical durability. If too much force is applied when an object is gripped then the object may be damaged. However, if too little force is applied when an object is gripped then the object may be dropped before it can be transferred into the second receptacle.
  • the manipulating apparatus may incorporate a sensor that can detect the relative pose of the gripping elements and the object gripped in order for the system to be able to react accordingly and maintain the grasp.
  • sensors have a number of drawbacks, for example, they are too bulky to integrate into the manipulating apparatus, are unreliable in operation, are expensive, etc.
  • the present invention aims to an optical sensor system which can be integrated into a manipulating apparatus.
  • the invention introduces one or more optical sources within a first component and a plurality of optical receivers within a second component of a manipulating apparatus.
  • the first component is in opposition to the second component such that an object can be gripped by the first and second components.
  • Light from the light source may be obscured by the object, preventing light from being detected at one or more of the optical receivers. This information can be processed to determine the position of the object relative to the first and second components.
  • a finger for use in a manipulating apparatus, the finger comprising: a light source; a plurality of optical receivers; and a face configured, in use, to engage with an object.
  • the face may comprise a pad and a frame, wherein the frame is arranged around the periphery of the pad.
  • the pad may comprise the light source.
  • the pad may comprise a plurality of light emitting elements, each of the light emitting elements emitting light at a respective wavelength.
  • the pad may comprise a plurality of different regions, each of the pad regions having a different colour.
  • the finger may comprise a plurality of light sources, each of the plurality of light sources being located within the frame.
  • One or more of the plurality of light sources may comprise an optical fibre.
  • one or more of the plurality of light sources may comprise a light emitting element. The or each light emitting element may emit light at a respective wavelength.
  • Each of the plurality of optical receivers may be received within the frame.
  • Each of the plurality of optical receivers may comprise an optical fibre.
  • a manipulating apparatus comprising: a first finger according to the first aspect of the invention; a second finger according to the first aspect of the invention, the first finger being arranged such that it is opposed to the second finger; one or more actuators and a control unit, wherein, in use, the control unit activates the one or more actuators to move the first finger relative to the second finger such that the first finger and the second finger engage an object, wherein light signals received by the second finger from the first finger are used by the control unit to determine the movement of the first finger relative to the second finger.
  • the light signals received by the first finger from the second finger may be used by the control unit to determine the movement of a gripped object relative to the first finger and the second finger. If the object is moving relative to the fingers then the fingers can be activated to increase the force applied to the object so that it is held more securely by the fingers.
  • the apparatus may comprise one or more further sensors and data from the or each sensor may be used by the control unit in conjunction with the light signals received from the first finger and/or the second finger to determine the movement of the first finger relative to the second finger.
  • a method of manipulating an object comprising the steps of: i) moving a first finger relative to a second finger in order to grip an object between the first finger and the second finger; ii) generating one or more optical signals at the first finger; iii) detecting the one or more optical signals at the second finger; and iv) determining the position of the object in accordance with the one or more detected optical signals.
  • the method may comprise the further step of v) moving the first finger relative to the second finger in accordance with the position determined in step iv).
  • Figure 1 shows a schematic depiction of a conventional manipulating apparatus incorporating two opposed fingers
  • Figure 2 shows a schematic depiction of a manipulating apparatus according to an embodiment of the present invention
  • Figure 3 shows a schematic depiction of the face of a finger according to a first specific embodiment of the present invention
  • Figure 4 shows a schematic depiction of a manipulating apparatus comprising a first finger 110A” and a second finger 110B” as shown in Figure 3;
  • Figure 5 shows a schematic depiction of the face of a finger 110” according to a second specific embodiment of the present invention
  • Figure 6 shows an alternative to the embodiment of the finger of Figure 5;
  • Figure 7 shows a schematic depiction of a manipulating apparatus comprising a first finger 110A’” and a second finger 11 OB’” as shown in Figure 5;
  • Figure 8 shows a schematic depiction of control unit 130’.
  • Figure 1 shows a schematic depiction of a conventional manipulating apparatus 100.
  • Figure 1a shows a schematic depiction of a conventional manipulating apparatus 100 incorporating two opposed fingers 110
  • Figure 1b shows a schematic depiction of the face of one of the fingers 110 which is used to make contact with an object.
  • the manipulating apparatus 100 shown in Figure 1a further comprises a first finger 110A, a second finger 110B and a control unit 130.
  • Each of the first and second fingers comprises a respective actuator 120A, 120B, which are connected to the control unit 130.
  • the control unit may activate the first actuator 120A and/or the second actuator 120B to move or rotate the respective finger(s) 110A, 110B such that the fingers can co-operate to grip an object 200.
  • Sensors may be provided within the fingers such that contact with an object may be detected. Information obtained from the sensors may then be supplied to the control unit such that the first actuator 120A and/or the second actuator 120B can be controlled in response to the detected contact.
  • the face comprises a central pad 114 which is surrounded by a frame 113.
  • the central pad may protrude from the plane of the frame to grip an object 200. This enables delicate and/or irregularly shaped objects to be held whilst reducing the probability that the object is damaged.
  • the central pad may have a relatively high coefficient of friction, specifically so that it is higher than the coefficient of friction of the frame 122, to assist in the gripping of the object.
  • the surface of the central pad may be textured or grooved to provide the increase in friction.
  • the material used to from the material pad may be selected so as to have a suitable coefficient of friction.
  • FIG. 2 shows a schematic depiction of a manipulating apparatus 100’ according to an embodiment of the present invention.
  • the first finger 111 A and the second finger 111 B are similar to the first and second fingers 110A, 110B described above with reference to Figures 1a & 1 b.
  • each of the first and second fingers 111A, 111 B comprise a respective actuator 120A, 120B, which are connected to the control unit 130’.
  • the control unit may activate the first actuator 120A and/or the second actuator 120B to move or rotate the respective finger(s) 110A, 110B such that the fingers can co-operate to grip an object 200.
  • Sensors may be provided within the fingers such that contact with an object may be detected.
  • the face comprises a central pad 114 which is surrounded by a frame 113.
  • the central pad may protrude from the plane of the frame to grip an object 200. This enables delicate and/or irregularly shaped objects to be held whilst reducing the probability that the object is damaged.
  • the central pad may have a relatively high coefficient of friction, specifically so that it is higher than the coefficient of friction of the frame 122, to assist in the gripping of the object.
  • the surface of the central pad may be textured or grooved to provide the increase in friction.
  • the material used to from the material pad may be selected so as to have a suitable coefficient of friction.
  • the first finger 111A additionally comprises a plurality of optical sources 116 which are operable to send optical signals towards the second finger 111 B.
  • the second finger 111 B additionally comprises a plurality of optical receivers 118 which are configured to receive the optical signals which are sent from the plurality of optical sources and to then route the received signals to the control unit 130’. It should be understood that if there is no object located in between the first finger and the second finger then each of the optical receivers 118 will receive light from the optical sources. However, as is shown in Figure 2, if an object 200 is present then it may block some of the light signals sent from the plurality of optical sources.
  • some of the plurality of optical receivers may receive no signal from the plurality of optical sources, some of the plurality of optical receivers may receive an attenuated signal and some of the plurality of optical receivers may not experience any change in the received optical signal.
  • the signals received at each of the plurality of optical receivers 118 can be sent to the control unit 130’.
  • the control unit can process the optical signals and use the resulting data when determining the location of the object relative to the first finger and the second finger. This data may be used in conjunction with data generated from other sensor signals when determining the location of the object.
  • the determined location information may then be used by the control unit 130’ when sending signals to the first actuator and/or the second actuator to grip the object or to move the object from its present position to a further position. For example, if the data received from the optical receivers indicated that a grasped object is moving relative to the first and second fingers then this can be interpreted to mean that the object is weakly grasped and the first and second fingers should be brought closer together to establish a firmer hold on the object. This relative movement of the fingers may be applied until it is detected that the movement of the grasped object relative to the first and second fingers has ceased.
  • both the first finger and the second finger may comprise both a plurality of optical sources and a plurality of optical receivers and that during the process of using the apparatus to grip an object either or both fingers can be used to generate optical signals which can detected by the opposed finger to determine the size and/or location of the object to be gripped.
  • the first and second fingers may be manoeuvred such that an object is held between them. If the optical sensors detect that the object is moving relative to fingers, that is that the detected position of the object is changing, then the force applied by the first finger and/or the second finger may be increased. The increase in applied force provides a firmer grip on the object. The force applied by the fingers may be increased until no further movement of the object relative to the fingers is detected. Once it has been determined that the object is being securely then the force applied by the fingers may be maintained at that level until the apparatus releases the object.
  • FIG 3 shows a schematic depiction of the face of a finger 111 according to a first specific embodiment of the present invention, which may be used in the manipulating apparatus 100’ described above with reference to Figure 2.
  • the finger 111 comprises a frame 113, which comprises a plurality of optical receivers 118.
  • the frame is arranged around a central pad 114, which comprises one or more light emitting elements 117. When the or each light emitting element is activated then the entirety of the central pad is illuminated, sending light to an opposed finger.
  • Each of the plurality of optical receivers may comprise optical fibres which couple the received optical signal to the control unit 130’ for processing.
  • the number of optical receivers arranged within the finger will depend on the size of the finger and the resolution of the object size detection which is required.
  • the central pad is shown as comprising two light emitting elements 117 but it should be understood that the central pad may comprise a single light emitting element or more than two light emitting elements. When the one or more light emitting elements are activated then the central pad 114 acts as the light source 116. If the central pad comprises two or more light emitting elements then each of the plurality of light emitting elements may emit light of a different wavelength (or colour).
  • the wavelength of the signals received at each of the optical receivers can also be used by the control unit 130’ when determining the position and/or shape or an object.
  • a similar effect may be obtained by forming the pad such that it has zones of different colours, such that the illumination provided by the one or more light emitting elements causes the light source to have zones having different colours.
  • Figure 4 shows a schematic depiction of a first finger 111 A and a second finger 111 B according to this first specific embodiment of the present invention. It can be seen that the central pad acts a single optical source 116, which radiates from the first finger 111 A to the second finger 111 B. In the absence of an object, each of the optical receivers in the second finger would receive the light transmitted from the central pad of the first finger.
  • Figure 4 shows an object 200 between the first finger and the second finger and it can be seen that the object prevents light from reaching the middle two optical receivers shown in the second finger. However, light from the central pad of the first finger can be received at the uppermost and lowermost optical receivers which are shown in Figure 4.
  • the control unit 130’ can process the optical signals and use the resulting data when determining the location of the object relative to the first finger and the second finger.
  • the data obtained from the processing of the optical signals may be used on its own. Alternatively, this data may be used in conjunction with data received from other sensors comprised within one of the fingers.
  • the pad 114 of the second finger 111 B is not shown in Figure 4.
  • Figure 5 shows a schematic depiction of the face of a finger 112 according to a second specific embodiment of the present invention, which may be used in the manipulating apparatus 100’ described above with reference to Figure 2.
  • the finger 112 comprises a frame 113 which is received around a central pad 114.
  • the frame comprises a plurality of optical sources 116 and a plurality of optical receivers 118.
  • One or more of the plurality of optical sources 116 may be activated, sending light to an opposed finger, where the optical signal(s) may be received by one or more of the plurality of optical receivers.
  • Figure 6 shows an alternative to the embodiment of the finger shown in Figure 5, with finger 112’ comprising a frame 113 which is received around a central pad 114.
  • the finger further comprises a plurality of optical fibres 119 which are received within the frame, located around the periphery of the central pad.
  • the finger 112’ shown in Figure 6 uses the optical fibres 119 as both the optical source and the optical receiver.
  • An advantage of this arrangement is that it reduces the number of active components which are present in each finger.
  • the optical fibres 119 connecting the finger to the control unit may be used as either an optical source or as an optical receiver.
  • One or more of the plurality of optical fibres may function as an optical source when light is coupled into the fibre in response to an instruction from the control unit.
  • one or more of the plurality of optical fibres may function as an optical receiver when light enters into the fibre at the finger and is transmitted to an optical detector.
  • the optical fibres 119 comprised in the finger 112’ shown in Figure 6 could be assigned to form a group of optical sources 116 and a group of optical receivers 118. Whilst it would be possible for one or more of the optical fibres to operate simultaneously as an optical source and an optical receiver, this would require significant complexity in the design of the control unit and the optical interfaces.
  • Figure 7 shows a schematic depiction of a manipulating apparatus comprising a first finger 112A and a second finger 112B according to this second specific embodiment of the present invention.
  • an optical signal is sent via one of the plurality of optical sources 116 of the first finger 112A (shown by the dashed line in Figure 7), which radiates from the first finger 112A to the second finger 112B.
  • the optical signal would diverge and in the absence of an object, one or more of the optical receivers in the second finger would receive the light transmitted from the first finger.
  • the number of receivers that receive the transmitted light may vary in accordance with the separation of the two fingers and the degree of divergence of the optical signal.
  • Figure 7 shows an object 200 between the first finger and the second finger and it can be seen that the object prevents light from reaching the two lower optical receivers shown in the second finger. However, light from the first finger can be received at the two upper optical receivers which are shown in Figure 7 (shown by the long dashed lines).
  • the control unit can process the optical signals and use the resulting data when determining the location of the object relative to the first finger and the second finger. For the sake of clarity, Figure 7 does not show the central pad 114 of either of the first finger or the second finger.
  • the foregoing discussion with reference to Figure 7 shows a manipulating apparatus comprising fingers 112A & 112B. It should be understood that the manipulating apparatus may alternatively comprise fingers 112A’ and 112B’ (that is the fingers described above with reference to Figure 6).
  • FIG 8 shows a schematic depiction of control unit 130’ which is a part of the manipulating apparatus described above with reference to Figure 2 and which may be used with the finger described above with reference to Figure 3 or with the finger described above with reference to Figure 5.
  • the control unit comprises a processing unit 132, volatile data storage unit 133, nonvolatile data storage unit 134, actuator interface 136, first optical interface 138A and second optical interface 138B.
  • the processing unit 132 is connected to the volatile data storage unit 133 and the non-volatile data storage unit 134.
  • the processing unit 132 can access computer programs, instructions and other data stored on the non-volatile data storage unit and process them in order for the manipulating apparatus to operate.
  • the processing unit may write data to, and read data from, the volatile data storage unit as required.
  • the processing unit 132 is also in communication with the actuator interface 136 and the first and second optical interfaces 138A, 138B.
  • the processing unit can send instructions to the first optical interface 138A which cause one or more of the optical sources 116 in the first finger to transmit an optical signal. Any signal(s) received at the optical receivers of the second finger may be received at the second optical interface 138B.
  • These signal(s) may be sent to the processing unit which can analyse and interpret the received signal(s) to determine the location of an object between the first and second fingers. This location data may then be used to further activate the first actuator 120A and/or the second actuator 120B.
  • the processing unit may be, for example, one of a central processing unit (CPU), an adapted graphics processing unit (GPU) or an appropriately configured FPGA.
  • the non-volatile data storage unit may comprise ROM data storage chips, solid state memory data storage or other optical or magnetic data storage technologies and the volatile data storage unit may comprise RAM data storage chips.
  • the optical signals are generated in the first finger and received in the second finger. It should be understood that the signals may be sent in the reverse direction, that is from the second finger to the first finger.
  • the finger which is used to generate the optical signals may be alternated periodically and/or both fingers may send signals at the same time. It should be understood that in such cases the first and second optical interfaces will need to be capable of both generating and receiving optical signals.
  • the fingers described above comprise a central pad, with a frame received around it, such that the pad can engage with an object gripped between a pair of fingers. It should be understood that the present invention could be used in pairs of opposed fingers which incorporate alternative mechanisms to grip an object.
  • the first and second optical interfaces may comprise light emitting elements (for example, laser diodes, light emitting diodes (LEDs), etc.) and/or light detecting elements (for example photodiodes) as required. It may be convenient for the light emitting elements and the light detecting elements to be located outside of the control unit such that they are communicably connected to either or both of the first and second optical interfaces and to the light sources and light receivers. In such a case, a single optical interface 138 may be provided to enable signals to be sent from the processing unit to the light emitting elements and from the light detecting elements to the processing unit.
  • light emitting elements for example, laser diodes, light emitting diodes (LEDs), etc.
  • light detecting elements for example photodiodes
  • the optical signals may use wavelengths in the visible spectrum or alternatively the signals may use wavelengths in the infra-red spectrum so as to make use of emitters and receivers which are used in optical fibre communications systems.
  • the light emitting elements may transmit at fixed wavelengths or may be capable of generating signals at a variable wavelength. Signals of differing wavelengths may be generated using a light emitting element which generates a wideband signal (such as a white light LED) and then using one or more filters to generate a signal having a narrow wavelength band.
  • each of the plurality of light sources may use the same wavelength.
  • the plurality of light sources may be placed into one of a plurality of sub-groups. Different wavelengths could be allocated to each of the subgroups to provide further information regarding the position of an object.
  • the present invention relates to a manipulating apparatus, which may be of use in picking system.
  • the apparatus comprises a pair opposed fingers, which can be moved to grip an object between the fingers.
  • One or more light signals may be launched from one of the fingers such that a gripped object may block some, or all, of the light signals.
  • the light signal, or signals, received by the other finger can be processed to determine the location of the object and either or both of the fingers may be moved in response to the received light signal(s) or the location of the object.

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Orthopedic Medicine & Surgery (AREA)
  • Manipulator (AREA)

Abstract

The present invention relates to a manipulating apparatus, which may be of use in picking system. The apparatus comprises a pair opposed fingers, which can be moved to grip an object between the fingers. One or more light signals may be launched from one of the fingers such that a gripped object may block some, or all, of the light signals. The light signal, or signals, received by the other finger can be processed to determine the location of the object and either or both of the fingers may be moved in response to the received light signal(s) or the location of the object.

Description

OBJECT PRESENCE SENSING
Technical Field
The present invention relates generally to the field of sensing and more specifically to an apparatus and method for sensing the position of an object.
Background
Automated and semi-automated picking systems require robotic picking stations which are able to select an object from a first receptacle, such as a tote or other storage unit, grip the object, and then move the object into a second receptacle, such as a further tote or a bag. The picking station may comprise a manipulating apparatus which needs to be able to pick objects which may vary in size, shape, weight and physical durability. If too much force is applied when an object is gripped then the object may be damaged. However, if too little force is applied when an object is gripped then the object may be dropped before it can be transferred into the second receptacle. During the pick and place operation the manipulating apparatus may incorporate a sensor that can detect the relative pose of the gripping elements and the object gripped in order for the system to be able to react accordingly and maintain the grasp. Currently available sensors have a number of drawbacks, for example, they are too bulky to integrate into the manipulating apparatus, are unreliable in operation, are expensive, etc.
Summary
In view of the problems in known sensing systems for use in a manipulating apparatus which are suitable for use in a picking system, the present invention aims to an optical sensor system which can be integrated into a manipulating apparatus.
In general terms, the invention introduces one or more optical sources within a first component and a plurality of optical receivers within a second component of a manipulating apparatus. In operation, the first component is in opposition to the second component such that an object can be gripped by the first and second components. Light from the light source may be obscured by the object, preventing light from being detected at one or more of the optical receivers. This information can be processed to determine the position of the object relative to the first and second components.
According to a first aspect of the present invention there is provided a finger for use in a manipulating apparatus, the finger comprising: a light source; a plurality of optical receivers; and a face configured, in use, to engage with an object. The face may comprise a pad and a frame, wherein the frame is arranged around the periphery of the pad.
The pad may comprise the light source. In particular, the pad may comprise a plurality of light emitting elements, each of the light emitting elements emitting light at a respective wavelength. Alternatively, the pad may comprise a plurality of different regions, each of the pad regions having a different colour.
The finger may comprise a plurality of light sources, each of the plurality of light sources being located within the frame. One or more of the plurality of light sources may comprise an optical fibre. Alternatively, one or more of the plurality of light sources may comprise a light emitting element. The or each light emitting element may emit light at a respective wavelength.
Each of the plurality of optical receivers may be received within the frame. Each of the plurality of optical receivers may comprise an optical fibre.
According to a second aspect of the present invention there is provided a manipulating apparatus comprising: a first finger according to the first aspect of the invention; a second finger according to the first aspect of the invention, the first finger being arranged such that it is opposed to the second finger; one or more actuators and a control unit, wherein, in use, the control unit activates the one or more actuators to move the first finger relative to the second finger such that the first finger and the second finger engage an object, wherein light signals received by the second finger from the first finger are used by the control unit to determine the movement of the first finger relative to the second finger.
The light signals received by the first finger from the second finger may be used by the control unit to determine the movement of a gripped object relative to the first finger and the second finger. If the object is moving relative to the fingers then the fingers can be activated to increase the force applied to the object so that it is held more securely by the fingers. The apparatus may comprise one or more further sensors and data from the or each sensor may be used by the control unit in conjunction with the light signals received from the first finger and/or the second finger to determine the movement of the first finger relative to the second finger.
According to a third aspect of the present invention there is provided a method of manipulating an object, the method comprising the steps of: i) moving a first finger relative to a second finger in order to grip an object between the first finger and the second finger; ii) generating one or more optical signals at the first finger; iii) detecting the one or more optical signals at the second finger; and iv) determining the position of the object in accordance with the one or more detected optical signals. The method may comprise the further step of v) moving the first finger relative to the second finger in accordance with the position determined in step iv).
Brief Description of the Drawings
Embodiments of the invention will now be described by way of example only with reference to the accompanying drawings, in which like reference numbers designate the same or corresponding parts, and in which:
Figure 1 shows a schematic depiction of a conventional manipulating apparatus incorporating two opposed fingers;
Figure 2 shows a schematic depiction of a manipulating apparatus according to an embodiment of the present invention;
Figure 3 shows a schematic depiction of the face of a finger according to a first specific embodiment of the present invention;
Figure 4 shows a schematic depiction of a manipulating apparatus comprising a first finger 110A” and a second finger 110B” as shown in Figure 3;
Figure 5 shows a schematic depiction of the face of a finger 110” according to a second specific embodiment of the present invention;
Figure 6 shows an alternative to the embodiment of the finger of Figure 5;
Figure 7 shows a schematic depiction of a manipulating apparatus comprising a first finger 110A’” and a second finger 11 OB’” as shown in Figure 5; and
Figure 8 shows a schematic depiction of control unit 130’.
Detailed Description of Embodiments Figure 1 shows a schematic depiction of a conventional manipulating apparatus 100. In particular, Figure 1a shows a schematic depiction of a conventional manipulating apparatus 100 incorporating two opposed fingers 110 and Figure 1b shows a schematic depiction of the face of one of the fingers 110 which is used to make contact with an object. The manipulating apparatus 100 shown in Figure 1a further comprises a first finger 110A, a second finger 110B and a control unit 130. Each of the first and second fingers comprises a respective actuator 120A, 120B, which are connected to the control unit 130. The control unit may activate the first actuator 120A and/or the second actuator 120B to move or rotate the respective finger(s) 110A, 110B such that the fingers can co-operate to grip an object 200. Sensors may be provided within the fingers such that contact with an object may be detected. Information obtained from the sensors may then be supplied to the control unit such that the first actuator 120A and/or the second actuator 120B can be controlled in response to the detected contact. The face comprises a central pad 114 which is surrounded by a frame 113. The central pad may protrude from the plane of the frame to grip an object 200. This enables delicate and/or irregularly shaped objects to be held whilst reducing the probability that the object is damaged. The central pad may have a relatively high coefficient of friction, specifically so that it is higher than the coefficient of friction of the frame 122, to assist in the gripping of the object. The surface of the central pad may be textured or grooved to provide the increase in friction. Alternatively, or in addition, the material used to from the material pad may be selected so as to have a suitable coefficient of friction.
Figure 2 shows a schematic depiction of a manipulating apparatus 100’ according to an embodiment of the present invention. The first finger 111 A and the second finger 111 B are similar to the first and second fingers 110A, 110B described above with reference to Figures 1a & 1 b. Specifically, each of the first and second fingers 111A, 111 B comprise a respective actuator 120A, 120B, which are connected to the control unit 130’. The control unit may activate the first actuator 120A and/or the second actuator 120B to move or rotate the respective finger(s) 110A, 110B such that the fingers can co-operate to grip an object 200. Sensors may be provided within the fingers such that contact with an object may be detected. Information obtained from the sensors may then be supplied to the control unit such that the first actuator 120A and/or the second actuator 120B can be controlled in response to the detected contact. The face comprises a central pad 114 which is surrounded by a frame 113. The central pad may protrude from the plane of the frame to grip an object 200. This enables delicate and/or irregularly shaped objects to be held whilst reducing the probability that the object is damaged. The central pad may have a relatively high coefficient of friction, specifically so that it is higher than the coefficient of friction of the frame 122, to assist in the gripping of the object. The surface of the central pad may be textured or grooved to provide the increase in friction. Alternatively, or in addition, the material used to from the material pad may be selected so as to have a suitable coefficient of friction.
The first finger 111A additionally comprises a plurality of optical sources 116 which are operable to send optical signals towards the second finger 111 B. The second finger 111 B additionally comprises a plurality of optical receivers 118 which are configured to receive the optical signals which are sent from the plurality of optical sources and to then route the received signals to the control unit 130’. It should be understood that if there is no object located in between the first finger and the second finger then each of the optical receivers 118 will receive light from the optical sources. However, as is shown in Figure 2, if an object 200 is present then it may block some of the light signals sent from the plurality of optical sources. Depending on the size, position and the composition of the object, some of the plurality of optical receivers may receive no signal from the plurality of optical sources, some of the plurality of optical receivers may receive an attenuated signal and some of the plurality of optical receivers may not experience any change in the received optical signal. The signals received at each of the plurality of optical receivers 118 can be sent to the control unit 130’. The control unit can process the optical signals and use the resulting data when determining the location of the object relative to the first finger and the second finger. This data may be used in conjunction with data generated from other sensor signals when determining the location of the object. The determined location information may then be used by the control unit 130’ when sending signals to the first actuator and/or the second actuator to grip the object or to move the object from its present position to a further position. For example, if the data received from the optical receivers indicated that a grasped object is moving relative to the first and second fingers then this can be interpreted to mean that the object is weakly grasped and the first and second fingers should be brought closer together to establish a firmer hold on the object. This relative movement of the fingers may be applied until it is detected that the movement of the grasped object relative to the first and second fingers has ceased.
It should be understood that both the first finger and the second finger may comprise both a plurality of optical sources and a plurality of optical receivers and that during the process of using the apparatus to grip an object either or both fingers can be used to generate optical signals which can detected by the opposed finger to determine the size and/or location of the object to be gripped. The first and second fingers may be manoeuvred such that an object is held between them. If the optical sensors detect that the object is moving relative to fingers, that is that the detected position of the object is changing, then the force applied by the first finger and/or the second finger may be increased. The increase in applied force provides a firmer grip on the object. The force applied by the fingers may be increased until no further movement of the object relative to the fingers is detected. Once it has been determined that the object is being securely then the force applied by the fingers may be maintained at that level until the apparatus releases the object.
Figure 3 shows a schematic depiction of the face of a finger 111 according to a first specific embodiment of the present invention, which may be used in the manipulating apparatus 100’ described above with reference to Figure 2. The finger 111 comprises a frame 113, which comprises a plurality of optical receivers 118. The frame is arranged around a central pad 114, which comprises one or more light emitting elements 117. When the or each light emitting element is activated then the entirety of the central pad is illuminated, sending light to an opposed finger.
Each of the plurality of optical receivers may comprise optical fibres which couple the received optical signal to the control unit 130’ for processing. The number of optical receivers arranged within the finger will depend on the size of the finger and the resolution of the object size detection which is required. The central pad is shown as comprising two light emitting elements 117 but it should be understood that the central pad may comprise a single light emitting element or more than two light emitting elements. When the one or more light emitting elements are activated then the central pad 114 acts as the light source 116. If the central pad comprises two or more light emitting elements then each of the plurality of light emitting elements may emit light of a different wavelength (or colour). By having a pattern of light emitting elements, with each of the light emitting elements having a respective wavelength, then the wavelength of the signals received at each of the optical receivers can also be used by the control unit 130’ when determining the position and/or shape or an object. A similar effect may be obtained by forming the pad such that it has zones of different colours, such that the illumination provided by the one or more light emitting elements causes the light source to have zones having different colours.
Figure 4 shows a schematic depiction of a first finger 111 A and a second finger 111 B according to this first specific embodiment of the present invention. It can be seen that the central pad acts a single optical source 116, which radiates from the first finger 111 A to the second finger 111 B. In the absence of an object, each of the optical receivers in the second finger would receive the light transmitted from the central pad of the first finger. Figure 4 shows an object 200 between the first finger and the second finger and it can be seen that the object prevents light from reaching the middle two optical receivers shown in the second finger. However, light from the central pad of the first finger can be received at the uppermost and lowermost optical receivers which are shown in Figure 4. The control unit 130’ can process the optical signals and use the resulting data when determining the location of the object relative to the first finger and the second finger. The data obtained from the processing of the optical signals may be used on its own. Alternatively, this data may be used in conjunction with data received from other sensors comprised within one of the fingers. For the sake of clarity, the pad 114 of the second finger 111 B is not shown in Figure 4.
Figure 5 shows a schematic depiction of the face of a finger 112 according to a second specific embodiment of the present invention, which may be used in the manipulating apparatus 100’ described above with reference to Figure 2. The finger 112 comprises a frame 113 which is received around a central pad 114. The frame comprises a plurality of optical sources 116 and a plurality of optical receivers 118. One or more of the plurality of optical sources 116 may be activated, sending light to an opposed finger, where the optical signal(s) may be received by one or more of the plurality of optical receivers.
Figure 6 shows an alternative to the embodiment of the finger shown in Figure 5, with finger 112’ comprising a frame 113 which is received around a central pad 114. The finger further comprises a plurality of optical fibres 119 which are received within the frame, located around the periphery of the central pad. The finger 112’ shown in Figure 6 uses the optical fibres 119 as both the optical source and the optical receiver. An advantage of this arrangement is that it reduces the number of active components which are present in each finger. The optical fibres 119 connecting the finger to the control unit may be used as either an optical source or as an optical receiver. One or more of the plurality of optical fibres may function as an optical source when light is coupled into the fibre in response to an instruction from the control unit. Similarly, one or more of the plurality of optical fibres may function as an optical receiver when light enters into the fibre at the finger and is transmitted to an optical detector. It should be understood that the optical fibres 119 comprised in the finger 112’ shown in Figure 6 could be assigned to form a group of optical sources 116 and a group of optical receivers 118. Whilst it would be possible for one or more of the optical fibres to operate simultaneously as an optical source and an optical receiver, this would require significant complexity in the design of the control unit and the optical interfaces.
Figure 7 shows a schematic depiction of a manipulating apparatus comprising a first finger 112A and a second finger 112B according to this second specific embodiment of the present invention. In this example, it may be seen that an optical signal is sent via one of the plurality of optical sources 116 of the first finger 112A (shown by the dashed line in Figure 7), which radiates from the first finger 112A to the second finger 112B. It can be seen that the optical signal would diverge and in the absence of an object, one or more of the optical receivers in the second finger would receive the light transmitted from the first finger. The number of receivers that receive the transmitted light may vary in accordance with the separation of the two fingers and the degree of divergence of the optical signal. Figure 7 shows an object 200 between the first finger and the second finger and it can be seen that the object prevents light from reaching the two lower optical receivers shown in the second finger. However, light from the first finger can be received at the two upper optical receivers which are shown in Figure 7 (shown by the long dashed lines). The control unit can process the optical signals and use the resulting data when determining the location of the object relative to the first finger and the second finger. For the sake of clarity, Figure 7 does not show the central pad 114 of either of the first finger or the second finger. The foregoing discussion with reference to Figure 7 shows a manipulating apparatus comprising fingers 112A & 112B. It should be understood that the manipulating apparatus may alternatively comprise fingers 112A’ and 112B’ (that is the fingers described above with reference to Figure 6).
Figure 8 shows a schematic depiction of control unit 130’ which is a part of the manipulating apparatus described above with reference to Figure 2 and which may be used with the finger described above with reference to Figure 3 or with the finger described above with reference to Figure 5. The control unit comprises a processing unit 132, volatile data storage unit 133, nonvolatile data storage unit 134, actuator interface 136, first optical interface 138A and second optical interface 138B. The processing unit 132 is connected to the volatile data storage unit 133 and the non-volatile data storage unit 134. The processing unit 132 can access computer programs, instructions and other data stored on the non-volatile data storage unit and process them in order for the manipulating apparatus to operate. Furthermore, the processing unit may write data to, and read data from, the volatile data storage unit as required. The processing unit 132 is also in communication with the actuator interface 136 and the first and second optical interfaces 138A, 138B. The processing unit can send instructions to the first optical interface 138A which cause one or more of the optical sources 116 in the first finger to transmit an optical signal. Any signal(s) received at the optical receivers of the second finger may be received at the second optical interface 138B. These signal(s) may be sent to the processing unit which can analyse and interpret the received signal(s) to determine the location of an object between the first and second fingers. This location data may then be used to further activate the first actuator 120A and/or the second actuator 120B.
The preceding discussion has described a manipulating apparatus which uses only the optical sensors described above. In practice, such an apparatus may incorporate further sensor subsystems. In such a case, the optical sensor system described above may provide one input to the control unit which could be used when determining the position of the first and second actuators.
The processing unit may be, for example, one of a central processing unit (CPU), an adapted graphics processing unit (GPU) or an appropriately configured FPGA. The non-volatile data storage unit may comprise ROM data storage chips, solid state memory data storage or other optical or magnetic data storage technologies and the volatile data storage unit may comprise RAM data storage chips.
Modifications and Variations
Many modifications and variations can be made to the embodiments described above, without departing from the scope of the present invention.
The foregoing discussion has, for the sake of simplicity, described that the optical signals are generated in the first finger and received in the second finger. It should be understood that the signals may be sent in the reverse direction, that is from the second finger to the first finger. The finger which is used to generate the optical signals may be alternated periodically and/or both fingers may send signals at the same time. It should be understood that in such cases the first and second optical interfaces will need to be capable of both generating and receiving optical signals. The fingers described above comprise a central pad, with a frame received around it, such that the pad can engage with an object gripped between a pair of fingers. It should be understood that the present invention could be used in pairs of opposed fingers which incorporate alternative mechanisms to grip an object. The first and second optical interfaces may comprise light emitting elements (for example, laser diodes, light emitting diodes (LEDs), etc.) and/or light detecting elements (for example photodiodes) as required. It may be convenient for the light emitting elements and the light detecting elements to be located outside of the control unit such that they are communicably connected to either or both of the first and second optical interfaces and to the light sources and light receivers. In such a case, a single optical interface 138 may be provided to enable signals to be sent from the processing unit to the light emitting elements and from the light detecting elements to the processing unit.
The optical signals may use wavelengths in the visible spectrum or alternatively the signals may use wavelengths in the infra-red spectrum so as to make use of emitters and receivers which are used in optical fibre communications systems. The light emitting elements may transmit at fixed wavelengths or may be capable of generating signals at a variable wavelength. Signals of differing wavelengths may be generated using a light emitting element which generates a wideband signal (such as a white light LED) and then using one or more filters to generate a signal having a narrow wavelength band. In one alternative, each of the plurality of light sources may use the same wavelength. In a further alternative, the plurality of light sources may be placed into one of a plurality of sub-groups. Different wavelengths could be allocated to each of the subgroups to provide further information regarding the position of an object.
In one respect, the present invention relates to a manipulating apparatus, which may be of use in picking system. The apparatus comprises a pair opposed fingers, which can be moved to grip an object between the fingers. One or more light signals may be launched from one of the fingers such that a gripped object may block some, or all, of the light signals. The light signal, or signals, received by the other finger can be processed to determine the location of the object and either or both of the fingers may be moved in response to the received light signal(s) or the location of the object.
The foregoing description of embodiments of the invention has been presented for the purpose of illustration and description. It is not intended to be exhaustive or to limit the invention to the precise form disclosed..

Claims

1. A finger for use in a manipulating apparatus, the finger comprising: a light source; a plurality of optical receivers and a face configured, in use, to engage with an object.
2. A finger according to claim 1 , wherein the face comprises a pad and a frame, wherein the frame is arranged around the periphery of the pad
3. A finger according to claim 2, wherein the pad comprises the light source.
4. A finger according to claim 3, wherein the light source comprises a plurality of light emitting elements, each of the light emitting elements emitting light at a respective wavelength.
5. A finger according to claim 3, wherein the pad comprises a plurality of different regions, each of the pad regions having a different colour.
6. A finger according to claim 2, wherein the finger comprises a plurality of light sources, each of the plurality of light sources being located within the frame.
7. A finger according to claim 6, wherein one or more of the plurality of light sources comprises an optical fibre.
8. A finger according to claim 6, wherein one or more of the plurality of light sources comprises a light emitting element.
9. A finger according to claim 8, wherein the or each light emitting element emits light at a respective wavelength.
10. A finger according to claim 2, wherein each of the plurality of optical receivers are received within the frame.
11. A finger according to claim 10, wherein each of the plurality of optical receivers comprise an optical fibre.
12. A manipulating apparatus comprising: a first finger according to any of claims 1 to 11 ; a second finger according to any of claims 1 to 10, the first finger being arranged such that it is opposed to the second finger; one or more actuators and a control unit, wherein, in use, the control unit activates the one or more actuators to move the first finger relative to the second finger such that the first finger and the second finger engage an object, wherein light signals received by the second finger from the first finger are used by the control unit to determine the movement of the first finger relative to the second finger.
13. An apparatus according to claim 11 , wherein light signals received by the first finger from the second finger are used by the control unit to determine the movement of the first finger relative to the second finger.
14. An apparatus according to claim 11 or claim 12, wherein the apparatus comprises one or more further sensors and data from the or each sensor are used by the control unit in conjunction with the light signals received from the first finger and/or the second finger to determine the movement of the first finger relative to the second finger.
15. A method of manipulating an object, the method comprising the steps of: i) moving a first finger relative to a second finger in order to grip an object between the first finger and the second finger; ii) generating one or more optical signals at the first finger; iii) detecting the one or more optical signals at the second finger; and iv) determining the position of the object in accordance with the one or more detected optical signals.
16. A method according to step 14, wherein the method comprises the further step of v) moving the first finger relative to the second finger in accordance with the position determined in step iv).
PCT/EP2021/072311 2020-08-11 2021-08-10 Object presence sensing WO2022034107A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GBGB2012448.3A GB202012448D0 (en) 2020-08-11 2020-08-11 Object presence sensing
GB2012448.3 2020-08-11

Publications (1)

Publication Number Publication Date
WO2022034107A1 true WO2022034107A1 (en) 2022-02-17

Family

ID=72519936

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2021/072311 WO2022034107A1 (en) 2020-08-11 2021-08-10 Object presence sensing

Country Status (2)

Country Link
GB (2) GB202012448D0 (en)
WO (1) WO2022034107A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024035432A1 (en) * 2022-08-10 2024-02-15 Siemens Corporation Sensor control system for coanda-based end effectors

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4423998A (en) * 1980-07-24 1984-01-03 Fujitsu Fanuc Ltd. Gripping device
WO1984004723A1 (en) * 1983-05-27 1984-12-06 American Telephone & Telegraph Robotic systems utilizing optical sensing
US4766322A (en) * 1984-09-14 1988-08-23 Kabushiki Kaisha Toshiba Robot hand including optical approach sensing apparatus
US20130325181A1 (en) * 2012-05-31 2013-12-05 Toyota Motor Engineering & Manufacturing North America, Inc. Non-contact optical distance and tactile sensing system and method

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS5850836B2 (en) * 1976-07-20 1983-11-12 新明和工業株式会社 Grip devices for industrial robots, etc.
US4852928A (en) * 1984-02-16 1989-08-01 Multivisions Corporation Robotic end effectors
JPH0699389A (en) * 1992-09-18 1994-04-12 Toshiba Corp Handling device for work arm
JP4621827B2 (en) * 2004-03-09 2011-01-26 財団法人名古屋産業科学研究所 Optical tactile sensor, sensing method using optical tactile sensor, sensing system, object operation force control method, object operation force control device, object gripping force control device, and robot hand
JP5549204B2 (en) * 2009-12-01 2014-07-16 セイコーエプソン株式会社 Optical position detection device, hand device, and touch panel
EP2909635B1 (en) * 2012-10-16 2018-12-05 Beckman Coulter, Inc. Chute arrangement with strip-off feature

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4423998A (en) * 1980-07-24 1984-01-03 Fujitsu Fanuc Ltd. Gripping device
WO1984004723A1 (en) * 1983-05-27 1984-12-06 American Telephone & Telegraph Robotic systems utilizing optical sensing
US4766322A (en) * 1984-09-14 1988-08-23 Kabushiki Kaisha Toshiba Robot hand including optical approach sensing apparatus
US20130325181A1 (en) * 2012-05-31 2013-12-05 Toyota Motor Engineering & Manufacturing North America, Inc. Non-contact optical distance and tactile sensing system and method

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024035432A1 (en) * 2022-08-10 2024-02-15 Siemens Corporation Sensor control system for coanda-based end effectors

Also Published As

Publication number Publication date
GB2604195A (en) 2022-08-31
GB202012448D0 (en) 2020-09-23

Similar Documents

Publication Publication Date Title
CN101996002B (en) For the apparatus and method of optical attitude identification
US20100157278A1 (en) Method and optical sensor for the detection of objects
WO2022034107A1 (en) Object presence sensing
US4605354A (en) Slip sensitive robot gripper system
US4766322A (en) Robot hand including optical approach sensing apparatus
KR101986342B1 (en) Article sorting system using picking robot
US4520932A (en) Stamp detection in a mail processing apparatus
NL8202496A (en) DEVICE FOR DETERMINING THE POSITION OF AN OBJECT.
EP3455705B1 (en) A touch and pressure sensing system with different upper layers
US20060163504A1 (en) Identification sensor
US20210370352A1 (en) Detecting non-handleable items
US20210023714A1 (en) Illuminated Surface as Light Source for In-Hand Object Location System
CN102541303A (en) Proximity sensor with motion detection
US6900451B2 (en) Mapping sensor system for detecting positions of flat objects
US9607911B2 (en) Optical programming of electronic devices on a wafer
US9898110B1 (en) Mouse pad, input system and pairing method thereof
CN107782354B (en) Motion sensor detection system and method
US9677914B2 (en) Optoelectronic sensor for recognizing object edges
SE457151B (en) DEVICE FOR OPTICAL SENSING OF TARGETS
CN110794836B (en) Empty cage vehicle warehouse returning control method, system and storage medium
US4672185A (en) Control system for semiconductor substrate process line
JP7302671B2 (en) optical sensor
US6023058A (en) Photosensitive detector and mosaic of photosensitive detectors for the detection of luminous flashes and applications
CN210142202U (en) SMT charging tray detection subassembly
EP0337670A2 (en) Signal transmission system for machine tools, inspection machines, and the like

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21759287

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21759287

Country of ref document: EP

Kind code of ref document: A1