EP3717180A1 - Modular system - Google Patents

Modular system

Info

Publication number
EP3717180A1
EP3717180A1 EP18815295.3A EP18815295A EP3717180A1 EP 3717180 A1 EP3717180 A1 EP 3717180A1 EP 18815295 A EP18815295 A EP 18815295A EP 3717180 A1 EP3717180 A1 EP 3717180A1
Authority
EP
European Patent Office
Prior art keywords
segment
robotic system
modular robotic
cell
control device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP18815295.3A
Other languages
German (de)
French (fr)
Inventor
James Stuart MARSAY
Garry LOFTHOUSE
Donald Lee RAYWOOD
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Applied Scientific Technologies Uk Ltd
Original Assignee
Applied Scientific Technologies Uk Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Applied Scientific Technologies Uk Ltd filed Critical Applied Scientific Technologies Uk Ltd
Publication of EP3717180A1 publication Critical patent/EP3717180A1/en
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/08Programme-controlled manipulators characterised by modular constructions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B01PHYSICAL OR CHEMICAL PROCESSES OR APPARATUS IN GENERAL
    • B01LCHEMICAL OR PHYSICAL LABORATORY APPARATUS FOR GENERAL USE
    • B01L9/00Supporting devices; Holding devices
    • B01L9/02Laboratory benches or tables; Fittings therefor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23PMETAL-WORKING NOT OTHERWISE PROVIDED FOR; COMBINED OPERATIONS; UNIVERSAL MACHINE TOOLS
    • B23P21/00Machines for assembling a multiplicity of different parts to compose units, with or without preceding or subsequent working of such parts, e.g. with programme control
    • B23P21/004Machines for assembling a multiplicity of different parts to compose units, with or without preceding or subsequent working of such parts, e.g. with programme control the units passing two or more work-stations whilst being composed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J21/00Chambers provided with manipulation devices
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N35/00Automatic analysis not limited to methods or materials provided for in any single one of groups G01N1/00 - G01N33/00; Handling materials therefor
    • G01N35/0099Automatic analysis not limited to methods or materials provided for in any single one of groups G01N1/00 - G01N33/00; Handling materials therefor comprising robots or similar manipulators
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N35/00Automatic analysis not limited to methods or materials provided for in any single one of groups G01N1/00 - G01N33/00; Handling materials therefor
    • G01N2035/00178Special arrangements of analysers
    • G01N2035/00326Analysers with modular structure
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40304Modular structure

Definitions

  • the present invention relates to robotic systems, and in particular modular robotic systems.
  • robotic systems for automation.
  • robotic systems that automate laboratory tasks are well known and widely available.
  • Such systems can automate complex processes, often performing steps more quickly, consistently and accurately than a human operator.
  • a customised system is commissioned. That is, a system is designed and built specifically for performing the process in question. This approach is often considered effective because a custom built automated system is optimised for undertaking a specific process. Such custom-built systems are often designed so that certain aspects can be modified to accommodate changes in requirements. However, substantial changes in the way in which the process is undertaken, or if the process needs to be scaled up or scaled down, often require any existing systems to be discarded and entirely new systems to be commissioned. This can be costly, particularly if it is impossible or impractical to recycle parts of the original system.
  • a modular robotic system comprising one or more cells.
  • Each cell comprises a plurality of geometrically shaped segments.
  • Each segment comprises hardware for performing a processing function.
  • modular robotic systems in accordance with certain aspects of the invention comprise one or more cells each of which comprise a number of geometric segments loaded with specific processing hardware.
  • each segment By physically configuring each segment in accordance with a geometric shape, cell design is easier and can be undertaken more quickly. Further, the exact space occupied by any given cell can be predicted irrespective of the processing hardware it provides.
  • cells of the system themselves will typically have predictable geometric configurations, again irrespective of the processing functions they provide making system design, and in particular system modification and system scaling, easier than conventional custom designed systems.
  • each segment comprises a geometrically shaped main body.
  • each segment has a work surface part configured such that the work surface part of each of the segments together form a substantially planar work surface of the cell.
  • each segment is provided with a work surface part which, together with work surface parts from other segments of a cell, provide a substantially continuous work surface.
  • This arrangement makes it easier for human operators to work with and visually supervise the system.
  • each cell can be configured so that the work surface is at conventional laboratory bench height, for example 1 100mm, meaning that the system can be readily and conveniently integrated into a laboratory environment.
  • the geometrically shaped main body of each segment substantially correspond in shape and size with each other.
  • the shape and size of the geometrically shaped main body of each segment is the same. This improves the ease with which systems can be designed because, irrespective of the function provided by a segment, it will always occupy a position within a cell in the same predictable way.
  • the geometrically shaped main body of each segment is such that the cell forms a substantially hexagonal shape.
  • the main bodies of each segment are geometrically shaped so that the cell has a substantially hexagonal shape.
  • multiple cells forming a system can be arranged optimising the use of space because hexagons can be“tiled” without gaps.
  • a continuous work surface, substantially without gaps, can be provided for the system allowing easy interaction with human operators and allow easy interaction between cells.
  • the segments are interchangeable.
  • each segment is connected to a control device for controlling the hardware of each segment.
  • control device is an external control device.
  • control device is adapted to detect an identity associated with each segment.
  • control device is adapted to determine from the detected identity, a processing function provided by the hardware of the segment and to present configuration information corresponding to the processing function on a user interface associated with the control device.
  • user interface enables a process to be performed by the system to be configured using the configuration information.
  • At least one of the one or more cells comprises an array of display units connected to the control device and the control device is arranged control the array of display units to display information relating to operation of the cell.
  • each display unit of the array of display units is positioned proximate to a segment and is controlled to display information relating to operation of the segment.
  • system further comprises an augmented reality system connected to the control device and arranged to control an augmented reality device to generate an augmented reality display comprising graphical elements comprising information relating to operation of the system.
  • the augmented reality is system is operable to position the graphical elements on the augmented reality display in positions corresponding to detected positions of components of the system to which they relate.
  • each cell comprises a base unit on which the geometrically shaped segments are disposed.
  • the base unit includes a plurality of retractable leg units for lowering the base unit onto a moving apparatus.
  • the leg units each comprise a foot adapted to engage with a corresponding recess of the wheeled apparatus when fully retracted to secure the base unit to the moving apparatus.
  • system further comprises a robot arm handler unit operable to handle items.
  • a camera unit is positioned on the robot arm handler unit.
  • a geometrically shaped segment for a modular robotic system according to the first aspect comprising hardware for performing a processing function.
  • Figure 1 provides a simplified schematic diagram of a cell of a modular robotic system in accordance with certain embodiments of the invention
  • Figure 2 provides a simplified schematic diagram of an example segment of a cell of a modular robotic system in accordance with certain embodiments of the invention
  • Figure 3 provides a simplified schematic diagram of a graphical user interface in accordance with certain embodiments of the invention.
  • Figure 4 provides a simplified schematic diagram depicting a modular robotic system in accordance with certain embodiments of the invention.
  • FIGS. 5a to 5h provide schematic diagrams depicting various possible segment shapes and cell configurations in accordance with certain embodiments of the invention.
  • Figure 6 provides a simplified schematic diagram of a cell of a modular robotic system in accordance with certain embodiments of the invention in which the cell includes an array of display;
  • Figure 7 provides a simplified schematic diagram showing a modular robotic system in accordance with certain embodiments of the invention, where the control device is connected via to an augmented reality system;
  • Figure 8 provides a simplified schematic diagram showing an augmented reality device in use in accordance with certain embodiments of the invention
  • Figure 9 provides a schematic diagram in accordance with certain embodiments of the invention in which a base unit part of cell includes retractable leg units.
  • Figure 1 shows a simplified schematic diagram of a cell 101 of a modular robotic system in accordance with certain embodiments of the invention.
  • Modular robotic systems in accordance with examples of the invention can include one or more such cells.
  • the cell 101 comprises a number of discrete, geometrically shaped, segments 102, 103, 104, 105, 106, 107. In the embodiment shown in Figure 1 , the cell 101 comprises six such geometrically shaped segments.
  • each segment comprises a geometrically shaped main body 108 and a hardware unit 109a - 109f for providing a processing function, such as a laboratory processing function.
  • each segment 102, 103, 104, 105, 106, 107 is configured to be the same size and shape as the main body of each other segment. More specifically, the main body of each segment 102, 103, 104, 105, 106, 107 is a geometric“wedge” shape such that when the six segments are positioned adjacent, a polyhedron with a uniform hexagon shaped upper and lower face is formed.
  • the upper surface of the cell forms a substantially planar work surface made from work surface parts of each segment - i.e. an upper surface of the main body of each segment.
  • the segments are configured so that the work surface is at conventional laboratory bench height, e.g. approximately 1 100mm from the floor.
  • the shape of the main body of each segment is selected such that the cell forms a shape (e.g. a hexagon) which is such that when multiple cells are brought together, a continuous surface is formed.
  • each segment is depicted schematically as having the same shape and configuration (a simple cylinder shape is shown for clarity). Flowever, as explained below, different hardware units may provide different functions therefore will typically be of differing shape and configuration. Although being of different vertical heights, typically, hardware associated with each segment will typically be substantially confined to the surface area defined by the upper surface of the segment.
  • the cell 101 further comprises a base 1 10 which corresponds in shape to the hexagon formed by the six wedge shaped segments 102, 103, 104, 105, 106, 107.
  • the base comprises a hexagonal bed on top of a correspondingly shaped pedestal.
  • the bed comprises six predetermined segment positions into which segments are inserted.
  • the processing function hardware unit of each segment typically provides a different processing function.
  • the processing function hardware unit from the first segment 102 may provide a sample container loading and unloading function for loading and unloading sample containers from a sample rack;
  • the processing function hardware unit from the second segment 103 may provide a shaker function for shaking a sample container;
  • the processing function hardware unit from the third segment 104 may provide a centrifuge function;
  • the processing function hardware unit from the fourth segment 105 may provide a heating function for heating a sample container;
  • the processing function hardware unit from the fifth segment 106 may provide a separating function for separating supernatant from a sample container, and the processing function hardware unit from the sixth segment 106 may provide a sample taking function for taking a sample of supernatant liquid from a sample container.
  • the cell 101 further comprises a handler unit 1 1 1 for handling items associated with the process being performed by the cell 101 .
  • the handler unit 1 1 1 may be a robotic arm equipped with a servo controlled gripper, configured to pass sample containers between processing function hardware units of the various segments.
  • the gripper is adapted such that it has a modifiable grip size so that it can accommodate objects of different size.
  • the camera unit enables image data relating to the work surface of the cell 101 to be captured and used for control of the system.
  • the cell 101 is configured so that the segments are interchangeable. That is, different segments can be inserted and removed from the cell 101 .
  • Each segment is secured in place in the cell by a suitable latching mechanism.
  • a mechanical latching mechanism can be used.
  • an electromechanical latching mechanism e.g. using one or more solenoids
  • the cell 101 typically includes a number of service connection points for providing services necessary for the processing hardware function units to operate.
  • the service connection points include, for example, a power service connection point for providing electrical power; a water supply supplying cold, demineralised water; a waste extraction connection for extracting waste liquids; a fume extraction connection for extracting fumes; an air supply providing clean, dry pressurised air, and a data network connection for connecting to the segment to a computer network enabling data to be sent and received.
  • each segment comprises a number of service connection points necessary for the functioning of its processing function hardware unit.
  • the service connection points of each segment connect to corresponding service connection points of the cell.
  • the services enter the cell 101 by connecting a services umbilical to a services port.
  • a services umbilical to a services port.
  • each face of the base 1 10 is provided with a services port 1 16.
  • the most convenient services port can be used to connect services to the cell (for example the services port near an appropriate external services connection).
  • control device 1 12 Typically, operation of the cell 101 , specifically operation of the processing hardware function units and the handler unit 1 1 1 is controlled by an external control device 1 12.
  • the control device is connected to the cell via a suitable data connection 1 13.
  • Information from the cell such as information from the hardware units and image data captured by the camera is communicated to the control device via the data connection 1 13.
  • Control information to the cell 101 e.g. control information for controlling specific hardware units and the handler unit 1 1 1 , is communicated from the control device to the cell 101 via the data connection 1 13.
  • the control device 1 12 is provided by a computing device, such as a personal computer (PC), comprising a processor unit, memory unit and suitable data input/output interface.
  • PC personal computer
  • Image data captured by the camera unit 1 15 can be processed by machine vision software enabling the system to track the location of items on the work surface of the cell. This enables the handler unit 1 1 1 to pass items, for example sample containers, between hardware units of different segments. This also allows the handler unit 1 1 1 to rectify problems, for example pick up and replace dropped sample containers, or correctly position sample containers incorrectly positioned by operators.
  • the cell 101 comprises wheel units 1 14 connected to the base 1 10 allowing the cell to be readily moved from place-to-place, for example around a laboratory or processing environment.
  • each wheel unit 1 14 includes a levelling actuator operable to change the distance of the wheel unit from the base.
  • the levelling actuators can be connected to an orientation sensor within the cell 101 arranged to detect any tilt present due, for example, to the cell being positioned on an uneven surface. Control signals, generated responsive to an orientation output of the orientation sensor, can be used to actuate the levelling actuators to ensure that the cell is level. In certain embodiments, this is an automated process, i.e. a cell levelling process occurs without any intervention from human operators.
  • Figure 2 provides a simplified schematic diagram showing a more detailed view of an example segment 201 in accordance with certain embodiments of the invention.
  • the segment 201 includes a hardware unit which comprises a fume extractor unit 202 and a sample container shaker unit 203.
  • the segment 201 further comprises a main body 204 which houses a control unit 205 for controlling the fume extractor unit 202 and the sample container shaker unit 203.
  • the main body 204 of the segment 201 includes three service connection points: a first service connection point 206 providing a power supply connection for providing power to the control unit 205, the fume extractor unit 202 and the sample container shaker unit 203; a second service connection point 207 a data connection point for connecting the control unit 205 to an external control device, and a third service connection point 208 for removing fumes extracted by the fume extractor unit 202.
  • Control unit 205 Electrical power is provided to the control unit 205, the fume extractor unit 202 and the sample container shaker unit 203 via internal power connection lines 209, 210.
  • Data is communicated to and from the control unit 205 and the data connection point 207 via a data connection line 21 1 . Fumes extracted from of the fume extractor unit 202 are extracted from the segment 201 via the fume extraction connection point 208 and a fume conduit 212.
  • Control signals are sent from the control unit 205 to the fume extractor unit 202 and a sample container shaker unit 203 via control lines 213, 214.
  • the external control device 1 12 controlling the system has stored thereon software for controlling the operation of the system.
  • the software is adapted to provide a user with a graphical user interface (GUI) which displays to a user a schematic representation of the system that corresponds to the physical configuration of the system.
  • GUI graphical user interface
  • the GUI provides a schematic depiction of the configuration of the system including the relative location of segments within a cell (and, if the system comprises multiple cells, the relative location of different cells) along with the functions provided by the hardware units of the or each cell.
  • the GUI provides a user with controls allowing a process to be defined comprising a sequence of operations, for example, with the hardware units from different segments of the cell performing specific steps in a specific order.
  • FIG. 3 provides a simplified schematic diagram of a GUI 301 generated by the software as described above enabling a user to define a process to be undertaken by a modular robotic system in accordance with certain embodiments of the invention.
  • a first part 302 of the GUI 301 shows a schematic representation of the system being controlled by the software.
  • the system comprises a single cell with six segments, a first segment (1 ) provided with loading/unloading hardware; a second segment (2) provided with shaking hardware; a third segment (3) provided with centrifuge hardware; a fourth (4) segment provided with heating hardware; a fifth segment (5) provided with separating hardware, and a sixth segment (6) provided with sampling hardware.
  • the GUI 301 provides a“drag and drop” interface allowing a user to“select” a segment from the first part 302 and“drag” it to a second part 303 of the GUI 301 .
  • a processing step is shown in the second part 303 and the user is prompted to enter in variables associated with the function performed by the hardware unit of the relevant segment. For example, when the“drag and drop” operation is performed for the second segment (2), a user is prompted to enter a period of time of the shaking operation and a shaking intensity.
  • the user performs the drag and drop operation on the various segments corresponding to the order with which the process is to be performed.
  • the vertical order with which the processing steps are shown in the second part 303 of the GUI determines the order in which the process is performed, the top processing step being performed first.
  • An illustrative process shown in the second part 303 of the GUI in Figure 3 comprises the following steps in the following order:
  • This control causes the software to control the segments of the system to perform the process specified in the second part 303 of the GUI 301 .
  • This is typically achieved by converting the process defined by the user into a series of control signals which are communicated to the control units of each segment.
  • control signals “abstracts” the generation of control signals from the user.
  • the software may determine, from the order of steps defined by the user in the second part of the GUI, the requisite operation of the handler unit passing the sample container to and from hardware units.
  • the control units of each segment will typically generate feedback control information to monitor operation of the system and to identify, and where possible, rectify errors.
  • a process can be defined by alternative means.
  • an optical code such as a machine-readable barcode, affixed for example to a sample container, can be imaged using the camera device. Encoded on the barcode is a process identifier which identifies a predetermined process to be performed by the system.
  • the software running on the external control device is adapted to identify a process associated with a barcode, for example by retrieving it from memory and then control the segments of the system to perform the process.
  • an optical code such as a barcode may itself be used to encode process data - e.g. specify a series of steps. The software is adapted to decode such data, extract the relevant process data and undertake the specified process.
  • a reduced amount of operator input is required as it is not necessary for an operator to specify a process and input an“execute” command.
  • a sample container with a suitable optical code can simply be put on the work surface of the cell, imaged by the camera unit and then the process is performed.
  • the software is typically arranged to detect if a segment has been inserted into the cell or removed from the cell and to update the GUI 302 accordingly.
  • the control unit of each segment on power up may be arranged to communicate to the control device a segment identifier.
  • the software may include a library of segment configuration information associated with different segment identifiers and present information on the GUI accordingly (e.g. the visual representation of the segment type and the variables that can be set for that segment type).
  • the software may be able to determine a relative position of the segment within the cell by virtue of an identifier address associated with a data connection point within the cell to which the segment has been attached.
  • the identity and position of segments may be determined by a marker (e.g.
  • the segments may also include “landmark” markers in predetermined positions on each segment.
  • the landmark markers show predetermined patterns, the parallax of which can be used to determine segment position and orientation.
  • the landmark markers are imaged by the camera device. By detecting the landmark markers, the control device can identify the orientation of the segment to identify if it has been incorrectly installed and, for example, generate a corresponding error message to be displayed on the GUI 302.
  • FIG 4 provides a simplified schematic diagram depicting a modular robotic system 401 in accordance with certain embodiments of the invention.
  • the system 401 comprises five cells 402, 403, 404, 405, 406 each of which correspond to the cell described with reference to Figure 1 and cell segment described with reference to Figure 2.
  • the system 301 is provided with a continuous surface. Forming a system in this fashion means that more complex processes can be performed, or aspects of a process can be performed in parallel.
  • a first part of a GUI associated with the system shown in Figure 4 will typically provide a schematic representation of the various cells of the system as well as their constituent segments.
  • the second part of the GUI will enable parallel processes to be defined (e.g. processes being performed independently of each other on different cells), and collaborative processes being performed across cells.
  • the relative position of each cell can be determined in any suitable way.
  • each cell has an identifying optical code (e.g. barcode) which can be imaged by a camera unit of the handler of an adjacent cell and communicated back to the control device.
  • Software running on the control device can thereby determine the relative position of each cell adjacent a given cell and thereby determine the relative position of all the cells.
  • the main body of the segments of each cell are substantially of uniform (the same) shape.
  • non- uniform segment configurations are possible.
  • Figures 5a to 5g provide schematic diagrams showing certain example segment configurations.
  • cells comprising segments which form shapes comprising more than one hexagonal are possible.
  • Figure 5h depicts a cell comprising segments which are such that the cell comprises two hexagons.
  • the base unit can be separated into parts, for example into two halves.
  • the base unit can be separated into these parts when, for example, the cell is being moved or stored.
  • configuring the base unit so that it can be divided into parts can improve the ease with which the cell can be moved.
  • the base unit comprises a plurality of retractable leg units which can be moved up and down to level the cell.
  • each of the leg units includes a levelling actuator which operate in corresponding manner to the levelling actuators described above (e.g. connected to an orientation sensor to automatically level the cell).
  • Embodiments comprising leg units rather than wheel units cannot be moved as readily therefore, in such embodiments, a specially adapted moving apparatus, for example a wheeled apparatus (a“dolly”) can be used for moving parts of the base unit.
  • a wheeled apparatus “a“dolly”
  • the base unit is separated into the parts and a lifting platform of the dolly is inserted under the base cell parts which lifts the cell off the ground.
  • the dolly is then moved (for example by being pushed or pulled by a human operative via an arm connected to the dolly).
  • the lifting platform is lowered until the leg units make contact with the floor.
  • the dolly is then withdrawn.
  • the levelling actuators can then be used to level the cell, to account, for example, for any irregulates in the flatness of the floor at the new location.
  • the dolly is inserted under a base unit part and the leg units retract lowering the base unit part onto the dolly.
  • the leg units When the base unit part is fully lowered and resting on the dolly, the leg units continue to retract.
  • a suitably adapted foot of each leg unit is received by specially adapted recesses in the dolly.
  • Figure 9 provides a schematic diagram of depicting an example of this process.
  • Figure 9 provides a partial view of the dolly 901 which includes wheels 902 and a partial view of the base unit part 903 the cell including one of the leg units 904 comprising a foot 905.
  • the base unit parts are separated and the dolly 901 is inserted under a base unit part 903.
  • the leg units 904 then retract and the base unit part 903 then rests on the dolly 901 .
  • the leg units 904 continue retracting until the foot 905 of each leg unit 904 is received by a corresponding recess 906 of the dolly 901 .
  • the base unit part 903 is secured to the dolly 902 and can be readily moved with a reduced risk that the base unit part 903 will become dislodged from the dolly 901 .
  • the GUI associated with operation of the cell is displayed on a display associated with the control device.
  • the display may be in a location that is different to the location of the cell, for example in a control room. In such arrangements, it may not be possible to view the GUI and the cell at the same time.
  • the cell itself may include means to display information to a user, for an example an array of display screens mounted on the cell.
  • Figure 6 provides a schematic diagram of such an embodiment.
  • Figure 6 shows a cell 601 corresponding to that described, for example, with reference to Figure 1 .
  • the cell 601 includes an array of display units 602.
  • each display unit is provided by a conventional display screen.
  • the array of display screens 602 is mounted centrally with respect to the cell, for example on a central column 603 to which the handler unit is mounted.
  • the array of display screens 602 is such that each individual display screen is positioned relative (proximate) to a particular segment such that it is visually apparent from their position that each display screen is associated with a particular segment.
  • Information associated with a particular segment can be displayed on the display screen.
  • each display screen is connected to the control device, via, for example the data connection and controlled by the software running on the control device.
  • the software running on the control device generates and stores information relating to the position of each segment within a cell along with information about the function performed by each cell.
  • the software also generates and stores information relating to the operational state of each segment.
  • the system can be connected to an augmented reality (AR) system enabling an augmented reality device to be used to ascertain information about the cell.
  • AR augmented reality
  • Figure 7 provides a simplified schematic diagram showing a modular robotic system of the type described with reference to Figure 1 where the control device 701 is connected via a suitable interface to an AR system 702.
  • the AR system 702 includes an API 703 allowing data to communicated from the control device relating to the cell 701 to be processed and converted into data to display on an AR device 704 connected, via a suitable data connection, to the AR system 702.
  • the data communicated from the control device 701 to the AR system 702 can include, for example, information about the configuration of the cell (e.g. the relative position of the segments etc) and information about the operation of the cell (e.g. the operational status of each segment etc).
  • the AR device 704 has running thereon AR software which includes an image recognition component enabling the software to identify the cell within images captured by an imaging device (e.g. video camera) 705 of the AR device 704.
  • the image recognition component may typically further be able to determine the relative orientation of the cell using, for example, landmark markers positioned on the cell.
  • the AR device 704 With the position and orientation of the cell determined within images captured by the AR device 704, the AR device 704 is then arranged to generate an augmented reality display of the cell comprising images of the cell with graphical elements, superimposed on the images and display this on a display 706.
  • the graphical elements typically comprise information relating to the cell received from the control device 701 via the AR system 702.
  • an operative can view the cell through the AR device 704 and be presented, visually, with information about the operation of the cell.
  • many AR systems are available, and any suitable AR system can be used.
  • FIG. 8 A schematic diagram of an illustrative example of this is shown in Figure 8.
  • FIG 8 shows an AR device in the form of a smartphone 801 .
  • the smartphone comprises a camera which is directed towards a cell 802.
  • the smartphone has running thereon AR software which controls the smartphone 801 to display on a display screen 803 images of the scene captured by the camera including the cell 802.
  • the AR software may provide functionality associated with the AR system 702 described with reference to Figure 7 and the control device may be arranged to communicate data to and from the smartphone via a suitable wireless data connection.
  • the AR software is further arranged to generate graphical elements 804a, 804b, 804c, 804d, 804e, 804f relating to the operation of the cell.
  • the graphical elements relate to the function performed by each segment. As can be seen from Figure 8, each graphical element is positioned on the display in a position corresponding to one of the segments.

Abstract

A modular robotic system comprising one or more cells (101). Each cell (101) comprises a plurality of geometrically shaped segments (102 - 107). Each segment (102 - 107) comprises hardware (109a - 109f) for performing a processing function.

Description

MODULAR SYSTEM
Technical Field
The present invention relates to robotic systems, and in particular modular robotic systems.
Background
It is well known to use robotic systems for automation. For example, robotic systems that automate laboratory tasks are well known and widely available. Such systems can automate complex processes, often performing steps more quickly, consistently and accurately than a human operator.
In accordance with common practice in the field of automation, and in particular in the field of laboratory automation, if a system is required to automate a laboratory process, a customised system is commissioned. That is, a system is designed and built specifically for performing the process in question. This approach is often considered effective because a custom built automated system is optimised for undertaking a specific process. Such custom-built systems are often designed so that certain aspects can be modified to accommodate changes in requirements. However, substantial changes in the way in which the process is undertaken, or if the process needs to be scaled up or scaled down, often require any existing systems to be discarded and entirely new systems to be commissioned. This can be costly, particularly if it is impossible or impractical to recycle parts of the original system.
It is an object of certain embodiments of the invention to address such drawbacks. Summary of the Invention
In accordance with a first aspect of the invention, there is provided a modular robotic system comprising one or more cells. Each cell comprises a plurality of geometrically shaped segments. Each segment comprises hardware for performing a processing function.
Advantageously, modular robotic systems in accordance with certain aspects of the invention comprise one or more cells each of which comprise a number of geometric segments loaded with specific processing hardware. By physically configuring each segment in accordance with a geometric shape, cell design is easier and can be undertaken more quickly. Further, the exact space occupied by any given cell can be predicted irrespective of the processing hardware it provides. Moreover, as a result of the segments having a geometric shape, cells of the system themselves will typically have predictable geometric configurations, again irrespective of the processing functions they provide making system design, and in particular system modification and system scaling, easier than conventional custom designed systems.
Optionally, each segment comprises a geometrically shaped main body.
Optionally, the main body of each segment has a work surface part configured such that the work surface part of each of the segments together form a substantially planar work surface of the cell.
Advantageously, in certain embodiments, each segment is provided with a work surface part which, together with work surface parts from other segments of a cell, provide a substantially continuous work surface. This arrangement makes it easier for human operators to work with and visually supervise the system. Moreover, in certain embodiments, each cell can be configured so that the work surface is at conventional laboratory bench height, for example 1 100mm, meaning that the system can be readily and conveniently integrated into a laboratory environment. Optionally, the geometrically shaped main body of each segment substantially correspond in shape and size with each other.
In certain embodiments, the shape and size of the geometrically shaped main body of each segment is the same. This improves the ease with which systems can be designed because, irrespective of the function provided by a segment, it will always occupy a position within a cell in the same predictable way.
Optionally, the geometrically shaped main body of each segment is such that the cell forms a substantially hexagonal shape.
Advantageously, in certain embodiments, the main bodies of each segment are geometrically shaped so that the cell has a substantially hexagonal shape. As a result, multiple cells forming a system can be arranged optimising the use of space because hexagons can be“tiled” without gaps. Further, a continuous work surface, substantially without gaps, can be provided for the system allowing easy interaction with human operators and allow easy interaction between cells.
Optionally, the segments are interchangeable.
Optionally, each segment is connected to a control device for controlling the hardware of each segment.
Optionally, the control device is an external control device.
Optionally, the control device is adapted to detect an identity associated with each segment.
Optionally, the control device is adapted to determine from the detected identity, a processing function provided by the hardware of the segment and to present configuration information corresponding to the processing function on a user interface associated with the control device. Optionally, the user interface enables a process to be performed by the system to be configured using the configuration information.
Optionally, at least one of the one or more cells comprises an array of display units connected to the control device and the control device is arranged control the array of display units to display information relating to operation of the cell.
Optionally, each display unit of the array of display units is positioned proximate to a segment and is controlled to display information relating to operation of the segment.
Optionally the system further comprises an augmented reality system connected to the control device and arranged to control an augmented reality device to generate an augmented reality display comprising graphical elements comprising information relating to operation of the system.
Optionally, the augmented reality is system is operable to position the graphical elements on the augmented reality display in positions corresponding to detected positions of components of the system to which they relate.
Optionally, each cell comprises a base unit on which the geometrically shaped segments are disposed.
Optionally, the base unit includes a plurality of retractable leg units for lowering the base unit onto a moving apparatus.
Optionally, the leg units each comprise a foot adapted to engage with a corresponding recess of the wheeled apparatus when fully retracted to secure the base unit to the moving apparatus.
Optionally, the system further comprises a robot arm handler unit operable to handle items.
Optionally, a camera unit is positioned on the robot arm handler unit. In accordance with a second aspect of the invention, there is provided a geometrically shaped segment for a modular robotic system according to the first aspect, comprising hardware for performing a processing function.
Various further features and aspects of the invention are defined in the claims.
Brief Description of the Drawings
Embodiments of the present invention will now be described by way of example only with reference to the accompanying drawings where like parts are provided with corresponding reference numerals and in which:
Figure 1 provides a simplified schematic diagram of a cell of a modular robotic system in accordance with certain embodiments of the invention;
Figure 2 provides a simplified schematic diagram of an example segment of a cell of a modular robotic system in accordance with certain embodiments of the invention;
Figure 3 provides a simplified schematic diagram of a graphical user interface in accordance with certain embodiments of the invention;
Figure 4 provides a simplified schematic diagram depicting a modular robotic system in accordance with certain embodiments of the invention;
Figures 5a to 5h provide schematic diagrams depicting various possible segment shapes and cell configurations in accordance with certain embodiments of the invention;
Figure 6 provides a simplified schematic diagram of a cell of a modular robotic system in accordance with certain embodiments of the invention in which the cell includes an array of display;
Figure 7 provides a simplified schematic diagram showing a modular robotic system in accordance with certain embodiments of the invention, where the control device is connected via to an augmented reality system;
Figure 8 provides a simplified schematic diagram showing an augmented reality device in use in accordance with certain embodiments of the invention, and Figure 9 provides a schematic diagram in accordance with certain embodiments of the invention in which a base unit part of cell includes retractable leg units.
Detailed Description
Figure 1 shows a simplified schematic diagram of a cell 101 of a modular robotic system in accordance with certain embodiments of the invention. Modular robotic systems in accordance with examples of the invention can include one or more such cells.
The cell 101 comprises a number of discrete, geometrically shaped, segments 102, 103, 104, 105, 106, 107. In the embodiment shown in Figure 1 , the cell 101 comprises six such geometrically shaped segments.
More specifically, each segment comprises a geometrically shaped main body 108 and a hardware unit 109a - 109f for providing a processing function, such as a laboratory processing function.
The main body of each segment 102, 103, 104, 105, 106, 107 is configured to be the same size and shape as the main body of each other segment. More specifically, the main body of each segment 102, 103, 104, 105, 106, 107 is a geometric“wedge” shape such that when the six segments are positioned adjacent, a polyhedron with a uniform hexagon shaped upper and lower face is formed. The upper surface of the cell forms a substantially planar work surface made from work surface parts of each segment - i.e. an upper surface of the main body of each segment.
In certain embodiments, the segments are configured so that the work surface is at conventional laboratory bench height, e.g. approximately 1 100mm from the floor.
Typically, the shape of the main body of each segment is selected such that the cell forms a shape (e.g. a hexagon) which is such that when multiple cells are brought together, a continuous surface is formed.
In Figure 1 , the hardware unit of each segment is depicted schematically as having the same shape and configuration (a simple cylinder shape is shown for clarity). Flowever, as explained below, different hardware units may provide different functions therefore will typically be of differing shape and configuration. Although being of different vertical heights, typically, hardware associated with each segment will typically be substantially confined to the surface area defined by the upper surface of the segment.
The cell 101 further comprises a base 1 10 which corresponds in shape to the hexagon formed by the six wedge shaped segments 102, 103, 104, 105, 106, 107. Typically, the base comprises a hexagonal bed on top of a correspondingly shaped pedestal. The bed comprises six predetermined segment positions into which segments are inserted.
The processing function hardware unit of each segment typically provides a different processing function. For example, the processing function hardware unit from the first segment 102 may provide a sample container loading and unloading function for loading and unloading sample containers from a sample rack; the processing function hardware unit from the second segment 103 may provide a shaker function for shaking a sample container; the processing function hardware unit from the third segment 104 may provide a centrifuge function; the processing function hardware unit from the fourth segment 105 may provide a heating function for heating a sample container; the processing function hardware unit from the fifth segment 106 may provide a separating function for separating supernatant from a sample container, and the processing function hardware unit from the sixth segment 106 may provide a sample taking function for taking a sample of supernatant liquid from a sample container.
The cell 101 further comprises a handler unit 1 1 1 for handling items associated with the process being performed by the cell 101 . For example, the handler unit 1 1 1 may be a robotic arm equipped with a servo controlled gripper, configured to pass sample containers between processing function hardware units of the various segments. In certain embodiments, the gripper is adapted such that it has a modifiable grip size so that it can accommodate objects of different size.
Mounted on the handler unit 1 1 1 is a camera unit 1 15. The camera unit enables image data relating to the work surface of the cell 101 to be captured and used for control of the system. The cell 101 is configured so that the segments are interchangeable. That is, different segments can be inserted and removed from the cell 101 . Each segment is secured in place in the cell by a suitable latching mechanism. In certain embodiments, a mechanical latching mechanism can be used. In other embodiments an electromechanical latching mechanism (e.g. using one or more solenoids) can be used.
Typically, for each segment, the cell 101 includes a number of service connection points for providing services necessary for the processing hardware function units to operate. The service connection points include, for example, a power service connection point for providing electrical power; a water supply supplying cold, demineralised water; a waste extraction connection for extracting waste liquids; a fume extraction connection for extracting fumes; an air supply providing clean, dry pressurised air, and a data network connection for connecting to the segment to a computer network enabling data to be sent and received.
Correspondingly, each segment comprises a number of service connection points necessary for the functioning of its processing function hardware unit. The service connection points of each segment connect to corresponding service connection points of the cell.
Typically, the services enter the cell 101 by connecting a services umbilical to a services port. In certain embodiments, as shown in Figure 1 , each face of the base 1 10 is provided with a services port 1 16. By providing a services port on each face, the most convenient services port can be used to connect services to the cell (for example the services port near an appropriate external services connection).
Typically, operation of the cell 101 , specifically operation of the processing hardware function units and the handler unit 1 1 1 is controlled by an external control device 1 12. The control device is connected to the cell via a suitable data connection 1 13. Information from the cell, such as information from the hardware units and image data captured by the camera is communicated to the control device via the data connection 1 13. Control information to the cell 101 , e.g. control information for controlling specific hardware units and the handler unit 1 1 1 , is communicated from the control device to the cell 101 via the data connection 1 13. Typically, the control device 1 12 is provided by a computing device, such as a personal computer (PC), comprising a processor unit, memory unit and suitable data input/output interface.
Image data captured by the camera unit 1 15 can be processed by machine vision software enabling the system to track the location of items on the work surface of the cell. This enables the handler unit 1 1 1 to pass items, for example sample containers, between hardware units of different segments. This also allows the handler unit 1 1 1 to rectify problems, for example pick up and replace dropped sample containers, or correctly position sample containers incorrectly positioned by operators.
The cell 101 comprises wheel units 1 14 connected to the base 1 10 allowing the cell to be readily moved from place-to-place, for example around a laboratory or processing environment. In certain embodiments, each wheel unit 1 14 includes a levelling actuator operable to change the distance of the wheel unit from the base. The levelling actuators can be connected to an orientation sensor within the cell 101 arranged to detect any tilt present due, for example, to the cell being positioned on an uneven surface. Control signals, generated responsive to an orientation output of the orientation sensor, can be used to actuate the levelling actuators to ensure that the cell is level. In certain embodiments, this is an automated process, i.e. a cell levelling process occurs without any intervention from human operators.
Figure 2 provides a simplified schematic diagram showing a more detailed view of an example segment 201 in accordance with certain embodiments of the invention.
The segment 201 includes a hardware unit which comprises a fume extractor unit 202 and a sample container shaker unit 203. The segment 201 further comprises a main body 204 which houses a control unit 205 for controlling the fume extractor unit 202 and the sample container shaker unit 203. The main body 204 of the segment 201 includes three service connection points: a first service connection point 206 providing a power supply connection for providing power to the control unit 205, the fume extractor unit 202 and the sample container shaker unit 203; a second service connection point 207 a data connection point for connecting the control unit 205 to an external control device, and a third service connection point 208 for removing fumes extracted by the fume extractor unit 202. Electrical power is provided to the control unit 205, the fume extractor unit 202 and the sample container shaker unit 203 via internal power connection lines 209, 210. Data is communicated to and from the control unit 205 and the data connection point 207 via a data connection line 21 1 . Fumes extracted from of the fume extractor unit 202 are extracted from the segment 201 via the fume extraction connection point 208 and a fume conduit 212. Control signals are sent from the control unit 205 to the fume extractor unit 202 and a sample container shaker unit 203 via control lines 213, 214.
Returning to Figure 1 , typically, the external control device 1 12 controlling the system has stored thereon software for controlling the operation of the system.
In certain embodiments, the software is adapted to provide a user with a graphical user interface (GUI) which displays to a user a schematic representation of the system that corresponds to the physical configuration of the system. In other words, the GUI provides a schematic depiction of the configuration of the system including the relative location of segments within a cell (and, if the system comprises multiple cells, the relative location of different cells) along with the functions provided by the hardware units of the or each cell.
The GUI provides a user with controls allowing a process to be defined comprising a sequence of operations, for example, with the hardware units from different segments of the cell performing specific steps in a specific order.
Figure 3 provides a simplified schematic diagram of a GUI 301 generated by the software as described above enabling a user to define a process to be undertaken by a modular robotic system in accordance with certain embodiments of the invention.
A first part 302 of the GUI 301 shows a schematic representation of the system being controlled by the software. In this example, the system comprises a single cell with six segments, a first segment (1 ) provided with loading/unloading hardware; a second segment (2) provided with shaking hardware; a third segment (3) provided with centrifuge hardware; a fourth (4) segment provided with heating hardware; a fifth segment (5) provided with separating hardware, and a sixth segment (6) provided with sampling hardware.
To define a process, the GUI 301 provides a“drag and drop” interface allowing a user to“select” a segment from the first part 302 and“drag” it to a second part 303 of the GUI 301 . Once a drag and drop operation has been performed by a user in this way, a processing step is shown in the second part 303 and the user is prompted to enter in variables associated with the function performed by the hardware unit of the relevant segment. For example, when the“drag and drop” operation is performed for the second segment (2), a user is prompted to enter a period of time of the shaking operation and a shaking intensity.
The user performs the drag and drop operation on the various segments corresponding to the order with which the process is to be performed. For example, the vertical order with which the processing steps are shown in the second part 303 of the GUI determines the order in which the process is performed, the top processing step being performed first. An illustrative process shown in the second part 303 of the GUI in Figure 3 comprises the following steps in the following order:
1 ) Unload sample container from sample container rack
2) Shake sample container for 60 seconds with shake intensity level 2
3) Centrifuge sample container for 30 seconds at 300rpm
4) Shake sample container for 10 seconds with shake intensity level 5
5) Fleat sample container for 10 minutes at 200 degrees
6) Separate 10 ml of supernatant from sample container
7) Take and store sample supernatant from sample container
8) Return sample container to sample container rack
Once the process has been defined by the user, the user can select an“execute” control. This control causes the software to control the segments of the system to perform the process specified in the second part 303 of the GUI 301 . This is typically achieved by converting the process defined by the user into a series of control signals which are communicated to the control units of each segment. As will be understood, typically the conversion of the process specified by the user using the GUI into control signals“abstracts” the generation of control signals from the user. For example, the software may determine, from the order of steps defined by the user in the second part of the GUI, the requisite operation of the handler unit passing the sample container to and from hardware units. Further, the control units of each segment will typically generate feedback control information to monitor operation of the system and to identify, and where possible, rectify errors.
In certain embodiments, alternatively, or additionally to define a process for the system to perform using the GUI, a process can be defined by alternative means. In one example, an optical code, such as a machine-readable barcode, affixed for example to a sample container, can be imaged using the camera device. Encoded on the barcode is a process identifier which identifies a predetermined process to be performed by the system. The software running on the external control device is adapted to identify a process associated with a barcode, for example by retrieving it from memory and then control the segments of the system to perform the process. Alternatively, or additionally, an optical code such as a barcode may itself be used to encode process data - e.g. specify a series of steps. The software is adapted to decode such data, extract the relevant process data and undertake the specified process.
In this way, a reduced amount of operator input is required as it is not necessary for an operator to specify a process and input an“execute” command. Instead, a sample container with a suitable optical code can simply be put on the work surface of the cell, imaged by the camera unit and then the process is performed.
The software is typically arranged to detect if a segment has been inserted into the cell or removed from the cell and to update the GUI 302 accordingly. For example, the control unit of each segment on power up may be arranged to communicate to the control device a segment identifier. The software may include a library of segment configuration information associated with different segment identifiers and present information on the GUI accordingly (e.g. the visual representation of the segment type and the variables that can be set for that segment type). The software may be able to determine a relative position of the segment within the cell by virtue of an identifier address associated with a data connection point within the cell to which the segment has been attached. Alternatively, the identity and position of segments may be determined by a marker (e.g. optical code) affixed to a part of the segment that can be imaged by the camera unit. In certain embodiments, the segments may also include “landmark” markers in predetermined positions on each segment. The landmark markers show predetermined patterns, the parallax of which can be used to determine segment position and orientation. The landmark markers are imaged by the camera device. By detecting the landmark markers, the control device can identify the orientation of the segment to identify if it has been incorrectly installed and, for example, generate a corresponding error message to be displayed on the GUI 302.
Figure 4 provides a simplified schematic diagram depicting a modular robotic system 401 in accordance with certain embodiments of the invention. The system 401 comprises five cells 402, 403, 404, 405, 406 each of which correspond to the cell described with reference to Figure 1 and cell segment described with reference to Figure 2.
As can be seen from Figure 4, by virtue of the shape of each cell, the system 301 is provided with a continuous surface. Forming a system in this fashion means that more complex processes can be performed, or aspects of a process can be performed in parallel.
A first part of a GUI associated with the system shown in Figure 4 will typically provide a schematic representation of the various cells of the system as well as their constituent segments. The second part of the GUI will enable parallel processes to be defined (e.g. processes being performed independently of each other on different cells), and collaborative processes being performed across cells. The relative position of each cell can be determined in any suitable way. In one example, each cell has an identifying optical code (e.g. barcode) which can be imaged by a camera unit of the handler of an adjacent cell and communicated back to the control device. Software running on the control device can thereby determine the relative position of each cell adjacent a given cell and thereby determine the relative position of all the cells.
In the embodiments described above, the main body of the segments of each cell are substantially of uniform (the same) shape. However, in certain embodiments, non- uniform segment configurations are possible. Figures 5a to 5g provide schematic diagrams showing certain example segment configurations. In certain embodiments, cells comprising segments which form shapes comprising more than one hexagonal are possible. Figure 5h depicts a cell comprising segments which are such that the cell comprises two hexagons.
In certain embodiments, rather than being a single unit, the base unit can be separated into parts, for example into two halves.
The base unit can be separated into these parts when, for example, the cell is being moved or stored. Advantageously, configuring the base unit so that it can be divided into parts can improve the ease with which the cell can be moved.
In certain embodiments, for example embodiments in which the base unit can be separated into parts, rather than wheel units, the base unit comprises a plurality of retractable leg units which can be moved up and down to level the cell.
In certain such embodiments, each of the leg units includes a levelling actuator which operate in corresponding manner to the levelling actuators described above (e.g. connected to an orientation sensor to automatically level the cell).
Embodiments comprising leg units rather than wheel units cannot be moved as readily therefore, in such embodiments, a specially adapted moving apparatus, for example a wheeled apparatus (a“dolly”) can be used for moving parts of the base unit. In certain such embodiments, when the cell is to be moved, the base unit is separated into the parts and a lifting platform of the dolly is inserted under the base cell parts which lifts the cell off the ground.
The dolly is then moved (for example by being pushed or pulled by a human operative via an arm connected to the dolly). When the dolly reaches the intended destination of the cell, the lifting platform is lowered until the leg units make contact with the floor. The dolly is then withdrawn. The levelling actuators can then be used to level the cell, to account, for example, for any irregulates in the flatness of the floor at the new location.
In other embodiments, the dolly is inserted under a base unit part and the leg units retract lowering the base unit part onto the dolly. When the base unit part is fully lowered and resting on the dolly, the leg units continue to retract. A suitably adapted foot of each leg unit is received by specially adapted recesses in the dolly. Figure 9 provides a schematic diagram of depicting an example of this process.
Figure 9 provides a partial view of the dolly 901 which includes wheels 902 and a partial view of the base unit part 903 the cell including one of the leg units 904 comprising a foot 905.
As can be seen from Figure 9, initially the base unit parts are separated and the dolly 901 is inserted under a base unit part 903. The leg units 904 then retract and the base unit part 903 then rests on the dolly 901 . The leg units 904 continue retracting until the foot 905 of each leg unit 904 is received by a corresponding recess 906 of the dolly 901 . In this way, the base unit part 903 is secured to the dolly 902 and can be readily moved with a reduced risk that the base unit part 903 will become dislodged from the dolly 901 .
In embodiments described above, the GUI associated with operation of the cell is displayed on a display associated with the control device. The display may be in a location that is different to the location of the cell, for example in a control room. In such arrangements, it may not be possible to view the GUI and the cell at the same time.
In certain embodiments, the cell itself may include means to display information to a user, for an example an array of display screens mounted on the cell.
Figure 6 provides a schematic diagram of such an embodiment. Figure 6 shows a cell 601 corresponding to that described, for example, with reference to Figure 1 . Flowever, the cell 601 includes an array of display units 602. Typically, each display unit is provided by a conventional display screen. As can be seen, the array of display screens 602 is mounted centrally with respect to the cell, for example on a central column 603 to which the handler unit is mounted.
Advantageously, as can be seen from Figure 6, the array of display screens 602 is such that each individual display screen is positioned relative (proximate) to a particular segment such that it is visually apparent from their position that each display screen is associated with a particular segment.
Information associated with a particular segment (such as the processing step/function undertaken by the segment and the particular variables associated with the processing step/function) can be displayed on the display screen.
Typically, each display screen is connected to the control device, via, for example the data connection and controlled by the software running on the control device.
As described above, in certain embodiments, the software running on the control device generates and stores information relating to the position of each segment within a cell along with information about the function performed by each cell. Typically, the software also generates and stores information relating to the operational state of each segment. In certain embodiments, the system can be connected to an augmented reality (AR) system enabling an augmented reality device to be used to ascertain information about the cell.
Figure 7 provides a simplified schematic diagram showing a modular robotic system of the type described with reference to Figure 1 where the control device 701 is connected via a suitable interface to an AR system 702. The AR system 702 includes an API 703 allowing data to communicated from the control device relating to the cell 701 to be processed and converted into data to display on an AR device 704 connected, via a suitable data connection, to the AR system 702.
The data communicated from the control device 701 to the AR system 702 can include, for example, information about the configuration of the cell (e.g. the relative position of the segments etc) and information about the operation of the cell (e.g. the operational status of each segment etc).
The AR device 704 has running thereon AR software which includes an image recognition component enabling the software to identify the cell within images captured by an imaging device (e.g. video camera) 705 of the AR device 704. The image recognition component may typically further be able to determine the relative orientation of the cell using, for example, landmark markers positioned on the cell.
With the position and orientation of the cell determined within images captured by the AR device 704, the AR device 704 is then arranged to generate an augmented reality display of the cell comprising images of the cell with graphical elements, superimposed on the images and display this on a display 706. The graphical elements typically comprise information relating to the cell received from the control device 701 via the AR system 702.
In this way, an operative can view the cell through the AR device 704 and be presented, visually, with information about the operation of the cell. As is known in the art, many AR systems are available, and any suitable AR system can be used.
A schematic diagram of an illustrative example of this is shown in Figure 8.
Figure 8 shows an AR device in the form of a smartphone 801 . The smartphone comprises a camera which is directed towards a cell 802. The smartphone has running thereon AR software which controls the smartphone 801 to display on a display screen 803 images of the scene captured by the camera including the cell 802. The AR software may provide functionality associated with the AR system 702 described with reference to Figure 7 and the control device may be arranged to communicate data to and from the smartphone via a suitable wireless data connection. The AR software is further arranged to generate graphical elements 804a, 804b, 804c, 804d, 804e, 804f relating to the operation of the cell. In the example shown in Figure 8, the graphical elements relate to the function performed by each segment. As can be seen from Figure 8, each graphical element is positioned on the display in a position corresponding to one of the segments.
Each feature disclosed in this specification (including any accompanying claims, abstract and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise. Thus, unless expressly stated otherwise, each feature disclosed is one example only of a generic series of equivalent or similar features.
It will be appreciated that features from one embodiment may be appropriately incorporated into another embodiment unless technically unfeasible to do so.
It will be appreciated that various embodiments of the present disclosure have been described herein for purposes of illustration, and that various modifications may be made. Accordingly, the various embodiments disclosed herein are not intended to be limiting, with the scope being indicated by the following claims.

Claims

1 . A modular robotic system comprising one or more cells, each cell comprising a plurality of geometrically shaped segments, each segment comprising hardware for performing a processing function.
2. A modular robotic system according to claim 1 , wherein each segment comprises a geometrically shaped main body.
3. A modular robotic system according to claim 1 , wherein the main body of each segment has a work surface part configured such that the work surface parts of each of the segments together form a substantially planar work surface of the cell.
4. A modular robotic system according to claim 3, wherein, in use, the planar work surface is substantially 1 100mm high.
5. A modular robotic system according to any of claims 2 to 4, wherein the geometrically shaped main body of each segment substantially correspond in shape and size.
6. A modular robotic system according to any of claims 2 to 5, wherein the geometrically shaped main body of each segment is such that the cell forms a substantially hexagonal shape.
7. A modular robotic system according to any previous claim, wherein the segments are interchangeable.
8. A modular robotic system according to any previous claim, wherein each segment is connected to a control device for controlling the hardware of each segment.
9. A modular robotic system according to claim 8, wherein the control device is an external control device.
10. A modular robotic system according to claim 9, wherein the control device is adapted to detect an identity associated with each segment.
1 1 . A modular robotic system according to claim 10, wherein the control device is adapted to determine from the detected identity, a processing function provided by the hardware of the segment and to present configuration information corresponding to the processing function on a user interface associated with the control device.
12. A modular robotic system according to claim 1 1 , wherein the user interface enables a process to be performed by the system to be configured using the configuration information.
13. A modular robotic system according to any of claims 8 to 12, wherein at least one of the one or more cells comprises an array of display units connected to the control device and the control device is arranged control the array of display units to display information relating to operation of the cell.
14. A modular robotic system according to claim 13, wherein each display unit of the array of display units is positioned proximate to a segment and is controlled to display information relating to operation of the segment.
15. A modular robotic system according to any of claims 8 to 14, further comprising an augmented reality system connected to the control device and arranged to control an augmented reality device to generate an augmented reality display comprising graphical elements comprising information relating to operation of the system.
16. A modular robotic system according to claim 15, wherein the augmented reality is system is operable to position the graphical elements on the augmented reality display in positions corresponding to detected positions of components of the system to which they relate.
17. A modular robotic system according to any previous claim wherein each cell comprises a base unit on which the geometrically shaped segments are disposed.
18. A modular robotic system according to claim 17, wherein the base unit includes a plurality of retractable leg units for lowering the base unit onto a moving apparatus.
19. A modular robotic system according to claim 18, wherein the leg units each comprise a foot adapted to engage with a corresponding recess of the wheeled apparatus when fully retracted to secure the base unit to the moving apparatus.
20. A modular robotic system according to any previous claim, further comprising a robot arm handler unit operable to handle items.
21. A modular robotic system according to claim 20, wherein a camera unit is positioned on the robot arm handler unit.
22. A geometrically shaped segment for a modular robotic system according to any of claims 1 to 21 , comprising hardware for performing a processing function.
EP18815295.3A 2017-12-01 2018-11-29 Modular system Pending EP3717180A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB1720031.2A GB2568932B (en) 2017-12-01 2017-12-01 Modular system
PCT/GB2018/053450 WO2019106364A1 (en) 2017-12-01 2018-11-29 Modular system

Publications (1)

Publication Number Publication Date
EP3717180A1 true EP3717180A1 (en) 2020-10-07

Family

ID=60950171

Family Applications (1)

Application Number Title Priority Date Filing Date
EP18815295.3A Pending EP3717180A1 (en) 2017-12-01 2018-11-29 Modular system

Country Status (3)

Country Link
EP (1) EP3717180A1 (en)
GB (1) GB2568932B (en)
WO (1) WO2019106364A1 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7329625B2 (en) * 2020-01-07 2023-08-18 株式会社日立ハイテク Automatic analyzer, display system for automatic analyzer, and display method for automatic analyzer
WO2022002391A1 (en) * 2020-07-01 2022-01-06 Insys Industriesysteme Ag Assembly cell
WO2022070350A1 (en) * 2020-09-30 2022-04-07 株式会社牧野フライス製作所 Processing system

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4835711A (en) * 1986-05-30 1989-05-30 Zymark Corporation Quickly reconfigurable robotic system
US5386762A (en) * 1992-09-14 1995-02-07 Gokey; Phillip E. Robotic chef
AT411886B (en) * 2002-05-10 2004-07-26 Electrovac PRODUCTION SYSTEM
US6920973B2 (en) * 2003-06-19 2005-07-26 The Regents Of The University Of Michigan Integrated reconfigurable manufacturing system
SE528350C2 (en) * 2005-03-09 2006-10-24 Animex Plast Ab Modular system for industrial robot
US8795593B2 (en) * 2006-03-29 2014-08-05 Michael J. Nichols Instrument docking station for an automated testing system
US7777155B2 (en) * 2007-02-21 2010-08-17 United Technologies Corporation System and method for an integrated additive manufacturing cell for complex components
JP4946523B2 (en) * 2007-03-07 2012-06-06 セイコーエプソン株式会社 General-purpose cell for production system and production system using the general-purpose cell
JP2010137339A (en) * 2008-12-12 2010-06-24 Olympus Corp Production device and production system
US9561590B1 (en) * 2013-06-24 2017-02-07 Redwood Robotics, Inc. Distributed system for management and analytics of robotics devices

Also Published As

Publication number Publication date
GB2568932A (en) 2019-06-05
GB201720031D0 (en) 2018-01-17
GB2568932B (en) 2022-07-13
WO2019106364A1 (en) 2019-06-06

Similar Documents

Publication Publication Date Title
EP3717180A1 (en) Modular system
US11789471B2 (en) Method of cleaning heat exchangers or tube bundles using a cleaning station
US7528620B2 (en) Probe card transfer assist apparatus and inspection equipment using same
US8734720B2 (en) Automated testing system arrangements using docking station
TWI698373B (en) Surface treatment system for large object
JP2007088117A (en) Probe card transfer auxiliary device, inspection equipment and inspection method
JP5468367B2 (en) Robot control method and robot control system
CN106737795A (en) The instrument fast replacing device and method of a kind of dynamic power machine hand
CN109413999A (en) Carousel isolator
TW201924882A (en) Control method for surface treatment system
RU2382834C2 (en) Device for manipulation of covers of electrolytic cell for manufacturing of alluminium by electrolysis
JP2011110627A (en) Robot control method, robot control program, and teaching pendant used for robot control method
CN104950834A (en) Production plan preparation support method and production plan preparation support apparatus
JP2009122102A (en) Method and apparatus for permission determination for controlling movement permission of mast and grapple
US11034083B2 (en) Three dimensional printing system that automatically removes particles from build plane
KR102091817B1 (en) Robot system and control method of robot system
JP7376916B2 (en) Work supply/removal system, portable robot device, and portable work stocker
JP2009279706A (en) Robot control device
CN207451420U (en) Material barrel grasping system
JP2021033869A (en) Electronic equipment, control method of the same, program and storage medium
EP3429806B1 (en) A robot and a method of controlling a robot
JP6460639B2 (en) Robot system control method, program, recording medium, and robot system
RU2494180C1 (en) Unit for automated performance of preparatory-cutting operations
AU2017383531A1 (en) Liquid handling apparatus
CN113798334B (en) Automatic control method of pusher

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20200625

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)