WO2022180179A1 - Commande de flux de matière à capteurs virtuels - Google Patents

Commande de flux de matière à capteurs virtuels Download PDF

Info

Publication number
WO2022180179A1
WO2022180179A1 PCT/EP2022/054703 EP2022054703W WO2022180179A1 WO 2022180179 A1 WO2022180179 A1 WO 2022180179A1 EP 2022054703 W EP2022054703 W EP 2022054703W WO 2022180179 A1 WO2022180179 A1 WO 2022180179A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
conveyor
sensor
reference points
positions
Prior art date
Application number
PCT/EP2022/054703
Other languages
German (de)
English (en)
Inventor
Andreas Hintz
Original Assignee
Ssi Schäfer Automation Gmbh (At)
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ssi Schäfer Automation Gmbh (At) filed Critical Ssi Schäfer Automation Gmbh (At)
Priority to EP22712534.1A priority Critical patent/EP4154219A1/fr
Publication of WO2022180179A1 publication Critical patent/WO2022180179A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/17Terrestrial scenes taken from planes or by drones
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/20Scenes; Scene-specific elements in augmented reality scenes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65GTRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
    • B65G43/00Control devices, e.g. for safety, warning or fault-correcting
    • B65G43/08Control devices operated by article or material being fed, conveyed or discharged
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20092Interactive image processing based on input by user
    • G06T2207/20104Interactive definition of region of interest [ROI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose

Definitions

  • the present disclosure relates generally to the field of intralogistics, and more particularly to an intralogistics system that includes a camera and a virtual sensor to control a material flow in the intralogistics system using the virtual sensor, and a method of using the virtual sensor in the intralo - gistics system.
  • sensors In order to guide objects in a material flow, ie conveyed goods, through an intralogistics system, a very large number of (technically as simple and cheap as possible conditions) sensors, in particular photoelectric sensors, light barriers and scanners, have been used to date. These sensors are used everywhere within the system, especially at nodes (intersections, branches, junctions, etc.) of the material flow.
  • Tracing and tracking is the (internal) tracking of the conveyed goods.
  • Tracking means following the object from a source to a destination. Tracing means backtracking from the end customer to the point of original production.
  • the document WO 2018/211072 A1 relates to a device and a method for controlling a material flow at a material flow node.
  • a single camera is positioned at the node to identify the conveyed goods.
  • the camera data is merged with additional data that was obtained away from the node.
  • the camera data is also used to determine a current position and speed of items being transported on a continuous conveyor within the camera's field of view.
  • a material flow control using a video camera is also disclosed in DE 102006 015 689 A1.
  • Status images from the camera are evaluated in order to determine an actual position of the object to be transported in a previously defined area of the system.
  • the video camera is arranged vertically above a conveyor system.
  • the status images are evaluated in order to determine an actual position of the goods to be conveyed, which is then compared with a target position in order to generate corresponding conveyor technology control signals based on the target/actual comparison.
  • a video camera is used to determine the positions of objects within a transport container, the transport containers being transported on a conveyor system.
  • DE 102011 053 547 A1 discloses the use of video cameras in order to monitor a transfer process of a conveyed item by a robot that is positioned between two conveyor systems arranged in parallel.
  • identification marks and dimensions of the goods to be conveyed are determined using the camera data.
  • DE 102007 035 272 A10 discloses a method for identifying goods to be transported, in particular pieces of luggage, with a large number of sensors.
  • material flow control generally requires a large number of sensors for monitoring the states of a conveyor system in order to actuate actuators arranged downstream and to follow the conveyed goods sufficiently reliably (tracking). Every sensor costs money. Every sensor needs maintenance.
  • the wiring of the sensors is complex. Wiring takes a lot of time and is often difficult to do, especially when the sensors are subsequently integrated into an existing conveyor system. Installation locations can be difficult to access. Cable routing from the installation site to a control computer can be difficult because the wiring locations can be difficult to access.
  • a method for defining a virtual sensor in a conveyor system comprising the steps of: providing a model of the conveyor system, the model having at least one position and preferably an orientation and/or a dimension , for each of the conveyor sections in a reference frame of reference, in particular in the frame of reference of the conveyor system; Defining at least one reference point, in particular by positioning of, preferably identical, additional markings and determining a position for each defined reference point in the reference system; Positioning an image sensor within the conveyor system, preferably at any location, so that at least one of the conveyor sections and at least one of the reference points are in a field of view of the image sensor; generating an image with the image sensor after the image sensor is positioned; identifying reference points in the image and determining positions of the identified reference points in a frame of reference of the generated image; Determination of a coordinate transformation, preferably by means of image processing, based on the positions of the reference points in both reference systems, so that each pixel of
  • a major advantage of the present development can be seen in the fact that many physical sensors that are required to control a material flow can be replaced by one, in particular a single, image sensor. Based on the data from the image sensor, information can be obtained that simulates the signals from the physical sensors. This reduces cabling effort because the virtual sensor no longer has to be physically connected to the controller.
  • a route of the conveyor system can be changed later without any problems.
  • the conveyor system can be retrofitted with a variety of sensors, especially in places that are inaccessible to an installer.
  • the reference points can be set at locations that are easily accessible.
  • the number of sensors required for material flow control is significantly reduced. This reduces costs.
  • the development can be retrofitted in existing systems by marking elements of the existing system with the markings, which can then be identified in the image sensor image in order to reference the image sensor image to reality.
  • the virtual sensors can be defined, for example, by a technician who positioned the camera in the image displayed to the technician, for example on a mobile touch-sensitive screen that the technician carries with him. For example, the technician draws a line on his screen to define a light barrier at a location in relation to a conveyor line, which can also be seen in the picture.
  • the line drawn on the screen is automatically assigned coordinates in the reference reference system, so that a location-dependent material flow controller can be informed about changes in the flow rate, which can be recorded by the camera and evaluated using image processing.
  • the material flow control can then act accordingly on the conveyor system, although there is no real sensor at the place where the line was drawn.
  • the area defined for the virtual sensor is preferably monitored in a currently generated image for a change in status based on a comparison with an image generated earlier.
  • the chronologically earlier image is recorded in a state that is free of conveyed goods. This means that at the time of recording there was no material to be conveyed in the field of view of the
  • the other images in particular the currently generated image, can have conveyed goods. If these images are compared with the original image, a conveyed item can be automatically recognized using image recognition. If the detected item to be conveyed enters the area that is defined for the virtual sensor, information is available that the item to be conveyed is in the area of the virtual sensor. This information can be converted into a corresponding status signal, on the basis of which the controller can then in turn control actuators that are arranged downstream of the monitoring area. In this way, for example, a discharge device can be actuated, which is arranged immediately after the virtual sensor. The conveyed goods can also only be counted depending on the position, to give another example.
  • a control signal is therefore generated for an actuator which is arranged downstream of the monitored area when the change of state has occurred.
  • the change in status is caused in particular by a conveyed item that moves through the area defined for the virtual sensor along the at least one conveying path that lies in the field of view of the image sensor.
  • the reference points may be implemented by inherent features of components of the conveyor system.
  • the conveyor devices can be provided with a serial number by the manufacturer, which can be used as a "reference point" if it can be resolved and recognized in the image of the image sensor. In this case it is not necessary to subsequently provide the system to be monitored with markings.
  • Such reference points can be defined by selecting an area in the generated image where an identifier, such as a barcode label, is located. This area is assigned to the conveyor line and characteristic of this conveyor line.
  • a sensor type is also assigned to the monitored area, which is preferably a light barrier, a light sensor or a scanner.
  • the selection of a sensor type determines which state changes are to be monitored. If a light barrier or a light scanner is selected as the sensor type, it is sufficient to detect a change in status at a point or along a line. If a scanner is selected, the area to be monitored must be monitored for previously defined identification features such as barcodes. This type of monitoring can be automated using image processing.
  • a system for defining a virtual sensor for a conveyor system which comprises at least one conveyor section, the system having: an image sensor which can be positioned in the conveyor system which defines a reference reference system and which has a field of view , so that an image of at least one of the conveyor sections of the conveyor system and of at least one reference point can be generated; a plurality of markers fixed and invariably positionable as the reference points within the conveyor system at positions known in the reference frame; a device that is set up to store the positions of the reference points that are known in the reference reference system; an image processing device that is set up to determine positions of reference points that can be identified in the image, preferably by means of pattern recognition, in a reference system of the generated image; a device that is set up to determine a coordinate transformation based on the positions of the reference points in both reference systems, so that each pixel of the generated image can be assigned a coordinate in the reference reference system; and a device that is set up to define a monitoring area, preferably in the image, which is to
  • FIG. 1 shows a block diagram of an exemplary intralogistics system
  • FIG. 2 is an illustration of a conveyor system in reality and an illustration of a field of view and image captured by an image sensor of the reality;
  • FIG. 3 is an illustration of a captured image that is displayed on a screen of a tablet computer and that is manually editable;
  • FIG. 4 shows a prior art controller (FIG. 4A), a controller according to the present disclosure (FIG. 4B), and a retrofit of a conventional controller (FIG. 4C);
  • FIG. 6 shows an application in the context of discontinuous conveyors in an internal environment (Fig. 6A) and in the outside world (Fig. 6B); and
  • FIG. 7 is a block diagram of a system for defining a virtual sensor.
  • the present disclosure relates generally to the field of intralogistics, and in particular the control of a material flow in an intralogistics system (storage and/or picking system) by means of real image sensors and virtual (monitoring censors.
  • the present development relates to the idea that a technician positions a camera at a freely selectable point within the intralogistics system.
  • the camera generates an image of a section of the intralogistics system that is of interest.
  • the image shows markers that the technician previously attached to e.g has attached storage and/or conveying equipment.
  • the placement positions of the markers are determined by the technician in the layout of the intralogistics system, so the camera position can be determined using image processing based on the markers.
  • the technician can then draw monitoring areas in the image, which defines virtual sensors.
  • Status changes in the camera image that correspond to the monitoring area can be processed and used to control the material flow without real monitoring sensors being positioned in the real world (intralogistics system).
  • Intralogistics system 1 shows a block diagram of a block diagram of a storage and/or picking system 10, which represents an exemplary intralogistics system.
  • the system 10 comprises a conveyor system 12, an image sensor 14 (a camera 16 is treated below as an example), a virtual sensor 18, an image processing unit 20 (computer that is set up for editing and processing images) and a controller 22 (e.g. a material flow computer , MFR, 24).
  • the system 10 is usually housed in a building (eg in a hall).
  • the term "intralogistics” includes the organisation, control, implementation and optimization of the internal flow of materials, the flow of information and the handling of goods in industry, trade and public institutions.
  • the term “material flow” refers to all processes and their linkages in the manufacture, treatment and processing as well as the distribution of (conveyed) goods and objects within certain defined areas (e.g. goods receipt, storage, order picking and goods issue) .
  • the material flow is controlled by the material flow computer (MFR) 24, which controls source-destination relationships and coordinates the sequence in which individual orders (e.g. transport orders) are processed.
  • MFR material flow computer
  • conveyor system 12 generally refers to the technical systems of the material flow, ie conveyor device(s) 26, which are essentially internal Changes of location, ie a transport that causes (conveyance) goods.
  • the conveying devices 26 comprise two groups (not shown): continuous conveyors and discontinuous conveyors.
  • Continuous conveyors e.g. roller conveyors, belt conveyors, chain conveyors, overhead conveyors, etc.
  • Discontinuous conveyors are vehicles (moving robots, flying drones, etc.) that promote and transport the conveyed material either freely, ie autonomously or independently, or track-guided or forced-guided along a conveyor route 28 (path between source and destination).
  • Conveyor routes 28 represent the paths along which the conveyed goods are transported from their source to a destination. If, for example, a roller conveyor is used as the conveyor device 26, then the conveyor section 28 essentially corresponds to one
  • the associated conveying section 28 Longitudinal extension of the roller conveyor. If a vehicle driving on the ground is used as the conveying device 26, then the associated conveying section 28 essentially corresponds to a route of the vehicle. If a flying drone is used as the conveying device 26, then the associated conveying section 28 essentially corresponds to a trajectory of the drone. In general, this means that the path covered by the conveyed goods coupled to the conveying device 26 during transport corresponds to the conveying section 28 .
  • the movement of the conveyed goods is conventionally recorded with one or more sensors.
  • Exemplary sensors that are conventionally used in the conveyor system 12 are: light barriers, light sensors, cameras 16 and (barcode) scanners.
  • Light barriers and light sensors can be used to detect whether or not the conveyed goods are at a predetermined location.
  • Light barriers and light scanners work linearly, ie monitor a point or a line.
  • Cameras 16 work two-dimensionally, ie capture areas (images). Cameras can be used to record positions and movements, even from several conveyed goods at the same time. Cameras 16 and scanners can be used to identify the conveyed goods.
  • sensor is therefore to be understood as a technical component that actively measures physical or chemical variables and converts the measured variables into corresponding electrical signals for further processing.
  • Sensors are also referred to as detectors, (measuring variable or measuring) sensors or (measuring) probes (source: Wikipedia on “sensor”).
  • a “virtual sensor 18” is to be understood as a sensor that is not physically present in the system 10, but nevertheless provides desired information, such as about a system status (e.g., whether or not conveyed goods are present at a specific location?). of a size that is recorded by one or more sensors that are actually present, such as a camera 16, and converted into the desired information by means of data processing.
  • the image sensor 14 represents the real sensor.
  • the image data generated by the image sensor 14 are processed by the image processing device 10 .
  • the image data is analyzed using image recognition.
  • Image recognition is a branch of pattern recognition and image processing.
  • Image recognition in the context of image processing is the ability of software to identify objects, places, people, writing and actions in images.
  • Virtual sensors 18 therefore supply equivalent information to real sensors, except that the information has to be generated by processing data from other, real sensors.
  • actuators 32 such as motors, accumulation conveyors, infeed devices, ejection devices and the like. control at the right time.
  • real sensors are installed at fixed locations, such as a light barrier immediately in front of a material flow node. point where, for example, two conveyor sections 26 unite to form a single conveyor section 26 . If this light barrier sends a signal, the information is available that a conveyed item has interrupted the light beam at this moment. There is therefore time- and location-dependent information that can be used to control the flow of material, e.g. to activate an ejection device located immediately downstream, which causes the conveyed goods that the light barrier is currently interrupting or has interrupted to be ejected.
  • the installation location of the light barrier in a reference system 30, for example in the reference system 30 of the storage and/or picking system 10 or the conveyor system 12, is predetermined and stored in corresponding model data.
  • model is generally understood to mean a detailed description of a system (existing or to be manufactured), such as conveyor system 12.
  • This description illustrates one or more characteristics of the system, such as a shape, a texture, a structure, an arrangement of Components (conveyor devices 26 and/or conveyor lines 28), (relative) positions and orientations of the components, dimensions of the components, dimensional ratios and the like.
  • the description in paper is in the form of a layout or a map. An installer uses this data to install the light barrier in the right place when setting up (installing) the system.
  • the electronic description is in the form of model data stored in a memory of a data processing device.
  • the installation locations and model data are related to a coordinate reference system (e.g. coordinate system of the conveyor system).
  • coordinate reference system which is also referred to as “coordinate reference system”, is understood to mean a coordinate system or reference system 30 which is related to the real world by being linked to a datum (eg a point of origin). Coordinate systems are used to clearly identify the positions (coordinate 34, cf. Fig. 1) of points and objects (e.g. conveyors 26) in a geometric space, such as in the conveyor system 12.
  • a coordinate 34 is one of several numbers with which one uniquely indicates the location of the point in a plane or in a space. Each of the dimensions required for this description is expressed by a 34 coordinate.
  • two-dimensional onal coordinate and reference systems 30 considered. It goes without saying that these systems can be expanded to any dimensions, for example to four dimensions (height, width, length and time). Also, the terms "place” and "position” mean the same thing.
  • Reference systems 30 are therefore required in order to describe the behavior of location-dependent variables clearly and completely.
  • the positions and movements of objects can only be specified relative to the respective reference system 30.
  • references generally expresses that things are related to one another, i.e. brought into a relationship with one another.
  • the conveyor system 12 of FIG. 2 has a reference system 30 which can be identical to the reference system 30 of the overall system 10 .
  • the camera 16 has a reference system 30'.
  • the reference systems 30 and 30' can be referenced to one another, so that the coordinates 34 of any point in one of the reference systems 30', 30 can also be expressed in terms of coordinates 34 of the same point in the other frame of reference 30, 30'.
  • a transformation function coordinate transformation
  • FIG. 2 shows a schematic partial view of a conveyor system 12 in a plan view.
  • the conveyor system 12 is shown in reality, ie in the reference system 30, of an exemplary storage and picking system 10 that includes the conveyor system 12.
  • FIG. The upper part of FIG. 2 shows an image 36 that a camera 16 recorded.
  • Figure 36 shows an image of part of the real conveyor system 12, which is in the field of view 37 of the camera 16.
  • the (two-dimensional) image 36 is formed from pixels (not designated in more detail), which are built up in rows and columns.
  • the image 36 has its own frame of reference 30'.
  • the reference system 30 can be the same for the storage and picking system 10 and the conveyor system 12 . This means that the systems 10 and 12 can be described in the same frame of reference 30 .
  • the reference system 30 can be, for example, a (two- or three-dimensional) Cartesian coordinate system that has its origin (0/0/0) in, for example, the bottom left corner of a building (not illustrated here) in which the systems 10 and 12 are installed are.
  • the reference system 30' of the image 36 can also be a (two-dimensional) Cartesian coordinate system.
  • the coordinate system of the conveyor system 12 could be made up of coordinates 34 associated, for example, with a location of a conveyor 26 within the system 12, such as "1. Conveyor of the main line”, “2. conveyor of the main line”, .... , nth conveyor of the main line", "1. 1st Branch Conveyor”, etc.. Furthermore, these conveyors could be further subdivided into “Input”, “Middle” and “Output” so that a location where a sensor can be placed can be described in more detail , such as "1. Main Line Conveyor, Exit”.
  • the conveyor lines 28-1 and 28-2 there are two conveyor lines 28-1 and 28-2, which are illustrated with dot-dash lines.
  • the conveyor sections 28-1 and 28-2 each extend in a straight line as an example.
  • the conveyor section 28-1 runs horizontally in FIG. 2 and the conveyor section 28-2 runs at an angle to it.
  • the conveyor sections 28-1 and 28-2 are implemented as examples by continuous conveyors. It goes without saying that discontinuous conveyors can also be used as an alternative or in addition to conveying goods (not shown) along the conveying routes 28 .
  • the conveying sections 28-1 and 28-2 are of modular design, ie they comprise a plurality of conveying devices 26 which are arranged in such a way that they adjoin one another.
  • the routes 28 could also each be implemented by only a single conveyor.
  • the conveyor 26-1 is a linear roller conveyor and the conveyor 26-2 is a linear belt conveyor.
  • non-linear conveyors such as curved conveyors, can also be used further and alternatively.
  • the conveyor devices 26 can also overcome height differences, for example by using vertical conveyors or ramps for vehicles. However, all of this is not shown in FIG. 2 to simplify the explanation, but is nevertheless possible.
  • Fig. 2 three reference points M1 to M3 are shown as an example, which can be used for the desired te coordinate transformation. It goes without saying that the entire system 10 or 12 is usually provided with many more reference points than are shown in FIG. 2 . This aspect will be discussed in more detail below.
  • the three reference points are denoted by M1 to M3 in FIG. 2, because in this case the reference points are implemented by way of example markings M, which were subsequently and additionally attached to at least some of the conveyor devices 26.
  • the marks M1 and M3 are fixed and unchangeable on the conveyor 26-2.
  • the mark M2 is attached to the conveyor 26-4.
  • the markings M are components of the system 10 (cf. FIG. 1) or of the conveyor system 12. It is understood that the markings M can also be attached to other components of the system 10, such as shelves, the (building) floor, walls, ceiling, posts or the like.
  • the locations and positions at which the markers M1 through M3 in the frame of reference 30 of the conveyor system 12 are attached to the conveyors 26-2 and 26-4 are to be determined.
  • the coordinates 34 of the markings M1 to M3, i.e. the reference points, in the reference system 30 are thus determined.
  • the markings M1 to M3 can also be attached to the corresponding conveyor devices 26 at predefined locations before the installation of the conveyor devices 26 within the system 10 or 12, i.e. by the manufacturer (e.g. markings are always placed at the entrance of a conveyor on the placed on the left when looking in the conveying direction).
  • the operator of the system 10 only specifies the installation locations of the conveyor devices 26 in the overall system 10 .
  • the locations of the markings M within the reference system 30 of the conveyor system 12 are obtained automatically, and in particular independently of whether the conveyor system 12 is positioned exactly in the overall system 10 or not.
  • the camera 16 can be positioned anywhere. It is only necessary to ensure that a sufficient number of reference points are contained in the image 36 so that the reference points of the image 36 can be identified by means of image recognition, i.e. can be transformed to the reference system 30. As soon as it is clear which reference points can be seen in image 36, the coordinate transformation can be determined in order to be able to clearly assign locations in the reference system 30 of the conveyor system 12 to the reference points contained in image 36. In this way it is possible to assign a location in the reference system 30 to each pixel. The image 36 is then referenced to the reference system 30 of the conveyor system 12 .
  • the referencing makes it possible to define one or more virtual sensors 18 in image 36 .
  • the virtual sensors 18-1 to 18-8 are defined, for example, as light barriers (cf. lines) and the virtual sensor 18-9 is defined as a barcode scanner (cf. rectangle).
  • the light barriers 18-1 to 18-3 are used by the conveyor 26-1 (roller conveyor with several storage locations, segmentation not shown) for implementing storage locations, so that the MFC 24 can stop and store conveyed goods there.
  • the light barriers 18-4 and 18-5 are activated by the conveyor (continuous belt conveyor) used to influence the speed of conveyed goods immediately before the union point with the conveyor 26-5.
  • the light barriers 18-6 and 18-8 are also used to influence the speed before the junction point.
  • the light barrier 18-7 is oblique - and thus not perpendicular barriers like the other light - oriented to the conveying direction in order to be able to carry out a length measurement.
  • the scanner 18-9 scans an area and is used for identity determination.
  • areas to the side of the conveyor sections and/or conveyor devices 26 could also be monitored, e.g. to detect falling conveyed goods. These areas are preferably directly adjacent to the conveying devices 26, as indicated by hatched areas in FIG.
  • an image 36 free of conveyed goods is generated, which serves as a basis for images 36 that are later recorded while conveyed goods are conveyed through the field of view 37 of the image sensor 14 .
  • the MFC 24 can use to control actuators that are required for material flow control.
  • the signal from a virtual light barrier can be used to turn off a drive that drives a conveyor 26 that is assigned to the same location as the virtual light barrier ke in order to stop the conveyed goods (congestion function).
  • the virtual light barrier supplies a corresponding signal as soon as the conveyed goods enter the area assigned to the virtual light barrier.
  • FIG. 3 illustrates the (subsequent) definition of a virtual sensor 18 at any location within the image 36.
  • the virtual sensor 18 is given a (Monitored) area of the image 36 assigned within the conveyor system 12, whose location coordinates are known in the reference system 30.
  • FIG. 3 shows an exemplary tablet computer 40 with a touch-sensitive screen 42 on which the image 36 of the camera 16 of FIG. 2 is displayed to a user 44 .
  • the virtual sensor 18-5 is already illustrated in the displayed image 36.
  • the (camera) image 36 is already referenced to reality, i.e. the coordinate transformation between the reference system 30' of the image 36 and the reference system 30 of the real world has already been determined using the identified reference points (M1 to M3).
  • the user 44 has drawn the virtual sensor 18-5 on the screen 42 manually, e.g.
  • the screen 42 records the position, orientation and/or size of the drawn sensor 18 in its frame of reference 30'.
  • the drawn position, orientation and/or size of the virtual sensor 18-5 can now be converted into corresponding data, in particular location data, in the real world in order to simulate the desired sensor data.
  • the user 44 also defines a sensor type (e.g. light barrier) and integrates the sensor defined in this way into the material flow control 22 (cf. FIG. 1).
  • a sensor type e.g. light barrier
  • the virtual sensors 18 can also only be defined in the model data of the system 10 and/or 12, in which case the image sensor 14 must then be positioned in such a way that the field of view 37 captures the areas defined in this way (and sufficient reference points).
  • FIG. 4A shows an old prior art material flow controller with real sensors
  • FIG. 4B showing a new controller with a virtual sensor 18 (not shown)
  • virtual sensor 18 (not shown) in an old controller.
  • Figure 4A shows that conventional controllers are physically connected to each real world sensor. In other words, this means that a large number of signal lines or a bus system is used to physically connect the real sensors to the controller.
  • FIG. 4B illustrates the same scenario as FIG. 4A, but the real sensors have been replaced by the (preferably single) image sensor 14, which in turn simulates a plurality of virtual sensors 18.
  • FIG. 4C also illustrates the same scenario as FIGS. 4A and 4B, in which the image sensor 14 or the virtual sensors 18 are coupled to an old controller via an intermediate controller 48 .
  • the intermediate controller 48 converts the signal from the image sensor 14 into a corresponding number of signals from virtual sensors 18, which are supplied to corresponding terminals of the old controller.
  • FIG. 4C shows the possibility of retrofitting, so that inventory controls can also be expanded to include virtual sensors 18 .
  • FIG. 5 shows a flow chart of a method for defining a virtual sensor 18 in a conveyor system 12 whose model is known in the reference system 30 of the conveyor system 12 . That is, the model of the conveyor system 12 is provided, the model including at least a position, and preferably an orientation and dimension, of each of the conveyor runs 28 in the frame of reference 30 of the conveyor system 12 (or the overall system 10).
  • the conveyor devices 26 shown in FIG. 2 can form a (first modular) conveyor system 12, to which further (not illustrated modular systems) connect, which are constructed similarly to the conveyor system 12 of FIG.
  • a reference point in the reference system 30 of the conveyor system 12 is defined.
  • Framework 30 is the reference framework to which everything is intended to relate.
  • the definition can take place, for example, by positioning additional markings within the conveyor system 12 .
  • One, several or all conveyor devices can be provided with the markings M (cf. M1 to M3) (see optional steps S 12) by sticking the markings M to the conveyor devices 26, for example.
  • the markers M can also be provided separately from the conveyors 26, but in this case at fixed and unchanging positions relative to the conveyors 26.
  • the markers M could be attached to stands (not shown), for example, which are separate and spaced from the conveyors 26 can be positioned.
  • the markings M can be attached to other elements in the environment (e.g. on shelves, doors, walls, ceiling, floor, etc.) which are in an unchangeable spatial relationship to the conveyor devices 26 or the conveyor sections 28. This is particularly advantageous when discontinuous conveyors are used.
  • the markings could be glued to the floor, for example, similar to guide tracks for mobile robots.
  • the markings M can, for example, be barcodes or other optically recognizable markings. In particular, it can be characteristic markings that are used only once in the conveyor system 12 in order to facilitate a clear assignment (identification) of the marking M to the desired conveyor route 28 .
  • identical markings M could also be used, e.g. reflecting points, which are detected particularly well by the image sensor 14 and are easily recognizable in the image 36 - but cannot be clearly identified.
  • the markings are not individually identifiable, but can only be clearly identified in groups, e.g. using pattern recognition.
  • the definition of the reference points of step S10 also includes that the positions for each defined reference point in the (reference) frame of reference 30 of the conveyor system 12 are determined. These positions are required to determine the coordinate transformation.
  • the image sensor 14 is positioned within the conveyor system 12 such that at least one of the conveyor sections 28 and at least one of the previously defined reference points are in the field of view 37 of the image sensor 14.
  • several reference points are required within the field of view 37 in order to reliably carry out the (in particular automated) identification of the reference points present there. This is particularly difficult if the reference points are realized by markings M that are difficult or impossible to distinguish, or if the differences in the Markings are not visually recognizable (e.g. due to insufficient resolution). In these cases it is helpful to be able to access additional information, such as distances between the markings, attachment locations in relation to the conveying devices 26 or the like.
  • the image 36 is generated with the image sensor 14, preferably periodically, after the image sensor 14 has been positioned accordingly.
  • the correct positioning can be checked by the user 44 by the user 44 moving the camera 16 back and forth until he (live) sees a sufficient number of reference points on his tablet computer 40 (cf. FIG. 3).
  • the reference points in the generated image 36 are identified, preferably by means of image recognition, and the positions of the identified reference points are determined in the reference system 30' of the image 36.
  • identification means that the reference point in the image 36 must be recognized in order to decide which pixel of the image 36 represents a reference point and which does not.
  • the reference points can be recognized and identified by the user 44 in the image 36, for example.
  • the user 44 can use the stylus 46 to select the reference points he has visually recognized in the image 36 on the screen 42 and assign them the corresponding reference point in the real world, i.e. their position in the reference reference system 30.
  • the identification can also take place automatically by the image recognition unit 40 by means of image recognition.
  • the reference points in image 36 are first recognized as reference points, ie identified, and then based on further information, such as the distance between the recognized reference points, distances to other significant, also easily recognizable features in image 36 or similar chem, automatically identified. This means that it is automatically determined which of the reference points are actually contained in image 36.
  • a coordinate transformation is determined based on the positions of the mutually assigned reference points in both reference systems 30 and 30', so that each point of the generated image 36 can be assigned a coordinate 34 in the reference system 30 of the conveyor system 12.
  • a monitoring area 50 (in the image 36 or in the model) is defined, which is to be monitored by the virtual sensor 18 and which at least partially overlaps with one of the conveyor sections 28 that is in the field of view 37 of the image sensor 14.
  • Fig. 6 illustrates the possible uses of the virtual sensor 18 when discontinuous conveyors are used.
  • FIG. 6A shows a driverless transport vehicle, AGV 52 and several drones 54 in the vicinity of a rack warehouse.
  • Two two-dimensional monitoring areas 50 - 1 and 50 - 2 are illustrated, which overlap the routes 28 (dash-dotted lines) or are penetrated by the routes 28 .
  • four markers M are shown by way of example, with markers M1 and M2 attached to a shelf and markers M3 and M4 attached to the top of a passageway.
  • the area 50-1 surrounds a front of the shelf and is monitored for whether the AGV 52 or one of the drones 54 is moving through the area 50-1.
  • the area 50-3 is defined three-dimensionally and can represent, for example, a safety zone into which the vehicles 52 and 54 are not allowed to enter because people could be there.
  • FIG. 6B also shows vehicles 52 and 54 in an outdoor environment.
  • Vehicles 52 and 54 can be used to deliver parcels (goods).
  • the areas 50 represent waypoints, destinations or safety zones.
  • the monitoring of the areas 50 is illustrated in a step S22 in FIG. Monitoring takes place, for example, by comparing images 36 that were recorded at different times.
  • a first basic image can show a state that is free of conveyed goods, so that a transport of conveyed goods can be automatically recognized at any time as a difference to this original image.
  • the controller 22 can act accordingly on the material flow in an optional step S24.
  • the vehicles 52 and 54 of Figure 6 may be stopped or alerting and warning signals may be generated to persons, such as residents.
  • the conveyors 26 of Figure 3 could be stopped or accelerated.
  • the item to be conveyed could be identified in FIG. 3, just to give a few examples.
  • FIG. 7 shows a block diagram of a system 60 for defining virtual sensors 18 for a conveyor system 12 that includes at least one conveyor line.
  • the system 60 comprises the image sensor 14, the markings M, a (data) memory 62, the image processing device 20, a coordinate transformation device 64 and a device 66 for defining virtual sensors. With the exception of the marks M, these components are connected to exchange data with each other.
  • the system 60 can be coupled to the controller 22 of FIG. 1 and generates control signals for the actuators 32 of FIG. 1.
  • the image sensor 14 can in turn be implemented by a camera 16 .
  • the positions of the reference points which are known in the reference reference system 30 are stored in the data memory 62 .
  • the image processing device 20 is set up to determine positions of reference points, which can be identified in the image 36, preferably by means of pattern recognition, in the reference system 30' of the image 36 generated.
  • the coordinate transformation device 64 is set up to determine a coordinate transformation based on the positions of the reference points in the two reference systems 30 and 30', so that each pixel of the generated image 36 can be assigned a coordinate 34 in the reference reference system 30.
  • Device 66 is set up to define a monitoring area 50, preferably in image 36, which is to be monitored by virtual sensor 18 and which at least partially overlaps with one of conveyor sections 28 or is adjacent thereto, in field of view 37 of image sensor 14 .
  • image sensors/cameras can also be used simultaneously.
  • the fields of view of the cameras may overlap. If multiple cameras are used, 3D positions of moving objects can also be determined.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Remote Sensing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

L'invention concerne un procédé permettant de définir un capteur virtuel (18) dans un système de transport (12), qui présente au moins une voie de transport (28), ledit procédé comprenant les étapes suivantes :a) disposer d'un modèle du système de transport (12), le modèle présentant au moins une position, ainsi que de préférence une orientation et/ou une dimension, de chacune des voies de transport (28) dans un système de référence (30), en particulier dans le système de référence (30) du système de transport (12), b) définir (S10) au moins un point de référence, en particulier par positionnement de repères supplémentaires (M1, M2, M3), de préférence identiques, et déterminer une position pour chaque point de référence défini dans le système de référence (30), c) positionner un capteur d'image (14) à l'intérieur du système de transport (12), de préférence à n'importe quel endroit, de sorte qu'au moins une des voies de transport (28) et au moins un des points de référence se situent dans un champ de vision (37) du capteur d'image (14), d) générer une image (36) avec le capteur d'image (14), une fois que le capteur d'image (14) est positionné, e) identifier des points de référence dans l'image (36) et déterminer des positions des points de référence identifiés dans un système de référence (30') de l'image (36) générée, f) déterminer une transformation de coordonnées, de préférence par traitement d'image, sur la base des positions des points de référence dans les deux systèmes de référence (30, 30'), de sorte qu'une coordonnée (34) du système de référence (30) peut être associée à chaque point d'image de l'image (36) générée, et g) définir une zone de surveillance (50), de préférence dans l'image (36), qui doit être surveillée par le capteur virtuel (18) et qui se superpose au moins en partie à l'une des voies de transport (28), qui se trouve dans le champ de vision (37) du capteur d'image (14) ou le jouxte.
PCT/EP2022/054703 2021-02-26 2022-02-24 Commande de flux de matière à capteurs virtuels WO2022180179A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP22712534.1A EP4154219A1 (fr) 2021-02-26 2022-02-24 Commande de flux de matière à capteurs virtuels

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102021104623.8 2021-02-26
DE102021104623.8A DE102021104623A1 (de) 2021-02-26 2021-02-26 Materialflusssteuerung mit virtuellen Sensoren

Publications (1)

Publication Number Publication Date
WO2022180179A1 true WO2022180179A1 (fr) 2022-09-01

Family

ID=80937260

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2022/054703 WO2022180179A1 (fr) 2021-02-26 2022-02-24 Commande de flux de matière à capteurs virtuels

Country Status (3)

Country Link
EP (1) EP4154219A1 (fr)
DE (1) DE102021104623A1 (fr)
WO (1) WO2022180179A1 (fr)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102006015689A1 (de) 2006-03-28 2007-10-04 SSI Schäfer PEEM GmbH System und Verfahren zum Steuern einer Fördertechnik mittels rechnergestützter Bilderkennung
DE102007035272A1 (de) 2007-07-27 2009-01-29 Siemens Ag Verfahren zum Identifizieren von Transportgütern, insbesondere Gepäckstücken
US20110019877A1 (en) * 2006-01-19 2011-01-27 Martin Kasemann Method and Apparatus For Monitoring a Production Line
JP2011195288A (ja) * 2010-03-19 2011-10-06 Toshiba Elevator Co Ltd マンコンベア画像処理装置
DE102011053547A1 (de) 2011-09-13 2013-03-14 Apologistics Gmbh Verfahren und System zum Lagern und Kommissionieren von Artikeln, insbesondere von Apothekenartikeln
DE102011055455A1 (de) 2011-11-17 2013-05-23 Apologistics Gmbh Anordnung und Verfahren zum automatisierten Verpacken von Erzeugnissen
US8699758B2 (en) * 2010-11-18 2014-04-15 Axis Ab Object counter and method for counting objects
US20150041281A1 (en) * 2013-08-07 2015-02-12 Solystic Conveyor system including tracking of the conveyed articles by using imaging and virtual sensors
US20180033151A1 (en) * 2015-02-25 2018-02-01 Panasonic Intellectual Property Management Co., Ltd. Monitoring device and monitoring method
WO2018211072A1 (fr) 2017-05-18 2018-11-22 Ssi Schäfer Automation Gmbh (At) Dispositif et procédé pour commander un flux de matière sur un point nodal de flux de matière

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110019877A1 (en) * 2006-01-19 2011-01-27 Martin Kasemann Method and Apparatus For Monitoring a Production Line
DE102006015689A1 (de) 2006-03-28 2007-10-04 SSI Schäfer PEEM GmbH System und Verfahren zum Steuern einer Fördertechnik mittels rechnergestützter Bilderkennung
DE102007035272A1 (de) 2007-07-27 2009-01-29 Siemens Ag Verfahren zum Identifizieren von Transportgütern, insbesondere Gepäckstücken
JP2011195288A (ja) * 2010-03-19 2011-10-06 Toshiba Elevator Co Ltd マンコンベア画像処理装置
US8699758B2 (en) * 2010-11-18 2014-04-15 Axis Ab Object counter and method for counting objects
DE102011053547A1 (de) 2011-09-13 2013-03-14 Apologistics Gmbh Verfahren und System zum Lagern und Kommissionieren von Artikeln, insbesondere von Apothekenartikeln
DE102011055455A1 (de) 2011-11-17 2013-05-23 Apologistics Gmbh Anordnung und Verfahren zum automatisierten Verpacken von Erzeugnissen
US20150041281A1 (en) * 2013-08-07 2015-02-12 Solystic Conveyor system including tracking of the conveyed articles by using imaging and virtual sensors
US9365357B2 (en) 2013-08-07 2016-06-14 Solystic Conveyor system including tracking of the conveyed articles by using imaging and virtual sensors
US20180033151A1 (en) * 2015-02-25 2018-02-01 Panasonic Intellectual Property Management Co., Ltd. Monitoring device and monitoring method
WO2018211072A1 (fr) 2017-05-18 2018-11-22 Ssi Schäfer Automation Gmbh (At) Dispositif et procédé pour commander un flux de matière sur un point nodal de flux de matière

Also Published As

Publication number Publication date
DE102021104623A1 (de) 2022-09-01
EP4154219A1 (fr) 2023-03-29

Similar Documents

Publication Publication Date Title
EP2675709B1 (fr) Système de chargement du cargo et méthode de commande d'un convoyeur multiple
EP2665646B1 (fr) Système de chargement d'un avion et procédé de transport d'une marchandise de fret sur une plate-forme de chargement
DE102010050461B3 (de) Lager- und Kommissioniersystem sowie Verfahren zum automatischen Konfigurieren des Systems
DE102014016032B4 (de) Vorrichtung und Verfahren zum Aufnehmen von wahllos gehäuften Gegenständen mit einem Roboter
EP3316181B1 (fr) Procédé de détection d'objets dans un entrepôt et chariot élévateur doté d'un dispositif de détection d'objets dans un entrepôt
EP2561417B1 (fr) Procédé pour introduire une configuration spatiale de dispositifs de fabrication dans un programme de planification assisté par ordinateur et son optimisation
DE102011009739A1 (de) Vorrichtung und Verfahren zur Erfassung des Warenbestandes einer Verkaufs- und/oder Lagereinrichtung sowie ein hiermit ausgerüstetes Lagerverwaltungssystem
DE102007013299A1 (de) Sensorvorrichtung sowie Anlage mit einem Förderer und einer Sensorvorrichtung
WO2005033734A1 (fr) Systeme de positionnement assiste par transpondeur
EP3291144B1 (fr) Dispositif rfid et procede de reconnaissance d'occupation de casier
DE102019217568A1 (de) System von autonomen einrichtungen und steuerungsverfahren davon
DE102018116611B4 (de) Verfahren und System zur Steuerung des Materialflusses von Objekten in einem realen Lager
DE10220936A1 (de) Vorrichtung zur Lokalisierung mit festen und/oder veränderlichen Landmarken
DE102017125103A1 (de) Einstellvorrichtung und einstellsystem zum konfigurieren von einstellungen für eine mehrzahl von maschinen
DE102017110861A1 (de) Vorrichtung und Verfahren zum Steuern eines Materialflusses an einem Materialfluss-Knotenpunkt
DE202014000433U1 (de) Vorrichtung zum Erfassen und Darstellen von dreidimensionalen Objekten
DE102011000819B4 (de) Ladesystem für ein Flugzeug und Verfahren zur Förderung eines Frachtstücks auf einem Frachtdeck
DE10033857A1 (de) Regalbediengerät
WO2022180179A1 (fr) Commande de flux de matière à capteurs virtuels
EP3974936A1 (fr) Configuration d'un dispositif de visualisation pour une zone de machine
DE102020001213A1 (de) Positionsberechnungssystem, Positionsberechnungsverfahren und automatisch geführtes Fahrzeug
DE202019102253U1 (de) Vorrichtung zum Verifizieren von Lagergut-Positionen in intralogistischen Lagersystemen
WO2020109116A1 (fr) Procédé et système pour le contrôle du flux de matériaux d'objets dans une installation technique de transport d'un entrepôt réel
DE112021007260T5 (de) Warnmeldungs-Steuervorrichtung, Warnmeldungs-Steuerverfahren, Warnmeldungs-Steuerprogramm und Speichermedium
DE102018204704A1 (de) System zum Überwachen eines Überwachungsbereichs

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22712534

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2022712534

Country of ref document: EP

Effective date: 20221222

NENP Non-entry into the national phase

Ref country code: DE