WO2014028959A1 - Procédé et système d'assistance à un ouvrier dans une installation de manipulation de marchandises - Google Patents

Procédé et système d'assistance à un ouvrier dans une installation de manipulation de marchandises Download PDF

Info

Publication number
WO2014028959A1
WO2014028959A1 PCT/AT2013/050166 AT2013050166W WO2014028959A1 WO 2014028959 A1 WO2014028959 A1 WO 2014028959A1 AT 2013050166 W AT2013050166 W AT 2013050166W WO 2014028959 A1 WO2014028959 A1 WO 2014028959A1
Authority
WO
WIPO (PCT)
Prior art keywords
worker
camera
goods
detected
movements
Prior art date
Application number
PCT/AT2013/050166
Other languages
German (de)
English (en)
Inventor
Markus Winkler
Original Assignee
Tgw Logistics Group Gmbh
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tgw Logistics Group Gmbh filed Critical Tgw Logistics Group Gmbh
Priority to EP13780049.6A priority Critical patent/EP2888185A1/fr
Publication of WO2014028959A1 publication Critical patent/WO2014028959A1/fr

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65GTRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
    • B65G1/00Storing articles, individually or in orderly arrangement, in warehouses or magazines
    • B65G1/02Storage devices
    • B65G1/04Storage devices mechanical
    • B65G1/137Storage devices mechanical with arrangements or automatic control means for selecting which articles are to be removed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65GTRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
    • B65G1/00Storing articles, individually or in orderly arrangement, in warehouses or magazines
    • B65G1/02Storage devices
    • B65G1/04Storage devices mechanical
    • B65G1/137Storage devices mechanical with arrangements or automatic control means for selecting which articles are to be removed
    • B65G1/1373Storage devices mechanical with arrangements or automatic control means for selecting which articles are to be removed for fulfilling orders in warehouses
    • B65G1/1378Storage devices mechanical with arrangements or automatic control means for selecting which articles are to be removed for fulfilling orders in warehouses the orders being assembled on fixed commissioning areas remote from the storage areas
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects

Definitions

  • the invention relates to a method and an assistance system for assisting a worker in a facility for manipulating goods.
  • the invention relates to a use of said method and / or said assistance system.
  • EP 1 487 616 B1 discloses an augmented reality system for a warehouse.
  • actual movements of a warehouse worker are compared using a camera with target movements and logged accordingly.
  • the necessary camera is mounted on the helmet of a worker.
  • additional information such as the location of a part to be detected, may be displayed in data glasses.
  • EP 2 161 219 B1 discloses another augmented reality system for a warehouse worker where the camera is also mounted on a worker's helmet.
  • a kind of navigation system can be realized with a data goggles, in which a target movement of the worker is displayed by means of superimposed directional arrows.
  • WO 03/034397 A1 discloses an augmented reality system which can be used inter alia in a warehouse.
  • the camera is in turn mounted on the helmet of a worker.
  • these systems have narrow limits. For example, the movements of a worker can not be monitored independently of his line of sight.
  • the monitoring of a workflow is dependent on the worker and his helmet camera looking at his hands.
  • this can not be assumed in any case, because in particular experienced workers do things "blindly", without necessarily looking at their hands, and workers are frequently observed who perform their work without any problems, despite lively conversations with their colleagues. len. Her gaze often wanders to the other party and is rarely focused on the actual job.
  • helmet cameras Another disadvantage of helmet cameras is their weight, which sometimes seriously affects the productivity of the worker. Especially with aggravating working conditions such as high temperature and high humidity wearing such a helmet is perceived as very uncomfortable. In any case, this increases the sweat on the head. If the worker wipes sweat from his forehead, the camera is "blinded” for a few moments and can not continuously monitor the workflow, meaning important steps can be "overlooked” by the camera. When the work is done in a cold environment, the worker's breathing air often causes the camera lens to fog up, which also makes monitoring of a workflow impossible. Finally, the weight of the camera and data glasses limits the maximum head acceleration that occurs, for example, when turning the head fast. Therefore, the worker can not work as fast as his bodily physiology may allow.
  • the meaningful monitoring of such an operation is practically impossible. Assuming that the camera loses sight of the containers even for a single moment, the assignment of the source and target containers when the view is restored and thus the monitoring of the work process is extremely difficult with the utmost security. It is not possible to determine if the worker did not swap the source and target containers during a "blind" phase of the camera to improve his / her working situation, the camera's image area covers only one of the containers (such as when using a relatively long focal length lens in relation to the size of the camera) Container), then the unambiguous determination, which container is just detected, is also virtually impossible.
  • the known systems are primarily set up for the operations to be performed by a worker. If several people work together, meaningful monitoring of the workflow is very difficult and only possible with very high computational effort. If, for example, three people work together, the work area is covered by three different cameras whose coverage is influenced by the head movements of the persons concerned. The ability of people to look a conversation partner in the eye is also a major hindrance to monitoring such a workflow. If one of the three persons gives instructions to the other two, they (and thus their helmet cameras) inevitably look at the instructor. Although, in principle, there are three cameras for observation in this work situation, none of them detects the object that is actually manipulated (e.g., shared or handed over).
  • the object of the present invention is now to specify an improved method and an improved assistance system for assisting a worker in a facility for manipulating goods.
  • the above-mentioned disadvantages are to be overcome.
  • the object is achieved by a method for assisting a worker in a device for manipulating goods, comprising the steps: a) detecting movements of the worker with at least one camera mounted on the installation side,
  • a motion sequence comprises detecting and discarding at least one commodity, the detection range of the at least one camera is subdivided into segments, and wherein a motion sequence contains occupancy of the segments by the worker.
  • an assistance system for assisting a worker in a device for manipulating goods comprising:
  • At least one camera mounted on the system for detecting movements of the worker
  • the system By detecting both the movement of a worker from a remote observer site, the system can be used very flexibly under changing conditions and detects the monitored workflow with high security. This significantly improves the monitoring of a workflow.
  • the following advantages result in particular:
  • the movements of a worker can be monitored independently of his line of sight.
  • the system can also be used for "experienced” workers who have not kept their eyes on their hands.
  • the worker is more efficient because he is not affected by the weight of a helmet camera and can move freely. For example, fast head turns are also unproblematic for the presented system in terms of image processing. That too Fogging of the camera by the worker's breathing air is unlikely because the camera may not necessarily be mounted directly above the worker's head but at a more advantageous location, such as slightly away from him.
  • the position of the observing camera is always known and does not have to be laboriously calculated. Therefore, the system can not be "out of step.” A specification of work steps can therefore be made fluid even with comparatively low computational effort.
  • the said method or the said assistance system can therefore be used to monitor a sequence of movements which relates to the picking and / or resorting of goods.
  • the said method or the said assistance system are used in a warehouse.
  • a warehouse is used for receiving, storing and usually distribution of goods.
  • a warehouse can be designed, for example, as a high-bay warehouse, small parts warehouse, floor storage, high-level storage or as a paternoster storage and, for example, shelves of various types (such as shelf shelves, pallet racking, continuous shelves and high shelves) as well as various types of funding (such as roller conveyors, belt conveyors, forklifts, and driverless transport vehicles).
  • a grid can be laid over a work surface (worktable) and preferably also displayed by a laser projector. Of course, fixed markings on this work surface are also conceivable. Movements are evaluated as a sequence of occupied segments.
  • a “camera” is understood to mean a device or system which images a captured scene two-dimensionally or three-dimensionally.
  • a “camera” can capture a scene by detecting the light generated and / or reflected by the scene - both in the scene visible as well as in the invisible wavelength range - image.
  • a “camera” within the meaning of the invention does not necessarily use light as an information carrier, but additionally or alternatively, the detection of a scene can also take place in other ways. Image of a scenery on
  • the movements of the goods, which is manipulated by the worker are detected with the at least one camera before step b) and it is compared in step b) whether the detected movements of the worker and of the goods are in a setpoint.
  • Range of a first movement sequence are. The fact that not only the worker is detected by the camera, but also the goods manipulated by him, the monitoring of movement is particularly safe. In this embodiment, it is not sufficient, for example, to move only the hand in a predetermined manner. Instead, the goods must be moved in a predetermined manner.
  • the detection of the goods by the camera can of course not be ruled out even if it is merely compared whether the detected movements of the worker lie within a desired range of a first movement sequence.
  • the goods may be detected "incidentally" and without further consequences, whereas in the case of the preceding embodiment the goods are recorded as planned and their movement is evaluated.
  • a product is classified as being "detected" by the worker, if it is detected between the detection and depositing, in particular during the entire sequence of movements, at least partially by a hand of the worker or a tool held by him (eg pliers, gripper,
  • a hand of the worker or a tool held by him eg pliers, gripper
  • a commodity is at least partially obscured when it is held by a worker, and this circumstance is exploited in the present embodiment to classify a commodity as "captured”. If it is not completely recognized between the detection (start point of the movement sequence) and the departure (end point of the movement sequence), then it can be assumed with high probability that it will be moved by the worker in the given way and not, for example, only between start and end point lying on a work table. It is checked whether a product during the entire movement sequence at least partially by a hand of the worker or a held by him
  • a product is classified as being grasped by the worker, for example if a worker's hand is merely moved over a product lying on the work table.
  • data from a database in which 3D models of the goods are stored can be used. If a view of the relevant 3D model does not completely correspond to the view of the goods captured by the camera, and in particular corresponds to the hidden portion (at least in part) of a worker's hand or a tool held by him, then the goods can be classified as recorded.
  • step a at least the starting point and / or the end point of a movement sequence are displayed.
  • the worker is shown what to do.
  • the starting point and the end point may be provided with different visual markings to indicate to the worker what to do.
  • the start and end points can be marked with different colors.
  • the camera identifies persons and performs the display of at least the start and end points of a movement sequence on the basis of a personal setting (profile).
  • profile personal setting
  • preferences but also restrictions of certain users can be taken into account. For example, by default, the starting point and the end point may be marked red and green, but purple and orange, for example, if the worker has poor eyesight in distinguishing red and green.
  • the starting point and end point can also be marked with different shapes, such as a circle and a cross.
  • indicator lamps or laser projectors are available for the optical marking of the start and end points, which can be used in a particularly flexible manner for producing different markings by means of the beam deflection.
  • start and end point can also be done with the help of a screen or a corresponding insertion in a data glasses. Especially with screens and data glasses, the display of even complex movement sequences is possible. For example, a worker may be instructed to rotate a detected item before it is discarded.
  • acoustic instructions are also possible, for example "T-shirt from the left stack, turn 90 ° to the left and in shipping box number 1
  • the start point and end point of a movement sequence are not displayed at the same time, but with a time offset.
  • the time offset can be fixed (eg one second) or individually calculated / estimated on the basis of the motion sequence It is also possible to show only the starting point for the time being and the end point only when it is recognized that the goods have been taken from the starting point.
  • a product is automatically delivered and / or removed after step c), if said comparison fails positively.
  • goods to be manipulated are thus automatically delivered and / or transported away.
  • the output of the comparison whether a movement of the worker and the goods are in a desired range of a first movement sequence, evaluated. If the result is positive, then the current sequence of movements can be considered complete, and processed goods can be transported away and / or new goods to be processed can be delivered.
  • this process does not need to be explicitly requested by the worker, for example by pressing a corresponding key, but is done automatically. Also the execution of a certain gesture is not necessary. The worker can thus work more efficiently. Of course, it can also be provided to consciously control a delivery of goods and their removal, just by pressing a button or by making a gesture.
  • the workflow is stopped when it is detected that the worker has moved away from his workplace / work area, for example, to visit the toilet.
  • the worker need not make an explicit statement to stop or restart the workflow. Stopping or starting up occurs automatically as a consequence of the worker being or not present in the workplace / workspace.
  • a delivery of goods and their removal to consciously control, as mentioned, for example, by pressing a button or by making a gesture.
  • a loading aid is transported to a provision site on the installation and the camera detects a position and / or location of the transporting loading aid relative to the provisioning location, which may include a conveyor, and this is assigned at least one segment ,
  • the loading aids source box
  • a segment is assigned to it, so that steps a) to c) can be carried out.
  • a loading aid and multiple segments can be assigned.
  • the loading aid is subdivided and has, for example, several compartments. It is generally conceivable that the actual position and / or actual position of the loading aid is detected and used for the assignment of the at least one segment, so that the at least one segment can be well aligned with the loading aid. This can be done with the camera, for example.
  • the delivered load carrier may generally be empty or already contain goods if it is a destination box (order container), or stocked with stock items, if it is a source box (storage container). It is possible that the at least one segment is variably defined in terms of its size and / or position and approximately to a contour of a peripheral edge of the loading aid, which is located in position and / or location on the supply location, is tuned.
  • the definition of "a contour of a peripheral edge of the loading equipment” is to be understood as meaning hen that the contour is defined by the outer walls of the loading aid (in particular loading aid without compartment division) or the contour is defined by the partitions or separation and outer walls of the loading aid (load support with compartment division or individual compartments).
  • a loading aid may be provided sufficiently accurately in its conveying direction and / or transversely to the conveying direction on the conveyor, however, this loading aid may be rotated at the provisioning. Due to the "dynamic adaptability", the segment can be changed individually and in accordance with the current situation and the detection range of the camera can be adjusted. ⁇ br/> ⁇ br/> The detection of the actual position and / or actual position preferably takes place by means of the already existing camera, but it would also be conceivable in that an additional sensor system is provided on the system which is suitable for detecting the actual position and / or actual position.
  • the at least one segment is tuned in its size dimension to a defined by the contour of a peripheral edge of the loading means surface area, wherein a surface area of the segment at least 5% larger than the surface area of the loading aid and preset becomes.
  • a contour of a circumferential edge of the loading aid is understood to mean that the contour is defined by the outer walls of the loading aid (in particular loading aid without compartment division) or the contour is defined by the partitions or separation and outer walls of the loading aid (loading aids
  • a "static adaptability" of the segment is possible, for example, individually to different types of loading aids, wherein the detection range of the camera is preset to different types of loading aids.
  • a combination of "static adaptability” and “dynamic adaptability” of the segment would also be conceivable. It is also favorable if, when the comparison is positive in step c), a loading aid is transported away and an assignment of the at least one segment is resolved. In this way, system resources can be used economically and accumulation of segments can be avoided.
  • the removed load support may in turn contain goods or be empty.
  • a movement sequence contains only the starting point and end point of a movement. This makes it possible to keep the calculation effort for the said method low. For the successful execution of a movement sequence, it is sufficient that the starting point and then the end point is reached by the worker (or his hand) and / or a product.
  • the camera is positioned frontally to the worker and goods are rearranged between successive loading equipment.
  • ultrasonic sensors or laser scanners can be used.
  • TOF cameras time of flight
  • the observed area is illuminated by means of a light pulse, and the camera measures for each pixel, the time that the light needs to the object and back again.
  • the time required is directly proportional to the distance.
  • the function of the (conventional 2D) camera and the space depth sensor in a device, namely the TOF camera united.
  • a spatial image of a workplace / workspace can of course also be obtained with the help of several 2D cameras. In general, this requires at least three 2D cameras.
  • a "segment” in the context of the invention is not necessarily two-dimensional, but may also be three-dimensional.
  • This three-dimensional segment can be understood as any shaped part of the three-dimensional "detection space" observed by the assistance system. While a two-dimensional segment is bounded by an envelope, a three-dimensional segment is separated from an envelope. surrounded area. It is noted that these terms are used synonymously in the context of the application. This means that a doctrine, which refers to an envelope, can be applied mutatis mutandis to enveloping surfaces and vice versa. Likewise, a teaching disclosed for 2D segments can be applied analogously to 3D segments and vice versa.
  • a segment may be classified as "occupied” if the envelope or envelope of a segment is breached by a worker, particularly his or her hand or a tool held by him / her
  • a three-dimensional segment would also be over
  • the occupancy of a three-dimensional segment can be determined both by calculations in three-dimensional space and by calculations in several two-dimensional views. occupied "when breaking the envelopes of several projections in different planes.
  • the projection planes or the axes of the observing cameras are orthogonal to one another. With lower requirements for detection reliability, observation from two different directions may be sufficient.
  • the calculation effort for the said method can be further reduced by concentrating the segmentation on an area in which the predetermined motion sequences are executed. It is conceivable, in particular, that the said part of the detection area is adapted to the respective movement sequence to be executed, that is to say that a different part of the detection area is segmented per movement sequence.
  • the height of the worker is detected and a movement sequence is adjusted according to the detected height.
  • This makes it possible to design the movement with respect to ergonomic aspects and thus to increase the performance of the worker and / or to avoid long-term health damage due to poor ergonomics.
  • an arm length of the worker can be taken into account. For example, if it is the job of the worker to recycle goods between different load carriers, then a worker with a short arm length can be instructed to place the load carriers closer than a worker with a long one Arm's length.
  • movements of the worker can be avoided whose harmfulness the worker himself is not aware of, for example because the individual movement causes no pain and damage occurs only after a long time and frequent execution of the harmful movement.
  • An adaptation of a movement sequence corresponding to the detected height can also be understood to mean a distribution of the work within a group of workers. For example, activities requiring large arm length and / or high body height, such as operating a relatively tall rack, are given to workers who meet these criteria. Accordingly, small workers are entrusted with activities that are suitable for them. It is advantageous if a group of workers has different characteristics (diversification).
  • the body size of the worker is detected and / or an area from which goods are removed and / or an area in which goods are stored is / are adjusted in accordance with the detected body size.
  • the said adjustment is an adaptation of the height of said areas to understand.
  • a work surface a work table
  • conveyor for the delivery and / or removal of goods can be adjusted in height.
  • linear motors such as spindle drives
  • other drives are fully usable.
  • the height of a stack of goods and / or a loading aid can also be taken into account.
  • the area from which stacked goods are removed may be successively shifted upwards, so that the goods can always be taken out at the same height.
  • the area in which goods are stored successively moved down so that the goods can always be stored at the same height. If goods are to be removed from or deposited in a high loading aid, this can be moved further downwards than a low loading aid, so that the upper edge of the loading aid is always at optimum height.
  • the optimum working height can also be related to the bottom of the loading aid or to the upper edge of a stack therein.
  • the height of the stack in a load carrier may vary depending on the position of the observed the camera is not visible. In this case it can be provided that the height of the stack is calculated on the basis of the height of the manipulated objects.
  • the adaptation of the removal area and / or the depositing area to the height of the worker is by no means limited to its height.
  • an arm length of the worker can also be used to move the removal area and / or the deposition area laterally toward the worker or away from the same.
  • the considerations used for height adjustment are to be applied mutatis mutandis.
  • the removal area and / or the deposit area can be moved laterally sideways when goods are lined up laterally, ie, as it were, a "horizontal stack" is formed.
  • the at least one camera is positioned in front of and / or above a workstation / workspace intended for the worker. In this way, the movements of the worker and the manipulation of the goods can be tracked very well.
  • the camera is movable.
  • a camera can be used more flexibly since, as needed, it can observe a workstation / work area from different angles or can also be used to monitor several workstations. It is also conceivable, for example, for several cameras to be arranged at a workstation / workspace or to be pulled together there when a particularly difficult process is to be observed.
  • the cameras can be moved by any means. For example, they can be mounted on articulated and motorized arms ("robot arms”) or moved along a rail system.
  • Fig. 1 shows a first schematically illustrated example of a system for the manipulation of
  • Figure 2 is a second schematically illustrated example of a system for manipulating goods in an oblique view.
  • Fig. 3 shows an example of an image captured by a camera with a drawn
  • Fig. 4 is an example of an image captured by a camera which is segmented
  • FIG. 5 shows an example of an image captured by a camera with the start and end points of a motion sequence drawn in
  • FIG. 6 shows an example of an image captured by a camera, which is segmented only in regions;
  • FIG. 7 like FIG. 2, only with segments / envelopes drawn in;
  • FIG. 8 shows the scene from above in the visual area shown in FIG. 7;
  • FIG. 10 like FIG. 9, only with a hand of the worker lowered into the loading aid
  • 11-13 is an exemplary flowchart for assisting a worker in the
  • Fig. 14 is an exemplary flowchart for monitoring "forbidden" segments.
  • Fig. 1 shows a worker 1 in a plant 2 for manipulating goods 3.
  • the plant 2 comprises a table 4, troughs in which loading aids 5 (eg boxes or boxes) are set, display devices 6 (eg lamps or LEDs), input keys 7, a screen 8 and a camera 9.
  • the camera 9 serves for detecting movements of the worker 1 and optionally of movements of the goods 3, which is manipulated by the worker 1.
  • a transverse to the table 4 extending conveyor 10 is provided.
  • the system 2 also comprises means for comparing whether the detected movements of the worker 1 and, where appropriate, the product 3 are in a desired range of a movement sequence, and for outputting a correction signal, if this is not the case, wherein a motion sequence detecting and storing at least one product 3 includes.
  • the said means can be formed by a computer integrated in the screen 8, in which appropriate software runs.
  • a movement sequence comprises detecting and discarding at least one product 3 includes.
  • the task of the worker 1 may be to pick goods 3 for an outgoing order.
  • goods 3 are to be taken from the left and middle boxes according to the specification of a picking system and placed in the right box 5.
  • 8 can be output on the screen: "Please place two items from the left box in the right box, then place three items from the middle box in the right box.”
  • the output can also be acoustical.
  • the display devices 6 can also be used.
  • the instruction may read "Please put two items from the box marked in green into the box marked in red”. Synchronously with the output, the corresponding display devices 6 are activated. Before step a), in this variant, therefore, the starting point and the end point of a movement sequence are displayed.
  • green display device 6 it would also be conceivable, for example, for the green display device 6 to flash twice to signal that two objects are to be removed. A separate text statement can then be omitted.
  • the starting point and the end point of the movement sequence can also be marked with different shapes, for example with a circle and a cross, or else with the number / type of goods 3 to be picked / put.
  • laser projectors and data glasses are also available for optical marking of start and end points.
  • the mentioned method can be omitted by the presented method, on the other hand errors of the worker 1 are recognized.
  • the system can avoid a looming error in the approach. If, for example, in the first instruction "Please insert two items from the box marked in green into the box marked in red", it is recognized that the operator 1 removes an item from the box 5 marked in red, then a warning can be issued to the worker 1 before he has actually sorted the goods 3 incorrectly, and a warning may be issued if he does remove the goods 3 from the correct one Box 5 removes, but then moves it in the wrong direction. On the other hand, if he executes the procedure correctly, the system jumps to the next instruction.
  • the active confirmation of a completed work by the worker 1 is not excluded in principle.
  • placing both hands at the left and right ends of the table edge may mean "operation completed - expect next operation.”
  • the system may also serve to sort goods 3 into the boxes 5. For example, in the right container 5, there are mixed goods 3, sorted into the middle box 5 and the left box 5 (for example, because several storage containers have fallen from awkwardness to the ground and to keep the streets running in the warehouse hastily filled into a single container) render valuable services, since the system recognizes which goods 3 from which source box 5 is placed in which destination box 5 and checks whether this is also correct.
  • the system can also provide more complex instructions. For example, a worker 1 may be instructed to rotate a detected item to a particular location before it is discarded.
  • a product 3 is automatically delivered before step a) and / or transported away after step c), if said comparison, whether the detected movements of the worker 1 and optionally the product 3 in a desired range of a first movement sequence lie, positive turns out.
  • goods 3 to be manipulated can be delivered and / or removed via the conveyor 10.
  • goods 3 may be picked from the boxes 5 in shipping cartons (not shown) on the conveyor 10.
  • the system automatically recognizes this and the full shipping carton is sent to the shipping department and a new empty shipping carton is transported to the next job.
  • the reverse case is also conceivable, namely that goods 3 are transported via the conveyor 10 from a warehouse and are picked into the waiting boxes 5. Is the corresponding number of goods 3 from the storage container ter (not shown) removed, it is transported via the conveyor 10 back to the camp.
  • provision may also be made for intentionally controlling a delivery of goods 3 and their removal, for example by pressing a button 7 or by making a gesture.
  • pivoting the arm to the right may mean that the conveyor 10 is transporting the next container.
  • the output of a correction signal or the delivery and removal of goods 3 is suppressed if it is determined that the worker 1 has moved away from his workstation / workspace.
  • the workflow is stopped when it is detected that the worker 1 has moved away from his workplace / work area, for example, to go to the bathroom.
  • the worker 1 does not need to make an explicit statement in order to stop or restart the workflow.
  • Stopping or starting up occurs automatically as a consequence of worker 1 being or not present at the workplace / work area.
  • it can also be deviatingly provided to intentionally control a delivery of goods 3 and their removal, for example, again by pressing a button 7 or by executing a gesture.
  • the camera 9 or an image processing software for the captured by the camera 9 image capture the structure of the worker 1, that is, for example, the torso, limbs, head and in particular hands capture. It is formed as it were a skeleton of the worker 1, whereby its movements can be detected particularly accurate or differentiated. This makes it possible to detect whether a product 3 has actually been registered. If, for example, a product 3 is only covered by the head or hull, it can be assumed that it has not been recorded. If, on the other hand, it is covered by one hand or both hands of the worker 1, then the probability is very high that the goods 3 were actually recorded.
  • the "skeleton" of the worker 1 can be extended by tools held by him (eg pliers, grippers, etc.) and a product 3 can then be classified as captured when the goods 3 are covered by the tool becomes. Even then, the likelihood is very high that the goods 3 were actually recorded.
  • tools held by him eg pliers, grippers, etc.
  • FIG. 2 now shows a system 2 for manipulating goods 3, which is very similar to the arrangement shown in FIG. In contrast, the camera 9 but not on one
  • Pillar but mounted on a ceiling.
  • it can also be provided that it is movably mounted, for example, in a rail and driven by a motor.
  • cameras 9 can be used more flexibly. Since these move along defined paths, their location is much easier than, for example, the location of helmet cameras carried by workers 1.
  • system 2 shown in FIG. 2 also comprises a room depth sensor 11.
  • Fig. 3 shows an exemplary image captured by a camera 9 (Note: the image does not show the arrangements shown in Figures 1 and 2 but another exemplary arrangement.) In particular, the boxes 5 are not arranged in special troughs, but rather - placed on the table 4).
  • an expected motion sequence 12 is shown.
  • the worker 1 is to put the goods / object 3 from the upper left box 5 into the lower left box 5.
  • a corresponding image processing is now checked whether the hand of the worker 1 and the object 3 are moved along the path shown. If so, then this is considered successful in performing the requested action. If this is not true, then an error signal is output and attempts to bring about the requested action.
  • the movements of the goods 3, which is manipulated by the worker 1 are detected before the step b) with the at least one camera 9 and the space depth sensor 11 and compared in step b), Whether the detected movements of the worker 1 and the goods 3 are within a desired range of a first movement sequence 12.
  • the monitoring of movement is particularly safe. In this variant, it is not sufficient, for example, merely to move the hand in a predetermined manner. Instead, the goods 3 must be moved in a predetermined manner.
  • a product 3 is classified as being detected by the worker 1 if, between the detection and removal, in particular during the entire movement sequence 12, at least partially by a hand of the worker 1 or a tool held by him (eg pliers, Gripper, etc.) is covered.
  • a product 3-unlike in FIG. 3 - is at least partially hidden when it is held by the worker 1. If it is not completely recognized between the detection (starting point of the movement sequence 12) and the deposition (end point of the movement sequence 12) or during the entire movement sequence 12, then it can be assumed with high probability that it is moved by the worker 12 in the predetermined manner and not, for example, lies only on a work table 4 between start and end point.
  • a product 3 is at least partially hidden
  • data from a database in which 3D models of the goods 3 are stored can be used. If a view of the relevant 3D model does not completely correspond to the view of the goods 3 detected by the camera 9 and / or the depth sensor 11, and in particular corresponds to the hidden portion (at least partially) of a hand of the worker 1 or of a tool held by him the goods 3 can be classified as recorded.
  • the worker 1 has to turn the product 3 into a defined position relative to the camera 9 after picking up, so that it can be recognized, for example, on the basis of its contour, coloring, etc.
  • Fig. 4 shows the same scene as Fig. 3.
  • the image captured by the camera 9 is subdivided with a raster 13 into a plurality of arrays 100..146 arranged like a matrix.
  • the detection range of the at least one camera 9 or the space depth sensor 11 is thus subdivided into segments 100..146, and a motion sequence 12 contains an occupancy of the segments 100..146 by the worker 1 and / or the goods 3.
  • the worker 1 put the object 3 from the upper left box 5 into the upper right box 5.
  • the calculation effort is kept low in this variant of the method, since it only has to be checked whether the hand of the worker 1 and the goods 3 occupy the segments 132, 133 and 134 in the stated sequence.
  • the boxes 5 can be set up randomly on a worktable 4 (that is, if, for example, predetermined troughs are missing), it may be advantageous to divide the table 4 into a predetermined grid in which the boxes 5 are to be placed.
  • lines can be painted on the table 4 or projected onto it. In this way it is avoided that boxes 5 are placed in an unfavorable position, for example at the crossing points of several grid lines.
  • the rear boxes 5 from the perspective of the worker 1 are relatively unfavorable because they occupy comparatively many segments.
  • a motion sequence 12 contains only the starting point 132 and end point 134 of a movement, as shown in FIG. 5. In this case, it is only checked whether the hand of the worker 1 and the goods 3 occupy the segments 132, and 134 in the order named. This makes it possible to keep the calculation effort for the said method even lower. For the successful execution of a movement sequence, it is sufficient that starting point and then the end point by the worker 1 (or his hand) and / or a product 3 is achieved.
  • FIG. 6 shows a further variant of the method, in which only part of the detection range of the at least one camera 9 or of the room depth sensor 11 is subdivided into the mentioned segments 100...
  • the calculation effort for the said method can be reduced even further.
  • the check can be limited to whether the expected movement sequence 12 is occupied by occupying the current motion sequence 12 required, ie, "active" segments 100 and 103 or optionally also by non-occupancy of not required for the current motion sequence 12, the is called “inactive" segments 101 and 102 has been executed. It is also conceivable, of course, that several segments are provided in the action area of the worker 1.
  • segmented part of the detection area it is also conceivable, in particular, for the segmented part of the detection area to be adapted to the movement sequence 12 to be executed in each case, that is to say for each movement sequence 12 a different part of the detection area to be segmented.
  • an "envelope curve" is placed around a respective box 5.
  • a segment 100 and 103 can be classified as occupied if this envelope is breached by the worker 1, in particular his hand
  • a segment 100 and 103 must be occupied for a certain time (eg one second) so that it is also classified as occupied.
  • a loading aid 5 (eg box, container) is preferably transported in an automated manner and this is assigned at least one segment 100 and 103.
  • a loading aid 5 eg box, container
  • this is assigned at least one segment 100 and 103.
  • a provisioning station delivery point
  • a source box storage container
  • a provisioning station buffer space
  • a loading aid 5 can also be assigned a plurality of segments 100 and 103, in particular if the loading aid 5 is subdivided. It is generally conceivable that the actual position and / or actual position of the loading aid 5 is detected and used for the assignment of the at least one segment 100 and 103, so that the at least one segment 100 and 103 and the loading aid 5 are coordinated with each other can be done for example with the help of the camera 9. It is also favorable in this context, however, if the presence of the loading aid 5 is detected at the place of supply to the system 2 by means of a sensor other than the camera 9, for example with a light barrier or a switch.
  • the actual position and / or actual position of the loading aid 5 is disregarded, and that at least one segment 100 and 103 is assigned to a standard position when the presence of a loading aid 5 is detected.
  • the delivered loading aid 5 may generally be empty or already contain goods 3.
  • a loading aid 5 is transported away and an assignment of the at least one segment 100 and 103 is resolved. In this way, system resources can be used economically and accumulation of segments 100 and 103 can be avoided.
  • the transported offloading means 5 may in turn contain goods 3 or be empty.
  • FIG. 7 now shows the workplace already shown in FIG. 2, but now with segments or envelopes / enveloping surfaces 100... 102 above the boxes 5.
  • the segments / envelopes 100... 102 are not necessarily planar or two-dimensional but can also be spatial or three-dimensional. This is particularly advantageous when the workstation / workspace can be detected three-dimensionally with a room depth sensor 11 or an arrangement of several 2D cameras.
  • the teaching on two-dimensional segments / envelopes 100..146 disclosed above is also applicable without restriction to three-dimensional segments / envelopes 100... 102.
  • the segments / envelopes 100... 102 in FIG. 7 can of course also have a different shape, for example the shape of a hemisphere or even irregular shapes.
  • FIG. 8 shows the scenery shown in Fig. 7, now in fragmentary form from above. Specifically, the table 4, the loading equipment 5, the goods 3, the display devices 6, the input keys 7 and a hand of the worker 1 are shown, which differ from the Fig. 7 all- Now it is located above the right loading aid 5.
  • the loading aids 5 are in turn assigned to the segments 100..102.
  • FIG. 8 shows the scene in the visual range, that is to say as it is perceived by the human eye.
  • FIG. 9 now shows the same scene as an image of the spatial depth, that is to say as perceived by a room-depth camera or a room depth sensor 11. Different areas of the room depth camera or the depth sensor 11 different areas are hatched differently in Fig. 9.
  • different distances can of course also be assigned to different gray values or color values. For example, more distant objects are light gray, closer objects are shown in dark gray.
  • FIG. 9 it can be seen from the hatching that the hand of the worker 1 is positioned above the loading aid 5, since the hand is hatched differently than the bottom of the loading means 5 and also unlike the table 4 9 can be seen that the display devices 6 and the input keys 7 "disappear" because they are just installed in the concrete example shown in the table 4.
  • FIG. 10 shows the hand of the worker 1 in FIG. 10.
  • this spatial area corresponds to the area covered by a loading aid 5.
  • the hand of the worker 1 is positioned above this area. The segment 100 is therefore considered not occupied.
  • the hand of the worker 1 is lowered into the loading aid 5.
  • Segment 100 is therefore considered occupied. In this way it is avoided that a hand detected only above a segment 100... 102 detects that the system is detecting or storing it a product 3 is misunderstood. Only the occupation of a (spatial) segment 100..102 is regarded as detecting or storing a product 3.
  • the space area in which the detection or deposition of a commodity 3 is detected corresponds to the area encompassed by a loading aid 5.
  • the said spatial area can also comprise only a partial area, in particular from the floor, and thus correspond to the area which would occupy a liquid filled in the loading aid 5.
  • the space area can also be arranged at a distance from the floor, similar to the spatial area that an oil layer floating on water would occupy in the loading aid 5.
  • the spatial area can also protrude beyond the loading aid 5 (cf. FIG. 7) or virtually hover above the loading aid 5.
  • the camera 9 can therefore be designed as a room-depth camera and be provided in particular exclusively for detecting the spatial depth. A separate room depth sensor 11 is then no longer required.
  • the boundaries between the room-depth camera and the room depth sensor are fluid and, in principle, the variants explained in connection with FIGS. 1 to 7 and the resulting advantages are analogously applicable to an embodiment in which the camera 9 is provided exclusively for detecting the room depth.
  • a room depth camera 9 is positioned directly above the worker 1 so that it directly provides the image shown in FIG. 9 and FIG. 10, respectively. It is also conceivable, however, that this is positioned slightly above the worker 1, taking into account that e.g. the loading aids 5 and also the segments 100..102 are not equidistant from the room-depth camera 9 due to their distance in the plane. The segments 100..102 are then considered to be occupied at different distances, which are also determined by the position of the respective segments 100... 102 in the plane.
  • further parameters can be used to decide whether a segment 100... 102 is now occupied or not.
  • the residence time of the worker's hand 1 in a segment 100..102 can be used for the decision.
  • the segment 100..102 can be considered as not occupied.
  • Another parameter for said decision may be the speed of the worker's hand 1. If a segment 100..102 passes through too quickly, this can also be an indication that a product 3 was not properly recorded or filed.
  • a segment 100..102 is crossed at a constant speed or no or only slight accelerations are measured, this can also be an indication that a product 3 has not been recorded or stored properly.
  • a hand 3 is slowed down when detecting or storing a product.
  • a detection of such a delay can again be a positive indication for the detection / removal of a product 3.
  • a reversal of the movement in the vertical direction should also be detectable. If this is not the case, there may be an error in the evaluation.
  • a region of a certain size for example a certain number of pixels
  • Image area is detected. That is, in a simple embodiment, it is only checked if an area of a certain (minimum) size is detected. Additionally or alternatively, the shape of the recognized area can also be evaluated. For the correct execution of a work step, it may also be provided that the starting and ending segments in which goods 3 are picked up or stored, but certain intervening segments are not occupied, but only swept over. If, for example, a product 3 from the right load support 5 in the relocate the left loading equipment 5, it can be provided that the first segment 100 must be occupied (picking up the goods 3), then the segment 101 must be swept over, ie the hand must be passed over it, and finally the segment 102 must be occupied (Store the goods 3).
  • segments 100..102 are defined. If it is required, for example, to relocate a product 3 from the middle loading aid 5 into the left loading aid 5, then it can be provided that the segment 100 may not be swept over, but exclusively the segments 101 (picking up the goods 3) and the segment 102. must be placed (storage of goods 3).
  • the body size of the worker 1 is detected and a movement sequence 12 adapted according to the detected body size.
  • an arm length of the worker 1 can be taken into account. If, as shown in FIG. 6, the task of the worker 1 is to reorder goods 3 between different loading aids 5, then a worker 1 with a short arm length can be instructed to place the loading aids 5 closer than a worker 1 with a long arm length.
  • This specification can be done, for example, via the screen 8 or implicitly also in that a grid projected onto the table 4 is narrower or wider. In this way, an area from which goods 3 are removed and / or an area where goods 3 are stored are adjusted according to the detected body size of the worker 1.
  • the said adaptation also means an adjustment of the height of said areas.
  • a worktable 4 or the conveying means 10 can be designed to be height-adjustable.
  • linear motors such as spindle drives
  • other drives can be used. If the presence of a large worker 1 is detected by the camera 9, then the working table 4 and / or the conveyor 10 moved up. If a small worker 1 detected, then down accordingly.
  • the height of a stack of goods and / or a loading aid 5 can also be taken into account.
  • the bottom surfaces of the wells shown in Figs. 1 and 2 may be height adjustable. If goods 3 can be removed from or deposited in a high loading aid 5, this can be moved further downwards than a low loading aid 5, so that the upper edge or the bottom of the loading aid 5 is always at optimum height.
  • the area from which stacked goods 3 are removed may be successively shifted upward, so that the goods 3 are always taken out at the same height.
  • the area in which goods 3 are stored may be successively moved down, so that the goods 3 are always stored at the same height.
  • This method is also applicable when goods 3 are to be stacked in a load carrier 5. If the height of the stack located in a loading aid 5 can not be seen by the camera 9, the height of the stack can also be calculated on the basis of the height of the manipulated objects.
  • the adaptation of the removal area and / or the deposit area to the height of the worker 1 is by no means limited to the height.
  • an arm length of the worker 1 can be used to move the removal area and / or the deposit area accordingly laterally to the worker 1 or away from it.
  • the considerations used for height adjustment are to be applied mutatis mutandis.
  • the removal area and / or the deposit area can be successively moved laterally when goods 3 are lined up laterally, so as it were a "horizontal stack" is formed.
  • An adaptation of a movement sequence 12 corresponding to the detected body size can also be understood as a distribution of the work within a group of workers 1. For example, activities requiring large arm length and / or high body height, such as operating a relatively tall rack, are awarded to workers 1 who meet these criteria. Accordingly, small workers 1 are entrusted with activities that are suitable for them. It is advantageous if a group of workers 1 has different characteristics (diversification).
  • the method presented is, of course, not limited to use on a worktable 4 but can also be applied to other devices in a warehouse, such as working on a shelf.
  • such shelves can serve not only for storing goods 3, but also for picking.
  • FIGS. 1 and 2 it is of course also possible to use a plurality of cameras 9 and / or room depth sensors 11 whose detection ranges can also overlap.
  • a camera 9 and / or a room depth sensor 11 can also be designed to be movable.
  • the system can be used very flexibly under changing conditions and recognizes the monitored workflow with high reliability. This significantly improves the monitoring of a workflow.
  • the movements of a worker 1 can be monitored independently of the viewing direction.
  • the worker 1 is more efficient because he is not affected by a helmet camera.
  • the system can not fall "out of step” since the position of the observing camera 9 is always known.
  • Restrictions or preferences of a worker 1 given by his body structure may be taken into account.
  • FIGS. 11 to 13 now show an exemplary procedure for assisting a worker 1, in which, after placing an order on it, the presence of a loading aid 5 (LHM), in particular a source box and / or a destination box, is checked. If this is present, then its position and / or position is detected and the loading aid 5 accordingly a segment 100..146 assigned. In the present example it is assumed that this is an envelope surface. Of course, it is also possible that this is an envelope volume. As mentioned above, a "static adaptability" or “dynamic adaptability” is possible for the segment 100..146.
  • LHM loading aid 5
  • an object search is started in which it is examined whether a worker's hand 1, a tool held by him, or the like is within a predetermined range. First, it is checked whether a detected object has or exceeds a certain size. If this is not the case, the object search is continued. If this is true, however, it is checked whether the object is located above a segment 100..146 which is assigned to a loading aid 5. If this is not the case, the object search is continued. However, if this is true, then it is checked whether the relevant segment is 100..146 part of the given order, that is, whether it is a expected in the given sequence of motion segment 100..146. If this is not the case, a warning is issued and the object search continues.
  • the object search continues. If this is the case, however, it is checked whether the object penetrates said envelope for a predetermined time or is located in said envelope volume. If this is the case, a good 3 is classified as captured or stored (ie a "pick" or a "put" detected). If this is not the case, then the object search continues. The sequence ends when the order ends with the pick or the put. Otherwise, the object search continues.
  • the subsequent object search again checks whether a detected object has a certain minimum size. If this is not the case, the object search continues. If this is true, then it is checked whether the object is in a forbidden segment 100..146. If this is the case, a warning is issued and continued with the object search. If this is not true, then it is checked whether the job is completed. If this is not the case, the object search continues. Otherwise, the process ends.
  • the illustrated devices may also comprise more or fewer components than shown and are sometimes shown in greatly simplified form in the figures.
  • the illustrated devices as well as their components have also been shown partly out of scale and / or enlarged and / or reduced in size for a better understanding of their design.

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Warehouses Or Storage Devices (AREA)

Abstract

L'invention concerne un procédé et un système d'assistance à un ouvrier (1) dans une installation (2) de manipulation de marchandises (3). Les mouvements de l'ouvrier (1) et le cas échéant ceux de la marchandise (3) manipulée par l'ouvrier (1) sont détectés à l'aide d'au moins une caméra (9) montée côté installation et comparés afin de déterminer si les mouvements détectés de l'ouvrier (1) et le cas échéant ceux de la marchandise (3) se situent dans une zone théorique d'une séquence de mouvements. Si c'est le cas, l'exécution d'une autre séquence de mouvements est ensuite vérifiée. Si ce n'est pas le cas, un signal de correction est délivré afin de produire l'exécution de la séquence de mouvements voulue. Pour cela, la zone de détection de la ou des caméras (9) est divisée en segments, une séquence de mouvements contenant une occupation des segments par l'ouvrier (1). En règle générale, une séquence de mouvements comprend la saisie et la pose d'au moins une marchandise (3).
PCT/AT2013/050166 2012-08-24 2013-08-23 Procédé et système d'assistance à un ouvrier dans une installation de manipulation de marchandises WO2014028959A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP13780049.6A EP2888185A1 (fr) 2012-08-24 2013-08-23 Procédé et système d'assistance à un ouvrier dans une installation de manipulation de marchandises

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
ATA921/2012A AT513130B1 (de) 2012-08-24 2012-08-24 Verfahren und System zur Unterstützung eines Arbeiters in einer Anlage zur Manipulation von Waren
ATA921/2012 2012-08-24

Publications (1)

Publication Number Publication Date
WO2014028959A1 true WO2014028959A1 (fr) 2014-02-27

Family

ID=49474163

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/AT2013/050166 WO2014028959A1 (fr) 2012-08-24 2013-08-23 Procédé et système d'assistance à un ouvrier dans une installation de manipulation de marchandises

Country Status (3)

Country Link
EP (1) EP2888185A1 (fr)
AT (1) AT513130B1 (fr)
WO (1) WO2014028959A1 (fr)

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102014113264A1 (de) * 2014-09-15 2016-03-17 IGZ Ingenieurgesellschaft für logistische Informationssysteme mbH Steuersystem für einen Kommissionier-Arbeitsplatz und Verfahren zum Kommissionieren
DE102015200529A1 (de) * 2015-01-15 2016-08-04 Volkswagen Aktiengesellschaft Verfahren zur Herstellung eines Kraftfahrzeuges
US20160234464A1 (en) * 2015-02-06 2016-08-11 Xerox Corporation Computer-vision based process recognition
DE102015216062A1 (de) * 2015-08-21 2017-02-23 Insystems Automation Gmbh Sensorvorrichtung zur Erfassung eines manuellen Eingriffs in einen Vorratsbehälter
DE102017107357A1 (de) 2016-10-21 2018-04-26 Trumpf Werkzeugmaschinen Gmbh + Co. Kg Absortierunterstützungsverfahren, Absortiersystem und Flachbettwerkzeugmaschine
WO2018073419A1 (fr) 2016-10-21 2018-04-26 Trumpf Werkzeugmaschinen Gmbh + Co. Kg Procédé d'aide au triage et machine-outil à banc plat
DE102016120132A1 (de) 2016-10-21 2018-04-26 Trumpf Werkzeugmaschinen Gmbh + Co. Kg Werkstücksammelstelleneinheit und Verfahren zur Unterstützung der Bearbeitung von Werkstücken
CN109891342A (zh) * 2016-10-21 2019-06-14 通快机床两合公司 在金属加工工业中基于室内人员定位的制造控制
EP3518055A1 (fr) * 2018-01-29 2019-07-31 IGZ Ingenieurgesellschaft für logistische Informationssysteme mbH Système de surveillance et de fonctionnement d'un poste de travail de production et procédé de production d'un produit ou d'un produit partiel
US10566771B2 (en) 2015-06-16 2020-02-18 Leibherr-Components Biberach GmbH Method for mounting electric switching systems and assembly support device for simplifying the assembly of such switching systems
WO2021004744A1 (fr) * 2019-07-09 2021-01-14 Glatt Gmbh Système d'archivage et procédé pour archiver des données électroniques
ES2803083A1 (es) * 2019-07-22 2021-01-22 Nutrilife Int S L Dispositivo de asistencia al engomado de troqueles
DE102019130154A1 (de) * 2019-11-08 2021-05-12 Trumpf Werkzeugmaschinen Gmbh + Co. Kg Verfahren zum visuellen Unterstützen eines Handhabungsvorgangs und Flachbettwerkzeugmaschine
EP3948725A4 (fr) * 2019-03-28 2022-06-29 Dematic Corp. Confirmation sans contact pour système et procédé de saisie et de mise en place
WO2022218968A1 (fr) * 2021-04-14 2022-10-20 Wincor Nixdorf International Gmbh Terminal libre-service et procédé pour assurer une entrée sécurisée d'un numéro d'identification personnel au niveau d'un terminal libre-service
US11488235B2 (en) 2019-10-07 2022-11-01 Oculogx Inc. Systems, methods, and devices for utilizing wearable technology to facilitate fulfilling customer orders
US11520314B2 (en) 2016-10-21 2022-12-06 Trumpf Werkzeugmaschinen Gmbh + Co. Kg Control of manufacturing processes in metal processing industry

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102017126457A1 (de) * 2017-11-10 2019-05-16 ARS Computer und Consulting GmbH Vorrichtung zur Unterstützung eines Benutzers an einem Arbeitsplatz, Verfahren und Computerprogrammprodukt
DE102018100187A1 (de) * 2018-01-05 2019-07-11 ADT GmbH Verfahren zur Überwachung von Lagerarbeiten an Regalen auf Arbeitsflächen zur Erkennung von Fehllagerungsprozessen
EP3696772A3 (fr) * 2019-02-14 2020-09-09 Denso Wave Incorporated Dispositif et méthode d'analyse de l'état du travail manuel par le travailleur et programme d'analyse du travail
DE102019131235B4 (de) * 2019-11-19 2022-09-08 Trumpf Werkzeugmaschinen Gmbh + Co. Kg Verfahren zum Verknüpfen von Information mit einem Werkstückdatensatz und Flachbettwerkzeugmaschine

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2003034397A1 (fr) 2001-10-19 2003-04-24 Accenture Global Services Gmbh Realite amplifiee industrielle
US20060104479A1 (en) * 2004-11-12 2006-05-18 Iss Technology Methods of unattended detection of operator's deliberate or unintentional breaches of the operating procedure and devices therefore.
DE102006057266A1 (de) * 2006-11-23 2008-05-29 SSI Schäfer Noell GmbH Lager- und Systemtechnik Sortier- und Verteilsystem
EP1487616B1 (fr) 2002-03-20 2010-06-30 Volkswagen Aktiengesellschaft Commande de processus automatique
EP2161219B1 (fr) 2008-09-05 2011-11-16 KNAPP Systemintegration GmbH Procédé et dispositif de soutien visuel de procédés de commissionnement
DE202011004401U1 (de) * 2011-03-17 2012-06-18 SSI Schäfer Noell GmbH Lager- und Systemtechnik Steuerung und Überwachung einer Lager- und Kommissionieranlage durch Bewegung und Sprache

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE202011107932U1 (de) * 2011-11-16 2012-01-18 Knapp Ag Vorrichtung für eine halbautomatische Prüfstation

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2003034397A1 (fr) 2001-10-19 2003-04-24 Accenture Global Services Gmbh Realite amplifiee industrielle
EP1487616B1 (fr) 2002-03-20 2010-06-30 Volkswagen Aktiengesellschaft Commande de processus automatique
US20060104479A1 (en) * 2004-11-12 2006-05-18 Iss Technology Methods of unattended detection of operator's deliberate or unintentional breaches of the operating procedure and devices therefore.
DE102006057266A1 (de) * 2006-11-23 2008-05-29 SSI Schäfer Noell GmbH Lager- und Systemtechnik Sortier- und Verteilsystem
EP2161219B1 (fr) 2008-09-05 2011-11-16 KNAPP Systemintegration GmbH Procédé et dispositif de soutien visuel de procédés de commissionnement
DE202011004401U1 (de) * 2011-03-17 2012-06-18 SSI Schäfer Noell GmbH Lager- und Systemtechnik Steuerung und Überwachung einer Lager- und Kommissionieranlage durch Bewegung und Sprache

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP2888185A1

Cited By (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102014113264B4 (de) * 2014-09-15 2020-10-15 IGZ Ingenieurgesellschaft für logistische Informationssysteme mbH Steuersystem mit Gestensteuerung für einen Kommissionier-Arbeitsplatz und ein Gestensteuerungs-Verfahren zum Kommissionieren
DE102014113264A1 (de) * 2014-09-15 2016-03-17 IGZ Ingenieurgesellschaft für logistische Informationssysteme mbH Steuersystem für einen Kommissionier-Arbeitsplatz und Verfahren zum Kommissionieren
DE102015200529A1 (de) * 2015-01-15 2016-08-04 Volkswagen Aktiengesellschaft Verfahren zur Herstellung eines Kraftfahrzeuges
US10110858B2 (en) * 2015-02-06 2018-10-23 Conduent Business Services, Llc Computer-vision based process recognition of activity workflow of human performer
US20160234464A1 (en) * 2015-02-06 2016-08-11 Xerox Corporation Computer-vision based process recognition
US10566771B2 (en) 2015-06-16 2020-02-18 Leibherr-Components Biberach GmbH Method for mounting electric switching systems and assembly support device for simplifying the assembly of such switching systems
DE102015216062A1 (de) * 2015-08-21 2017-02-23 Insystems Automation Gmbh Sensorvorrichtung zur Erfassung eines manuellen Eingriffs in einen Vorratsbehälter
JP2020506128A (ja) * 2016-10-21 2020-02-27 トルンプフ ヴェルクツォイクマシーネン ゲゼルシャフト ミット ベシュレンクテル ハフツング ウント コンパニー コマンディートゲゼルシャフトTrumpf Werkzeugmaschinen GmbH + Co. KG 加工片回収点ユニットおよび加工片の処理を支援するための方法
JP7003123B2 (ja) 2016-10-21 2022-01-20 トルンプフ ヴェルクツォイクマシーネン ゲゼルシャフト ミット ベシュレンクテル ハフツング ウント コンパニー コマンディートゲゼルシャフト 加工片回収点ユニットおよび加工片の処理を支援するための方法
WO2018073420A1 (fr) 2016-10-21 2018-04-26 Trumpf Werkzeugmaschinen Gmbh + Co. Kg Procédé d'aide au tri, système de tri et machine-outil à banc plat
WO2018073418A1 (fr) 2016-10-21 2018-04-26 Trumpf Werkzeugmaschinen Gmbh + Co. Kg Unité de points de collecte de pièces et procédé permettant de faciliter l'usinage de pièces
CN109863002A (zh) * 2016-10-21 2019-06-07 通快机床两合公司 工件收集点单元和用于辅助工件加工的方法
CN109891342A (zh) * 2016-10-21 2019-06-14 通快机床两合公司 在金属加工工业中基于室内人员定位的制造控制
EP3529675B1 (fr) * 2016-10-21 2022-12-14 Trumpf Werkzeugmaschinen GmbH + Co. KG Commande de fabrication, basée sur la localisation de personnes dans un espace intérieur, dans l'industrie de transformation des métaux
JP2020504696A (ja) * 2016-10-21 2020-02-13 トルンプフ ヴェルクツォイクマシーネン ゲゼルシャフト ミット ベシュレンクテル ハフツング ウント コンパニー コマンディートゲゼルシャフトTrumpf Werkzeugmaschinen GmbH + Co. KG 仕分け支援方法および平台機械工具
DE102016120132A1 (de) 2016-10-21 2018-04-26 Trumpf Werkzeugmaschinen Gmbh + Co. Kg Werkstücksammelstelleneinheit und Verfahren zur Unterstützung der Bearbeitung von Werkstücken
WO2018073419A1 (fr) 2016-10-21 2018-04-26 Trumpf Werkzeugmaschinen Gmbh + Co. Kg Procédé d'aide au triage et machine-outil à banc plat
DE102016120131B4 (de) 2016-10-21 2020-08-06 Trumpf Werkzeugmaschinen Gmbh + Co. Kg Absortierunterstützungsverfahren und Flachbettwerkzeugmaschine
DE102017107357A1 (de) 2016-10-21 2018-04-26 Trumpf Werkzeugmaschinen Gmbh + Co. Kg Absortierunterstützungsverfahren, Absortiersystem und Flachbettwerkzeugmaschine
US11520314B2 (en) 2016-10-21 2022-12-06 Trumpf Werkzeugmaschinen Gmbh + Co. Kg Control of manufacturing processes in metal processing industry
US11361391B2 (en) 2016-10-21 2022-06-14 Trumpf Werkzeugmaschinen Gmbh + Co. Kg Manufacturing control in the metal processing industry
US11358180B2 (en) 2016-10-21 2022-06-14 Trumpf Werkzeugmaschinen Gmbh + Co. Kg Workpiece collecting point units and methods for supporting the processing of workpieces
US11009856B2 (en) 2016-10-21 2021-05-18 Trumpf Werkzeugmaschinen Gmbh + Co. Kg Sorting support method and flatbed machine tool
US11059076B2 (en) 2016-10-21 2021-07-13 Trumpf Werkzeugmaschinen Gmbh + Co. Kg Sorting support methods, sorting systems, and flatbed machine tools
DE102016120131A1 (de) 2016-10-21 2018-04-26 Trumpf Werkzeugmaschinen Gmbh + Co. Kg Absortierunterstützungsverfahren und Flachbettwerkzeugmaschine
JP6993061B2 (ja) 2016-10-21 2022-01-13 トルンプフ ヴェルクツォイクマシーネン ゲゼルシャフト ミット ベシュレンクテル ハフツング ウント コンパニー コマンディートゲゼルシャフト 仕分け支援方法および平台機械工具
EP3929675A1 (fr) * 2018-01-29 2021-12-29 IGZ Ingenieurgesellschaft für logistische Informationssysteme mbH Système de surveillance et de commande pour un poste de travail de production et procédé de fabrication d'un produit ou d'un sous-produit
EP3518055A1 (fr) * 2018-01-29 2019-07-31 IGZ Ingenieurgesellschaft für logistische Informationssysteme mbH Système de surveillance et de fonctionnement d'un poste de travail de production et procédé de production d'un produit ou d'un produit partiel
EP3948725A4 (fr) * 2019-03-28 2022-06-29 Dematic Corp. Confirmation sans contact pour système et procédé de saisie et de mise en place
WO2021004744A1 (fr) * 2019-07-09 2021-01-14 Glatt Gmbh Système d'archivage et procédé pour archiver des données électroniques
ES2803083A1 (es) * 2019-07-22 2021-01-22 Nutrilife Int S L Dispositivo de asistencia al engomado de troqueles
US11488235B2 (en) 2019-10-07 2022-11-01 Oculogx Inc. Systems, methods, and devices for utilizing wearable technology to facilitate fulfilling customer orders
DE102019130154A1 (de) * 2019-11-08 2021-05-12 Trumpf Werkzeugmaschinen Gmbh + Co. Kg Verfahren zum visuellen Unterstützen eines Handhabungsvorgangs und Flachbettwerkzeugmaschine
DE102019130154B4 (de) 2019-11-08 2023-12-21 TRUMPF Werkzeugmaschinen SE + Co. KG Verfahren zum visuellen Unterstützen eines Handhabungsvorgangs und Flachbettwerkzeugmaschine
WO2022218968A1 (fr) * 2021-04-14 2022-10-20 Wincor Nixdorf International Gmbh Terminal libre-service et procédé pour assurer une entrée sécurisée d'un numéro d'identification personnel au niveau d'un terminal libre-service

Also Published As

Publication number Publication date
AT513130A4 (de) 2014-02-15
AT513130B1 (de) 2014-02-15
EP2888185A1 (fr) 2015-07-01

Similar Documents

Publication Publication Date Title
AT513130B1 (de) Verfahren und System zur Unterstützung eines Arbeiters in einer Anlage zur Manipulation von Waren
EP2686254B1 (fr) Commande et surveillance d'une installation de stockage et de préparation des commandes par le mouvement et la voix
EP2161219B1 (fr) Procédé et dispositif de soutien visuel de procédés de commissionnement
DE102018109463C5 (de) Verfahren zur Benutzung einer mehrgliedrigen aktuierten Kinematik, vorzugsweise eines Roboters, besonders vorzugsweise eines Knickarmroboters, durch einen Benutzer mittels einer mobilen Anzeigevorrichtung
DE102013109220B4 (de) Robotervorrichtung und Verfahren zum Herausnehmen von Bulk-Ware aus einem Lager
DE102014102943B4 (de) Robotersystem mit Funktionalität zur Ortsbestimmung einer 3D- Kiste
DE102017128543B4 (de) Störbereich-einstellvorrichtung für einen mobilen roboter
DE102014016032B4 (de) Vorrichtung und Verfahren zum Aufnehmen von wahllos gehäuften Gegenständen mit einem Roboter
DE102014016072B4 (de) Vorrichtung und Verfahren zum Aufheben eines willkürlich aufgestapelten Gegenstands mittels eines Roboters
DE102018101375B4 (de) Artikelbeförderungsvorrichtung, die mindestens einen Sensor nutzt
DE10216023B4 (de) Verfahren und Vorrichtung zur kontrollierten Interaktion zwischen einer eigenbeweglichen Robotereinheit und einem Menschen
EP3049879A2 (fr) Véhicule de transport et procédé pour le transport sans perturbation de rayonnages de chargement dans des usines présentant un fonctionnement de roulage partiellement autonome
EP2561417B1 (fr) Procédé pour introduire une configuration spatiale de dispositifs de fabrication dans un programme de planification assisté par ordinateur et son optimisation
DE112017007392B4 (de) Steuervorrichtung, Greifsystem, Verteilersystem, Programm, Steuerverfahren und Herstellungsverfahren
DE102013220107A1 (de) Verfahren zur Anleitung und/oder Kontrolle von an einem Arbeitsplatz auszuführenden Montage- und Kommissionierungsprozessen
EP2607271B1 (fr) Procédé de préparation de commandes d'articles et dispositif de préparation de commandes conçu à cet effet
EP3759036A1 (fr) Dispositif de preparation de commandes comprenant une image se trouvant de manière virtuelle dans une zone de travail
EP3118706B1 (fr) Procede de commande d'un chariot de manutention pour la preparation de commande
DE202011004401U1 (de) Steuerung und Überwachung einer Lager- und Kommissionieranlage durch Bewegung und Sprache
DE102018101162B4 (de) Messsystem und Verfahren zur extrinsischen Kalibrierung
DE102014224843A1 (de) Aufbereitungssystem zum Reinigen und/oder Desinfizieren von medizinischen Geräten und Verfahren zum Betreiben desselben
DE102016125224A1 (de) Verfahren zur Navigation und Selbstlokalisierung eines sich autonom fortbewegenden Bearbeitungsgerätes
DE102014105506B4 (de) Robotsauger und Verfahren zum Betrieb eines solchen Robotsaugers
DE102013000964A1 (de) Verfahren zum Erfassen und Darstellen von dreidimensionalen Objekten und Vorrichtung zur Durchführung eines solchen Verfahrens
EP3378825A1 (fr) Chariot de manutention pourvu d'un objet volant

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13780049

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2013780049

Country of ref document: EP