WO2019108117A1 - Systems, methods and computer programs for container space allocation monitoring - Google Patents

Systems, methods and computer programs for container space allocation monitoring Download PDF

Info

Publication number
WO2019108117A1
WO2019108117A1 PCT/SE2018/051215 SE2018051215W WO2019108117A1 WO 2019108117 A1 WO2019108117 A1 WO 2019108117A1 SE 2018051215 W SE2018051215 W SE 2018051215W WO 2019108117 A1 WO2019108117 A1 WO 2019108117A1
Authority
WO
WIPO (PCT)
Prior art keywords
item
container
camera
symbol
detected symbol
Prior art date
Application number
PCT/SE2018/051215
Other languages
French (fr)
Inventor
Cristian KLEIN
Barbara TEREBIENIEC
Original Assignee
Klein Cristian
Terebieniec Barbara
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Klein Cristian, Terebieniec Barbara filed Critical Klein Cristian
Publication of WO2019108117A1 publication Critical patent/WO2019108117A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/08Logistics, e.g. warehousing, loading or distribution; Inventory or stock management
    • G06Q10/087Inventory or stock management, e.g. order filling, procurement or balancing against orders
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/74Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/08Logistics, e.g. warehousing, loading or distribution; Inventory or stock management
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition
    • G06V30/14Image acquisition
    • G06V30/148Segmentation of character regions
    • G06V30/153Segmentation of character regions using recognition of characters or words
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition
    • G06V30/22Character recognition characterised by the type of writing
    • G06V30/224Character recognition characterised by the type of writing of printed characters having additional code marks or containing code marks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30241Trajectory
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • G06T7/248Analysis of motion using feature-based methods, e.g. the tracking of corners or segments involving reference images or patches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition

Definitions

  • the present disclosure relates to efficient identification of item location with respect to a container space.
  • the present disclosure relates to systems, methods and computer programs for container space allocation monitoring.
  • a biological sample In many lab settings, a biological sample is to be repeatedly stored in and removed from a freezer, often with considerable time between insertion and removal of the biological sample.
  • the biological sample is typically held within a vial.
  • the vial is then placed in a box arranged to hold a plurality of vials.
  • the box may in turn be arranged in one of several racks of the freezer.
  • a typical freezer may contain on the order of five hundred boxes.
  • the location of the box in which a biological sample has been placed is tracked using pen and paper or an ad-hoc digital table to be filled in by the person responsible for the biological sample. Finding the right box again after having inserted a biological sample can be time consuming.
  • the present disclosure suggests systems, methods and computer programs where symbols, e.g. barcodes, are placed on an item, such as a box holding an organic sample, and the symbol is detected by one or more cameras. Based on trajectories and/or current content of the container, control circuitry of the system determines if the item is being inserted or removed.
  • the container may comprise storage components, such as racks, which may also be labelled with unique symbols, thereby enabling matching the symbol of the item with a symbol of a storage component in order to associate the item with a location within the container.
  • the present disclosure relates to a system for container space allocation monitoring.
  • the system comprises at least one camera placed to face a respective region in front of an opening of a container.
  • the at least one camera is placed outside the container and facing a region in front of the opening of the container to capture activities outside the container.
  • the at least one camera may be placed above, below, to the left and/or to the right of the opening.
  • the opening may be a side opening of the container.
  • the system further comprises a database configured to store information relating to items stored in the container and their respective positions within the container space.
  • the system also comprises control circuitry.
  • the control circuitry is configured to detect a symbol of an item based on processing one or more images captured by the at least one camera.
  • the control circuitry is further configured to determine a position of the detected symbol based on image processing of at least one image comprising the detected symbol.
  • the control circuitry is also configured to determine a trajectory of the detected symbol based on a set of determined at least one positions of the detected symbol.
  • the control circuitry is additionally configured to determine insertion or removal of the item and a position of the item relative to a position of the container space based on the determined trajectory.
  • the control circuitry is yet further configured to update the stored information in the database based on the determined insertion or removal of the item and the determined position of the item relative to the position of the container space.
  • the system thereby uses the trajectory of the determined symbol and the stored information in the database to derive a contextual relationship which is used to determine if the item carrying the symbol is being inserted or removed from the container and a position of the item relative to a position of the container space.
  • the position of the item relative to a position of the container space could be as simple as "outside” or "inside” the container space, but could in some examples use a symbol of a storage component of the container space to determine which storage component the item has been inserted in or removed from.
  • the position of the item's symbol, relative to the container space and/or relative to a symbol of a storage component may be recorded in the database to speed up retrieval of items stored in the container space.
  • At least part of the control circuitry and one or more of the at least one camera are comprised in a single logical unit such that detecting the symbol of an item, determining the position of the detected symbol, determining the trajectory of the detected symbol and determining insertion or removal of the item and a position of the item relative to a position of the container space are performed within the single logical unit.
  • the single logical unit comprises a single-board computer.
  • the system further comprises a motion filter arranged to determine changes between consecutive images captured by the at least one camera.
  • the motion filter thereby facilitates real-time identification of symbols without the need for substantial computational resources.
  • the system further comprises a video encoder and a video decoder.
  • the video encoder is configured to encode image data in a compressed format.
  • the video decoder is configured to decode/decompress the compressed format image data.
  • the use of video encoders and video decoders enables effective ways of separating the components associated with generating image data, such as cameras and motion filters (in addition to video encoders), from components associated with interpretation of the generated image data.
  • a video encoder enables efficient transfer of image data between the image data generating components and the image data interpreting components.
  • the at least one camera comprises the motion filter and/or the video encoder. The camera thereby not only captures images, but also performs local image processing in the camera.
  • control circuitry is further configured to ensure correct chronological order of a plurality of separately processed images. This enables parallel processing of the image data associated with the images. If the control circuitry is able to process image data relating to different images in parallel, the processing of one image corresponding to a second time later than a first time may be finished before another image corresponding to the first time, which would cause the processed image data to appear out of order; by ensuring correct chronological order, such problems can be avoided.
  • the system further comprises a memory buffer configured to receive the compressed format image data from the video encoder and store the compressed format image data until a remaining processing pipeline is ready to process the stored compressed format image data.
  • the system thereby becomes more stable with respect to downstream processes, which often are more computationally expensive.
  • the buffer can ensure smooth real-time monitoring of the container.
  • the at least one camera comprises a motion filter and/or a video encoder
  • the processed image data may be transmitted to the memory buffer directly from the at least one camera.
  • the at least one camera also comprises the memory buffer. In such case, the at least one camera transmits image data from the buffer to downstream processes, such as a potential video decoder, when a remaining processing pipeline is ready to process the stored compressed format image data.
  • the image processing of the at least one image comprising the detected symbol is configured to determine an estimated relative position of the detected symbol to another symbol and/or an environmental marker.
  • the estimated relative position may be used to establish a contextual relationship between the container and the item. If the position of the container with respect to the other symbol or environmental marker is known, changes in the relative position of the detected symbol to the other symbol and/or the environmental marker can tell if the item is moving towards or away from the container.
  • the database has no indication that the item is considered stored within the container and a series of relative positions indicates a movement of the item towards the container, upon which the symbol of the item is lost from view of the at least one camera, the situation may be interpreted as the item being delivered to the container, as indicated by the series of relative positions, and placed inside the container, as indicated by the at least one camera no longer being able to track the symbol of the item.
  • the contextual relationship is thus one of insertion, and the database can be updated accordingly.
  • the relative position of the detected symbol to another symbol may further be used to determine a relative position of the item within the container space and/or within a storage component, typically a storage component comprising the other symbol.
  • the image processing of the at least one image comprising the detected symbol is configured to determine an estimated distance and/or orientation of the detected symbol relative to the at least one camera.
  • the estimated distance and/or orientation By determining the estimated distance and/or orientation, positional accuracy of the item with respect to the container can be established or improved (if the position of the item is also determined in other ways).
  • the estimated distance and/or orientation may further facilitate establishment of a contextual relationship which enables determining if the item is being inserted or removed from the container.
  • the at least one camera is a single camera.
  • a single camera typically reduces cost with respect to solutions involving a plurality of cameras, as well as reducing space requirements and complexity of the system.
  • the at least one camera comprises a plurality of cameras placed to face different regions in front of the opening of the container.
  • the plurality of cameras thereby is more likely to detect a symbol which may be obscured from certain camera viewpoints at particular times.
  • the plurality of cameras further facilitates establishing a three-dimensional interpretation of the detected symbol as well as the space in which the item having the symbol moves.
  • the at least one camera comprises a stereo camera and/or a depth sensing camera.
  • the stereo and/or depth-sensing ability provides ways for single cameras to determine a position of the item with respect to the container based on the detected symbol.
  • the size and/or orientation of the detected symbol can be used to deduce a distance and/or orientation of the item with respect to the camera, and with knowledge of the camera with respect to the container, a position of the item with respect to the container.
  • the symbol of the item comprises one of a one-dimensional barcode, a two-dimensional barcode, an optical character recognition, OCR, number, and human-readable symbols, letters or digits that can be translated into machine code using optical character recognition.
  • Barcodes and OCR numbers allow for large sets of different items and storage components to be assigned unique symbols. The barcodes and OCR numbers further enable the use of available methods for identifying the barcodes and OCR numbers.
  • control circuitry is further configured to detect a symbol of a storage component of the container, and to determine the insertion or removal of the item and the position of the item relative to a position of a detected symbol of the storage component of the container space.
  • the symbol of the item can be correlated to the symbol of the storage component in order to keep track of where, i.e. for which storage component, the item was inserted or removed.
  • By monitoring the relative positions it is possible to determine if and how the item is moving towards or away from the storage component, which may be interpreted as removal and insertion of the item from/into the storage component and possibly also where within the storage component the item was inserted or removed.
  • the symbol of the storage component may be used in combination with the symbol of the item to determine to/from where the item is inserted/removed, as well as assisting in establishing the context which enables determining if the item is being inserted or removed.
  • the present disclosure further relates to a method for container space allocation monitoring.
  • the method comprises detecting a symbol of an item based on processing one or more images captured by at least one camera.
  • the method further comprises determining a position of the detected symbol based on image processing of at least one image comprising the detected symbol.
  • the method also comprises determining a trajectory of the detected symbol based on a set of determined at least one positions of the detected symbol.
  • the method additionally comprises determining insertion or removal of the item and a position of the item relative to a position of a container space based on the determined trajectory.
  • the method yet further comprises updating information stored in a database based on the determined insertion or removal of the item and the determined position of the item relative to the position of the container space.
  • the method also relates to a computer program for container space allocation monitoring, the computer program comprising computer program code which, when executed by a processor, causes the processor to carry out the method as described above and below.
  • the computer program thus has all the technical effects and advantages of the disclosed method and system, as described above and below.
  • Figure 1 illustrates a system for container space allocation monitoring
  • Figure 2 is a box diagram of a system for container space allocation monitoring
  • Figure 3 is a box diagram of a system for container space allocation monitoring; and Figure 4 illustrates a method for container space allocation monitoring.
  • Figure 1 illustrates a system 100 for container space allocation monitoring.
  • Figure 1 further illustrates a container 110 having a container space 120 and an opening 112 for external access to the container space 120.
  • the container 110 may also comprise a closing mechanism (not shown), such as a door, configured to provide access to and close off the opening 112.
  • Figure 1 also illustrates an item 130 having a first symbol 132 and a storage component 140 of the container space, wherein the storage component 140 has a second symbol 142.
  • Neither one of the container 110, item 130, storage component 140 nor any of the first and second symbols 132, 142 are part of the claimed system 100, but are illustrated herein merely to facilitate understanding of different aspects of the system 100, since the system is intended to operate with respect to at least some of the container, item, storage component and the first and second symbols.
  • the basic idea of the system 100 is to act analogous to a traffic camera, and use the captured images to identify symbols and their movement in space and time in order to determine if items are being inserted into or removed from the storage space, and, to the extent possible, with respect to the available positions within the storage space.
  • the information relating to insertion and removal of items to and from the storage space is recorded at a database, which may also be used to assist when determining if an item is being inserted or removed.
  • the system 100 comprises at least one camera lOa-c placed to face a respective region in front of an opening of a container.
  • the at least one camera is thereby arranged to capture images of items being inserted or removed and is able to register their associated symbols.
  • one or more camera is arranged at a position above the opening of the container, facing straight down.
  • the one or more camera thereby captures a bird's-eye view of what goes on in front of the opening of the container at any given moment.
  • the bird's-eye view is particularly suitable for capturing symbols, since most items inserted and removed will have a preferred side facing upwards, where an associated symbol will preferably be placed.
  • the at least one camera lOa-c is a single camera.
  • a single camera typically reduces cost with respect to solutions involving a plurality of cameras, as well as reducing space requirements and complexity of the system. If a bird's-eye view type of monitoring is desired, a single camera per container is typically enough to keep track of the items being inserted and removed from the respective containers.
  • the single camera is arranged to face a respective region in front of an opening of each container in a plurality of containers. In other words, the region monitored by the single camera covers all of the respective regions in front of the plurality of containers.
  • the at least one camera comprises a plurality of cameras placed to face different regions in front of the opening of the container.
  • the plurality of cameras thereby is more likely to detect a symbol which may be obscured from certain camera viewpoints at particular times.
  • the plurality of cameras further facilitates establishing a three-dimensional interpretation of the detected symbol as well as the space in which the item having the symbol moves.
  • a plurality of cameras can be distributed such that each container has one or more camera facing a region in front of the opening of each container. The system thereby extends the monitoring from one container to a plurality of containers.
  • the at least one camera comprises a stereo camera and/or a depth sensing camera.
  • the stereo and/or depth-sensing ability provides ways for single cameras to determine a position of the item with respect to the container based on the detected symbol.
  • the size and/or orientation of the detected symbol can be used to deduce a distance and/or orientation of the item with respect to the camera, and with knowledge of the camera with respect to the container, a position of the item with respect to the container.
  • the positions may be estimated using a perspective-n-point algorithm to determine a pose of the detected symbol with respect to the camera.
  • three-dimensional reconstruction from a stereo image may be used to determine positioning of detected symbols.
  • the system further comprises a database 14 configured to store information relating to items stored in the container and their respective positions within the container space.
  • each item may be assigned a unique ID associated with a symbol of the item.
  • Information relating to the position of the item may be as simple as an indicator if the item is currently stored within the container space or not, e.g. a binary number or an informative statement such as "inside” or "outside", but may also indicate where within the container space. If the container has storage components, such as racks, and the storage components can be identified during insertion or removal of an item, e.g.
  • the stored information may comprise information configured to identify the storage component as well, such as a unique ID for the storage component, and possibly also where within the storage component. For each item, the stored information would then comprise a unique ID for the item, a unique ID for the storage component in which the item is stored or has been removed from and an indicator if the item is currently stored within the container space or not.
  • the stored information may further comprise additional information considered useful by a user of the system, such as information relating to persons and/or projects associated with the item or things stored within the item.
  • the system also comprises control circuitry 12.
  • the control circuitry 12 is configured to detect a symbol of an item based on processing one or more images captured by the at least one camera.
  • the processing may comprise the use of image recognition software and/or applying filters or software modules for identifying predetermined features associated with the symbol.
  • the symbol may have known features for which special purpose filters and/or software modules are used to identify.
  • the symbol of the item may comprise one of a one dimensional barcode, a two-dimensional barcode, an optical character recognition, OCR, number, and human-readable symbols, letters or digits that can be translated into machine code using optical character recognition. Such barcodes and OCR numbers have known features which may be used for detection.
  • Barcodes and OCR numbers further enables assigning unique IDs to each item to be stored within the container.
  • the processing of the one or more images captured by the at least one camera comprises identifying at least one of a one-dimensional barcode, a two-dimensional barcode, an optical character recognition, OCR, number, and human-readable symbols, letters or digits that can be translated into machine code using optical character recognition.
  • the system further comprises a motion filter 15 arranged to determine changes between consecutive images captured by the at least one camera lOa-c.
  • the motion filter 15 thereby facilitates real-time identification of symbols without the need for substantial computational resources.
  • the motion filter reduces the amount of image data that has to be processed by downstream processes.
  • the downstream processes may comprise both the detection of the symbol of the item, as discussed above, as well as further data processing steps in the form of determining if insertion or removal of the item is taking place. While local processing could be performed at the at least one camera in some examples, e.g. a camera may comprise a motion filter, it is typically desirable to transfer (possibly filtered) image data to a computational resource of the control circuitry particularly suitable for processing the downstream image data.
  • the system further comprises a video encoder 16 and a video decoder 17.
  • the video encoder 16 is configured to encode image data in a compressed format.
  • the video decoder 17 is configured to decode/decompress the compressed format image data.
  • the use of the video encoder 16 and the video decoder 17 enables efficient transfer of image data from one part in the system to another.
  • the video encoder 16 and the video decoder 17 facilitates splitting up the control circuitry 12 into logically and/or physically separate units, since image data transfer between the separate units of the control circuitry 12 can be performed more efficiently.
  • Encoding the image data may further be used to provide context and/or an optimal format for downstream image processing.
  • the context may facilitate interpretation of the image data.
  • the optimal format may facilitate the use of optimized algorithms to identify features of the image data.
  • the system 100 further comprises a memory buffer 18 configured to receive the compressed format image data from the video encoder 16 and store the compressed format image data until a remaining processing pipeline is ready to process the stored compressed format image data.
  • the system 100 thereby becomes more stable with respect to downstream processes, which often are more computationally expensive an may not be immediately ready to receive the image data.
  • the buffer 18 can ensure smooth real time monitoring of the container 110.
  • the at least one camera lOa-c comprises a motion filter 15 and/or a video encoder 16
  • the processed image data may be transmitted to the memory buffer 18 directly from the at least one camera.
  • the at least one camera lOa-c also comprises the memory buffer 18. In such case, the at least one camera transmits image data from the buffer 18 to downstream processes, such as a potential video decoder 17, when a remaining processing pipeline is ready to process the stored compressed format image data.
  • control circuitry 12 is further configured to ensure correct chronological order of a plurality of separately processed images.
  • the control circuitry 12 is further configured to determine a position of the detected symbol based on image processing of at least one image comprising the detected symbol. As stated above, the determined positions form the basis for keeping track of the insertion and removal of items in the container. The position of the symbol may be derived from relationships with items in the environment and/or based on the image of the detected symbol itself.
  • the image processing of the at least one image comprising the detected symbol is configured to determine an estimated relative position of the detected symbol 132 to another symbol 142 and/or an environmental marker 152. With the position of the environmental marker 152 and/or the other symbol 142 being known, a spatial relationship between the detected symbol 132 and the other symbol 142 and/or the environmental marker 152 can be established.
  • the images of said camera may be used to determine a distance and direction in a two-dimensional plane parallel to a floor on which the container 110 is arranged between the detected symbol 132 and the other symbol 142 and/or the environmental marker 152.
  • the estimated relative position of the detected symbol 132 to the other symbol 142 may be used to determine a relative position of the item 130 within the container space 120 and/or within a storage component 140 of the container space, e.g. a storage component 140 comprising the other symbol 142.
  • the environment marker is arranged to provide a reference in a global coordinate system in which both the item 130 comprising the detected symbol 132 and the container 110 are arranged.
  • the other symbol 142 is arranged to provide a reference in local coordinate system with respect to the container 110.
  • the other symbol 142 may comprise a symbol arranged on a storage component 140 of the container space, e.g. a rack for receiving the item 130.
  • the image processing of the at least one image comprising the detected symbol is configured to determine an estimated distance and/or orientation of the detected symbol relative to the at least one camera lOa-c. If the detected symbol is known to possess certain geometrical properties, such as being square- or rectangular-shaped with known side lengths, and/or having distinctive visual features, such as black and white squares or stripes, the geometrical properties and/or visual features may be used to determine an estimated distance and/or orientation of the detected symbol relative to the at least one camera. Once the symbol has been detected, the image of the symbol can be matched to a translation and/or rotation of a symbol from a reference plane with respect to the at least one camera.
  • One or more positions at known times can then be used to form a trajectory.
  • the trajectory may be used to keep track of where the item 130 is in relation to the container 110 as well as providing context as to whether an item that at one time instant appears or disappears in the vicinity of the opening 112 of the container 110 is being inserted or removed from the container space 120. Therefore, the control circuitry 12 is also configured to determine a trajectory of the detected symbol 132 based on a set of determined at least one positions of the detected symbol.
  • the control circuitry 12 is additionally configured to determine insertion or removal of the item and a position of the item relative to a position of the container space based on the determined trajectory. Movement of the detected symbol 132 in space and time provides a context which facilitates interpretation whether the item 130 is being inserted into or removed from the storage space 120. For example, a symbol of an item is detected within the vicinity of the opening 112 of the storage space and the system has to decide if the item is being inserted or removed (or neither). The system may consult the information stored in the database 14 to see if the item 130 associated with the detected symbol 132 was previously stored in the container space 120 or not. If the item was registered as stored, the system has to determine if the item is to be considered removed.
  • a trajectory of the detected symbol 132 which leads away from the container may indicate that the item is considered removed from the container space.
  • a trajectory which makes a narrow U-turn in vicinity of the opening 112 of the container or a symbol 142 of a storage component 140 of the container space, after which the detected symbol may disappear from view of the at least one camera lOa-c, may be interpreted as the item being taken out temporarily and then returned, thus still being stored within the container space. If two storage components having unique symbols are detected during the trajectory of the item, or rather the detected symbol 132 of the item 130, the trajectory may be used to indicate the item being moved from one storage component to another. Likewise, the trajectory may be used to indicate insertion of an item. If the trajectory of the detected symbol 132 approaches the storage space and then disappears, this may be used to interpret the item 130 having the detected symbol 132 as being inserted.
  • the trajectory may further be used to indicate in which storage component the item is stored.
  • the control circuitry 12 is further configured to detect a symbol of a storage component of the container, and to determine the insertion or removal of the item and the position of the item relative to a position of a detected symbol of the storage component of the container space.
  • the control circuitry 12 is yet further configured to update the stored information in the database based on the determined insertion or removal of the item and the determined position of the item relative to the position of the container space.
  • At least part of the control circuitry 12 and one or more of the at least one camera lOa-c are comprised in a single logical unit (not shown) such that detecting the symbol of an item, determining the position of the detected symbol, determining the trajectory of the detected symbol and determining insertion or removal of the item and a position of the item relative to a position of the container space are performed within the single logical unit.
  • the single logical unit comprises a single board computer.
  • the single logical unit may further comprise the part of the control circuitry 12 configured to update the stored information in the database based on the determined insertion or removal of the item and the determined position of the item relative to the position of the container space.
  • the single logical unit also comprises the motion filter 15.
  • the single logical unit further comprises the memory buffer 18.
  • the database 34 is comprised in the single logical unit.
  • control circuitry of the respective logical unit is configured to only update the stored information in the database if the information the control circuitry wants to store at the database differs from the currently stored information.
  • logical unit could share a single, external portion of the control circuitry configured to update the stored information in the database based on the determined insertion or removal of the item and the determined position of the item relative to the position of the container space.
  • the single, external portion of the control circuitry configured to update the stored information in the database is comprised in the database.
  • Figure 2 is a box diagram of a system 200 for container space allocation monitoring.
  • the illustrated system provides examples of how systems can be arranged to divide the computational tasks from the moment of image capture to updating a database between computational resources of different capacity.
  • the system is here illustrated having a low performance computational subsystem 202 and a high performance computational subsystem 204, as will be illustrated further below.
  • the system 200 comprises at least one camera 20 placed to face a respective region in front of an opening of a container (not shown). For reasons of convenience, the at least one camera 20 will be discussed in terms of a single camera, but it should be understood that the illustrated examples may comprise more than one camera of any type discussed herein.
  • the system further comprises a database 24 configured to store information relating to items stored in the container and their respective positions within the container space.
  • the system also comprises control circuitry.
  • the control circuitry handles the computational tasks and will be illustrated as distributed modules for clarity. Though the modules are illustrated as separate entities, it is to be understood that the modules may be integrated into one or more common logical unit(s).
  • the modules can be implemented in any suitable combination of software and hardware.
  • the control circuitry is configured to detect a symbol of an item based on processing one or more images captured by the at least one camera, as illustrated by a symbol detector and decoder module 221.
  • the control circuitry is further configured to determine a position of the detected symbol based on image processing of at least one image comprising the detected symbol, as illustrated by a symbol positioner module 222.
  • the control circuitry is also configured to determine a trajectory of the detected symbol based on a set of determined at least one positions of the detected symbol as illustrated by a symbol tracker module 223.
  • the control circuitry is yet further configured to determine insertion or removal of the item and a position of the item relative to a position of the container space based on the determined trajectory, as illustrated by a gesture decoder module 224.
  • the control circuitry is additionally configured to update the stored information in the database 24 based on the determined insertion or removal of the item and the determined position of the item relative to the position of the container space, as illustrated by a database writer module 225.
  • the system 200 preferably further comprises a motion filter 25 arranged to determine changes between consecutive images captured by the camera 20.
  • control circuitry is further configured to ensure correct chronological order of a plurality of separately processed images, as illustrated by a reorder module 226.
  • the system further comprises a video encoder 26 and a video decoder 27, as illustrated by a video encoder module 26 and a video decoder module 27, respectively.
  • the video encoder 26 is configured to encode image data in a compressed format.
  • the video decoder 27 is configured to decode/decompress the compressed format image data.
  • the system may also comprise a memory buffer 28, as illustrated by a buffer module 28, configured to receive the compressed format image data from the video encoder 26 and store the compressed format image data until a remaining processing pipeline is ready to process the stored compressed format image data.
  • a buffer module 28 configured to receive the compressed format image data from the video encoder 26 and store the compressed format image data until a remaining processing pipeline is ready to process the stored compressed format image data.
  • the camera 20 and any motion filter 25, video encoder 26 and memory buffer 28 are arranged together in a low performance computational subsystem 202. If present, any of the motion filter 25, video encoder 26 and memory buffer 28 may be arranged as part of or integrated with the camera 20. According to some aspects, the camera 20, the motion filter 25, the video encoder 26 and the memory buffer 28 comprised in a single board computer system.
  • the symbol detector and decoder module 221, the symbol positioner module, the symbol tracker module 223, the gesture decoder module 224, the database writer module 225, and, if present, the reorder module 226 and the video decoder 27 are arranged together in a high performance computational subsystem 204.
  • the high performance computational subsystem 204 may be implemented in a cloud virtual machine.
  • the database 24 is also comprised in the high performance computational subsystem 204.
  • Figure 3 is a box diagram of a system 300 for container space allocation monitoring.
  • the illustrated system provides examples of how image processing and downstream aspects of the control circuitry may be integrated with one or more camera in a single-board computer system.
  • Figure 3 aims at illustrating how the low performance computational subsystem 202 and the high performance computational subsystem 204 of Figure 2 above may both be comprised in a single-board computer system 303, along with potential advantages.
  • the system 300 comprises at least one camera 30 placed to face a respective region in front of an opening of a container (not shown). For reasons of convenience, the at least one camera 30 will be discussed in terms of a single camera, but it should be understood that the illustrated examples may comprise more than one camera of any type discussed herein.
  • the system further comprises a database 34 configured to store information relating to items stored in the container and their respective positions within the container space.
  • the system also comprises control circuitry.
  • the control circuitry handles the computational tasks and will be illustrated as distributed modules for clarity. Though the modules are illustrated as separate entities, it is to be understood that the modules may be integrated into one or more common logical unit(s).
  • the modules can be implemented in any suitable combination of software and hardware.
  • the control circuitry is configured to detect a symbol of an item based on processing one or more images captured by the at least one camera, as illustrated by a symbol detector and decoder module 321.
  • the control circuitry is further configured to determine a position of the detected symbol based on image processing of at least one image comprising the detected symbol, as illustrated by a symbol positioner module 322.
  • the control circuitry is also configured to determine a trajectory of the detected symbol based on a set of determined at least one positions of the detected symbol as illustrated by a symbol tracker module 323.
  • the control circuitry is yet further configured to determine insertion or removal of the item and a position of the item relative to a position of the container space based on the determined trajectory, as illustrated by a gesture decoder module 324.
  • the control circuitry is additionally configured to update the stored information in the database 34 based on the determined insertion or removal of the item and the determined position of the item relative to the position of the container space, as illustrated by a database writer module 325.
  • the system 300 preferably further comprises a motion filter 35 arranged to determine changes between consecutive images captured by the camera 30.
  • control circuitry is further configured to ensure correct chronological order of a plurality of separately processed images, as illustrated by a reorder module 326.
  • the system may also comprise a memory buffer 38, as illustrated by a buffer module 38, configured to receive the compressed format image data from either directly from the camera 30 or via the motion filter 35 (if present) and store the compressed format image data until a remaining processing pipeline is ready to process the stored compressed format image data.
  • a buffer module 38 configured to receive the compressed format image data from either directly from the camera 30 or via the motion filter 35 (if present) and store the compressed format image data until a remaining processing pipeline is ready to process the stored compressed format image data.
  • the camera 30 and at least the part of the control circuitry comprising the symbol detector and decoder module 321, the reorder module 326, the symbol positioner module 322, the symbol tracker module 323, the gesture decoder module 324 and the database writer module 325 are comprised in a single-board computer system 303.
  • the database 34 is illustrated as a unit separate from the single-board computer system 303, but may in some examples be comprised in the single-board computer system 303 as well.
  • the database writer module 325 is configured to only update the stored information in the database if the information the database writer module 325 wants to store at the database 34 differs from the currently stored information.
  • each single-board computer systems 303 could share a single, external database writer module 325 (not shown).
  • the single, external database writer module is comprised in the database 34 (not shown).
  • Figure 4 illustrates a method for container space allocation monitoring.
  • the method comprises detecting S10 a symbol of an item based on processing one or more images captured by at least one camera.
  • the method further comprises determining S20 a position of the detected symbol based on image processing of at least one image comprising the detected symbol.
  • the method also comprises determining S30 a trajectory of the detected symbol based on a set of determined at least one positions of the detected symbol.
  • the method additionally comprises determining S40 insertion or removal of the item and a position of the item relative to a position of a container space based on the determined trajectory.
  • the method yet further comprises updating S50 information stored in a database based on the determined insertion or removal of the item and the determined position of the item relative to the position of the container space.
  • the method carries out the steps for which the disclosed system for container space allocation monitoring as described above and below is configured, and consequently has all the technical effects and advantages of the system for container space allocation monitoring.
  • the present disclosure also relates to a computer program for container space allocation monitoring.
  • the computer program comprises computer program code which, when executed by a processor, causes the processor to carry out the method for container space allocation monitoring as described above and below.

Abstract

The present disclosure relates to a system (100) for container space allocation monitoring. The system (100) comprises at least one camera (10a-c) placed to face a respective region in front of an opening (112) of a container (110). The system further comprises a database (14) configured to store information relating to items stored in the container and their respective positions within the container space (120). The system also comprises control circuitry (12). The control circuitry is configured to detect a symbol of an item based on processing one or more images captured by the at least one camera. The control circuitry is further configured to determine a position of the detected symbol based on image processing of at least one image comprising the detected symbol. The control circuitry is also configured to determine a trajectory of the detected symbol based on a set of determined at least one positions of the detected symbol. The control circuitry is additionally configured to determine insertion or removal of the item and a position of the item relative to a position of the container space based on the determined trajectory, and to update the stored information in the database based on the determined insertion or removal of the item and the determined position of the item relative to the position of the container space. The present disclosure also relates to corresponding methods and computer programs.

Description

Systems, methods and computer programs for container space allocation monitoring TECHNICAL FIELD
The present disclosure relates to efficient identification of item location with respect to a container space. In particular, the present disclosure relates to systems, methods and computer programs for container space allocation monitoring.
BACKGROUND
In many lab settings, a biological sample is to be repeatedly stored in and removed from a freezer, often with considerable time between insertion and removal of the biological sample. The biological sample is typically held within a vial. The vial is then placed in a box arranged to hold a plurality of vials. The box may in turn be arranged in one of several racks of the freezer. A typical freezer may contain on the order of five hundred boxes. In many workplaces the location of the box in which a biological sample has been placed is tracked using pen and paper or an ad-hoc digital table to be filled in by the person responsible for the biological sample. Finding the right box again after having inserted a biological sample can be time consuming. One reason for this is that in many workplaces, there is naturally a high staff turn-around and information on where the leaving staff kept their biological samples is often lost. Another source for the trouble in recovering the right biological sample is that during freezer failures or maintenance, the person(s) responsible for taking care of the failure or maintenance are often different from the people working with the biological samples. Hence, it is common that the person(s) responsible for taking care of the failure or maintenance to inadvertently shuffle the racks holding the boxes with biological samples around, and sometimes even moving the boxes holding the biological sample vials around as well to facilitate their own work. Since the persons working in lab environments are typically high skilled workers, with high matching salaries, the additional time spent on identifying the location of a stored sample adds up over time and results in lost efficiency. A further potential cause for lost time is a need to identify what boxes are still in use and what boxes are obsolete, since this would require going through all the boxes in the freezer one by one.
The above scenario is described as relating to storage of items in freezers in a lab environment. However, there are other scenarios in which similar difficulties occur. It can for example relate to libraries where the items are books. It may also be applicable to shopping malls or grocery stores wherein each item for sale may be monitored accordingly. Finally, it may concern post offices where items are parcels stored on shelves.
In each of the above scenarios, there is a need to decrease amount of time spent looking for items which have been stored away in a storage for storing a large quantity of items.
SUMMARY
One of the main problems faced when inserting and removing items in containers requiring multiple users to keep track of the items in order to maintain a well-organized storage within the container is that the process of monitoring insertion and removal, as well as keeping track of where every item is stored, may take up a lot of time. The present disclosure suggests systems, methods and computer programs where symbols, e.g. barcodes, are placed on an item, such as a box holding an organic sample, and the symbol is detected by one or more cameras. Based on trajectories and/or current content of the container, control circuitry of the system determines if the item is being inserted or removed. The container may comprise storage components, such as racks, which may also be labelled with unique symbols, thereby enabling matching the symbol of the item with a symbol of a storage component in order to associate the item with a location within the container.
More specifically, the present disclosure relates to a system for container space allocation monitoring. The system comprises at least one camera placed to face a respective region in front of an opening of a container. Thus, the at least one camera is placed outside the container and facing a region in front of the opening of the container to capture activities outside the container. The at least one camera may be placed above, below, to the left and/or to the right of the opening. The opening may be a side opening of the container.
The system further comprises a database configured to store information relating to items stored in the container and their respective positions within the container space. The system also comprises control circuitry. The control circuitry is configured to detect a symbol of an item based on processing one or more images captured by the at least one camera. The control circuitry is further configured to determine a position of the detected symbol based on image processing of at least one image comprising the detected symbol. The control circuitry is also configured to determine a trajectory of the detected symbol based on a set of determined at least one positions of the detected symbol. The control circuitry is additionally configured to determine insertion or removal of the item and a position of the item relative to a position of the container space based on the determined trajectory. The control circuitry is yet further configured to update the stored information in the database based on the determined insertion or removal of the item and the determined position of the item relative to the position of the container space. The system thereby uses the trajectory of the determined symbol and the stored information in the database to derive a contextual relationship which is used to determine if the item carrying the symbol is being inserted or removed from the container and a position of the item relative to a position of the container space. The position of the item relative to a position of the container space could be as simple as "outside" or "inside" the container space, but could in some examples use a symbol of a storage component of the container space to determine which storage component the item has been inserted in or removed from. The position of the item's symbol, relative to the container space and/or relative to a symbol of a storage component, may be recorded in the database to speed up retrieval of items stored in the container space.
According to some aspects, at least part of the control circuitry and one or more of the at least one camera are comprised in a single logical unit such that detecting the symbol of an item, determining the position of the detected symbol, determining the trajectory of the detected symbol and determining insertion or removal of the item and a position of the item relative to a position of the container space are performed within the single logical unit. According to some further aspects, the single logical unit comprises a single-board computer. By integrating the camera(s) with the downstream processing performed by the control circuitry, encoding and decoding of image data can be omitted.
According to some aspects, the system further comprises a motion filter arranged to determine changes between consecutive images captured by the at least one camera. The motion filter thereby facilitates real-time identification of symbols without the need for substantial computational resources.
According to some aspects, the system further comprises a video encoder and a video decoder. The video encoder is configured to encode image data in a compressed format. The video decoder is configured to decode/decompress the compressed format image data. The use of video encoders and video decoders enables effective ways of separating the components associated with generating image data, such as cameras and motion filters (in addition to video encoders), from components associated with interpretation of the generated image data. In particular, a video encoder enables efficient transfer of image data between the image data generating components and the image data interpreting components. According to some further aspects, the at least one camera comprises the motion filter and/or the video encoder. The camera thereby not only captures images, but also performs local image processing in the camera.
According to some aspects, the control circuitry is further configured to ensure correct chronological order of a plurality of separately processed images. This enables parallel processing of the image data associated with the images. If the control circuitry is able to process image data relating to different images in parallel, the processing of one image corresponding to a second time later than a first time may be finished before another image corresponding to the first time, which would cause the processed image data to appear out of order; by ensuring correct chronological order, such problems can be avoided.
According to some aspects, the system further comprises a memory buffer configured to receive the compressed format image data from the video encoder and store the compressed format image data until a remaining processing pipeline is ready to process the stored compressed format image data. The system thereby becomes more stable with respect to downstream processes, which often are more computationally expensive. The buffer can ensure smooth real-time monitoring of the container. In case the at least one camera comprises a motion filter and/or a video encoder, the processed image data may be transmitted to the memory buffer directly from the at least one camera. According to some aspects, the at least one camera also comprises the memory buffer. In such case, the at least one camera transmits image data from the buffer to downstream processes, such as a potential video decoder, when a remaining processing pipeline is ready to process the stored compressed format image data.
According to some aspects, the image processing of the at least one image comprising the detected symbol is configured to determine an estimated relative position of the detected symbol to another symbol and/or an environmental marker. In addition to provide information of where in space the item carrying the detected symbol is, i.e. its present location, the estimated relative position may be used to establish a contextual relationship between the container and the item. If the position of the container with respect to the other symbol or environmental marker is known, changes in the relative position of the detected symbol to the other symbol and/or the environmental marker can tell if the item is moving towards or away from the container. If, for instance, the database has no indication that the item is considered stored within the container and a series of relative positions indicates a movement of the item towards the container, upon which the symbol of the item is lost from view of the at least one camera, the situation may be interpreted as the item being delivered to the container, as indicated by the series of relative positions, and placed inside the container, as indicated by the at least one camera no longer being able to track the symbol of the item. The contextual relationship is thus one of insertion, and the database can be updated accordingly. The relative position of the detected symbol to another symbol may further be used to determine a relative position of the item within the container space and/or within a storage component, typically a storage component comprising the other symbol.
According to some aspects, the image processing of the at least one image comprising the detected symbol is configured to determine an estimated distance and/or orientation of the detected symbol relative to the at least one camera. By determining the estimated distance and/or orientation, positional accuracy of the item with respect to the container can be established or improved (if the position of the item is also determined in other ways). The estimated distance and/or orientation may further facilitate establishment of a contextual relationship which enables determining if the item is being inserted or removed from the container.
According to some aspects, the at least one camera is a single camera. A single camera typically reduces cost with respect to solutions involving a plurality of cameras, as well as reducing space requirements and complexity of the system.
According to some aspects, the at least one camera comprises a plurality of cameras placed to face different regions in front of the opening of the container. The plurality of cameras thereby is more likely to detect a symbol which may be obscured from certain camera viewpoints at particular times. The plurality of cameras further facilitates establishing a three-dimensional interpretation of the detected symbol as well as the space in which the item having the symbol moves.
According to some aspects, the at least one camera comprises a stereo camera and/or a depth sensing camera. The stereo and/or depth-sensing ability provides ways for single cameras to determine a position of the item with respect to the container based on the detected symbol. The size and/or orientation of the detected symbol can be used to deduce a distance and/or orientation of the item with respect to the camera, and with knowledge of the camera with respect to the container, a position of the item with respect to the container.
According to some aspects, the symbol of the item comprises one of a one-dimensional barcode, a two-dimensional barcode, an optical character recognition, OCR, number, and human-readable symbols, letters or digits that can be translated into machine code using optical character recognition. Barcodes and OCR numbers allow for large sets of different items and storage components to be assigned unique symbols. The barcodes and OCR numbers further enable the use of available methods for identifying the barcodes and OCR numbers.
According to some aspects, the control circuitry is further configured to detect a symbol of a storage component of the container, and to determine the insertion or removal of the item and the position of the item relative to a position of a detected symbol of the storage component of the container space. By assigning storage components, such as racks, their respective unique symbols, the symbol of the item can be correlated to the symbol of the storage component in order to keep track of where, i.e. for which storage component, the item was inserted or removed. By monitoring the relative positions it is possible to determine if and how the item is moving towards or away from the storage component, which may be interpreted as removal and insertion of the item from/into the storage component and possibly also where within the storage component the item was inserted or removed. In other words, the symbol of the storage component may be used in combination with the symbol of the item to determine to/from where the item is inserted/removed, as well as assisting in establishing the context which enables determining if the item is being inserted or removed.
The present disclosure further relates to a method for container space allocation monitoring. The method comprises detecting a symbol of an item based on processing one or more images captured by at least one camera. The method further comprises determining a position of the detected symbol based on image processing of at least one image comprising the detected symbol. The method also comprises determining a trajectory of the detected symbol based on a set of determined at least one positions of the detected symbol. The method additionally comprises determining insertion or removal of the item and a position of the item relative to a position of a container space based on the determined trajectory. The method yet further comprises updating information stored in a database based on the determined insertion or removal of the item and the determined position of the item relative to the position of the container space. The disclosed method carries out the steps which the disclosed system is configured to carry out, and thus has all the technical effects and advantages of the disclosed system, as described above and below.
The method also relates to a computer program for container space allocation monitoring, the computer program comprising computer program code which, when executed by a processor, causes the processor to carry out the method as described above and below. The computer program thus has all the technical effects and advantages of the disclosed method and system, as described above and below.
BRIEF DESCRIPTION OF THE DRAWINGS
Figure 1 illustrates a system for container space allocation monitoring;
Figure 2 is a box diagram of a system for container space allocation monitoring;
Figure 3 is a box diagram of a system for container space allocation monitoring; and Figure 4 illustrates a method for container space allocation monitoring.
DETAILED DESCRIPTION
Figure 1 illustrates a system 100 for container space allocation monitoring. Figure 1 further illustrates a container 110 having a container space 120 and an opening 112 for external access to the container space 120. The container 110 may also comprise a closing mechanism (not shown), such as a door, configured to provide access to and close off the opening 112. Figure 1 also illustrates an item 130 having a first symbol 132 and a storage component 140 of the container space, wherein the storage component 140 has a second symbol 142. Neither one of the container 110, item 130, storage component 140 nor any of the first and second symbols 132, 142 are part of the claimed system 100, but are illustrated herein merely to facilitate understanding of different aspects of the system 100, since the system is intended to operate with respect to at least some of the container, item, storage component and the first and second symbols.
The basic idea of the system 100 is to act analogous to a traffic camera, and use the captured images to identify symbols and their movement in space and time in order to determine if items are being inserted into or removed from the storage space, and, to the extent possible, with respect to the available positions within the storage space. The information relating to insertion and removal of items to and from the storage space is recorded at a database, which may also be used to assist when determining if an item is being inserted or removed.
The system 100 comprises at least one camera lOa-c placed to face a respective region in front of an opening of a container. The at least one camera is thereby arranged to capture images of items being inserted or removed and is able to register their associated symbols. In a preferred aspect, one or more camera is arranged at a position above the opening of the container, facing straight down. The one or more camera thereby captures a bird's-eye view of what goes on in front of the opening of the container at any given moment. The bird's-eye view is particularly suitable for capturing symbols, since most items inserted and removed will have a preferred side facing upwards, where an associated symbol will preferably be placed. The bird's-eye view further facilitates generating trajectories of symbols in a two-dimensional projection over the floor in which the one or more camera is recording. According to some aspects, the at least one camera lOa-c is a single camera. A single camera typically reduces cost with respect to solutions involving a plurality of cameras, as well as reducing space requirements and complexity of the system. If a bird's-eye view type of monitoring is desired, a single camera per container is typically enough to keep track of the items being inserted and removed from the respective containers. According to some aspects, the single camera is arranged to face a respective region in front of an opening of each container in a plurality of containers. In other words, the region monitored by the single camera covers all of the respective regions in front of the plurality of containers.
According to some aspects, the at least one camera comprises a plurality of cameras placed to face different regions in front of the opening of the container. The plurality of cameras thereby is more likely to detect a symbol which may be obscured from certain camera viewpoints at particular times. The plurality of cameras further facilitates establishing a three-dimensional interpretation of the detected symbol as well as the space in which the item having the symbol moves. In the case of several spatially distributed containers, a plurality of cameras can be distributed such that each container has one or more camera facing a region in front of the opening of each container. The system thereby extends the monitoring from one container to a plurality of containers.
According to some aspects, the at least one camera comprises a stereo camera and/or a depth sensing camera. The stereo and/or depth-sensing ability provides ways for single cameras to determine a position of the item with respect to the container based on the detected symbol. The size and/or orientation of the detected symbol can be used to deduce a distance and/or orientation of the item with respect to the camera, and with knowledge of the camera with respect to the container, a position of the item with respect to the container. The positions may be estimated using a perspective-n-point algorithm to determine a pose of the detected symbol with respect to the camera. According to some aspects, three-dimensional reconstruction from a stereo image may be used to determine positioning of detected symbols.
The system further comprises a database 14 configured to store information relating to items stored in the container and their respective positions within the container space. When storing the information, each item may be assigned a unique ID associated with a symbol of the item. Information relating to the position of the item may be as simple as an indicator if the item is currently stored within the container space or not, e.g. a binary number or an informative statement such as "inside" or "outside", but may also indicate where within the container space. If the container has storage components, such as racks, and the storage components can be identified during insertion or removal of an item, e.g. by detecting a symbol of the storage component in question, the stored information may comprise information configured to identify the storage component as well, such as a unique ID for the storage component, and possibly also where within the storage component. For each item, the stored information would then comprise a unique ID for the item, a unique ID for the storage component in which the item is stored or has been removed from and an indicator if the item is currently stored within the container space or not. The stored information may further comprise additional information considered useful by a user of the system, such as information relating to persons and/or projects associated with the item or things stored within the item.
The system also comprises control circuitry 12. The control circuitry 12 is configured to detect a symbol of an item based on processing one or more images captured by the at least one camera. The processing may comprise the use of image recognition software and/or applying filters or software modules for identifying predetermined features associated with the symbol. In other words, the symbol may have known features for which special purpose filters and/or software modules are used to identify. The symbol of the item may comprise one of a one dimensional barcode, a two-dimensional barcode, an optical character recognition, OCR, number, and human-readable symbols, letters or digits that can be translated into machine code using optical character recognition. Such barcodes and OCR numbers have known features which may be used for detection. Barcodes and OCR numbers further enables assigning unique IDs to each item to be stored within the container. Thus, according to some aspects, the processing of the one or more images captured by the at least one camera comprises identifying at least one of a one-dimensional barcode, a two-dimensional barcode, an optical character recognition, OCR, number, and human-readable symbols, letters or digits that can be translated into machine code using optical character recognition.
According to some aspects, the system further comprises a motion filter 15 arranged to determine changes between consecutive images captured by the at least one camera lOa-c. The motion filter 15 thereby facilitates real-time identification of symbols without the need for substantial computational resources. The motion filter reduces the amount of image data that has to be processed by downstream processes. The downstream processes may comprise both the detection of the symbol of the item, as discussed above, as well as further data processing steps in the form of determining if insertion or removal of the item is taking place. While local processing could be performed at the at least one camera in some examples, e.g. a camera may comprise a motion filter, it is typically desirable to transfer (possibly filtered) image data to a computational resource of the control circuitry particularly suitable for processing the downstream image data.
Therefore, according to some aspects, the system further comprises a video encoder 16 and a video decoder 17. The video encoder 16 is configured to encode image data in a compressed format. The video decoder 17 is configured to decode/decompress the compressed format image data. The use of the video encoder 16 and the video decoder 17 enables efficient transfer of image data from one part in the system to another. In particular, the video encoder 16 and the video decoder 17 facilitates splitting up the control circuitry 12 into logically and/or physically separate units, since image data transfer between the separate units of the control circuitry 12 can be performed more efficiently. Encoding the image data may further be used to provide context and/or an optimal format for downstream image processing. The context may facilitate interpretation of the image data. The optimal format may facilitate the use of optimized algorithms to identify features of the image data.
According to some aspects, the system 100 further comprises a memory buffer 18 configured to receive the compressed format image data from the video encoder 16 and store the compressed format image data until a remaining processing pipeline is ready to process the stored compressed format image data. The system 100 thereby becomes more stable with respect to downstream processes, which often are more computationally expensive an may not be immediately ready to receive the image data. The buffer 18 can ensure smooth real time monitoring of the container 110. In case the at least one camera lOa-c comprises a motion filter 15 and/or a video encoder 16, the processed image data may be transmitted to the memory buffer 18 directly from the at least one camera. According to some aspects, the at least one camera lOa-c also comprises the memory buffer 18. In such case, the at least one camera transmits image data from the buffer 18 to downstream processes, such as a potential video decoder 17, when a remaining processing pipeline is ready to process the stored compressed format image data.
It may sometimes be desirable to split up the processing of the image data over several separate computational resources. Since different computational resources may complete their respective tasks out-of-sync with the order in which the images associated with the image data was recorded, there may be a need to sort at least some of the separately processed images with respect to the chronological order in which they were captured. Therefore, according to some aspects, the control circuitry 12 is further configured to ensure correct chronological order of a plurality of separately processed images. The control circuitry 12 is further configured to determine a position of the detected symbol based on image processing of at least one image comprising the detected symbol. As stated above, the determined positions form the basis for keeping track of the insertion and removal of items in the container. The position of the symbol may be derived from relationships with items in the environment and/or based on the image of the detected symbol itself.
Thus, according to some aspects, the image processing of the at least one image comprising the detected symbol is configured to determine an estimated relative position of the detected symbol 132 to another symbol 142 and/or an environmental marker 152. With the position of the environmental marker 152 and/or the other symbol 142 being known, a spatial relationship between the detected symbol 132 and the other symbol 142 and/or the environmental marker 152 can be established. For instance, if one 10a of the at least one cameras lOa-c is arranged to obtain a bird's-eye view of a region in front of the opening of the container 110, the images of said camera may be used to determine a distance and direction in a two-dimensional plane parallel to a floor on which the container 110 is arranged between the detected symbol 132 and the other symbol 142 and/or the environmental marker 152. Stated differently, the estimated relative position of the detected symbol 132 to the other symbol 142 may be used to determine a relative position of the item 130 within the container space 120 and/or within a storage component 140 of the container space, e.g. a storage component 140 comprising the other symbol 142. According to some aspects, the environment marker is arranged to provide a reference in a global coordinate system in which both the item 130 comprising the detected symbol 132 and the container 110 are arranged. According to some aspects, the other symbol 142 is arranged to provide a reference in local coordinate system with respect to the container 110. The other symbol 142 may comprise a symbol arranged on a storage component 140 of the container space, e.g. a rack for receiving the item 130.
According to some aspects, the image processing of the at least one image comprising the detected symbol is configured to determine an estimated distance and/or orientation of the detected symbol relative to the at least one camera lOa-c. If the detected symbol is known to possess certain geometrical properties, such as being square- or rectangular-shaped with known side lengths, and/or having distinctive visual features, such as black and white squares or stripes, the geometrical properties and/or visual features may be used to determine an estimated distance and/or orientation of the detected symbol relative to the at least one camera. Once the symbol has been detected, the image of the symbol can be matched to a translation and/or rotation of a symbol from a reference plane with respect to the at least one camera.
One or more positions at known times, e.g. by associating each position with a time stamp, can then be used to form a trajectory. The trajectory may be used to keep track of where the item 130 is in relation to the container 110 as well as providing context as to whether an item that at one time instant appears or disappears in the vicinity of the opening 112 of the container 110 is being inserted or removed from the container space 120. Therefore, the control circuitry 12 is also configured to determine a trajectory of the detected symbol 132 based on a set of determined at least one positions of the detected symbol.
The control circuitry 12 is additionally configured to determine insertion or removal of the item and a position of the item relative to a position of the container space based on the determined trajectory. Movement of the detected symbol 132 in space and time provides a context which facilitates interpretation whether the item 130 is being inserted into or removed from the storage space 120. For example, a symbol of an item is detected within the vicinity of the opening 112 of the storage space and the system has to decide if the item is being inserted or removed (or neither). The system may consult the information stored in the database 14 to see if the item 130 associated with the detected symbol 132 was previously stored in the container space 120 or not. If the item was registered as stored, the system has to determine if the item is to be considered removed. A trajectory of the detected symbol 132 which leads away from the container, e.g. after the detected symbol has disappeared from view of the at least one camera lOa-c, may indicate that the item is considered removed from the container space. A trajectory which makes a narrow U-turn in vicinity of the opening 112 of the container or a symbol 142 of a storage component 140 of the container space, after which the detected symbol may disappear from view of the at least one camera lOa-c, may be interpreted as the item being taken out temporarily and then returned, thus still being stored within the container space. If two storage components having unique symbols are detected during the trajectory of the item, or rather the detected symbol 132 of the item 130, the trajectory may be used to indicate the item being moved from one storage component to another. Likewise, the trajectory may be used to indicate insertion of an item. If the trajectory of the detected symbol 132 approaches the storage space and then disappears, this may be used to interpret the item 130 having the detected symbol 132 as being inserted.
If one or more symbols of corresponding one or more storage components are detected at the same time, the trajectory may further be used to indicate in which storage component the item is stored. In other words, according to some aspects, the control circuitry 12 is further configured to detect a symbol of a storage component of the container, and to determine the insertion or removal of the item and the position of the item relative to a position of a detected symbol of the storage component of the container space.
The control circuitry 12 is yet further configured to update the stored information in the database based on the determined insertion or removal of the item and the determined position of the item relative to the position of the container space.
According to some aspects, at least part of the control circuitry 12 and one or more of the at least one camera lOa-c are comprised in a single logical unit (not shown) such that detecting the symbol of an item, determining the position of the detected symbol, determining the trajectory of the detected symbol and determining insertion or removal of the item and a position of the item relative to a position of the container space are performed within the single logical unit. According to some further aspects, the single logical unit comprises a single board computer. By integrating the camera(s) with the downstream processing performed by the control circuitry, encoding and decoding of image data can be omitted. The single logical unit may further comprise the part of the control circuitry 12 configured to update the stored information in the database based on the determined insertion or removal of the item and the determined position of the item relative to the position of the container space. According to some aspects, the single logical unit also comprises the motion filter 15. According to some aspects, the single logical unit further comprises the memory buffer 18.
In some examples the database 34 is comprised in the single logical unit.
It may be desirable to connect several logical units of the single logical unit time described above, e.g. several single-board computer systems, to the same database. In such cases it may be desirable to avoid performing the same type of updates in the database. In some examples the control circuitry of the respective logical unit is configured to only update the stored information in the database if the information the control circuitry wants to store at the database differs from the currently stored information.
Alternatively, logical unit could share a single, external portion of the control circuitry configured to update the stored information in the database based on the determined insertion or removal of the item and the determined position of the item relative to the position of the container space. According to some aspects, the single, external portion of the control circuitry configured to update the stored information in the database is comprised in the database.
Figure 2 is a box diagram of a system 200 for container space allocation monitoring. The illustrated system provides examples of how systems can be arranged to divide the computational tasks from the moment of image capture to updating a database between computational resources of different capacity. The system is here illustrated having a low performance computational subsystem 202 and a high performance computational subsystem 204, as will be illustrated further below.
The system 200 comprises at least one camera 20 placed to face a respective region in front of an opening of a container (not shown). For reasons of convenience, the at least one camera 20 will be discussed in terms of a single camera, but it should be understood that the illustrated examples may comprise more than one camera of any type discussed herein. The system further comprises a database 24 configured to store information relating to items stored in the container and their respective positions within the container space.
The system also comprises control circuitry. The control circuitry handles the computational tasks and will be illustrated as distributed modules for clarity. Though the modules are illustrated as separate entities, it is to be understood that the modules may be integrated into one or more common logical unit(s). The modules can be implemented in any suitable combination of software and hardware.
The control circuitry is configured to detect a symbol of an item based on processing one or more images captured by the at least one camera, as illustrated by a symbol detector and decoder module 221. The control circuitry is further configured to determine a position of the detected symbol based on image processing of at least one image comprising the detected symbol, as illustrated by a symbol positioner module 222. The control circuitry is also configured to determine a trajectory of the detected symbol based on a set of determined at least one positions of the detected symbol as illustrated by a symbol tracker module 223. The control circuitry is yet further configured to determine insertion or removal of the item and a position of the item relative to a position of the container space based on the determined trajectory, as illustrated by a gesture decoder module 224. The control circuitry is additionally configured to update the stored information in the database 24 based on the determined insertion or removal of the item and the determined position of the item relative to the position of the container space, as illustrated by a database writer module 225.
The system 200 preferably further comprises a motion filter 25 arranged to determine changes between consecutive images captured by the camera 20.
According to some aspects, the control circuitry is further configured to ensure correct chronological order of a plurality of separately processed images, as illustrated by a reorder module 226.
According to some preferred aspects, the system further comprises a video encoder 26 and a video decoder 27, as illustrated by a video encoder module 26 and a video decoder module 27, respectively. The video encoder 26 is configured to encode image data in a compressed format. The video decoder 27 is configured to decode/decompress the compressed format image data.
The system may also comprise a memory buffer 28, as illustrated by a buffer module 28, configured to receive the compressed format image data from the video encoder 26 and store the compressed format image data until a remaining processing pipeline is ready to process the stored compressed format image data.
Possible aspects and capabilities of the at least one camera 20, the motion filer 25, the video encoder 26, the memory buffer 28, the video decoder 27, the database 24 as well as implementations of the control circuitry, illustrated at least in part here by the symbol detector and decoder module 221, the reorder module 226, the symbol positioner module 222, the symbol tracker module 223, the gesture decoder module 224 and the database writer module 225, have been discussed in relation to Figure 1 above and applies in the illustrated example of Figure 2 as well. In the illustrated example the camera 20 and any motion filter 25, video encoder 26 and memory buffer 28 are arranged together in a low performance computational subsystem 202. If present, any of the motion filter 25, video encoder 26 and memory buffer 28 may be arranged as part of or integrated with the camera 20. According to some aspects, the camera 20, the motion filter 25, the video encoder 26 and the memory buffer 28 comprised in a single board computer system.
Likewise, the symbol detector and decoder module 221, the symbol positioner module, the symbol tracker module 223, the gesture decoder module 224, the database writer module 225, and, if present, the reorder module 226 and the video decoder 27 are arranged together in a high performance computational subsystem 204. The high performance computational subsystem 204 may be implemented in a cloud virtual machine. According to some aspects, the database 24 is also comprised in the high performance computational subsystem 204.
The low performance computational subsystem 202 and the high performance computational subsystem 204 are communicatively connected. Figure 3 is a box diagram of a system 300 for container space allocation monitoring. The illustrated system provides examples of how image processing and downstream aspects of the control circuitry may be integrated with one or more camera in a single-board computer system. In other words, Figure 3 aims at illustrating how the low performance computational subsystem 202 and the high performance computational subsystem 204 of Figure 2 above may both be comprised in a single-board computer system 303, along with potential advantages.
The system 300 comprises at least one camera 30 placed to face a respective region in front of an opening of a container (not shown). For reasons of convenience, the at least one camera 30 will be discussed in terms of a single camera, but it should be understood that the illustrated examples may comprise more than one camera of any type discussed herein. The system further comprises a database 34 configured to store information relating to items stored in the container and their respective positions within the container space.
The system also comprises control circuitry. The control circuitry handles the computational tasks and will be illustrated as distributed modules for clarity. Though the modules are illustrated as separate entities, it is to be understood that the modules may be integrated into one or more common logical unit(s). The modules can be implemented in any suitable combination of software and hardware.
The control circuitry is configured to detect a symbol of an item based on processing one or more images captured by the at least one camera, as illustrated by a symbol detector and decoder module 321. The control circuitry is further configured to determine a position of the detected symbol based on image processing of at least one image comprising the detected symbol, as illustrated by a symbol positioner module 322. The control circuitry is also configured to determine a trajectory of the detected symbol based on a set of determined at least one positions of the detected symbol as illustrated by a symbol tracker module 323. The control circuitry is yet further configured to determine insertion or removal of the item and a position of the item relative to a position of the container space based on the determined trajectory, as illustrated by a gesture decoder module 324. The control circuitry is additionally configured to update the stored information in the database 34 based on the determined insertion or removal of the item and the determined position of the item relative to the position of the container space, as illustrated by a database writer module 325.
The system 300 preferably further comprises a motion filter 35 arranged to determine changes between consecutive images captured by the camera 30.
According to some aspects, the control circuitry is further configured to ensure correct chronological order of a plurality of separately processed images, as illustrated by a reorder module 326.
One of the advantages of the examples of Figure 3 with respect to the examples illustrated in relation to Figure 2 is that the video encoder and video decoders can be omitted.
The system may also comprise a memory buffer 38, as illustrated by a buffer module 38, configured to receive the compressed format image data from either directly from the camera 30 or via the motion filter 35 (if present) and store the compressed format image data until a remaining processing pipeline is ready to process the stored compressed format image data.
Possible aspects and capabilities of the at least one camera 30, the motion filer 35, the memory buffer 38, the database 34 as well as implementations of the control circuitry, illustrated at least in part here by the symbol detector and decoder module 321, the reorder module 326, the symbol positioner module 322, the symbol tracker module 323, the gesture decoder module 324 and the database writer module 325, have been discussed in relation to Figure 1 above and applies in the illustrated example of Figure 3 as well.
The camera 30 and at least the part of the control circuitry comprising the symbol detector and decoder module 321, the reorder module 326, the symbol positioner module 322, the symbol tracker module 323, the gesture decoder module 324 and the database writer module 325 are comprised in a single-board computer system 303.
The database 34 is illustrated as a unit separate from the single-board computer system 303, but may in some examples be comprised in the single-board computer system 303 as well.
It may be desirable to connect several single-board computer systems 303 to the same database 34. In such cases it may be desirable to avoid performing the same type of updates in the database 34. In some examples the database writer module 325 is configured to only update the stored information in the database if the information the database writer module 325 wants to store at the database 34 differs from the currently stored information.
Alternatively, each single-board computer systems 303 could share a single, external database writer module 325 (not shown). According to some aspects, the single, external database writer module is comprised in the database 34 (not shown).
Figure 4 illustrates a method for container space allocation monitoring. The method comprises detecting S10 a symbol of an item based on processing one or more images captured by at least one camera. The method further comprises determining S20 a position of the detected symbol based on image processing of at least one image comprising the detected symbol. The method also comprises determining S30 a trajectory of the detected symbol based on a set of determined at least one positions of the detected symbol. The method additionally comprises determining S40 insertion or removal of the item and a position of the item relative to a position of a container space based on the determined trajectory. The method yet further comprises updating S50 information stored in a database based on the determined insertion or removal of the item and the determined position of the item relative to the position of the container space. The method carries out the steps for which the disclosed system for container space allocation monitoring as described above and below is configured, and consequently has all the technical effects and advantages of the system for container space allocation monitoring.
The present disclosure also relates to a computer program for container space allocation monitoring. The computer program comprises computer program code which, when executed by a processor, causes the processor to carry out the method for container space allocation monitoring as described above and below.

Claims

1. A system (100, 200, 300) for container space allocation monitoring, the system (100, 200, 300) comprising
at least one camera (lOa-c, 20, 30), the at least one camera being placed outside a container and facing a region in front of an opening of the container to capture activities outside the opening of the container,
a database (14, 24, 34) configured to store information relating to items stored in the container and their respective positions within the container space (120), and control circuitry (12) configured to
• detect a symbol of an item based on processing one or more images captured by the at least one camera ,
• determine a position of the detected symbol based on image processing of at least one image comprising the detected symbol,
• determine a trajectory of the detected symbol based on a set of determined at least one positions of the detected symbol,
• determine insertion or removal of the item and a position of the item relative to a position of the container space based on the determined trajectory, and update the stored information in the database based on the determined insertion or removal of the item and the determined position of the item relative to the position of the container space.
2. The system according to claim 1, wherein at least part of the control circuitry and one or more of the at least one camera are comprised in a single logical unit such that detecting the symbol of an item, determining the position of the detected symbol, determining the trajectory of the detected symbol and determining insertion or removal of the item and a position of the item relative to a position of the container space are performed within the single logical unit.
3. The system according to claim 1 or 2, wherein the system further comprises a motion filter (15, 25, 35) arranged to determine changes between consecutive images captured by the at least one camera (lOa-c, 20, 30).
4. The system according to any of the preceding claims, wherein the control circuitry (12) is further configured to ensure correct chronological order of a plurality of separately processed images.
5. The system according to any of the preceding claims, wherein the system further comprises a video encoder (16, 26) and a video decoder (17, 27), wherein the video encoder (16, 26) is configured to encode image data in a compressed format, and wherein the video decoder (17, 27) is configured to decode/decompress the compressed format image data.
6. The system according to claim 5, further comprising a memory buffer (18, 28, 38) configured to receive the compressed format image data from the video encoder and store the compressed format image data until a remaining processing pipeline is ready to process the stored compressed format image data.
7. The system according to any of the preceding claims, wherein the image processing of the at least one image comprising the detected symbol is configured to determine an estimated relative position of the detected symbol to another symbol and/or an environmental marker.
8. The system according to any of the preceding claims, wherein the image processing of the at least one image comprising the detected symbol is configured to determine an estimated distance and/or orientation of the detected symbol relative to the at least one camera.
9. The system according to any of the preceding claims, wherein the at least one camera (lOa-c, 20, 30) is a single camera.
10. The system according to any of claims 1-8, wherein the at least one camera (lOa-c, 20, 30) comprises a plurality of cameras placed to face different regions in front of the opening of the container.
11. The system according to any of the preceding claims, wherein the at least one camera (lOa-c, 20, 30) comprises a stereo camera and/or a depth-sensing camera.
12. The system according to any of the preceding claims, wherein the symbol of the item comprises one of a one-dimensional barcode, a two-dimensional barcode, an optical character recognition, OCR, number, and human-readable symbols, letters or digits that can be translated into machine code using optical character recognition.
13. The system according to any of the preceding claims, wherein the control circuitry (12) is further configured to detect a symbol of a storage component of the container, and to determine the insertion or removal of the item and the position of the item relative to a position of a detected symbol of the storage component of the container space.
14. A method for container space allocation monitoring, the method comprising
detecting (S10) a symbol of an item based on processing one or more images captured by at least one camera,
determining (S20) a position of the detected symbol based on image processing of at least one image comprising the detected symbol,
determining (S30) a trajectory of the detected symbol based on a set of determined at least one positions of the detected symbol,
determining (S40) insertion or removal of the item and a position of the item relative to a position of a container space based on the determined trajectory, and
updating (S50) information stored in a database based on the determined insertion or removal of the item and the determined position of the item relative to the position of the container space, wherein the one or more images are captured from a position outside the container and captures a region in front of an opening of the container to capture activities outside the opening of the container.
15. A computer program for container space allocation monitoring, the computer program comprising computer program code which, when executed by a processor, causes the processor to carry out the method according to claim 14.
PCT/SE2018/051215 2017-11-28 2018-11-26 Systems, methods and computer programs for container space allocation monitoring WO2019108117A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
SE1751462-1 2017-11-28
SE1751462A SE542615C2 (en) 2017-11-28 2017-11-28 Systems, methods and computer programs for container space allocation monitoring

Publications (1)

Publication Number Publication Date
WO2019108117A1 true WO2019108117A1 (en) 2019-06-06

Family

ID=66664583

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/SE2018/051215 WO2019108117A1 (en) 2017-11-28 2018-11-26 Systems, methods and computer programs for container space allocation monitoring

Country Status (2)

Country Link
SE (1) SE542615C2 (en)
WO (1) WO2019108117A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102020202933A1 (en) 2020-03-06 2021-09-09 BSH Hausgeräte GmbH Storage device device

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020141637A1 (en) * 2001-03-28 2002-10-03 Philips Electronics North America Corporation Method and apparatus to distinguish deposit and removal in surveillance videio
US7168618B2 (en) * 2004-08-12 2007-01-30 International Business Machines Corporation Retail store method and system
WO2010017531A2 (en) * 2008-08-08 2010-02-11 Snap-On Incorporated Image-based inventory control system using advanced image recognition
US20150029339A1 (en) * 2013-07-25 2015-01-29 Ncr Corporation Whole Store Scanner
US9129250B1 (en) * 2013-09-25 2015-09-08 Amazon Technologies, Inc. Automated inventory management
US20150262116A1 (en) * 2014-03-16 2015-09-17 International Business Machines Corporation Machine vision technology for shelf inventory management
US20160217417A1 (en) * 2015-01-23 2016-07-28 Samsung Electronics Co., Ltd. Object recognition for a storage structure
US20160364686A1 (en) * 2015-06-15 2016-12-15 Aethon, Inc. Safety, Management and Tracking of Hospital Pharmacy Trays
US20170286901A1 (en) * 2016-03-29 2017-10-05 Bossa Nova Robotics Ip, Inc. System and Method for Locating, Identifying and Counting Items
WO2017196822A1 (en) * 2016-05-09 2017-11-16 Grabango Co. System and method for computer vision driven applications within an environment

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020141637A1 (en) * 2001-03-28 2002-10-03 Philips Electronics North America Corporation Method and apparatus to distinguish deposit and removal in surveillance videio
US7168618B2 (en) * 2004-08-12 2007-01-30 International Business Machines Corporation Retail store method and system
WO2010017531A2 (en) * 2008-08-08 2010-02-11 Snap-On Incorporated Image-based inventory control system using advanced image recognition
US20150029339A1 (en) * 2013-07-25 2015-01-29 Ncr Corporation Whole Store Scanner
US9129250B1 (en) * 2013-09-25 2015-09-08 Amazon Technologies, Inc. Automated inventory management
US20150262116A1 (en) * 2014-03-16 2015-09-17 International Business Machines Corporation Machine vision technology for shelf inventory management
US20160217417A1 (en) * 2015-01-23 2016-07-28 Samsung Electronics Co., Ltd. Object recognition for a storage structure
US20160364686A1 (en) * 2015-06-15 2016-12-15 Aethon, Inc. Safety, Management and Tracking of Hospital Pharmacy Trays
US20170286901A1 (en) * 2016-03-29 2017-10-05 Bossa Nova Robotics Ip, Inc. System and Method for Locating, Identifying and Counting Items
WO2017196822A1 (en) * 2016-05-09 2017-11-16 Grabango Co. System and method for computer vision driven applications within an environment

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
ASAKA S ET AL.: "Warehouse Management System with Monitoring Location of Goods", IP.COM JOURNAL, 1 September 1994 (1994-09-01), WEST HENRIETTA, NY, US, XP013101646, ISSN: 1533-0001 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102020202933A1 (en) 2020-03-06 2021-09-09 BSH Hausgeräte GmbH Storage device device

Also Published As

Publication number Publication date
SE1751462A1 (en) 2019-05-29
SE542615C2 (en) 2020-06-16

Similar Documents

Publication Publication Date Title
US10628648B2 (en) Systems and methods for tracking optical codes
US11049278B2 (en) System and method for visual identification, and system and method for classifying and sorting
Lim et al. Real-time image-based 6-dof localization in large-scale environments
US9204112B2 (en) Systems, circuits, and methods for efficient hierarchical object recognition based on clustered invariant features
JP2020030857A (en) Image processing apparatus and image processing method
EP2153409B1 (en) Camera pose estimation apparatus and method for augmented reality imaging
Younes et al. A survey on non-filter-based monocular visual SLAM systems
EP3079100B1 (en) Image processing apparatus, image processing method and computer readable storage medium
CN109791608A (en) Mapping abstract and localization
CN101853387A (en) Stereoscopic warehouse goods checking method and system
US9959651B2 (en) Methods, devices and computer programs for processing images in a system comprising a plurality of cameras
US10755422B2 (en) Tracking system and method thereof
CN109284653A (en) Slender body detection based on computer vision
WO2019108117A1 (en) Systems, methods and computer programs for container space allocation monitoring
JP2022546999A (en) PICKUP ROBOT, PICKUP METHOD, COMPUTER-READABLE STORAGE MEDIUM
EP3761228A1 (en) Computer-implemented method
CN109635688B (en) Method and system for managing books on bookshelf based on image recognition
Chou et al. Multi-image correspondence using geometric and structural constraints
CN105809690A (en) Data processing method, device and electronic device
US20230139490A1 (en) Automatic training data sample collection
US20220261578A1 (en) Method, System and Apparatus for Dynamic Inventory Guidance and Mapping
Xia et al. YOLO-Based Semantic Segmentation for Dynamic Removal in Visual-Inertial SLAM
Kim Object detection using RGBD data for interactive robotic manipulation
Brunetto et al. Interactive RGB-D SLAM on mobile devices
EP2509028B1 (en) Method and system for optically detecting and localizing a two-dimensional, 2D, marker in 2D scene data, and marker therefor

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18883977

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 10/09/2020)

122 Ep: pct application non-entry in european phase

Ref document number: 18883977

Country of ref document: EP

Kind code of ref document: A1