Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
SYSTEMS, METHODS AND COMPUTER PROGRAMS FOR CONTAINER SPACE ALLOCATION MONITORING
Document Type and Number:
WIPO Patent Application WO/2019/108117
Kind Code:
A1
Abstract:
The present disclosure relates to a system (100) for container space allocation monitoring. The system (100) comprises at least one camera (10a-c) placed to face a respective region in front of an opening (112) of a container (110). The system further comprises a database (14) configured to store information relating to items stored in the container and their respective positions within the container space (120). The system also comprises control circuitry (12). The control circuitry is configured to detect a symbol of an item based on processing one or more images captured by the at least one camera. The control circuitry is further configured to determine a position of the detected symbol based on image processing of at least one image comprising the detected symbol. The control circuitry is also configured to determine a trajectory of the detected symbol based on a set of determined at least one positions of the detected symbol. The control circuitry is additionally configured to determine insertion or removal of the item and a position of the item relative to a position of the container space based on the determined trajectory, and to update the stored information in the database based on the determined insertion or removal of the item and the determined position of the item relative to the position of the container space. The present disclosure also relates to corresponding methods and computer programs.

Inventors:
KLEIN CRISTIAN (SE)
TEREBIENIEC BARBARA (SE)
Application Number:
PCT/SE2018/051215
Publication Date:
June 06, 2019
Filing Date:
November 26, 2018
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
KLEIN CRISTIAN (SE)
TEREBIENIEC BARBARA (SE)
International Classes:
G06Q10/08; G06Q10/087; G06T7/246; G06V20/00; G06V30/148; G06V30/224; G06V20/52; G06V30/10
Domestic Patent References:
WO2010017531A22010-02-11
WO2017196822A12017-11-16
Foreign References:
US20160217417A12016-07-28
US20020141637A12002-10-03
US20160364686A12016-12-15
US20150262116A12015-09-17
US20170286901A12017-10-05
US20150029339A12015-01-29
US7168618B22007-01-30
US9129250B12015-09-08
Other References:
ASAKA S ET AL.: "Warehouse Management System with Monitoring Location of Goods", IP.COM JOURNAL, 1 September 1994 (1994-09-01), WEST HENRIETTA, NY, US, XP013101646, ISSN: 1533-0001
Attorney, Agent or Firm:
ZACCO SWEDEN AB et al. (SE)
Download PDF:
Claims:
CLAIMS

1. A system (100, 200, 300) for container space allocation monitoring, the system (100, 200, 300) comprising

at least one camera (lOa-c, 20, 30), the at least one camera being placed outside a container and facing a region in front of an opening of the container to capture activities outside the opening of the container,

a database (14, 24, 34) configured to store information relating to items stored in the container and their respective positions within the container space (120), and control circuitry (12) configured to

• detect a symbol of an item based on processing one or more images captured by the at least one camera ,

• determine a position of the detected symbol based on image processing of at least one image comprising the detected symbol,

• determine a trajectory of the detected symbol based on a set of determined at least one positions of the detected symbol,

• determine insertion or removal of the item and a position of the item relative to a position of the container space based on the determined trajectory, and update the stored information in the database based on the determined insertion or removal of the item and the determined position of the item relative to the position of the container space.

2. The system according to claim 1, wherein at least part of the control circuitry and one or more of the at least one camera are comprised in a single logical unit such that detecting the symbol of an item, determining the position of the detected symbol, determining the trajectory of the detected symbol and determining insertion or removal of the item and a position of the item relative to a position of the container space are performed within the single logical unit.

3. The system according to claim 1 or 2, wherein the system further comprises a motion filter (15, 25, 35) arranged to determine changes between consecutive images captured by the at least one camera (lOa-c, 20, 30).

4. The system according to any of the preceding claims, wherein the control circuitry (12) is further configured to ensure correct chronological order of a plurality of separately processed images.

5. The system according to any of the preceding claims, wherein the system further comprises a video encoder (16, 26) and a video decoder (17, 27), wherein the video encoder (16, 26) is configured to encode image data in a compressed format, and wherein the video decoder (17, 27) is configured to decode/decompress the compressed format image data.

6. The system according to claim 5, further comprising a memory buffer (18, 28, 38) configured to receive the compressed format image data from the video encoder and store the compressed format image data until a remaining processing pipeline is ready to process the stored compressed format image data.

7. The system according to any of the preceding claims, wherein the image processing of the at least one image comprising the detected symbol is configured to determine an estimated relative position of the detected symbol to another symbol and/or an environmental marker.

8. The system according to any of the preceding claims, wherein the image processing of the at least one image comprising the detected symbol is configured to determine an estimated distance and/or orientation of the detected symbol relative to the at least one camera.

9. The system according to any of the preceding claims, wherein the at least one camera (lOa-c, 20, 30) is a single camera.

10. The system according to any of claims 1-8, wherein the at least one camera (lOa-c, 20, 30) comprises a plurality of cameras placed to face different regions in front of the opening of the container.

11. The system according to any of the preceding claims, wherein the at least one camera (lOa-c, 20, 30) comprises a stereo camera and/or a depth-sensing camera.

12. The system according to any of the preceding claims, wherein the symbol of the item comprises one of a one-dimensional barcode, a two-dimensional barcode, an optical character recognition, OCR, number, and human-readable symbols, letters or digits that can be translated into machine code using optical character recognition.

13. The system according to any of the preceding claims, wherein the control circuitry (12) is further configured to detect a symbol of a storage component of the container, and to determine the insertion or removal of the item and the position of the item relative to a position of a detected symbol of the storage component of the container space.

14. A method for container space allocation monitoring, the method comprising

detecting (S10) a symbol of an item based on processing one or more images captured by at least one camera,

determining (S20) a position of the detected symbol based on image processing of at least one image comprising the detected symbol,

determining (S30) a trajectory of the detected symbol based on a set of determined at least one positions of the detected symbol,

determining (S40) insertion or removal of the item and a position of the item relative to a position of a container space based on the determined trajectory, and

updating (S50) information stored in a database based on the determined insertion or removal of the item and the determined position of the item relative to the position of the container space, wherein the one or more images are captured from a position outside the container and captures a region in front of an opening of the container to capture activities outside the opening of the container.

15. A computer program for container space allocation monitoring, the computer program comprising computer program code which, when executed by a processor, causes the processor to carry out the method according to claim 14.

Description:
Systems, methods and computer programs for container space allocation monitoring TECHNICAL FIELD

The present disclosure relates to efficient identification of item location with respect to a container space. In particular, the present disclosure relates to systems, methods and computer programs for container space allocation monitoring.

BACKGROUND

In many lab settings, a biological sample is to be repeatedly stored in and removed from a freezer, often with considerable time between insertion and removal of the biological sample. The biological sample is typically held within a vial. The vial is then placed in a box arranged to hold a plurality of vials. The box may in turn be arranged in one of several racks of the freezer. A typical freezer may contain on the order of five hundred boxes. In many workplaces the location of the box in which a biological sample has been placed is tracked using pen and paper or an ad-hoc digital table to be filled in by the person responsible for the biological sample. Finding the right box again after having inserted a biological sample can be time consuming. One reason for this is that in many workplaces, there is naturally a high staff turn-around and information on where the leaving staff kept their biological samples is often lost. Another source for the trouble in recovering the right biological sample is that during freezer failures or maintenance, the person(s) responsible for taking care of the failure or maintenance are often different from the people working with the biological samples. Hence, it is common that the person(s) responsible for taking care of the failure or maintenance to inadvertently shuffle the racks holding the boxes with biological samples around, and sometimes even moving the boxes holding the biological sample vials around as well to facilitate their own work. Since the persons working in lab environments are typically high skilled workers, with high matching salaries, the additional time spent on identifying the location of a stored sample adds up over time and results in lost efficiency. A further potential cause for lost time is a need to identify what boxes are still in use and what boxes are obsolete, since this would require going through all the boxes in the freezer one by one.

The above scenario is described as relating to storage of items in freezers in a lab environment. However, there are other scenarios in which similar difficulties occur. It can for example relate to libraries where the items are books. It may also be applicable to shopping malls or grocery stores wherein each item for sale may be monitored accordingly. Finally, it may concern post offices where items are parcels stored on shelves.

In each of the above scenarios, there is a need to decrease amount of time spent looking for items which have been stored away in a storage for storing a large quantity of items.

SUMMARY

One of the main problems faced when inserting and removing items in containers requiring multiple users to keep track of the items in order to maintain a well-organized storage within the container is that the process of monitoring insertion and removal, as well as keeping track of where every item is stored, may take up a lot of time. The present disclosure suggests systems, methods and computer programs where symbols, e.g. barcodes, are placed on an item, such as a box holding an organic sample, and the symbol is detected by one or more cameras. Based on trajectories and/or current content of the container, control circuitry of the system determines if the item is being inserted or removed. The container may comprise storage components, such as racks, which may also be labelled with unique symbols, thereby enabling matching the symbol of the item with a symbol of a storage component in order to associate the item with a location within the container.

More specifically, the present disclosure relates to a system for container space allocation monitoring. The system comprises at least one camera placed to face a respective region in front of an opening of a container. Thus, the at least one camera is placed outside the container and facing a region in front of the opening of the container to capture activities outside the container. The at least one camera may be placed above, below, to the left and/or to the right of the opening. The opening may be a side opening of the container.

The system further comprises a database configured to store information relating to items stored in the container and their respective positions within the container space. The system also comprises control circuitry. The control circuitry is configured to detect a symbol of an item based on processing one or more images captured by the at least one camera. The control circuitry is further configured to determine a position of the detected symbol based on image processing of at least one image comprising the detected symbol. The control circuitry is also configured to determine a trajectory of the detected symbol based on a set of determined at least one positions of the detected symbol. The control circuitry is additionally configured to determine insertion or removal of the item and a position of the item relative to a position of the container space based on the determined trajectory. The control circuitry is yet further configured to update the stored information in the database based on the determined insertion or removal of the item and the determined position of the item relative to the position of the container space. The system thereby uses the trajectory of the determined symbol and the stored information in the database to derive a contextual relationship which is used to determine if the item carrying the symbol is being inserted or removed from the container and a position of the item relative to a position of the container space. The position of the item relative to a position of the container space could be as simple as "outside" or "inside" the container space, but could in some examples use a symbol of a storage component of the container space to determine which storage component the item has been inserted in or removed from. The position of the item's symbol, relative to the container space and/or relative to a symbol of a storage component, may be recorded in the database to speed up retrieval of items stored in the container space.

According to some aspects, at least part of the control circuitry and one or more of the at least one camera are comprised in a single logical unit such that detecting the symbol of an item, determining the position of the detected symbol, determining the trajectory of the detected symbol and determining insertion or removal of the item and a position of the item relative to a position of the container space are performed within the single logical unit. According to some further aspects, the single logical unit comprises a single-board computer. By integrating the camera(s) with the downstream processing performed by the control circuitry, encoding and decoding of image data can be omitted.

According to some aspects, the system further comprises a motion filter arranged to determine changes between consecutive images captured by the at least one camera. The motion filter thereby facilitates real-time identification of symbols without the need for substantial computational resources.

According to some aspects, the system further comprises a video encoder and a video decoder. The video encoder is configured to encode image data in a compressed format. The video decoder is configured to decode/decompress the compressed format image data. The use of video encoders and video decoders enables effective ways of separating the components associated with generating image data, such as cameras and motion filters (in addition to video encoders), from components associated with interpretation of the generated image data. In particular, a video encoder enables efficient transfer of image data between the image data generating components and the image data interpreting components. According to some further aspects, the at least one camera comprises the motion filter and/or the video encoder. The camera thereby not only captures images, but also performs local image processing in the camera.

According to some aspects, the control circuitry is further configured to ensure correct chronological order of a plurality of separately processed images. This enables parallel processing of the image data associated with the images. If the control circuitry is able to process image data relating to different images in parallel, the processing of one image corresponding to a second time later than a first time may be finished before another image corresponding to the first time, which would cause the processed image data to appear out of order; by ensuring correct chronological order, such problems can be avoided.

According to some aspects, the system further comprises a memory buffer configured to receive the compressed format image data from the video encoder and store the compressed format image data until a remaining processing pipeline is ready to process the stored compressed format image data. The system thereby becomes more stable with respect to downstream processes, which often are more computationally expensive. The buffer can ensure smooth real-time monitoring of the container. In case the at least one camera comprises a motion filter and/or a video encoder, the processed image data may be transmitted to the memory buffer directly from the at least one camera. According to some aspects, the at least one camera also comprises the memory buffer. In such case, the at least one camera transmits image data from the buffer to downstream processes, such as a potential video decoder, when a remaining processing pipeline is ready to process the stored compressed format image data.

According to some aspects, the image processing of the at least one image comprising the detected symbol is configured to determine an estimated relative position of the detected symbol to another symbol and/or an environmental marker. In addition to provide information of where in space the item carrying the detected symbol is, i.e. its present location, the estimated relative position may be used to establish a contextual relationship between the container and the item. If the position of the container with respect to the other symbol or environmental marker is known, changes in the relative position of the detected symbol to the other symbol and/or the environmental marker can tell if the item is moving towards or away from the container. If, for instance, the database has no indication that the item is considered stored within the container and a series of relative positions indicates a movement of the item towards the container, upon which the symbol of the item is lost from view of the at least one camera, the situation may be interpreted as the item being delivered to the container, as indicated by the series of relative positions, and placed inside the container, as indicated by the at least one camera no longer being able to track the symbol of the item. The contextual relationship is thus one of insertion, and the database can be updated accordingly. The relative position of the detected symbol to another symbol may further be used to determine a relative position of the item within the container space and/or within a storage component, typically a storage component comprising the other symbol.

According to some aspects, the image processing of the at least one image comprising the detected symbol is configured to determine an estimated distance and/or orientation of the detected symbol relative to the at least one camera. By determining the estimated distance and/or orientation, positional accuracy of the item with respect to the container can be established or improved (if the position of the item is also determined in other ways). The estimated distance and/or orientation may further facilitate establishment of a contextual relationship which enables determining if the item is being inserted or removed from the container.

According to some aspects, the at least one camera is a single camera. A single camera typically reduces cost with respect to solutions involving a plurality of cameras, as well as reducing space requirements and complexity of the system.

According to some aspects, the at least one camera comprises a plurality of cameras placed to face different regions in front of the opening of the container. The plurality of cameras thereby is more likely to detect a symbol which may be obscured from certain camera viewpoints at particular times. The plurality of cameras further facilitates establishing a three-dimensional interpretation of the detected symbol as well as the space in which the item having the symbol moves.

According to some aspects, the at least one camera comprises a stereo camera and/or a depth sensing camera. The stereo and/or depth-sensing ability provides ways for single cameras to determine a position of the item with respect to the container based on the detected symbol. The size and/or orientation of the detected symbol can be used to deduce a distance and/or orientation of the item with respect to the camera, and with knowledge of the camera with respect to the container, a position of the item with respect to the container.

According to some aspects, the symbol of the item comprises one of a one-dimensional barcode, a two-dimensional barcode, an optical character recognition, OCR, number, and human-readable symbols, letters or digits that can be translated into machine code using optical character recognition. Barcodes and OCR numbers allow for large sets of different items and storage components to be assigned unique symbols. The barcodes and OCR numbers further enable the use of available methods for identifying the barcodes and OCR numbers.

According to some aspects, the control circuitry is further configured to detect a symbol of a storage component of the container, and to determine the insertion or removal of the item and the position of the item relative to a position of a detected symbol of the storage component of the container space. By assigning storage components, such as racks, their respective unique symbols, the symbol of the item can be correlated to the symbol of the storage component in order to keep track of where, i.e. for which storage component, the item was inserted or removed. By monitoring the relative positions it is possible to determine if and how the item is moving towards or away from the storage component, which may be interpreted as removal and insertion of the item from/into the storage component and possibly also where within the storage component the item was inserted or removed. In other words, the symbol of the storage component may be used in combination with the symbol of the item to determine to/from where the item is inserted/removed, as well as assisting in establishing the context which enables determining if the item is being inserted or removed.

The present disclosure further relates to a method for container space allocation monitoring. The method comprises detecting a symbol of an item based on processing one or more images captured by at least one camera. The method further comprises determining a position of the detected symbol based on image processing of at least one image comprising the detected symbol. The method also comprises determining a trajectory of the detected symbol based on a set of determined at least one positions of the detected symbol. The method additionally comprises determining insertion or removal of the item and a position of the item relative to a position of a container space based on the determined trajectory. The method yet further comprises updating information stored in a database based on the determined insertion or removal of the item and the determined position of the item relative to the position of the container space. The disclosed method carries out the steps which the disclosed system is configured to carry out, and thus has all the technical effects and advantages of the disclosed system, as described above and below.

The method also relates to a computer program for container space allocation monitoring, the computer program comprising computer program code which, when executed by a processor, causes the processor to carry out the method as described above and below. The computer program thus has all the technical effects and advantages of the disclosed method and system, as described above and below.

BRIEF DESCRIPTION OF THE DRAWINGS

Figure 1 illustrates a system for container space allocation monitoring;

Figure 2 is a box diagram of a system for container space allocation monitoring;

Figure 3 is a box diagram of a system for container space allocation monitoring; and Figure 4 illustrates a method for container space allocation monitoring.

DETAILED DESCRIPTION

Figure 1 illustrates a system 100 for container space allocation monitoring. Figure 1 further illustrates a container 110 having a container space 120 and an opening 112 for external access to the container space 120. The container 110 may also comprise a closing mechanism (not shown), such as a door, configured to provide access to and close off the opening 112. Figure 1 also illustrates an item 130 having a first symbol 132 and a storage component 140 of the container space, wherein the storage component 140 has a second symbol 142. Neither one of the container 110, item 130, storage component 140 nor any of the first and second symbols 132, 142 are part of the claimed system 100, but are illustrated herein merely to facilitate understanding of different aspects of the system 100, since the system is intended to operate with respect to at least some of the container, item, storage component and the first and second symbols.

The basic idea of the system 100 is to act analogous to a traffic camera, and use the captured images to identify symbols and their movement in space and time in order to determine if items are being inserted into or removed from the storage space, and, to the extent possible, with respect to the available positions within the storage space. The information relating to insertion and removal of items to and from the storage space is recorded at a database, which may also be used to assist when determining if an item is being inserted or removed.

The system 100 comprises at least one camera lOa-c placed to face a respective region in front of an opening of a container. The at least one camera is thereby arranged to capture images of items being inserted or removed and is able to register their associated symbols. In a preferred aspect, one or more camera is arranged at a position above the opening of the container, facing straight down. The one or more camera thereby captures a bird's-eye view of what goes on in front of the opening of the container at any given moment. The bird's-eye view is particularly suitable for capturing symbols, since most items inserted and removed will have a preferred side facing upwards, where an associated symbol will preferably be placed. The bird's-eye view further facilitates generating trajectories of symbols in a two-dimensional projection over the floor in which the one or more camera is recording. According to some aspects, the at least one camera lOa-c is a single camera. A single camera typically reduces cost with respect to solutions involving a plurality of cameras, as well as reducing space requirements and complexity of the system. If a bird's-eye view type of monitoring is desired, a single camera per container is typically enough to keep track of the items being inserted and removed from the respective containers. According to some aspects, the single camera is arranged to face a respective region in front of an opening of each container in a plurality of containers. In other words, the region monitored by the single camera covers all of the respective regions in front of the plurality of containers.

According to some aspects, the at least one camera comprises a plurality of cameras placed to face different regions in front of the opening of the container. The plurality of cameras thereby is more likely to detect a symbol which may be obscured from certain camera viewpoints at particular times. The plurality of cameras further facilitates establishing a three-dimensional interpretation of the detected symbol as well as the space in which the item having the symbol moves. In the case of several spatially distributed containers, a plurality of cameras can be distributed such that each container has one or more camera facing a region in front of the opening of each container. The system thereby extends the monitoring from one container to a plurality of containers.

According to some aspects, the at least one camera comprises a stereo camera and/or a depth sensing camera. The stereo and/or depth-sensing ability provides ways for single cameras to determine a position of the item with respect to the container based on the detected symbol. The size and/or orientation of the detected symbol can be used to deduce a distance and/or orientation of the item with respect to the camera, and with knowledge of the camera with respect to the container, a position of the item with respect to the container. The positions may be estimated using a perspective-n-point algorithm to determine a pose of the detected symbol with respect to the camera. According to some aspects, three-dimensional reconstruction from a stereo image may be used to determine positioning of detected symbols.

The system further comprises a database 14 configured to store information relating to items stored in the container and their respective positions within the container space. When storing the information, each item may be assigned a unique ID associated with a symbol of the item. Information relating to the position of the item may be as simple as an indicator if the item is currently stored within the container space or not, e.g. a binary number or an informative statement such as "inside" or "outside", but may also indicate where within the container space. If the container has storage components, such as racks, and the storage components can be identified during insertion or removal of an item, e.g. by detecting a symbol of the storage component in question, the stored information may comprise information configured to identify the storage component as well, such as a unique ID for the storage component, and possibly also where within the storage component. For each item, the stored information would then comprise a unique ID for the item, a unique ID for the storage component in which the item is stored or has been removed from and an indicator if the item is currently stored within the container space or not. The stored information may further comprise additional information considered useful by a user of the system, such as information relating to persons and/or projects associated with the item or things stored within the item.

The system also comprises control circuitry 12. The control circuitry 12 is configured to detect a symbol of an item based on processing one or more images captured by the at least one camera. The processing may comprise the use of image recognition software and/or applying filters or software modules for identifying predetermined features associated with the symbol. In other words, the symbol may have known features for which special purpose filters and/or software modules are used to identify. The symbol of the item may comprise one of a one dimensional barcode, a two-dimensional barcode, an optical character recognition, OCR, number, and human-readable symbols, letters or digits that can be translated into machine code using optical character recognition. Such barcodes and OCR numbers have known features which may be used for detection. Barcodes and OCR numbers further enables assigning unique IDs to each item to be stored within the container. Thus, according to some aspects, the processing of the one or more images captured by the at least one camera comprises identifying at least one of a one-dimensional barcode, a two-dimensional barcode, an optical character recognition, OCR, number, and human-readable symbols, letters or digits that can be translated into machine code using optical character recognition.

According to some aspects, the system further comprises a motion filter 15 arranged to determine changes between consecutive images captured by the at least one camera lOa-c. The motion filter 15 thereby facilitates real-time identification of symbols without the need for substantial computational resources. The motion filter reduces the amount of image data that has to be processed by downstream processes. The downstream processes may comprise both the detection of the symbol of the item, as discussed above, as well as further data processing steps in the form of determining if insertion or removal of the item is taking place. While local processing could be performed at the at least one camera in some examples, e.g. a camera may comprise a motion filter, it is typically desirable to transfer (possibly filtered) image data to a computational resource of the control circuitry particularly suitable for processing the downstream image data.

Therefore, according to some aspects, the system further comprises a video encoder 16 and a video decoder 17. The video encoder 16 is configured to encode image data in a compressed format. The video decoder 17 is configured to decode/decompress the compressed format image data. The use of the video encoder 16 and the video decoder 17 enables efficient transfer of image data from one part in the system to another. In particular, the video encoder 16 and the video decoder 17 facilitates splitting up the control circuitry 12 into logically and/or physically separate units, since image data transfer between the separate units of the control circuitry 12 can be performed more efficiently. Encoding the image data may further be used to provide context and/or an optimal format for downstream image processing. The context may facilitate interpretation of the image data. The optimal format may facilitate the use of optimized algorithms to identify features of the image data.

According to some aspects, the system 100 further comprises a memory buffer 18 configured to receive the compressed format image data from the video encoder 16 and store the compressed format image data until a remaining processing pipeline is ready to process the stored compressed format image data. The system 100 thereby becomes more stable with respect to downstream processes, which often are more computationally expensive an may not be immediately ready to receive the image data. The buffer 18 can ensure smooth real time monitoring of the container 110. In case the at least one camera lOa-c comprises a motion filter 15 and/or a video encoder 16, the processed image data may be transmitted to the memory buffer 18 directly from the at least one camera. According to some aspects, the at least one camera lOa-c also comprises the memory buffer 18. In such case, the at least one camera transmits image data from the buffer 18 to downstream processes, such as a potential video decoder 17, when a remaining processing pipeline is ready to process the stored compressed format image data.

It may sometimes be desirable to split up the processing of the image data over several separate computational resources. Since different computational resources may complete their respective tasks out-of-sync with the order in which the images associated with the image data was recorded, there may be a need to sort at least some of the separately processed images with respect to the chronological order in which they were captured. Therefore, according to some aspects, the control circuitry 12 is further configured to ensure correct chronological order of a plurality of separately processed images. The control circuitry 12 is further configured to determine a position of the detected symbol based on image processing of at least one image comprising the detected symbol. As stated above, the determined positions form the basis for keeping track of the insertion and removal of items in the container. The position of the symbol may be derived from relationships with items in the environment and/or based on the image of the detected symbol itself.

Thus, according to some aspects, the image processing of the at least one image comprising the detected symbol is configured to determine an estimated relative position of the detected symbol 132 to another symbol 142 and/or an environmental marker 152. With the position of the environmental marker 152 and/or the other symbol 142 being known, a spatial relationship between the detected symbol 132 and the other symbol 142 and/or the environmental marker 152 can be established. For instance, if one 10a of the at least one cameras lOa-c is arranged to obtain a bird's-eye view of a region in front of the opening of the container 110, the images of said camera may be used to determine a distance and direction in a two-dimensional plane parallel to a floor on which the container 110 is arranged between the detected symbol 132 and the other symbol 142 and/or the environmental marker 152. Stated differently, the estimated relative position of the detected symbol 132 to the other symbol 142 may be used to determine a relative position of the item 130 within the container space 120 and/or within a storage component 140 of the container space, e.g. a storage component 140 comprising the other symbol 142. According to some aspects, the environment marker is arranged to provide a reference in a global coordinate system in which both the item 130 comprising the detected symbol 132 and the container 110 are arranged. According to some aspects, the other symbol 142 is arranged to provide a reference in local coordinate system with respect to the container 110. The other symbol 142 may comprise a symbol arranged on a storage component 140 of the container space, e.g. a rack for receiving the item 130.

According to some aspects, the image processing of the at least one image comprising the detected symbol is configured to determine an estimated distance and/or orientation of the detected symbol relative to the at least one camera lOa-c. If the detected symbol is known to possess certain geometrical properties, such as being square- or rectangular-shaped with known side lengths, and/or having distinctive visual features, such as black and white squares or stripes, the geometrical properties and/or visual features may be used to determine an estimated distance and/or orientation of the detected symbol relative to the at least one camera. Once the symbol has been detected, the image of the symbol can be matched to a translation and/or rotation of a symbol from a reference plane with respect to the at least one camera.

One or more positions at known times, e.g. by associating each position with a time stamp, can then be used to form a trajectory. The trajectory may be used to keep track of where the item 130 is in relation to the container 110 as well as providing context as to whether an item that at one time instant appears or disappears in the vicinity of the opening 112 of the container 110 is being inserted or removed from the container space 120. Therefore, the control circuitry 12 is also configured to determine a trajectory of the detected symbol 132 based on a set of determined at least one positions of the detected symbol.

The control circuitry 12 is additionally configured to determine insertion or removal of the item and a position of the item relative to a position of the container space based on the determined trajectory. Movement of the detected symbol 132 in space and time provides a context which facilitates interpretation whether the item 130 is being inserted into or removed from the storage space 120. For example, a symbol of an item is detected within the vicinity of the opening 112 of the storage space and the system has to decide if the item is being inserted or removed (or neither). The system may consult the information stored in the database 14 to see if the item 130 associated with the detected symbol 132 was previously stored in the container space 120 or not. If the item was registered as stored, the system has to determine if the item is to be considered removed. A trajectory of the detected symbol 132 which leads away from the container, e.g. after the detected symbol has disappeared from view of the at least one camera lOa-c, may indicate that the item is considered removed from the container space. A trajectory which makes a narrow U-turn in vicinity of the opening 112 of the container or a symbol 142 of a storage component 140 of the container space, after which the detected symbol may disappear from view of the at least one camera lOa-c, may be interpreted as the item being taken out temporarily and then returned, thus still being stored within the container space. If two storage components having unique symbols are detected during the trajectory of the item, or rather the detected symbol 132 of the item 130, the trajectory may be used to indicate the item being moved from one storage component to another. Likewise, the trajectory may be used to indicate insertion of an item. If the trajectory of the detected symbol 132 approaches the storage space and then disappears, this may be used to interpret the item 130 having the detected symbol 132 as being inserted.

If one or more symbols of corresponding one or more storage components are detected at the same time, the trajectory may further be used to indicate in which storage component the item is stored. In other words, according to some aspects, the control circuitry 12 is further configured to detect a symbol of a storage component of the container, and to determine the insertion or removal of the item and the position of the item relative to a position of a detected symbol of the storage component of the container space.

The control circuitry 12 is yet further configured to update the stored information in the database based on the determined insertion or removal of the item and the determined position of the item relative to the position of the container space.

According to some aspects, at least part of the control circuitry 12 and one or more of the at least one camera lOa-c are comprised in a single logical unit (not shown) such that detecting the symbol of an item, determining the position of the detected symbol, determining the trajectory of the detected symbol and determining insertion or removal of the item and a position of the item relative to a position of the container space are performed within the single logical unit. According to some further aspects, the single logical unit comprises a single board computer. By integrating the camera(s) with the downstream processing performed by the control circuitry, encoding and decoding of image data can be omitted. The single logical unit may further comprise the part of the control circuitry 12 configured to update the stored information in the database based on the determined insertion or removal of the item and the determined position of the item relative to the position of the container space. According to some aspects, the single logical unit also comprises the motion filter 15. According to some aspects, the single logical unit further comprises the memory buffer 18.

In some examples the database 34 is comprised in the single logical unit.

It may be desirable to connect several logical units of the single logical unit time described above, e.g. several single-board computer systems, to the same database. In such cases it may be desirable to avoid performing the same type of updates in the database. In some examples the control circuitry of the respective logical unit is configured to only update the stored information in the database if the information the control circuitry wants to store at the database differs from the currently stored information.

Alternatively, logical unit could share a single, external portion of the control circuitry configured to update the stored information in the database based on the determined insertion or removal of the item and the determined position of the item relative to the position of the container space. According to some aspects, the single, external portion of the control circuitry configured to update the stored information in the database is comprised in the database.

Figure 2 is a box diagram of a system 200 for container space allocation monitoring. The illustrated system provides examples of how systems can be arranged to divide the computational tasks from the moment of image capture to updating a database between computational resources of different capacity. The system is here illustrated having a low performance computational subsystem 202 and a high performance computational subsystem 204, as will be illustrated further below.

The system 200 comprises at least one camera 20 placed to face a respective region in front of an opening of a container (not shown). For reasons of convenience, the at least one camera 20 will be discussed in terms of a single camera, but it should be understood that the illustrated examples may comprise more than one camera of any type discussed herein. The system further comprises a database 24 configured to store information relating to items stored in the container and their respective positions within the container space.

The system also comprises control circuitry. The control circuitry handles the computational tasks and will be illustrated as distributed modules for clarity. Though the modules are illustrated as separate entities, it is to be understood that the modules may be integrated into one or more common logical unit(s). The modules can be implemented in any suitable combination of software and hardware.

The control circuitry is configured to detect a symbol of an item based on processing one or more images captured by the at least one camera, as illustrated by a symbol detector and decoder module 221. The control circuitry is further configured to determine a position of the detected symbol based on image processing of at least one image comprising the detected symbol, as illustrated by a symbol positioner module 222. The control circuitry is also configured to determine a trajectory of the detected symbol based on a set of determined at least one positions of the detected symbol as illustrated by a symbol tracker module 223. The control circuitry is yet further configured to determine insertion or removal of the item and a position of the item relative to a position of the container space based on the determined trajectory, as illustrated by a gesture decoder module 224. The control circuitry is additionally configured to update the stored information in the database 24 based on the determined insertion or removal of the item and the determined position of the item relative to the position of the container space, as illustrated by a database writer module 225.

The system 200 preferably further comprises a motion filter 25 arranged to determine changes between consecutive images captured by the camera 20.

According to some aspects, the control circuitry is further configured to ensure correct chronological order of a plurality of separately processed images, as illustrated by a reorder module 226.

According to some preferred aspects, the system further comprises a video encoder 26 and a video decoder 27, as illustrated by a video encoder module 26 and a video decoder module 27, respectively. The video encoder 26 is configured to encode image data in a compressed format. The video decoder 27 is configured to decode/decompress the compressed format image data.

The system may also comprise a memory buffer 28, as illustrated by a buffer module 28, configured to receive the compressed format image data from the video encoder 26 and store the compressed format image data until a remaining processing pipeline is ready to process the stored compressed format image data.

Possible aspects and capabilities of the at least one camera 20, the motion filer 25, the video encoder 26, the memory buffer 28, the video decoder 27, the database 24 as well as implementations of the control circuitry, illustrated at least in part here by the symbol detector and decoder module 221, the reorder module 226, the symbol positioner module 222, the symbol tracker module 223, the gesture decoder module 224 and the database writer module 225, have been discussed in relation to Figure 1 above and applies in the illustrated example of Figure 2 as well. In the illustrated example the camera 20 and any motion filter 25, video encoder 26 and memory buffer 28 are arranged together in a low performance computational subsystem 202. If present, any of the motion filter 25, video encoder 26 and memory buffer 28 may be arranged as part of or integrated with the camera 20. According to some aspects, the camera 20, the motion filter 25, the video encoder 26 and the memory buffer 28 comprised in a single board computer system.

Likewise, the symbol detector and decoder module 221, the symbol positioner module, the symbol tracker module 223, the gesture decoder module 224, the database writer module 225, and, if present, the reorder module 226 and the video decoder 27 are arranged together in a high performance computational subsystem 204. The high performance computational subsystem 204 may be implemented in a cloud virtual machine. According to some aspects, the database 24 is also comprised in the high performance computational subsystem 204.

The low performance computational subsystem 202 and the high performance computational subsystem 204 are communicatively connected. Figure 3 is a box diagram of a system 300 for container space allocation monitoring. The illustrated system provides examples of how image processing and downstream aspects of the control circuitry may be integrated with one or more camera in a single-board computer system. In other words, Figure 3 aims at illustrating how the low performance computational subsystem 202 and the high performance computational subsystem 204 of Figure 2 above may both be comprised in a single-board computer system 303, along with potential advantages.

The system 300 comprises at least one camera 30 placed to face a respective region in front of an opening of a container (not shown). For reasons of convenience, the at least one camera 30 will be discussed in terms of a single camera, but it should be understood that the illustrated examples may comprise more than one camera of any type discussed herein. The system further comprises a database 34 configured to store information relating to items stored in the container and their respective positions within the container space.

The system also comprises control circuitry. The control circuitry handles the computational tasks and will be illustrated as distributed modules for clarity. Though the modules are illustrated as separate entities, it is to be understood that the modules may be integrated into one or more common logical unit(s). The modules can be implemented in any suitable combination of software and hardware.

The control circuitry is configured to detect a symbol of an item based on processing one or more images captured by the at least one camera, as illustrated by a symbol detector and decoder module 321. The control circuitry is further configured to determine a position of the detected symbol based on image processing of at least one image comprising the detected symbol, as illustrated by a symbol positioner module 322. The control circuitry is also configured to determine a trajectory of the detected symbol based on a set of determined at least one positions of the detected symbol as illustrated by a symbol tracker module 323. The control circuitry is yet further configured to determine insertion or removal of the item and a position of the item relative to a position of the container space based on the determined trajectory, as illustrated by a gesture decoder module 324. The control circuitry is additionally configured to update the stored information in the database 34 based on the determined insertion or removal of the item and the determined position of the item relative to the position of the container space, as illustrated by a database writer module 325.

The system 300 preferably further comprises a motion filter 35 arranged to determine changes between consecutive images captured by the camera 30.

According to some aspects, the control circuitry is further configured to ensure correct chronological order of a plurality of separately processed images, as illustrated by a reorder module 326.

One of the advantages of the examples of Figure 3 with respect to the examples illustrated in relation to Figure 2 is that the video encoder and video decoders can be omitted.

The system may also comprise a memory buffer 38, as illustrated by a buffer module 38, configured to receive the compressed format image data from either directly from the camera 30 or via the motion filter 35 (if present) and store the compressed format image data until a remaining processing pipeline is ready to process the stored compressed format image data.

Possible aspects and capabilities of the at least one camera 30, the motion filer 35, the memory buffer 38, the database 34 as well as implementations of the control circuitry, illustrated at least in part here by the symbol detector and decoder module 321, the reorder module 326, the symbol positioner module 322, the symbol tracker module 323, the gesture decoder module 324 and the database writer module 325, have been discussed in relation to Figure 1 above and applies in the illustrated example of Figure 3 as well.

The camera 30 and at least the part of the control circuitry comprising the symbol detector and decoder module 321, the reorder module 326, the symbol positioner module 322, the symbol tracker module 323, the gesture decoder module 324 and the database writer module 325 are comprised in a single-board computer system 303.

The database 34 is illustrated as a unit separate from the single-board computer system 303, but may in some examples be comprised in the single-board computer system 303 as well.

It may be desirable to connect several single-board computer systems 303 to the same database 34. In such cases it may be desirable to avoid performing the same type of updates in the database 34. In some examples the database writer module 325 is configured to only update the stored information in the database if the information the database writer module 325 wants to store at the database 34 differs from the currently stored information.

Alternatively, each single-board computer systems 303 could share a single, external database writer module 325 (not shown). According to some aspects, the single, external database writer module is comprised in the database 34 (not shown).

Figure 4 illustrates a method for container space allocation monitoring. The method comprises detecting S10 a symbol of an item based on processing one or more images captured by at least one camera. The method further comprises determining S20 a position of the detected symbol based on image processing of at least one image comprising the detected symbol. The method also comprises determining S30 a trajectory of the detected symbol based on a set of determined at least one positions of the detected symbol. The method additionally comprises determining S40 insertion or removal of the item and a position of the item relative to a position of a container space based on the determined trajectory. The method yet further comprises updating S50 information stored in a database based on the determined insertion or removal of the item and the determined position of the item relative to the position of the container space. The method carries out the steps for which the disclosed system for container space allocation monitoring as described above and below is configured, and consequently has all the technical effects and advantages of the system for container space allocation monitoring.

The present disclosure also relates to a computer program for container space allocation monitoring. The computer program comprises computer program code which, when executed by a processor, causes the processor to carry out the method for container space allocation monitoring as described above and below.