Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
IMAGE-BASED SYSTEM AND METHOD FOR SHIPPING CONTAINER MANAGEMENT WITH EDGE COMPUTING
Document Type and Number:
WIPO Patent Application WO/2021/258195
Kind Code:
A1
Abstract:
An image-based automated system for tracking and management of shipping containers in a terminal yard is provided. The system comprises a plurality of Edge Processing Devices and a Central Processing Device. The Edge Processing Devices receives images from yard security cameras, traffic circulation cameras and container handlers, the images being associated with position coordinates. The Edge Processing Devices and the Central Processing Device detect, using machine-learning algorithms, container codes from the images, and associate the detected container codes with the position coordinates, and determine in real time, the positions of the shipping containers imaged by the cameras at the terminal. During execution of the transit shipment plan, the positions of the shipping containers with the transit shipment plan are compared and discrepancies are identified between a planned position of one of the shipping containers and the actual position of said shipping container, as previously determined.

Inventors:
IVENS JENNIFER (CA)
IVENS BRUCE (CA)
Application Number:
PCT/CA2021/050849
Publication Date:
December 30, 2021
Filing Date:
June 22, 2021
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
CANSCAN SOFTWARES AND TECH INC (CA)
International Classes:
G06Q10/08; B61L25/02; H04N7/18
Domestic Patent References:
WO2020124247A12020-06-25
Foreign References:
US20090108065A12009-04-30
US20030191555A12003-10-09
KR102206662B12021-01-22
CN212302516U2021-01-05
CN106203239A2016-12-07
CN110969054A2020-04-07
CN110348451A2019-10-18
IN201621028008A2017-03-24
CN109492449A2019-03-19
Other References:
ANONYMOUS: "OCR in Ports and Terminals", PEMA INFORMATION PAPER, 1 January 2013 (2013-01-01), pages 1 - 22, XP055892971, Retrieved from the Internet [retrieved on 20220217]
ANONYMOUS: "Artificial intelligence to improve terminal efficiencies", CAMCO TIMES, 1 June 2018 (2018-06-01), pages 3 - 23, XP055892974, Retrieved from the Internet [retrieved on 20220217]
ZHANG RAN; BAHRAMI ZHILA; WANG TENG; LIU ZHENG: "An Adaptive Deep Learning Framework for Shipping Container Code Localization and Recognition", IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, IEEE, USA, vol. 70, 13 August 2020 (2020-08-13), USA, pages 1 - 13, XP011820301, ISSN: 0018-9456, DOI: 10.1109/TIM.2020.3016108
Attorney, Agent or Firm:
ROBIC (CA)
Download PDF:
Claims:
CLAIMS

1. An image-based automated method for tracking and management of shipping containers in a terminal yard, the method comprising: receiving a transit shipment plan for shipping containers arriving soon or already at the terminal yard; receiving images from yard security cameras and/or traffic circulation cameras, the images being associated with yard-camera-position coordinates, at least some of the images showing shipping containers provided with container identification codes; receiving additional images from cameras mounted onto container handlers, the additional images being associated with container-handler-position coordinates, at least some of the images showing shipping containers provided with container identification codes; detecting, using machine-learning algorithms, the container codes from the images received from at least one of the yard security cameras, the traffic circulation cameras and from the container handler cameras, and associating the detected container codes with the yard-camera-position and/or the container-handler-position coordinates, and deriving therefrom, in real time, respective positions of the shipping containers imaged by the cameras; sending transit shipment instructions to operator-interfaces of the container handlers according to the transit shipment plan; and comparing, during execution of the transit shipment plan by the container handlers, the positions and the container codes of the shipping containers with the transit shipment plan and identifying discrepancies between a planned position of one of the shipping containers, as per the transit shipment plan, and the actual position of said shipping container, as previously determined; and sending an alert or a warning message to one of the operator- interfaces when a discrepancy has been identified.

2. The method according to claim 1 , wherein at least some of the images received show railcars provided with railcar identification codes, the railcars transporting shipping containers that need to be loaded onto transport trucks by the container handlers, the method further comprising: detecting, using the machine-learning algorithms, the railcar codes; and identifying discrepancies between a planned position of a given container on a given one of the railcars, as per the transit shipment plan, and the actual position of said shipping container relative to the given railcar, as previously determined.

3. The method according to claim 1 or 2, wherein said images showing railcars provided with railcar codes are associated with a main rail track or with one of several spur rail tracks, the method further comprising: comparing, during execution of the transit shipment plan, the position of the railcars relative to the main and spur rail tracks of the terminal yard, with the transit shipment plan, and identifying discrepancies between a planned position of one of the railcars on one of the main and spur rail tracks, as per the transit shipment plan, and the actual position of said railcar.

4. The method according to claim 3, the method further comprising: comparing, during execution of the transit shipment plan, the position of a given railcar relative to other railcars on the main and/or spur rail tracks, with the transit shipment plan, and identifying discrepancies between a planned position of one of the railcars relative to said other railcars, as per the transit shipment plan, and the actual position of said railcar.

5. The method according to claims 3 or 4, comprising: detecting, using machine-learning algorithms, track identification codes and dimensions of the railcars imaged by the cameras; and logging, in real time, records associating: railcar codes with track identification codes, position of a given railcar on the main or spur rail tracks; and dimension of a given railcar; and shipping container codes with railcar identification codes and/or track identification codes; whereby a Terminal Operation System application can access the records to verify, confirm and report shipping container movements, in real time, versus the transit shipment plan. 6. The method according to any one of claims 1 to 5, wherein the images from the cameras mounted onto the container handlers are sent to Edge Processing Devices, the detection of the container codes being performed locally, by the Edge Processing Devices, or remotely, through Cloud-Based Processing Servers accessed via the Edge Processing Devices.

7. The method according to any one of claims 1 to 6, wherein the container handlers may comprise one or more intermodal container cranes, such as Rubber Tyre Gantry (RTG) cranes and Rail Mounted Gantry (RMG) cranes, and one or more mobile intermodal container handlers, such as stackers and lifter vehicles.

8. The method according to any one of claims 1 to 7, comprising sending instructions to Programmable Logic Controllers (PLCs) of the container handlers to control operation thereof, based on the discrepancies identified between the planned position of the shipping container being handled, as per the transit shipment plan, and the actual position of said shipping container.

9. The method according to claim 8, wherein the instructions to PLCs comprise instructions that prevent clamping, lifting or moving said shipping container determined as incorrectly selected, following the comparison of the shipping container’s position with the transit shipment plan.

10. The method according to any one of claims 1 to 9, wherein comparing the positions of the shipping containers with the transit shipment plan is performed by a Central Processing Device and communicated to the Edge Processing Devices located at different locations in the terminal yard.

11. The method according to claim 5, comprising assessing a physical condition of the shipping containers by processing the images from the security yard cameras and/or the container-handler mounted cameras though convolutional neural networks algorithms and logging said physical condition for access by the TOS.

12. An image-based automated system for tracking and management of shipping containers in a terminal yard, the system comprising: a Terminal Operation System (TOS) interface for receiving the transit shipment plan for shipping containers arriving soon or already at the terminal yard; a plurality of Edge Processing Devices, each comprising one or more processors, memory, a communication module and a position module, the Edge Processing Devices being configured for: receiving images from yard security cameras and/or from traffic circulation cameras, the images being associated with yard-camera-position coordinates, at least some of the images showing shipping containers provided with container identification codes; receiving additional images from cameras mounted onto container handlers, the additional images being associated with container-handler- position coordinates, at least some of the images showing other shipping containers provided with container codes; a Central Processing Device comprising one or more processors, memory and a communication module, the Central Processing Device between communication with the TOS and with the Edge Processing Devices, the Edge Processing Devices being configured to detect, using machine-learning algorithms, the container codes from the images received from at least one of the yard security cameras, the traffic circulation cameras and from the container handler cameras, to associate the detected container codes with the yard-camera-position and/or the container-handler-position coordinates, to derive therefrom, in real time, respective positions of the shipping containers and to associate the detected container codes with the derived positions, thereby determining in real time, the positions of the shipping containers imaged by the cameras; the Central Processing Device being configured to: send transit shipment instructions to operator-interfaces of the container handlers according to the transit shipment plan; and compare, during execution of the transit shipment plan by the container handlers, the positions of the shipping containers with the transit shipment plan and identify discrepancies between a planned position of one of the shipping containers, as per the transit shipment plan, and the actual position of said shipping container, as previously determined; and send an alert or a warning message to one of the operator-interfaces when a discrepancy has been identified.

13. The system according to claim 12, wherein the Central Processing Device and/or the Edge Processing Devices comprise a Container Yard Video Management System interface, to access the images and/or image data from the yard security cameras.

14. The system according to claim 12 or 13, further comprising a cloud-based shipping container ID and Inspection module, the Edge Processing Devices accessing the cloud- based module to process the images received.

15. The system according to any of claims 12 to 14, wherein at least some of the Edge Processing Devices further comprise one or more video cameras within the same enclosure.

16. The system according to any of claims 12 to 15, wherein at least some of the Edge Processing Devices can communicate directly, via wired or wireless communication links, with the cameras mounted onto the container handlers.

17. The system according to any of claims 12 to 16, wherein the communication modules of the Edge Processing Devices include one or more of the following interfaces: a Wi-Fi interface, a Bluetooth interface, a 4G or 5G interface.

18. The system according to any of claims 12 to 17, wherein the Edge Processing Devices include a PLC interlock control module, to communicate instructions and statuses to PLCs of the container handlers.

19. The system according to any of claims 12 to 18, wherein the Central Processing Device and/or the Edge Processing Device comprise a shipping container status module to provide shipping container statuses, warnings and/or transit shipment instructions to operator-interfaces of the container handlers.

20. The system according to any of claims 12 to 19, comprising one or more databases to store records which associate: railcar identification codes with track identification codes, position of a given railcar on the main or spur rail tracks; and dimension of a given railcar; shipping container codes with railcar identification codes and/or track identification codes.

21. A non-transitory storage medium, for storing executable instructions for causing one or more processing devices to: receive a transit shipment plan for shipping containers arriving soon or already at the terminal yard; receive images from yard security cameras and from traffic circulation cameras, the images being associated with yard-camera-position coordinates, at least some of the images showing shipping containers provided with container codes; receive images from cameras mounted onto container handlers, the images being associated with container-handler-position coordinates, at least some of the images showing other shipping containers provided with container codes; detect, using machine-learning algorithms, container codes from the images received from the yard security cameras, the traffic circulation cameras and from the container handler cameras, and associating the detected container codes with the position coordinates, thereby determining in real time, the position of the shipping containers imaged by the cameras; send transit shipment instructions to operator- interfaces of the container handlers according to the transit shipment plan; and compare, during execution of the transit shipment plan by the container handlers, the positions of the shipping containers with the transit shipment plan and identify discrepancies between a planned position of one of the shipping containers, as per the transit shipment plan, and the actual position of said shipping container, as previously determined; and send an alert or a warning message to one of the operator-interfaces when a discrepancy has been identified.

Description:
IMAGE-BASED SYSTEM AND METHOD FOR SHIPPING CONTAINER MANAGEMENT WITH EDGE COMPUTING

RELATED APPLICATION

The present application claims the benefit of US Provisional Application No. 63/042.151 filed June 22, 2020, the entire disclosure of which is hereby incorporated by reference in its entirety.

TECHNICAL FIELD

[001] The present invention generally relates to the field of cargo shipping containers, and more particularly to an image-based system and method for managing transport and handling of shipping containers at terminals.

BACKGROUND

[002] With the rapidly growing world population, international trade and globalization have become the foundation of the global economy. Shipping container maritime transport is the primary means by which general cargo is transported throughout the world. With over 38 million twenty-foot equivalent containers in the global fleet, the shipping container is one of the most important assets of international trade.

[003] Most terminals use Terminal Operating Systems (referred to as TOS) to plan, manage and control the flow of containers at their yards, gates, rails and loading and unloading stations. However, such systems lack end-to-end tracking of the containers on site. There exist systems that use handheld OCR devices used by terminal operators or Radio Frequency tagging of containers, to track the containers within terminals. However, such systems are expensive to implement, poorly integrated with existing TOS and do not allow to track in real-time, intermodal displacement of shipping containers. Existing systems also fail to provide feedback, such as warnings and alerts, to operators and/or to handling equipment, in case of mishandled shipping containers.

[004] There is a need for improving the current methods and systems to allow for faster recognition and cycling of shipping containers within terminals. SUMMARY

[005] According to an aspect, an image-based automated method is provided, for tracking and managing shipping containers in a terminal yard. The method comprises receiving a transit shipment plan for shipping containers arriving soon or already at the terminal yard. The method also comprises receiving images from at least one of yard security cameras and traffic circulation cameras. The images are associated with yard- camera-position coordinates. The method may also comprise receiving additional images from cameras mounted onto container handlers. At least some of the images received show shipping containers provided with container codes. The method comprises detecting, using machine-learning algorithms, the container codes from the images received from the yard security cameras, the traffic circulation cameras and from the container handler cameras. The detected container codes are associated with the position coordinates. The coordinates can correspond to the yard-camera-position and/or the container-handler-position coordinates. The method may comprise deriving, from the yard-camera-position and/or the container-handler-position coordinates, in real time, the respective positions of the shipping containers imaged by the cameras. Transit shipment instructions are sent to operator-interfaces of the container handlers, according to the transit shipment plan. During execution of the transit shipment plan by the container handlers, the positions of the shipping containers and the container codes are compared with the transit shipment plan. Discrepancies between the planned positions of the shipping containers, as per the transit shipment plan, and the actual position of said shipping container, as previously determined, are identified and reported. Alerts and/or warning messages can be sent to one of the operator-interfaces when a discrepancy has been identified.

[006] In possible implementations, some of the images received show railcars provided with railcar codes. The railcars transport shipping containers that need to be loaded onto transport trucks by the container handlers. In this case, the method further comprises the detection, using the machine-learning algorithms, of railcar codes and the identification of discrepancies between the planned position of a given container on a railcar, as per the transit shipment plan, and the actual position of said shipping container relative to the railcar, as previously determined. [007] In possible implementations, images showing railcars provided with railcar codes are associated with a main rail track or with one of several spur rail tracks. In this case, during execution of the transit shipment plan, the position of the railcars relative to the main and spur rail tracks of the terminal yard are compared with the transit shipment plan, and discrepancies between the planned position of a railcar, as per the transit shipment plan, and its actual position are identified and reported.

[008] In possible implementations, track identification codes and dimensions of the railcars imaged by the cameras are detected, using machine-learning algorithms. Preferably, records associating railcar identification codes, rail track identification codes; position of a given railcar on the main or spur rail tracks; dimension of a given railcar; and shipping container codes are continuously logged, in real-time. The Terminal Operation System application can access the records to verify, confirm and report shipping container movements, in real time, versus the transit shipment plan.

[009] Depending on the implementation, the images from the cameras mounted onto the container handlers are sent to Edge Processing Devices and the detection of the container codes is performed locally, by the Edge Processing Devices. Alternatively, image processing is performed remotely, through Cloud-Based Processing Servers, via the Edge Processing Devices. A combination of local and cloud-based processing is also possible.

[0010] In possible implementations, the container handlers may comprise one or more intermodal container cranes, such as Rubber Tyre Gantry (RTG) cranes and Rail Mounted Gantry (RMG) cranes, and one or more mobile intermodal container handlers, such as stackers and lifter vehicles.

[0011] In possible implementations, the method comprises sending digital instructions to an on-machine controller such as a Programmable Logic Controller (PLC) of the container handlers to control operation thereof, for example to prevent clamping, lifting or moving a given shipping container determined as incorrectly selected, following the comparison of the shipping container’s position with the transit shipment plan, based on the discrepancies identified between the planned position of the shipping container being handled, as per the transit shipment plan, and the actual position of said shipping container. [0012] Depending on the implementation, comparing the positions of the shipping containers with the transit shipment plan can be performed by the terminal Central Processing Device and communicated to the Edge Processing Devices.

[0013] In possible implementation, the method may also comprise assessing a physical condition of the shipping containers by processing the images from the security yard cameras and/or the container-handler mounted cameras though convolutional neural networks algorithms, to detect defects on the containers, such as rust, deformations, state of the container’s handles and security seals. Preferably, physical condition of containers is stored in a database, for access by the TOS and/or by the Central Processing Device.

[0014] According to another aspect, a system for implementing the method described above is provided. The system comprises a Terminal Operation System (TOS) interface for receiving the transit shipment plan for shipping containers arriving soon or already at the terminal yard. The system also comprises a plurality of Edge Processing Devices, each comprising one or more processors, memory, a communication module and a position module. The Edge Processing Devices are configured to receive images from cameras mounted onto container handlers. The system also comprises a Central Processing Device with one or more processors, memory/storage means and a communication module. The Edge Processing Devices are configured to detect and recognize, using machine-learning algorithms, container identification codes from the images received from the yard security cameras, the traffic circulation cameras and from the container handler cameras, to associate the detected container codes with the yard- camera-position and/or the container-handler-position coordinates, to derive therefrom, in real time, respective positions of the shipping containers and to associate the recognized container codes with position coordinates, so as to determine in real time, the position of the shipping containers imaged by the cameras. The Central Processing Device and/or the Edge Processing Devices are further configured to send transit shipment instructions to operator-interfaces of the container handlers according to the transit shipment plan; and to compare, during execution of the transit shipment plan by the container handlers, the positions of the shipping containers with the transit shipment plan and identify discrepancies between a planned position of the shipping containers, as per the transit shipment plan, and the actual position of said shipping container, as determined. The Central Processing Device sends, directly or via the Edge Processing Devices and/or the TOS, an alert or a warning message to one of the operator-interfaces when a discrepancy has been identified.

[0015] Depending on the implementation, the Central Processing Device and/or the Edge Processing Devices comprise a Container Yard Video Management System interface, to access the images from the yard security cameras.

[0016] In possible implementations, the system comprises a cloud-based shipping container ID and Inspection module. The Central Processing Device and/or the Edge Processing Devices have access to the cloud-based module, such that the images received can be processed remotely with trained machine-learning algorithms and models.

[0017] In possible implementations, the Edge Processing Devices may include one or more video cameras within its enclosure, or alternatively, the Edge Processing Devices can communicate, via wired or wireless communication links, with the cameras mounted onto the container handlers. The communication modules of the Edge Processing Devices may include one or more of the following interfaces: a Wi-Fi interface, a Bluetooth interface, a 4G or 5G interface. In possible implementations, the system is adapted to interface with video cameras having imbedded Graphical Processing Unit (GPU) and/or Tensor Processing Units (TPU). An object detection algorithm (such as container detection) can be executed by the imbedded GPU and/or TPU in the camera, which converts video to frames. In such implementations, only the images having containers therein are then sent for container identification code recognition and/or defect detection.

[0018] In some implementations, the Edge Processing Devices include a PLC interlock control module, to communicate instructions and statuses to PLCs of the container handlers.

[0019] The Central Processing Device and/or the Edge Processing Device preferably both comprise a shipping container status module to provide shipping container statuses, warnings and/or transit shipment instructions to operator-interfaces of the container handlers.

[0020] Depending on the implementation, the system may include its own databases to store records associating the railcar identification codes, track identification codes, position of railcars, dimensions of railcars; and shipping container codes. Alternatively, the system can access a third party database to access, update and/or retrieve railcar and shipping container related records.

[0021] According to another aspect, a non-transitory storage medium is provided, for storing executable instructions for causing one or more processing devices to perform the steps of the method described above.

[0022] Other features and advantages of the embodiments of the present invention will be better understood upon reading of preferred embodiments thereof with reference to the appended drawings. It will be noted that the different steps of the process defined above can be combined and executed in a different order than the order defined above, and that some or all of the steps can be performed, depending on the implementation. Similarly, different embodiments of the system can include all or some of the components and modules defined above.

BRIEF DESCRIPTION OF THE DRAWINGS

[0023] FIG. 1 is a block diagram of possible components of an image-based system for shipping container management, and of their interactions, according to a possible implementation.

[0024] FIG. 2 is a schematic diagram of possible components of an image-based system for shipping container management, according to a possible implementation.

[0025] FIG. 3 is a schematic diagram showing components of the system of FIG.2, for tracking railcars, shipping containers mounted thereom [0026] FIG. 4 is a schematic diagram, showing cameras positioned to capture images of railcars transporting shipping containers, as they are being shunted from the main rail track to the spur rail track.

[0027] FIG. 5 is a schematic diagram, showing cameras positioned on RTGs or other handling vehicles to capture images of railcars moving on terminal rail tracks, from which containers are moved to and from hauling trucks.

[0028] FIG. 6 is a table summarizing the different steps for mapping rail lines, rail cars and shipping containers in a terminal.

[0029] FIG. 7 is a schematic diagram of components of the system of FIG.2, for a Rubber Tyred Gantry Crane, according to an exemplary implementation. [0030] FIG. 8 is schematic diagram of possible steps of the image-based shipping container management process, for the detection of shipping container codes, including an audio or visual operator interface within the container handling equipment.

[0031] FIG. 9A is a flow chart of an exemplary process for detecting a shipping container code displayed vertically on the side of the container. [0032] FIG. 9B is a functional diagram of modules for shipping container code recognition and seal detection and condition assessment, according to an exemplary implementation.

[0033] FIG. 10 is a schematic diagram of components of the system of FIG.1 , for a stacking equipment, according to an exemplary implementation. [0034] FIG. 11 is a schematic diagram showing container code validation by a container handling vehicle operator, following the Al-based container code recognition by the shipping container management system.

[0035] It should be noted that the appended drawings illustrate only exemplary embodiments of the invention and are therefore not to be construed as limiting of its scope, for the invention may admit to other equally effective embodiments.

DETAILED DESCRIPTION

[0036] The proposed system and its associated method relate to an image-based, automated method for tracking and managing shipping containers in a terminal. The system and method use machine-learning algorithms to recognize and detect shipping container and railcar identification codes from images captured by yard security cameras, traffic circulation cameras and/or cameras mounted onto container handlers, and associate the detected codes with position coordinates, to track in real time, the position of the shipping containers imaged by the cameras. Tracking of the shipping containers can be compared with transit shipment plans, to make sure shipping containers transit within the terminal as planned.

[0037] The proposed system and method have been developed to provide an organized large-scale management of container movement and storage on a container terminal site. The system comprises several compact mobile machine vision and artificial intelligence devices, also referred to as edge processing devices. The edge processing devices are designed for applications such as container handling vehicles. They detect and recognize, using Al algorithms, the alpha numeric characters which may be painted, stencilled or posted as on warning signs, on surfaces of stationary or on moving shipping containers and railcars. The proposed system also comprises a central processing device that may be integrated with other supervisory platforms, such as a Terminal Operating System (TOS) and Terminal Security Management (TSM) applications, to exchange data that will improve efficiency of movement and will assemble progress updates to ensure loading plans are being followed or corrected, as necessary. [0038] Using machine vision cameras producing images in video or single frame format, feeding the edge computing devices in an Edge Computing configuration and deriving location coordinates from image computing and global positioning devices (GPS) and delivering digital information to local management systems or to web services for additional processing, shipping container data can be delivered to and integrated in a database system on the terminal site or on a preferred web-based site.

[0039] The proposed system is designed to use a minimum level of electrical power which permits its application in installations that are remote and must operate using energy developed and stored locally. Possible implementations of the system include attachment of edge processing devices to heavy mobile equipment working in shipping container depots and to fixed installations monitoring rail traffic entering and parking on yard loading spur tracks. The detection and inspection capacity of the edge processing device can be selected to meet the specific needs of terminals and may vary from simple code recognition and container tracking to complex damage detection and classification tasks.

[0040] In possible implementations, the system comprises a combination of edge processing/computing devices and web service platforms. The selection of the system’s configuration will vary according to the terminal site infrastructure opportunities for data communication, and interface needs with other installations and third-party platforms. The edge processing devices are provided with communication means, such as WIFI or digital cellular communication, and where data is considered sensitive, encryption techniques can be applied.

[0041] A typical container depot or rail yard is managed through a central Terminal Operating System (TOS). The proposed system can be integrated with the TOS to provide a two-way exchange of data which permits the TOS to verify, confirm and report actual events versus original plans generated for rail car and container deliveries.

[0042] The proposed system can also be integrated with a site security management system. Video and image frames can be directed to the edge processing devices, to provide data that can be used to identify the position of containers and equipment in real time.

[0043] Referring to FIG.1 , a block diagram of possible components of the proposed automated system for tracking and management of shipping containers in a terminal yard 100 are shown. The system 100 comprises a central processing device 200 and a plurality of satellite edge processing devices 300, which may be mobile or fixed. The central processing device 200 can consist of one or more computers or servers, provided with processors, memory and communication interfaces. The central processing device may reside locally, at the terminal site, or remotely, in a cloud-based implementation. A software application runs on the central processing device. The software application communicates with and controls the plurality of edge processing devices 300.

[0044] The terminal staff receive various types of communication from freight companies, providing instructions related to the delivery of shipping containers scheduled to pass through the terminal. These instructions are combined and converted into an electronic file format such as csv. which is then entered into the TOS as a portion of the overall transit shipment plan (C). According to a possible implementation of the proposed system and method, execution of the plan within the boundaries of the terminal yard results in a series of simple single container move communications between the central processor device, the TOS and one of the field edge processing devices. An example could be an instruction, sent as a digital message, from the TOS, via the central processing device to an edge processing device, to find a specific container and to move it to a defined coordinate near the dock. The individual shipping container moves, or displacements, are part of the overall transit shipping plan which at this point is only concerned with the terminal yard. The shipping instructions relate to terminal yard operation only and include information such as the container identification code, related railcar and delivery coordinates within the yard boundaries. The TOS also typically has access to the Container Yard Inventory 16, which lists all shipping containers on site, and the different container handling equipment and vehicles available. Under the management of the TOS and the central processing device 200, the coordinates of all onsite units (including for example, shipping containers and container handlers) are logged and stored in a database or similar system, for quick reference (B). The central processing device 200 and/or the edge processing devices can also communicate with the Terminal Security Management application 20 (identified in FIG. 1 as a Container Yard Video Management System). The application 20 receives from various yard security cameras, images from different regions of the terminal yard, which may or not comprise containers. In possible implementations, the Central processing device comprises a Container Yard Video Management System interface, to access the images from the yard security cameras and/or additional data, such as the position and/or state (on/off, defective, etc.) of the cameras. The TOS and/or the Container Yard Video Management System interfaces can for example include APIs and/or webservices used to communicate and exchange data with the TOS and Video Management System. The TOS manages the distribution of work tasks to all site container transporting equipment, via the central processing device. The container transporting equipment are equipped with an edge processing device 300, which can communicate the central processing device and preferably, with one another. The container move instructions are sent to operator interfaces, and advantageously, alerts and messages are also sent to the operator interfaces, via the edge processing devices, to inform operators, in real time, when there are discrepancies between a planned position of one of the shipping containers, as per the transit shipment plan, and the actual position of said shipping container.

[0045] The edge processing devices 300 are modules or units, provided with one or more processors, memory means and communication modules. The edge processing device 300 communicates with one or more cameras 380, which can include security yard cameras, traffic circulation cameras, cameras mounted onto container handlers, or their own dedicated camera(s), either directly or via the Video Management System. The communication modules may include one or more of the following interfaces: a Wi-Fi interface, a Bluetooth interface, a 4G, 5G interface, to communicate with the central processing device, the cameras located close to the edge device and/or with other edge processing devices. The edge processing devices have access to image data from onsite cameras. The different types of cameras, including equipment mounted cameras, image the containers and return the image data to the edge devices 300 that process the images and extracts the container identification details (F). The edge processing devices 300 communicate with the TOS, either directly or via hubs or other processing devices which may act as hubs. The edge processing devices 300 can also return data relating to the progress and accuracy of the executed portions of the plan (E). Yard security cameras can also send video images to the video management system 20, and the yard security video images can be used by the central processing device to track shipping containers on site (G, H). The plurality of edge processing devices, each comprise one or more processors, memory, a communication module and a position module. Edge processing devices are thus configured to receive images from different types of cameras, including yard security cameras and traffic circulation cameras, and cameras mounted onto container handlers. The images are associated with yard-camera-position and/or the container-handler-position coordinates. This association is preferably performed by the edge processing devices, but depending on the implementation, the central processing device can also associate the images and positional coordinates in the yard, to derive the positions of the containers detected in the images. At least some of the images show the shipping containers provided with container codes. In possible implementations, the images can be analysed by the edge processing devices to first detect shipping containers in the images or video frames (as indicated in module 366 on the left-side of Fig. 9 - but recognizing the shape of containers), and then detect container codes (with container identification module 362). When a container code has been recognized by the machine learning algorithms executed at the edge processing devices, the container code is associated with the coordinates (such as GPS coordinates or other position indicator) of the camera that captured the image. The association of the shipping container code with a position in the terminal yard can be stored temporarily by the edge processing device and transmitted to the TOS which can update its own database. The respective positions of all shipping containers imaged by the cameras, for which the container code has been successfully detected, can thus be derived from the captured images. Advantageously, in some implementations, no additional or specialized cameras are needed, as images captured by existing security and/or traffic cameras can be used and their content is leveraged for tracking in real-time, the positions of the shipping containers on terminal sites. In possible implementations, the edge processing devices and/or the cloud-based application can interface with video cameras having imbedded Graphical Processing Unit (GPU) and/or Tensor Processing Units (TPU). TPU are Al accelerator application-specific integrated circuit (ASIC) specifically adapted and configured for neural network machine learning. The system 100 can be designed and adapted to interface and/or integrate such cameras. An object detection algorithm (such as container detection) is executed by the imbedded GPU and/or TPU in the camera, which converts video to frames. When using such cameras, only the images having containers therein can be sent to the cloud-based container identification code module 362 and/or defect detection module 266 for further analysis.

[0046] In possible implementations, the central and/or the edge processing devices 200, 300 have access to cloud-based Al-algorithms 400, that can process the images captured by the various cameras and proceed with container and/or railcar identification code recognition, and return the detected identification code and the associated position coordinate, based on either one of metadata tagged within the image, or GPS coordinated provided by the edge processing devices (I). In other possible implementations, the container and railcar code detection can be performed entirely locally, at the terminal, by the edge processing devices, or partly by the edge processing device, partly by the central processing device. The TOS can be configured to send the transit shipment instructions to operator- interfaces (e.g. graphical user interfaces) of the container handlers according to the transit shipment plan, via the central processing device 200. The central processing device 200 can be configured to provide text alarms during execution of the transit shipment plan by the container handlers. For example, should the handler move to the instruction provided coordinates but is unable to find the container number identified for that coordinate, then a text alarm would be sent for assistance. The comparison of the shipping containers positions and codes with the transit shipment plan can be performed by the central processing device and/or by the edge processing devices. Discrepancies between a planned position of one of the shipping containers, as per the transit shipment plan, and the actual position of said shipping container, as previously determined, can be identified by the central processing device 200 or edge processor. Discrepancies can include a difference between a container code in the transit shipment plan and the container code being handled or imaged. Discrepancies can also include a difference between a container’s expected position as per the transit shipment plan and the actual position of the container derived from the yard-camera-position and/or the container- handler-position coordinates. The actual position of the container can correspond to one of the yard-camera-position and/or the container-handler-position coordinates, or to an average of the of the yard-camera-position and/or the container-handler-position coordinates.

[0047] Referring now to FIG. 2, a schematic diagram illustrates possible components of the system 100. The edge processing devices 300 are designed as secured enclosures, housing the one or more processors 310, memory means (RAM, ROM, flash, etc.) 320, communication interfaces 330 and position module (GPS or DGPS) 352. Although the edge processing devices preferably connect to the terminal cameras and/or to the terminal’s video management system to process the images for container code recognition and physical condition assessment, in possible embodiments, at least some of the edge processing device may comprise one or more video cameras within their enclosure. The edge processing devices 300 run a software application 360 that includes or accesses a container identification module 362 (and optionally, a railcar identification module), and preferably, an audio or graphical operator’s interface module 364, to be able to send container ID and/or container positions thereto. The software application 360 comprises a shipping container status module to provide shipping container statuses, warnings and/or transit shipment instructions to operator-interfaces of the container handlers. The different shipping container statuses are logged and stored in a database.

[0048] The cameras 380 are either fixed cameras, positioned on posts at strategic locations in the terminal yard, such as near the main rail track and spur rail tracks; on cranes structures, such as Rubber Tyred Gantry (RTG) cranes and Rail Mounted Gantry (RMG) cranes 42; or on container handling vehicles, such as lifters and stackers 44. The edge processing devices can communicate, via wired or wireless communication links, with the cameras mounted onto the container handlers. As will be explained in greater detail below, the container handler vehicle and/or cranes are typically provided with Programmable Logic Controllers (PLCs) 46, and, in some implementations, the edge processing device comprises a PLC interlock control module 350 to communicate with the PLCs. Operators of the container handlers (cranes, stackers, lifters, trucks, etc.) can have access to an operator interface, controlled by the TOS, the central processing device and/or the edge processing devices, that can display or provide audio indications regarding shipping container statuses (such as their actual positions and code) and discrepancies between the transit shipment plans and the actual positions of the shipping containers. The central processing device 200 and/or the edge processing devices 300 may therefore comprise, in possible implementations, a software module with processor- executable instructions to provide shipping container statuses, warnings and/or transit shipment instructions to operator-interfaces of the container handlers.

[0049] The central processing device 200 can be provided in a computer room, with other servers of the terminal, hosting the TOS 10 and Security Management System 20, but as explained previously, the central processing device may also be part of a cloud-based server farm, remote from the terminal. The Central processing device comprises one or more processors 210, data storage and memory means 220, and communication modules 230, as well the TOS and TMS interfaces 250, 256. The Central processing device runs a shipping container management software application, and may include a loading and unloading planning module, a container and railcar position tracking and management module 264, a container health assessment module 266 and a graphical user interface 268.

Container Loading and Unloading Management System at Rail-Side

[0050] Referring to FIG. 3, the proposed system can be used at intermodal container terminals to track and control the selection and transport of shipping containers between railcars sitting on terminal rail spur tracks, the depot yard storage piles and the city/yard trucks. By detecting the presence of arriving railcars and creating an identification and location database for all railcars sitting on tracks at a terminal, and interfacing with the TOS where container loading plans are stored, the loading and unloading of railcars becomes more: organized, accurate, efficient, safe and has a higher rate of transfer.

[0051] Still referring to FIG. 3, as an exemplary embodiment, the edge processing device 300 communicates with first and second video cameras 380i, 380N, where one is oriented such that its field of view captures moving rail cars 70 and the rail car identification code 72, while the second camera is oriented to detect shipping containers 60 with the shipping containers identification code 62. Of course, other camera configurations are possible, including for example a single camera capturing both the rail cars and shipping containers transported thereon, as in FIG. 4. The images are sent from the cameras to the edge processing device, via a wired and/or wireless connection, such as Wi-Fi connection, for example. The edge process device 300 can access, via its communication module, remote machine algorithms 400 that have been previously trained to detect rail cars, shipping containers on rail cars, shipping containers, and shipping containers codes and rail car codes. Alternately, where cloud-based access is not allowed from the terminal site for security reasons, it is possible to store and execute the trained Al-algorithms by the edge processing device 300, or by the central processing device (not shown in FIG. 3).

[0052] When the degree of confidence in the detected rail car and/or shipping container codes is low, such as below a given threshold, a confirmation request can be sent to an operator’s interface, such as to a rail operations checker’s interface (on a tablet, smart phone, laptop or the likes), to confirm the rail car and/or the shipping container code. This feedback can be used to further train the Al-algorithms, so as to improve the accuracy of the Al-code detection code algorithms. The final code determination is sent to the TOS’ database, such as in the terminal yard inventory database, via the central processing device. If the central processing device detects a discrepancy between a planned position of a given container on a railcar, as per the import transit shipment plan, and the actual position of said shipping container relative to a railcar, as previously determined, an alert can be sent to the TOS system, and/or to operator’s interfaces, via the central processing device 200 or the edge processing devices 300.

[0053] In possible embodiments, at least some of the images received at the edge processing devices 300 show railcars provided with railcar codes. The railcars transport shipping containers that need to be loaded onto transport trucks by the container handlers. The edge processing devices 300 can recognize, using the machine-learning algorithms, the railcar codes. The railcar codes, along with the shipping container codes, can be stored locally; in cloud-based servers; or in a central database accessible to the central processing device 200. Positions of the railcars can also be determined by the edge processing devices 300. Similar to the comparison made for shipping containers, the edge processing devices and/or the central processing device are configured to identify discrepancies between a planned position of a given container on a railcar, as per the transit shipment plan, and the actual position of said shipping container relative to a railcar, as previously determined. In other words, a software application, preferably executed by the central processing device, compares the matching of containers and railcars, as per the transit shipment plan, with the actual position of containers on railcars, and detects any difference, either in the container identification or the railcar identification. An alert or message can be sent to a graphical user interface, accessible via the central processing device, the edge processing devices or other devices, such as smart phones or tablets carried by terminal checkers or container handler operators. The comparison of the positions of the shipping containers with the transit shipment plan is thus preferably performed by the central processing device 200 and communicated to the edge processing devices 300, which can send the information to operators and/or terminal checkers user interface.

[0054] Referring now to FIG. 4, the first step of container loading management begins with fixed cameras 380i and 380N, positioned to capture rail cars 70 and shipping containers 60 moving along the main rail track 74 (or main line) as they are shunted into positions on the terminal rail spur tracks 76. The edge processing device (not shown in FIG. 4) is designed and configured to contribute to a rail cars management database, which can be accessed by the central processing device 200 and/or by the TOS. The edge processing device detects the presence of moving rail cars 70, and recognizes the identification code 72 of the rail cars. A log is created of each rail car 70 and its position on one of the multiple spur tracks.

[0055] In shunting railcars from the main rail line 74 and onto the terminal property, a single track is typically employed which then, by way of a switch-track, directs rail cars onto any of the several spur tracks 76. At a position along this stretch of single track 74, the edge processing device 300 dedicated to rail car identification is be positioned with one camera 380i imaging in profile all railcar and containers passing and the second camera 380N facing the multi-track layout and imaging the cars as they pass onto one of the spur tracks 76. Where yard and track layouts have obstacles to viewing, additional cameras may be required to circumvent data loss. The car numbers 72, car sequence and track selection are held in the database so that all cars can be identified by their rail car code and position on a track.

[0056] Shipping containers 60 arriving and departing on rail cars have their identification numbers 62 detected, identified and recorded along with that of the carrying railcar 72. A function of the identification data collection is recognition of the sequence of cars as they have been parked along each of the multiple spur tracks 76. This data becomes important with the onboarding of import containers as domestic transportation plans have been developed which require the placement of specific containers on assigned railcars and in specific positions on the car.

[0057] When shunting operations bring the newly loaded train from the spur lines and onto the mainline, those import containers have profile camera images processed and the data compared with terminal loading plans, thereby verifying that containers are on the correct rail car and in the correct position according to the loading plan. Deviations are reported to the terminal operator.

[0058] The following is an example of what the overall record may show:

- Rail-car identification numbers for all railcars parked on the rail spurs in the terminal. The record will also show the track number on which the railcar sits,

- Position of the rail car in the line of rail cars;

- Dimensions of each rail car; and

- Container identification number of each container loaded, along with the corresponding rail car number. [0059] In possible embodiments, the images showing railcars provided with railcar codes are associated with a main rail track or with one of several spur rail tracks. The central processing device and/or the edge processing devices can be configured to compare, during execution of the transit shipment plan, the position of the railcars relative to the main and spur rail tracks of the terminal yard, with the transit shipment plan, and identify discrepancies between a planned position of one of the railcars, as per the transit shipment plan, and the actual position of said railcar. In addition, the position of the railcars relative to other railcars on the main and/or spur rail tracks can also be detected, and discrepancies can be identified, including for example the order of the railcars on the main and/or spur tracks.

[0060] In possible embodiments, the edge processing devices can detect, using machine learning algorithms (executed locally or via cloud-based servers), track identification codes and dimensions of the railcars imaged by the cameras. The central processing device 300 logs, in real time, records associating railcar identification codes with track identification codes, position of a given railcar on the main or spur rail tracks; and dimension of a given railcar; and shipping container codes with railcar identification codes and/or track identification codes. The Terminal Operation System application can access the records to verify, confirm and report shipping container movements, in real time, versus the transit shipment plan.

[0061] Referring to FIG. 5, the process can also be performed when shipping containers are being unloaded from hauling trucks 80 to terminal rail tracks (or rail lines). Video camera 380iii captures images of the truck loading station, and images arriving hauling trucks, while video camera 380iv captures images of the different terminal lines 78. Again, rail car codes and shipping container codes are continuously compared to the transit shipment plans, and discrepancies can be identified and notified in real time, via graphical or audio interfaces destined to terminal operators.

[0062] FIG. 6 shows a table summarizing the different steps of the rail car, shipping container and hauling truck mapping process, where export shipping containers and rail car numbers are identified, and where the rail cars are associated to a rail line. Using positional coordinates from the edge processing devices and/or from the cameras, the position of the rail cars within the terminal can be determined. Container Loading and Unloading Management System for Shipping Containers Handling Vehicles and Cranes

[0063] Shipborne containers arriving at a maritime port, may contain valuable import goods with an origin in a distant foreign country. The shipping containers are offloaded and parked in the terminal yard until arrangements can be made for their delivery to the final customer. For rail shipments, the arrangements require that the rail transporter prepare a transit shipment plan that schedules rail car deliveries, and the containers onboard; to a depot/terminal conveniently located for the final miles of the delivery. It is therefore important that the selection of containers by the terminals’ machine operators, be executed exactly as planned, otherwise the container will be delivered to the wrong destination.

[0064] Referring to FIG. 7, for rail car loading, a camera 380i mounted to the crane structure (it this example, an RTG structure) in a position where a container delivered by terminal trailer from its parked position in the yard, sits beneath the crane waiting to be lifted and loaded to a rail car. From that position, the container identification code can be imaged and the loading positioning details relating to the container is retrieved by (or pushed to) the edge processing device 300, via a communication module 330, such as a compact industrial cellular router, for example. A camera 380N is also placed on the RTG structure in a position to image the area across all of the multiple rail tracks, which provides the edge processor 300 with the real-time data relating to the efforts of the RTG operator, to load containers onto a railcar sitting on one of the multiple tracks. Housed within a suitable environment on the crane is the edge processing device 300, provided with enclosure including a power module, one or more processors, a GPS module, a 4G, 5G or next-generation module and a Wi-Fi communication module. The edge processing device may also include a human interface screen. This edge processing device 300 can direct the movement of containers, using data stored in the data base, to maximize container throughput volume, by sending corresponding instructions to operators, via the operator-interfaces of the container handlers The container handlers may thus encompass not only trucks and railcars, but also one or more intermodal container cranes, such as Rubber Tyre Gantry (RTG) cranes and Rail Mounted Gantry (RMG) cranes, and mobile intermodal container handlers, such as stackers and lifter vehicles. As such, some of the edge processing devices may have a fixed location, but they can also be “mobile” edge processing devices, when provided on moving vehicles. [0065] Control of the loading operation relies on two sets of data, the first being the loading instructions, electronically delivered to the operating RTG edge processing device 300, by the TOS system directly or via the central processing system 200, and which may appear on the operator-interfaces of the crane interface panel (as shown in FIG. 8), in the elevated control booth. This edge processing device 300 supervises the loading of the rail cars, referring to the conveyed loading instructions and observing, via the captured images, the actual efforts of the RTG operator.

[0066] The second data set required is generated by the edge processing device located at the rail verification station, which has recorded the positions of all rail cars on the terminal tracks. The crane edge processing device then compares loading instructions for the transit shipping plans with real-time events and warns the operators, via the control booth interface, when attempts are being made to load containers onto an unassigned rail car. In possible implementations, in case of a mismatch between the shipping plan and container code identification, control instructions can be sent from the edge processing device 300 at the crane, to PLCs of the cranes, to block operations thereof. Also, at this stage also, it is possible to request confirmation of the shipping container code detected by the Al-algorithms via the crane operator’s interface, either via audio or touch input, when the level of confidence in the container code detection is low, i.e. below a given confidence threshold. The tracking and management of shipping containers can therefore entail that the edge processing devices send instructions to a given Programmable Logic Controllers (PLCs) of the container handlers to control its operation. The instructions sent by an edge processing device to a given PLC can comprise instructions that prevent clamping, lifting, or moving a given shipping container determined as incorrectly selected, following the comparison of the shipping container’s position with the transit shipment plan.

[0067] Still referring to FIG. 7, the edge processing device 300 can detect in real time that a railcar has entered the camera field of view, record the images of rail cars and associated containers, execute edge computing applications for object detection and identification, transmit data to an on or off-site central processor device 200 and if so designed, transmit the data to a secure web service 400, store data related to containers, railcars and locations, and interface and integrate with site Terminal Operating System, either directly or via the central processing device 300. [0068] This is an end-to-end solution that begins with the system receiving the details related to a series of container handling movements planned through a resident computer management system and based on planned daily transportation arrangements. The electronic messages are distributed by the system to available container handling equipment and supervisory performance monitoring stations. The execution of the plan is monitored by the system in real-time allowing the system to intercede at planned moments to prevent errors or to initiate changes to the plan. Confirmation of the completed tasks, including accepted changes to the plan are returned to the resident computer management system for archival updates.

Visual imaging

[0069] Referring to FIGs. 8, 9A and 9B, the edge processing device and/or the remote cloud-based platform can comprise a shipping container code recognition module 362 and a defect detection module 266. Also referring to FIG. 7 and FIG 10, smart mobile devices 300 can be used to capture and process. Image analysis produces character identification of container and equipment mounted serial numbers visibly scribed to surfaces of the equipment. The size, location and detail of these numbers follow codes particular to the equipment which includes but not limited to shipping containers, rail cars, rubber tyre gantry cranes, top handlers, and other container handling mobile equipment. In possible implementations, the image-based automated system comprises and/or has access to a cloud-based shipping container ID 362 and Inspection module 266 (as illustrated in Fig. 9B) The Central Processing Device and/or the Mobile Edge Processing Devices can access the cloud-based modules to process the images received.

[0070] With reference to FIG.1 , security cameras installed in equipment working areas of the facility, continuously return streaming video of activities to a Yard Video Management System 20. These video streams may be simultaneously delivered to the Central Processing Device 200 and serial numbers of passing equipment and containers thereby identified and their location signalled to the resident computer management system (Terminal Operating System) 10. [0071] Performance of the Central Processing Device 200 and satellite Mobile Edge Processing Device 300 use graphics processing units (GPU) to perform high speed character analysis. The deep neural network (DNN-based) architecture is run using Tensor Flow or Tensor Flow Lite. The processor identifies image frames of interest using object detection models such as Faster-RCNN. Further processing identifies areas of interest within the frames localizing text and physical deviations such as damage. Machine learning models such as Faster RCNN and selected non-machine learning techniques are used to select the areas.

[0072] Referring to FIG. 9A, text may be scribed on equipment surfaces in both horizontal and vertical orientation, each with a particular characteristic such as shape and character spacing. Using an algorithm such as arbitrary oriented text recognition (AON), text orientation can be determined in an unsupervised manner. Data is formulated using methods and modules such as a directional encoded mask (DEM) or selective attention network (SAN.F) or Baseline, thereby reducing the learning time and improving accuracy.

[0073] Various text detection models, text recognition models and modifiers are available and are selected based on the application environment. This includes Rosetta which is a two-step software model for the detection and recognition of text and StarNet which is a trainable classifier that reduces the amount of training data required for new visual domains. Also available is the algorithm Efficient and Accurate Scene Text detection pipeline (EAST), which is a trainable algorithm that offers an advantage to this application of direct text detection without the typical requirement of additional sub-algorithms that aggregate and partition text. Also available for this application is VGG which is a deep convolutional neural network (CNN) that incorporates small size convolutional filters that allow multiple filter layers resulting in improved performance.

[0074] Referring to FIG. 9B, a rating for the container’s condition can also be determined at different stages of the container handling process, by the shipping container inspection module 266. A reference database may be included as part of the back-end system 300 and/or cloud-based platform 400, to store baseline container codes, types of damages, condition ratings, and other information on standard undamaged containers for comparison purposes, etc. The printed information provided on the shipping containers, e.g., codes, labels, seals, and signs, is detected and recognized automatically with intelligent customized software modules and algorithms, executed locally or in the cloud. Different types of shipping container defects can be detected, including cracks, deformations and corrosion, and integrity of handles and security seals. The inspection module can identify damage type and extent of those damages to corner fittings, door header, top end rails and forklift pockets. The processing steps include the detection of the area damaged, the use of an adaptive image threshold method to isolate the damaged portion of the pixel level through segmentation and area outline within the bounding box. In possible implementations, the tracking and management process of shipping containers in terminal can not only identify discrepancies in the execution of the transit shipment plan: it can also asses a physical condition of the shipping containers by processing the images from the security yard cameras and/or the container-handler mounted cameras using machine learning algorithms, such as through convolutional neural networks.

[0075] Provision of the overall type and level estimate according to the detected area. The terminal operators can be offered access to the information from deployed web or mobile applications and interface 364, via the edge processing devices 300, or the information can be updated to the TOS’ database.

[0076] Referring now to FIGs. 10 and 11 , mobile shipping container vehicles 44 are used to lift containers from a position in the yard or on a truck and transported to a new location which may be a container pile that may be as much as six containers high. Each of these types of mobile vehicles are designed to perform similar duties as previously described under the RTG/RMG crane operation. Cameras 380 are placed in strategic positions where the container identification details can be best imaged, and the on-board electronics of edge processing devices 300 performs object identification, recognition and communication services.

[0077] Where the design of the terminal mobile vehicle permits, cameras will be placed to capture multiple images of container identification details. The container code detection module 362 analyzes the images and determines a shipping container code. As per FIG. 11, container codes detected by the container code detection module can be confirmed by operators, when the level of confidence is too low, and feedback from the operators can be sent back to the machine learning algorithms to be further trained. Also, logics may be used and with an edge computer interface to the onboard programmable logic controller (PLC), equipment functions may be controlled, via a PLC-control module 350. This may include system refusal to clamp on to, lift or move a container which the edge computing station has determined, has been incorrectly selected.

[0078] Through digital communication with TOS 10, the edge processing device, via the operator’s interface 364 controls can direct the mobile vehicle operator to specific coordinates for container retrieval or storage. Positional instructions originate through the GPS module 352 and is verified through relative markers such as the previously recorded container identification and last position inventory data.

[0079] Vehicle traffic patterns can be established and modified in real time with integration of yard security cameras through recognition applications and interface with the terminal operating system.

Shipping container data management at terminal sites

[0080] Referring to all FIGs. 1 to 11 , the proposed system 100 may advantageously be integrated with other operating platforms and can serve to exchange data for the purpose of directing terminal container movements and ensuring that the terminal records are kept accurate and complete. The system 100 comprises a central processing device 200 with multiple smaller satellite processing devices 300 located on fixed or moving container handling equipment 42, 44. Multiple cameras 380 on the equipment provide vision data to the satellite processors 200. A variety of camera types are available and are selected based on the application. Communication between processor devices is primarily wireless, using any of WIFI, 4G or 5G cellular. Communications between systems is essential to the good functioning of the entire operation and therefore must be specifically designed to meet the individual needs of each terminal and the data management system.

[0081] In possible implementations, terminals that are not operating high level management systems can use the data management system 100 to operate loading and unloading operations primarily in real time with instructions provided moment to moment by supervisory input. Historical records may be kept but bulk loading and unloading instructions would not be necessary.

[0082] In other implementations, the proposed system interfaces to both the terminal operating system (TOS) 10 and terminal security management (TSM) 20 system. The TOS can bulk download the loading and unloading plans to the system 100, based on the operators chosen criteria, such as by 4-hour operating periods or by individual delivery manifest. The system 100 can, from that point, coordinate the operation by delivering work instructions to equipment 42, 44 in the field and retrieve field feedback to update records.

[0083] The edge processing devices 300 receive work instructions from the central processing device 200 and manages the operation of the attached equipment by display (or audio instructions) to the equipment operator and by interlock control with the equipment programmable logic controller 46 (PLC). The cameras 380 provide vision to the details of the operator activity and image the identification details on sides and tops of containers being manipulated by the equipment lifting devices 44. The edge processing devices 300 run algorithms to recognize these details and determine the container identification numbers 62 which in turn are compared with the shipping plan. Should the container being manipulated not match the plan or should other electronic instructions not be followed by the machine operator correctly, the edge processing interlocking instructions can prevent completion of the activity while notifying the error to the machine operator and to the central processing device 200. With completion of the activity, the edge processing device 300 will notify the central processing device 200 and will perform the next task. The central processing device 200 will in turn update the activity progress logs.

[0084] The GPS modules 352 provide the edge processing devices with coordinates on a continuous basis and at appropriate moments the coordinates are captured and logged. Such is the case when the edge device equipped mobile vehicles are placing containers in the terminal yard for storage. The machine PLC controls are monitored by the edge processing device and upon completion of the setting of the container for storage, the GPS coordinates are captured and uploaded to the central processing device 200 along with all related data.

[0085] A plurality of security cameras 380 are normally installed at terminals, given the value of goods and national border security issues. The video images of these cameras are typically brought to a single vendor management system normally situated in an IT room. The shipping container data management system 100 can be interfaced with the video management system 20, allowing the movement and positioning to containers to be followed through the processing of the video images and identification of container identification details. The data management system 100 can, in possible implementations, be used to provide and assemble the field data required to prepare a terminal site location plan of containers transiting at the terminal site. Residing in the TOS, the location plan can be updated in real time, from the edge processing devices, allowing other activities such as container servicing to be directed to the container location.

[0086] While the above description provides examples of the embodiments, it will be appreciated that some features and/or functions of the described embodiments are susceptible to modification without departing from the principles of the operation of the described embodiments. Accordingly, what has been described above has been intended to be illustrative and non-limiting and it will be understood by persons skilled in the art that other variants and modifications may be made without departing from the scope of the invention as defined in the claims appended hereto.