Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
SYSTEM AND METHOD FOR DETECTING REAL-TIME LOCATION OF SHIPPING CONTAINERS
Document Type and Number:
WIPO Patent Application WO/2023/017445
Kind Code:
A1
Abstract:
A system for identifying real-time location of shipping containers, comprising, edge unit comprises real Time Kinematic (RTK) GNSS rover configured to collect GNSS data to identify physical stack location of shipping container with high precision; distance sensor configured to measure height from ground, cameras mounted on spreader configured to capture camera feed used to infer unique identification number of shipping container being locked. Twist-lock detection module configured to monitor twist-lock lock-on mechanism to detect transition state of spreader and deliver detected transition state to sensor data reading module. Sensor data reading module configured to read GNSS data, distance sensor data and camera sensor data. BSRT derivation module is configured to obtain BSRT location from GNSS data and measured height using GNSS data of extreme corner locations of the corresponding block. Container number recognition module is configured to detect and recognize unique container number being attached to spreader.

Inventors:
ALLU LOVARAJU (IN)
ANDHAVARAPU KRISHNA KISHORE (IN)
GOLLU MANMADH (IN)
GUNDA SATISH CHANDRA (IN)
ARUMILLI KISHOR (IN)
GUDE GANGADHAR (IN)
Application Number:
PCT/IB2022/057474
Publication Date:
February 16, 2023
Filing Date:
August 10, 2022
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
ATAI LABS PRIVATE LTD (IN)
International Classes:
G08B21/02; G01S5/02; G06Q10/08; G08B13/24; G08G1/00
Foreign References:
US20110010005A12011-01-13
US20170220989A12017-08-03
US20200104790A12020-04-02
Download PDF:
Claims:
CLAIMS A system for identifying real-time location of shipping containers in container yard, comprising: an edge unit affixed to at least one of a container handling equipment, the edge unit comprising at least one of: a real time Kinematic (RTK) Global navigation satellite system (GNSS) rover configured to collect GNSS data to identify a physical stack location of the shipping container with a high precision; and a distance sensor configured to measure height of the shipping container from a ground, at least one of: a first camera; a second camera; and a third camera mounted on a spreader configured to capture camera feed to infer a unique container number of the shipping container being locked; a first depth sensor and a second depth sensor configured to measure first and second depth sensor data, and one or more wireless transmitters configured to read one or more wireless transmitters data; at least one of: the GNSS rover 112; the distance sensor 116; and the at least one of: the first camera 114a; the second camera 114b; and the third camera 114c; the first depth sensor 130a; the second depth sensor 130b; the one or more wireless transmitters 128a, 128b..128n; configured to deliver at least one of: the GNSS data; measured height; the camera feed; the first and second depth sensor data; and the beacons data; to at least one of: a first computing device; a second computing device; and a cloud server over a first network 104a, the first computing device, the second computing device, and the cloud server comprising a twist-lock detection module, a sensor data reading module, a block stack row and tier (BSRT) derivation module and a container number recognition module, the twist-lock detection module configured to monitor a twist-lock lock- on mechanism to detect a transition state of the spreader of the at least one of container handling equipment and deliver the detected transition state to the sensor data reading module, the sensor data reading module configured to read at least one of: the GNSS data; the measured height; the camera feed; the first and second depth sensor data; and the one or more wireless transmitters data of the shipping

36 container; the BSRT derivation module configured to obtain a BSRT location of the shipping container from the GNSS data and the measured height using the GNSS data of extreme corner locations of the corresponding block of shipping container, the container number recognition module configured to detect and recognize the unique container number of the one or more shipping containers being attached to the spreader of the at least one of container handling equipment. The system of claim 1, wherein the edge unit comprising the first camera configured to capture a twist-lock lights panel to monitor the transition state of twist-lock lock-on mechanism. The system of claim 1, wherein the edge unit comprising the second camera and the third camera mounted on the spreader configured to capture the unique container number of the shipping container from a top-view. The system of claim 1, wherein the cloud server comprising a triangulation technique is used at a block level to derive a current location of the shipping container in a BSRT format using four corner locations of the block as reference. The system of claim 4, wherein the database is configured to store at least four reference points of the block and a reference ground height of the block (Altitude at the ground level). The system of claim 1, wherein the second camera is mounted at bottom-center of left extreme of the spreader and the third camera is mounted at bottom-center of the right extreme of the spreader. The system of claim 1, wherein the BSRT derivation module comprising a data reading module configured to read the GNSS data received from the GNSS rover, the measured height received from the distance sensor (LiDAR), the first depth sensor data and the second depth sensor data received from the first depth sensor and the second depth sensor and the camera feed received from the at least one of: the first camera; the second camera; and the third camera; and the one or more wireless transmitters data from the one or more wireless transmitters.

37 The system of claim 1, wherein the first depth sensor data comprising a measured horizontal distance of an operator cabin from one extreme of the at least one of container handling equipment. The system of claim 1, wherein the second depth sensor data comprising a measured vertical distance of the spreader from the second depth sensor. The system of claim 1, wherein the BSRT derivation module comprising a data calculating module configured to calculate X and Y directions from the longitude, latitude and to calculate the height of the shipping container from the ground (Hg), and from altitude. The system of claim 1, wherein the BSRT derivation module comprising a data processing module is configured to decode at least of: the GNSS data, the camera feed, the measured height, the first depth sensor data, the second depth sensor data, and the one or more wireless transmitters data. The system of claim 11, wherein the data processing module configured to extract the required location information from at least one of: the GNSS rover; the distance sensor and the first camera; the second camera; and the third camera; the first depth sensor; the second depth sensor; the one or more wireless transmitters and delivers to a data deriving module. The system of claim 1, wherein the BSRT derivation module comprising a data deriving module is configured to derive the location of the shipping container in the BSRT format using at least of: the GNSS data; the camera feed; the measured height; the first depth sensor data; the second depth sensor data; and the one or more wireless transmitters data. The system of claim 12, wherein the data deriving module is configured to deliver the BSRT location along with the unique container number to the cloud server. The system of claim 1, wherein the GNSS data comprising a location data, one or more location co-ordinates, one or more GNSS co-ordinates, a longitude, a latitude, and an altitude. The system of claim 1, wherein the container handling equipment comprising a reach stacker (RST), a rubber-tired gantry crane (RTG), and a rail mounted gantry crane (RMG). The system of claim 1, wherein the GNSS rover is mounted on top of an operator cabin on at least one of: the rubber-tired gantry crane (RTG); and the rail mounted gantry crane (RMG). The system of claim 1, wherein the GNSS rover is mounted on the top-centre of the spreader on the reach stacker (RST). A method to derive a BSRT real-time location of shipping containers on a rubber-tired gantry crane and reach stacker (RST), comprising: monitoring a transition state of a spreader by at least one of: a twistlock lights voltage detector; and a first camera; reading GNSS data using a GNSS (global navigation satellite system) rover and measuring height using a distance sensor; transmitting the GNSS data and the measured height to a cloud server through a first network; processing the GNSS data and the measured height by a BSRT derivation module; calculating X and Y from the longitude, latitude and calculating the height of a container from the ground (Hg) by the BSRT derivation module; deriving a real-time BSRT location of the shipping container using X, Y and Hg; and displaying the real-time BSRT location (precise location) on a first computing device and a second computing device by the BSRT derivation module. A method to derive a BSRT real-time location of shipping containers on a rubber-tired gantry crane and reach stacker (RST), comprising: monitoring a transition state of a spreader by at least one of: a twistlock lights voltage detector; and a first camera; reading wireless transmitters data of a shipping container from one or more wireless transmitters and first and second depth sensor data of the shipping container from a first depth sensor and a second depth sensor; transmitting the wireless transmitters data and the first and second depth sensor data to a cloud server over a first network; decoding the wireless transmitters data, the first and second depth sensor data by a BSRT derivation module; deriving BSRT location of the shipping container from the wireless transmitters data and the first and second depth sensor data by the BSRT derivation module; and displaying the real-time BSRT location on the first computing device and the second computing device by the BSRT derivation module.

Description:
SYSTEM AND METHOD FOR DETECTING REAL-TIME LOCATION OF SHIPPING CONTAINERS

TECHNICAL FIELD

[001] The disclosed subject matter relates generally to a container management system for shipping and storage yards. More particularly, the system and method for detecting real-time location of shipping containers in the container yard.

BACKGROUND

[002] Shipping container storage yards contain thousands of shipping containers arranged in blocks. The shipping containers include intermodal containers, cargo or freight containers, ISO containers, sea or ocean containers, Conex boxes, and the like. These shipping containers are moved from one location to another to pick and place the containers using container handling equipment (CHE) and internal transfer vehicles. There is often a need for a system to inspect the location of containers and/or create long lasting records of the containers at the time of transfer. Due to the high volume of the shipping containers, it is very critical to assign a unique address to each of the locations and to record the locations of the shipping containers being placed for efficient movement of containers.

[003] The identification and movement of existing containers and assignment of the location to the incoming containers are difficult. It is also difficult to locate the physical location of the containers in the yard by using GNSS coordinates (longitude, latitude, and altitude) in the three- dimensional space. In the large yards, it is difficult to compare GNSS coordinates of the current container with the coordinates of every container slot in a database to derive current container location. Prior art systems and methods have experimented with attaching marks, markers, bugs, radios, GPS equipment, and other devices to the shipping containers. These devices then ride along through the entire trip. However, putting such things on each container is expensive, and the devices are often blocked for some reason and not accessible. Hence, there is a need for a system to assign a unique address to a container position within the yard is solved by using a BSRT (Block Stack Row and Tier) format.

[004] In the light of the aforementioned discussion, there exists a need for a system for detecting real-time location of shipping containers. SUMMARY

[005] The following presents a simplified summary of the disclosure in order to provide a basic understanding of the reader. This summary is not an extensive overview of the disclosure and it does not identify key /critical elements of the invention or delineate the scope of the invention. Its sole purpose is to present some concepts disclosed herein in a simplified form as a prelude to the more detailed description that is presented later.

[006] An objective of the present disclosure is directed towards a system that assigns a unique address to a shipping container position within the shipping yard by using a BSRT (Block Stack Row and Tier) format.

[007] Another objective of the present disclosure is directed towards a system that uses a triangulation technique at a block level to derive current container location in the BSRT format.

[008] Another objective of the present disclosure is directed towards a system that collects a GNSS location for extreme corners of each block of the shipping containers.

[009] Another objective of the present disclosure is directed towards a system that provides an easy identification and movement of existing containers and assignment of location to incoming containers.

[0010] Another objective of the present disclosure is directed towards a system that records all the shipping containers located in the shipping yard and their respective locations.

[0011] Another objective of the present disclosure is directed towards a system that eliminates manual errors and huge loss of time in searching misplaced containers using a sensor fusion and an artificial intelligence.

[0012] In an embodiment of the present disclosure, the system comprising the edge unit is affixed to at least one container handling equipment. [0013] In another embodiment of the present disclosure, the edge unit affixed to at least one of a container handling equipment, the edge unit comprising at least one of: a real time Kinematic (RTK) Global navigation satellite system (GNSS) rover configured to collect GNSS data to identify a physical stack location of the shipping container with a high precision.

[0014] In another embodiment of the present disclosure, distance sensor configured to measure height of the shipping container from a ground, at least one of: a first camera; a second camera; and a third camera mounted on a spreader configured to capture camera feed to infer a unique container number of the shipping container being locked.

[0015] In another embodiment of the present disclosure, a first depth sensor and a second depth sensor configured to measure first and second depth sensor data, and one or more wireless transmitters configured to read one or more wireless transmitters data, at least one of: the GNSS rover 112; the distance sensor 116; and the at least one of the first camera 114a; the second camera 114b; and the third camera 114c; the first depth sensor 130a; the second depth sensor 130b; the one or more wireless transmitters 128a, 128b..128n; configured to deliver at least one of the GNSS data; measured height; the camera feed; the first and second depth sensor data; and the beacons data; to at least one of: a first computing device; a second computing device; and a cloud server over a first network 104a.

[0016] In another embodiment of the present disclosure, the first computing device, the second computing device, and the cloud server comprising a twist-lock detection module, a sensor data reading module, a block stack row and tier (BSRT) derivation module and a container number recognition module, the twist-lock detection module configured to monitor a twist-lock lock-on mechanism to detect a transition state of the spreader of the at least one of container handling equipment and deliver the detected transition state to the sensor data reading module.

[0017] In another embodiment of the present disclosure, the sensor data reading module configured to read at least one of the GNSS data; the measured height; the camera feed; the first and second depth sensor data; and the one or more wireless transmitters data of the shipping container; the BSRT derivation module configured to obtain a BSRT location of the shipping container from the GNSS data and the measured height using the GNSS data of extreme corner locations of the corresponding block of shipping container.

[0018] In another embodiment of the present disclosure, the container number recognition module configured to detect and recognize the unique container number of the one or more shipping containers being attached to the spreader of the at least one of container handling equipment.

BRIEF DESCRIPTION OF THE DRAWINGS

[0019] In the following, numerous specific details are set forth to provide a thorough description of various embodiments. Certain embodiments may be practiced without these specific details or with some variations in detail. In some instances, certain features are described in less detail so as not to obscure other aspects. The level of detail associated with each of the elements or features should not be construed to qualify the novelty or importance of one feature over the others.

[0020] FIG. 1, FIG. 1 is a block diagram representing a system in which aspects of the present disclosure can be implemented. Specifically, FIG. 1 depicts a schematic representation of the system for tracking and identifying real-time location of shipping containers, in accordance with one or more exemplary embodiments.

[0021] FIG. 2A, FIG. 2A is a block diagram depicting a schematic representation of the BSRT derivation module shown in FIG. 1, in accordance with one or more exemplary embodiments.

[0022] FIG. 2B, FIG. 2B is an example diagram depicting architecture of containers block with reference points, in accordance with one or more exemplary embodiments.

[0023] FIG. 3, FIG. 3 is an example diagram depicting architecture of a block with 14 stacks and 4 rows, in accordance with one or more exemplary embodiments.

[0024] FIG. 4, FIG. 4 is an example flow diagram depicting a method for deriving the BSRT location on RTG and RST, in accordance with one or more exemplary embodiments. [0025] FIG. 5, FIG. 5 is another block diagram representing a system in which aspects of the present disclosure can be implemented. Specifically, FIG. 5 depicts a schematic representation of the system for identifying real-time BSRT location of shipping containers, in accordance with one or more exemplary embodiments.

[0026] FIG. 6A and 6B, FIG. 6A and 6B are example diagrams representing a placement of Wireless transmitters, a first depth sensor and a second depth sensor on rubber-tired gantry, in accordance with one or more exemplary embodiments.

[0027] FIG. 6C, FIG. 6C is an example diagram representing a placement of Wireless transmitters in accordance with one or more exemplary embodiments.

[0028] FIG. 7, FIG. 7 is another example flow diagram depicting a method for deriving the BSRT location on the rubber-tired gantry, in accordance with one or more exemplary embodiments.

[0029] FIG. 8, FIG. 8 is another example flow diagram depicting a method for identifying twist-lock transition states and deriving the real-time BSRT location on the container handling equipment, in accordance with one or more exemplary embodiments.

[0030] FIG. 9, FIG. 9 is an example flow diagram depicting a method for identifying transition states of the spreader when twist-lock moves from unlocked to locked state, in accordance with one or more exemplary embodiments.

[0031] FIG. 10, FIG. 10 is an example flow diagram depicting a method for identifying transition states of the spreader when twist-lock moves from locked state to unlocked, in accordance with one or more exemplary embodiments.

[0032] FIG. 11, FIG. 11 is a block diagram illustrating the details of digital processing system in which various aspects of the present disclosure are operative by execution of appropriate software instructions. [0033] Furthermore, the objects and advantages of this invention will become apparent from the following description and the accompanying annexed drawings.

REFERENCE NUMERALS IN THE DRAWINGS

[0034] FIG. 1, 100 discloses a system configured for deriving a real-time BSRT location of shipping containers

102 Container Handling Equipment

104a First Network

104b Second Network

104c Third Network

106 Cloud Server

108a First Computing Device

108b Second Computing Device

110 Edge unit

112 GNSS rover

114a First Camera

114b Second Camera

114c Third Camera

116 Distance Sensor

118 Twist-Lock Detection Module

120 Sensor Data Reading Module

122 BSRT derivation module

124 Container Number Recognition Module

126 Database

136 GNSS Base

138 Twist-lock Lights Voltage Detector

[0035] FIG. 2, 200 discloses a 120 BSRT derivation module of 100 system

201 Bus

202 data reading module

204 data calculating module

206 data processing module

208 data deriving module [0036] FIG. 3, 300 discloses an architecture of a block with 14 stacks and 4 rows of containers

302a Block

304a First Reference Point (Rl)

304b Second Reference Point (R2)

304c Third Reference point (R3)

304d Fourth Reference Point (R4)

306 Width (W)

308 Container Location (L)

310a First Distance (DI)

310b Second Distance (D2)

[0037] FIG. 4, 400 discloses method for deriving the BSRT location on RTG and RST

402 Monitoring twist-lock transition states of a spreader by a first camera or by the voltage of the twist lock lights

404 Determining whether a BSRT location of a container is derived on a rubber-tired gantry crane (RTG) or rail mounted gantry (RMG)?

If 404 is YES (RTG/RMG), 406 Reading GNSS data using a GNSS rover and measuring height using a distance sensor

408 Determining whether to process data on the second computing device?

If 408 No 410 Transmitting the GNSS data and the measured height to a first computing device or a cloud server through a first network

412 Processing the GNSS data and the measured height by a BSRT derivation module

If 408 Yes, the method continues at step 412

414 Calculating X and Y from longitude, latitude and calculating height of the container from ground (Hg)

416 Deriving the BSRT location using X, Y and Hg by the BSRT derivation module

418 Displaying the real-time BSRT location (precise location) on the first computing device and the second computing device

If 404 is No (RST), 420 Reading the GNSS data using the GNSS rover

422 Determining whether to process data on the second computing device? If 422 is No, 424 Transmitting the GNSS data to the first computing device or the cloud server through the first network

426 Processing the GNSS data by the BSRT derivation module

428 Calculating X and Y from the longitude, latitude and calculating height of the container from the ground (Hg) from altitude

If 422 is Yes, the method continues at step 426

Thereafter at step 428, the method reverts at step 418

[0038] FIG. 5, 500 discloses another example diagram depicting a system for identifying real-time BSRT location of shipping containers

102 Container Handling Equipment

104a First Network

104b Second Network

106 Cloud Server

108a First Computing Device

108b Second Computing Device

110 Edge unit

114a First Camera

114b Second Camera

114c Third Camera

118 Twist-Eock Detection Module

120 Sensor Data Reading Module

122 BSRT derivation module

124 Container Number Recognition Module

126 Database

128a, 128b, 128c and 128n Wireless transmitters (Bluetooth beacons)

130a first depth sensor

130b second depth sensor

132 Wireless receiver (Beacon receiver)

138 Twist-lock Lights Voltage Detector [0039] FIG. 6A, FIG. 6B 600a, 600b are example diagrams representing a placement of Wireless transmitters, a first depth sensor and a second depth sensor on rubber-tired gantry

602 the rubber-tired gantry

604 Spreader

128a wireless transmitters (Bluetooth beacons)

130a first depth sensor

130b second depth sensor

[0040] FIG. 6C is an example diagram representing a placement of Wireless transmitters

602 reach stacker

604 Spreader

128a Wireless transmitters

132 Wireless receiver

[0041] FIG. 7 is a method for deriving the real-time BSRT location on the rubber- tired gantry

702 Monitoring twist-lock transition states of a spreader by a first camera or by voltage of the twist lock lights

704 Determining whether a container holding equipment is a rubber-tired gantry crane (RTG) or rail mounted gantry (RMG)?

If 704 is Yes RTG/RMG 706, Reading bluetooth beacons data from bluetooth beacons and first and second depth sensor data from a first depth sensor and a second depth sensor 708 Determining whether to process data on the second computing device?

If 708 is No, 710, Transmitting the bluetooth beacons data and the first and second depth sensor data to the first computing device or cloud server over a first network

712 Decoding the bluetooth beacons data and the first and second depth sensor data by a sensor data reading module

714 Deriving BSRT location of the shipping container from the bluetooth beacons data and the first and second depth sensor data by the BSRT derivation module

716 Displaying the real-time BSRT location on a first computing device and a second computing device If 708 is Yes, the method continues at step 712

If 704 is No (RST), 718 Reading the bluetooth beacons data from the Bluetooth beacons and container handling equipment sensor data from container handling equipment sensors 720 Determining whether to process data on the second computing device?

If 720 is No, 722 Transmitting the bluetooth beacons data and the container handling equipment sensor data to the first computing device or cloud server over the first network 724 Decoding the bluetooth beacons data and the container handling equipment sensor data by a sensor data reading module

726 Deriving BSRT location from the bluetooth beacons data and the container handling equipment sensor data

Thereafter at step 726, the method reverts at step 716

If 720 is Yes, the method continues at step 724

[0042] FIG. 8 is another example flow diagram depicting a method for identifying twist-lock transition states and deriving the real-time BSRT location on the container handling equipment

802 Monitoring the transition state of the spreader using the first camera or by voltage of the twist lock lights

804 Identifying the twist-lock is in idle/unlocked position after illuminating red light 806 identifying the twist-lock is ready to switch after illuminating yellow-red lights

808, Switching the twist-lock from idle to locking state and the transition state is identified after illuminating yellow light

810 Identifying the twist-lock is ready to lock after illuminating yellow-green lights

812 Determining whether transition state of the spreader is locked?

812 is Yes, 814, Reading the GNSS data using GNSS rover and measuring height of the shipping containers using distance sensor

816 Deriving the BSRT location of the shipping container using GNSS data and the measured height

818 Updating and storing the BSRT location of the container in the cloud server

820 Displaying the BSRT location of the shipping container on the first computing device and the second computing device

812 is Yes, 822 Identifying the twist-lock transition of the spreader is ready to switch after illuminating the yellow-green lights

824, Identifying the switching of the twist-lock transition state after illuminating the yellow lights

826 Identifying the twist-lock is ready to unlock after illuminating the yellow-red lights

828 Determining whether the transition state of the spreader is unlocked?

828 is Yes, 830, Capturing the shipping container by the second camera and the third camera from the top view

832 Decoding the unique container number using the container number recognition module

834 Updating and Storing the unique container number in the cloud server

828 is No, the method reverts to step 806

828 is Yes, the method continues at step 814

828 is No, the method reverts at step 806

812 is No, the method reverts at step 810

[0043] FIG. 9 is an example flow diagram depicting a method for identifying transition states of the spreader when twist-lock moves from unlocked to locked state

902 Monitoring a transition state of a spreader by a twist-lock detection module

904 Identifying the transition state of the spreader is unlocked after illuminating a red light

906 Determining whether the twist-lock of the spreader is ready to switch for locking a container?

If 904 Yes 908 Identifying the twist-lock is ready to switch after illuminating yellow-red lights

910 Switching the twist-lock from idle to locking state and the transition state is identified after illuminating the yellow light

912 Identifying the twist-lock is ready to lock after illuminating the yellow-green lights

914 Identifying the twist-lock is locked after illuminating the green light

If 906 No the method reverts at step 902

[0044] FIG. 10 is an example flow diagram depicting a method for identifying transition states of the spreader when twist-lock moves from locked to unlocked state

1002 Monitoring a transition state of a spreader by a twist-lock detection module

1004 Identifying the transition state of the spreader is locked after illuminating the green light

1006 Determining whether the twist-lock is ready to switch for unlocking the container?

If 1006 is Yes 1008, Identifying the twist-lock is ready to switch after illuminating the yellow-green lights 1010, Switching the twist-lock from locked state to unlocked state and the transition state is identified after illuminating the yellow light

1012 Identifying the twist-lock is ready to unlock after illuminating the yellow-Red lights 1014 Identifying the twist-lock is unlocked after illuminating the red light

If 1006 is No, the method reverts at step 1002

[0045] FIG. 11- digital processing system corresponds to the computing device

1110 CPU

1120 Random Access Memory (RAM)

1125 Shared Environment of RAM 1020

1126 User Programs of RAM 1020

1130 Secondary Memory

1135 Hard Drive of secondary Memory 1130

1136 Flash Memory of secondary Memory 1130

1137 Removable Storage Drive of secondary Memory 1130

1140 Removable Storage Unit

1150 Communication Path

1160 Graphics Controller

1170 Display Unit

1180 Network Interface

1190 An Input Interface

DETAILED DESCRIPTION OF EXAMPLE EMBODIMENTS

[0046] It is to be understood that the present disclosure is not limited in its application to the details of construction and the arrangement of components set forth in the following description or illustrated in the drawings. The present disclosure is capable of other embodiments and of being practiced or of being carried out in various ways. Also, it is to be understood that the phraseology and terminology used herein is for the purpose of description and should not be regarded as limiting.

[0047] The use of “including”, “comprising” or “having” and variations thereof herein is meant to encompass the items listed thereafter and equivalents thereof as well as additional items. The terms “a” and “an” herein do not denote a limitation of quantity, but rather denote the presence of at least one of the referenced item. Further, the use of terms “first”, “second”, and “third”, and the like, herein do not denote any order, quantity, or importance, but rather are used to distinguish one element from another.

[0048] Referring to FIG. 1, FIG. 1 is a block diagram 100 representing a system in which aspects of the present disclosure can be implemented. Specifically, FIG. 1 depicts a schematic representation of the system for tracking and identifying real-time location of shipping containers, in accordance with one or more exemplary embodiments. The system 100 includes a container handling equipment 102, and a first network 104a, a cloud server 106, a first computing device 108a, and a database 126, a third network 104c, and a GNSS Base 136. The container handling equipment 102 may include, but not limited to, a rubber-tired gantry crane (RTG), a reach stacker (RST) or a Rail-Mounted Gantry Crane (RMG), and the like. The container handling equipment 102 may be configured to pick and place the shipping container (shown in FIG. 2B) for transport by land, water, and air, and the like. The first computing device 108a, the second computing device 108b and the cloud server 106 may include a twist-lock detection module 118, a sensor data reading module 120, a BSRT (Block Stack Row and Tier) derivation module 122, and a container number recognition module 124.

[0049] An edge unit 110 may be affixed to the container handling equipment 102. The edge unit 110 includes a GNSS rover 112, a first camera 114a, a second camera 114b, a third camera 114c, a distance sensor 116, a second network 104b, a second computing device 108b, and a twistlock lights voltage detector 138. The cloud server 106 includes a cluster of servers configured to obtain real-time location of the shipping container and a container number from the edge unit 110 over the first network 104a. The first network 104a, the second network 104b and the third network 104c may include, but is not limited to, an Ethernet, a wireless local area network (WLAN), or a wide area network (WAN), a Bluetooth low energy network, a ZigBee network, a Controller Area Network (CAN bus), a WIFI communication network e.g., the wireless high speed internet, or a combination of networks, a cellular service such as a 4G (e.g., LTE, mobile WiMAX) or 5G cellular data service, a RFID module, a NFC module, wired cables, such as the world-wide-web based Internet, or other types of networks may include Transport Control Protocol/Internet Protocol (TCP/IP) or device addresses (e.g. network-based MAC addresses, or those provided in a proprietary networking protocol, such as Modbus TCP, or by using appropriate data feeds to obtain data from various web services, including retrieving XML data from an HTTP address, then traversing the XML for a particular node) and the like without limiting the scope of the present disclosure.

[0050] The first computing device 108a may be located in a backroom at multiple locations. The multiple locations may include, but are not limited to, a warehouse, a shipping yard, a container yard, and the like. The multiple locations may include one or more programmed computers and are in wired, wireless, direct, or networked communication (the first network and the second network) with the first camera 114a, the second camera 114b, and the third camera 114c. Although the first and second computing devices 108a, 108b are shown in FIG. 1, an embodiment of the system 100 may support any number of computing devices. The first computing device 108a may be operated by a first user. The first camera 114a, the second camera 114b and the third camera 114c may include, but is not limited to, three-dimensional cameras, thermal image cameras, infrared cameras, night vision cameras, varifocal cameras, and the like.

[0051] The second computing device 108b may be installed inside the container handling equipment 102 for identifying the precise location of the shipping container (shown in FIG. 2B) and is operated by a second user. The second computing device 108b may be configured to perform all the computations on the edge unit 110 and transmit the GNSS data and measured height to the cloud server 106 through the first network 108a for further processing. The second computing device 108b may be configured to perform minimal computations needed on the edge unit 110 and transmit the required sensor data for processing on the cloud server 106.

[0052] The computing devices 108a, 108b may include, but are not limited to, a desktop computer, a personal mobile computing device such as a tablet computer, a laptop computer, or a netbook computer, a smartphone, a server, an augmented reality device, a virtual reality device, a digital media player, a piece of home entertainment equipment, backend servers hosting database 126 and other software, and the like. Each computing device 108a, 108b supported by the system 100 is realized as a computer-implemented or computer-based device having the hardware or firmware, software, and/or processing logic needed to carry out the intelligent messaging techniques and computer-implemented methodologies described in more detail herein. [0053] The edge unit 110 may be configured to deliver the GNSS data and the measured height to the cloud server 106 through the first network 104a. The system 100 is preferably realized as a computer-implemented system in that the first and second computing devices (108a, 108b) are configured as computer-based electronic devices.

[0054] The cloud server 106 may be operatively connected to the first camera 114a, the second camera 114b, and the third camera 114c over the first network 104a, and the second network 104b. The cloud server 106 may be preferably a local computer, a remote cloud server. The cloud server 106 may include a wireless RF transceiver that communicates with the first camera 114a, the second camera 114b, and the third camera 114c. The first camera 114a, the second camera 114b, and the third camera 114c may be also configured to operate under variable environmental conditions, such as bright or dim lighting, cold or hot temperatures, clean or dusty air, dry or moist climate, and the like.

[0055] The GNSS rover 112 may be mounted on the top-centre of a spreader 604 (as shown in FIG. 6A) on the reach stacker (RST). The GNSS rover 112 may be mounted on top of the operator cabin in case of the rubber-tired gantry crane (RTG), or the rail-mounted gantry crane (RMG). The GNSS rover 112 may also be represented as a Real Time Kinematic (RTK) GNSS. The real Time Kinematic (RTK) GNSS rover 112 may be configured to identify the physical stack location of the shipping container (shown in FIG. 2B) with high precision. The base station of the GNSS rover 112 may be positioned at a fixed location in the shipping yard. The GNSS rover 112 may be mounted on the container handling equipment 102 to obtain a precise GNSS location of the shipping container (shown in FIG. 2B) when the shipping container (shown in FIG. 2B) is locked or unlocked for a move. The GNSS data may include, but not limited to, location data, location co-ordinates, GNSS co-ordinates, longitude, latitude and altitude. The GNSS rover 112 may be configured to obtain the precise location of the shipping container (shown in FIG. 2B) in the form of longitude, latitude, and altitude.

[0056] The GNSS rover 112 may be configured to collect the GNSS data for extreme corners of each block of the shipping container (shown in FIG. 2B). The GNSS data for extreme corners of each block may include four reference points. Each reference point may include, but is not limited to, longitude, latitude and altitude/distance, and the like. The GNSS rover 112 may be configured to deliver the GNSS data to the cloud server 106 over the first network 104a and/or the second network 104b.

[0057] The distance sensor 116 may be a light detection and ranging sensor. The distance sensor (LiDAR) 116 may be mounted below the operator cabin of the rubber-tired gantry crane (RTG) and is positioned towards the spreader 604. The distance sensor 116 may be configured to measure the height of the shipping container (shown in FIG. 2B) from the ground. The distance sensor 116 may also be configured to measure the distance between the distance sensor 116 and the shipping container.

[0058] The cloud server 106 may be configured to receive the GNSS data and the measured height to translate the GNSS data and the measured height to a user-defined physical location in BSRT format using the BSRT derivation module 122.

[0059] The first camera 114a may be mounted near the twist-lock lights panel (Not Shown). The first camera 114a may be configured to capture the twist-lock lights panel (Not Shown) to monitor the transition states of lock-on mechanism. The transition states may include, but are not limited to, an Idle/Unlocked state, a Ready to switch state, a Switching state, a Ready to lock state, a Locked state, a Locked to Unlocked state, Unlock to Locked state, and the like. The transition states of lock-on mechanism as indicated by the twist-lock lights panel (not shown). The twist-lock lights panel (not shown) includes multiple-colored twist lock lights. The multiple-colored twist lock lights may include, but are not limited to, a red light, a green light, and two yellow lights. The twist lock lights may include, but are not limited to, light emitting diodes, and the like. The Idle/Unlocked state may be observed after illuminating the RED light on the twist-lock lights panel (Not Shown) when the spreader 604 is in an idle state/unlocked from the container. The ready to switch state may be observed after illuminating the yellow-red lights on the twist-lock lights panel (Not Shown) when the spreader is placed on the container for locking and the twist locks are ready to switch from unlocking to lock.

[0060] The switching state may be observed after illuminating the yellow light on the twistlock lights panel (not shown) when the twist locks are switching from lock to unlock or unlock to lock. The ready to lock state may be observed after illuminating the yellow-green lights on the twist-lock lights panel (not shown) when the twist-locks are ready to lock. The locked state may be observed after illuminating the green light on the twist-lock lights panel (not shown) when the spreader 604 is locked to the container. The twist-lock lights are illuminated automatically as per the transition state of the twist lock and the transition state of the twist-locks are monitored by the twist-lock detection algorithm. The twist-lock lights voltage detector 138 may be configured to identify the transition states by monitoring the voltage levels of the colored twist-lock lights.

[0061] Further, the transition states may also be identified by monitoring the voltage levels of the colored twist-lock lights. The second camera 114b and the third camera 114c may be mounted on the spreader 604 and are configured to capture the camera feed of the particular shipping container, which is locked for a move. The camera feed may include, but not limited to, one or more images of the shipping container locked for a move, one or more images of the twist-lock lights panel, and so forth. The second camera 114b may be mounted at bottomcenter of left extreme of the spreader 604 and the third camera 114c may be mounted at bottom-center of the right extreme of the spreader 604. The orientation of the second camera 114b and the third camera 114c may be adjusted such that a unique container number on top of the shipping container (shown in FIG. 2B) is captured.

[0062] The first computing device 108a and the second computing device 108b may include a twist-lock detection module 118, and a BSRT derivation module 122. The twist-lock detection module 118 may be configured to monitor the twist-lock lock-on mechanism and identifies the transition states of the spreader 604. The twist-lock detection module 118 may include vision based algorithms to detect the transition states of the spreader 604. The BSRT derivation module 122 may be configured to receive the GNSS data from the GNSS rover 112 and the measured height from the distance sensor 116, the camera feed from the first camera 114a, the second camera 114b and the third camera 114c to derive the real-time location (precise location) of the container in the BSRT format. The first user may locate the shipping container easily when the sensor data is translated to a user defined physical location (BSRT format) in the container yard. The sensor data may include the GNSS data, measured height, and the camera feed.

[0063] The first computing device 108a and the second computing device 108b may include the sensor data reading module 120 configured to receive the GNSS data from the GNSS rover 112, the measured height from the distance sensor 116, and the camera feed from the e first camera 114a, the second camera 114b and the third camera 114c based on the type of container handling equipment 102. The sensor data reading module 120 may be configured to send the GNSS data, the measured height and the camera feed a to the BSRT derivation module 122.

[0064] The cloud server 106 may include a cluster of servers configured to process the GNSS data, the measured height, and the camera feed received from one or more edge units 110. The cloud server 106 may include a twist lock detection module 122 configured to obtain the real-time location (precise location) of the container and the unique container number from the sensor data received from the container handling equipment 102 through the first network 104a. The system 100 supports all shipping container sizes. The size of the shipping containers (shown in FIG. 2B) may include, but not limited to, 8ft (2.43m) wide, 8.5ft (2.59m) high and come in lengths 10ft, 20ft (6.06m), 40ft (12.2m) and 45ft., High cube containers are 9.5ft high and so forth.

[0065] The cloud server 106 includes a container number recognition module 124 may be configured to recognize the unique container number of the shipping container that is locked for a move. The top view of the container is captured from the second camera 114b and the third camera 114c mounted on the spreader 604. The container number recognition module 124 may be configured to process the top view container images and recognize the unique container number. Furthermore, the full container number may be captured when the second camera 114b and the third camera 114c are positioned at a minimum distance from the shipping container. Half of the unique container number may be captured when the container is in locked state as the second camera 114b and the third camera 114c are very close to the shipping container. The full container number may be captured when the spreader 604 approaches the shipping container for a lock or when the spreader 604 moves away from the shipping container after unlock. If the container number is inferred from the frames captured after unlocking the shipping container, some processing delay may be added and it is difficult to assume the container number in real time. If the container number is inferred from the frames captured before lock, there may not be any delay. The frames are stored in the cloud server 106 in a circular queue till the lock transition state. Once the spreader 604 is locked to the shipping container, the frames in the queue are processed using the container number recognition module 124 to infer container numbers.

[0066] An exemplary method for identifying the transition states of the spreader comprising, the first camera 114a may be positioned near the twist-lock lights panel (not shown) to monitor the colored lights. The first camera 114a may be configured to capture the colored lights on the twist-lock lights panel (not shown) and delivers the images of the captured colored lights to the cloud server 106 to monitor the twist locks lock-on mechanism and the transition states. The second computing device 108b may be configured to identify the transition states using the twist-lock detection module 118 and the identified transition state may be displayed on the first computing device 108a and the second computing device 108b. The transition states may be identified by observing the ON/OFF status of multiple colored twist lock lights. The cloud server 106 may also be configured to identify the transition states using the twist-lock detection module 118.

[0067] Another exemplary method for identifying the transition states of the spreader 604 comprising, the transition states may be identified by tapping the input voltage of the colored lights. The voltage of the colored light is non-zero value when it is ON and the voltage of the colored light is zero value when it is OFF. These voltage values are converted to binary values and then translated to the respective transition states. The GNSS base 136 may be mounted on a pole or a building at a fixed location in the container yard.

[0068] Referring to FIG. 2A, FIG. 2A is a block diagram 200a depicting a schematic representation of the BSRT derivation module 122 shown in FIG. 1, in accordance with one or more exemplary embodiments. The BSRT derivation module 122 includes a bus 201, a data reading module 202, a data calculating module 204, a data processing module 204, and a data deriving module 208. The data reading module 202 may be configured to read the GNSS data received from the GNSS rover 112 and the measured height received from the distance sensor (LiDAR) 116. The data calculating module 204 may be configured to calculate X and Y directions from the longitude, latitude and to calculate the height of the container from the ground (Hg), and altitude. The data processing module 206 may be configured to decode the GNSS data and the measured height. The data processing module 206 may be configured to extract the required location information from the GNSS rover 112 and the distance sensor 116 and delivers the location information to the data deriving module 208. The data deriving module 208 may be configured to derive the current container location in the BSRT format using the GNSS data and the measured height. The data deriving module 208 may be configured to deliver the BSRT location along with the container number to the cloud server 106.

[0069] Referring to FIG. 2B, FIG. 2B is an example diagram 200b depicting architecture of containers block with reference points, in accordance with one or more exemplary embodiments. The diagram 200b includes blocks 210a, 210b, 210c and 210d, reference points 212a, 212b, 212c and 212d, the container handling equipment 102. The GNSS rover 112 may be configured to collect the GNSS data for extreme corners of each block of the shipping containers (shown in FIG. 2B). The GNSS data for extreme corners of each block (210a, for example) may include four reference points 212a, 212b, 212c and 212d. Each reference points 212a, 212b, 212c and 212d may include, but are not limited to, longitude, latitude and altitude/distance, and the like.

[0070] According to an exemplary embodiment of the present disclosure, the GNSS rover 112 may be mounted on the center-top of the spreader on the reach stacker (RST). The RST can lift or release containers around the block. The RST cannot lift a container in the middle of a block because of its operating limitations. Therefore, the GNSS rover 112 obtains a clear satellite view whenever the RST lifts the container and the GNSS rover 112 obtains the location with high precision. The Tier ID may be derived directly from the altitude. The GNSS rover 112 may provide the container location with high precision only when it has a clear sky view to communicate with the satellites.

[0071] According to an exemplary embodiment of the present disclosure, the rubber-tired gantry crane (RTG) may be capable to lift the container from the middle of the block. For example, if the GNSS rover 112 is mounted on the spreader of the RTG, it is very difficult to get a clear satellite view in a well like scenario of containers. To overcome this issue, the GNSS rover 112 is mounted on top of the operator cabin instead of the spreader on the RTG to obtain a clear satellite view. The Block ID, Stack ID, Row ID may be derived from the longitude and latitude of the GNSS location. The Tier ID may not derive from the altitude of the GNSS location where the GNSS rover 112 may be mounted at a fixed location and the GNSS rover 112 may not move with the spreader. The distance sensor (LiDAR) 116 may be mounted below the operator cabin such that it is facing the spreader. The distance sensor 116 may be configured to measure the distance between the distance sensor 116 and the spreader and this distance is translated to Tier ID.

[0072] The distance sensor (LiDAR) 116 may be mounted below the operator cabin of the rubber-tired gantry crane (RTG) and is positioned towards the spreader. The distance sensor 116 may be configured to measure height of the shipping container (shown in FIG. 2B) from the ground. The distance sensor 116 may also be configured to measure the distance between the distance sensor 116 and the shipping container (shown in FIG. 2B).

[0073] The cloud server 106 includes a triangulation technique. The triangulation technique may be used at the block level to derive current location of the shipping container in the BSRT format. The GNSS location for extreme corners of each block is collected. These locations are called reference points 212a, 212b, 212c and 212d. Each block contains four reference points 212a, 212b, 212c and 212d. Each reference point 212a, 212b, 212c and 212d contains longitude, latitude and altitude/LiDAR distance. The database 126 may be configured to store four reference points 212a, 212b, 212c and 212d of the block 210a and the reference ground height of the block (Altitude at the ground level).

[0074] Referring to FIG. 3, FIG. 3 is an example diagram 300 depicting architecture of a block 210a with 14 stacks and 4 rows, in accordance with one or more exemplary embodiments. The diagram 300 includes, the block 210a, a first reference point (Rl) 304a, a second reference point (R2)304b, a third reference point (R3) 304c and a fourth reference point (R4) 304d, width (W) 306, a container location (L) 308, first Distance (DI) 310a, and second distance (D2) 310b.

[0075] The width 306 of the block 210a may be the distance between the first reference point 304a and the second reference point 304b. The first distance 310a is the distance between the first reference point 304a and the current container location (L) 308, the second distance 310b is the distance between the second reference point 304b and the current container location (L) 308.

X = (IV 2 + DI 2 - D2 2 ) / 2W Y = Ol 2 - X 2

[0076]W, DI, D2 may be calculated by using longitude, latitude of the reference points (Rl, R2 ) and current location (L). X, Y are calculated by using W, DI and D2.

[0077] Each block in the container yard comprising four corner reference points 304a, 304b, 304c and 304d. If the current location lies within the rectangle formed by the reference points304a, 304b, 304c and 304d of the particular block 302a, then the ID of that particular block is determined as Block ID.

[0078] The Stack ID may be calculated by dividing X by unit length (10ft)

Stack ID = X I unit length

[0079] The Row ID may be calculated by dividing Y by unit width. The width of each container is 8 ft

Row ID = Y / unit width

[0080] The GNSS data (altitude) may be used to calculate the height of the container from the ground (Hg) by subtracting the Reference ground height from the altitude. The Tier ID may be derived by dividing H g by the container height.

H g = Altitude - Reference ground height Tier ID = H g / container height

[0081] The distance sensor 116 may be used to calculate the height of container from the ground (Hg) by subtracting the distance of the sensor from the container (D c ) from the distance of the sensor from the ground Tier ID may be derived by dividing ground H g by container height. g = Dg - D c Tier ID = H g / container height

[0082] Referring to FIG. 4, FIG. 4 is an example flow diagram 400 depicting a method for deriving the BSRT location on RTG and RST, in accordance with one or more exemplary embodiments. The method 400 may be carried out in the context of the details of FIG. 1, FIG. 2 and FIG. 3. However, the method 400 may also be carried out in any desired environment.

Further, the aforementioned definitions may equally apply to the description below.

[0083] The method commences at step 402, Monitoring transition states of a spreader by the first camera or by the voltage of the twist lock lights. Determining whether the BSRT location of the shipping container is derived on the rubber-tired gantry crane (RTG) or rail mounted gantry (RMG)?, at step 404. If the answer at step 404 is Yes (RTG/RMG), reading GNSS data using the GNSS rover and measuring height of shipping containers using the distance sensor, at step 406. Determining whether to process data on the second computing device?, at step 408. If the answer at step 408 is No (RST), Transmitting the GNSS data and the measured height to the first computing device or the cloud server through the first network, at step 410. Thereafter at step 412, processing the GNSS data and the measured height by the BSRT derivation module. If the answer at step 408 Yes, the method continues at step 412. Thereafter at step 414, calculating X and Y from longitude, latitude and calculating height of the container from ground (Hg) from the measured height. Thereafter at step 416, deriving the BSRT location using X, Y and Hg by the BSRT derivation module. Thereafter at step 418, displaying the real-time BSRT location (precise location) on the first computing device and the second computing device. If the answer at step 404 is RMG, reading the GNSS data using the GNSS rover, at step 420. Thereafter at step 422, transmitting the GNSS data to the cloud server through the first network. Thereafter at step 424, processing the GNSS data by the BSRT derivation module. Thereafter at step 426, Calculating X and Y from the longitude, latitude and calculating height of the container from the ground (Hg) from altitude. Thereafter at step 426, the method reverts at step 418.

[0084] Referring to FIG. 5, FIG. 5 is another block diagram 500 representing a system in which aspects of the present disclosure can be implemented. Specifically, FIG. 5 depicts a schematic representation of the system for identifying real-time location of shipping containers (shown in FIG. 2B), in accordance with one or more exemplary embodiments. The system 500 includes the container handling equipment 102, and the first network 104a, the cloud server 106, the first computing device 108a, the database 126. The container handling equipment 102 may include, but is not limited to, a rubber-tired gantry crane (RTG), a reach stacker (RST) or a rail mounted gantry crane (RMG). The container handling equipment 102 may be configured to pick and place the shipping containers (shown in FIG. 2B) for transport by land, water, and air, and the like. The container handling equipment 102 may be affixed to the edge unit 110. The edge unit 110 includes wireless transmitters 128a, 128b, 128c and 128n (Bluetooth beacons), a first depth sensor 130a, a second depth sensor 130b, a Wireless receiver 132 (beacon receiver), the second network 104b, and the second computing device 108b, the first camera 114a, the second camera 114b, the third camera 114c, and the twistlock lights voltage detector 138. The Wireless transmitters 128a, 128b, 128c and 128n may be configured to obtain the beacon data of the container. The Bluetooth beacon data may include, but is not limited to, strength of the beacon signal, particular location, and the like. The first depth sensor 130a may be configured to measure the distance to the operator cabin from one extreme of the rubber-tired gantry, and the second depth sensor 130b may be configured to measure the height of the shipping container from the ground. The measured horizontal distance of the operator cabin from one extreme of the rubber-tired gantry may be a first depth sensor data. The measured vertical distance(height) of the shipping container from the ground may be the second depth sensor data may be the. The first depth sensor 130a may be mounted between two vertical girders of an RTG pointing towards the operator cabin. The second depth sensor 130b may be mounted on the operator cabin pointing down towards the spreader 604.

[0085] The cloud server 106 comprising the BSRT derivation module 122 may be configured to receive the measured horizontal distance and the measured vertical distance from the first depth sensor 130a, the second depth sensor 130b positioned on the rubber-tired gantry to derive the real-time location of the container in the BSRT format.

[0086] The BSRT derivation module 122 may be configured to receive the Wireless transmitters data from the Wireless transmitters 128a, 128b, 128c and 128n, and an internal sensor data from the internal sensors of the container handling equipment 102 positioned on the reach stacker to derive location of the shipping container in the BSRT format.

[0087] The container location in Block Stack Row and Tier (BSRT) format may be derived using the GNSS rover 112 and the distance sensor 116. The container location in Block Stack Row and Tier (BSRT) format may also be derived using BLE beacons, distance sensors and the internal sensor data of the container handling equipment 102. [0088] Referring to FIG. 6A and 6B, FIG. 6A and 6B are example diagrams 600a and 600b representing a placement of Wireless transmitters, a first depth sensor and a second depth sensor on rubber-tired gantry, in accordance with one or more exemplary embodiments. The diagram 600a depicts the rubber-tired gantry 602, the wireless transmitters 128a, the first depth sensor 130a and the second depth sensor 130b. The operator cabin along with the spreader 604 moves horizontally across the rows to handle containers in each row. The spreader 604 moves vertically from top tier to bottom tier to handle containers in each tier. The first depth sensor 130a may be mounted between the two vertical girders of the rubber- tired gantry (RTG) 602 pointing towards the operator cabin to measure the horizontal distance of the operator cabin from one extreme of the rubber-tired gantry 602. The second depth sensor 130b may be mounted on the operator cabin pointing down towards the spreader to measure the vertical distance of the spreader from the second distance sensor 130b. The Wireless transmitters 128a, 128b, 128c and 128n may be installed on the ground for each stack in the block. The wireless receiver 132 may be mounted at the bottom of the rubber- tired gantry (RTG) 602 to read the signal from the wireless transmitters 128a, 128b, 128c and 128n. Each wireless transmitters 128a, 128b, 128c and 128n may be mapped to a particular stack in the block.

[0089] According to exemplary embodiment of the present disclosure, deriving the location of the container in the BSRT format on the rubber-tired gantry (RTG) comprising:

[0090] The Block ID and Stack ID on the rubber- tired gantry may derived by using encoded packet information in the closest beacon detected. If there are multiple beacons detected at a particular location, the ambiguity can be resolved by using Angle of Arrival (AoA) and strength of the beacon signal.

[0091] The operator cabin distance (£) c ) from one extreme of the rubber-tired gantry may be measured using the first depth sensor 130a and/or the second depth sensor 130b. The Row ID on the rubber-tired gantry may be derived by dividing the distance D c by container width

RowID = (D c + offset) / Container width [0092] The spreader distance from the cabin (O s ) may be measured by using the first depth sensor 130a and/or the second depth sensor 130b. The Tier ID on the rubber-tired gantry may be derived by dividing the distance D s by container height.

Tier ID = (D s + offset) / container height.

[0093] Referring to FIG. 6C, FIG. 6C is an example diagrams 600c representing placement of Wireless transmitters, in accordance with one or more exemplary embodiments. The diagram 600c depicts the reach stacker 602, the Wireless transmitters 128a, 128b, 128c and 128n, a wireless receiver 132, and the container handling equipment (reach stacker) 102.

[0094] The internal sensors of the reach stacker (RST) 602 may collect the container handling equipment sensor data like height of the spreader 604 and reach of the spreader. The container handling equipment sensor data may include, but not limited, height of the container from the ground, height of the spreader, stretch of the spreader, and the like. The container handling equipment sensor data may be transmitted to the second computing device 108b mounted inside the operator cabin through the second network 104b. The container location in BSRT format may be derived by using the container handling equipment sensor data and Wireless transmitters data installed on the ground for each stack in the block.

[0095] According to exemplary embodiment of the present disclosure, deriving the location of the container in the BSRT format on the reach stacker comprising:

[0096] The Block ID and Stack ID on the reach stacker may be derived by using encoded packet information in the closest beacon detected. If there are multiple beacons detected at a particular location, the ambiguity may be resolved by using Angle of Arrival (AoA) and strength of the BLE beacon signal.

[0097] The received signal strength indication (RSSI) information from the wireless receiver 132 may deliver the current location of the reach stacker cabin with respect to the Wireless transmitters 128a, 128b, 128c, and 128n location. Distance from the cabin to the spreader may be calculated by using the stretch of the spreader captured from the container handling equipment sensor data. The Row ID on the reach stacker may be derived by using the received signal strength indication (RSSI) information from the wireless receiver 132 and the calculated distance from the cabin to the spreader. The Tier ID on the reach stacker may be derived by using the height of the spreader captured from the container handling equipment sensor data.

[0098] Referring to FIG. 7, FIG. 7 is another example of flow diagram 700 depicting a method for deriving the BSRT location on the rubber-tired gantry, in accordance with one or more exemplary embodiments. The method 700 may be carried out in the context of the details of FIG. 1, FIG. 2, FIG. 3, FIG. 4, FIG. 5, FIG. 6A, and FIG. 6B. However, the method 700 may also be carried out in any desired environment. Further, the aforementioned definitions may equally apply to the description below.

[0099] The method commences at step 702, Monitoring transition states of the spreader by the first camera or by voltage of the twist lock lights. Determining whether the container holding equipment is a rubber-tired gantry crane (RTG) or rail mounted gantry (RMG)?, at step 704. If the answer at step 704 is Yes (RTG/RMG), the method continues at step 706, Reading bluetooth beacons data from bluetooth beacons and the first and second depth sensor data from the first depth sensor and the second depth sensor. Determining whether to process data on the second computing device?, at step 708. If the answer at step 708 is No, the method continues at step 710, Transmitting the bluetooth beacons data and the first and second depth sensor data to the first computing device or cloud server over a first network. Thereafter at 712, Decoding the bluetooth beacons data and the first and second depth sensor data by the sensor data reading module. Thereafter at step 714, Deriving BSRT location of the shipping container from the bluetooth beacons data and the first and second depth sensor data by the BSRT derivation module. Thereafter at step 716, Displaying the real-time BSRT location on the first computing device and the second computing device. If the answer at step 708 is Yes, the method continues at step 712. If the answer at step 704 is No (RST), 718 Reading the bluetooth beacons data from the Bluetooth beacons and the container handling equipment sensor data from the container handling equipment sensors. Determining whether to process data on the second computing device?, at step 720. If the answer at step 720 is No, the method continues at step 722, Transmitting the bluetooth beacons data and the container handling equipment sensor data to the first computing device or cloud server over the first network. Thereafter at step 724, Decoding the bluetooth beacons data and the container handling equipment sensor data by the sensor data reading module. Thereafter at step 726, Deriving BSRT location from the bluetooth beacons data and the container handling equipment sensor data. Thereafter at step 726, the method reverts at step 716. If the answer at step 720 is Yes, the method continues at step 724.

[00100] Referring to FIG. 8, FIG. 8 is another example flow diagram 800 depicting a method for identifying transition states and deriving the BSRT location on the container handling equipment in accordance with one or more exemplary embodiments. The method 800 may be carried out in the context of the details of FIG. 1A, FIG. IB, FIG. 2, FIG. 3, FIG. 4, FIG. 5, FIG. 6A, FIG. 6B, and Fig. 7. However, the method 800 may also be carried out in any desired environment. Further, the aforementioned definitions may equally apply to the description below.

[00101] The method commences at step 802, monitoring the transition state of the spreader using the first camera or by voltage of the twist lock lights. Thereafter at step 804, identifying the twist-lock is in idle/unlocked position after illuminating red light. Thereafter at step 806, identifying the twist-lock is ready to switch after illuminating yellow-red lights. Thereafter at step 808, switching the twist-lock from idle to locking state and the transition state is identified after illuminating yellow light. Thereafter at step 810, identifying the twistlock is ready to lock after illuminating yellow-green lights. Determining whether transition state of the spreader is locked?, at step 812. If the answer at step 812 is yes, reading the GNSS data using the GNSS rover and measuring height using the depth sensor, at step 814. Thereafter at step 816, deriving the BSRT location of the container using GNSS data and the measured height. Thereafter at step 818, updating and storing the BSRT location of the container in the cloud server. Thereafter at step 820, Displaying the BSRT location of the container on the first computing device and the second computing device. If the answer at step 812 is Yes, Identifying the twist-lock transition of the spreader is ready to switch after illuminating the yellow-green lights, at step 822. Thereafter at step 824, identifying the switching of the transition state after illuminating the yellow light. Thereafter at step 826, identifying the twist-lock is ready to unlock after illuminating the yellow-red lights. Determining whether the transition state of the spreader is unlocked?, at step 828. If the answer at step 828 is Yes, Capturing the shipping container by the second camera and the third camera from the top view, at step 830. Thereafter at step 832, decoding the unique container number using the container number recognition module. Thereafter at step 834, updating and storing the unique container number in the cloud server. If the answer at step 828 is yes, the method continues at step 814. If the answer at step 828 is No, the method reverts at step 806. If the answer at step 812 is No, the method reverts at step 810.

[00102] Referring to FIG. 9, FIG. 9 is an example flow diagram 900 depicting a method for identifying transition states of the spreader when twist-lock moves from unlocked to a locked state, in accordance with one or more exemplary embodiments. The method 900 may be carried out in the context of the details of FIG. 1A, FIG. IB, FIG. 2, FIG. 3, FIG. 4, FIG. 5, FIG. 6A, FIG. 6B, Fig. 7, and FIG. 8. However, the method 900 may also be carried out in any desired environment. Further, the aforementioned definitions may equally apply to the description below.

[00103] The method commences at step 902, monitoring a transition state of a spreader by a twist-lock detection module. Thereafter at step 904, identifying the transition state of the spreader is unlocked after illuminating a red light. Determining whether the twist-lock of the spreader is ready to switch for locking a container?, at step 906. If the answer at step 904 is Yes, identifying the twist-lock is ready to switch after illuminating yellow-red lights, at step 908. Thereafter at step 910, switching the twist-lock from idle to locking state and the transition state is identified after illuminating the yellow light. Thereafter at step 912, identifying the twist-lock is ready to lock after illuminating the yellow-green lights. Thereafter at step 914, identifying the twist-lock is locked after illuminating the green light. If the answer at step 906 No, the method reverts at step 902.

[00104] Referring to FIG. 10, FIG. 10 is an example flow diagram 1000 depicting a method for identifying transition states of the spreader when twist-lock moves from a locked to unlocked state, in accordance with one or more exemplary embodiments. The method 1000 may be carried out in the context of the details of FIG. 1A, FIG. IB, FIG. 2, FIG. 3, FIG. 4, FIG. 5, FIG. 6A, FIG. 6B, Fig. 7, FIG. 8, and FIG. 9. However, the method 1000 may also be carried out in any desired environment. Further, the aforementioned definitions may equally apply to the description below.

[00105] The method commences at step 1002, monitoring a transition state of a spreader by a twist-lock detection module. Thereafter at step 1004, identifying the transition state of the spreader is locked after illuminating the green light. Determining whether the twist-lock is ready to switch for unlocking the container?, at step 1006. If answer at step 1006 is Yes, identifying the twist-lock is ready to switch after illuminating the yellow-green lights, at step 1008. Thereafter at step, 1010, switching the twist-lock from locked state to unlocked state and the transition state is identified after illuminating the yellow light. Thereafter at step 1012, identifying the twist-lock is ready to unlock after illuminating the yellow-red lights. Thereafter at step 1014, identifying the twist-lock is unlocked after illuminating the red light. If the answer at step 1006 is No, the method reverts at step 1002.

[00106] Referring to FIG. 11, FIG. 11 is a block diagram illustrating the details of digital processing system 1100 in which various aspects of the present disclosure are operative by execution of appropriate software instructions. Digital processing system 1100 may correspond to the first computing device 108a and the second computing device 108b (or any other system in which the various features disclosed above can be implemented).

[00107] Digital processing system 1100 may contain one or more processors such as a central processing unit (CPU) 1110, random access memory (RAM) 1120, secondary memory 1130, graphics controller 1160, display unit 1170, network interface 1180, an input interface 1190. All the components except display unit 1170 may communicate with each other over communication path 1150, which may contain several buses as is well known in the relevant arts. The components of Figure 11 are described below in further detail.

[00108] CPU 1110 may execute instructions stored in RAM 1120 to provide several features of the present disclosure. CPU 1110 may contain multiple processing units, with each processing unit potentially being designed for a specific task. Alternatively, CPU 1110 may contain only a single general-purpose processing unit.

[00109] RAM 1120 may receive instructions from secondary memory 1130 using communication path 1150. RAM 1120 is shown currently containing software instructions, such as those used in threads and stacks, constituting shared environment 1125 and/or user programs 1126. Shared environment 1125 includes operating systems, device drivers, virtual machines, etc., which provide a (common) run time environment for execution of user programs 1126. [00110] Graphics controller 1160 generates display signals (e.g., in RGB format) to display unit 1170 based on data/instructions received from CPU 1110. Display unit 1170 contains a display screen to display the images defined by the display signals. Input interface 1190 may correspond to a keyboard and a pointing device (e.g., touch-pad, mouse) and may be used to provide inputs. Network interface 1180 provides connectivity to a network (e.g., using Internet Protocol), and may be used to communicate with other systems (such as those shown in Figure 1, a first network 104a and the second network 104b) connected to the network.

[00111] Secondary memory 1130 may contain hard drive 1135, flash memory 1136, and removable storage drive 1137. Secondary memory 1130 may store the data software instructions (e.g., for performing the actions noted above with respect to the Figures), which enable digital processing system 1100 to provide several features in accordance with the present disclosure.

[00112] Some or all of the data and instructions may be provided on the removable storage unit 1140, and the data and instructions may be read and provided by removable storage drive 1137 to CPU 1110. Floppy drive, magnetic tape drive, CD-ROM drive, DVD Drive, Flash memory, a removable memory chip (PCMCIA Card, EEPROM) are examples of such removable storage drive 1137.

[00113] The removable storage unit 1140 may be implemented using medium and storage format compatible with removable storage drive 1137 such that removable storage drive 1137 can read the data and instructions. Thus, removable storage unit 1140 includes a computer readable (storage) medium having stored therein computer software and/or data. However, the computer (or machine, in general) readable medium can be in other forms (e.g., non-removable, random access, etc.).

[00114] In this document, the term "computer program product" is used to generally refer to the removable storage unit 1140 or hard disk installed in hard drive 1135. These computer program products are means for providing software to digital processing system 1100. CPU 1110 may retrieve the software instructions, and execute the instructions to provide various features of the present disclosure described above. [00115] The term “storage media/medium” as used herein refers to any non-transitory media that store data and/or instructions that cause a machine to operate in a specific fashion. Such storage media may comprise non-volatile media and/or volatile media. Non-volatile media includes, for example, optical disks, magnetic disks, or solid-state drives, such as storage memory 1130. Volatile media includes dynamic memory, such as RAM 1120. Common forms of storage media include, for example, a floppy disk, a flexible disk, hard disk, solid-state drive, magnetic tape, or any other magnetic data storage medium, a CD- ROM, any other optical data storage medium, any physical medium with patterns of holes, a RAM, a PROM, and EPROM, a FLASH-EPROM, NVRAM, any other memory chip or cartridge.

[00116] Storage media is distinct from but may be used in conjunction with transmission media. Transmission media participates in transferring information between storage media. For example, transmission media includes coaxial cables, copper wire and fiber optics, including the wires that comprise bus 1150. Transmission media can also take the form of acoustic or light waves, such as those generated during radio-wave and infra-red data communications.

[00117] In an embodiment of the present disclosure, a system for identifying real-time location of shipping containers in container yard, comprising: the edge unit 110 affixed to at least one of the container handling equipment 102, the edge unit 110 comprises a real time Kinematic (RTK) Global navigation satellite system (GNSS) rover 112 configured to collect GNSS data to identify a physical stack location of the shipping container with a high precision.

[00118] In another embodiment of the present disclosure, the distance sensor 116 configured to measure height of the shipping container from a ground, at least one of: the first camera 114a; the second camera 114b; and the third camera 114c mounted on the spreader configured to capture camera feed to infer a unique container number of the shipping container being locked. [00119] In another embodiment of the present disclosure, the first depth sensor and the second depth sensor configured to measure the first and second depth sensor data, one or more wireless transmitters configured to read beacons data

[00120] In another embodiment of the present disclosure, at least one of: the GNSS rover 112, the distance sensor 116 and the at least one of: the first camera 114a; the second camera 114b; and the third camera 114c, the first depth sensor 130a, the second depth sensor 130b, the one or more wireless transmitters 128a, 128b..128n configured to deliver the GNSS data, measured height, the camera feed, the first and second depth sensor data, and the beacons data to at least one of: the first computing device 108a; the second computing device 108b; and the cloud server 106 over the first network 104a.

[00121] In another embodiment of the present disclosure, the first computing device 108a, the second computing device 108b, and the cloud server 106 comprises the twist-lock detection module 118, the sensor data reading module 120, the (block stack row and tier) BSRT derivation module 122 and the container number recognition module 124.

[00122] In another embodiment of the present disclosure, the twist-lock detection module 118 configured to monitor twist-lock lock-on mechanism to detect the transition state of the spreader of the at least one of container handling equipment 102 and deliver the detected transition state to the sensor data reading module 120, the sensor data reading module 120 configured to read at least one of: the GNSS data, the measured height, the camera feed, the first and second depth sensor data, and the beacons data of the shipping container, the BSRT derivation module configured to obtain a BSRT location of the shipping container from the GNSS data and the measured height using the GNSS data of extreme corner locations of the corresponding block of shipping container, the container number recognition module configured to detect and recognize the unique container number of the one or more shipping containers being attached to the spreader of the at least one of container handling equipment.

[00123] In another embodiment of the present disclosure, the edge unit 110 comprising the first camera 114a configured to capture a twist-lock lights panel to monitor the transition state of twist-lock lock-on mechanism. The second camera 114b and the third camera 114b mounted on the spreader 604 configured to capture the unique container number of the shipping container from a top-view. The cloud server 106 comprising a triangulation technique is used at a block level to derive a current location of the shipping container in a BSRT format using four corner locations of the block as reference. The database 126 is configured to store at least four reference points of the block and a reference ground height of the block (Altitude at the ground level). The BSRT derivation module 122 comprising a data reading module 202 configured to read the GNSS data received from the GNSS rover 112, the measured height received from the distance sensor (LiDAR) 116, the first depth sensor data and the second depth sensor data received from the first depth sensor 130a and the second depth sensor 130b and the camera feed received from the at least one of: the first camera 114a; the second camera 114b; and the third camera 114c; and the one or more wireless transmitters data from the one or more wireless transmitters 128a, 128b... 128n.

[00124] In another embodiment of the present disclosure, the first depth sensor data comprising a measured horizontal distance of an operator cabin from one extreme of the at least one of container handling equipment. The second depth sensor data comprising a measured vertical distance of the spreader from the second depth sensor. The BSRT derivation module 122 comprising a data calculating module 204 configured to calculate X and Y directions from the longitude, latitude and to calculate the height of the shipping container from the ground (Hg), and from altitude. A data processing module 206 is configured to decode at least of: the GNSS data, the camera feed, the measured height, the first depth sensor data, the second depth sensor data, and the one or more wireless transmitters data. The data processing module configured to extract the required location information from at least one of: the GNSS rover; the distance sensor and the first camera; the second camera; and the third camera; the first depth sensor; the second depth sensor; the one or more wireless transmitters and delivers to a data deriving module. A data deriving module 208 is configured to derive the location of the shipping container in the BSRT format using at least of the GNSS data; the camera feed; the measured height; the first depth sensor data; the second depth sensor data; and the one or more wireless transmitters data. The data deriving module 208 is configured to deliver the BSRT location along with the unique container number to the cloud server 106. [00125] Reference throughout this specification to “one embodiment”, “an embodiment”, or similar language means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present disclosure. Thus, appearances of the phrases “in one embodiment”, “in an embodiment” and similar language throughout this specification may, but do not necessarily, all refer to the same embodiment.

[00126] Although the present disclosure has been described in terms of certain preferred embodiments and illustrations thereof, other embodiments and modifications to preferred embodiments may be possible that are within the principles and spirit of the invention. The above descriptions and figures are therefore to be regarded as illustrative and not restrictive.

[00127] Thus the scope of the present disclosure is defined by the appended claims and includes both combinations and sub-combinations of the various features described hereinabove as well as variations and modifications thereof, which would occur to persons skilled in the art upon reading the foregoing description.