Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
A METHOD AND SYSTEM FOR OBJECT TRACKING
Document Type and Number:
WIPO Patent Application WO/2023/152763
Kind Code:
A9
Abstract:
In accordance with the embodiments of this disclosure, a system for object tracking is disclosed. The system includes an object, a rover, and a server. The rover is configured to receive a correction data and a tag identifier associated with a tag mounted on the object. The rover is further configured to determine a location of the identified object, and transmit the determined location to a server. Further, the server is configured to identify the object based on the tag identifier and an object identifier associated with the object, create a 3-dimensional (3d) representation of the object based on the determined location and a predetermined geofence, and display the 3d representation of the object.

Inventors:
KUMAR UDDHAV (IN)
Application Number:
PCT/IN2023/050130
Publication Date:
September 21, 2023
Filing Date:
February 08, 2023
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
LYNKIT SOLUTIONS PRIVATE LTD (IN)
International Classes:
G01S13/06; G01S19/07; G05D1/10
Attorney, Agent or Firm:
KEDARAM, Vineesh (IN)
Download PDF:
Claims:
We claim:

1. A method for object tracking, comprising: receiving, by a rover, correction and proximity data and a tag identifier associated with a tag mounted on a movable object; determining a location of an antenna connected to the rover based on the received correction data and one or more real-time kinematic (RTK) measurements of a movement of the rover; transmitting the determined location to a server; determining, by the server, a location of the object based on the location of the antenna; identifying the object based on the tag identifier and an object identifier associated with the object; and creating a location based 3 -dimensional (3d) representation of the object based on the determined location of the object and a predetermined geofence.

2. The method according to claim 1, wherein the tag comprises a radio frequency identification (RFID) tag, further wherein, the object comprises an industrial container.

3. The method according to claim 1, wherein receiving the tag identifier comprises receiving the tag identifier from an RFID reader mounted on a crane configured to move the object.

4. The method according to claim 1, wherein the identifying comprises identifying the object based on a stored mapping between the tag identifier and the object identifier.

5. The method according to claim 1, wherein the location is a most recent location of the object as well as one or more previous locations of the object. The method according to claim 1 , further comprising correcting the determined location of the antenna, connected to the rover based on one or more additional real-time kinematics (RTK) correctional data received from the base station via a real time server using communication methods such as Cellular Networks, LoRaWAN or WiFi. The method according to claim 1, further comprising creating the location based 3d representation based on a mapping between the object and the determined geofence. The method of claim 7, wherein the determined geofence is at a predetermined scale with respect to the object and represents the actual location in terms of the latitude, longitude and altitude i.e., x,y and z coordinates respectively of where the object is actually placed in the real life space. The method according to claim 1, further comprises generating one or more inferences based on the 3d and locational representation of the object, wherein the one or more inferences are associated with a storage management of the object in a warehouse. The method according to claim 1, further comprising displaying the 3d representation of the object in a facility such as a container yard. The method of claim 1, further comprising: creating a base map i.e., the digital replica of the space using the data from the rover; and displaying the 3d representation of the object on the created base map and allowing any user to search for the object in the application via the object identifier. The method of claim 1, further comprising correlating the transmitted location of a GNSS antenna in communication with the rover to the actual location of the object. The method of claim 1, further comprising allocating one or more tasks to an object mover associated with the rover either automatically via the application logic or via a manual job creation by an application user. The method of claim 1, further comprising providing one or more recommendations indicating an optimal location to place the object. The method of claim 1, further comprising: receiving data from a proximity sensor; and identifying a time stamp associated with picking, movement, and dropping of object. An object tracking system, comprising: a movable object; a rover configured to: receive a correction data, proximity sensor data and a tag identifier associated with a tag mounted on the object, determine a location of an antenna connected to the rover based on the received correction data and one or more real-time kinematic (RTK) measurements of a movement of the rover, and transmit the determined location to a server; wherein the server is configured to: identify the object based on the tag identifier and an object identifier associated with the object, and create a base map and a location based 3 -dimensional (3d) representation of the object on the base map based on the location of the antenna and a predetermined geofence, and display the location based 3 -dimensional (3d) representation of the object.

17. The system of claim 15, wherein the application hosted on the server is further configured to: allocate tasks to an object mover; maintain a record of actions and productivity of the object movers; record and create analytical data; optimize the placement of boxes to reduce the moves required to find, pick and place the objects to limit fuel and energy consumption; store historical movements and locations of one or more objects configured to be moved by the object mover.

18. An object tracking system, comprising: a processor; a memory storing computer-executable instructions, which when executed, cause the processor to: receive a set of tag identifiers, wherein each tag identifier from the set corresponds to one of a set of objects located in a parameter; identify each object from the set of objects based on a corresponding received tag identifier from the set of received tag identifiers and a corresponding object identifier, determine a location of an antenna associated with each identified object, and create a 3 -dimensional (3d) representation of each identified object corresponding to the determined locations and one or more predetermined geofences.

Description:
A METHOD AND SYSTEM FOR OBJECT TRACKING

FIELD OF THE INVENTION

[0001] The embodiments discussed in the present disclosure are generally related to object tracking. In particular, the embodiments discussed are related to object tracking using radio frequency identification (RFID) and real-time kinematics (RTK).

BACKGROUND OF THE INVENTION

[0002] In an industrial storage facility such as a warehouse or a container yard, objects such as industrial containers are generally stored in large quantities. To subsequently locate a particular object, one conventional approach is to manually record the details of each object such as its identification number (e.g., a container number) and location with respect to other objects (e.g., a grid/bay address within a perimeter). Therefore, there is a substantial manual effort required to maintain the record of such details and subsequently, retrieve a particular object using the recorded details.

[0003] Another conventional approach to locate an object is to implement a Global Positioning System (GPS)-based tracking mechanism in combination with RFID-based identification of the objects. However, a challenge associated with this approach is that the location of an object determined using this approach is accurate only up to a few meters. This may result in an incorrect object being tracked because of inaccuracy in the precise location, which is counterintuitive and undesirable.

[0004] Therefore, there exists a need to overcome the challenges associated with the conventional approaches. SUMMARY OF THE INVENTION

[0005] Embodiments of a method and a corresponding system for object tracking are disclosed that address at least some of the above-mentioned challenges and issues.

[0006] In accordance with the embodiments of this disclosure, a method for object tracking is disclosed. The method includes generation of correction data from a base station and receiving of the correctional data by a rover mounted on an object mover, from the base station. The method further includes, receiving, by the rover, a tag identifier associated with a tag mounted on a movable object. The method further includes identifying the object based on the received tag identifier and an object identifier associated with the object. The method further includes determining a location of an antenna connected to the rover based on the received correction data and one or more real-time kinematics (RTK) augmented measurements of a movement of the rover, which is associated with the object mover. The method additionally includes transmitting the determined location to a server where an application is hosted. The method further includes determining, by the application/software, a location of the object based on the location of the antenna and creating a 3-dimensional (3d) representation of the object based on the determined location of the object and a determined geofence which helps the user to identify the object with more ease.

[0007] In accordance with the embodiments of this disclosure, a system for object tracking is disclosed. The system includes an object, a rover associated with object mover, an application, a base station and a server. The rover is configured to receive a tag identifier associated with a tag mounted on the object. The rover is further configured to identify the object based on the received tag identifier wherein a GNSS-based RTK module is configured to determine the location of the antenna connected to the rover, and transmit the determined location to the application hosted on the server. In an embodiment, a proximity sensor indicates the time of pickup and drop-off of the object which is correlated with the location of the antenna to decipher the location of object pickup and drop-off. In an alternate embodiment, the server may identify the object based on the received tag identifier and object identifier. Further, an application hosted at the server, is configured to determine the location of the object based on the location of the antenna and create a precise location based 3-dimensional (3d) representation of the object based on the determined location of the object and a geofence, and accordingly, display the location based 3d representation of the object and allow any user to search for the object in the application via the object identifier.

[0008] In accordance with the embodiments of this disclosure, a system for object tracking is disclosed. The system includes a processor and a memory storing computer-executable instructions that when executed, cause the processor to receive a set of tag identifiers, wherein each tag identifier from the set corresponds to one of a set of objects located in a parameter. The instructions further cause the processor to identify each object from the set of objects based on a corresponding received tag identifier from the set of received tag identifiers and a corresponding object identifier. The instructions further cause the processor to determine a location of an antenna associated with each identified object, and provide a location based 3 -dimensional (3d) representation of each identified object relative to the remaining objects from the set of objects or simply in one or more predetermined geofences, grid, or free space based on the corresponding determined locations. BRIEF DESCRIPTION OF THE DRAWINGS

[0009] Further advantages of the invention will become apparent by reference to the detailed description of preferred embodiments when considered in conjunction with the drawings:

[0010] FIGs. 1A, IB, and 1C illustrate a side view, front view, and a 3-dimensional (3d) view respectively, of a storage perimeter to store objects, according to an embodiment.

[0011] FIG. 2 illustrates various components of an object tracking system, according to an embodiment.

[0012] FIG. 3 illustrates a method for object tracking, according to an embodiment.

[0013] FIGS. 4A, 4B, and 4C illustrate a side view, front view, and 3d view of exemplary objects and their respective locations in the storage perimeter, according to an embodiment.

[0014] FIG. 5 illustrates an exemplary 3d representation of objects, according to an embodiment.

[0015] FIG. 6 illustrates a graphical representation of a movement of an object, according to an embodiment.

[0016] FIG. 7 illustrates a geographical representation of a movement of an object, according to an embodiment.

[0017] FIGS. 8A-8C illustrate printed circuit board (PCB) designs of the object tracking system, in accordance with an embodiment.

[0018] FIGS. 9A-9M illustrate various user interfaces associated with the object tracking system, in accordance with an embodiment. DETAILED DESCRIPTION

[0019] The following detailed description is presented to enable any person skilled in the art to make and use the invention. For purposes of explanation, specific details are set forth to provide a thorough understanding of the present invention. However, it will be apparent to one skilled in the art that these specific details are not required to practice the invention. Descriptions of specific applications are provided only as representative examples. Various modifications to the preferred embodiments will be readily apparent to one skilled in the art, and the general principles defined herein may be applied to other embodiments and applications without departing from the scope of the invention. The present invention is not intended to be limited to the embodiments shown but is to be accorded the widest possible scope consistent with the principles and features disclosed herein.

[0020] The various embodiments throughout the disclosure will be explained in more detail with reference to figures.

[0021] FIGS. 1 A and IB illustrate a front view and side view, respectively, of an area to store objects, according to an embodiment. FIG. 1C is a 3d view of the area. The area may be a storage facility such as a warehouse or an industrial container yard or even a parking yard or construction site. A storage facility may include several such areas and each such area may store objects in the form of rows and columns as well as stacked vertically on other objects or simply randomly across the area.

[0022] In accordance with the embodiments presented herein, an object may refer to a container, a box, a pallet, or any item which is unable to move on its own and requires an object mover or a relocator such as a crane, a forklift, or a droid to move the object. In one example, an object such as an industrial container may be cuboid in shape and have dimensions as 20/40/45/52 feet x 8 feet x 8.5 feet (length x width x height). As illustrated in FIG. la, several containers may be arranged as described above. For example, a matrix 102 may represent a side view of this arrangement. Each grid in the matrix may represent a container. For example, a container 106 may be stacked on top of another container 104. Another container 110 may be stacked on top of the container 106. Another container 108 may be placed adjacent to the container 104. The front of this arrangement is illustrated in FIG. IB.

[0023] A person skilled in the art would understand that the object is not limited to industrial containers but may include any object of any shape or size that may be required to be placed or stored in any space or environment.

[0024] FIG. 2 illustrates various components of an object tracking system 200, according to an embodiment. In one example, the object tracking system 200 may be installed in a storage facility as described in the context of FIG. 1. In an embodiment, the object tracking system 200 may include a base station 202, which may include an RTK module 204 (also referred to as a GNSS RTK module 204), a micro-controller unit (MCU) 206, and a communication module 208 which may be based on cellular mobile data such as 4G, or LoRaWAN or WiFi communication platforms, as known in the art. The base station 202 may be implemented as a stationary electronic device mounted in proximity to the objects to be tracked. The base station 202, thus, acts as a reference point for generation of more accurate data related to tracking of the objects. The RTK module 204 may include an RTK chipset, which may be configured to create correction data and transmit the correction data to a real-time server (e.g., an NTRIP server), which may further be configured to transmit the correction data to an RTK chipset included in a rover.

The RTK module 204 may, alternatively, communicate the correction data directly to a rover using another communication layer such as LoRaWAN for direct communication without an NTRIP server. Here, RTK is real time kinematics-based system by which greater accuracy of a moving GNSS module (e.g., a rover) may be achieved by comparing its position in relation to a stationary reference point (e.g., a base station) whose location is known.

[0025] The base station 202 may be configured to communicate with one or more rovers 210, 212, and 214 operating in the facility. Each of these rovers may be mounted on a corresponding object mover such as a crane (not shown), which may be configured to move (lift, displace, and drop-off) the objects within premises of the facility. In an embodiment, the rover may be a box mounted on the crane which is connected with one or more antennas located on the crane in proximity to the rover or on the rover. The antenna(s) communicates with the satellites and then sends data to the RTK module 216. The location (latitude and longitude) determined in accordance with the embodiments presented herein is of the antenna and not the rover or the object. This location is required to be later correlated with time data from a proximity sensor located in proximity to the rover, to determine the exact location of the object.

[0026] In an exemplary scenario, the rover 210 may include an RTK module 216 (e.g., an RTK chipset), an MCU 218, a communication module 220 or any other communication system such as WiFi or LoRaWAN (e.g., a 4G LTE module), a tag reader such as an RFID reader 222. The rover 210 may also additionally include a back-up battery, proximity sensor, temperature sensor (for internal device health monitoring), accelerometer/magnetometer/gyroscope for the detection of the equipment’s movement and direction and a bluetooth module, which are not explicitly illustrated but may be configured to perform their respective functions, as known in the art. For instance, the bluetooth module may be configured to update an operating system/firmware of the rover 210 and may also be configured to act as a central processing unit (CPU) for the rover 210. The back-up battery may be configured to provide a secondary power source to the rover 210. The proximity sensor on the other hand would help to determine the exact time the object was picked and dropped. The RTK module 216 i.e., the RTK chipset included in the rover 210 may be configured to record the correctional data transmitted by the base station 202 and subsequently, transmit corrected data to another location server (e.g., tracking server 230).

[0027] The other rovers 212 and 214 may also include equivalent internal components as described above and function in a similar manner as rover 210, as will be described in the following excerpts of this disclosure.

[0028] The tag reader in each rover may be configured to read an RFID tag, as known in the art. In an embodiment, the RFID tag may include an ultra-high frequency RFID tag which may be placed on the container illustrated in the context of FIG. 1 (or any other object, as applicable). In an embodiment, each RFID tag may have a unique tag identifier (TID) such as, but not limited to, a Tag Identification ID, which is a 24-digit universally unique number. The solution presented in the illustrated embodiments can also identify the tag by reading the IC’s (integrated circuit chip) EPC (electronic product code) instead of the TID. For instance, an RFID tag may be associated with the EPC instead of the TID. The EPC or the TID may be later used to identify the RFID tag that is being picked up or dropped off by a crane. The objective of using the EPC and the TID is same that is, identification of the RFID tag, however, the EPC may be configurable by the manufacturer or operator of the RFID tag whereas TID is randomly generated.

[0029] Further, the tag reader (e.g., RFID reader 222) may be a module included in the rover (e.g., rover 210), and is connected to an external antenna located on the crane, in accordance with the disclosed embodiments. The tag reader may be configured to convert one or more frequencies recorded from the tag into a TID or EPC associated with the tag which is further used to identify the object that has been picked, dropped or being moved. Here, the antenna located on the crane may be an electronic device connected to the tag reader and configured to emit power to the tag such that the tag can scatter back one or more frequencies to enable the tag reader to identify the tag. In one example of the disclosed embodiments, the RFID reader 222 may be configured to read the RFID tag 224 mounted on an object described in the context of FIG. 1. Similarly, a tag reader mounted on another rover (e.g., rover 212) may be configured to read another RFID tag mounted on another object, and so on.

[0030] Further, the communication module (based on cellular networks, WiFi or LoRaWAN) 220 may be connected with one or more subscriber identity module (SIM) cards (in case of cellular networks) and may be implemented as a primary connectivity tool for receiving or transmitting data from or to the tracking server 230, respectively in case of the use of cellular data. In the cases where LoRaWAN or WiFi would be used, the role of the communication module can still be there as an alternate mode of communication. The device is capable of deciphering the best possible mode of communication based on network strengths of all systems.

[0031] The accelerometer which is a part of the rover also detects the movement and direction of the rover to identify the angle in which the object has been dropped for a more accurate rendering on the 3 -dimensional interface.

[0032] As illustrated in FIG. 2, the base station 202 and the rovers 210, 212, and 214 may be in communication with a satellite system 226, which may include one or more satellites. In an embodiment, the base station 202 and the rovers 210, 212, and 214 may also be in communication with a user device 228. In an embodiment, the base station 202 may further be in communication with a tracking server 230, which may also be in communication with the user device 228. [0033] A person skilled in the art would understand that the base station may be capable of performing any known functions performed by conventional base stations in a telecommunications environment. Further, the user device 228 may include any device that includes a display capability and processing capability to implement the embodiments of this invention. For instance, the user device 228 may include, but not limited to, a smartphone, laptop, a desktop, a tablet, a smartwatch, a smart television, an augmented reality (AR) or virtual reality (VR) device and so on. The tracking server 230 may include a tracking server which may be hybrid, cloud or on-premise based. The tracking server 230 may include any server with known or conventional functionalities such as hosting a website and communicating within or outside an associated network using known communication protocols. In the embodiments illustrated herein, the server may be implemented to facilitate communication between the base station(s) to rover(s). The same or a different tracking server may also be implemented to connect the rover(s) to an application that hosts the user interface for users.

[0034] FIG. 3 illustrates a method for object tracking, according to an embodiment. In step 302, the RFID reader 222 of the rover 210 may receive a tag identifier (e.g., TID) associated with a tag mounted on an object by scanning the object. In one example, when an object such as object 104 enters the storage facility, an RFID tag may be temporarily or permanently mounted on the object. In case of temporary installation, this RFID tag may be removed from the object when the object exits the facility. In an embodiment, when the object enters the storage facility, the operators of the facility or users of the application map the TID (or EPC) of the RFID tag and an object identifier that identifies the object. This mapping may be manually created by the operator, for instance, by using a mobile application or a website or automatically (unmanned operations) using vision/OCR technology to detect the object identifier and mapping it with the RFID. Further, a crane may be used to lift the object and drop it off to a desired storage location in the storage facility. The crane may either be manually selected by the operator or automatically selected to perform a given task (e.g., lifting the container) based on a task allocation. The task allocation may be performed manually by the operator or automatically by the application hosted in the tracking server 230 based on one or more productivity-related metrics or rules implemented in the application in the tracking server 230.

[0035] When the crane lifts the object, a proximity sensor mounted on the crane adjacent to the rover, may detect the presence of the object in vicinity of the proximity sensor. The proximity sensor may then, provide an input to the rover 210 indicating that the object is lifted by the crane. Based on this input, the RFID reader 222 may scan the tag mounted on the object to read the tag identifier (e.g., FID or EPC). For example, the RFID reader 222 may scan the tag 224 mounted on the object 104 to read the EPC of the tag 224. Further, when the crane drops the object off, the reading of the tag identifier is stopped. For instance, when the proximity sensor detects that the sensor has moved a predetermined distance away from the object and is no more in proximity to the object, the sensor may indicate to the rover 210 that the crane has now dropped off the object. This data is used by the tracking server 230 to record the timestamp associated with pickup or drop-off of the object. This timestamp is correlated with the location of the antenna during the time of movement, picking and dropping to determine the location where the object was picked, moved and dropped. Based on the indication from the proximity sensor, the RFID reader 222 of the rover 210 may stop scanning the RFID tag 224. In an embodiment, the timestamp associated with the pickup and drop-off of the object may also be determined manually (by the operator) or automatically (by the tracking server) based on the above-described detections performed by the proximity sensor. [0036] Further, in step 304, the rover 210 sends data (e.g., the determined location) to the server hosted application to identify the object or it may itself identify the object using its local memory based on the mapping between the received tag identifier and an object identifier that identifies the object. For example, the rover 210 may identify that the object dropped-off by the corresponding crane is object 104 based on the mapping between the tag number of the tag 224 and an alphanumeric object number (e.g., a container number) of the object 104. In the embodiments where the rover 210 identifies the object, the MCU 218 of the rover 210 may perform steps 304-308, described as follows. For instance, step 304 may be performed by the rover 210 based on receiving the tag identifier from the RFID reader 222. This is termed as local storage or “on edge” computing. The data can however still be transmitted to the server for viewing by the user, it is only the matching that is done locally.

[0037] In an alternative implementation, the rover may merely transmit the tag identifier (EPC or TID) to the tracking server 230 (step 304), which may then identify the object based on the stored mapping, as described above. In this alternative implementation, the tracking server 230 may perform the subsequent steps 304 - 314. In an embodiment, the mapping may be previously stored in an external data base hosted on the tracking server 230. For instance, when a tag is mounted on an object, an operator may create a mapping between the object identifier and the corresponding tag identifier, and store this mapping. In an embodiment, the mapping may be stored in the server (e.g., the tracking server 230 or any other server) and the data (object identifier and tag identifier) would be matched at the server for identification of corresponding objects that their tag readers scan.

[0038] In step 306, the rover 210 may record time of picking, movement/displacement and drop off of the object based on data received from the proximity sensor in case of “on edge” computing where as in the other method, the rover would simply transmit the reading to the proximity sensors to the application which would then determine the time of picking and dropping. Once the time (e.g., timestamp) is recorded, the rover 210 may determine a location of the antenna, in step 308. In an embodiment, the determined location is a single latitude-longitude location that is determined by using RTK measurements received from the base station 202 via the real- time/NTRIP server 230.

[0039] In an embodiment, the base station 202 may continuously and periodically transmit correction data computed from carrier phase measurement signals received from one or more satellites in the satellite system 226, to the rover 210 via the real time/NTRIP server 230. The RTK module 216 on the rover 210 may receive the correction data via its communication module 220 and then, compare its own phase measurements with the correction data received from the base station 202. When the tag reader of the rover scans an object, the rover 210 then uses this comparison and the location of the base station to determine its own accurate location. This location is the actual location of the GNSS antenna on the crane which is transmitted in the form of latitude, longitude and mean altitude above sea level, later to the application hosted on the tracking server. By correlating the relative position of the GNSS antenna with respect to the object, we can record the last known location of the object 104.

[0040] Further, the RTK module 216 of the rover 210 may optionally, correct the determined location of the object 104 based on the correction data being continuously received from the base station 202.

[0041] In step 310, the rover 210 may transmit the determined (and optionally, corrected) location to the tracking server 230. In one example, the rover 210 may transmit to the tracking server 230, a latitude, a longitude and altitude as RTK data representing the location of the antenna attached to the rover. The rover 210 may additionally transmit additional information such as, but not limited to, the RFID tag number of the scanned tag, a signal strength associated with signals received from the RFID tag, a battery level of the rover 210, a time stamp received from the satellite system 226, an International Mobile Equipment Identity (IMEI) associated with the rover 210, proximity sensor data, accelerometer data, and power connection status (Mains connected or back-up battery powered).

[0042] In step 312, the tracking server 230 may then cross reference the latitude, longitude and altitude of the position of the GNSS antenna when the object was dropped to determine the actual location of the object. The tracking server 230 may also create a precise location based spatial (3d) representation of the object based on the determined location of the GNSS antenna and a predetermined geofence inside a base map. The base map may be a digital replica of the actual yard/war ehouse and indicates the position of various objects that are tracked. This algorithm takes into account (i) The ground level altitude of the area where the container was dropped, (ii) horizontal offset between the antenna and the center of the object’s top (i.e., 2D difference), (iii) the vertical offset i.e., the difference between the altitude of the top of the container and the height of the antenna and (iv) the dimensions of the object being identified. A pictorial geofence is a 3d area linked with a location (i.e. space defined by multiple x,y,z coordinates, where x represents latitude, y the longitude and z the altitude) which indicates a position of an object within the 3d area/space. The predetermined pictorial geofence may be separate for each object and may be applied to the determined location of each object on the basis of antenna’s location to create a representation of the corresponding object on a predetermined scale with respect to the real- world object. [0043] In an exemplary scenario of step 312, FIGS. 4A, 4B, and 4C illustrate a side view, front view, and 3d view respectively, of three containers Cl ’, C2’, and C3’ dropped-off by a crane 402. The illustrated boxes including dots, Cl, C2, and C3 represent the determined and corrected single-coordinate locations (i.e., actual location) of the three-antenna location during placement of the object which are then correlated to containers Cl’, C2’, and C3’. In an exemplary scenario, the containers Cl ’, C2’, and C3’ may be similar to the objects 104, 106, and 110, respectively. The process of creating the spatial representation of these containers begins by creating a base map i.e., demarcating the latitude, longitude and altitude of the actual grid where the containers are to be placed and then, applying a predetermined 3 -dimensional geofences (or sections) to each of these locations to create a representation of the corresponding real-world containers. Alternately, there also may be no need to create a predefined base grid but simply plot the objects in free space by using the correlation algorithm presented herein. Therefore, the larger- sized boxes Cl’, C2’, and C3’ also represent visible geofenced locations of the containers Cl’, C2’, and C3’, respectively. For an easy understanding, an exemplary RFID tag 404 is shown as mounted on the geofenced image C3’ of container C3’.

[0044] Referring back to FIG. 3, in step 314, the application hosted on tracking server 230 may display the 3d representation on the user device 228.

[0045] FIG. 5 illustrates an exemplary scenario of the 3d representation displayed on the user device 228. As illustrated, FIG. 5 represents a view of multiple containers in a storage facility.

[0046] Further, although the above-described steps have been explained in the context of rover 210, any rover illustrated above may be used in addition to or instead of rover 210 to implement the above-described method. For instance, the base station 202 and the tracking server 230 may communicate with any combination or subset of rovers operating in the storage facility to implement the method described in the context of FIG. 3. There may also be multiple base stations communicating with multiple rovers being used to cover a larger area.

[0047] Following is an illustrative exemplary scenario in accordance with the presented embodiments. Several stacking areas in the storage facility may be divided into grids of 20 feet equivalent units. In this example, the containers to be stored may be of 2 types/sizes - 20 feet and 40 feet. The 20 feet container occupies one of these grids and the 40 feet container occupies 2 of these grids.

[0048] The tracking server 230 (or an equivalent server) may process the latitude and longitude and place it into one- or two of the 20-feet grids depending on the size of the container. Further, the tracking server 230 needs to determine on which tier or level were these boxes stacked on. When the tracking server 230 receives altitude above sea level, it may first subtract the ground level height corresponding to the ground level height of the grid the boxes were placed on. Alternatively, the rover may be configured to directly send the height above the ground in which case no subtraction at the application level is required.

[0049] Further, the height of each of the boxes is already known to the tracking server 230. Based on this height, the tracking server may follow simple multiplication algorithms to determine the level or tier the box was dropped on. For each grid and tier/level decision, there is a stored range in the tracking server 230 i.e., a lower and upper limit the single point can come on to corelate it to the grids. This may be done keeping into consideration the margin of error and the size of the object. This range is made in such a way that it covers 3 -dimensional space, i.e., height as well as latitude/longitude margins [0050] FIG. 6 illustrates a graphical representation of experimental data related to a movement of an object, according to an embodiment. The three axes in the illustrated graph represent the movement of the object along length, width, and height of the perimeter where the object is stored. Each axis denotes the corresponding values in centimeters, which indicates a substantially higher accuracy in determining location of the object compared to the conventional approaches, which merely determine the location up to a few meters.

[0051] FIG. 7 illustrates a geographical representation of experimental data related to a movement of an object, according to an embodiment. As illustrated, the movement of the object is represented by the indicated path with a centimeter-level accuracy.

[0052] FIGS. 8A-8C illustrate printed circuit board (PCB) designs of the object tracking system, in accordance with an embodiment. Fig 8 A has the main MCU controller. Fig 8B has the communication module which are connected with the 2 other modules; RFID reader module and RTK Module. FIG. 8C is a representation of a main base board to implement the embodiments presented herein.

[0053] FIG. 9A illustrates an exemplary user interface on an operator’s device (e.g., user device 228) to display details of all cranes operating in the storage facility. As the objects are brought for placement, picking, shifting, examination, stuffing, destuffing or dropping, the tracking server 230 may allocate this task to one of the object movers (which is connected to a rover). The algorithm that allocates this is based on various factors such as (i) whether the operator is carrying out a task, (ii) the productivity of that operator, (iii) if the object mover is allowed to handle the object with certain attributes such as size, or process type (gated in vs gated out or import/export/empty boxes etc.), (iv) location between the object movers and objects and other similar conditions. Fig 9A represents the screen of an individual that can also override this logic and allocate it to who he wants.

[0054] FIG. 9B illustrates an exemplary user interface indicating a status of objects that are required to exit the storage facility (indicated under a ‘gate out’ tab) and their location, movement status and the status of the object mover (in this case a crane) with respect to that object. For instance, a first object (e.g., container number FDSF1234567) may be ‘loading’ on to a trailer while a second object (e.g., container number ABCF1234123) may have already been dropped-off by a crane.

[0055] FIG. 9C illustrates an exemplary user interface indicating a status of objects that have entered the storage facility and need to be stored (indicated under a ‘gate in’ tab). This tab may include similar details for objects, as described above.

[0056] FIG. 9D illustrates an exemplary user interface on another user device used by an application user who can see the list of all objects on a web/desktop or mobile interface, their last record position, size and attributes of the object, current status, its location history and other such attributes.

[0057] FIG. 9E illustrates an exemplary user interface on another user device used by a gate operator.

[0058] FIG. 9F illustrates an exemplary graphical representation of analytics of the objects whose movement is recorded based on location data and actions performed by the object mover of various objects in the storage facility. The embodiments of this invention enable the application (and/or the tracking server 230) to receive the location of all objects in the storage facility. Therefore, the application may perform various analytical operations to determine one or more inferences. [0059] In one example, the application may determine the following inferences:

1. Count of objects inside a space, as in fig 9D.

2. Count of objects arriving and exiting at any point of time or during any time interval.

3. Analysis on volume trends moving in and out of the storage facility.

4. Analysis on how to better utilize space and position objects in a more efficient manner.

5. Crane/Forklift operator efficiency by allocating them tasks and jobs, and displaying/ recording their performance.

6. Reduce the utilization and consumption of fuel/energy by suggesting the optimal way to pick/drop and arrange the boxes to limit the number of moves that the object mover is required to do for any given picking/dropping task (as boxes are stored on top of the other, finding a box from a stack of boxes utilizes a lot of time and energy; by better arranging the boxes to ensure the boxes which are placed on top are the ones which are lifted first, huge savings can be achieved.

[0060] FIG. 9G illustrates yet another user interface for displaying the total number of objects in the storage facility and their associated details.

[0061] FIG. 9H illustrates a user interface for populating details associated with an object when the object enters the storage facility. These details may be populated by the operator on the user device 228. These details are later stored in a memory/database associated with the object tracking system 200 for implementing the embodiments described above.

[0062] FIG. 91 shows the representation that operator who has picked the object to be placed sees. He can see the object identifier, the identification number of the carrier that brought said object into the vicinity, as well as the recommended optimal location that this container should be placed which is based on our logic in the application code which compares (i) who that object belongs to, (ii) expected re-picking of that object, (iii) vacant spaces and (iv) what spaces are authorized to store an object of a certain quality or attribute.

[0063] Fig 9J illustrates another user interface for the individuals connected to the application which allows them to search for any object in the area, see the activity of any object mover connected to our rover and view other metrics such as movement of objects within that space between any two periods of time.

[0064] FIGS. 9K and 9L illustrate interfaces in which, the application also allows for demarcating certain areas in the space for specific uses only. The application also allows the operator to mark the highest possible stacking limits of objects in those areas and defines the number of grids available.

[0065] FIG. 9M illustrates an interface in which, the application allows to link a rover’s device ID with the name of equipment it is placed on and also configure the icons/colors to be visible on the interface when viewing said equipment. The application also allows us to allocate the associated equipment to certain and specific areas of operation referred above.

[0066] In accordance with the embodiments of this disclosure, the terms “comprising,” “including,” and “having,” as used in the claim and specification herein, shall be considered as indicating an open group that may include other elements not specified. The terms “a,” “an,” and the singular forms of words shall be taken to include the plural form of the same words, such that the terms mean that one or more of something is provided. The term “one” or “single” may be used to indicate that one and only one of something is intended. Similarly, other specific integer values, such as “two,” may be used when a specific number of things is intended. The terms “preferably,” “preferred,” “prefer,” “optionally,” “may,” and similar terms are used to indicate that an item, condition, or step being referred to is an optional (not required) feature of the invention. [0067] The present disclosure has been described with reference to various specific and preferred embodiments and techniques. However, it should be understood that many variations and modifications may be made while remaining within the spirit and scope of the invention. It will be apparent to one of ordinary skill in the art that methods, devices, device elements, materials, procedures, and techniques other than those specifically described herein can be applied to the practice of the invention as broadly disclosed herein without resort to undue experimentation. All art-known functional equivalents of methods, devices, device elements, materials, procedures, and techniques described herein are intended to be encompassed by this invention. Whenever a range is disclosed, all subranges and individual values are intended to be encompassed. This invention is not to be limited by the embodiments disclosed, including any shown in the drawings or exemplified in the specification, which are given by way of example and not of limitation. Additionally, it should be understood that the various embodiments of the SP network architecture described herein contain optional features that can be individually or together applied to any other embodiment shown or contemplated here to be mixed and matched with the features of that architecture.

[0068] While the invention has been described with respect to a limited number of embodiments, those skilled in the art, having benefit of this disclosure, will appreciate that other embodiments can be devised which do not depart from the scope of the invention as disclosed herein.