Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
SYSTEM AND METHOD FOR NAVIGATING A SELF-DRIVING DEVICE TO A USER
Document Type and Number:
WIPO Patent Application WO/2024/013076
Kind Code:
A1
Abstract:
The system for navigating a movable device such as a self-driving golf trolley to a user includes a server in communication with the movable device that is in communication with a remote carried by the user. The server monitors the positions of the movable device and extracts a satellite image of a surrounding area. The movable device monitors the distance of the user from the trolley, based on an interaction between the communication units of the remote, and the movable device. When the monitored distance between the movable device and the remote/user exceeds a predefined distance, the grids of a given resolution are imposed on the satellite image of the AOI, which enables the movable device to track and navigate to the user by following a path passing through the center points of the respective grids, thereby preventing zig-zag movement of the trolley.

Inventors:
CHAUDHARY NITIN (GB)
Application Number:
PCT/EP2023/069011
Publication Date:
January 18, 2024
Filing Date:
July 10, 2023
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
INOVOCADDY LTD (GB)
International Classes:
G05D1/02
Domestic Patent References:
WO2021246169A12021-12-09
Foreign References:
US20120182392A12012-07-19
US20120163662A12012-06-28
Attorney, Agent or Firm:
BRYERS LLP et al. (GB)
Download PDF:
Claims:
CLAIMS

What is claimed is:

1. A system for navigating a self-driving movable device to a user, the system comprising: a server in communication with the self-driving movable device that is in communication with the positioning device carried by the user, wherein the server is configured to receive real-time position of the movable device and correspondingly extract a satellite image of an area of interest (AOI) surrounding the real-time position of the movable device; and wherein the movable device comprises a processor coupled to a memory storing instructions executable by the processor, and configured to: monitor a distance between the movable device and the user, and correspondingly determine real-time position of the user in the AOI, wherein the position of the user pertains to the position of the positioning device; enable the server to create and superimpose grids of a first predefined resolution on the satellite image of the AOI when the monitored distance between the movable device, and the positioning device/user exceeds a predefined distance, and correspondingly receives the created grid map from the server; and track and navigate to the user by following a first path passing through center points of the respective grids between the movable device and the user.

2. The system as set forth in claim 1, wherein when the monitored distance between the movable device and the positioning device/user changes while following the first path, the movable device sub-divides the grids into sub-grids of a second predefined resolution greater than the first predefined resolution, and superimposes the sub-grids on the satellite image of the AOI, and wherein the movable device tracks and navigates to the user by following a second path passing through center points of the respective sub-grids between the movable device and the user.

3. The system as set forth in claim 1, wherein based on the distance between the movable device and the positioning device/user, the movable device repeats the grid-based tracking every predetermined distance until the distance between the movable device and the positioning device/user comes within the predefined distance.

4. The system as set forth in claim 1, wherein when the movable device comes to halt at a stationary position while tracking the positioning device/user and the movable device identifies the monitored position of the movable device to be scattered in a radius around the stationary position, the position of the movable device is fixed in the AOI at a last position of the movable device in the corresponding grid to mitigate the scattering.

5. The system as set forth in claim 1, wherein when the distance between the movable device and the positioning device/user is within the predefined distance, the movable device follows a path traveled by the positioning device/user to navigate to the positioning device/user.

6. The system as set forth in claim 1, wherein the predefined distance pertains to a distance between the movable device and the positioning device of the user up to which a communication remains established therebetween.

7. The system as set forth in claim 1, wherein the predefined distance up to which the communication between the mobile device and the positioning device remains established is 10-12 meters.

8. The system as set forth in claim 1, wherein the movable device comprises: an inertial measurement unit (IMU) comprising a gyroscope, and an accelerometer, the IMU is configured to monitor velocity, angular velocity, orientation, and linear acceleration of the movable device; a global positioning system (GPS) module, wherein the IMU and GPS are configured to enable the server to monitor the real-time 2D position of the movable device in the AOI; an altitude sensor configured to monitor altitude of the movable device in the AOI, wherein the altitude sensor enables conversion of the 2D position of the movable device into a 3D position based on the monitored altitude, and correspondingly facilitates the server to determine and monitor the real-time 3D position of the movable device in the AOI; and at least one communication unit configured at predetermined positions on the movable device.

9. The system as set forth in claim 8, wherein the positioning device is in form of a remote adapted to be carried by the user, the positioning device comprising: an inertial measurement unit (IMU) comprising a gyroscope, and an accelerometer, the IMU is configured to monitor velocity, angular velocity, orientation, and linear acceleration of the remote; a communication unit, the communication unit of the remote configured to operatively communicate with the communication unit of the movable device, which facilitates the movable device to determine a distance of the positioning device from the movable device, wherein the determined distance of the positioning device from the movable device and the real-time 3D position of the movable device in the AOI enables the movable device to monitor the real-time 2D position of the positioning device and the user in the AOI; and an altitude sensor configured to monitor altitude of the positioning device in the AOI, wherein the altitude sensor enables the movable device to convert the 2D position of the positioning device and the user into a 3D position based on the monitored altitude of the positioning device and the 3D position of the movable device.

10. The system as set forth in claim 9, wherein the data collected by the communication unit, and the altitude sensor of the movable device and the communication unit, and the altitude sensor of the positioning device are denoised and stabilized.

11. The system as set forth in claim 10, wherein the movable device and positioning device are configured with a calibration engine to enable self-calibration of the IMUs associated with the movable device and the positioning device, wherein the calibration engine enables the movable device and positioning device to: receive raw data monitored by the respective IMUs of the movable device and the positioning device and correspondingly create a rotation matrix; extract Euler angles comprising roll, pitch, and yaw associated with the corresponding movable device, and the positioning device, based on the rotation matrix; and stabilize the extracted Euler angles to provide stabilized values of roll, pitch, and yaw associated with the movable device and the positioning device, which facilitates the movable device and the positioning device to determine tme north and yaw angles without manual calibration.

12. The system as set forth in claim 9, wherein the system comprises a plurality of the movable devices, and a plurality of the positioning devices associated with a plurality of the user in the AOI, such that there being one movable device in communication with one positioning device associated with each of the users in a mesh topology; wherein each of the positioning devices is configured to operatively communicate with the set of communication units associated with each of the movable devices, which facilitates determining a distance between each of the movable devices and a distance between the movable devices and the positioning devices, which correspondingly enables each of the movable devices to determine the position of their respective positioning devices/users in the AOI and navigate the movable devices to the respective positioning devices/users.

13. The system as set forth in claim 1, wherein the self-driving movable device is a self-driving caddy, the user is a golfer, and the AOI is a golf course.

14. The system as set forth in claim 1, wherein the self-driving movable device is selected from a group comprising any or a combination of an unmanned aerial vehicle, a self-driving trolley, and a self-driving vehicle.

15. The system as set forth in claim 1, wherein the movable device is configured to follow the positioning device when the positioning device is in front of the movable device.

16. A system for navigating a self-driving movable device to a user, the system comprising: a server in communication with the self-driving movable device that is in communication with the positioning device carried by the user, wherein the server is configured to receive real-time position of the movable device and correspondingly extract a satellite image of an area of interest (AOI) surrounding the real-time position of the movable device; and wherein the movable device is configured to: monitor a distance between the movable device and the user, and correspondingly determine real-time position of the user in the AOI, wherein the position of the user pertains to the position of the positioning device; enable the server to create and superimpose grids of a first predefined resolution on the satellite image of the AOI when communication between the movable device and the positioning device of the user is interrupted, and correspondingly receive the created grid map from the server; and track and navigate to the user by following a first path passing through center points of the respective grids between the movable device and the user

17. The system as set forth in claim 13, wherein when the communication between the movable device and the positioning device of the user is interrupted, the movable device operates using the grid-based tracking.

18. A method for navigating a self-driving movable device to a user, the method comprises the steps of: receiving, by a server that is in communication with the movable device that is in communication with the positioning device carried by the user, real-time position of the movable device; extracting, by the server, a satellite image of an area of interest (AOI) surrounding the received position of the movable device; monitoring, by the movable device, a distance between the movable device and the user; creating and superimposing, by the server, grids of a first predefined resolution on the satellite image of the AOI when the monitored distance between the movable device and the positioning device/user exceeds a predefined distance; tracking and navigating to, by the movable device, the user by following a first path passing through center points of the respective grids between the movable device and the user.

19. The method as set forth in claim 18, wherein when the monitored distance between the movable device and the positioning device/user changes while following the first path, the method comprises the steps of: sub-dividing, by the movable device, the grids into sub-grids of a second predefined resolution greater than the first predefined resolution; superimposing, by the movable device, the sub-grids on the satellite image of the AOI; and tracking and navigating to, by the movable device, the user by following a second path passing through center points of the respective sub-grids between the movable device and the user.

20. The method as set forth in claim 18, wherein based on the distance between the movable device and the positioning device/user, the method comprises the step of repeating, by the movable device, the gridbased tracking every predetermined distance until the distance between the movable device and the positioning device/user comes within the predefined distance.

21. The method as set forth in claim 18, wherein when the movable device comes to halt at a stationary position while tracking the positioning device/user and the server identifies the monitored position of the movable device to be scattered in a radius around the stationary position, the method comprises the step of fixing the position of the movable device at a last position of the movable device in the corresponding grid to mitigate the scattering.

22. The method as set forth in claim 18, wherein when the distance between the movable device and the positioning device/user comes within the predefined distance, the method comprises the step of following, by the movable device, a path traveled by the positioning device/user to navigate to the positioning device/user.

23. The method as set forth in claim 18, wherein the method of self-calibrating IMUs associated with the movable device and the positioning device comprises the steps of: receiving raw data monitored by the IMUs of the movable device and the positioning device and correspondingly creating a rotation matrix; extracting Euler angles comprising roll, pitch, and yaw associated with the movable device and the positioning device, based on the created rotation matrix; and stabilizing the extracted Euler angles using a filter to provide stabilized values of roll, pitch, and yaw associated with the movable device and the positioning device, which facilitates the movable device and the positioning device to determine true north and yaw angle without manual calibration.

24. The method as set forth in claim 18, wherein the method comprises the steps of: communicatively coupling a plurality of the movable devices, and a plurality of the positioning devices associated with a plurality of the user, with the each other, in a mesh topology in the AOI, such that there being one movable device in communication with one positioning device associated with each of the users; wherein each of the positioning devices is configured to operatively communicate with the set of communication units associated with each of the movable devices; and determining a distance between each of the movable devices, which correspondingly facilitates determining the position of each of the movable devices from the respective positioning devices/users and navigating the movable devices to the respective positioning devices/users.

25. The method as set forth in claim 18, wherein when communication between the movable device and the positioning device of the user is interrupted, the method comprises the steps of: creating and superimposing, by the server, grids of the first predefined resolution on the satellite image of the AOI; and tracking and navigating to, by the movable device, the user by following the first path passing through center points of the respective grids between the movable device and the user.

26. The method as set forth in claim 24, wherein when the communication between the movable device and the positioning device of the user is interrupted, the method comprises the step of notifying the communication interruption to the server, which enables the movable device to operate the movable device using the grid-based tracking method.

27. A vehicle configured to follow a positioning device, the vehicle comprising: a receiving means arranged to obtain at least one radio signal emitted from the positioning device, and satellite location data; and a controller configured to: identify the global location of the positioning based on the obtained satellite location data; determine a relative position of the positioning device, to the vehicle, based on the at least one radio signal; determine the global location of the positioning device based on the global location of the vehicle and the position of the positioning device; and provide a control signal to direct the vehicle towards the positioning device based on the determined global position of the positioning device.

28. The vehicle as set forth in claim 27, wherein the controller is configured to determine the relative position of the positioning device based on the difference between the time of arrival of the signal received by a first receiver among the receiving means, and at least one of other receivers among the receiving means.

29. The vehicle as set forth in claim 27, wherein each of the receivers is arranged over the vehicle to receive a plurality of signals emitted from the positioning device; wherein the controller is configured to: calculate a plurality of possible relative positions of the positioning device based on the plurality of received signals; determine a plurality of possible global positions of the positioning device corresponding to the plurality of calculated possible relative positions; calculate the midpoint of the plurality of possible global positions; and select the midpoint as the global position of the positioning device.

30. The vehicle as set forth in claim 27, wherein the controller is configured to: obtain gridded map data representing a first plurality of tiles overlaying a map of a geographical location, and determine the global location of the vehicle by selecting one of the tiles based on the obtained global location data.

31. The vehicle as set forth in claim 27, wherein the controller is configured to sub-divide the obtained gridded map data into a second plurality of tiles overlaying the geographic locatio, and determine the global location of the vehicle by selecting one of the second plurality of tiles based on the obtained global location data.

Description:
SYSTEM AND METHOD FOR NAVIGATING A SELF-DRIVING DEVICE TO A USER

BACKGROUND

[0001] The background description includes information that may be useful in understanding the present invention. It is not an admission that any of the information provided herein is prior art or relevant to the presently claimed invention, or that any publication specifically or implicitly referenced is prior art.

[0002] Since the invention of golf in the 15th century, golf clubs, golf putters, and golf balls are the components needed by a golfer to play a round of golf. As the time progressed, instead of just one club, depending on the distance, ground condition, and wind, golfers started using multiple clubs and putters came into use. Therefore, golfers had to carry around multiple clubs and putters with them. So, different mechanisms were later invented to motorize the caddy to help golfers, and later on that this motorized caddy changed to a remote- controlled trolley. While the need for a trolley is always necessary and many golfers look for a suitable trolley. With the advancement of technology, motorized trolleys were introduced, which can be controlled by a remote so that the golfers don't need to push/pull the trolley around. With just a few presses of a button, the golfer is able to bring the trolley to the required location. Recently, many trolleys were introduced, which can follow the golfer without any user interaction.

[0003] The existing remote-controlled trolleys involve a straightforward way of finding the global position of the golfer using a global positioning system (GPS) so that the trolley can automatically navigate to the position of the golfer. However, one cannot rely only on GPS because it’s not accurate all the time specifically at stationary points. So, to attain an accurate global position of the golfer, usually, IMU sensors are integrated along with the GPS data. However, the IMU sensors need calibration, which requires rotating the IMU sensor in all directions. As a result, the IMU may result in accurate measurements if not calibrated properly. Also, the calibration needs to be repeated every time the IMU sensor is turned on again, which increases the cost and effort of both the consumer and the supporting team.

[0004] In all the existing prior arts, the trolley always tries to track the remote carried by the golfer. However, at some positions, the distance between the trolley and the remote becomes more which leads to interruption of the connection between the trolley and the remote. Thus, because of hardware restrictions, the existing trolley cannot correctly find the direction of the remote to follow. Additionally, this may happen when the distance between the remote and trolley is long and the remote turns on, then the trolley cannot find the direction of the remote or may happen in other chances as well. In this situation, the trolley tracks the remote in a zigzag path, which is not desirable.

[0005] Moreover, even with the combination of IMU and GPS data fusion in the existing technologies, GPS coordinates can scatter around in a radius when the trolley is stationary. Usually, the scatter radius of a GPS receiver is around 10 meters for example, which can be reduced to some extent, however, it cannot be completely mitigated. In addition, existing technologies fail to accurately work in real-world difficulties such as in case of no fixed GPS in cloudy weather, when there is no line of sight from the trolley to the user, no GPS due to vegetation coverage, and the presence of unaware obstacles.

[0006] Besides, the above issues faced by self-driving caddy and golfers are also faced by other unmanned self-driving devices such as unmanned aerial vehicles (UAV), unmanned water vehicles (UWV), self-driving vehicles, automated luggage and containers, and the like, which rely on the GPS and IMU data fusion for tracking a user carrying the remote, and navigating the unmanned self-driving devices to the user.

[0007] There is, therefore, a need to overcome the drawbacks, shortcomings, and limitations associated with existing navigation and tracking systems for self-driving caddy, by providing an improved, accurate, and efficient solution to enable automated tracking of golfer (user) and navigation of self-driving caddy (or other unmanned self-driving devices) to the golfer (user) in difficult real-world conditions and also when the connection between the self-driving caddy (unmanned self-driving devices) and remote of the golfer (user) is interrupted. Further, there is a need to overcome the calibration restrictions associated with IMU sensors by enabling self-calibration of the IMU sensors without any human intervention.

SUMMARY OF THE INVENTION

[0008] The present disclosure relates to an improved, accurate, and efficient solution to enable automated tracking of golfer (user) and navigation of self-driving caddy (similar unmanned self-driving devices) to the golfer (user) in difficult real-world conditions and also when the connection between the self-driving caddy (unmanned self-driving devices) and remote of the golfer (user) is interrupted.

[0009] According to an aspect, the present disclosure provides a system and method for accurate and efficient tracking of the positions of a user (golfer) carrying a positioning device (remote) and a movable device (golf trolley) in an area of interest (golf course), which enables automated navigation of the movable device to the user without any human interaction. The movable device and the positioning device have a self-calibration engine and they don’t need calibration in a factory or after that during usage in the AOI. The proposed system and method involve a fusion-based stable tracking system using communication units (also referred to as CUs, herein) including but not limited to ultra-wideband (UWB) modules, radio frequency (RF) modules, Bluetooth (BLE) Modules, image sensors/cameras, infrared (IR) sensors, ultrasonic sensors, time of flight (TOF) sensors, and other technologies available in the art. One or more CUs are positioned on the movable device/trolley side along with an altitude sensor, a global positioning system (GPS) module, and an inertial measurement unit (IMU), to determine the exact 3D position of the trolley in the AOI. Further, one CU is on the remote, which remains in communication with the CUs of the trolley, and one altitude sensor and IMU are also provided on the positioning device/remote, which allows the trolley to determine the distance of the remote/user from the trolley. The proposed system and method involve a grid-based path-finding or grid-based tracking system, which alleviates fluctuations (zig-zag motion) of the trolley when communication between the trolley and the remote is interrupted or a distance between the trolley and the user is beyond 10-12 meters. The proposed system and method further provide an adaptive mesh network between multiple trolleys, which can use other trolleys in the vicinity to help locate users' positions better.

[0010] According to another aspect, the present disclosure also overcomes the calibration restrictions associated with IMU sensors by enabling self-calibration of the IMU sensors without any human intervention. This self-calibration of the IMU has wide application in every sector involving IMU sensors.

[0011] Further aspects and embodiments of the present invention are found in the following numbered paragraphs.

[0012] 1. A system for navigating a self-driving movable device to a user, the system comprising: a server in communication with the self-driving movable device that is in communication with the positioning device carried by the user, wherein the server is configured to receive real-time position of the movable device and correspondingly extract a satellite image of an area of interest (AOI) surrounding the real-time position of the movable device; and wherein the movable device comprises a processor coupled to a memory storing instructions executable by the processor, and configured to: monitor a distance between the movable device and the user, and correspondingly determine real-time position of the user in the AOI, wherein the position of the user pertains to the position of the positioning device; enable the server to create and superimpose grids of a first predefined resolution on the satellite image of the AOI when the monitored distance between the movable device, and the positioning device/user exceeds a predefined distance, and correspondingly receives the created grid map from the server; and track and navigate to the user by following a first path passing through center points of the respective grids between the movable device and the user.

[0013] 2. The system as set forth in paragraph 1, wherein when the monitored distance between the movable device and the positioning device/user changes while following the first path, the movable device sub-divides the grids into sub-grids of a second predefined resolution greater than the first predefined resolution, and superimposes the sub-grids on the satellite image of the AOI, and wherein the movable device tracks and navigates to the user by following a second path passing through center points of the respective sub-grids between the movable device and the user.

[0014] 3. The system as set forth in paragraph 1, wherein based on the distance between the movable device and the positioning device/user, the movable device repeats the grid-based tracking every predetermined distance until the distance between the movable device and the positioning device/user comes within the predefined distance.

[0015] 4. The system as set forth in paragraph 1, wherein when the movable device comes to halt at a stationary position while tracking the positioning device/user and the movable device identifies the monitored position of the movable device to be scattered in a radius around the stationary position, the position of the movable device is fixed in the AOI at a last position of the movable device in the corresponding grid to mitigate the scattering.

[0016] 5. The system as set forth in paragraph 1, wherein when the distance between the movable device and the positioning device/user is within the predefined distance, the movable device follows a path traveled by the positioning device/user to navigate to the positioning device/user.

[0017] 6. The system as set forth in paragraph 1, wherein the predefined distance pertains to a distance between the movable device and the positioning device of the user up to which a communication remains established therebetween.

[0018] 7. The system as set forth in paragraph 1, wherein the predefined distance up to which the communication between the mobile device and the positioning device remains established is 10-12 meters. [0019] 8. The system as set forth in paragraph 1, wherein the movable device comprises: an inertial measurement unit (IMU) comprising a gyroscope, and an accelerometer, the IMU is configured to monitor velocity, angular velocity, orientation, and linear acceleration of the movable device; a global positioning system (GPS) module, wherein the IMU and GPS are configured to enable the server to monitor the real-time 2D position of the movable device in the AOI; an altitude sensor configured to monitor altitude of the movable device in the AOI, wherein the altitude sensor enables conversion of the 2D position of the movable device into a 3D position based on the monitored altitude, and correspondingly facilitates the server to determine and monitor the real-time 3D position of the movable device in the AOI; and at least one communication unit configured at predetermined positions on the movable device. [0020] 9. The system as set forth in paragraph 8, wherein the positioning device is in form of a remote adapted to be carried by the user, the positioning device comprising: an inertial measurement unit (IMU) comprising a gyroscope, and an accelerometer, the IMU is configured to monitor velocity, angular velocity, orientation, and linear acceleration of the remote; a communication unit, the communication unit of the remote configured to operatively communicate with the communication unit of the movable device, which facilitates the movable device to determine a distance of the positioning device from the movable device, wherein the determined distance of the positioning device from the movable device and the real-time 3D position of the movable device in the AOI enables the movable device to monitor the real-time 2D position of the positioning device and the user in the AOI; and an altitude sensor configured to monitor altitude of the positioning device in the AOI, wherein the altitude sensor enables the movable device to convert the 2D position of the positioning device and the user into a 3D position based on the monitored altitude of the positioning device and the 3D position of the movable device.

[0021] 10. The system as set forth in paragraph 9, wherein the data collected by the communication unit, and the altitude sensor of the movable device and the communication unit, and the altitude sensor of the positioning device are denoised and stabilized.

[0022] 11. The system as set forth in paragraph 10, wherein the movable device and positioning device are configured with a calibration engine to enable self-calibration of the IMUs associated with the movable device and the positioning device, wherein the calibration engine enables the movable device and positioning device to: receive raw data monitored by the respective IMUs of the movable device and the positioning device and correspondingly create a rotation matrix; extract Euler angles comprising roll, pitch, and yaw associated with the corresponding movable device, and the positioning device, based on the rotation matrix; and stabilize the extracted Euler angles to provide stabilized values of roll, pitch, and yaw associated with the movable device and the positioning device, which facilitates the movable device and the positioning device to determine tme north and yaw angles without manual calibration.

[0023] 12. The system as set forth in paragraph 9, wherein the system comprises a plurality of the movable devices, and a plurality of the positioning devices associated with a plurality of the user in the AOI, such that there being one movable device in communication with one positioning device associated with each of the users in a mesh topology; wherein each of the positioning devices is configured to operatively communicate with the set of communication units associated with each of the movable devices, which facilitates determining a distance between each of the movable devices and a distance between the movable devices and the positioning devices, which correspondingly enables each of the movable devices to determine the position of their respective positioning devices/users in the AOI and navigate the movable devices to the respective positioning devices/users.

[0024] 13. The system as set forth in paragraph 1, wherein the self-driving movable device is a self-driving caddy, the user is a golfer, and the AOI is a golf course.

[0025] 14. The system as set forth in paragraph 1, wherein the self-driving movable device is selected from a group comprising any or a combination of an unmanned aerial vehicle, a self-driving trolley, and a self-driving vehicle.

[0026] 15. The system as set forth in paragraph 1, wherein the movable device is configured to follow the positioning device when the positioning device is in front of the movable device.

[0027] 16. A system for navigating a self-driving movable device to a user, the system comprising: a server in communication with the self-driving movable device that is in communication with the positioning device carried by the user, wherein the server is configured to receive real-time position of the movable device and correspondingly extract a satellite image of an area of interest (AOI) surrounding the real-time position of the movable device; and wherein the movable device is configured to: monitor a distance between the movable device and the user, and correspondingly determine real-time position of the user in the AOI, wherein the position of the user pertains to the position of the positioning device; enable the server to create and superimpose grids of a first predefined resolution on the satellite image of the AOI when communication between the movable device and the positioning device of the user is interrupted, and correspondingly receive the created grid map from the server; and track and navigate to the user by following a first path passing through center points of the respective grids between the movable device and the user

[0028] 17. The system as set forth in parargaph 13, wherein when the communication between the movable device and the positioning device of the user is interrupted, the movable device operates using the grid-based tracking.

[0029] 18. A method for navigating a self-driving movable device to a user, the method comprises the steps of: receiving, by a server that is in communication with the movable device that is in communication with the positioning device carried by the user, real-time position of the movable device; extracting, by the server, a satellite image of an area of interest (AOI) surrounding the received position of the movable device; monitoring, by the movable device, a distance between the movable device and the user; creating and superimposing, by the server, grids of a first predefined resolution on the satellite image of the AOI when the monitored distance between the movable device and the positioning device/user exceeds a predefined distance; tracking and navigating to, by the movable device, the user by following a first path passing through center points of the respective grids between the movable device and the user.

[0030] 19. The method as set forth in paragraph 18, wherein when the monitored distance between the movable device and the positioning device/user changes while following the first path, the method comprises the steps of: sub-dividing, by the movable device, the grids into sub-grids of a second predefined resolution greater than the first predefined resolution; superimposing, by the movable device, the sub-grids on the satellite image of the AOI; and tracking and navigating to, by the movable device, the user by following a second path passing through center points of the respective sub-grids between the movable device and the user.

[0031] 20. The method as set forth in paragraph 18, wherein based on the distance between the movable device and the positioning device/user, the method comprises the step of repeating, by the movable device, the grid-based tracking every predetermined distance until the distance between the movable device and the positioning device/user comes within the predefined distance.

[0032] 21. The method as set forth in paragraph 18, wherein when the movable device comes to halt at a stationary position while tracking the positioning device/user and the server identifies the monitored position of the movable device to be scattered in a radius around the stationary position, the method comprises the step of fixing the position of the movable device at a last position of the movable device in the corresponding grid to mitigate the scattering.

[0033] 22. The method as set forth in paragraph 18, wherein when the distance between the movable device and the positioning device/user comes within the predefined distance, the method comprises the step of following, by the movable device, a path traveled by the positioning device/user to navigate to the positioning device/user.

[0034] 23. The method as set forth in paragraph 18, wherein the method of self-calibrating IMUs associated with the movable device and the positioning device comprises the steps of: receiving raw data monitored by the IMUs of the movable device and the positioning device and correspondingly creating a rotation matrix; extracting Euler angles comprising roll, pitch, and yaw associated with the movable device and the positioning device, based on the created rotation matrix; and stabilizing the extracted Euler angles using a filter to provide stabilized values of roll, pitch, and yaw associated with the movable device and the positioning device, which facilitates the movable device and the positioning device to determine tme north and yaw angle without manual calibration.

[0035] 24. The method as set forth in paragraph 18, wherein the method comprises the steps of: communicatively coupling a plurality of the movable devices, and a plurality of the positioning devices associated with a plurality of the user, with the each other, in a mesh topology in the AOI, such that there being one movable device in communication with one positioning device associated with each of the users; wherein each of the positioning devices is configured to operatively communicate with the set of communication units associated with each of the movable devices; and determining a distance between each of the movable devices, which correspondingly facilitates determining the position of each of the movable devices from the respective positioning devices/users and navigating the movable devices to the respective positioning devices/users.

[0036] 25. The method as set forth in paragraph 18, wherein when communication between the movable device and the positioning device of the user is interrupted, the method comprises the steps of: creating and superimposing, by the server, grids of the first predefined resolution on the satellite image of the AOI; and tracking and navigating to, by the movable device, the user by following the first path passing through center points of the respective grids between the movable device and the user.

[0037] 26. The method as set forth in paragraph 24, wherein when the communication between the movable device and the positioning device of the user is interrupted, the method comprises the step of notifying the communication interruption to the server, which enables the movable device to operate the movable device using the grid-based tracking method.

[0038] 27. A vehicle configured to follow a positioning device, the vehicle comprising: a receiving means arranged to obtain at least one radio signal emitted from the positioning device, and satellite location data; and a controller configured to: identify the global location of the positioning based on the obtained satellite location data; determine a relative position of the positioning device, to the vehicle, based on the at least one radio signal; determine the global location of the positioning device based on the global location of the vehicle and the position of the positioning device; and provide a control signal to direct the vehicle towards the positioning device based on the determined global position of the positioning device.

[0039] 28. The vehicle as set forth in paragraph 27, wherein the controller is configured to determine the relative position of the positioning device based on the difference between the time of arrival of the signal received by a first receiver among the receiving means, and at least one of other receivers among the receiving means.

[0040] 29. The vehicle as set forth in paragraph 27, wherein each of the receivers is arranged over the vehicle to receive a plurality of signals emitted from the positioning device; wherein the controller is configured to: calculate a plurality of possible relative positions of the positioning device based on the plurality of received signals; determine a plurality of possible global positions of the positioning device corresponding to the plurality of calculated possible relative positions; calculate the midpoint of the plurality of possible global positions; and select the midpoint as the global position of the positioning device.

[0041] 30. The vehicle as set forth in paragraph 27, wherein the controller is configured to: obtain gridded map data representing a first plurality of tiles overlaying a map of a geographical location, and determine the global location of the vehicle by selecting one of the tiles based on the obtained global location data.

[0042] 31. The vehicle as set forth in paragraph 27, wherein the controller is configured to subdivide the obtained gridded map data into a second plurality of tiles overlaying the geographic locatio, and determine the global location of the vehicle by selecting one of the second plurality of tiles based on the obtained global location data.

BRIEF DESCRIPTION OF THE DRAWINGS

[0043] The accompanying drawings are included to provide a further understanding of the present disclosure and are incorporated in and constitute a part of this specification. The drawings illustrate exemplary embodiments of the present disclosure and, together with the description, serve to explain the principles of the present disclosure.

[0044] In the drawings, similar components and/or features may have the same reference label. Further, various components of the same type may be distinguished by following the reference label with a second label that distinguishes among the similar components. If only the first reference label is used in the specification, the description is applicable to any one of the similar components having the same first reference label irrespective of the second reference label.

[0045] FIG. 1 illustrates an exemplary block diagram of the proposed system comprising the positioning device (remote carried by the user) in communication with the movable device (trolley) according to an embodiment of the invention.

[0046] FIG. 2 illustrates an exemplary overview of the tracking engine of the proposed system according to an embodiment of the invention.

[0047] FIG. 3 illustrates an exemplary overview of the tracking engine, calibration engine, and global positioning engine being configured on the trolley of the proposed system according to an embodiment of the invention.

[0048] FIG. 4 illustrates an exemplary overview of the self-calibration engine or mechanism of the proposed system according to an embodiment of the invention.

[0049] FIG. 5 illustrates an exemplary overview depicting the usage of sensors in the remote of the user according to an embodiment of the invention.

[0050] FIG. 6 illustrates an exemplary representation depicting the working of the gridding engine of the proposed system according to an embodiment of the invention.

[0051] FIGs. 7A and 7B illustrate exemplary representation depicting the working of the gridding engine in the case of stationary gridding according to an embodiment of the invention. [0052] FIG. 8A illustrates an exemplary representation depicting multiple independent trolleys of the proposed system in a star topology network without any communication therebetween according to an embodiment of the invention.

[0053] FIG. 8B illustrates an exemplary representation depicting multiple trolleys of the proposed system in a mesh topology network and in communication with each other according to an embodiment of the invention [0054] FIG. 9 illustrates exemplary functional block/components involved in the proposed system according to an embodiment of the invention.

[0055] FIG. 10 illustrates an exemplary flow diagram of the proposed method according to an embodiment of the invention.

[0056] FIG. 11 illustrates an exemplary flow diagram depicting the working of the gridding engine according to an embodiment of the invention.

[0057] FIG. 12 illustrates an exemplary flow diagram depicting the working of the calibration engine according to an embodiment of the invention.

DETAILED DESCRIPTION

[0058] The following is a detailed description of embodiments of the disclosure depicted in the accompanying drawings. The embodiments are in such detail as to clearly communicate the disclosure. However, the amount of detail offered is not intended to limit the anticipated variations of embodiments; on the contrary, the intention is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the present disclosure as defined by the appended claims.

[0059] In the specification, reference may be made to the spatial relationships between various components and to the spatial orientation of various aspects of components as the devices are depicted in the attached drawings. However, as will be recognized by those skilled in the art after a complete reading of the present application, the devices, members, devices, etc. described herein may be positioned in any desired orientation. Thus, the use of terms such as “above,” “below,” “upper,” “lower,” “first”, “second” or other like terms to describe a spatial relationship between various components or to describe the spatial orientation of aspects of such components should be understood to describe a relative relationship between the components or a spatial orientation of aspects of such components, respectively, as the device described herein may be oriented in any desired direction.

[0060] FIG. 1 illustrates an exemplary block diagram of the proposed system 100 for tracking a user 108 and navigating a self-driving movable device 104 (or vehicle 104) to the user 108. The system 100 includes a remote 106 (also referred to as a positioning device 106, herein) that can be carried by the user 108. The remote 106 is configured to communicatively interact with the movable device 104 (also referred to as trolley 104, herein) to determine a distance between them, which enables the movable device 104 of the system to determine the exact positions of the remote 106/user 108 and movable device 104 in an area of interest (AOI) and further enables the movable device 104 to track and navigate to the remote 106/user 108. In an exemplary embodiment, the AOI may be a golf course, but not limited to the like, and the movable device 104 may be a self-driving golf caddy (also referred to as the trolley 104, herein) that is configured to carry golfing tools and accessories for the user 108. In another exemplary embodiment, the AOI may be an airfield and the movable device 104 may be an unmanned aerial vehicle (UAV). In yet another embodiment, the AOI may be a water body and the movable device 104 may be an unmanned water vehicle (UWV). Those skilled in the art would appreciate that while various embodiments and figures of the present application herein have been elaborated in terms of the golf caddy /trolley as the movable device, and a golfer as the user 108 in a golf course as the AOI for sake of simplicity and easier explanation, however, the teachings of the present application is also equally implementable for UAVs, UWVs, self-driving vehicles, and the likes, and all such embodiments are well within the scope of the present application without any limitation.

[0061] The trolley 104 (movable device) includes a global positioning system (GPS) module 104-1, an inertial measurement unit (IMU) 104-2, an altitude sensor 104-3, one or more communication units (CUs) or receiving means (for example, but not limited to CU-1, CU_2, or CU-3, or a combination thereof), a miniprocessor, and a display 104-C comprising a monitor 104-5. The CUs can be selected from any or a combination of ultra-wideband (UWB) modules, radio frequency (RF) modules, Bluetooth (BLE) modules, transceivers, image sensors/cameras, infrared (IR) sensors, ultrasonic sensors, time of flight (TOF) sensors, and other known wireless communication modules available in the art. Further, an encoder 104-4 is configured on the trolley 104 to decrypt the data captured by the sensors before transmitting the data to the server 102 or other trolleys 104, and a decoder is configured to decrypt the data received from the server 102 or multiple trolleys. The remote 106 includes an inertial measurement unit 106-1, an altitude sensor 106-2, and one communication unit (CU-4). In an exemplary embodiment, one CU can be configured on the trolley 104, which can communicate with the CU of the remote 106. In another exemplary embodiment, two CUs can be configured on the trolley 104, wherein both the CUs of the trolley 104 can communicate with the CU of the remote 106. In yet another exemplary embodiment, as shown in FIG. 1 , the CUs can be configured at the left wheel 104-A, right wheel 104-B, and front display 104-C of the trolley 104. While various embodiments and figures of the present disclosure elaborate upon the use of three CUs (CU-1 to CU-3) on the trolley just for the sake of exemplary explanation purpose, however, the number of CUs on the trolley 104 can be kept one or two or more than three also, based on the type of the CU being employed in the system and as per the requirement, and all such embodiments are well within the scope of the present disclosure without any limitation.

[0062] System 100 includes a server 102 in communication with the movable device or trolley 104 through a network. Further, the trolley 104 remains in communication with the remote 106 using the CUs of the trolley 104 and remote 106. The IMU 104-2 and GPS 104-1 of the trolley 104 enable server 102 to determine and monitor the exact 2D position of the trolley 104 in the AOI. The altitude sensor 104-3 further helps determine the altitude of the trolley 104, which helps convert the determined 2D position of the trolley 104 into a 3D position. Further, the interaction between the CU of the remote 106 and the CUs of the trolley 104, and the IMU 104-1 of the remote 106 enables the trolley 104 to determine a distance between the trolley 104 and the remote 106. The movable device 104 then determines the exact 2D position of the remote 106/user 108 in the AOI based on the position of the trolley 104 and the distance between the trolley 104 and the remote 106. Later, the altitude sensor 106-2 of the remote 106 helps determine the altitude of the remote 106/user 108 and enables the movable device 104 to convert the 2D position of the remote 106/user 108 into a 3D position. Accordingly, the movable device 104 is actuated to move forward and navigate to the identified 3D position of the remote 106/user 108 in the AOI, efficiently and accurately. In yet another embodiment, the movable device 104 is also configured to follow the remote 106/user 108 in front of the remote 106/user 108, when the movable device 104 is detected to be in front of the remote 106/user 108 in the AOI. [0063] In an embodiment, the movable device or vehicle 104 can include a controller configured to identify the global location of the movable device or vehicle 104 based on the obtained satellite location data, and further determine a relative position of the positioning device 106 to the movable device or vehicle 104, based on at least one radio signal communicated between the CUs of the movable device 104 and the positioning device 106. The controller can determine the global location of the positioning device 106 based on the global location of the movable device or vehicle 104 and the position of the positioning device 106, and can correspondingly provide a control signal to direct the movable device or vehicle 104 towards the positioning device 106 based on the determined global position of the positioning device 106.

[0064] It would be obvious for a person skilled in the art that one cannot rely only on GPS for determining the accurate position of the trolley 104 because GPS is not accurate all the time specified in stationary points as in the case of the existing technologies. So, to attain an accurate global position, the IMU sensor is integrated into the proposed system 100/trolley 104 along with the GPS module 104-1. The IMU sensor is configured on a stable platform in the remote 106 or the trolley 104. The IMU sensor measures the acceleration, using a gyroscope, and magnetometer. The coupled GPS and IMU data enable sensor fusion, which processes the GPS and IMU data to stabilize the location, velocity, and acceleration for both the remote 106 and trolley 104. As the position of both the trolley 104 and GPS is needed in the proposed system 100 for efficient and accurate tracking and navigation, however, a GPS cannot be configured within the remote 106 because of various technical reasons and hardware restrictions associated with GPS. For instance, as GPS provided on the remote will remain ON in searching mode, the GPS will consume more electrical power, as a result, frequent charging of the remote 106 will be required or a higher capacity battery will be required in the remote which will make the remote 106 bulkier and heavy, making it unpleasant for the user to carry the remote 106. In addition, as GPS generally has an accuracy of up to 5 meters or so, as a result, GPS implemented in the remote 106 may not be able to provide an accurate location of the remote 106 to the trolley 104, which will make the overall navigation process inaccurate. Further, GPS also fails to work properly in cloudy /rainy conditions and when the line of sight is disturbed due to vegetation coverage, due to the presence of unaware obstacles, or when the remote remains in the pocket of the user, thereby making it unreliable. To overcome this, the proposed system 100 determines the local position of the remote 106 with respect to the trolley 104 based on the interaction between the communication units of the remote 106 and the trolley 104. Further, by adding to the global position data of the trolley 104, the global position of the remote 106 is also determined.

[0065] IMU includes gyroscope and accelerometer sensors. Gyroscopes and accelerometers are motionsensing devices that measure the rate of rotation (angular velocity) and linear acceleration, respectively. The velocity, position, and orientation of an object are tracked by integrating acceleration and angular velocity over time. IMU suffers error propagation in measurements. The accumulated error, which is known as drift, grows rapidly and makes the IMU output unreliable for navigation purposes. Thus, usually, IMU is fused with the GPS for improved results. But, the IMU needs calibration, which is done by rotating the sensor in all directions for accurate measurements. Also, the calibration needs to be repeated once the IMU is turned on again, which is timeconsuming and hectic. The present application overcomes the manual calibration issue required in the IMU by providing a self-calibration engine for the proposed system 100 which is discussed in detail in FIGs. 4 and 12. [0066] Referring to FIGs. 2, and 3, the tracking engine 200 or tracking system 302 of the proposed system 100 involves any or a combination of time of flight (TOF) data, time of arrival (TOA) data, time difference of arrival (TODA) data or angle of arrival (AO A) data coming from the communication units of the trolley 105 and remote 106 at block 202. In an implementation, one communication unit CU-4 can be on the remote 106, one communication unit CU-1 can be on the left wheel 104-A of the trolley (left anchor), one communication unit CU-2 can be on the right wheel 104-2 of trolley (right anchor), and one communication unit CU-3 can be on the front display 104-C of the trolley as shown in FIGs. 1 and 8A, such that triangulation between the CUs is achieved, which helps determine the TOF data, and correspondingly enables the tracking engine to determine the distance between the trolley 104 and the remote 106. In another implementation, one CU can be configured on the remote 106, and two communication units can be configured on the trolley 104, wherein one BLE (first CU) is used for AOA data, and one UWB (second CU) is used for TOF data, which correspondingly enables the tracking engine to determine the distance between the trolley 104 and the remote 106. In yet another implementation, a directional antenna (for example a patch antenna) having multiple CUs can be configured around the trolley 104 such that a complete 360° coverage around the trolley is achieved, and one CU can be configured on the remote 106, which helps determine the AOA data, and correspondingly enables the tracking engine to determine the distance between the trolley 104 and the remote 106.

[0067] These communication unit outputs (AOA data and/or TOA data and/or TOF data) are generally noisy. The data (AOA)is further processed at block 204, which involves denoising the data using denoising techniques available in the art to clean the data. After this, the clean data is further processed to estimate the 2D position of the trolley 104 in the AOI and determine the distance between the 104 trolley and remote 106. The resulted 2D position may again not be clean and may be noisy, so filtration is performed using known filters to stabilize the estimated position of the trolley 104 and the remote 106, which helps the trolley 104 to determine the 2D location of the remote 106 with respect to the trolley 104 at block 206. This avoids the estimated position to jump or be unexpected. Further, the GPS and IMU data collected by the GPS module 104-1 and IMU 104-2 of the trolley 104 at block 208 and the 2D location of the trolley estimated at block 210 is fused with the altitude data collected by altitude sensor 104-3 of the trolley 104 at block 214, to determine the 3D position of the trolley 104 in the AOI at block 212. Further, the altitude data collected by altitude sensor 106-2 of the remote 106 at block 214 is fused with the 3D global position of the trolley 104 determined at block 212 to estimate the 3D global position of the remote 106 in the AOI, which can be further stabilized filtered to stabilize it, to determine the 3D clean and accurate 3D global position of the remote 106 in the AOI at block 216. Accordingly, the wheels of the trolley 104 can be controlled to track and navigate to the remote 106, which is played in real-time. Thus, by adding the GPS and altitude sensor data of the remote 106 and the trolley 104, the 3D position of both the remote 106 and trolley 106 can be found.

[0068] Thus, system 100 uses communication units, altitude sensors, and IMUs to make a triangulation with the remote 106 and the trolley 104. The system 100 measures the AOA of the transmitting signal from the communication unit to determine the angle, local position, and global position of the remote 106/user 108. For instance, the resulting 2D position means the location in (x,y) which can be local or global. For local positioning, denoising, filtration, and stabilization of sensor data can be used, but for the global position, GPS 104-1 and IMU 104-2 are used. In the tracking system, one of the CUs on trolley 104 is considered the origin of the local position system. And other CUs have a relative (x,y) which is constant and known. But the position of the CU-4 of the remote 106 is not known and it changes over time upon the movement of the golfer (user 108). The main purpose of the tracking system 302 is to localize the CU-4 of the remote. Once the tracking system 302 finds the location of the remote 106 (note that this position is in local x,y and the origin of this is the first CU at the trolley) the wheels of the trolley 104 can be controlled to navigate the trolley towards the user. For the 3D global position of the trolley 104 in (x,y,z), the GPS 104-1 (fusion with IMU 104-2) and altitude sensor 104-3 are used. Since in the local position system, the positions of the trolley 104 and remote 106 are known, and the GPS and altitude sensor data of the trolley 104 are known, so the 3D position of the remote 106 can be determined by adding x,y data of the remote 106 to x,y,z of the trolley 104.

[0069] Referring to FIGs. 3-5, an exemplary overview of the calibration engine of the proposed system 100 is disclosed. All IMU sensors generally need calibration, which requires rotating the IMU sensor in all directions. As a result, the IMU may result in accurate measurements if not calibrated properly. Also, the calibration needs to be repeated every time the IMU sensor is turned on again, which increases the cost and effort of both the consumer and the supporting team. To overcome this, the proposed system 100 provides a self-calibrating IMU in form of an eCompass 306 for the trolley as shown in FIG. 3 and an eCompass 504 for the remote as shown in FIG. 5, that use IMU raw data from a magnetometer and accelerometer to make a rotation matrix. Based on the rotation matrix, the Euler angles are extracted. As the extracted angles are not stable and extremely sensitive to motion, so to overcome this, known filtration techniques can be used. The output of the filter is stable Euler angles at block 404.

[0070] The motion of the trolley and remote has rotation angles with respect to the Cartesian system. The angles include Roll, Pitch, and Yaw. The roll is around the X-axis, the pitch is around the Y-axis, and the yaw is around the Z-axis. These angles are named Euler angles herein. The yaw is interpreted as the rotation in left and right which is very important. The yaw helps find the heading of the user and trolley for better tracking. The pitch is interpreted as the rotation of the user and trolley up and down. Both roll and pitch help to understand the slope of the ground in the AOI for better wheel slip management and tracking. The eCompass 306, 504 based on IMUs 104-2, 106-1 as shown in FIGs. 3 and 5 are used to measure the orientation or Euler angles of the remote 106 and trolley 104.

[0071] The tracking engine of the proposed system 100 is implemented by denoising techniques and filters, and communication units like UWB, RF, BLE, image sensors/cameras, IR sensors, ultrasonic sensors, time of flight (TOF) sensors, and the like, to determine the user position in 2D space, and an altitude sensor to determine the location in the 3D space building on the determined position on 2D space as shown in FIGs. 2 to 5. Even though this communication unit-based tracking gives very accurate tracking of the user 108, its accuracy is limited to a range of less than 10-12 meters. If and when the distance between remote 106 and trolley 104 is greater than a predefined distance (for example, but not limited to 10 meters) or when communication between the CU of the remote 106 and the CU(s) of the trolley 104 is interrupted, the tracking engine starts fluctuating, which leads to zig-zag movement of the trolley 104 while following forward and navigating to the remote 106/user 108, which is not ideal and highly undesirable.

[0072] The trolley 104 always tries to track the remote 106 carried by the user 108. However, there are some positions where the distance between the trolley and remote is more than the predefined distance for example, but not limited to 10m, and because of hardware restrictions the tracking engine can’t find the direction of the remote 106 correctly to follow. Additionally, this may happen when the distance between the remote 106 and trolley 104 is long and the remote turns ON, then the trolley 104 cannot find the direction of the remote 106 or may happen in other chances where the trolley 104 starts zigzag movement irrespective of distance. To overcome this zigzag movement, the proposed system 100 involves a grid-based tracking technique involving a gridding engine configured with server 102 and trolley 104. If the flag of the long distance between the trolley 104 and the remote 106 is turned ON, the gridding Direction of Arrival method is implemented. At close distances between the trolley 104 and remote 106, both left and right communication units (CU-1, CU-2) of the trolley 104 see the AO As correctly but at long distances, both left and right communication units (CCU-1, CU-2) of the trolley 104 see the AOA equally so only one AOA is seen. Because of these, a flag has to be made on the equal left and right AOA measurements. If the flag turns ON, it means a long distance is detected between the trolley 104 and remote 106, and/or it means fluctuation in communication between the trolley 104 and remote 106. In this situation, the trolley 104 generally tracks the remote 106 in the zigzag path which is not desirable. To overcome this, a grid of DOA (for example a grid in [-10,10]) instead of a thin line of DOA. Now, the trolley 104 is forced to follow the remote 106 in the middle (center points) of DOA (for example in zero degrees) with two AO As. The trolley 104 continues this way till the long-distance flag turns OFF (the AO As of left and right CU are not equal). This avoids trolley 104 to fast zigzag tracking and in close distances uses all three AO As. Depending on the distance between the trolley 104 and the remote 106/user 108, the grid-based tracking repeats itself every predetermined distance for example but is not limited to 2-meters, which means the trolley 104 course corrects itself every 2 meters till the distance between the trolley 104 and the user 108 is less than the predefined distance for example, but not limited to 10m. In that case, the trolley 104 stops following the user 108 using the grid-based tracking and starts following the user 108 with the tracking engine which solely uses the communication units till the user 108 stops moving or the distance between the user 108 and trolley 104 exceeds the predefined distance (example 10 meters) again.

[0073] The controller of the movable device or vehicle 104 is configured to calculate a plurality of possible relative positions of the positioning device 106 based on communication between the CUs of the positioning device 106 and the movable device 104, determine a plurality of possible global positions of the positioning device 106 corresponding to the plurality of calculated possible relative positions, calculate the midpoint of the plurality of possible global positions, and select the midpoint as the global position of the positioning device 106. Further, the controller is configured to obtain gridded map data representing a first plurality of tiles overlaying a map of a geographical location, and determine the global location of the movable device or vehicle 104 by selecting one of the tiles based on the obtained global location data. Furthermore, the controller is configured to sub-divide the obtained gridded map data into a second plurality of tiles overlaying the geographic location; and determine the global location of the vehicle by selecting one of the second plurality of tiles based on the obtained global location data.

[0074] In an embodiment, when the monitored distance between the trolley 104 and the remote 106/user 108 exceeds a predefined distance, the gridding engine enables the server 102 to extract satellite images 602 of the AOI and create and superimpose grids 604 of a first predefined resolution on the satellite image 602 of the AOI as shown in FIG. 6. The trolley 104 can then correspondingly track and navigate to the user 108 by following a first path passing through the center points of the respective grids 604 between the trolley 104 and the user 108. Further, when the monitored distance between the trolley 104 and the remote 106/user 108 changes while following the first path, the trolley 104 further sub-divides the grids into sub-grids of a second predefined resolution greater than the first predefined resolution and superimposes the sub-grids on the satellite image of the AOI. Accordingly, the trolley 104 tracks and navigates to the user 108 by following a second path passing through the center points of the respective sub-grids between the trolley 104 and the user 108. Based on the distance between the trolley 104 and the remote 106/user 108, the trolley 104 repeats the grid-based tracking every predetermined distance (for example, but not limited to 2-meters) until the distance between the trolley 104 and the remote 106/user 108 comes within the predefined distance.

[0075] In an exemplary embodiment, when the distance between the trolley 104 and user 108 is greater than the predefined distance for example, but not limited to 10 meters, the approximate position of the remote 106/user 108 and the trolley 104 is sent to the server. The backend server 102 then based on those GPS positions creates a grid on the satellite image of the surrounding area. The server 102 can also include important information about hazards and paths and other important information in these grids and provides it back in the scale of a first predefined resolution, in order to reduce the data transfer limitations and reduce latency. Then this first predefined resolution grid information is sent back to the trolley 104. Further, during the navigation of the trolley 104 towards the user 108, as the distance between the trolley 104 and remote 106/user 108 changes, the trolley 104 then subdivides the grid information further to increase to a second predefined resolution of the grids, which can be dynamic, to achieve very precise tight travel paths. Once the distance between the trolley 104 and the user 108 reaches less than the predefined distance, the trolley 104 switches back to the CU-based tracking mode by following the path traveled by the user 108.

[0076] Further, with the combination of the IMU calibration engine and GPS engine, the GPS coordinates can also scatter around the trolley 104 in a radius on the satellite image 602 when the trolley 104 is stationary as shown in FIG. 7A. Usually, the scatter radius of a GPS receiver is around 10 meters for example, which can be further reduced to around 3-5 meters of scatter radius using the proposed system as shown in FIG. 7 A. However, complete mitigation of this scattering of the GPS location is required. To completely mitigate the GPS scattering, the proposed system provides a solution, wherein when the trolley 104 comes to halt at a stationary position while tracking the remote 106/user 108 and the trolley 104 identifies the monitored position of the trolley 104 to be scattered in a radius around the stationary position, the position of the trolley 104 is fixed in the AOI at a last position of the trolley 104 in the corresponding grid 604 as shown in FIG. 7B to mitigate the scattering. As a result, when the trolley 104 is tracking/moving, the trolley 104 can keep track of where in the grid 604 the trolley 104 is currently at. When the trolley 104 comes to a halt, the trolley 104 location is fixed on the grid 604 at its last position. The size of the grids 604 can be ranging from 50 cm up to Im. This increases the reliability of the data that is going to be processed in the backend server 102.

[0077] In another embodiment, when communication between the trolley 104 and the remote 106 of the user 108 is interrupted, the trolley 104 notifies the server 102 and the trolley 102 operates using the grid-based tracking. The gridding engine enables the server 102 to create and superimpose grids of a first predefined resolution on the satellite image of the AOI and the trolley 104 correspondingly tracks and navigates to the user 108 by following a first path passing through the center points of the respective grids between the trolley 104 and the user 108.

[0078] In an embodiment, as shown in FIG. 8B, multiple navigation trolley systems 100-1, 100-2 can be present in the AOI 800 that may be in communication with each other. The combined system 800 may involve multiple trolleys 104-1, 104-2, and multiple remotes 106-1, 106-2 associated with one or more users in the AOI, such that there is one trolley in communication with one remote associated with each of the users. Generally, when it comes to an individual trolley as shown in FIG. 8 A, the system 100-1 or 100-2 behaves in a star topology, wherein the remote 106-1 or 106-2 acts as the parent node and left anchor (CU on wheel), right anchor, and the display anchor of the trolley 104-1 or 104-2 act as child nodes. Also, there is no communication between the trolleys 104-1 and 104-2 which was sufficient enough to do the triangulation of the user.

[0079] However, to further optimize the triangulation of the remote/user and to add more user-friendly and safety features, an interconnection network between multiple trolleys 104-1, 104-2 is introduced in a mesh topology 800 as shown in FIG. 8B. wherein each of the trolleys 104-1, 104-2 is configured to operatively communicate with the communication units associated with other trolleys, which facilitates each of the trolleys 104-1 to 104-2 to determine a distance between them and also determine a distance between the trolley and their respective remotes/user. This data enables determining the position of each trolley 104-1, 104-2 from their respective remote 106-1, 106-2 and navigating the trolley 104-1, 104-2 to the respective remote 106-1, 106-2. Having this kind of mesh network where each individual trolley communicates with each other, generates a self- healing and self-correcting network, which can be used to improve the precision and accuracy of the grid and pathing information provided by the proposed system. Besides, as each trolley 104-1, 104-2 in the combined system 800 is aware of the position itself as well as the position of the remote too, having a communication pipeline between two or more trolleys 104-1, 104-2 means that they are aware of each other’s position, which in turn makes all the trolleys self-aware of live obstacles in real-time to avoid collisions. This feature enables the proposed combined system 800 to accurately operate the trolleys 104-1, 104-2 in real-world difficulties such as in case of no fixed GPS in cloudy weather when there is no line of sight from the trolley to the user, no GPS due to vegetation coverage, presence of unaware obstacles, and the like.

[0080] FIG. 9 illustrates exemplary functional blocks involved in the proposed system 100 according to an embodiment of the invention.

[0081] In an aspect, the system 100 may involve one or more processors) 902. The one or more processor(s) 902 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, logic circuitries, and/or any devices that manipulate data based on operational instructions. Among other capabilities, one or more processor(s) 902 are configured to fetch and execute computer-readable instructions stored in a memory 904 of the server 102. The memory 904 may store one or more computer-readable instructions or routines, which may be fetched and executed to create or share the data units over a network service. The memory 904 may comprise any non-transitory storage device including, for example, volatile memory such as RAM, or non-volatile memory such as EPROM, flash memory, and the like.

[0082] System 100 may also comprise an interface(s) 906. The interface(s) 906 may comprise a variety of interfaces, for example, interfaces for data input and output devices, referred to as I/O devices, storage devices, and the like. The interface(s) 906 may facilitate communication of server 102, trolley 104, and remote 106. The interface(s) 906 may also provide a communication pathway for one or more components of the server 102, trolley 104, and remote 106. Examples of such components include, but are not limited to, processing engine(s) 908 and database 910.

[0083] The processing engine(s) 908 may be implemented as a combination of hardware and programming (for example, programmable instructions) to implement one or more functionalities of the processing engine(s) 908. In examples described herein, such combinations of hardware and programming may be implemented in several different ways. For example, the programming for the processing engine(s) 908 may be processorexecutable instructions stored on a non-transitory machine-readable storage medium and the hardware for the processing engine(s) 908 may comprise a processing resource (for example, one or more processors), to execute such instructions. In the present examples, the machine-readable storage medium may store instructions that, when executed by the processing resource, implement the processing engine(s). In such examples, system 100 may comprise the machine-readable storage medium storing the instructions and the processing resource to execute the instructions, or the machine-readable storage medium may be separate but accessible to server 102 and the processing resource. In other examples, the processing engine(s) 908 may be implemented by electronic circuitry. [0084] The database 910 may comprise data that is either stored or generated as a result of functionalities implemented by any of the components of the processing engine(s) 908 or the server 102 or the trolley 104 or the remote 106.

[0085] In an exemplary embodiment, the processing engine(s) 908 may include a global positioning engine 912, a tracking engine 914, a calibration engine 916, a gridding engine 918, and other engines (s) 920. Other engine(s) 920 can supplement the functionalities of the processing engine or the server 102.

[0086] In an embodiment, the global positioning engine 912 and tracking engine 914 enable the processor 902 associated with the trolley 104 to actuate the GPS and IMU of the trolley 104 to allow the server 102 to monitor the real-time 2D position of the trolley 104 in the AOI and further actuate the communication unit(s) of the trolley 104 to communicate with the CU of the remote 106 in order to determine the distance between the trolley 104 and remote 106/user 108. The global positioning engine 912 and tracking engine 914 can further enable the processor 902 associated with the trolley 104 to actuate the altitude sensor of the trolley 104 to determine the altitude of the trolley 104 in the AOI. Accordingly, server 102 can process the 2D position of the trolley 104 and the altitude data of the trolley 104 to determine the 3D position of the trolley 104 in the AOI.

[0087] The global positioning engine 912 and tracking engine 914 can further enable the processor 902 associated with the trolley 104 to actuate the altitude sensor of the remote 106, via the trolley 104, which further enables the trolley 104 to determine the altitude of the remote 106/user 108 in the AOI. Further, the trolley 104 can process the determined distance between the trolley 104 and the remote 106/user 108 and its 3D position in the AOI to determine the 2D position of the remote 106/user 108 in the AOI. Furthermore, the extracted altitude data of the remote 106/user 108 enables the trolley 104 to determine a 3D position of the remote 106/user 108 in the AOI. Accordingly, the tracking engine 914 enables the processor 902 to actuate the wheels of the trolley 104 to move forward and navigate the trolley 104 to the 3D position of the user 108.

[0088] In an embodiment, the calibration engine 916 enables the trolley 104 to receive raw data being monitored by the IMUs of the trolley 104 and correspondingly creating a rotation matrix. The calibration engine 916 enables the extraction of Euler angles comprising roll, pitch, and yaw associated with the trolley 104, based on the created rotation matrix. Accordingly, the calibration engine 916 stabilizes the extracted Euler angles using known filtration techniques to provide stabilized values of roll, pitch, and yaw associated with the trolley 104. Thus, allowing the trolley 104 to determine the true north and yaw angle without manual calibration.

[0089] In another embodiment, the calibration engine 916 enables the remote 106 to receive raw data being monitored by the IMU of the remote 106 and correspondingly creating a rotation matrix. The calibration engine 916 enables the processor 902 to extract Euler angles comprising roll, pitch, and yaw associated with the remote 106, based on the created rotation matrix. Accordingly, the calibration engine 916 stabilizes the extracted Euler angles using known filtration techniques to provide stabilized values of roll, pitch, and yaw associated with the remote 106. Thus, allowing the remote 106 to determine the true north and yaw angle without manual calibration [0090] In an embodiment, when the monitored distance between the trolley 104 and the remote 106/user 108 exceeds a predefined distance (for example, but not limited to 10m) or communication between the CUs of the trolley 104 and the remote 106 is interrupted, the gridding engine 918 enables the server 102 to create and superimpose grids of a first predefined resolution (for instance 5X5 m) on the satellite image of the AOI and transmit the created grid map to the trolley 104. Accordingly, the wheels of the trolley 104 are actuated to move forward and navigate the trolley 104 to the user 108 by following a first path passing through the center points of the respective grids between the trolley 104 and the user 108.

[0091] Further, when the monitored distance between the trolley 104 and the remote 106/user 108 changes while following the first path, the gridding engine 918 enables the trolley 104 to sub-divide the grids into subgrids of a second predefined resolution (IXlm, 50cmX50cm) greater than the first predefined resolution and superimposes the sub-grids on the satellite image of the AOI. Accordingly, the tracking engine 914 and gridding engine 918 enable the processor 902 associated with the trolley 104 to actuate the corresponding wheels to move forward and navigate the trolley 104 to the user 108 by following a second path passing through the center points of the respective sub-grids between the trolley 104 and the user 108. In an embodiment, the gridding engine 916 can enable the trolley 104 to repeat the grid-based tracking every predetermined distance until the distance between the trolley 104 and the remote 106/user 108 comes within the predefined distance.

[0092] In another embodiment, when the trolley 104 comes to halt at a stationary position while tracking the remote 106/user 108 and the trolley 104 identifies the monitored position of the trolley 104 to be scattered in a radius around the stationary position, the gridding engine 918 enables the processor 902 to actuate and position the trolley 104 at a fixed position in the AOI where the trolley 104 was found to be last positioned in the corresponding grid, thereby mitigating the GPS scattering issue.

[0093] FIG. 10 illustrates an exemplary flow diagram of the proposed method 1000 for navigating a movable device (trolley) to a user (golfer). FIG. 11 illustrates an exemplary flow diagram 1100 depicting the working of the gridding engine.

[0094] The proposed method 1000 for navigating a movable device (trolley) to a user (golfer) includes step 1002 of receiving, by a server, real-time positions of the trolley. Further, the trolley remains in communication with the remote carried by the user, which allows the trolley to determine relative position/distance of the remote with respect to the trolley. Method 1000 further includes step 1004 of extracting, by the server, a satellite image of an area of interest (AOI) surrounding the position of the trolley received at step 1002. The method 1000 further includes step 1006 of monitoring, by the movable device, a distance between the movable device and the user.

[0095] Further, the method 1000 includes step 1008 of creating and superimposing, by the server, grids of a first predefined resolution on the satellite image of the AOI when the monitored distance between the trolley and the remote/user exceeds a predefined distance. Accordingly, the method 1000 further includes step 1010 of tracking and navigating to, by the trolley, the user by following a first path passing through center points of the respective grids between the trolley and the user.

[0096] Furthermore, when the monitored distance between the trolley and the remote/user changes while following the first path, the method 1100 further includes step 1102 of sub-dividing, by the trolley, the grids into sub-grids of a second predefined resolution greater than the first predefined resolution of step, followed by another step 1104 of superimposing, by the trolley, the sub-grids on the satellite image of the AOI. Accordingly, the method 1100 includes step 1106 of tracking and navigating to, by the trolley, the user by following a second path passing through center points of the respective sub-grids between the trolley and the user.

[0097] In an embodiment, based on the distance between the trolley and the remote/user, the method 1000, 1100 includes the step of repeating, by the trolley, the grid-based tracking every predetermined distance until the distance between the trolley and the positioning remote/user comes within the predefined distance and the remote/user.

[0098] In another embodiment, when the trolley comes to halt at a stationary position while tracking the remote/user and the system identifies the monitored position of the trolley to be scattered in a radius around the stationary position, the method 1000 includes the step of fixing the position of the trolley at a last position of the trolley in the corresponding grid to mitigate the scattering.

[0099] In yet another embodiment, when the system includes multiple trolleys, and multiple remotes associated with one or more users in the AOI, such that there is one trolley in communication with one remote associated with each of the users, in order to further optimize the triangulation of the remote/user and to add more user-friendly and safety features, the method can involve a step of creating an interconnection network between multiple trolleys in a mesh topology as shown in FIG. 7B, such that each of the trolleys is configured to operatively communicate with the communication units associated with each of the trolleys. This facilitates each trolley to determine a distance between them and also determine a distance between the trolley and the respective remotes/user. Besides, this data facilitates determining the position of each trolley from their respective remote/users and navigating the trolley to the respective remote/users.

[00100] It is to be appreciated by a person skilled in the art that the above mesh network where each trolley is communicating with each other, generates a self-healing and self-correcting network that can be used to improve the precision and accuracy of the grid and pathing information provided by the backend server. Besides, as each trolley is aware of the position itself as well as the position of the remote too, having a communication pipeline between two or more trolleys means that they are aware of each other’s position, which in turn makes all the trolleys self-aware of live obstacles in real-time to avoid collisions. This feature enables the proposed system to accurately operate the trolleys in real-world difficulties such as in case of no fixed GPS in cloudy weather, when there is no line of sight from the trolley to the user, no GPS due to vegetation coverage, presence of unaware obstacles, and the like.

[00101] FIG. 12 illustrates an exemplary flow diagram 1200 depicting the working of the calibration engine. Method 1200 of implementing self-calibration in the IMU includes step 1202 of receiving raw data monitored by the IMUs of the trolley and the remote and correspondingly creating a rotation matrix. Further, the method 1200 includes step 1204 of extracting Euler angles comprising roll, pitch, and yaw associated with the trolley and the remote, based on the rotation matrix created in step 1202. Furthermore, the method 1200 includes step 1206 of stabilizing the extracted Euler angles using known filtration techniques to provide stabilized values of roll, pitch, and yaw associated with the trolley and the remote, which facilitates the trolley and the remote in determining tme north and yaw angle without manual calibration.

[00102] Thus, the present invention (proposed system and method) overcomes the drawbacks, shortcomings, and limitations associated with existing navigation and tracking systems for self-driving trolleys, by providing an improved, accurate, and efficient solution to enable automated tracking of golfer (user) and navigation of selfdriving caddy (similar unmanned self-driving devices) to the golfer (user) in difficult real-world conditions and also when the connection between the self-driving caddy (unmanned self-driving devices) and remote of the golfer (user) is interrupted. Further, the present invention also overcomes the calibration restrictions associated with IMU sensors by enabling self-calibration of the IMU sensors without any human intervention.

[00103] In one implementation, a network can be a wireless network, a wired network or a combination thereof. Network can be implemented as one of the different types of networks, such as an intranet, local area network (LAN), wide area network (WAN), the internet, and the like. Further, the network may either be a dedicated network or a shared network. The shared network represents an association of the different types of networks that use a variety of protocols, for example, Hypertext Transfer Protocol (HTTP), Transmission Control Protocol/Intemet Protocol (TCP/IP), Wireless Application Protocol (WAP), and the like, to communicate with one another. Further, netPwork can include a variety of network devices, including routers, bridges, servers, computing devices, storage devices, and the like. In another implementation the network can be a cellular network or mobile communication network based on various technologies, including but not limited to, Global System for Mobile (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Long Term Evolution (LTE), WiMAX, and the like.

[00104] Various terms are used herein. To the extent a term used in a claim is not defined below, it should be given the broadest definition persons in the pertinent art have given that term as reflected in printed publications and issued patents at the time of filing.

[00105] As used in the description herein and throughout the claims that follow, the meaning of “a,” “an,” and “the” includes plural reference unless the context clearly dictates otherwise. Also, as used in the description herein, the meaning of “in” includes “in” and “on” unless the context clearly dictates otherwise.

[00106] The recitation of ranges of values herein is merely intended to serve as a shorthand method of referring individually to each separate value falling within the range. Unless otherwise indicated herein, each individual value is incorporated into the specification as if it were individually recited herein.

[00107] All methods described herein can be performed in any suitable order unless otherwise indicated herein or otherwise clearly contradicted by context. The use of any and all examples, or exemplary language (e.g. “such as”) provided with respect to certain embodiments herein is intended merely to better illuminate the invention and does not pose a limitation on the scope of the invention otherwise claimed. No language in the specification should be construed as indicating any non-claimed element essential to the practice of the invention. [00108] Groupings of alternative elements or embodiments of the invention disclosed herein are not to be construed as limitations. Each group member can be referred to and claimed individually or in any combination with other members of the group or other elements found herein. One or more members of a group can be included in, or deleted from, a group for reasons of convenience and/or patentability. When any such inclusion or deletion occurs, the specification is herein deemed to contain the group as modified thus fulfilling the written description of all groups used in the appended claims.

[00109] Embodiments of the present invention include various steps, which will be described below. The steps may be performed by hardware components or may be embodied in machine-executable instructions, which may be used to cause a general-purpose or special-purpose processor programmed with the instructions to perform the steps. Alternatively, steps may be performed by a combination of hardware, software, firmware, and/or by human operators. [00110] Embodiments of the present invention may be provided as a computer program product, which may include a machine-readable storage medium tangibly embodying thereon instructions, which may be used to program a computer (or other electronic devices) to perform a process. The machine-readable medium may include, but is not limited to, fixed (hard) drives, magnetic tape, floppy diskettes, optical disks, compact disc readonly memories (CD-ROMs), and magneto-optical disks, semiconductor memories, such as ROMs, PROMs, random access memories (RAMs), programmable read-only memories (PROMs), erasable PROMs (EPROMs), electrically erasable PROMs (EEPROMs), flash memory, magnetic or optical cards, or other type of media/machine-readable medium suitable for storing electronic instructions (e.g., computer programming code, such as software or firmware).

[00111] Various methods described herein may be practiced by combining one or more machine-readable storage media containing the code according to the present invention with appropriate standard computer hardware to execute the code contained therein. An apparatus for practicing various embodiments of the present invention may involve one or more computers (or one or more processors within a single computer) and storage systems containing or having network access to computer program(s) coded in accordance with various methods described herein, and the method steps of the invention could be accomplished by modules, routines, subroutines, or subparts of a computer program product.

[00112] In interpreting the specification, all terms should be interpreted in the broadest possible manner consistent with the context. In particular, the terms “comprises” and “comprising” should be interpreted as referring to elements, components, or steps in a non-exclusive manner, indicating that the referenced elements, components, or steps may be present, or utilized, or combined with other elements, components, or steps that are not expressly referenced. Where the specification claims refer to at least one of something selected from the group consisting of A, B, C ....and N, the text should be interpreted as requiring only one element from the group, not A plus N, or B plus N, etc.

[00113] While the foregoing describes various embodiments of the invention, other and further embodiments of the invention may be devised without departing from the basic scope thereof. The scope of the invention is determined by the claims that follow. The invention is not limited to the described embodiments, versions or examples, which are included to enable a person having ordinary skill in the art to make and use the invention when combined with information and knowledge available to the person having ordinary skill in the art.