Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
METHOD AND CONTROLLER FOR CONTROLLING FLUID EJECTION BY AN AERIAL ROBOT
Document Type and Number:
WIPO Patent Application WO/2022/035380
Kind Code:
A1
Abstract:
There is provided a method of controlling fluid ejection by an aerial robot. The aerial robot includes: a body on which an actuator and a nozzle are mounted, the actuator being configured to control an orientation of the nozzle to perform controlled fluid ejection; and a sensor system configured to detect a surface below the aerial robot during flight and a target area on the surface desired to be impinged by the fluid when ejected from the nozzle. The method includes: obtaining sensor data from the sensor system, the sensor data comprising distance information between the target area and the aerial robot; determining a control parameter for controlling the fluid ejection by the aerial robot using the distance information based on a multi-component objective function relating to kinematic parameters of a kinematics model associated with the body, the actuator, the nozzle and a fluid motion of the fluid when ejected from the nozzle to a point of contact; controlling the actuator to control the orientation of the nozzle to perform controlled fluid ejection based on the control parameter; obtaining ejected fluid data from the sensor system, the ejected fluid data comprising a position of the point of contact of the fluid with respect to the surface when ejected from the nozzle; determining an adjustment parameter based on the position of the point of contact of the fluid when ejected from the nozzle; and controlling the actuator to control the orientation of the nozzle to perform controlled fluid ejection based on the adjustment parameter. There is also provided a corresponding controller for controlling fluid ejection by an aerial robot, and a corresponding aerial robot configured to perform fluid ejection including the controller.

Inventors:
FOONG SHAOHUI (SG)
LEE SHAWNDY MICHAEL (SG)
LEE MENG (SG)
TANG EMMANUEL (SG)
LIM RYAN (SG)
Application Number:
PCT/SG2021/050472
Publication Date:
February 17, 2022
Filing Date:
August 12, 2021
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
UNIV SINGAPORE TECHNOLOGY & DESIGN (SG)
International Classes:
B05B12/08; B64D1/18; G05D1/10; B64C39/02; B64D47/08
Foreign References:
JP6721098B12020-07-08
US20170359943A12017-12-21
CN110180839A2019-08-30
US7859655B22010-12-28
Other References:
LEE SHAWNDY MICHAEL; CHIEN JER LUEN; TANG EMMANUEL; LEE DENZEL; LIU JINGMIN; LIM RYAN; FOONG SHAOHUI: "Hybrid Kinematics Modelling for an Aerial Robot with Visual Controllable Fluid Ejection", 2020 IEEE/ASME INTERNATIONAL CONFERENCE ON ADVANCED INTELLIGENT MECHATRONICS (AIM), IEEE, 6 July 2020 (2020-07-06), pages 832 - 838, XP033807591, DOI: 10.1109/AIM43001.2020.9158941
Attorney, Agent or Firm:
VIERING, JENTSCHURA & PARTNER LLP (SG)
Download PDF:
Claims:
CLAIMS

What is claimed is:

1. A method of controlling fluid ejection by an aerial robot, the aerial robot comprising a body on which an actuator and a nozzle are mounted, the actuator being configured to control an orientation of the nozzle to perform controlled fluid ejection; and a sensor system configured to detect a surface below the aerial robot during flight and a target area on the surface desired to be impinged by the fluid when ejected from the nozzle, the method comprising: obtaining sensor data from the sensor system, the sensor data comprising distance information between the target area and the aerial robot; determining a control parameter for controlling the fluid ejection by the aerial robot using the distance information based on a multi-component objective function relating to kinematic parameters of a kinematics model associated with the body, the actuator, the nozzle and a fluid motion of the fluid when ejected from the nozzle to a point of contact; controlling the actuator to control the orientation of the nozzle to perform controlled fluid ejection based on the control parameter; obtaining ejected fluid data from the sensor system, the ejected fluid data comprising a position of the point of contact of the fluid with respect to the surface when ejected from the nozzle; determining an adjustment parameter based on the position of the point of contact of the fluid when ejected from the nozzle; and controlling the actuator to control the orientation of the nozzle to perform controlled fluid ejection based on the adjustment parameter.

2. The method according to claim 1, wherein the kinematics model comprises a rigid kinematics modeling in relation to position vectors corresponding to the body, the actuator and the nozzle, and a fluid modelling in relation to a position vector corresponding to the point of contact of the fluid when ejected from the nozzle.

3. The method according to claim 2, wherein the rigid kinematics modeling comprises a first link parameter and a first joint parameter relating to a link between a reference point and the body, a second link parameter and a second joint parameter relating to a link between the body and the actuator, a third link parameter and a third joint parameter relating to a link between the actuator and the nozzle, a fourth link parameter relating to a link between the nozzle and the point of contact of the fluid modeled based on a fluid dynamics function in relation to the fluid motion of the fluid.

4. The method according to claim 3, wherein the fluid dynamics function comprises a first component in relation to a horizontal component of the distance travelled by the fluid from the nozzle to the point of contact, and a second component in relation to a vertical component of the distance travelled by the fluid from the nozzle to the point of contact.

5. The method according to claim 4, further comprising modelling the first link parameter, the first joint parameter, the second link parameter, the second joint parameter, the third link parameter, the third joint parameter, the first component and the second component using forward kinematics.

6. The method according to claim 4 or 5, wherein the multi-component objective function comprises the first component, the second component, and a third component, the third component being the first link parameter.

7. The method according to claim 6, wherein the multi-component objective function comprises a first weight associated to the first component, a second weight associated to the second component and a third weight associated to the third component, the first weight and the second weight are configured based on the amount of energy to be delivered onto the point of contact of the fluid to be ejected and the third weight is configured based on an amount of a fluid connector in which the aerial robot has to drag to obtain an optimal position of the aerial robot.

8. The method according to any one of claims 2 to 7, wherein the multi-component objective function comprises constrains on a desired relative position of the aerial robot and the point of contact.

9. The method according to claim 8, wherein the constrains comprise a first threshold in relation to the horizontal component of the distance travelled by the fluid from the nozzle and a second threshold in relation to the vertical component of the distance travelled by the fluid from the nozzle.

10. The method according to any one of claims 1 to 9, wherein the control parameter comprises an angle of the nozzle with respect to a vertical axis.

11. The method according to any one of claims 1 to 10, wherein the sensor system comprises an image capture device configured to capture image data, and said obtaining ejected fluid data from the sensor system, the ejected fluid data comprising a position of the point of contact of the fluid ejected from the nozzle comprises obtaining image data from the image capture device, the image data in relation to the fluid ejected from the nozzle; and performing detection of the point of contact of the fluid ejected from the nozzle based on the image data.

12. The method according to claim 11, wherein said determining an adjustment parameter based on the position of the point of contact of the fluid ejected from the nozzle further comprises determining an offset distance between the point of contact and the target area.

13. The method according to any one of claims 1 to 12, wherein said obtaining sensor data from the sensor system, the sensor data comprising distance information between the target area and the aerial robot comprises performing detection of the target area based on a machine learning model.

14. The method according to any one of claims 1 to 13, wherein said determining a control parameter for controlling the fluid ejection by the aerial robot using the distance information is further based on a force estimation model.

15. A controller for controlling fluid ejection by an aerial robot, the aerial robot comprising a body on which an actuator and a nozzle are mounted, the actuator being configured to control an orientation of the nozzle to perform controlled fluid ejection; and a sensor system configured to detect a surface below the aerial robot during flight and a target area on the surface desired to be impinged by the fluid when ejected from the nozzle, the controller comprising: a memory; and at least one processor communicatively coupled to the memory and configured to: obtain sensor data from the sensor system, the sensor data comprising distance information between the target area and the aerial robot; determine a control parameter for controlling the fluid ejection by the aerial robot using the distance information based on a multi-component objective function relating to kinematic parameters of a kinematics model associated with the body, the actuator, the nozzle and a fluid motion of the fluid when ejected from the nozzle to a point of contact; control the actuator to control the orientation of the nozzle to perform controlled fluid ejection based on the control parameter; obtain ejected fluid data from the sensor system, the ejected fluid data comprising a position of the point of contact of the fluid with respect to the surface when ejected from the nozzle; determine an adjustment parameter based on the position of the point of contact of the fluid when ejected from the nozzle; and control the actuator to control the orientation of the nozzle to perform controlled fluid ejection based on the adjustment parameter.

16. The controller according to claim 15, wherein the kinematics model comprises a rigid kinematics modeling in relation to position vectors corresponding to the body, the actuator and the nozzle, and a fluid modelling in relation to a position vector corresponding to the point of contact of the fluid when ejected from the nozzle.

17. The controller according to claim 16, wherein the rigid kinematics modeling comprises a first link parameter and a first joint parameter relating to a link between a reference point and the body, a second link parameter and a second joint parameter relating to a link between the body and the actuator, a third link parameter and a third joint parameter relating to a link between the actuator and the nozzle, a fourth link parameter relating to a link between the nozzle and the point of contact of the fluid modeled based on a fluid dynamics function in relation to the fluid motion of the fluid.

18. The controller according to claim 17, wherein the fluid dynamics function comprises a first component in relation to a horizontal component of the distance travelled by the fluid from the nozzle to the point of contact, and a second component in relation to a vertical component of the distance travelled by the fluid from the nozzle to the point of contact.

19. The controller according to claim 18, further comprising modelling the first link parameter, the first joint parameter, the second link parameter, the second joint parameter, the third link parameter, the third joint parameter, the first component and the second component using forward kinematics.

20. The controller according to claim 18 or 19, wherein the multi-component objective function comprises the first component, the second component, and a third component, the third component being the first link parameter.

21. The controller according to claim 20, wherein the multi-component objective function comprises a first weight associated to the first component, a second weight associated to the second component and a third weight associated to the third component, the first weight and the second weight are configured based on the amount of energy to be delivered onto the point of contact of the fluid to be ejected and the third weight is configured based on an amount of a fluid connector in which the aerial robot has to drag to obtain an optimal position of the aerial robot.

22. The controller according to any one of claims 16 to 21, wherein the multi-component objective function comprises constrains on a desired relative position of the aerial robot and the point of contact.

23. The controller according to claim 22, wherein the constrains comprise a first threshold in relation to the horizontal component of the distance travelled by the fluid from the nozzle and a second threshold in relation to the vertical component of the distance travelled by the fluid from the nozzle.

24. The controller according to any one of claims 15 to 23, wherein the control parameter comprises an angle of the nozzle with respect to a vertical axis.

25. The controller according to any one of claims 15 to 24, wherein the sensor system comprises an image capture device configured to capture image data, and said obtaining ejected fluid data from the sensor system, the ejected fluid data comprising a position of the point of contact of the fluid ejected from the nozzle comprises obtaining image data from the image capture device, the image data in relation to the fluid ejected from the nozzle; and performing detection of the point of contact of the fluid ejected from the nozzle based on the image data.

26. The controller according to claim 25, wherein said determining an adjustment parameter based on the position of the point of contact of the fluid ejected from the nozzle further comprises determining an offset distance between the point of contact and the target area.

27. The controller according to any one of claims 15 to 26, wherein said determining a control parameter for controlling the fluid ejection by the aerial robot using the distance information is further based on a force estimation model.

28. An aerial robot configured to perform fluid ejection, the aerial robot comprising: a body; a nozzle mounted on the body and configured to eject fluid; an actuator mounted on the body and configured to control an orientation of the nozzle to perform controlled fluid ejection; a sensor system configured to detect a surface below the aerial robot during flight and a target area on the surface desired to be impinged by the fluid when ejected from the nozzle; and the controller for controlling the actuator to control the orientation of the nozzle to perform fluid ejection according to any one of claims 15 to 27.

29. A computer program product, embodied in one or more non-transitory computer- readable storage mediums, comprising instructions executable by at least one processor to perform the method of controlling fluid ejection by an aerial robot according to any one of claims 1 to 14.

Description:
METHOD AND CONTROLLER FOR CONTROLLING FLUID EJECTION BY AN

AERIAL ROBOT

CROSS-REFERENCE TO RELATED APPLICATION

[0001] This application claims the benefit of priority of Singapore Patent Application No. 10202007706S, filed on 12 August 2020, the content of which being hereby incorporated by reference in its entirety for all purposes.

TECHNICAL FIELD

[0002] The present invention generally relates to a method of controlling fluid ejection by an aerial robot, a controller thereof, and an aerial robot configured to perform fluid ejection including the controller.

BACKGROUND

[0003] Facilities maintenance have always been laborious and tedious. It encompasses indoor and outdoor maintenance which requires a substantial amount of manpower and resources to complete the entire process. Although there are many commercial maintenance devices in the market, most of them are catered for indoors usage. The challenge lies in outdoor facilities maintenance such as facade cleaning, water tank cleaning, road distress detection and river-bank monitoring, and many more. Given the complexity of the environment and the amount of area to operate, sheltered link- ways, for example, are considered to be one of the more complex outdoor maintenance task; hence a difficult process to automate.

[0004] In recent years, there are notable past works on cleaning robots capitalising on visual servoing and jet stream compensation concepts. For example, S.M. Uddin, M. R. Hossain, M.S. Rabbi, M.A. Hasan, M.S.R. Zishan, “Unmanned Aerial Vehicle for Cleaning the High Rise Buildings”, International Conference on Robotics, Electrical, and Signal Processing Techniques (ICREST), Bangladesh, 2019, describes about facade cleaning with no means of visual aid. In another study, N. Strisciuglio, R. Tylecek, N. Petkov, P. Biber, J. Hemming, E.J.V. Henten, T. Sattler, M. Pollefeys, T. Gevers, T. Brox and R. B. Fisher, “TrimBot2020: an outdoor robot for automatic gardening”, International Symposium on Robotics, [Online], Available: arXiv: 1804.01792 [cs.RO], 2018, presents a robot that reconstructs the environment for precision gardening exploiting object segmentation and tracking for its end-effector. W. Boyd, Z. Hood, J. Lomi, C. St.Laurent, K. Young, ’’Fire Containment Drone”, Degree thesis, Worcester Polytechnic Institute, Worcester, Massachusetts, 2017, and C. Yuan, Z. Liu, Y. Zhang, ’’Vision-based forest fire detection in aerial images for firefighting using UAVs”, International Conference on Unmanned Aircraft Systems (ICUAS), Arlington, VA, USA, 2016 describes respective UAV used for fire detection and extinguisher balls delivery. All of these notable examples involve subsets of the visual servoing concept.

[0005] Most facade cleaning robots generally sweep through a fixed pattern (perpendicularly) towards the area of interest (e.g., walls & panels), and does not account for projectile drop and it is easily influenced by situational conditions like sudden wind gust or sensor drift.

[0006] A need therefore exists to provide fluid ejection by an aerial robot, that seek to improve, or at least ameliorate, one or more deficiencies in conventional approaches in providing controlled fluid ejection by an aerial robot. It is against this background that the present invention has been developed.

SUMMARY

[0007] According to a first aspect of the present invention, there is provided a method of controlling fluid ejection by an aerial robot, the aerial robot comprising a body on which an actuator and a nozzle are mounted, the actuator being configured to control an orientation of the nozzle to perform controlled fluid ejection; and a sensor system configured to detect a surface below the aerial robot during flight and a target area on the surface desired to be impinged by the fluid when ejected from the nozzle, the method comprising: obtaining sensor data from the sensor system, the sensor data comprising distance information between the target area and the aerial robot; determining a control parameter for controlling the fluid ejection by the aerial robot using the distance information based on a multi-component objective function relating to kinematic parameters of a kinematics model associated with the body, the actuator, the nozzle and a fluid motion of the fluid when ejected from the nozzle to a point of contact; controlling the actuator to control the orientation of the nozzle to perform controlled fluid ejection based on the control parameter; obtaining ejected fluid data from the sensor system, the ejected fluid data comprising a position of the point of contact of the fluid with respect to the surface when ejected from the nozzle; determining an adjustment parameter based on the position of the point of contact of the fluid when ejected from the nozzle; and controlling the actuator to control the orientation of the nozzle to perform controlled fluid ejection based on the adjustment parameter.

[0008] According to a second aspect of the present invention, there is provided a controller for controlling fluid ejection by an aerial robot, the aerial robot comprising a body on which an actuator and a nozzle are mounted, the actuator being configured to control an orientation of the nozzle to perform controlled fluid ejection; and a sensor system configured to detect a surface below the aerial robot during flight and a target area on the surface desired to be impinged by the fluid when ejected from the nozzle, the controller comprising: a memory; and at least one processor communicatively coupled to the memory and configured to: obtain sensor data from the sensor system, the sensor data comprising distance information between the target area and the aerial robot; determine a control parameter for controlling the fluid ejection by the aerial robot using the distance information based on a multi-component objective function relating to kinematic parameters of a kinematics model associated with the body, the actuator, the nozzle and a fluid motion of the fluid when ejected from the nozzle to a point of contact; control the actuator to control the orientation of the nozzle to perform controlled fluid ejection based on the control parameter; obtain ejected fluid data from the sensor system, the ejected fluid data comprising a position of the point of contact of the fluid with respect to the surface when ejected from the nozzle; determine an adjustment parameter based on the position of the point of contact of the fluid when ejected from the nozzle; and control the actuator to control the orientation of the nozzle to perform controlled fluid ejection based on the adjustment parameter.

[0009] According to a third aspect of the present invention, there is provided an aerial robot configured to perform fluid ejection, the aerial robot comprising: a body; a nozzle mounted on the body and configured to eject fluid; an actuator mounted on the body and configured to control an orientation of the nozzle to perform controlled fluid ejection; a sensor system configured to detect a surface below the aerial robot during flight and a target area on the surface desired to be impinged by the fluid when ejected from the nozzle; and the controller for controlling the actuator to control the orientation of the nozzle to perform fluid ejection as described according to the above-mentioned second aspect of the present invention.

[0010] According to a fourth aspect of the present invention, there is provided a computer program product, embodied in one or more non-transitory computer-readable storage mediums, comprising instructions executable by at least one processor to perform the method of controlling fluid ejection by an aerial robot as described according to the above-mentioned first aspect of the present invention.

BRIEF DESCRIPTION OF THE DRAWINGS

[0011] Embodiments of the present invention will be better understood and readily apparent to one of ordinary skill in the art from the following written description, by way of example only, and in conjunction with the drawings, in which:

FIG. 1 depicts a schematic flow diagram of a method of controlling fluid ejection by an aerial robot, according to various embodiments of the present invention;

FIG. 2 depicts a schematic block diagram of a controller for controlling fluid ejection by an aerial robot, according to various embodiments of the present invention, such as corresponding to the method of controlling fluid ejection by the aerial robot as described with reference to FIG. 1;

FIG. 3 depicts a schematic drawing of an aerial robot configured to perform fluid ejection, according to various embodiments of the present invention;

FIG. 4 shows a hybrid kinematics block diagram according to various embodiments of the present invention;

FIG. 5 shows a free -body diagram corresponding to various links of the aerial robot including fluid motion, according to various embodiments of the present invention;

FIG. 6 shows a diagram illustrating velocity components broken down into its X and Z components, according to various example embodiments of the present invention;

FIG. 7 shows a visual sensor’s standard lens FOV with respect to the upper bound of the nozzle angle such that the fluid can still be seen in the FOV, according to various example embodiments of the present invention; FIG. 8 shows a masked water stream from an original 2-D image frame, according to various example embodiments of the present invention;

FIG. 9 shows various fluid trajectory with and without resistance according to various example embodiments of the present invention;

FIG. 10 shows a surface plot for the relationship between the three variables, x poc , z poc> a , according to various example embodiments of the present invention;

FIG. 11 and FIG. 12 present the respective surface plots that describes the velocity and the angle, f>, of the POC as it hits the surface, according to various example embodiments of the present invention;

FIGS. 13A-13E show example implementations of the aerial robot according to various example embodiments of the present invention;

FIG. 14 illustrates a static system test performed, according to various first example embodiments of the present invention;

FIG. 15 shows the best fit plots for various a angle, according to various example embodiments of the present invention;

FIGS. 16A-16B shows a static test performed on the UAV platform, according to various embodiments of the present invention;

FIG. 17A shows a table illustrating a list of variables and parameters that were used for the Convolutional Neural Network (CNN) model architecture, according to various example experiments;

FIG. 17B depicts detection of a target area on a surface, according to various example embodiments of the present invention;

FIGS. 18A-18B depicts images and corresponding plots for visual compensation with water, according to various example embodiments of the present invention;

FIGS. 19A-19B show a combined test (Combined Visual Compensation Test) for debris and POC detection, according to various example embodiments of the present invention;

FIG. 20 shows the three full kinematics link positions in the world frame for visualization, according to various example embodiments of the present invention;

FIG. 21 shows a graph illustrating hybrid kinematics validations, according to various example embodiments of the present invention;

FIG. 22 shows a table illustrating the results from three experiment test sets from different positions, according to various example embodiments of the present invention; FIGS. 23A-23B show visual compensation with Hybrid Kinematics estimation, according to various example embodiments of the present invention;

FIG. 24 shows a table illustrating results when calculated with the hybrid kinematics model, according to various example embodiments of the present invention;

FIG. 25 shows a Hybrid Kinematic Model plot for an experiment, according to various example embodiments of the present invention;

FIG. 26 shows another control system block diagram of the Hybrid Model, according to various example embodiments;

FIGS. 27A-27C show the corresponding free body diagram as a series of rotary and static joints to represent the various links of the aerial robot, according to various example embodiments;

FIG. 28 illustrates a concept of Model from inlet to outlet, according to various example embodiments;

FIG. 29 shows a table illustrating a Computational Fluid Dynamics setup, according to various example embodiments;

FIG. 30 illustrates a sample of FLUENT Flow Fields, according to various example embodiments;

FIGS. 31A-31B show surface plots for an example experiment, according to various example embodiments;

FIG. 31C illustrates a configuration space for bulk stream, according to various example embodiments;

FIG. 32 shows a table illustrating hybrid kinematics simulation results, according to various example embodiments;

FIG. 33 A illustrates an example aerial robot for high pressure, while FIG. 33B illustrates an example aerial robot for low pressure, according to various example embodiments;

FIG. 33C illustrates an example littered sheltered walkway;

FIG. 34 shows the experimental setup for force characterisation, according to various example embodiments;

FIG. 35 illustrates force characterisations: low pressure vs high pressure, according to various example embodiments;

FIG. 36 shows the typical setup for an experiment, according to various example embodiments; FIG. 37 shows a table illustrating experimentation data in relation to flight position, according to various example embodiments;

FIG. 38A shows low pressure jet impingement, while FIG. 38B shows high pressure jet impingement, according to various example embodiments;

FIG. 39 shows a table illustrating experimentation data in relation to low pressure and high pressure fluid ejection, according to various example embodiments; and

FIG. 40 illustrates a tether providing a fluid source link between a motorized ground system and the aerial robot according to various example embodiments.

DETAILED DESCRIPTION

[0012] Various embodiments of the present invention provide a method of controlling fluid ejection by an aerial robot, a controller thereof, and an aerial robot configured to perform fluid ejection including the controller.

[0013] FIG. 1 depicts a schematic flow diagram of a method 100 of controlling fluid ejection by an aerial robot (which may also be interchangeably referred to as unmanned aerial vehicle (UAV) herein), according to various embodiments of the present invention. The aerial robot comprising a body on which an actuator and a nozzle are mounted, the actuator being configured to control an orientation of the nozzle to perform controlled fluid ejection; and a sensor system configured to detect a surface below the aerial robot during flight and a target area on the surface desired to be impinged by the fluid when ejected from the nozzle. The method 100 of controlling fluid ejection by the aerial robot comprises: obtaining (at 102) sensor data from the sensor system, the sensor data comprising distance information between the target area and the aerial robot; determining (at 104) a control parameter for controlling the fluid ejection by the aerial robot using the distance information based on a multi-component objective function relating to kinematic parameters of a kinematics model associated with the body, the actuator, the nozzle and a fluid motion of the fluid when ejected from the nozzle to a point of contact; controlling (at 106) the actuator to control the orientation of the nozzle to perform controlled fluid ejection based on the control parameter; obtaining (at 108) ejected fluid data from the sensor system, the ejected fluid data comprising a position of the point of contact of the fluid with respect to the surface when ejected from the nozzle; determining (at 110) an adjustment parameter based on the position of the point of contact of the fluid when ejected from the nozzle; and controlling (at 112) the actuator to control the orientation of the nozzle to perform controlled fluid ejection based on the adjustment parameter. [0014] Accordingly, various embodiments describe controlling fluid ejection by the aerial robot which includes a hybrid kinematics modelling method, where hybrid kinematics is employed for modelling of rigid and fluidic entities of the aerial robot. In other words, the hybrid kinematics includes rigid and fluid modelling, allowing the fluidic projectile to resemble an entity of the UAV. The projection of fluid stream at the target area is performed by modelling it (for precision) as an entity of the aerial robot.

[0015] In relation to the target area on the surface, in various embodiments, the target area may be, or include, object(s) or residual(s)/mark(s) (e.g., debris) thereon.

[0016] In relation to the distance information between the target area and the aerial robot, the distance information may include Euclidean distances between the target area and the aerial robot.

[0017] In various embodiments, the fluid ejection by the aerial robot is performed using the nozzle mounted on the body of the aerial robot which orientation is adjustable by the actuator. The nozzle may be coupled to a fluid connector which is connected to a fluid source. The fluid connector, for example, may be a hose in a non-limiting example. The fluid ejection may be performed to impinge a targeted area of the surface with high velocity fluid from the onboard nozzle. It will be understood by a person skilled in the art that the present invention is not limited to any particular application, as long as the fluid ejection from the nozzle of the aerial robot is controlled with respect to the target area.

[0018] In various embodiments, the kinematics model comprises a rigid kinematics modeling in relation to position vectors corresponding to the body, the actuator and the nozzle, and a fluid modelling in relation to a position vector corresponding to the point of contact of the fluid when ejected from the nozzle.

[0019] In various embodiments, the rigid kinematics modeling comprises a first link parameter and a first joint parameter relating to a link between a reference point (or origin) and the body, a second link parameter and a second joint parameter relating to a link between the body and the actuator, a third link parameter and a third joint parameter relating to a link between the actuator and the nozzle, a fourth link parameter relating to a link between the nozzle and the point of contact of the fluid modeled based on a fluid dynamics function in relation to the fluid motion of the fluid.

[0020] In various first embodiments, the fluid dynamics function comprises a first component in relation to a horizontal component of the distance travelled by the fluid from the nozzle to the point of contact, and a second component in relation to a vertical component of the distance travelled by the fluid from the nozzle to the point of contact.

[0021] In various first embodiments, the method 100 further comprises modelling the first link parameter, the first joint parameter, the second link parameter, the second joint parameter, the third link parameter, the third joint parameter, the first component and the second component using forward kinematics.

[0022] In various first embodiments, the multi-component objective function comprises the first component, the second component, and a third component, the third component being the first link parameter. The first link parameter may be a length of the fluid connector between the reference point and the body.

[0023] In various first embodiments, the multi-component objective function comprises a first weight associated to the first component, a second weight associated to the second component and a third weight associated to the third component, the first weight and the second weight are configured based on the amount of energy to be delivered onto the point of contact of the fluid to be ejected and the third weight is configured based on an amount of a fluid connector in which the aerial robot has to drag to obtain an optimal position of the aerial robot (or the center of gravity of the aerial robot).

[0024] In various first embodiments, the multi-component objective function comprises constrains on a desired relative position of the aerial robot and the point of contact. In various first embodiments, the constrains comprise a first threshold in relation to the horizontal component of the distance travelled by the fluid from the nozzle and a second threshold in relation to the vertical component of the distance travelled by the fluid from the nozzle.

[0025] In various first embodiments, the control parameter comprises an angle of the nozzle with respect to a vertical axis.

[0026] In various first embodiments, the sensor system comprises an image capture device configured to capture image data. The image capture device may include visual and depth sensors. The above-mentioned obtaining ejected fluid data from the sensor system, the ejected fluid data comprising a position of the point of contact of the fluid ejected from the nozzle comprises obtaining image data from the image capture device, the image data in relation to the fluid ejected from the nozzle; and performing detection of the point of contact of the fluid ejected from the nozzle based on the image data. [0027] In various embodiments, the above-mentioned determining an adjustment parameter based on the position of the point of contact of the fluid ejected from the nozzle further comprises determining an offset distance between the point of contact and the target area.

[0028] In various embodiments, the above-mentioned obtaining sensor data from the sensor system, the sensor data comprising distance information between the target area and the aerial robot comprises performing detection of the target area based on a machine learning model.

[0029] In various embodiments, the above-mentioned determining a control parameter for controlling the fluid ejection by the aerial robot using the distance information is further based on a force estimation model.

[0030] FIG. 2 depicts a schematic block diagram of a controller 200 for controlling fluid ejection by an aerial robot, according to various embodiments of the present invention, such as corresponding to the above-mentioned method 100 of controlling fluid ejection by the above- mentioned aerial robot as described hereinbefore according to various embodiments of the present invention with reference to FIG. 1. Accordingly, similarly, the aerial robot comprises a body on which an actuator and a nozzle are mounted, the actuator being configured to control an orientation of the nozzle to perform controlled fluid ejection; and a sensor system configured to detect a surface below the aerial robot during flight and a target area on the surface desired to be impinged by the fluid when ejected from the nozzle. The controller 200 comprises: a memory 202; and at least one processor communicatively coupled to the memory 202 and configured to: obtain sensor data from the sensor system, the sensor data comprising distance information between the target area and the aerial robot; determine a control parameter for controlling the fluid ejection by the aerial robot using the distance information based on a multicomponent objective function relating to kinematic parameters of a kinematics model associated with the body, the actuator, the nozzle and a fluid motion of the fluid when ejected from the nozzle to a point of contact; control the actuator to control the orientation of the nozzle to perform controlled fluid ejection based on the control parameter; obtain ejected fluid data from the sensor system, the ejected fluid data comprising a position of the point of contact of the fluid with respect to the surface when ejected from the nozzle; determine an adjustment parameter based on the position of the point of contact of the fluid when ejected from the nozzle; and control the actuator to control the orientation of the nozzle to perform controlled fluid ejection based on the adjustment parameter.

[0031] It will be appreciated by a person skilled in the art that the at least one processor 204 may be configured to perform various functions or operations through set(s) of instructions (e.g., software modules) executable by the at least one processor 204 to perform various functions or operations. Accordingly, as shown in FIG. 2, the controller 200 may comprise: a sensor data module (or a sensor data circuit) 206 configured to obtain sensor data from the sensor system, the sensor data comprising distance information between the target area and the aerial robot, and ejected fluid data comprising a position of the point of contact of the fluid with respect to the surface when ejected from the nozzle; a control parameter determining module (or a control parameter determining circuit) 208 configured to determine a control parameter for controlling the fluid ejection by the aerial robot using the distance information based on a multi-component objective function relating to kinematic parameters of a kinematics model associated with the body, the actuator, the nozzle and a fluid motion of the fluid when ejected from the nozzle to a point of contact; a fluid ejection controlling module (or a fluid ejection controlling circuit) 210 configured to control the actuator to control the orientation of the nozzle to perform controlled fluid ejection based on the control parameter; determine an adjustment parameter based on the position of the point of contact of the fluid when ejected from the nozzle and control the actuator to control the orientation of the nozzle to perform controlled fluid ejection based on the adjustment parameter.

[0032] It will be appreciated by a person skilled in the art that the above-mentioned modules are not necessarily separate modules, and two or more modules may be realized by or implemented as one functional module (e.g., a circuit or a software program) as desired or as appropriate without deviating from the scope of the present invention. For example, two or more of the sensor data module 206, the control parameter determining module 208 and the fluid ejection controlling module 210 may be realized (e.g., compiled together) as one executable software program (e.g., software application or simply referred to as an “app”), which for example may be stored in the memory 202 and executable by the at least one processor 204 to perform the corresponding functions or operations as described herein according to various embodiments.

[0033] In various embodiments, the controller 200 for controlling fluid ejection corresponds to the method 100 of controlling fluid ejection as described hereinbefore with reference to FIG. 1, therefore, various functions or operations configured to be performed by the least one processor 204 may correspond to various steps of the method 100 described hereinbefore according to various embodiments, and thus need not be repeated with respect to the controller 200 for clarity and conciseness. In other words, various embodiments described herein in context of methods (e.g., the method 100 of controlling fluid ejection) are analogously valid for the corresponding systems or devices (e.g., the controller 200 for controlling fluid ejection), and vice versa. For example, in various embodiments, the memory 202 may have stored therein the sensor data module 206, the control parameter determining module 208 and/or the fluid ejection controlling module 210, each corresponding to one or more steps of the method 100 of controlling fluid ejection as described hereinbefore according to various embodiments, which are executable by the at least one processor 204 to perform the corresponding functions or operations as described herein.

[0034] A computing system, a controller, a microcontroller or any other system providing a processing capability may be provided according to various embodiments in the present invention. Such a system may be taken to include one or more processors and one or more computer-readable storage mediums. For example, the controller 200 described hereinbefore may include a processor (or controller) 204 and a computer-readable storage medium (or memory) 202 which are for example used in various processing carried out therein as described herein. A memory or computer-readable storage medium used in various embodiments may be a volatile memory, for example a DRAM (Dynamic Random Access Memory) or a non-volatile memory, for example a PROM (Programmable Read Only Memory), an EPROM (Erasable PROM), EEPROM (Electrically Erasable PROM), or a flash memory, e.g., a floating gate memory, a charge trapping memory, an MRAM (Magnetoresistive Random Access Memory) or a PCRAM (Phase Change Random Access Memory).

[0035] In various embodiments, a “circuit” may be understood as any kind of a logic implementing entity, which may be special purpose circuitry or a processor executing software stored in a memory, firmware, or any combination thereof. Thus, in an embodiment, a “circuit” may be a hard-wired logic circuit or a programmable logic circuit such as a programmable processor, e.g., a microprocessor (e.g., a Complex Instruction Set Computer (CISC) processor or a Reduced Instruction Set Computer (RISC) processor). A “circuit” may also be a processor executing software, e.g., any kind of computer program, e.g., a computer program using a virtual machine code, e.g., Java. Any other kind of implementation of various functions or operations may also be understood as a “circuit” in accordance with various other embodiments. Similarly, a “module” may be a portion of a system according to various embodiments in the present invention and may encompass a “circuit” as above, or may be understood to be any kind of a logic-implementing entity therefrom.

[0036] Some portions of the present disclosure are explicitly or implicitly presented in terms of algorithms and functional or symbolic representations of operations on data within a computer memory. These algorithmic descriptions and functional or symbolic representations are the means used by those skilled in the data processing arts to convey most effectively the substance of their work to others skilled in the art. An algorithm is here, and generally, conceived to be a self-consistent sequence of steps leading to a desired result. The steps are those requiring physical manipulations of physical quantities, such as electrical, magnetic or optical signals capable of being stored, transferred, combined, compared, and otherwise manipulated.

[0037] Unless specifically stated otherwise, and as apparent from the following, it will be appreciated that throughout the present specification, discussions utilizing terms such as “obtaining”, “controlling”, “computing”, “determining”, “sending”, “performing” or the like, refer to the actions and processes of a computer system or electronic device, that manipulates and transforms data represented as physical quantities within the computer system into other data similarly represented as physical quantities within the computer system or other information storage, transmission or display devices.

[0038] The present specification also discloses a system (e.g., which may also be embodied as a device or an apparatus), such as the controller 200, for performing various operations or functions of the method(s) described herein. Such a system may be specially constructed for the required purposes, or may comprise a general purpose computer or other device selectively activated or reconfigured by a computer program stored in the computer. Algorithms that may be presented herein are not inherently related to any particular computer or other apparatus. Various general-purpose machines may be used with computer programs in accordance with the teachings herein. Alternatively, the construction of more specialized apparatus to perform the required method steps may be appropriate.

[0039] In addition, the present specification also at least implicitly discloses a computer program or software/functional module, in that it would be apparent to the person skilled in the art that individual steps of various methods described herein may be put into effect by computer code. The computer program is not intended to be limited to any particular programming language and implementation thereof. It will be appreciated that a variety of programming languages and coding thereof may be used to implement the teachings of the disclosure contained herein. Moreover, the computer program is not intended to be limited to any particular control flow. There are many other variants of the computer program, which can use different control flows without departing from the scope of the present invention. It will be appreciated by a person skilled in the art that various modules described herein (e.g., the sensor data module 206, the control parameter determining module 208 and/or the fluid ejection controlling module 210) may be software module(s) realized by computer program(s) or set(s) of instructions executable by a computer processor to perform various functions or operations, or may be hardware module(s) being functional hardware unit(s) designed to perform various functions or operations. It will also be appreciated that a combination of hardware and software modules may be implemented.

[0040] Furthermore, one or more of the steps of a computer program/module or method described herein may be performed in parallel rather than sequentially. Such a computer program may be stored on any computer readable medium. The computer readable medium may include storage devices such as magnetic or optical disks, memory chips, or other storage devices suitable for interfacing with a general purpose computer. The computer program when loaded and executed on such a general-purpose computer effectively results in an apparatus that implements various steps of methods described herein.

[0041] In various embodiments, there is provided a computer program product, embodied in one or more computer-readable storage mediums (non-transitory computer-readable storage medium), comprising instructions (the sensor data module 206, the control parameter determining module 208 and/or the fluid ejection controlling module 210) executable by one or more computer processors to perform a method 100 of controlling fluid ejection by an aerial robot, as described hereinbefore with reference to FIG. 1. Accordingly, various computer programs or modules described herein may be stored in a computer program product receivable by a system therein, such as the controller 200 as shown in FIG. 2, for execution by at least one processor 204 of the system 200 to perform various functions or operations.

[0042] Various software or functional modules described herein may also be implemented as hardware modules. More particularly, in the hardware sense, a module is a functional hardware unit designed for use with other components or modules. For example, a module may be implemented using discrete electronic components, or it can form a portion of an entire electronic circuit such as an Application Specific Integrated Circuit (ASIC). Numerous other possibilities exist. Those skilled in the art will appreciate that the software or functional module(s) described herein can also be implemented as a combination of hardware and software modules.

[0043] FIG. 3 depicts a schematic drawing of an aerial robot 300 configured to perform fluid ejection, according to various embodiments. The aerial robot 300 comprises: a body 302 configured to accommodate components of the aerial robot; a nozzle 304 mounted on the body 302 and configured to eject fluid; an actuator 306 mounted on the body 302 and configured to control an orientation of the nozzle to perform controlled fluid ejection; a sensor system 308 configured to detect a surface below the aerial robot during flight and a target area on the surface desired to be impinged by the fluid when ejected from the nozzle; and the controller 200 for controlling the actuator 306 to control the orientation of the nozzle 304 to perform controlled fluid ejection as described hereinbefore with reference to FIG. 2 according to various embodiments.

[0044] It will be understood by a person skilled in the art that various types of aerial robot are known in the art and the present invention is not limited to any particular type of aerial robot. Various configurations and operating mechanisms or principles of such an aerial robot are known in the art and thus need not be described herein for clarity and conciseness, and the present invention is not limited to any particular configuration or operating mechanism of an aerial robot. For example, it is understood that the aerial robot 300 comprises thruster units being mounted on the body and/or wing members attached to the body.

[0045] It will be appreciated by a person skilled in the art that the terminology used herein is for the purpose of describing various embodiments only and is not intended to be limiting of the present invention. As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.

[0046] Any reference to an element or a feature herein using a designation such as “first”, “second” and so forth does not limit the quantity or order of such elements or features, unless stated or the context requires otherwise. For example, such designations may be used herein as a convenient way of distinguishing between two or more elements or instances of an element. Thus, a reference to first and second elements does not necessarily mean that only two elements can be employed, or that the first element must precede the second element. In addition, a phrase referring to “at least one of’ a list of items refers to any single item therein or any combination of two or more items therein.

[0047] In order that the present invention may be readily understood and put into practical effect, various example embodiments of the present invention will be described hereinafter by way of examples only and not limitations. It will be appreciated by a person skilled in the art that the present invention may, however, be embodied in various different forms or configurations and should not be construed as limited to the example embodiments set forth hereinafter. Rather, these example embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the present invention to those skilled in the art.

[0048] Various example embodiments provide a method of controlling fluid ejection by an aerial robot which includes visual compensation and hybrid kinematics modelling. Various example embodiments describe the projection of high velocity fluid stream at a point target (corresponding to the target area as described hereinbefore) by modelling it (for precision) as an entity of the aerial robot or unmanned aerial vehicle (UAV). The ‘point-and-shoot’ capability may be homogeneous to a visual servo system in robotics. Accordingly, the hybrid kinematics modelling is employed for the aerial platform when performing fluidic jetting. The hybrid kinematics includes rigid and fluid modelling, allowing the fluidic projectile to resemble an entity of the UAV. Alongside the kinematics model, is a visual compensator which describes how the system can employ Computer Vision (CV) techniques and Convolutional Neural Network (CNN); to accurately track the object in view and manipulate the nozzle control for an active closed-loop visual compensation. Various example embodiments address how hybrid kinematics can be employed for precision nozzle control. The results are evaluated through means of simulations and experiments with actual and synthetic data. Various example embodiments may be employed for other UAVs with similar concept.

[0049] An example physical device was deployed on a public sheltered link-way with staged debris. Visual-based compensation where the debris (corresponding to the target area as described hereinbefore) is in the field-of-view (FOV) and the other is a kinematics-based actuation where the estimated distance (i.e. from range sensor(s)) to the shelter (corresponding to the surface as described hereinbefore) is provided such that the model determines the UAV’s altitude and angle of servo rotation to ensure that the water stream reaches the shelter. In other words, the distance information (e.g., Euclidean distances) between the target area and the aerial robot are obtained from the range sensor(s). The distance information is the input to the optimization function which returns the optimal positions of the respective links of the aerial robot. Unlike many past works which treat the water stream as a fluidic payload, various example embodiments model the water stream as part of the system: an end-effector.

[0050] This type of task presents a multitude of solutions; three aspects are described below. First, the rigid modelling is described. Next, the analysis of fluid modelling with resistance. Accordingly, the system kinematics (rigid + fluid modelling) for precise actuation is provided. Lastly, (visual) sensing is described - this includes the active closed-loop visual compensation of the actuation system. For example, an active closed-loop control for visual compensation of the trajectory with respect to the location of debris. Simulations and Experiments are further demonstrated. According to various example embodiments, a unique physical device that is compact, tether-friendly and features a single-axis servo system appropriate for extended projection range is demonstrated.

Modelling and Sensing

[0051] FIG. 4 shows an illustration of the system according to various example embodiments. More particularly, FIG. 4 shows a hybrid kinematics block diagram 400 according to various embodiments of the present invention. The diagram 400 describes two portions: Hybrid Modelling and Visual Compensation. The hybrid model governs the rigid kinematics and the fluid dynamics modelling which will be described in detail in the following section. The Visual Compensation portion addresses the path at which the initial estimated p poc , with error, is corrected with a closed-loop control which will be described later.

Kinematics i) Rigid modelling

[0052] Much like a robotic arm for a pick-and-place task, the links of each specific points of the UAV system resembles those of a robotic arm where the point-of-contact (POC) is analogous to the end-effector of the arm. The uniqueness of the modelling lies in the last link where the water (or fluid) trajectory, in effect, takes the form of an end-effector.

[0053] In various example embodiments, since it is easier for the UAV to sweep through the length of the shelter in its Y axis due to the tether, the analysis describes the system in its X-Z axis. FIG. 5 shows a free -body diagram corresponding to various links of the aerial robot including fluid motion, according to various example embodiments. More particularly, FIG. 5 shows the kinematics of an UAV setup. With FIG. 5, forward kinematics (FK) equations may be obtained first such that inverse kinematics (IK) may be applied to find out the corresponding (optimized) angles (e.g., y ob & a) between its respective link. These angles represent the amount of rotation the current link needs to make to ensure that the point (i.e. p b / s / n ) is at its supposed position. p b denotes position vector of UAV body (body of the aerial robot), p s denotes the position vector of the servo, and p n denotes the position vector of the nozzle. In various example embodiments, the UAV is assumed to be straight and level, with the debris (at Z-distance above and X-distance away) is in the camera field-of-view (FOV). Since only the kinematics of the system was investigated, are assumed to be zero (0).

[0054] In various example embodiments, the POC may be described as follows: where p poc denotes the position vector of the point of contact, p p is the pump pressure and a is the nozzle angle from the -Z axis.

[0055] After which, inverse kinematics (IK) was employed to solve for For the equations that governs the Z-components of this system, the UAV’s altitude may be decomposed into various parts (where o is origin, h is hose (corresponding to the fluid connector as described hereinbefore), b is body, s is servo and n is nozzle) as follows:

Equation (2) represents the altitude at which the UAV is hovering at with respect to its center of gravity (CG). For example, l ob corresponding to the first link parameter, and y ob corresponding to the first joint parameter as described hereinbefore.

[0056] Equation (3) represents the vertical distance between the pb and p s . y bs is an intrinsic angle of the UAV. For example, l bs corresponding to the second link parameter, and y bs corresponding to the second joint parameter as described hereinbefore.

[0057]

Equation (4) represents the distance between the p s and p n . a denotes the nozzle angle (e.g., angle between the -z axis and the midline of the nozzle). For example, corresponding to the third link parameter, and a corresponding to the third joint parameter or control parameter as described hereinbefore.

[0058] Equation (5) as follows is unique as it relates to fluid, unlike the previous equations.

Equation (5) denotes the vertical distance travelled by the water stream.

[0059] With that, the forward kinematics (FK) in the Z axis, from the origin p o to p pO c, is given by: [0060] Similarly, the x-components may be decomposed as follows:

Equation (7) is the length of the hose decomposed into the X axis. l ob refers to the length of the fluid connector (e.g., hose) from the origin to the body of the UAV (in this regard, the body of the UAV may be taken with reference to the center of gravity of the UAV).

Equation (8) represents the X-distance from the pb to the p

[0061]

Equation (9) represents the X-distance from the p s to the p,

[0062]

Equation (10) represents the water projection in the X axis. For example, equations (5) and (10) corresponding to the fourth link parameter relating to a link between the nozzle and the point of contact of the fluid modeled based on a fluid dynamics function in relation to the fluid motion of the fluid as described hereinbefore.

[0063] FK in the X axis, from the origin, is given by:

[0064] Although Equation (5) and Equation (10) are accurate for a kinematics model, the fluid projectile may be best modelled dynamically.

[0065] With reference to FIG. 5, there exists multiple solutions for Hence, Equation (12) describes an objective function with constraints on the desired relative position of the UAV and POC. Optimization-based methods for nonlinear constrained equations were employed to find out the relative angles (y ob and a) and length, l ob . where Q, R are the weights responsible for penalizing high values of is responsible for penalizing length l ob . 11) Fluid modelling

[0066] The uniqueness of this subsection is in exploiting the water stream projection as a link of a robotic manipulator. It is understood that any projectile travelling through air is subjected to resistance (i.e. gravity and air).

[0067] In this subsection, the projectile motion was further investigated to find out how the trajectory is affected by the angle of projection, stream velocity and position of the UAV body, especially in the presence of air resistance: where, a is the nozzle angle with respect to -Z axis.

[0068] In various example embodiments, this is useful for the kinematics modelling so that the end-effector’s POC can be accurately determined. The stream can be visualized as a channel of water droplets or particles jetting together which is important as it governs the projectile motion equations. To analyse the fluid trajectory, the stream is assumed as individual particles falling through air from the tip of the nozzle onto the shelter. The velocity components can be broken down into its X and Z components, according to FIG. 6. More particularly, FIG. 6 illustrates the fluid trajectory modelling.

[0069] Since the individual particles are only subjected to drag force and gravity, the Equations of Motion (EOM) then becomes Equations (16) and (17). where c is the quadratic drag constant is the mass (kg).

[0070] The drag term here is expressed as a linear function of velocity, i.e. drag force increases if the velocity of particle increases. With some initial velocity V XQ and v Zo , Equations (16) and (17) can be analytically integrated twice to obtain Equations (18) and (19) as follows. where a x is the drag term in the x-component, a z is the drag term in the z-component, c is the drag coefficient, g is gravity, and m is the mass of the particle. [0071] The magnitudes of the velocity at the POC can be estimated along its trajectory which can be solved by finding the gradient at the specific point. The velocity (or velocity vectors) at the POC, v poc , is a function of nozzle angle, height above the shelter and the pressure of the pump.

Since the pressure of the pump is constant, the function simplifies to: which also comprises of the (v x ,v z ) equations:

[0072] From Equations (22) and (23), the angle at which the water jet comes into contact with the shelter can be determined as follows.

Visual sensing i) Sensor

[0073] According to various example embodiments, the type of visual sensor lens may be standard/normal lens so that the image has as little distortion as possible; in order to get better accuracy. FIG. 7 shows the visual sensor’s standard lens FOV with respect to the upper bound of the nozzle angle such that the fluid can still be seen in the FOV. More particularly, FIG. 7 illustrates the camera placement in relation to FOV versus servo limit. ii) Algorithm (Debris Detection)

[0074] Detection by the manipulation of color spaces as described in C. Yuan, Z. Liu, Y. Zhang, “Vision-based forest fire detection in aerial images for firefighting using UAVs”, International Conference on Unmanned Aircraft Systems (ICUAS), Arlington, VA, USA, 2016, have been tested in the past however, this does not provide a robust detector that can accurately identify the debris in different lighting and scenario. Tiny YOLOv3, as described in J. Redmon, A. Farhadi, “YOLOv3: An Incremental Improvement”, Computer Vision and Pattern Recognition, [Online]. Available: arXiv: 1804.02767 [cs.CV], 2018, was employed due to its accuracy and speed. Each frame of the video capture is parsed through the neural network to obtain the occurrence of the debris with respect to the location in the image coordinate system, p P oc(x, ).

[0075] As the targeted object is largely made up of dried leaves, the key focus for this dataset mainly relates to different shades of clustered leaves. The result returns a bounding box with a centroid that would be used as the reference for the visual compensation. iii) Algorithm (POC Detection)

[0076] Stereo Morphology is a method most employed extensively in medical and material studies to examine micro structures such as described in W. Long, G.F. Cheng, T. Liu, J.B. Yang, “Exploring on Stereo Morphology and Habit Plane of Martensite in Ferrous Alloy”, IOP Conference Series Materials Science and Engineering, vol. 381, no.l, 10.1088/1757- 899X/381/1/012168, 2018. Morphological methods such as described in J.S. Zhu, W. Li, “Real- Time Monitoring of Jet Trajectory during Jetting Based on Near-Field Computer Vision”, Sensors, vol. 19, no. 3, pp. 690, 10.3390/sl9030690, 2019, are also used for water stream prediction.

[0077] A similar technique is explored to ‘extract’ the water stream from the 2-Dimensional (2D) scene of the image plane. Since the on-board camera is a depth camera, a depth frame can be retrieved for processing. In the depth frame, the scene is picked up by the IR sensors (e.g., 850nm± 10nm) and is visualised by its 1 -channel intensity. The steps to recover the water stream are as follows:

1) Retrieve the depth to color image streams

2) Sample specific areas to retrieve the depth threshold, d t

3) Extract the region of interest (ROI) by eliminating all pixel depth smaller than d t where px is the pixel count, T is the image frame, px d is the pixel depth, px y is the pixel’s y coordinate, y t and y u are the lower and upper bounds of the image frame’s y axis.

4) Apply morphological method such as erosion to remove outliers and small contours as seen in the right image of FIG. 8.

5) Mask out the stream from the 2-D scene and apply offset to the tip of the masked POC [0078] FIG. 8 shows the masked water stream from the original 2-D image frame. This figure essentially only reflects the 3-D water stream projected onto the image plane. More particularly, the left image shows 2-D Image stream, the center image illustrates masked ROI, and the right image illustrates filtered masked stream.

[0079] The system also adopts a Proportional-Integral-Derivative (PID) controller. Bounded by, uiow < u < U upp Equation (26) where u is the command for the nozzle angle, u iow and u upp are the lower and upper limits of the nozzle angle that keeps the POC within FOV.

Simulations

(Hybrid) Forward Kinematics

[0080] In various example embodiments, the projectile motion is simulated with and without resistance (gravity and air) to understand how the presence of resistance will affect the motion. FIG. 9 shows the various trajectory with and without resistance. Here the origin, (0,0), of the plots are situated at p s . For example, the nozzle angle a may take reference from the servo, because a is taken with respect to the servo frame (not the tip of the nozzle). Accordingly, the projectile motion is simulated with respect to the servo. More particularly, FIG. 9 shows the projectile motion with and without resistance.

[0081] It can be seen that in the presence of resistance, the projectile motion descents faster which leads to smaller projectile ranges for all angles and the trajectory becomes more linear at smaller angles (e.g., 20°-45°). FIG. 10 shows a surface plot 1000 for the relationship between the three variables, X poc , Z poc , a . This simulation result gives good knowledge about the projectile modelling as smaller nozzle angles the end-effector link can be assumed as linear.

(Hybrid) Inverse Kinematics

[0082] With reference to Equation (12), tuning the individual (Q, R & S) weights, affects the priorities of the individual components. Q, R affect the amount of energy delivered onto the POC while S reduces the amount of hose the UAV has to drag to achieve the optimal position vector of the body pb (or the center of gravity of the UAV). The equation is then minimised to find the optimal decision variables.

Properties of stream at POC

[0083] Following the range of z and a values computed from the previous section, projectile parameters can be obtained using Equations (22) and Eq (23). FIG. 11 and FIG. 12 presents the respective surface plots that describes the velocity and the angle, (3, of the POC as it hits the surface. More particularly, FIG. 11 shows the surface plot in relation to vs a, while FIG. 12 shows the surface plot in relation to

[0084] FIGS. 13A-13E show example implementations of the aerial robot according to various example embodiments of the present invention. More particularly, FIG. 13 A illustrates the aerial robot performing fluid ejection to clean a linkway. FIGS. 13B-13C illustrate front and isometric views of the aerial robot according to various example embodiments of the present invention. FIG. 13D illustrates a tethering system in which the aerial robot may be tethered to. For example, the tethering system may be a Titan tethering system.

Experiments

[0085] The modelling derived in the Modelling and Sensing section above was tested on a physical UAV device which is designed such that the center of gravity (CG) is situated slightly above the origin of the servo (corresponding to the actuator as described hereinbefore), the rest of the electronics are arranged into tiers and strategically packed such that the UAV is compact as illustrated in FIG. 13E. More particularly, for illustration purpose and without limitation, FIG. 13E depicts an example implementation of the aerial robot according to various example embodiments of the present invention, with a number of key components shown. The UAV is fitted with a single-axis servo system that allows for a range of extended motion for the nozzle actuation. From the assembly model of the design, some parameters that can be determined

Kinematics i) Static System Test

[0086] FIG. 14 shows a static test was performed to validate the fluidic model at different distances and to obtain certain unknown variables: drag and mass coefficient In this experiment, various Z water was tested to observe the X water with respect to nozzle angle a. A Karcher BP4 garden pump was used, this garden pump delivers a constant pressure to the system. At specific height increment, various a angles were sampled. More particularly, FIG. 14 illustrates the static system test.

[0087] This experiment addresses the effects of the drag on the fluid projectile at different instances as it descents. FIG. 15 shows the best fit plots for various a angle. FIG. 15 has a coefficient of determination, r 2 = 0.9953. It can also be seen that the experimentation data reproduces a trajectory identical to the simulated plots for their respective a angles as shown in

FIG. 9. With these results, can be termed as a single variable and the empirical value that fits all three models is 1.125.

[0088] FIGS. 16A-16B validate for the hybrid modelling portion of FIG. 4 for any given X poc and Z poc on a static rig with respect to the change in X ob . More particularly, FIGS. 16A- 16B shows a static test performed on the UAV platform.

Visual sensing i ) Debris Detection Evaluation

[0089] FIG. 17A shows a table illustrating a list of variables and parameters that were used for the Convolutional Neural Network (CNN) model architecture, according to various example experiments. The debris detection was tested using actual specimens of dry, dead leaves (corresponding to the target area as described hereinbefore). The data-set includes various shades of yellow: greenish-yellow to dark brown. The leaves are mixed and randomly formed into clusters with sizes ranging between 20cm-30cm in diameter and a sample of the results of the detection is illustrated in FIG. 17B. z’z ) POC Detection Evaluation

[0090] The POC detection was tested on a Im aluminum extrude rig which allows for the drone to be moved around without powering its motors yet still be able to carry out its auxiliary features: Visual Jetting and Compensation.

[0091] A water test was conducted to test the controller under actual jetting conditions. The plots in FIGS. 18A-18B show that the controller is able to allow the POC to track the reference point within its controllable range and that it will never exceed the camera FOV. Additionally, this test also features a reference point random generator that automates the next location of the reference point after the POC tracks the reference point successfully. The controller is able to keep the system stable through the entire process. FIGS. 19A-19B show the combined test (Combined Visual Compensation Test) for debris and POC detection.

Outdoor Test

[0092] An outdoor flight test was conducted to validate the hybrid model described in the earlier sections. In this flight test, three different experiments were conducted. Declaring the same POC point for the three different hovering positions, the hybrid model calculates and returns the optimal angle for the nozzle to be actuated such that the POC always hits the same target point (corresponding to the target area as described hereinbefore). FIG. 20 shows the three full kinematics link positions in the world frame for visualization. FIG. 21 shows a graph illustrating the hybrid kinematics validations.

[0093] FIG. 22 shows a table illustrating the results from three experiment test sets from different positions. FIGS. 23A-23B show the visual compensation of the system during an actual flight test. More particularly, FIGS. 23A-23B show the visual compensation with Hybrid Kinematics estimation. Noting that the debris is located at 0.4783m vertically below the p n when a is 0°.

[0094] FIG. 24 shows a table illustrating the results when calculated with the hybrid kinematics model. The dot 2310 in FIG. 23B is the X water estimation obtained from the hybrid model. Although there is some discrepancy, could be a result of external factors like wind or sensor drifts, the hybrid model is able to compute a point substantially close to the actual POC location with 95.66% accuracy. With the aid of the visual compensation node, the actual POC (X W ater ■> X water) always falls in the region of the detected debris.

[0095] FIG. 25 shows a Hybrid Kinematic Model plot for experiment 4, in relation to the experiments conducted with respect to FIG. 22 and experimental results recorded in the table shown in FIG. 24.

[0096] According to various example embodiments, the implementation of a hybrid kinematics model adjoining a learned detector for a cleaning UAV is described. This hybrid model provides a more robust actuation control for the aforementioned example implementation. The kinematics and visual aspects were studied, modelled and tested in a controlled environment before experimenting on an actual shelter with staged debris. Although various example embodiments focused on the hovering state of the UAV, it will be appreciated that additional forms of mechanics (UAV dynamics) can also be modelled for a more accurate representation. In various example embodiments, the relationship between amount of energy delivered by the system and optimal weight carried may be configured to allow for more powerful pressure and sensing systems. In various example embodiments, the system may be configured as machine learning-based autonomous control.

Fluid Force Estimation of an Aerial Robot with 3D Hybrid Kinematics-Force Model

[0097] According to various example embodiments, the aerial robot may be configured to eject precise high-velocity fluid stream to remove stubborn targets off impact- sensitive, high and large surfaces in high-risk maintenance. Certain impact-sensitive surfaces include aircraft radome, glass facade, creeper plants on structures/super trees and even solar panels. To estimate the fluid force at the Point-of-Contact (POC), various example embodiments provides a model capable of positioning a pressure nozzle on a stable, quasi-static UAV at various altitudes to deliver highly-energised fluid of a known force. In view of the large stand-off distance as described in C. P. Gindinceanu, “De-icing and maintenance of wind turbines with drones”, M.Sc. Thesis, Aalborg University, Denmark, 2019, Accessed on: May 13, 2020, and the ballistic trajectory of the bulk stream as it exits the nozzle, various example embodiments presents two sub-models that govern the precise pressurised ejection: (3D) Hybrid Kinematics Model and Force Estimation Model which makes up the 3D Hybrid Kinematics Force (HKF) Model. As there is a combination of rigid and fluid links, the Hybrid Kinematics Model allows for the positioning of the aerial robot and the end-effector: bulk stream’s POC. For a large stand-off distance, the Force Estimation Model is formulated to maintain an expected force at the POC. The results are verified through simulations and experiments. The models are validated by flying the physical devices (aerial robots) to replicate actual deployment scenarios. [0098] In the recent years, customised aerial vehicles are increasingly disrupting and modernizing many traditional methods in industries of basic maintenance (cleaning) and fire rescue missions. Aerial robots can benefit these industries in terms of operational safety, productivity for high risk and labour-intensive operations. For the former, many of these industries require the maintenance of sensitive and delicate objects; namely aircraft cleaning, structures/facade cleaning, stagnant water removal in creeper plants/super trees and even cleaning solar panels. While these objects need to be periodically maintained for safety and efficacy reasons, the cleaning process requires careful attention due to its operating height and delicate surface.

[0099] Most of the fluid-ejecting UAVs are mainly used to supplement high-risk maintenance. These operations usually describe scenarios that are hard-to-reach or inaccessible by human-operators without the use of assistive equipment like cherry-pickers, boom lifts or erected structures. For an unmanned aerial vehicle (UAV) to assist in these types of tasks, its operation encompasses the delivery of high-velocity fluid at the target surface with a controlled impact force.

[00100] The maintenance operations require the UAV to be fitted with a hose and nozzle for supplying fluid and a power cable for the UAV propulsion. Since the on-board nozzle has a fixed orifice diameter, in order to vary the fluid’s impact force, these maintenance operations would require strategic UAV positioning. By optimizing the position of the UAV, it can deliver high-velocity fluid with a controlled force at the POC. While the force required at the POC for target removal should increase with the increase in target size/weight, in various example embodiments, the target (corresponding to the target area as described hereinbefore) may be classified based on surface or stubborn types as large targets are not necessarily heavy and vice versa. Hence, it is necessary to quantify the force information at the POC in order to ascertain the amount of fluid force that can be delivered onto the target at a height.

[00101] Various example embodiments describe extensively about the parameters at the POC through hydrodynamic analysis. Since the placement of the fluid end-effector and nozzle position affects the force at the POC, a Force Estimation Model for the fluid ejection may be formulated to determine the force at various target POCs. Various example embodiments describe the parameters at the POC which is not typically addressed or difficult to attain. Additionally, it is known that the kinetic energy of a ballistic projectile is inversely proportional with the increase in height. One might add that by placing the UAV closer to the target will increase the force delivered. However, positioning the UAV too close to the surface will subject it to a significant amount of aerodynamic ground effect; rendering it unstable. For example, P. Sanchez-Cuevas, G. Heredia, A. Ollero, “Characterization of the Aerodynamic Ground Effect and Its Influence in Multirotor Control”, International Journal of Aerospace Engineering, vol. 2017, pp. 1-17, 10.1155/2017/1823056., 2017 suggests that for a stable thrust increment due to ground effect (about 1), the ratio of distance-to-ground to propeller radius should be 5.5.

[00102] Since the last link (manipulating the end-effector) is fluid and might not be linear, the employment of fluid dynamics is key in the modelling of the last link. Therefore, a 3D HKF Model is provided to model the positioning of the various links as well as the force estimation of the bulk stream; giving knowledge on the amount of energy that can be safely transferred to the target.

[00103] Accordingly, various example embodiments may make better current high-risk maintenance methods by positioning the quasi-static UAV at an optimal distance and remotely ejecting high-velocity fluid, with a known force. The HKF model may advantageously provide more informed and stable positioning of the UAV for better control of its bulk stream up till the POC. In various example embodiments, the HKF Model may be addressed as follows: Section describing the Hybrid Model (kinematics and fluid dynamics) of the aerial robot and the Force Estimation Model for the bulk stream from the Conservation of Momentum theory, Sections describing the Simulations and Experiments respectively. [00104] Accordingly, various example embodiments provide:

1) 3D HKF Model that includes:

1.1) A 3D Hybrid Model (rigid kinematics and fluid dynamics) for POC placement

1.2) A Force Estimation Model for magnitude and direction of the bulk stream at the POC

2) Unique physical devices that are capable of ejecting fluid; used for verification and validation of experimentation results

Modelling

[00105] FIG. 26 shows another control system block diagram 2600 of the Hybrid Model, according to various example embodiments. The block diagram governs the 3D HKF Model which determines the placement of the end-effector (POC).

3D Hybrid Model

[00106] The UAV may move forward towards the surface of the target but since its tether hose dangles out from the rear of the UAV; it restricts the amount of forward flight as shown in FIG. 26. In order to overcome this restriction, in various example embodiments, a single DOF servo for nozzle actuation may be implemented, which permits the end-effector to extend into the UAV’s longitudinal (X) axis. Nozzle actuation for roll and yaw are not considered as the UAV is not restricted about those axes.

[00107] The Hybrid Model plays a vital role in governing the properties of the force vector of the bulk stream since it enables the positioning of the UAV. FIGS. 27A-27C show the corresponding free body diagram as a series of rotary and static joints to represent the various links of the aerial robot. More particularly, FIGS. 27A-27C show the kinematics of a UAV setup. FIG. 27A relates to x-z axes, FIG. 27B relates to x-y axes, and FIG. 27C relates to a 3D perspective (where o is origin, h is hose, b is body’s center of gravity (CG), s is servo and n is nozzle. y ob represents the angle of the hose to the ground, φp ob represents the radial change in heading from the origin to body, a represents the actuation angle of the servo system, Θ b represents the pitch of the aerial robot & l ob represents the length of hose carried from the point of reference (origin) to the body of the UAV.

[00108] The bounding box 1 in FIG. 27A represents the region which is located within the UAV’s body and bounding box 2 represents the region that is pseudo-fixed on the craft’s body. [00109] The position of the bulk stream at the POC in the local frame (L) can be described as a relationship (f) of pump pressure ( p ), nozzle angle (a ), 3D position of robot (p p ) and where T b is the transformation matrix from the UAV body frame (b) to the local reference frame (L )

[00110] Inverse Kinematics (IK) was employed to solve for the variables and l ob shown in FIGS. 27A-27C. FIGS. 27A-27C illustrate the various links of the tethered p

UAV system ejecting fluid at a target (POC). The respective transformation matrix, T x . aids in transforming a point to the next in 3D which encompasses rotation and translation information.

[00111] The kinematics model is governed by a series of homogeneous transformation matrices from each respective parts of the quasi-static UAV system such that each point comprises of the 3D coordinates in the local frame relative to the origin.

[00112] Hence, the 3D coordinates for the link between p 0 and p b is represented by Equation (28) which includes the altitude of the UAV, lateral and forward distance between the robot’s center of gravity (b) and the origin (o): where c represents cos and s represents sin.

[00113] Because the UAV is assumed to be quasi-static during ejection, the ideal heading should be forward-facing with minimal rotational (roll and yaw) motion acting on the UAV. Hence, the Y-component of the body-servo link and thereafter would retain the same transform as the previous link. Equation (29) represents the link

[00114] The link between the servo (s) and the nozzle (n) defines the position of the nozzle orifice relative to the servo considering the nozzle actuation angle (a ). [00115] The final segment is represented by a fluidic link which is the path that the bulk stream takes; from the nozzle orifice to the end-effector at the target (POC). Thus, this portion is vital in the estimation of the force at the POC. In order to model this link, the trajectory is best represented by the fluid dynamics of the stream throughout the stand-off distance (orifice to the POC).

[00116] A key information needed to model this trajectory is the fluid’s exit velocity at the orifice. To achieve this, Bernoulli’s equation from the Conservation of Energy law was employed to derive the velocity of the fluid flow at various stages within the system.

[00117] Taking into consideration that the pressure nozzle is mounted to a single DOF actuator, the fluid stream that exits the nozzle only comprises of the x and z velocity components. In order to analyse the fluid trajectory from a quasi-static UAV, the average velocity of the fluid

(Vfi) passing through air, starting from the orifice of the nozzle, can be broken down into its respective x-z components (assuming no wind effect acting on the stream) as follows: where fl represents the fluid, v e is the exit velocity from the orifice and a is the actuation angle.

[00118] Since the flow, after exiting the nozzle, is subjected to drag force and gravity, the fluid projectile is best modelled dynamically. The governing equation of the fluid is given by: where is the acceleration of a fluid parcel within the fluid stream, c is the drag constant, g is the gravity and m is the mass.

[00119] The drag term is expressed as a linear function of velocity, i.e. drag increases as velocity of flow increases. With some initial exit velocity Equation (32) can be analytically integrated to obtain the fluid link as expressed in Equation (33) for t [00120] In practise, the variables are nearly impossible to determine theoretically. Hence, the variables may be obtained empirically through simulations and experiments which will be discussed in later. Thus, the Forward Kinematics (FK) in its respective axes, from the origin p 0 to p poc , is given by:

[00121 ] With reference to , multiple solutions exist for A symbolic -based nonlinear optimization tool may be employed to solve for the relative angles and length, l ob . Equation (35) describes an objective function with constraints on the desired relative position of the aerial robot and the POC. This objective function attempts to minimise the values of the four (4) terms By minimising the objective function, 8, the optimal values for y ob , a and l ob are generated to formulate the shortest path for the bulk stream to travel from the nozzle orifice to the POC; resulting in the highest fluid force that can be achieved with respect to the specific input parameters. This path will ensure that the bulk stream is able to transfer as much energy as possible to the target at the P.

Force at POC

[00122] With the Hybrid Model, the POC can be accurately placed for any target position (corresponding to the target area as described hereinbefore) for any given altitude. This subsection will address the parameters that can be obtained at the POC. The key idea of estimating the parameters at the POC will essentially aid in force estimation at the target. By understanding the relationship (f) between the UAV’s position nozzle angle (a ) and velocity of the fluid at the the force at the POC can be estimated: [00123] The magnitudes of the velocity throughout the bulk stream can be estimated along its trajectory by finding the gradient at the specific altitude as well as using computer aided methods to simulate and study the bulk stream with greater accuracy, as will be described in the following sections.

[00124] Since the servo is only capable of a single degree (x-z) actuation, the bulk stream’s velocity components will comprise of the velocity dynamic equations:

[00125] In order to determine v e , the Extended Bernoulli’s equation was applied from the pump (Pt 1) to the nozzle orifice (Pt 2) as shown in FIG. 28. More particularly, FIG. 28 illustrates a concept of Model from inlet (Pt 1) to outlet (Pt 2). With that, v e in Equation (31) is derived from the extended Bernoulli’s Theorem to be: where dp is the differential pressure between inlet and outlet and is the difference in the altitude between the nozzle orifice and ground (altitude of pump). A cylindrical nozzle diameter of 1mm and the Darcy friction factor of 0.0311 obtained by Newton-Raphson iteration of the Colebrook White equation, as described in C. F. Colebrook, C. M. White, “Experiments with Fluid Friction in Roughened Pipes”, Proceedings of The Royal Society A: Mathematical, Physical and Engineering Sciences, vol. 161, pp. 367-381, 1937, was used.

[00126] The force at the POC can be described with the Conservation of Momentum law. To extrapolate the force at any point throughout the trajectory, the force from momentum can be defined in Equation (40) as follows: where p is the density of the dispersed fluid, A roi is the area of the water at the region of interest

(roi) and Avp is the change in fluid’s velocity along the path from the exit orifice (v e ) to any region of interest (Vp roi )-

[00127] arc vital variables in Equation (40). With increased v e I p decreased a / decreased Zp, the higher the linearity of the bulk stream, as described in N. Rajaratnam, S. A. H. Rizvi, P. M. Steffler, P. R. Smy, “Experimental study of very high velocity circular waterjets in air”, Journal of Hydraulic Research, vol. 32, no. 3, pp. 461- 470, 10.1080/00221689409498746, 1994 and N. Rajaratnam, C. Albers, ’’Water distribution in very high velocity waterjets in air”, Journal of Hydraulic Engineering, vol. 124, no. 6, pp. 647- 650, 10.1061/(ASCE)0733-9429(1998)124:6(647), 1998, hence, the higher would be. While the bulk stream appears to be linear, J. D. Thomas, C. Liu, F. A. Flachskampf, J. P. O’Shea, R. Davidoff, A. E. Weyman, “Quantification of Jet Flow by Momentum Analysis”, Circulation, vol. 81, no. 1, pp. 247-259, 10.1161/01. CIR.81.1.247, 1990 states that for flow regions beyond the potential core, it is impractical to model the profile as a plug flow. In order to determine the bulk velocity at a specific point from the orifice, the following equation stands true: where v m is the midline velocity at the given distance, and r is the radius of the bulk stream.

[00128] Additionally, with Equations (38)-(39), the angle at which the bulk stream comes into contact with the target can be determined. This angle will give the fluid stream the ‘brushing/pushing’ effect onto the target(s) to remove it from the surface.

[00129] Since the in Equation (40) are theoretically impossible to retrieve, it may be obtained through empirical means via simulation models and experiments which will be demonstrated later.

Computational Fluid Dynamics ( CFD )

[00130] Noting that the cross-sectional area of interest of a dispersed waterjet and its change in velocity (^) aft of the nozzle orifice in Equation (40) are impossible to measure physically, CFD simulation was conducted to obtain these parameters empirically with the setup described in a table as shown in FIG. 29.

[00131] FIG. 30 illustrates a sample of FLUENT Flow Fields: a-c: 0°, 10°, 20°. From FIG. 30, a reduction in water phase velocity was observed towards the water-air boundary from the bulk stream axis. This is represented by the phenomenon of momentum and mass transfer between two fluid phases due to air entrainment breaking up the continuous medium into dispersed droplets. Within the main region, the initially continuous stream is disintegrated into droplets of decreasing velocity and momentum from the bulk stream axis (white midline), where droplets near the water-air boundary are of negligible momentum as illustrated by the water velocity contours. As such, an approximate region of interest (about 30%) of the bulk stream was inferred as the appropriate cross-sectional area for the closest approximation in Equation (40).

Simulations

(Hybrid) Inverse Kinematics (IK)

[00132] As derived in Equations (28)-(34), these expressions define the model that governs the positioning of the 3D coordinates of the POC in the local frame. With that, solving the IK would allow the end-effector position to be selected and the joint angles between each links as a set of solutions. This means that the input parameters for the IK would be the 3D position of the end-effector thereafter, it solves for the joint angles and specific link lengths. Equation (35) describes the objective function that solves for the optimal parameters for the aforementioned hybrid system. The non-linear programming (nip) solver employed in various example embodiments is the Interior Point OPTimizer (IPOPT) which is commonly used for nonlinear optimization of continuous systems. FIGS. 31A-31B show respective surface plots for the x & z poc points for Exp C as discussed with respect to a table illustrated in FIG. 32. More particularly, FIGS. 31A-31B show surface plots for example Exp c) (4.0, 0.0, 6.0).

[00133] More particularly, the table illustrated in FIG. 32 shows Hybrid Kinematics simulation results. More particularly, the table shows three (3) sets of simulation results for various (x, y) end-effector positions, the z-coordinate for the end-effector may always be kept at the predetermined height of the target. The IK solutions demonstrates that the simulation is able to compute the optimal joint angles and link length automatically, almost instantaneously, by just providing the desired 3D end-effector coordinates. The time taken to compute each set of optimal solutions took less than 10ms.

[00134] Since the crux of the hybrid IK lies in the end-effector link, the region where the POC is considered effective is mapped by the configuration space which has a delta-like or curved conical shape as demarcated by dotted lines in FIG. 31C. More particularly, FIG. 31C illustrates a configuration space for bulk stream. The accentuated curved edges of the conical shape is a result of the non-linearity of the bulk stream at the extremities of the nozzle actuation angle (a ).

Experiments

[00135] The aerial robots may be physical UAV devices that are designed to deliver pressurised fluid at the target through a commercial pump system. As illustrated in FIGS. 33 A- 33B, these UAVs are designed such that each UAV has an undercarriage that carries the high pressure nozzle connected to hose supplying fluid. For example, FIG. 33A illustrates an example aerial robot for high pressure, while FIG. 33B illustrates an example aerial robot for low pressure, according to various example embodiments. With this tethered configuration, it does not need to carry the entire volume of water while still being able to continuously jet pressurised fluid at the target. The high pressure nozzle is also given a single degree-of-freedom servo system, for an extended range of motion for the nozzle actuation. The pre-defined parameters relative to the are: 90°, 0.06m & 0.19m respectively. A similar configuration is designed for a lower pressure system and its pre-defined parameters are: 73.34°, 0.05 m & 0.12m. The high and low pressure systems operate at 120 bar and 45 bar respectively. FIG. 33C illustrates an example littered sheltered walkway.

Force Characterisation (Verification)

[00136] The following experiments were conducted to analyse the force of the fluid at the POC as it strikes the target. The actual scenario describes some sheltered walkways which stands at about 2.75m above the ground, littered with twigs, leaves, tissues and bird droppings etc. For example, sheltered walkways in Singapore are critical structures that facilitates for public commuting hence, requires periodic maintenance for public safety. Since the target is 2.75m above the ground, the UAV was positioned at a certain height above the surface to prevent any ricochet/splash-back of the fluid as it strikes the target surface. In this case, 1.25m meters was chosen, which is a considerably safe distance away from the target surface, rounding the total altitude to 4m above the ground; preventing “splashbacks” and updrafts from any ground effect that can render the UAV unstable.

[00137] The indoor experiment setup shown in FIG. 34 illustrates the force characterisation process at a defined altitude for various actuation angles. More particularly, FIG. 34 shows the experimental setup for force characterisation. The UAV is mounted at the top of the static rig where the fluid jet strikes a load scale, placed at 1.25m below, on the base plate of the rig to record the impact force at the POC. The load scale is made up of a combination of four (4) single point load cells that produces the force reading when there is any loading on the acrylic plate. The entire experimental data is illustrated as the various histograms as shown in FIG. 35. More particularly, FIG. 35 illustrates force characterisations: low pressure vs high pressure. Since jet impingement occurs about 1.25m away from the nozzle, the free jet contains a mixture of air and water. The histograms describe the average spread of load occurrences of the respective actuation angles for each experiment set. The predicted values (vertical line 3510) are computed with Equations (38)-(41) with a percentage error within 2-7% of the respective measurements (underestimated). This could be due to several factors, but not limited to, namely: air entrapment before ejection and individual load cell sensitivity tolerance.

3D Kinematics with Fluid Ejection i) System Flight Test

[00138] Various example experiment demonstrates the entire system in a flight test with fluid ejection to validate the feasibility and accuracy of the system model. FIG. 36 shows the typical setup for a single experiment with the relevant key information labelled. The aim of the experiment was to test for the accuracy of the formulated models in the modelling and sensing section above. In order to understand the properties at the POC, a reservoir is placed at the far end of the setup with a load scale on it where it doubles as the target point while measuring the force of the fluid at the POC, for example, for measuring force away from the main system.

[00139] The various flight tests are recorded in a table in illustrated in FIG. 37, the target positions [x, y, z] T for the first, second and third experiments are [3.6,1.1,1.8] T m, [3.5,0.1,1.5] T m and [3.5,1.8,1.6] T m, respectively.

[00140] FIG. 38A shows low pressure jet impingement, while FIG. 38B shows high pressure jet impingement. As shown in FIGS. 38A-38B, for the force characterisation flight test, it is assumed that the aerial robot is in the quasi-static position with negligible bodily translations. FIG. 39 shows a table which records the various experiment data for the different altitudes, nozzle angles (<z), pressures and the estimated bulk stream impingement angles (/?). The results showcased the relationship between the altitude of the aerial robot and nozzle angle against the force exerted on the load scale. It can be clearly seen that the z distance affects the force the greatest as compared to increasing the actuation angle (a) of the nozzle with an error of < 7% for 2.0m, and < 20% for 2.5m. The 3D HKF Model was able to provide the means for an aerial robot to position and deliver fluid with a desired force (magnitude and direction), for precise fluid ejection.

[00141] Accordingly, various example embodiments describe the implementation of a 3D HKF model (Hybrid Kinematics Model coupled with Force Estimation Model), for an aerial robot. The various mechanics and force estimations were studied, modelled and simulated in a controlled environment before verification in actual flight tests. The results have shown that the UAV and POC can be positioned with the impact force at the POC estimated to a certain degree of accuracy. Since air entrainment increases with a larger stand-off distance, it is safer and more practical to manoeuvre the nozzle between 1.5-2.0m from the target surface with the nozzle actuation angle between (-5° to 5°) from the POC. This subjects the UAV to lesser induced ground effect and reaction force from the ejection; ensuring the stability of the UAV during position flight and more accurate POC parameters during ejection.

[00142] From the table illustrated in FIG. 39, the force estimation is more accurate at lower nozzle actuation angle. This suggests the possibility for better force control at the POC in the event of different types of surface. The fl angle is also more accurate at lower altitude and with higher flow velocity. Various example embodiments may be configured for force controlling and machine learning -based optimization for computer vision and control applications. Various example embodiments may include the measurement of force at the actuation point (source) instead which may allow better knowledge to handle the reaction force upon trigger.

[00143] According to various example embodiments, a tethered UAV system capable of autonomous fluidic payload delivery with visual-based closed loop control is provided. The system for example may be used for cleaning of surfaces at low altitude, hard-to-reach places which can be employed for the aviation or maritime cleaning. Various example embodiments employ object detector for debris detection, this detector is trained to identify (specific) potential debris over a uniform surface (corresponding to the target area on the surface as described hereinbefore). After which, the position of the debris in the image frame is translated into actuation angle for fluidic payload delivery. Various example embodiments of the invention comprise a secondary detector that identifies the point-of-contact of the jet stream with the surface, the simultaneous detection is fed to a compensator node as inputs for a visualbased closed loop control for the actuation system. This smart visual-based system can be deployed on any platform with sufficient endurance and processing power. Various example embodiments of the invention may be employed at low-medium altitude and at aviation or maritime sector, as well as cleaning and inspection services in the aviation, maritime and agriculture sectors.

[00144] The UAV, with its inherent ability to ascend and hover over the sheltered walkway, will give operators on the ground an undistorted view of the roof of the sheltered walkway. According to various example embodiments, a high-resolution camera on-board the UAV will provide real-time view of the roof and is documented along with GPS coordinates for further offline analysis. If the roof is obscured by debris, it can be cleared, and the roof cleaned using a water jet that is controlled by a nozzle on-board the UAV and pump on the ground. As illustrated in FIG. 40, a tether provides a water source link (which can be a nearby water tap or provided externally with a water truck) between the motorized ground system and UAV. This tether also provides power to the UAV so that it can remain operational indefinitely. Power to the motorized ground unit and to the UAV is provided through a battery system or electrical generator so that it can be operated in remote locations. The ground unit can be motorized so that it can be manually commanded to traverse along the length of the walkway to allow continuous operation of the UAV.

[00145] Various example embodiments advantageously allow tasks to be carry out with fewer human operators, quicker as cumbersome scaffoldings do not need to be erected and much safer as personnel no longer need to operate at elevated positions. The tethered unmanned aerial vehicles (UAVs) may be employed to provide visibility of the roofs of walkways for realtime analysis and documentation. Such information and analysis may include defect classification and GPS coordinates of suspected defect so that pictures can be retrieved for offline verification. The UAV may be fitted with a waterjet that will allow it to clear leaves, twigs and light debris that accumulate on the roof of the walkways. Advanced sensing systems on the UAVs will allow it to track the location of the walkway and allow it to automatically follow the path of the walkway using a multitude of on-board advanced sensing systems such as GPS, UWB, IMU, vision and LIDAR. Various example embodiments may be implemented for rapid, efficient and safe inspection of infrastructures, for example.

[00146] Various example embodiments comprise a tethered aerial drone to gain access to the top of sheltered walkways without the need for elevating any humans (humans are not placed at height). This tethering allows the drone to be powered from the ground so that the drone is smaller and lighter since batteries are no longer required and the flight endurance can be significantly extended. The drone also does not possess a container to hold water but instead channels the water up using a tethering water hose (corresponding to the fluid connector as described hereinbefore). Various embodiments may be compact, lightweight and able to operate in conditions where sheltered walkways are present. Various example embodiments comprise a controllable nozzle on the drone which allows the water channelled up to the drone to be controlled as it directed down onto the sheltered walkway. This allows the drone to eject water at debris and dirt in an inclined angle so it can be pushed off the walkway. It also provides a means to reach areas which is blocked by obstacles such as trees. Onboard the UAV, a multitude of sensors detect the sheltered walkway surface so that the drone can automatically follow the walkway. Vision-based sensors also actively track the sheltered walkway to automatically classify and locate dirt/debris and defects. This sensing system will provide feedback to the drone flight controller and nozzle controller, so the water is targeted at the correct area. Al and machine learning techniques may be used to estimate the contact point when the water stream contacts the surface of the sheltered walkway. Various example embodiments allow real time and automated processing of data instead of post processing via data tethering as well ability to do cleaning and inspection in one setting. The system automatically identifies, and targets areas of the walkways required for cleaning.

[00147] While embodiments of the invention have been particularly shown and described with reference to specific embodiments, it should be understood by those skilled in the art that various changes in form and detail may be made therein without departing from the scope of the invention as defined by the appended claims. The scope of the invention is thus indicated by the appended claims and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced.