Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
WEED MAPPING
Document Type and Number:
WIPO Patent Application WO/2021/062459
Kind Code:
A1
Abstract:
A sensing unit mountable to an aerial vehicle comprises a frame to mount the sensing unit to the aerial vehicle, a gimbal coupled to the frame allowing pitch and roll of a body mounted to the gimbal relative to the frame and one or more cameras mounted to the body. The sensing unit comprises a pair of cameras mounted to the body at an angle to the vertical and angled away from each other to minimise an overlap of images captured by the cameras. Combined locational data is matched with image capture times and height data captured by the aerial vehicle. The image data is analysed to determine the pixels representing weeds, remove false positives, calculating a centroid and radius of each remaining area representing weeds, combining a list of centroids and radii representing weeds. The list is converted into a format, a shapefile, usable by selected treatment equipment, such as a sprayer control system.

Inventors:
SINGLE JOHN ANTHONY (AU)
SINGLE MARY LOUISA (AU)
SINGLE BENJAMIN VALLACK (AU)
SINGLE ANTHONY VALLACK (AU)
Application Number:
PCT/AU2019/051079
Publication Date:
April 08, 2021
Filing Date:
October 04, 2019
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
SINGLE AGRICULTURE PTY LTD (AU)
International Classes:
G06T7/50; G01C11/00; G01S17/89; G06F15/00; G06V10/25
Domestic Patent References:
WO2017077543A12017-05-11
Foreign References:
US20190304120A12019-10-03
US20160280397A12016-09-29
US10028426B22018-07-24
US20180259496A12018-09-13
US20180129210A12018-05-10
US20190205610A12019-07-04
US20020022929A12002-02-21
Attorney, Agent or Firm:
SPRUSON & FERGUSON (AU)
Download PDF:
Claims:
CLAIMS

1. A sensing unit mountable to an aerial vehicle, the sensing unit comprising: a frame to mount the sensing unit to the aerial vehicle; a gimbal coupled to the frame allowing pitch and roll of a body mounted to the gimbal relative to the frame; and one or more cameras, in particular, a pair of cameras mounted to the body at an angle to the vertical and angled away from each other to minimise an overlap of images captured by the cameras.

2. The sensing unit of claim 1, comprising a light detection and ranging (LiDAR) unit mounted to the body between the cameras.

3. The sensing unit of claim 1 or 2, comprising a processor mounted to, or housed within the body, wherein the processor is in communication with: the gimbal to control the orientation of the gimbal; and/or the cameras to receive image data from the cameras; and/or the LiDAR unit to receive ranging data from the LiDAR unit. 4. The sensing unit of claim 3, wherein the processor is in communication with an inertial measurement unit (IMU), digital compass and GPS unit, optionally provided in a combined unit, and mounted to the sensing unit or to the aerial vehicle. 5. The sensing unit of claim 3 or 4, wherein the processor is in communication with a light sensor mounted to the sensing unit or to the aerial vehicle to measure ambient light levels.

6. The sensing unit of claim 3, 4 or 5, wherein the processor is in communication with a memory, such as a USB SSD, optionally accommodated in a housing mounted to, or housed within the body. 7. The sensing unit of any preceding claim, comprising a power distribution board, optionally mounted to, or housed within the body, and in communication with one or more of the gimbal, cameras, LiDAR unit, processor and light sensor. 8. The sensing unit of any preceding claim, comprising a timing board, optionally mounted to, or housed within the body, and in communication with one or more of the cameras, LiDAR unit, processor, IMU, digital compass, GPS unit and light sensor. 9. An aerial vehicle comprising the sensing unit of any preceding claim.

10. The aerial vehicle of claim 9, wherein the aerial vehicle is an unmanned, remotely controlled aerial vehicle (UAVs), such as a fixed wing, multicopter, helicopter or hybrid thereof, or a manned aerial vehicle.

11. A method of processing image data and locational data captured by an aerial vehicle to identify a location and size of a ground feature, in particular weeds, the method comprising: combining data captured by a GPS unit, an inertial measurement unit

(IMU) and a digital compass mounted to the aerial vehicle with data captured by a ground based GPS base station to generate combined locational data; matching the combined locational data with image capture times for images captured by one or more cameras mounted to the aerial vehicle and height data captured by aerial vehicle; analysing the image data to determine whether or not each pixel represents the ground feature, in particular weeds; performing area analysis on the image data to remove false positives; calculating a centroid and radius of each remaining area representing the ground feature, in particular weeds; combining a list of centroids and radii representing the ground feature, in particular weeds, with the combined locational data; and generating a list of longitudes, latitudes and radii representing the ground feature, in particular weeds.

12. The method of claim 11 , further comprising converting the list of longitudes, latitudes and radii representing the ground feature, in particular weeds to a format usable by selected treatment equipment, such as a sprayer control system.

13. The method of claim 11 or 12, wherein the image data and the locational data are captured by the sensing unit of any one of claims 1 to 8. 14. A method of converting a list of longitudes, latitudes and radii representing a ground feature to be treated, in particular weeds, to a format usable by treatment equipment, the method comprising a processor: creating a background array based at least on a bounding box representing an area to be treated; setting elements representing weeds in the background array to zero; creating a polygon array equal to the background array; analysing the background array for elements representing weeds and elements representing non-weeds until all elements in the background array are set to zero and modifying the polygon array correspondingly until all non-weed elements in the polygon array are equal to a value of 1 and individual polygon elements have the same value; creating polygons comprising points based on the elements of the modified polygon array; and converting the points of the polygons to longitudes and latitudes.

15. The method of claim 14, comprising writing a shapefile based on the longitudes and latitudes converted from the points of the polygons.

16. The method of claim 14 or 15, wherein creating polygons comprises the following steps for each different set of values in the polygon array: a) creating a list of the coordinates of the maximum column values (e.g. from left to right) and a list of the minimum coordinates; b) creating a new list; c) working through the top list from left to right, appending the first set of coordinates to the new list, and then if the row is not equal to the previous row, appending one set of coordinates at the current column and old row and then appending the current coordinates to the new list and continuing until the last value in the list is reached and appending that coordinate to the new list; d) repeating step c) for the bottom list, but processing from right to left (i.e. maximum column to minimum column) and adding one to the y (column) value (i.e. moving it down one square) and appending these to the same list as the top section; and e) appending the first value in the new list to the end of the new list, thus closing the polygon.

17. The method of claim 14, comprising analysing the array using a connected elements analysis comprising grouping blobs of weeds together. 18. The method of claim 17, comprising calculating a list of the centroids of each blob of weeds based on the connected elements analysis to determine the shortest distance between each centroid.

19. The method of claim 18, comprising generating a spray path independently for each blob of weeds to ensure each area is sprayed.

20. The method of claim 19, comprising combining the spray path for each blob of weeds into a list of all the spray paths in an order to be sprayed based on a shortest distance between each blob.

Description:
TITLE

WEED MAPPING

FIELD OF THE INVENTION The present invention relates to weed mapping. In particular, embodiments of the present invention relate to systems, methods and apparatus for weed mapping, but mapping of other features is also envisaged.

BACKGROUND TO THE INVENTION The detection and treatment of weeds is an ongoing requirement in agriculture. One known solution is a boom-mounted selective spray system comprising a plurality of sensors and a herbicide spraying system that can be mounted to a ground vehicle, such as a tractor, or to a dedicated guided vehicle. The sensors endeavour to distinguish weeds from desired plants, crops or stubble and selectively spray the weeds and not the desired plants, crops or stubble. This approach aims to target the weeds, avoid unnecessary treatment of the desired plants, crops or stubble and reduce the use and associated cost of herbicides and the like. However, the added weight, cost and power requirements, as well as dust created by the vehicle, compromise and limit what can be achieved by such systems.

One proposed solution to some of these problems is to mount the sensors to an aerial vehicle, such as a fixed wing, multicopter (e.g. quadcopter or octocopter etc.), helicopter or hybrids of these aerial vehicles. Typically, they are unmanned, remotely controlled aerial vehicles (UAVs), but in some cases the aerial vehicles are manned. Such aerial vehicles comprise an array of separate infra-red (IR), red (R) and green (G) digital cameras to capture image data and a location measuring device e.g. GPS to produce georeferenced images. Typically, photogrammetry, via software, is used to combine the image data with the locational data to generate 3D images. A file is generated detailing the location of the weeds and the file is used by a ground based vehicle or aerial vehicle to spray the weeds. Digital cameras used for capturing images from an aerial vehicle vary greatly in capacity, arrangement and features. Typically, they are either RGB cameras based on consumer, mass produced variants e.g. Sony A6000, or specifically designed aerial vehicle camera comprising an array of ~5 monochrome cameras capturing images in different light spectrums paired with a light sensor.

Locational data is captured typically via a GPS device only. The GPS accuracy can be increased using real-time kinematic (RTK) or post-processing kinematic (PPK) methods which require a base station. Accuracy can be further enhanced by recording the yaw, pitch and roll of the vehicle via an inertial measurement unit (IMU) and the heading of the vehicle via a digital compass. The height above ground can also be captured via a LiDAR or radar system. Surveyed ground control points (GCP) can also be used to further increase locational accuracy.

One known apparatus comprises an array of five individual cameras covering blue, green, red, red-edge and near infrared spectra, a light sensor and a GPS unit mounted on an aerial vehicle. To geolocate pixels, the apparatus processes the captured data via a methodology called photogrammetry. This process includes capturing overlapping images with a record of the estimated position from on-board GPS. Features are identified in overlapping images and are meshed together. The position, orientation and location of the images are determined via software using triangulation and trigonometry. Ground control points (GCP), with known locations, are placed throughout the surveyed area to increase positional accuracy to an accuracy of ~1 m or better. The output of the process is a georeferenced image which captures the locational data of the pixels in an image. The georeferenced images are processed to identify the weeds and convert the data to weed maps. The primary focus of this system is plant health in crops rather than the identification of individual plants or plant types. This is typically undertaken on comparatively small areas (e.g. under 100Ha) and only requires low resolution images. Generally, the aforementioned apparatus and other known apparatus involve a large overlap of images with large computational processing requirements. There are three main issues with the aforementioned apparatus relating to photogram metry, the cameras and the creation of shapefiles.

In the photogrammetry process, the images must substantially overlap both horizontally and longitudinally. Typically, this is a 70% overlap. At 70% overlap, the coverage rate efficiency is 9%, i.e. only 9% of each image covers an additional area. Processing is extremely computationally demanding and is typically undertaken in server “farms”, i.e. in the cloud. This requires the data to be uploaded to the internet, which is problematic even with a dedicated, full speed internet connection. Ideally, weed maps should be available the same day or within 24 hours of surveying the area. This issue is further exacerbated in rural areas (i.e. where this system operates) which typically use 4G mobile or satellite internet connections, both of which are poorly placed to upload this quantity of data. If standard GPS (L1 band) is being used, the GCPs must be placed throughout the survey area to ensure positional accuracy. Each GCP must be surveyed in and be identifiable from the air, which requires the GCPs to be maintained.

Regarding the cameras, identification of plants via software in a standard colour image is difficult to achieve which makes off-the-shelf RGB cameras unviable. The best spectra to use are red and near infrared. Typically, this is achieved via multiple individual monochrome cameras. When combined with a light sensor, highly accurate reflectance data is captured which is excellent for measuring plant health. The downside is the cost and complexity of multiple individual cameras, which are typically limited to low resolutions and require substantial post-processing to combine the individual spectra. The aforementioned apparatus achieves a ground sample distance (GSD) of 5.2cm at 120m. However, to detect small plants, e.g. under 5cm in diameter, resolutions of ~1cm GSD, or more than 25 times higher are required. This can be achieved by flying the aerial vehicle lower, but this compromises coverage rate.

The outcome of the above issues relating to photogrammetry and the cameras are that, at resolutions of 1cm GSD, the aforementioned apparatus can only capture ~10ha per hour, excluding non-productive time, such as changing batteries etc. Another issue is that the creation of the shapefiles used by spraying equipment is not typically an output of aforementioned apparatus and other known apparatus and the shapefiles must be generated via third party software. Also, to be commercially viable, the shapefiles should cover large areas, typically 200ha+, which results in shapefiles having a size which often exceeds the memory of the spray systems and hence the shapefiles are not useable by the spray systems.

OBJECT OF THE INVENTION

A preferred object of the present invention is to provide a system and/or a method and/or an apparatus that addresses or at least ameliorates one or more of the aforementioned problems and/or provides a useful commercial alternative.

SUMMARY OF THE INVENTION

One aspect of the present invention is directed to an apparatus in the form of a sensing unit mountable to an aerial vehicle to capture images of the ground to detect ground features, in particular weeds.

Another aspect of the present invention is directed to an apparatus in the form of an aerial vehicle comprising the sensing unit.

A further aspect of the present invention is directed to methods of processing images and locational data to identify locations and sizes of ground features, in particular weeds.

A yet further aspect of the present invention is directed to methods of generating shapefiles for use by weed treatment equipment.

According to one aspect, but not necessarily the broadest aspect, the present invention is directed to a sensing unit mountable to an aerial vehicle, the sensing unit comprising: a frame to mount the sensing unit to the aerial vehicle; a gimbal coupled to the frame allowing pitch and roll relative to the frame; a body mounted to the gimbal; and one or more cameras, in particular, a pair of cameras mounted to the body at an angle to the vertical and angled away from each other to minimise an overlap of images captured by the cameras.

Preferably, the sensing unit comprises a light detection and ranging (LiDAR) unit mounted to the body, and in particular between the cameras.

Preferably, the sensing unit comprises a processor mounted to, or housed within the body, wherein the processor is in communication with: the gimbal to control the orientation of the gimbal; the cameras to receive image data from the cameras; the LiDAR unit to receive ranging data from the LiDAR unit.

Preferably, the processor is in communication with an inertial measurement unit (IMU), digital compass and GPS unit, optionally provided in a combined unit, and mounted to the sensing unit or to the aerial vehicle.

Preferably, the processor is in communication with a light sensor mounted to the sensing unit or to the aerial vehicle.

Preferably, the processor is in communication with a memory, such as a USB SSD, optionally accommodated in a housing mounted to, or housed within the body.

Preferably, the sensing unit comprises a power distribution board, optionally mounted to, or housed within the body, and in communication with one or more of the gimbal, cameras, LiDAR unit, processor and light sensor.

Preferably, the sensing unit comprises a timing board, optionally mounted to, or housed within the body, and in communication with one or more of the cameras, LiDAR unit, processor, IMU, digital compass, GPS unit and light sensor.

According to another aspect, but not necessarily the broadest aspect, the present invention is directed to an aerial vehicle comprising the sensing unit. The aerial vehicle may be an unmanned, remotely controlled aerial vehicles (UAVs), such as a fixed wing, multicopter, helicopter or hybrid thereof, or a manned aerial vehicle.

According to a further aspect, but not necessarily the broadest aspect, the present invention is directed to a method of processing image data and locational data captured by an aerial vehicle by a processor to identify a location and size of a ground feature, in particular weeds, the method comprising the processor: combining data captured by a GPS unit, an inertial measurement unit (IMU) and a digital compass mounted to the aerial vehicle with data captured by a ground based GPS base station to generate combined locational data; matching the combined locational data with image capture times for images captured by a pair of cameras mounted to the aerial vehicle and height data captured by aerial vehicle; analysing the image data to determine whether or not each pixel represents the ground feature, in particular weeds; performing area analysis on the image data to remove false positives; calculating a centroid and radius of each remaining area representing the ground feature, in particular weeds; combining a list of centroids and radii representing the ground feature, in particular weeds, with the combined locational data; and generating a list of longitudes, latitudes and radii representing the ground feature, in particular weeds.

Preferably, the method further includes converting the list of longitudes, latitudes and radii representing the ground feature, in particular weeds, to a format usable by selected treatment equipment, such as a sprayer control system.

Preferably, the image data and the locational data are captured by the aforementioned sensing unit mounted to the aerial vehicle. According to a yet further aspect, but not necessarily the broadest aspect, the present invention is directed to a method of converting a list of longitudes, latitudes and radii representing a ground feature to be treated, in particular weeds, to a format usable by treatment equipment, the method comprising a processor: creating a background array based at least on a bounding box representing an area to be treated; setting elements representing weeds in the background array to zero; creating a polygon array equal to the background array; analysing the background array for elements representing weeds and elements representing non-weeds until all elements in the background array are set to zero and modifying the polygon array correspondingly until all non-weed elements in the polygon array are equal to a value of 1 and individual polygon elements have the same value; creating polygons comprising points based on the elements of the modified polygon array; and converting the points of the polygons to longitudes and latitudes.

The method may include writing a shapefile based on the longitudes and latitudes converted from the points of the polygons.

Creating polygons may comprise the following steps for each different set of values in the polygon array: a) creating a list of the coordinates of the maximum column values (e.g. from left to right) and a list of the minimum coordinates; b) creating a new list; c) working through the top list from left to right, appending the first set of coordinates to the new list, and then if the row is not equal to the previous row, appending one set of coordinates at the current column and old row and then appending the current coordinates to the new list and continuing until the last value in the list is reached and appending that coordinate to the new list; d) repeating step c) for the bottom list, but processing from right to left (i.e. maximum column to minimum column) and adding one to the y (column) value (i.e. moving it down one square) and appending these to the same list as the top section; and e) appending the first value in the new list to the end of the new list, thus closing the polygon.

The method may comprise analysing the array using a connected elements analysis comprising grouping blobs of weeds together.

The method may comprise calculating a list of the centroids of each blob of weeds based on the connected elements analysis to determine the shortest distance between each centroid.

The method may comprise generating a spray path independently for each blob of weeds to ensure each area is sprayed.

The method may comprise combining the spray path for each blob of weeds into a list of all the spray paths in an order to be sprayed based on a shortest distance between each blob.

Further features and/or aspects of the present invention will become apparent from the following detailed description.

BRIEF DESCRIPTION OF THE DRAWINGS

Embodiments of the present invention will now be described by way of example only with reference to the accompanying drawings in which like reference numerals refer to like features. In the drawings:

FIG 1 is a front view of a sensing unit according to an embodiment of the present invention;

FIG 2 is a side view of the sensing unit shown in FIG 1 ;

FIG 3 is a front view of the sensing unit shown in FIG 1 mounted to an aerial vehicle; FIG 4 is a side view of the sensing unit mounted to the aerial vehicle shown in FIG 3;

FIG 5 is a block diagram of elements of the sensing unit shown in FIG 1 ;

FIG 6 shows the sensing unit mounted to the aerial vehicle shown in FIG 3 flying above an area of ground being imaged;

FIG 7 is a general flow diagram illustrating methods of processing image data and locational data to identify a location and size of a ground feature, in particular weeds, according to embodiments of the present invention;

FIG 8 is an example image in which plant pixels have been identified by methods of the present invention and highlighted in a specific colour;

FIG 9 shows the result of using thresholds across each image channel in the L*a*b* colour space for the same source image used in the example shown in FIG 8;

FIG 10 shows the thresholds used for each image channel for the image in FIG 9;

FIG 11 shows the result of using thresholds in different colour spaces - RGB, HSV, YCbCr and L*a*b* for the same source image used in the example shown in FIG 8;

FIG 12 shows the result of using is normalised difference vegetation index (NDVI) for the same source image used in the example shown in FIG 8; and

FIG 13 is a general flow diagram illustrating methods of converting a list of longitudes, latitudes and radii representing a ground feature to be treated, in particular weeds, to a format usable by treatment equipment.

Skilled addressees will appreciate that elements in the drawings are illustrated for simplicity and clarity and have not necessarily been drawn to scale. For example, the relative dimensions of some of the elements in the drawings may be distorted and/or some elements may be omitted to help improve understanding of embodiments of the present invention. DETAILED DESCRIPTION

Generally, some embodiments of the present invention are directed to a sensing unit mountable to an aerial vehicle to capture images of the ground to detect ground features, and in particular to detect weeds. Other embodiments of the present invention are directed to an aerial vehicle comprising the sensing unit.

Further embodiments of the present invention are directed to methods of processing images and locational data to identify locations and sizes of ground features, in particular weeds. Yet further embodiments of the present invention are directed to methods of generating shapefiles for use by weed treatment equipment, such as manned or autonomous spraying systems.

Whilst examples of the present invention are described with reference to the detection of weeds for the treatment thereof, it is envisaged that modifications to the present invention could be made by one skilled in the art where necessary such that other ground features could be located.

Reference is made to FIGS 1 and 2, which show a sensing unit 100 according to embodiments of the present invention, which is mountable to an aerial vehicle 200, as shown in FIGS 3 and 4. The sensing unit 100 comprises a frame 102 to mount the sensing unit 100 to the aerial vehicle 200. In some embodiments, the frame 102 comprises a pair of substantially vertical, spaced apart arms 104, a pair of inclined arms 106 extending from the arms 104 and a substantially horizontal cross member 108 extending between the inclined arms 106. The frame 102 can be made of any suitably strong, lightweight material, such as aluminium or plastic and can be formed as a single element or multiple elements joined together by any suitable means known in the art. The frame 102 comprises one or more fasteners 110 for coupling the sensing unit 100 to the aerial vehicle 200. The one or more fasteners 110 can be in the form of hooks, clips, zip ties, clamps, bolts or the like, or a combination thereof to securely attach the sensing unit 100 to the aerial vehicle 200. In some embodiments, the frame 102 can comprise a gusset or brace 112 attached between the pair of inclined arms 106 to provide additional strength and rigidity. The sensing unit 100 comprises a gimbal 114 rotatably coupled to the arms 104 of the frame 102 and a body 116 mounted to the gimbal 114 allowing pitch and roll of the body 116 relative to the frame 102. The sensing unit 100 comprises a pair of cameras 118 mounted to the body 116 at an angle to the vertical and angled away from each other to minimise an overlap of images captured by the cameras 118. The body 116 can comprise a plurality of surfaces 120A, 120B, 120C angled relative to each other to facilitate mounting the cameras 118 at an angle to the vertical and angled away from each other. In preferred embodiments, the sensing unit 100 comprises a light detection and ranging (LiDAR) unit 122 mounted to the body 116 between the cameras 118. In the embodiment shown in FIGS 1-4, the body 116 comprises angled surfaces 120A and 120B, to, or through which cameras 118 are mounted, separated by substantially horizontal surface 120C, to, or through which LiDAR unit 122 is mounted.

In some embodiments, the sensing unit 100 comprises one or more cameras 118 mounted to the body 116. In some embodiments, the sensing unit 100 comprises a single camera 118 mounted to the body 116. In other embodiments, the sensing unit 100 comprises three or more cameras 118 mounted to the body 116.

With reference to the system block diagram shown in FIG 5, in preferred embodiments, the sensing unit 100 comprises one or more processors 124 mounted to, or housed within the body 116. In some embodiments, the one or more processors 124 can be in the form of an industrial single board computer (SBC) and/or a microcontroller. The processor 124 is in communication with the gimbal 114 to provide data thereto and control the orientation of the gimbal 114. The processor 124 is in communication with the cameras 118 to receive image data from the cameras. The processor 124 is in communication with the LiDAR unit 122 to receive ranging data from the LiDAR unit.

In preferred embodiments, the processor 124 is in communication with an inertial measurement unit (IMU), digital compass and GPS unit, optionally provided in a combined unit 126, and mounted to the sensing unit 100 or to the aerial vehicle 200. A GPS antenna 138 mounted to the sensing unit 100 or to the aerial vehicle 200 is coupled to the unit 126. In preferred embodiments, the processor 124 is in communication with a light sensor 128 mounted to the sensing unit 100 or to the aerial vehicle 200. In preferred embodiments, the processor 124 is in communication with a memory 130, such as a USB SSD, optionally accommodated in a housing 132 mounted to, or housed within the body 116.

In preferred embodiments, the sensing unit 100 comprises a power distribution board 134, optionally mounted to, or housed within the body 116, and in communication with one or more of the gimbal 114, cameras 118, LiDAR unit 122, processor 124 and light sensor 128 to provide power thereto.

In preferred embodiments, the sensing unit 100 comprises a timing board 136, optionally mounted to, or housed within the body 116, and in communication with one or more of the cameras 118, LiDAR unit 122, processor 124, IMU, digital compass, GPS unit 126 and light sensor 128 to provide timing data thereto.

Some embodiments of the present invention are directed to an aerial vehicle 200 comprising the sensing unit 100, as shown in FIGS 3 and 4. The aerial vehicle 200 can be an unmanned, remotely controlled aerial vehicle (UAVs), such as the multicopter shown in FIGS 3 and 4, a fixed wing aerial vehicle or a hybrid thereof. Alternatively, the aerial vehicle 200 can be a manned aerial vehicle, such as a helicopter. It will be appreciated that some variations to the sensing unit 100 may be made, such as to the fasteners 110 and/or the shape of the frame 102, depending on the type of aerial vehicle 200 to which the sensing unit 100 is to be attached.

FIG 6 shows the sensing unit 100 mounted to the aerial vehicle 200 flying above an area of ground 140 being imaged comprising a feature to be detected, such as a plant, such as a weed 142. In some embodiments, the aerial vehicle 200 is approximately 60-80m above the ground 140. The dotted lines in FIG 6 indicate the field of view (FOV) of the cameras 118 of the sensing unit 100 illustrating the minimal overlap of the FOVs due to the angled mounting of the cameras 118.

It will be appreciated that light spectra are an important consideration and depend on the ground feature being detected. They form the basis for being able to accurately and consistently distinguish the feature of interest, e.g. distinguish plants from background pixels. For distinguishing weeds, the spectral response of plants and the ability of cameras to capture different spectra of light must be understood. Generally, plants have a high reflectance in the near infrared (NIR) range versus soil. Plants have a low reflectance in the visible spectrum with soil being relatively consistent across the visible spectrum. Within the visible spectrum, plants reflect more green light than red and blue light, hence why plants generally appear green to human eyes which only see the visible spectrum of light.

Standard cameras are designed to mimic human eyesight and therefore only detect light in the visible spectrum. Regular cameras typically comprise three different light detecting diodes, each diode representing a single pixel. Blue diodes detect light of wavelength roughly 400 to 500nm, green diodes at 500nm to 580nm and red at 580nm to 650nm. This is achieved via coloured dyes applied directly to the diodes to differentiate between colours and then the use of cut filters. Glass effectively filters out UV light below 400nm and the cut filter eliminates light above 650nm in the NIR to infra-red (IR) spectrum. However, without the cut filter, the diodes will detect NIR light.

According to some embodiments, the sensor unit 100 comprises two 12MP industrial cameras 118, 4000 pixels wide, which have been modified to sense NIR light. Modification of the cameras 118 involves replacing the IR cut filter with another filter. Light is blocked from 580nm to 680nm (red light), light is allowed through from 680 to 750nm and light higher than 750nm is blocked allowing the red diodes of the camera to capture NIR. This results in a camera which captures blue, green and NIR light allowing for comparison between these three spectra.

In preferred embodiments of the present invention, the one or more cameras 118 employ a global shutter, which captures the entire image simultaneously. This is in contrast to a rolling shutter, which captures the image line by line over a period of time. The one or more cameras 118 are continuously moving forward. With a rolling shutter system, each line would be exposed for a period of time, e.g. 1/2000 th of a second, but not at the same time as the other lines in the image because the camera moves between capturing each line. A typical image capture time is 10ms for a rolling shutter over which time the camera would have moved ~14cm forward at 50km/h. Therefore, a global shutter is the preferred shutter type. According to some embodiments, the sensor unit 100 comprises the light sensor 128 to measure ambient light conditions to facilitate continual calibration of the cameras 118 as ambient lighting changes. The sensor unit 100 comprises the GPS, IMU (inertial measurement unit) and digital compass unit 126 which allow the position of the sensor unit to be determined extremely accurately. The sensor unit 100 comprises the LiDAR (light detection and ranging) device 122 which measures height of the sensor unit 100 above the ground (as opposed to the GPS which measures height above sea level). According to some embodiments, the sensor unit 100 comprises the on-board computer 124, including several custom built electronic daughter boards, timing microcontrollers and USB sticks to collate and store data. In some embodiments, the sensor unit 100 has a mass of approximately 3kg and can generate and store over 2TB of data in a full day of flying.

Generally, the apparatus and methods according to embodiments of the present invention combine location information and image data captured from the sensor unit 100 mounted to the aerial vehicle 200 to generate a map of ground features, and in particular a weed map for broadacre farming. The aerial vehicle 200 comprising the sensor unit 100 flies over a defined area of ground in swaths or sweeps and captures images of the ground via cameras 118. In some embodiments, the pair of cameras 118 point downwards, are positioned side by side and are angled slightly away from vertical to capture images with minimal overlap and to increase the width and area of ground captured per swath or sweep. The swath width is the ground sampling distance (GSD) multiplied by the number of pixels in the width of the image. According to some embodiments, the GSD is 1cm and with images having a pixel width of 4000 pixels, this equates to a swath width of 78.4m

The light sensor 128 positioned, for example, on top of the aerial vehicle 200 measures ambient light conditions to facilitate continual calibration of the cameras 118 as ambient lighting changes. The combined GPS, IMU and digital compass unit 126 of the sensor unit 100 enables the position, direction and angle of the sensor unit 126 to be known accurately in 3D space. The single point LiDAR unit 122 measures the height of the sensor unit 100 above the ground 140 enabling direct calculation of GPS coordinates of locations within the captured images via simple triangulation. One or more processors 124 tie the elements of the sensing unit 100 together. For example, the sensing unit 100 can comprise a custom microcontroller, which synchronises the capture of data between the different elements of the sensing unit 100 in the nanosecond scale. The sensing unit 100 can comprise the industrial single board computer (SBC) coupled to the high-speed USB memory device 130 to capture all the required data.

Separate to the sensor unit 100 is a GPS base station 144, which is positioned within, for example, 15km of the area in which the aerial vehicle 200 comprising the sensor unit 100 is operating. The GPS base station 144 logs raw GPS data (RINEX) for use in post-processing of the GPS data captured by the sensor unit 100.

In preferred embodiments, direct georeferencing is employed wherein the measurements of the height of the aerial vehicle above ground via the LiDAR device 122, the location and orientation of the one or more cameras 118 on the aerial vehicle (via the high accuracy GPS and IMU) allow the location of any individual pixel on the ground in an image to be calculated. The benefits of direct georeferencing compared to aerial triangulation include a highly efficient area coverage rate because direct georeferencing only requires an image overlap of ~2% to ensure complete coverage of the designated area. Also, the location of a pixel only needs to be calculated if the feature of interest, in particular, a weed is positively identified, thus significantly reducing processing requirements. The reduced processing requirements allow, in particular, weed maps to be produced on the same day of flying with a standard laptop in the field without the need to transfer large amounts of data over the internet. The images produced by the cameras 118 need to be high contrast and low blur. An image with high contrast enhances the distinction between background and weeds, which makes the detection of weeds more reliable. To enhance the contrast of an image, more light needs to be captured by the camera for each pixel. Increasing contrast can be achieved by using a bigger image sensor because more light can be captured by the camera, but this increases mass (linearly) and cost (exponentially). Contrast can be increased with a lower pixel count. The portion of the sensor for each pixel is bigger, but the coverage rate is directly decreased, if the ground sampling distance (GSD) or area per pixel stays the same. Contrast can be increased with longer exposure times. Capturing the images over a longer period allows more light to be captured, but because the camera is moving, this increases the area each pixel is exposed to, i.e. increases blur.

Blur, or sharpness, refers to the area on the ground each pixel is exposed to during the capture of an image with a moving camera. Blur increases the more the camera moves during the exposure time. As blur increases, the distinction between background and weed decreases. A standard camera with an exposure time of 1 /50 th of a second travelling at 36km/h will move 20cm during the exposure time. If the camera was operating at a GSD of 1cm, each pixel would be exposed to a 21x1 cm area and be represented in the image as a 1x1 cm area. This would result in a significantly blurred image that would not be suitable for detecting a weed. Blur can be reduced by increasing the shutter speed. A camera with a shutter speed of 1 /2000 th of a second travelling at 36km/hr would travel 0.5cm during the exposure time, significantly reducing blur. However, as shutter speed increases, contrast decreases. Blur can be reduced by decreasing camera velocity with the result of lower coverage rates.

Coverage rate is inversely proportional to the size of the ground feature, in particular, weeds detectable and the cost per area e.g. cost//ha of the system. Coverage rate is dependent on the Ground Sampling Distance (GSD), the swath width, the forward velocity of the aerial vehicle and an efficiency factor. The GSD is the size of each pixel on the ground with the camera stationary. 1cm GSD refers to each pixel being 1x1 cm in a square array on the ground. The size of the smallest weed consistently detectable is determined by the GSD. As a base standard to consistently detect a certain size weed, at least one pixel must completely view the weed. To reliably detect the target weed size of 3cm, the GSD needs to be 1cm to ensure at least one pixel is filled by the image of the weed. As stated herein, the swath width is the GSD (e.g. 1cm) multiplied by the image width in the number of pixels, e.g. 4000. Additionally, the overlap between the images must be taken into account, which for direct georeferencing is ~2%. As stated herein, in embodiments where two 12MP cameras are used having image widths of 4000 pixels each, this equates to a swath width of 78.4m at 1cm GSD. The forward velocity of the aerial vehicle has three potential limitations. The aerial vehicle should be chosen so that the speed of the aerial vehicle is not the limit. The image capture rate also needs to be considered. Where 12MP cameras are used in the sensor unit 100 with images 3000 pixels in length, capturing one image per second with a GSD of 1cm and 2% overlap, a maximum forward velocity of 29.4m/s (~106km/h) is achieved. This is close to the top speed of most fixed wing aircraft and is not the limitation in a direct georeferencing system. The shutter speed and blur limitations of the cameras also need to be considered. This is a balance between image quality and coverage rate. However, this is expected to be 50 to 60km/h which becomes the limit in a direct georeferencing system

The efficiency factor covers the time not productively flying and covers set up time, flight to and from the area, turning at each end of a swath and changing batteries and swapping memory storage, such as USBs. For some embodiments, the efficiency factor is expected to be ~80% based on 30 minutes endurance for the aerial vehicle.

Once the area is flown, the data captured by the cameras 118 of the sensor unit 100 and the GPS base station 144 is combined using custom built software. Generally, the first step is to upload the data from the GPS base station 144 and the GPS, IMU and digital compass unit 126 through 3rd party software to increase the accuracy of the data from the cameras 118 and the GPS, IMU and digital compass unit 126. The next step is to identify the features of interest, e.g. weeds 142 within the images and ultimately determine an area centred on each weed, such as the radius and the centre of a circle centred on the weed. In some embodiments, this process uses blurring, normalised difference vegetation index (NDVI) thresholding and value thresholding from hue, saturation and value (HSV) thresholding to eliminate non-weed pixels. The longitude and latitude of each weed centre is then calculated using the post-processed GPS, IMU and digital compass information and height-above-ground data from the LiDAR unit 122. The output is now a list of weed centres and radii.

The last stage of the processing method is to convert the list of weed centres to a file, such as a shapefile or weed map, compatible with the selected treatment equipment, such as a sprayer control system with which the file is to be used. The area occupied by the weed is designated “spray” and the areas without weeds are designated “do not spray”. The file is then transferred to the sprayer’s controller where the onboard equipment sprays the designated “spray” areas only. The sensor unit 100 with its flyover data and processing methods generate an accurate reporting tool for land owners and agronomists that greatly assists with weed detection and recognition.

Further embodiments of the present invention are therefore directed to methods of processing image data and locational data captured by an aerial vehicle to identify a location and size of a ground feature, in particular weeds. In preferred embodiments, the image data and the locational data are captured by the sensing unit 100 mounted to the aerial vehicle 200. Generally, processing of the data includes post-processing the locational data, processing the image data, integrating the output of the processed images and the locational data to produce a list of weed locations and sizes and creating a file compatible with equipment for treating the ground feature, in particular weeds, such as spraying equipment, and in some embodiments creating shapefiles.

With reference to the general flow diagram in FIG 7, according to some embodiments, post-processing of the locational data in the method 300 comprises at 302 combining data captured by the GPS unit, inertial measurement unit (IMU) and digital compass unit 126 mounted to the aerial vehicle 200 with data captured by the ground based GPS base station 144 to generate combined locational data to increase the accuracy of the locational data. In some embodiments, this can be done by known 3rd party software.

At 304, the method 300 comprises matching the combined locational data with image capture times for images captured by the pair of cameras 118 mounted to the aerial vehicle 200 and height data captured by aerial vehicle 200, in particular height data captured by the LiDAR unit 122.

Processing of the image data is performed in parallel with the processing of the locational data. The output required from image processing is the location and size, in terms of the number of pixels, for each ground feature of interest, in particular weeds. At 306, the method 300 comprises analysing the image data to determine whether or not each pixel represents the ground feature, in particular weeds. There are a multitude of different ways that this can be achieved by combining multiple different image processing techniques.

According to some embodiments, the method 300 comprises at 306A image pre-processing which typically includes one or more of sharpness adjustment, contrast adjustment and one or more noise reduction processes.

In the processing and analysis of the image data, each pixel undergoes a series of calculations which it must pass to qualify or be categorised as a “weed” pixel. At 306B, the method 300 comprises one or more thresholding processes. According to some embodiments, the primary calculation is normalised difference vegetation index (NDVI) with a minimum value. There are multiple variations to NDVI which could be used. Other processes that can be employed use thresholds based on different colour spaces, e.g. minimum and maximum red, green and/or blue values, or converting to HSV colour space and using set thresholds. More complex calculations can be employed by modifying thresholds based on light readings from the light sensor 128, or based on image wide averages, e.g. brightness levels.

FIG 8 is an example image in which plants in the form of weeds are shown in a specific colour that contrasts well with non-plant (non-weed) areas. In this example, the specific colour is blue (highlighted with arrows), but other colours can be used. In the example shown in FIG 8, the weeds are shown in blue as a result of using custom light filters on the cameras 118 so the cameras detect primarily red and near infrared light. Blue light is strongly blocked by the filter, but as the blue receivers on the camera also see near infrared, which plants highly reflect, plants appear blue.

FIG 9 shows a thresholded version of the same source image used in the example shown in FIG 8 using thresholds across each image channel in the L*a*b* colour space. The thresholds used for each image channel are shown in the graphs in FIG 10. FIG 11 shows the same source image used in the example shown in FIG 8 using some different colour spaces, namely RGB, FISV, YCbCr and L*a*b*. FIG 12 shows the same source image used in the example shown in FIG 8 using the NDVI method with a minimum NDVI value. This image demonstrates how effective the NDVI method can be. White is a high NDVI value, representing the vegetation of interest, i.e. weeds, and black is a low NDVI value, i.e. non-weeds. Returning to the general flow diagram shown in FIG 7, at 306C, the method

300 comprises generating a black and white image, with black indicating that a pixel is not a plant (weed) and white indicating that a pixel is a plant (weed).

Typically, a single true pixel, i.e. a pixel indicating a weed, isolated from other true pixels, cannot be verified as a weed without an unacceptable level of false positives. Therefore, at 308, the method 300 comprises performing area analysis on the image data to remove false positives. The removal of false positive pixels can be based on groups of pixels and/or the proximity of pixels and some different options can be used. In some embodiments, an erosion process is used comprising removing pixels around the perimeter of an area or blob of pixels to a defined depth. In some embodiments, a connected elements process is used comprising calculating the number of pixels that are connected allowing for a minimum threshold to be set. In other embodiments, a blurring process is used comprising analysing an area around a pixel. For example, if 50% of pixels within a 3-pixel radius are positive, the centre pixel passes as a positive pixel and is still true. Multiple different options for the blurring process can be used.

Once false positives are eliminated, at 310, the method 300 comprises calculating a centroid and radius of each remaining area, or blob representing the ground feature, in particular a weed, and in preferred embodiments is output in a list. At 312, the method 300 comprises combining the list of centroids and radii representing the ground feature, in particular weeds, with the combined locational data.

At 314, the method 300 comprises generating a list of longitudes, latitudes and radii representing the ground feature, in particular weeds. In some embodiments, the list of plant centroids and radii is combined with the locational data and converted to longitude and latitude for the centroid and radius in metres via 3D trigonometry. The output is a list of the longitude, latitude and radius of the detected plants (weeds).

At 316, in preferred embodiments, the method 300 comprises converting the list of longitudes, latitudes and radii representing the ground feature, in particular weeds, to a format usable by selected treatment equipment, such as a sprayer control system to treat weeds. In some embodiments, this involves conversion to a shapefile format. In particular, for some equipment guidance systems there are significant memory limitations that need to be overcome to allow for the use of shapefiles. The most basic method is to create a circle for each plant to be treated, but this creates too many polygons and exceeds the memory limitations of the guidance systems.

Another aspect of the present invention is a method for generating shapefiles to stay within the memory limit of the equipment guidance systems. In particular, another aspect of the present invention is a method of converting a list of longitudes, latitudes and radii representing a ground feature to be treated, in particular weeds, to a format usable by treatment equipment. The method requires a number of inputs. One input is the aforementioned list of weeds comprising the coordinates (longitudes, latitudes) and size (radius) of the weeds. The list can be obtained from one or more flights of the aerial vehicle 200. Another input relates to accuracy, i.e. how much extra around each weed is required to allow for inaccuracies of the entire system. This may be for example, 0.2-0.5m. A further input is a square size (SS), which can typically be 1 m. The square size is effectively the resolution. The smaller the SS, the less chemical is wasted, but the more polygons are required. Another input is a bounding box for the entire area under analysis. In some embodiments, the bounding box can be derived from current shapefiles, or from bounding an area in a map program, such as Google Maps, or the like. In some embodiments, the bounding box is a saveable variable because it is likely to be re-used, e.g. particular areas of relevance, such as paddocks or fields on a property, can be selected from a drop-down list.

With reference to FIG 13, the method 400 comprises at 402 a processor creating a background array based at least on the bounding box representing an area to be treated. In some embodiments, creating the background array comprises determining a width and height (m) of the bounding box for the treatment area and optionally adding a safety margin around the whole area, which can be, for example, 20m. In some embodiments, creating the background array comprises assigning a value of one to the elements of the array, the width and height (inclusive of the safety margin) divided by the square size SS with point 0,0 at, for example, the north-western point of the bounding box.

At 404, the method 400 comprises setting elements representing weeds in the background array to zero. This effectively removes weeds from the background array. In some embodiments, this step can comprise converting the coordinates representing weeds into the Universal Transverse Mercator (UTM) coordinate system. Based on the weed coordinates, the weed radius and accuracy elements representing squares to be treated are set to zero.

At 406, the method 400 comprises creating a polygon array equal to the background array. At 408, the method 400 comprises analysing the background array for elements representing weeds and elements representing non-weeds until all elements in the background array are set to zero and modifying the polygon array correspondingly until all non-weed elements in the polygon array are equal to a value of 1 and individual polygon elements in the polygon array have the same value.

In some embodiments, analysing the background array for elements representing weeds and elements representing non-weeds comprises analysing the background array in sections. For example, a top section of the background array can be analysed first. Processing can comprise starting at 0,0 and working down the columns in the background array until a weed, i.e. a 0 value is found, or one less than the length of the column is reached. All elements in the polygon array covered in this process are set to 1 and all elements in the background array covered in this process are set to 0. Processing comprises moving to the top of the next column and repeating the previous processing steps. A bottom section of the background array can be analysed next. Processing can comprise starting at the bottom left of the background array, i.e. the maximum row, first column and working up the columns in background array until a 0 value is found. Processing comprises setting all elements covered in this process to 2 and setting all elements covered in the background array to 0. Processing comprises moving to the bottom of the next column and repeating the previous step. The remainder of the background array can be analysed next. Processing can comprise setting a counter starting at 3. Starting at 0,0 and working down columns in the background array until a non-weed element, i.e. a value of 1, is found. If the bottom of the column is reached without finding a weed, processing comprises moving to the top of the second column, 0,1 and continuing down that column etc.

Once a 1 value is found in the background array, processing comprises setting this element as the global x and y coordinate and setting this position to the local x,y position. If this value is a 1 in the background array, processing comprises working back up the column (towards row 0) until a zero is found, setting this column position to the local y variable and working down the column setting elements to zero and the polygon array equal to the counter. Once a zero is encountered processing comprises moving to the next column and repeating this process. If this value is a 0 in the background array, processing comprises moving down the column until a 1 is found and checking that the value in the polygon array PA in the previous column, same row is equal to the counter value. If a 1 is found and the previous column, same row is equal to the counter, processing comprises setting this column position to the local y variable, moving down this column, setting the background array to 0 and polygon array to the counter value until a zero in the background array is encountered and the process is repeated. If a 1 cannot be found in the background array with the previous column, same row equal to the counter value, then this polygon is finished. The counter value is increased by 1 and processing is repeated from the 0,0 point, but from the global x,y position. This is repeated until all elements of the background array are 0. At this stage, all non weed locations in the polygon array equal 1 or greater with individual polygons having the same counter value.

At 410, the method 400 comprises creating polygons comprising points based on the elements of the modified polygon array. In some embodiments, creating polygons comprises the following steps for each different set of values in the polygon array (i.e. the polygon):

1. Creating a list of the coordinates of the maximum column values (e.g. from left to right) and a list of the minimum coordinates.

2. Creating a new list.

3. Working through the top list from left to right, appending the first set of coordinates to the new list, and then if the row is not equal to the previous row, appending one set of coordinates at the current column and old row and then appending the current coordinates to the new list. Continuing until the last value in the list is reached and appending that coordinate to the new list.

4. Repeating step 3 for the bottom list, but processing from right to left (i.e. maximum column to minimum column) and adding one to the y (column) value (i.e. moving it down one square) and appending these to the same list as the top section.

5. Appending the first value in the new list to the end of the new list, thus closing the polygon.

At 412, the method 400 comprises converting the points of the polygons to longitudes and latitudes. The method includes writing a shapefile based on the longitudes and latitudes converted from the points of the polygons. The shapefile can then be used to treat the weeds by the treatment equipment selected, whether that be spraying, mechanical removal or steam treatment or other method.

In some embodiments, following step 404 above, the method 400 comprises analysing the array using a connected elements analysis grouping blobs of weeds together. A list of the centroid of each blob is calculated from the connected elements analysis which is then used to create the shortest distance between each centroid. A spray path is then generated independently for each blob to ensure each area is sprayed. The spray path for each blob is then combined into a list of all the spray paths in the order dictated by the solution to the shortest distance between each blob. The output is a list of waypoints that a vehicle can follow to spray all the weeds in the area in the shortest distance possible. Reductions in travel distance of 80% to 90% are achievable with this methodology compared to if the whole area was covered as is typically the case with ground based sprayers.

Hence, embodiments of the present invention address or at least ameliorate the aforementioned problems of the prior art. Embodiments of the present invention are able to detect small plants, e.g. under 5cm in diameter, with resolutions of ~1 cm GSD and are able to out-perform existing boom mounted technology in weed identification and identify fewer false positives. Some embodiments can achieve over 200ha per hour including non-productive time. In other embodiments, a coverage rate of over 300 ha per hour can be achieved based on an aerial vehicle velocity of 50km/h and 1 cm GSD. Per hour of application, the existing prototype will cover an area approximately 2.5 times faster than 36m boom-mounted spot sprayers, at around one third of the capital cost of such sprayers. This is achievable, at least in part, due to the minimal overlap of the fields of view of the inclined cameras of the sensing unit thus being able to cover larger areas with each sweep.

Embodiments of the present invention enable easier quality control by simply scouting for escaped weeds after a spot spray application has been identified and sprayed, especially given that these weeds may be herbicide resistant. A double knockdown application can be applied to browned out weeds after an initial spot spray application by using the same weed map. Embodiments of the present invention have the ability to target weeds by size and, where a blanket application is required, larger weeds could receive a heavier rate of herbicide or a different herbicide, however individual weeds growing in clumps would be seen as a large weed. Fields flown before or shortly after a rain event would result in weed maps excluding germination from that rain event, but spot spraying on the earlier weeds requiring a higher rate or different herbicide could be incorporated with a blanket spray through prescription mapping or a separate spray line.

According to some embodiments of the present invention, where a weed is identified, a radius around that weed could be built into the weed map which would reflect where seeds have fallen from that plant, which would allow very small undetectable weeds to be sprayed in the process of spraying the larger weed and/or pre-emergent herbicides to be selectively applied. Weed maps over a season can be accumulated together to provide a seasonal map, which could then be used to selectively apply pre-emergent herbicides before the next season’s weeds germinate.

Fields can be scouted for isolated weeds that may have escaped a blanket application, especially as it is probable that these weeds are herbicide resistant and can be controlled before they become significant problems. In row crops, weeds growing out of place can be identified and treated.

Weed maps produced in accordance with some embodiments of the present invention can include a digital elevation model, which provides a map of obstacles in a field from which autonomous vehicles could be guided on an efficient path from weed to weed, using small, light equipment to spot spray fields autonomously.

As data captured from flights is post-processed, machine learning or leaf shape could be incorporated to identify some weeds in crop (green on green). After an autumn flush, winter weed patches such as black oats could be mapped and those patches treated in crop or pre-emergent chemistry applied to those areas. Weed maps could be used to trigger alternative selective weed control, for example mechanical treatment, where tines selectively engage the ground, or microwave weed control.

Flights of the aerial vehicle comprising the sensor unit are pre-planned and therefore the operator only needs to manage power supply, e.g. batteries, launch the aerial vehicle and comply with CASA regulations. With petrol-powered drones and CASA exemption from line of sight, extended flight times make set and forget flights possible.

It will be appreciated that at least some of the features of at least some of the embodiments described herein can be combined in various combinations with at least some of the features of at least some of the other embodiments described herein within the scope of the present invention.

In this specification, the terms, “first”, “second” etc. are intended to differentiate between different features of the present invention and are not intended to limit the present invention to a particular order of implementation unless the context indicates otherwise.

In this specification, the terms “comprises”, “comprising” or similar terms are intended to mean a non-exclusive inclusion, such that an apparatus that comprises a list of elements does not include those elements solely but may well include other elements not listed.

The reference to any prior art in this specification is not, and should not be taken as, an acknowledgement or any form of suggestion that the prior art forms part of the common general knowledge. It will be appreciated that the present invention is not limited to the specific embodiments described herein. Skilled addressees will identify variations from the specific embodiments described herein that will nonetheless fall within the scope of the present invention, which is determined by the following claims.




 
Previous Patent: TRANSMISSION ARRANGEMENT

Next Patent: FLUID CONTAINER