Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
A SYSTEM AND A METHOD FOR GENERATING A SPECTRAL IMAGE FOR A PLOT OF LAND
Document Type and Number:
WIPO Patent Application WO/2011/160159
Kind Code:
A1
Abstract:
A system (1) for generating a spectral image (2) in the form of an NDVI image for a plot of land in the form of a paddock (3) is provided. System 1 includes a data gathering unit (4) for progressively obtaining spectral values 8 for a plurality of respective regions (5) within paddock (3). Each region (5) has a surface area of at least 16 m2. System (1) also has a processor (6) for building a data set for each region (5) that includes both location data (7) for that region and the respective spectral value (8). Processor (6) is responsive to the data set for generating spectral image (2) for paddock (3).

Inventors:
MEADOWS NEIL (AU)
HODGE GREG (AU)
Application Number:
PCT/AU2010/001563
Publication Date:
December 29, 2011
Filing Date:
November 22, 2010
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
CAMBIUM LAND & WATER MAN PTY LTD (AU)
MEADOWS NEIL (AU)
HODGE GREG (AU)
International Classes:
G01J3/00; G06K9/46
Domestic Patent References:
WO2001033505A22001-05-10
Foreign References:
EP1178283A12002-02-06
US5467271A1995-11-14
US7068816B12006-06-27
Attorney, Agent or Firm:
MEADOWS, Neil (Lenah Valley, Tasmania 7008, AU)
Download PDF:
Claims:
CLAIMS

1. A method for generating a spectral image for a plot of land, the method including:

(a) progressively obtaining at least one spectral value for a plurality of respective regions within the plot, where each region has a surface area of at least 5 m2;

(b) building a data set for each region that includes both location data for that region and the respective spectral value;

(c) being responsive to the data set for generating the spectral image for the plot; and

(d) being responsive to a map of a subset of the regions for assessing differences between the subset of regions and corresponding regions of an earlier generated image.

2. A method according to claim 1 including the step of being responsive to a map of all the regions for assessing differences between those regions and corresponding regions of an earlier generated image.

3. A method according to any one of the preceding claims wherein the spectral values are obtained at a height of about 3 to 100 metres above the respective region.

4. A method according to claim 3 wherein the spectral image values are obtained at a height of about 3 to 60 metres above the respective region.

5. A method according to claim 4 wherein the spectral image values are obtained at a height of about 3 to 30 metres above the respective region.

6. A method according to claim 5 wherein the spectral image values are obtained at a height of about 3 to 10 metres above the respective region.

7. A method according to claim 6 wherein the spectral image values are obtained at a height of about 3 to 6 metres above the respective region.

8. A method according to any one of the preceding claims wherein the spectral image values are obtained by a data gathering unit that is mounted to a vehicle.

9. A method according to claim 8 wherein the vehicle is a crop-duster type aircraft.

10. A method according to claim 8 wherein the vehicle is a ground operated unit.

11. A method according to any one of the preceding claims 8 to 10 wherein the vehicle is operated to apply one or more treatments to the plot and the data gathering unit is responsive to the application of the one or more treatments for obtaining the spectral values.

12. A method according to any one of the preceding claims 8 to 10 wherein the vehicle includes a control that is actuated by an operator to apply the one or more treatments, and the method includes the further step of the data gathering unit being responsive to the control for selectively obtaining the spectral values.

13. A method according to any one of the preceding claims 8 to 10 wherein the vehicle includes a control that is automatically actuated in response to the aircraft entering a predefined geofence, and the method includes the further step of the data gathering unit being responsive to the control for selectively obtaining the spectral values.

14. A method according to any one of the preceding claims 8 to 13 wherein steps (b) to (d) are carried out, at least in part, in a location external to the vehicle.

15. A method according to any one of the preceding claims wherein the spectral values are derived from one or more of the visible and infrared spectrum.

16. A method according to claim 15 wherein the spectral values are used to calculate a normalised difference vegetation index (NDVI) indices for the respective regions.

17. A method according to claim 16 including the further step of analyzing the NDVI with respect to predetermined information on the region for providing crop growth information.

18. A method according to claim 15 wherein the spectral values are used to calculate a red edge inflection point (REIP) for respective regions.

19. A method according to claim 18 including the further step of analyzing the REIP with respect to predetermined information on the region for providing crop health information.

20. A method according to claim 17 or claim 19 wherein the predetermined information is a crop growth function.

21. A method according to claim 20 wherein the crop growth information includes amounts of one or more additives to be applied to the region for promoting crop health and growth.

22. A method according to any one of the preceding claims including the step of being responsive to the image for assessing differences between regions.

23. A method according to claim 22 including the step of being responsive to a map of at least one preselected region for assessing differences between the at least one preselected region with a corresponding region on an earlier generated image.

24. A method according to any one of the preceding claims including the step of being responsive to the location data for applying one or more geographic or climatic indicators to the image.

25. A method according to claim 24 wherein the geographic or climatic indicators are selected from the group including: contours; bearings; hours of sunshine; rainfall; water courses; and fence lines.

26. A method according to any one of the preceding claims wherein the surface area of the region is at least 20 m2.

27. A method according to any one of the preceding claims including the step of publishing the image as an electronic file that is available for remote access.

28. A method according to claim 22 including the step of providing a portal through which authorised users are able to remotely access the electronic file.

29. A method according to any one of the preceding claims wherein the step of progressively obtaining at least one spectral value for a plurality of respective regions within the plot includes obtaining only one spectral value for respective regions.

30. A method according to any one of the preceding claims wherein the step of being responsive to the data set for generating the spectral image for the plot includes being responsive to only one spectral value for each region.

31. A method according to any one of the preceding claims including the step of being responsive to a map of a subset of the regions for assessing differences between the subset of regions and corresponding regions of an earlier generated image.

32. A method according to any one of the preceding claims including the step of being responsive to a map of all the region for assessing differences between those regions and corresponding regions of an earlier generated image.

33. A system for generating a spectral image for a plot of land, the system including: a data gathering unit for progressively obtaining spectral values for a plurality of respective regions within the plot, where each region has a surface area of at least 5 m2: and

a processor: for building a data set for each region that includes both location data for that region and the respective spectral value; being responsive to the data set for generating the spectral image for the plot; and being responsive to a map of a subset of the regions for assessing differences between the subset of regions and corresponding regions of an earlier generated image.

34. A system according to claim 33 including being responsive to a map of all the regions for assessing differences between those regions and corresponding regions of an earlier generated image.

Description:
A SYSTEM AND A METHOD FOR GENERATING A SPECTRAL IMAGE FOR A PLOT OF LAND

FIELD OF THE INVENTION

[0001] The present invention relates to a system and a method for generating a spectral image for a plot of land.

[0002] Embodiments of the invention have been particularly developed for generating a spectral image for a plot of land for the purpose of and evaluating vegetation characteristics of a crop growing in and from the land. While some embodiments will be described herein with particular reference to that application, it will be appreciated that the invention is not limited to such a field of use, and is applicable to in broader contexts.

BACKGROUND

[0003] Any discussion of the background art throughout the specification should in no way be considered as an admission that such art is widely known or forms part of common general knowledge in the field.

[0004] It is known to use spectral imaging for indentifying the presence of green vegetation on a plot of land - such as a potato crop in a paddock - and for evaluating vegetation characteristics such as a Normalised Difference Vegetation Index (NDVI) data, NDVI data provides a numerical indicator that can be used to assess whether plants contain live green vegetation or not. NDVI data of a crop in a paddock can be used to indicate areas of stress in the crop well before that stress can be perceived in the visible light spectrum.

[0005] NDVI data may infer conditions such as soil moisture and/or the presence of pests and diseases. A spatial image of whole-of paddock NDVI data can used as, , : amongst other things, a tool to:

• Enhance the efficiency of existing agronomic management such as where to ' . site soil moisture meters.

• Provide a general timeframe as to when to apply a certain herbicide.

• Indicate when and where to irrigate.

• Indicate when a crop is at such a level of health that specialist advice should be sought. • Show where in a paddock to check for crop performance indicators (for example sap flow and leaf nitrogen tests, tuber formation and number, soil borne diseases).

• Assess against crop yield.

• Identify where to sample soil for future cropping operation.

• Show how the paddock is/was performing for insurance purposes.

[0006] High NDVI measures equate to an abundance of plant functioning chlorophyll.

Low NDVI measures equate to an absence of low density or plant functioning chlorophyll.

[0007] Present methods of collecting data for generating a spectral image for a plot of land is through the use of:

• Data collected by satellites.

• Data collected by specialized flights.

• Terrestrial data capture, for example by data captured on tractors, all-terrain vehicles, utility vehicles and the like.

[0008] Following the collection of this data, it must then be processed using specialized data processing software to generate the spectral image and then produce the relevant NDVI data.

[0009] This present method of generating a spectral image for a plot of land is expensive and time-consuming to undertake. Furthermore, the post data capture computer processing requires separate computer software and/or other computer operations to produce an image and to produce an index. This is particularly undesirable for a number of reasons, including that fact that separate computer software requires specialized learning of the software to produce an index which is generally a time consuming process.

SUMMARY OF THE INVENTION

[0010] It is an object of the present invention to overcome or ameliorate at least one of the disadvantages of the prior art, or to provide a useful alternative.

[0011] According to an embodiment of the invention there is provided a method for generating a spectral image for a plot of land, the method including:

' ' progressively obtaining at least one spectral value for a plurality of respective regions within the plot, where each region has a surface area of at least 5 m 2 ; building a data set for each region that includes both location data for that region and the respective spectral value; and

being responsive to the data set for generating the spectral image for the plot.

[0012] In an embodiment the method includes the step of being responsive to a map of a subset of the regions for assessing differences between the subset of regions and corresponding regions of an earlier generated image.

[0013] In an embodiment each region is adjacent to at least one other region.

[0014] In an embodiment the regions are contiguous and collectively cover the plot.

[0015] In an embodiment at least one of the regions is spaced apart from an adjacent region to define at least one intermediate zone.

[0016] In an embodiment one or more of the regions overlap.

[0017] In an embodiment all the regions overlap.

[0018] In an embodiment the zones are relatively small compared to the regions.

[0019] In an embodiment the zones are relatively large compared to the regions.

[0020] In an embodiment the step of generating the spectral image includes the sub-step of deriving a spectral value for at least one intermediate zone.

[0021] In an embodiment the step of generating the spectral image includes the sub-step of deriving a spectral value for all respective intermediate zones.

[0022] In an embodiment the spectral value for one or more of the intermediate zones is [ " ' ' -· derived from a respective spectral value from one or more regions adjacent to the P " respective zone.

[0023] In an embodiment the spectral values are obtained from directly above the respective regions.

[0024] In an embodiment the spectral values are obtained at a height of about 3 to 100 I' metres above the respective region.

[0025] In an embodiment the spectral image values are obtained at a height of about 3 to ' 5 60 metres above the respective region.

[0026] In an embodiment the spectral image values are obtained at a height of about 3 to

30 metres above the respective region.

[0027] In an embodiment the spectral image values are obtained at a height of about 3 to

10 metres above the respective region.

[0028] In an embodiment the spectral image values are obtained at a height of about 3 to

6 metres above the respective region. [0029] In an embodiment the spectral image values are obtained by a data gathering unit.

[0030] In an embodiment the data gathering unit is mounted to a crop-duster type aircraft.

[0031] In an embodiment the crop-dusting type aircraft is operated to apply one or more treatments to the plot and the data gathering unit is responsive to the application of the one or more treatments for obtaining the spectral values.

[0032] In an embodiment the aircraft includes a controller that is actuated by an operator to apply the one or more treatments, and the method includes the further step of the data gathering unit being responsive to the controller for selectively obtaining the spectral values.

[0033] In an embodiment the aircraft includes a controller that is automatically actuated in response to the aircraft entering a predefined geofence, and the method includes the further step of the data gathering unit being responsive to the controller for selectively obtaining the spectral values.

[0034] In an embodiment the data gathering unit is mounted to a tractor.

[0035] In an embodiment the data gathering unit is mounted to a trailing implement.

[0036] In an embodiment the spectral values are derived from one or more of the visible and infrared spectrum.

[0037] In an embodiment the spectral values are used to calculate a normalised difference vegetation index (NDVI) for respective regions.

[0038] In an embodiment the method includes the further step of analyzing the NDVI with respect to predetermined information on the region for providing crop growth information.

[0039] In an embodiment the spectral values are used to calculate a red edge inflection point (REIP) for respective regions.

[0040] In an embodiment the method includes the further step of analyzing the REIP with respect to predetermined information on the region for providing crop health ' information.

[0041] In an embodiment the predetermined information is a crop growth function.

[0042] In an embodiment the crop growth information includes amounts of one or more additives to be applied to the region for promoting crop health and growth. [0043] In an embodiment the method includes the step of being responsive to the image for assessing differences between regions.

[0044] In an embodiment the method includes the step of being responsive to a map of at least one preselected region for assessing differences between the at least one preselected region with a corresponding region of an earlier generated image.

[0045] In an embodiment the method includes the step of being responsive to a map of a subset of the regions for assessing differences between the subset of regions and corresponding regions of an earlier generated image.

[0046] In an embodiment the method includes the step of being responsive to a map of all the region for assessing differences between those regions and corresponding regions of an earlier generated image.

[0047] In an embodiment the method includes the step of being responsive to the location data for applying one or more geographic or climatic indicators to the image.

[0048] In an embodiment the geographic or climatic indicators are selected from the group including: contours; bearings; hours of sunshine; rainfall; water courses; and fence lines.

[0049] In an embodiment the surface area of the region is at least 10 m 2 .

[0050] In an embodiment the surface area of the region is at least 16 m 2 .

[0051] In an embodiment the surface area of the region is at least 20 m 2 .

[0052] In an embodiment the method includes the step of publishing the image as an electronic file that is available for remote access.

[0053] In an embodiment the method includes the step of providing a portal through which authorised users are able to remotely access the electronic file.

[0054] In an embodiment the groundspeed of the aircraft during the application of the one or more treatments is between about 100 and 200 km/h.

[0055] In an embodiment the groundspeed of the aircraft during the application of the one or more treatments is between about 200 and 240 km/h.

[0056] In an embodiment the groundspeed of the aircraft during the application of the one or more treatments is about 200 km/h.

[0057] In an embodiment the groundspeed of the aircraft during the application of the [ ' ■'■ '■ one or more treatments is about 150 km/h.

[0058] In an embodiment the data gathering unit captures about 8 to 16 images per second. [0059] In an embodiment the data gathering unit captures about 10 to 12 images per second.

[0060] In an embodiment the data gathering unit captures about 15 images per second.

[0061] In an embodiment the data gathering unit captures about 30 to 40 images per second.

[0062] In an embodiment the data gathering unit captures more than 80 images per second.

[0063] In an embodiment the regions have a respective region surface area, and not all region surface areas are equal.

[0064] In an embodiment the step of progressively obtaining at least one spectral value for a plurality of respective regions within the plot includes obtaining only one spectral value for respective regions.

[0065] In an embodiment the step of being responsive to the data set for generating the spectral image for the plot includes being responsive to only one spectral value for each region.

[0066] According to a further embodiment of the invention there is provided a method ' for building a data set for a plot of land, the method including the steps of:

mounting a data gathering unit to a crop-duster type aircraft;

operating the aircraft to apply one or more treatments to the plot while simultaneously operating the data gathering unit to progressively obtain at least one spectral value for a plurality of respective regions within the plot; and

building a data set for each region that includes both location data for that region • and the respective spectral value.

[0067] In an embodiment the method includes the further step of being responsive to the data set for generating the spectral image for the plot.

[0068] In an embodiment the image is converted to an electronic file having format data and image file data, wherein the image file data is less than 1 MB per hectare of the plot of land.

[0069] In an embodiment the image file data is less than 0.1 MB per hectare of the plot of land.

[0070] In an embodiment the image file data is between 50 to 100 KB per hectare of the plot of land. [0071] In an embodiment the image file is a one of the following file types: a .DAT file and a .CSV file.

[0072] According to a further embodiment of the invention there is provided a system for generating a spectral image for a plot of land, the system including:

a data gathering unit for progressively obtaining spectral values for a plurality of respective regions within the plot, where each region has a surface area of at least 5 m 2 ; and

a processor: for building a data set for each region that includes both location data for that region and the respective spectral value; and being responsive to the data set for generating the spectral image for the plot.

[0073] In an embodiment the system includes an image capture device having an available field of view from which data images are able to be taken, wherein the data gathering unit is responsive to the data images for obtaining the spectral values.

[0074] In an embodiment the image capture device includes an actual field of view for each data image which is one of: the same as the available field of view; and a subset of the available field of view.

[0075] In an embodiment the image capture device is responsive to a sensor signal generated by the data gathering unit for defining the actual field of view.

[0076] In an embodiment the data gathering unit and the image capture device are mounted in an aircraft and the data gathering unit is responsive to the height of the aircraft above ground for generating the sensor signal.

[0077] In an embodiment the image capture device includes one or more photodiodes.

[0078] According to a further embodiment of the invention there is provided a system for building a data set of a plot of land, the system including:

a data gathering unit for mounting to a vehicle for travelling across the plot; a controller for the vehicle that is operable to selectively apply one or more treatments to the plot while simultaneously operating the data gathering unit to progressively obtain a spectral value for a plurality of respective regions within the plot; and

a processor for building a data set for each region that includes both location data for that region and the respective spectral value.

[0079] In an embodiment the vehicle is a crop duster type aircraft.

[0080] In an embodiment the vehicle is a ground operated unit. [0081] In an embodiment the vehicle is a tractor.

[0082] In an embodiment steps (b) to (d) are carried out, at least in part, in a location

externally to the vehicle.

[0083] According to a further embodiment of the invention there is provided a method for building a data set for a plot of land, the method including:

progressively obtaining at least one spectral value for a plurality of respective regions within the plot, where each region has a surface area of at least 5 m ;

building a data set for each region that includes both location data for that region and the respective spectral value; and

being responsive to the data set for generating the spectral image for the plot.

[0084] Reference throughout this specification to "one embodiment", "some embodiments" or "an embodiment" means that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, appearances of the phrases "in one embodiment", "in some embodiments", "in embodiments", "in alternate emboidments" or "in an embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment, but may. Furthermore, the particular features, structures or characteristics may be combined in any suitable manner, as would be apparent to one of ordinary skill in the art from this disclosure, in one or more embodiments.

[0085] As used herein, unless otherwise specified the use of the ordinal adjectives "first", "second", "third", etc., to describe a common object, merely indicate that different instances of like objects are being referred to, and are not intended to imply that the objects so described must be in a given sequence, either temporally, spatially, in ranking, or in any other manner.

[0086] In the claims below and the description herein, any one of the terms comprising, comprised of or which comprises is an open term that means including at least the elements/features that follow, but not excluding others. Thus, the term comprising, when used in the claims, should not be interpreted as being limitative to the means or elements or steps listed thereafter. For example, the scope of the expression a device comprising A and B should not be limited to devices consisting only of elements A and B. Any one of the terms including or which includes or that includes as used herein is also an open term that also means including at least the elements/features that follow the term, but not excluding others. Thus, including is synonymous with and means comprising.

BRIEF DESCRIPTION OF THE DRAWINGS

[0087] Preferred embodiments of the invention will now be described, by way of example only, with reference to the accompanying drawings in which:

Figure 1 is a schematic representation of a system according to an embodiment of the invention;

Figure 2 is a top view of a plot of land;

Figure 3 is a side view of an aircraft flying over the plot of land of Figure 2;

Figure 4 is a top view of a region within the plot of land;

Figure 5 is an alternate embodiment of a top view of a plot of land;

Figure 6 is a schematic representation of an on board camera unit and processor for the aircraft of Figures 1 to 3;

Figure 7 is a schematic representation of an alternate image capture device in the form of a pair of on board photodiode sensors for the aircraft in communication with a processor located remotely from the aircraft;

1 1 Figure 8 is a schematic top view of an on board camera unit and processor for trailing equipment for a tractor;

Figure 9 is a side view of one of the photodiode sensors; and

Figure 10 is an underside view of the sensor of Figure 9.

DETAILED DESCRIPTION

[0088] Described herein are systems and methods for generating a spectral image for a plot of land.

[0089] Referring initially to Figures 1 and 2, there is schematically illustrated a system 1 for generating a spectral image 2 in the form of an NDVI image for a plot of land, that is shown in Figure 2 in the form of a paddock 3. System 1 includes a data gathering unit 4 for progressively obtaining spectral values 8 for a plurality of respective regions 5 within paddock 3. Each region 5 has a surface area of at least 16 m 2 . System 1 also has a processor 6 for building a data set for each region 5 that includes both location data 7 for that region and the respective spectral value 8. Processor 6 is responsive to the data set for generating spectral image 2 for paddock 3. Processor 6 is also responsive to a geographical map of a subset of the regions 5 for assessing differences between the subset of regions and corresponding regions of an earlier generated image.

[0090] Referring in more detail to Figure 2, each region 5 is adjacent to at least one other region. All the regions 5 are spaced apart from the adjacent regions to define an intermediate zone 10 that extends between the regions. The zone 10 is relatively large compared to the regions 5, in that the surface area of zone 10 is much greater than the combined surface areas of regions 5. In other embodiments, zone 10 is relatively small compared to the regions 5.

[0091] In the embodiment of Figure 2, regions 5 have a respective region surface area, and all the respective region surface areas are substantially equal. In other embodiments, the region surface is not equal for all regions.

[0092] The region surface area is different in different embodiments. For example, in a preferred embodiment, region 5 is about 10 to 15 m . In some embodiments, the surface area of region 5 is about least 5 m . In yet other embodiments, the surface area of region 5 is about 10 m 2 . In yet other embodiments, the surface area of region 5 is about 20 m 2 .

[0093] However, the size of region 5 will be dependent on the type of lens used. For example, as the angle of the lens gets wider, the region gets bigger. The region size would also be dependent on the type of crop being assessed. For example, for crops such as lettuce, a relatively smaller field of view would be best.

[0094] It is also noted that in various embodiments, region 5 is positioned relative to an adjacent region 5 in one of the following configurations: contiguous, slightly

; ; overlapping; or with a small gap in between individually assessed regions.

[0095] In the embodiments disclosed in this specification, the surface area of region 5 is a horizontal planar two dimensional view from directly above the region. No account is taken for contour in the calculation of the surface area figures provided above.

[0096] In an alternate embodiment shown in Figure 5, some regions 5 are contiguous to adjacent regions. As such there are two separate and spaced apart intermediate zones 10. In other embodiments there are more than two zones 10. In further embodiments the regions 5 are contiguous and collectively cover paddock 3 and zone 10 is null. In further embodiments, one or more or all of regions 5 overlap with at least one adjacent region. In other embodiments, selected combinations of the above arrangements are possible. [0097] It will be appreciated that only a portion of paddock 3 is illustrated in Figure 2 and Figure 5.

[0098] Referring now to Figure 3, system 1 further includes an image capture device in the form of a camera unit 20 having an available field of view 21 from which data images are able to be taken. Unit 4 is responsive to the data images for obtaining spectral values 8. Unit 20 includes an actual field of view 22 for each data image which is subset of view 21. In other embodiments, view 22 is the same as the available field of view 21.

[0099] Unit 20 is responsive to a control signal generated by unit 4 for defining the view 22.

[00100] Data gathering unit 4 and camera unit 20 are mounted in an aircraft 40 configured for crop dusting.

[00101] In embodiments the image capture device includes one or more sensors having one or more photodiodes. For example, in the embodiment illustrated in Figure 7 the image capture device includes a pair of photodiodes 59. The photodiodes collectively provide a 1 -pixel sensor that is responsive to two separate non-overlapping predetermined spectra. That is, the quantum of light that is sensed directly by the photodiodes affects the amount of analogue electrical signal sent to the data logger within unit 4. The two separate non-overlapping predetermined spectra sent to processor 6 which combines them to a single signal value. An analogue/digital converter in the data logger converts this signal to derive for each zone a spectral value 8. In other embodiments, only raw captured data is processed by unit 4. This is then sent to an office computer (located at a base remote from the image capture device)

; whereby locations of the raw data and a subsequent spectral value are then calculated and further processed into an image (NDVT & /or REIP).

[00102] In other embodiments, the predetermined spectra overlap with each other.

[00103] Each photodiode has a filtration coating for providing the photodiodes with the required responsiveness to the respective predetermined spectra. Advantageously, the coating negates the need for a separate filter being placed in front of the lens. The sensors include machined hollow aluminium cylinder to house the photodiodes and the length of the cylinder defines a field of view (similar to that denoted by reference number 22) and therefore defines the size (or footprint) of zone of land that is being assessed, thereby negating the need for the lens. A change in width of the cylinder also changes the size of the assessed area.

[00104] The position that the photodiodes are mounted within the cylinder is such that, at a height of about 3 to 5 meters, the assessed surface area is about 16 m 2 . At greater heights this assessed land area will be larger. Alternatively, in other embodiments, the assessed area can be altered by changing the mounting position of the photodiodes relative to the cylinder. This assessed land area will in part form at least a portion of a region 5. In embodiments, the assessed land area forms the entirety of region 5.

[00105] In other embodiments, the sensors include a relatively inexpensive lens, which is positioned inside an aluminium casing.

[00106] In an alternate embodiment, the photodiode remains fixed relative to aircraft 40 and the cylinder is adjustably mounted to aircraft 40. The position of the cylinder is remotely adjustable so that a predetermined assessed area size can be set for a certain height. Furthermore, in some embodiments, the adjustment of the position of the cylinder is automated to adjust in correspondence with the height of aircraft 40 such that a substantially equivalent area size is always assessed. In other embodiments, the cylinder is manually adjustable. In yet other embodiments, the photodiode is adjustably mounted to aircraft 40 and the cylinder remains fixed relative to aircraft 40.

[00107] In yet other embodiments, the image capture device includes one or more heat sensors which provide an analogue response in an upper waveband region of the NIR spectrum. For example; the gathering of input from heat sensors allows the monitoring of certain plant stresses. In sustained higher ambient temperatures, water-related stresses on a plant are known to cause the closing of the stomata in those plants, which results in the plant retaining more heat. Whereas non water-stressed plants the stomata remain relatively open and, as a result, are less susceptible to heat retention. Accordingly, the gathering of data of this kind is especially useful for planning irrigation programs.

[00108] Referring back to Figure 5, processor 6 derives the spectral value for one or more of zones 10 from a respective spectral value from one or more regions 5 adjacent to the respective zone. In other words the spectral values for zones 10 are generally inferred from the adjacent regions 5.

[00109] Only one spectral value 8 is taken for a respective region 5. Referring to Figure 4, this is achieved by taking a plurality of spectral sub- values corresponding to an array of pixels 30 and averaging them to get a single spectral value 8 for each region 5. It will be appreciated that the single spectral is able to be derived from the plurality of sub- values by other than the average of those sub values. For example, in one embodiment, a selected single sub- value is used as the value 8.

[00110] Referring to Figure 6, there is illustrated a unit 20 that comprises two cameras 23 that each capture data from two distinct wavelength bands. The data captured from each camera 23 is combined to generate spectral value 8. In alternate embodiments, there is other than two cameras. In use, the cameras are mounted adjacent to each other and are operated in synchronism or quick succession to optimize the accuracy of the spectral values, and the correspondence of the spectral values with the geo-referencing information contained within the location data for a given region.

[00111] Cameras 23 each capture two distinct parts of the total spectrum. Those two parts of the spectrum are referred to as 'red' - a small window around 695 nm - and 'near-infrared' - a small window around 950 nm. It is noted that these wavebands are particularly suited to assessing potato crops. In other embodiments where other types of crops are of interest, other optimal target wavebands are captured as would be appreciated by those skilled in the art.

[00112] The capture of the two distinct parts of the total spectrum is achieved by placing a filter 24 in front of the camera 23 that filters out the unwanted wavelengths above and below the required wavelength window. The target wavelengths then hit a CMOS sensor (not shown) at the back of camera 23 and are converted into an electrical signal. Depending upon the intensity (that is, the number of photons) of the wavelength that hits each pixel, a value is then assigned. A pixel average of 0 indicates no intensity and that no target photo ris have been detected as having entered the camera 23. A pixel average of 255 indicates maximum intensity and that many photons have entered the camera 23. In this embodiment, use is made of an 8 bit signal, which equates to 256 values. In other embodiments, a 9 bit signal is used and provides would give a wider intensity range of 512 values; and a finer, more complex image. In other embodiments, a 4-bit signal is used and allows for 16 values and thus a coarser image. The spectral values are used to calculate a variety of indices including a NDVI for the respective regions 11. In embodiments utilizing sensors having one or more photodiodes, the sensors are mounted in pairs. For the monitoring of potato crop paddocks, one of the pair of photodiodes senses the red spectral waveband and the other of the pair senses the NIR spectral waveband.

[00113] It will be appreciated that in other embodiments, the two parts of the spectrum captured by cameras 23 are other than the 950 nm and 695 nm wavelength windows and will be specified depending on the crop of interest and algorithm of interest. For example, the wavelength of 880 nm is often used and embodiments of the invention are able to be adapted for obtaining measurements at that wavelength.

[00114] Unit 20 captures about 10 to 12 images per second. In other embodiments, unit 20 captures about 8 to 16 images per second. It will be appreciated by those skilled in the art, from the teaching herein, that other image capture rates are utilized where appropriate. The camera unit captures light reflected from paddock 3 in the visible and near-infrared wavebands. It will be appreciated that the above image capture rates are also applicable to embodiments including one or more sensors having one or more photodiodes. In an embodiment, unit 20 or the photodiode sensors capture about 15 images per second. In a preferred embodiment, unit 20 or the photodiode sensors capture about 30 to 40 images per second. In other embodiments, unit 20 or the photodiode sensors capture about 20 to 30 images per second. In other embodiments, unit 20 or the photodiode sensors capture about 40 to 50 images per second. In other embodiments, unit 20 or the photodiode sensors capture about 50 to 60 images per second. In other embodiments, unit 20 or the photodiode sensors capture more than : about 60 images per second.

[00115] In alternate embodiments, more than one image capture device may be mounted to aircraft 40. For example, in an embodiment, two image capture devices are mounted to aircraft 40, with one on each wing of the aircraft. In this embodiment, the image capture devices would collectively capture more than 80 images per second. This would produce an image with much greater pixel density.

[00116] The raw data set captured by unit 20 is one position data per second and about 35 to 40 spectral values per second.

[00117] Referring back to Figure 3, the spectral values are obtained from directly above each respective region 5. Similarly, in embodiments including one or more sensors each having one or more photodiodes, the one or more sensors are mounted in aircraft 40. It is appreciated that in alternate embodiments, unit 4 is not mounted in aircraft 40, but located remotely from aircraft 40, for example at the same base as the office computer. This allows unit 20 to be programmed remotely.

[00118] Aircraft 40 is operated to apply one or more treatments to paddock 3 either simultaneously or sequentially. These treatments include the dusting of crops with various fertilizers and/or chemicals such as pesticides or herbicides. However, in other embodiments, these treatments are for other than dusting crops. It is mentioned that the treatment is generally a process that is done in a regular time intervals, for example once every 8 to 10 days during the growing season of a potato crop being grown in paddock 3. In other embodiments coverage occurs once a month.

[00119] Aircraft 40 includes a controller (not shown) that is actuated by an operator to apply the one or more treatments. In this embodiment the controller is a button located within the cockpit of aircraft 40 and which is manually progressed by the pilot into a depressed configuration to actuate fluid pumps (not shown). These pumps direct the relevant fertilizer or other chemicals through nozzles (not shown) that are spaced apart along the rear of the wings of aircraft 40. The pilot maintains the button in the depressed configuration during each pass over paddock 3, and releases the button at the end of the pass - at which time it moves from the depressed configuration to a steady state configuration - to discontinue the passage of the treatment through the nozzles. In other embodiments, the controller is one or more of a mechanical lever, a touch panel, a keypad, or other such interface.

[00120] It is noted that in other embodiments, a ground vehicle (such as a tractor) is used to apply one or more treatments to paddock 3. Therefore, the ground vehicle will have

: ' · the image capture device mounted to it, as described in the embodiments described above.

[00121] In another embodiment, the controller is automatically actuated in response to the aircraft entering a predefined geofence to apply the one or more treatments. That is, the device providing the georeferencing information for allowing the derivation of the location information is also able to contribute to the automated gathering of the spectral data and/or the application of the treatment to the crop in the paddock 3.

[00122] In one embodiment where automated gathering of spectral data occurs, the data- gathering unit 4 is pre-programmed to automatically capture spectral values within the geofence. This is achieved by way of identifying the coordinates of the perimeter of a predetermined region and programming a geofence that defines this predetermined region within which spectral values are always captured. This takes away the need for data-gathering unit 4 to be manually actuated to capture spectral values. This automation is achieved by identifying a plurality of coordinates defining the perimeter of the predetermined region and programming a geofence within which spectral values are always captured. Aircraft 40 includes an on board GPS device that communicates the location of aircraft 40 to unit 4. Unit 4 includes a time-out function that determines if the captured spectral image data should be is sent to processor 6. Specifically, the time-out function will determine if:

• Aircraft 40 was being used to dust a crop within that plot, in which case the data is sent to processor 6; or

• Aircraft 40 was simply traversing the plot of land to get to another location, in which case the data is discarded.

[00123] In other embodiments, the time-out function is incorporated into processor 6. In this case, the captured spectral image data is sent to processor 6 and certain data is discarded if aircraft 40 was traversing the plot of land to get to another location.

[00124] In embodiments, the geofence feature can be used with mapping software or georeferenced aerial photo software such as Google Earth. That is, the areas of data capture (the paddocks) are identified using Google Earth in association with the software used to manage the data received from the aircraft. This allows for:

• A greater control of the paddocks that are to be surveyed by the aircraft or ground operated units.

• Simpler integration of processor 6 into the aircraft or ground operated unit as there is no need for wiring into the existing electronic systems, other than just power.

• Negates the need for the pilot to actuate, or indeed conclude and send the capture of data. This reduces the chance of human error and the chance of writing over GPRS data that has not yet been sent to the computer network when a subsequent paddock is then surveyed.

• The ability for the user to identify and name target paddocks themselves, and to therefore register these names into the software component of the system. In embodiments, this feature is only available to subscribers for a predetermined fee. In yet other embodiments, only a number of paddock images per season must be agreed to. [00125] As the captured values for the regions 5 can be correlated with a geographical map of a subset of the regions, differences can be assessed between the subset of regions and corresponding regions of an earlier generated image. This allows for a comparison to be made, over time, for a particular region. This is especially advantageous to observe patterns and to help to maintain the general health of crops in a particular region.

[00126] In other embodiments, differences can be assessed between all the regions.

[00127] It will be appreciated that, in normal use, aircraft 40 makes a plurality of parallel passes in a predetermined direction over paddock 3 to ensure that substantially all of the paddock is subject to the treatment. The centres of the adjacent passes are typically about 16 m apart, although this is dependent upon the nature of the treatment and the aircraft. In this embodiment, the regions are square and have a surface area of about 16 m and, as such, the distance in the predetermined direction between adjacent edges of the regions in adjacent passes is about 12 m. The distance between adjacent regions in the same pass is about 0.2 m due to the groundspeed of aircraft 40 being about 150 km/h and the image capture rate about 10 images per second. In practice, however, the groundspeed varies with time and often an increased capture rate of 12 images per second is used to provide some overlap between adjacent images in the same pass.

[00128] Unit 4 is responsive to the button being in the depressed configuration for selectively obtaining spectral values 8 during normal operation of aircraft 40. In other words, during the dusting of crops, the same control that is used to activate the release of chemicals will also actuate unit 4. In turn, unit 4 directs unit 20 to capture images of regions 5 at the selected image capture rate.

[001 9] Spectral values 8 are obtained from the same height at which aircraft 40 is flying.

This height, denoted on Figure 3 by the letter 'FT, is about 3 to 10 metres above the respective region 1 1. It will be appreciated that this will depend upon a variety of factors including the nature of the treatment, the prevailing weather conditions, the skill of the pilot, the terrain of paddock 3 and the terrain surrounding paddock 3, any manmade structures in or near paddock 3, amongst others. Accordingly, in other embodiments, H is about 3 to 30 metres above the respective region. In yet other embodiments, H is about 3 to 6 metres above the respective region. In yet other embodiments, H is about 3 to 100 metres above the respective region. In yet other embodiments, H is about 3 to 60 metres above the respective region. [00130] Aircraft 40 is operated to apply one or more treatments to paddock 3 and unit 4 is responsive to the application of the one or more treatments for sequentially obtaining the spectral values 8. This is advantageous as it allows the spectral image values to be obtained by aircraft 40 during an action that was to occur in any event. This allows for time and cost effective data gathering for the subsequent generation of the spectral image 2 for paddock 3. This is similarly an advantage for embodiments where a ground vehicle is used.

[00131] As mentioned above, the notional groundspeed of aircraft 40 during the application of the one or more treatments is about 150 km/h. However, it will be appreciated by those skilled in the art that the groundspeed of aircraft 40 is different in other embodiments. For example, other groundspeeds of aircraft 40 typically fall between about 200 and 240 km/h. In an embodiment the groundspeed of aircraft 40 during the application of the one or more treatments is about 200 km/h. In other embodiments, groundspeeds fall within the range of about 100 and 200 km h. In other embodiments, groundspeeds fall within the range of about 200 to 300 km/h. The latter occurs, for example, with aircraft such as the Air Tractor and Dromader turbine aircraft which are used primarily for applying selective herbicides to a broad acre crop including as cotton crops.

[00132] It is appreciated that, in embodiments where data gathering unit 4 is mounted to an irrigator, the notional groundspeed of an irrigator is much less than that of an aircraft. For example, a typical irrigator moves at notional groundspeeds of about 15 to 145 meters per hour. While the speed is much lower, this embodiment also offers the same advantage of the spectral image values being obtained by the irrigator during an action that was to occur in any event.

[00133] As briefly mentioned above, in embodiments, the data gathering unit 4 is mounted on ground equipment such as tractors and their trailing implements. As will be understood by those skilled in the art, the normal groundspeed of a tractor and/or an associated implement, is much less than that of an aircraft although typically greater than that of an irrigator. Data gathering unit 4 and camera unit 20 are mounted in aircraft 40 and unit 4 is responsive to the height of aircraft 40 above ground for generating a sensor signal. More particularly, unit 4 derives the sensor signal from the aircraft's navigation system (not shown) as to the height of aircraft 40. In other embodiments the sensor signal is derived from a GPS receiver or a combination of inputs. In other embodiments, unit 4 is responsive to other than the height of aircraft 40 above ground.

[00134] In another embodiments, as mentioned above, the data capture device (camera unit 20 or the sensors including one or more photodiodes) and data gathering unit 4, are mounted to an overhead irrigator. In embodiments including the sensors (having one or more photodiodes) mounted to the irrigator, up to four pairs of sensors are used. In further embodiments, the over-head irrigator includes a plurality of data capture devices each spaced apart equidistant relative to each adjacent device and mounted to the irrigator. In other embodiments, the data capture units are not all equidistant from one another. The data capture devices include a primary data capture device that is in communication with data gathering unit 4. The data capture devices, aside from the primary data capture device, each include an output in the form of an electrical plug and an input in the form of an electrical socket whereby the plug and socket are configured to fit together. The primary data capture device includes a plug only. This arrangement is such that a plug from one data capture device can fit into a socket of an adjacent data capture device. Therefore, desired the data capture devices can be linked together via their respective plugs and sockets so that the desired data capture devices are each used to capture spectral image data. The primary data capture device will always be used to capture spectral image data.

[00135] In other embodiments, data unit 4 is not mounted to the overhead irrigator and is located elsewhere and the primary capture device is in wireless communication with data unit 4.

[00136] In a further embodiment, unit 4 uses an array of sensors and is responsive to the sensor signal to calculate the portion of the available field of view 21 that will constitute the actual field of view 22. In an example of such an embodiment, view 21 , the available view, is made up of an array of 256 χ 256 pixels within unit 20. View 22, however, is intended to be of a predetermined surface area which, in this embodiment is 16 m . Therefore, unit 20 is responsive to the sensor signal for determining the relevant subset of pixels for defining view 22. That is, if the height of aircraft 40 increases, the number of pixels used to define view 22 will reduce, and vice versa. The centre pixels are always used and, as the height decreases, progressively more pixels outwardly of the centre pixels are used. In other embodiments, view 21 is made up of an array of other than 256 χ 256 pixels within unit 20. In embodiments where the photodiode sensors are used, pixel manipulation is not required.

[00137] Referring to Figures 1 and 6, processor 6 in this embodiment takes the form of an industrial computer 41 that is removeably mounted within aircraft 40 and that is operably coupled to unit 20 and a GPS system on aircraft 40. In other embodiments processor 6 takes the form of other than an industrial computer, for example, a laptop computer. Computer 41 provides a series of prompt signals at the desired image capture rate to unit 20 to obtain the series of spectral values 8 in a given pass. In parallel, computer 41 provides a corresponding series of prompt signals to the GPS system to obtain location data 7. Both the data 7 and values 8 are stored in processor memory 49.

[00138] In other embodiments such as that illustrated in Figure 7, processor 6 is a Linux single board 60 operably coupled with a GPS and GPRS system 61. Processor 6 is preprogrammed to capture images at defined rates (photodiode sensors at about 15 per second and GPS at 1 per second). In embodiments where the camera or sensor is mounted to an irrigator the defined rates are much slower. For example, the captured images are stored on processor 6 until the irrigation or crop dusting event is completed (that is, the entire paddock has been dusted or irrigated, or both) whereby the file is sent via GPRS to the office computer. The office computer then computes a location position to each spectral value by dividing the number of spectral values captured between location seconds.

[00139] In other embodiments, different wireless communications protocols are used, such as GSM, 3G, WiFi, or other localised networks. In still further embodiments, the file is loaded to an intermediate memory device and physically transported before being uploaded to the office computer either directly or via a network to which that computer is connected or connectable.

[00140] Once values 8 and data 7 are obtained by processor 6, spectral data is able to be derived for zones 10. Processor 6 uses the respective values 8 of adjacent regions 5 to derive estimates of the likely values within the zones. In other embodiments, this derivation of estimates is done remotely, for example, in a remote office housing the office computer.

[00141] Processor 6 synchronizes data 7 with the series of spectral values 8 and the values in adjacent zones 10. The computer is responsive to the spectral values and the location data for generating spectral image 2. [00142] Image 2 is then converted to an electronic file in the form of a relatively compact PDF file 50. In other embodiments, the file may be other than a PDF file, for example JPEG files, GEOTIFF files and GIF files. In other embodiments, image 2 is converted to an electronic file in a remote office. The resultant image is then placed onto a website over a georeferenced aerial photo, such as Google Earth.

[00143] File 50 includes formatting data and the spectral image data. Typically, file 50 has a size of less than 1 MB per hectare, and more preferably the spectral image data is less than 1 MB per hectare. In other embodiments file 50 has a size of less than 0.1 MB per hectare, and more preferably the spectral image data is less than 0.1 MB per hectare. In other embodiments, image 2 is published as other than as an electronic file.

[00144] On the basis of the above figures, for a typical paddock with a potato crop having an area of about 25 hectares, a typical file size is about 2.5 MB. It is also noted that, paddocks vary in size depending on the crop that is grown on the paddock and the available space for the paddock. Therefore, in alternate embodiments, the size of the paddock will be other than 25 hectares. For example in one embodiment, paddocks are 100 hectares or more.

[00145] Once file 50 is generated, processor 6 will send the file via wireless communication (for example GSM) to a central computer network 51 which includes a communications interface 52, a processor 53 and a database 54. It will be appreciated that network 51 includes additional components, which have been omitted for the sake of clarity.

[00146] File 50 is received at interface 52 and stored by processor 53 in database 54.

Database 54 includes a plurality of such files which are date-stamped. Moreover, it will be appreciated that each file includes at least some location information - typically at least three points - for allowing additional functionality. For example, to facilitate one or more of: the overlay on a given spectral image of contours or other geographic or manmade features; the combination of spectral images for adjacent paddocks; the production of a time sequenced overlay of chronologically generated spectral images for a given paddock; and others.

[00147] In other embodiments, the raw data from unit 4 is sent wirelessly (for example via GSM) to computer network 51 where it is processed by processor 53.

[00148] The stored file 50, or a file derived from file 50, is accessed via a website (not shown) that is hosted by processor 53. This allows for remote access for one or more preselected authorized users. The website takes the form of a web portal that is configured to allow authorized users remote access one or more preselected files 50. The remote access is achieved via internet or email ready devices, such as a personal computer 55, a laptop computer 56, a mobile telephone 57, and a PDA 58, hybrid devices (not shown), smartphones, and iPhones, amongst others.

[00149] In embodiments, the stored file 50, or a file derived from file 50, is used with a predetermined function of plant growth to determine the correlated correct amount of plant input to be subsequently applied to that region. That is, the information from the spectral data (for example the NDVI and REIP) is analyzed and compared with predetermined information on the region (in particular the crop) for providing crop growth information. This information includes various health parameters of the crop. This is then used to assess what type of additives or chemicals, and how much, should be applied to promote crop health and growth, that is, to increase the general health and growth of that crop. In an embodiment, this analysis and comparison is done in a relatively short time period such that the application of chemicals or additives is carried out in the same pass as the collection of spectral data.

[00150] In an embodiment the crop growth information includes amounts of one or more additives to be applied to the region for promoting crop growth.

[00151] For example, NDVI is strongly correlated with the application of foliar nitrogen to optimize potato yield. This data is used to produce a crop growth function for potato yield. Variable rates of nitrogen could be applied to each sensed region following NDVI assessment.

[00152] Prescriptions for variable rate applications of crop inputs (for example fertilizer, herbicide, water, pesticide) can be correlated from the NDVI spectral image. In embodiments described above whereby a tractor implement is used, it would be possible, for example, to capture NDVI from the front of the tractor, apply a correlation function to the captured data to supply a variable rate of crop input from the trailing implement in the one pass.

[00153] In embodiments that utilize crop-duster type aircraft, the captured NDVI fropm ·' the first pass over a region would allow the chemicals and additives to be applied during the second pass of a single trip.

[00154] This is especially advantageous for embodiments utilising over-head irrigators whereby captured NDVI/REIP would be correlated with a crop growth function and an application rate would be made available, via the web, to a spray contractor who would download the relevant data (such as a .SHP file format) and load this directly into their hardware so that the crop would be sprayed accordingly.

[00155] A crop growth function includes a simulation of ideal growth for a particular crop including ideal levels of certain chemicals (as seen, for example, in paragraph

[00149]). This simulation is done by processor 53. In other embodiments, this simulation is done in an external processor.

[00156] An example of a method of producing and utilizing a crop growth function it is outlined below:

• A crop is scanned to obtain an NDVI value for a region.

• The captured region numbers are processed in processor 53 from a correlation equation (linear correlation is a=b + c*d which describes the change in NDVI - denoted by c - with change in nitrogen required). These captured region numbers are applied to the NDVI value giving the optimal amount of additive that is to be applied to that crop in that region.

• This amount is transferred into a spray rate.

[00157] In embodiments where a tractor is used and the trailing implement includes a spray function, this method is done in the one pass. The crop growth function is defined as the ordinary least squares regression equation or model a=b + c*d, whereby the dependant variable a is the required additive (for example foliar nitrogen) and the independent variable d is NDVI or REIP or some other crop sensed variable. Variables b and c are components (slope and intercept) of the linear regression model.

[00158] Processor 53 is responsive to file 50, which includes image 2, and other spectral images of the same region 5 at other times for allowing a user to assess differences between regions at those different times. In other words, the user can access a number of images 2 for the same area over time and is then able to compare and contrast those images. Server processor 51 is also responsive to location data 7 for applying one or more geographic or climatic indicators to the image, such as overlaying the resultant computed spectral value surface over a georeferenced aerial photograph of the area, such as Google Earth, for example. The geographic or climatic indicators are one or more of: contours; bearings; hours of sunshine; rainfall; water courses; and fence lines, amongst others. [00159] The files 50 - which defines the images 2 - are stored within database 54 for the particular region and are assessed by an authorised user at any time as:

• Time sequences replays using applications that are commonly available to most modern 3G-type of "smart" phones.

• An integrated data layer with any other farm management software.

• A tool for crop modeling and agribusiness projections.

[00160] In overview, spectral values 8 are progressively obtained for a plurality of respective regions 5 within paddock 3. A dataset is built for each region 5 that includes both location data 7 for that region and the respective spectral value 8. The spectral value, in this embodiment, is a single value between 0 and 255. This value is derived from an average of the intensity of the radiation at the required frequency falling upon the pixels of unit 20 that are being used when the image is captured. In other embodiments, algorithms other than averaging are used.

[00161] A spectral value for zone 10 is derived from the dataset and then spectral image 2 for paddock 3 is generated from the data and values. The image 2 is converted to a file 50 which is then sent to processor 53 to be stored in database 54 where file 50 is made available for remote access by one or more preselected authorized users.

[00162] In other embodiments where paddock 3 includes more than one zone 10, and a spectral value is derived for all respective intermediate zones by processor 6.

[00163] It will be appreciated that in other embodiments image 2 is other than an NDVI image. For example, in one embodiment, image 2 is a plant cell density (PCD) image. In another embodiment the image is for indicating the red edge inflection point (REIP) of the respective regions. In further embodiments, image 2 allows for the indication or more than one of the above, or other, measures.

[00164] In other embodiments, the plot of land is other than a paddock. In yet other embodiments, processor 6 and unit 20 are mounted in other than an aircraft. For example, in one embodiment, processor 6 and unit 20 are mounted to an existing overhead irrigator and data is captured as the irrigator moves over the plot of land.

[00165] Furthermore, in some embodiments, an illumination source is used to provide additional light of the required wavelength or wavelengths to enhance the quality of the images captured by unit 20.

[00166] Reference is now made to Figures 8 to 10, where corresponding features are denoted by corresponding reference numerals. In this embodiment, a system 71 is configured for building a data set for a plot of land in the form of paddock 3. The data set includes spectral values 8 for a plurality of respective regions 5 within paddock 3 (not explicitly shown in Figure 8), together with location data 7 for those regions.

[00167] It will be appreciated that paddock 3 is the site of a crop that is systematically segmented into cropped segments 72, 73 and 74, and intermediate un-cropped segments 75, 76, 77 and 78. To avoid unnecessary complication, only three cropped segments are illustrated. In practice, the generally uniform crop segments extend over substantially the entirety of paddock 3 with the exception of the peripheral edge.

[00168] A tractor 80 is illustrated in Figure 8, and is progressing along cropped segment 72 in the direction indicated by arrow 79. The tractor 80 includes four wheels that are travelling along adjacent un-cropped segments 76 and 77, while an implement - which in the illustrated embodiment is a spray applicator 74 - is coupled to tractor 80 and includes two spaced apart wheels that are travelling in the direction of arrow 79. The wheels of the applicator 74 are travelling along un-cropped segments 75 and 78. Accordingly, the tractor and applicator are able to traverse paddock 3 to allow the required cultivation or treatment of the crop to occur - in this case the application of one or more of a selective herbicide, pesticide or fertilizer - without the wheels of the tractor or applicator damaging the crop.

[00169] System 71 includes a data gathering unit 81 for mounting to a vehicle in the form of tractor 80, and for remaining with the tractor as it travels across paddock 3. Tractor 80 travels at an average speed of about 25 km per hour across paddock 3. In other embodiments, tractor 80 travels at average speed of between about 20 and 30 km per hour. In yet other embodiments, tractor 80 travels at average speed in the range of about 10 to 20 km per hour. In yet other embodiments, tractor 80 travels at average speed of less than about 10 km per hour. In yet other embodiments, tractor 80 travels at average speed in the range of about 30 to 50 km per hour.

[00170] Typically, the controller is contained within a protective casing and, if required ventilated. The operator of the tractor selectively actuates a control lever to apply the one or more treatments to the crop. In this embodiment, the actuation of the control lever also simultaneously operates the data gathering unit 81 to progressively obtain a spectral value for a plurality of respective regions within the paddock that pass under the applicator. A processor within unit 81 builds a data set for each region, where that data set includes both location data for that region and the respective spectral value. The location data is obtained from a GPS unit 82 that is also mounted to tractor 80. In other embodiments, unit 81 simply collects the location and spectral data, which is subsequently provided to the processor, which is remote from unit 81.

[00171] System 71 further includes an image capture device in the form of a pair of spaced apart sensor units 83 and 84 that are mounted to applicator 74 and maintained above the plants in the respective underlying cropped segments. In other embodiments further sensor units are used instead of or in addition to sensor units 83 and 84. As best shown in the underside view of Figure 10, sensor unit 84 includes a casing 85, from the underside of which is mounted a square array of four closely spaced sensors 87, 88, 89 and 90. These sensors are each configured to provide respective electrical signals indicative of the intensity of the sensed radiation, but at different wavelengths. In different embodiments, this radiation is of different types. For example, in one embodiment, the radiation is IR radiation. Unit 84 receives a control signal from unit 81 that is relayed via a wired communication link 91 that connects unit 81 to unit 84. This control signal is used as a timing or synchronization signal for sampling the detected intensities at sensors 87 to 90. The sampled signals from the sensors are combined and communicated to from unit 84 to unit 81, also via link 91.

[00172] The control signal is also used by unit 81 to obtain a synchronized location signal from GPS unit 82.

[00173] It will be appreciated that these sensors at a macro level are similar and, being closely located, have a similar field of view of the underlying cropped segment. This field of view is illustrated by way of example in Figure 9 by the two downwardly directly diverging arrows that radiate from unit 84.

[00174] Unit 83 and 84 also include an illumination source in the form of respective downwardly directed xenon lights 92. This is especially suitable for the application as xenon light most closely matches the spectral characteristics of natural sunlight. The object of using such a light is to negate any error due to periodic shading of the sensor unit (due to, for example, cloud cover and shadows from implements) and to supply a constant source of reflected light from which different images of crop vigour can be assessed that were captured under different environmental conditions.

[00175] In other embodiments different illumination sources are used. [00176] In embodiments, sensors 87, 88, 89 and 90 are removable from units 83 and 84 for facilitating quick replacement of these units. It is noted that this replacement of parts does not affect the data stream that is initially processed in the Linux single board 60.

[00177] As sensors 87, 88, 89 and 90 can be replaced quickly and easily, this allows for rapid assessment of different target wavelengths, as well as different implement heights (focal length), that are best for assessing the growth characteristics of certain crops. For example, an aircraft primarily used for assessing potatoes, the best wavebands to use are 50 nm windows centered at 920 nm (NIR) and 690 nm (red). Such an aircraft applies fungicide at 3 to 5 metre heights and fertilizer at about a 10 metre height. When altering height or altitude, sensors 87, 88, 89 and 90 can be quickly replaced with a unit with a different focal length depending upon the desired field of view and altitude. For example, in embodiments, at 3 to 5 metres in height, the focal length is 8 mm, which equates to a ground field of view of approximately 20 m 2 . At 10 metres in height the focal length is 12 mm to yield the same field of view.

[00178] A possible real life example would be if a party wished to assess banana, sweet corn and sugar cane. Each crop has a different target waveband and ideal operating height. To carry out the assessment of the crops, the assessor only needs to select the correct lens to be mounted as one of sensors 87, 88, 89 and 90 to quickly replace and fit prior to the commencement, for example, of the aircraft taking off.

[00179] Alternatively, the current Linux SBC board 60 has the potential for eight input data streams (four paired inputs). Therefore, in an embodiment, four different sensors 87, 88, 89 and 90 are fitted to applicator 74. The sensor that is used can then be selected using a cockpit mounted switch. In other embodiments, each lens will collect spectral data and the processor will process only the input from a pre-determined one of the sensors 87, 88, 89 and 90. It is noted that the same apparatus and methods are also applicable not only to aircraft but also to ground operated units including spreaders, spray operators and over-head irrigation equipment.

[00180] In an embodiment, the data capture can be configured remotely. That is, the Linux SBC board data capture rate can be configured remotely via the office based computer. As such, the configuration can be changed for different uses dependent upon altitude, velocity (SBC board) and wavelengths (lens mounts). In embodiments, the same unit can be removed from an agricultural aircraft and placed onto a tractor, then reconfigured via the website interface. In this embodiment, the Linux SBC board 60 would reconfigure when booted up via GPRS connection to the website.

[00181] The control signal is provided by unit 81 at a rate of about 1 per second, and as such there are four sensor signals per second communicated to unit 81 from unit 84. As unit 83 also includes four similar sensors, unit 81 receives 8 sensor signals per second as tractor 80 travels over paddock 3 and applies the relevant treatment to the underlying crop.

[00182] Sensors 87, 88, 89 and 90 mounted to applicator 74 are typically positioned at a height of about 3 to 6 metres above the crop for the purpose of spraying. For the purpose of applying fertilizer, sensors 87, 88, 89 and 90 are typically positioned at a height of about 10 metres above the crop.

[00183] In embodiments whereby a tractor 80 is used, sensors 87, 88, 89 and 90 are typically positioned at a height of about 2 to 3 metres above the crop. A spay contractor typically has a spray implement with arms that fold out when spraying and fold in when moving between jobs. One of sensors 87, 88, 89 and 90 (lens mount) would be located in the centre region slightly in front of the implement on each side with a field of view of about 8 to 20 m 2 .

[00 84] Unit 81 is responsive to the movement and rate of movement of tractor 80 for issuing the timing signals, and for recording synchronized - or at least substantially synchronized - sensor signals and location data. Accordingly, as the speed of tractor 80 ■ varies so too does the rate at which the control signals are issued. If the tractor stops either momentarily or for an extended period, the control signals will be suspended until movement is once again detected. This reduces the risk of a build-up within unit 81 of redundant or unnecessary data, and simplifies the transmission and processing of the data. In embodiments, tractor 80 includes a pause button that is actuatable by a tractor operator during operation of tractor 80. This pause button allows the tractor operator to pause data capture when data is not required, which is the case then navigating around obstacles. In embodiments where aircraft 40 is used, aircraft 40 also includes a similar pause button feature.

[00185] In this embodiment, unit 81 processes the raw data to provide a data file that is ' transmitted wirelessly - in this embodiment by the GPRS protocol - to a remote computer. [00186] In embodiments where greater accuracy is required, the GPS data is corrected by unit 81 or the remote computer to correct for the location of units 83 and 84 relative to unit 82.

[00187] The remote computer processes the data file to provide a spectral image having both NDVI and REIP images, together with time and date information. This is then made available for authorized users to selectively access via a web server, together with earlier generated like images for the paddock. Authorized users can also identify new areas of interest (paddocks 3) via a web portal. These new areas are then registered by the Linux SBC board 60 when the aircraft or ground operated equipment is next used.

[00188] It will be appreciated that the disclosure above provides a spectral image of a paddock that is quick and inexpensive to obtain, which provides an indication of one or more characteristics of the land vegetation in the paddock, and which is sized for the purpose of allowing quick and easy processing, communication and analysis.

[00189] The primary advantages of one or more of the embodiments described above include:

• The spectral values are obtained during an activity that would have to otherwise occur. No additional operations, treatments, flights or services need be scheduled.

• The spectral values and location data are processed by a computer mounted to the aircraft and the NDVI image or images are constructed as the crop dusting activity is underway.

• Once generated, the spectral image is sent immediately to a website (for example, to www.terrapix. com. a ) and is made available at low cost to subscribed customers.

• Often the spectral image is available to the customers while the aircraft is still in flight.

• The spectral image is included in a file of small size to facilitate quick transmission to the central computer network and subsequent downloading to remote subscribers.

• The capture of the images occurs simultaneously with the application of the treatment, and does not require any additional actions by the pilot during that application. A relatively low-tech commercially available industrial or laptop computer mounted to the aircraft is able to coordinate the capture and processing of the required to produce the spectral image. That is, the need for significant post- capture processing has been obviated.

Furthermore, as the need for significant post-capture processing has been obviated, the need for separate processing software is also not required. Therefore, there is no need to learn how to use this complex processing software which saves time and costs.

Speed of processing and delivery is aided by the use of relatively low resolution image capture, and the "filling" of the intermediate zones. It has been found that in cropping applications, that the resolution provided by the embodiments is sufficiently accurate to allow identification of possible risk regions within the paddock without having to incur the cost of far more expensive and detailed image capture.

Using the captured NDVI images in an algorithm to predict yield (or some other factor). Over time it is expected that an association between NDVI and crop yield would become apparent, or, an association with another crop health limiting parameter would become apparent. An example of such an association is foliar nitrogen application rates for potato crops.

NDVI images can be used to predict within-paddock gross margins.

Storage of NDVI and yield/gross margin can be used to assess the efficacy of crop management practice.

NDVI can be used to inform existing crop growth models.

NDVI can be used to determine within-paddock plant growth and, therefore, the costs and benefits of within-paddock variable rate management, such as variable rate irrigation and fertilizer applications.

There is flexibility to undertake some processing during or immediately post collection of the sensor information to reduce the amount of data that actually need be sent to the remote computer. However, there is also post processing occur at the remote computer so that the computer located on the vehicle need not have to be high powered or highly specified. • The entire photodiode unit or lens mount can be quickly removed and replaced with another unit to facilitate quick data capture of different target reflected light wavelengths from different crops.

• The on-board processor unit can be accessed remotely to identify new geofences and to change data capture configuration. This allows for the best combination to be used for different crops under different particular conditions in different geographic locations using the same data capture unit and same method of data capture.

• Capture of data a very low altitude negates atmospheric effects that often affect spectral reflectance measurements from high altitude aircraft. Accordingly, this is advantageous as it makes obtaining a spectral image from a plot of land much cheaper. Higher altitude data capture (fore example a thousands of feet above ground) is particularly susceptible to atmospheric effects, which can affect the sensing of wavelengths from crops. Typically, there is great effort (in terms of computing power, manual labour and the needs for specialised sensors) required to measure, correct and standardise these varying atmospheric effects. This is negated by data capture from close to relatively ground level.

• Captured NDVI data from agricultural aircrafts is at a pixel scale is about an 80 megapixel scale in an embodiment. This closely matches the mode of variable rate application from those same aircraft. High altitude NDVI data capture is at about a 1 megapixel scale, which requires more data analysis and interpretation to manipulate those images to a more coarse level.

[00190] Unless specifically stated otherwise, as apparent from the following discussions, it is appreciated that throughout the specification discussions utilizing terms such as "processing," "computing," "calculating," "determining", analyzing" or the like, refer to the action and/or processes of a computer or computing system, or similar electronic computing device, that manipulate and/or transform data represented as physical, such as electronic, quantities into other data similarly represented as physical quantities.

[00191] In a similar manner, the term "processor" may refer to any device or portion of a device that processes electronic data, e.g., from registers and/or memory to transform that electronic data into other electronic data that, e.g., may be stored in registers and/or memory. A "computer" or a "computing machine" or a "computing platform" may include one or more processors. [00192] The methodologies described herein are, in one embodiment, performable by one or more processors that accept computer-readable (also called machine-readable) code containing a set of instructions that when executed by one or more of the processors carry out at least one of the methods described herein. Any processor capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken are included. Thus, one example is a typical processing system that includes one or more processors. Each processor may include one or more of a CPU, a graphics processing unit, and a programmable DSP unit. The processing system further may include a memory subsystem including main RAM and/or a static RAM, and/or ROM. A bus subsystem may be included for communicating between the components. The processing system further may be a distributed processing system with processors coupled by a network. If the processing system requires a display, such a display may be included, for example a liquid crystal display (LCD) or a cathode ray tube (CRT) display. If manual data entry is required, the processing system also includes an input device such as one or more of an alphanumeric input unit such as a keyboard, a pointing control device such as a mouse, and so forth. The term memory unit as used herein, if clear from the context and unless explicitly stated otherwise, also encompasses a storage system such as a disk drive unit. The processing system in some configurations may include a sound output device, and a network interface device. The memory subsystem thus includes a computer-readable carrier medium that carries computer-readable code (e.g., software) including a set of instructions to cause performing, when executed by one or more processors, one of more of the methods described herein. Note that when the method includes several elements, e.g., several steps, no ordering of such elements is implied, unless specifically stated. The software may reside in the hard disk, or may also reside, completely or at least partially, within the RAM and/or within the processor during execution thereof by the computer system. Thus, the memory and the processor also constitute computer-readable carrier medium carrying computer-readable code.

[00193] The LCD screen includes the display of some basic features such as GPS signal strength, power status, data input rates from lens mounts, and GPRS signal strength, amongst others. This configuration is can facilitate efficient set up and also identify any problems encountered during set up.

[00194] The Linux SBC board 60 has a data port, in the form of a USB port, for more advanced identification of the status of the system and to facilitate more significant software upgrades that might be required. This thereby negates removal of the data logger from the aircraft or ground operated unit and allows for more efficient maintenance of the system (such as a software rebooting-type of function). In other embodiments, the data port is other than a USB port.

[00195] Furthermore, a computer-readable carrier medium may form, or be included in a computer program product.

[00196] In alternative embodiments, the one or more processors operate as a standalone device or may be connected, e.g., networked to other processor(s), in a networked deployment, the one or more processors may operate in the capacity of a server or a user machine in server-user network environment, or as a peer machine in a peer-to-peer or distributed network environment. The one or more processors may form a personal computer (PC), a tablet PC, a set-top box (STB), a Personal Digital Assistant (PDA), a cellular telephone, a web appliance, a network router, switch or bridge, or any machine capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that machine.

[00197] Note that while some diagrams only show a single processor and a single memory that carries the computer-readable code, those in the art will understand that many of the components described above are included, but not explicitly shown or described in order not to obscure the inventive aspect. For example, while only a single machine is illustrated, the term "machine" shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein.

[00198] Thus, one embodiment of each of the methods described herein is in the form of a computer-readable carrier medium carrying a set of instructions, e.g., a computer program that is for execution on one or more processors, e.g., one or more processors that are part of web server arrangement. Thus, as will be appreciated by those skilled in the art, embodiments of the present invention may be embodied as a method, an apparatus such as a special purpose apparatus, an apparatus such as a data processing system, or a computer-readable carrier medium, e.g., a computer program product. The computer-readable carrier medium carries computer readable code including a set of instructions that when executed on one or more processors cause the processor or processors to implement a method. Accordingly, aspects of the present invention may take the form of a method, an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of carrier medium (e.g., a computer program product on a computer-readable storage medium) carrying computer-readable program code embodied in the medium.

[00199] The software may further be transmitted or received over a network via a network interface device. While the carrier medium is shown in an exemplary embodiment to be a single medium, the term "carrier medium" should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions. The term "carrier medium" shall also be taken to include any medium that is capable of storing, encoding or carrying a set of instructions for execution by one or more of the processors and that cause the one or more processors to perform any one or more of the methodologies of the present invention. A carrier medium may take many forms, including but not limited to, non-volatile media, volatile media, and transmission media. Non-volatile media includes, for example, optical, magnetic disks, and magneto-optical disks. Volatile media includes dynamic memory, such as main memory. Transmission media includes coaxial cables, copper wire and fiber optics, including the wires that comprise a bus subsystem. Transmission media also may also take the form of acoustic or light waves, such as those generated during radio wave and infrared data communications. For example, the term "carrier medium" shall accordingly be taken to included, but not be limited to, solid-state memories, a computer product embodied in optical and magnetic media; a medium bearing a propagated signal detectable by at least one processor of one or more processors and representing a set of instructions that, when executed, implement a method; a carrier wave bearing a propagated signal detectable by at least one processor of the one or more processors and representing the set of instructions a propagated signal and representing the set of instructions; and a transmission medium in a network bearing a propagated signal detectable by at least one processor of the one or more processors and representing the set of instructions.

[00200] It will be understood that the steps of methods discussed are performed in one embodiment by an appropriate processor (or processors) of a processing (i.e., computer) system executing instructions (computer-readable code) stored in storage. It will also be understood that the invention is not limited to any particular implementation or programming technique and that the invention may be implemented using any appropriate techniques for implementing the functionality described herein. The invention is not limited to any particular programming language or operating system.

[00201] Reference throughout this specification to "one embodiment" or "an embodiment" means that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, appearances of the phrases "in one embodiment" or "in an embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment, but may. Furthermore, the particular features, structures or characteristics may be combined in any suitable manner, as would be apparent to one of ordinary skill in the art from this disclosure, in one or more embodiments.

[00202] Similarly it should be appreciated that in the above description of exemplary embodiments of the invention, various features of the invention are sometimes grouped together in a single embodiment, Figure, or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of one or more of the various inventive aspects. This method of disclosure, however, is not to be interpreted as reflecting an intention that the claimed invention requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed embodiment. Thus, the claims following the Detailed Description are hereby expressly incorporated into this Detailed Description, with each claim standing on its own as a separate embodiment of this invention.

[00203] Furthermore, while some embodiments described herein include some but not other features included in other embodiments, combinations of features of different embodiments are meant to be within the scope of the invention, and form different embodiments, as would be understood by those skilled in the art. For example, in the following claims, any of the claimed embodiments can be used in any combination.

[00204] Furthermore, some of the embodiments are described herein as a method or combination of elements of a method that can be implemented by a processor of a computer system or by other means of carrying out the function. Thus, a processor with the necessary instructions for carrying out such a method or element of a method forms a means for carrying out the method or element of a method. Furthermore, an element described herein of an apparatus embodiment is an example of a means for carrying out the function performed by the element for the purpose of carrying out the invention.

[00205] In the description provided herein, numerous specific details are set forth.

However, it is understood that embodiments of the invention may be practiced without these specific details. In other instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.

[00206] Similarly, it is to be noticed that the term coupled, when used in the claims, should not be interpreted as being limited to direct connections only. The terms "coupled" and "connected," along with their derivatives, may be used. It should be understood that these terms are not intended as synonyms for each other. Thus, the scope of the expression a device A coupled to a device B should not be limited to devices or systems wherein an output of device A is directly connected to an input of device B. It means that there exists a path between an output of A and an input of B which may be a path including other devices or means. "Coupled" may mean that two or more elements are either in direct physical or electrical contact, or that two or more elements are not in direct contact with each other but yet still co-operate or interact with each other.

[00207] Thus, while there has been described what are believed to be the preferred embodiments of the invention, those skilled in the art will recognize that other and further modifications may be made thereto without departing from the spirit of the invention, and it is intended to claim all such changes and modifications as fall within the scope of the invention. For example, any formulas given above are merely representative of procedures that may be used. Functionality may be added or deleted from the block diagrams and operations may be interchanged among functional blocks. Steps may be added or deleted to methods described within the scope of the present invention.




 
Previous Patent: A CONSTRUCTION ELEMENT

Next Patent: A SHUT-IN TOOL