Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
CLOUD-FREE ANALYTICS FROM SATELLITE INPUT
Document Type and Number:
WIPO Patent Application WO/2022/254211
Kind Code:
A1
Abstract:
A method for analysing an image of a field in which cloud obscures part of the field, and a device for performing the method, are provided. The method comprises the steps of detecting, by a first artificial neural network using a classification algorithm, that there is cloud in the image and that cloud obscures at least part of the field; determining, by the first artificial neural network using the classification algorithm, a percentage area of the field obscured by the cloud; determining that the percentage area of the field obscured by the cloud is less than or equal to a first threshold; determining, by a second artificial neural network using a segmentation algorithm, positions within the image at which the field is obscured by the cloud; applying a mask to the image, the mask corresponding to the positions within the field obscured by the cloud, to allocate cloud-free area of the field; and analysing the cloud-free area of the field.

Inventors:
SIDIROPOULOS PANAGIOTIS (GB)
Application Number:
PCT/GB2022/051394
Publication Date:
December 08, 2022
Filing Date:
June 01, 2022
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
HUMMINGBIRD TECH LIMITED (GB)
International Classes:
G06V20/13; G06K9/62; G06V10/26; G06V10/82; G06V20/17
Foreign References:
US20200250427A12020-08-06
US20170161584A12017-06-08
CN111898503B2021-02-26
Other References:
BERRY P MSPINK J H: "A physiological analysis of oilseed rape yields: Past and future", THE JOURNAL OF AGRICULTURAL SCIENCE, vol. 144, 2006, pages 381 - 392
ROUSE, J.WHAAS, R.H.SCHEEL, J.A.DEERING, D.W.: "Monitoring Vegetation Systems in the Great Plains with ERTS", PROCEEDINGS, 3RD EARTH RESOURCE TECHNOLOGY SATELLITE (ERTS) SYMPOSIUM, vol. 1, 1974, pages 48 - 62
Attorney, Agent or Firm:
DUNLOP, Hugh (GB)
Download PDF:
Claims:
Claims

1. A method for analysing an image of a field in which cloud obscures part of the field, the method comprising: detecting, by a first artificial neural network using a classification algorithm, that there is cloud in the image and that cloud obscures at least part of the field; determining, by the first artificial neural network using the classification algorithm, a percentage area of the field obscured by the cloud; determining that the percentage area of the field obscured by the cloud is less than or equal to a first threshold; determining, by a second artificial neural network using a segmentation algorithm, positions within the image at which the field is obscured by the cloud; applying a mask to the image, the mask corresponding to the positions within the field obscured by the cloud, to allocate cloud-free area of the field; and analysing the cloud-free area of the field.

2. The method of claim 1, wherein the first threshold is between 40% and 70%.

3. The method of claim 1 or 2, wherein an average value characteristic of the cloud-free area of the field is produced by the analysis of the cloud-free area of the field, and that average value is used to characterise the positions of the field obscured by the cloud.

4. The method of claim 1 or 2, wherein the analysis of the cloud-free area of the field is used to interpolate values characteristic of positions of the field obscured from the cloud based on values characteristic of the cloud-free area of the field.

5. The method of any preceding claim, further comprising, prior to the detecting, extracting the image of the field from an image captured by a satellite.

6. The method of claim 5, wherein extracting further comprises applying field boundaries of the field to the image captured by a satellite, labelling the section of the image within the field boundaries as the image of the field, and discarding sections of the image outwith the field boundaries of the field.

7. The method of claim 6, further comprising resizing the section of the image within the field boundaries to a predefined image size.

8. The method of any one of claims 5 to 7, wherein extracting comprises retaining red, green, blue and NIR bands of the image captured by a satellite and discarding remaining bands.

9. The method of any preceding claim, wherein the segmentation algorithm labels each pixel of the image as cloud or not cloud.

10. The method of any preceding claim, wherein the first artificial neural network and the second artificial neural network are trained using training samples consisting of images of agricultural areas of land.

11. The method of claim 10, wherein the training of the first artificial neural network and the second artificial neural network comprises supervised deep learning training.

12. The method of any preceding claim, wherein the first artificial neural network and the second artificial neural network are convolutional neural networks.

13. An apparatus for analysing an image of a field in which cloud obscures part of the field, the apparatus comprising: means for detecting, by a first artificial neural network using a classification algorithm, that there is cloud in the image and that cloud obscures at least part of the field; means for determining, by the first artificial neural network using the classification algorithm, a percentage area of the field obscured by the cloud; means for determining that the percentage area of the field obscured by the cloud is less than or equal to a first threshold; means for determining, by a second artificial neural network using a segmentation algorithm, positions within the image at which the field is obscured by the cloud; means for applying a mask to the image, the mask corresponding to the positions within the field obscured by the cloud, to allocate cloud-free area of the field; and means for analysing the cloud-free area of the field

14. The apparatus of claim 13, wherein the first threshold is between 40% and 70%.

15. The apparatus of claim 13 or 14, wherein an average value characteristic of the cloud-free area of the field is produced by the analysis of the cloud-free area of the field, and that average value is used to characterise the positions of the field obscured by the cloud.

16. The apparatus of claim 13 or 14, wherein the analysis of the cloud-free area of the field is used to interpolate values characteristic of positions of the field obscured from the cloud based on values characteristic of the cloud-free area of the field.

17. The apparatus of any one of claims 13 to 16, further comprising, prior to the means for detecting, means for extracting the image of the field from an image captured by a satellite.

18. The apparatus of claim 17, wherein the means for extracting further comprises means for applying field boundaries of the field to the image captured by a satellite, means for labelling the section of the image within the field boundaries as the image of the field, and means for discarding sections of the image outwith the field boundaries of the field.

19. The apparatus of claim 18, further comprising means for resizing the section of the image within the field boundaries to a predefined image size.

20. The apparatus of any one of claims 17 to 19, wherein the means for extracting comprises means for retaining red, green, blue and NIR bands of the image captured by a satellite and means for discarding remaining bands.

21. The apparatus of any one of claims 13 to 20, wherein the segmentation algorithm labels each pixel of the image as cloud or not cloud.

22. The apparatus of any one of claims 13 to 21, wherein the first artificial neural network and the second artificial neural network are trained using training samples consisting of images of agricultural areas of land.

23. The apparatus of claim 22, wherein the training of the first artificial neural network and the second artificial neural network comprises supervised deep learning training.

24. The apparatus of any one of claims 13 to 23, wherein the first artificial neural network and the second artificial neural network are convolutional neural networks.

Description:
Cloud-free analytics from satellite input

Field

[001] This invention relates to cloud-free analytics of aerial and satellite images.

Background

[002] Aerial images of the earth's surface may be taken using satellites orbiting in space. Such images usually comprise a plurality of frequency bands, including bands for the visible spectrum. Clouds are often present in such images, and these obscure the earth's surface in visible (red, green, and blue) and near infra-red (NIR) bands. ESAs Sentinel-2 satellites aim to image the whole of the earth's surface at least once per week in thirteen different bands (including bands for red, green, blue, NIR). Satellites also image in different bands, for example ultraviolet, short-wave infrared (SWIR), etc, bands.

[003] Aerial images (e.g. images taken by satellites) may be used to analyse vegetation cover, for example by forests and agricultural land. Analysis of crops in agricultural fields may be used to identify crop cover, maturity, and health. Collection of crop cover and/or maturity data over time allows the early identification of several different types of stress in the crop, provides valuable insights on the expected yield, and may enable prediction of when the crop will be ready to harvest. The analysis of crops may involve an index, for example the green area index (Berry P M, Spink J H. 2006. A physiological analysis of oilseed rape yields: Past and future. The Journal of Agricultural Science 144:381-392) or the normalised difference vegetation index (NDVI) (Rouse, J.W, Haas, R.H., Scheel, J.A., and Deering, D.W. (1974) 'Monitoring Vegetation Systems in the Great Plains with ERTS.' Proceedings, 3rd Earth Resource Technology Satellite (ERTS) Symposium, vol. 1, p. 48-62).

[004] However, clouds frequently obscure aerial images of crops. Cloud may cover all or part of a particular field. Usually, aerial images in which at least part of the field is obscured by cloud is discarded. If the field is obscured by cloud (i.e. a part of a cloud, parts of plural cloud, or one or plural full clouds) each time aerial images are taken, then no data is gathered regarding that field.

[005] It is often simply not possible if satellite imaging is used). It is known to use frequency bands that penetrate cloud, for example radio waves (radar data) which is available in some satellite images, but such bands may not be directly relevant to agricultural features, introduce a new set of challenges (e.g. speckle noise), or may generate less accurate analytics than visible and NIR imagery. Summary of the invention

[006] In accordance with a first aspect of the invention, a method for analysing an image of a field in which cloud obscures part of the field is provided. The method comprises: detecting, by a first artificial neural network using a classification algorithm, that there is cloud in the image and that cloud obscures at least part of the field; determining, by the first artificial neural network using the classification algorithm, a percentage area of the field obscured by the cloud; determining that the percentage area of the field obscured by the cloud is less than or equal to a first threshold; determining, by a second artificial neural network using a segmentation algorithm, positions within the image at which the field is obscured by the cloud; applying a mask to the image, the mask corresponding to the positions within the field obscured by the cloud, to allocate cloud-free area of the field; and analysing the cloud-free area of the field.

[007] According to an aspect of the invention, there is provided an apparatus for analysing an image of a field in which cloud obscures part of the field. The apparatus comprises: a first artificial neural network using a classification algorithm, that there is cloud in the image and that cloud obscures at least part of the field; means for determining, by the first artificial neural network using the classification algorithm, a percentage area of the field obscured by the cloud; means for determining that the percentage area of the field obscured by the cloud is less than or equal to a first threshold; means for determining, by a second artificial neural network using a segmentation algorithm, positions within the image at which the field is obscured by the cloud; means for applying a mask to the image, the mask corresponding to the positions within the field obscured by the cloud, to allocate cloud-free area of the field; and means for analysing the cloud-free area of the field.

[008] The method and apparatus enables analysis using only cloud-free areas of a field. The classification algorithm is faster, less computationally demanding and more accurate than the segmentation algorithm. Utilising the classification algorithm to determine which images to analyse with the segmentation algorithm allows the method to generate results faster and be more accurate than if the segmentation algorithm were used for all images. The method and apparatus allows for identification and analysis of cloud-free areas of fields that are sufficiently large to allow statistically significant analytics thereof (cloud-free areas may be areas-or locations-of the image at which vegetation is expected). This enables analysis of (and therefore data from) an improved proportion of aerial images, and increased data for time-series analyses. As a result, accuracy of analytics of fields (crops growing in fields) is improved. The image of a field usually comprises red, green, and blue (RGB), and near infra-red (NIR) bands. Thus, unlike cloud penetrating methods (e.g. radar) the cloud obscures part of the field and the method enables determination of extent and location of cloud. Cloud may mean partial, singular and plural clouds in the image, in other words cloud may refer to at least a part of at least one cloud.

[009] Preferably, the first threshold is between 40% and 70%. In other words, the threshold area obscured by cloud is between 40% and 70%. More preferably, the threshold is between 45% and 60%. Still more preferably, the threshold is 50%. The threshold provides a balance between retaining data points (by being able to analyse images with partial cloud coverage) having sufficient visible field to get a reasonably accurate data point, and speed/computational requirement for the overall method because the segmentation algorithm is costly and so is undesirable to use for each image.

[0010] Preferably, an average value characteristic of the cloud-free area of the field is produced by the analysis of the cloud-free area of the field, and that average value is used to characterise the positions of the field obscured by the cloud. This avoids using areas obscured by cloud in the analysis. Use of the cloud-free areas of the field allows for analysis in visible (RGB) and NIR bands, which is typically more accurate than use of cloud-penetrating instruments.

[0011] Alternatively, the analysis of the cloud-free area of the field is used to interpolate values characteristic of positions of the field obscured from the cloud based on values characteristic of the cloud-free area of the field. This enables derivation of values for positions covered by cloud.

[0012] Preferably, the analysis of the cloud-free area of the field comprises analysing vegetation. This may comprise comparing reflectance of two bands of light over the cloud-free area of the field to determine vegetation maturity or health. The analysis may involve an index, for example the green area index or the normalised difference vegetation index (NDVI), as will be familiar to the person skilled in the art.

[0013] Preferably, the method comprises, prior to the detecting, extracting the image of the field from an image captured by a satellite. Preferably, the apparatus comprises, prior to the means for detecting, means for extracting the image of the field from an image captured by a satellite. This allows the method and apparatus to identify the field from a larger image, for example a full-strip satellite image.

[0014] Preferably, the extracting comprises applying field boundaries of the field to the image captured by a satellite, labelling the section of the image within the field boundaries as the image of the field, and discarding sections of the image outwith the field boundaries of the field. Preferably, the means for extracting comprises applying field boundaries of the field to the image captured by a satellite, means for labelling the section of the image within the field boundaries as the image of the field, and means for discarding sections of the image outwith the field boundaries of the field. This enables identification of the field from the image captured by the satellite

[0015] Preferably, the section of the image within the field boundaries is resized to a predefined image size. Preferably, the apparatus comprises means for resizing the section of the image within the field boundaries to a predefined image size.

[0016] Preferably, the extracting comprises retaining red, green, blue and NIR bands of the image captured by a satellite and discarding remaining bands. For example, Sentinel-2 has 13 bands in total. Sentinel 2 or other imaging instruments on-board satellites are capable of capturing images in visible and NIR bands. Preferably, the means for extracting comprises means for retaining red, green, blue and NIR bands of the image captured by a satellite and discarding remaining bands.

[0017] Preferably, the segmentation algorithm labels each pixel of the image as cloud or not cloud. In other words, the segmentation algorithm identifies each part of the field as obscured by a cloud or as cloud-free (e.g. each pixel labelled as cloud or not cloud). This enables production of a cloud- mask to apply to image.

[0018] Preferably, the first artificial neural network and the second artificial neural network are convolutional neural networks. Preferably, the artificial neural networks are trained using training samples consisting of images of agricultural areas of land. Preferably, the agricultural areas used for training are in the same country or in a neighbouring country as the field.

[0019] Preferably, the training of the first artificial neural network and the second artificial neural network comprises supervised deep learning training.

[0020] Preferably, if the percentage area of the field obscured by cloud is greater than the first threshold, the method comprises discarding the image. Preferably, if the percentage area of the field obscured by cloud is greater than the first threshold, the apparatus comprises means for discarding the image. Discarding that image (i.e. not further analysing that image), for example means not determining, by the second artificial neural network using the segmentation algorithm, positions within the image at which the field is obscured by the cloud.

[0021] Preferably, if the first artificial neural network using the classification algorithm detects that there is no cloud in the image, the method (and apparatus) comprises proceeding to the step of analysing the cloud-free area of the field and analyses the whole area of the field, without use of the segmentation algorithm.

[0022] According to an aspect of the invention, the classification algorithm is executed using an artificial neural network and the segmentation algorithm is executed using the same artificial neural network. In that case, the invention provides a method for analysing an image of a field in which cloud obscures part of the field. The method comprises: detecting, by an artificial neural network using a classifying algorithm, that there is cloud in the image and that cloud obscures at least part of the field; determining, by the artificial neural network using the classifying algorithm, a percentage area of the field obscured by the cloud; determining that the percentage area of the field obscured by the cloud is less than or equal to a first threshold; determining, by the artificial neural network using a segmentation algorithm, positions within the image at which the field is obscured by the cloud; applying a mask to the image, the mask corresponding to the positions within the field obscured by the cloud, to allocate cloud-free area of the field; and analysing the cloud-free area of the field.

[0023] Other features and advantages of the invention will become apparent after review of the entire application, including the following sections: brief description of the drawings, detailed description and claims.

Brief description of the drawings

[0024] The accompanying drawings illustrate exemplary aspects of the invention, and, together with the general description given above and the detailed description given below, serve to explain features of the invention.

[0025] Fig. 1 is a schematic of a satellite view of an agricultural area of the earth's surface [0026] Fig. 2 is a flow diagram representing a method for analysing an image of a field.

[0027] Fig. 3 is a flow diagram representing the image download and extraction step of Fig. 2.

[0028] Fig. 4 is a diagram showing a computer system.

Detailed description [0029] The various aspects of the invention will be described in detail with reference to the accompanying drawings by way of example only. Wherever possible, the same reference numbers will be used throughout the drawings to refer to the same or like parts. References made to particular examples and implementations are for illustrative purposes, and are not intended to limit the scope of the invention or the claims.

[0030] The invention uses one or more artificial neural networks (ANNs) to analyse images. An ANN is based on a collection of connected artificial neurons, which loosely model the neurons in a biological brain. Each connection, like the synapses in a biological brain, can transmit a signal to other neurons. An artificial neuron that receives a signal then processes it and can signal neurons connected to it. Neurons and edges (connections) typically have a weight that adjusts as learning proceeds. The weight increases or decreases the strength of the signal at a connection. Neurons may have a threshold such that a signal is sent only if the aggregate signal crosses that threshold. Typically, neurons are aggregated into layers. Different layers may perform different transformations on their inputs. Signals travel from the first layer (the input layer), to the last layer (the output layer), possibly after traversing the layers multiple times.

[0031] Convolutional neural networks are a type of feed-forward artificial neural network. Convolutional neural networks may include collections of neurons that each have a receptive field and that collectively tile an input space.

[0032] Deep learning architectures, such as deep belief networks and deep convolutional networks, may comprise layered neural networks architectures in which the output of a first layer of neurons becomes an input to a second layer of neurons, the output of a second layer of neurons becomes and input to a third layer of neurons, and so on. Such networks are often referred to as sequential networks. The skilled person will understand, however, that other deep learning architectures, e.g. functional, may be used. Deep neural networks may be trained to recognize a hierarchy of features. Like convolutional neural networks, computation in these deep learning architectures may be distributed over a population of processing nodes, which may be configured in one or more computational chains. These multi-layered architectures may be trained one layer at a time and may be fine-tuned using back propagation.

[0033] FIG. 1 represents satellite image 100 of an agricultural area of the earth's surface. Four fields (105, 110, 115, 120) are shown; a first field 105 having first field boundaries 106, a second field 110 having second field boundaries 111, a third field 115 having third field boundaries (not labelled for clarity of figure), and a fourth field 120 having fourth field boundaries (not labelled for clarity of figure). Four rectangular fields are shown, but the skilled person would recognise that the fields may be any shape and size, so long as the boundaries are defined. Only four fields are shown in a rectangular image, but it will be appreciated that this is a simplification and the image may be any shape (defined by the particular method by which the aerial image is taken) and fields or other features (e.g. roads, buildings) will also be present. The fields 105, 110, 115 in the image 100 are at least partly obscured by cloud cover (clouds 125,130,135).

[0034] Fig. 2 is a flow diagram representing a method 200 for analysing an image of a field. The method 200 may be used to analyse the image 100 of Fig. 1 and will be described with reference to the fields depicted in Fig. 1.

[0035] At step 205 an aerial image is acquired. The aerial image may be collected by a satellite, for example Sentinel-2 and be a full strip consisting of all bands (including red, green, blue, and NIR). The aerial image typically encompasses a larger area than a single field. At step 210 field boundaries for a particular field are acquired, for example from a database. The field boundaries of a given field form a closed loop enclosing, or defining, the field. The field boundaries are predefined and stored (e.g. in the database), but may follow walls, fences, and/or hedgerows on the ground. The image acquired at step 205 and the field boundaries acquired at step 210 are input to a step 215 of image download and extraction.

[0036] At step 215, the field boundaries are applied to the aerial image to extract an aerial image of the field (the area encompassed by the field). For example, the field boundaries 106 are applied to image 100 to result in an extracted image of the first field 105. Visible (red, green, blue - RGB) and near infra-red (NIR) bands are retained, and other bands are typically discarded. The image of the first field 105 comprises visible parts of the first field 105 and areas obscured by first cloud 125 and second cloud 130. Step 215 will be explained in more detail with reference to Fig. 3, below.

[0037] The image of the field is output from step 215 and used as input to cloud detection step 220. In cloud detection step 220 an artificial neural network employing a classification algorithm analyses the image of the field. The artificial neural network employing the classification algorithm determines whether there is cloud in the image of the field (in other words, determines whether any part of the field is obscured by cloud) and classifies the image of the field as having cloud or not having cloud. The classification algorithm also determines the extent of cloud in the image of the field as a percentage of the total area of the field obscured by cloud. In the example of Fig. 1, the first field 105, second field 110, and third field 115 would be classified as having cloud obscuring at least part of the respective field, and the fourth field 120 would be classified as having no cloud obscuring that field. The term "cloud" is used to refer to any portion or number of clouds obscuring a field.

[0038] The artificial neural network in step 220 takes the image of the field from step 215 as its input to a first layer of the neural network, and outputs, at the last layer of the neural network results of classification by the neural network (that is, whether the image includes cloud and, if so, the percentage area of the field obscured by the cloud).

[0039] The method 200 continues at decision step 225, wherein the classification of the image at step 225 is used to determine if any cloud was detected in the image of the field. If the classification indicates that no cloud was detected, then the method 200 proceeds to step 230. If the classification indicates that cloud was detected in the image of the field, then the method 200 proceeds to step 235.

[0040] At decision step 235, the percentage area of the field obscured by the cloud (from the classification of the image at step 225) is compared to a threshold. If the percentage area of the field obscured by the cloud is less than or equal to the threshold then the method proceeds to step 240, and cloud-free area of the image of the field is subsequently analysed. If the percentage area of the field obscured by the cloud is greater than the threshold then the method proceeds to step 245, and the image of the field is not subsequently analysed. The threshold is set to a value that enables images that do not have extensive cloud cover to be used in subsequent analysis. The visible or cloud-free areas of such images allow the field to be viewed in those areas, and vegetation in those areas to be analysed. The threshold is therefore set low enough that analysis of the vegetation in the cloud-free areas is statistically reliable. Conversely, the threshold is also set high enough that as few as possible images are discarded, and so as much data as possible about the field is collected and analysed. This may enable time-series data, and more valuable insights on the condition of the crop . The threshold may be set to a value between 40% and 70%. Preferably, the first threshold is set to a value between 45% and 60%, more preferably 50%. Returning to Fig. 1, it is likely that the first field 105 would exceed the threshold area of cloud and image of the first field would be discarded (via step 245), whereas the second field 110 and third field 115 would likely meet the threshold requirement and images of said fields would be proceed to step 240 for further analysis. [0041] At cloud localisation step 240, an artificial neural network determines positions within the image at which cloud is present. For each position (in other words, for each pixel of the image) a segmentation algorithm is used to determine whether that location is obscured by cloud. Given that each position (pixel) is analysed, the segmentation algorithm is necessarily slower than the classification algorithm used at step 220. However, computational and time costs for the method 200 are improved by use of the classification algorithm at step 220 to eliminate unnecessary (because there are no clouds) and unsuitable (because there is too much cloud cover) images before determination of cloud locations using the segmentation algorithm. In the former case, the image is retained for analysis (see step 230, 255) and in the latter case the image is discarded without further determination of cloud position at cloud localisation step 240 (and so is not used at step 230 and 255). A mask, corresponding to the positions at which cloud is determined, is produced by the cloud localisation step 240.

[0042] Depending upon the output of determination 235, a user interface may display an indication to a user that there is full cloud (step 245) because cloud extent is greater than the threshold of may display an indication to the user that there is local, or partial, cloud (step 250) because cloud is classified but its extent is less than or equal to the threshold.

[0043] At step 230, the area of the image to be analysed (i.e. the area that does not encompass any cloud) is confirmed, and the cloud-free area is passed to analysis step 255. If no cloud was detected at step 225, then the area to be analysed is set, at step 230, to be the whole area of the field, and this is passed to step 255. If cloud was detected at step 225 (and the cloud extent is less than or equal to the threshold at 235), then the mask determined using the segmentation algorithm at step 240 is used to exclude from further analysis areas of the image determined to have cloud. At step 230, the mask is applied to the image of the field and only cloud-free area (or positions within the field/image) are allocated for analysis, and passed to step 255. In the example of Fig. 1, the fourth field 120 has no cloud obscuring the field and so the image of the fourth field is passed to step 230 directly from step 225, and at step 230 the area to be analysed is set as the whole image, and this is passed to the analysis step 255. The image of the second field 110 (and likewise the image of the third field 115) have cloud obscuring less than the threshold area, and so a mask is set by cloud localisation step 240 and used by the area to be analysed step 230 to allocate cloud-free area of the image of the second field. This cloud-free area is passed to the analysis step 255.

[0044] Step 255 receives the image of the cloud-free area of the field output from step 230 and analyses that cloud-free area as required. In some examples the analysis produces an average value of a parameter for the cloud-free positions of the field, and uses this also for the positions of the field obscured by cloud. In other examples, the analysis uses values of the parameter for cloud-free positions and uses these values to interpolate values for the parameter for the positions of the field that are obscured by cloud. These examples ensure a data point for the field (and positions within the field) even when the field is partially obscured by cloud.

[0045] The analysis may include analysis of vegetation (in the cloud-free positions) in the field. Analysis may involve a comparison of reflectance of two different bands of light. The analysis may involve determination of an index for the field, or a part thereof. Common indexes include, for example (a) canopy coverage which is a percentage of field covered by crop, often used to determine how much fertiliser, e.g. nitrogen, to use on the crop, and (b) the normalised difference vegetation index (NDVI) which is equal to [NIR reflectance minus Red reflectance] divided by [NIR reflectance plus Red reflectance], used to determine vegetation health.

[0046] Results of analysis at step 255, indication of the area to be analysed (as determined at step 230) and indications of local cloud (step 250) and full cloud (245) may be output for display to a user at step 260.

[0047] The method 200 may be performed by an apparatus, for example a user apparatus. Each step of the method 200 may be implemented as a module of computer code performed by a processor, having memory that stores parameters and training data. A user apparatus may include a display. The area to be analysed and any mask may be displayed on the display of the user apparatus at output step 260. Results of the analysts step 255 may also be displayed on the display at output step 260. The full cloud graphic (step 245) and local cloud graphic (step 250) may be displayed on the display of the user apparatus at output step 260. The output of the post-processing module 109 may be displayed through a display module 110.

[0048] Fig. 3 provides further details of image download and extraction step 215 of Fig. 2. At step 305, an aerial image encompassing the field is downloaded. For example, a Sentinel-2 full-strip image, in all 13 bands, is downloaded. At step 310, bands of interest are extracted from the downloaded aerial image. In the example of Fig. 3, the red, green, blue, and near infra-red image bands are retained and remaining bands are discarded. At step 315 the field boundaries (acquired at step 210) are accessed and used to extract an image of an individual field (or to extract individual images of each field of interest present in the downloaded aerial image, and to extract images of individual fields to respective files). This ensures that only the area of interest (the field) are used in subsequent steps. At step 320, the image is re-sized to a pre-defined image size for use in the subsequent steps. The pre-defined image size is typically a number of pixels, for example 128 x 128 pixels. For example, if the original image size of the field is larger than the pre-defined image size then the image is re-sized to the pre-defined image size (i.e. its size is reduced). If the original image size of the field is smaller than the pre-defined image size (e.g. because it is a small field) then the image is re-sized by a) resizing the image of the field in order to force it to become of the correct size (increasing its size); or b) by storing the neighbouring pixels (outside the field boundaries but close to the field) to create an image having the pre-defined image size with a part of the image being the field, and part of the image being of the areas outside the field.

[0049] The artificial neural network using the classification algorithm to detect clouds and their extent used at step 220, and the artificial neural network using the segmentation algorithm to assign position of cloud in the image used at step 240 are typically two separate neural networks: a first network using the classification algorithm at step 220 and a second neural network using the segmentation algorithm at step 240. Alternatively, a single artificial neural network may use both the classification algorithm (at step 220) and the segmentation algorithm (at step 240).

[0050] The classification algorithm detects patterns in the images in order to classify the images as including cloud or not including cloud (in other words "cloudy" or "not cloudy"), and to classify the area extent of cloud. A regression may be used to generate the percentage area (extent) of cloud. The segmentation algorithm identifies the areas of the field that are covered by clouds and generates a pixel-level cloud-mask. A cloud mask is an image, which may have the same image size as the re-defined image size above. Each pixel of the cloud mask may take one of two values, in this case (a) cloud-pixel=maximum colour (b) no-cloud-pixel= zero colour (for example, in a digital 8-bit per colour channel scale, a cloud pixel may be assigned RGB colour 255,255,255 (i.e. white) and a no cloud pixel may be assigned RGB colour 0,0,0 (i.e. black, or absence of colour)). When the mask is applied to the image of the field, it removes from the analysis the positions obscured by cloud and the mask does not affect the positions not obscured by cloud (cloud-free). The network or networks may be convolutional neural networks trained by supervised deep learning. The training may use training samples with positions labelled as cloud and not cloud. The trained neural network or networks are able to identify cloud based on colour and texture within the analysed images. The training samples may consist of images of agricultural areas (i.e. no encompassing towns and cities), and may be from the same country, or area of a country, as the field(s) analysed in method 200. Haze is treated as not being cloud as would be understood by the skilled person. [0051] The functions described may be implemented in hardware, software, firmware, or any combination thereof. If implemented in hardware, an example hardware configuration may comprise a processing system in a device. The processing system may be implemented with a bus architecture. For certain aspects, a user interface (e.g., keypad, display, mouse, joystick, etc.) may also be connected to the bus. The processor may be responsible for managing the bus and general processing, including the execution of software stored on machine-readable media. The machine- readable media may be embodied in a computer-program product. In a hardware implementation, the machine-readable media may be part of the processing system separate from the processor. [0052] The machine-readable media may comprise a number of software modules. The software modules include instructions that, when executed by the processor, cause the processing system to perform various functions. Each software module may reside in a single storage device or be distributed across multiple storage devices.

[0053] Referring now to Fig. 4, an example computer system 400 is shown in which the present invention, or portions thereof, can be implemented as computer-readable code to program processing components of the computer system 400. Various embodiments of the invention are described in terms of this example computer system 400. For example, the methods illustrated by the flowcharts of Figures 2 and 3 can be implemented in such a system 400. After reading this description, it will become apparent to a person skilled in the relevant art how to implement the invention using other computer systems and/or computer architectures. At least one input to the computer system 400 must be a satellite or aerial image.

[0054] Computer system 400 includes one or more processors, such as processor 402. Processor 402 can be a special purpose or a general-purpose processor. Processor 402 is connected to a communication infrastructure 401 (for example, a bus, or network). Computer system 400 also includes a user input interface 403 connected to one or more input devices 404 and a display interface 405 connected to one or more displays 406, which may be integrated input and display components. Input devices 404 may include, for example, a pointing device such as a mouse or touchpad, a keyboard, a touchscreen such as a resistive or capacitive touchscreen, etc. A computer display 407 (not shown in Fig. 4), in conjunction with display interface 405, can be used as a display and can display the outputs 260 described with reference to Fig. 2 (including the local cloud graphic 250, full cloud graphic 245, results of the cloud localisation 240 (including mask) the image of the field and areas to be analysed, 230, and results of analysis 255. Alternatively, the outputs 260 can be printed on paper using printer 409 through printer interface 408.

[0055] Computer system 400 also includes a main memory 410, preferably random access memory (RAM), and may also include a secondary memory 411. Secondary memory 411 may include, for example, a hard disk drive 412 (not shown in Fig. 4), a removable storage drive 413 (not shown in Fig. 4), flash memory, a memory stick, and/or any similar non-volatile storage mechanism or cloud memory. Either or both of main memory 410 and secondary memory 411 may include means for allowing computer programs or other instructions to be loaded into computer system 400. Examples of such means may include a program cartridge and cartridge interface (such as that found in video game devices), a removable memory chip (such as an EPROM, or PROM) or the like.

[0056] Computer system 400 may also include a communications interface 414. Communications interface 414 allows software and data to be transferred between computer system 400 and external devices 415. Communications interface 414 may include a modem, a network interface (such as an Ethernet card), a communications port, a PCMCIA slot and card, or the like.

[0057] Various aspects of the present invention can be implemented by software and/or firmware (also called computer programs, instructions or computer control logic) to program programmable hardware, or hardware including special-purpose hardwired circuits such as application-specific integrated circuits (ASICs), programmable logic devices (PLDs), field-programmable gate arrays (FPGAs), etc. of the computer system 400, or a combination thereof. Computer programs for use in implementing the techniques introduced here may be stored on a machine-readable storage medium and may be executed by one or more general-purpose or special-purpose programmable microprocessors.

[0058] Computer programs, model parameters and training data are stored in main memory 410 and/or secondary memory 411. It will also be appreciated that the model stored in these memories can be trained (and fixed) or adaptive (and susceptible to further training). Computer programs may also be received via communications interface 414. Such computer programs, when executed, enable computer system 400 to implement the present invention as described herein. In particular, the computer programs, when executed, enable processor 402 to implement the processes of the present invention, such as the steps in the methods illustrated by the flowcharts of Figures 2 and 3. Accordingly, such computer programs represent controllers of the computer system 400. Where the invention is implemented using software, the software may be stored in a computer program product and loaded into computer system 400 using removable storage drive 413, interface 403, hard drive 412, or communications interface 414.

[0059] Embodiments of the invention employ any computer useable or readable medium, known now or in the future. Examples of computer useable mediums include, but are not limited to, primary storage devices (e.g., any type of random access memory), secondary storage devices (e.g., hard drives, floppy disks, CD ROMS, ZIP disks, tapes, magnetic storage devices, optical storage devices, MEMS, nano-technological storage device, etc.), and communication mediums (e.g., wired and wireless communications networks, local area networks, wide area networks, intranets, etc.). [0060] It will be understood that embodiments of the present invention are described herein by way of example only, and that various changes and modifications may be made without departing from the scope of the invention.