Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
A METHOD FOR VALIDATING GEOLOGICAL MODEL DATA OVER CORRESPONDING ORIGINAL SEISMIC DATA
Document Type and Number:
WIPO Patent Application WO/2018/229469
Kind Code:
A1
Abstract:
The present invention describes A computer implemented method for generating a validated geological model from three-dimensional (3D) seismic data and legacy data, the method comprising the steps of (a) receiving said legacy data from a first data source; (b) receiving said 3D seismic data from a second data source; (c) based on receiving said legacy data and said 3D seismic data, generating an adaptive geological model from said 3D seismic data, said adaptive geological model comprising at least one characteristic geological property; (d) generating at least one synthetic seismic data from at least a first region of interest of said adaptive geological model, the synthetic seismic data being adapted to determine a qualitative similarity value between at least said first region of interest of said adaptive geological model and a corresponding region of interest of said received 3D seismic data; (e) comparing said qualitative similarity value to a corresponding value within said legacy data; (f) adjusting said at least one characteristic geological property until said qualitative similarity value is within a predetermined threshold region of said corresponding value from said legacy data, and (g) automatically generating said validated geologic model including said at least one characteristic geological property that has been modified to be within said predetermined threshold region of said corresponding value from said legacy data.

Inventors:
ECKERSLEY ADAM (GB)
LOWELL JAMES (GB)
HENDERSON JONATHAN (GB)
PATON GAYNOR SUZANNE (GB)
Application Number:
PCT/GB2018/051589
Publication Date:
December 20, 2018
Filing Date:
June 11, 2018
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
FOSTER FINDLAY ASS LTD (GB)
International Classes:
G01V1/28; G01V1/30
Domestic Patent References:
WO2016124878A12016-08-11
Foreign References:
US20020013661A12002-01-31
US4679174A1987-07-07
US20130118736A12013-05-16
EP0745870A21996-12-04
EP2867703A22015-05-06
EP2867705A22015-05-06
EP3047311A12016-07-27
EP3001866B12016-11-23
Other References:
BILL GOODWAY: "Elastic-wave AVO methods", 2002, PANCANADIAN ENERGY CORPORATION
Attorney, Agent or Firm:
MURGITROYD & CO (GB)
Download PDF:
Claims:
CLAIMS

1. A computer implemented method for generating a validated geological model from three-dimensional (3D) seismic data and legacy data, the method comprising the steps of:

(a) receiving said legacy data from a first data source;

(b) receiving said 3D seismic data from a second data source;

(c) based on receiving said legacy data and said 3D seismic data, generating an adaptive geological model from said 3D seismic data, said adaptive geological model comprising at least one characteristic geological property;

(d) generating at least one synthetic seismic data from at least a first region of interest of said adaptive geological model, the synthetic seismic data being adapted to determine a qualitative similarity value between at least said first region of interest of said adaptive geological model and a corresponding region of interest of said received 3D seismic data;

(e) comparing said qualitative similarity value to a corresponding value within said legacy data;

(f) adjusting said at least one characteristic geological property until said qualitative similarity value is within a predetermined threshold region of said corresponding value from said legacy data, and

(g) automatically generating said validated geologic model including said at least one characteristic geological property that has been modified to be within said predetermined threshold region of said corresponding value from said legacy data.

2. A computer implemented method according to claim 1 , wherein said adaptive geological model is based on a grid derived from said 3D seismic data.

3. A computer implemented method according to any one of the preceding claims, wherein said legacy data includes at least one characteristic geological property from any one of a well-log data and at least one predefined library.

A computer implemented method according to any one of the preceding claims, wherein said at least one characteristic geological property is assigned from and modified in accordance with at least one probable data pattern determined utilising machine-learning.

A computer implemented method according to claim 4, wherein said at least one characteristic geological property is assigned utilising any one of 'Kriging', 'Co-Kriging', stochastic modelling, a data from a predefined library, or 'machine-learning'.

A computer implemented method according to any one of claims 4 and 5, wherein said probable date pattern is determined using at least one predetermined training data.

A computer implemented method according to any one of the preceding claims, wherein said at least one characteristic geological property is at least one rock characteristic.

A computer implemented method according to any one of the preceding claims, wherein said adaptive geological model comprises a plurality of geological objects.

A computer implemented method according to claim 8, wherein all of said plurality of geological objects are adaptively linked to each other.

A computer implemented method according to any one of claims 8 and 9, wherein said plurality of geological objects includes any one of or any combination of a geobody, a horizon, a fault, and any other suitable planar geological feature derivable from said 3D seismic data.

1 1. A computer implemented method according to any one of the preceding claims, wherein in step (e) said at least one synthetic data is compared to said 3D seismic data utilising any one of visual comparison, a trace-by- trace correlation, a difference between predetermined correlating volumes, spectral comparison.

12. A computer system for generating a validated geological model from 3D seismic data and legacy data by a method according to any one of the preceding claims.

13. A computer readable storage medium having embodied thereon a computer program, when executed by a computer that is configured to perform the method of any one of claims 1 to 1 1.

Description:
A METHOD FOR VALIDATING GEOLOGICAL MODEL DATA OVER CORRESPONDING ORIGINAL SEISMIC DATA

The present invention relates generally to the field of oil and gas exploration, and in particular to the field of computer aided exploration for hydrocarbons using geophysical data, such as for example seismic data, of the earth. Even more particular, the present invention relates to geological modelling from 3D seismic data and optimising the conceptual model to the original corresponding 3D seismic data so as to minimise the uncertainty between the "real" properties and the interpreted properties derived from the geological model.

Introduction

In the oil and gas industry, geological data surveys such as, for example, seismic prospecting and other similar techniques are commonly used to aid in the search for and evaluation of subterranean hydrocarbon deposits. Furthermore, production geologists are increasingly using computer-based geological models or geomodel (e.g. using 3D geocellular modelling packages) to represent a reservoir's geology.

One of the challenges of the interpreter(s) is to try and replicate known or conceptual geological features that are presumed to be present in the subsurface. The geomodel may then help to more accurately predict the implications of the geological features in potentially being proven hydrocarbon reserves of commercial value. For example, seismic fades interpretation can play a significant role in the initial basin exploration, prospect evaluation, reservoir characterisation, and ultimately, field development.

A typical geological model may be used to control the distribution of relevant petro-physical well data to provide a basis for accurate volumetric assessment. However, most of the known conventional approaches to geological and property modelling rely on well data to drive the results, as (until very recently) the ability to see and derive information from seismic data was very limited. In particular, high-resolution geological models may be built upon 3D mathematical meshes that provide the numerical architecture for building the structural stratigraphic framework. The model(s) are then generally constructed and parameterized through software products that allow professional geoscientists to approximate the static state of the reservoir by interpolating or simulating, for example, geologic fades and their petro-physical properties within a 3D volume. This process is "model driven" from a scenario comprising the conceptual geological model. The interpolation and simulation algorithms used to fill, for example, the inter-well space, are performed using workflows based on these conceptual models and attempt to bind (or link) the results to logical rules derived from underlying geologic principles.

While the workflows can vary based on individual interpretation of the data, the results are generally obligated to respect the observed data. Within a given scenario, it is the interpolation algorithm that is responsible for providing the best estimate at every grid location and the simulation algorithm that is responsible for capturing the inherent variability, providing the basis for any uncertainty analysis. For example, in the known art, the current geological modelling workflow may be as follows:

• Build a structural framework (horizons and faults)

• Use the horizons and faults to build a geo-cellular grid

· Interpret log data to determine fades types within different geological layers and zones

• Perform data analysis to identify trends in data

• Upscale this information to populate the geo-cellular grid

• Apply modelling algorithms to extrapolate property values away from wells - various options

• Tie to any additional data source such as seismic attributes or trends This process can take weeks or months to complete, and usually involves a simplification of interpreted horizons and/or faults, and may require a reduction in lateral resolution to create the geocellular grid. Some of the geological variations that may be seen in well log data are too complex to attempt to model. Often, there is additional degradation of the data due to the averaging methods used to assign values to grid cells.

Furthermore, away from the well, petro-physical well data may be distributed to all cells within the geological model using geostatistics. In this process, extrapolation of property values away from a well is often tied to seismic attributes such as, for example, the acoustic impedance.

As can be seen, current methods and modelling techniques include significant number of assumptions and approximations that are made during the geological modelling workflow and none of these methods or techniques are capable of providing any kind of quality assurance of the modelled data and interpreted properties. I.e. the interpreter has now way of knowing how closely the generated geological model matches the original data. Furthermore, none of the available techniques / methods is able to (quickly and easily) assess different assumptions and approximations, so as to ensure the best fit of the modelled data.

Summary of the Invention Preferred embodiment(s) of the invention seek to overcome one or more of the above disadvantages of the prior art.

According to a first aspect of the invention there is provided a computer implemented method for generating a validated geological model from three- dimensional (3D) seismic data and legacy data, the method comprising the steps of:

(a) receiving said legacy data from a first data source;

(b) receiving said 3D seismic data from a second data source; (c) based on receiving said legacy data and said 3D seismic data, generating an adaptive geological model from said 3D seismic data, said adaptive geological model comprising at least one characteristic geological property;

(d) generating at least one synthetic seismic data from at least a first region of interest of said adaptive geological model, the synthetic seismic data being adapted to determine a qualitative similarity value between at least said first region of interest of said adaptive geological model and a corresponding region of interest of said received 3D seismic data;

(e) comparing said qualitative similarity value to a corresponding value within said legacy data;

(f) adjusting said at least one characteristic geological property until said qualitative similarity value is within a predetermined threshold region of said corresponding value from said legacy data, and

(g) automatically generating said validated geologic model including said at least one characteristic geological property that has been modified to be within said predetermined threshold region of said corresponding value from said legacy data.

Advantageously, said adaptive geological model may be based on a grid derived from said 3D seismic data. Preferably, said legacy data may include at least one characteristic geological property from any one of a well-log data and at least one predefined library.

Advantageously, said at least one characteristic geological property may be assigned from and modified in accordance with at least one probable data pattern determined utilising machine-learning.

Preferably, said at least one characteristic geological property may be assigned utilising any one of 'Kriging', 'Co-Kriging', stochastic modelling, a data from a predefined library, or 'machine-learning'. Even more preferably, said probable date pattern may be determined using at least one predetermined training data. Even more preferably, said at least one characteristic geological property may be at least one rock characteristic.

Advantageously, said adaptive geological model may comprise a plurality of geological objects.

Advantageously, all of said plurality of geological objects may be adaptively linked to each other. Advantageously, said plurality of geological objects may include any one of or any combination of a geobody, a horizon, a fault, and any other suitable planar geological feature derivable from said 3D seismic data.

Advantageously, in step (e) said at least one synthetic data is compared to said 3D seismic data utilising any one of visual comparison, a trace-by-trace correlation, a difference between predetermined correlating volumes, spectral comparison.

According to a second aspect of the invention there is provided a computer system for generating a validated geological model from 3D seismic data and legacy data by a method according to any one of the preceding claims.

According to a third aspect of the invention there is provided a computer readable storage medium having embodied thereon a computer program, when executed by a computer that is configured to perform the method of the first aspect of the invention.

Alternatively, there is provided a method for validating a geological model to a corresponding 3D seismic data, comprising the steps of:

(a) generating an adaptive geological model from said original 3D seismic data comprising at least one first characteristic geological property;

(b) generating at least one synthetic seismic data from at least a first region of interest of said adaptive geological model that is adapted to determine a qualitative similarity value to a corresponding region of interest of said original 3D seismic data;

(c) comparing said qualitative similarity value to a predetermined reference value;

(d) modify said at least one first characteristic geological property until said qualitative similarity value is within a predetermined threshold region from said predetermined reference value.

Advantageously, in the alternative method, in step (d) said at least one first characteristic geological property is modified based on at least one probable data pattern determined using machine learning algorithms. Preferably, said at least one probable data pattern is determined utilising a predetermined training set of said at least one first characteristic geological property. Even more preferably, said at least one first characteristic geological property is at least one rock characteristic. Brief Description of the Drawings

Preferred embodiments of the present invention will now be described, by way of example only and not in any limitative sense, with reference to the accompanying drawings, in which:

Figure 1 shows an illustration of the workflow of the present invention "closing the loop" between the generated geomodel and the original seismic data through validation;

Figure 2 shows (a) an example illustration of adaptive horizons and faults, and (b) an example illustration of a watertight model created using and adaptive framework;

Figure 3 shows (a) an illustration of additional layers that are automatically created between horizons, and (b) an illustration of the additional layering between horizon pairs being displayed as a volume;

Figure 4 shows an illustration of the distribution of rock properties within each geological layer;

Figure 5 shows an illustration of sub-seismic sample layers; Figure 6 shows an illustration of a geological model at seismic scale representing seismic data;

Figure 7 shows (a) an illustration of variable density modelling the seismic, and (b) an illustration of an RGB frequency blend proceeded directly on the variable density;

Figure 8 shows a simplified illustration of a seismic wave hitting a layer boundary at an angle θι, resulting in reflected and transmitted P- and S-waves;

Figure 9 shows an illustration of rock boundaries, as well as, wavelet extraction;

Figure 10 shows an illustration of an example trace and corresponding extracted wavelet;

Figure 11 shows an illustration of a geological model that is split into sections / regions;

Figure 12 shows (a) a close-up of the top-right region of the geological model in Figure 11 , and (b) its underlying seismic data;

Figure 13 shows an illustration of an example learning algorithm utilised to classify lithology;

Figure 14 shows an illustration of a sliding window for visual comparison of the synthetic seismic reflectivity data and corresponding image from the real data;

Figure 15 shows examples (a) and (b) of single layer correlation between synthetic and RGB frequency blend, and

Figure 16 shows an example of a multilayer correlation between synthetic and RGB frequency blend.

Detailed description of the preferred embodiment(s)

The exemplary embodiments of this invention will be described in relation to geological modelling described in any one of EP2867703A, EP2867705A, EP304731 1A, EP3001866B and WO2016/124878. However, it should be appreciated that, in general, the method and system of the present invention will work equally well with any other geological model created from 2D or 3D seismic data. For purposes of explanation, it should be appreciated that the terms 'determine', 'calculate' and 'compute', and variations thereof, as used herein are used interchangeably and include any type of methodology, process, mathematical operation or technique. The terms 'generating' and 'adapting' are also used interchangeably describing any type of computer modelling technique for visual representation of a subterranean environment from geological survey data, such as 3D seismic data. In addition, the terms 'vertical' and 'horizontal' refer to the angular orientation with respect to the surface of the earth, i.e. a seismic data volume is orientated such that 'vertical' means substantially perpendicular to the general orientation of the ground surface of the earth (assuming the surface is substantially flat), and 'horizontal' means substantially parallel to the general orientation of the ground surface of the earth. In other words, a seismic data volume is therefore in alignment with respect to the surface of the earth so that the top of the seismic volume is towards the surface of the earth and the bottom of the seismic volume is towards the centre of the earth. In addition, the term 'time domain' may also define the vertical direction of the seismic traces with regards to the surface of the earth, whereas the term 'lateral' may refer to a horizontal displacement with regards to the to surface of the earth. Furthermore, the term 'atom' is generally known by the person skilled in the art and refers to an adapted wavelet from a dictionary of wavelets to generate an analytical model function.

As illustrated in Figure 1 , the computer system with improved functionality from the present invention "closes the loop" between the 3D geological model and the original seismic data. The new concept enables both the seismic interpretation and the geological concepts to be validated against the original data and allows different geological hypotheses suggested by the data to be examined.

In particular, the purpose of the invention is to make it much easier to Quality Control (QC) and therefore validate any 3D geological model and it's underpinning geological concepts.

A preferred method of the initial geological model building is now discussed. Model Creation

As shown in Figure 2, a "watertight" geological model may be created using an adaptive framework as described in any or all of EP2867703A, EP2867705A, EP304731 1A, EP3001866B and WO2016/124878. Within the adaptive framework, the interpreted adaptive horizons, faults and geobodies are linked to one another, and are configured to update "on the fly" when any one of these adaptive surfaces (i.e. horizons, faults and geobodies) are modified (e.g. by the interpreter), or the interpretation is refined. In many real applications, the geocellular grid (corner point grid), used to model petrophysical and dynamic properties in the reservoir, does not coincide with the seismic grid. In particular, geocellular grid cells are usually larger than the bin size of the seismic survey. In the present invention, the automatic refinement / adaption of the surfaces occurring on the seismic grid, removes the need to generate a geocellular grid. As is known from the currently available prior art, the creation of a geocellular grid can take a significant amount of time, as the surfaces are adjusted/ modified to fit, for example, to a coarser grid. In the present invention, the automatic refinement is based on the seismic grid making the process of interpretation faster and more efficient.

As illustrated in Figures 3 (a) and (b), within the "watertight" model, additional layers between interpreted horizons and faults can be defined manually or automatically. I.e., the manually and automatically detected layers can be edited so that boundaries can be shifted or layers can be combined or split. This provides a very efficient method to divide the data into compartments representing different geological units. As is shown in Figure 4, the geometries defined by the interpreted and automatically generated layers may form zones with assigned rock properties. Layers can be based on user-defined horizons, data-driven sub-horizons or iso-proportional slicing. The advantage of user- defined layers or horizons is that each layer is manually QC'ed (quality controlled), though at the expense of increased time. Iso-proportional slicing may be advantageous in really noisy data or complex data (where signals are patchy), using other horizons to guide the intermediate layers through the noisy data. Also, as the vertical resolution of well log data is finer than seismic data, within each zone, intermediate subsample layers are defined to be more closely spaced than the seismic sampling. Rock properties are then assigned to each of these units or groups of units. Figure 5 illustrates a preferred method for subsampling the layers using iso-proportional slicing. The rock properties can then be assigned to each layer directly from any one of a well-data, a pre-defined library of common rock properties from different regions of the world, or by using machine learning. Once a geological model has been created (or a previous geological model has been provided), the hypotheses represented by that model are validated. During the validation, any one of the whole volume, a sub-volume or a slice of the geological model may be examined. Validation

The validation step provides a fast, practical workflow, that enables cross checking at every stage of the interpretation and 3D modelling workflow, making it easier to QC and validate both a 3D static model and the geological concepts that underpin that model.

Here, a synthetic model is created from the geological model using forward modelling techniques, so as to enable a comparison back to the original data, and also allow different ideas and concepts to be tested. Various forwarding modelling techniques may be used to generate the synthetic model(s). A preferred forward modelling method of the present invention is as follows:

Forward Modelling

As is known, complex interference patterns occurring in seismic data and RGB frequency colour blends can be related to thickness and impedance variations. Although variation in bed thickness is a dominant factor controlling seismic amplitude and RGB blended frequency decomposition colour responses, subtle lithological changes (presented as differences in acoustic impedance contrast) can be differentiated as a second order effect.

Referring now to Figures 6 and 7, forward modelling is used to model the complex interference patterns occurring in seismic. Forward modelling generates synthetic variable density volumes that are derived from earth models or other geological models. Although, a variety of forward modelling algorithms are suitable for generating synthetic variable density volumes, the preferred algorithms used with the present invention are the Aki-Richards approximation to the Zoeppritz equations.

Aki-Richards approximation for Forward Modelling

Modelling seismic wave energy as it partitions at the interface of two different rock layers is a fundamental part of forward modelling. The use of the Zoeppritz equations to model rock boundaries, as well as, a number of linearized approximations to the Zoeppritz equations, have been accepted for a long time.

An overview of the Zoeppritz equations is now given, together with a description of the Aki-Richards approximation.

The Zoeppritz equations calculate the amplitudes of transmitted and reflected components of the P- (longitudinal) and S- (transverse) waves for a seismic wave incident on a boundary between two different rock layers. The full Zoeppritz system comprises four equations in four unknowns. The equations can be solved, but they do not provide an intuitive understanding of the relationship between the coefficients and the rock properties (e.g. density (p), P- wave velocity (Vp), S-wave velocity (Vs)) either side of the boundary. Of the four results calculated by these equations, the amplitude of the reflected P-wave is the most important characteristic for the forward modelling implementation within the method of the present invention. In practice, there are numerous approximations for the Zoeppritz equations that are simpler and more intuitive to understand in terms of the rock properties. '"Elastic-wave AVO methods', by Bill Goodway - PanCanadian Energy Corporation, [2002]" provides the Aki-Richards (1980) approximation. Again, this is the preferred approach in the method of the present invention. It is a three-term equation in terms of the ratios of density, P-wave velocity and s-wave velocity between the two surfaces, whose boundary the equation is applied to.

Figure 8 shows a simplified illustration of a wave hitting a boundary at an angle θι, resulting in reflected and transmitted P- and S-waves. The reflection coefficient calculated using the Aki-Richards approximation is that of the reflected P-wave (Rpp(9i)), given as:

Rpp(θ 1 = ^^ A +

1 2 V p ^ p )j - 2 ^ sm 2 θ

V 1 1 ( V2 ^ V s + ^ p )) + a n 2 θ

2 1 1 A [Eq. 1]

V v ' where:

for x representing each of p, Vp, Vs.

The other three results that can be obtained from both, the Zoeppritz and Aki- Richards equations, are the amplitudes of the transmitted P-wave and of the reflected and transmitted S-waves.

The angle of the transmitted P-wave is determined by Snell's law as: Snell's law is used in ray tracing algorithms, which take into account the impact of different layers in a multi-layered model. In such algorithms, the transmitted P- wave from the first boundary encountered would determine the angle at which the wave with an initial angle of hits the second boundary.

However, the method of the present invention does not account for this, but instead considers each boundary independently of the rock layers that lie above it. Whilst this provides a less accurate model, it does have the property that a boundary between the same two rock types will always yield the same reflection coefficient, irrespective of where it sits in the model, so that changing the upper layers of the model will not affect the synthetic seismic result in the lower part of the model.

Where multiple angles of incidence are used, Rpp(9) is calculated for Θ at 1 ° [deg] intervals from the minimum to the maximum angle in the range. The resulting values are averaged with a mean function to yield a single reflection coefficient, Rpp(e {min ,max}), for an angle range from Q min to Q max .

A seismic trace s(9) for an incidence angle Θ, is defined from the reflectivity Rpp(9) and the wavelet ω by: s(0) = Rpp{6) * w; [Eq. 3]

The earlier Aki-Richards approximation of Zoeppritz equations is used to compute the reflectivity {Rpp) from the rock properties.

Generic wavelets such as a Ricker or Gabor can be used to then generate a synthetic trace, however, to more accurately model the seismic, extracted wavelets may be used. Figure 9 shows an illustration of three rock layers and its boundaries, as well as, wavelet extraction. Wavelet Extraction

Wavelets are required to model the seismic wave energy as it partitions at the interface of two different rocks. The type of wavelet (such as Ricker, Gabor or any other suitable one) and its dominant frequency greatly influence the seismic reflectivity data produced from forward modelling.

To obtain more accurate models, wavelets extracted directly from the seismic is the preferred method. For example, wavelet extraction uses the data from a predetermined number of in-lines and cross-lines within a region of interest (ROI). A variety of techniques are available for wavelet extraction, including, for example, autocorrelation. The autocorrelation of the signal may use a rescaled version of the autocorrelation of the wavelet. This technique uses the Wiener- Khinchin theorem and extracts a zero phase wavelet. The spectrum of the wavelet can be computed from the spectrum of the autocorrelation of the signal: (w) = i c ww )| = Bj\F(C ss ) \

for some constant, B. And the wavelet is computed by inverse Fourier transform:

W = 5 - - 1 ( LF(Q T); [Eq. 5]

The final wavelet can be normalised by dividing it by its maximum value. Figure 10 shows an illustration of an example trace and its extracted wavelet.

Parameters

Control over the synthetic generation of properties, such as frequency and phase of the wavelet, together with the range of angles, contributing to model and layer thicknesses and allow for the validity of the model to be examined. Multiple realisations of the synthetic data can be produced to examine the effect of both changing the imaging parameters, but more importantly, to compare the effect of changing the layer properties. In the present invention, an example driven approach is applied to easily compare the effect of changing parameters.

Similarly, frequency decomposition based RGB blends may be generated from the synthetic data, therefore, allowing the impact of changing the frequency decomposition parameters on the result to be examined. I.e. how the frequency response differs, if the rock layer properties are changed, or how the frequency response may change, if the parameters used to generate the synthetic data are changed. Comparisons can then be made between (i) what is observed in the synthetic model and (ii) the results generated using the same frequency decomposition parameters on the original seismic data.

This level of interaction provides a unique and very easy to apply methodology for determining, whether or not, the geological model and the hypotheses used in defining it really are supported by the original seismic data.

Rock Property Propagation

Assigning rock properties within the model layers can be achieved using any one of the following techniques: a) Kriqinq:

Kriging is an interpolation method that allows for creation of a model based on sparse data from, for example, multiple wells within a region. The aim is to "fill out" (i.e. populate) a 3D model, using the known well data, each weighted using spatial covariance to account for the distance of the target from each known data point. Kriging provides the best interpolation of the available points, as measured by minimising its associated error variance. The general formula for Kriging, so as to estimate a value at an unknown point (s 0 ), can be given as:

Z(s 0 = [Eq. 6] where Z(S,) are the measured values at N known locations and A, are unknown weights, to be determined, summing to . Kriging varies from other simpler methods in that these weights are based on the spatial arrangement of the known data points, rather than just their distance from the target. This allows for isolated known data points to be weighted more than clusters of points, which may contribute more similar and sometimes redundant information.

Variations of the Kriging technique allow for additional information to be used to determine the weights A,. b) Co-Kriqinq:

Co-Kriging is an extension of Kriging, which uses multiple attributes in a multivariate system to produce an estimate of unknown points. Secondary abundant data sources (such as seismic data) that are highly correlated can be incorporated into the measure. The secondary data is then utilised to guide the interpolation of the primary sparse well data. c) Stochastic Modelling:

The mathematically 'best' interpolation path is not always the most suitable as a realistic solution. Stochastic modelling creates multiple equally probably realisations of a model, each of which fits the sparse well data, but interpolates between them in different ways. Creating many such realisations allows for a greater choice and investigation of solutions, which would never be reached by so called 'safe' methods, such as Kriging, which always seek to minimise the error.

The average of many realisations should produce a result very similar to that of Kriging, smoothing out all the differences from each model. Error estimates and uncertainty analysis may also be derived from these multiple realisations. Each individual model, however, will be noisier than the "Kriged" result, but these can be smoothed after extracting the required statistics. d) Base Library:

In exploration regions where no well data is available, assumptions of the rock properties may be made. A base library may be provided with common rock properties retrieved from different regions of the world. The common rock properties may be assigned to each model layer (e.g. fades) using multi attribute (including frequency magnitude volumes) volumes and fades classification. The validity of the rock property assignment may then be assessed by QC'ing (Quality Controlling) the geomodel back against the original seismic data. e) Machine learning:

Machine Learning is a process where computer algorithms "find" patterns in the data, enabling the computer to predict probable outcomes. In an aspect of the present invention Machine Learning is utilised to investigate complex / high dimensional parameter spaces and determine the most probable rock properties observed in the seismic data, and then to assign the predicted rock properties to each geological layer.

Machine Learning is therefore used to predict the rock properties observed in the seismic data by forward modelling each geological layer model with a range of common rock properties (within each layer), forming a training set of known labelled results.

Effectively, each variable density generated within the training set forms a "ground truth" in which information such as frequency decomposition magnitude volumes, fades classification volumes and attribute volumes (such as Quadrature, Envelope, Instantaneous Phase) are used as input features to train learning algorithms, such as, for example, a deep learning neural network. To reduce the complexity and variability of the classification problem, the geological model that is compared back to the seismic can be split into smaller regions. These regions could be divided into blocks (sub-volumes), and or slices as shown in Figures 1 1 and 12.

Additionally, the classification problem can be simplified even further by isolating the variability of the rock properties (Density, Vp, Vs) from other elements of the synthetic seismic data generation, such as, layer boundary and wavelet characteristics.

To this end, it is assumed that the model layers in the geological model define the rock lithology boundaries and not the seismic reflectivity, and that an appropriate wavelet is used, preferably an extracted wavelet from the seismic region of interest.

As the selected wavelet and thickness between layer boundaries significantly influences the generation of the seismic data, the degrees of freedom (DOF) of a synthetic model are reduced by fixing two out of the three forward modelling components, i.e. wavelet and thickness or model layer (for each region of interest).

Furthermore, for each layer within the geological model, a training set is built "on the fly", comprising, inter alia, an extracted seismic wavelet, the layer boundaries from the geological model, and varying rock properties (Density, Vp, Vs).

Different rock properties sequences are then used to create the training set for each zone. These sequences are defined using empirical knowledge of common rock sequencing (such as shale / sand / shale) from different parts of the world. Within each sequence, the full range of expected Density, Vp, Vs values for each rock within the sequence is used to generate a variable density for each combination. This process may be repeated to include fluid substitution. As illustrated in Figure 13, using then the labelled training set of common lithologies within a geological model, mimicking the lithology boundaries observed within the seismic, a learning algorithm, such as a deep learning neural network, is used to extrapolate new features from the limited set of features contained in the training set (such as frequency decomposition magnitude volumes, facies classification volumes and attribute volumes) to classify lithology types (sand, shale etc.).

Having now trained a learning algorithm to classify rock lithology within a particular zone, features extracted from the corresponding real data are fed into the learning algorithm to obtain a lithology prediction. The lithology prediction is then used to populate the appropriate layer within the geological model.

QC Step:

To now validate a rock property assigned geological model to the original seismic data, an assessment method is required. The assessment method compares the geological model, to which the lithologies have been assigned, by either Geostatistical, Base Library or Machine Learning propagation or any other method of assignment.

The synthetic model generated from the geological model can now be compared back to the original seismic data, using any one of a number of methods, such as, for example: a) Visual comparison

Referring now to Figure 14, using an ineractive display such as a sliding window, a visual comparison of the synthetic seismic reflectivity data (or derived volumes such as frequency decomposition blends or other seismic attributes) and the corresponding imagary from the real data can be reviewed and compared. b) Independent trace-by-trace correlation

A full trace-length measure of similarity is calculated between the original seismic data and the synthetic model, with per-trace correlation, lateral granularity/localisation, but no vertical localisation. c) Volume difference

The original seismic data and synthetic data are compared as a whole, or within a series of sub-volumes. The comparison could be based directly on the seismic reflectivity data (i.e. synthetic model), or on any other seismic attribute, such as, envelope. This is a simple but noisy indicator, as the differences are per-voxel and very localised. d) Spectral comparison

A Fast Fourier transform is applied over both, the seismic and synthetic model, from full volumes / sub volumes / slices, and use different levels of granularity and averaging to compare the spectral content of the model and the seismic data. This would produce a global measure of spectral content differences, but without localisation within the model. e) Layer based

Referring now to Figures 15 and 16, an event or layer based measure of similarity could be formed by breaking down the full volume, sub volume or slice into its layers, and an overall correlation factor is computed from the correlation between the original seismic and synthetic seismic data for each point within each layer. The correlation may occur for a single layer as shown in Figure 15 or multiple layers as shown in Figure 16. f) Wavelet based

Matching pursuit decomposition techniques (e.g. GeoTeric™'s High Definition Frequency Decomposition), which characterise a signal by matching it to a set of wavelets can also be used to provide estimates of the correlation between the original seismic data and the synthetic data created from the model. The wavelet matching uses, for example, a dictionary of wavelets that are matched to an input signal, by varying the wavelet scale, modulation, amplitude and phase. Correlations can then be made between the original seismic data and the synthetic seismic data by comparing these parameters. The correlation measurements may be done on a point by point basis, by comparing sections of a trace or by comparing sections of multiple traces. Most importantly, good correlation between some of the parameters but not others can be used as a diagnostic to determine which parameters of the model might be incorrect, e.g. if scale, modulation and amplitude of the matched wavelets show a good correlation, but the phase of the matched wavelets does not correlate well, then the problem is likely to be an error in the thickness value that has been assigned to the layer in the model.

Machine Learning:

Taking metrics from one or more of the aforementioned QC (Quality Control) approaches, Machine Learning techniques may be used to classify good matches between the original seismic data and the synthetic data created from the geological model. Classifying the fit, by comparing sections of a trace, or by comparing sections of multiple traces.

It will be appreciated by persons skilled in the art that the above embodiment has been described by way of example only and not in any limitative sense, and that various alterations and modifications are possible without departing from the scope of the invention as defined by the appended claims.