Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
METHOD AND DEVICE FOR ESTIMATING A COMBUSTION EFFICIENCY VALUE DURING FLARING
Document Type and Number:
WIPO Patent Application WO/2024/057051
Kind Code:
A1
Abstract:
The invention relates to a computing device and method for estimating (100) a combustion efficiency value during flaring, over a time period, said method comprising the following steps: - Acquiring (140) a video stream of the flare flame (61 ) over the time period; - Segmenting (150) the video stream in several video segments, each video segments being associated with a video segment duration; - Analyzing (160) the video segments, using a correlation model, so as to classify each video segments in at least one flame state category; and - Computing (170) the combustion efficiency value, said computing (170) step using the video segment durations and a plurality of unburned reduction index values, each of said unburned reduction index values being specific to one of the flame state categories, specific to the industrial plant and calculated using computational fluid dynamics.

Inventors:
CUIF-SJOSTRAND MARIANNE (FR)
BONNASSE-GAHOT MARIE (FR)
MAUNOURY ABEL (FR)
DUGUE JACQUES (FR)
Application Number:
PCT/IB2022/000536
Publication Date:
March 21, 2024
Filing Date:
September 15, 2022
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
TOTALENERGIES ONETECH (FR)
International Classes:
F23G7/08; F23G5/50; F23N5/08; F23N5/16; F23N5/20
Foreign References:
EP2122253A12009-11-25
US20210191365A12021-06-24
EP2309186A22011-04-13
Attorney, Agent or Firm:
A.P.I. CONSEIL (FR)
Download PDF:
Claims:
Claims A method for estimating (100) a combustion efficiency value during flaring by a flare flame (61) in an industrial plant, over a time period, said method being implemented by one or more processors (10) and comprising the following steps:

- Acquiring (140) a video stream of the flare flame (61 ) over the time period;

- Segmenting (150) the video stream in several video segments, each video segments being associated with a video segment duration;

- Analyzing (160) the video segments, using a correlation model, so as to classify each video segments in at least one flame state category; and

- Computing (170) the combustion efficiency value accordingly: said computing (170) step using the video segment durations and a plurality of unburned reduction index values, each of said unburned reduction index values being specific to one of the flame state categories, specific to the industrial plant and calculated using computational fluid dynamics. The method according to claim 1 , wherein the combustion efficiency value is computed after less than five minutes after the acquisition of the video stream. The method according to claim 1 or claim 2, wherein the unburned reduction index values are correlated to a quantity of unburned gas released. The method according to anyone of claim 1 to 3, wherein it comprises a step of initial calibration (120), said initial calibration (120) comprising the following steps:

- Acquisition (121 ) of video streams of the flare flame of the industrial plant over several flame behaviors;

- Segmentation (123) of the video streams in a plurality of video segments each associated with a flame state category; and

- Computation (126) of an unburned reduction index value for each of at least four flame state categories, using computational fluid dynamics. The method according to anyone of claim 1 to 3, wherein it comprises a step of initial calibration (120), said initial calibration (120) comprising the following steps:

- Acquisition (121 ) of video streams of the flare flame of the industrial plant over several flame behaviors; - Acquisition (122) of a weather parameter value, a parameter value of the flare flame, a parameter value of the flare structure and/or a flare processing parameter value;

- Segmentation (123) of the video streams in a plurality of video segments each associated with a flame state category; and

- Computation (126) of an unburned reduction index value for each of at least four flame state categories, using computational fluid dynamics with the weather parameter value, the parameter value of the flare flame, the parameter value of the flare structure and/or the flare processing parameter value. The method according to claim 4 or 5, wherein the step of initial calibration (120) further comprises, after the segmentation (123), a selection (124) of the most representative flame state categories, each of said representative flame state categories being associated with a representative weather parameter value and/or a representative flare processing parameter value, said representative weather parameter value and/or representative flare processing parameter value being calculated from the acquired weather parameter and/or flare processing parameter values; and wherein the computation of the unburned reduction index values is done for the most representative flame state categories using computational fluid dynamics with the representative weather parameter value and/or the representative flare processing parameter value. The method according to anyone of claim 1 to 6, wherein it comprises a step of calibrating (110) the correlation model, said correlation model is configured to classify segments of a video stream of a flare flame according to a reduced number of flame states based on the flame behavior as monitored by a camera, preferably, the correlation model is configured to classify segments of a video stream according parameter value(s) of the flare flame. The method according to the previous claim, wherein the correlation model is configured to classify segments of a video stream also according to flare processing parameter value(s), parameter value(s) of the flare structure and/or weather parameter value(s). The method according to anyone of claim 1 to 8, wherein it further comprises a step of recalibration (180), said recalibration comprising the measure of unburned gases released by the industrial plant by a dedicated mean, and a comparison of the measured of unburned gases released and a computed quantity of unburned gases released from the computed combustion efficiency value. The method according to the previous claim, wherein, when there is an inconsistency between the measured of unburned gases released and the computed quantity of unburned gases released, the step of recalibration (180) further comprises a step of modifying the unburned reduction index values, preferably according to said inconsistency. The method according to anyone of claim 1 to 10, wherein the segmentation of the video stream is based on a predetermined duration value or wherein the segmentation of the video stream is based on a duration value which is calculated according to flare processing parameter value(s), parameter value(s) of the flare flame, parameter value(s) of the flare structure and/or weather parameter value(s). The method according to anyone of claim 1 to 10, wherein the segmentation of the video stream use at least one non-image-based value, such as a weather parameter value, a flare processing parameter value and/or a parameter value of the flare structure. The method according to anyone of claim 1 to 12, wherein the correlation model comprises a supervised, unsupervised or reinforcement-based machine learning model, such as a convolutional neural network. The method according to anyone of claim 1 to 13, wherein the correlation model is configured to compute a value of at least one parameter of the flare flame such as: size of the flame, flame to smoke ratio, temperature of the flare, colorimetry, a soot built up, a flame detachment, smoke of the flare, an angle of inclination of the flame, a flame opening angle, a length of visible plume, coloration of flame, coloration of smoke and/or a soot content. The method according to anyone of claim 1 to 14, wherein the step of analyzing (160) the video segments comprises a combination of computing a value of at least one parameter of the flare flame such as: a size of the flame, a flame to smoke ratio, a temperature of the flare, colorimetry, a soot built up, a flame detachment, a smoke of the flare, an angle of inclination of the flame, a flame opening angle, a length of visible plume; and using a machine learning model. The method according to anyone of claim 1 to 15, wherein the step of segmenting (150) the video stream comprises using the correlation model. The method according to anyone of claim 1 to 16, wherein the step of segmenting (150) and the step of analyzing (160) the video segments, are performed simultaneously and each comprises the use of the correlation model. The method according to anyone of claim 1 to 17, wherein the step of analyzing (160) the video segments further comprises a sub step of differentiating flame and smoke from environment. The method according to anyone of claim 1 to 18, wherein it comprises the use of at most hundred flames state categories, fifty, preferably at most forty flames state categories, more preferably at most thirty flames state categories and even more preferably at most twenty flames state categories. The method according to anyone of claim 1 to 19, wherein it comprises the use of at least four flames state categories, preferably at least five flames state categories, more preferably at least ten flames state categories and even more preferably at least fifteen flames state categories. The method according to anyone of claim 1 to 20, wherein, when a video segment is classified in several flame state categories, preferably each of the flame state categories associated with this video segment is associated with a percentage, the sum of the percentages being equal to 100%; said percentage being preferably a percentage of duration of each flame state category in the video segment. The method according to anyone of claim 1 to 21 , wherein the unburned reduction index values have been calculated through computational fluid dynamics, preferably reactive computational fluid dynamics large eddy simulation. The method according to anyone of claim 1 to 22, wherein during computing (170) the combustion efficiency value, only a part of the video segments are associated with one or several unburned reduction index values. The method according to anyone of claim 1 to 23, wherein computing (170) the combustion efficiency value further comprises the use of at least one non-image- based value, said non-image-based value being selected among a weather parameter value, a plant value, or a current process condition information. The method according to anyone of claim 1 to 24, wherein the unburned reduction indexes have been modified according to a measured quantity of unburned gases released obtained through sampling campaign, for example with drones, of the unburned gases quantities released into the atmosphere. The method according to anyone of claim 1 to 25, wherein it further comprises obtaining an audio stream of the flare flame, in particular the audio stream comprise vibration recording of a flare burner generating the flare flame, corresponding to the video stream and wherein computing (170) the combustion efficiency value further use the obtained audio stream. The method according to anyone of claim 1 to 26, wherein flame state categories are associated with an unburned reduction index value based on quantity of unburned gases previously calculated for similar flames state category with a computational fluid dynamics. The method according to anyone of claim 1 to 27, wherein it further comprises a step of capturing a video stream of a flare emitted by a flare burner of the industrial plant with at least one camera. The method according to anyone of claim 1 to 28, wherein it further comprises a step of storing the video stream of the flare flame, with the computed combustion efficiency value along with a date and time stamp. Method of operating a flare burner (62) comprising a modification of at least one flare processing parameter value based on the computed combustion efficiency value according to anyone of claims 1 to 29. The method of operating a flare burner (62) according to the previous claim, wherein the modification of at least one flare processing parameter value is also based one at least one weather parameter value. A computer program product having computer-executable instructions which, when carried out on a computer system, perform the method according to anyone of the preceding claims. A computing device (1) for estimating a combustion efficiency value during flaring by a flare flame (61) in an industrial plant, over a time period, said computing device comprising:

- a memory component (20) configured to store a correlation model configured to classify each video segments in at least one flame state category; - a communication interface (30) configured to acquire a video stream of the flare flame;

- one or more processors (10) configured to: o Acquire a video stream of the flare flame (61 ) over the time period; o Segment the video stream in several video segments, each video segments being associated with a video segment duration; o Analyze the video segments, using a correlation model, so as to classify each video segments in at least one flame state category; and o Compute the combustion efficiency value, said computing (170) step being based on the video segment durations and a plurality of unburned reduction indexes, each of said unburned reduction indexes being specific to one of the flame state categories, specific to the industrial plant and calculated using computational fluid dynamics. A computing system comprising a computing device according to the claim 33 and at least image capturing device (40) arranged to generate a video stream of a flare flame in an industrial plant. The computing system according to claim 34, wherein the image capturing device (40) is oriented toward at least one flare burner (62) and configured for obtaining video stream of a flare flame (61 ) emitted by the industrial plant, preferably said image capturing device (40) being a digital video, a high-definition digital video, or a 3D video. The computing system according to claim 34 or 35, wherein the at least one image capturing device (40) includes a visible camera, and optionally an infrared camera, a near-infrared camera and/or a broad-spectrum infrared camera. The computing system according to anyone of claims 34 to 36, wherein it further comprises an audio recorder or a vibration recorder (53) configured to measure the vibration of the flare burner during flaring.

Description:
METHOD AND DEVICE FOR ESTIMATING A COMBUSTION EFFICIENCY VALUE DURING FLARING

FIELD OF THE INVENTION

[0001] The present invention relates to the field of greenhouse gas emission monitoring. In particular, the invention relates to the field of greenhouse gas emission monitoring during flaring. This invention provides a method for estimating a combustion efficiency value during flaring in an industrial plant, over a time period. It also provides a method for operating a flame burner of a plant comprising a modification of at least one flare processing parameter value based on the computed quantity of unburned gases released. It also provides computing system implementing such methods.

DESCRIPTION OF RELATED ART

[0002] Flaring is a high-temperature oxidation process used to burn waste gases containing combustible components such as volatile organic compounds (VOCs), natural gas (such as methane), carbon monoxide (CO), and hydrogen (H 2 ). The waste gases are piped to a remote location and burned in an open flame in ambient air using a flare burner. Flaring has been considered as promoting a nearly complete (e.g., > 98%) conversion of the combustible components in the waste gas. However, it might be worst.

[0003] The global oil and gas industry is trending toward improved environmental safety and compliance throughout the various phases of oil & gas production. The conversion (e.g., 90-98%) of the combustible components can suffer drastic variation between industrial plants and over time. Hence, if one wants to report oil and gas sector methane emissions accurately and transparently, there is a need for monitoring solution capable of estimating a combustion efficiency value during flaring. There is also a need for better understanding in order to propose appropriate mitigation solutions.

[0004] Regulatory authorities (e.g., the U.S. Environmental Protection Agency - EPA; Oil and Gas Methane Partnership 2.0 - OGMP 2.0) impose environmental regulations in terms of monitoring and controlling the flare flame performance. Accordingly, the operators may track and record: the presence and absence of smoke, the presence and absence of the flare flame, the flow rate and composition of flare gas, the flow rate and composition of assist fuel gas, purge gas, steam and/or air, calculated values for net heating value at regular intervals in time based on the above listed input parameters, and the time duration of flaring and smoking events. For example, systems based on thermocouples, infrared (IR) sensors, or cameras can be used for indicating the presence or absence of smoke and to some extent the quantity of smoke. They can also be used to make a real-time assessment of the combustion zone. However, operators aren’t confronted with an obligation to control a quantity (measured or calculated) of unburned gas discharged into the atmosphere.

[0005] When the burner flame becomes unstable, significant increases in NO, CO, Volatile organic compound, polycyclic aromatic hydrocarbon, particulates, and/or other emissions from the burner can occur. Solutions have been proposed to predict and prevent the occurrence of burner flame-out events to improve the safety of the combustion system. Such systems based on flame sensors are capable of distinguishing between a real flame condition, false flame condition or unstable condition. A false flame condition can be produced by the detection of UV or IR light radiating, for example, from hot refractory or metal materials around the burner. Systems to identify unstable flame conditions which are likely to lead to a burner flame-out event can be configured to notify the system operator ahead of time so that the unstable conditions can be rectified before the flame-out occurs. It has also been proposed to use multispectral imagers to define a combustion efficiency of a flare burner. These multispectral imagers can report a flare’s smoke index. They are mainly based on image analysis and combustion of heat release, flame footprint and flame stability. Moreover, such solutions can be combined with a control of the operating condition of the flare burner.

[0006] However, such systems cannot compute a quantity (measured or calculated) of unburned gases discharged into the atmosphere. There, multispectral imagers are complex cameras that make it possible to estimate the quantity of unburned matter released.

[0007] However, these solutions do not consider in a detailed way the variations associated with the turbulence of a three-dimensional and unsteady flow. Solutions that can generate high-precision data in real time at reasonable prices.

[0008] Hence, there is a need for solutions capable of generating a prediction of combustion efficiency over time by a flame flare, with a simple equipment. SUMMARY OF THE INVENTION

[0009] The following sets forth a simplified summary of selected aspects, embodiments and examples of the present invention for the purpose of providing a basic understanding of the invention. However, the summary does not constitute an extensive overview of all the aspects, embodiments and examples of the invention. The sole purpose of the summary is to present selected aspects, embodiments, and examples of the invention in a concise form as an introduction to the more detailed description of the aspects, embodiments and examples of the invention that follow the summary.

[0010] The invention aims to overcome the disadvantages of the prior art. In particular, the invention proposes a method for estimating a combustion efficiency value during flaring by a flare flame in an industrial plant, over a time period, said method being implemented by one or more processors and comprising the following steps:

- Acquiring a video stream of the flare flame over the time period;

- Segmenting the video stream in several video segments, each video segments being associated with a video segment duration;

- Analyzing the video segments, using a correlation model, so as to classify each video segments in at least one flame state category; and

- Computing the combustion efficiency value accordingly: said computing step using the video segment durations and a plurality of unburned reduction index values, each of said unburned reduction index values being specific to one of the flame state categories, specific to the industrial plant and calculated using computational fluid dynamics.

[001 1] A method according to the invention open the way to consider in real time the variations associated with the turbulence of a three-dimensional and unsteady flow. Hence, a method according to the invention can generate high-precision data in real time without necessarily using multispectral imagers. A method according to the invention can thus be used in a predictive emission monitoring system based on a visible imager.

[0012] According to other optional features of the method for estimating a combustion efficiency, the latter may optionally include one or more of the following features, alone or in combination:

- the combustion efficiency value is computed after less than five minutes after the acquisition of the video stream. Preferably, the combustion efficiency value is computed after less than five minutes after the generation of the video stream by a camera.

- the unburned reduction index values are correlated to a quantity of unburned gas released. In particular, this correlation permitted by the use of CFD can be enhanced based on the gas composition and flow rate of the flare before the combustion.

- it comprises a step of initial calibrations, said initial calibration comprising the following steps: o Acquisition of video streams of the flare flame of the industrial plant over several flame behaviors; o Segmentation of the video streams in a plurality of video segments each associated with a flame state category; and o Computation of an unburned reduction index value for each of at least four flame state categories, using computational fluid dynamics.

- This step of initial calibrations can further comprises an acquisition of a weather parameter value, a parameter value of the flare flame, a parameter value of the flare structure and/or a flare processing parameter value; and the computation of an unburned reduction index value for each of at least four flame state categories, is thus made using computational fluid dynamics with the weather parameter value, the parameter value of the flare flame, the parameter value of the flare structure and/or the flare processing parameter value. This can improve the accuracy of the computed combustion efficiency value.

- the step of initial calibration further comprises, after the segmentation, a selection of the most representative flame state categories, each of said representative flame state categories being associated with a representative weather parameter value and/or a representative flare processing parameter value, said representative weather parameter value and/or representative flare processing parameter value being calculated from the acquired weather parameter and/or flare processing parameter values; and wherein the computation of the unburned reduction index values is done for the most representative flame state categories using computational fluid dynamics with the representative weather parameter value and/or the representative flare processing parameter value. This can improve the accuracy of the computed combustion efficiency value. - it comprises a step of calibrating the correlation model, said correlation model is configured to classify segments of a video stream of a flare flame according to a reduced number of flame states based on the flame behavior as monitored by a camera, preferably, the correlation model is configured to classify segments of a video stream according parameter value(s) of the flare flame.

- the correlation model is configured to classify segments of a video stream also according to flare processing parameter value(s), parameter value(s) of the flare structure and/or weather parameter value(s). This can improve the accuracy of the computed combustion efficiency value.

- it further comprises a step of recalibration, said recalibration comprising the measure of unburned gases released by the industrial plant by a dedicated mean, and a comparison of the measured of unburned gases released and a computed quantity of unburned gases released from the computed combustion efficiency value. This can improve the accuracy of the computed combustion efficiency value.

- when there is an inconsistency between the measured of unburned gases released and the computed quantity of unburned gases released, the step of recalibration further comprises a step of modifying the unburned reduction index values, preferably according to said inconsistency.

- the segmentation of the video stream is based on a predetermined duration value or wherein the segmentation of the video stream is based on a duration value which is calculated according to flare processing parameter value(s), parameter value(s) of the flare flame, parameter value(s) of the flare structure and/or weather parameter value(s).

- the segmentation of the video stream use at least one non-image-based value, such as a weather parameter value (e.g. preferably wind force), a flare processing parameter value (e.g. gas flow) and/or a parameter value of the flare structure (e.g mechanical design, angle of the nozzle, diameter of the nozzle...). the correlation model comprises a supervised, unsupervised or reinforcementbased machine learning model, such as a convolutional neural network.

- the correlation model is configured to compute a value of at least one parameter of the flare flame such as: size of the flame, flame to smoke ratio, temperature of the flare, colorimetry, a soot built up, a flame detachment, smoke of the flare, an angle of inclination of the flame, a flame opening angle, a length of visible plume, coloration of flame, coloration of smoke and/or a soot content.

- the step of analyzing the video segments comprises a combination of computing a value of at least one parameter of the flare flame such as: a size of the flame, a flame to smoke ratio, a temperature of the flare, colorimetry, a soot built up, a flame detachment, a smoke of the flare, an angle of inclination of the flame, a flame opening angle, a length of visible plume; and using a machine learning model. This can improve the accuracy of the computed combustion efficiency value.

- the step of segmenting the video stream comprises using the correlation model.

- the step of segmenting and the step of analyzing the video segments, are performed simultaneously and each comprises the use of the correlation model.

- the step of analyzing the video segments further comprises a sub step of differentiating flame and smoke from environment (e.g. clear skies, cloudy skies, rain, snow, dust, fog, humidity).

- it comprises the use of at most hundred flames state categories, fifty, preferably at most forty flames state categories, more preferably at most thirty flames state categories and even more preferably at most twenty flames state categories.

- it comprises the use of at least four flames state categories, preferably at least five flames state categories, more preferably at least ten flames state categories and even more preferably at least fifteen flames state categories.

- when a video segment is classified in several flame state categories, preferably each of the flame state categories associated with this video segment is associated with a percentage, the sum of the percentages being equal to 100%; said percentage being preferably a percentage of duration of each flame state category in the video segment.

- wherein the unburned reduction index values have been calculated through computational fluid dynamics, preferably reactive computational fluid dynamics large eddy simulation. This can improve the accuracy of the computed combustion efficiency value.

- wherein during computing the combustion efficiency value, only a part of the video segments are associated with one or several unburned reduction index values. - computing the combustion efficiency value further comprises the use of at least one non-image-based value, said non-image-based value being selected among a weather parameter value (e.g. preferably wind force), a plant value (e.g. torch head geometry), or a current process condition information (e.g. gas debit).

- the unburned reduction indexes have been modified according to a measured quantity of unburned gases released obtained through sampling campaign, for example with drones, of the unburned gases quantities released into the atmosphere.

- it further comprises obtaining an audio stream of the flare flame, in particular the audio stream comprise vibration recording of a flare burner generating the flare flame, corresponding to the video stream and wherein computing the combustion efficiency value further use the obtained audio stream.

- flame state categories are associated with an unburned reduction index value based on quantity of unburned gases previously calculated for similar flames state category with a computational fluid dynamics.

- it further comprises a step of capturing a video stream of a flare emitted by a flare burner of the industrial plant with at least one camera.

- it further comprises a step of storing the video stream of the flare flame, with the computed combustion efficiency value along with a date and time stamp.

[0013] According to another aspect, the invention can also relate to a method of operating a flare burner comprising a modification of at least one flare processing parameter value based on the computed combustion efficiency value according to the invention. Preferably, the modification of at least one flare processing parameter value is also based one at least one weather parameter value.

[0014] According to another aspect, the present invention can also relate to one or more tangible non-transitory computer-readable media storing computer-readable instructions that, when executed by one or more processors, cause the one or more processors to perform a method according to the invention. In particular, it relates to a computer program product having computer-executable instructions which, when carried out on a computer system, perform the method according to anyone of the preceding claims.

[0015] According to another aspect of the present invention, it is provided a computing device for estimating a combustion efficiency value during flaring by a flare flame in an industrial plant, over a time period, said computing device comprising:

- a memory component configured to store a correlation model configured to classify each video segments in at least one flame state category;

- a communication interface configured to acquire a video stream of the flare flame;

- one or more processors configured to: o Acquire a video stream of the flare flame over the time period; o Segment the video stream in several video segments, each video segments being associated with a video segment duration; o Analyze the video segments, using a correlation model, so as to classify each video segments in at least one flame state category; and o Compute the combustion efficiency value, said computing step being based on the video segment durations and a plurality of unburned reduction indexes, each of said unburned reduction indexes being specific to one of the flame state categories, specific to the industrial plant and calculated using computational fluid dynamics.

[0016] According to other optional features of the method for estimating a combustion efficiency, the latter may optionally include one or more of the following features, alone or in combination:

- it further comprises at least one image capturing device oriented toward at least one flare burner and configured for obtaining video stream of a flare flame emitted by the industrial plant, preferably said image capturing device being a digital video, a high-definition digital video, or a 3D video.

- the at least one image capturing device includes a visible camera, and optionally an infrared camera, a near-infrared camera and/or a broad-spectrum infrared camera.

[0017] According to another aspect of the present invention, it is provided a computing system comprising a computing device according to the invention and at least image capturing device arranged to generate a video stream of a flare flame in an industrial plant. The image capturing device can be oriented toward at least one flare burner and configured for obtaining video stream of a flare flame emitted by the industrial plant, preferably said image capturing device being a digital video, a high-definition digital video, or a 3D video. The at least one image capturing device can include a visible camera, and optionally an infrared camera, a near-infrared camera and/or a broad-spectrum infrared camera.

It can further comprises an audio recorder or a vibration recorder configured to measure the vibration of the flare burner during flaring.

[0018] According to another aspect of the present invention, it is provided a trained machine-learning model for estimating a combustion efficiency value of a flare flame in an industrial plant, the combustion efficiency being representative of a concentration of unburned gas released during flaring by a flare flame, the trained machine-learning model being obtained via the following steps:

- Acquiring video streams of the flare flame of the industrial plant over several flame states;

- Optionally, acquiring a weather parameter value and/or a flare processing parameter value associated with the video streams;

- Segmentation of the video streams in a plurality of video segments each classified according to with a flame state category;

- Computing, for each of at least four flame state categories, a plurality flare flame parameters;

- Computing a combustion efficiency of the flare flame for each of the at least four flame state categories, using computational fluid dynamics with optionally the weather parameter value and/or the flare processing parameter value; Training a machine-learning model to

- classify video streams in a plurality of video segments associated with at least one flame state category using a machine-learning algorithm, the plurality of flare flame parameters and preferably the weather parameter value and/or the flare processing parameter value, and

- predict for a future time period, using the combustion efficiencies computed using computational fluid dynamics, a combustion efficiency value of the flare flame.

[0019] Preferably, the invention relates to one or more tangible non-transitory computer- readable media storing the trained machine-learning model for estimating a combustion efficiency value according to the invention. BRIEF DESCRIPTION OF THE DRAWINGS

[0020] The foregoing and other objects, features and advantages of the present invention will become more apparent from the following detailed description when taken in conjunction with the accompanying drawings in which:

- FIG. 1 is a schematic view of a method of estimating a combustion efficiency value according to the present invention. A step framed by a dotted line is optional.

FIG. 2 is a schematic view of a calibration of the correlation model according to the present invention.

FIG. 3 is a schematic view of an initial calibration according to the present invention.

FIG. 4A & 4B are graphical illustrations of some steps of a method of estimating a combustion efficiency value according to the present invention.

FIG. 5 is a graphical illustration of a system of estimating a combustion efficiency value according to the present invention.

[0021] Several aspects of the present invention are disclosed with reference to flow diagrams and/or block diagrams of methods, devices and computer program products according to embodiments of the invention.

[0022] On the figures, the flow diagrams and/or block diagrams show the architecture, the functionality and possible implementation of devices or systems or methods and computer program products, according to several embodiments of the invention.

[0023] For this purpose, each box in the flow diagrams or block diagrams may represent a system, a device, a module or code which comprises several executable instructions for implementing the specified logical function(s).

[0024] In some implementations, the functions associated with the box may appear in a different order than indicated in the drawings.

[0025] For example, two boxes successively shown, may be executed substantially simultaneously, or boxes may sometimes be executed in the reverse order, depending on the functionality involved.

[0026] Each box of flow diagrams or block diagrams and combinations of boxes in flow diagrams or block diagrams may be implemented by special systems that perform the specified functions or actions or perform combinations of special equipment and computer instructions.

DETAILED DESCRIPTION

[0027] Below, we describe a summary of the invention and the associated vocabulary, before presenting the drawbacks of the prior art, then finally showing in more detail how the invention overcomes them. In particular, a description of example embodiments of the invention follows.

[0028] As used herein, “video stream” can relates to a video transmitted in the form of a video stream but also a video stream transmitted in the form of video sequences. A video stream is generally generated by a camera.

[0029] By “time period” is meant, within the meaning of the invention, a delimited part of time. A time period according to the invention can have any duration, for example from a few milliseconds to several hours or day.

[0030] By “unburned gas” is meant, within the meaning of the invention, compounds released by the industrial plant which have not been burned during the flare combustion. The unburned gas released can refer to some or all the unburned gases discharged in the atmosphere such as methane, nitrous oxide, or ethane.

[0031] “Unburned reduction index” can relate, within the meaning of the invention, to values of combustion efficiency for an industrial plant flare burner, each value being specific to a flame state category. In particular the values of combustion efficiency, being specific to a flame state category, can be based on a balance on carbon atoms (either mass or molar), on a CO2 mass balance produced versus combustible mass burned; or considered as its inverse (i.e. destruction efficiency). For example, it can relate to a percentage of fuel that has been partially or completely destroyed by combustion. In particular, it can also correspond to the percentage of CO2 created versus the percentage of fuel burned.

[0032] “Combustion efficiency” can relate, within the meaning of the invention, to the measured efficiency of converting organic carbon compounds to carbon dioxide. where [CO2] is the CO2 volume concentration produced by the flare, [CO] the CO volume concentration exiting the flare, 2 n f [w y Q] the volume concentration of total hydrocarbons exiting the flare, and [soot] is the volume concentration of any soot present exiting the flare (often neglected). Usually, volume concentrations are considered to be equivalent to molar concentrations.

[0033] By “weather parameter value” is preferably meant, within the meaning of the invention, value of standard weather parameter such as temperature, humidity, atmospheric pressure, clouds, precipitation, wind, such as wind force and wind direction.

[0034] By “parameter value of the flare flame” is preferably meant, within the meaning of the invention, value of parameters relating to the behavior of the flame such as a size of the flame, a flame to smoke ratio, a temperature of the flare, colorimetry, a soot built up, a flame detachment, a smoke of the flare, an angle of inclination of the flame, a flame opening angle, or a length of visible plume.

[0035] By “parameter value of the flare structure” is preferably meant, within the meaning of the invention, value of parameters relating to the flare tip geometry such as a mechanical design, an angle of the nozzle, or a diameter of the nozzle.

[0036] By “flare processing parameter value” is preferably meant, within the meaning of the invention, value of parameters relating to the operating of the flare burner such as a gas flow, an use of assist medium (steam, air, or fuel gas) or a gas composition.

[0037] By “process ”, “compute", “determine ”, “display ”, “extract ”, “compare” or more broadly “executable operation” is meant, within the meaning of the invention, an action performed by a computing device, computing system or a processor unless the context indicates otherwise. In this regard, the operations relate to actions and/or processes of a data processing system, for example a computing system or an electronic computing device, which manipulates and transforms the data represented as physical (electronic) quantities in the memories of the computing system or other devices for storing, transmitting or displaying information. In particular, calculation operations are carried out by the processor of the device, the produced data are entered in a corresponding field in a data memory and this field or these fields can be returned to a user for example through a Human Machine Interface formatting such data. These operations may be based on applications or software.

[0038] The terms or expressions “application ”, “software ”, “algorithm", “program code ”, and “executable code” mean any expression, code or notation, of a set of instructions intended to cause a data processing to perform a particular function directly or indirectly (for example after a conversion operation into another code). Exemplary program codes may include, but are not limited to, a subprogram, a function, an executable application, a source code, an object code, a library and/or any other sequence of instructions designed for being performed on a computing system.

[0039] By “processor” is meant, within the meaning of the invention, at least one hardware circuit configured to perform operations according to instructions contained in a code. The hardware circuit may be an integrated circuit. Examples of a processor include, but are not limited to, a central processing unit, a graphics processor, an application-specific integrated circuit (“ASIC” according to Anglo-Saxon terminology), and a programmable logic circuit. A single processor or several other units may be used to implement the invention.

[0040] By “coupled” is meant, within the meaning of the invention, connected, directly or indirectly, with one or more intermediate elements. Two elements may be coupled mechanically, electrically or linked by a communication channel.

[0041] The expression “human-machine interface”, within the meaning of the invention, corresponds to any element allowing a human being to communicate with a computer, in particular and without that list being exhaustive, a keyboard and means allowing in response to the commands entered on the keyboard to perform displays and optionally to select with the mouse or a touchpad item displayed on the screen. Another embodiment is a touch screen for selecting directly on the screen the elements touched by the finger or an object and optionally with the possibility of displaying a virtual keyboard.

[0042] By “computing device”, it should be understood any device comprising a processing unit or a processor, for example in the form of a microcontroller cooperating with a data memory, possibly a program memory, said memories possibly being dissociated. The processing unit cooperates with said memories by means of internal communication bus. Usually, a computing system comprises several computing devices.

[0043] By "correlation model", it is to be understood in the sense of the invention a finite sequence of operations or instructions allowing to calculate an output value from one or more input values. The implementation of this finite sequence of operations makes it possible, for example, to assign a value Y, such as a label Y, to an observation described by a set of characteristics or parameters X thanks, for example, to the implementation of a function f, capable of reproducing Y by having observed X. Y = f (X) + e, where e symbolizes the noise or measurement error.

[0044] By "supervised machine learning model" is meant in the sense of the invention a correlation model generated automatically from data, called observations, which have been labeled.

[0045] By "unsupervised machine learning model" is meant, in the sense of the invention, a correlation model generated automatically from data, called observations, which have not been labeled.

[0046] By " reinforcement machine learning model" is meant, in the sense of the invention, a correlation model generated automatically from data, called observations, through a process involving rewards and penalties based on the goal to be achieved.

[0047] The term "about" as used herein can allow for a degree of variability in a value or range, for example, within 10%, within 5%, or within 1% of a stated value or of a stated limit of a range.

[0048] The term "substantially" as used herein refers to a majority of, or mostly, as in at least about 50%, 60%, 70%, 80%, 90%, 95%, 96%, 97%, 98%, 99%, 99.5%, 99.9%, 99.99%, or at least about 99.999% or more.

[0049] As mentioned, solutions disclosed in the prior art use multispectral cameras which are complex and expensive. They are not adapted for a broad adoption of the flaring efficiency monitoring when considering maintenance on such systems, in particular in oil & gas operation. There is a need for simple and robust devices capable of producing accurate estimation of flare combustion efficiency.

[0050] A method, based on the use of computing fluid dynamics has been developed. This method in particular combine a fast classification of video segments of a flare combustion capable of a fine discretization of flame states with unburned gas reduction index based obtained through computing fluid dynamics. This makes it possible to predict combustion efficiency in real time and with high accuracy.

[0051] According to a first aspect, the invention relates to a method 100 for estimating a combustion efficiency value during flaring. In particular, it relates to a method for estimating a combustion efficiency value during flaring by a flare flame 61 . [0052] A flare flame 61 is generally generated during operation of a gas combustion device such as a flare burner. Although this is not illustrated in the figure 5, a flare flame 61 may generate smoke. In the context of the invention, the flaring flame will preferably be associated with an industrial plant such as an offshore I onshore oil & gas asset such as oil & gas production unit or a petroleum refinery, a chemical plant or a natural gas processing plant. In a preferred embodiment, the flare flame 61 is generated by a gas combustion device 62 of an oil & gas installation.

[0053] As mentioned, this gas combustion device 62 is designed to modify the composition of the released gas and reduce the concentration of harmful gas released in the atmosphere. One object of the invention is particularly to monitor the released gases by computing a predicted value of a combustion efficiency value during flaring.

[0054] Preferably, the combustion efficiency value can be used to calculate an estimation of a quantity of unburned gas released in the atmosphere during flaring. The unburned gas can comprise process gas in general for example: hydrocarbons such as methane or ethane, hydrogen; nitrous oxide; hydrogen sulfide or a combination thereof.

[0055] It is possible that the composition of the gas to be flared is modified as a function of time or that meteorological parameters undergo modifications that could have an impact on the combustion efficiency value and thus the quantity of unburned gas released into the atmosphere. Hence, the estimation is preferably made for a given time period. In a preferred embodiment, the combustion efficiency value is estimated in real time. Preferably, it is estimated less than 10 minutes after the acquisition of the video stream, preferably less than 5 minutes, more preferably less than 1 minute, even more preferably less than 10 seconds.

[0056] In particular, as illustrated in figure 1 , a method for estimating a combustion efficiency value according to the invention will comprise the following steps: acquiring 140 a video stream of the flare flame; Segmenting 150 the video stream in several video segments; Analyzing 160 the video segments so as to classify each video segments in at least one flame state category; And computing 170 the combustion efficiency value, said computing 170 step being based on the video segment durations and a plurality of unburned reduction index values. Preferably, these steps are performed in real time and the steps are executed in parallel or substantially in parallel.

[0057] The above steps are preferably executed in the form of a loop. The invention is repeated for a plurality of video streams, which can be a plurality of video files, so as to generate in real time, estimations of a combustion efficiency value during the flaring. Hence, thanks to the invention, a continuous monitoring of the unburned gas released by flaring can be easily implemented.

[0058] A method according to the invention can also comprise the following steps: calibration 110 of the correlation model; initial calibration 120 of the unburned reduction index; capturing 130 a video stream; recalibration 180 of the unburned reduction index values; and accuracy estimation 190 of the computed quantity of unburned gases released.

[0059] As shown in figure 1 , a method 100 according to the invention can comprise a step of calibrating 110 the correlation model. This step is preferably implemented by one or more processors 10. This step can be performed before the implementation of the continuous monitoring of the unburned gas released by flaring, and it can be performed by different processors 10 because implemented to specialized servers.

[0060] This step of calibrating 110 is in particular designed to define the rules and/or parameters that will be implemented to assign one or several flame state categories to a video segment.

[0061] As mentioned, one advantage of the invention is to combine an extremely accurate method of estimating a combustion efficiency value with a quick and efficient way to classify video segments. Over time, a flame can assume an almost infinite number of flame states. Advantageously, the correlation model will be configured to discretize and significantly reduce the number of flame states to be considered. This classification is preferably based, at least partially, on the flare flame behavior as monitored by a camera.

[0062] In a preferred embodiment, the correlation model is calibrated so as to classify the segments of a video stream of a flare flame according to a reduced number of flame state categories. For example, the correlation model is calibrated so as to be configured to classify the segments of a video stream of a flare flame according to at most 200 flame state categories, preferably at most 150 flame state categories, more preferably at most 100 flame state categories, even more preferably at most 50 flame state categories.

[0063] Moreover, in a preferred embodiment, the correlation model is calibrated so as to be configured to classify segments of a video stream of a flare flame according to a minimal number of flame states. For example, the correlation model is calibrated so as to be configured to classify segments of a video stream of a flare flame according to at least 2 flame state categories, preferably at least 5 flame state categories, more preferably at least 10 flame state categories, even more preferably at least 15 flame state categories.

[0064] The correlation model comprises or consists in a machine learning model. Preferably, it can comprise a supervised, unsupervised or reinforcement-based machine learning model. Preferably, the correlation model comprises an unsupervised machine learning model such as Hidden Markov Model, convolutional neural network, clustering, regression, Clustered linear regression, self-organizing map or a combination thereof.

[0065] The correlation model can also be based on predetermined rules. For example, the correlation model can be configured to compute a value of at least one parameter of the flare flame such as: size of the flame, flame to smoke ratio, temperature of the flare, colorimetry, a soot built up, a flame detachment, smoke of the flare, an angle of inclination of the flame, a flame opening angle, a length of visible plume, intensity of the flame (e.g. through RGB Reg-Green-Blue), color of the flame, spectral analysis on several pixels located in the plume and around the flame, or a combination thereof. The calibration of the model can here also comprise the use of a predetermined threshold values for one or more parameters of the flare flame, stored in a memory. The calibration of the model can use statistical analysis of these parameters (e.g. mean, variance, standard deviation).

[0066] In a preferred embodiment, the correlation model comprises the use of a machine learning model and of predetermined rules. Hence, preferably, the correlation model comprises a combination of uses of computed values of at least one parameter of the flare flame such as: a size of the flame, a flame to smoke ratio, a temperature of the flare, colorimetry, a soot built up, a flame detachment, a smoke of the flare, an angle of inclination of the flame, a flame opening angle, a length of visible plume; and using a machine learning model.

[0067] Calibrating 110 the correlation model can also benefit from flare processing parameter value(s), parameter value(s) of the flare flame, parameter value(s) of the flare structure and/or weather parameter value(s). Hence, preferably, the calibration of the correlation model comprises the use of at least one flare processing parameter value (e.g. gas flow), of at least one parameter value of the flare flame, of at least one parameter value of the flare structure and/or of at least one weather parameter value (e.g. wind force).

[0068] A flare processing parameter can for example be selected among: gas flow, gas composition or combination thereof. A flare structural parameter can for example relate to the flare tip geometry.

[0069] A weather parameter can for example be selected among: wind force, wind orientation, rain or combination thereof.

[0070] A particular embodiment of calibrating 110 the correlation model is illustrated in the figure 2.

[0071 ] As illustrated in the figure 2, the calibration 110 of the correlation model can comprise an acquisition 111 of reference videos of the flare flame, preferably from a predetermined industrial plant, over several flame behaviors. As mentioned, in one hand, the classification of the video segments is preferably based, at least partially, on the flare flame behavior as monitored by a camera and in the other hand the unburned gases quantity can be correlated to the flare flame behavior. These videos can be considered as reference videos used to select the threshold values and/or train the machine learning models.

[0072] The calibration of the correlation model can also comprise an acquisition 112 of a weather parameter value, a parameter value of the flare flame, parameter value of the flare structure and/or a flare processing parameter value corresponding to the reference videos. Indeed, these values may be used for the classification of video segments. Also, they can be of particular interest in the initial calibration 120 step to calculate 125 a quantity of unburned gases released for a flame state category with computational fluid dynamics.

[0073] The calibration of the correlation model can also comprise a selection 113 of threshold values and/or model parameters to classify video segments of the reference videos in a plurality of flame state categories. Such threshold values and/or model parameters can be selected in order to reduce the number of flame state categories and homogenize each flame state category.

[0074] As shown in figure 1 , a method 100 according to the invention can comprise a step of initial calibration 120. This step is preferably implemented by one or more processors 10. This step can be performed before the continuous implementation of the estimation method, and it can be performed by different processors 10 because executed on dedicated servers.

[0075] This step is in particular designed to determine unburned reduction index values for a plurality of flame state categories. As mentioned, the unburned reduction index values can be correlated to a combustion efficiency value and thus to a quantity of unburned gas released during the flaring by the flare flame. In particular, the flame state categories are associated with an unburned reduction index value based on combustion efficiency previously calculated for similar flames state category with a computational fluid dynamics.

[0076] Advantageously, the present invention uses computational fluid dynamics, preferably reactive computational fluid dynamics to calculate a combustion efficiency value during the flaring by the flare flame for a plurality of flame state categories. In particular, the present invention uses computational fluid dynamics, preferably reactive computational fluid dynamics to calculate unburned reduction index values for a plurality of flame state categories. The use of computational fluid dynamics and in particular large eddy scale computational fluid dynamics advantageously takes into account the large scales of the turbulence of a three-dimensional and unsteady flow. This allows higher accuracies than could be achieved with other calculation methods. However, considering the computational resources required, these methods are not considered as such for estimating in real time a combustion efficiency value.

[0077] As described, the correlation model is preferably configured to identify flame state categories, in particular most characteristics of flame state categories in a video of a flare flame. As stated, the quantity of unburned gases is highly dependent on the design of the flare burner. Hence, the initial calibration 120 of the method according to the invention refers in particular to the determination, for a specific industrial plant, of an unburned reduction index value, or of a combustion efficiency value.

[0078] A particular embodiment of an initial calibration 120 according to the invention is illustrated in the figure 3. As illustrated, the initial calibration 120 can comprise an acquisition 121 of reference videos of the flare flame of the industrial plant over several flame behaviors.

[0079] The initial calibration 120 can comprise an acquisition 122 of at least one weather parameter value, at least one parameter value of the flare flame, at least one parameter value of the flare structure and/or at least one flare processing parameter value, preferably over the duration of the video stream. The parameter value(s) acquired relates to events occurring at the same time as the capture of the video stream.

[0080] The initial calibration 120 can comprise a segmentation 123 of the reference video in a plurality of video segments each associated with a flame state category. As it will be described hereafter, the segmentation can be mainly based on flame behavior. [0081] The initial calibration 120 can comprise a selection 124 of the most representative flame state categories, each of said flame state category being associated with a representative weather parameter value, a representative parameter value of the flare flame, a representative parameter value of the flare structure and/or a representative flare processing parameter value, said representative weather parameter value, representative parameter value of the flare flame, a representative parameter value of the flare structure and/or representative flare processing parameter value being calculated from the acquired weather parameter and/or flare processing parameter values.

[0082] Once a flame state category has been selected, preferably associated with a representative weather parameter value, a representative parameter value of the flare flame, a representative parameter value of the flare flame and/or a representative flare processing parameter value, the initial calibration 120 can comprise a step of calculating 125 a quantity of unburned gases released for a flame state with computational fluid dynamics.

[0083] The initial calibration 120 can comprise a computation 126, with computational fluid dynamics, of unburned reduction index values for at least four flame state categories, preferably among the most representative flame state categories.

[0084] In addition to a calculation by computational fluid dynamics, the method according to the invention may include a modification of the calculated unburned reduction index values based on other inputs. For example, the other inputs can comprise a measured quantity of unburned gases released by the flare flame. This measured quantity can be obtained through sampling campaign, for example with drones, of the unburned gases quantities released into the atmosphere. This can for example imply the use of online gas chromatography.

[0085] As shown in figure 1 , a method 100 according to the invention can comprise a step of capturing 130 a video stream. The captured video stream is preferably a video stream of the flare flame over a time period. In particular, it can comprise a flare flame emitted by an industrial plant, in particular a flare burner of the industrial plant.

[0086] The video stream has been preferably generated by at least one camera. For example, it can be a video stream generated by several camera. More preferably, the video stream has been generated by one camera. Several cameras can be used to compute data from various cameras to have several acquisition angles (diversity of position). [0087] The video stream can comprise visible images, ultraviolet images and/or infrared images. However, more preferably, the camera is a visible camera (i.e. RGB camera), for example using wavelengths of light from 400~700nm. The camera can be a camera configured to generate depth data. Hence, the acquisition step can comprise an acquisition of ultraviolet images and/or infrared images.

[0088] The video stream generally comprises a plurality of images comprising a combustion zone, preferably a flare combustion zone. The flare combustion zone can comprise the flame and the smoke.

[0089] As shown in figure 1 , a method 100 according to the invention comprises a step of acquiring 140 a video stream of the flare flame. This step is preferably implemented by one or more processors 10. Moreover, it can imply the use of a communication device when acquiring the video stream generated by a distant capturing device.

[0090] The capture of the video can be carried out upstream of the implementation of the invention. In this case, the method comprises only an acquisition of the video stream. In such embodiment, the invention can be implemented on remote servers.

[0091] The figures 4A and 4B illustrate a particular embodiment of the invention including the main steps of the method according to the invention.

[0092] As illustrated in the figure 4A, in addition to the acquisition of a video stream 210, the method 100 according to the invention can comprise a step of acquiring at least one non-image-based parameter value 215, such as a weather parameter value (e.g. preferably wind force, wind direction), a flare processing parameter value (e.g. gas flow, composition), parameter value of the flare flame (e.g. inclination) or a parameter of the flare structure (e.g. flare tip geometry). In the figure 4A, the levels of gray illustrate variations in the values of the non-image-based parameter.

[0093] The method of the invention can also comprise obtaining an audio stream of the flare flame 61 , in particular the audio stream comprise vibration recording of a flare burner 62 generating the flare flame, corresponding to the video stream. The audio stream for example can be produced by an audio recorder or a vibration recorder 53 configured to measure the vibration of the flare burner during flaring.

[0094] As shown in figure 1 , a method 100 according to the invention can comprise a step of segmenting 150 the video stream in several video segments. This step is preferably implemented by one or more processors 10.

[0095] This step is in particular configured to generate several video segments 220 from a video stream 210. Each video segment is preferably associated with a video segment duration. As illustrated in the figure 4A, the video segments 220a, 220b, 220c, 220d duration may vary over the duration 230 of the video stream. For example, a video segment is constituted of a plurality of images over a duration of at least 10 ms. Preferably, the video segment duration is at most one minute, more preferably at most thirty seconds, even more preferably at most one second.

[0096] A video segment can be constituted of one image (e.g. one frame). When a video segment is constituted of only one image, the video segment duration can correspond to the acquisition frequency of the capturing device 40.

[0097] The segmentation of the video stream can use at least one non-image-based parameter value, such as a weather parameter value (e.g. preferably wind force), a parameter value of the flare flame, a parameter value of the flare structure, or a flare processing parameter value (e.g. gas flow). Indeed, it may be useful for the segmentation to also take into account the wind measured at a given instant and as a function of the flow at a given instant.

[0098] In an embodiment, the segmentation of the video stream is based on a predetermined duration value. The predetermined duration value can be used as a setting and can be changed if necessary. For example, the segmentation can generate a plurality of video segments of at least 10 ms. The segmentation can also generate a plurality of video segments each consisting in one image.

[0099] In another embodiment and advantageously, the segmentation can be implemented according to the behavior of the flare flame on an image or on a plurality of images. This segmentation can be done with or without further using non-image-based parameter value.

[0100] The step of segmenting 150 the video stream in several video segments can comprise the use of the correlation model. The correlation model will be described in detail in relation with the step of analyzing 160 the video segments. This is particularly relevant when the segmentation is implemented at least partially according to the behavior of the flare flame. In particular, the step of segmenting 150 and the step of analyzing 160 the video segments, are performed simultaneously and each comprises the use of the correlation model. [0101] As shown in figure 1 , a method 100 according to the invention comprises a step of analyzing 160 the video segments. This step is preferably implemented by one or more processors 10, for example the same processor(s) implementing the step of segmenting 150 the video stream. In particular, the one or more microprocessors 10 are configured to load a correlation model.

[0102] An advantage of the present invention is that the proposed solution can reduce the complexity of the video stream and generate discretized data useable in further treatments.

[0103] This step is in particular designed to classify each video segments in at least one flame state category. In the figure 4A, the video segments are classified in four flame state categories 220a, 220b, 220c, 220d, each represented by a level of grey, black, and white.

[0104] As mentioned, the performance of the invention can be improved by using a reduced number of flame state categories. Preferably, the step of analyzing 160 the video segments comprises the use of at most hundred flames state categories, preferably at most fifty flames state categories, more preferably at most forty flames state categories and even more preferably at most thirty flames state categories.

[0105] As mentioned, the performance of the invention can be improved by using a minimal number of flame state categories. Preferably, the step of analyzing 160 the video segments comprises the use of at least four flames state categories, preferably at least five flames state categories, more preferably at least ten flames state categories and even more preferably at least fifteen flames state categories.

[0106] As illustrated in the figure 4B, in a preferred embodiment, at least some of the video segments 220e are classified in several flame state categories. Indeed, in some embodiments, one video segment may represent the flare flame under several flare flame state categories. Alternatively, one video segment may represent the flare flame under one flare flame state which do not correspond exactly to a flare flame state category of the correlation model. This may happen when this flare flame state category of an analyzed video stream was not observed in the reference video used during the calibration phase.

[0107] The table 1 below illustrates the categories identified in a 1000 ms video stream and the cumulated associated durations.

[0108] Table 1

[0109] The classification of a video segment 220e in several flame state categories can be particularly advantageous when the video segment 220e doesn’t correspond significantly to a flare flame state category. In order to reduce the occurrence of video segment with undetermined flame state category, the method can comprise the assignment of several flare flame state categories to one video segment. Thus, an approximation can even be done when the particular state of the flare flame on this video segment cannot be associated to a given flare flame state category. In the figure 4B, the video segment 220d is assigned by the correlation model to the flare flame category 1 , whereas the video segment 220f is assigned by the correlation model to the flare flame category 2. The video segment 220e is assigned to several flare flame state categories (i.e. flare flame state 1 , 2 & 3).

[0110] In a preferred embodiment, when a video segment is classified in several flame state categories, each of the flame state categories associated with this video segment is associated with a percentage, the sum of the percentages being equal to 100%; said percentage being preferably a percentage of duration of each flame state category in the video segment. Alternatively, the percentages can correspond to a percentage of homology of the segment with each of the flare flame state categories. Also, a maximum likelihood technique can be used to select the most suitable flame state category.

[011 1] In a preferred embodiment, when a video segment is classified in several flame state categories, each of the flame state categories associated with this video segment is associated with a duration.

[0112] In some embodiments, not all of the video segments will be classified in a flare flame state category. Alternatively, some video segments will be classified in a “undetermined” flare flame state category. Indeed, as the flare flame can adopt a multitude of flare flame state categories, a method according to the invention can focus on the characterization of the principal flare flame state categories in order to improve the overall performance of the method. [0113] As mentioned, the analysis of the video segments according to the invention is advantageously done using a correlation model. This correlation model is preferably specific to the flare flame of the studied industrial plant. Indeed, the flare burner and more broadly the industrial plant have a strong influence on the venting, so the correlation model is industrial plant dependent.

[0114] The correlation model comprises or consists in a machine learning model. Preferably, it can comprise a supervised, unsupervised or reinforcement-based machine learning model. Preferably, the correlation model comprises an unsupervised machine learning model such as Hidden Markov Model, a convolutional neural network, clustering, regression, clustered linear regression or a combination thereof

[0115] In particular, the step of analyzing 160 the video segments comprise computation of a value of at least one parameter of the flare flame such as: size of the flame, flame to smoke ratio, temperature of the flare, colorimetry, a soot built up, a flame detachment, smoke of the flare, an angle of inclination of the flame, a flame opening angle, and a length of visible plume.

[0116] Preferably, the step of analyzing 160 the video segments comprise a combination of using a computed value of at least one parameter of the flare flame such as: a size of the flame, a flame to smoke ratio, a temperature of the flare, colorimetry, a soot built up, a flame detachment, a smoke of the flare, an angle of inclination of the flame, a flame opening angle, a length of visible plume; and using a machine learning model.

[0117] For example, the method can comprise a step of computing a value of at least one parameter of the flare flame, the value of this parameter will be used as an input data of a machine learning model. Alternatively, the machine learning model can use at least a part of the images of the video segment and the value of at least one parameter of the flare flame as input data.

[0118] Preferably, the step of analyzing 160 the video segments comprise the use of predetermined threshold values. This is particularly relevant when the correlation model can also be based on predetermined rules. For example, the correlation model can be configured to compute a value of at least one parameter of the flare flame such as: size of the flame, flame to smoke ratio, temperature of the flare, colorimetry, a soot built up, a flame detachment, smoke of the flare, an angle of inclination of the flame, a flame opening angle, a length of visible plume, or a combination thereof. The calibration of the model can here comprise storing in a memory the predetermined threshold values for one or more parameters of the flare flame.

[0119] The step of analyzing 160 the video segments can further comprise a sub step of differentiating flame and/or smoke from environment: clear skies, cloudy skies, rain, snow, dust, and/or sea.

[0120] The step of analyzing 160 the video segments can further comprise the use of an audio stream of the flare flame. As already mentioned, this audio stream preferably comprises vibration recording of a flare burner generating the flare flame, corresponding to the video stream. It can be used to segment the video stream and/or classify the video segment between the flame state categories.

[0121] As shown in figure 1 , a method 100 according to the invention comprises a step of computing 170 the combustion efficiency value. This step is preferably implemented by one or more processors 10. In particular, the one or more processors 10 are configured to load unburned reduction index values.

[0122] Once the flame categories have been identified, the step of computing 170 the combustion efficiency value is in particular designed to transform the acquired and generated data in a combustion efficiency value.

[0123] Preferably, the unburned reduction index values define a relationship between a flame state category, and a combustion efficiency value.

[0124] An advantage of the present invention is that the proposed solution can thus combine a rapid way of reducing the complexity of the behavior of a flare flame and an accurate way of determining a combustion efficiency value that can be used also to determine a quantity of unburned gas released.

[0125] As mentioned, the unburned reduction index values used in this step have been calculated using computational fluid dynamics. In particular, as described in relation to the initial steps 110 and 120, the flame state categories are associated with an unburned reduction index values based on quantity of unburned gases previously calculated for similar flames state category with a computational fluid dynamics.

[0126] Preferably, the unburned reduction index values have been calculated through computational fluid dynamics, including computational fluid dynamics Reynolds-averaged simulation, computational fluid dynamics unsteady Reynolds-averaged simulation and/or computational fluid dynamics large eddy simulation.

[0127] Advantageously, a combustion efficiency value is computed based on a plurality of unburned reduction index values. The unburned reduction index values are preferably specific to one of the flame state categories. Hence, each flame state category will have a specific unburned reduction index value. However, a same video stream is generally associated with several flame state categories. Also, a same video segment can be associated with several flame state categories.

[0128] Preferably, the unburned reduction index values are also specific to the industrial plant that implement the combustion device producing the flare flame.

[0129] Preferably, the combustion efficiency value is computed based also on the video segment durations.

[0130] Advantageously, the computing step 170 use the video segment durations and a plurality of unburned reduction index values.

[0131] The table 2 below illustrates the data generated in an embodiment of the present invention. In a video stream captured over a time period 230 of 100 minutes, the method according to the invention associated several video segments of variable durations to flare flame state categories. These categories are each associated with an unburned reduction index value computed thought computational fluid dynamics.

[0132] Table 2

[0133] The combustion efficiency value can be expressed in percentage.

[0134] In one embodiment, the combustion efficiency value is used to compute a quantity of unburned gases released. Moreover, when calculated, the quantity of unburned gas released can be expressed in mass concentration of unburned gases per unit volume, in mass quantity released by minutes, or percentage. [0135] As it has been mentioned, in some embodiments, some video segments may not be assigned to a category or be assigned to an undetermined flare flame state category. Hence, during computing 170 the combustion efficiency value, in some embodiments, only a part of the video segments are associated with one or several unburned reduction index values.

[0136] In some embodiments, the step of computing 170 the combustion efficiency value can use at least one non-image-based value. For example, the non-image-based value can be selected among a weather parameter value (e.g. preferably wind force), a parameter value of the flare flame (e.g. inclination angle), a parameter value of the flare structure (e.g. torch head geometry), and/or a flare processing parameter value (e.g. gas debit, gas composition).

[0137] As mentioned, the method according to the invention can comprise a recalibration step. Hence, in some embodiments, the unburned reduction index values have been modified according to a combustion efficiency value calculated or a measured quantity of unburned gases released obtained, through sampling campaign, for example with drones, of the unburned gases quantities released into the atmosphere.

[0138] As shown in figure 1 , a method 100 according to the invention can further comprise a step of recalibration 180. This step is preferably implemented by one or more processors 10.

[0139] This step can be used to validate performance or optimize parameters to improve the accuracy of the estimation. This step is in particular designed to modify the unburned reduction index values associated with the flame state categories.

[0140] The unburned reduction index values have preferably been calculated with computational fluid dynamics. However, in preferred embodiments they can be confirmed, and/or adjusted with other sources of information.

[0141] Preferably, the recalibration 180 step comprises the measure of unburned gases released by the industrial plant by a dedicated mean. The dedicated mean can be selected for example among: an online gas analyzer or a drone configured to measure unburned gases released by the industrial plant. Also, the recalibration 180 step can comprise a comparison of the measured unburned gases released values and the computed combustion efficiency values. [0142] When the measured unburned gases released differ from the computed quantity of unburned gases or are inconsistent with the computed combustion efficiency values, the recalibration 180 step can comprise a step of modification of the unburned reduction index values based on the value measured unburned gases released values.

[0143] As shown in figure 1 , a method 100 according to the invention can comprise a step of estimating 190 an accuracy level for the computed combustion efficiency values. This step is preferably implemented by one or more processors 10.

[0144] This step is in particular designed to estimate an accuracy level of the computed combustion efficiency values.

[0145] Also, the method according to the invention can further comprise a step of storing the video stream of the flare flame, with the computed combustion efficiency values along with a date and time stamp. The data from the other sensors can also be stored and time stamped.

[0146] In another aspect, the invention relates to a method of operating a flare burner. In particular, a method of operating a flare burner arranged to generate a flare flame in an industrial plant. This method is preferably implemented by one or more processors 10.

[0147] Preferably, the method of operating a flare burner is a method of operating a flare burner which have released the quantity of unburned gas into the atmosphere during flaring by a flare flame in an industrial plant, over a time period predicted according to the method of the invention.

[0148] This method can comprise a modification of at least one flare processing parameter value based on the computed combustion efficiency values according to the invention. In particular, such a modification can use at least one weather parameter value.

[0149] In particular, this method comprises a modification of at least one flare burner processing parameter value such as the use of an assist medium (steam, air, or fuel gas) to improve the combustion efficiency. Hence, the method can be used to automatically adjust supplemental fuel additions as well as any assist source such as gas, steam or air. based on the computed combustion efficiency value according to the invention. In particular, such a modification can use at least one weather parameter value. This modification can involve operating valves 54 controlling of all components entering in the combustion zone such as valves connecting the flare burner to steam source 64 or fuel gas source 63. [0150] Advantageously, as the method of computing the combustion efficiency value is preferably operated in real time, the method according to the invention can comprise operating a flare burner also in real time.

[0151] Such an application can ultimately be used to significantly reduce the unburned gases released into the atmosphere.

[0152] As it will be appreciated by the one skilled in the art, aspects of the present invention may be embodied as a device, system, method or computer program product. Accordingly, aspects of the present invention may take the form of a fully hardware embodiment, a fully software embodiment (including firmware, resident software, microcode, etc.) or a mode of operation. In addition, aspects of the present invention may take the form of a computer program product incorporated into one or more computer-readable media having a computer readable program code embedded therein.

[0153] Thus, in another aspect, the invention relates to one or more computer-readable media storing computer-readable instructions that when executed by one or more processors 10 cause the one or more processors 10 to perform a method according to the invention. Preferably, the computer-readable media is a tangible non-transitory computer- readable media.

[0154] For the purposes of this disclosure, computer-readable media may include any instrumentality or aggregation of instrumentalities that may retain data and/or instructions for a period of time. Computer-readable media may include, for example, without limitation, storage media such as a direct access storage device (e.g. a hard disk drive or floppy disk drive), a sequential access storage device (e.g. a tape disk drive), compact disk, CD-ROM, DVD, RAM, ROM, electrically erasable programmable read-only memory (EEPROM), and/or flash memory; as well as communications media such as wires, optical fibers, microwaves, radio waves, and other electromagnetic and/or optical carriers; and/or any combination of the foregoing.

[0155] In particular, any combination of one or more computer-readable media may be used. In the context of this document, a computer-readable medium may be any tangible medium that may contain, or store, a program for use by or in connection with an instruction execution system, apparatus, or device. A computer-readable medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared or semiconductor system, apparatus or device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer-readable storage medium would include: a hard disk, a random-access memory (RAM).

[0156] Computer program code for performing operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object-oriented programming language such as Java, C ++, or similar, the programming language "C" or similar programming languages, a scripting language such as Perl, or similar languages, and I or functional languages such as Meta Language. Program code can run entirely on a user's computer, partly on a user's computer, and partly on a remote computer or entirely on the computer or remote server. In the latter scenario, the remote computer can be connected to a user's computer by any type of network, including a local area network (LAN) or a wide area network (WAN).

[0157] These computer program instructions may be stored on a computer readable medium that can direct a computing device (i.e. computer, server ...) or a computing system, so that the instructions stored in the computer-readable medium produce a computing device configured to implement the invention.

[0158] Hence, in another aspect, the invention relates a computing device 1 for estimating a combustion efficiency value.

[0159] This computing device 1 for estimating a combustion efficiency value is preferably configured to implement a method according to the invention. Preferably, the computing device 1 for estimating a combustion efficiency value can implement the embodiment, whether preferred or not preferred, described earlier in the description of a method according to the invention.

[0160] For purposes of this disclosure, a computing device 1 according to the invention may include any instrumentality or aggregate of instrumentalities operable to compute, classify, process, transmit, receive, retrieve, originate, switch, store, display, manifest, detect, record, reproduce, handle, or utilize any form of information, intelligence, or data.

[0161] For example, a computing device 1 may be a personal computer, a network storage device, or any other suitable device and may vary in size, shape, performance, functionality, and price. The computing device 1 may include random-access memory (RAM), one or more processing resources such as a central processing unit (CPU) or hardware or software control logic, ROM, and/or other types of nonvolatile memory. Additional components of the computing device may include one or more disk drives, one or more network ports for communication with external devices as well as various input and output (I/O) devices, such as a keyboard, a mouse, and a video display. The computing device may also include one or more buses operable to transmit communications between the various hardware components.

[0162] Figure 5 is a schematic block diagram illustrating various hardware components that may be utilized in the computing device 1 according to the invention.

[0163] In particular, as illustrated in figure 5, the computing device 1 comprises: one or more memory components 20 configured to store a correlation model, unburned reduction index values and instructions for the processor(s), one or more communication interfaces 30 configured to acquire a video stream; and one or more processors 10 configured to process the video stream to generate an estimation of the combustion efficiency value and in particular of the quantity of unburned gas released during flaring by a flare flame. It can also comprise one or more user interfaces 70.

[0164] A computing device 1 for estimating a combustion efficiency value may comprise one or more processors 10. A processor 10 may be operably coupled to the memory component 10 to execute instructions, encoded in programs, for carrying out the presently disclosed techniques, more particularly to perform the method according to the invention.

[0165] In particular, the processor 10 is configured to segment a video stream in several video segments. Preferably, each video segment is associated with a video segment duration.

[0166] The processor 10 can also be configured to analyze the video segments so as to classify each video segments in at least one flame state category. Preferably, the classification is done using a correlation model described above.

[0167] The processor 10 can also be configured to compute the combustion efficiency value. The processor 10 can in particular be configured to compute the combustion efficiency value using the video segment durations and a plurality of unburned reduction index values. Preferably, each of said unburned reduction index values being specific to one of the flame state categories, specific to the industrial plant and calculated using computational fluid dynamics, as described above.

[0168] The encoded instructions may be stored in any suitable article of manufacture (such as the memory component 20) that includes at least one tangible non-transitory, computer- readable medium that at least collectively stores these instructions or routines. In this manner, the memory component 20 may contain a set of instructions that, when executed by the processor 10, performs the method of the invention.

[0169] Hence, a computing device 1 for estimating a combustion efficiency value may comprise a memory component 20. Preferably, the memory component 20 is configured to store a correlation model configured to classify each video segments in at least one flame state category and unburned reduction index values associated with a flame state category.

[0170] The memory component 20 may comprise any computer readable medium known in the art including, for example, a volatile memory, such as a static random-access memory (SRAM) and a dynamic random-access memory (DRAM), and / or a non-volatile memory, such as read-only memory, flash memories, hard disks, optical disks and magnetic tapes. The memory component 20 may include a plurality of instructions or modules or applications for performing various functions. Thus, the memory component 20 can implement routines, programs, or matrix-type data structures. Preferably, the memory component 20 may comprise a medium readable by a computing system in the form of a volatile memory, such as a random-access memory (RAM) and / or a cache memory. The memory component 20, like the other modules, can for example be connected with the other components of the computing device 1 via a communication bus and one or more data carrier interfaces.

[0171] Moreover, the memory component 20 is preferably configured to store instructions capable of implementing the method according to the invention, according to its preferred or non-preferred embodiments. The memory component 20 may include any number of databases or similar storage media that can be queried from the processor 10 as needed to perform the method of the invention.

[0172] Furthermore, the computing device 1 can also comprise a communication interface 30.

[0173] The communication interface 30 is configured to receive or emit data on at least one communication network R1 and may implement a wired or wireless communication. Preferably, the communication is operated via a wireless protocol such as Wi-Fi, 3G, 4G, 5G and/or Bluetooth. These data exchanges may take the form of sending and receiving files. For example, the communication interface 30 may be configured to transmit a printable file. The communication interface 30 may in particular be configured to allow the communication with a remote terminal, including a client. The client is generally any hardware and/or software capable of communication with the computing device 1. A communication interface 30 according to the invention is, in particular, configured to exchange data with third-party devices or systems.

[0174] The communication interface 30 is preferably configured to acquire a video stream of the flare flame. It can also be configured to transmit a combustion efficiency value. The computing device 1 can communicate with other devices or computing systems and in particular with clients thanks to the communication interface 30

[0175] These different modules or components are separated in Figure 5, but the invention may provide various types of arrangement, for example a single module cumulating all the functions described here. Similarly, these modules or components may be divided into several electronic boards or gathered on a single electronic board.

[0176] A computing device 1 according to the invention can be incorporated into a computing system and able to communicate with one or several external devices such as a keyboard, a pointer device, a display, or any device allowing a user to interact with the computing device 1 .

[0177] The computing device 1 may also be configured to communicate with or via a human-machine-interface. Thus, in one embodiment of the present invention, the computing device 1 can be coupled to a human interface machine (HMI). The HMI may be used to allow the transmission of parameters to the devices or conversely make available to the user the values of the data measured or calculated by the device.

[0178] In general, the HMI is communicatively coupled to a processor 10 and includes a user output interface and a user input interface. The user output interface may include an audio and display output interface and various indicators such as visual indicators, audible indicators and haptic indicators.

[0179] The user input interface may include a keyboard, a mouse, or another navigation module such as a touch screen, a touchpad, a stylus input interface, and a microphone for inputting audible signals such as a user speech, data and commands that can be recognized by the processor.

[0180] According to another aspect, the invention can relate to a system for estimating a combustion efficiency value during flaring by a flare flame in an industrial plant, over a time period. Preferably, said system comprise a device according to the invention.

[0181] As illustrated in the figure 5, in a particular in a system 2 according to the invention, the computing device 1 can be coupled with at least one image capturing device 40, preferably oriented toward a flame flare and configured for obtaining video stream of a flame flare emitted by a flare burner. The image capturing device includes a visible camera, and optionally an infrared camera 41 , a near-infrared camera and/or a broad-spectrum infrared camera.

[0182] The computing device 1 can also be coupled with at least one weather monitoring device 51 configure to generate weather parameter value.

[0183] The system 2 can also comprise an audio recorder or a vibration recorder 53 configured to measure the vibration of the flare burner during flaring.

[0184] The invention can be the subject of numerous variants and applications other than those described above. In particular, unless otherwise indicated, the different structural and functional characteristics of each of the implementations described above should not be considered as combined and I or closely and I or inextricably linked to each other, but on the contrary as simple juxtapositions. In addition, the structural and I or functional characteristics of the various embodiments described above may be the subject in whole or in part of any different juxtaposition or any different combination.