Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
MONITORING OF WELLSITE WITH ARTIFICIAL INTELLIGENCE
Document Type and Number:
WIPO Patent Application WO/2023/028337
Kind Code:
A1
Abstract:
Various embodiments of the present disclosure include a method for monitoring a wellsite. In some embodiments, the method can include receiving data from a wellsite sensor. In some embodiments, the method can include analyzing the data using artificial intelligence to determine a characteristic associated with the wellsite.

Inventors:
ARYANPUR RAMEEN MATTHEW (US)
SCHWACH JORY (US)
Application Number:
PCT/US2022/041735
Publication Date:
March 02, 2023
Filing Date:
August 26, 2022
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
ANDIUM INC (US)
International Classes:
E21B44/00
Foreign References:
US20160282508A12016-09-29
US20200157929A12020-05-21
US20210123784A12021-04-29
Attorney, Agent or Firm:
SEPPELT, Jon A. et al. (US)
Download PDF:
Claims:
What is claimed is:

1. A method for monitoring a wellsite, comprising: receiving data from a wellsite sensor; analyzing the data using artificial intelligence to determine a characteristic associated with the wellsite.

2. The method of claim 1, wherein receiving data from the wellsite sensor includes receiving image data from the wellsite sensor.

3. The method of claim 2, wherein: the wellsite sensor includes a thermal imaging camera that monitors a tank located at the wellsite; and the image data includes image data captured with the thermal imaging camera.

4. The method of claim 3, wherein the characteristic associated with the wellsite includes a level of liquid in a tank located on the wellsite.

5. The method of claim 1, wherein analyzing the data using artificial intelligence includes using computer vision to analyze the data.

6. The method of claim 2, wherein: the wellsite sensor includes a camera that monitors a flare associated with the wellsite; and the data includes image data of the flare captured with the camera.

7. The method of claim 6, wherein determining the characteristic associated with the wellsite includes determining a flare characteristic associated with the flare.

8. The method of claim 7, wherein determining the flare characteristic includes determining the flare characteristic based on at least one of a color and size of the flare, using computer vision.

9. The method of claim 8, further comprising using artificial intelligence to determine a type of notification to generate, based on the flare characteristic.

28

10. The method of claim 2, wherein the image data includes an image of at least one of a methane leak.

11. The method of claim 10, wherein the method includes using computer vision to determine a type of characteristic associated with the at least one of the methane leak.

12. The method of claim 11, further comprising determining, with artificial intelligence, a type of notification to generate based on the type of characteristic.

13. A system for monitoring a wellsite, comprising: a processor; and a memory storing non-transitory computer-readable instructions that are executable by the processor to: receive data from a wellsite sensor, wherein the wellsite sensor includes a camera and the data includes image data; analyze the data using computer vision to determine a characteristic associated with the wellsite; and record data associated with the characteristic in a data log.

14. The system of claim 13, wherein the data log is accessible to a user located remote to the wellsite.

15. The system of claim 14, further comprising instructions that are executable by the processor to determine a type of notification to send to the user.

16. A system for monitoring a wellsite: a camera; a processor; and a memory storing instructions that are executable by the processor to: receive image data from the camera; analyze the image data using computer vision to determine a characteristic associated with the wellsite; and determine, with artificial intelligence, a type of notification to generate, based on the characteristic associated with the wellsite.

17. The system of claim 16, further comprising instructions executable by the processor to send the notification to a remote user device.

18. The system of claim 17, wherein the notification includes a notification that a leak is occurring on the wellsite.

19. The system of claim 17, wherein the notification includes a notification of a level of liquid in a tank disposed on the wellsite.

20. The system of claim 17, wherein the notification includes a notification of a health of a flare disposed on the wellsite.

Description:
MONITORING OF WELLSITE WITH ARTIFICIAL INTELLIGENCE

BACKGROUND

[0001] Control and/or monitoring of equipment on remote oil and gas wellsites can be challenging. Oftentimes, the wellsites remain unoccupied by engineers and technicians, and are only occasionally visited for routine maintenance and/or for offloading gas and/or oil. Thus, levels of storage tanks and the functioning of equipment is oftentimes either left unmonitored or is monitored by rudimentary sensors, that are only able to obtain basic information. Furthermore, the output of those sensors generally needs to be interpreted by an individual, which requires additional manpower, introducing inefficiencies in the extraction of oil and gas. Embodiments of the present disclosure can reduce many of the inefficiencies associated with an oil and gas wellsite and provide for more automated control over the wellsite.

SUMMARY

[0002] Various embodiments of the present disclosure include a method for monitoring a wellsite. In some embodiments, the method can include receiving data from a wellsite sensor. In some embodiments, the method can include analyzing the data using artificial intelligence to determine a characteristic associated with the wellsite.

[0003] Various embodiments of the present disclosure include a system for monitoring a wellsite. In some embodiments, the system can include a processor. In some embodiments, the system can include a memory storing non-transitory computer-readable instructions that are executable by the processor. In some embodiments, the computer-readable instructions can be executable by the processor to receive data from a wellsite sensor, wherein the wellsite sensor includes a camera and the data includes image data. In some embodiments, the computer-readable instructions can be executable by the processor to analyze the data using computer vision to determine a characteristic associated with the wellsite. In some embodiments, the computer-readable instructions can be executable by the processor to record data associated with the characteristic in a data log.

[0004] Various embodiments of the present disclosure include a system for monitoring a wellsite. In some embodiments, the system can include a processor. In some embodiments, the system can include a memory storing non-transitory computer-readable instructions that are executable by the processor. In some embodiments, the computer-readable instructions can be executable by the processor to receive image data from the camera. In some embodiments, the computer-readable instructions can be executable by the processor to analyze the image data using computer vision to determine a characteristic associated with the wellsite. In some embodiments, the computer-readable instructions can be executable by the processor to determine, with artificial intelligence, a type of notification to generate, based on the characteristic associated with the wellsite.

BRIEF DESCRIPTION OF DRAWINGS

[0005] Fig. 1 depicts a system for monitoring a wellsite with artificial intelligence, in accordance with embodiments of the present disclosure.

[0006] Fig. 2 depicts a method for monitoring a wellsite with artificial intelligence, in accordance with embodiments of the present disclosure.

[0007] Fig. 3 depicts a diagram of an example of a computing device for monitoring a wellsite with artificial intelligence, in accordance with various embodiments of the present disclosure.

DETAILED DESCRIPTION

[0008] Embodiments of the present disclosure are described below with reference to the accompanying figures. The features and advantages which are explained are illustrated by way of example and not by way of limitation. One of ordinary skill in the art will recognize that there are additional features and advantages provided by embodiments of the present disclosure beyond those described herein.

[0009] Fig. 1 depicts a system 100 for monitoring a wellsite with artificial intelligence, in accordance with embodiments of the present disclosure. As depicted, in Fig. 1, in some embodiments, the wellsite can include one or more of a number of process equipment. In an example, the process equipment can include one or more of a storage tank 102, a flare 104, well head 106, a separator 108, and/or other equipment not shown. As will be appreciated, the wellsite can include other equipment not shown, such as pumps, valves, distillation columns, pump jacks, etc.

In some embodiments, as discussed herein, wellsites can be located in remote regions and may not be regularly manned by personnel. As such, it can be beneficial to have ways to monitor the status of and control day to day operations of the wellsite, which can allow for selective elevation of incidents to personnel and/or to a central control and command center.

[0010] Accordingly, embodiments of the present disclosure can utilize a monitoring and/or control system that can utilize artificial intelligence to monitor and/or provide control over the system 100. As depicted in Fig. 1, the system can include a number of sensors, which can monitor various aspects of the system 100. In some embodiments, the sensors can include cameras 110-1, 110-2, 110-3, hereinafter referred to in the plural as cameras 110. In some embodiments, the cameras 110 can include an image sensor and/or a thermal sensor and can be a still camera and/or a video camera. In some embodiments, the camera can include an optical gas imager, which can include a cooled or uncooled microbolometer with an appropriate spectral response range and sensor sensitivity to target specific hydrocarbon gases. As further discussed herein, a camera can include an optical gas imager.

[0011] As depicted, the first camera 110-1 can be directed towards the storage tank 102, such that at least a portion of the storage tank 102 is within view of the camera 110-1. As further depicted, a second camera 110-2 can be directed towards the flare 104, such that camera 110-2 captures a picture of a plume 112 of the flare 104. In some embodiments, a third camera 110-3 can be directed towards a cap 106 associated with a wellbore, and/or associated piping 114-1, 114-2, ... , 114-5. Although embodiments are discussed in relation to a wellsite, embodiments of the present disclosure can also be used at other types of sites. For example, embodiments of the present disclosure can be used at a transfer station for natural gas, a waste management site and/or landfill, a mine, etc.

[0012] With respect to the storage tank 102, in some embodiments, the whole storage tank 102 may be within view of the camera 110-1 and/or a select portion of the storage tank 102 may be within view of the camera 110-1. In some embodiments, the first camera 110-1, and/or another camera 110, depicted in Fig. 1 can be directed towards multiple pieces of equipment that are included in the system 100. For example, in some embodiments, one or more of the cameras 110 can be directed towards one or more pieces of equipment, such that data collected from the cameras 110 (e.g., in an image and/or video) captures one or more pieces of equipment. In some embodiments, a single camera can be used to monitor one or more pieces of equipment. Furthermore, one or more cameras can monitor other aspects associated with the wellsite, such as vehicles, personnel, intruders, etc.

[0013] In some embodiments, the system can include a computer 116. In some embodiments, the computer 116 may be located at the wellsite. In some embodiments, the computer 116 may not be physically located at the wellsite. For example, in some embodiments, the computer 116 can be a cloud based computing system and/or can be located at a remote site, such as a control center. In some embodiments, the computer 116 can include a processor 118 and a memory 120, which include memory resources (e.g., volatile memory and/or non-volatile memory) for executing instructions stored in a tangible non-transitory medium (e.g., volatile memory, non-volatile memory, and/or machine readable medium) and/or an application specific integrated circuit (ASIC) including logic configured to perform various examples of the present disclosure. A machine (e.g., a computing device) can include and/or receive a tangible non-transitory machine readable medium storing a set of machine readable instructions (MRI) (e.g., software) via an input device.

[0014] As used herein, processor 118 can include one or a plurality of processors such as in a parallel processing system. Memory resources 120 can include memory addressable by the processor resources 118 for execution of machine readable instructions. The machine readable medium can include volatile and/or non-volatile memory such as random access memory (RAM), magnetic memory such as a hard disk, floppy disk, and/or tape memory, a solid state drive (SSD), flash memory, phase change memory, etc. In some examples, the non-volatile memory can be a database including a plurality of physical non-volatile memory devices. In various examples, the database can be local to a particular system or remote (e.g., including a plurality of non-volatile memory devices).

[0015] The processor resources 118 can control the overall operation of the system 100. The processor resources 118 can be connected to a memory controller, which can read and/or write data from and/or to volatile memory (e.g., RAM). The memory controller can include an ASIC and/or a processor with its own memory resources (e.g., volatile and/or non-volatile memory). The volatile memory can include one or a plurality of memory modules (e.g., chips). A basic input-output system (BIOS) for the system 100 may be stored in non-volatile memory or other non-volatile memory not specifically illustrated, but associated with the processor resources 118. The BIOS can control a start-up or boot process and control basic operation of the system 100.

[0016] The processor resources 118 can be connected to a bus to provide for communication between the processor resources 118 and other portions of the system 100. For example, the bus may operate under a standard protocol such as a variation of the Peripheral Component Interconnect (PCI) bus standard, or the like. The bus can connect the processor resources 118 to the non-volatile memory, graphics controller, input device, and/or the network connection, among other portions of the system 100. The non-volatile memory (e.g., hard disk, SSD, etc.) can provide persistent data storage for the system 100. The graphics controller can connect to a display device, which can provide an image to a user based on activities performed by the system 100. In some embodiments, although not depicted, the system can include an edge device, which can be a cell phone and/or computer with a network enabled signal (e.g., cell signal), which can communicate information from the wellsite to a remote computing device for further analysis.

[0017] In some embodiments, the memory can include non-transitory computer- readable instructions that are executable by the processor to receive data from a wellsite sensor. In some embodiments, as discussed herein, the wellsite sensor can include a camera 110, which can include a standard image sensor and/or a thermal sensor. In some embodiments, the data obtained from the wellsite sensor can include image data that is captured with one or more of the cameras 110.

[0018] In some embodiments, the data can include image data from one or more images of the storage tank 102. As depicted in Fig. 1, phantom line 122 represents a transition (e.g., interface) between oil stored in the tank and a gas (e.g., air). Oil is located below the line 122 and the gas is located above the line 122. In some embodiments, the image data obtained from the camera 110 can be used to determine a level of fluid in the storage tank 102, as discussed in U.S. Application no. 16/617,234, which is hereby incorporated by reference as though fully set forth herein.

[0019] In some embodiments, the camera 110-1 can be a thermal camera that is positioned such that the oil storage tank 102 is within a field of view of the thermal camera. In some embodiments, the thermal camera can be connected, to the computer 116, which can analyze a feed from the thermal camera and determine a thermal gradation of the oil storage tank 102 that indicates a level of oil in the oil storage tank 102. For example, the portion of the oil storage tank 102 filled with oil can have a different temperature profile than an empty portion of the oil storage tank 102 due to the oil that fills the filled portion of the oil storage tank having a different temperature and/or heat capacity than air filling the empty portion of the oil storage tank 102. [0020] In an example where the oil storage tank 102 is half full, the bottom half containing oil may appear to have a different temperature (e.g., be colder) than the top half containing air, which can be represented in the feed received from the thermal camera by the computer 116. Through processing of the feed received from the thermal camera via computer vision, as discussed herein, a determination can be made by the computer 116 and/or a central computer that the oil storage tank is half full. Likewise, a determination can be made of how full the oil storage tank 102 is, at any level (e.g., 42% full, 59% full, 77% full, etc.). In some embodiments, computer vision can include algorithmic methods for processing image data for the purpose of extracting salient/contextually-relevant insights from that data. Generally computer vision can require no “training data” for the specific problem domain (e.g., increasing contrast on an image then analyzing pixels to find edges).

[0021] In an example, the camera 110-1 can be a thermal video sensor, which can be in communication with the computer 116 via a wired or wireless connection. The thermal camera 110-1 can capture a thermal image of a storage tank 102. As can be seen in Fig. 1, the storage tank 102 can be partially filled with one or more liquids and/or gas. In some embodiments, the liquid(s) stored in the storage tank 102 can have a higher heat capacity than an air, which fills an empty portion of the tank. Accordingly, as a temperature in an environment in which the storage tank 102 is located varies, the portion of the storage tank 102 filled with the liquid can be more resistant to a temperature change, due to its higher heat capacity. Accordingly, in an example where the environment in which the storage tank 102 is located fluctuates in temperature, a temperature of an empty portion of the storage tank 102 (e.g., the portion of the tank located above line 122) can be different than a filled portion of the storage tank (e.g., the portion of the tank located below line 122).

[0022] Additionally, the portion of the tank filled with liquids can contain liquids with differing heat capacities, which can produce different thermal profiles detectable by the camera 110-1. For example, where the portion of the tank located between lines 122 and 126 contains oil and the portion of the tank located below line 124 contains water, the two portions (e.g., oil containing portion between lines 122 and 126 and water containing portion below line 124) can have different thermal profiles due to the varying heat capacities of the two different liquids.

[0023] For instance, as a temperature increases throughout the day, a temperature of an empty portion of the tank can increase at a faster rate than a temperature of the filled portion, because a heat capacity of the liquid filling the filled portion is greater than a heat capacity of the air filling the empty portion. Accordingly, a thermal interface 122 can form between the filled portion and the empty portion. In an example, the thermal interface 122 can be a temperature differential between the filled portion and the empty portion.

[0024] In some embodiments, the thermal camera 110-1 can capture the thermal image of the storage tank 102. In an example, the data associated with the thermal image of the storage tank 102 can be provided to the computer 116, as discussed herein.

[0025] The computer 116 can analyze the data and can determine a profile of the storage tank included in the thermal image of the storage tank 102. For example, an outline of the storage tank 102 can be determined in relation to a surrounding environment from the storage tank 102, which can be of a different temperature than the storage tank 102 included in the thermal image of the storage tank 102. The profile of the storage tank 102 can be defined as an outline of the storage tank, which can be used in calculations made by the computer 116 that relate to how full the storage tank 102 is.

[0026] In some embodiments, artificial intelligence can be used to make determinations associated with characteristics of the tank 102. In some embodiments, artificial intelligence and/or machine learning can include specifying a mathematical model to characterize a process of interest and using algorithmic techniques to fit training data to that model for the purposes of extracting salient/contextually-relevant insights in a domain of interest (e.g., training an object detection model to recognize a tank and associated thermal interfaces 122, 124, 126). For example, in some embodiments, data received from the camera 110-1 can be analyzed using computer vision to determine a characteristic associated with the wellsite 100. In some embodiments, the data received from the camera 110-1 can be analyzed using computer vision to determine a characteristic associated with the tank 102.

[0027] In an example, computer vision can be used to determine where the thermal interface 122, 124 is located between the filled portion of the storage tank 102 and the empty portion of the storage tank 102. In an example, the thermal interfaces 122, 124 can be caused as a result of a changing temperature in an environment in which the tank 102 is located. In an example, as the tank heats up or cools off throughout changing temperatures during the day or night, the thermal interfaces 122, 124 can be caused due to the differing heat capacities of the fluids (e.g., air and liquid in the tank). For instance, a temperature of a liquid in the filled portion of the storage tank 102 can change at a slower rate than air in the empty portion of the storage tank 102.

[0028] Based on where the thermal interfaces 122, 124 are located, the level of fluid in the storage tank 102 can be determined. In an example, a level of fluid in the storage tank 102 can be determined, based on a location of the thermal interfaces 122, 124 between the filled portion of the storage tank 102 and the empty portion of the storage tank 102. For instance, as depicted in Fig. 1, the thermal interface 124 occurs at a level that is roughly 0.1 of an overall height of the storage tank 102 and the thermal interface 122 occurs at a level that is roughly 0.7 of an overall height of the storage tank 102. Thus, a determination can be made that liquid filling the filled portion of the tank 102 occupies roughly 0.7 of the total tank volume (e.g., 70 percent of the total tank volume). Furthermore, a determination can be made that 60 percent of the total tank volume is filled with oil and 10 percent of the total tank volume filled with water.

[0029] In some embodiments, a determination of the level of fluid in the storage tank 102 can be based on an area occupied by the liquid in a profile thermal image of the tank 102. For example, from a side profile view (e.g., two-dimensional) thermal image of the storage tank 102, an area of which is occupied by the liquid (e.g., filled portion of the storage tank 102) and/or air (e.g., empty portion of the storage tank 102) can be determined. A proportion of the area of which is occupied by the liquid and area of which is occupied by the air can be determined. For example, as depicted, in Fig. 1, roughly more than twice the volume of the storage tank 102 is occupied by the liquid than that occupied by the air. Accordingly, a determination can be made that the storage tank is 70 percent full.

[0030] In some embodiments, the level of the tank can be computed by the computer 116 and can be conveyed to a cloud computer or central computer. In some embodiments, data associated with the image taken by the camera 110-1 can be transmitted to the computer 116, a cloud computer, or central computer located offsite for one or more determinations of characteristics associated with the wellsite to be determined. As further discussed herein, in some embodiments, the image can include additional data. For example, the image can be labeled with additional data in some embodiments. In some embodiments, the image can be labeled with data that includes calculated levels associated with each of the thermal interfaces. For example, the image can be labeled with data that indicates that the thermal interface 124 is located at 0.1 of a height of the storage tank 102 and the thermal interface 122 is located at 0.7 of a height of the storage tank 122. In some embodiments, the labeled image can be transmitted, as further discussed herein. The labeled image can be used for training purposes with respect to the use of artificial intelligence, as further discussed herein.

[0031] In some embodiments, as discussed herein, computer vision can be used to identify one or more characteristics associated with the storage tank 102 through the associated image data thereof. In some embodiments, computer vision can include one or more algorithms that can be initially trained and continually updated to recognize the one or more characteristics associated with the storage tank 102. In some embodiments, computer vision can detect a change in pixel color between the interfaces 122, 124. For example, because of the differing heat capacities of the fluids stored in the storage tank, the interfaces 122, 124 can generally be represented as a change in color, represented by the thermal image data.

[0032] In some embodiments, computer vision can be used to determine the location of the interfaces 122, 124, even when there is not a clear demarcation of where the interface is. In an example, in some embodiments, the interfaces 122, 124 may not be defined lines that extend across the thermal image data. For instance, in some embodiments, the interfaces can be depicted on the thermal image as slow transitions from one color to another, creating a type of fuzzy transition and not a defined interface. In some embodiments, computer vision can be used and trained to identify transitions between the fluids, which are less than defined (e.g., are gradual transitions of color). In some embodiments, labeled images, as discussed herein, can be used for training to identify transitions between the fluids.

[0033] In some embodiments, the slow transition of color on the thermal image at the interfaces 122, 124 can be caused by the dissipation of heat in the wall of the storage tank 102. For example, the fluids can transfer heat through a wall of the storage tank 102. As the heat is transferred through the wall of the storage tank, the heat can be dispersed in the material of the tank, which in some embodiments can be metal and/or plastic. In some embodiments, data can be initially trained by a human labeling of images and can be further refined with automated training of the model as more data and images are processed.

[0034] In some embodiments, the interface 124, which can be the interface between oil and water may not be a clear line due to an emulsion that exists between the two fluids (e.g., water and oil). In an example, computer vision can be trained to detect instances where emulsions exist and interpret the data associated with the thermal image to determine the data in a way that accounts for the emulsion in relation to the two fluids forming the emulsion. In an example, as depicted in Fig. 1, an emulsion exists between the interfaces 124, 126. In a physical sense, the interface between the oil and water can appear as a thicker line on the thermal image taken by the camera 110-1, since the emulsion can have a heat capacity that is different than both oil and water. As depicted, the emulsion can exist between the interface lines 124, 126. In some embodiments, computer vision can be used to recognize that the emulsion exists and where the emulsion is located in order to accurately identify where the transitions between water, emulsion, and oil are located.

[0035] In some embodiments, computer vision can be used to detect extraneous data associated with the storage tank 102. For example, in some embodiments, changes in temperature along the surface of the tank can be caused by external factors, not associated with the differing heat capacities of fluids stored in the storage tank 102. For instance, in some embodiments, the sun can shine on select portions of the storage tank 102, causing some portions of the storage tank to be heated and some portions of the storage tank 102 to not be heated.

[0036] In instances where the sun is rising or setting, other equipment on the well site and/or geographical features can cause portions of the storage tank 102 to be shaded, while other portions of the storage tank 102 are exposed to the sun. Accordingly, the portions of the storage tank 102 located in the sun can be heated and the portions of the storage tank 102 that are located in the shade may not be heated. Thus, the thermal image of the tank 102 can depict a transition between the heated portion of the storage tank 102 and the non-heated portion of the storage tank 102. In some embodiments, computer vision can determine the transition between the heated and non-heated portion of the storage tank 102 and can recognize that the transition is a transition due to a temperature change caused by the sun and not a transition between fluids stored in the tank 102. Accordingly, embodiments of the present disclosure can classify and deal with extraneous data related to environmental conditions.

[0037] In an example, artificial intelligence can interpret the data obtained from the camera 110-1 and can make a determination, based on the data. In some embodiments, additional information can be provided to the system. In some embodiments, the additional information can include location information associated with a position of the wellsite geographically. In some embodiments, the additional information can include data associated with surrounding geography, vegetation, structures, etc., which may cast a shadow on the storage tank 102. In some embodiments, the additional information can include data associated with sun charts with respect to the times at which the sun sets and rises. In some embodiments, the additional information can include weather information associated with an area in which the wellsite is located. For example, the weather information can include wind speed, wind direction, temperature, humidity, amounts of precipitation, etc. In some embodiments, the additional information can include information with respect to a location of equipment on the wellsite, which can, for example cause shadows to be cast on the storage tank 102. In some embodiments, the additional information can include a location of the camera and direction in which the camera is facing, as well as a location and direction of the storage tank 102 with respect to the camera 110-1. [0038] In some embodiments, the additional information can be used by artificial intelligence to determine various factors associated with the storage tank, based on the additional information, which may be interpreted as one or more false interfaces between fluids in the storage tank. For example, in some embodiments, where a shadow is cast on the storage tank, artificial intelligence can be used to consider some or all of the additional information alongside the data obtained from the camera 110-1 to determine whether the interface is actually associated with the transition between fluids and not from a shadow being cast on the tank.

[0039] In some embodiments, computer vision can learn from previous patterns of data and utilize previous patterns with respect to making determinizations of characteristics on the wellsite. For example, if a shadow is cast on the storage tank 102 every day at a particular time, computer vision can be used to make a determination that data obtained as a result of the shadow being cast is not representative of an actual level of fluid in the storage tank 102. [0040] For example, in some embodiments, the additional data can include a time of day and a time the sun rises, as well as the location of equipment that casts a shadow on the storage tank. As mentioned, in some embodiments, artificial intelligence can consider this data and can determine that a thermal transition is due to a shadow being cast on the storage tank and not an interface existing between fluids. [0041] In some embodiments, artificial intelligence can determine how many interfaces are present and can determine that a detected thermal transition line is not an interface between fluids, based on the total number of thermal transition lines. For example, if four thermal transition lines are present, when only three thermal transition lines should be present, a determination can be made that one of the thermal transition lines is a false interface between fluids. In some embodiments, the computer vision can factor out the false data due to shadows on the tank, based on a rate of movement between the false shadow data and the other transition lines. For example, where the false shadow data represents a line moving in an opposite direction of the other transition lines present on the tank and/or represents a line moving at a much more rapid pace, due to a movement of the shadow, computer vision can disqualify that data from further analysis.

[0042] In some embodiments, artificial intelligence can be used to determine that one of the thermal transition lines is a false liquid interface, based on a geometry of the thermal transition line. For example, if the thermal transition line detected by computer vision is at an angle that is not a horizontal angle, a determination can be made that the thermal transition line is not due to a liquid interface. In some embodiments, computer vision can look for patterns in data associated with the liquid interfaces and use those patterns to disqualify extraneous image data that could be categorized as liquid interfaces by an untrained system. For example, if the camera 110-1 is located below a vertical level of a liquid interface, the thermal image can have a slight upward (e.g., convex) curve where the liquid interface is located. If the camera 110-1 is located at a same vertical level of the liquid interface, the thermal image can a straight line where the liquid interface is located. If the camera 110-1 is located above a vertical level of the liquid interface, the thermal image can have a slight downward (e.g., concave) curve.

[0043] In some embodiments, artificial intelligence can be used to generate various notification associated with the data collected form the camera 110-1. In some embodiments, artificial intelligence can be used to determine a notification that a level of fluid in the storage tank 102 is nearing a capacity of the storage tank 102. In some embodiments, artificial intelligence can be used to determine a notification that includes an indication of an amount of oil versus an amount of water in the storage tank. In some embodiments, an alert can be generated when a ratio of water to oil exceeds a particular ratio. In some embodiments, the alert can be indicative of a problem with the separator 108, thus indicating that the separator 108 is not effectively removing water from the oil, water mix passing through the separator from a wellhead.

[0044] As further depicted in Fig. 1, a second camera 110-2 can be configured to provide image data associated with a flare 104. Although the cameras 110 are generally discussed herein as being directed towards a single piece of equipment and capturing image data of that single piece of equipment, cameras 110 of the present disclosure can be directed towards one or more pieces of equipment, effectively allowing for them to capture image data associated with the one or more pieces of equipment.

[0045] In some embodiments, computer vision can be used to recognize particular features associate with the image data of the flare 104. In some embodiments, the camera 110-2 can include an image sensor and/or a thermal sensor that is configured to capture image data of a plume 112 (e.g., smoke plume) generated by the flare 104 and/or a flame (not depicted) generated by the flare 104. In some embodiments, computer vision can be used to determine a color and/or size of the plume 112 and/or flame generated by the flare 104. In some embodiments, the color of the plume 112 generated by the flare 104 can be indicative of a health of the flare 104. For example, a plume that is black can indicate that the flare is running rich. In some embodiments, computer vision can be used to determine characteristics associated with the flare, even when extraneous data is included in the image captured with the camera 110-2. For example, in some embodiments, a background of the image can include clouds, which can make it difficult to distinguish between a plume 112 associated with the flare and the clouds. In some embodiments, the computer vision can be trained to recognize differences between the plume 112 and clouds based on motion associated with the plume, versus clouds located in a background of the image. In an example, the plume 112 can move with a faster apparent velocity than background clouds, since the plume 112 can be located closer to the camera 110-2 than the clouds. [0046] In some embodiments, the camera 110-2 can include image data sensing functionality as well as thermal sensing functionality. Alternatively, the camera 110- 2 can include more than one camera (e.g., two cameras), one of which includes an image data sensor and one of which includes a thermal sensor. In some embodiments, where clouds exist in a background of an image obtained with the camera 110-2, a combination of a thermal image and an image data image can be used to determine characteristics associated with the flare. For example, where a color of the plume 112 matches a color of the clouds, a combination of thermal mapping and image mapping with the two different types of cameras can be used to determine a size of the plume 112. The thermal data can first be used to locate a position and size of the plume 112 and the image data can be used for determining a health of the plume 112.

[0047] In some embodiments, artificial intelligence can be used to determine whether to generate a notification associated with the flare 104 health, as further discussed herein. In some embodiments, artificial intelligence can be used to make determinations associated with characteristics of the flare 104 (e.g., flare health). In some embodiments, artificial intelligence and/or machine learning can include specifying a mathematical model to characterize a process of interest and using algorithmic techniques to fit training data to that model for the purposes of extracting salient/contextually-relevant insights in a domain of interest (e.g., training an object detection model to recognize flares and black smoke from live camera streams given labeled training samples). For example, in some embodiments, data received from the camera 110-2 can be analyzed using computer vision to determine a characteristic associated with the wellsite. In some embodiments, the data received from the camera 110-2 can be analyzed using computer vision to determine a characteristic associated with the flare 104.

[0048] In some embodiments, a third camera 110-3 can be positioned, such that an image of one or more portions of piping are captured. For example, as depicted in Fig. 1, a portion of piping adjacent to the well head 106 is within view of the camera 110-3, as well as an associated leak (e.g., methane leak). In some embodiments of the present disclosure, computer vision can be used to analyze the image to determine that a methane leak exists. In some embodiments, the third camera can include an image sensor and/or thermal sensor. In some embodiments, the image sensor can be configured to detect a wavelength of light associated with the methane leak. Methane is absorbed throughout the infrared spectral range but has four fundamental zones centered around 2.3 m, 3.3pm, 6.5pm and 7.7pm, with 3.3pm and 7.7-7.8pm showing the higher intensity. Although embodiments of the present disclosure are discussed in relation to a wellsite, embodiments of the present disclosure can also detect methane at other types of sites. For example, embodiments of the present disclosure can detect methane at a transfer station for natural gas, a waste management site, and/or a landfill. Further embodiments of the present disclosure can detect other types of fluids (e.g., gases and/or liquids) associated with other types of sites, such as ammonium nitrate vapor at a mine.

[0049] In further relation to Fig. 1, in some embodiments, the camera can be positioned such that a portion of the piping 114-1 of the wellsite is within view of the camera 110-3. In some embodiments, the camera 110-3 can be positioned such that a substantial portion of the piping 114-1, 114-2,.. . , 114-5 associated with the wellsite can be within view of the camera. Accordingly, the camera 110-3 can detect a leak among a majority of the piping 114-1, 114-2,.. . , 114-5 associated with the wellsite. In some embodiments, a plurality of cameras can be disposed on the wellsite, such that each camera monitors a portion of the piping 114-1, 114-2,. . . , 114-5 associated with the wellsite and together all or a substantial portion of the piping 114-1, 114- 2,. . . , 114-5 of the wellsite can be monitored.

[0050] In some embodiments, computer vision can be used to analyze a signal obtained from the camera 110-3. In an example, computer vision can be used to look for characteristics of the image that fit characteristics associated with a fluid leak. In some embodiments, computer vision can detect leaks based on a difference in pixel color and/or intensity. For example, computer vision can be programmed with a baseline that does not include any leaks along the piping associated with the wellsite. When a leak is present, in some embodiments, computer vision can be used to detect the leak, based on a change in pixels of the image captured by the camera 110-3. In some embodiments, a fluid leak can be associated with a cluster of pixels that differ in color and/or intensity versus those pixels in a baseline image.

[0051] In some embodiments, computer vision can be used to detect false positives associated with an image captured with the camera 110-3. For example, computer vision can be used to detect the difference between a fluid leak and other types of anomalies, such as pooled water on the ground due to rain, etc. [0052] In some embodiments, artificial intelligence can be used to determine what type of notification to generate. In some embodiments, artificial intelligence can be used to make a decision of whether to generate a notification and what type of notification to generate. In some embodiments, artificial intelligence can be used to make a determination of a size of the leak, based on a number of pixels associated with the leak.

[0053] Although some embodiments of the present disclosure are discussed for use in relation to a wellsite, some embodiments can be used in relation to other types of sites. For example, some embodiments of the present disclosure can be used in relation to a transfer station for natural gas, a waste management site, and/or a landfill for the detection of methane gas. In some embodiments, embodiments of the present disclosure can be used in relation to a mining site. For example, embodiments of the present disclosure can be used to monitor for vapors released from ammonium nitrate. In an example, embodiments of the present disclosure can be used to monitor for vapors released from ammonium nitrate stored in a silo at a mine. For example, in some embodiments, one or more cameras (e.g., optical gas imagers) can be disposed in proximity to the silo and/or around the mine, such that a field of view of the cameras can capture any areas where there may be a likelihood of supposed vapors released from ammonium nitrate. In some embodiments, the camera can include an image sensor and/or thermal sensor. In some embodiments, the image sensor can be configured to detect a wavelength of light associated with the ammonium nitrate leak. The image sensor can be configured to detect a wavelength centered around 3.3pm and 7.4pm.

[0054] Some embodiments of the present disclosure can be used in relation to a natural gas transfer station and can detect the presence of methane gas at and/or around the natural gas transfer station. For example, a camera, which can include an image sensor and/or a thermal sensor can be disposed at a location at which the camera can provide a field of view of the natural gas transfer station. In some embodiments, based on a signal obtained from the camera, a methane leak can be detected based on a wavelength of light associated with the methane leak.

[0055] Some embodiments of the present disclosure can be used in relation to a waste management site and/or landfill and can detect the presence of methane gas at and/or around the waste management site and/or landfill. For example, a camera, which can include an image sensor and/or a thermal sensor can be disposed at a location at which the camera can provide a field of view of the waste management site and/or landfill. In some embodiments, based on a signal obtained from the camera, a methane leak can be detected based on a wavelength of light associated with the methane leak.

[0056] Some embodiments of the present disclosure can be used in relation to a mine and can detect the presence of methane gas at and/or around the waste management site and/or landfill. For example, a camera, which can include an image sensor and/or a thermal sensor can be disposed at a location at which the camera can provide a field of view of the waste management site and/or landfill. In some embodiments, based on a signal obtained from the camera, a methane leak can be detected based on a wavelength of light associated with the methane leak.

[0057] Fig. 2 depicts a method 130 for monitoring a wellsite with artificial intelligence, in accordance with embodiments of the present disclosure. In some embodiments, the method 130 can include receiving 132 data from a wellsite sensor. In some embodiments, receiving the data from the wellsite sensor can include receiving image data from the wellsite sensor. For example, the wellsite sensor can include a camera in some embodiments, which can include an image and/or thermal sensor.

[0058] In some embodiments, the characteristic can be associated with a wellsite. For example, as discussed herein, in some embodiments, the camera can be disposed on a wellsite such that a field of vision of the camera is directed towards one or more pieces of process equipment located on the wellsite. In some embodiments, the piece of process equipment can include a storage tank in which liquid is stored. For example, the liquid can include a petroleum or other type of liquid associated with the oil and gas industry. In some embodiments, the storage tank can be associated with another type of industry. For example, the storage tank can be a silo, which houses a material, such as ammonium nitrate.

[0059] As previously discussed, as a result of differing heat capacities of fluids stored in the storage tank, a thermal image of the storage tank taken with the thermal image camera can be analyzed with computer vision to determine one or more interfaces located between various fluids stored in the storage tank. Thus, based on a location of one or more liquid interfaces, a level of the various fluids stored in the storage tank can be determined. [0060] In some embodiments, the method can include analyzing 134 the data using artificial intelligence to determine a characteristic associated with the wellsite. In some embodiments, data received from the camera can be analyzed using artificial intelligence to make various determinations with respect to the data. For example, in some embodiments, computer vision can initially be used to interpret the data. For instance, computer vision can be used to determine particular patterns and/or features associated with the data, which can be associated with particular characteristics. For instance, as discussed herein, computer vision can interpret multiple liquid interfaces existing in the storage tank due to emulsions, thermal dissipation of heat through a material that the storage tank is made from, and/or can interpret artifacts due to weather related phenomena, such as shadows casted on the storage tank from local geographic features and/or other process equipment located on the wellsite.

[0061] In some embodiments, the method can include monitoring a flare associated with the wellsite with the camera. As previously discussed herein, in some embodiments, the wellsite sensor can include a camera that monitors a flare associated with the wellsite. In some embodiments, a dedicated camera can monitor the flare. In other embodiments, a single camera can monitor a number of different process equipment on the wellsite. For example, a single camera can monitor a storage tank, along with the flare, among other types of process equipment. Accordingly, a single image can be taken that includes multiple pieces of equipment. Alternatively, a camera can be located on a gimbal and can be adjusted to take pictures of individual pieces of equipment.

[0062] In some embodiments, determining the characteristic associated with the wellsite can include determining a flare characteristic associated with the flare. In some embodiments, the flare characteristic can be determined using computer vision. For example, determining the flare characteristic can include determining the flare characteristic based on at least one of a color and size of the flare, using computer vision.

[0063] In some embodiments, artificial intelligence can be used to determine a type of notification to generate, based on the flare characteristic. In some embodiments where the characteristic includes that the flare is generating black smoke, artificial intelligence can be used to determine that first, an indication should be sent to personnel in order to alert them of the fact that the flare is generating black smoke. Secondly, in some embodiments, artificial intelligence can be used to determine a cause for the flare producing black smoke, such as that the flare is running rich. Further alerts can be determined and sent to the personnel via artificial intelligence with respect to a reason that the flare is running rich.

[0064] In some embodiments, the image data can include an image of a fluid leak. In some embodiments, image data as generally discussed herein can be in the form of a video, which includes numerous images compiled together. In some embodiments, the image data can include an image of a methane leak, ammonium nitrate leak, among other types of leaks. In some embodiments, as discussed herein, a type of characteristic associated with the at least one of the methane leak can be determined using computer vision. In some embodiments, a pattern of image data can be analyzed using computer vision, thus determining particular learned patterns associated with a methane leak and enabling those patterns to be used in future determinations associated with methane leaks. Although a methane leak is discussed herein, other types of leaks can be detected, such as those associated with ammonium nitrate.

[0065] In some embodiments, the image data can be labeled using artificial intelligence. For example, in some embodiments, one or more labels can be included on the image data based on one or more characteristics associated with the data and/or analysis of the data. For example, where artificial intelligence is used to make a determination with respect to a characteristic associated with the image data, a label can be included in the image data with respect to the type of determination made. For example, artificial intelligence can be used to determine the presence of a fluid (e.g., gas, liquid), which in some embodiments can be a fluid leak. In some embodiments, based on analysis of the image data associated with the fluid, the image can be labeled with data associated with characteristics of the fluid. In an example, in some embodiments, the image can be labeled with location data that indicates where the fluid is located. For instance, where the fluid is associated with a leak (e.g., gas leak), the location of the fluid can be indicated on the image.

[0066] In some embodiments, a determination associated with the image data can be transmitted. In an example, the image data can be transmitted to a server for review by a service technician who can be responsible for overseeing the location where the image data was acquired. In some embodiments, the image data may only be transmitted when a particular determination is made with respect to the image data. In an example, a plurality of images can be captured, however only a subset of those images may be transmitted for review, based on determinations associated with those images. For instance, where a determination is made that a fluid leak exists, one or more images that include the fluid leak can be transmitted, while images that do not include the fluid leak can be filtered from those that are transmitted. In some embodiments, the images can be labeled with data associated with the detected fluid (e.g., detected fluid leak). The data can then be reviewed by a service provider and used for mitigation of the detected fluid, in some embodiments.

[0067] In some embodiments, processing of the image can be performed on-site (e.g., in proximity to the site where the camera is located). Oftentimes, such sites (e.g., wellsites) can be located remotely and can have little cellular service and/or internet for use in transmitting data. Thus, it can be beneficial to process data on-site to avoid transmitting large amounts of data to a remote server. By processing the data on-site, only relevant images and data associated with the site can be transmitted to a remote server. In some embodiments, the data may be transmitted to a mobile device that is located within proximity to the wellsite. For example, in some embodiments the mobile device can be a mobile user device such as a table, cell phone, etc. When the mobile device nears the location of the site, the data may be transmitted to the mobile device.

[0068] In some embodiments, the data associated with the fluid can include a volume of fluid, determined as discussed herein. For example, in some embodiments, a plurality of cameras can provide one or more images of a fluid escaping a site, such as a wellsite, natural gas transfer station, waste management site, landfill, and/or mine, among other types of sites. The images from the plurality of cameras can be compiled to generate a three-dimensional image, which can be used to determine a volume of a fluid depicted in the three-dimensional image.

[0069] Fig. 3 depicts a diagram of an example of a computing device 140 for monitoring a wellsite with artificial intelligence, in accordance with various embodiments of the present disclosure. The computing device 170 can utilize software, hardware, firmware, and/or logic to perform a number of functions described herein. In an example, the computing device 140 can be representative of a remote transmitting unit, computer 116, etc.

[0070] The computing device 140 can be a combination of hardware and instructions 142 to share information. The hardware, for example can include a processing resource 144 and/or a memory resource 146 (e.g., computer-readable medium (CRM), database, etc.). A processing resource 144, as used herein, can include a number of processors capable of executing instructions 142 stored by the memory resource 146. Processing resource 144 can be integrated in a single device or distributed across multiple devices. The instructions 142 (e.g., computer readable instructions (CRI)) can include instructions 142 stored on the memory resource 146 and executable by the processing resource 144 to implement a desired function (e.g., monitoring a wellsite, as discussed herein).

[0071] The memory resource 146 can be in communication with the processing resource 144. The memory resource 146, as used herein, can include a number of memory components capable of storing instructions 142 that can be executed by the processing resource 144. Such memory resource 146 can be anon-transitory CRM. Memory resource 146 can be integrated in a single device or distributed across multiple devices. Further, memory resource 146 can be fully or partially integrated in the same device as processing resource 144 or it can be separate but accessible to that device and processing resource 144. Thus, it is noted that the computing device 140 can be implemented on a support device and/or a collection of support devices, on a mobile device and/or a collection of mobile devices, and/or a combination of the support devices and the mobile devices.

[0072] The memory resource 146 can be in communication with the processing resource 144 via a communication link 148 (e.g., path). The communication link 148 can be local or remote to a computing device associated with the processing resource 144. Examples of a local communication link 148 can include an electronic bus internal to a computing device where the memory resource 146 is one of a volatile, non-volatile, fixed, and/or removable storage medium in communication with the processing resource 144 via the electronic bus.

[0073] Link 148 (e.g., local, wide area, regional, or global network) represents a cable, wireless, fiber optic, or remote connection via a telecommunication link, an infrared link, a radio frequency link, and/or other connectors or systems that provide electronic communication. That is, the link 148 can, for example, include a link to an intranet, the Internet, or a combination of both, among other communication interfaces. The link 148 can also include intermediate proxies, for example, an intermediate proxy server (not shown), routers, switches, load balancers, and the like. [0074] In some embodiments, the computer readable instructions 142 can include instructions executable by the processing resource to receive image data from a camera that is configured to obtain images of one or more pieces of process equipment disposed on an oil and gas site. As discussed herein, the one or more pieces of process equipment can include a storage tank, a flare, piping, etc.

[0075] In some embodiments, the computer readable instructions can include instructions executable by the processing resource to analyze the image data using computer vision to determine a characteristic associated with the wellsite. As discussed herein, particular features of the image data, such as pixel color, pixel location, pixel intensity, etc. can be analyzed to determine particular patterns, which are associated with characteristics of the wellsite. For example, the characteristics of the wellsite can include image data associated with a liquid interface associated with a storage tank, image data associated with an amount of smoke generated by a flare, image data associated with a methane leak coming from piping associated with the wellsite.

[0076] In some embodiments, a composition of the flame associated with the flare can be determined. In an example, a camera can include an image sensor that can be configured to detect a wavelength of light associated with the visualization of a fluid, such as methane. In some embodiments, based on the data associated with the image obtained by the camera, an amount of methane in the gas associated with the flare can be determined. In some embodiments, a plurality of cameras (e.g,. three cameras) can be disposed at different locations around the flare. Accordingly, the image data associated with each one of the cameras can be combined into a three-dimensional image. Based on the three-dimensional image of the fluid (e.g., methane), a volume of methane existing in the flare combustion gases can be analyzed. For example, a size of a plume of methane can be determined based on the multiple perspectives associated with the three-dimensional image and thereby a volume associated with the methane.

[0077] In some embodiments, the volume of methane in the flare combustion gases can be quantified such that a volume of methane exiting the flare as unbumed methane can be determined. In some embodiments, governmental reporting requirements may require that the particular amount of methane escaping into the atmosphere be recorded. Such reporting requirements may be associated with a fee that is levied based on the amount of escaping methane. Accordingly, embodiments of the present disclosure can determine the amount of methane escaping into the atmosphere, such that an appropriate tax can be levied based on the amount of methane.

[0078] In some embodiments, an efficiency of methane bum can be determined based on a determined amount of unbumed methane exiting the flare. In some embodiments, a sensor can be included in a supply conduit of the flare, which can detect the amount (e.g., volume) of particular gases being supplied to the flare. As discussed above, a plurality of cameras (e.g,. three cameras) can be disposed at different locations around the flare. Accordingly, the image data associated with each one of the cameras can be combined into a three-dimensional image. Based on the three-dimensional image of the fluid (e.g., methane), a volume of methane existing in the flare combustion gases can be analyzed.

[0079] In some embodiments, the volume of methane in the flare combustion gases can be quantified such that a volume of methane exiting the flare as unbumed methane can be determined. Accordingly, the volume of unbumed methane can be compared to methane that is provided to the flare via the conduit in order to determine an efficiency of methane bum.

[0080] In some embodiments, an amount of methane can be determined at a waste management site and/or landfill. In an example, as discussed above, taxes can be levied based on an amount of methane escaping from a waste management site and/or landfill. Many times, waste management sites and/or landfills can be backfilled to cover up any refuse at the site. Oftentimes a liner is disposed over the refuse to prevent methane from escaping the site. In some instances, vents may be installed to allow methane to escape, as the refuse decomposes. Some embodiments of the present disclosure can include detecting a methane leak from the site. In an example, a camera can be disposed on the site, such that a field of view of the camera includes the site. In some embodiments, the camera can include an image sensor that can be configured to detect a wavelength of light associated with the visualization of a fluid, such as methane.

[0081] Some embodiments of the present disclosure can include monitoring an amount of methane escaping the waste management site and/or landfill. In an example, when it is desired to detect an amount of methane escaping from the waste management site and/or landfill, a plurality of cameras can be disposed about the site and can be configured to detect methane. For example, in some embodiments, a plurality of cameras can be disposed about the site and can include image sensors that can be configured to detect a wavelength of light associated with the visualization of a fluid, such as methane. In some embodiments, based on the data associated with the image obtained by the camera, an amount of methane in the gas associated with the flare can be determined. In some embodiments, a plurality of cameras (e.g., three cameras) can be disposed at different locations around the waste management site and/or landfill. Accordingly, the image data associated with each one of the cameras can be combined into a three-dimensional image. Based on the three-dimensional image of the fluid (e.g., methane), a volume of methane generated by the waste management site and/or landfill can be analyzed.

[0082] In some embodiments, the volume of methane escaping the waste management site and/or landfill can be quantified such that a volume of methane exiting the waste management site and/or landfill can be determined. In some embodiments, governmental reporting requirements may require that the particular amount of methane escaping into the atmosphere be recorded. Such reporting requirements may be associated with a fee that is levied based on the amount of escaping methane. Accordingly, embodiments of the present disclosure can determine the amount of methane escaping into the atmosphere, such that an appropriate tax can be levied based on the amount of methane.

[0083] In some embodiments, the computer readable instructions 142 can include instructions executable by the processing resource to determine, with artificial intelligence, a type of notification to generate, based on the characteristic associated with the wellsite. In some embodiments, computer vision can determine data associated with, for example, a particular location of a liquid interface in relation to a height of a storage tank. In some embodiments, computer vision can determine a number of liquid interfaces located in the storage tank. In some embodiments, artificial intelligence can be used to interpret the data analyzed by the computer vision to determine that the level of oil is nearing a level that needs to be emptied and/or an amount of water in the storage tank is greater than an allowed amount. Thus, artificial intelligence can be used to generate a notification that includes a level of liquid in the storage tank, that the storage tank needs to be emptied, and/or that a level of water in the storage tank is above an allowed amount and thus a problem may exist with the separator 108, as depicted in Fig. 1.

[0084] In some embodiments, computer vision can be used to determine an amount and/or type of smoke associated with a flare. For example, based on the location, color, and/or intensity of pixels associated with the image data, an identification of the quantity and/or type of smoke can be made. Based on the identification, artificial intelligence can make a determination with respect to any notifications that should be sent to personnel, as discussed herein.

[0085] In some embodiments, based on image data of a fluid (e.g., gas) leak that has been analyzed by computer vision, artificial intelligence can be used to determine a type of a notification that should be sent to personnel. For example, computer vision can be used to determine that a gas leak is occurring and/or how large the gas leak is, based on analysis of the image data, including a location, color, and/or intensity of pixels located in the image. According to the identification made by computer vision, artificial intelligence can determine whether and what type of a notification to send to personnel.

[0086] In some embodiments, external information can be referenced by the system to determine the likelihood that a puddle of liquid on the ground is due to a leak. In some embodiments, weather information can be referenced to determine is precipitation has recently occurred. If precipitation has occurred, in some embodiments, the system can be trained to perform further investigation. For example, the system can wait to see if the liquid evaporates, which would indicate the possibility that the liquid is water. Furthermore, in some embodiments, the system can identify if any pipes or process equipment is located nearby and analyze the image for any accumulation of liquid on the process equipment located adjacent to the puddle of liquid.

[0087] In some embodiments, data associated with a determined characteristic associated with the wellsite can be recorded in a data log. In some embodiments, levels associated with liquid in a storage tank can be recorded in the data log. In some embodiments, the data log can be interactive, such that a user can click on data associated with the determination that a tank is full and can then be shown a picture of the tank from which the determination was made. Accordingly, this information data can be used to further train the system with respect to the determination of tank level. In some embodiments, particular characteristics associated with a flare can be recorded in the data log, such as a color of a plume associated with the flare and a time stamp associated with an image. Furthermore, determinations made based on an image of the flare can be recorded in the data log, such as the flare is running rich. In some embodiments, the data log can be interactive, such that a user can click on data associated with the determination that the flare is running rich and can then be shown a picture of the flare from which the determination was made. Accordingly, this data can be used to further train the system with respect to the determination of flare health. Furthermore, determinations made based on an image of a leak can be recorded in the data log, such as a leak currently is occurring. In some embodiments, the data log can be interactive, such that a user can click on data associated with the determination that the leak is occurring and can then be shown a picture of the piping and supposed leak from which the determination was made. Accordingly, this information data can be used to further train the system with respect to the determination of a leak. Although some embodiments of the present disclosure are discussed in relation to a wellsite, embodiments of the present disclosure can be used at other types of sites, as discussed herein.

[0088] In some embodiments, the data log can be accessible to a user that is remote to a wellsite. In some embodiments, the data log can be made accessible to a user’s mobile device so no matter where they are, remote or at the wellsite, they can access the data log.

[0089] Embodiments are described herein of various apparatuses, systems and/or methods. Numerous specific details are set forth to provide a thorough understanding of the overall structure, function, manufacture, and use of the embodiments as described in the specification and illustrated in the accompanying drawings. It will be understood by those skilled in the art, however, that the embodiments may be practiced without such specific details. In other instances, well-known operations, components, and elements have not been described in detail so as not to obscure the embodiments described in this specification. Those of ordinary skill in the art will understand that the embodiments described and illustrated herein are non-limiting examples, and thus it can be appreciated that the specific structural and functional details disclosed herein may be representative and do not necessarily limit the scope of the embodiments.

[0090] Reference throughout the specification to “various embodiments”, “some embodiments”, “one embodiment”, or “an embodiment”, or the like, means that a particular feature, structure, or characteristic described in connection with the embodiment(s) is included in at least one embodiment. Thus, appearances of the phrases “in various embodiments”, “in some embodiments”, “in one embodiment”, or “in an embodiment”, or the like, in places throughout the specification, are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures or characteristics may be combined in any suitable manner in one or more embodiments. Thus, the particular features, structures, or characteristics illustrated or described in connection with one embodiment may be combined, in whole or in part, with the features, structures, or characteristics of one or more other embodiments without limitation given that such combination is not illogical or nonfunctional.