Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
AGRICULTURAL SYSTEMS AND PLATFORMS
Document Type and Number:
WIPO Patent Application WO/2021/081428
Kind Code:
A1
Abstract:
Described herein are platforms, methods, software, systems and devices for agricultural product management. In some embodiments, a platform as described herein includes a map configured to display a plurality of visual layers, a user interface to display the map and at least one user portal configured to access the user interface. The plurality of visual layers may represent components relating to a grow of an agricultural product in an agricultural facility or display data based on a grow of the agricultural product in the agricultural facility.

Inventors:
HAUK JAMES (US)
HARTNELL JACOB (US)
GRUNEBERG DANIEL (US)
STANEK MOLLY (US)
Application Number:
PCT/US2020/057195
Publication Date:
April 29, 2021
Filing Date:
October 23, 2020
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
SENSEI AG HOLDINGS INC (US)
International Classes:
G06T19/00; G01C3/08
Foreign References:
US20190124855A12019-05-02
US20150227297A12015-08-13
Attorney, Agent or Firm:
DENG, QiMing (US)
Download PDF:
Claims:
CLAIMS

WHAT IS CLAIMED IS:

1. A system platform comprising: (a) a map of at least a portion of an agricultural facility, wherein the map is configured to display a plurality of visual layers; (b) a user interface configured to display the map; and (c) at least one user portal configured to access the user interface, wherein a plurality of visual layers are configured to (i) represent actuators relating to a grow of an agricultural product in the agricultural facility and (ii) display data based on the grow of the agricultural product in the agricultural facility.

2. The system platform of claim 1, wherein the map is a virtual reality map, an augmented reality map, or a mixed reality map.

3. The system platform of claim 2, wherein the map is the augmented reality map.

4. The system platform of any one of claims 1-3, wherein a visual layer of the plurality of visual layers is a virtual reality layer, an augmented reality layer, or a mixed reality layer.

5. The system platform of any one of claims 1-4, wherein a visual layer of the plurality of visual layers represents an actuator of the agricultural facility.

6. The system platform of any one of claims 1-5, wherein a visual layer of the plurality of visual layers is individually addressable by a user accessing a user portal of the at least one user portal.

7. The system platform of claim 6, wherein the visual layer is selectable by a user for visual enlargement or review of the data.

8. The system platform of any one of claims 1-7, wherein user access to a visual layer is controlled by a user portal.

9. The system platform of any one of claims 1-8, wherein the map is displayed on a single graphical screen.

10. The system platform of any one of claims 1-9, wherein the user interface is a graphical user interface.

11. The system platform of any one of claims 1-10, wherein the system platform is accessed on a smart phone.

12. The system platform of any one of claims 1-11, wherein the visual layer displays the data based on a plurality of grows.

13. The system platform of any one of claims 1-12, wherein the data comprises real-time data.

14. The system platform of any one of claims 1-13, wherein the data comprises data from a database.

15. The system platform of any one of claims 1-14, wherein the data is updated during the grow.

16. The system platform of claim 15, wherein at least a part of the data is continuously updated during the grow.

17. The system platform of any one of claims 1-16, wherein at least part of the data is received as an input from a sensor.

18. The system platform of any one of claims 1-17, wherein the data comprises data collected from a previous grow of the agricultural product.

19. The system platform of claim 18, wherein the sensor is a camera.

20. The system platform of any one of claims 1-19, wherein at least part of the data is received as an input from a user.

21. The system platform of claim 20, wherein in the user is an agricultural product grower.

22. The system platform of any one of claims 1-21, wherein the data comprises a calculated result.

23. The system platform of claim 22, wherein the calculated result comprises a prediction of an agricultural product yield, a nutritional content of the agricultural product, a time to complete the grow, or any combination thereof.

24. The system platform of claim 22, wherein the calculated result comprises a threshold value of a water usage, an electrical usage, a remaining stock amount or any combination thereof.

25. The system platform of any one of claims 1-24, wherein the system platform includes a trained algorithm, and wherein a calculated result of the data is determined by the trained algorithm.

26. The system platform of any one of claims 1-25, wherein the data includes a plurality of operating parameters, wherein the plurality of operating parameters are (i) input by a user or (ii) received from an actuator or sensor.

27. The system platform of claim 26, wherein the plurality of operating parameter comprises a soil pH, a temperature, a humidity, or any combination thereof.

28. The system platform of any one of claims 1-27, wherein an actuator comprises a fan, a valve, a duct, a light source, a water source, a fertilizer source, a nutrient source, or any combination thereof.

29. The system platform of any one of claims 1-28, wherein the agricultural facility is a fish farm, a dairy farm, a livestock farm, a crop farm, an orchard, an indoor farm, a hydroponic farm or any combination thereof.

30. The system platform of any one of claims 1-29, wherein the agricultural product comprises an animal-based product.

31. The system platform of any one of claims 1-30, wherein the agricultural product comprises a plant-based product.

32. The system platform of any one of claims 1-31, wherein the user is a consumer of the agricultural product, a business entity that sells the agricultural product to a consumer, an agricultural grower, or an agricultural manager.

Description:
AGRICULTURAL SYSTEMS AND PLATFORMS

CROSS-REFERENCE

[0001] This application claims the benefit of U.S. Provisional Patent Application No 62/925,750, filed October 24, 2019, which is entirely incorporated herein by reference.

BACKGROUND

[0002] Agriculture management is an important function to oversee all aspects of running farms and other growing facilities that produce agricultural products. Agriculture management also includes farmers and landowners to address profitability, fertility, and conservation. These types of management functions are essential to a successful farm business and to ensure sufficient and nutrient-rich food for a population of food consumers.

BRIEF SUMMARY

[0003] An aspect of the present disclosure provides for a system platform. In some cases, the system platform may comprise: (a) a map of at least a portion of an agricultural facility, wherein the map is configured to display a plurality of visual layers; (b) a user interface configured to display the map; and (c) at least one user portal configured to access the user interface, wherein a plurality of visual layers are configured to (i) represent actuators relating to a grow of an agricultural product in the agricultural facility and (ii) display data based on the grow of the agricultural product in the agricultural facility. In some cases, the map may be a virtual reality map, an augmented reality map, or a mixed reality map. In some cases, the map may be the augmented reality map. In some cases, a visual layer of the plurality of visual layers may be a virtual reality layer, an augmented reality layer, or a mixed reality layer. In some cases, a visual layer of the plurality of visual layers may represent an actuator of the agricultural facility. In some cases, a visual layer of the plurality of visual layers may be individually addressable by a user accessing a user portal of the at least one user portal. In some cases, the visual layer may be selectable by a user for visual enlargement or review of the data. In some cases, user access to a visual layer may be controlled by a user portal. In some cases, the map may be displayed on a single graphical screen. In some cases, the user interface may be a graphical user interface. In some cases, the system platform may be accessed on a smart phone. In some cases, the visual layer may display the data based on a plurality of grows. In some cases, the data may comprise real-time data. In some cases, the data may comprise data from a database. In some cases, the data may be updated during the grow. In some cases, at least a part of the data may be continuously updated during the grow. In some cases, at least part of the data may be received as an input from a sensor. In some cases, the data may comprise data collected from a previous grow of the agricultural product. In some cases, the sensor may be a camera. In some cases, at least part of the data may be received as an input from a user. In some cases, the user may be an agricultural product grower. In some cases, the data may comprise a calculated result. In some cases, the calculated result may comprise a prediction of an agricultural product yield, a nutritional content of the agricultural product, a time to complete the grow, or any combination thereof. In some cases, the calculated result may comprise a threshold value of a water usage, an electrical usage, a remaining stock amount or any combination thereof. In some cases, the system platform may include a trained algorithm, and wherein a calculated result of the data is determined by the trained algorithm. In some cases, the data may include a plurality of operating parameters, wherein the plurality of operating parameters are (i) input by a user or (ii) received from an actuator or sensor. In some cases, the plurality of operating parameter may comprise a soil pH, a temperature, a humidity, or any combination thereof. In some cases, an actuator may comprise a fan, a valve, a duct, a light source, a water source, a fertilizer source, a nutrient source, or any combination thereof. In some cases, the agricultural facility may be a fish farm, a dairy farm, a livestock farm, a crop farm, an orchard, an indoor farm, a hydroponic farm or any combination thereof. In some cases, the agricultural product may comprise an animal-based product. In some cases, the agricultural product may comprise a plant-based product. In some cases, the user may be a consumer of the agricultural product, a business entity that sells the agricultural product to a consumer, an agricultural grower, or an agricultural manager.

INCORPORATION BY REFERENCE

[0004] All publications, patents, and patent applications mentioned in this specification are herein incorporated by reference to the same extent as if each individual publication, patent, or patent application was specifically and individually indicated to be incorporated by reference. To the extent publications and patents or patent applications incorporated by reference contradict the disclosure contained in the specification, the specification is intended to supersede and/or take precedence over any such contradictory material.

BRIEF DESCRIPTION OF THE DRAWINGS [0005] The novel features of the invention are set forth with particularity in the appended claims. A better understanding of the features and advantages of the present invention will be obtained by reference to the following detailed description that sets forth illustrative embodiments, in which the principles of the invention are utilized, and the accompanying drawings (also “figure” and “FIG.” herein), of which:

[0006] FIG. 1A and FIG. IB show an example of a visual system platform comprising a map configured to display a plurality of visual layers.

[0007] FIG. 2 shows a non-limiting exemplary schematic diagram of the system for providing a VR-enhanced map of an agricultural facility to a user.

[0008] FIG. 3 shows an embodiment of a system such as used in an agricultural or visual system platform as described herein.

DETAILED DESCRIPTION

[0009] While various embodiments of the invention have been shown and described herein, it will be obvious to those skilled in the art that such embodiments are provided by way of example only. Numerous variations, changes, and substitutions may occur to those skilled in the art without departing from the invention. It should be understood that various alternatives to the embodiments of the invention described herein may be employed.

[0010] Systems as described herein may include visual systems. A system may comprise a map of at least a portion of an agricultural facility, a user interface configured to display the map and at least one user portal configured to access the user interface. The map may be configured to display one or more layers. A layer may be configured to (i) represent one or more components relating to a grow of an agricultural product in an agricultural facility, (ii) display data based on the grow of the agricultural product in the agricultural facility, or (iii) a combination thereof. [0011] Visual representation of systems (such as an agricultural system) are described herein. Systems may include a map, such as an augmented reality map. The map may comprise an overlay of sensor data (such as light amount or water amount) and agricultural product data (such as growth rate or pest presence). At least a portion of the map (such as an augmented reality map) may be visible to a user (such as a consumer). This visualization may be important for example as verification to avoid food fraud or increase trust of producer/consumer relationship. A system as described herein may include software. The software may include a mobile app. The mobile app may utilize a camera to map out general space and area of a growing environment or map actuators such as valves, plumbing, electrical components, water sources, or fans. Automated scanning and development of new software/hardware deployments may be based on wifi/ signal strength mapping of loT devices onto an augmented reality map

[0012] In some cases, the system may comprise a high-level graphical interface that can show several streams of data concurrently in a format appropriate for the user viewing the information. Each system, location, input, output, combination thereof or others can be highlighted and viewed separately, selectable as a “layer” from the high-level system overview “map” that can be effectively experienced from a single screen.

[0013] Graphical layouts of an agricultural facility system (such as a 2D or 3D graphical layout) may provide an overhead view of the agricultural facility as a network of components (such as actuators or sensors). Individual components may be represented as graphical objects/images, interconnected via lines (such as colored coded lines) which represent flows of assets on the high-level control map. A graphical representation of pipelines in a given agricultural facility may show (a) one or more physical connections (i.e. pipes, wires, conveyors, etc.) between components, (b) real-time data related to physical EO quantities/qualities or sensor data, or (c) a combination thereof, each of which may tie into location-specifi c outcomes (e.g. nutrition content or nutritional profile in harvested produce, time-spend in each row, etc.) as it pertains to an agricultural product or research facility. A graphical screen may be akin to the primary GUI. The system may permit a zoomed-in pari of the whole system. For example, one or more views of finer levels of detail (i.e. a particular component or subsystem) may be visualized. [0014] If the user may be a farm owner, the user portal may provide access to view a plurality of information from a plurality of systems or components within a system. In some cases, a user portal, such as a user portal for a farm owner may be configured to permit viewing all information of all systems. Scope of information or level of summary/detail may be switchable or may be configurable based on user input or user portal controls. For example, a user may specify: (i) to visualize electrical systems, (ii) to visualize fertilizer consumption from a portion of the agricultural facility, (iii) to visualize a location of the agricultural facility or a system of a plurality of systems with highest labor cost or highest nutritional profile, or (iv) any combination thereof, or others. A status of an agricultural facility may be reduced down to a quick-view dashboard/list of uniquely identified (e.g. color-coded) system status indicators (e.g. green = each system is good, yellow = becoming out of spec, red = requires attention, etc.), or a more fully-featured graphical representation of the agricultural facility such as a bird’s eye view with each system or component showing its status across the map (with configurable detail). [0015] If the user is an agricultural worker, they may be able to view a subset of systems relevant or related to their position. An agricultural worker may also enter or input information into a system via a user portal in various contexts throughout the facility, relevant to their training, position, and actual work.

[0016] A user may be a produce consumer. A consumer may view a portion of the system with information such as nutritional data from a harvest, an amount of electrical power an agricultural facility may consume, an amount of resources (light, water) used for an agricultural product, or any combination thereof. Amounts or other information may be updated to a user in real-time, such as during a grow. Immutable video recordings of one or more levels of detail from the agricultural facility, along with one or more forms of sensor data, may be relayed to the consumer to increase trust/confidence, and provide an auditable record of one or more conditions of a produce and other assets as they pass through various parts of the agricultural facility/operation. Consumers may visualize data via a company website or be linked directly from a QR code or hyperlink on an agricultural produce display or on the agricultural produce itself.

[0017] Sensors, including image capture devices may be positioned throughout an agricultural facility. Sensors may capture data of a grow, such as identifying a presence of a pest or a presence of a disease. Such data may permit correlation between existence of the pest or disease to one or more environmental conditions from which the pest or disease may be permitted to invade the grow. Sensors, such an image capture devices may provide input to a graphical interface to display to an agricultural facility to a user. The input may be real-time input. The input may show images of components of the agricultural facility or images of a grow or images of a pest or disease, or any combination thereof. The input may show a specific sub location in the agricultural facility where a problem may be located to aid in modifying a grow and/or directing the appropriate resource to address the problem, such as a presence of a pest or disease.

[0018] A graphical layout with high resolution or fine detail may be derived from camera imagery, such as a digital camera. Images may be acquired by a drone or unmanned flying object. Drones may be configured to collect images from one or more sub locations of a grow during one or more time point s of a grow. A 3D map of a growing environment can be created from a number of inputs, such as an aerial imagery or panoramas stitched together from the camera (such as a camera from a drone or from a mobile device). The combination of forms of metadata linked to this imagery, such as GPS, compass, and accelerometer data may give the system sufficient information to create a virtual map of the actual environment as experienced from a point of view capable of seeing all the necessary components. This imagery can be reconstructed in a virtual environment, potentially abstracted for more focused visual detail, and annotated with features that may not be readily apparent from the imagery itself, e.g. real-time contents and flow rates of pipes shown or buried, discrete fertilizer content, etc. The additional information can be overlaid onto 2D/3D images/maps, for viewing from any number of physical interfaces - desktop, mobile, virtual reality (VR), augmented reality (AR), others, or any combination thereof. This may allow the user to view multiple systems together, to quickly see which parts of a given system may be having problems, and to quickly infer the cause, due to the user’s ability to see the co-located parts of multiple systems concurrently.

[0019] Compiled map layers, similar to a GIS (geographic information system), may be generated on top of a 2D map or blueprint-type documents and user input from various types of cameras. The basic real imagery may be intended to be enriched with secondary layers of information and metadata from both blueprints having to do with the systems themselves (i.e. virtual labels, hyperlinks to datasheets, connection diagram layers, etc.), but also with real-time information about the status of facility systems and their contents, derived from sensor inputs, recent harvest/yield/nutrition analysis, and other information about the status of internal automation routines.

[0020] Once visual and spatial mapping of the agricultural facility is complete, the system may be used to automatically capture the location data (and display location-specific information) of the one or more types of existing industrial equipment or IoT devices within the agricultural facility. The system may generate (such as automatically generate) recommendations for placement and type of future equipment/devices, based on information observed (such as a temperature gradient or data inconsistency). The system can automatically infer location and necessity of existing and prospective equipment via Wi-Fi and/or Bluetooth signal tri angulation, covering Wi-Fi dead-zones observed by mobile devices moving throughout the space, given relative proximity of other equipment, performance over time, and gradients observed through mobile sensors and/or sensor arrays, or any combination thereof.

Definitions

[0021] Unless otherwise indicated, open terms for example “contain,” “containing,” “include,” “including,” and the like mean comprising.

[0022] The singular forms “a”, “an”, and “the” are used herein to include plural references unless the context clearly dictates otherwise. Accordingly, unless the contrary is indicated, the numerical parameters set forth in this application are approximations that may vary depending upon the desired properties sought to be obtained by the present invention.

[0023] Unless otherwise indicated, some instances herein contemplate numerical ranges. When a numerical range is provided, unless otherwise indicated, the range includes the range endpoints. Unless otherwise indicated, numerical ranges include all values and subranges therein as if explicitly written out. Unless otherwise indicated, any numerical ranges and/or values herein, following or not following the term “about,” can be at 85-115% (i.e., plus or minus 15%) of the numerical ranges and/or values. [0024] As used herein, the term “user” may mean a food consumer, an agricultural grower, an agricultural manager, a business entity in the food consumer industry or any combination thereof. A user may be a person that produces or assisting in at least one aspect of producing the agricultural product. A user may be a farmer, a planter, a breeder, a stockman, an agriculturist, an agronomist, a rancher, a producer, a cropper, a harvester, a gardener, an orchardist, a horticulturist, a hydroponist, a pomologist, a viticulturist, or any combination thereof. A user may be a person in the farm business or agriculture business. A user may be an agricultural manager that oversees an aspect of the business. A user may be a CEO, CSO, or CFO of an agriculture facility. A user may be a person that purchases an agricultural product from a farmer or a food producer. A user may be a person that sells the agricultural product to a consumer. A user may be a consumer, a person who eats the agricultural product or who buys the agricultural product. [0025] As used herein, the term “agricultural product” may mean a product produced for consumption by a person or animal. An agricultural product may include a plant or portion thereof, an animal or portion thereof, an animal product, or any combination thereof. An agricultural product may include one or more crops, a food, a nutrient, a consumable, a livestock, a plant, an animal, an animal product (such as dairy, milk, eggs, cheese), a plant product, or any combination thereof.

[0026] As used herein, the term “agricultural facility” may mean a facility for producing one or more types of an agricultural product. An agricultural facility may include a rural or urban facility or both. An agricultural facility may include an outdoor or indoor facility or both. An agricultural facility may include multiple different geographical locations or a singular geographical location. An agricultural facility may include a farm, a dairy farm, a livestock farm, a crop farm, a commercial farm, a fish farm, a meat farm, a poultry farm, a greenhouse, an orchard, a hydroponic farm, an urban farm, or any combination thereof. An agricultural facility may utilize natural growing elements such as sunlight, soil, and weather conditions of a geographical outdoor location. An agricultural facility may utilize artificial elements such as artificial light, artificial soil, artificial heat, or combinations thereof. An agricultural facility may utilize direct sun, indirect sun (such as from solar panels), or artificial light to grow crops.

[0027] As used herein, the term “farming practice” may mean a practice performed by one or more farms. A farming practice may include growing an agricultural product under organic food standards or not. A farming practice may include growing an agricultural product under non-

GMO standards or not. A farming practice may include growing an agricultural product under hormone-free conditions or not. A farming practice may include growing an agricultural product under antibiotic-free conditions or not. A farming practice may include growing an agricultural product under environmental-sustainable practices or not. A farming practice may include growing an agricultural product under a reduced carbon footprint standard or not. A farming practice may include growing an agricultural product under fair-trade standards or not. A farming practice may include growing an agricultural product under a particular level of animal welfare standards or not. A farming practice may include growing an agricultural product under farm raised conditions or raised in the wild. A farming practice may include growing an agricultural product under in-door conditions, open access conditions, or free range conditions. A farming practice may include growing an animal-based product on a grass-fed diet or not. A farming practice may include any of the forgoing examples or any combination thereof.

[0028] As used herein, the term “nutritional profile” may mean an agricultural product having one or more nutritional attributes. One or more nutritional attributes of an agricultural product may be quantified and communicated to a user, such as an agricultural manager, agricultural grower, food consumer, or any combination thereof. The platforms as described herein may execute a recipe to grow an agricultural product, the recipe having been optimized to maximize one or more nutritional attributes. A nutritional profile of an agricultural product may be compares across one or more farms growing the agricultural product. A nutritional attribute may include a taste or flavor, a color, a texture, a ripeness, a freshness, a vitamin content or amount, a mineral content or amount, a fat content or amount, a sugar content or amount, a carbohydrate content or amount, a pesticide content or amount, an anti-oxidant content or amount, or any combination thereof.

[0029] In general, the term “software” as used herein comprises computer readable and executable instructions that may be executed by a computer processor. In some embodiments a

“software module” comprises computer readable and executable instructions and may, in some embodiments described herein make up a portion of software or may in some embodiments be a stand-alone item. In various embodiments, software and/or a software module comprises a file, a section of code, a programming object, a programming structure, or combinations thereof. In further various embodiments, a software module comprises a plurality of files, a plurality of sections of code, a plurality of programming objects, a plurality of programming structures, or combinations thereof. In various embodiments, the one or more software modules comprise, by way of non-limiting examples, a web application, a mobile application, and a standalone application. In some embodiments, software modules are in one computer program or application. In other embodiments, software modules are in more than one computer program or application. In some embodiments, software modules are hosted on one machine. In other embodiments, software modules are hosted on more than one machine. In further embodiments, software modules are hosted on cloud computing platforms. In some embodiments, software modules are hosted on one or more machines in one location. In other embodiments, software modules are hosted on one or more machines in more than one location.

Layers of a map

[0030] Layers may be configured to be component specific. For example, the map may display one or more plumbing components in a first layer and one or more electrical components in a second layer. Layers may be configured to be data specific. For example, a first layer may comprise data of an agricultural product such as yield and a second layer may comprise data of a component such as water flow rate of a pump or temperature of the facility. Layers may be configured to be location specific. For example a first layer may comprise data from a first sub location of an agricultural facility and a second layer may comprise data from a second sub location of the agricultural facility. Information or images displayed in a layer may be configured by a user. Access to one or more layers may be restricted based on a user type, such as a consumer or agricultural worker.

[0031] A layer may be a visual layer projected to a user via a graphical user interface. A layer may be interactive with a user. A layer may be adjustable in visual size. A layer may overlap another layer. A plurality of layers may overlap to form the map. A layer may be positioned adjacent another layer. A layer may comprise an imagery component, a data component, or a combination thereof. Information of a layer may be updated, such as updated in real-time.

[0032] A layer may comprise a projection or predicted value of a data, such as an agricultural yield, time to complete a grow, a total volume of water utilized for a grow, or any combination thereof, or others. A predictive value may be based on comparing data from a current grow to data collected from a previous grow or data from a database. A predictive value may be based on comparing data from a current grow to a training set of data. A predictive value may be determined by a trained algorithm of the system.

[0033] A map may be a virtual reality map, an augmented reality map, or a mixed reality map. A map may permit a user to survey an agricultural facility virtually or from a remote location. A map may permit a user (such as an agricultural manager) to make a business decision or a modification to an agricultural grow via a remote location. A map may permit a user (such as a consumer) to tour an agricultural facility such as prior to or after purchasing of an agricultural product produced in the agricultural facility. A map may provide a user greater transparency of agricultural grows of a particular facility. A visual layer may be a virtual reality layer, an augmented reality layer, or a mixed reality layer. [0034] A layer of a plurality of layers may be individually addressable by a user. A layer may be selected from a plurality of layers. A layer may be selected by a user. Access to a layer may be controlled via a user portal such that different users may gain access to different layers. A layer may be visually expanded or enlarged. A layer may be visualized adjacent to at least a second layer. A layer may be visualized superimposed onto at least a second layer. Data from a layer may be exported or downloaded from the system. A layer may be modified, such as the type of data displayed in the layer. A layer may include a virtual image of a portion of the agricultural facility, an icon representing a portion or component of the agricultural facility (such as an icon of a pump or valve), a data collected from a sensor or actuator or the facility, a movie of virtual images collected over time, a graph of data collected over time, a predicted value, or any combination thereof.

[0035] A map may be displayed on a single graphical screen. A map may be displayed in an augmented or virtual reality. A map may be displayed on a portable device, such as a tablet or smart phone. A map may be displayed at multiple locations, on multiple devices, to multiple users. A map may be displayed on a website.

Data

[0036] Data may be displayed in one or more layers of a map. Data may be collected from a sensor (such as a camera, temperature sensor, or moisture sensor) or an actuator of one or more agricultural facilities. Data may be intermittently collected, continuously collected, collected based on a user request, or any combination thereof. Data may be updated one or more times during a grow. A user may be notified when a data is updated or when a data may achieve a particular cutoff or threshold value. Data may comprise data from a database of the system or imported from another database system. Data may comprise data collected during a previous grow. Data may comprise data collected during a grow of the same agricultural product or during similar growing conditions. Data may comprise data input to the system by a user, such as a visual assessment of an agricultural product as determined by a grower. Data may comprise a result, such as a result calculated by an algorithm. The result may be displayed in one or more layers, may be displayed in the map, may be provided to a user via a user portal, may be stored in a database, or any combination thereof. A result may comprise for example a threshold value of a water usage, an electrical usage, a remaining stock amount or any combination thereof. A result may comprise a prediction of an agricultural product yield, a nutritional content of the agricultural product, a time to complete the grow, or any combination thereof. Data may include a plurality of operating parameters, such as a soil pH or nutrient content, a soil or air temperature, an air humidity, or any combination thereof. Sensors

[0037] A sensor may collect data. A sensor may collect a single type of data. A sensor may collect more than one type of data. A data type that one or more sensors may collect may include: an image (such as an image of an agricultural product or portion thereof or an image within the agricultural facility), a temperature, a humidity, a pH, a light level, a light spectrum, an air flow, an air circulation level, an air composition, an air exchange rate, a fertilizer content, a fertilizer concentration, a nutrient content, a nutrient concentration or any combination thereof. A sensor may collect data related to a climate or microclimate within an agricultural facility. A sensor may collect data on a disease type or disease level. A sensor may collect data on an agricultural product yield, size of product, amount of product, rate of growth, or any combination thereof. A sensor may collect data on amount of resources utilized or rate of resources utilized, such as water, fertilizer, soil, nutrients, sun light, heat, or any combination thereof. A sensor may collect data automatically. Automated data collection may include continuous collection, discrete collection (such as when a given threshold may be reached), incremental collection (such as collection as timed intervals), or any combination thereof. A sensor may collect data when a user (grower or farm manager) prompts the sensor. A sensor may be a user (such as a grower). In some embodiments, a user may provide a sensory assessment of an agricultural product and may input the data into the processor, such as a user-based estimate of product number.

[0038] A sensor may be a camera, such as a digital camera. A camera may be a camera on a smart phone (such as an iPhone) or on a smart watch (such as an iWatch). A sensor may be a pH sensor, a temperature sensor, a light sensor, a humidity sensor, an air sensor, a turbidity sensor, a chemical sensor, or any combination thereof.

[0039] An array of sensors may include nxn sensors, such as lxl sensors, 2x2 sensors, 3x3 sensors or more. An array of sensors may include nxm sensors, such as 1x3 sensors, 2x6 sensors, 3x9 sensors or more. An array of sensors may include a plurality of sensors. A sensor in an array of sensors may be individually addressable by a user or by the processor. For example, a subset of sensors may collect data based on a given parameter - such as time or temperature.

Actuators

[0040] Adjusting an actuator may adjust one or more operating parameters of the agricultural facility. For example, adjusting a vent may adjust the temperature at one or more locations in the agricultural facility. An array of sensors in communication with an array of actuators may facilitate adjustments of the temperature in discrete locations of the agricultural facility that may be insufficient compared to the remaining locations that are sufficiently heated. An adjustment in an actuator may be determined by the processor and based on data received from one or more sensors, users, or a combination thereof.

[0041] An actuator may include a vent, a shutter, a louver, a sprayer, a pump, a valve, a mixer, a heater, a fan, a light, a humidifier, a dehumidifier, an air exchanger, a gas pump, a water source, a wind source, a food source, a fertilizer source, a pest control source, an evaporative cooler, a gas generator (such as a C02 generator) or any combination thereof.

[0042] An array of actuators may include nxn actuators, such as lxl actuators, 2x2 actuators,

3x3 actuators or more. An array of actuators may include nxm actuators, such as 1x3 actuators, 2x6 actuators, 3x9 actuators or more. An array of actuators may include a plurality of actuators. An actuator in an array of actuators may be individually addressable by a user or by the processor. For example, a subset of actuators may collect data based on a given parameter - such as time or temperature.

Customizable Interface

[0043] Farm management, growers, and farm workers constitute distinct user groups. The platform may be customized to each user group’s needs but built from a shared centralized data source. Managers and growers may each have access to a powerful customized dashboard and admin interface giving them a complete view of everything happening on the farm. Farm workers may have access to task management tools, barcode scanners, and applications for workstations such as harvesting, seeding, and packaging as well as access to Standard Operating Procedure documentation. The platform may provide a plurality of discrete user interfaces. The platform may provide separate user interfaces for agricultural managers, agricultural growers, and food consumers. The platform may provide at least 2, 3, 4, 5, 6, 7, 8, 9, 10, 15, 20, 30, 40,

50, 100, 200, 500, 1,000, 10,000, 100,000, 1,000,000 discrete user interfaces or more. The platform may provide at least 50 discrete user interfaces. The platform may provide at least 500 discrete user interfaces. The platform may provide at least 5,000 discrete user interfaces. The platform may provide at least 50,000 discrete user interfaces. The platform may provide at least 500,000 discrete user interfaces. A discrete user interface may limit the data a user can access, whether a user can input data into the platform, and what type of request a user can enter into the interface, or any combination thereof. For example, data entry may be reserved for agricultural grower or manager interfaces.

Machine Learning

[0044] A platform as described herein may comprise a machine learning module, such as a trained algorithm. A machine learning module may be trained on one or more training data sets.

A machine learning module may be trained on at least about: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 20, 30, 40, 50, 60, 70, 80, 90, 100, 200, 300, 400, 500, 600, 700, 800, 900, 1,000, 5,000, 10,000, 20,000, 50,000 data sets or more. A machine learning module may be trained on from about 50 to about 200 data sets. A machine learning module may be trained on from about 50 to about 1,000 data sets. A machine learning module may be trained on from about 1,000 to about 5,000 data sets. A machine learning module may be trained on from about 5 to about 500 data sets. A machine learning module may generate a training data set from data acquired or extracted from a sensor or user, such as during an agricultural grow. A machine learning module may be validated with one or more validation data sets. A validation data set may be independent from a training data set. A training data set may comprise data provided by a sensor, data provided by a user, or any combination thereof. A training data set may be stored in a database of the platform. A training data set may be uploaded to the machine learning module from an external source. A training data set may be generated from data acquired at from an agricultural grow. A training data set may be updated continuously or periodically. A training data set may comprise data from at least about: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 20, 30, 40, 50, 60, 70, 80, 90, 100, 200, 300, 400, 500, 600, 700, 800, 900, 1,000, 5,000, 10,000, 20,000, 50,000 different agricultural grows. A training data set may comprise data from about 50 to about 200 different agricultural grows. A training data set may comprise data from about 50 to about 1,000 different agricultural grows. A training data set may comprise data from about 1,000 to about 5,000 different agricultural grows. A training data set may comprise data from about 5 to about 500 different agricultural grows.

[0045] In some embodiments, the sensed parameter(s) herein is received as an input to output a correlation by a processor. In some embodiments, correlation herein is received as an input to a machine learning algorithm configured to output guidance or instruction for future agricultural grows.

[0046] The systems, methods, and media described herein may use machine learning algorithms for training prediction models and/or making predictions of for an agricultural grow. Machine learning algorithms herein may learn from and make predictions on data, such as data obtained from a sensor or user. Data may be any input, intermediate output, previous outputs, or training information, or otherwise any information provided to or by the algorithms.

[0047] A machine learning algorithm may use a supervised learning approach. In supervised learning, the algorithm can generate a function or model from training data. The training data can be labeled. The training data may include metadata associated therewith. Each training example of the training data may be a pair consisting of at least an input object and a desired output value.

A supervised learning algorithm may require the individual to determine one or more control parameters. These parameters can be adjusted by optimizing performance on a subset, for example a validation set, of the training data. After parameter adjustment and learning, the performance of the resulting function/model can be measured on a test set that may be separate from the training set. Regression methods can be used in supervised learning approaches.

[0048] A machine learning algorithm may use an unsupervised learning approach. In unsupervised learning, the algorithm may generate a function/model to describe hidden structures from unlabeled data (i.e., a classification or categorization that cannot be directed observed or computed). Since the examples given to the learner are unlabeled, there is no evaluation of the accuracy of the structure that is output by the relevant algorithm. Approaches to unsupervised learning include: clustering, anomaly detection, and neural networks.

[0049] A machine learning algorithm is applied to patient data to generate a prediction model. In some embodiments, a machine learning algorithm or model may be trained periodically. In some embodiments, a machine learning algorithm or model may be trained non-periodically.

[0050] As used herein, a machine learning algorithm may include learning a function or a model. The mathematical expression of the function or model may or may not be directly computable or observable. The function or model may include one or more parameter(s) used within a model. For example, a linear regression model having a formula Y = CO + Clxl + C2x2 has two predictor variables, xl and x2, and coefficients or parameter, CO, Cl, and C2. The predicted variable in this example is Y. After the parameters of the model are learned, values can be entered for each predictor variable in a model to generate a result for the dependent or predicted variable (e.g., Y).

[0051] In some embodiments, a machine learning algorithm comprises a supervised or unsupervised learning method such as, for example, support vector machine (SVM), random forests, gradient boosting, logistic regression, decision trees, clustering algorithms, hierarchical clustering, K-means clustering, or principal component analysis. Machine learning algorithms may include linear regression models, logistical regression models, linear discriminate analysis, classification or regression trees, naive Bayes, K-nearest neighbor, learning vector quantization (LVQ), support vector machines (SVM), bagging and random forest, boosting and Adaboost machines, or any combination thereof.

[0052] Data input into a machine learning algorithm may include data obtained from an individual, data obtained from a practitioner, or a combination thereof. Data input into a machine learning algorithm may include data extracted from a sensor, from a user, or a combination thereof. Data input into a machine learning algorithm may include a product yield, an environmental condition, a pest resilience, a nutrient profile, a farming practice used, or any combination thereof. [0053] Data obtained from one or more grows can be analyzed using feature selection techniques including filter techniques which may assess the relevance of one or more features by looking at the intrinsic properties of the data, wrapper methods which may embed a model hypothesis within a feature subset search, and embedded techniques in which a search for an optimal set of features may be built into a machine learning algorithm. A machine learning algorithm may identify a set of parameters that may provide an optimized grow.

[0054] A machine learning algorithm may be trained with a training set of samples. The training set of samples may comprise data collected from a grow, from different grows, or from a plurality of grows. A training set of samples may comprise data from a database.

[0055] A training set of samples may include 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 20, 30, 40, 50, 60, 70,

80, 90, 100, 200, 300, 400, 500, 600, 700, 800, 900, 1,000, 5,000, 10,000, 20,000, 50,000 or more data types. A training set of samples may comprise a single data type. A training set of samples may include different data types. A training set of samples may comprise a plurality of data types. A training set of samples may comprise at least three data types. A training set of samples may include data obtained from about: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 20, 30, 40, 50, 60, 70,

80, 90, 100, 200, 300, 400, 500, 600, 700, 800, 900, 1,000, 5,000, 10,000, 20,000, 50,000 or more grows. A training set of samples may include data from a single grow. A training set of samples may include data from different grows. A training set of samples may include data from a plurality of grows.

[0056] Iterative rounds of training may occur to arrive at a set of features to classify data. Different data types may be ranked differently by the machine learning algorithm. One data type may be ranked higher than a second data type. Weighting or ranking of data types may denote significance of the data type. A higher weighted data type may provide an increased accuracy, sensitivity, or specificity of the classification or prediction of the machine learning algorithm. For example, an input parameter of growing temperature may significantly increase crop yield, more than any other input parameter. In this case, growing temperature may be weighted more heavily than other input parameters in increasing crop yield. The weighting or ranking of features may vary from grow to grow. The weighting or ranking of features may not vary from grow to grow. [0057] A machine learning algorithm may be tested with a testing set of samples. The testing set of samples may be different from the training set of samples. At least one sample of the testing set of samples may be different from the training set of samples. The testing set of samples may comprise data collected from before a grow, during a grow, after a grow, from different grows, or from a plurality of grows. A testing set of samples may comprise data from a database. [0058] A training set of samples may include different data types - such as one or more input parameters and one or more output parameters. A testing set of samples may include 1, 2, 3, 4, 5,

6, 7, 8, 9, 10, 20, 30, 40, 50, 60, 70, 80, 90, 100, 200, 300, 400, 500, 600, 700, 800, 900, 1,000

5,000, 10,000, 20,000, 50,000 or more data types. A testing set of samples may comprise a data type. A testing set of samples may include different data types. A testing set of samples may comprise a plurality of data types. A testing set of samples may comprise at least three data types. A testing set of samples may include data obtained from 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 20, 30,

40, 50, 60, 70, 80, 90, 100, 200, 300, 400, 500, 600, 700, 800, 900, 1,000, 5,000, 10,000, 20,000,

50,000 or more grows. A testing set of samples may include data from a single grow. A testing set of samples may include data from different grows. A testing set of samples may include data from a plurality of grows.

[0059] A machine learning algorithm may classify or predict an outcome with at least about:

80%, 85%, 90%, 95%, 96%, 97%, 98%, 99% accuracy. A machine learning algorithm may classify or predict an outcome with at least about: 80%, 85%, 90%, 95%, 96%, 97%, 98%, 99% sensitivity. A machine learning algorithm may classify or predict an outcome with at least about: 80%, 85%, 90%, 95%, 96%, 97%, 98%, 99% specificity. For example, a machine learning algorithm may classify with 90% accuracy that an agricultural yield will not succumb to pest infestation. A machine learning algorithm may classify a grow as having at least 90% likelihood of producing an agricultural product with superior nutritional profile as compared to a control. A machine learning algorithm may predict at least 95% likelihood of an agricultural yield under a range of growing temperatures.

[0060] An independent sample may be independent from the training set of samples, the testing set of samples or both. The independent sample may be input into the machine learning algorithm for classification. An independent sample may not have been previously classified by the machine learning algorithm.

[0061] A trained algorithm (such as a machine learning software module) as described herein is configured to undergo at least one training phase wherein the trained algorithm is trained to carry out one or more tasks including data extraction, data analysis, and generation of output or result, such as a recipe for growing an agricultural product with maximal yield or with maximal nutritional benefit.

[0062] In some embodiments of the agricultural platform described herein, the agricultural platform comprises a training module that trains the training algorithm. The training module is configured to provide training data to the trained algorithm, said training data comprising, for example, a data set from an agricultural facility or a data set from a previous grow. [0063] In some embodiments, a trained algorithm is trained using a data set and a target in a manner that might be described as supervised learning. In these embodiments, the data set is conventionally divided into a training set, a test set, and, in some cases, a validation set. A target is specified that contains the correct classification of each input value in the data set. For example, a data set from one type of agricultural product is repeatedly presented to the trained algorithm, and for each sample presented during training, the output generated by the trained algorithm is compared with the desired target. The difference between the target and the set of input samples is calculated, and the trained algorithm is modified to cause the output to more closely approximate the desired target value, such as maximized yield of the agricultural product. In some embodiments, a back-propagation algorithm is utilized to cause the output to more closely approximate the desired target value. After a large number of training iterations, the trained algorithm output will closely match the desired target for each sample in the input training set. Subsequently, when new input data, not used during training, is presented to the trained algorithm, it may generate an output classification value indicating which of the categories the new sample is most likely to fall into. The trained algorithm is said to be able to “generalize” from its training to new, previously unseen input samples. This feature of a trained algorithm allows it to be used to classify almost any input data which has a mathematically formulatable relationship to the category to which it should be assigned.

[0064] Unsupervised learning is used, in some embodiments, to train a trained algorithm to use input data such as, for example, agricultural product data and output, for example, a maximized yield or disease detection. Unsupervised learning, in some embodiments, includes feature extraction which is performed by the trained algorithm on the input data. Extracted features may be used for visualization, for classification, for subsequent supervised training, and more generally for representing the input for subsequent storage or analysis. In some cases, each training case may consist of a plurality of agricultural product data.

[0065] Trained algorithms that are commonly used for unsupervised training include k-means clustering, mixtures of multinomial distributions, affinity propagation, discrete factor analysis, hidden Markov models, Boltzmann machines, restricted Boltzmann machines, autoencoders, convolutional autoencoders, recurrent neural network autoencoders, and long short-term memory autoencoders. While there are many unsupervised learning models, they all have in common that, for training, they require a training set consisting of a data set of grows of an agricultural product, without associated labels.

[0066] A trained algorithm may include a training phase and a prediction phase. The training phase is typically provided with data in order to train the machine learning algorithm. Non- limiting examples of types of data inputted into a trained algorithm for the purposes of training include an agricultural product yield, an amount and type of raw materials, an environmental condition during a grow, a length of grow, a nutritional profile of the agricultural product, a soil composition, or any combination thereof. Data that is inputted into the trained algorithm is used, in some embodiments, to construct a hypothesis function to determine the presence of an abnormality. In some embodiments, a trained algorithm is configured to determine if the outcome of the hypothesis function was achieved and based on that analysis make a determination with respect to the data upon which the hypothesis function was constructed. That is, the outcome tends to either reinforce the hypothesis function with respect to the data upon which the hypothesis functions was constructed or contradict the hypothesis function with respect to the data upon which the hypothesis function was constructed. In these embodiments, depending on how close the outcome tends to be to an outcome determined by the hypothesis function, the machine learning algorithm will either adopts, adjusts, or abandon the hypothesis function with respect to the data upon which the hypothesis function was constructed. As such, the machine learning algorithm described herein dynamically learns through the training phase what characteristics of an input (e.g. data) is most predictive in optimizing a crop yield, minimizing pest infestation or carbon foot print of a product, maximizing profit or nutritional value of a product, or any combination thereof.

[0067] For example, a trained algorithm is provided with data on which to train so that it, for example, is able to determine the most salient features of a received agricultural product data to operate on. The trained algorithms described herein train as to how to analyze the agricultural product data, rather than analyzing the agricultural product data using pre-defmed instructions. As such, the trained algorithms described herein dynamically learn through training what characteristics of an input signal are most predictive in in optimizing a crop yield, minimizing pest infestation or carbon foot print of a product, maximizing profit or nutritional value of a product, or any combination thereof.

[0068] In some embodiments, the trained algorithm is trained by repeatedly presenting the trained algorithm with agricultural product data along a range of successful and non-successful grows. The trained algorithm may be presented with data from grows having high yield and data having no product produced. The trained algorithm may be presented with data from grows having a high carbon footprint and data from grows having a minimized carbon foot print. A trained algorithm may receive heterogeneous data conveying the range and variability of data that the trained algorithm may encounter in a future grow. Agricultural product data may be generated by computer simulation. [0069] In some embodiments, training begins when the trained algorithm is given agricultural product data and asked to optimize a crop yield, minimize pest infestation or carbon foot print of a product, maximize profit or nutritional value of a product, or any combination thereof. The predicted output is then compared to the true data that corresponds to the agricultural product data. An optimization technique such as gradient descent and backpropagation is used to update the weights in each layer of the trained algorithm so as to produce closer agreement between the probability predicted by the trained algorithm, and the optimized result. This process is repeated with new agricultural product data until the accuracy of the network has reached the desired level. An optimization technique is used to update the weights in each layer of the trained algorithm so as to produce closer agreement between the data predicted by the trained algorithm, and the true data. This process is repeated with new agricultural product data until the accuracy of the network has reached the desired level.

[0070] In general, a machine learning algorithm may be trained using a large database of measurements and/or any features or metrics computed from the above said data with the corresponding ground-truth values. The training phase constructs a transformation function for predictive in optimizing a crop yield, minimizing pest infestation or carbon foot print of a product, maximizing profit or nutritional value of a product, or any combination thereof. The machine learning algorithm dynamically learns through training what characteristics or features of an input signal are most predictive in optimizing the features of an agricultural product, such as nutritional profile. A prediction phase uses the constructed and optimized transformation function from the training phase to predict the optimization of the grow and product yield.

Virtual Environments

[0071] As shown in FIG. 2, in some embodiments, disclosed herein is a system 100 for use with a visual system platform. In some embodiments, the system 100 comprises: a digital display 101 configured to display a virtual environment comprising a plurality of virtual images to a user 104. The system can include one or more sensors (e.g., movement sensors) 102 configured to sense a plurality of parameters of said individual (e.g., a body movement); and a processor or a digital processing device 103 configured to correlate said virtual environment or at least one virtual image of said plurality of virtual images with at least one parameter of said plurality of parameters.

[0072] In some embodiments, the digital display 101 is head-mounted. In some embodiments, the digital display is a liquid crystal display (LCD). In further embodiments, the display is a thin film transistor liquid crystal display (TFT-LCD). In some embodiments, the display is an organic light emitting diode (OLED) display. In various further embodiments, on OLED display is a passive-matrix OLED (PMOLED) or active-matrix OLED (AMOLED) display. In some embodiments, the display is a plasma display. In other embodiments, the display is a video projector. In yet other embodiments, the display is a head-mounted display in communication with the digital processing device, such as a VR headset or AR headset. In further embodiments, suitable VR headsets include, by way of non-limiting examples, HTC Vive, Oculus Rift,

Samsung Gear VR, Microsoft HoloLens, Razer OSVR, FOVE VR, Zeiss VR One, Avegant Glyph, Freefly VR headset, and the like. In still further embodiments, the display is a combination of devices such as those disclosed herein.

[0073] In some embodiments, the virtual environment herein is a VR environment. In some embodiments, the virtual environment is an AR environment. In some embodiments, the virtual environment herein is a MxR environment. In some embodiments, the virtual environment comprises an image captures from a camera in an agricultural facility. In some embodiments, the virtual environment comprises a scene that is not in the actual environment that the individual is in. In some embodiments, each of the plurality of virtual images comprises a portion of the virtual environment. In some embodiments, the virtual environment does not include any element that is in the actual environment of the individual or a virtual representation of any element of the actual environment of the individual. In some embodiments, the virtual environment does not include a virtual representation of the individual. In some embodiments, the virtual environment includes a virtual representation of the individual, e.g., an avatar or an image of the individual. [0074] In some embodiments, the sensor(s) 102 herein includes one or more sensors. In some embodiments, the sensor comprises a movement sensor. In some embodiments, the movement sensor is configured to interface with the map such as one or more of the plurality of layers of the map to manipulate, select, or visualize the map or layers.

[0075] In some embodiments, the sensor, e.g., one or more cameras, can be mounted overhead on the ceiling or on any fixed structural element above the individual. In some embodiments, the sensor is attached to a movable element, for example a movable arm which is mounted to a table or a transportable cart.

[0076] In some embodiments, the system 100 further comprises an output device 105 configured to provide a plurality of outputs to the user. In some embodiments, the plurality of outputs corresponds to or is related to at least one of the plurality of virtual images of the virtual environment.

[0077] In some embodiments, the output device includes one or more selected from but is not limited to: a speaker, an earphone, and a headset. [0078] In some embodiments, the system 100 herein includes a processor 103. The processor can be in communication with one or more of the digital display 101, the sensors 102, and the output device 105. Such communication can be wired or wireless communication. Such communication can be uni-directional or bi-directional so that data and/or command can be communicated therebeween. In some embodiments, the processor 103 herein is configured to execute code or software stored on an electronic storage location of a digital processing device such as, for example, on the memory. In some embodiments, the processor herein includes a central processing unit (CPU).

Systems & Platforms

[0079] FIG. 3 shows an embodiment of a system such as used in a visual system or an agricultural platform as described herein, comprising a digital processing device 301. The digital processing device 301 includes a software application configured for agriculture management. Alternatively or in combination, the digital processing device 301 is configured to generate a trained algorithm (e.g., machine learning algorithm) such as by training the algorithm with a training data set. The digital processing device 301 may include a central processing unit (CPU, also “processor” and “computer processor” herein) 305, which can be a single core or multi-core processor, or a plurality of processors for parallel processing. The digital processing device 301 also includes either memory or a memory location 310 (e.g., random-access memory, read-only memory, flash memory), electronic storage unit 315 (e.g., hard disk), communication interface 320 (e.g., network adapter, network interface) for communicating with one or more other systems, and peripheral devices, such as cache. The peripheral devices can include storage device(s) or storage medium 365 which communicate with the rest of the device via a storage interface 370. The memory 310, storage unit 315, interface 320 and peripheral devices are configured to communicate with the CPU 305 through a communication bus 325, such as a motherboard. The digital processing device 301 can be operatively coupled to a computer network (“network”) 330 with the aid of the communication interface 320. The network 330 can comprise the Internet and/or a local area network (LAN). The network 330 can be a telecommunication and/or data network.

[0080] The digital processing device 301 includes input device(s) 345 to receive information from a user, the input device(s) in communication with other elements of the device via an input interface 350. Alternatively or in combination, the input device(s) includes a remote device such as a smartphone or tablet that is configured to communicate remotely with the digital processing device 301. For example, a user may use a smartphone application to access sensor data, current actuator instructions, the smart recipe, or other information stored on the digital processing device 301. The digital processing device 301 can include output device(s) 355 that communicates to other elements of the device via an output interface 360.

[0081] The CPU 305 is configured to execute machine-readable instructions embodied in a software application or module. The instructions may be stored in a memory location, such as the memory 310. The memory 310 may include various components (e.g., machine readable media) including, but not limited to, a random access memory component (e.g., RAM) (e.g., a static RAM "SRAM", a dynamic RAM "DRAM, etc.), or a read-only component (e.g., ROM). The memory 310 can also include a basic input/output system (BIOS), including basic routines that help to transfer information between elements within the digital processing device, such as during device start-up, may be stored in the memory 310.

[0082] The storage unit 315 can be configured to store files, such as sensor data, smart recipe(s), etc. The storage unit 315 can also be used to store operating system, application programs, and the like. Optionally, storage unit 315 may be removably interfaced with the digital processing device (e.g., via an external port connector (not shown)) and/or via a storage unit interface. Software may reside, completely or partially, within a computer-readable storage medium within or outside of the storage unit 315. In another example, software such as the software application and/or module(s) may reside, completely or partially, within processor(s) 305.

[0083] Information and data can be displayed to a user through a display 335. The display is connected to the bus 325 via an interface 340, and transport of data between the display other elements of the device 301 can be controlled via the interface 340.

[0084] Platforms, systems, and methods as described herein can be implemented by way of machine (e.g., computer processor) executable code stored on an electronic storage location of the digital processing device 301, such as, for example, on the memory 310 or electronic storage unit 315. The machine executable or machine readable code can be provided in the form of a software application or software module. During use, the code can be executed by the processor 305. In some cases, the code can be retrieved from the storage unit 315 and stored on the memory 310 for ready access by the processor 305. In some situations, the electronic storage unit 315 can be precluded, and machine-executable instructions are stored on memory 310.

[0085] In some embodiments, one or more remote devices 302 are configured to communicate with and/or receive instructions from the digital processing device 301, and may comprise any sensor, actuator, or camera as described herein. For example, in some cases, the remote device 302 is a temperature sensor that is configured to gather temperature data and send the data to the digital processing device 301 for analysis according to a smart recipe. The sensor can provide information such as sensor data, type of data, sensor ID, sensor location, metadata, or other data. In some cases, the remote device 302 is an actuator configured to perform one or more actions based on instructions received from the digital processing device 301.

Specific Embodiments

[0086] A number of methods and systems are disclosed herein. Specific exemplary embodiments of these methods and systems are disclosed below.

[0087] Embodiment 1. A system platform comprising: (a) a map of at least a portion of an agricultural facility, wherein the map is configured to display a plurality of visual layers; (b) a user interface configured to display the map; and (c) at least one user portal configured to access the user interface, wherein a plurality of visual layers are configured to (i) represent actuators relating to a grow of an agricultural product in the agricultural facility and (ii) display data based on the grow of the agricultural product in the agricultural facility.

[0088] Embodiment 2. The system platform of embodiment 1, wherein the map is a virtual reality map, an augmented reality map, or a mixed reality map.

[0089] Embodiment 3. The system platform of embodiment 2, wherein the map is the augmented reality map.

[0090] Embodiment 4. The system platform of any one of embodiments 1-3, wherein a visual layer of the plurality of visual layers is a virtual reality layer, an augmented reality layer, or a mixed reality layer.

[0091] Embodiment 5. The system platform of any one of embodiments 1-4, wherein a visual layer of the plurality of visual layers represents an actuator of the agricultural facility.

[0092] Embodiment 6. The system platform of any one of embodiments 1-5, wherein a visual layer of the plurality of visual layers is individually addressable by a user accessing a user portal of the at least one user portal.

[0093] Embodiment 7. The system platform of embodiment 6, wherein the visual layer is selectable by a user for visual enlargement or review of the data.

[0094] Embodiment 8. The system platform of any one of embodiments 1-7, wherein user access to a visual layer is controlled by a user portal.

[0095] Embodiment 9. The system platform of any one of embodiments 1-8, wherein the map is displayed on a single graphical screen.

[0096] Embodiment 10. The system platform of any one of embodiments 1-9, wherein the user interface is a graphical user interface.

[0097] Embodiment 11. The system platform of any one of embodiments 1-10, wherein the system platform is accessed on a smart phone. [0098] Embodiment 12. The system platform of any one of embodiments 1-11, wherein the visual layer displays the data based on a plurality of grows.

[0099] Embodiment 13. The system platform of any one of embodiments 1-12, wherein the data comprises real-time data.

[00100] Embodiment 14. The system platform of any one of embodiments 1-13, wherein the data comprises data from a database.

[00101] Embodiment 15. The system platform of any one of embodiments 1-14, wherein the data is updated during the grow.

[00102] Embodiment 16. The system platform of embodiment 15, wherein at least a part of the data is continuously updated during the grow.

[00103] Embodiment 17. The system platform of any one of embodiments 1-16, wherein at least part of the data is received as an input from a sensor.

[00104] Embodiment 18. The system platform of any one of embodiments 1-17, wherein the data comprises data collected from a previous grow of the agricultural product.

[00105] Embodiment 19. The system platform of embodiment 18, wherein the sensor is a camera.

[00106] Embodiment 20. The system platform of any one of embodiments 1-19, wherein at least part of the data is received as an input from a user.

[00107] Embodiment 21. The system platform of embodiment 20, wherein in the user is an agricultural product grower.

[00108] Embodiment 22. The system platform of any one of embodiments 1-21, wherein the data comprises a calculated result.

[00109] Embodiment 23. The system platform of embodiment 22, wherein the calculated result comprises a prediction of an agricultural product yield, a nutritional content of the agricultural product, a time to complete the grow, or any combination thereof.

[00110] Embodiment 24. The system platform of embodiment 22, wherein the calculated result comprises a threshold value of a water usage, an electrical usage, a remaining stock amount or any combination thereof.

[00111] Embodiment 25. The system platform of any one of embodiments 1-24, wherein the system platform includes a trained algorithm, and wherein a calculated result of the data is determined by the trained algorithm.

[00112] Embodiment 26. The system platform of any one of embodiments 1-25, wherein the data includes a plurality of operating parameters, wherein the plurality of operating parameters are (i) input by a user or (ii) received from an actuator or sensor. [00113] Embodiment 27. The system platform of embodiment 26, wherein the plurality of operating parameter comprises a soil pH, a temperature, a humidity, or any combination thereof. [00114] Embodiment 28. The system platform of any one of embodiments 1-27, wherein an actuator comprises a fan, a valve, a duct, a light source, a water source, a fertilizer source, a nutrient source, or any combination thereof.

[00115] Embodiment 29. The system platform of any one of embodiments 1-28, wherein the agricultural facility is a fish farm, a dairy farm, a livestock farm, a crop farm, an orchard, an indoor farm, a hydroponic farm or any combination thereof.

[00116] Embodiment 30. The system platform of any one of embodiments 1-29, wherein the agricultural product comprises an animal-based product.

[00117] Embodiment 31. The system platform of any one of embodiments 1-30, wherein the agricultural product comprises a plant-based product.

[00118] Embodiment 32. The system platform of any one of embodiments 1-31, wherein the user is a consumer of the agricultural product, a business entity that sells the agricultural product to a consumer, an agricultural grower, or an agricultural manager.

[00119] While preferred embodiments of the present invention have been shown and described herein, it will be obvious to those skilled in the art that such embodiments are provided by way of example only. It is not intended that the invention be limited by the specific examples provided within the specification. While the invention has been described with reference to the aforementioned specification, the descriptions and illustrations of the embodiments herein are not meant to be construed in a limiting sense. Numerous variations, changes, and substitutions will now occur to those skilled in the art without departing from the invention. Furthermore, it shall be understood that all aspects of the invention are not limited to the specific depictions, configurations or relative proportions set forth herein which depend upon a variety of conditions and variables. It should be understood that various alternatives to the embodiments of the invention described herein may be employed in practicing the invention. It is therefore contemplated that the invention shall also cover any such alternatives, modifications, variations or equivalents. It is intended that the following claims define the scope of the invention and that methods and structures within the scope of these claims and their equivalents be covered thereby.