[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

WO2021067847A1 - Agricultural platforms - Google Patents

Agricultural platforms Download PDF

Info

Publication number
WO2021067847A1
WO2021067847A1 PCT/US2020/054120 US2020054120W WO2021067847A1 WO 2021067847 A1 WO2021067847 A1 WO 2021067847A1 US 2020054120 W US2020054120 W US 2020054120W WO 2021067847 A1 WO2021067847 A1 WO 2021067847A1
Authority
WO
WIPO (PCT)
Prior art keywords
platform
agricultural
data
agricultural product
sensors
Prior art date
Application number
PCT/US2020/054120
Other languages
French (fr)
Inventor
Molly STANEK
Jacob HARTNELL
James HAUK
Jay MONTONI
Original Assignee
Sensei Ag Holdings, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sensei Ag Holdings, Inc. filed Critical Sensei Ag Holdings, Inc.
Publication of WO2021067847A1 publication Critical patent/WO2021067847A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/04Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/02Agriculture; Fishing; Forestry; Mining

Definitions

  • Agriculture management is an important function to oversee all aspects of running farms and other growing facilities that produce agricultural products. Agriculture management also includes farmers and landowners to address profitability, fertility, and conservation. These types of management functions are essential to a successful farm business and to ensure sufficient and nutrient-rich food for a population of food consumers.
  • a platform for agriculture management comprising: a trained algorithm that provides a smart recipe for growing an agricultural product, wherein the trained algorithm is configured to (i) receive data at least in part from an array of sensors, wherein a sensor is positioned at a location related to an agricultural facility and (ii) direct at least one actuator of an array of actuators positioned at a location in the agricultural facility to adjust one or more operating parameters according to the smart recipe, wherein the trained algorithm is trained on at least five agricultural products and wherein the trained algorithm optimizes the smart recipe to maximize agricultural product yield.
  • the agricultural product comprises an animal-based product.
  • the agricultural product comprises a plant-based product.
  • the agricultural facility is a fish farm, a dairy farm, a livestock farm, a crop farm, an orchard, an indoor farm, a hydroponic farm or any combination thereof.
  • the at least five an agricultural products comprise different an agricultural products.
  • the array of sensors comprise a camera, a temperature sensor, a light sensor, a pH sensor, or any combination thereof.
  • the array of sensors comprise at least 5 different types of sensors.
  • the one or more operating parameters comprise a water amount provided to the agricultural product, a light amount provided to the agricultural product, a soil composition provided to the agricultural product, or any combination thereof.
  • the location comprises two or more locations. In some cases, the location is remote.
  • the location is within the agricultural facility.
  • the data comprises an input from an agricultural grower, an agricultural manager, or a combination thereof.
  • the data comprises data collected from previous grows of the agricultural product.
  • a platform for agriculture management comprising: (a) an array of sensors, wherein a sensor is positioned at a location related to an agricultural facility; (b) an array of actuators, wherein an actuator is positioned at a location related to the agricultural facility; and (c) a processor configured to receive data at least in part from a sensor of the array of sensors and upon receipt of the data direct a change in an operating parameter of at least one actuator of the array of actuators, wherein the change in the operating parameter is calculated by a trained algorithm configured for maximizing agricultural product yield.
  • the array of sensors comprises a camera, a temperature sensor, a light sensor, a pH sensor, or any combination thereof. In some cases, the array of sensors comprise at least 5 different types of sensors. In some cases, the array of actuators comprises a water source, a light source, a nutrient source, a wind source, a temperature source, or any combination thereof. In some cases, the agricultural facility is a fish farm, a dairy farm, a livestock farm, a crop farm, an orchard, an indoor farm, a hydroponic farm or any combination thereof. In some cases, the operating parameter comprises a water amount provided to the agricultural product, a light amount provided to the agricultural product, a soil composition provided to the agricultural product, or any combination thereof.
  • the processor directs a change in at least three operating parameters based on receipt of the data.
  • the location comprises two or more locations. In some cases, the location is remote. In some cases, the location is within the agricultural facility.
  • the data comprises an input from an agricultural grower, an agricultural manager, or a combination thereof. In some cases, the data comprises data collected from previous grows of the agricultural product.
  • a platform for agriculture management comprising: (a) a plurality of sensors, wherein a sensor is positioned at a location related to an agricultural facility; (b) a processor configured to receive data at least in part from at least one sensor of the plurality of sensors; and (c) a user interface configured to receive a request for an agricultural product from a user, and upon receipt of the request, the processor is configured to provide a result to the user based on the data, wherein the result comprises a nutritional result of the agricultural product, a food safety result of the agricultural product, a provenance of the agricultural product, or any combination thereof.
  • the platform further comprises a database, wherein the database comprises a data set received from a plurality of agricultural facilities.
  • the processor comprises a trained algorithm trained to compare the data set from the database to the data received from the at least one sensor to produce the result.
  • the plurality of agricultural facilities is at least 5.
  • the plurality of agricultural facilities are located in different geographical locations.
  • the request comprises a provenance of the agricultural product, a farming practice of the agricultural product, a nutritional panel of the agricultural product, a food safety of the agricultural product, or any combination thereof.
  • the user is a food consumer.
  • the user is a business entity that sells the agricultural product to a consumer.
  • the location comprises two or more locations.
  • the location is remote. In some cases, the location is within the agricultural facility.
  • the plurality of sensors comprises a camera, a temperature sensor, a light sensor, a pH sensor, or any combination thereof. In some cases, the plurality of sensors comprise at least 5 different types of sensors.
  • the agricultural facility is a fish farm, a dairy farm, a livestock farm, a crop farm, an orchard, an indoor farm, a hydroponic farm or any combination thereof.
  • the agricultural product comprises an animal-based product. In some cases, the agricultural product comprises a plant-based product.
  • the data comprises an input from an agricultural grower, an agricultural manager, or a combination thereof. In some cases, the data comprises data collected from previous grows of the agricultural product.
  • a platform for agriculture management comprising: (a) a plurality of sensors, wherein a sensor is positioned at a location related to an agricultural facility; (b) a plurality of actuators, wherein an actuator is positioned at a location related to the agricultural facility; (c) a processor configured to receive data from at least one sensor of the plurality of sensors; and (d) a plurality of discrete user interfaces, wherein a user interface of the plurality is configured to (i) receive data from a user, (ii) receive a request from a user, (iii) provide a result to a user, or (iv) any combination thereof, wherein a first user interface of the plurality is configured for a food consumer.
  • the plurality of discrete user interfaces is at least three.
  • a second user interface is configured for an agricultural grower.
  • a second user interface is configured for an agricultural manager.
  • FIG. 1 shows an embodiment of a system such as used in an agricultural platform as described herein.
  • a platform that includes a processor and user interface for management of production of an agricultural product.
  • a platform as described herein includes a plurality of sensors, such as a plurality of individual sensors and/or one or more arrays of sensors.
  • a platform as described herein includes a plurality of actuators, such as a plurality of individual actuators and/or one or more arrays of actuators.
  • a processor may receive data from at least one sensor and may direct a change in an operating parameter of at least one actuator.
  • a platform as described herein includes a database that comprises a data set. The data set may comprise data from a plurality of agricultural facilities.
  • the processor may include a trained algorithm.
  • the trained algorithm may be trained with a data set.
  • the trained algorithm may be trained to compare data input to the processor to produce a result.
  • the result may be prompted by a request from a user.
  • the platforms, systems, media, and methods described herein include a cloud computing environment.
  • a cloud computing environment comprises one or more computing processors.
  • each of the expressions “at least one of A, B and C,” “at least one of A, B, or C,” “one or more of A, B, and C,” “one or more of A, B, or C” and “A, B, and/or C” means A alone, B alone, C alone, A and B together, A and C together, B and C together, or A, B and C together.
  • the term “about” may mean the referenced numeric indication plus or minus 15% of that referenced numeric indication.
  • the term “user” may mean a food consumer, an agricultural grower, an agricultural manager, a business entity in the food consumer industry or any combination thereof.
  • a user may be a person that produces or assisting in at least one aspect of producing the agricultural product.
  • a user may be a farmer, a planter, a breeder, a stockman, an agriculturist, an agronomist, a rancher, a producer, a cropper, a harvester, a gardener, an orchardist, a horticulturist, a hydroponist, a pomologist, a viticulturist, or any combination thereof.
  • a user may be a person in the farm business or agriculture business.
  • a user may be an agricultural manager that oversees an aspect of the business.
  • a user may be a CEO, CSO, or CFO of an agriculture facility.
  • a user may be a person that purchases an agricultural product from a farmer or a food producer.
  • a user may be a person that sells the agricultural product to a consumer.
  • a user may be a consumer, a person who eats the agricultural product or who buys the agricultural product.
  • an agricultural product may mean a product produced for consumption by a person or animal.
  • An agricultural product may include a plant or portion thereof, an animal or portion thereof, an animal product, or any combination thereof.
  • An agricultural product may include one or more crops, a food, a nutrient, a consumable, a livestock, a plant, an animal, an animal product (such as dairy, milk, eggs, cheese), a plant product, or any combination thereof.
  • an agricultural facility may mean a facility for producing one or more types of an agricultural product.
  • An agricultural facility may include a rural or urban facility or both.
  • An agricultural facility may include an outdoor or indoor facility or both.
  • An agricultural facility may include multiple different geographical locations or a singular geographical location.
  • An agricultural facility may include a farm, a dairy farm, a livestock farm, a crop farm, a commercial farm, a fish farm, a meat farm, a poultry farm, a greenhouse, an orchard, a hydroponic farm, an urban farm, or any combination thereof.
  • An agricultural facility may utilize natural growing elements such as sunlight, soil, and weather conditions of a geographical outdoor location.
  • An agricultural facility may utilize artificial elements such as artificial light, artificial soil, artificial heat, or combinations thereof.
  • An agricultural facility may utilize direct sun, indirect sun (such as from solar panels), or artificial light to grow crops.
  • a farming practice may mean a practice performed by one or more farms.
  • a farming practice may include growing an agricultural product under organic food standards or not.
  • a farming practice may include growing an agricultural product under non- GMO standards or not.
  • a farming practice may include growing an agricultural product under hormone-free conditions or not.
  • a farming practice may include growing an agricultural product under antibiotic-free conditions or not.
  • a farming practice may include growing an agricultural product under environmental-sustainable practices or not.
  • a farming practice may include growing an agricultural product under a reduced carbon footprint standard or not.
  • a farming practice may include growing an agricultural product under fair-trade standards or not.
  • a farming practice may include growing an agricultural product under a particular level of animal welfare standards or not.
  • a farming practice may include growing an agricultural product under farm raised conditions or raised in the wild.
  • a farming practice may include growing an agricultural product under in-door conditions, open access conditions, or free range conditions.
  • a farming practice may include growing an animal-based product on a grass-fed diet or not.
  • a farming practice may include any of the forgoing examples or any combination thereof.
  • the term “recipe” may mean a collection of parameters used in planting, growing, and/or maintaining an agricultural product.
  • parameters that might be found in a recipe include geographical location of the agricultural product, elevation of the agricultural product, environmental temperature, environmental humidity, environmental air quality, frequency of watering, amount of watering, frequency of application of fertilizer, amount of fertilizer used, frequency of pesticide applied, amount of pesticide used, soil composition, soil pH, agricultural product seed characteristics, and timing with respect to any actions carried out with respect to an agricultural product and/or seed of an agricultural product.
  • the term “nutritional profile” may mean an agricultural product having one or more nutritional attributes.
  • One or more nutritional attributes of an agricultural product may be quantified and communicated to a user, such as an agricultural manager, agricultural grower, food consumer, or any combination thereof.
  • the platforms as described herein may execute a recipe to grow an agricultural product, the recipe having been optimized to maximize one or more nutritional attributes.
  • a nutritional profile of an agricultural product may be compares across one or more farms growing the agricultural product.
  • a nutritional attribute may include a taste or flavor, a color, a texture, a ripeness, a freshness, a vitamin content or amount, a mineral content or amount, a fat content or amount, a sugar content or amount, a carbohydrate content or amount, a pesticide content or amount, an anti-oxidant content or amount, or any combination thereof.
  • a “software” as used herein comprises computer readable and executable instructions that may be executed by a computer processor.
  • a “software module” comprises computer readable and executable instructions and may, in some embodiments described herein make up a portion of software or may in some embodiments be a stand-alone item.
  • software and/or a software module comprises a file, a section of code, a programming object, a programming structure, or combinations thereof.
  • a software module comprises a plurality of files, a plurality of sections of code, a plurality of programming objects, a plurality of programming structures, or combinations thereof.
  • the one or more software modules comprise, by way of non-limiting examples, a web application, a mobile application, and a standalone application.
  • software modules are in one computer program or application. In other embodiments, software modules are in more than one computer program or application. In some embodiments, software modules are hosted on one machine. In other embodiments, software modules are hosted on more than one machine. In further embodiments, software modules are hosted on cloud computing platforms. In some embodiments, software modules are hosted on one or more machines in one location. In other embodiments, software modules are hosted on one or more machines in more than one location.
  • a sensor may collect data.
  • a sensor may collect a single type of data.
  • a sensor may collect more than one type of data.
  • a data type that one or more sensors may collect may include: an image (such as an image of an agricultural product or portion thereof or an image within the agricultural facility), a temperature, a humidity, a pH, a light level, a light spectrum, an air flow, an air circulation level, an air composition, an air exchange rate, a fertilizer content, a fertilizer concentration, a nutrient content, a nutrient concentration or any combination thereof.
  • a sensor may collect data related to a climate or microclimate within an agricultural facility.
  • a sensor may collect data on a disease type or disease level.
  • a sensor may collect data on an agricultural product yield, size of product, amount of product, rate of growth, or any combination thereof.
  • a sensor may collect data on amount of resources utilized or rate of resources utilized, such as water, fertilizer, soil, nutrients, sun light, heat, or any combination thereof.
  • a sensor may collect data automatically. Automated data collection may include continuous collection, discrete collection (such as when a given threshold may be reached), incremental collection (such as collection as timed intervals), or any combination thereof.
  • a sensor may collect data when a user (grower or farm manager) prompts the sensor.
  • a sensor may be a user (such as a grower).
  • a user may provide a sensory assessment of an agricultural product and may input the data into the processor, such as a user- based estimate of product number.
  • a sensor may be a camera, such as a digital camera.
  • a camera may be a camera on a smart phone (such as an iPhone) or on a smart watch (such as an iWatch).
  • a sensor may be a pH sensor, a temperature sensor, a light sensor, a humidity sensor, an air sensor, a turbidity sensor, a chemical sensor, or any combination thereof.
  • An array of sensors may include nxn sensors, such as lxl sensors, 2x2 sensors, 3x3 sensors or more.
  • An array of sensors may include nxm sensors, such as 1x3 sensors, 2x6 sensors, 3x9 sensors or more.
  • An array of sensors may include a plurality of sensors.
  • a sensor in an array of sensors may be individually addressable by a user or by the processor. For example, a subset of sensors may collect data based on a given parameter - such as time or temperature. Actuators
  • Adjusting an actuator may adjust one or more operating parameters of the agricultural facility. For example, adjusting a vent may adjust the temperature at one or more locations in the agricultural facility.
  • An array of sensors in communication with an array of actuators may facilitate adjustments of the temperature in discrete locations of the agricultural facility that may be insufficient compared to the remaining locations that are sufficiently heated.
  • An adjustment in an actuator may be determined by the processor and based on data received from one or more sensors, users, or a combination thereof.
  • An actuator may include a vent, a shutter, a louver, a sprayer, a pump, a valve, a mixer, a heater, a fan, a light, a humidifier, a dehumidifier, an air exchanger, a gas pump, a water source, a wind source, a food source, a fertilizer source, a pest control source, an evaporative cooler, a gas generator (such as a C02 generator) or any combination thereof.
  • a vent a shutter, a louver, a sprayer, a pump, a valve, a mixer, a heater, a fan, a light, a humidifier, a dehumidifier, an air exchanger, a gas pump, a water source, a wind source, a food source, a fertilizer source, a pest control source, an evaporative cooler, a gas generator (such as a C02 generator) or any combination thereof.
  • a gas generator such as a C02 generator
  • An array of actuators may include nxn actuators, such as lxl actuators, 2x2 actuators,
  • An array of actuators may include nxm actuators, such as 1x3 actuators, 2x6 actuators, 3x9 actuators or more.
  • An array of actuators may include a plurality of actuators.
  • An actuator in an array of actuators may be individually addressable by a user or by the processor. For example, a subset of actuators may collect data based on a given parameter - such as time or temperature.
  • An operating parameter may include at least in part an ingredient or raw material needed to grow an agricultural parameter.
  • An operating parameter may include a seed composition to grow a particular type of agricultural product, a water amount to provide to the agricultural product, a light amount or light spectrum to provide to the agricultural product, a soil composition to provide to the agricultural product, or any combination thereof.
  • an operating parameter may include a feed type, feed amount, feed frequency, water type, water amount, water frequency, or any combination thereof for optimal growth, health, or nutrition of an animal product.
  • An operating parameter may include an amount or type of raw ingredient.
  • An operating parameter may include an environmental condition to provide a nutritionally optimized product, a maximized product yield, a shortened growth cycle to achieve the agricultural product, a product yield that reduces an amount of raw ingredients needed or any combination thereof.
  • An operating parameter may include an acceptable range, such as a growing temperature from about 65 degrees Fahrenheit to about 90 degrees Fahrenheit.
  • An operating parameter may include a suggested starting value, such as a growing temperature of about 75 degrees Fahrenheit, a value that may be updated or modified during the course of a grow cycle.
  • An operating parameter may be modified by a user or by a control system.
  • An operating parameter may be modified based at least in part from data received from one or more sensors.
  • a platform may utilize a processor comprising a trained algorithm.
  • the trained algorithm may be trained on one or more agricultural products.
  • the trained algorithm may be trained to identify features in the data received from the one or more sensors and based on the data received determine predicts outputs such as yield prediction. Further, the trained algorithm may be trained to identify features in the data received from the one or more sensors and suggest changes in operating parameters of one or more actuators to enhance or optimize an output, such as yield production.
  • a trained algorithm may be trained to maximize yield production, minimize disease, minimize weed growth, agricultural nutritional content, animal welfare, water conservancy, soil conservancy, or any combination thereof.
  • a maximized yield production may be the most agricultural product produced in the smallest square footage of agricultural facility.
  • a maximized yield production may be the most agricultural product produced for the least amount of seed, water, or resource input.
  • a Farm Management System may be a complete software solution for managing an agricultural facility, such as a farm.
  • the platform may integrate task management, enterprise resource planning (ERP) solutions, crop planning, computer vision, supply chain management, as well as sensors and automation into a singular powerful platform: the one application an agricultural manager may need to manage a data driven 21st century farm.
  • ERP enterprise resource planning
  • Collecting multiple functionalities into one platform may allow a user to find correlations amongst the data and take action or deliver useful insights with machine learning. This approach may give growers the feedback they need to maximize output and minimize inputs, all while optimizing for growing environmental conditions, nutritional value, yields, food safety, or any combination thereof.
  • a user may be able to collect, maintain, and compare data from multiple agricultural facilities at similar or different geographical locations. This ability can deliver the end consumer more insight into how one agricultural facility compares to another agricultural facility in practices, sustainability, food safety, nutrition, or any combination thereof.
  • the summary covers modules that may comprise the farm management system individually and also how they can be brought together to enable new functionality.
  • a series of sensors may be distributed at carefully chosen locations throughout the agricultural facility, collecting ongoing data on temperature, humidity, fertilizer content, fertilizer concentration, nutrient content, nutrient concentration, pH, light levels, light spectrums, air flow, air circulation, or any combination thereof.
  • Sensor arrays may open visibility into patterns in environmental variables such as temperature. For example, given a greenhouse with a wet wall for cooling at one end of it, there are differences in both temperature and humidity from the near side to the far side. Sensors placed in a grid system can identify, monitor and measure the various microclimates in the growing environments. This monitoring of microclimates in real-time can be input into a processor of the platform and the processor can direct an adjustment in an operating parameter of one or more actuators to correct an undesirable microclimate in a specific location in the agricultural facility. Real-time monitoring and adjustments to correct for undesirable microclimates can maximize agricultural product yield.
  • Data collected from one or more sensors may be processed in real-time, triggering actuators (such as farming equipment) to make changes in the growing environment; e.g., modifications to temperature, adding nutrients to fertigation supply, powering off lights, or any combination thereof.
  • triggering actuators such as farming equipment
  • actuator arrays may consist of vents, heaters, fans, lights, humidifiers, dehumidifiers, other devices, or any combination thereof that can act on the environment or a subset of the environment (e.g. microclimate of a sub-location of the agricultural facility).
  • Processors with built-in machine learning or Artificial Intelligence (AI) can utilize data from one or more sensors of a sensor array and learn best how to use one or more actuators of an array of actuators to achieve an outcome in a specific environment. For example, if the desired temperature is 72 degrees Fahrenheit, a network of fans and directional vents can be configured to correctly circulate the air in an even distribution pattern.
  • Camera arrays can capture data from a portion of a growing area or an entire growing area.
  • the primary use of imaging technology may be correlated with outcomes for growing plants, but it can also be used to monitor for pests, both insect and animal. If there is a pest detected, a user can know exactly which plants within the agricultural facility might be affected. This feature may be important for food safety assurances, research purposes, improving smart recipes, or any combination thereof.
  • Camera arrays may provide multiple perspectives of an agricultural product (such as different angles of a growing plant), data what can be utilized to determine agricultural product size (e.g. fruit size), leaf orientation, color, stage of disease, plant/animal stress, or any combination thereof.
  • cameras biofeedback may be incorporated so that the processor can adjust the operating parameters (such as environmental conditions) based on data (such as leaf orientation, color, reproductive status, plant health or any combination thereof).
  • the processor can adjust the operating parameters (such as environmental conditions) based on data (such as leaf orientation, color, reproductive status, plant health or any combination thereof).
  • Two-dimensional camera technology may be incorporated in a variety of spectmms. Three-dimensional camera technology may unlock live modeling functionality and additional correlation possibilities.
  • ERP Enterprise Resource Planning
  • the farm management system may integrate a subset of ERP functionality with the focus being capturing data and utilizing that data to run an agricultural facility with maximum efficiency.
  • Features include but are not limited to: sales management, order management, customer relationship management, task management, standard operating procedures, user training, inventory, cost analysis, planning or any combination thereof.
  • the platform may be tailored specifically to the needs of growers and domain expertise may be incorporated into the platform.
  • the platform may comprise features allowing growers to collect information relevant for food safety compliance, to generate reports for certification, or a combination thereof.
  • Features of the platform may include:
  • IPM Integrated Pest Management
  • Nutrition certification May provide data on agricultural product provenance to show that a batch of food meets a certain level of nutrition based on laboratory analysis of batch samples.
  • Sustainability certification May provide insight to a food consumer into how their food was grown based on data collected by the platform.
  • Data collected that may be provided to a food consumer may include the amount of water used, the amount of energy used, the amount of soil square footage used, other resources used, metadata of how an agricultural product may be grown (for example, open field vs. soil greenhouse vs. hydroponic greenhouse, etc.), or any combination thereof.
  • Supply chain integration May provide integration with Supply Chain data using REST and GraphQL APIs. This may be important for food safety and efficiently executing a food recall.
  • the platform may provide integration with other ERP products such as NetSuite and Oracle ERP.
  • REST Real State Transfer
  • Farm management, growers, and farm workers constitute distinct user groups.
  • the platform may be customized to each user group’s needs but built from a shared centralized data source. Managers and growers may each have access to a powerful customized dashboard and admin interface giving them a complete view of everything happening on the farm. Farm workers may have access to task management tools, barcode scanners, and applications for workstations such as harvesting, seeding, and packaging as well as access to Standard Operating Procedure documentation.
  • the platform may provide a plurality of discrete user interfaces.
  • the platform may provide separate user interfaces for agricultural managers, agricultural growers, and food consumers.
  • the platform may provide at least 2, 3, 4, 5, 6, 7, 8, 9, 10, 15, 20, 30, 40,
  • the platform may provide at least 50 discrete user interfaces.
  • the platform may provide at least 500 discrete user interfaces.
  • the platform may provide at least 5,000 discrete user interfaces.
  • the platform may provide at least 50,000 discrete user interfaces.
  • the platform may provide at least 500,000 discrete user interfaces.
  • a discrete user interface may limit the data a user can access, whether a user can input data into the platform, and what type of request a user can enter into the interface, or any combination thereof. For example, data entry may be reserved for agricultural grower or manager interfaces.
  • agricultural growers using their mobile devices to access an application to input data may represent a food safety risk.
  • work stations can be set up ahead of time by following a sterilization standard operating procedure: for example, a harvesting station with a tablet computer and a smart scale can be sterilized and used with gloves. Smart watches can be sterilized before each use and used with sterile gloves.
  • Voice recognition systems capture agricultural growers logs and Bluetooth enabled scanners can scan QR codes or RFID chips.
  • Plant Recipes may be a format for encoding information about an organism's environmental requirements to achieve a desired phenotype, as well as any instructions or protocols needed to achieve a certain outcome.
  • the information in a recipe can be serialized in JSON, XML, or JSON-LD.
  • JSON Web Token or “JWT Token” is a JSON-based open standard (RFC 7519) for creating access tokens that assert some number of claims and may include user information including encrypted user information.
  • a Plant Recipe may be interpreted by a reactive module that parses a stream of event data (sensors, actuators, etc.) according to rules defined in the Plant Recipe.
  • This reactive module may emit its own stream of event data including messages, alerts, and new actions (such as adjusting an operating parameter by changing a setting on an actuator - turning on a heater or pump).
  • Sensor data from successful grows can be used to create new recipes, in essence recording environmental variables for playback later. For example, a particular harvest of tomatoes may have higher nutrient content than another harvest of tomatoes. By taking sensor data from the environment for the duration of the grow, it may be encoded into the recipe format to recreate that grow in a different controlled environment or the next grow. With greater data inflow, recipes may become increasingly targeted and intelligent through the development of machine learning algorithms.
  • a smart recipe may be provided by a user.
  • a smart recipe may be at least in part an output provided by a trained algorithm from a previous grow cycle.
  • a smart recipe may be a combination of an initial recipe provided by a user or database that may be additionally modified or updated by a trained algorithm, such as updated during a grow cycle.
  • a smart recipe may be updated based on a subtype of agricultural product that may be grown.
  • a smart recipe may be updated based on a geographical location that the agricultural product may be grown.
  • a smart recipe may be updated based on a user feedback, such as a request from a food consumer for an agricultural product having a particular set of nutritional elements.
  • An agricultural product may be evaluated or ranked against a second agricultural product for one or more nutritional elements.
  • a nutrition element may include a presence of a mineral, an amount of a mineral, a presence of a vitamin, an amount of a vitamin, an amount of calories, an amount of sugar, an amount of salt, an amount of fat, a type of fat, or any combination thereof.
  • a trained algorithm may receive data from an array of sensors, determine one or more nutritional elements of the agricultural product, and direct a modification or one or more operating parameters of one or more actuators or a modification to a recipe or a modification to a raw material to optimize the nutritional element or a panel of nutritional elements as compared to a control agricultural product.
  • machine learning models can be created to help growers improve their operation.
  • the purpose of the FMS may be to collect structured data that can be used to power machine learning services such as:
  • Crop Planning Sensor data and a database of plant recipes may be used to determine what plants would grow well in a particular growing environment and to give a user recommendation on how to optimize any environment for a given cultivar.
  • sales data may drive recommendations on what crops should be grown in a given geographical location.
  • Pest detection, predictions, and mitigations Using data collected from the Integrated Pest Management portion of the FMS ERP alongside historical data, the platform can recognize individual pests from images but it may predict pest occurrence based on growing seasons, environmental conditions, other factors, or any combination thereof.
  • Biofeedback may also be incorporated via camera imaged-based collection, so that the processer can direct adjustment of one or more environmental conditions based on the biofeedback - leaf orientation, color, reproductive status, plant health, or any combination thereof.
  • the platform may provide a singular software solution to run a profitable, data-driven modem agricultural facility.
  • a user may have access to control of monitoring, automation and farm operations which ultimately may empower the user to ran a more profitable and effective farm.
  • plant recipes such as smart recipes
  • SOPs proven Standard Operating Procedures
  • the platform may be implemented in farming systems of many different types (plant factory, greenhouse, hydroponic, aeroponic).
  • One of the key assets of the platform may be the data stream and database.
  • the platform may collect data (including location, types of grow systems already in place, sales and orders, market data, historical data for a particular environment, or any combination thereof) and the platform structures the data to deliver useful insights and accountability.
  • API and export features disclosed herein may allow for access to all this structured data allowing users (such as scientists) to use virtually any data analysis tools. Importantly, this may allow the user to link rich datasets to their publications increasing research transparency and ultimately allowing for better peer review and science.
  • the platform may allow these users to collect structured data from experiments and provide insight into the state of past, present, and future experiments.
  • the platform uses one or more sensors in an n x n array (or grid) or alternative geometric configuration to collect data at n discrete locations over a portion of an agricultural facility, which in some embodiments is digitized using pickup electronics and in some embodiments is connected to a computer for recording and displaying this data. It should be understood, however, that the platform is suitable for measuring a data associated with any type of agricultural product.
  • a sensor is configured to sense data associated with, for example, a plant, an animal, an environment, a climate, a parameter of a location associated with an agricultural facility, or any combination thereof.
  • the platform comprises a mobile base unit that may be movable and housing one or more sensors.
  • the platform comprises a mobile base unit, one or more sensors, and one or more actuators.
  • a mobile base unit includes wheels or a track upon which the mobile base unit is moved on a surface.
  • a trained algorithm may provide an output.
  • the output may comprise a yield prediction of at least one agricultural product, a disease prediction of at least one agricultural product, a weed detection in a grow cycle of at least one agricultural product, a crop quality of a grow cycle of at least one agricultural product, a species recognition of at least one agricultural product, an animal welfare rating of at least one agricultural product that is an animal-based product, a water management result of a grow of at least one agricultural product, a livestock production of at least one animal-based product, a soil management of a grow of at least one agricultural product, or any combination thereof.
  • a platform may comprise a machine learning module, such as a trained algorithm.
  • a machine learning module may be trained on one or more training data sets.
  • a machine learning module may be trained on at least about: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 20, 30, 40, 50, 60, 70, 80, 90, 100, 200, 300, 400, 500, 600, 700, 800, 900, 1,000, 5,000, 10,000, 20,000, 50,000 data sets or more.
  • a machine learning module may be trained on from about 50 to about 200 data sets.
  • a machine learning module may be trained on from about 50 to about 1,000 data sets.
  • a machine learning module may be trained on from about 1,000 to about 5,000 data sets.
  • a machine learning module may be trained on from about 5 to about 500 data sets.
  • a machine learning module may generate a training data set from data acquired or extracted from a sensor or user.
  • a machine learning module may be validated with one or more validation data sets.
  • a validation data set may be independent from a training data set.
  • a training data set may comprise data provided by a sensor, data provided by a user, or any combination thereof.
  • a training data set may be stored in a database of the platform.
  • a training data set may be uploaded to the machine learning module from an external source.
  • a training data set may be generated from data acquired at from a grow cycle.
  • a training data set may be updated continuously or periodically.
  • a training data set may comprise data from at least about: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 20, 30, 40,
  • a training data set may comprise data from about 50 to about 200 different grows.
  • a training data set may comprise data from about 50 to about 1,000 different grows.
  • a training data set may comprise data from about 1,000 to about 5,000 different grows.
  • a training data set may comprise data from about 5 to about 500 different grows.
  • the sensed parameter(s) herein is received as an input to output a correlation by a processor.
  • correlation herein is received as an input to a machine learning algorithm configured to output guidance or instruction for future grows.
  • the systems, methods, and media described herein may use machine learning algorithms for training prediction models and/or making predictions of for a grow.
  • Machine learning algorithms herein may learn from and make predictions on data, such as data obtained from a sensor or user.
  • Data may be any input, intermediate output, previous outputs, or training information, or otherwise any information provided to or by the algorithms.
  • a machine learning algorithm may use a supervised learning approach.
  • the algorithm can generate a function or model from training data.
  • the training data can be labeled.
  • the training data may include metadata associated therewith.
  • Each training example of the training data may be a pair consisting of at least an input object and a desired output value.
  • a supervised learning algorithm may require the individual to determine one or more control parameters. These parameters can be adjusted by optimizing performance on a subset, for example a validation set, of the training data. After parameter adjustment and learning, the performance of the resulting function/model can be measured on a test set that may be separate from the training set. Regression methods can be used in supervised learning approaches.
  • a machine learning algorithm may use an unsupervised learning approach.
  • the algorithm may generate a function/model to describe hidden structures from unlabeled data (e.g., a classification or categorization that cannot be directed observed or computed). Since the examples given to the learner are unlabeled, there is no evaluation of the accuracy of the structure that is output by the relevant algorithm.
  • Approaches to unsupervised learning include: clustering, anomaly detection, and neural networks.
  • a machine learning algorithm may use a semi -supervised learning approach.
  • Semi- supervised learning can combine both labeled and unlabeled data to generate an appropriate function or classifier.
  • a machine learning algorithm may use a reinforcement learning approach.
  • the algorithm can learn a policy of how to act given an observation of the world. Every action may have some impact in the environment, and the environment can provide feedback that guides the learning algorithm.
  • a machine learning algorithm may use a transduction approach.
  • Transduction can be similar to supervised learning, but does not explicitly construct a function. Instead, tries to predict new outputs based on training inputs, training outputs, and new inputs.
  • a machine learning algorithm may use a “learning to learn” approach. In learning to learn, the algorithm can learn its own inductive bias based on previous experience.
  • a machine learning algorithm is applied to patient data to generate a prediction model.
  • a machine learning algorithm or model may be trained periodically.
  • a machine learning algorithm or model may be trained non-periodically.
  • a machine learning algorithm may include learning a function or a model.
  • the mathematical expression of the function or model may or may not be directly computable or observable.
  • C2x2 has two predictor variables, xl and x2, and coefficients or parameter, CO, Cl, and C2.
  • the predicted variable in this example is Y. After the parameters of the model are learned, values can be entered for each predictor variable in a model to generate a result for the dependent or predicted variable (e.g., Y).
  • a machine learning algorithm comprises a supervised or unsupervised learning method such as, for example, support vector machine (SVM), random forests, gradient boosting, logistic regression, decision trees, clustering algorithms, hierarchical clustering, K-means clustering, or principal component analysis.
  • Machine learning algorithms may include linear regression models, logistical regression models, linear discriminate analysis, classification or regression trees, naive Bayes, K-nearest neighbor, learning vector quantization (LVQ), support vector machines (SVM), bagging and random forest, boosting and Adaboost machines, or any combination thereof.
  • Data input into a machine learning algorithm may include data obtained from an individual, data obtained from a practitioner, or a combination thereof.
  • Data input into a machine learning algorithm may include data extracted from a sensor, from a user, or a combination thereof.
  • Data input into a machine learning algorithm may include a product yield, an environmental condition, a pest resilience, a nutrient profile, a farming practice used, or any combination thereof.
  • Data obtained from one or more grows can be analyzed using feature selection techniques including filter techniques which may assess the relevance of one or more features by looking at the intrinsic properties of the data, wrapper methods which may embed a model hypothesis within a feature subset search, and embedded techniques in which a search for an optimal set of features may be built into a machine learning algorithm.
  • a machine learning algorithm may identify a set of parameters that may provide an optimized grow.
  • a machine learning algorithm may be trained with a training set of samples.
  • the training set of samples may comprise data collected from a grow, from different grows, or from a plurality of grows.
  • a training set of samples may comprise data from a database.
  • a training set of samples may include 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 20, 30, 40, 50, 60, 70,
  • a training set of samples may comprise a single data type.
  • a training set of samples may include different data types.
  • a training set of samples may comprise a plurality of data types.
  • a training set of samples may comprise at least three data types.
  • a training set of samples may include data obtained from about: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 20, 30, 40, 50, 60, 70,
  • a training set of samples may include data from a single grow.
  • a training set of samples may include data from different grows.
  • a training set of samples may include data from a plurality of grows.
  • Iterative rounds of training may occur to arrive at a set of features to classify data.
  • Different data types may be ranked differently by the machine learning algorithm.
  • One data type may be ranked higher than a second data type.
  • Weighting or ranking of data types may denote significance of the data type.
  • a higher weighted data type may provide an increased accuracy, sensitivity, or specificity of the classification or prediction of the machine learning algorithm.
  • an input parameter of growing temperature may significantly increase crop yield, more than any other input parameter. In this case, growing temperature may be weighted more heavily than other input parameters in increasing crop yield.
  • the weighting or ranking of features may vary from grow to grow.
  • the weighting or ranking of features may not vary from grow to grow.
  • a machine learning algorithm may be tested with a testing set of samples.
  • the testing set of samples may be different from the training set of samples. At least one sample of the testing set of samples may be different from the training set of samples.
  • the testing set of samples may comprise data collected from before a grow, during a grow, after a grow, from different grows, or from a plurality of grows.
  • a testing set of samples may comprise data from a database.
  • a training set of samples may include different data types - such as one or more input parameters and one or more output parameters.
  • a testing set of samples may include 1, 2, 3, 4,
  • a testing set of samples may comprise a data type.
  • a testing set of samples may include different data types.
  • a testing set of samples may comprise a plurality of data types.
  • a testing set of samples may comprise at least three data types.
  • a testing set of samples may include data obtained from 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 20, 30, 40, 50, 60, 70, 80, 90, 100, 200, 300, 400, 500, 600, 700, 800, 900, 1,000, 5,000, 10,000, 20,000,
  • a testing set of samples may include data from a single grow.
  • a testing set of samples may include data from different grows.
  • a testing set of samples may include data from a plurality of grows.
  • a machine learning algorithm may classify or predict an outcome with at least about: 80%, 85%, 90%, 95%, 96%, 97%, 98%, 99% accuracy.
  • a machine learning algorithm may classify or predict an outcome with at least about: 80%, 85%, 90%, 95%, 96%, 97%, 98%, 99% sensitivity.
  • a machine learning algorithm may classify or predict an outcome with at least about: 80%, 85%, 90%, 95%, 96%, 97%, 98%, 99% specificity.
  • a machine learning algorithm may classify with 90% accuracy that an agricultural yield will not succumb to pest infestation.
  • a machine learning algorithm may classify a grow as having at least 90% likelihood of producing an agricultural product with superior nutritional profile as compared to a control.
  • a machine learning algorithm may predict at least 95% likelihood of an agricultural yield under a range of growing temperatures.
  • An independent sample may be independent from the training set of samples, the testing set of samples or both.
  • the independent sample may be input into the machine learning algorithm for classification.
  • An independent sample may not have been previously classified by the machine learning algorithm.
  • a classifier may be employed to determine or to predict a set of growing conditions to be executed during a grow.
  • a classifier may provide real-time feedback and guided adjustments of the one or more growing conditions - such as during a grow.
  • One or more growing conditions may be adjusted real-time during a grow.
  • a machine learning algorithm may identify an ‘ideal’ or ‘optimized’ input parameter for each grow.
  • An ‘ideal’ or ‘optimized’ input parameter may remain constant or may change over time.
  • An ‘ideal’ or ‘optimized’ input parameter may be specific or unique for each grow or agricultural product.
  • Feedback from a machine learning algorithm may be continuous such as feedback during a grow, episodic such as at the end of a grow, or roll-back such as cumulative changes over several different grows, or any combination thereof. Feedback from a machine learning algorithm may result in one or more changes in a recipe or operating parameter.
  • a trained algorithm (such as a machine learning software module) as described herein is configured to undergo at least one training phase wherein the trained algorithm is trained to carry out one or more tasks including data extraction, data analysis, and generation of output or result, such as an recipe for growing an agricultural product with maximal yield or with maximal nutritional benefit.
  • the agricultural platform comprises a training module that trains the training algorithm.
  • the training module is configured to provide training data to the trained algorithm, said training data comprising, for example, a data set from an agricultural facility or a data set from a previous grow.
  • a trained algorithm is trained using a data set and a target in a manner that might be described as supervised learning.
  • the data set is conventionally divided into a training set, a test set, and, in some cases, a validation set.
  • a target is specified that contains the correct classification of each input value in the data set. For example, a data set from one type of agricultural product is repeatedly presented to the trained algorithm, and for each sample presented during training, the output generated by the trained algorithm is compared with the desired target. The difference between the target and the set of input samples is calculated, and the trained algorithm is modified to cause the output to more closely approximate the desired target value, such as maximized yield of the agricultural product.
  • a back-propagation algorithm is utilized to cause the output to more closely approximate the desired target value.
  • the trained algorithm output will closely match the desired target for each sample in the input training set.
  • new input data not used during training
  • it may generate an output classification value indicating which of the categories the new sample is most likely to fall into.
  • the trained algorithm is said to be able to “generalize” from its training to new, previously unseen input samples. This feature of a trained algorithm allows it to be used to classify almost any input data which has a mathematically formulatable relationship to the category to which it should be assigned.
  • the trained algorithm utilizes an individual learning model.
  • An individual learning model is based on the trained algorithm having trained on data from a single individual and thus, a trained algorithm that utilizes an individual learning model is configured to be used on a single individual on whose data it trained.
  • the trained algorithm utilizes a global training model.
  • a global training model is based on the trained algorithm having trained on data from multiple individuals and thus, a trained algorithm that utilizes a global training model is configured to be used on multiple patients/individuals.
  • the trained algorithm utilizes a simulated training model.
  • a simulated training model is based on the trained algorithm having trained on a data set obtained from a grow of an agricultural product.
  • a trained algorithm that utilizes a simulated training model is configured to be used on multiple grows of an agricultural product.
  • Unsupervised learning is used, in some embodiments, to train a trained algorithm to use input data such as, for example, agricultural product data and output, for example, a maximized yield or disease detection.
  • Unsupervised learning in some embodiments, includes feature extraction which is performed by the trained algorithm on the input data. Extracted features may be used for visualization, for classification, for subsequent supervised training, and more generally for representing the input for subsequent storage or analysis. In some cases, each training case may consist of a plurality of agricultural product data.
  • Trained algorithms that are commonly used for unsupervised training include k-means clustering, mixtures of multinomial distributions, affinity propagation, discrete factor analysis, hidden Markov models, Boltzmann machines, restricted Boltzmann machines, autoencoders, convolutional autoencoders, recurrent neural network autoencoders, and long short-term memory autoencoders. While there are many unsupervised learning models, they all have in common that, for training, they require a training set consisting of a data set of grows of an agricultural product, without associated labels.
  • a trained algorithm may include a training phase and a prediction phase.
  • the training phase is typically provided with data in order to train the machine learning algorithm.
  • types of data inputted into a trained algorithm for the purposes of training include an agricultural product yield, an amount and type of raw materials, an environmental condition during a grow, a length of grow, a nutritional profile of the agricultural product, a soil composition, or any combination thereof.
  • Data that is inputted into the trained algorithm is used, in some embodiments, to construct a hypothesis function to determine the presence of an abnormality.
  • a trained algorithm is configured to determine if the outcome of the hypothesis function was achieved and based on that analysis make a determination with respect to the data upon which the hypothesis function was constructed.
  • the outcome tends to either reinforce the hypothesis function with respect to the data upon which the hypothesis functions was constructed or contradict the hypothesis function with respect to the data upon which the hypothesis function was constructed.
  • the machine learning algorithm will either adopts, adjusts, or abandon the hypothesis function with respect to the data upon which the hypothesis function was constructed.
  • the machine learning algorithm described herein dynamically learns through the training phase what characteristics of an input (e.g. data) is most predictive in optimizing a crop yield, minimizing pest infestation or carbon foot print of a product, maximizing profit or nutritional value of a product, or any combination thereof.
  • a trained algorithm is provided with data on which to train so that it, for example, is able to determine the most salient features of a received agricultural product data to operate on.
  • the trained algorithms described herein train as to how to analyze the agricultural product data, rather than analyzing the agricultural product data using pre-defmed instructions.
  • the trained algorithms described herein dynamically learn through training what characteristics of an input signal are most predictive in in optimizing a crop yield, minimizing pest infestation or carbon foot print of a product, maximizing profit or nutritional value of a product, or any combination thereof.
  • the trained algorithm is trained by repeatedly presenting the trained algorithm with agricultural product data along a range of successful and non-successful grows.
  • the trained algorithm may be presented with data from grows having high yield and data having no product produced.
  • the trained algorithm may be presented with data from grows having a high carbon footprint and data from grows having a minimized carbon foot print.
  • a trained algorithm may receive heterogeneous data conveying the range and variability of data that the trained algorithm may encounter in a future grow.
  • Agricultural product data may be generated by computer simulation.
  • training begins when the trained algorithm is given agricultural product data and asked to optimize a crop yield, minimize pest infestation or carbon foot print of a product, maximize profit or nutritional value of a product, or any combination thereof.
  • the predicted output is then compared to the true data that corresponds to the agricultural product data.
  • An optimization technique such as gradient descent and backpropagation is used to update the weights in each layer of the trained algorithm so as to produce closer agreement between the probability predicted by the trained algorithm, and the optimized result. This process is repeated with new agricultural product data until the accuracy of the network has reached the desired level.
  • An optimization technique is used to update the weights in each layer of the trained algorithm so as to produce closer agreement between the data predicted by the trained algorithm, and the true data. This process is repeated with new agricultural product data until the accuracy of the network has reached the desired level.
  • a machine learning algorithm may be trained using a large database of measurements and/or any features or metrics computed from the above said data with the corresponding ground-truth values.
  • the training phase constructs a transformation function for predictive in optimizing a crop yield, minimizing pest infestation or carbon foot print of a product, maximizing profit or nutritional value of a product, or any combination thereof.
  • the machine learning algorithm dynamically learns through training what characteristics or features of an input signal are most predictive in optimizing the features of an agricultural product, such as nutritional profile.
  • a prediction phase uses the constructed and optimized transformation function from the training phase to predict the optimization of the grow and product yield.
  • the trained algorithm may be used to maximize, for example, the agricultural product yield on which the system was trained using the prediction phase.
  • the system can predict in an independent grow cycle the optimized product yield Data Filtering
  • data that is received by a machine learning algorithm software module from a sensor as an input may comprise agricultural product data that has been filtered and or modified.
  • filtering comprises a removal of noise or artifact from a sensed data, such as noise perturbations or temperature fluctuations from a grower entering or exiting a facility.
  • Artifact or noise may comprise, for example, ambient signals that are sensed together with data sensed from in an agricultural facility.
  • sensed agricultural product data is filtered prior to and/or after transmission of said data to a processor.
  • Filtering of sensed agricultural product data may, for example, comprise the removal of ambient signal noise from a sensed agricultural product data.
  • Signal noise may, for example, comprises ambient agricultural product data generated by, for example, electronic devices, electrical grids, or other devices.
  • sensed agricultural product data is converted to another form of data or signal which then undergoes a signal filtering process.
  • a device or system includes a processor including software that is configured to convert sensed agricultural product data to another form of data or signal.
  • the process of converting sensed agricultural product data to another form of data or signal typically comprises an encoding process, wherein a first form of data is converted into a second form of data or signal._Once filtered, the filtered data may be transmitted to a machine learning algorithm for analysis.
  • FIG. 1 shows an embodiment of a system such as used in an agricultural platform as described herein, comprising a digital processing device 101.
  • the digital processing device 101 includes a software application configured for agriculture management. Alternatively or in combination, the digital processing device 101 is configured to generate a trained algorithm (e.g., machine learning algorithm) such as by training the algorithm with a training data set.
  • the digital processing device 101 may include a central processing unit (CPU, also “processor” and “computer processor” herein) 105, which can be a single core or multi-core processor, or a plurality of processors for parallel processing.
  • CPU central processing unit
  • processor also “processor” and “computer processor” herein
  • the digital processing device 101 also includes either memory or a memory location 110 (e.g., random-access memory, read-only memory, flash memory), electronic storage unit 115 (e.g, hard disk), communication interface 120 (e.g, network adapter, network interface) for communicating with one or more other systems, and peripheral devices, such as cache.
  • the peripheral devices can include storage device(s) or storage medium 165 which communicate with the rest of the device via a storage interface 170.
  • the memory 110, storage unit 115, interface 120 and peripheral devices are configured to communicate with the CPU 105 through a communication bus 125, such as a motherboard.
  • the digital processing device 101 can be operatively coupled to a computer network (“network”) 130 with the aid of the communication interface 120.
  • the network 130 can comprise the Internet and/or a local area network (LAN).
  • the network 130 can be a telecommunication and/or data network.
  • the digital processing device 101 includes input device(s) 145 to receive information from a user, the input device(s) in communication with other elements of the device via an input interface 150.
  • the input device(s) includes a remote device such as a smartphone or tablet that is configured to communicate remotely with the digital processing device 101.
  • a remote device such as a smartphone or tablet that is configured to communicate remotely with the digital processing device 101.
  • a user may use a smartphone application to access sensor data, current actuator instructions, the smart recipe, or other information stored on the digital processing device 101.
  • the digital processing device 101 can include output device(s) 155 that communicates to other elements of the device via an output interface 160.
  • the CPU 105 is configured to execute machine-readable instructions embodied in a software application or module.
  • the instructions may be stored in a memory location, such as the memory 110.
  • the memory 110 may include various components (e.g ., machine readable media) including, but not limited to, a random access memory component (e.g., RAM) (e.g, a static RAM “SRAM”, a dynamic RAM “DRAM, etc.), or a read-only component (e.g, ROM).
  • the memory 110 can also include a basic input/output system (BIOS), including basic routines that help to transfer information between elements within the digital processing device, such as during device start-up, may be stored in the memory 110.
  • BIOS basic input/output system
  • the storage unit 115 can be configured to store files, such as sensor data, smart recipe(s), etc.
  • the storage unit 115 can also be used to store operating system, application programs, and the like.
  • storage unit 115 may be removably interfaced with the digital processing device (e.g, via an external port connector (not shown)) and/or via a storage unit interface.
  • Software may reside, completely or partially, within a computer-readable storage medium within or outside of the storage unit 115.
  • software such as the software application and/or module(s) may reside, completely or partially, within processor(s) 105.
  • Information and data can be displayed to a user through a display 135.
  • the display is connected to the bus 125 via an interface 140, and transport of data between the display other elements of the device 101 can be controlled via the interface 140.
  • Methods as described herein can be implemented by way of machine (e.g, computer processor) executable code stored on an electronic storage location of the digital processing device 101, such as, for example, on the memory 110 or electronic storage unit 115.
  • the machine executable or machine readable code can be provided in the form of a software application or software module.
  • the code can be executed by the processor 105.
  • the code can be retrieved from the storage unit 115 and stored on the memory 110 for ready access by the processor 105.
  • the electronic storage unit 115 can be precluded, and machine-executable instructions are stored on memory 110.
  • one or more remote devices 102 are configured to communicate with and/or receive instructions from the digital processing device 101, and may comprise any sensor, actuator, or camera as described herein.
  • the remote device 102 is a temperature sensor that is configured to gather temperature data and send the data to the digital processing device 101 for analysis according to a smart recipe.
  • the sensor can provide information such as sensor data, type of data, sensor ID, sensor location, metadata, or other data.
  • the remote device 102 is an actuator configured to perform one or more actions based on instructions received from the digital processing device 101.
  • the remote device is a camera configured to provide a camera feed or imaging data to the digital processing device 101. The camera may be configured to receive and respond to instructions to perform an action such as, for example, turning on/off, rotating or moving, and/or zooming in or out.
  • Embodiment 1 A platform for agriculture management, the platform comprising: a trained algorithm that provides a smart recipe for growing an agricultural product, wherein the trained algorithm is configured to (i) receive data at least in part from an array of sensors, wherein a sensor is positioned at a location related to an agricultural facility and (ii) direct at least one actuator of an array of actuators positioned at a location in the agricultural facility to adjust one or more operating parameters according to the smart recipe, wherein the trained algorithm is trained on at least five agricultural products and wherein the trained algorithm optimizes the smart recipe to maximize agricultural product yield.
  • the trained algorithm is configured to (i) receive data at least in part from an array of sensors, wherein a sensor is positioned at a location related to an agricultural facility and (ii) direct at least one actuator of an array of actuators positioned at a location in the agricultural facility to adjust one or more operating parameters according to the smart recipe, wherein the trained algorithm is trained on at least five agricultural products and wherein the trained algorithm optimizes the smart recipe to maximize agricultural product yield.
  • Embodiment 2 The platform of embodiment 1, wherein the agricultural product comprises an animal-based product.
  • Embodiment 3 The platform of any one of embodiments 1-2, wherein the agricultural product comprises a plant-based product.
  • Embodiment 4 The platform of any one of embodiments 1-3, wherein the agricultural facility is a fish farm, a dairy farm, a livestock farm, a crop farm, an orchard, an indoor farm, a hydroponic farm or any combination thereof.
  • Embodiment 5 The platform of any one of embodiments 1-4, wherein the at least five an agricultural products comprise different an agricultural products.
  • Embodiment 6 The platform of any one of embodiments 1-5, wherein the array of sensors comprise a camera, a temperature sensor, a light sensor, a pH sensor, or any combination thereof.
  • Embodiment 7 The platform of any one of embodiments 1-6, wherein the array of sensors comprise at least 5 different types of sensors.
  • Embodiment 8 The platform of any one of embodiments 1-7, wherein the one or more operating parameters comprise a water amount provided to the agricultural product, a light amount provided to the agricultural product, a soil composition provided to the agricultural product, or any combination thereof.
  • Embodiment 9 The platform of any one of embodiments 1-8, wherein the location comprises two or more locations.
  • Embodiment 10 The platform of any one of embodiments 1-9, wherein the location is remote.
  • Embodiment 11 The platform of any one of embodiments 1-10, wherein the location is within the agricultural facility.
  • Embodiment 12 The platform of any one of embodiments 1-11, wherein the data comprises an input from an agricultural grower, an agricultural manager, or a combination thereof.
  • Embodiment 13 The platform of any one of embodiments 1-12, wherein the data comprises data collected from previous grows of the agricultural product.
  • Embodiment 14 A platform for agriculture management, the platform comprising: (a) an array of sensors, wherein a sensor is positioned at a location related to an agricultural facility; (b) an array of actuators, wherein an actuator is positioned at a location related to the agricultural facility; and (c) a processor configured to receive data at least in part from a sensor of the array of sensors and upon receipt of the data direct a change in an operating parameter of at least one actuator of the array of actuators, wherein the change in the operating parameter is calculated by a trained algorithm configured for maximizing agricultural product yield.
  • Embodiment 15 The platform of embodiment 14, wherein the array of sensors comprises a camera, a temperature sensor, a light sensor, a pH sensor, or any combination thereof.
  • Embodiment 16 The platform of any one of embodiments 14-15, wherein the array of sensors comprise at least 5 different types of sensors.
  • Embodiment 17 The platform of any one of embodiments 14-16, wherein the array of actuators comprises a water source, a light source, a nutrient source, a wind source, a temperature source, or any combination thereof.
  • Embodiment 18 The platform of any one of embodiments 14-17, wherein the agricultural facility is a fish farm, a dairy farm, a livestock farm, a crop farm, an orchard, an indoor farm, a hydroponic farm or any combination thereof.
  • Embodiment 19 The platform of any one of embodiments 14-18, wherein the operating parameter comprises a water amount provided to the agricultural product, a light amount provided to the agricultural product, a soil composition provided to the agricultural product, or any combination thereof.
  • Embodiment 20 The platform of any one of embodiments 14-19, wherein the processor directs a change in at least three operating parameters based on receipt of the data.
  • Embodiment 21 The platform of any one of embodiments 14-20, wherein the location comprises two or more locations.
  • Embodiment 22 The platform of any one of embodiments 14-21, wherein the location is remote.
  • Embodiment 23 The platform of any one of embodiments 14-22, wherein the location is within the agricultural facility.
  • Embodiment 24 The platform of any one of embodiments 14-23, wherein the data comprises an input from an agricultural grower, an agricultural manager, or a combination thereof.
  • Embodiment 25 The platform of any one of embodiments 14-24, wherein the data comprises data collected from previous grows of the agricultural product.
  • Embodiment 26 A platform for agriculture management, the platform comprising: (a) a plurality of sensors, wherein a sensor is positioned at a location related to an agricultural facility; (b) a processor configured to receive data at least in part from at least one sensor of the plurality of sensors; and (c) a user interface configured to receive a request for an agricultural product from a user, and upon receipt of the request, the processor is configured to provide a result to the user based on the data, wherein the result comprises a nutritional result of the agricultural product, a food safety result of the agricultural product, a provenance of the agricultural product, or any combination thereof.
  • Embodiment 27 The platform of embodiment 26, further comprising a database, wherein the database comprises a data set received from a plurality of agricultural facilities.
  • Embodiment 28 The platform of any one of embodiments 26-27, wherein the processor comprises a trained algorithm trained to compare the data set from the database to the data received from the at least one sensor to produce the result.
  • Embodiment 29 The platform of any one of embodiments 26-28, wherein the plurality of agricultural facilities is at least 5.
  • Embodiment 30 The platform of any one of embodiments 26-29, wherein the plurality of agricultural facilities are located in different geographical locations.
  • Embodiment 31 The platform of any one of embodiments 26-30, wherein the request comprises a provenance of the agricultural product, a farming practice of the agricultural product, a nutritional panel of the agricultural product, a food safety of the agricultural product, or any combination thereof.
  • Embodiment 32 The platform of any one of embodiments 26-31, wherein the user is a food consumer.
  • Embodiment 33 The platform of any one of embodiments 26-32, wherein the user is a business entity that sells the agricultural product to a consumer.
  • Embodiment 34 The platform of any one of embodiments 26-33, wherein the location comprises two or more locations.
  • Embodiment 35 The platform of any one of embodiments 26-34, wherein the location is remote.
  • Embodiment 36 The platform of any one of embodiments 26-35, wherein the location is within the agricultural facility.
  • Embodiment 37 The platform of any one of embodiments 26-36, wherein the plurality of sensors comprises a camera, a temperature sensor, a light sensor, a pH sensor, or any combination thereof.
  • Embodiment 38 The platform of any one of embodiments 26-37, wherein the plurality of sensors comprise at least 5 different types of sensors.
  • Embodiment 39 The platform of any one of embodiments 26-38, wherein the agricultural facility is a fish farm, a dairy farm, a livestock farm, a crop farm, an orchard, an indoor farm, a hydroponic farm or any combination thereof.
  • Embodiment 40 The platform of any one of embodiments 26-39, wherein the agricultural product comprises an animal-based product.
  • Embodiment 41 The platform of any one of embodiments 26-40, wherein the agricultural product comprises a plant-based product.
  • Embodiment 42 The platform of any one of embodiments 26-41, wherein the data comprises an input from an agricultural grower, an agricultural manager, or a combination thereof.
  • Embodiment 43 The platform of any one of embodiments 26-42, wherein the data comprises data collected from previous grows of the agricultural product.
  • Embodiment 44 A platform for agriculture management, the platform comprising: (a) a plurality of sensors, wherein a sensor is positioned at a location related to an agricultural facility; (b) a plurality of actuators, wherein an actuator is positioned at a location related to the agricultural facility; (c) a processor configured to receive data from at least one sensor of the plurality of sensors; and (d) a plurality of discrete user interfaces, wherein a user interface of the plurality is configured to (i) receive data from a user, (ii) receive a request from a user, (iii) provide a result to a user, or (iv) any combination thereof, wherein a first user interface of the plurality is configured for a food consumer.
  • Embodiment 45 The platform of embodiment 44, wherein the plurality of discrete user interfaces is at least three.
  • Embodiment 46 The platform of any one of embodiments 44-45, wherein a second user interface is configured for an agricultural grower.
  • Embodiment 47 The platform of any one of embodiments 44-46, wherein a second user interface is configured for an agricultural manager.
  • a food consumer values sourcing animal-based products from farms that operate with high animal welfare standards.
  • the food consumer will access an agricultural management platform via the food consumer portal (such as via an application on a personal electronic device) to review one or more animal welfare standards reported by one or more farms that supply the animal-based product that the food consumer is considering to purchase. Based on the food consumer review of the animal welfare standards reported and displayed in the food consumer portal, the food consumer will make a decision to purchase an animal-based product.
  • a food consumer values sourcing agricultural-based products from farms that operate with high water conservation and soil preservation standards The food consumer will access an agricultural management platform via the food consumer portal (such as via an application on a personal electronic device) to review one or more water conservation or soil preservation standards reported by one or more farms that supply the agricultural -based product that the food consumer is considering to purchase. Based on the food consumer review of the water conservation or soil preservation standards reported and displayed in the food consumer portal, the food consumer will make a decision to purchase an agricultural-based product.
  • a food consumer values sourcing food products that contain high nutritional content.
  • the food consumer will access an agricultural management platform via the food consumer portal (such as via an application on a personal electronic device) to review one or more nutritional profiles (comprising one or more nutrition elements) of an agricultural product reported by one or more farms that supply the agricultural product that the food consumer is considering to purchase. Based on the food consumer’s review of the nutritional profiles reported and displayed in the food consumer portal, the food consumer will make a decision to purchase an agricultural product.
  • Example 4 A processor of an agriculture management platform initiates a grow of an agricultural product based on a recipe. During the grow cycle, a drought initiates in the location of the grow, significantly reducing the amount of rainfall received to the agricultural product. Data from one or more sensors will be provided to the processor - data that will be related to the significant reduction of rainfall. The processor will then direct a change in one or more actuators during the grow cycle to increase the amount of water provided to the agricultural product.
  • a processor of an agriculture management platform initiates a grow of an agricultural product based on a recipe.
  • a moisture-based pest infestation initiates in the particular sub-location of the grow, significantly reducing the amount of product yield in that sub-location.
  • Data from one or more sensors will be provided to the processor - data that will be related to the significant reduction of product yield in that sub-location.
  • the processor will then direct a change in one or more actuators within the sub-location (a subset of actuators in the array) during the grow cycle to increase reduce a moisture content within the sub-location to eradicate or reduce damage by the moisture-based pest infestation to the agricultural product.
  • Moisture data collected from the sub-location during the grow cycle will be incorporated into a trained algorithm of the processor to inform future grows of the agricultural product or to minimize risk of moisture-based pest infestations in future grows of the agricultural product.
  • An agricultural management platform will have three distinct user portals.
  • a first user portal will be configured for a food consumer.
  • the first user portal for the food consumer will permit access to nutritional information of the agricultural product, a geographical location of a grow, a farming practice (such as organic grow, non-GMO grow, hormone-free grow, antibiotic- free grow, animal welfare standards, wild or farm raised, caged or open access or free range) of an agricultural product, or any combination thereof.
  • the first user portal will permit a food consumer to provide a feedback to a farm or to another food consumer, a rating of an agricultural product, a question to a farm or to another food consumer, or any combination thereof.
  • a second user portal will be configured for an agricultural grower.
  • the second user portal will permit the agricultural grower to input, review or modify one or more operating parameters, outputs, data, recipes, or any combination thereof.
  • the third user portal will be configured for an agricultural manager.
  • the third user portal will permit the agricultural manager to input, review, or modify one or more operating parameters, outputs, data, recipes, or any combination thereof.
  • the agricultural manager will communicate with the agricultural grower via the individual portals by providing feedback, comments, questions or any combination thereof.
  • a food consumer, an agricultural manager, or agricultural grower will communicate with each other via the user portals.

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Strategic Management (AREA)
  • Human Resources & Organizations (AREA)
  • Economics (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • Development Economics (AREA)
  • Artificial Intelligence (AREA)
  • Mathematical Physics (AREA)
  • Computing Systems (AREA)
  • Medical Informatics (AREA)
  • Evolutionary Computation (AREA)
  • Data Mining & Analysis (AREA)
  • Game Theory and Decision Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Engineering & Computer Science (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Marketing (AREA)
  • Operations Research (AREA)
  • Quality & Reliability (AREA)
  • Tourism & Hospitality (AREA)
  • General Business, Economics & Management (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)
  • Greenhouses (AREA)
  • Cultivation Of Plants (AREA)

Abstract

Described herein are platforms, methods, software, systems and devices for agricultural product management. In some embodiments, a platform as described herein includes a plurality of sensors, such as an array of sensors. In some embodiments, a platform as described herein includes a plurality of actuators, such as an array of actuators. A processor may receive data from at least one sensor and may direct a change in an operating parameter of at least one actuator. In some embodiments, a platform as described herein includes a database that comprises a data set, such as data set including data from a plurality of agricultural facilities.

Description

AGRICULTURAL PLATFORMS
CROSS-REFERENCE
[0001] This application claims the benefit of U.S. Provisional Patent Application No 62/910,346, filed October 3, 2019, which is entirely incorporated herein by reference.
SUMMARY
[0002] Agriculture management is an important function to oversee all aspects of running farms and other growing facilities that produce agricultural products. Agriculture management also includes farmers and landowners to address profitability, fertility, and conservation. These types of management functions are essential to a successful farm business and to ensure sufficient and nutrient-rich food for a population of food consumers.
[0003] Disclosed herein is a platform for agriculture management, the platform comprising: a trained algorithm that provides a smart recipe for growing an agricultural product, wherein the trained algorithm is configured to (i) receive data at least in part from an array of sensors, wherein a sensor is positioned at a location related to an agricultural facility and (ii) direct at least one actuator of an array of actuators positioned at a location in the agricultural facility to adjust one or more operating parameters according to the smart recipe, wherein the trained algorithm is trained on at least five agricultural products and wherein the trained algorithm optimizes the smart recipe to maximize agricultural product yield. In some cases, the agricultural product comprises an animal-based product. In some cases, the agricultural product comprises a plant-based product. In some cases, the agricultural facility is a fish farm, a dairy farm, a livestock farm, a crop farm, an orchard, an indoor farm, a hydroponic farm or any combination thereof. In some cases, the at least five an agricultural products comprise different an agricultural products. In some cases, the array of sensors comprise a camera, a temperature sensor, a light sensor, a pH sensor, or any combination thereof. In some cases, the array of sensors comprise at least 5 different types of sensors. In some cases, the one or more operating parameters comprise a water amount provided to the agricultural product, a light amount provided to the agricultural product, a soil composition provided to the agricultural product, or any combination thereof. In some cases, the location comprises two or more locations. In some cases, the location is remote. In some cases, the location is within the agricultural facility. In some cases, the data comprises an input from an agricultural grower, an agricultural manager, or a combination thereof. In some cases, the data comprises data collected from previous grows of the agricultural product. [0004] Disclosed herein is a platform for agriculture management, the platform comprising: (a) an array of sensors, wherein a sensor is positioned at a location related to an agricultural facility; (b) an array of actuators, wherein an actuator is positioned at a location related to the agricultural facility; and (c) a processor configured to receive data at least in part from a sensor of the array of sensors and upon receipt of the data direct a change in an operating parameter of at least one actuator of the array of actuators, wherein the change in the operating parameter is calculated by a trained algorithm configured for maximizing agricultural product yield. In some cases, the array of sensors comprises a camera, a temperature sensor, a light sensor, a pH sensor, or any combination thereof. In some cases, the array of sensors comprise at least 5 different types of sensors. In some cases, the array of actuators comprises a water source, a light source, a nutrient source, a wind source, a temperature source, or any combination thereof. In some cases, the agricultural facility is a fish farm, a dairy farm, a livestock farm, a crop farm, an orchard, an indoor farm, a hydroponic farm or any combination thereof. In some cases, the operating parameter comprises a water amount provided to the agricultural product, a light amount provided to the agricultural product, a soil composition provided to the agricultural product, or any combination thereof. In some cases, the processor directs a change in at least three operating parameters based on receipt of the data. In some cases, the location comprises two or more locations. In some cases, the location is remote. In some cases, the location is within the agricultural facility. In some cases, the data comprises an input from an agricultural grower, an agricultural manager, or a combination thereof. In some cases, the data comprises data collected from previous grows of the agricultural product.
[0005] Disclosed herein is a platform for agriculture management, the platform comprising: (a) a plurality of sensors, wherein a sensor is positioned at a location related to an agricultural facility; (b) a processor configured to receive data at least in part from at least one sensor of the plurality of sensors; and (c) a user interface configured to receive a request for an agricultural product from a user, and upon receipt of the request, the processor is configured to provide a result to the user based on the data, wherein the result comprises a nutritional result of the agricultural product, a food safety result of the agricultural product, a provenance of the agricultural product, or any combination thereof. In some cases, the platform further comprises a database, wherein the database comprises a data set received from a plurality of agricultural facilities. In some cases, the processor comprises a trained algorithm trained to compare the data set from the database to the data received from the at least one sensor to produce the result. In some cases, the plurality of agricultural facilities is at least 5. In some cases, the plurality of agricultural facilities are located in different geographical locations. In some cases, the request comprises a provenance of the agricultural product, a farming practice of the agricultural product, a nutritional panel of the agricultural product, a food safety of the agricultural product, or any combination thereof. In some cases, the user is a food consumer. In some cases, the user is a business entity that sells the agricultural product to a consumer. In some cases, the location comprises two or more locations. In some cases, the location is remote. In some cases, the location is within the agricultural facility. In some cases, the plurality of sensors comprises a camera, a temperature sensor, a light sensor, a pH sensor, or any combination thereof. In some cases, the plurality of sensors comprise at least 5 different types of sensors. In some cases, the agricultural facility is a fish farm, a dairy farm, a livestock farm, a crop farm, an orchard, an indoor farm, a hydroponic farm or any combination thereof. In some cases, the agricultural product comprises an animal-based product. In some cases, the agricultural product comprises a plant-based product. In some cases, the data comprises an input from an agricultural grower, an agricultural manager, or a combination thereof. In some cases, the data comprises data collected from previous grows of the agricultural product.
[0006] Disclosed herein is a platform for agriculture management, the platform comprising: (a) a plurality of sensors, wherein a sensor is positioned at a location related to an agricultural facility; (b) a plurality of actuators, wherein an actuator is positioned at a location related to the agricultural facility; (c) a processor configured to receive data from at least one sensor of the plurality of sensors; and (d) a plurality of discrete user interfaces, wherein a user interface of the plurality is configured to (i) receive data from a user, (ii) receive a request from a user, (iii) provide a result to a user, or (iv) any combination thereof, wherein a first user interface of the plurality is configured for a food consumer. In some cases, the plurality of discrete user interfaces is at least three. In some cases, a second user interface is configured for an agricultural grower. In some cases, a second user interface is configured for an agricultural manager.
BRIEF DESCRIPTION OF THE DRAWINGS [0007] The novel features of the invention are set forth with particularity in the appended claims. A better understanding of the features and advantages of the present invention will be obtained by reference to the following detailed description that sets forth illustrative embodiments, in which the principles of the invention are utilized, and the accompanying drawings (also “figure” and “FIG.” herein), of which:
[0008] FIG. 1 shows an embodiment of a system such as used in an agricultural platform as described herein.
DETAILED DESCRIPTION
[0009] Described herein is a platform that includes a processor and user interface for management of production of an agricultural product. In some embodiments, a platform as described herein includes a plurality of sensors, such as a plurality of individual sensors and/or one or more arrays of sensors. In some embodiments, a platform as described herein includes a plurality of actuators, such as a plurality of individual actuators and/or one or more arrays of actuators. A processor may receive data from at least one sensor and may direct a change in an operating parameter of at least one actuator. In some embodiments, a platform as described herein includes a database that comprises a data set. The data set may comprise data from a plurality of agricultural facilities. The processor may include a trained algorithm. The trained algorithm may be trained with a data set. The trained algorithm may be trained to compare data input to the processor to produce a result. The result may be prompted by a request from a user. [0010] In various embodiments, the platforms, systems, media, and methods described herein include a cloud computing environment. In some embodiments, a cloud computing environment comprises one or more computing processors.
[0011] While various embodiments are shown and described herein, it will be obvious to those skilled in the art that such embodiments are provided by way of example only. It should be understood that various alternatives to the embodiments herein in some embodiments are employed.
Definitions
[0012] Unless otherwise defined, all technical terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. As used in this specification and the appended claims, the singular forms “a,” “an,” and “the” include plural references unless the context clearly dictates otherwise. Any reference to “or” herein is intended to encompass “and/or” unless otherwise stated.
[0013] As used herein, the phrases “at least one,” “one or more,” and “and/or” are open-ended expressions that are both conjunctive and disjunctive in operation. For example, each of the expressions “at least one of A, B and C,” “at least one of A, B, or C,” “one or more of A, B, and C,” “one or more of A, B, or C” and “A, B, and/or C” means A alone, B alone, C alone, A and B together, A and C together, B and C together, or A, B and C together.
[0014] As used herein, the term “about” may mean the referenced numeric indication plus or minus 15% of that referenced numeric indication.
[0015] As used herein, the term “user” may mean a food consumer, an agricultural grower, an agricultural manager, a business entity in the food consumer industry or any combination thereof. A user may be a person that produces or assisting in at least one aspect of producing the agricultural product. A user may be a farmer, a planter, a breeder, a stockman, an agriculturist, an agronomist, a rancher, a producer, a cropper, a harvester, a gardener, an orchardist, a horticulturist, a hydroponist, a pomologist, a viticulturist, or any combination thereof. A user may be a person in the farm business or agriculture business. A user may be an agricultural manager that oversees an aspect of the business. A user may be a CEO, CSO, or CFO of an agriculture facility. A user may be a person that purchases an agricultural product from a farmer or a food producer. A user may be a person that sells the agricultural product to a consumer. A user may be a consumer, a person who eats the agricultural product or who buys the agricultural product.
[0016] As used herein, the term “agricultural product” may mean a product produced for consumption by a person or animal. An agricultural product may include a plant or portion thereof, an animal or portion thereof, an animal product, or any combination thereof. An agricultural product may include one or more crops, a food, a nutrient, a consumable, a livestock, a plant, an animal, an animal product (such as dairy, milk, eggs, cheese), a plant product, or any combination thereof.
[0017] As used herein, the term “agricultural facility” may mean a facility for producing one or more types of an agricultural product. An agricultural facility may include a rural or urban facility or both. An agricultural facility may include an outdoor or indoor facility or both. An agricultural facility may include multiple different geographical locations or a singular geographical location. An agricultural facility may include a farm, a dairy farm, a livestock farm, a crop farm, a commercial farm, a fish farm, a meat farm, a poultry farm, a greenhouse, an orchard, a hydroponic farm, an urban farm, or any combination thereof. An agricultural facility may utilize natural growing elements such as sunlight, soil, and weather conditions of a geographical outdoor location. An agricultural facility may utilize artificial elements such as artificial light, artificial soil, artificial heat, or combinations thereof. An agricultural facility may utilize direct sun, indirect sun (such as from solar panels), or artificial light to grow crops.
[0018] As used herein, the term “farming practice” may mean a practice performed by one or more farms. A farming practice may include growing an agricultural product under organic food standards or not. A farming practice may include growing an agricultural product under non- GMO standards or not. A farming practice may include growing an agricultural product under hormone-free conditions or not. A farming practice may include growing an agricultural product under antibiotic-free conditions or not. A farming practice may include growing an agricultural product under environmental-sustainable practices or not. A farming practice may include growing an agricultural product under a reduced carbon footprint standard or not. A farming practice may include growing an agricultural product under fair-trade standards or not. A farming practice may include growing an agricultural product under a particular level of animal welfare standards or not. A farming practice may include growing an agricultural product under farm raised conditions or raised in the wild. A farming practice may include growing an agricultural product under in-door conditions, open access conditions, or free range conditions.
A farming practice may include growing an animal-based product on a grass-fed diet or not. A farming practice may include any of the forgoing examples or any combination thereof.
[0019] As used herein, the term “recipe” may mean a collection of parameters used in planting, growing, and/or maintaining an agricultural product. Non-limiting examples of parameters that might be found in a recipe include geographical location of the agricultural product, elevation of the agricultural product, environmental temperature, environmental humidity, environmental air quality, frequency of watering, amount of watering, frequency of application of fertilizer, amount of fertilizer used, frequency of pesticide applied, amount of pesticide used, soil composition, soil pH, agricultural product seed characteristics, and timing with respect to any actions carried out with respect to an agricultural product and/or seed of an agricultural product. [0020] As used herein, the term “nutritional profile” may mean an agricultural product having one or more nutritional attributes. One or more nutritional attributes of an agricultural product may be quantified and communicated to a user, such as an agricultural manager, agricultural grower, food consumer, or any combination thereof. The platforms as described herein may execute a recipe to grow an agricultural product, the recipe having been optimized to maximize one or more nutritional attributes. A nutritional profile of an agricultural product may be compares across one or more farms growing the agricultural product. A nutritional attribute may include a taste or flavor, a color, a texture, a ripeness, a freshness, a vitamin content or amount, a mineral content or amount, a fat content or amount, a sugar content or amount, a carbohydrate content or amount, a pesticide content or amount, an anti-oxidant content or amount, or any combination thereof.
[0021] In general, the term “software” as used herein comprises computer readable and executable instructions that may be executed by a computer processor. In some embodiments a “software module” comprises computer readable and executable instructions and may, in some embodiments described herein make up a portion of software or may in some embodiments be a stand-alone item. In various embodiments, software and/or a software module comprises a file, a section of code, a programming object, a programming structure, or combinations thereof. In further various embodiments, a software module comprises a plurality of files, a plurality of sections of code, a plurality of programming objects, a plurality of programming structures, or combinations thereof. In various embodiments, the one or more software modules comprise, by way of non-limiting examples, a web application, a mobile application, and a standalone application. In some embodiments, software modules are in one computer program or application. In other embodiments, software modules are in more than one computer program or application. In some embodiments, software modules are hosted on one machine. In other embodiments, software modules are hosted on more than one machine. In further embodiments, software modules are hosted on cloud computing platforms. In some embodiments, software modules are hosted on one or more machines in one location. In other embodiments, software modules are hosted on one or more machines in more than one location.
Sensors
[0022] A sensor may collect data. A sensor may collect a single type of data. A sensor may collect more than one type of data. A data type that one or more sensors may collect may include: an image (such as an image of an agricultural product or portion thereof or an image within the agricultural facility), a temperature, a humidity, a pH, a light level, a light spectrum, an air flow, an air circulation level, an air composition, an air exchange rate, a fertilizer content, a fertilizer concentration, a nutrient content, a nutrient concentration or any combination thereof. A sensor may collect data related to a climate or microclimate within an agricultural facility. A sensor may collect data on a disease type or disease level. A sensor may collect data on an agricultural product yield, size of product, amount of product, rate of growth, or any combination thereof. A sensor may collect data on amount of resources utilized or rate of resources utilized, such as water, fertilizer, soil, nutrients, sun light, heat, or any combination thereof. A sensor may collect data automatically. Automated data collection may include continuous collection, discrete collection (such as when a given threshold may be reached), incremental collection (such as collection as timed intervals), or any combination thereof. A sensor may collect data when a user (grower or farm manager) prompts the sensor. A sensor may be a user (such as a grower). In some embodiments, a user may provide a sensory assessment of an agricultural product and may input the data into the processor, such as a user- based estimate of product number.
[0023] A sensor may be a camera, such as a digital camera. A camera may be a camera on a smart phone (such as an iPhone) or on a smart watch (such as an iWatch). A sensor may be a pH sensor, a temperature sensor, a light sensor, a humidity sensor, an air sensor, a turbidity sensor, a chemical sensor, or any combination thereof.
[0024] An array of sensors may include nxn sensors, such as lxl sensors, 2x2 sensors, 3x3 sensors or more. An array of sensors may include nxm sensors, such as 1x3 sensors, 2x6 sensors, 3x9 sensors or more. An array of sensors may include a plurality of sensors. A sensor in an array of sensors may be individually addressable by a user or by the processor. For example, a subset of sensors may collect data based on a given parameter - such as time or temperature. Actuators
[0025] Adjusting an actuator may adjust one or more operating parameters of the agricultural facility. For example, adjusting a vent may adjust the temperature at one or more locations in the agricultural facility. An array of sensors in communication with an array of actuators may facilitate adjustments of the temperature in discrete locations of the agricultural facility that may be insufficient compared to the remaining locations that are sufficiently heated. An adjustment in an actuator may be determined by the processor and based on data received from one or more sensors, users, or a combination thereof.
[0026] An actuator may include a vent, a shutter, a louver, a sprayer, a pump, a valve, a mixer, a heater, a fan, a light, a humidifier, a dehumidifier, an air exchanger, a gas pump, a water source, a wind source, a food source, a fertilizer source, a pest control source, an evaporative cooler, a gas generator (such as a C02 generator) or any combination thereof.
[0027] An array of actuators may include nxn actuators, such as lxl actuators, 2x2 actuators,
3x3 actuators or more. An array of actuators may include nxm actuators, such as 1x3 actuators, 2x6 actuators, 3x9 actuators or more. An array of actuators may include a plurality of actuators. An actuator in an array of actuators may be individually addressable by a user or by the processor. For example, a subset of actuators may collect data based on a given parameter - such as time or temperature.
Operating Parameters
[0028] An operating parameter may include at least in part an ingredient or raw material needed to grow an agricultural parameter. An operating parameter may include a seed composition to grow a particular type of agricultural product, a water amount to provide to the agricultural product, a light amount or light spectrum to provide to the agricultural product, a soil composition to provide to the agricultural product, or any combination thereof. For animal products, an operating parameter may include a feed type, feed amount, feed frequency, water type, water amount, water frequency, or any combination thereof for optimal growth, health, or nutrition of an animal product.
[0029] An operating parameter may include an amount or type of raw ingredient. An operating parameter may include an environmental condition to provide a nutritionally optimized product, a maximized product yield, a shortened growth cycle to achieve the agricultural product, a product yield that reduces an amount of raw ingredients needed or any combination thereof. An operating parameter may include an acceptable range, such as a growing temperature from about 65 degrees Fahrenheit to about 90 degrees Fahrenheit. An operating parameter may include a suggested starting value, such as a growing temperature of about 75 degrees Fahrenheit, a value that may be updated or modified during the course of a grow cycle. An operating parameter may be modified by a user or by a control system. An operating parameter may be modified based at least in part from data received from one or more sensors.
Optimized Output [0030] A platform may utilize a processor comprising a trained algorithm. The trained algorithm may be trained on one or more agricultural products. The trained algorithm may be trained to identify features in the data received from the one or more sensors and based on the data received determine predicts outputs such as yield prediction. Further, the trained algorithm may be trained to identify features in the data received from the one or more sensors and suggest changes in operating parameters of one or more actuators to enhance or optimize an output, such as yield production.
[0031] A trained algorithm may be trained to maximize yield production, minimize disease, minimize weed growth, agricultural nutritional content, animal welfare, water conservancy, soil conservancy, or any combination thereof.
[0032] A maximized yield production may be the most agricultural product produced in the smallest square footage of agricultural facility. A maximized yield production may be the most agricultural product produced for the least amount of seed, water, or resource input.
Platform
[0033] A Farm Management System may be a complete software solution for managing an agricultural facility, such as a farm. The platform may integrate task management, enterprise resource planning (ERP) solutions, crop planning, computer vision, supply chain management, as well as sensors and automation into a singular powerful platform: the one application an agricultural manager may need to manage a data driven 21st century farm.
[0034] Collecting multiple functionalities into one platform may allow a user to find correlations amongst the data and take action or deliver useful insights with machine learning. This approach may give growers the feedback they need to maximize output and minimize inputs, all while optimizing for growing environmental conditions, nutritional value, yields, food safety, or any combination thereof.
[0035] By creating a cloud-based application a user may be able to collect, maintain, and compare data from multiple agricultural facilities at similar or different geographical locations. This ability can deliver the end consumer more insight into how one agricultural facility compares to another agricultural facility in practices, sustainability, food safety, nutrition, or any combination thereof.
[0036] As there are many elements to the holistic farm management system as described herein, the summary covers modules that may comprise the farm management system individually and also how they can be brought together to enable new functionality.
Sensors, Actuators, and Camera Arrays
[0037] A series of sensors may be distributed at carefully chosen locations throughout the agricultural facility, collecting ongoing data on temperature, humidity, fertilizer content, fertilizer concentration, nutrient content, nutrient concentration, pH, light levels, light spectrums, air flow, air circulation, or any combination thereof.
[0038] Sensor arrays may open visibility into patterns in environmental variables such as temperature. For example, given a greenhouse with a wet wall for cooling at one end of it, there are differences in both temperature and humidity from the near side to the far side. Sensors placed in a grid system can identify, monitor and measure the various microclimates in the growing environments. This monitoring of microclimates in real-time can be input into a processor of the platform and the processor can direct an adjustment in an operating parameter of one or more actuators to correct an undesirable microclimate in a specific location in the agricultural facility. Real-time monitoring and adjustments to correct for undesirable microclimates can maximize agricultural product yield.
[0039] Data collected from one or more sensors, may be processed in real-time, triggering actuators (such as farming equipment) to make changes in the growing environment; e.g., modifications to temperature, adding nutrients to fertigation supply, powering off lights, or any combination thereof.
[0040] Similar to sensor arrays, actuator arrays may consist of vents, heaters, fans, lights, humidifiers, dehumidifiers, other devices, or any combination thereof that can act on the environment or a subset of the environment (e.g. microclimate of a sub-location of the agricultural facility). Processors with built-in machine learning or Artificial Intelligence (AI) can utilize data from one or more sensors of a sensor array and learn best how to use one or more actuators of an array of actuators to achieve an outcome in a specific environment. For example, if the desired temperature is 72 degrees Fahrenheit, a network of fans and directional vents can be configured to correctly circulate the air in an even distribution pattern.
[0041] Camera arrays can capture data from a portion of a growing area or an entire growing area. The primary use of imaging technology may be correlated with outcomes for growing plants, but it can also be used to monitor for pests, both insect and animal. If there is a pest detected, a user can know exactly which plants within the agricultural facility might be affected. This feature may be important for food safety assurances, research purposes, improving smart recipes, or any combination thereof. Camera arrays may provide multiple perspectives of an agricultural product (such as different angles of a growing plant), data what can be utilized to determine agricultural product size (e.g. fruit size), leaf orientation, color, stage of disease, plant/animal stress, or any combination thereof. Additionally, with cameras biofeedback may be incorporated so that the processor can adjust the operating parameters (such as environmental conditions) based on data (such as leaf orientation, color, reproductive status, plant health or any combination thereof). Two-dimensional camera technology may be incorporated in a variety of spectmms. Three-dimensional camera technology may unlock live modeling functionality and additional correlation possibilities.
Enterprise Resource Planning (ERP) functionality
[0042] The farm management system (FMS) may integrate a subset of ERP functionality with the focus being capturing data and utilizing that data to run an agricultural facility with maximum efficiency. Features include but are not limited to: sales management, order management, customer relationship management, task management, standard operating procedures, user training, inventory, cost analysis, planning or any combination thereof.
[0043] The platform may be tailored specifically to the needs of growers and domain expertise may be incorporated into the platform. For example, the platform may comprise features allowing growers to collect information relevant for food safety compliance, to generate reports for certification, or a combination thereof.
[0044] Features of the platform may include:
[0045] Integrated Pest Management (IPM) . May use computer vision to collect detailed information on pests and organisms in the environment or agricultural facility.
[0046] Food safety traceability, documentation and provenance. May allow for consumers to have transparency in where their food comes from and what was done to it.
[0047] Nutrition certification. May provide data on agricultural product provenance to show that a batch of food meets a certain level of nutrition based on laboratory analysis of batch samples.
[0048] Sustainability certification. May provide insight to a food consumer into how their food was grown based on data collected by the platform. Data collected that may be provided to a food consumer may include the amount of water used, the amount of energy used, the amount of soil square footage used, other resources used, metadata of how an agricultural product may be grown (for example, open field vs. soil greenhouse vs. hydroponic greenhouse, etc.), or any combination thereof.
[0049] Supply chain integration. May provide integration with Supply Chain data using REST and GraphQL APIs. This may be important for food safety and efficiently executing a food recall.
[0050] Additionally, the platform may provide integration with other ERP products such as NetSuite and Oracle ERP.
[0051] “Representational State Transfer” or “REST” is an architectural style that defines a set of constraints to be used for creating web services and provides interoperability between computer systems and the internet.
Customizable Interface [0052] Farm management, growers, and farm workers constitute distinct user groups. The platform may be customized to each user group’s needs but built from a shared centralized data source. Managers and growers may each have access to a powerful customized dashboard and admin interface giving them a complete view of everything happening on the farm. Farm workers may have access to task management tools, barcode scanners, and applications for workstations such as harvesting, seeding, and packaging as well as access to Standard Operating Procedure documentation. The platform may provide a plurality of discrete user interfaces. The platform may provide separate user interfaces for agricultural managers, agricultural growers, and food consumers. The platform may provide at least 2, 3, 4, 5, 6, 7, 8, 9, 10, 15, 20, 30, 40,
50, 100, 200, 500, 1,000, 10,000, 100,000, 1,000,000 discrete user interfaces or more. The platform may provide at least 50 discrete user interfaces. The platform may provide at least 500 discrete user interfaces. The platform may provide at least 5,000 discrete user interfaces. The platform may provide at least 50,000 discrete user interfaces. The platform may provide at least 500,000 discrete user interfaces. A discrete user interface may limit the data a user can access, whether a user can input data into the platform, and what type of request a user can enter into the interface, or any combination thereof. For example, data entry may be reserved for agricultural grower or manager interfaces.
Capturing Data from Works
[0053] In general, agricultural growers using their mobile devices to access an application to input data may represent a food safety risk. To avoid potential food safety issues associated with employees using their own devices, work stations can be set up ahead of time by following a sterilization standard operating procedure: for example, a harvesting station with a tablet computer and a smart scale can be sterilized and used with gloves. Smart watches can be sterilized before each use and used with sterile gloves. Voice recognition systems capture agricultural growers logs and Bluetooth enabled scanners can scan QR codes or RFID chips. Plant Recipes (Grow files)
[0054] Plant Recipes may be a format for encoding information about an organism's environmental requirements to achieve a desired phenotype, as well as any instructions or protocols needed to achieve a certain outcome. The information in a recipe can be serialized in JSON, XML, or JSON-LD. “JSON Web Token” or “JWT Token” is a JSON-based open standard (RFC 7519) for creating access tokens that assert some number of claims and may include user information including encrypted user information.
[0055] A Plant Recipe may be interpreted by a reactive module that parses a stream of event data (sensors, actuators, etc.) according to rules defined in the Plant Recipe. This reactive module may emit its own stream of event data including messages, alerts, and new actions (such as adjusting an operating parameter by changing a setting on an actuator - turning on a heater or pump).
[0056] Sensor data from successful grows can be used to create new recipes, in essence recording environmental variables for playback later. For example, a particular harvest of tomatoes may have higher nutrient content than another harvest of tomatoes. By taking sensor data from the environment for the duration of the grow, it may be encoded into the recipe format to recreate that grow in a different controlled environment or the next grow. With greater data inflow, recipes may become increasingly targeted and intelligent through the development of machine learning algorithms.
[0057] A smart recipe may be provided by a user. A smart recipe may be at least in part an output provided by a trained algorithm from a previous grow cycle. A smart recipe may be a combination of an initial recipe provided by a user or database that may be additionally modified or updated by a trained algorithm, such as updated during a grow cycle. A smart recipe may be updated based on a subtype of agricultural product that may be grown. A smart recipe may be updated based on a geographical location that the agricultural product may be grown. A smart recipe may be updated based on a user feedback, such as a request from a food consumer for an agricultural product having a particular set of nutritional elements.
Nutritional Elements
[0058] An agricultural product may be evaluated or ranked against a second agricultural product for one or more nutritional elements. A nutrition element may include a presence of a mineral, an amount of a mineral, a presence of a vitamin, an amount of a vitamin, an amount of calories, an amount of sugar, an amount of salt, an amount of fat, a type of fat, or any combination thereof. A trained algorithm may receive data from an array of sensors, determine one or more nutritional elements of the agricultural product, and direct a modification or one or more operating parameters of one or more actuators or a modification to a recipe or a modification to a raw material to optimize the nutritional element or a panel of nutritional elements as compared to a control agricultural product.
Machine Learning Powered by Growing
[0059] Using data captured from the ERP section, the IoT section, or both as well as market data, machine learning models can be created to help growers improve their operation. The purpose of the FMS may be to collect structured data that can be used to power machine learning services such as:
[0060] Crop Planning. Sensor data and a database of plant recipes may be used to determine what plants would grow well in a particular growing environment and to give a user recommendation on how to optimize any environment for a given cultivar. In addition, sales data may drive recommendations on what crops should be grown in a given geographical location.
[0061] Pest detection, predictions, and mitigations: Using data collected from the Integrated Pest Management portion of the FMS ERP alongside historical data, the platform can recognize individual pests from images but it may predict pest occurrence based on growing seasons, environmental conditions, other factors, or any combination thereof.
[0062] Advanced environment automation : Precise control of environmental variables based on sensor, actuator, and cameras arrays may be possible with the platform. Biofeedback may also be incorporated via camera imaged-based collection, so that the processer can direct adjustment of one or more environmental conditions based on the biofeedback - leaf orientation, color, reproductive status, plant health, or any combination thereof.
[0063] The platform may provide a singular software solution to run a profitable, data-driven modem agricultural facility. By utilizing the platform, a user may have access to control of monitoring, automation and farm operations which ultimately may empower the user to ran a more profitable and effective farm. Moreover, plant recipes (such as smart recipes) and proven Standard Operating Procedures (SOPs) allow users to source and replicate proven methods to get results regardless of their growing experience. Uniquely, the platform may be implemented in farming systems of many different types (plant factory, greenhouse, hydroponic, aeroponic). [0064] One of the key assets of the platform may be the data stream and database. All of the data points that may be gathered from sensors, activities, markets, plant health, harvest data, or any combination thereof may be leveraged to create machine learning models for advanced growing tools. The platform may collect data (including location, types of grow systems already in place, sales and orders, market data, historical data for a particular environment, or any combination thereof) and the platform structures the data to deliver useful insights and accountability.
[0065] Entirely new categories of products may be built on top of this data. Financial tools for the CEA industry, including crop futures or crop insurance, as well as a decentralized distribution for farms: local growers selling directly to local buyers. Finally, data marketplaces may emerge where expert growers sell access to proven plant recipes and SOPs.
Research Sector
[0066] In addition to production environments, these innovations can also be used for agricultural research environments. Controlled growing environments equipped with sensor, actuator, and camera arrays can run experiments and capture structured data at levels of efficiency that may exceed any efforts to date. This may allow for user (such as scientists) to maintain adequate control groups when running experiments, and even may allow for them to run experiments in attempt to optimize environmental conditions and achieve a unique phenotype (such as a higher nutrition profile). Moreover, plant recipe files may allow for the experiments to be recorded and replicated, allowing users (such as scientists) to run crop trials in a consistent and controlled way which may lead to reproducibility in agricultural science.
[0067] The API and export features disclosed herein may allow for access to all this structured data allowing users (such as scientists) to use virtually any data analysis tools. Importantly, this may allow the user to link rich datasets to their publications increasing research transparency and ultimately allowing for better peer review and science.
[0068] For research institutions running multiple experiments, the platform may allow these users to collect structured data from experiments and provide insight into the state of past, present, and future experiments.
[0069] In some embodiments, the platform uses one or more sensors in an n x n array (or grid) or alternative geometric configuration to collect data at n discrete locations over a portion of an agricultural facility, which in some embodiments is digitized using pickup electronics and in some embodiments is connected to a computer for recording and displaying this data. It should be understood, however, that the platform is suitable for measuring a data associated with any type of agricultural product.
[0070] In some embodiments, a sensor is configured to sense data associated with, for example, a plant, an animal, an environment, a climate, a parameter of a location associated with an agricultural facility, or any combination thereof. In some embodiments, the platform comprises a mobile base unit that may be movable and housing one or more sensors. In some embodiments, the platform comprises a mobile base unit, one or more sensors, and one or more actuators. In some embodiments, a mobile base unit includes wheels or a track upon which the mobile base unit is moved on a surface.
[0071] A trained algorithm may provide an output. The output may comprise a yield prediction of at least one agricultural product, a disease prediction of at least one agricultural product, a weed detection in a grow cycle of at least one agricultural product, a crop quality of a grow cycle of at least one agricultural product, a species recognition of at least one agricultural product, an animal welfare rating of at least one agricultural product that is an animal-based product, a water management result of a grow of at least one agricultural product, a livestock production of at least one animal-based product, a soil management of a grow of at least one agricultural product, or any combination thereof.
Machine Learning
[0072] A platform may comprise a machine learning module, such as a trained algorithm. A machine learning module may be trained on one or more training data sets. A machine learning module may be trained on at least about: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 20, 30, 40, 50, 60, 70, 80, 90, 100, 200, 300, 400, 500, 600, 700, 800, 900, 1,000, 5,000, 10,000, 20,000, 50,000 data sets or more. A machine learning module may be trained on from about 50 to about 200 data sets. A machine learning module may be trained on from about 50 to about 1,000 data sets. A machine learning module may be trained on from about 1,000 to about 5,000 data sets. A machine learning module may be trained on from about 5 to about 500 data sets. A machine learning module may generate a training data set from data acquired or extracted from a sensor or user. A machine learning module may be validated with one or more validation data sets. A validation data set may be independent from a training data set. A training data set may comprise data provided by a sensor, data provided by a user, or any combination thereof. A training data set may be stored in a database of the platform. A training data set may be uploaded to the machine learning module from an external source. A training data set may be generated from data acquired at from a grow cycle. A training data set may be updated continuously or periodically. A training data set may comprise data from at least about: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 20, 30, 40,
50, 60, 70, 80, 90, 100, 200, 300, 400, 500, 600, 700, 800, 900, 1,000, 5,000, 10,000, 20,000
50,000 different grows. A training data set may comprise data from about 50 to about 200 different grows. A training data set may comprise data from about 50 to about 1,000 different grows. A training data set may comprise data from about 1,000 to about 5,000 different grows. A training data set may comprise data from about 5 to about 500 different grows.
[0073] In some embodiments, the sensed parameter(s) herein is received as an input to output a correlation by a processor. In some embodiments, correlation herein is received as an input to a machine learning algorithm configured to output guidance or instruction for future grows.
[0074] The systems, methods, and media described herein may use machine learning algorithms for training prediction models and/or making predictions of for a grow. Machine learning algorithms herein may learn from and make predictions on data, such as data obtained from a sensor or user. Data may be any input, intermediate output, previous outputs, or training information, or otherwise any information provided to or by the algorithms.
[0075] A machine learning algorithm may use a supervised learning approach. In supervised learning, the algorithm can generate a function or model from training data. The training data can be labeled. The training data may include metadata associated therewith. Each training example of the training data may be a pair consisting of at least an input object and a desired output value. A supervised learning algorithm may require the individual to determine one or more control parameters. These parameters can be adjusted by optimizing performance on a subset, for example a validation set, of the training data. After parameter adjustment and learning, the performance of the resulting function/model can be measured on a test set that may be separate from the training set. Regression methods can be used in supervised learning approaches.
[0076] A machine learning algorithm may use an unsupervised learning approach. In unsupervised learning, the algorithm may generate a function/model to describe hidden structures from unlabeled data (e.g., a classification or categorization that cannot be directed observed or computed). Since the examples given to the learner are unlabeled, there is no evaluation of the accuracy of the structure that is output by the relevant algorithm. Approaches to unsupervised learning include: clustering, anomaly detection, and neural networks.
[0077] A machine learning algorithm may use a semi -supervised learning approach. Semi- supervised learning can combine both labeled and unlabeled data to generate an appropriate function or classifier.
[0078] A machine learning algorithm may use a reinforcement learning approach. In reinforcement learning, the algorithm can learn a policy of how to act given an observation of the world. Every action may have some impact in the environment, and the environment can provide feedback that guides the learning algorithm.
[0079] A machine learning algorithm may use a transduction approach. Transduction can be similar to supervised learning, but does not explicitly construct a function. Instead, tries to predict new outputs based on training inputs, training outputs, and new inputs.
[0080] A machine learning algorithm may use a “learning to learn” approach. In learning to learn, the algorithm can learn its own inductive bias based on previous experience.
[0081] A machine learning algorithm is applied to patient data to generate a prediction model. In some embodiments, a machine learning algorithm or model may be trained periodically. In some embodiments, a machine learning algorithm or model may be trained non-periodically.
[0082] As used herein, a machine learning algorithm may include learning a function or a model. The mathematical expression of the function or model may or may not be directly computable or observable. The function or model may include one or more parameter(s) used within a model. For example, a linear regression model having a formula Y = CO + Clxl +
C2x2 has two predictor variables, xl and x2, and coefficients or parameter, CO, Cl, and C2. The predicted variable in this example is Y. After the parameters of the model are learned, values can be entered for each predictor variable in a model to generate a result for the dependent or predicted variable (e.g., Y).
[0083] In some embodiments, a machine learning algorithm comprises a supervised or unsupervised learning method such as, for example, support vector machine (SVM), random forests, gradient boosting, logistic regression, decision trees, clustering algorithms, hierarchical clustering, K-means clustering, or principal component analysis. Machine learning algorithms may include linear regression models, logistical regression models, linear discriminate analysis, classification or regression trees, naive Bayes, K-nearest neighbor, learning vector quantization (LVQ), support vector machines (SVM), bagging and random forest, boosting and Adaboost machines, or any combination thereof.
[0084] Data input into a machine learning algorithm may include data obtained from an individual, data obtained from a practitioner, or a combination thereof. Data input into a machine learning algorithm may include data extracted from a sensor, from a user, or a combination thereof. Data input into a machine learning algorithm may include a product yield, an environmental condition, a pest resilience, a nutrient profile, a farming practice used, or any combination thereof.
[0085] Data obtained from one or more grows can be analyzed using feature selection techniques including filter techniques which may assess the relevance of one or more features by looking at the intrinsic properties of the data, wrapper methods which may embed a model hypothesis within a feature subset search, and embedded techniques in which a search for an optimal set of features may be built into a machine learning algorithm. A machine learning algorithm may identify a set of parameters that may provide an optimized grow.
[0086] A machine learning algorithm may be trained with a training set of samples. The training set of samples may comprise data collected from a grow, from different grows, or from a plurality of grows. A training set of samples may comprise data from a database.
[0087] A training set of samples may include 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 20, 30, 40, 50, 60, 70,
80, 90, 100, 200, 300, 400, 500, 600, 700, 800, 900, 1,000, 5,000, 10,000, 20,000, 50,000 or more data types. A training set of samples may comprise a single data type. A training set of samples may include different data types. A training set of samples may comprise a plurality of data types. A training set of samples may comprise at least three data types. A training set of samples may include data obtained from about: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 20, 30, 40, 50, 60, 70,
80, 90, 100, 200, 300, 400, 500, 600, 700, 800, 900, 1,000, 5,000, 10,000, 20,000, 50,000 or more grows. A training set of samples may include data from a single grow. A training set of samples may include data from different grows. A training set of samples may include data from a plurality of grows.
[0088] Iterative rounds of training may occur to arrive at a set of features to classify data. Different data types may be ranked differently by the machine learning algorithm. One data type may be ranked higher than a second data type. Weighting or ranking of data types may denote significance of the data type. A higher weighted data type may provide an increased accuracy, sensitivity, or specificity of the classification or prediction of the machine learning algorithm. For example, an input parameter of growing temperature may significantly increase crop yield, more than any other input parameter. In this case, growing temperature may be weighted more heavily than other input parameters in increasing crop yield. The weighting or ranking of features may vary from grow to grow. The weighting or ranking of features may not vary from grow to grow.
[0089] A machine learning algorithm may be tested with a testing set of samples. The testing set of samples may be different from the training set of samples. At least one sample of the testing set of samples may be different from the training set of samples. The testing set of samples may comprise data collected from before a grow, during a grow, after a grow, from different grows, or from a plurality of grows. A testing set of samples may comprise data from a database.
[0090] A training set of samples may include different data types - such as one or more input parameters and one or more output parameters. A testing set of samples may include 1, 2, 3, 4,
5, 6, 7, 8, 9, 10, 20, 30, 40, 50, 60, 70, 80, 90, 100, 200, 300, 400, 500, 600, 700, 800, 900,
1,000, 5,000, 10,000, 20,000, 50,000 or more data types. A testing set of samples may comprise a data type. A testing set of samples may include different data types. A testing set of samples may comprise a plurality of data types. A testing set of samples may comprise at least three data types. A testing set of samples may include data obtained from 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 20, 30, 40, 50, 60, 70, 80, 90, 100, 200, 300, 400, 500, 600, 700, 800, 900, 1,000, 5,000, 10,000, 20,000,
50,000 or more grows. A testing set of samples may include data from a single grow. A testing set of samples may include data from different grows. A testing set of samples may include data from a plurality of grows.
[0091] A machine learning algorithm may classify or predict an outcome with at least about: 80%, 85%, 90%, 95%, 96%, 97%, 98%, 99% accuracy. A machine learning algorithm may classify or predict an outcome with at least about: 80%, 85%, 90%, 95%, 96%, 97%, 98%, 99% sensitivity. A machine learning algorithm may classify or predict an outcome with at least about: 80%, 85%, 90%, 95%, 96%, 97%, 98%, 99% specificity. For example, a machine learning algorithm may classify with 90% accuracy that an agricultural yield will not succumb to pest infestation. A machine learning algorithm may classify a grow as having at least 90% likelihood of producing an agricultural product with superior nutritional profile as compared to a control. A machine learning algorithm may predict at least 95% likelihood of an agricultural yield under a range of growing temperatures.
[0092] An independent sample may be independent from the training set of samples, the testing set of samples or both. The independent sample may be input into the machine learning algorithm for classification. An independent sample may not have been previously classified by the machine learning algorithm. [0093] A classifier may be employed to determine or to predict a set of growing conditions to be executed during a grow. A classifier may provide real-time feedback and guided adjustments of the one or more growing conditions - such as during a grow. One or more growing conditions may be adjusted real-time during a grow.
[0094] Use of a machine learning algorithm may optimize agricultural product yield or nutritional benefit. A machine learning algorithm may identify an ‘ideal’ or ‘optimized’ input parameter for each grow. An ‘ideal’ or ‘optimized’ input parameter may remain constant or may change over time. An ‘ideal’ or ‘optimized’ input parameter may be specific or unique for each grow or agricultural product. Feedback from a machine learning algorithm may be continuous such as feedback during a grow, episodic such as at the end of a grow, or roll-back such as cumulative changes over several different grows, or any combination thereof. Feedback from a machine learning algorithm may result in one or more changes in a recipe or operating parameter.
[0095] A trained algorithm (such as a machine learning software module) as described herein is configured to undergo at least one training phase wherein the trained algorithm is trained to carry out one or more tasks including data extraction, data analysis, and generation of output or result, such as an recipe for growing an agricultural product with maximal yield or with maximal nutritional benefit.
[0096] In some embodiments of the agricultural platform described herein, the agricultural platform comprises a training module that trains the training algorithm. The training module is configured to provide training data to the trained algorithm, said training data comprising, for example, a data set from an agricultural facility or a data set from a previous grow.
[0097] In some embodiments, a trained algorithm is trained using a data set and a target in a manner that might be described as supervised learning. In these embodiments, the data set is conventionally divided into a training set, a test set, and, in some cases, a validation set. A target is specified that contains the correct classification of each input value in the data set. For example, a data set from one type of agricultural product is repeatedly presented to the trained algorithm, and for each sample presented during training, the output generated by the trained algorithm is compared with the desired target. The difference between the target and the set of input samples is calculated, and the trained algorithm is modified to cause the output to more closely approximate the desired target value, such as maximized yield of the agricultural product. In some embodiments, a back-propagation algorithm is utilized to cause the output to more closely approximate the desired target value. After a large number of training iterations, the trained algorithm output will closely match the desired target for each sample in the input training set. Subsequently, when new input data, not used during training, is presented to the trained algorithm, it may generate an output classification value indicating which of the categories the new sample is most likely to fall into. The trained algorithm is said to be able to “generalize” from its training to new, previously unseen input samples. This feature of a trained algorithm allows it to be used to classify almost any input data which has a mathematically formulatable relationship to the category to which it should be assigned.
[0098] In some embodiments of the trained algorithm described herein, the trained algorithm utilizes an individual learning model. An individual learning model is based on the trained algorithm having trained on data from a single individual and thus, a trained algorithm that utilizes an individual learning model is configured to be used on a single individual on whose data it trained.
[0099] In some embodiments, the trained algorithm utilizes a global training model. A global training model is based on the trained algorithm having trained on data from multiple individuals and thus, a trained algorithm that utilizes a global training model is configured to be used on multiple patients/individuals.
[0100] In some embodiments, the trained algorithm utilizes a simulated training model. A simulated training model is based on the trained algorithm having trained on a data set obtained from a grow of an agricultural product. A trained algorithm that utilizes a simulated training model is configured to be used on multiple grows of an agricultural product.
[0101] Unsupervised learning is used, in some embodiments, to train a trained algorithm to use input data such as, for example, agricultural product data and output, for example, a maximized yield or disease detection. Unsupervised learning, in some embodiments, includes feature extraction which is performed by the trained algorithm on the input data. Extracted features may be used for visualization, for classification, for subsequent supervised training, and more generally for representing the input for subsequent storage or analysis. In some cases, each training case may consist of a plurality of agricultural product data.
[0102] Trained algorithms that are commonly used for unsupervised training include k-means clustering, mixtures of multinomial distributions, affinity propagation, discrete factor analysis, hidden Markov models, Boltzmann machines, restricted Boltzmann machines, autoencoders, convolutional autoencoders, recurrent neural network autoencoders, and long short-term memory autoencoders. While there are many unsupervised learning models, they all have in common that, for training, they require a training set consisting of a data set of grows of an agricultural product, without associated labels.
[0103] A trained algorithm may include a training phase and a prediction phase. The training phase is typically provided with data in order to train the machine learning algorithm. Non limiting examples of types of data inputted into a trained algorithm for the purposes of training include an agricultural product yield, an amount and type of raw materials, an environmental condition during a grow, a length of grow, a nutritional profile of the agricultural product, a soil composition, or any combination thereof. Data that is inputted into the trained algorithm is used, in some embodiments, to construct a hypothesis function to determine the presence of an abnormality. In some embodiments, a trained algorithm is configured to determine if the outcome of the hypothesis function was achieved and based on that analysis make a determination with respect to the data upon which the hypothesis function was constructed. That is, the outcome tends to either reinforce the hypothesis function with respect to the data upon which the hypothesis functions was constructed or contradict the hypothesis function with respect to the data upon which the hypothesis function was constructed. In these embodiments, depending on how close the outcome tends to be to an outcome determined by the hypothesis function, the machine learning algorithm will either adopts, adjusts, or abandon the hypothesis function with respect to the data upon which the hypothesis function was constructed. As such, the machine learning algorithm described herein dynamically learns through the training phase what characteristics of an input (e.g. data) is most predictive in optimizing a crop yield, minimizing pest infestation or carbon foot print of a product, maximizing profit or nutritional value of a product, or any combination thereof.
[0104] For example, a trained algorithm is provided with data on which to train so that it, for example, is able to determine the most salient features of a received agricultural product data to operate on. The trained algorithms described herein train as to how to analyze the agricultural product data, rather than analyzing the agricultural product data using pre-defmed instructions. As such, the trained algorithms described herein dynamically learn through training what characteristics of an input signal are most predictive in in optimizing a crop yield, minimizing pest infestation or carbon foot print of a product, maximizing profit or nutritional value of a product, or any combination thereof.
[0105] In some embodiments, the trained algorithm is trained by repeatedly presenting the trained algorithm with agricultural product data along a range of successful and non-successful grows. The trained algorithm may be presented with data from grows having high yield and data having no product produced. The trained algorithm may be presented with data from grows having a high carbon footprint and data from grows having a minimized carbon foot print. A trained algorithm may receive heterogeneous data conveying the range and variability of data that the trained algorithm may encounter in a future grow. Agricultural product data may be generated by computer simulation.
[0106] In some embodiments, training begins when the trained algorithm is given agricultural product data and asked to optimize a crop yield, minimize pest infestation or carbon foot print of a product, maximize profit or nutritional value of a product, or any combination thereof. The predicted output is then compared to the true data that corresponds to the agricultural product data. An optimization technique such as gradient descent and backpropagation is used to update the weights in each layer of the trained algorithm so as to produce closer agreement between the probability predicted by the trained algorithm, and the optimized result. This process is repeated with new agricultural product data until the accuracy of the network has reached the desired level. An optimization technique is used to update the weights in each layer of the trained algorithm so as to produce closer agreement between the data predicted by the trained algorithm, and the true data. This process is repeated with new agricultural product data until the accuracy of the network has reached the desired level.
[0107] In general, a machine learning algorithm may be trained using a large database of measurements and/or any features or metrics computed from the above said data with the corresponding ground-truth values. The training phase constructs a transformation function for predictive in optimizing a crop yield, minimizing pest infestation or carbon foot print of a product, maximizing profit or nutritional value of a product, or any combination thereof. The machine learning algorithm dynamically learns through training what characteristics or features of an input signal are most predictive in optimizing the features of an agricultural product, such as nutritional profile. A prediction phase uses the constructed and optimized transformation function from the training phase to predict the optimization of the grow and product yield.
[0108] Following training, the trained algorithm may be used to maximize, for example, the agricultural product yield on which the system was trained using the prediction phase. With appropriate training data, the system can predict in an independent grow cycle the optimized product yield Data Filtering
[0109] In some embodiments of the devices, systems, software, and methods described herein, data that is received by a machine learning algorithm software module from a sensor as an input may comprise agricultural product data that has been filtered and or modified. In some embodiments, filtering comprises a removal of noise or artifact from a sensed data, such as noise perturbations or temperature fluctuations from a grower entering or exiting a facility. Artifact or noise may comprise, for example, ambient signals that are sensed together with data sensed from in an agricultural facility.
[0110] In some embodiments of the devices, systems, software, and methods described herein, sensed agricultural product data is filtered prior to and/or after transmission of said data to a processor. Filtering of sensed agricultural product data may, for example, comprise the removal of ambient signal noise from a sensed agricultural product data. Signal noise may, for example, comprises ambient agricultural product data generated by, for example, electronic devices, electrical grids, or other devices.
[0111] In some embodiments, sensed agricultural product data is converted to another form of data or signal which then undergoes a signal filtering process. In some embodiments, a device or system includes a processor including software that is configured to convert sensed agricultural product data to another form of data or signal. The process of converting sensed agricultural product data to another form of data or signal typically comprises an encoding process, wherein a first form of data is converted into a second form of data or signal._Once filtered, the filtered data may be transmitted to a machine learning algorithm for analysis.
[0112] It should be understood, that any device, system, and/or software described herein is configured for use in or is captured by one or more steps of a method.
Drawings
[0001] FIG. 1 shows an embodiment of a system such as used in an agricultural platform as described herein, comprising a digital processing device 101. The digital processing device 101 includes a software application configured for agriculture management. Alternatively or in combination, the digital processing device 101 is configured to generate a trained algorithm (e.g., machine learning algorithm) such as by training the algorithm with a training data set. The digital processing device 101 may include a central processing unit (CPU, also “processor” and “computer processor” herein) 105, which can be a single core or multi-core processor, or a plurality of processors for parallel processing. The digital processing device 101 also includes either memory or a memory location 110 (e.g., random-access memory, read-only memory, flash memory), electronic storage unit 115 (e.g, hard disk), communication interface 120 (e.g, network adapter, network interface) for communicating with one or more other systems, and peripheral devices, such as cache. The peripheral devices can include storage device(s) or storage medium 165 which communicate with the rest of the device via a storage interface 170. The memory 110, storage unit 115, interface 120 and peripheral devices are configured to communicate with the CPU 105 through a communication bus 125, such as a motherboard. The digital processing device 101 can be operatively coupled to a computer network (“network”) 130 with the aid of the communication interface 120. The network 130 can comprise the Internet and/or a local area network (LAN). The network 130 can be a telecommunication and/or data network.
[0002] The digital processing device 101 includes input device(s) 145 to receive information from a user, the input device(s) in communication with other elements of the device via an input interface 150. Alternatively or in combination, the input device(s) includes a remote device such as a smartphone or tablet that is configured to communicate remotely with the digital processing device 101. For example, a user may use a smartphone application to access sensor data, current actuator instructions, the smart recipe, or other information stored on the digital processing device 101. The digital processing device 101 can include output device(s) 155 that communicates to other elements of the device via an output interface 160.
[0003] The CPU 105 is configured to execute machine-readable instructions embodied in a software application or module. The instructions may be stored in a memory location, such as the memory 110. The memory 110 may include various components ( e.g ., machine readable media) including, but not limited to, a random access memory component (e.g., RAM) (e.g, a static RAM "SRAM", a dynamic RAM "DRAM, etc.), or a read-only component (e.g, ROM). The memory 110 can also include a basic input/output system (BIOS), including basic routines that help to transfer information between elements within the digital processing device, such as during device start-up, may be stored in the memory 110.
[0004] The storage unit 115 can be configured to store files, such as sensor data, smart recipe(s), etc. The storage unit 115 can also be used to store operating system, application programs, and the like. Optionally, storage unit 115 may be removably interfaced with the digital processing device (e.g, via an external port connector (not shown)) and/or via a storage unit interface. Software may reside, completely or partially, within a computer-readable storage medium within or outside of the storage unit 115. In another example, software such as the software application and/or module(s) may reside, completely or partially, within processor(s) 105.
[0005] Information and data can be displayed to a user through a display 135. The display is connected to the bus 125 via an interface 140, and transport of data between the display other elements of the device 101 can be controlled via the interface 140.
[0006] Methods as described herein can be implemented by way of machine (e.g, computer processor) executable code stored on an electronic storage location of the digital processing device 101, such as, for example, on the memory 110 or electronic storage unit 115. The machine executable or machine readable code can be provided in the form of a software application or software module. During use, the code can be executed by the processor 105. In some cases, the code can be retrieved from the storage unit 115 and stored on the memory 110 for ready access by the processor 105. In some situations, the electronic storage unit 115 can be precluded, and machine-executable instructions are stored on memory 110.
[0007] In some embodiments, one or more remote devices 102 are configured to communicate with and/or receive instructions from the digital processing device 101, and may comprise any sensor, actuator, or camera as described herein. For example, in some cases, the remote device 102 is a temperature sensor that is configured to gather temperature data and send the data to the digital processing device 101 for analysis according to a smart recipe. The sensor can provide information such as sensor data, type of data, sensor ID, sensor location, metadata, or other data. In some cases, the remote device 102 is an actuator configured to perform one or more actions based on instructions received from the digital processing device 101. In some cases, the remote device is a camera configured to provide a camera feed or imaging data to the digital processing device 101. The camera may be configured to receive and respond to instructions to perform an action such as, for example, turning on/off, rotating or moving, and/or zooming in or out.
Specific Embodiments
[0113] A number of methods and systems are disclosed herein. Specific exemplary embodiments of these methods and systems are disclosed below.
[0114] Embodiment 1. A platform for agriculture management, the platform comprising: a trained algorithm that provides a smart recipe for growing an agricultural product, wherein the trained algorithm is configured to (i) receive data at least in part from an array of sensors, wherein a sensor is positioned at a location related to an agricultural facility and (ii) direct at least one actuator of an array of actuators positioned at a location in the agricultural facility to adjust one or more operating parameters according to the smart recipe, wherein the trained algorithm is trained on at least five agricultural products and wherein the trained algorithm optimizes the smart recipe to maximize agricultural product yield.
[0115] Embodiment 2. The platform of embodiment 1, wherein the agricultural product comprises an animal-based product.
[0116] Embodiment 3. The platform of any one of embodiments 1-2, wherein the agricultural product comprises a plant-based product.
[0117] Embodiment 4. The platform of any one of embodiments 1-3, wherein the agricultural facility is a fish farm, a dairy farm, a livestock farm, a crop farm, an orchard, an indoor farm, a hydroponic farm or any combination thereof.
[0118] Embodiment 5. The platform of any one of embodiments 1-4, wherein the at least five an agricultural products comprise different an agricultural products.
[0119] Embodiment 6. The platform of any one of embodiments 1-5, wherein the array of sensors comprise a camera, a temperature sensor, a light sensor, a pH sensor, or any combination thereof.
[0120] Embodiment 7. The platform of any one of embodiments 1-6, wherein the array of sensors comprise at least 5 different types of sensors.
[0121] Embodiment 8. The platform of any one of embodiments 1-7, wherein the one or more operating parameters comprise a water amount provided to the agricultural product, a light amount provided to the agricultural product, a soil composition provided to the agricultural product, or any combination thereof.
[0122] Embodiment 9. The platform of any one of embodiments 1-8, wherein the location comprises two or more locations.
[0123] Embodiment 10. The platform of any one of embodiments 1-9, wherein the location is remote.
[0124] Embodiment 11. The platform of any one of embodiments 1-10, wherein the location is within the agricultural facility.
[0125] Embodiment 12. The platform of any one of embodiments 1-11, wherein the data comprises an input from an agricultural grower, an agricultural manager, or a combination thereof.
[0126] Embodiment 13. The platform of any one of embodiments 1-12, wherein the data comprises data collected from previous grows of the agricultural product.
[0127] Embodiment 14. A platform for agriculture management, the platform comprising: (a) an array of sensors, wherein a sensor is positioned at a location related to an agricultural facility; (b) an array of actuators, wherein an actuator is positioned at a location related to the agricultural facility; and (c) a processor configured to receive data at least in part from a sensor of the array of sensors and upon receipt of the data direct a change in an operating parameter of at least one actuator of the array of actuators, wherein the change in the operating parameter is calculated by a trained algorithm configured for maximizing agricultural product yield.
[0128] Embodiment 15. The platform of embodiment 14, wherein the array of sensors comprises a camera, a temperature sensor, a light sensor, a pH sensor, or any combination thereof.
[0129] Embodiment 16. The platform of any one of embodiments 14-15, wherein the array of sensors comprise at least 5 different types of sensors.
[0130] Embodiment 17. The platform of any one of embodiments 14-16, wherein the array of actuators comprises a water source, a light source, a nutrient source, a wind source, a temperature source, or any combination thereof.
[0131] Embodiment 18. The platform of any one of embodiments 14-17, wherein the agricultural facility is a fish farm, a dairy farm, a livestock farm, a crop farm, an orchard, an indoor farm, a hydroponic farm or any combination thereof.
[0132] Embodiment 19. The platform of any one of embodiments 14-18, wherein the operating parameter comprises a water amount provided to the agricultural product, a light amount provided to the agricultural product, a soil composition provided to the agricultural product, or any combination thereof. [0133] Embodiment 20. The platform of any one of embodiments 14-19, wherein the processor directs a change in at least three operating parameters based on receipt of the data. [0134] Embodiment 21. The platform of any one of embodiments 14-20, wherein the location comprises two or more locations.
[0135] Embodiment 22. The platform of any one of embodiments 14-21, wherein the location is remote.
[0136] Embodiment 23. The platform of any one of embodiments 14-22, wherein the location is within the agricultural facility.
[0137] Embodiment 24. The platform of any one of embodiments 14-23, wherein the data comprises an input from an agricultural grower, an agricultural manager, or a combination thereof.
[0138] Embodiment 25. The platform of any one of embodiments 14-24, wherein the data comprises data collected from previous grows of the agricultural product.
[0139] Embodiment 26. A platform for agriculture management, the platform comprising: (a) a plurality of sensors, wherein a sensor is positioned at a location related to an agricultural facility; (b) a processor configured to receive data at least in part from at least one sensor of the plurality of sensors; and (c) a user interface configured to receive a request for an agricultural product from a user, and upon receipt of the request, the processor is configured to provide a result to the user based on the data, wherein the result comprises a nutritional result of the agricultural product, a food safety result of the agricultural product, a provenance of the agricultural product, or any combination thereof.
[0140] Embodiment 27. The platform of embodiment 26, further comprising a database, wherein the database comprises a data set received from a plurality of agricultural facilities. [0141] Embodiment 28. The platform of any one of embodiments 26-27, wherein the processor comprises a trained algorithm trained to compare the data set from the database to the data received from the at least one sensor to produce the result.
[0142] Embodiment 29. The platform of any one of embodiments 26-28, wherein the plurality of agricultural facilities is at least 5.
[0143] Embodiment 30. The platform of any one of embodiments 26-29, wherein the plurality of agricultural facilities are located in different geographical locations.
[0144] Embodiment 31. The platform of any one of embodiments 26-30, wherein the request comprises a provenance of the agricultural product, a farming practice of the agricultural product, a nutritional panel of the agricultural product, a food safety of the agricultural product, or any combination thereof. [0145] Embodiment 32. The platform of any one of embodiments 26-31, wherein the user is a food consumer.
[0146] Embodiment 33. The platform of any one of embodiments 26-32, wherein the user is a business entity that sells the agricultural product to a consumer.
[0147] Embodiment 34. The platform of any one of embodiments 26-33, wherein the location comprises two or more locations.
[0148] Embodiment 35. The platform of any one of embodiments 26-34, wherein the location is remote.
[0149] Embodiment 36. The platform of any one of embodiments 26-35, wherein the location is within the agricultural facility.
[0150] Embodiment 37. The platform of any one of embodiments 26-36, wherein the plurality of sensors comprises a camera, a temperature sensor, a light sensor, a pH sensor, or any combination thereof.
[0151] Embodiment 38. The platform of any one of embodiments 26-37, wherein the plurality of sensors comprise at least 5 different types of sensors.
[0152] Embodiment 39. The platform of any one of embodiments 26-38, wherein the agricultural facility is a fish farm, a dairy farm, a livestock farm, a crop farm, an orchard, an indoor farm, a hydroponic farm or any combination thereof.
[0153] Embodiment 40. The platform of any one of embodiments 26-39, wherein the agricultural product comprises an animal-based product.
[0154] Embodiment 41. The platform of any one of embodiments 26-40, wherein the agricultural product comprises a plant-based product.
[0155] Embodiment 42. The platform of any one of embodiments 26-41, wherein the data comprises an input from an agricultural grower, an agricultural manager, or a combination thereof.
[0156] Embodiment 43. The platform of any one of embodiments 26-42, wherein the data comprises data collected from previous grows of the agricultural product.
[0157] Embodiment 44. A platform for agriculture management, the platform comprising: (a) a plurality of sensors, wherein a sensor is positioned at a location related to an agricultural facility; (b) a plurality of actuators, wherein an actuator is positioned at a location related to the agricultural facility; (c) a processor configured to receive data from at least one sensor of the plurality of sensors; and (d) a plurality of discrete user interfaces, wherein a user interface of the plurality is configured to (i) receive data from a user, (ii) receive a request from a user, (iii) provide a result to a user, or (iv) any combination thereof, wherein a first user interface of the plurality is configured for a food consumer. [0158] Embodiment 45. The platform of embodiment 44, wherein the plurality of discrete user interfaces is at least three.
[0159] Embodiment 46. The platform of any one of embodiments 44-45, wherein a second user interface is configured for an agricultural grower.
[0160] Embodiment 47. The platform of any one of embodiments 44-46, wherein a second user interface is configured for an agricultural manager.
EXAMPLES
Example 1
[0161] A food consumer values sourcing animal-based products from farms that operate with high animal welfare standards. The food consumer will access an agricultural management platform via the food consumer portal (such as via an application on a personal electronic device) to review one or more animal welfare standards reported by one or more farms that supply the animal-based product that the food consumer is considering to purchase. Based on the food consumer review of the animal welfare standards reported and displayed in the food consumer portal, the food consumer will make a decision to purchase an animal-based product. Example 2
[0162] A food consumer values sourcing agricultural-based products from farms that operate with high water conservation and soil preservation standards. The food consumer will access an agricultural management platform via the food consumer portal (such as via an application on a personal electronic device) to review one or more water conservation or soil preservation standards reported by one or more farms that supply the agricultural -based product that the food consumer is considering to purchase. Based on the food consumer review of the water conservation or soil preservation standards reported and displayed in the food consumer portal, the food consumer will make a decision to purchase an agricultural-based product.
Example 3
[0163] A food consumer values sourcing food products that contain high nutritional content.
The food consumer will access an agricultural management platform via the food consumer portal (such as via an application on a personal electronic device) to review one or more nutritional profiles (comprising one or more nutrition elements) of an agricultural product reported by one or more farms that supply the agricultural product that the food consumer is considering to purchase. Based on the food consumer’s review of the nutritional profiles reported and displayed in the food consumer portal, the food consumer will make a decision to purchase an agricultural product.
Example 4 [0164] A processor of an agriculture management platform initiates a grow of an agricultural product based on a recipe. During the grow cycle, a drought initiates in the location of the grow, significantly reducing the amount of rainfall received to the agricultural product. Data from one or more sensors will be provided to the processor - data that will be related to the significant reduction of rainfall. The processor will then direct a change in one or more actuators during the grow cycle to increase the amount of water provided to the agricultural product.
Example 5
[0165] A processor of an agriculture management platform initiates a grow of an agricultural product based on a recipe. During the grow cycle, a moisture-based pest infestation initiates in the particular sub-location of the grow, significantly reducing the amount of product yield in that sub-location. Data from one or more sensors will be provided to the processor - data that will be related to the significant reduction of product yield in that sub-location. The processor will then direct a change in one or more actuators within the sub-location (a subset of actuators in the array) during the grow cycle to increase reduce a moisture content within the sub-location to eradicate or reduce damage by the moisture-based pest infestation to the agricultural product. Moisture data collected from the sub-location during the grow cycle will be incorporated into a trained algorithm of the processor to inform future grows of the agricultural product or to minimize risk of moisture-based pest infestations in future grows of the agricultural product. Example 6
[0166] An agricultural management platform will have three distinct user portals. A first user portal will be configured for a food consumer. The first user portal for the food consumer will permit access to nutritional information of the agricultural product, a geographical location of a grow, a farming practice (such as organic grow, non-GMO grow, hormone-free grow, antibiotic- free grow, animal welfare standards, wild or farm raised, caged or open access or free range) of an agricultural product, or any combination thereof. The first user portal will permit a food consumer to provide a feedback to a farm or to another food consumer, a rating of an agricultural product, a question to a farm or to another food consumer, or any combination thereof. A second user portal will be configured for an agricultural grower. The second user portal will permit the agricultural grower to input, review or modify one or more operating parameters, outputs, data, recipes, or any combination thereof. The third user portal will be configured for an agricultural manager. The third user portal will permit the agricultural manager to input, review, or modify one or more operating parameters, outputs, data, recipes, or any combination thereof. The agricultural manager will communicate with the agricultural grower via the individual portals by providing feedback, comments, questions or any combination thereof. A food consumer, an agricultural manager, or agricultural grower will communicate with each other via the user portals.
[0167] While preferred embodiments of the present invention have been shown and described herein, it will be obvious to those skilled in the art that such embodiments are provided by way of example only. Numerous variations, changes, and substitutions will now occur to those skilled in the art without departing from the invention. It should be understood that various alternatives to the embodiments of the invention described herein may be employed in practicing the invention. It is intended that the following claims define the scope of the invention and that methods and structures within the scope of these claims and their equivalents be covered thereby.

Claims

CLAIMS WHAT IS CLAIMED IS:
1. A platform for agriculture management, the platform comprising: a trained algorithm that provides a smart recipe for growing an agricultural product, wherein the trained algorithm is configured to (i) receive data at least in part from an array of sensors, wherein a sensor is positioned at a location related to an agricultural facility and (ii) direct at least one actuator of an array of actuators positioned at a location in the agricultural facility to adjust one or more operating parameters according to the smart recipe, wherein the trained algorithm is trained on at least five agricultural products and wherein the trained algorithm optimizes the smart recipe to maximize agricultural product yield.
2. The platform of claim 1, wherein the agricultural product comprises an animal -based product.
3. The platform of claim 1, wherein the agricultural product comprises a plant-based product.
4. The platform of claim 1, wherein the agricultural facility is a fish farm, a dairy farm, a livestock farm, a crop farm, an orchard, an indoor farm, a hydroponic farm or any combination thereof.
5. The platform of claim 1, wherein the at least five an agricultural products comprise different an agricultural products.
6. The platform of claim 1, wherein the array of sensors comprise a camera, a temperature sensor, a light sensor, a pH sensor, or any combination thereof.
7. The platform of claim 1, wherein the array of sensors comprise at least 5 different types of sensors.
8. The platform of claim 1, wherein the one or more operating parameters comprise a water amount provided to the agricultural product, a light amount provided to the agricultural product, a soil composition provided to the agricultural product, or any combination thereof.
9. The platform of claim 1, wherein the location comprises two or more locations.
10. The platform of claim 1, wherein the location is remote.
11. The platform of claim 1, wherein the location is within the agricultural facility.
12. The platform of claim 1, wherein the data comprises an input from an agricultural grower, an agricultural manager, or a combination thereof.
13. The platform of claim 1, wherein the data comprises data collected from previous grows of the agricultural product.
14. A platform for agriculture management, the platform comprising:
(a) an array of sensors, wherein a sensor is positioned at a location related to an agricultural facility;
(b) an array of actuators, wherein an actuator is positioned at a location related to the agricultural facility; and
(c) a processor configured to receive data at least in part from a sensor of the array of sensors and upon receipt of the data direct a change in an operating parameter of at least one actuator of the array of actuators, wherein the change in the operating parameter is calculated by a trained algorithm configured for maximizing agricultural product yield.
15. The platform of claim 14, wherein the array of sensors comprises a camera, a temperature sensor, a light sensor, a pH sensor, or any combination thereof.
16. The platform of claim 14, wherein the array of sensors comprise at least 5 different types of sensors.
17. The platform of claim 14, wherein the array of actuators comprises a water source, a light source, a nutrient source, a wind source, a temperature source, or any combination thereof.
18. The platform of claim 14, wherein the agricultural facility is a fish farm, a dairy farm, a livestock farm, a crop farm, an orchard, an indoor farm, a hydroponic farm or any combination thereof.
19. The platform of claim 14, wherein the operating parameter comprises a water amount provided to the agricultural product, a light amount provided to the agricultural product, a soil composition provided to the agricultural product, or any combination thereof.
20. The platform of claim 14, wherein the processor directs a change in at least three operating parameters based on receipt of the data.
21. The platform of claim 14, wherein the location comprises two or more locations.
22. The platform of claim 14, wherein the location is remote.
23. The platform of claim 14, wherein the location is within the agricultural facility.
24. The platform of claim 14, wherein the data comprises an input from an agricultural grower, an agricultural manager, or a combination thereof.
25. The platform of claim 14, wherein the data comprises data collected from previous grows of the agricultural product.
26. A platform for agriculture management, the platform comprising:
(a) a plurality of sensors, wherein a sensor is positioned at a location related to an agricultural facility;
(b) a processor configured to receive data at least in part from at least one sensor of the plurality of sensors; and
(c) a user interface configured to receive a request for an agricultural product from a user, and upon receipt of the request, the processor is configured to provide a result to the user based on the data, wherein the result comprises a nutritional result of the agricultural product, a food safety result of the agricultural product, a provenance of the agricultural product, or any combination thereof.
27. The platform of claim 26, further comprising a database, wherein the database comprises a data set received from a plurality of agricultural facilities.
28. The platform of claim 26, wherein the processor comprises a trained algorithm trained to compare the data set from the database to the data received from the at least one sensor to produce the result.
29. The platform of claim 26, wherein the plurality of agricultural facilities is at least 5.
30. The platform of claim 26, wherein the plurality of agricultural facilities are located in different geographical locations.
31. The platform of claim 26, wherein the request comprises a provenance of the agricultural product, a farming practice of the agricultural product, a nutritional panel of the agricultural product, a food safety of the agricultural product, or any combination thereof.
32. The platform of claim 26, wherein the user is a food consumer.
33. The platform of claim 26, wherein the user is a business entity that sells the agricultural product to a consumer.
34. The platform of claim 26, wherein the location comprises two or more locations.
35. The platform of claim 26, wherein the location is remote.
36. The platform of claim 26, wherein the location is within the agricultural facility.
37. The platform of claim 26, wherein the plurality of sensors comprises a camera, a temperature sensor, a light sensor, a pH sensor, or any combination thereof.
38. The platform of claim 26, wherein the plurality of sensors comprise at least 5 different types of sensors.
39. The platform of claim 26, wherein the agricultural facility is a fish farm, a dairy farm, a livestock farm, a crop farm, an orchard, an indoor farm, a hydroponic farm or any combination thereof.
40. The platform of claim 26, wherein the agricultural product comprises an animal-based product.
41. The platform of claim 26, wherein the agricultural product comprises a plant-based product.
42. The platform of claim 26, wherein the data comprises an input from an agricultural grower, an agricultural manager, or a combination thereof.
43. The platform of claim 26, wherein the data comprises data collected from previous grows of the agricultural product.
44. A platform for agriculture management, the platform comprising:
(a) a plurality of sensors, wherein a sensor is positioned at a location related to an agricultural facility;
(b) a plurality of actuators, wherein an actuator is positioned at a location related to the agricultural facility;
(c) a processor configured to receive data from at least one sensor of the plurality of sensors; and
(d) a plurality of discrete user interfaces, wherein a user interface of the plurality is configured to (i) receive data from a user, (ii) receive a request from a user, (iii) provide a result to a user, or (iv) any combination thereof, wherein a first user interface of the plurality is configured for a food consumer.
45. The platform of claim 44, wherein the plurality of discrete user interfaces is at least three.
46. The platform of claim 44, wherein a second user interface is configured for an agricultural grower.
47. The platform of claim 44, wherein a second user interface is configured for an agricultural manager.
PCT/US2020/054120 2019-10-03 2020-10-02 Agricultural platforms WO2021067847A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201962910346P 2019-10-03 2019-10-03
US62/910,346 2019-10-03

Publications (1)

Publication Number Publication Date
WO2021067847A1 true WO2021067847A1 (en) 2021-04-08

Family

ID=75338598

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2020/054120 WO2021067847A1 (en) 2019-10-03 2020-10-02 Agricultural platforms

Country Status (1)

Country Link
WO (1) WO2021067847A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220156603A1 (en) * 2020-11-17 2022-05-19 International Business Machines Corporation Discovering farming practices

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060282274A1 (en) * 2005-06-10 2006-12-14 Bennett Michael S Monitoring and managing farms
US20170023193A1 (en) * 2015-05-18 2017-01-26 Biological Innovation & Optimization Systems, LLC Grow Light Embodying Power Delivery and Data Communications Features
US20180262571A1 (en) * 2016-03-04 2018-09-13 Sabrina Akhtar Integrated IoT (Internet of Things) System Solution for Smart Agriculture Management
US20180263171A1 (en) * 2014-04-21 2018-09-20 The Climate Corporation Generating an agriculture prescription
US20190133026A1 (en) * 2016-04-04 2019-05-09 Freight Farms, Inc. Modular Farm Control and Monitoring System

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060282274A1 (en) * 2005-06-10 2006-12-14 Bennett Michael S Monitoring and managing farms
US20180263171A1 (en) * 2014-04-21 2018-09-20 The Climate Corporation Generating an agriculture prescription
US20170023193A1 (en) * 2015-05-18 2017-01-26 Biological Innovation & Optimization Systems, LLC Grow Light Embodying Power Delivery and Data Communications Features
US20180262571A1 (en) * 2016-03-04 2018-09-13 Sabrina Akhtar Integrated IoT (Internet of Things) System Solution for Smart Agriculture Management
US20190133026A1 (en) * 2016-04-04 2019-05-09 Freight Farms, Inc. Modular Farm Control and Monitoring System

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220156603A1 (en) * 2020-11-17 2022-05-19 International Business Machines Corporation Discovering farming practices

Similar Documents

Publication Publication Date Title
Khandelwal et al. Artificial intelligence in agriculture: An emerging era of research
CN111476149A (en) Plant cultivation control method and system
WO2021081428A1 (en) Agricultural systems and platforms
Guragain et al. A low-cost centralized IoT ecosystem for enhancing oyster mushroom cultivation
Rajendiran et al. Smart Aeroponic Farming System: Using IoT with LCGM-Boost Regression Model for Monitoring and Predicting Lettuce Crop Yield.
WO2021067847A1 (en) Agricultural platforms
Dadios et al. Automation and control for adaptive management system of urban agriculture using computational intelligence
Baburao et al. Review of Machine Learning Model Applications in Precision Agriculture
Jackson et al. Robust Ensemble Machine Learning for Precision Agriculture
Rajendiran et al. IoT-Integrated Machine Learning-Based Automated Precision Agriculture-Indoor Farming Techniques
Araneta et al. Controlled Environment for Spinach Cultured Plant with Health Analysis using Machine Learning
Kuppusamy et al. Machine learning-enabled Internet of Things solution for smart agriculture operations
Anirudh et al. Exploring the Applications of Wireless Sensor Networks and the Internet of Things in Agricultural and Livestock Production and Management
Saha et al. ML-based smart farming using LSTM
Thotho et al. Comprehensive Survey on Applications of Internet of Things, Machine Learning and Artificial Intelligence in Precision Agriculture
Rai et al. Application of machine learning in agriculture with some examples
Abdulghani et al. Cyber-Physical System Based Data Mining and Processing Toward Autonomous Agricultural Systems
Venkatraman et al. Industrial 5.0 Aquaponics System Using Machine Learning Techniques
Victoire et al. Leveraging Artificial Intelligence for Enhancing Agricultural Productivity and Sustainability
Guragain et al. Journal of Agriculture and Food Research
Peraka et al. FPGA based Smart and Sustainable Agriculture
Azizi International Journal of Artificial Intelligence and Machine Learning
Singh et al. A Multiple Linear Regression Model for Crop Production using Machine Learning and Neural Network
Kumar et al. Data Analytics in Agriculture: Predictive Models and Real‐Time Decision‐Making
Azizi Application of Artificial Intelligence (AI) In-Farm

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20870948

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20870948

Country of ref document: EP

Kind code of ref document: A1