Instructions to use bharatgenai/AgriParam with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use bharatgenai/AgriParam with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-generation", model="bharatgenai/AgriParam", trust_remote_code=True) messages = [ {"role": "user", "content": "Who are you?"}, ] pipe(messages)# Load model directly from transformers import AutoModelForCausalLM model = AutoModelForCausalLM.from_pretrained("bharatgenai/AgriParam", trust_remote_code=True, dtype="auto") - Notebooks
- Google Colab
- Kaggle
- Local Apps
- vLLM
How to use bharatgenai/AgriParam with vLLM:
Install from pip and serve model
# Install vLLM from pip: pip install vllm # Start the vLLM server: vllm serve "bharatgenai/AgriParam" # Call the server using curl (OpenAI-compatible API): curl -X POST "http://localhost:8000/v1/chat/completions" \ -H "Content-Type: application/json" \ --data '{ "model": "bharatgenai/AgriParam", "messages": [ { "role": "user", "content": "What is the capital of France?" } ] }'Use Docker
docker model run hf.co/bharatgenai/AgriParam
- SGLang
How to use bharatgenai/AgriParam with SGLang:
Install from pip and serve model
# Install SGLang from pip: pip install sglang # Start the SGLang server: python3 -m sglang.launch_server \ --model-path "bharatgenai/AgriParam" \ --host 0.0.0.0 \ --port 30000 # Call the server using curl (OpenAI-compatible API): curl -X POST "http://localhost:30000/v1/chat/completions" \ -H "Content-Type: application/json" \ --data '{ "model": "bharatgenai/AgriParam", "messages": [ { "role": "user", "content": "What is the capital of France?" } ] }'Use Docker images
docker run --gpus all \ --shm-size 32g \ -p 30000:30000 \ -v ~/.cache/huggingface:/root/.cache/huggingface \ --env "HF_TOKEN=<secret>" \ --ipc=host \ lmsysorg/sglang:latest \ python3 -m sglang.launch_server \ --model-path "bharatgenai/AgriParam" \ --host 0.0.0.0 \ --port 30000 # Call the server using curl (OpenAI-compatible API): curl -X POST "http://localhost:30000/v1/chat/completions" \ -H "Content-Type: application/json" \ --data '{ "model": "bharatgenai/AgriParam", "messages": [ { "role": "user", "content": "What is the capital of France?" } ] }' - Docker Model Runner
How to use bharatgenai/AgriParam with Docker Model Runner:
docker model run hf.co/bharatgenai/AgriParam
AgriParam
BharatGen introduces AgriParam, a domain-specialized large language model fine-tuned from Param-1-2.9B-Instruct on a high-quality, India-centric agriculture dataset.
AgriParam is designed to understand and generate contextually rich responses for agricultural queries, farmer advisories, policy information, research insights, and rural knowledge dissemination.
π± Motivation
Agriculture is the backbone of Indiaβs economy, yet existing language models lack deep domain knowledge tailored to Indian contexts, languages, and cultural nuances.
AgriParam bridges this gap by combining Param-1βs bilingual capabilities with a meticulously curated agricultural knowledge base.
π Model Architecture
AgriParam inherits the architecture of Param-1-2.9B-Instruct:
- Hidden size: 2048
- Intermediate size: 7168
- Attention heads: 16
- Hidden layers: 32
- Key-value heads: 8
- Max position embeddings: 2048
- Activation: SiLU
- Positional Embeddings: Rotary (RoPE, theta=10000)
- Attention Mechanism: Grouped-query attention
- Precision: bf16-mixed
- Base model: Param-1-2.9B-Instruct
π Data Preparation
AgriParamβs training corpus was carefully crafted to ensure deep agricultural knowledge, cultural relevance, and bilingual (English-Hindi) accessibility.
Steps involved:
Source Gathering
- 17k open-source, India-focused agricultural news & information passages.
Question Generation
- Generated 5 curated Q&A pairs per passage using an open-source LLM.
Domain Taxonomy & Personas
- Built an exhaustive, India-specific agricultural taxonomy.
- Defined farmer, policy-maker, scientist, and agri-business personas.
Dataset Construction
- 2M Q&A pairs grounded in taxonomy and personas.
- Complete dataset translated into Hindi.
- 6M multi-turn conversation samples created.
ποΈ Training Setup
- Base model: Param-1-2.9B-Instruct
- Training framework: Hugging Face +
torchrunmulti-node setup - Prompt template: Custom-designed for agricultural inference
- Scheduler: Linear with warmup
- Epochs: 3
- Total training samples: 12M
- Test samples: 1.2M
- Base learning rate: 5e-6
- Minimum learning rate: 0
- Additional tokens:
<user>,<assistant>,<context>,<system_prompt> - Vocab size: 256k + 4
- Global batch size: 1024
- Micro batch size: 4
- Gradient accumulation steps: 32
π Inference Example
from transformers import AutoTokenizer, AutoModelForCausalLM
import torch
model_name = "bharatgenai/AgriParam"
tokenizer = AutoTokenizer.from_pretrained(model_name, trust_remote_code=False)
model = AutoModelForCausalLM.from_pretrained(
model_name,
trust_remote_code=True,
torch_dtype=torch.bfloat16 if torch.cuda.is_available() else torch.bfloat32,
device_map="auto"
)
# Example agricultural query
user_input = "What are the best practices for organic wheat farming in Uttar Pradesh?"
# 3 types of prompt
# 1. Generic QA: <user> ... <assistant>
# 2. Context based QA: <context> ... <user> ... <assistant>
# 3. Multi-turn conversation (supports upto 5 conversations): <user> ... <assistant> ... <user> ... <assistant>
# Based on your requirements use the type of prompt (refere the above examples)
prompt = f"<user> {user_input} <assistant>"
# prompt = f"<context> {user_context} <user> {user_input} <assistant>"
# prompt = f"<user> {user_input1} <assistant> {user_input2} <user> {user_input3} <assistant>..."
inputs = tokenizer(prompt, return_tensors="pt").to(model.device)
with torch.no_grad():
output = model.generate(
**inputs,
max_new_tokens=300,
do_sample=True,
top_k=50,
top_p=0.95,
temperature=0.6,
eos_token_id=tokenizer.eos_token_id,
use_cache=False
)
print(tokenizer.decode(output[0], skip_special_tokens=True))
π Evaluation
- Crop-specific Q&A
- Policy & scheme awareness
- Rural advisory & extension services
- Bilingual (English/Hindi) capability
BhashaBench-Krishi (BBK)
| Model | BBK | BBK_English | BBK_Hindi |
|---|---|---|---|
| Llama-3.2-1B | 28.91 | 29.71 | 25.21 |
| Llama-3.2-1B-Instruct | 28.65 | 29.16 | 26.33 |
| Llama-3.2-3B | 31.96 | 32.68 | 28.69 |
| granite-3.1-3b-a800m-base | 32.17 | 33.36 | 26.70 |
| sarvam-2b-v0.5 | 27.68 | 28.14 | 25.57 |
| sarvam-1 | 30.24 | 30.82 | 27.57 |
| AgriParam | 32.18 | 33.10 | 27.97 |
Subject Domain Performance
| Subject Domain | Llama-3.2-1B | Llama-3.2-1B-Instruct | Llama-3.2-3B | granite-3.1-3b-a800m-base | sarvam-2b-v0.5 | sarvam-1 | AgriParam |
|---|---|---|---|---|---|---|---|
| Agri-Environmental & Allied Disciplines | 31.82 | 32.95 | 25.00 | 36.93 | 29.55 | 30.11 | 27.27 |
| Agricultural Biotechnology | 31.11 | 28.63 | 34.35 | 43.13 | 30.34 | 36.64 | 36.64 |
| Agricultural Chemistry & Biochemistry | 27.05 | 22.78 | 31.32 | 35.94 | 27.05 | 34.52 | 34.16 |
| Agricultural Economics & Policy | 29.98 | 25.52 | 35.09 | 34.77 | 27.75 | 30.78 | 32.54 |
| Agricultural Engineering & Technology | 27.46 | 26.23 | 32.79 | 30.33 | 27.46 | 29.51 | 27.87 |
| Agricultural Extension Education | 30.88 | 29.46 | 32.30 | 29.84 | 28.17 | 29.97 | 34.50 |
| Agricultural Microbiology | 34.23 | 36.04 | 31.53 | 34.23 | 17.12 | 26.13 | 34.23 |
| Agriculture Communication | 33.07 | 28.35 | 29.53 | 34.25 | 25.59 | 33.07 | 32.68 |
| Agriculture Information Technology | 30.53 | 31.58 | 44.21 | 36.84 | 27.89 | 32.11 | 27.89 |
| Agronomy | 27.92 | 28.77 | 31.84 | 31.51 | 28.67 | 29.60 | 32.49 |
| Animal Sciences | 25.68 | 34.46 | 36.49 | 37.84 | 35.14 | 29.05 | 40.54 |
| Crop Sciences | 31.15 | 26.41 | 29.87 | 35.15 | 26.59 | 29.33 | 32.42 |
| Dairy & Poultry Science | 35.96 | 31.46 | 30.34 | 44.94 | 33.71 | 32.58 | 29.21 |
| Entomology | 29.02 | 27.59 | 35.49 | 29.31 | 27.59 | 27.87 | 31.75 |
| Fisheries and Aquaculture | 29.41 | 41.18 | 38.24 | 26.47 | 20.59 | 14.71 | 23.53 |
| General Knowledge & Reasoning | 28.44 | 27.53 | 33.13 | 32.38 | 26.17 | 30.56 | 31.92 |
| Genetics and Plant Breeding | 30.59 | 30.08 | 28.02 | 29.05 | 26.99 | 31.62 | 29.82 |
| Horticulture | 27.05 | 28.60 | 31.21 | 32.17 | 27.00 | 29.76 | 31.40 |
| Natural Resource Management | 28.50 | 26.42 | 29.02 | 32.64 | 26.42 | 26.94 | 27.46 |
| Nematology | 22.83 | 28.26 | 28.26 | 27.17 | 21.20 | 24.46 | 23.91 |
| Plant Pathology | 28.97 | 30.48 | 27.96 | 29.97 | 25.44 | 33.50 | 25.44 |
| Plant Sciences & Physiology | 28.68 | 31.78 | 37.98 | 26.36 | 20.93 | 30.23 | 31.01 |
| Seed Science and Technology | 29.70 | 28.71 | 27.72 | 29.21 | 29.70 | 34.65 | 27.23 |
| Soil Science | 31.25 | 29.92 | 31.69 | 29.99 | 27.49 | 30.21 | 34.93 |
| Veterinary Sciences | 27.08 | 14.58 | 37.50 | 39.58 | 20.83 | 41.67 | 43.75 |
Question Level Difficulty
| Difficulty | Llama-3.2-1B | Llama-3.2-1B-Instruct | Llama-3.2-3B | granite-3.1-3b-a800m-base | sarvam-2b-v0.5 | sarvam-1 | AgriParam |
|---|---|---|---|---|---|---|---|
| Easy | 29.43 | 30.22 | 36.44 | 36.08 | 28.26 | 32.20 | 36.94 |
| Hard | 27.72 | 26.37 | 25.61 | 26.02 | 28.01 | 27.54 | 25.91 |
| Medium | 28.68 | 27.69 | 29.17 | 29.88 | 27.03 | 28.99 | 29.09 |
- Downloads last month
- 89