π Lumi Mobile - On-Device AI Assistant
Fine-tuned language model for classifying user instructions into tasks, notes, and reflections. Optimized for mobile deployment with PyTorch Mobile.
π What it does
Converts natural language into structured data:
- Tasks: "call mom tomorrow" β
{"task": "call mom tomorrow"} - Notes: "this book is great" β
{"note": "this book is great", "tag": "personal"} - Reflections: "feeling grateful today" β
{"reflection": "feeling grateful today"}
π± Mobile Usage
Download Files
const modelUrl = 'https://huggingface.co/yourusername/lumi-mobile/resolve/main/lumi_mobile.ptl';
const vocabUrl = 'https://huggingface.co/yourusername/lumi-mobile/resolve/main/vocab.json';
React Native Integration
npm install react-native-pytorch-core
π¦ Files
lumi_mobile.ptl- PyTorch Mobile model (350M params)vocab.json- Vocabulary for tokenizationmobile_config.json- Mobile settings
π― Features
- β Runs completely offline
- β No server required
- β Privacy-first (data never leaves device)
- β ~700MB model size
π§ Base Model
Fine-tuned from unsloth/LFM2-350M
Built for on-device AI inference π±
- Downloads last month
- 11
Hardware compatibility
Log In
to view the estimation
We're not able to determine the quantization variants.