🌟 Lumi Mobile - On-Device AI Assistant

Fine-tuned language model for classifying user instructions into tasks, notes, and reflections. Optimized for mobile deployment with PyTorch Mobile.

πŸš€ What it does

Converts natural language into structured data:

  • Tasks: "call mom tomorrow" β†’ {"task": "call mom tomorrow"}
  • Notes: "this book is great" β†’ {"note": "this book is great", "tag": "personal"}
  • Reflections: "feeling grateful today" β†’ {"reflection": "feeling grateful today"}

πŸ“± Mobile Usage

Download Files

const modelUrl = 'https://huggingface.co/yourusername/lumi-mobile/resolve/main/lumi_mobile.ptl';
const vocabUrl = 'https://huggingface.co/yourusername/lumi-mobile/resolve/main/vocab.json';

React Native Integration

npm install react-native-pytorch-core

πŸ“¦ Files

  • lumi_mobile.ptl - PyTorch Mobile model (350M params)
  • vocab.json - Vocabulary for tokenization
  • mobile_config.json - Mobile settings

🎯 Features

  • βœ… Runs completely offline
  • βœ… No server required
  • βœ… Privacy-first (data never leaves device)
  • βœ… ~700MB model size

πŸ”§ Base Model

Fine-tuned from unsloth/LFM2-350M


Built for on-device AI inference πŸ“±

Downloads last month
11
GGUF
Model size
0.4B params
Architecture
lfm2
Hardware compatibility
Log In to view the estimation

We're not able to determine the quantization variants.

Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Model tree for Taru/lumi-mobile

Base model

LiquidAI/LFM2-350M
Finetuned
unsloth/LFM2-350M
Quantized
(1)
this model