strudel-coder-0.5B-ONNX

ONNX export of amhinson/strudel-coder-0.5B for use with transformers.js.

This is a fine-tuned Qwen2.5-Coder-0.5B-Instruct model specialized for Strudel REPL live coding — a browser-based music environment that ports TidalCycles' pattern language to JavaScript.

Usage with transformers.js

npm i @huggingface/transformers
import { pipeline } from "@huggingface/transformers";

const generator = await pipeline(
  "text-generation",
  "amhinson/strudel-coder-0.5B-ONNX",
  { dtype: "q4" }
);

const messages = [
  {
    role: "system",
    content: `You are a Strudel live coding assistant. Strudel is a music live coding environment for the browser that ports the TidalCycles pattern language to JavaScript. You help write and modify Strudel REPL code.

Strudel uses:
- Mini-notation for rhythmic patterns in double-quoted strings (spaces for sequences, [] for sub-sequences, <> for alternation, * for speed, ~ for rests, , for parallel patterns)
- Euclidean rhythms like (3,8) to distribute beats evenly
- Chained functions: .s() for sound source, .note() for pitch, .gain() for volume, .lpf()/.hpf() for filters, .delay(), .room(), .pan(), .vowel(), .speed(), .rev(), .every(), .sometimes()
- .bank() to select drum machine sample banks (e.g., RolandTR909, RolandTR808)
- $: prefix for multi-pattern composition (each $: line is an independent pattern)
- n() to select sample variations, stack() to layer patterns
- Signals like sine, saw, rand for continuous modulation with .range()

Given an instruction and the current code context, produce the complete updated code.`
  },
  {
    role: "user",
    content: 'Instruction: add a hi-hat pattern\nContext:\n$: s("bd(5,8)").bank("RolandTR909")'
  }
];

const output = await generator(messages, {
  max_new_tokens: 256,
  do_sample: false
});

console.log(output[0].generated_text.at(-1).content);

Training

Fine-tuned with QLoRA (rank 64, alpha 128) on 2000 Strudel REPL examples for 4 epochs. Base model: Qwen/Qwen2.5-Coder-0.5B-Instruct.

Downloads last month
42
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support