Crustly -TUI AI Coding Assistant in Rust

Hi everyone,

I started Crustly, a new open-source project: a blazingly fast, memory-efficient terminal-based AI assistant written in Rust. It’s a CLI code assistant, designed to bring a seamless, local-first coding experience to your terminal with a TUI interface like Claude code, Crush.

Why this project?

  • Performance: Built in Rust for speed and minimal resource usage.
  • Multi AI provider: steup your AI provider or use it locally
  • Privacy: Also use it with no cloud dependency—everything runs locally with LM Studio,.
  • Integration: Works natively in your terminal, alongside your favorite tools.
  • Extensibility: Modular design for easy customization and multi command tools.

Current Features:

  • Context-aware code suggestions , write file with permission system.
  • Interactive TUI (powered by [ratatui/cursive/other]).
  • Lightweight . and many more

How to Contribute:

  • Code: PRs are welcome! Check out the open issues for ideas.
  • Feedback: Report bugs or suggest improvements via GitHub issues.
  • Testing: Try it out and share your experience—especially feedback on the TUI/UX!
1 Like

Hello. Just this morning I woke up thinking that I would like to put together a TUI based chat for GenAI, with support for various models (local and remote). But I thought "I must be the last one to think about this". So, searched and found your post.

I'm looking at crustly as well as chabeau. A lot of overlap, but not exactly the same. But one thing I found in both project: Claude is a main contributor! Can you say more about this?

Maybe, to put it another way, would you say that Crustly is "self-hosted"?

Hi @detro — thanks a lot for checking out Crustly!

Great to hear you were thinking about building something similar. I had the same feeling at the beginning (“surely someone already made this”), but it turns out there’s still a lot of room for experimentation in TUI-based AI tooling.

About “Claude” as contributor

The GitHub activity you’re seeing comes from the fact that I’ve been using Claude Desktop and Cursor / Codestral-powered tools a lot while working on Crustly, so many commits or PRs may show up as “authored with Claude.”
But to be clear:

There is no official involvement from Anthropic (Claude) or any other company.
Crustly is fully independent and community-driven.

It’s just me + whoever decides to contribute. Any AI-generated commit metadata is accidental/side-effect territory.

Is Crustly “self-hosted”?

Yes — that’s exactly the idea.

Crustly is built to be:

Local-first — you can run it entirely offline using LM Studio or any local model that exposes an OpenAI-style API.

Provider-agnostic — if you want to use remote APIs (OpenAI, Anthropic, Groq, etc.), you can configure them, but nothing is hard-coded or required.

Terminal-native — the TUI is meant to make AI assistance feel like another tool in your dev environment, not a separate app.

So in short:

Crustly can be fully self-hosted, fully cloud-based, or a hybrid. It’s up to the user.

On overlap with other projects

You’re right — there’s overlap with chabeau and a few other emerging TUI assistants. That’s a good sign, I think. Each project explores slightly different ideas in UX, context management, local model tooling, etc. Crustly’s focus is on:

Speed & low memory usage (Rust)

Simple plug-in architecture for commands

A real TUI conversation flow (Claude-Code–style)

Strict permission system for file read/write

I’d be really happy to exchange ideas or see how our projects can complement each other.

If you do experiment with building your own, or try Crustly, I’d love to hear your thoughts

This topic was automatically closed 90 days after the last reply. We invite you to open a new topic if you have further questions or comments.