A real AI studio with image generation, editing, and agentic coding — versus a chat app. Both free, both local.
MLX Studio is a complete AI studio — generate images, edit images, chat, and write code with 20+ agentic tools. LM Studio is a solid chat and API app with cross-platform support. MLX Studio runs natively on Apple Silicon via MLX with prefix caching, paged KV cache, and continuous batching. LM Studio uses llama.cpp and runs on Windows, Linux, and Mac. Both are free.
| Feature | MLX Studio | LM Studio |
|---|---|---|
| Image Generation | Flux Schnell, Dev, Kontext, Z-Image, Klein | No |
| Image Editing | Qwen Image Edit, Flux Fill, Kontext | No |
| Agentic Coding Tools | 20+ built-in via MCP | None |
| MCP (Model Context Protocol) | Native + external server support | No |
| Framework | MLX / vMLX (Apple-native) | llama.cpp |
| Prefix Caching | Yes | No |
| Paged KV Cache | Multi-context, persistent | Single-slot (evicts on switch) |
| KV Cache Quantization | q4 / q8 | No |
| Continuous Batching | Up to 256 sequences | Limited |
| Persistent Disk Cache | Yes | No |
| JANG Mixed-Precision Quantization | Yes — 74% MMLU on 230B at 2-bit (82.5 GB) vs MLX 4-bit 26.5% (119.8 GB) | No (standard GGUF only) |
| Speculative Decoding | 20–90% faster generation | No |
| API Endpoints | 11 (Anthropic + OpenAI-compatible) | 1 (OpenAI-compatible) |
| Vision Models + Full Cache | Yes | Partial |
| Mamba / SSM / Hybrid Support | Nemotron-H, Jamba, GatedDeltaNet | No |
| Voice Chat | Kokoro TTS + Whisper STT | No |
| Model Converter | JANG + standard + GGUF-to-MLX | GGUF download only |
| Platform | macOS (Apple Silicon) | macOS, Windows, Linux |
| Price | Free | Free Pro: $7.99/mo |
MLX Studio is not just a chat app — it is a full creative studio. Generate images locally with Flux Schnell (fast), Flux Dev (quality), Z-Image Turbo, and Klein. Edit existing images with Qwen Image Edit, Flux Fill (inpainting/outpainting), and Flux Kontext (style transfer and character consistency).
LM Studio does not include any image generation or editing capabilities. It is designed exclusively for text-based LLM inference. If you need visual AI workflows on your Mac, MLX Studio is the only local option with a complete image pipeline.
MLX Studio includes 20+ built-in agentic coding tools through native MCP (Model Context Protocol) integration. Models can autonomously read, write, and edit files, execute shell commands, search the web, and run multi-step workflows — all locally with zero cloud dependency. LM Studio has no built-in tool support.
MLX Studio runs on the vMLX engine — purpose-built for Apple Silicon using Apple's MLX framework. It features a 5-layer caching stack that delivers dramatically faster performance than llama.cpp-based apps at long contexts.
LM Studio uses llama.cpp with a single-slot KV cache. Switching conversations evicts cached state and requires full re-processing. No prefix caching, no cache quantization, no persistent disk cache.
MLX Studio includes a built-in model converter with JANG mixed-precision quantization — a technique that assigns different bit widths to different layers based on their sensitivity. This preserves model quality at aggressive compression levels that standard uniform quantization cannot match.
Result: 74% MMLU on a 230B model at 2-bit (82.5 GB) vs MLX 4-bit at 26.5% (119.8 GB). Also 86% MMLU on 122B at 4-bit. LM Studio relies on pre-quantized GGUF models with standard uniform quantization, offering no built-in conversion or mixed-precision capability.
LM Studio is a solid app and the right choice in certain scenarios. Here is where it has an edge:
If you are on a Mac with Apple Silicon and want a complete AI studio — generate images, edit images, chat with agentic tools, and serve APIs — MLX Studio is the clear choice. If you need Windows or Linux support, LM Studio is a good option.
Generate images. Edit images. Chat. Code with 20+ agentic tools. All local, all free.
Download MLX StudioFree · macOS 15+ · Apple Silicon (M1 or later) · Code-signed & notarized