Changelog

All major updates and releases.

April 14, 2025

Beta Patch #4

v1.2.1
  • Added checkpoints that let you jump between LLM edits.
  • Agent mode can now fix lint errors after editing files.
  • Upgraded tool-calling implementation; now, any model can run in Agent mode - R1, Gemma3, Quasar Alpha, etc.
  • Dynamic context squashing when the context window is not sufficiently large.
  • Added SSH support, WSL support, and Linux support.
  • Void now comes with auto-updates, so we'll be pushing smaller and more frequent changes.
  • GPT 4.1, Gemini 2.5 Pro, OpenHands LM, DeepSeek V3, Phi4, and Quasar Alpha support.
April 7, 2025

Beta Patch #3

v1.0.3
  • Experimental version of Void pushed to our Discord members.
  • Initial version of v1.1.0 above.
March 22, 2025

Beta Patch #2

v1.0.2
  • Added Agent mode, Gather mode, and Chat mode.
  • Agent mode can control your terminal, read/write files, and search your codebase.
  • Chat and Gather mode now make suggestions to edit specific files.
  • Void auto-detects when a model supports tools (Agent/Gather), FIM (Autocomplete), and thinking (Reasoning).
  • New slider to set a model's reasoning level preference.
  • Autocomplete has been re-enabled, and you can use it with any FIM model.
  • Added a Fast Apply option, enabled by default.
  • Chat mode now creates links to symbols in your code.
  • Rebase from VSCode 1.99.0.
  • Claude 3.7, Deepseek V3, Gemini 2.0, and QwQ support.
January 23, 2025

Beta Patch #1

v1.0.1
  • New default theme.
  • Add support for DeepSeek.
  • Fixed system prompt errors when using o1 and Gemini.
  • Improved prompts for fast-apply and quick edit.
  • Performance improvements for chat.
  • Temporarily disable autocomplete (needs better model selection guardrails).
  • Minor updates to diff streaming algorithm and FIM output parsing algorithm.
January 19, 2025

Beta Release

v1.0.0
  • Added quick edits (Ctrl+K)! This includes FIM-prompting, output parsing, and history management.
  • Added autocomplete!
  • Void now lives natively in the VSCode codebase, no more extension API.
  • Added new UI to smoothly show the LLM's streamed changes in VSCode.
  • New settings page with one-click switch, model selection, and even more providers.
  • Added auto-detection of Ollama models by default.
  • Fixed CORS and CSP errors for many local models by originating LLM calls from 'node/' with IPC.
  • Native UI for Accept/Reject buttons, streaming, and interruptions.
  • File suggestions in the chat based on history.
  • Switched from the MIT License to the Apache 2.0 License for a little more protection on our OSS initiative.
October 1, 2024

Early Launch

  • Initialized Void's website and GitHub repo.
  • A lot of early setup work not recorded in this changelog.
  • Basic features like LLM streaming in the editor, custom history, custom editor UI.
  • Users could build Void for an early version of Ctrl+L and fast-apply.