Changelog

All major updates and releases.

January 23, 2025

Beta Patch #1

v1.0.1
  • New default theme.
  • Add support for DeepSeek.
  • Fixed system prompt errors when using o1 and Gemini.
  • Improved prompts for fast-apply and quick edit.
  • Performance improvements for chat.
  • Temporarily disable autocomplete (needs better model selection guardrails).
  • Minor updates to diff streaming algorithm and FIM output parsing algorithm.
January 19, 2025

Beta Release

v1.0.0
  • Added quick edits (Ctrl+K)! This includes FIM-prompting, output parsing, and history management.
  • Added autocomplete!
  • Void now lives natively in the VSCode codebase, no more extension API.
  • Added new UI to smoothly show the LLM's streamed changes in VSCode.
  • New settings page with one-click switch, model selection, and even more providers.
  • Added auto-detection of Ollama models by default.
  • Fixed CORS and CSP errors for many local models by originating LLM calls from 'node/' with IPC.
  • Native UI for Accept/Reject buttons, streaming, and interruptions.
  • File suggestions in the chat based on history.
  • Switched from the MIT License to the Apache 2.0 License for a little more protection on our OSS initiative.
October 1, 2024

Early Launch

  • Initialized Void's website and GitHub repo.
  • A lot of early setup work not recorded in this changelog.
  • Basic features like LLM streaming in the editor, custom history, custom editor UI.
  • Users could build Void for an early version of Ctrl+L and fast-apply.