Smart Core, Simple Form
🎉 Coming Soon to Mac App Store!
Fully compliant with Apple's security & privacy standards. Currently under review. Your support and stars ⭐️ keep us going!
WitNote is a local-first AI writing companion for macOS, Windows, and Linux. Supports free switching between Ollama / WebLLM / Cloud API engines, paired with an ultra-minimalist native card interface, it works right out of the box. No continuous cloud dependency, no privacy concerns — intelligence made lightweight.
- Smart: Three Engines in One, Freedom to Choose
- WebLLM: Lightweight model, requires download on first run, then works offline
- Ollama: Powerful local model, strong performance, completely offline
- Cloud API: Connect to cloud intelligence, infinite possibilities
- Simple: No complexity
- iOS-style card management, drag to organize
- Smart focus mode — window narrows, editor simplifies
- Secure: Data sovereignty
- Privacy Policy: 100% local storage. Your thoughts belong only to you.
- ✨ Enhanced Autocomplete — 3 selectable levels (Lite/Standard/Full) for different model sizes.
- ⌨️ Smart Tab — Press Tab to accept sentence by sentence, or segments, for precise control.
- 🛠️ Customizable Prompts — Edit system prompts directly in settings with a one-click restore option.
- 🌓 Dual-Pane Preview — WYSIWYG, edit on the left, real-time preview on the right
- ⚡️ Quick Edit — Enhanced floating menu for styling selected text
- 📝 Pure Local Notes — Choose any folder as your notes vault, supports
.txtand.md - 🤖 Three-in-One Engine — Switch freely between WebLLM (Light), Ollama (Local Power), or Cloud API (Custom Connection)
- 🎭 Rich Role Library — Built-in 10+ selected role prompts (Writer, Translator, Polisher, etc.), one-click switch & custom support
- 🌍 Global Communication — Supports 8 Languages:
- English, 简体中文, 繁體中文, 日本語, 한국어, Français, Deutsch, Español
- Interface and AI responses automatically adapt to your language
- 🔒 Privacy First — All AI inference is local (when using local engines), no data upload, Apple Notarized
- 💬 Highly Customizable — Freely edit system prompts to craft your exclusive AI assistant
- 🎨 Multiple Themes — Light / Dark / Zen Tea, fully optimized dark mode
- 🗂️ Card Grid View — iOS-style with drag-and-drop sorting, polished context menus
- 🔍 Context Aware — AI can directly read your current article or folder contents
- 🎯 Focus Mode — Auto-switches to distraction-free editing when window narrows
Download the latest installer from Releases:
| Platform | File | Note |
|---|---|---|
| 🍎 macOS | WitNote-1.3.2.dmg |
Apple Silicon (M1/M2/M3/M4/M5) Only |
| 🪟 Windows (x64) | WitNote-1.3.2-setup-x64.exe |
Standard PC (Intel/AMD) |
| 🪟 Windows (ARM64) | WitNote-1.3.2-setup-arm64.exe |
Snapdragon PCs (e.g. Surface Pro X) |
| 🐧 Linux (AppImage) | WitNote-1.3.2-x86_64.AppImage |
x64 Universal (ARM64 available) |
| 📦 Linux (Deb) | WitNote-1.3.2-amd64.deb |
Ubuntu/Debian x64 (ARM64 available) |
| Item | Minimum | Recommended |
|---|---|---|
| OS Version | macOS 12.0+ | macOS 13.0+ |
| Chip | Not Supported (Intel Chips) | Apple Silicon (M1/M2/M3/M4/M5) |
| RAM | - | 16GB+ |
| Storage | - | SSD, 4GB+ free space |
❌ Important Note for Intel Macs:
This application does not support Mac computers with Intel chips. Even if forced to run, the experience will be extremely poor due to the following reasons:
- Architectural Incompatibility: The built-in local inference engines (WebLLM/Ollama) deeply rely on the ARM64 architecture and NPU/Metal hardware acceleration of Apple Silicon.
- Lack of Hardware Acceleration: Intel Macs lack Unified Memory Architecture. Running quantized models is extremely slow (generating a single token may take seconds) and causes severe device heating.
- Architectural Trade-off: To ensure the best experience and minimal package size, we have removed support for the x86_64 architecture.
We strongly recommend using Mac devices equipped with Apple Silicon (M-series) chips.
| Item | Minimum | Recommended |
|---|---|---|
| OS Version | Windows 10 (64-bit) | Windows 11 |
| Processor | Intel Core i5 / AMD Ryzen 5 | Intel Core i7 / AMD Ryzen 7 |
| RAM | 8GB | 16GB+ |
| Storage | 2GB free space | SSD, 4GB+ free space |
| GPU | Integrated graphics | Discrete GPU with Vulkan support |
⚠️ Note: Windows ARM64 devices (e.g. Surface Pro X) are now natively supported!
| Item | Minimum | Recommended |
|---|---|---|
| OS Version | Ubuntu 20.04+ / Debian 11+ | Latest Mainstream Distro |
| Arch | x64 / ARM64 | x64 / ARM64 |
| RAM | 8GB | 16GB+ |
⚠️ Note: Windows version is newly released. Feedback welcome!
- Download the
.dmgfile - Double-click to open the DMG
- Drag the app to Applications folder
- Launch from Applications
🎉 Great News!
This app is now Apple Notarized! No more "unverified developer" warnings!
😅
The developer bravely took out a loan to afford the $99 Apple Developer account...*(Yes, this actually happened. Thanks to all users for your support!)
- Download the
.exeinstaller - Run the setup wizard
- Choose installation path (customizable)
- Complete installation, launch from Desktop or Start Menu
AppImage (Universal):
- Download
.AppImagefile - Right-click Properties -> Allow executing file as program (or
chmod +x WitNote*.AppImage) - Double-click to run
Deb (Ubuntu/Debian):
- Download
.debfile (e.g.,WitNote-1.3.1-amd64.deb) - Run installation via terminal (automatically handles dependencies):
sudo apt install ./WitNote-1.3.2-amd64.deb
📝 Important Notes for Windows Users:
As an individual developer without an expensive EV Code Signing Certificate, you might encounter the following:
- SmartScreen: If you see "Windows protected your PC" (Unknown Publisher), please click "More info" -> "Run anyway".
- Antivirus Warning: Windows Defender or other AV software might flag the installer. The project is open-source and safe. If blocked, please try disabling AV temporarily.
The app includes a built-in WebLLM engine with qwen2.5:0.5b model (macOS only, Windows users are recommended to use Ollama).
- Pros: No extra software installation needed, works completely offline after initial model download.
- Best for: Quick Q&A, simple text polishing, low-end devices.
Supports connecting to locally running Ollama service.
- Pros: Runs 7B, 14B or even larger models, powerful performance, completely offline.
- Usage: Install Ollama first, then download more models in Settings (e.g., qwen2.5:7b, llama3, etc).
Supports connecting to OpenAI-compatible Cloud APIs.
- Pros: Access the most powerful models on Earth with just an API Key.
- Best for: Top-tier logical reasoning, or when local hardware cannot support large models.
- Config: Enter API URL and Key in Settings (Supports OpenAI, Gemini, DeepSeek, Moonshot, etc).
# Clone the repository
git clone https://github.com/hooosberg/WitNote.git
cd WitNote
# Install dependencies
npm install
# Start development server
npm run dev
# Build macOS version
npm run build
# Build Windows version
npm run build -- --winMIT License
hooosberg
🔗 https://github.com/hooosberg/WitNote
Smart Core, Simple Form











