A powerful, self-hosted Personal Knowledge Management system that rivals Obsidian. Built with Next.js and TypeScript, MarkItUp transforms markdown editing into a comprehensive second brain for organizing, linking, and discovering your thoughts.
- π User Authentication & Authorization - Secure multi-user support with JWT sessions
- π§ Unified AI Settings - Centralized configuration across all AI features
- π‘οΈ Enhanced Security - Improved CSP headers and Docker/Ollama compatibility
- π€ AI Robustness - Better JSON parsing and proxy diagnostics
- οΏ½ Optional Volume Mounts - Simplified deployment for managed hosting scenarios
Breaking Change: Authentication now required by default. See docs/MIGRATION_4_0.md for migration guide or set
DISABLE_AUTH=truefor single-user mode.
- β¨ WYSIWYG Editor - Rich text editing with TipTap
- οΏ½ Intelligent Link Suggester - AI-powered wikilink suggestions with real-time analysis
- π Spaced Repetition - Scientific flashcard system with FSRS algorithm
- π€ Universal AI Support - OpenAI, Anthropic, Gemini, or Ollama (100% local/private)
- οΏ½ Semantic Search - Browser-based ML for meaning-based discovery
- π€ Real-time Collaboration - Multi-user editing with WebSocket sync
See CHANGELOG.md for complete version history.
git clone https://github.com/xclusive36/MarkItUp.git
cd MarkItUp
npm install
npm run devVisit http://localhost:3000 and start building your knowledge base!
- Bidirectional Linking - Connect notes with
[[wikilinks]]and automatic backlinks - Knowledge Graph - Visual exploration with 3D force-directed layouts
- Advanced Search - Full-text search with operators (
tag:,folder:, exact phrases) - Semantic Search - AI-powered vector search finds related content by meaning, not just keywords:
- π§ Browser-Based ML - Client-side embeddings using Transformers.js (no server required!)
- π Similarity Discovery - Find related notes based on semantic similarity
- π Progress Tracking - Real-time indexing with visual progress indicators
- πΎ IndexedDB Storage - Efficient persistent storage for embeddings
- β‘ Fast Search - Sub-100ms queries for typical collections
- π― Mode Toggle - Switch between keyword and semantic search instantly
- π 100% Private - All processing happens locally in your browser
- βοΈ Configurable - Batch size, auto-indexing, and more in settings
- Smart Tagging - Organize with
#tagsand automatic indexing - Real-time Analytics - Track your knowledge growth
All AI features work with your choice of provider (OpenAI, Anthropic, Gemini, or Ollama):
- Intelligent Link Suggester - AI finds connection opportunities in your knowledge base:
- π Context-aware wikilink suggestions with confidence scoring
- π Batch orphan analysis for vault-wide optimization
- π‘ Visual context preview shows where links will be inserted
- β‘ Real-time suggestions while typing (optional)
- π§ Pattern learning adapts to your preferences
- π Bridge note suggestions to connect isolated clusters
- π Knowledge base health visualization
- Context-Aware AI Chat - AI assistant that understands your knowledge graph
- Writing Assistant - Advanced content analysis and improvement suggestions
- Knowledge Discovery - AI-powered gap analysis and content recommendations
- Research Assistant - Semantic search with intelligent note creation
- Multi-Provider Support - Choose once, use everywhere: OpenAI (GPT-4, GPT-3.5), Anthropic (Claude 3.5), Google Gemini (1.5 Pro/Flash), or Ollama (local AI)
MarkItUp features the most comprehensive Ollama integration in any PKM system:
- π Real-Time Streaming - Progressive response display with Server-Sent Events
- π Model Management - Download, delete, and refresh models directly from the UI
- π Connection Testing - One-click server verification with detailed status (no CORS errors!)
- βοΈ Advanced Parameters - Fine-tune context window, GPU layers, repeat penalty, and more
- π¦ Enhanced Model Details - View size, parameter count (7B/13B/70B), and quantization levels
- πΎ Zero Cost - No API keys, no usage limits, completely free local AI
- π Complete Privacy - All data stays on your machine, no external API calls
- π Model Library - Browse and install popular models (Llama 3.2, Mistral, CodeLlama, etc.)
- π― Custom Servers - Connect to remote Ollama instances with saved presets
- π Performance Metrics - Track tokens/second, response times, and model efficiency
- π‘οΈ Seamless Integration - Built-in CORS proxy for error-free connections (no Ollama config needed!)
Getting Started with Ollama:
- Install Ollama:
brew install ollama(macOS) or visit ollama.ai - Start server:
ollama serve - Pull a model:
ollama pull llama3.2or use MarkItUp's UI - Select "Ollama (Local)" in AI Settings - no API key needed!
Why Choose Ollama?
- 100% private - ideal for sensitive knowledge bases
- Offline capable after model download
- No monthly costs or token limits
- Full control over AI infrastructure
- Supports 50+ open-source models
- Multi-user Editing - Real-time collaborative editing with conflict resolution
- Live Presence - See other users' cursors and selections
- WebSocket Sync - Low-latency synchronization via Socket.IO
- Session Sharing - Easy document sharing with secure links
- Rich Plugin API - Custom functionality, commands, and processors
- Settings Management - Configurable plugin settings with persistence
- Content Processors - Transform content during import/export
- Event System - Plugin communication through events
- WYSIWYG Editor - Rich text editing with TipTap, seamless markdown conversion
- Multi-mode Editor - Edit, preview, split-view, or WYSIWYG modes
- LaTeX Math - Inline
$math$and block$$math$$equations - TikZ Diagrams - Create vector graphics directly in markdown
- GitHub Flavored Markdown - Tables, task lists, and more
| Topic | Description |
|---|---|
| π― Features | Complete feature overview and capabilities |
| βοΈ Installation | Detailed setup and configuration guide |
| π User Guide | How to use PKM features effectively |
| π€ AI Features | AI setup and advanced capabilities |
| π€ Collaboration | Real-time editing and sharing |
| π Plugin System | Using and developing plugins |
| π³ Deployment | Docker and production hosting |
| π§ API Reference | Technical documentation |
| βοΈ Comparison | MarkItUp vs Obsidian and others |
π Web-Native: Access from any device with a browser - no app installation required
π Complete Privacy: Self-hosted solution means your data never leaves your control
π Built for Connection: Native wikilinks, backlinks, and graph visualization
β‘ Modern Stack: Built with Next.js 16 and TypeScript for speed and reliability
π Truly Free: Open-source with no licensing fees or feature restrictions
π Extensible: Modern plugin architecture for unlimited customization
MarkItUp offers the most flexible AI integration in any PKM system:
| Feature | OpenAI | Anthropic | Gemini | Ollama |
|---|---|---|---|---|
| Streaming | β | β | β | β |
| API Key Required | β Yes | β Yes | β Yes | β None |
| Cost | $0.002-0.03/1K tokens | $0.003-0.015/1K tokens | $0.000075-0.00125/1K tokens | π Free |
| Context Window | 128K tokens | 200K tokens | π 1M tokens | Varies by model |
| Model Management | β | β | β | β Yes |
| Advanced Options | β | β | β | β 8+ params |
| Privacy (Local) | β Cloud | β Cloud | β Cloud | β 100% Local |
| Connection Health | β | β | β | β Yes |
| Offline Mode | β | β | β | β Yes |
| Performance Tracking | β | β | β | β Yes |
| Usage Limits | Pay per use | Pay per use | Pay per use | β Unlimited |
Choose the right provider for your needs:
- OpenAI - Most capable general-purpose AI (GPT-4)
- Anthropic - Long context windows, ethical AI (Claude 3.5)
- Gemini - Most cost-effective with massive 1M token context (Gemini 1.5 Pro/Flash)
- Ollama - Complete privacy, zero cost, offline capable (50+ models)
# Generate JWT_SECRET
openssl rand -base64 32
# Generate ENCRYPTION_KEY
openssl rand -hex 16Create docker-compose.yml:
version: "3.8"
services:
markitup:
container_name: markitup
ports:
- 3000:3000
# volumes:
# Optional: Uncomment to persist markdown files on host filesystem
# Note: Requires chmod 777 ./markdown for write permissions
# For managed hosting or ephemeral deployments, leave commented
# - ./markdown:/app/markdown
environment:
- PORT=3000
- HOSTNAME=0.0.0.0
- NODE_ENV=production
# REQUIRED: Replace with your generated secrets
- JWT_SECRET=your-generated-jwt-secret-here
- ENCRYPTION_KEY=your-generated-encryption-key-here
# Optional: Disable auth for testing (NOT FOR PRODUCTION)
# - DISABLE_AUTH=false
restart: unless-stopped
image: ghcr.io/xclusive36/markitup:latestdocker compose up -d# Generate secrets
JWT_SECRET=$(openssl rand -base64 32)
ENCRYPTION_KEY=$(openssl rand -hex 16)
# Run without persistent storage (default)
docker run --name markitup -p 3000:3000 \
-e JWT_SECRET="$JWT_SECRET" \
-e ENCRYPTION_KEY="$ENCRYPTION_KEY" \
--restart unless-stopped \
ghcr.io/xclusive36/markitup:latest
# OR with persistent storage
docker run --name markitup -p 3000:3000 \
-v ./markdown:/app/markdown \
-e JWT_SECRET="$JWT_SECRET" \
-e ENCRYPTION_KEY="$ENCRYPTION_KEY" \
--restart unless-stopped \
ghcr.io/xclusive36/markitup:latestBy default, data lives in the container. To persist data on your host:
- Uncomment the volume mount in docker-compose.yml or add
-v ./markdown:/app/markdownto docker run - Set permissions:
# Easy method (all users can write)
chmod -R 777 ./markdown
# Secure method (container user only)
sudo chown -R 1001:1001 ./markdownFor managed hosting, skip volume mounts and implement alternative storage (S3, database, etc.)
MarkItUp supports both multi-user authenticated mode (default, recommended) and single-user unauthenticated mode for personal self-hosted deployments.
Features:
- β
Secure user isolation - each user has their own
/markdown/user_{id}/directory - β User management with signup, login, and session handling
- β Per-user quotas and usage tracking
- β Audit logs showing who modified what
- β Collaborative features (sharing, permissions)
Requirements:
- Set
JWT_SECRETandENCRYPTION_KEYenvironment variables - Users must sign up and log in
For personal self-hosted instances where you're the only user:
# docker-compose.yml
environment:
- DISABLE_AUTH=trueWhat This Does:
- β No signup/login required - instant access
- β
All files stored in
/markdown/root directory - β Simpler setup for single-user scenarios
β οΈ No user isolation - all data is sharedβ οΈ No access control - anyone with network access can read/writeβ οΈ No audit trail - can't track who changed whatβ οΈ Limited features - collaboration and sharing disabled
π¨ DISABLE_AUTH=true means:
β’ Anyone who can reach your server can access ALL your notes
β’ No password protection whatsoever
β’ No way to restrict who can create, edit, or delete files
β’ All AI Chat file operations affect the shared directory
β’ Should ONLY be used for:
- Personal single-user deployments
- Localhost-only access
- Development/testing
- Networks you completely trust
When to Use Each Mode:
| Scenario | Recommended Mode |
|---|---|
| Personal use, only you access it | Either (DISABLE_AUTH is fine) |
| Exposing to the internet | Multi-user ONLY |
| Sharing with family/team | Multi-user ONLY |
| Untrusted network | Multi-user ONLY |
| Development/testing | DISABLE_AUTH is convenient |
Best Practices for DISABLE_AUTH=true:
- Network Isolation: Only expose on localhost or trusted LAN
- Firewall Rules: Block external access at network level
- Reverse Proxy: Use nginx/Caddy with basic auth if internet-facing
- Regular Backups: No audit log means you can't recover from mistakes
- Single User Only: Never use for multi-user scenarios
Migration: You can switch between modes anytime. Existing data in /markdown/ will work in both modes, though user-specific folders (/markdown/user_*/) are only used in multi-user mode.
We welcome contributions! Please see our contribution guidelines for details.
- π Bug Reports: Open an issue
- π‘ Feature Requests: Start a discussion
- π§ Pull Requests: Fork, develop, and submit PRs
MIT License - see LICENSE file for details.
β Star this repo if you find MarkItUp useful!
Built with β€οΈ by the MarkItUp community
