An AI-powered interview assistant application that helps solve coding problems using Ollama models.
- Node.js (v14 or higher)
- npm or yarn
- Ollama installed and running
brew install ollamacurl -fsSL https://ollama.com/install.sh | shDownload the installer from ollama.com
After installing Ollama, you need to pull the required models:
ollama pull deepseek-coder:6.7b
ollama pull llava:7bVerify that the models are installed:
ollama listInstall dependencies for both the server and the application:
# Install all dependencies at once
npm run install:all
# Or install manually:
# Install server dependencies
cd server
npm install
# Install application dependencies
cd ../app
npm installIn a separate terminal, start the Ollama server:
ollama serveThe Ollama server will run on http://localhost:11434 by default.
Verify that Ollama is running:
curl http://localhost:11434/api/tagsUse the provided startup script to launch everything automatically:
./start.shOr using npm:
npm startThis script will:
- Check if Ollama is installed and running (start it if needed)
- Verify required models are installed (pull them if missing)
- Start the Node.js server
- Start the Electron application
If you prefer to start services manually:
Open a terminal and navigate to the server directory:
cd server
npm startThe server will start on port 3000 (or the port specified in the PORT environment variable).
Open another terminal and navigate to the app directory:
cd app
npm startThis will launch the Electron application.
./start.sh
# or
npm start- Make sure Ollama is running (
ollama serve) - Start the Node.js server (
cd server && npm start) - Start the application (
cd app && npm start) - Click "Start Session" in the application
- Send LeetCode problems or coding questions
interviews/
├── app/ # Electron application (frontend)
├── server/ # Node.js server (backend)
├── start.sh # Startup script for all services
├── stop.sh # Stop script for all services
├── package.json # Root package.json with npm scripts
└── README.md # This file
To stop all running services, use the provided stop script:
./stop.shOr using npm:
npm run stopThis script will:
- Stop all Ollama processes
- Stop the Node.js server
- Stop all Electron application processes
- Clean up stale lock files
Or stop them individually:
- Ollama:
pkill -f 'ollama serve'or pressCtrl+Cin the terminal where it's running - Server:
pkill -f 'node.*server.js'or pressCtrl+Cin the server terminal - App: Close the Electron application window
If processes don't stop normally, force kill them:
pkill -9 -f 'ollama serve'
pkill -9 -f 'node.*server.js'
pkill -9 -f 'electron.*app'
pkill -9 -f 'electron-forge'- Make sure
ollama serveis running - Check that port 11434 is not occupied
- Verify Ollama installation:
ollama --version
- Pull the required models:
ollama pull deepseek-coder:6.7b ollama pull llava:7b
- Check available models:
ollama list
- Ensure the server is running on port 3000
- Check that no other application is using port 3000
- Verify server logs for error messages
- Make sure all dependencies are installed (
npm installin bothappandserverdirectories) - Check Node.js version:
node --version(should be v14 or higher) - Review application logs for specific errors
If you see an error like Failed to open LevelDB database: File currently in use:
This usually means another instance of the application is already running or a previous instance didn't close properly. The startup script (start.sh) automatically handles this, but if you're starting manually:
macOS:
# Kill any existing Electron processes
pkill -f "electron.*app"
# Remove stale lock file (if needed)
rm -f ~/Library/Application\ Support/app/IndexedDB/file__0.indexeddb.leveldb/LOCKLinux:
pkill -f "electron.*app"
rm -f ~/.config/app/IndexedDB/file__0.indexeddb.leveldb/LOCKWindows:
taskkill /F /IM electron.exe
# Then manually delete: %APPDATA%\app\IndexedDB\file__0.indexeddb.leveldb\LOCKThe application uses the following Ollama models:
deepseek-coder:6.7b- For code generation and problem solvingllava:7b- For vision/image analysis tasks
You can use other models by modifying the configuration in the server code.
GPL-3.0