Skip to content

BNU4KA/interviews

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

15 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Interviews - AI-Powered Interview Assistant

An AI-powered interview assistant application that helps solve coding problems using Ollama models.

Prerequisites

  • Node.js (v14 or higher)
  • npm or yarn
  • Ollama installed and running

Installation Guide

Step 1: Install Ollama

macOS

brew install ollama

Linux

curl -fsSL https://ollama.com/install.sh | sh

Windows

Download the installer from ollama.com

Step 2: Install Ollama Models

After installing Ollama, you need to pull the required models:

ollama pull deepseek-coder:6.7b
ollama pull llava:7b

Verify that the models are installed:

ollama list

Step 3: Install Dependencies

Install dependencies for both the server and the application:

# Install all dependencies at once
npm run install:all

# Or install manually:
# Install server dependencies
cd server
npm install

# Install application dependencies
cd ../app
npm install

Step 4: Start Ollama Server

In a separate terminal, start the Ollama server:

ollama serve

The Ollama server will run on http://localhost:11434 by default.

Verify that Ollama is running:

curl http://localhost:11434/api/tags

Running the Application

Quick Start (Recommended)

Use the provided startup script to launch everything automatically:

./start.sh

Or using npm:

npm start

This script will:

  1. Check if Ollama is installed and running (start it if needed)
  2. Verify required models are installed (pull them if missing)
  3. Start the Node.js server
  4. Start the Electron application

Manual Start

If you prefer to start services manually:

Start the Node.js Server

Open a terminal and navigate to the server directory:

cd server
npm start

The server will start on port 3000 (or the port specified in the PORT environment variable).

Start the Application

Open another terminal and navigate to the app directory:

cd app
npm start

This will launch the Electron application.

Usage

Using the startup script:

./start.sh
# or
npm start

Manual steps:

  1. Make sure Ollama is running (ollama serve)
  2. Start the Node.js server (cd server && npm start)
  3. Start the application (cd app && npm start)
  4. Click "Start Session" in the application
  5. Send LeetCode problems or coding questions

Project Structure

interviews/
├── app/          # Electron application (frontend)
├── server/       # Node.js server (backend)
├── start.sh      # Startup script for all services
├── stop.sh       # Stop script for all services
├── package.json  # Root package.json with npm scripts
└── README.md     # This file

Stopping Services

To stop all running services, use the provided stop script:

./stop.sh

Or using npm:

npm run stop

This script will:

  • Stop all Ollama processes
  • Stop the Node.js server
  • Stop all Electron application processes
  • Clean up stale lock files

Manual Stop

Or stop them individually:

  • Ollama: pkill -f 'ollama serve' or press Ctrl+C in the terminal where it's running
  • Server: pkill -f 'node.*server.js' or press Ctrl+C in the server terminal
  • App: Close the Electron application window

Force Stop

If processes don't stop normally, force kill them:

pkill -9 -f 'ollama serve'
pkill -9 -f 'node.*server.js'
pkill -9 -f 'electron.*app'
pkill -9 -f 'electron-forge'

Troubleshooting

Ollama server is not available

  • Make sure ollama serve is running
  • Check that port 11434 is not occupied
  • Verify Ollama installation: ollama --version

Model not found

  • Pull the required models:
    ollama pull deepseek-coder:6.7b
    ollama pull llava:7b
  • Check available models: ollama list

Server connection errors

  • Ensure the server is running on port 3000
  • Check that no other application is using port 3000
  • Verify server logs for error messages

Application won't start

  • Make sure all dependencies are installed (npm install in both app and server directories)
  • Check Node.js version: node --version (should be v14 or higher)
  • Review application logs for specific errors

LevelDB/IndexedDB lock error

If you see an error like Failed to open LevelDB database: File currently in use:

This usually means another instance of the application is already running or a previous instance didn't close properly. The startup script (start.sh) automatically handles this, but if you're starting manually:

macOS:

# Kill any existing Electron processes
pkill -f "electron.*app"

# Remove stale lock file (if needed)
rm -f ~/Library/Application\ Support/app/IndexedDB/file__0.indexeddb.leveldb/LOCK

Linux:

pkill -f "electron.*app"
rm -f ~/.config/app/IndexedDB/file__0.indexeddb.leveldb/LOCK

Windows:

taskkill /F /IM electron.exe
# Then manually delete: %APPDATA%\app\IndexedDB\file__0.indexeddb.leveldb\LOCK

Available Models

The application uses the following Ollama models:

  • deepseek-coder:6.7b - For code generation and problem solving
  • llava:7b - For vision/image analysis tasks

You can use other models by modifying the configuration in the server code.

License

GPL-3.0

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors