NVRLite is a cross-platform (Win/Linux), lightweight, Qt‑based Network Video Recorder designed to handle multiple RTSP video streams with:
- Low‑latency preview via OpenCV
- MP4 recording without re‑encoding (H.264 passthrough)
- Pre‑record buffer (N seconds before trigger)
- Auto‑reconnect with “NO SIGNAL” overlay when streams drop
- HTTP (REST) control for starting/stopping recordings per camera
It targets a developer‑friendly workflow similar to known NVRs in the market , but in a lightweight C++ application.
NVRLite is built around three main components:
-
Capture layer –
RtspCaptureThread- One
QThreadper RTSP stream - Opens RTSP with FFmpeg, decodes frames for display, and emits encoded H.264 packets for recording
- Auto‑retry when RTSP is down (every 5 seconds)
- Emits NO SIGNAL frames when offline
- One
-
Recording layer –
Mp4RecorderWorker- One recorder instance per stream (can be run in its own
QThread) - Receives encoded packets and writes MP4 files without re‑encoding
- Supports pre‑buffering: when recording starts, it also writes the last N seconds of video
- One recorder instance per stream (can be run in its own
-
Control / HTTP layer –
HttpDataServer- Runs an embedded HTTP server using cpp‑httplib
- Exposes APIs to start and stop recording for each
stream_id - Tracks last output file per stream and returns it on
/record/stop
Display uses OpenCV windows (no Qt GUI), with a grid layout that shows all active cameras in a single window.
The core application uses QCoreApplication (no QMainWindow, no QtGui/QtWidgets).
You need:
- Qt5 core:
qtbase5-dev
- FFmpeg and dev headers:
ffmpeg libavcodec-dev libavformat-dev libavutil-dev libswscale-dev
- OpenCV:
libopencv-dev
- Build tools:
build-essential cmake pkg-config
Install (example):
sudo apt update
sudo apt install -y \
qtbase5-dev \
libavcodec-dev libavformat-dev libavutil-dev libswscale-dev \
libopencv-dev \
ffmpeg \
build-essential cmake pkg-config- Use OpenCV and QT5 installer from website and provide path in cmake-gui
mkdir build
cd build
cmake ..
make -j8
- Use cmake-gui to generate the visual studio solution.
- Launch the solution in visual studio and select the release build
Open a terminal (cmd on windows) and launch the executable with the config json as parameters
Linux
./NVRLite --config config.json
Win
NVRLite.exe --config config.json
See (6) for json configuration
Routes
- Start Stream : POST /stream/start
- Stop Stream : POST /stream/stop
- Start Record : POST /record/start
- Stop Record : POST /record/stop
- Get Status : GET /stream/status<?stream_id=xxxx>
Endpoint
POST /stream/start
Content-Type: application/jsonRequest body
{
"stream_id": "stream_1"
}Behavior
- Parses JSON.
- If
stream_idis missing or not a string:- Returns
400and{"status":"error","message":"Missing or invalid 'stream_id'"}.
- Returns
- Otherwise:
-
Emits Qt signal:
emit startStreamRequested(streamId); -
Returns:
{ "status": "ok", "stream_id": "stream_1" }
-
Endpoint
POST /stream/stop
Content-Type: application/jsonRequest body
{
"stream_id": "stream_1"
}Behavior
- Parses JSON.
- If
stream_idis missing or not a string:- Returns
400and{"status":"error","message":"Missing or invalid 'stream_id'"}.
- Returns
- Otherwise:
-
Emits Qt signal:
emit stopStreamRequested(streamId); -
Returns:
{ "status": "ok", "stream_id": "stream_1" }
-
Endpoint
POST /record/start
Content-Type: application/jsonRequest body
{
"stream_id": "stream_1"
}Behavior
- Parses JSON.
- If
stream_idis missing or not a string:- Returns
400and{"status":"error","message":"Missing or invalid 'stream_id'"}.
- Returns
- Otherwise:
-
Emits Qt signal:
emit startRecordingRequested(streamId); -
Returns:
{ "status": "ok", "stream_id": "stream_1" }
-
Endpoint
POST /record/stop
Content-Type: application/jsonRequest body
{
"stream_id": "stream_1"
}Behavior
-
Parses JSON.
-
Emits:
emit stopRecordingRequested(streamId); -
Looks up last known recording file for this
streamIdviam_lastRecordingFile(filled inonRecordingStarted).
Responses
-
If a filename is known:
{ "status": "ok", "stream_id": "stream_1", "file": "rec_stream_1_2025-12-04_11-43-27.mp4" } -
If no recording file is known yet (e.g. recording never started):
{ "status": "warning", "stream_id": "stream_1", "file": null, "message": "No known recording file for this stream (maybe never started?)" } -
On JSON parse error or malformed body:
{ "status": "error", "message": "JSON parse error: ..." }
with HTTP 400.
Endpoint
GET /stream/status
Content-Type: application/jsonBehavior
- Returns:
In case of all streams (no filtering)
{
"status": "ok",
"streams": [
{ "stream_id": "stream_1", "recording": true, "streaming":true, "file": "rec_stream_1_....mp4" },
{ "stream_id": "stream_2", "recording": false, "streaming":true, "file": null },
{ "stream_id": "stream_3", "recording": false, "streaming":false, "file": null }
]
} In case of one single stream (endpoint : ?stream_id=)
{
"status": "ok",
"stream": {
"stream_id": "stream_1",
"recording": true,
"straming":true,
"file": "rec_stream_1_....mp4"
}
}-
A single OpenCV window (e.g.
"NVRLite") shows all cameras in a grid. -
For each frame, capture threads emit:
emit frameReady(streamId, bgrMat); -
A display manager collects frames from all
streamIds, arranges them into a grid (cv::hconcat+cv::vconcat), and shows the result. -
When a stream is offline, the capture thread periodically emits a NO SIGNAL frame instead.
The config file must be formatted the following way :
{
"streams": [
{ "id": "<name_of_camera>", "url": "<url>" },
{ "id": "<name_of_camera>", "url": "<url>" }
],
"http_port": 8090,
"autostart":0,
"display_mode":0,
"pre_buffering_time":5.0,
"post_buffering_time":0.5,
"rec_base_folder":"/home/user/recordings/"
}streamscontains the list of rtsp stream and associated namehttp_portdefines the REST API port to contact (0 - 65535)autostartdefines if the stream must start at launchdisplay_modedefines if the display grid is visible ( 0 = off, 1 = on)pre_buffering_timedefines the time to buffer the packet stream when start is called in seconds ( i.e. will save the last N seconds in the mp4 when the start call is made). This is used to compensate latencypost_buffering_timedefines the time to keep recording when stop is called (in seconds) ( i.e. will save N seconds more in the mp4 when the stop call is made)rec_base_folderdefines the folder to save MP4. Will be created if does not exist
Note : Granularity of time is ms inside the app.
See provided example.json
-
Start NVRLite (QCoreApplication).
-
Capture threads start and either:
- connect to RTSP and start sending frames and packets, or
- go into “retry every 5s” + NO SIGNAL display mode.
-
OpenCV window shows all streams in a grid (if autostart)
-
To start reading/streaming for
stream_1:curl -X POST http://<ip>:<port>/stream/start \ -H "Content-Type: application/json" \ -d '{"stream_id":"stream_1"}'
-
To start recording for
stream_1:curl -X POST http://<ip>:<port>/record/start \ -H "Content-Type: application/json" \ -d '{"stream_id":"stream_1"}'
-
To stop recording and get the output file path:
curl -X POST http://<ip>:<port>/record/stop \ -H "Content-Type: application/json" \ -d '{"stream_id":"stream_1"}'
If correct, the response will be
{
"status": "ok",
"stream_id": "<camera_name>",
"file": "<name_of_the_mp4>"
}-
MP4 files are written as:
rec_<streamId>_YYYY-MM-DD_HH-MM-SS.mp4
- Ensure your RTSP source is sending SPS/PPS in-band so that FFmpeg can capture
extradataand the MP4 is playable. - If you see “dimensions not set” or invalid MP4s:
- Make sure
StreamInfo.width/heightare updated after the first decoded frame (as in the current code). - Ensure you pass correct
codec_idandextradatainto the MP4 muxer.
- Make sure
- For heavy loads (many cameras), make sure:
- Each
RtspCaptureThreadruns in its own thread. - Each
Mp4RecorderWorkeris in a separateQThreadto avoid blocking.
- Each
- Stream Start/Stop control and autostart
- MQTT controls
- ONVIF device discovery
- Metadata overlay (timestamps, camera name, etc.)
- Export / share segments via HTTP