A high-performance, feature-rich implementation of the classic wget utility, built with Rust for speed, safety, and concurrency.
- Single file downloads with progress tracking
- Multiple file downloads from command line arguments
- Batch downloads from input files (
-iflag) - Sequential processing for command line URLs (wget-compatible)
- Concurrent processing for file-based URLs (performance optimized)
- Custom output names (
-Oflag) - Directory specification (
-Pflag) - Background downloads (
-Bflag) with logging - Rate limiting (
--rate-limit) with k/M suffixes - Website mirroring (
--mirror) with filtering options
- Asynchronous I/O using Tokio for high performance
- Streaming downloads for memory efficiency
- Multi-progress bars for concurrent download tracking
- Two-phase download process for clean user experience
- Comprehensive error handling with detailed reporting
The project is organized into focused modules:
cli/— Command-line argument parsing and validationhttp/— HTTP client and network operationsio/— File I/O and URL input processingdownload/— Concurrent download management and progress trackingutils/— Utility functions and helpers
git clone <repository-url>
cd wget-rs
cargo build --release# Download a single file
./wget https://example.com/file.zip
# Download with custom name
./wget -O myfile.zip https://example.com/file.zip
# Download to specific directory
./wget -P ~/Downloads/ https://example.com/file.zip
# Download multiple files from input file
./wget -i downloads.txt
# Background download with rate limiting
./wget -B --rate-limit=500k https://example.com/largefile.zip| Flag | Description | Example |
|---|---|---|
-O <file> |
Save as specific filename | ./wget -O image.jpg <url> |
-P <dir> |
Save to directory | ./wget -P ~/Downloads/ <url> |
-i <file> |
Read URLs from file | ./wget -i urls.txt |
-B |
Download in background | ./wget -B <url> |
--rate-limit=<rate> |
Limit download speed | ./wget --rate-limit=200k <url> |
--mirror |
Mirror entire website | ./wget --mirror <url> |
-R <suffixes> |
Reject file types | ./wget --mirror -R=jpg,gif <url> |
-X <dirs> |
Exclude directories | ./wget --mirror -X=/tmp,/cache <url> |
- Concurrent file downloads from input files (original wget is sequential)
- Modern progress bars with multi-download support
- Two-phase download process prevents UI conflicts
- Better error reporting with detailed summaries
- True wget compatibility for single downloads
- Rust performance and safety benefits
- Async/await architecture for efficiency
- Modular design for maintainability
./wget url1 url2 url3
# Downloads: url1 → url2 → url3 (one after another)./wget -i urls.txt
# Phase 1: Send all requests, collect responses
# Phase 2: Download all files simultaneouslystart at 2024-01-15 10:30:45
sending request, awaiting response... status 200 OK
content size: 1048576 [~1.00MB]
saving file to: ./file.zip
1.00 MiB / 1.00 MiB [████████████████████] 100.00% 2.5 MiB/s 0s
Downloaded [https://example.com/file.zip]
finished at 2024-01-15 10:30:47
start at 2024-01-15 10:30:45
Read 3 URLs from file: downloads.txt
Processing file URLs concurrently...
Phase 1: Sending requests...
sending request to https://example.com/file1.zip, awaiting response...
status 200 OK for https://example.com/file1.zip
sending request to https://example.com/file2.zip, awaiting response...
status 200 OK for https://example.com/file2.zip
Phase 2: Starting 2 concurrent downloads...
file1.zip 512 KiB / 1.0 MiB [████████████████████] 50.0% 1.2 MiB/s 1s
file2.zip 256 KiB / 2.0 MiB [██████████ ] 25.0% 800 KiB/s 3s
Concurrent Download Summary:
Successful: 2
Failed: 0
Total bytes: 3145728 (3.00 MB)
finished at 2024-01-15 10:30:52
cargo test# Test single download
./wget https://httpbin.org/bytes/1024
# Test concurrent downloads
echo -e "https://httpbin.org/bytes/1024\nhttps://httpbin.org/json" > test.txt
./wget -i test.txt
# Test with directory
./wget -P /tmp/ https://httpbin.org/uuid
# Test background mode
./wget -B https://httpbin.org/bytes/2048
cat wget-log- Rust 1.70+
- Tokio runtime
- Internet connection for testing
clap— Command line parsingreqwest— HTTP clienttokio— Async runtimeindicatif— Progress barschrono— Date/time handlingfutures-util— Stream utilities
# Debug build
cargo build
# Release build (optimized)
cargo build --release
# Run with logging
RUST_LOG=debug cargo run -- <url>Each module contains detailed documentation:
- CLI Module — Argument parsing and validation
- HTTP Module — Network operations and client
- I/O Module — File operations and URL reading
- Download Module — Concurrent download management
- Utils Module — Utility functions and helpers
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
This project is licensed under the MIT License - see the LICENSE file for details.
- Original GNU wget team for the inspiration
- Rust community for excellent async ecosystem
- Contributors and testers
Built with Rust