It's a python script transfering db files(sqlite & plist) to a json file. It's very useful in forensics. You can use the script to show all info in a folder or json file, which makes it much easier to read/search/understand info.
Here're some use cases: see detailed examples from the MeiyaCup 2024 Individual Competition.
And I used AI to generate some test cases, but the codes are not perfect to pass all the test cases, so there may be some errors. Please let me know if you find any bugs or have any suggestions, feel free to raise an issue or pull request, thank you!
- Python 3.7 or higher
- Windows, macOS, or Linux operating system
git clone <repository-url>
cd dbs2jsonpip install -r requirements-prod.txtpip install -r requirements.txtpip install logurupython main.py --helpIf the help information is displayed, the installation is successful.
- loguru (>=0.7.0) - Used to provide structured logging functionality
The main Python standard library modules used in the project:
pathlib- File path handlingsqlite3- SQLite database operationsplistlib- Plist file handlingargparse- Command-line argument parsingjson- JSON data processingcsv- CSV data processingos,sys,tempfile,shutil- System operationstyping- Type hint support
- pytest - Testing framework
- pytest-mock - Testing mocking tool
- pytest-cov - Test coverage
Basic usage examples:
# Process all database files in the current directory
python main.py
# Process a single file
python main.py -i database.db
# Process a specified directory
python main.py -i /path/to/evidence
# Output in CSV format
python main.py -f csv
# Detailed output mode
python main.py -v
# Specify output directory
python main.py -o /path/to
# Use multiple threads for faster processing
python main.py -t 4
# Use auto-detected optimal thread count
python main.py -t 0Run tests:
# install development dependencies
pip install -r requirements.txt
# run tests
python run_tests.py
# or use pytest
pytestThe tool now supports multi-threaded processing for improved performance when handling large numbers of database files.
- Single-threaded (default):
python main.pyorpython main.py -t 1 - Custom thread count:
python main.py -t 4(uses 4 threads) - Auto-detection:
python main.py -t 0(automatically detects optimal thread count)
- Faster processing: Multiple files are processed in parallel
- Smart resource management: Thread count is automatically capped to prevent system overload
- Thread-safe operations: All data structures are properly synchronized
- Progress tracking: Real-time progress updates in verbose mode
- Large directories: 10+ database files
- SSD storage: Fast I/O benefits from parallel processing
- Multi-core systems: Utilizes available CPU cores efficiently
- Time-critical analysis: Faster results for large evidence sets
-t, --threads Number of threads for parallel processing
(default: 1, use 0 for auto-detection)
- transfer sqlite file to a json file or folder.
- transfer plist file to a json file or folder.
- nested files in plist could be extracted.
- encrypted sqlite file detection.
- exporting data to csv file supported.
- multi-threading support for faster processing(experimental).
- some tests generated by AI, but didn't pass them all.