A lightweight numerical computing and machine learning library with core components implemented in C++. Designed for learners and small-scale projects, Hahaha provides efficient tensor operations, automatic differentiation foundation, and essential ML components.
- Tensor Module: Comprehensive tensor data structure with shape management (
TensorShape), stride control (TensorStride), and nested data handling (NestedData) for efficient numerical operations - Automatic Differentiation Ready: Foundation for computational graphs through
ComputeGraph.h - High-Performance Computing: Optimized for modern C++23 standards with CUDA 13.0 acceleration support
- Complete Development Environment: Docker-based development container with
.devcontainerconfiguration for consistent setup across platforms - Professional Tooling: Integrated code formatting (
clang-format), static analysis (clang-tidy), and unit testing (Google Test) - Example Applications: Complete MNIST handwritten digit recognition example demonstrating network layers, activation functions, optimizers, and training workflow
- Docker (for containerized development)
- Git
- CMake ≥ 3.10
- NVIDIA GPU with CUDA 13.0 support (optional, for GPU acceleration)
# Clone the repository
git clone https://github.com/Napbad/Hahaha.git && cd Hahaha
# Build the development image
docker build -t hahaha-dev .
# Run the development container
docker run --gpus all -it --rm -v $(pwd):/workspace hahaha-dev
# Compile the project
mkdir build && cd build
cmake .. && makeThe project includes full .devcontainer support:
- Open this folder in VS Code
- Click "Reopen in Container" when prompted
- The development environment will be automatically configured
#include "math/ds/TensorData.h"
// Create tensors with initializer lists
hahaha::math::TensorData<int> tensor1({{1, 2}, {3, 4}});
hahaha::math::TensorData<double> tensor2({{1.0, 2.0, 3.0}, {4.0, 5.0, 6.0}});
// Single value tensor
hahaha::math::TensorData<float> scalar(42.0f);Contributions are welcome! Please follow these guidelines:
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a pull request
Please ensure your code follows the project's coding standards by running format.sh before submitting.
This project is licensed under the Apache License 2.0 - see the LICENSE file for details.