A comprehensive, lightweight Java library that provides developers with simple, consistent, and secure APIs for reading, writing, transforming, and processing multiple file formats including JSON, CSV, XML, Excel, and YAML.
- Multi-Format Support: Read and write JSON, CSV, XML, Excel (XLS/XLSX), and YAML files
- Format Transformation: Convert between any supported formats seamlessly
- Streaming Support: Memory-efficient processing for large files
- Security Features: Built-in encryption/decryption with AES and RSA support
- Compression: GZIP and ZIP compression/decompression capabilities
- Validation: Schema validation for JSON, XML, and CSV formats
- Performance Tracking: Optional performance metrics and logging
- Thread-Safe: Concurrent operations support
- Extensible: Plugin architecture for custom format handlers
- Zero Configuration: Works out-of-the-box with sensible defaults
<dependency>
<groupId>com.diyawanna</groupId>
<artifactId>universal-file-toolkit</artifactId>
<version>1.0.0</version>
</dependency>implementation 'com.diyawanna:universal-file-toolkit:1.0.0'implementation("com.diyawanna:universal-file-toolkit:1.0.0")- Java 8+ (tested with Java 8, 11, 17, 21)
- Maven 3.6+ or Gradle 6.0+ for building from source
import com.diyawanna.uft.UniversalFileToolkit;
import com.diyawanna.uft.ToolkitConfig;
import com.diyawanna.uft.model.FileFormat;
// Create toolkit instance
ToolkitConfig config = ToolkitConfig.builder()
.enableLogging(true)
.enablePerformanceTracking(true)
.build();
UniversalFileToolkit toolkit = new UniversalFileToolkit(config);
// Read a JSON file
Map<String, Object> data = toolkit.read(
new File("data.json"),
FileFormat.JSON,
Map.class
);
// Write data to CSV
toolkit.write(data, new File("output.csv"), FileFormat.CSV);
// Transform JSON to Excel
toolkit.transform(
new File("input.json"), FileFormat.JSON,
new File("output.xlsx"), FileFormat.EXCEL,
TransformOptions.builder().preserveHeaders(true).build()
);// Read CSV as table
List<Map<String, Object>> rows = toolkit.readAsTable(
new File("employees.csv"),
FileFormat.CSV
);
// Stream large files efficiently
try (Stream<Map<String, Object>> stream = toolkit.streamAsTable(
Paths.get("large-dataset.csv"), FileFormat.CSV)) {
stream.filter(row -> (Integer) row.get("age") > 25)
.forEach(System.out::println);
}// Encrypt a file
EncryptionOptions encryptOptions = EncryptionOptions.builder()
.type(EncryptionType.AES)
.password("mySecretPassword".toCharArray())
.build();
File encryptedFile = toolkit.encrypt(new File("sensitive.json"), encryptOptions);
// Decrypt the file
File decryptedFile = toolkit.decrypt(encryptedFile, encryptOptions);// Compress a file
File compressedFile = toolkit.compress(
new File("large-data.json"),
CompressionType.GZIP
);
// Decompress
File decompressedFile = toolkit.decompress(compressedFile);ToolkitConfig config = ToolkitConfig.builder()
.enableLogging(true)
.enablePerformanceTracking(true)
.defaultEncryption(EncryptionType.AES)
.defaultCharset(StandardCharsets.UTF_8)
.externalLogger(LoggerFactory.getLogger("MyApp"))
.build();// Validate JSON against schema
ValidationOptions options = ValidationOptions.builder()
.schemaFile(new File("schema.json"))
.failFast(false)
.build();
ValidationResult result = toolkit.validate(
new File("data.json"),
FileFormat.JSON,
options
);
if (!result.isValid()) {
result.getErrors().forEach(error ->
System.err.println(error.getPath() + ": " + error.getMessage())
);
}// Async file writing
CompletableFuture<Void> writeTask = toolkit.writeAsync(
data,
Paths.get("output.json"),
FileFormat.JSON
);
// Async reading
CompletableFuture<List<Map<String, Object>>> readTask =
toolkit.readAsTableAsync(Paths.get("data.csv"), FileFormat.CSV);
// Handle completion
readTask.thenAccept(rows -> {
System.out.println("Read " + rows.size() + " rows");
});OperationResult<List<Map<String, Object>>> result =
toolkit.readWithPerformance(file, FileFormat.CSV, List.class);
PerformanceReport report = result.getPerformanceReport();
System.out.println("Operation took: " + report.getElapsedMillis() + "ms");
System.out.println("Bytes read: " + report.getBytesRead());
System.out.println("Peak memory: " + report.getPeakMemoryBytes());public class CustomFormatPlugin implements FormatPlugin {
@Override
public boolean supports(FileFormat format) {
return format == FileFormat.CUSTOM;
}
@Override
public Reader<?> createReader() {
return new CustomFormatReader();
}
@Override
public Writer createWriter() {
return new CustomFormatWriter();
}
}
// Register plugin
ToolkitConfig config = ToolkitConfig.builder()
.addPlugin(new CustomFormatPlugin())
.build();| Format | Read | Write | Streaming | Validation | Notes |
|---|---|---|---|---|---|
| JSON | β | β | β | β (JSON Schema) | Jackson-based |
| CSV | β | β | β | β (Header/Type) | RFC 4180 compliant |
| XML | β | β | β | β (XSD) | DOM/SAX support |
| Excel | β | β | β | β (Basic) | XLS and XLSX |
| YAML | β | β | β | β (Structure) | SnakeYAML-based |
- AES Encryption: 128, 192, 256-bit keys
- RSA Encryption: Public/private key encryption
- Password-based Encryption: PBKDF2 with configurable iterations
- File Hashing: MD5, SHA-1, SHA-256, SHA-512
- Secure Key Handling: Automatic memory cleanup for sensitive data
- Streaming: Use
streamAsTable()for files > 100MB - Memory Efficient: Lazy loading for large datasets
- Thread Safety: All public APIs are thread-safe
- Resource Management: Auto-closeable resources with try-with-resources
Check out the examples directory for comprehensive usage examples:
- Basic Operations
- Format Transformations
- Security Features
- Streaming Large Files
- Validation Examples
- Async Operations
The toolkit uses a comprehensive exception hierarchy:
try {
toolkit.read(file, FileFormat.JSON, Map.class);
} catch (ValidationException e) {
// Handle validation errors
} catch (SecurityException e) {
// Handle encryption/decryption errors
} catch (IOProcessingException e) {
// Handle I/O errors
} catch (ToolkitException e) {
// Handle general toolkit errors
}# Clone the repository
git clone https://github.com/Diyawanna/java-universal-file-toolkit.git
cd java-universal-file-toolkit
# Build with Gradle
./gradlew build
# Run tests
./gradlew test
# Generate documentation
./gradlew javadoc# Run all tests
./gradlew test
# Run integration tests
./gradlew integrationTest
# Generate coverage report
./gradlew jacocoTestReportCoverage reports are available at build/reports/jacoco/test/html/index.html
Complete API documentation is available at: JavaDoc
We welcome contributions! Please see our Contributing Guidelines for details.
- Fork the repository
- Create a feature branch:
git checkout -b feature/amazing-feature - Make your changes and add tests
- Ensure tests pass:
./gradlew test - Commit your changes:
git commit -m 'Add amazing feature' - Push to the branch:
git push origin feature/amazing-feature - Open a Pull Request
- Follow Google Java Style Guide
- Use meaningful variable and method names
- Add comprehensive JavaDoc comments
- Maintain test coverage above 80%
This project is licensed under the Apache License 2.0 - see the LICENSE file for details.
- Documentation: API Docs
- Issues: GitHub Issues
- Discussions: GitHub Discussions
- Email: tech@diyawanna.com
- Streaming Excel writing support
- Additional compression algorithms (BZIP2, LZ4)
- JSON Schema Draft 2019-09 support
- Performance optimizations for large files
- Parquet format support
- Database integration (JDBC)
- Cloud storage providers (S3, Azure Blob)
- Advanced transformation functions
- Jackson for JSON/YAML processing
- Apache POI for Excel file support
- Apache Commons CSV for CSV processing
- BouncyCastle for cryptographic functions
- Language: Java
- Build Tool: Gradle
- Dependencies: Minimal, well-established libraries
- Test Coverage: 85%+
- Documentation: 100% API coverage
Made with β€οΈ by Diyawanna
β Star this project if you find it useful!