Skip to content

Migrates to use module federation for plugins#61

Merged
elliotBraem merged 10 commits intomainfrom
fix/docker-build-native-modules
Feb 24, 2025
Merged

Migrates to use module federation for plugins#61
elliotBraem merged 10 commits intomainfrom
fix/docker-build-native-modules

Conversation

@elliotBraem
Copy link
Copy Markdown
Collaborator

@elliotBraem elliotBraem commented Feb 24, 2025

Summary by CodeRabbit

  • New Features

    • Enhanced content processing and distribution via streamlined transformation and remote plugin integrations, including AI-assisted and object-based transforms.
  • Documentation

    • Expanded user and developer guides covering plugin configuration, system architecture, and streamlined content sourcing features.
  • Refactor

    • Consolidated application startup, improved service orchestration, and upgraded asynchronous database handling for greater reliability.
  • Tests

    • Introduced comprehensive test suites for core processing, ensuring robust error handling and system stability.
  • Chores

    • Updated build configurations, dependency versions, and production environment settings for improved deployment.

elliotBraem and others added 9 commits February 23, 2025 16:37
…55)

* adds memory bank and plugin loader

* adds plugin service

* wip

* working distributor plugins w/ module federation, some TOODs

* working, nice, clean plugin service

* fmt

* addresses comments

* set the correct remotes

* update memory bank and documentation

* implements transform plugins

* update memories

* fmt

* fmt

* fix docs for ai-transform

* fmt

* remove broken link

* fix docs

* fmt

* passing tests

* adds tests

* adds multi item error
@vercel
Copy link
Copy Markdown

vercel bot commented Feb 24, 2025

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
curatedotfun-app ✅ Ready (Inspect) Visit Preview 💬 Add feedback Feb 24, 2025 6:22pm
curatedotfun-docs ✅ Ready (Inspect) Visit Preview 💬 Add feedback Feb 24, 2025 6:22pm
curatedotfun-landing ✅ Ready (Inspect) Visit Preview 💬 Add feedback Feb 24, 2025 6:22pm

@coderabbitai
Copy link
Copy Markdown

coderabbitai bot commented Feb 24, 2025

Walkthrough

This pull request implements extensive modifications across the project. Build scripts and dependency management in the Dockerfile and package.json have been updated, while several test suites, configuration files, and database queries have been refactored for asynchronous operations with better error handling. New services for plugin management, processing, and transformation are introduced (including the PluginService and ProcessorService), while outdated external plugin files have been removed. Documentation and configuration files have been revised to reflect updated plugin URLs, module federation, and enhanced transformation pipelines.

Changes

File(s) Change Summaries
Dockerfile Updated stage names (lowercase → uppercase), added backend build dependencies, refined package-copy commands, updated production base image, added secure environment variables, and included additional LiteFS tools.
backend/package.json Removed "type": "module", changed build scripts from bun to rspack build, updated start/dev scripts, added new devDependencies (@module-federation/node, @rspack/cli, concurrently, wait-on, better-sqlite3), updated dotenv, and removed @libsql/client.
Test Suites (backend/src/tests/*) Added a comprehensive test suite for DistributionService, ProcessorService, TransformationService; introduced new mocks and updated test assertions to reflect ProcessorService usage.
Configuration & Plugin Files Introduced new files (rspack.config.js, plugin.service.ts), updated ConfigService with new methods, added TransformationService (in one file) and removed legacy transformer files, and refactored database queries and operations to use better-sqlite3 with async patterns.
App and Route Files (app.ts, index.ts, routes/test.ts) Refactored application initialization using createApp/getInstance, replaced legacy main function with startServer, and enhanced request body type validations.
External & Removed Files Deleted external plugin files for GPT, Notion, RSS, Simple Transform, and Telegram to deprecate outdated integrations.
Documentation & Config Files (curate.config.json, docs/*, sidebars.ts, rsbuild.config.ts)* Updated plugin URLs to remote entries on unpkg, added new documentation on plugin types (sources, distributors, transformers), and restructured sidebars for enhanced navigation.
Memory Bank & Project Context Files Added new documents detailing active context, product brief, progress, system patterns, and technical context for improved internal tracking and planning.
Database & Query Files (db/index.ts, operations.ts, queries.ts, twitter queries) Refactored to use better-sqlite3 with asynchronous operations, updating method signatures and improving error handling.
package.json Added new dependency @curatedotfun/types and updated the production entry point from dist/index.js to dist/main.js.

Sequence Diagram(s)

sequenceDiagram
    participant C as Client
    participant S as Server (Bun.serve)
    participant A as App (createApp / getInstance)
    participant P as ProcessorService
    participant PS as PluginService
    participant DB as Database

    C->>S: HTTP Request
    S->>A: startServer()
    A->>P: Process submission
    P->>PS: Request plugin instance
    PS-->P: Return plugin instance
    P->>DB: Execute async DB queries
    DB-->>P: Return query results
    P->>S: Processed response
    S-->>C: HTTP Response
Loading

Possibly Related PRs

Poem

I’m a little rabbit hopping through code,
With new plugins and tests that lighten our load.
Dependencies updated and errors tamed,
Each change pens a verse in our digital frame.
Hop along, dear coder, in this joyful ode!
🐰💻

✨ Finishing Touches
  • 📝 Generate Docstrings (Beta)

🪧 Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>, please review it.
    • Generate unit testing code for this file.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    • @coderabbitai generate unit testing code for this file.
    • @coderabbitai modularize this function.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase.
    • @coderabbitai read src/utils.ts and generate unit testing code.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.
    • @coderabbitai help me debug CodeRabbit configuration file.

Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments.

CodeRabbit Commands (Invoked using PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger an incremental review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai full review to do a full review from scratch and review all the files again.
  • @coderabbitai summary to regenerate the summary of the PR.
  • @coderabbitai generate docstrings to generate docstrings for this PR. (Beta)
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai configuration to show the current CodeRabbit configuration for the repository.
  • @coderabbitai help to get help.

Other keywords and placeholders

  • Add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbitai anywhere in the PR title to generate the title automatically.

CodeRabbit Configuration File (.coderabbit.yaml)

  • You can programmatically configure CodeRabbit by adding a .coderabbit.yaml file to the root of your repository.
  • Please see the configuration documentation for more information.
  • If your editor has YAML language server enabled, you can add the path at the top of this file to enable auto-completion and validation: # yaml-language-server: $schema=https://coderabbit.ai/integrations/schema.v2.json

Documentation and Community

  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

Copy link
Copy Markdown

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 15

🔭 Outside diff range comments (3)
curate.config.json (1)

1298-1433: ⚠️ Potential issue

Fix duplicate feed ID.

The arbitrum feed ID appears twice in the configuration (lines 1329-1342 and lines 1389-1402).

Remove one of the duplicate feed configurations to avoid conflicts.

backend/src/services/db/index.ts (1)

77-205: ⚠️ Potential issue

Ensure consistency between synchronous and async calls
This class mixes synchronous calls (.execute()) and async calls (await queries.*). Make sure that you handle all methods consistently, and that your code or framework can handle partial commits if an async operation fails after a synchronous operation succeeded.

backend/src/services/db/queries.ts (1)

20-43: 🛠️ Refactor suggestion

Transaction boundaries
Within upsertFeeds, you apply a transaction for multiple insert/upserts, which is beneficial for atomic commits. Ensure you handle potential concurrency issues if other transactions are inserting or updating feeds simultaneously. You may need row-level locking or conflict resolution logic for large-scale concurrency.

🧹 Nitpick comments (69)
memory-bank/systemPatterns.md (1)

112-118: Nitpick: Consider Compound Adjective Hyphenation

On line 113, the bullet item "High performance" could be considered a compound adjective modifying a noun. For consistency and clarity, consider hyphenating it to "High-performance."

Proposed diff:

-   - High performance
+   - High-performance
🧰 Tools
🪛 LanguageTool

[uncategorized] ~113-~113: If this is a compound adjective that modifies the following noun, use a hyphen.
Context: ...isions 1. Elysia.js Framework - High performance - Built-in TypeScript support - M...

(EN_COMPOUND_ADJECTIVE_INTERNAL)

docs/docs/plugins/distributors/rss.md (2)

32-37: Add validation rules for configuration options.

Consider adding validation rules or constraints for each configuration option. For example:

  • What characters are allowed in the title?
  • Is there a maximum length for the description?
  • What URL formats are accepted for feedUrl and siteUrl?
  • What language codes are supported?

55-57: Add more development tips and troubleshooting guide.

The development section could be enhanced with:

  • Common issues and their solutions
  • Best practices for RSS feed configuration
  • Examples of valid and invalid configurations
docs/docs/plugins/distributors/supabase.md (1)

69-79: Add schema validation and constraints documentation.

The schema configuration section should include:

  1. Data type constraints and limitations
  2. Required vs optional fields
  3. Examples of custom field types
  4. Performance implications of different schema designs
docs/docs/plugins/distributors/index.md (2)

30-33: Add version information to plugin URLs.

The plugin URL should include a version to ensure consistent behavior.

-            "plugin": "@curatedotfun/telegram",
+            "plugin": "@curatedotfun/telegram@1.0.0",

76-78: Add error handling and retry configuration example.

The tip about error handling and retries should include a configuration example.

Add this example after the tip:

{
  "retries": {
    "maxAttempts": 3,
    "backoff": {
      "type": "exponential",
      "initialDelay": 1000
    }
  },
  "errorHandling": {
    "ignoreErrors": ["RateLimitError"],
    "fallbackValue": null
  }
}
memory-bank/progress.md (2)

88-90: Fix formatting in Performance section.

Add periods at the end of each bullet point for consistency.

-Need to optimize transform chain execution
-Plugin loading could be more efficient
-Caching strategy needs improvement
+Need to optimize transform chain execution.
+Plugin loading could be more efficient.
+Caching strategy needs improvement.
🧰 Tools
🪛 LanguageTool

[uncategorized] ~90-~90: A period might be missing here.
Context: ...more efficient - Caching strategy needs improvement ### Error Handling - Some edge cases i...

(AI_EN_LECTOR_MISSING_PUNCTUATION_PERIOD)


93-95: Fix grammar in Error Handling section.

Add "the" where appropriate and ensure consistent formatting.

-Some edge cases in transform chain might need better handling
-Need more comprehensive error reporting
-Better error recovery strategies needed
+Some edge cases in the transform chain might need better handling.
+Need more comprehensive error reporting.
+Better error recovery strategies needed.
🧰 Tools
🪛 LanguageTool

[uncategorized] ~93-~93: You might be missing the article “the” here.
Context: ...### Error Handling - Some edge cases in transform chain might need better handling - Need...

(AI_EN_LECTOR_MISSING_DETERMINER_THE)

docs/docs/plugins/transformers/simple-transform.md (1)

7-7: Add security considerations for template-based transformations.

While the documentation comprehensively covers usage, it should include security considerations when using Mustache templates, especially for user-provided content.

Consider adding a security section that covers:

  • Template injection risks
  • Input validation best practices
  • Safe handling of user-provided templates
docs/docs/plugins/sources/twitter.md (1)

144-167: Add language specifiers to code blocks.

The error code blocks are missing language specifiers, which affects syntax highlighting.

Add language specifiers to the code blocks:

-   ```
+   ```text
    Error: Rate limit exceeded
    ```

-   ```
+   ```text
    Error: Authentication failed
    ```

-   ```
+   ```text
    Error: Invalid tweet format
    ```
🧰 Tools
🪛 markdownlint-cli2 (0.17.2)

145-145: Fenced code blocks should have a language specified
null

(MD040, fenced-code-language)


153-153: Fenced code blocks should have a language specified
null

(MD040, fenced-code-language)


161-161: Fenced code blocks should have a language specified
null

(MD040, fenced-code-language)

docs/docs/plugins/transformers/ai-transform.md (1)

99-100: Add example system prompts.

Consider adding more examples of system prompts to help users understand how to write effective prompts for different use cases.

For example:

  • News summarization: "You are a news editor that summarizes articles in a concise and engaging way. Focus on key facts, maintain journalistic tone, and highlight the most newsworthy aspects."
  • Technical documentation: "You are a technical writer that explains complex concepts in clear, accessible language. Break down technical details into digestible sections and provide relevant examples."
  • Social media: "You are a social media manager that creates engaging content. Transform long-form content into short, attention-grabbing posts while maintaining key information and using appropriate tone."
docs/docs/plugins/distributors/notion.md (1)

73-77: Fix punctuation in configuration reference.

Add a colon after "You need to specify" to maintain consistency with the documentation style.

Apply this diff to fix the punctuation:

-You need to specify:
+You need to specify:

- `token`: Notion Internal Integration Token
- `databaseId`: Your database ID extracted from the URL
🧰 Tools
🪛 LanguageTool

[uncategorized] ~75-~75: Loose punctuation mark.
Context: ...ference You need to specify: - token: Notion Internal Integration Token - `da...

(UNLIKELY_OPENING_PUNCTUATION)

backend/src/app.ts (1)

195-196: Restrict invalid submission statuses.

Currently, the code filters submissions by any provided status string. If an invalid status is passed, no error is thrown, which can be confusing. Consider enforcing a well-defined subset of valid statuses (e.g., "approved", "rejected", etc.), returning an error for unrecognized statuses.

backend/src/__tests__/processor.service.test.ts (1)

103-131: Expose partial distribution errors to improve debugging.

When one distributor fails but others succeed, the test confirms the partial success scenario. However, consider capturing and returning partial failures, so the caller is aware of which distributors encountered errors.

backend/src/routes/test.ts (1)

54-60: Simplify body destructuring in the route handler.

Instead of re-defining the types in lines 54-60, you can rely on the already specified body validation in lines 73-79 or directly destructure body parameters in the handler’s signature for cleaner code:

-async ({
-  body,
-}: {
-  body: {
-    id: string;
-    text: string;
-    username: string;
-    ...
-  };
-}) => {
-  const { id, text, username, inReplyToStatusId, hashtags } = body as {
-    ...
-  };
+async ({ body: { id, text, username, inReplyToStatusId, hashtags } }) => {
docs/docs/plugins/transformers/index.md (3)

23-50: Configuration example needs enhancement.

The configuration example should include the plugin type for better clarity and type safety.

Update the configuration example to include plugin type:

 {
   "feeds": [{
     "outputs": {
       "stream": {
         "transform": [
-          // Global stream transforms
+          {
+            "plugin": "@curatedotfun/simple-transform",
+            "type": "transform",
+            "config": {
+              // Global stream transforms configuration
+            }
+          }
           // Applied to all content in this feed's stream
         ],
         "distribute": [{
           "transform": [
-            // Per-distributor transforms
+            {
+              "plugin": "@curatedotfun/object-transform",
+              "type": "transform",
+              "config": {
+                // Per-distributor transforms configuration
+              }
+            }
             // Applied only to content going to this distributor
           ]
         }]
       },
       "recap": {
         "transform": [
-          // Recap transforms
+          {
+            "plugin": "@curatedotfun/ai-transform",
+            "type": "transform",
+            "config": {
+              // Recap transforms configuration
+            }
+          }
           // Applied to content in scheduled recaps
         ]
       }
     }
   }]
 }

58-69: Type safety section needs more details.

The type safety section should include examples of type constraints and error handling.

Add more detailed examples:

// Example of type constraints
interface ContentInput {
  title: string;
  content: string;
}

interface EnhancedOutput {
  title: string;
  content: string;
  summary: string;
}

interface TransformerPlugin<TInput, TOutput, TConfig> {
  transform(args: { input: TInput, config: TConfig }): Promise<TOutput>;
}

// Example implementation
class ContentEnhancer implements TransformerPlugin<ContentInput, EnhancedOutput, { prompt: string }> {
  async transform(args: { input: ContentInput, config: { prompt: string } }): Promise<EnhancedOutput> {
    // Implementation
  }
}

84-89: Mermaid diagram could be more detailed.

The transform flow diagram could better illustrate error handling and optional transforms.

Consider updating the diagram:

 graph TD
-    Input[Content Input] --> Global[Global Transform]
-    Global --> Distributor[Per-Distributor Transform]
-    Distributor --> Output[Final Output]
+    Input[Content Input] --> Global[Global Transform]
+    Global --> GlobalError{Error?}
+    GlobalError -->|Yes| ErrorHandler[Error Handler]
+    GlobalError -->|No| Distributor[Per-Distributor Transform]
+    Distributor --> DistributorError{Error?}
+    DistributorError -->|Yes| ErrorHandler
+    DistributorError -->|No| Output[Final Output]
+    ErrorHandler --> FallbackOutput[Fallback Output]
backend/src/services/transformation/transformation.service.ts (3)

15-33: Consider optimizing object merging.

The combineResults method could be optimized for performance and type safety.

Consider this optimization:

-  private combineResults(prevResult: unknown, newResult: unknown): unknown {
+  private combineResults<T extends Record<string, unknown>>(
+    prevResult: T | unknown,
+    newResult: T | unknown
+  ): T | unknown {
     // If both are objects (not arrays), merge them with new values taking precedence
     if (
       typeof prevResult === "object" &&
       prevResult !== null &&
       !Array.isArray(prevResult) &&
       typeof newResult === "object" &&
       newResult !== null &&
       !Array.isArray(newResult)
     ) {
-      return {
-        ...(prevResult as Record<string, unknown>),
-        ...(newResult as Record<string, unknown>),
-      };
+      return Object.assign({}, prevResult as T, newResult as T);
     }
 
     // Otherwise return the new result (string will just return)
     return newResult;
   }

38-93: Add performance optimization and testing hooks.

The applyTransforms method could benefit from performance optimization and testing capabilities.

Consider these enhancements:

   async applyTransforms(
     content: any,
     transforms: TransformConfig[] = [],
     stage: TransformStage = "global",
+    options: {
+      timeout?: number;
+      onTransformStart?: (transform: TransformConfig, index: number) => void;
+      onTransformComplete?: (transform: TransformConfig, index: number, duration: number) => void;
+    } = {}
   ) {
     let result = content;
 
     for (let i = 0; i < transforms.length; i++) {
       const transform = transforms[i];
       try {
+        options.onTransformStart?.(transform, i);
+        const startTime = Date.now();
+
+        // Add timeout protection
+        const transformPromise = this.pluginService.getPlugin(transform.plugin, {
+          type: "transform",
+          config: transform.config,
+        });
+
+        const plugin = options.timeout
+          ? await Promise.race([
+              transformPromise,
+              new Promise((_, reject) =>
+                setTimeout(() => reject(new Error('Transform timeout')), options.timeout)
+              )
+            ])
+          : await transformPromise;

         const args: ActionArgs<any, Record<string, unknown>> = {
           input: result,
           config: transform.config,
         };
 
         logger.debug(
           `Applying ${stage} transform #${i + 1} (${transform.plugin})`,
         );
         const transformResult = await plugin.transform(args);
 
         // Validate transform output
         if (transformResult === undefined || transformResult === null) {
           throw new TransformError(
             transform.plugin,
             stage,
             i,
             "Transform returned null or undefined",
           );
         }
 
         // Combine results, either merging objects or using new result
         result = this.combineResults(result, transformResult);
+
+        const duration = Date.now() - startTime;
+        options.onTransformComplete?.(transform, i, duration);
+
       } catch (error) {
         // If it's already a TransformError, rethrow it
         if (error instanceof TransformError) {
           throw error;
         }
 
         // Otherwise wrap it in a TransformError
         throw new TransformError(
           transform.plugin,
           stage,
           i,
           error instanceof Error ? error.message : "Unknown error",
           error instanceof Error ? error : undefined,
         );
       }
     }
 
     return result;
   }

95-97: Add cleanup error handling.

The shutdown method should handle cleanup errors gracefully.

   async shutdown(): Promise<void> {
-    await this.pluginService.cleanup();
+    try {
+      await this.pluginService.cleanup();
+    } catch (error) {
+      logger.error('Error during transformation service shutdown:', error);
+      throw error;
+    }
   }
memory-bank/activeContext.md (2)

31-37: Add version information for transformation pipeline features.

Consider adding version information to track when features were introduced.

 New transformation pipeline features:
-Multiple transformations can be chained
-Global transforms apply to all distributions
-Per-distributor transforms for customization
-Graceful error handling and recovery
-Type-safe transformation flow
+- Multiple transformations can be chained (v1.0.0)
+- Global transforms apply to all distributions (v1.0.0)
+- Per-distributor transforms for customization (v1.0.0)
+- Graceful error handling and recovery (v1.0.0)
+- Type-safe transformation flow (v1.0.0)

57-75: Add priority levels to next steps.

Consider adding priority levels and estimated timelines to next steps.

 ## Next Steps
 1. Testing Infrastructure
+   Priority: High
+   Timeline: Q1 2025
    - Add unit tests for new services
    - Add integration tests for pipeline
    - Create test fixtures for transforms
    - Document testing patterns

 2. Plugin Development
+   Priority: Medium
+   Timeline: Q2 2025
    - Create transform plugin templates
    - Document new transform capabilities
    - Add plugin validation tools
    - Create example transformers

 3. Documentation
+   Priority: High
+   Timeline: Q1 2025
    - Update plugin documentation
    - Add transformation guides
    - Document error handling
    - Add configuration examples
backend/src/index.ts (3)

13-25: Consider concurrency safeguards when creating the singleton instance.
If multiple calls to getInstance occur concurrently, you could inadvertently create multiple instances before instance is set. A simple boolean lock or a memoized promise can help.

 let instance: AppInstance | null = null;
+let creatingInstance = false;

 async function getInstance(): Promise<AppInstance> {
+  if (creatingInstance) {
+    // Potentially wait for the instance creation to complete or handle concurrency as desired.
+    // e.g., a separate 'pendingPromise' could be returned.
+  }
   if (!instance) {
     try {
       creatingInstance = true;
       instance = await createApp();
     } catch (error) {
       logger.error("Failed to create app instance:", error);
       throw new Error("Failed to initialize application");
     } finally {
+      creatingInstance = false;
     }
   }
   return instance;
 }

64-85: Consider handling SIGTERM in addition to SIGINT.
In many containerized environments, SIGTERM is the primary shutdown signal, so catching it promotes cleaner shutdowns.

 process.on("SIGINT", async () => {
+// Also handle SIGTERM for Docker or Kubernetes:
+process.on("SIGTERM", async () => {
   startSpinner("shutdown", "Shutting down gracefully...");
   try {
     const shutdownPromises = [];
     if (context.twitterService)
       shutdownPromises.push(context.twitterService.stop());
     // ...

88-95: Graceful fallback or retry
Force-closing the process with process.exit(1) upon startup errors might be too abrupt. Add a fallback or a retry mechanism if partial services are mandatory.

backend/src/types/errors.ts (1)

1-30: Potential enhancement for native error causes.
These custom error classes define a cause property, but modern Node supports the built-in cause in ErrorOptions. Using that may preserve stack traces more elegantly.

-super(message);
+super(message, { cause });
 this.name = "PluginError";
backend/src/utils/plugin.ts (1)

27-29: Mismatch between comment and actual string slicing.
Your comment indicates returning the first 8 hex characters, but the code slices up to 16. Update the comment or reduce the slice for consistency.

-// Return first 8 chars of hex digest for a reasonably short but unique key
+// Return first 16 chars of hex digest for a reasonably short but unique key
 return hash.digest("hex").slice(0, 16);
backend/src/services/rss/queries.ts (1)

51-68: Error handling in deleteOldRssItems
While this function correctly selects items to keep and deletes the rest, it lacks any error handling. If the delete operation fails, the caller will not be captured in a try/catch. Consider adding a try/catch to log or handle potential issues.

backend/package.json (1)

32-36: Pin versions for new devDependencies
You introduced several dependencies (@module-federation/node, @rspack/cli, concurrently, wait-on) without pinning exact versions (e.g., "latest" for @rspack/cli). This can cause unpredictable builds. Consider pinning or at least using caret versions to avoid breaking changes.

Also applies to: 42-43

backend/src/services/processor/processor.service.ts (2)

13-17: Constructor dependency injection
Injecting both TransformationService and DistributionService is a solid approach for modularity. Consider adding an interface or type alias for these if they ever need to be swapped out or mocked in testing.


117-224: Batch processing approach
Your processBatch method effectively processes items individually, then applies batch transforms before distribution. This design is flexible, but watch out for large batch sizes, as transformations might become memory-intensive. Consider streaming or chunking if you expect very large inputs.

docs/docs/plugins/sources/index.md (2)

21-61: Great Use of JSON Examples

Lines 21-34 and 36-61 do an excellent job of showing both a minimal and a more complete JSON configuration snippet for the sources array. Including realistic placeholders (e.g., "username": "curatedotfun") helps demonstrate how end users might configure a plugin in real projects.

If you’d like to further enhance clarity, consider adding a brief explanation of each field in the JSON (e.g., "monitorMentions", "monitorHashtags").


90-101: Comprehensive Best Practices

Lines 90-101 provide a concise set of best practices (error handling, rate limiting, retries, etc.). Suggest adding a code snippet or real-world scenario to illustrate error-handling or rate-limiting configurations in future updates. This can give users a deeper understanding of how to implement these recommendations.

Overall, this section is thorough.

backend/src/services/submissions/submission.service.ts (2)

299-303: Processing Stream Output

Lines 299-303 call this.processorService.process(existingSubmission || submission!, feed.outputs.stream). This is a clear asynchronous flow for approved feeds. Consider logging success/failure more distinctly around this call to track individual feed completions.


338-341: Processing Stream Output (Second Occurrence)

Lines 338-341 duplicate the same logic for feed processing. If future expansions complicate this logic, refactoring to a shared helper function (e.g., processFeedSubmission(...)) might help maintain DRY principles.

backend/src/__tests__/distribution.service.test.ts (1)

78-93: Consider adding error assertion.

While the test correctly verifies that the method doesn't throw, it would be more robust to assert that the error was properly handled.

-      // Should complete without throwing
-      await distributionService.distributeContent(distributor, mockSubmission);
-      // Test passes if we reach here
+      const result = await distributionService.distributeContent(distributor, mockSubmission);
+      expect(result).toBeUndefined();
+      // Optionally verify error was logged if you have a mock logger
docs/docs/plugins/build-plugin.md (1)

127-135: Consider adding deployment checklist.

The deployment section could benefit from a pre-deployment checklist to ensure all necessary steps are completed.

memory-bank/techContext.md (2)

80-96: Remove duplicate heading.

Static analysis detected word repetition in the "Development" heading.

-### Plugin Development
+### Plugin Tools and Features
🧰 Tools
🪛 LanguageTool

[duplication] ~80-~80: Possible typo: you repeated a word.
Context: ...: Post monitoring (planned) ### Plugin Development - Development Tools * Plugin development kit * Ty...

(ENGLISH_WORD_REPEAT_RULE)


127-132: Consolidate environment-related requirements.

Static analysis detected word repetition. Consider consolidating environment-related requirements.

-### Requirements
-- Node.js environment
-- Environment variables configuration
-- Plugin dependencies
-- Frontend build artifacts
+### System Requirements
+- Node.js runtime setup
+- Configuration variables
+- Plugin dependencies
+- Frontend build artifacts
🧰 Tools
🪛 LanguageTool

[duplication] ~128-~128: Possible typo: you repeated a word.
Context: ... Deployment ### Requirements - Node.js environment - Environment variables configuration - Plugin depend...

(ENGLISH_WORD_REPEAT_RULE)

backend/src/services/config/config.service.ts (2)

84-90: Consider caching the plugin registry.

Since the plugin registry is frequently accessed and doesn't change during runtime, consider caching it to improve performance.

 private config: AppConfig | null = null;
+private pluginRegistry: PluginsConfig | null = null;
 
 public getPluginRegistry(): PluginsConfig {
   if (!this.config) {
     throw new Error("Config not loaded. Call loadConfig() first.");
   }
-  const config = this.getConfig();
-  return config.plugins;
+  if (!this.pluginRegistry) {
+    this.pluginRegistry = this.getConfig().plugins;
+  }
+  return this.pluginRegistry;
 }

100-108: Optimize feed lookup performance.

For better performance with large feed lists, consider using a Map for O(1) lookup instead of Array.find().

+private feedConfigMap: Map<string, FeedConfig> | null = null;
+
+private initFeedConfigMap(): void {
+  if (!this.config) {
+    throw new Error("Config not loaded. Call loadConfig() first.");
+  }
+  this.feedConfigMap = new Map(
+    this.config.feeds.map(feed => [feed.id.toLowerCase(), feed])
+  );
+}
+
 public getFeedConfig(feedId: string): FeedConfig | undefined {
   if (!this.config) {
     throw new Error("Config not loaded. Call loadConfig() first.");
   }
-  const config = this.getConfig();
-  return config.feeds.find(
-    (feed) => feed.id.toLowerCase() === feedId.toLowerCase(),
-  );
+  if (!this.feedConfigMap) {
+    this.initFeedConfigMap();
+  }
+  return this.feedConfigMap!.get(feedId.toLowerCase());
 }
docs/docs/plugins/index.md (1)

76-77: Fix duplicate word in transformer description.

Remove the duplicate word "Transform" in the heading and description.

-#### [🤖 AI Transform](./transformers/ai-transform.md)
-Transform content using OpenRouter's GPT models for AI-powered content enhancement.
+#### [🤖 AI Transform](./transformers/ai-transform.md)
+Use OpenRouter's GPT models for AI-powered content enhancement.
🧰 Tools
🪛 LanguageTool

[duplication] ~76-~76: Possible typo: you repeated a word.
Context: ...atabase. ### Transformers #### 🤖 AI Transform Transform content using OpenRouter's GPT models f...

(ENGLISH_WORD_REPEAT_RULE)

backend/src/services/distribution/distribution.service.ts (1)

39-43: Caution logging the entire error object.
Consider whether sensitive data may leak through the logs.

Dockerfile (1)

8-8: Consider using a specific Node.js version for better reproducibility.

While using node:20 works, it's better to pin to a specific version (e.g., node:20.11.1) to ensure consistent builds.

-FROM node:20 AS frontend-builder
+FROM node:20.11.1 AS frontend-builder
curate.config.json (1)

57-87: Consider extracting common transform configuration.

The transform configuration is duplicated across multiple feeds. Consider extracting it into a reusable template.

Create a common transform configuration at the root level:

{
  "global": {
    ...
  },
+ "transformTemplates": {
+   "default": [
+     {
+       "plugin": "@curatedotfun/object-transform",
+       "config": {
+         "mappings": {
+           "source": "https://x.com/{{username}}/status/{{tweetId}}",
+           "content": "{{content}}",
+           "author": "{{username}}",
+           "notes": "{{curator.notes}}",
+           "submittedAt": "{{submittedAt}}"
+         }
+       }
+     },
+     {
+       "plugin": "@curatedotfun/ai-transform",
+       "config": {
+         "prompt": "Summarize the content into a concise news flash, incorporating relevant details from the curator's notes. Maintain a neutral, third-person tone. Mention the author if relevant, or simply convey the information.",
+         "apiKey": "{OPENROUTER_API_KEY}",
+         "schema": {
+           "title": {
+             "type": "string",
+             "description": "Title derived from summary of content"
+           },
+           "summary": {
+             "type": "string",
+             "description": "Summary of content influenced by curator notes"
+           }
+         }
+       }
+     }
+   ]
+ }

Then reference it in each feed:

"transform": [
-  {
-    "plugin": "@curatedotfun/object-transform",
-    ...
-  },
-  {
-    "plugin": "@curatedotfun/ai-transform",
-    ...
-  }
+  "@transformTemplates/default"
]

Also applies to: 138-168, 219-248, 312-342, 376-406, 585-614, 841-871, 904-934, 1009-1039, 1123-1153

backend/src/services/plugins/sources/twitter/index.ts (4)

6-11: Consider adding validation for required config fields.

The TwitterConfig interface should enforce validation for required fields.

Consider using a validation library like Zod:

import { z } from 'zod';

const TwitterConfigSchema = z.object({
  username: z.string().min(1),
  password: z.string().min(1),
  email: z.string().email(),
  twoFactorSecret: z.string().optional()
});

type TwitterConfig = z.infer<typeof TwitterConfigSchema>;

47-68: Consider using exponential backoff for error handling.

The monitoring interval continues at a fixed rate even after errors. Consider implementing exponential backoff.

private backoffInterval = 5 * 60 * 1000; // 5 minutes
private maxBackoffInterval = 30 * 60 * 1000; // 30 minutes

async startMonitoring(): Promise<void> {
  // ... existing checks ...

  const scheduleNextCheck = (interval: number) => {
    this.monitoringInterval = setTimeout(async () => {
      try {
        await this.checkMentions();
        this.backoffInterval = 5 * 60 * 1000; // Reset on success
        scheduleNextCheck(this.backoffInterval);
      } catch (error) {
        logger.error("Error checking mentions:", error);
        this.backoffInterval = Math.min(this.backoffInterval * 2, this.maxBackoffInterval);
        scheduleNextCheck(this.backoffInterval);
      }
    }, interval);
  };

  scheduleNextCheck(this.backoffInterval);
}

92-97: Add implementation details for checkMentions.

The checkMentions method is incomplete with just a placeholder comment.

Would you like me to help implement the checkMentions method based on the submission service requirements?


99-131: Consider adding rate limiting to API endpoints.

The API endpoints lack rate limiting which could lead to abuse.

Consider adding rate limiting middleware:

import rateLimit from 'express-rate-limit';

const limiter = rateLimit({
  windowMs: 15 * 60 * 1000, // 15 minutes
  max: 100 // limit each IP to 100 requests per windowMs
});

getEndpoints(): PluginEndpoint[] {
  return [
    {
      path: "/last-tweet-id",
      method: "GET",
      middleware: [limiter],
      handler: async () => {
        // ... existing handler ...
      }
    },
    // ... other endpoints ...
  ];
}
backend/src/services/plugins/plugin.service.ts (9)

1-19: Consider concurrency safeguards for remote configuration re-initialization
The initial imports and declarations look well-structured. However, repeated calls to performReload(true) in a multi-threaded or multi-instance environment can cause race conditions if multiple threads attempt re-initialization at the same time.


21-32: Clarify type constraints in PluginCache
The PluginCache interface is effective, but note that it ties a plugin’s config strictly to one type. If you have advanced use cases mixing plugin types or specialized caching logic, consider clarifying the type constraints with additional documentation or generics for future extensibility.


34-44: Namespace or relocate PluginEndpoint and PluginWithEndpoints
Both interfaces might benefit from being placed in a dedicated types file if shared across services. This can make them easier to discover and reuse, especially if you plan to expand or unify plugin endpoint definitions further.


46-65: Validate remote configuration
RemoteConfig and RemoteState rely on external URLs and dynamic modules. Consider additional validation steps – for instance, checking if the URL domain is in an allowlist – to mitigate security risks from untrusted sources.


337-374: Confirm correctness of error aggregation
In cleanup, aggregated errors are thrown as a single AggregateError. This is good for visibility but ensure that upstream callers properly handle AggregateError objects so that partial cleanup failures don’t mask critical errors.


376-410: Unregistering endpoints in Elysia
The comment notes that “Elysia doesn’t provide a direct way to unregister routes.” This may confuse maintainers: consider linking to official Elysia docs or clarifying the fallback mechanism for removing plugin endpoints to avoid misinterpretation.


412-432: Endpoints might remain partially active if re-registered
Since you're overriding or ignoring existing routes on re-registration, race conditions could occur if endpoints from a prior plugin instance remain accessible. Document that the framework overwrites old endpoints or implement a manual “routing registry” in the app to avoid stale routes in ephemeral states.


434-448: Extend plugin interface validation
validatePluginInterface checks only one required method for each plugin type. If plugins have multiple mandatory methods or events (e.g., error handlers, config updaters), consider an expanded validation process or a typed approach verifying all expected life-cycle hooks.


458-465: Reserved characters
packageToRemoteName strips “@” and replaces “/” with underscores, which is sensible. But consider whether other special or platform-reserved characters might need sanitizing, especially if plugin names contain colons or additional punctuation.

backend/src/services/db/index.ts (1)

37-46: Improve error reporting
If Database or drizzle(...) fails, you log a generic error. Include environment details or partial DB path details (redacted if necessary) to help with debugging.

backend/src/services/db/queries.ts (9)

45-82: Pre-insert existence checks
saveSubmissionToFeed checks for submission and feed existence before insertion. This is good but consider capturing potential race conditions: if another process deletes the submission or feed between checks and upsert, you might get unexpected states. Using foreign keys and enabling references/constraints is a robust option.


104-121: Minor optimization
saveSubmission is an async function, but it doesn’t use await. You could either remove the async keyword or convert to await db.insert(...), so the function’s return type is unambiguous.


136-172: getModerationHistory ordering
You order by moderationHistory.createdAt. If multiple moderation actions happen simultaneously with the same timestamp, consider tie-breaking logic or consistent ordering of feed IDs to ensure stable results.


195-267: Implement stricter data validation
getSubmissionByCuratorTweetId merges submission and moderation data. If any fields are malformed or invalid, the final merged object might omit crucial data or cause runtime errors. Consider validation or fallback defaults for partially missing fields.


269-341: Likely duplication in submission queries
getSubmission and getSubmissionByCuratorTweetId contain nearly identical query logic, only varying in the WHERE clause. If extending these queries, consider refactoring to share a common function to reduce duplication and ensure consistent data mapping.


343-425: Bulk retrieval can be heavy
getAllSubmissions loads multiple submissions plus their moderation information. For large data sets, performance might degrade significantly. You might implement pagination or indexing at the database level (in addition to the offset/limit approach) to keep queries manageable.


495-509: Potential race with feed plugins
getFeedPlugin fetches plugin data. If your system frequently updates plugin config, you might cause stale reads or partial writes. Consider explicit “versioning” or updated timestamps to ensure the retrieved plugin config is up to date.


510-530: Extend plugin config schema
upsertFeedPlugin just JSON-encodes any config object. If you need schema validation or backward compatibility, consider storing structured fields or version numbers to handle migrations if the plugin’s config format evolves.


532-620: Potential indexing needed
getSubmissionsByFeed can return multiple large results with all moderation data joined. For heavily used feeds or large volumes of submissions, you may benefit from appropriate indexes on submissionFeeds(feedId, submissionId) or moderationHistory(feedId, createdAt). Keeps queries responsive under load.

📜 Review details

Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 00ee01f and 22d1314.

⛔ Files ignored due to path filters (3)
  • backend/bun.lockb is excluded by !**/bun.lockb
  • bun.lock is excluded by !**/*.lock
  • bun.lockb is excluded by !**/bun.lockb
📒 Files selected for processing (62)
  • Dockerfile (3 hunks)
  • backend/package.json (2 hunks)
  • backend/rspack.config.js (1 hunks)
  • backend/src/__tests__/distribution.service.test.ts (1 hunks)
  • backend/src/__tests__/mocks/processor-service.mock.ts (1 hunks)
  • backend/src/__tests__/processor.service.test.ts (1 hunks)
  • backend/src/__tests__/submission.service.test.ts (13 hunks)
  • backend/src/__tests__/transformation.service.test.ts (1 hunks)
  • backend/src/app.ts (1 hunks)
  • backend/src/external/gpt-transform.ts (0 hunks)
  • backend/src/external/notion/index.ts (0 hunks)
  • backend/src/external/rss/index.ts (0 hunks)
  • backend/src/external/rss/rss.service.ts (0 hunks)
  • backend/src/external/simple-transform.ts (0 hunks)
  • backend/src/external/telegram.ts (0 hunks)
  • backend/src/index.ts (4 hunks)
  • backend/src/routes/test.ts (2 hunks)
  • backend/src/services/config/config.service.ts (2 hunks)
  • backend/src/services/db/index.ts (5 hunks)
  • backend/src/services/db/operations.ts (2 hunks)
  • backend/src/services/db/queries.ts (21 hunks)
  • backend/src/services/distribution/distribution.service.ts (1 hunks)
  • backend/src/services/plugins/plugin.service.ts (1 hunks)
  • backend/src/services/plugins/sources/twitter/index.ts (1 hunks)
  • backend/src/services/processor/processor.service.ts (1 hunks)
  • backend/src/services/rss/queries.ts (4 hunks)
  • backend/src/services/submissions/submission.service.ts (7 hunks)
  • backend/src/services/transformation/transformation.service.ts (1 hunks)
  • backend/src/services/transformers/transformation.service.ts (0 hunks)
  • backend/src/services/twitter/queries.ts (4 hunks)
  • backend/src/types/config.ts (2 hunks)
  • backend/src/types/errors.ts (1 hunks)
  • backend/src/types/plugin.ts (0 hunks)
  • backend/src/utils/plugin.ts (1 hunks)
  • curate.config.json (22 hunks)
  • curate.config.test.json (4 hunks)
  • docs/docs/developers/configuration.md (3 hunks)
  • docs/docs/plugins/build-plugin.md (1 hunks)
  • docs/docs/plugins/distributors/_category_.json (1 hunks)
  • docs/docs/plugins/distributors/index.md (1 hunks)
  • docs/docs/plugins/distributors/notion.md (1 hunks)
  • docs/docs/plugins/distributors/rss.md (1 hunks)
  • docs/docs/plugins/distributors/supabase.md (1 hunks)
  • docs/docs/plugins/index.md (4 hunks)
  • docs/docs/plugins/sources/_category_.json (1 hunks)
  • docs/docs/plugins/sources/index.md (1 hunks)
  • docs/docs/plugins/sources/twitter.md (1 hunks)
  • docs/docs/plugins/transformers/_category_.json (1 hunks)
  • docs/docs/plugins/transformers/ai-transform.md (1 hunks)
  • docs/docs/plugins/transformers/gpt-transform.md (0 hunks)
  • docs/docs/plugins/transformers/index.md (1 hunks)
  • docs/docs/plugins/transformers/object-transform.md (1 hunks)
  • docs/docs/plugins/transformers/simple-transform.md (1 hunks)
  • docs/sidebars.ts (2 hunks)
  • frontend/rsbuild.config.ts (2 hunks)
  • memory-bank/activeContext.md (1 hunks)
  • memory-bank/productContext.md (1 hunks)
  • memory-bank/progress.md (1 hunks)
  • memory-bank/projectbrief.md (1 hunks)
  • memory-bank/systemPatterns.md (1 hunks)
  • memory-bank/techContext.md (1 hunks)
  • package.json (1 hunks)
💤 Files with no reviewable changes (9)
  • backend/src/external/simple-transform.ts
  • docs/docs/plugins/transformers/gpt-transform.md
  • backend/src/external/rss/rss.service.ts
  • backend/src/external/rss/index.ts
  • backend/src/services/transformers/transformation.service.ts
  • backend/src/external/gpt-transform.ts
  • backend/src/types/plugin.ts
  • backend/src/external/telegram.ts
  • backend/src/external/notion/index.ts
✅ Files skipped from review due to trivial changes (5)
  • docs/docs/plugins/sources/category.json
  • docs/docs/plugins/transformers/category.json
  • docs/docs/plugins/distributors/category.json
  • memory-bank/productContext.md
  • memory-bank/projectbrief.md
🧰 Additional context used
🪛 LanguageTool
docs/docs/plugins/transformers/simple-transform.md

[typographical] ~50-~50: A comma is not needed after quotations ending in either question marks or exclamation points.
Context: ... - If curator.notes is "Great thread!", outputs: 📝 Note: Great thread! - If ...

(COMMA_AFTER_QUESTION_QUOTE)

memory-bank/progress.md

[uncategorized] ~90-~90: A period might be missing here.
Context: ...more efficient - Caching strategy needs improvement ### Error Handling - Some edge cases i...

(AI_EN_LECTOR_MISSING_PUNCTUATION_PERIOD)


[uncategorized] ~93-~93: You might be missing the article “the” here.
Context: ...### Error Handling - Some edge cases in transform chain might need better handling - Need...

(AI_EN_LECTOR_MISSING_DETERMINER_THE)

docs/docs/plugins/sources/twitter.md

[grammar] ~125-~125: You’ve repeated a verb. Did you mean to only write one of them?
Context: ...sely - Monitor API usage 2. Error Handling - Handle Twitter API rate limits - Implement ...

(REPEATED_VERBS)

docs/docs/plugins/distributors/notion.md

[uncategorized] ~75-~75: Loose punctuation mark.
Context: ...ference You need to specify: - token: Notion Internal Integration Token - `da...

(UNLIKELY_OPENING_PUNCTUATION)

docs/docs/plugins/index.md

[duplication] ~76-~76: Possible typo: you repeated a word.
Context: ...atabase. ### Transformers #### 🤖 AI Transform Transform content using OpenRouter's GPT models f...

(ENGLISH_WORD_REPEAT_RULE)

docs/docs/plugins/sources/index.md

[grammar] ~65-~65: Did you mean “Twitter” (= microblogging service, capitalized)?
Context: ...``` ## 🔌 Available Sources - Twitter - Monitor tweets, mentions, replies...

(TWITTER)

memory-bank/systemPatterns.md

[uncategorized] ~113-~113: If this is a compound adjective that modifies the following noun, use a hyphen.
Context: ...isions 1. Elysia.js Framework - High performance - Built-in TypeScript support - M...

(EN_COMPOUND_ADJECTIVE_INTERNAL)

memory-bank/techContext.md

[duplication] ~80-~80: Possible typo: you repeated a word.
Context: ...: Post monitoring (planned) ### Plugin Development - Development Tools * Plugin development kit * Ty...

(ENGLISH_WORD_REPEAT_RULE)


[duplication] ~128-~128: Possible typo: you repeated a word.
Context: ... Deployment ### Requirements - Node.js environment - Environment variables configuration - Plugin depend...

(ENGLISH_WORD_REPEAT_RULE)

🪛 markdownlint-cli2 (0.17.2)
docs/docs/plugins/sources/twitter.md

145-145: Fenced code blocks should have a language specified
null

(MD040, fenced-code-language)


153-153: Fenced code blocks should have a language specified
null

(MD040, fenced-code-language)


161-161: Fenced code blocks should have a language specified
null

(MD040, fenced-code-language)

🔇 Additional comments (90)
docs/docs/developers/configuration.md (5)

37-37: Updated Telegram Plugin URL.
The URL for the @curatedotfun/telegram plugin has been updated to load the latest version directly from unpkg. This change ensures the plugin is retrieved via module federation using an absolute URL.


41-41: Updated RSS Plugin URL.
The RSS plugin now points to the latest version hosted on unpkg. This absolute URL update aligns with the migration toward module federation.


43-45: Renamed and Updated AI Transform Plugin.
The plugin formerly known as @curatedotfun/gpt-transform has been renamed to @curatedotfun/ai-transform and its URL has been updated accordingly. This change ensures consistency in naming and retrieval via the new absolute URL.


70-75: Updated Stream Transform Plugin Reference.
The stream configuration now correctly references the new @curatedotfun/ai-transform plugin. This update aligns the transformation step with the new plugin naming and URL convention.


96-101: Updated Recap Transform Plugin Reference.
The recap configuration has been updated to use the @curatedotfun/ai-transform plugin. This adjustment is essential for consistency in transformation processing across both stream and recap outputs.

memory-bank/systemPatterns.md (8)

1-2: Document Title and Header Clarity

The title is clear and succinctly introduces the document as the "System Patterns" documentation for the project.


3-4: Introduction of Architecture Overview

The "Architecture Overview" section header is well placed and sets the stage for the detailed content that follows.


5-13: Core Components – Server Layer Description

The description of the Server Layer is concise and informative, listing key aspects like REST API endpoints, static file serving, CORS/security middleware, health monitoring, and a high-performance /process endpoint. This gives a good quick overview of the server responsibilities.


14-22: Service Layer Breakdown

The Service Layer is clearly outlined, enumerating several services (ConfigService, SubmissionService, ProcessorService, TransformationService, DistributionService, Database Service, PluginLoaderService). This clear bullet-listing aids in understanding responsibilities and dependencies.


23-42: Plugin System Detailing with Module Federation

The Plugin System section is comprehensive—listing source, distributor, and transformer plugins, along with crucial features like runtime loading, hot-reloading, and type-safe configuration. The inclusion of module federation in the Plugin Pattern aligns well with the PR objectives.


43-73: Design Patterns Section – Clear and Organized

This section neatly categorizes and describes various design patterns (Singleton, Plugin, Service, Observer, Pipeline) used in the system. The detailed bullet points help communicate the architecture choices and responsibilities effectively.


74-109: Component Relationships and Mermaid Diagrams

The relationships among components are well illustrated using Mermaid diagrams. The Configuration Flow, Content Flow, and Error Handling Flow diagrams are clear and provide valuable visual aids to complement the written description. They help in understanding the data and process flows between various system parts.


110-151: Key Technical Decisions and Strategic Choices

The "Key Technical Decisions" section is thorough, detailing choices related to the Elysia.js framework, Plugin Architecture, configuration-based operations, service boundaries, error handling, and task scheduling. This level of detail is very useful for both developers and stakeholders in understanding the system’s design rationale.

🧰 Tools
🪛 LanguageTool

[uncategorized] ~113-~113: If this is a compound adjective that modifies the following noun, use a hyphen.
Context: ...isions 1. Elysia.js Framework - High performance - Built-in TypeScript support - M...

(EN_COMPOUND_ADJECTIVE_INTERNAL)

frontend/rsbuild.config.ts (2)

27-27: LGTM! Port configuration is correctly set.

The server port is set to 5173, which aligns with the standard development port used by Vite and other modern development servers.


37-39: LGTM! Write to disk configuration is essential for module federation.

Setting writeToDisk: true in dev configuration is crucial for module federation to work correctly, as it ensures that the federated modules are properly written to disk during development.

package.json (1)

5-5:

Details

❓ Verification inconclusive

Consider pinning the types package to a stable version.

The @curatedotfun/types package is at version ^0.0.2, which is a very early version. Consider:

  1. Pinning to an exact version to avoid unexpected changes
  2. Waiting for a stable release (1.0.0+)

🌐 Web query:

What is the latest version of @curatedotfun/types package?

💡 Result:

The latest version of the @curatedotfun/types package cannot be determined from the provided search results[1][2][3][4][5][6][7][8][9][10]. None of the sources mention this specific package or its version history.

For authoritative information about this package:

  1. Check directly on npm using npm view @curatedotfun/types version
  2. Visit its npm registry page at https://www.npmjs.com/package/@curatedotfun/types
  3. Verify if the package name is spelled correctly (the search results show examples like @types/node and @types/react, but not @curatedotfun/types)

If this is a private or unpublished package, you'll need to consult your organization's internal registry or package maintainers for version information.

Citations:


Action: Pin the version of @curatedotfun/types and verify its stability

The current declaration in package.json uses "@curatedotfun/types": "^0.0.2", which may lead to unexpected updates given the early (0.0.2) release status. Since our web query didn't return a definitive stable version, please consider the following:

  • Pin the version: Change the dependency to an exact version (e.g., "@curatedotfun/types": "0.0.2") to ensure that no unintended updates occur.
  • Verify stability: Confirm via the npm registry or with the package maintainers whether there are plans for a stable release (≥ 1.0.0) or if it’s intended to remain in early release status before long-term use.
backend/rspack.config.js (1)

40-42:

Details

❓ Verification inconclusive

Verify the frontend dist path.

The path ../frontend/dist assumes a specific directory structure. Ensure this path is correct and exists during production builds.


🏁 Script executed:

#!/bin/bash
# Description: Verify the frontend dist path exists in production.

# Test: Check if the frontend dist directory exists
if [ "$NODE_ENV" = "production" ]; then
  if [ ! -d "../frontend/dist" ]; then
    echo "Error: Frontend dist directory not found"
    exit 1
  fi
fi

Length of output: 25


Action Required: Verify the Production Build Environment

The current verification script did not trigger the production condition because NODE_ENV was empty. Please ensure that during production builds:

  • The environment variable NODE_ENV is correctly set to "production" so that the frontend path check is executed.
  • The relative path ../frontend/dist correctly points to the built frontend distribution. Manual confirmation that this directory exists and contains your production assets is recommended if this hasn't been already verified.
backend/src/services/db/operations.ts (1)

1-1:

Details

✅ Verification successful

Verify the database migration impact.

The switch from BunSQLiteDatabase to BetterSQLite3Database is a significant change that might affect dependent services.

Let's verify the impact:

Also applies to: 11-11


🏁 Script executed:

#!/bin/bash
# Search for any remaining references to BunSQLiteDatabase
rg "BunSQLiteDatabase"

# Search for database transaction patterns that might need updates
ast-grep --pattern 'db.$_()'

Length of output: 515


Database migration verified — No remnants of BunSQLiteDatabase detected

I searched the codebase for any occurrences of BunSQLiteDatabase and examined the database operation patterns. The results confirm:

  • No references to BunSQLiteDatabase were found anywhere in the repository.
  • Database queries in dependent services (e.g., in backend/src/services/twitter/queries.ts, backend/src/services/rss/queries.ts, and backend/src/services/db/queries.ts) continue to operate using the generic db interface and remain consistent with the new BetterSQLite3Database implementation.

This suggests the migration is complete and the impact on dependent services is minimal. Please ensure integration tests validate these changes at runtime.

docs/docs/plugins/transformers/object-transform.md (1)

7-8: Well-structured documentation with clear plugin differentiation!

The documentation effectively explains the Object Transform plugin's purpose and clearly distinguishes it from the Simple Transform plugin. The examples and tips provide excellent guidance for users.

Also applies to: 194-196

backend/src/__tests__/submission.service.test.ts (1)

1-960: LGTM! The test cases are comprehensive and well-structured.

The test suite thoroughly covers various scenarios for the SubmissionService, including:

  • Curator submissions and auto-approval
  • Moderation responses and handling
  • Resubmissions to different feeds
  • Case-insensitive handling of hashtags and moderator names
  • Blacklisted user handling
backend/src/__tests__/processor.service.test.ts (1)

49-65: Tests thoroughly cover the transformation pipeline.

The “should process content through transformation pipeline” test successfully verifies that global transforms and distributors receive transformed content. No issues found here—great coverage of the happy path scenario.

backend/src/routes/test.ts (1)

34-39: Good restriction of test routes in production.

Returning a 404 for the /api/test routes in production is a solid way to protect test endpoints from unintended access.

backend/src/index.ts (3)

2-2: Imports look correct.
This updated import structure neatly centralizes the app creation logic.


27-62: Server startup logic is well-structured.
Using Bun.serve with a clear health check route and then delegating to the app.fetch method keeps the code modular and readable.


98-98: Automatic server start looks appropriate.
Calling startServer() at the end ensures the service is up immediately on module import.

backend/src/types/errors.ts (3)

32-33: TransformStage definition is straightforward.
No issues observed.


34-47: TransformError class is well-structured.
This class provides detailed context about the failing transformation step.


49-58: ProcessorError definition is clear.
It cleanly names and contextualizes the error scenario.

backend/src/utils/plugin.ts (2)

31-61: Recursive key sorting is well-implemented.
No concerns here; the function efficiently catches non-serializable data.


63-95: Plugin configuration validation is comprehensive.
It properly checks for required fields and valid URL formats.

backend/src/services/rss/queries.ts (3)

3-3: Use of BetterSQLite3Database import
Good job importing BetterSQLite3Database to replace the old database adapter. This change appears consistent with the updated approach across the codebase.


14-17: Confirm synchronous vs. asynchronous usage
The saveRssItem function returns the result of db.insert(...).values(...) without await. Ensure this is intentional. Some Drizzle + BetterSQLite3 operations can be synchronous. If this function ever needs error handling or logging, consider wrapping it in a try/catch block.


28-39:

Details

❓ Verification inconclusive

Check if .all() is truly asynchronous
You've declared getRssItems as async and are using await db ... .all(). BetterSQLite3 is generally synchronous by default. If Drizzle for BetterSQLite3 indeed provides an async method, this is fine; otherwise, double-check that .all() is asynchronous.

Run the following script to confirm whether .all() is an async or synchronous method in your codebase:


🏁 Script executed:

#!/bin/bash
ast-grep --pattern $'db.$_().all()' 

Length of output: 36


ACTION: Confirm asynchronous behavior of .all() in getRssItems

The current use of await with db.select(...).all() raises concerns because BetterSQLite3 is typically synchronous. Our initial search using the pattern db.$_().all() produced no output, which means we haven't gathered conclusive evidence from the codebase regarding whether Drizzle’s integration with BetterSQLite3 provides an asynchronous version of .all().

Please manually verify whether:

  • The .all() method in your Drizzle/BetterSQLite3 integration is truly asynchronous.
  • If it’s synchronous, consider updating the function (e.g., removing async/await or using an appropriate asynchronous API) to avoid any misleading async behavior.

File to check: backend/src/services/rss/queries.ts (lines 28-39)

backend/package.json (2)

6-8: Consider orchestrating build and start scripts carefully
You switched to rspack for bundling, and the new scripts run rspack build before executing dist/main.js. Be mindful of race conditions with watchers. Also confirm that dist/main.js is the correct output path from rspack, ensuring your dev flow works correctly.


52-54: Confirm environment support for better-sqlite3
You added better-sqlite3 and updated dotenv. Remember that better-sqlite3 requires native compilation. Confirm that your CI environment or Docker setup includes the required build dependencies (python, make, g++).

backend/src/services/processor/processor.service.ts (1)

19-115: Robust error handling in process
Nice job handling various transform failures and fallback logic. A few observations:

  1. All distributor failures trigger a ProcessorError, which is good.
  2. Consider logging or returning partial success data if at least one distributor succeeds.
  3. The global transform fallback is especially helpful for resiliency.
docs/docs/plugins/sources/index.md (2)

1-19: Well-Structured Introduction & Architecture Overview

The introductory lines (1-19) effectively outline the purpose and flow of the source system. The YAML front matter is valid, and the hierarchical breakdown (Source System → Architecture → Source Pipeline) is organized and clear.

No immediate issues. Nice job setting the context for new users!


63-88: "Available Sources" & "Type Safety"

  1. Lines 63-68: The list of upcoming platforms is helpful. The reference to Twitter is already capitalized properly, so the static analysis tool's suggestion can be safely ignored.
  2. Lines 69-88: Excellent explanation of usage of TypeScript interfaces. This fosters clarity regarding plugin design and ensures new contributors know how to implement or extend these interfaces.

No changes needed.

🧰 Tools
🪛 LanguageTool

[grammar] ~65-~65: Did you mean “Twitter” (= microblogging service, capitalized)?
Context: ...``` ## 🔌 Available Sources - Twitter - Monitor tweets, mentions, replies...

(TWITTER)

backend/src/services/submissions/submission.service.ts (8)

12-13: New Import for ProcessorService

Importing ProcessorService (line 12) and naming it in the constructor is a straightforward change that clarifies the new architecture for processing submissions. No concerns here.


20-20: Added Injection of ProcessorService

Line 20 injects the processorService dependency. This creates clear separation of concerns, offloading content distribution or transformation to ProcessorService. Implementation looks good.


26-26: Asynchronous Cache Retrieval

Line 26 refactors the call to retrieve "admin_ids" from the Twitter cache asynchronously. This is consistent with the existing async/await usage. No concerns.


203-207: Retrieving Existing Submission & Feeds

Lines 203-207 retrieve an existing submission and its feeds if present. Good defensive checks (i.e., existingSubmission ? ... : []). Consider validating the response or handling the possibility that getFeedsBySubmission might return null or an unexpected format.

Do these DB calls safely handle null or empty values from the underlying storage? Please confirm.


212-212: Daily Submission Count Check

Line 212 fetches the daily submission count - well-placed to enforce submission rate limits. No issues.


382-382: Fetching Submission by Curator Tweet ID

Line 382 adds a DB query for retrieving a submission by the curator tweet ID. This is crucial for the new moderation flow. Looks straightforward, though ensure robust coverage in unit tests for edge cases (e.g., no matching submission).


405-407: Retrieving Submission Feeds for Moderation

Lines 405-407 fetch submission feeds to determine which are pending and can be moderated by the current user. Check if the system handles large feed counts gracefully (pagination, etc.) if there's a high volume of submissions.
[performance]


469-473: Asynchronous Processing Upon Approval

Lines 469-473 again showcase calling this.processorService.process(...) for newly approved feeds. The pattern is consistent with the other changes. No immediate issues.

backend/src/__tests__/transformation.service.test.ts (9)

1-21: Test Setup & MockPluginService

Lines 1-21 define the test suite imports, error classes, and a helpful MockPluginService. This helps isolate the TransformationService from actual plugin logic. No issues.


23-65: Test: Apply Transforms in Sequence

These lines (23-65) verify that multiple transforms apply in the correct order and merge results. Good coverage. The approach exemplifies best practices in unit testing by mocking each plugin transform function separately.


66-74: Test: Handle Empty Transform Array

Lines 66-74 confirm that providing no transforms returns the original input as-is. Straightforward and essential for edge-case coverage.


76-90: Test: Throw TransformError for Invalid Output

Lines 76-90 check for null outputs and verify an exception is thrown. This matches the domain logic of the TransformationService and ensures error handling remains robust.


92-156: Test: Different Transform Stages

Lines 92-156 test the interplay of “global” vs. “distributor” transforms. This helps confirm transformations can be sequentially chained across different contexts. Good demonstration of multi-phase processing.


158-174: Test: Propagate Plugin Errors as TransformError

Lines 158-174 confirm the service rethrows plugin errors as TransformError. Well-structured approach for capturing plugin-specific exceptions in a uniform manner.


176-199: Test: Merging Array Results Correctly

Lines 176-199 ensure arrays from multiple transforms are merged. The thorough approach covers how existing array data is appended by subsequent plugins. Nicely done.


201-224: Test: Handle Transform Config

Lines 201-224 confirm a plugin can access the provided config. This is important for flexible transformations, and the test ensures each plugin can adapt based on user-defined settings.


226-252: Test: Preserve Non-Object Results

Lines 226-252 confirm that transforms returning primitive values (e.g., string, number) are preserved. This is beneficial for transformations that may yield single-value outputs. Good coverage!

docs/sidebars.ts (3)

43-58: LGTM! Well-organized documentation structure for Sources.

The new Sources category with Overview and Twitter documentation improves navigation and aligns with the plugin system architecture.


63-67: LGTM! Good addition of Overview for Distributors.

Adding an overview document for distributors enhances documentation completeness.


84-103: LGTM! Improved organization of Transformer plugins.

The restructuring of transformer plugins with Overview, Object Transform, and AI Transform provides better clarity and organization.

backend/src/__tests__/distribution.service.test.ts (2)

7-10: LGTM! Well-defined interface for distribution arguments.

The DistributeArgs interface provides clear typing for plugin distribution parameters.


12-27: LGTM! Well-implemented mock plugin service.

The MockPluginService provides good test isolation and tracking of cleanup calls.

docs/docs/plugins/build-plugin.md (2)

7-7: LGTM! Clear introduction and plugin type categorization.

Good explanation of module federation and clear categorization of plugin types.

Also applies to: 11-15


100-111: LGTM! Well-documented environment variable configuration.

Clear explanation of environment variable syntax and setup in the Plugin Manager.

memory-bank/techContext.md (1)

5-20: LGTM! Clear technology stack overview.

Well-organized breakdown of backend, frontend, and external service dependencies.

backend/src/services/config/config.service.ts (1)

12-12: LGTM! Good practice to extract environment check.

The constant improves code clarity and maintainability by centralizing the environment check logic.

curate.config.test.json (2)

77-103:

Details

❓ Verification inconclusive

Validate AI transform schema configuration.

The AI transform configuration includes a complex schema. Ensure that:

  1. The OpenRouter API supports all specified fields
  2. Character limits are enforced by the AI model
  3. The schema aligns with the template usage

🌐 Web query:

What are the limitations of OpenRouter's GPT models regarding output schema validation and character limits?

💡 Result:

OpenRouter's GPT models inherit limitations from their respective base architectures rather than imposing new ones directly. Here are key considerations based on available information:

Schema Validation Limitations

  1. Model-Dependent Support
    Schema validation capabilities depend entirely on the underlying model provider (e.g., OpenAI). For example:

    • Structured Outputs: Newer OpenAI models like gpt-4o-mini support strict JSON Schema adherence with constraints:
      • Max 5 nesting levels in objects[2]
      • 100 total object properties per schema[2]
      • 15k-character limit across all string-based schema elements[2]
    • Legacy Models: Older versions (e.g., gpt-3.5-turbo) only support basic JSON mode without schema enforcement[2][6].
  2. Third-Party Tools
    Libraries like ZodGPT[6] can enforce schemas across any model through function calling wrappers but add latency through validation retries.

Character/Token Limits

  1. Token-Based Restrictions
    All responses are constrained by per-model token ceilings:

    • Standard gpt-4/gpt-3.5-turbo: 4k output tokens[5]
    • Token-to-character ratios vary by model architecture (e.g., ~1 token ≈ 4 characters for most English text)[9].
  2. Truncation Risks
    Responses exceeding configured max_tokens terminate early with finish_reason="length", potentially cutting off content mid-sentence[8][7].

  3. Indirect Character Control
    While no direct character-limiting exists:

    # Approximate method using max_tokens
    response = openai.ChatCompletion.create(
        model="gpt-4",
        messages=[...],
        max_tokens=500  # ~2000 characters
    )

    Trial-and-error adjustments are needed due to variable tokenization[7][9].

Platform-Specific Notes

OpenRouter adds these considerations:

  • Cost Variability: Identical prompts/outputs may incur different costs across models due to divergent tokenization schemes[9].
  • No Native Validation: Schema enforcement must be implemented via client-side code or third-party libraries[6][9].

For precise control over structured outputs when using OpenAI-based models through OpenRouter, newer snapshots like gpt-4o-2024-08-06 with native JSON Schema support are recommended[2][10].

Citations:


AI Transform Schema Configuration Validation

  • API Model Compatibility: Confirm that the OpenRouter model in use (e.g., a newer snapshot like gpt-4o-2024-08-06) fully supports strict JSON schema enforcement. Older or legacy models (e.g., gpt-3.5-turbo) may only offer basic JSON output without enforcing all schema constraints.
  • Character Limit Enforcement: Be aware that character limits (such as max 80 characters for "title" and 280 for "summary") are indirectly managed via token limits rather than precise character counting. If strict character enforcement is required, consider adding client-side validations.
  • Schema Alignment: Ensure that the complete schema, as defined below, aligns with the intended template usage and any third-party validation tools (e.g., libraries similar to ZodGPT), which might be used to enforce constraints beyond the API’s native capabilities.
                  "plugin": "@curatedotfun/ai-transform",
                  "config": {
                    "prompt": "Create a structured summary of this content",
                    "schema": {
                      "title": {
                        "type": "string",
                        "description": "A catchy headline summarizing the content (max 80 chars)"
                      },
                      "summary": {
                        "type": "string",
                        "description": "A concise summary of the content and curator notes (max 280 chars)"
                      },
                      "keyPoints": {
                        "type": "string",
                        "description": "3-5 bullet points highlighting main ideas, one per line"
                      },
                      "category": {
                        "type": "string",
                        "description": "The inferred topic category (e.g., DeFi, NFTs, Layer2)"
                      },
                      "sentiment": {
                        "type": "string",
                        "description": "The overall sentiment (positive, neutral, or negative)"
                      }
                    },
                    "outputFormat": "json"
                  }

Please verify that the deployment environment uses a compatible model and that any necessary post-processing is in place to handle character limits and schema validation.


13-13:

Details

❓ Verification inconclusive

Verify unpkg URLs and consider version pinning.

Using @latest tag in production can lead to unexpected behavior if a plugin releases a breaking change. Consider pinning to specific versions.

Also applies to: 17-17, 21-21, 25-25, 29-29


🏁 Script executed:

#!/bin/bash
# Check if the unpkg URLs are valid and get latest versions
for plugin in telegram notion simple-transform object-transform ai-transform; do
  echo "Checking @curatedotfun/${plugin}"
  curl -s "https://unpkg.com/@curatedotfun/${plugin}/package.json" | jq -r '.version'
done

Length of output: 1503


Heads Up: Verify unpkg Endpoints and Consider Version Pinning

  • The current configuration in curate.config.test.json (lines 13, 17, 21, 25, 29) uses the @latest tag in unpkg URLs. This practice can introduce instability if a plugin releases a breaking change.
  • Your initial verification script returned parse errors when fetching package metadata. The improved script above uses the @latest specifier in the URL to check if a valid package.json is returned.
  • Please manually verify that these unpkg endpoints (e.g., the URL for @curatedotfun/telegram@latest/package.json) return valid JSON metadata and extract a correct version number.
  • If valid output cannot be confirmed, consider pinning to a specific version to avoid unpredictable issues.

---

🏁 Script executed:

```shell
[run_scripts]
#!/bin/bash
# Improved script to verify that package.json for each plugin is fetched correctly using the @latest specifier
for plugin in telegram notion simple-transform object-transform ai-transform; do
  echo "Checking @curatedotfun/${plugin}@latest package.json"
  url="https://unpkg.com/@curatedotfun/${plugin}@latest/package.json"
  http_response=$(curl -s -o /tmp/tmpfile -w '%{http_code}' "$url")
  echo "HTTP Code: $http_response"
  if [ "$http_response" = "200" ]; then
    version=$(jq -r '.version' /tmp/tmpfile 2>/dev/null)
    if [ -n "$version" ] && [ "$version" != "null" ]; then
      echo "Version: $version"
    else
      echo "Warning: Valid HTTP response but could not extract version."
    fi
  else
    echo "Error: Failed to fetch package.json for @curatedotfun/${plugin}@latest"
  fi
  rm -f /tmp/tmpfile
  echo "-----"
done

Length of output: 3394


Attention: Verify Unpkg Redirect Handling and Version Pinning

  • The current configuration in curate.config.test.json (lines 13, 17, 21, 25, 29) uses the @latest tag, which, as observed, results in HTTP 302 redirects when attempting to fetch package metadata.
  • This redirection prevents the automated retrieval of a valid package.json (and its .version field) using tools like curl without additional options (e.g., -L to follow redirects).
  • Please manually verify that the unpkg endpoints (e.g., for @curatedotfun/telegram) resolve correctly when following redirects, or consider updating the retrieval approach.
  • Independently of redirection handling, pinning to specific plugin versions is advisable in production to avoid unexpected breaking changes.
backend/src/services/distribution/distribution.service.ts (10)

1-3: Imports for error handling look properly introduced.
No issues spotted with these changes.


5-6: Imports for PluginService and DistributorConfig look consistent.
The shift toward modular plugin handling is clear.


9-9: Constructor injection is well-structured.
This ensures testability and modularity.


11-14: Signature for distributeContent is clean.
Verify that higher-level callers properly supply a DistributorConfig and TwitterSubmission.


15-15: Destructuring distributor fields.
Confirm that config is always defined or handled gracefully when undefined.


17-23: Plugin lookup logic is straightforward.
Ensure that any null or undefined plugin returns are handled or mitigated before distribution.


25-31: Distribution via plugin.distribute appears correct.
No functional or syntax concerns identified.


32-37: Custom PluginExecutionError usage.
Throwing a specialized error helps differentiate plugin failures from other errors.


45-48: Conditionally rethrowing non-plugin errors is logical.
This pattern preserves plugin-specific failures while surfacing unexpected system issues.


53-53: Cleanup call is correct.
Ensures plugin resources are freed on shutdown.

backend/src/services/twitter/queries.ts (2)

3-3: Migration to BetterSQLite3Database.
This aligns with the shift to the new library—ensure the dependency is installed and properly configured.


6-75:

Details

❓ Verification inconclusive

Synchronous DB calls remain unchanged aside from the new DB type.
Verify references across the codebase to confirm that all related usage now expects a BetterSQLite3Database.


🏁 Script executed:

#!/bin/bash
# Searches for references to the old BunSQLiteDatabase usage to ensure complete migration.

rg 'BunSQLiteDatabase' -A 2 -B 2

Length of output: 32


Ensure complete migration to BetterSQLite3Database across the codebase.

The synchronous DB calls in backend/src/services/twitter/queries.ts correctly use the new DB type. However, since the initial search for BunSQLiteDatabase didn’t provide conclusive output, please perform manual verification (or run a more detailed search for BetterSQLite3Database) to ensure no legacy references remain and all related usage has been updated.

backend/src/types/config.ts (2)

13-13: Optional config in PluginConfig broadens usage.
Looks good; this provides additional flexibility for plugin setups.


24-24: TransformConfig now supports more generic config.
Allows a wider range of transformation parameters.

Dockerfile (4)

27-34: LGTM! Good practice in installing build dependencies.

The installation of build dependencies is well-structured with:

  • Clear package list
  • --no-install-recommends flag to minimize image size
  • Proper cleanup of apt cache

47-48: LGTM! Essential step for module federation.

Copying the frontend dist to the backend build context is crucial for the module federation setup to work correctly.


59-70: LGTM! Comprehensive runtime dependencies setup.

The installation of runtime dependencies is well-structured with:

  • Essential packages for LiteFS and runtime
  • --no-install-recommends flag
  • Proper cleanup

96-99: LGTM! Good security practice.

Setting secure environment defaults helps ensure consistent and secure runtime behavior.

curate.config.json (1)

13-30:

Details

❓ Verification inconclusive

Verify the unpkg.com URLs and consider version pinning.

While using @latest tag works, it's better to pin to specific versions for better reproducibility and to avoid unexpected breaking changes.


🌐 Web query:

What are the latest versions of @curatedotfun plugins on npm?

💡 Result:

The provided search results do not contain specific information about plugins under the @curatedotfun namespace on npm. None of the sources mention this specific package scope or its associated plugins.

To determine the latest versions of @curatedotfun plugins:

  1. Use npm CLI: Run npm view @curatedotfun/<plugin-name> version for each plugin
  2. Check npm registry: Visit npmjs.com/org/curatedotfun (if the organization exists)
  3. Programmatic check: Use latest-version ([7]) with code:
import latestVersion from 'latest-version';

console.log(await latestVersion('@curatedotfun/example-plugin'));

For automated version tracking in development environments, consider tools like:

  • package-info.nvim for Neovim users ([5])
  • Backstage's npm plugin for centralized tracking ([3])

Without direct access to current registry data or specific package mentions in the provided sources, these methods represent standard approaches to retrieve version information from npm.

Citations:


Ensure plugin version stability by pinning their versions

The current configuration in curate.config.json (lines 13-30) uses the @latest tag in the unpkg.com URLs. While this guarantees that you always use the most recent release, it can lead to unexpected breaking changes in the future. Since pinpointing the exact latest versions from the npm registry wasn’t possible via the web query, please consider the following actions:

  • Use the npm CLI (e.g., npm view @curatedotfun/telegram version) for each plugin to retrieve their current versions.
  • Replace the @latest tag in each URL with the specific version obtained. For example:
        "@curatedotfun/telegram": {
          "type": "distributor",
          "url": "https://unpkg.com/@curatedotfun/telegram@1.2.3/dist/remoteEntry.js"
        }
  • Repeat the process for all plugins to ensure reproducibility and stability across deployments.
backend/src/services/plugins/plugin.service.ts (2)

285-335: Handle partial load states more robustly
During loadModule, if a plugin partially loads but fails mid-initialize, subsequent calls may hold a stale or inconsistent remote module reference. Setting the remote’s status to “failed” eliminates some confusion, but consider a more atomic approach to revert or roll back partial states to ensure no stale references remain.


450-456: Check time boundaries in isStale
The code uses the difference between Date.now() and loadedAt.getTime(); watch out for system clock issues or changes (NTP drift). If the server’s clock drastically adjusts, a plugin might be prematurely marked stale or remain active too long.

backend/src/services/db/index.ts (2)

23-34: Validate fallback path
When DATABASE_URL is unset, the code defaults to file:${join(process.cwd(), ".db", "submissions.sqlite")}. This is convenient, but ensure that ephemeral containers or serverless environments have a stable location for .db. Otherwise, data might be lost on restarts.


54-75: Consider returning async results
Several methods (like saveSubmission or saveModerationAction) use synchronous calls on the underlying DB object. This is perfectly fine for small computations, but if heavy writes occur, consider verifying that these won't block the event loop if your architecture requires concurrency.

backend/src/services/db/queries.ts (1)

427-454: Cleanup strategy
cleanupOldSubmissionCounts uses a date comparison in the WHERE clause. If your system’s clock changes drastically, old entries might not get cleaned up or might be prematurely removed. Consider whether a time-based job or stronger temporal logic is needed if you anticipate clock instability.

"dev": "bunx turbo run dev",
"build": "bunx turbo run build",
"start": "cd backend && NODE_ENV=production bun run dist/index.js",
"start": "cd backend && NODE_ENV=production bun run dist/main.js",
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Verification agent

🧩 Analysis chain

Verify the new entry point path.

The start script now points to dist/main.js instead of dist/index.js. Ensure that:

  1. The build process generates this file
  2. All imports are correctly updated

🏁 Script executed:

#!/bin/bash
# Check if the new entry point exists and verify its imports
cd backend && \
  echo "Checking if dist/main.js exists:" && \
  [ -f dist/main.js ] && echo "✓ File exists" || echo "✗ File not found" && \
  echo "Checking for any remaining references to index.js:" && \
  rg -l "index\.js" .

Length of output: 289


Action Required: Verify and Fix the Entry Point

  • The automated check indicates that "backend/dist/main.js" was not found.
  • Please ensure the build process is updated to produce "dist/main.js". If the file isn't generated or is located elsewhere, update the start script accordingly.
  • Also, confirm that all references to "index.js" have been removed.

Comment on lines +32 to +34
"url": "{SUPABASE_URL}",
"key": "{SUPABASE_KEY}",
"table": "curated_content",
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Enhance security guidance for API credentials.

The configuration example shows API credentials directly in the JSON. Consider:

  1. Adding a warning about never committing these values
  2. Providing examples of secure credential management
  3. Adding guidance on API key permissions and scope
       "plugin": "@curatedotfun/supabase",
       "config": {
-        "url": "{SUPABASE_URL}",
-        "key": "{SUPABASE_KEY}",
+        "url": "${process.env.SUPABASE_URL}",
+        "key": "${process.env.SUPABASE_KEY}",
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
"url": "{SUPABASE_URL}",
"key": "{SUPABASE_KEY}",
"table": "curated_content",
"url": "${process.env.SUPABASE_URL}",
"key": "${process.env.SUPABASE_KEY}",
"table": "curated_content",

Comment on lines +46 to +57
// new rspack.container.ModuleFederationPlugin({
// name: "host",
// runtimePlugins: [
// require.resolve("@module-federation/node/runtimePlugin"),
// ],
// shared: {
// "@curatedotfun/types": {
// singleton: true,
// eager: true
// },
// }
// })
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Complete the module federation configuration.

The commented-out module federation configuration needs to be uncommented and completed to align with the PR objectives of migrating to use module federation for plugins.

Uncomment and update the configuration:

-    //   new rspack.container.ModuleFederationPlugin({
-    //     name: "host",
-    //     runtimePlugins: [
-    //       require.resolve("@module-federation/node/runtimePlugin"),
-    //     ],
-    //     shared: {
-    //       "@curatedotfun/types": {
-    //         singleton: true,
-    //         eager: true
-    //       },
-    //     }
-    //   })
+    new rspack.container.ModuleFederationPlugin({
+      name: "host",
+      runtimePlugins: [
+        require.resolve("@module-federation/node/runtimePlugin"),
+      ],
+      remotes: {
+        // Add remote entries for plugins
+      },
+      shared: {
+        "@curatedotfun/types": {
+          singleton: true,
+          eager: true
+        },
+      }
+    })
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
// new rspack.container.ModuleFederationPlugin({
// name: "host",
// runtimePlugins: [
// require.resolve("@module-federation/node/runtimePlugin"),
// ],
// shared: {
// "@curatedotfun/types": {
// singleton: true,
// eager: true
// },
// }
// })
new rspack.container.ModuleFederationPlugin({
name: "host",
runtimePlugins: [
require.resolve("@module-federation/node/runtimePlugin"),
],
remotes: {
// Add remote entries for plugins
},
shared: {
"@curatedotfun/types": {
singleton: true,
eager: true
},
}
})

Comment on lines +1 to +11
export class MockProcessorService {
public processedItems: { content: any; config: any }[] = [];

async process(content: any, config: any) {
this.processedItems.push({ content, config });
}

async processBatch(items: any[], config: any) {
this.processedItems.push({ content: items, config });
}
}
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Improve type safety in the mock service.

The mock service uses any types which reduces type safety. Consider using proper types from the actual ProcessorService.

Apply this diff to improve type safety:

+import type { ProcessorConfig, ProcessorInput, ProcessorResult } from '../../services/processor/processor.service';

 export class MockProcessorService {
-  public processedItems: { content: any; config: any }[] = [];
+  public processedItems: { content: ProcessorInput; config: ProcessorConfig }[] = [];

-  async process(content: any, config: any) {
+  async process(content: ProcessorInput, config: ProcessorConfig): Promise<ProcessorResult> {
     this.processedItems.push({ content, config });
+    return { success: true };
   }

-  async processBatch(items: any[], config: any) {
+  async processBatch(items: ProcessorInput[], config: ProcessorConfig): Promise<ProcessorResult[]> {
     this.processedItems.push({ content: items, config });
+    return items.map(() => ({ success: true }));
   }
 }
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
export class MockProcessorService {
public processedItems: { content: any; config: any }[] = [];
async process(content: any, config: any) {
this.processedItems.push({ content, config });
}
async processBatch(items: any[], config: any) {
this.processedItems.push({ content: items, config });
}
}
import type { ProcessorConfig, ProcessorInput, ProcessorResult } from '../../services/processor/processor.service';
export class MockProcessorService {
public processedItems: { content: ProcessorInput; config: ProcessorConfig }[] = [];
async process(content: ProcessorInput, config: ProcessorConfig): Promise<ProcessorResult> {
this.processedItems.push({ content, config });
return { success: true };
}
async processBatch(items: ProcessorInput[], config: ProcessorConfig): Promise<ProcessorResult[]> {
this.processedItems.push({ content: items, config });
return items.map(() => ({ success: true }));
}
}

Comment on lines 14 to 16
saveRssItem(feedId: string, item: RssItem): void {
saveRssItem(this.db, feedId, item);
}
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Consider making all database operations async for consistency.

The getRssItems method is async while saveRssItem and deleteOldRssItems remain sync. This inconsistency could lead to confusion and potential issues with transaction handling.

Consider making all database operations async for consistency:

-  saveRssItem(feedId: string, item: RssItem): void {
+  async saveRssItem(feedId: string, item: RssItem): Promise<void> {
-    saveRssItem(this.db, feedId, item);
+    await saveRssItem(this.db, feedId, item);
   }

-  deleteOldRssItems(feedId: string, limit: number): void {
+  async deleteOldRssItems(feedId: string, limit: number): Promise<void> {
-    deleteOldRssItems(this.db, feedId, limit);
+    await deleteOldRssItems(this.db, feedId, limit);
   }

Also applies to: 22-24

Comment on lines +51 to +59
if (isProduction) {
twitterService = new TwitterService({
username: process.env.TWITTER_USERNAME!,
password: process.env.TWITTER_PASSWORD!,
email: process.env.TWITTER_EMAIL!,
twoFactorSecret: process.env.TWITTER_2FA_SECRET,
});
await twitterService.initialize();
} else {
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Verification agent

🧩 Analysis chain

Validate required environment variables for Twitter credentials.

In production mode, the app attempts to initialize TwitterService with credentials from environment variables, but it doesn't validate their presence or correctness before usage. This might lead to runtime failures or ambiguous errors in case of missing or malformed environment variables.

You can run the following script to confirm that all required environment variables (TWITTER_USERNAME, TWITTER_PASSWORD, TWITTER_EMAIL) are defined in production builds:


🏁 Script executed:

#!/bin/bash
# Description: Check for presence of Twitter-related env vars in production.

if [[ "$NODE_ENV" == "production" ]]; then
  for var in TWITTER_USERNAME TWITTER_PASSWORD TWITTER_EMAIL; do
    if [[ -z "${!var}" ]]; then
      echo "Missing environment variable: $var"
      exit 1
    fi
  done
  echo "All Twitter environment variables are set."
else
  echo "Non-production environment. Skipping checks."
fi

Length of output: 133


Action Required: Implement Runtime Validation for Twitter Credentials in Production

The current implementation instantiates TwitterService in production using critical environment variables without verifying that they're set. This lack of validation may lead to ambiguous runtime failures if any variable is missing or malformed. Note that the provided verification script uses the NODE_ENV variable to detect production mode, while the code relies on the isProduction flag. These two mechanisms should be aligned to accurately reflect production conditions.

  • Ensure runtime checks: Before initializing TwitterService in production, add explicit runtime checks to validate that process.env.TWITTER_USERNAME, TWITTER_PASSWORD, and TWITTER_EMAIL are defined.
  • Align production detection: Either update the verification script or adjust the production detection logic in the code so they are consistent (e.g., use NODE_ENV === 'production' in both cases or defer entirely to the isProduction flag).

# 🔌 Plugins

curate.fun supports various plugins that extend its functionality, particularly for content distribution. Each plugin enables you to distribute curated content to different platforms and channels.
Curate.fun has a unique plugin pattern that uses [module federation](https://module-federation.io/), which allows the bot to load and use remote plugins without needing to install or redeploy. These plugins can extend its functionality, particularly for content ingestion, transformation, distribution.
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Add security considerations for module federation.

While module federation offers great flexibility, consider adding security implications and best practices:

  • Risks of using remote code
  • HTTPS requirement
  • Integrity checks
  • Version pinning recommendations

Comment on lines +30 to +42
transform?: TransformConfig[]; // Per-distributor transforms
}

export interface StreamConfig {
enabled: boolean;
transform?: TransformConfig;
transform?: TransformConfig[]; // Global transforms
distribute?: DistributorConfig[];
}

export interface RecapConfig {
enabled: boolean;
schedule: string;
transform: TransformConfig;
transform?: TransformConfig[]; // Global transforms for recap
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Enhanced transform arrays in DistributorConfig, StreamConfig, and RecapConfig.
This enables more sophisticated processing pipelines. However, there is a potential type mismatch between DistributorConfig.config: Record<string, string> (line 29) and the plugin service usage accepting Record<string, unknown>.

Below is a proposed fix to align types for distributor config:

 export interface DistributorConfig {
   plugin: string;
-  config: Record<string, string>;
+  config: Record<string, unknown>;
   transform?: TransformConfig[];
 }
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
transform?: TransformConfig[]; // Per-distributor transforms
}
export interface StreamConfig {
enabled: boolean;
transform?: TransformConfig;
transform?: TransformConfig[]; // Global transforms
distribute?: DistributorConfig[];
}
export interface RecapConfig {
enabled: boolean;
schedule: string;
transform: TransformConfig;
transform?: TransformConfig[]; // Global transforms for recap
export interface DistributorConfig {
plugin: string;
- config: Record<string, string>;
+ config: Record<string, unknown>;
transform?: TransformConfig[];
}
export interface StreamConfig {
enabled: boolean;
transform?: TransformConfig[]; // Global transforms
distribute?: DistributorConfig[];
}
export interface RecapConfig {
enabled: boolean;
schedule: string;
transform?: TransformConfig[]; // Global transforms for recap
}

Comment on lines +81 to +120
export class PluginService {
private static instance: PluginService;
private remotes: Map<string, RemoteState> = new Map();
private instances: Map<string, InstanceState<PluginType>> = new Map();
private endpoints: Map<string, PluginEndpoint[]> = new Map();
private app: Elysia | null = null;
private configService: ConfigService;

// Time in milliseconds before cached items are considered stale
private readonly instanceCacheTimeout: number = 7 * 24 * 60 * 60 * 1000; // 7 days (instance of a plugin with config)
private readonly moduleCacheTimeout: number = isProduction
? 30 * 60 * 1000
: 10 * 1000; // 30 minutes in production (module loaded from remote), 10 seconds in development
private readonly maxAuthFailures: number = 2; // one less than 3 to avoid locking
private readonly retryDelays: number[] = [1000, 5000]; // Delays between retries in ms

private constructor() {
this.configService = ConfigService.getInstance();
}

/**
* Gets the singleton instance of PluginService
*/
public static getInstance(): PluginService {
if (!PluginService.instance) {
PluginService.instance = new PluginService();
}
return PluginService.instance;
}

/**
* Sets the Elysia app instance for endpoint registration
*/
public setApp(app: Elysia) {
this.app = app;
// Register any pending endpoints
for (const [name, endpoints] of this.endpoints) {
this.registerEndpoints(name, endpoints);
}
}
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Introduce locks or synchronization for shared mutable structures
PluginService relies on Maps (this.remotes, this.instances, etc.) that can be mutated by multiple callers. If this service is accessed concurrently (e.g., multiple requests arrive simultaneously), there is a risk of data races or partially initialized plugin states. A synchronization mechanism (mutexes or atomic operations) might be prudent in high-concurrency scenarios.

Comment on lines +122 to +283
/**
* Gets or creates a plugin instance with the specified configuration
*/
public async getPlugin<
T extends PluginType,
TInput = unknown,
TOutput = unknown,
TConfig extends Record<string, unknown> = Record<string, unknown>,
>(
name: string,
pluginConfig: { type: T; config: TConfig },
): Promise<PluginTypeMap<TInput, TOutput, TConfig>[T]> {
try {
// Get plugin metadata from app config
const pluginMeta = this.configService.getPluginByName(name);

if (!pluginMeta) {
throw new PluginLoadError(
name,
"",
new Error(`Plugin ${name} not found in app configuration`),
);
}

// Create full config with URL from app config
const config: PluginConfig<T, TConfig> = {
type: pluginConfig.type,
url: pluginMeta.url,
config: pluginConfig.config,
};

const normalizedName = this.packageToRemoteName(name);
const instanceId = createPluginInstanceKey(normalizedName, config);

// Check existing instance
const instance = this.instances.get(instanceId);
if (instance) {
if (instance.authFailures >= this.maxAuthFailures) {
throw new PluginError(`Plugin ${name} disabled due to auth failures`);
}

if (!this.isStale(instance.loadedAt, this.instanceCacheTimeout)) {
return instance.instance as PluginTypeMap<
TInput,
TOutput,
TConfig
>[T];
}
}

// Get or initialize remote
let remote = this.remotes.get(normalizedName);
if (!remote) {
remote = {
config: { name: normalizedName, entry: config.url },
status: "active",
};
this.remotes.set(normalizedName, remote);
}

// Create and initialize instance with retries
let lastError: Error | null = null;
for (let attempt = 0; attempt <= this.retryDelays.length; attempt++) {
try {
// Load module if needed
if (
!remote.module ||
!remote.loadedAt ||
this.isStale(remote.loadedAt, this.moduleCacheTimeout)
) {
remote.status = "loading";
await this.loadModule(remote);
}

if (remote.status === "failed") {
throw remote.lastError || new Error("Module loading failed");
}

// Create and initialize instance
const newInstance = new remote.module() as PluginTypeMap<
TInput,
TOutput,
TConfig
>[T];
await newInstance.initialize(config.config);

// Validate instance implements required interface
if (!this.validatePluginInterface(newInstance, config.type)) {
throw new PluginInitError(
name,
new Error(
`Plugin does not implement required ${config.type} interface`,
),
);
}

// Register endpoints if available
if (this.app && (newInstance as PluginWithEndpoints).getEndpoints) {
const endpoints = (newInstance as PluginWithEndpoints)
.getEndpoints!();
this.registerEndpoints(normalizedName, endpoints);
}

// Cache successful instance
const instanceState: InstanceState<T> = {
instance: newInstance as PluginTypeMap<
unknown,
unknown,
Record<string, unknown>
>[T],
config: config as PluginConfig<T, Record<string, unknown>>,
loadedAt: new Date(),
authFailures: 0,
remoteName: normalizedName,
};
this.instances.set(instanceId, instanceState);

return newInstance;
} catch (error) {
lastError = error as Error;

// Track auth failure
if (instance) {
instance.authFailures += 1;

if (instance.authFailures >= this.maxAuthFailures) {
logger.error(`Plugin ${name} disabled due to auth failures`);
// Clean up endpoints before disabling
this.unregisterEndpoints(normalizedName);
throw new PluginError(
`Plugin ${name} disabled after ${instance.authFailures} auth failures`,
);
}
}

// If we have more retries, wait and try again
if (attempt < this.retryDelays.length) {
logger.warn(
`Plugin ${name} initialization failed, retrying in ${this.retryDelays[attempt]}ms`,
{ error },
);
await new Promise((resolve) =>
setTimeout(resolve, this.retryDelays[attempt]),
);
}
}
}

// If we get here, all retries failed
// Clean up failed remote
this.unregisterEndpoints(normalizedName);
throw lastError || new PluginError(`Failed to initialize plugin ${name}`);
} catch (error) {
logger.error(`Plugin error: ${name}`, { error });
throw error instanceof PluginError
? error
: new PluginError(
`Unexpected error with plugin ${name}`,
error as Error,
);
}
}
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Potential confusion handling recoverable vs. auth failures
When errors occur in getPlugin, the code increments authFailures only if an instance already exists, potentially conflating authentication vs. initialization (e.g., network fetch) failures. Consider tracking error subtypes (network/timeouts vs. auth) for more precise error handling and plugin disabling logic.

@elliotBraem elliotBraem merged commit 6c48325 into main Feb 24, 2025
elliotBraem added a commit that referenced this pull request Feb 24, 2025
elliotBraem added a commit that referenced this pull request Feb 24, 2025
elliotBraem added a commit that referenced this pull request Feb 27, 2025
* Migrates to use module federation for plugins (#61)

* Adds module federation support for distributor + transformer plugins (#55)

* adds memory bank and plugin loader

* adds plugin service

* wip

* working distributor plugins w/ module federation, some TOODs

* working, nice, clean plugin service

* fmt

* addresses comments

* set the correct remotes

* update memory bank and documentation

* implements transform plugins

* update memories

* fmt

* fmt

* fix docs for ai-transform

* fmt

* remove broken link

* fix docs

* fmt

* passing tests

* adds tests

* adds multi item error

* add necessary dependencies for better-sqlite-3

* remove libsql and copy over frontend dist

* install @libsql/client

* adds externals

* fmt

* use node for building

* convert to better-sqlite-3

* improve docker image

* fmt

* Revert "Migrates to use module federation for plugins (#61)" (#62)

This reverts commit 6c48325.

* Migrates plugins to use module federation (#63)

* Adds module federation support for distributor + transformer plugins (#55)

* adds memory bank and plugin loader

* adds plugin service

* wip

* working distributor plugins w/ module federation, some TOODs

* working, nice, clean plugin service

* fmt

* addresses comments

* set the correct remotes

* update memory bank and documentation

* implements transform plugins

* update memories

* fmt

* fmt

* fix docs for ai-transform

* fmt

* remove broken link

* fix docs

* fmt

* passing tests

* adds tests

* adds multi item error

* add necessary dependencies for better-sqlite-3

* remove libsql and copy over frontend dist

* install @libsql/client

* adds externals

* fmt

* use node for building

* convert to better-sqlite-3

* improve docker image

* fmt

* migrate to hono and node

* clean up

* nodemon

* fmt

* replace bun

* fix bun command

* fix npm command

* fix scripts

* npx

* correct command

* node module hoisting

* fix packages

* fmt

* package.json

* wip for good deploy

* fmt

* normalizes db async

* fmt

* fix dist

* working plugins

* fmt
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant