Llamaestro is built on top of Maestro. It provides users the following benefits:
- Cheaper π€ - Maestro was uh... pricey (Opus is a baller πΈ; LLama3 is a penny pincher)
- Easier to run πββοΈ - don't have to manage packages
- Faster π - Llamaestro utilizes Groq API and is so fast I had to build in a sleep timer to prevent hitting rate limits.
- User-friendlier π - You don't have to look at code
- Go to the GroqCloud dashboard and create a new API key.
- Copy this Google Colab notebook.
- Click the "Secrets" (key icon) on the left sidebar and click "Add secret".
- Enter a name for your secret: "GROQ_API_KEY" and paste your GroqCloud API key as the value.
- Click Runtime in the top navigation bar and click "Run All" or just hit cmd+enter a few times
Maestro is a multi-agent framework designed to break down complex objectives into smaller sub-tasks and coordinate the execution of these sub-tasks using different AI agents. The framework consists of three main components:
Llamaestro (Llama 3): The primary AI agent responsible for decomposing the objective into smaller sub-tasks and coordinating the overall workflow. Little Llama: An AI agent that handles the execution of individual sub-tasks generated by Llamaestro. Big Llama: An AI agent that reviews and refines the results produced by Little Llama to ensure the final output meets the original objective.
The Maestro framework is particularly useful for tackling complex, multi-step objectives that require a systematic approach and the coordination of multiple AI agents.
- Research & Summaries
- Coding small to medium apps
- Writing a book
If you have thoughts connect with me on X or LinkedIn.
Don't forget to leave a star on Github ππ
