Preact Codegen is a package that helps developers quickly set up Preact applications without needing a complex build configuration. Users can describe their application requirements in plain text, including desired routing and signal management features. The package processes this input to generate a structured, ready-to-use Preact application code snippet.
- Generate Preact application code from user input in plain text
- Supports routing and signal management features
- Easy to integrate and expand
- Simple and fast setup
pip install preact_codegenfrom preact_codegen import preact_codegen
user_input = "Describe your Preact application requirements here..."
api_key = "your_api_key_here" # Optional, if not provided, the default LLM7 will be used
response = preact_codegen(
user_input=user_input,
api_key=api_key,
)
print(response)user_input: The user input text to process (type:str)llm: ThelangchainLLM instance to use (optional, type:Optional[BaseChatModel]), defaults toChatLLM7fromlangchain_llm7api_key: The API key for LLM7 (optional, type:Optional[str]), defaults toos.getenv("LLM7_API_KEY")orNone
-
The package uses
ChatLLM7fromlangchain_llm7by default. You can safely pass your own LLM instance (based onlangchain) if you want to use another LLM.Example for
ChatOpenAI:from langchain_openai import ChatOpenAI from preact_codegen import preact_codegen llm = ChatOpenAI() response = preact_codegen( user_input=user_input, llm=llm, )
Example for
ChatAnthropic:from langchain_anthropic import ChatAnthropic from preact_codegen import preact_codegen llm = ChatAnthropic() response = preact_codegen( user_input=user_input, llm=llm, )
Example for
ChatGoogleGenerativeAI:from langchain_google_genai import ChatGoogleGenerativeAI from preact_codegen import preact_codegen llm = ChatGoogleGenerativeAI() response = preact_codegen( user_input=user_input, llm=llm, )
-
The default rate limits for LLM7 free tier are sufficient for most use cases of this package. If you want higher rate limits for LLM7, you can pass your own
API_KEYvia environment variableLLM7_API_KEYor directly likepreact_codegen(api_key="your_api_key"). -
You can get a free API key by registering at LLM7.
- GitHub issues: GitHub Issues
- Author name: Eugene Evstafev
- Author email: hi@euegne.plus