Stars
A collection of projects designed to help developers quickly get started with building deployable applications using the Claude API
Collection of awesome LLM apps with AI Agents and RAG using OpenAI, Anthropic, Gemini and opensource models.
📚 《从零开始构建智能体》——从零开始的智能体原理与实践教程
A book for Learning the Foundations of LLMs
🐙 Guides, papers, lessons, notebooks and resources for prompt engineering, context engineering, RAG, and AI Agents.
[WIP] Resources for AI engineers. Also contains supporting materials for the book AI Engineering (Chip Huyen, 2025)
Get up and running with Kimi-K2.5, GLM-4.7, DeepSeek, gpt-oss, Qwen, Gemma and other models.
[Book-2021] Practical MLOps O'Reilly Book
a Python toolbox loads 172 public time series datasets for machine/deep learning with a single line of code. Datasets from multiple domains including healthcare, financial, power, traffic, weather,…
Free MLOps course from DataTalks.Club
润学全球官方指定GITHUB,整理润学宗旨、纲领、理论和各类润之实例;解决为什么润,润去哪里,怎么润三大问题; 并成为新中国人的核心宗教,核心信念。
Code Transformer neural network components piece by piece
The GitHub repository for the paper "Informer" accepted by AAAI 2021.
Simple web app example serving a PyTorch model using streamlit and FastAPI
Causal Inference for the Brave and True的中文翻译版。全部代码基于Python,适用于计量经济学、量化社会学、策略评估等领域。英文版原作者:Matheus Facure
A Library for Advanced Deep Time Series Models for General Time Series Analysis.
An annotated implementation of the Transformer paper.
AISystem 主要是指AI系统,包括AI芯片、AI编译器、AI推理和训练框架等AI全栈底层技术
An offical implementation of PatchTST: "A Time Series is Worth 64 Words: Long-term Forecasting with Transformers." (ICLR 2023) https://arxiv.org/abs/2211.14730
[AAAI-23 Oral] Official implementation of the paper "Are Transformers Effective for Time Series Forecasting?"
python requests 源码阅读,学习更pythonic 的python代码写法。
《李宏毅深度学习教程》(李宏毅老师推荐👍,苹果书🍎),PDF下载地址:https://github.com/datawhalechina/leedl-tutorial/releases
🧑🏫 60+ Implementations/tutorials of deep learning papers with side-by-side notes 📝; including transformers (original, xl, switch, feedback, vit, ...), optimizers (adam, adabelief, sophia, ...), ga…
