Skip to content

Developer Resources 📚

Curated resources I reference while exploring AI engineering and building with LLMs. This is a living collection—updated as I discover new tools, providers, and learning resources.

Work in Progress

This page is actively being developed as I explore new tools and platforms. Expect frequent updates as I test and evaluate different services.


Model Providers & APIs

Commercial Providers

Open Source Models

  • Meta Llama: Llama 3.1, Llama 3.2 (via HuggingFace, Ollama)
  • Mixtral: 8x7B, 8x22B mixture-of-experts
  • Qwen: Alibaba's multilingual models
  • DeepSeek: Code-focused models

Local LLM Development

Running Models Locally

Ollama (macOS, Linux, Windows)

# Install
curl -fsSL https://ollama.ai/install.sh | sh

# Popular models
ollama run llama3.1
ollama run codellama
ollama run mistral
ollama run phi3

LM Studio - GUI application for running local models - Supports GGUF format models - Built-in model browser

GPT4All - Cross-platform desktop app - Privacy-focused local inference - No GPU required


Development Tools

Python Libraries & Frameworks

Core AI Frameworks - LangChain: Agent workflows, chains, memory - LangGraph: Stateful multi-actor applications - LlamaIndex: RAG and document indexing - OpenAI Python SDK: Official OpenAI client - Anthropic Python SDK: Official Claude client - LiteLLM: Unified interface for calling 100+ LLM APIs using the OpenAI format - Mem0: Adaptive memory layer that enables LLMs to remember user preferences and context across conversations

Vector Databases - Pinecone: Managed vector database - Chroma: Open-source embeddings database - Qdrant: Vector search engine - Weaviate: AI-native vector database

Utilities - tiktoken: Token counting for OpenAI models - pydantic: Data validation and structured outputs - httpx: Async HTTP client

IDEs & Code Editors

VS Code - GitHub Copilot: AI code completion and chat - Python: Linting, debugging, IntelliSense - Jupyter: Notebook support - Continue: Open-source AI assistant - Pylance: Fast Python language server - Docker: Container management

IntelliJ IDEA - GitHub Copilot: AI code completion - Python Plugin: Python/Django support - AWS Toolkit: AWS service integration - Database Tools: Built-in SQL support

Jupyter & Notebooks - JupyterLab: Interactive development environment - Jupyter AI: Generative AI extension for JupyterLab - Google Colab: Free cloud notebooks with GPU - Deepnote: Collaborative data science notebooks

CLI Tools

AI Development - aider: AI pair programming in terminal - llm: Simon Willison's LLM CLI tool - fabric: AI workflow automation

General Development - gh: GitHub CLI - httpie: API testing - jq: JSON processing


Cloud Providers

AWS - Bedrock: Managed access to foundation models (Claude, Llama, Titan) - SageMaker: Custom model training and deployment - Lambda: Serverless compute for AI workflows

Google Cloud - Vertex AI: Unified ML platform with Gemini models - Cloud Run: Containerized AI deployments

Azure - Azure OpenAI Service: Enterprise access to OpenAI models - Azure AI Studio: End-to-end AI development platform


AI Companies & Startups

Infrastructure & Platforms

Agent & Workflow Tools

  • LangSmith: LLM application observability
  • Flowise: Low-code LLM app builder
  • n8n: Workflow automation with AI nodes

Voice AI


Learning Resources

Online Courses

Documentation & Guides

Communities


← Previous: LLM Experiments Back to AI Lab