Early-adopter / Power user topics
with LLMs
MCP + Prompt Engineering + RAG
MCP
Model Context Protocol is a standardized way for AI models to securely access external tools, data, and services. It defines how context is shared between models and systems, enabling interoperable, real-time integration without hard-coding connections.
Estimated percentile: Top 5–10% of all ChatGPT users
If you are thinking about tooling, connectors, and agent-style workflows, structured context passing, then you are a power-user.
Prompt Engineering
Prompt engineering is the practice of designing and refining inputs to guide AI models toward accurate, useful, and consistent outputs. It involves structuring instructions, examples, and constraints to improve reasoning, control behavior, and optimize performance for specific tasks.
If you give structured, multi-constraint prompts, iterate and refine intent clearly, then you are among the top 2-5% globally.
Add Velocity recommends thinking in terms of workflows, as opposed to one-off questions. Focus on reproducibility and precision.
RAG
Retrieval-Augmented Generation combines information retrieval with text generation. An AI model first retrieves relevant documents from external data sources, then uses them as context to generate more accurate, up-to-date, and grounded responses.
RAG is mostly used by machine-learning (ML) engineers, startup builders, data engineers, research-oriented developers.
Working with RAG architecture, embeddings, document chunking, or memory systems likely puts you in the top 10% of global LLM users.
