APPENDIX A — The 12 Core Concepts of AI
(Beginner-Friendly Guide for the AI Learning Roadmap + AI Income Lab)
Learning AI isn’t hard.
This appendix cuts through that noise.
These are the 12 Core Concepts every beginner must understand before building real AI-powered systems — especially the type of arbitrage automations, agents, and creative micro-systems we introduce in the AI Income Lab.
Each concept includes:
- Plain-English definition
- Where it fits in the AI Roadmap
- How it applies to your real-world income experiments (AI Flip, product creation, automation, research tasks, etc.)
1. GenAI (Generative AI)
What it is:
AI that creates new content i.e. words, images, sounds, videos, or code.
Where it fits in the AI Roadmap:
This is Level 1: Understanding how GenAI thinks so you can mirror that learning process.
Where you’ll use it in the Income Lab:
- Writing listings for marketplace flips
- Drafting product descriptions
- Generating artwork
- Creating scripts, titles, captions
- Rapid brainstorming and prototyping
2. LLM (Large Language Model)
Where it fits:
LLMs are the “brain” behind everything in the AI Roadmap because they run your agents, help automate tasks, and serve as the foundation of every module.
Where you’ll use it:
- ChatGPT, Claude, Gemini, DeepSeek
- Building your own mini-agents
- Business planning
- Research and data extraction
3. Foundational Model
What it is:
A massive pre-trained model (like GPT-5) used as the base for more specialized tools.
Where it fits:
This is the starting point for fine-tuning, RAG systems, personal agents, and niche automations.
Income Lab relevance:
You’ll later choose which foundational model to build your agentic system on.
4. Fine-Tuning
What it is:
Training a general model on your data so it speaks your language.
Where it fits:
This helps students understand how models become specialists.
Income Lab relevance:
- Creating your own “brand voice” model
- Building custom customer-service bots
- Teaching AI to mimic your workflows
5. RAG (Retrieval-Augmented Generation)
What it is:
A system where the LLM pulls in fresh, external data to stay accurate.
Where it fits:
This is the next evolutionary step after prompt engineering, as it forms the architecture of real AI products.
Income Lab relevance:
- Building your own knowledge-base search engine
- Personal research assistant
- Automated marketplace scraper + analyzer
- Up-to-date product pricing agents
6. Prompt Engineering
What it is:
The skill of giving AI clear, structured instructions so it performs well.
Where it fits:
Prompt Engineering is Week 1’s core skill, because it unlocks everything downstream.
Income Lab relevance:
- Writing precise prompts for arbitrage
- Creating “prompt recipes” for repeatable mini-businesses
- Structuring prompts for image generation
- Building Chain-of-Thought workflows
- Teaching AI how you think
7. Context Window
What it is:
The limit of how much information an AI model can “hold in working memory.”
Where it fits:
Understanding context windows explains why AI sometimes forgets earlier parts of a conversation.
Income Lab relevance:
- Feeding AI longer documents
- Running batch spreadsheets through an AI agent
- Designing multi-step workflows
- Troubleshooting agent failures (“it forgot step 3”)
8. Hallucinations
What it is:
When AI “makes something up” but sounds confident.
Where it fits:
Critical for Week 1 → “AI Literacy: Trust but Verify.”
Income Lab relevance:
- Avoiding misinformation in product listings
- Ensuring your pricing data is correct
- Sanity-checking business analysis
- Reducing risk in your agent automations
9. Embeddings
What they are:
AI converts text, images, or concepts into mathematical meaning vectors.
Where it fits:
This is the core of search and categorization.
Income Lab relevance:
- Building your own semantic search (for products, ideas, notes)
- Matching product images to trending memes
- Recommendation systems
- Filtering misinformation
10. Tokens
What they are:
Tiny text units AIs use instead of words. (“Elephant” = 1 token. “Anti-disestablishmentarianism” = 7 tokens.)
Where it fits:
Understanding tokens helps with cost, context limits, and model performance.
Income Lab relevance:
- Knowing why long prompts cost more
- Designing efficient prompt templates
- Structuring AI-friendly documents
11. Multimodal AI
What it is:
AI that can handle text, images, audio, and video together.
Where it fits:
This represents a major leap toward Agentic AI.
Income Lab relevance:
- Image-to-text product analysis
- Screenshot → spreadsheet agents
- Voice-activated research assistants
- Video analysis for marketing and product creation
12. Zero-Shot Learning
What it is:
AI doing tasks it was never explicitly trained to do.
Where it fits:
This concept primes learners for how Agentic AI handles never-seen-before tasks.
Income Lab relevance:
- Instant categorization of products
- Evaluating deals without prior examples
- On-the-fly problem solving for unexpected tasks