How to become a context engineer in 2025 (the $100K skill no one's teaching)
The shift from prompt collecting to context building is creating a new class of AI professionals
The former founding member of OpenAI just validated something I've been saying about AI writing for months - even though he was talking about app development, the principle is the same: context engineering beats prompt engineering.
But here's what Andrej Karpathy's tweet about "context engineering" doesn't tell you:
This shift is about to create an entirely new category of AI professionals who understand BOTH context and prompting. And the window to position yourself is closing fast.
Whether you're a ghostwriter for single or multiple clients, a CEO writing your own content, or a creator trying to scale—what I'm about to share will fundamentally change how you work with AI.
Everyone's still hoarding generic prompts like Pokemon cards
Every week, I see somebody sharing their big prompt collections or libraries.
Meanwhile, I'm creating unique prompts that reference deep context. Because I learned something different ghostwriting for CEOs and leading AI content at startups.
Here's the disconnect: We've been trained to think prompts alone are the solution. Like if we just find the perfect combination of words, AI will finally "get" us.
But that's like thinking you can build a house with just a hammer. Sure, it's an essential tool. But where's your blueprint? Your materials? Your foundation?
Context is the foundation that makes your prompts actually work.
The best AI writers use way fewer prompts—but better ones
I know AI writers/content leads charging $10K/month retainers. CEOs who've completely replaced their content teams. Creators scaling to 6 figures with AI.
They're not collecting prompts. They're not copying templates from newsletters.
Instead, they're building "context architecture"—complete knowledge systems that change how prompts work entirely.
Here's the difference: When you have rich context, your prompts become specific to that context. They're not shareable as single prompts. They're not generic. They're powerful precisely because they tap into YOUR unique knowledge base.
That's context engineering. And it's what separates the professionals from the people still trading generic prompts like baseball cards.
Context engineering changes the entire game
Here's the shift that's happening:
Context engineers ask: "What context do I need to build so my prompts can reference exactly what matters?"
It's the difference between using generic formulas and creating unique, powerful prompts that only work with your system.
When you become a context engineer, you transform how prompts work. Instead of hoarding templates, you build three things:
Your knowledge architecture (what your AI knows)
Your context hierarchy (how information is organized)
Your prompt strategy (how to reference your unique context for maximum impact)
This isn't just more efficient. It's transformational. Your prompts become intellectual property, not copy-paste templates.
I learned this the hard way
When I was ghostwriting for executives and leading AI content at startups, I had a problem.
Output would often sound the same. Despite having different prompts for each client or company, the output was generic.
It wasn't until I started building complete knowledge systems for each client that everything clicked.
Instead of managing hundreds of generic prompts, I built comprehensive context libraries. Voice samples. Strategic frameworks. Their actual thoughts and perspectives.
Suddenly, my prompts became unique to each client's context.
That prompt is worthless to anyone else. But with the right context? It produces gold.
This system is what I used to quickly build my own newsletter to 8,000+ subscribers. It's how I charge higher monthly retainers and work faster.
The NEW 80/20 rule nobody talks about
A while back, I told you about the 80/20 rule of AI writing—80% systematic prompting, 20% actual collaboration and writing.
Well, that's shifted dramatically in just a few months. I have a new 80/20 split.
Here's what I've learned after building these systems for myself and others:
80% of your results come from context setup. 20% comes from how you prompt/talk with the LLM.
The game has completely changed. Yet everyone's still obsessing over that 20%.
But here's the thing: That 20% only works when you have the 80%. Without context, even the best prompts often (not always) produce generic garbage. With context, your prompts become precision instruments.
Context engineers are creating prompts that reference their specific frameworks, their unique knowledge, their proprietary methods. Prompts that would be useless to anyone else but are incredibly powerful for them.
Your first context engineering project starts today
Forget downloading another prompt pack. Here's what actually matters:
→ Create your knowledge architecture
What are your frameworks? Your principles? Your unique perspectives? Get them out of your head and into your AI's brain. Not writing for yourself? Replace with your client or company.
→ Design your context hierarchy
Learn what goes in project knowledge vs. custom instructions vs. prompts. This is where the magic happens.
→ Develop your interaction patterns
Stop using AI like a vending machine. Start treating it like a thought partner who knows your brain.
For ghostwriters, this becomes your $10K service. For CEOs and creators, this becomes your competitive advantage.
The window is smaller than you think
Every major AI tool has already built context-first architecture:
ChatGPT Projects
Claude Projects
Gemini Gems
The tools are here. The market's ready. The only question is whether you'll be ready.
In 6 months, pure "prompt engineering" will be outdated for writers and creators. But context engineers who can build knowledge systems AND create powerful prompts that reference them?
Those are the people who become irreplaceable.
—Alex
Founder of AI Disruptor (and context engineer)
PS: If you didn't catch my live stream last week, I showed how to create AI tools with Claude using your own (or client) context.
I’d say prompts still steer, but externalised context (files, folders, frameworks) now builds the stage. We’re moving from prompting into orchestrating. Prompt, context, and stored memory (like personalised settings or prior threads) all play their part…and the balance shifts dynamically.
Very thought provoking! I think it's more of a 50/50 equation between prompt and context engineering - one without the other will give you a generic and mediocre result