DeepSeek’s AI Revolution? A Week Later, Reality Sets In
My AI conversations - my own personal view.
What you make of open-source AI or DeepSeeks' leap into the market last week really depends on what you believe the S-curve in AI represents (and, of course where you stand politically and what you think about monopolies). Is this a A new S-curve which is all about lowest cost, efficiency, software, and data. Or is it about our current one, accelerating the established paradigm of building the generative AI infrastructure for AGI and many more applications.
A week ago, DeepSeek stunned the AI world with a claim that seemed too good to be true: training a GPT-4-level model at a fraction of the cost. Investors panicked, wiping billions off AI market value, and tech giants scrambled to reassess their AI strategies. Was this the moment that reshaped AI economics forever?
Fast forward seven days—perspective is settling in.
Yes, DeepSeek’s open-source approach and efficiency claims are significant. But the broader picture suggests this isn’t a fundamental revolution in AI cost structures - not anew s Instead, DeepSeek appears to fit squarely within the existing AI infrastructure race, rather than rewriting its rules.
1. The $6M AI Claim—What’s Missing?
DeepSeek reported training its model for under $6 million, a fraction of what OpenAI and Google typically spend. But analysts have started poking holes in that number:
Excludes R&D & Iteration Costs: Training runs aren’t one-shot events. OpenAI’s GPT models go through months of iterative refinement, requiring substantial compute before the final training phase even begins. DeepSeek’s reported cost likely covers only the last training stage, not the full development cycle.
Hardware Access Matters: DeepSeek leveraged thousands of high sillicon —already a constrained resource in China. If other firms tried replicating this model, they would struggle to source similar compute at the same price.
Distillation or Duplication? Reports suggest DeepSeek may have trained its model using techniques that distill knowledge from existing high-performing LLMs. While common in AI research, this raises ethical and competitive questions—if DeepSeek benefited from OpenAI’s models, does its cost advantage really hold as it grows and scales.
Recommended by LinkedIn
2. The Real AI Cost Curve: Still Climbing
Even if DeepSeek optimized its approach, AI remains a compute-intensive industry.
AI costs aren’t just about training—deployment, fine-tuning, and inference scale exponentially as models serve millions of users.
Future AI capabilities—autonomous agents, multimodal models, and real-time reasoning—will demand even more compute, ensuring NVIDIA’s dominance isn’t fading anytime soon.
3. Open-Source AI: Important, but Not a Game-Changer (Yet)
Open-source models have merits. Companies like Meta and Mistral are betting on decentralized AI, but the economic reality is clear:
Open-source lowers barriers but doesn’t remove infrastructure needs.
Most enterprise AI applications still rely on large-scale cloud compute, meaning hardware remains in control.
Regulatory concerns may soon tighten the leash on open-source AI, especially in the U.S. and EU.
Conclusion: A Reset, Not a Revolution
DeepSeek is impressive. It highlights the growing role of software efficiency in AI. But the AI cost curve is still dominated by compute power, scale, and infrastructure economics.
DeepSeek is part of the AI race, not the end of it.
👉 Erfahrener Business Coach | Spezialist für Network Marketing im Gesundheitsbereich | Finanzielle Freiheit & persönliches Wachstum leicht gemacht
2moMartin, thanks for sharing!
Thanks for the article Martin Birkner. AutoKeybo runs DeepSeek.