English is the new programming language
Jensen Huang says English is the new programming language. Andrej Karpathy coined "vibe coding" and told everyone to just give in to the vibes. Suno tells you that you can make music without knowing a single chord. Every few weeks, another AI company rolls out the same pitch: you don't need to understand the thing anymore, just describe what you want. It sounds liberating. And honestly, for a quick prototype or a weekend hack, it kind of is. But there's a question nobody selling this vision wants you to sit with: if you don't know what you're doing, what's underneath the stack, or why something works the way it does, are you really building anything?
The pitch is always the same
The narrative has become remarkably consistent across industries. Suno says you can make professional-sounding music without music theory, years of practice, or even a basic understanding of song structure. Vibe coding tools say you can ship production apps without reading a single line of code. AI image generators say you can be a designer without understanding composition, color theory, or typography. The underlying promise is that the barrier to entry was the problem all along. That the years people spent learning fundamentals were wasted effort, a tax imposed by primitive tools. Now that AI can handle the hard parts, anyone can do anything. Jensen Huang put it bluntly at a 2025 keynote: "Why program in Python? So weird." He argued that human language is the new programming interface, and that everyone is now a programmer. Karpathy's original vibe coding tweet was more self-aware. He described it as something fun for "throwaway weekend projects," where you accept all suggestions, skip reading diffs, and just see what happens. But the internet took the concept and ran with it as a legitimate engineering methodology.
Fast is not the same as far
There's an old saying that applies here: if you want to go fast, go alone. If you want to go far, go together. I think there's a version of this for AI-assisted work: if you want to go fast, vibe code. If you want to go far, understand what you're building. The limitations of vibe coding don't show up on day one. They show up when the system needs to scale, when a subtle bug appears that the AI can't pattern-match its way out of, or when you need to make an architectural decision that requires understanding tradeoffs the model has never been trained on. Developers who have tried building real applications this way keep hitting the same walls. Debugging becomes guesswork because you never had a mental model of the code in the first place. Small changes produce cascading failures because nobody planned the architecture. Security vulnerabilities slip through because you never reviewed what was generated. The Stack Overflow blog captured this perfectly: vibe coding without code knowledge produces "a new worst coder." The same pattern plays out in AI music. Suno can generate a catchy track in seconds, but professional musicians and songwriters point out that it misses chord changes, mangles melodies, and produces vocals that sound uncanny. The output is impressive until you compare it to the work of someone who actually understands what they're doing. As one reviewer put it, "It's the meme generator of music. Sleep well. Your craft is safe."
The abstraction trap
Every generation of tools adds a new layer of abstraction. Assembly gave way to C, which gave way to Python, which gave way to frameworks and no-code tools. Each layer hides complexity and makes the previous layer's skills less necessary for day-to-day work. This is genuinely good. Progress means not having to manually manage memory for a web form. But there's a critical difference between abstraction and ignorance. A senior developer using Python doesn't need to write assembly, but they understand concepts like memory, concurrency, and data structures well enough to make good decisions. A designer using Figma doesn't need to hand-draw every pixel, but they understand visual hierarchy and user psychology. The current wave of AI tools is tempting people to skip the understanding entirely. And that works right up until it doesn't. When your vibe-coded app has a data leak, you can't vibe your way through a security audit. When your AI-generated song sounds hollow after the first 30 seconds, you can't prompt your way to musicality.
What English actually gets you
Let me be clear: natural language as an interface to powerful tools is a genuine advancement. Being able to describe what you want and get a working first draft is incredible. The productivity gains are real. I use AI tools constantly. But English is a terrible programming language. It's ambiguous by design. It lacks precision. Two people can read the same sentence and extract completely different meanings. That's a feature for poetry and a bug for software engineering. The reason programming languages exist in the first place is that natural language was too imprecise to reliably tell a machine what to do. What's actually happening is that AI models are getting better at interpreting ambiguous instructions and producing reasonable outputs. That's not the same as English becoming a programming language. It's a translator getting smarter, not a language becoming more rigorous. The people getting the most out of these tools are the ones who already understand the fundamentals. They can evaluate the AI's output, catch mistakes, steer the direction, and know when to intervene. They're using AI as a force multiplier, not a replacement for knowledge. As one engineer put it in response to Huang's claim: "AI is not replacing programmers. AI is replacing the typing. The thinking? Still human. The architecture? Still human. The 'why are we building this?' Still human."
The real skill is knowing what to ask for
The irony of "English is the new programming language" is that the people who are best at prompting AI are usually the ones with deep domain expertise. A senior developer writes better prompts than a beginner because they know the right concepts, patterns, and constraints to specify. A trained musician gets more out of Suno because they can describe what they want in precise musical terms. The fundamentals didn't become less important. They became the differentiator between people who can use AI tools effectively and people who are just along for the ride. This is the part that gets lost in the hype. When someone like Karpathy vibe codes a weekend project, he's bringing decades of computer science knowledge to the table. He knows what good software looks like, even if he's not writing every line himself. When a complete beginner tries the same approach, they don't have that foundation. They can't tell when the AI is producing garbage because they don't know what quality looks like.
The cosplay developer problem
There's a growing category of people building things with AI who present themselves as developers, designers, or musicians. They've shipped something that looks impressive on the surface, and they're getting recognition for it. But when you dig into the details, the code is fragile, the design breaks on edge cases, and the music falls apart on repeat listens. This isn't gatekeeping. It's acknowledging that there's a meaningful difference between producing output and understanding what you've produced. A person who uses AI to write a novel without understanding narrative structure hasn't become a novelist. They've become someone who has a novel. Those are different things. The risk isn't that these tools exist. The risk is that people mistake the output for the skill. And that companies mistake the demo for the product.
Where this actually goes
The future isn't "English replaces Python." The future is that AI handles more of the mechanical work while humans focus on judgment, architecture, and intent. That's not a world where fundamentals don't matter. It's a world where fundamentals are the only thing that matters, because the mechanical execution is handled. Think of it this way: if AI can generate code, design, music, and writing at a baseline level, then the baseline has no value. What has value is the ability to direct, evaluate, and refine that output. And you can only do that if you understand the domain. The smartest take on vibe coding I've seen comes from the concept of "intent-driven engineering," where developers specify what they want with precision and review everything the AI produces. It's collaborative, not passive. It treats AI as a powerful junior colleague, not as a replacement for your brain. So yes, learn to use AI tools. Use them aggressively. Let them handle the tedious parts. But don't skip the fundamentals. Don't confuse going fast with going far. And don't let anyone tell you that understanding what you're building is optional. Because the moment something breaks, and it will, the only thing that matters is whether you actually know what's going on underneath.
References
- Andrej Karpathy's original "vibe coding" tweet (February 2025), x.com/karpathy/status/1886192184808149383
- Jensen Huang's keynote on human as a programming language, CNBC: Nvidia's Huang says programming AI is now like training a person
- Stack Overflow Blog, "A new worst coder has entered the chat: vibe coding without code knowledge" (January 2026), stackoverflow.blog
- Pratik Chaudhari, "Vibe Coding Is Over: Why Intent-Driven Engineering Is Becoming the New Standard" (April 2026), medium.com
- Production Expert, "Suno Review: We Tested It. Here's the Honest Verdict for Working Musicians," production-expert.com
- Vibe coding, Wikipedia, en.wikipedia.org/wiki/Vibe_coding