Vibe Coding Is About to Blow Up (entire industries)
The democratization of software development has been promised for decades. With vibe coding, we might finally be approaching that reality.

Can You Feel the Vibe?
Have you tried vibe coding your own app yet? Vibe coding is the practice of creating software by describing its "vibe" or intended look, feel, and functionality to an AI, which then builds it for you. The term was first coined in a tweet by Andrej Karpathy who was one of the founding team members of OpenAI. In concept it's pretty compelling. You just tell the AI what you want and it figures everything out for you.
For the past ten years UX designers have been told that they really need to learn how to code. Suddenly that advice seems antiquated. No longer do you need to know how to write code to be able to build a website or launch a mobile app. You don't even need to know anything about UX design or how to use Figma. Forget intense bootcamps, lengthy university degrees, or menial internships. If you can dream it, you can make it!
At least that's the promise, and as with most things related to AI the promise is still a ways off from reality, but it's getting closer every few months.
Your AI Programming Twin

The latest wave of apps represents a class of agent-based platforms that combine chat-based large language models (LLMs) that have chain-of-thought reasoning capabilities, with AI-powered agentic frameworks that can either call on other tools (i.e. software) or take action themselves. This means they can do things like set up and configure a database, download and install code packages from GitHub, and set up and modify code written in different programming languages including React and JavaScript. In other words, they can do what would previously have required a developer or programmer.
But what's even more interesting is that they can do much more than just write and install code. They can figure out what features your app will need, including integrations with other online services like payment processors; they can come up with the structure and layout for all the screens, and determine what labels and buttons they need; they can even come up with a logo and colour palette, and create placeholder content including text and images—everything you'd expect to see in a working prototype.
Some of the more popular vibe coding apps are Replit, Lovable, and Bolt (formerly Stackblitz), and there are more coming out all the time. Google just launched one called Firebase Studio and the folks at Rabbit who make the cute orange r1 AI device just launched their version called rabbitOS intern. Rumor has it that Figma is working on one too.
It's Already Happening

We're already seeing impressive examples of what non-developers can create. A music composer recently used Lovable to build an online ambient meditation experience that features his original music. An entrepreneur used Bolt to go from an idea he had four months ago to launching a slick online financial management app with both personal and business offerings. Many more examples like these can be found on Discord and X, with sponsored hackathons offering generous prizes to encourage adoption.
To say that this is a watershed moment in tech is not exaggerating. This technology has the potential to empower billions of non-coders to build apps and websites using natural language. It could also significantly transform the job market for developers. While the number of entry-level coding positions might decrease, experienced developers are likely to shift toward becoming "AI pilots" who are essentially experts who know how to guide these tools to create more complex applications (and troubleshoot them if and when they break.)
The Hidden Complexity of App Development

Before you quit your job to become a full time viber, it needs to be stated that these platforms are currently not anywhere near good enough to replace dev shops and experienced coders, yet. Modern app development is complex for good reasons: apps need to handle different screen sizes, be accessible to people with disabilities, remain secure against attacks, connect to various data sources, and perform well under different network conditions. These are challenging technical problems that AI is still learning to solve consistently.
And keep in mind, these tools are still powered by the same LLMs that routinely hallucinate (i.e., they make mistakes and sometimes just make stuff up). As a result they will often say that they've done something that they clearly didn't do, or they've done incorrectly. This can become quite maddening, especially when you're trying to make a small change or troubleshoot something that isn't working.
For example, I recently had a frustrating back-and-forth with Replit's Agent when I asked it to update a button's text colour. Despite repeated attempts and numerous confident assertions that it fixed the issue, the label is still invisible. Eventually I decided to move on because life is short and it was costing me credits every time I asked it to fix it.
Little things like this that might take a human programmer minutes if not seconds to fix can easily become costly wastes of time when working with an over-confident and clearly hallucinating chatbot. A quick scan of the Discord community servers for these tools reveals a steady stream of people getting stuck and/or losing patience.
The Developer Perspective

"Vibe coding totally changed things for me, especially as a senior dev. It helps me move faster with implementation, and I don’t get stuck overthinking small stuff. I can focus more on how the system fits together, the architecture, the flow. It doesn’t always give perfect results right away, but when I’m in the zone, it really clicks and things just work.”
— alexsh24 on Reddit
Working with these first-generation tools can feel like working with a junior full-stack developer who only recently completed a basic internship. They know some things, and they are pretty good at doing some complex things, as long as they've done it before. But as soon as something goes wrong or they're asked to do something truly novel, their incompetence shows.
Vibe coding. 10 minutes. Fixing vibe coding bugs. 10 hours. You choose.
– SimulationV2018 on Reddit
Part of this is due to limitations in the LLMs that these tools are leveraging, which are much better at providing answers to common coding questions than they are at generating original working code. The other challenge is due to the sheer amount of complexity and options available when building a modern app. This is why good developers are able to earn high salaries. It's still optimistic to think that you can get the same performance from a $20-$50/month tool as you might get from a developer that earns $10K-$12K/month.
What the Future Offers

Despite these shortcomings, the promise and the potential is clearly there. Frontier LLMs are becoming more powerful and capable almost monthly, and combined with the ability to call upon other tools when needed, we will see the day very soon, possibly even this year, when AI is able to do the job of a reasonably competent full-stack developer. Dario Amodei, CEO of Anthropic, recently predicted that artificial intelligence could write 90% of software code within 3 to 6 months and potentially all code within a year. While that may seem bullish, it's worth noting that Anthropic makes Claude, the LLM preferred by most programmers. So he might know something about this.
What does this mean for you? Well, if you've had an app idea but have been held back by technical constraints, there's never been a better time to experiment. Start small, be patient with the limitations, and you might be surprised by what you can create. And if you're a developer, rest assured that your expertise is far from obsolete, though your value and role may evolve.
The democratization of software development has been promised for decades. With vibe coding, we might finally be approaching that reality—imperfect as it may currently be. Just keep in mind that today's AI is the worst it's ever going to be.
Research and editing assistance provided by Perplexity, ChatGPT, and Claude. All images generated with Midjourney v7.