There seems like a disconnect between the breathless AI hype and reality. I tend to be on team-LLM (having exposure to it in my daily work), and my stance is that it's going to revolutionize some narrow domains and features of products, but the idea that it will replace white collar labor generally seems poorly substantiated by what we know so far.
What we see ChatGPT being good at is largely boilerplate. The folks who are like "I am 5x more productive now with ChatGPT" raise alarm bells in my head - it suggests that they were typing the programming equivalent of pablum for most of their day.
Yeah, if you ask ChatGPT to author some simple HTML it will do so with a refreshing level of accuracy (though far from being able to do without extensive supervision) - but who is writing plain HTML? Likewise it's cool that it can generate a sort function whole cloth... but who is writing their own sort functions?
The trick with boilerplate is that the industry has already spent the past two decades automating large amounts of it away via frameworks and baking much of the functionality into programming languages themselves. If you're writing boilerplate for so much of your day that a boilerplate-spitting AI massively increases your productivity, I'd argue you've got bigger problems!
The trend - across both backend and frontend dev - is to make things more declarative and close the gap between what the developer wants and the amount of code needed to implement it. Use these technologies - it has made people dramatically more productive, but didn't require a LLM!
If you find yourself spending a significant portion of your day writing boilerplate, stop doing that! There are many options for you that allow you to more directly express what you want!
90% of frontend development is gathering the requirements. The other 90% is trying to get CSS to center a div.
Let them believe it and choose new careers
For those of us who know we won't be replaced, we only benefit from this FUD
These demos all give me the same vibe as "we prevented aging in mice". It's really impressive and interesting but I don't think we should be worried about our jobs disappearing any more than what living to 200 will be like.
I see posts like this comparing past attempts, and in this article even 1990s Homestead, and I immediately think this person does not get it.
Additionally all the statements that follow along the lines of "GPT today has a hard time with xyz" are all problematic too. Today is not tomorrow, and what it can do tomorrow is what we should be talking about.
The fact is in just 3 months we've gone from fairly basic code reviews to super helpful code reviews with code examples and intelligent code contextual comments.
Comparing GPT to Homestead or any number of early attempts to translate mockup to HTML+JS is unhelpful. This is wildly different technology. Talking about what GPT can do today and landing at "it's not good enough" is also missing the point, we are nowhere near the limits of this tech and it's evolving so fast that these assessments are not insightful in the slightest.
Talking about GPT as being merely a probability evaluator is also demonstrating the authors limited insight. WE ARE ALL probability evaluators. Humans work on prior experience, employ heuristics to make decisions based on that. GPT, indeed all AI, is more or less based on that paradigm. The difference is an AI can hold the whole of its learning in a perfectly memorized model. It doesn't forget what's in that model and can draw on billions of data points to make its probabilistic determinations. We pull on bias, false memories, misunderstandings, and so on. Now, that's not to say AIs won't also be afflicted by these comprehension errors, but that just doubles down on AI doing what humans do which is astounding either way.
Jobs are going to be lost, it's a fact. Even if it's just that junior who was learning code reviews and supporting basic development, that job will be around for maybe a year or two at the rate we are moving.
I've been with other hats web developer for 25+ years. And have the impression ChatGPT gives me super powers.
I'm a dev using ChatGPT and CoPilot. I work more efficiently as a result.
Currently I don't think LLMs represent a risk to my job in that I work for a small company and personally handle many aspects of the development process on my own. However, at a larger company where those roles are spread out I could see some of them going away soon.
I work on a large WordPress theme. If an LLM were trained on that theme it could possibly have the global scope I need to make changes and understand their impact. It won't be long before we have software that allows you to easily train an LLM for your use. That could threaten my job. But then, my company would still need someone to manage and QA the results.
So I see fewer jobs, but not the end of the profession.
This is an exciting time, but also topically exhausting. I can see how past generations of technological change may have taken being in the midst of it all for granted in between the hype and the volley of takes for and against the latest groundbreaking innovation of the time.
Side note: I’m looking forward to a viable OpenAI competitor to come along and introduce their own chatbot…right now all of the speculation in the LLM area of AI are concentrated to the work of one company…what else can be out there?
At this point we could even deviate from software development and say this about any job that requires thinking and decision making around a set of rules.
You have a camp that keeps saying it will replace jobs (me, and it already does) and another that says it does not. If you want a good future, you don't care about either camps and use this as another tool. I don't like frontend, but now I don't need to ask someone else (and in that way it is replacing people we now mostly no longer need); I can just do it myself with chatgpt. Don't get me wrong, I know how to use html, css, react, vue etc but I hate doing it and find it all pretty much a waste of time; now chatgpt does it so I can do something else. I would say most people can do it with the stuff they don't want to do. Who cares if it's frontend, backend or whatever. Use it as a tool to gain an advantage; you will be ahead in many possible futures.
That doll popping out is creepy as hell.
I say it a lot, but I'll say it again:
"The code isn't the hard part of the job"
> "Fundamentally, LLMs are super-powerful text predictors. Given a prompt, they use machine learning to try and come up with the most likely set of characters that follow the prompt."
I've heard this tossed around and in the old days it used to be true.
BUT in the process of learning to predict text, just like with CNNs that learned to predict image types, it seems GPT learned some very interesting functions that bring it very close to "cognition".
I would not be at all surprised if we found those neurons in GPT, the same as I saw on fast.ai the building blocks of neurons that predicted images back in the day.
It's not that AI is going to replace certain jobs, it's that the people /not using AI/ will be out of a job because GPT et al are merely tools to get things done faster.
Just want to say, I'm really enjoying your site, thanks for all the tuts, they're timely! Just decided to ditch Wordpress because my content is static. Wordpress is ridiculous bloat for that. With CSS grid and flexbox I can replicate easily in HTML. Only reason I adopted WP, ~ 2014, was to get a sidebar, and escape from the dreaded float. Comments are my only dynamic content, easily transferred to a forum.
Anyway most appreciated, thank you.
I look at the ChatGPT movement similar to the era of the switchboard operator (not specifically frontend coding, but rather the impact):
https://www.history.com/news/rise-fall-telephone-switchboard...
As a FED, I embrace my AI overlords.
There's a lot I hate about my job (learning/using the latest silly thing React's rolling out, configuring new projects, Webpack, etc.). If AI can handle this garbage while I solve actual problems, I'll have it made in the shade.
ChatGPT has way more of a probability to hit the singularity first, thus rendering all coding efforts futile & useless, whilst it ushers in the machine apocalypse. So write that FE framework before you get taken away and used as a bio-battery for the great machine king.
It's definitely going impact simple frontends that don't need complex app logic. Arguably that is already the case with so many of the existing frameworks taking over, but now if you don't need a framework chances are it can be easily generated.
In my opinion, the dev role will slowly shift towards debugging and writing logic-heavy code.
It’s not the “it’ll build websites” part that worries me, it’s the “it’ll browse the web for people” part.
When it starts doing that the relevancy of interactivity and design on the web starts plummeting as less and less humans will use it.
"augmenting, not replacing" has been my go-to line for people who are anxious. There is no real meaningful discussion taking place about replacing people, it's just about augmenting people.
Just describe a average website and see if ChatGPT can reproduce it.
I think the description as such will be the difficult tasks, followed by fine tuning your description if ChatGPT's result isn't correct.
Time for the AI hype-men to touch some grass. Anyone claiming they have 10x super powers now has most likely only dealt with hello world problems in their current job thus far.
Does the argument hold for the next much better versions that are inevitably coming very soon? There is no point in making a temporal argument for today
You would guess all you could need in the future is a code base, and a video of the website working in action
bigger issue is that it takes more time and skill to become hireable, in the early days of the web there were plenty of people buying a book on HTML and getting hired in a few weeks. That isn't happening in a world where AI can do all but the most complicated programming tasks
> The End of Front-End *Web* Development
T,FTFY
> A small JS app like this blog has ~65k lines of code, across 900+ files. That doesn't include the written content, only the JavaScript and TypeScript.
More or less sums up why I think FE dev has reached peak ridiculous.
Good luck ChatGPT.
At this point I don't think anyone can predict where we will be in 5 years because its changing very quickly.
From my view, chat gpt already codes a lot like a jr. developer but much faster
AI won't take of front-end dev jobs until it can start creating a new framework every week, constantly break backwards compatibility, then abandon the project and start a new one.
Yet another blog post about how LLM can't replace people for X task. I wish there was less noise about this and more discussions on how we will (all) benefit from LLM replacing people. Like UBI, paying people to pursue their passions, etc.
I personally would be fine if LLM replaced my $DAYJOB if it meant I could work on learning violin, gardening, or creating videogames.
With a decade of experience as a front end dev in this industry, I have to agree with the premise, although not for the reasons stated. AI is, and will be, hugely disruptive for all fields of programming. But it won't be taking jobs any time soon.
The reason front end development as a skillset is coming to an end is that we've solved all of the problems that front end devs were responsible for. Front end used to be a very specific skill set that dealt with taming the complexities of cross browser support, and shoehorning the browser into a proper application platform. Those concerns are no more. Browsers have all converged on chromium-as-a-standard. Things like Tailwind/CSS-in-JS are quickly making traditional CSS skills obsolete; gone are the days of needing someone who knows how to cast arcane incantations of CSS rules to get an image centered on a page. And with modern frameworks and development techniques, all the stuff about semantic HTML and whatever else is being thrown out in favor of DevX and rendering everything as a pile of <div>s.
There has been a massive shift in the last 3-4 years where practically all job listings are now "full stack", with the expectation that UI work is "just some silly thing we have to deal with" on top of the real work. Perhaps some huge companies will maintain the separation but most have already given it up.
Frontend seems to involve visual design, which backend obviously doesn't (usually). Other than that, I don't see the difference, as soon as you're working on things more complex than simple static pages. Both will gradually become more efficiently done through smart use of AI tech.
But replacement? ChatGPT and GPT4 can't solve real world problems without heavy time investment of an engineer to prompt-engineer a semi-usable (if lucky) solution that still needs a lot of debugging (so coding yourself is still way faster for sufficiently complex things) and adjustment to your codebase.
Will people get left behind that don't adapt? For sure. That's always the case. If productivity is significantly increased with use of such AI tools, then those that won't use it due to whatever reason, including incompetence, will be let go eventually. All others will be fine, as always. This fear-mongering is actually good for us that can adapt, makes us more valuable.
Let's talk in 2030 again. Then in 2035. Then in 2040. And so on until we have AGI that generates you a new revenue stream just by reading the thoughts of a non-technical middle-manager, by which time we will probably have completely different problems.
People afraid of AI automating things just don't know how it works. I mean, no one does really, but if you went to University and studied AI, statistics, machine learning, deep learning and some of the variations then you'd know that that shit isn't intelligent or magic.
It predicts the most likely next tokens + some randomness based on what's in the training data. It can't do more than that. That it can conjure up well known algorithms and some variations of them that help solve some specific problems that "look complex" is not a surprise, but alas, a mostly useless party trick that scares some people apparently.