Had a nice day; did some investigation on the code of a Minecraft launcher project I'm currently contributing to, and, guess what?
I got bombed by AI doom-and-gloom content again.
(did I go a bit fast on the description of the whole topic of- this entire blog entry there?)
Anyways, well, it was only two videos which I watched, one after another. I'll quickly hyperlink those down here:
- "It Begins: An AI Tried to Escape the Lab" (more recent)
- "This Will Be My Most Disliked Video On YouTube"
Both of the videos are by Species | Documenting AGI, and I quite appreciate the length and effort the content has. What I don't really appreciate about these are, is leaving the audience with cliffhangers.
Don't get me wrong but, I hate cliffhangers which feel more like a delayed but consistent jolt in your mindset, rather than consistent, balanced-out surprises which could pique your interest but at the same time, provide you with a sense of plateau.
I have seen this pattern emerge with AI-related YouTube doomers which I'm seeming coming across more nowadays, really, and I kind of not like the trajectory of it.
In a world where software quality is substantially declining by folds of ten, every, single, day, in my opinion, AI is probably one of the things which I believe humanity really needs to regulate in general. If you ask me personally, I'd not be surprised by how delusional the community has become over time with all these Qwen, GPT, LLaMa, Grok or even Claude (especially Claude) models overriding their sense of judgement.
There used to be a time where software was insanely predictable, and even though I'm nowhere near in the same grounds as the top researchers in this field to say this, but, we've quite lost that predictability with the new tools we're seeing emerge lately. Now don't get me wrong again, but, I'm quite frustrated by LinkedIn people screaming about the hype they're unknowingly flueing, full-steam, with those em-dashes you hate ("—" <- this).
We've lost originality in music, and I don't think its unjustified for me to say this. I had to quit Spotify because of the amount of saturation I was getting from all the repeated AI music plus my playlist which has been degenerating itself through all the recommendations I get.
I think it'd be pretty sad if you told someone from the 1900s that we've now lost originality, in music.
And,
We've lost originality in all the creatives. Yes, I'd count programming in it as well - programmers usually don't count themselves as creatives anyway other than an elite few but oh well... I also stumbled upon someone's comment in a random video I was watching, and I perfectly like their analogy:
Using AI for art and calling yourself an artist is literally the same as ordering food from a restaurant and calling yourself a chef for that.
On an even more serious note, someone had said this:
By giving AI, you've effectively granted stupid people to be egotistical about the creativity they don't have.
I, personally, like predictability. I, personally also think that even if I had graduated from linguistics, I wouldn't just fire up Claude Code one day and decided that I want to be a programming languages designer and design an entire language without guilt in my head - that I'm skipping the cognition required to make that very object exist, without depending on external audit.
Have we really degraded that much?
Earlier this week, I was in a conversation with one of my friends on Discord - he's probably three years senior, but it very much surprises me that CS majors in my country don't know the fundamentals for building a simple web-based WireGuard/OpenVPN client. They just vibecoded the entire thing, with Antigravity, while not taking agency of their code, and calling it a day.
Now if you really like these tools, I'd not blame you. They're excellent at some things in particular. But guess what they're not excellent in?
Keeping your cognition alive.
The very same predictability we were promised in the early stages of development, is being stolen from us, explicitly.
Traditional algorithms are playing catch-up with all of this as well. One moment, I see a video on whether engineers are becoming dumber in my feed, the other moment I see some typical CEO ramp up their product saying it "doesn't hamper engineers". I usually tune into YouTube for some wholesome content that I want to watch (yes, the typical videos in which people comment "this is what YouTube is made for"), and it turns into an unpleasant experience within mere seconds of logging in.
Judge for yourself and I'd like to know where you find the predictable human interlace in this.
Software used to have QA. Even Anthropic's own deployments don't pass three nines nowadays. Are we really in for this deploy-and-break style too? Amidst all of this?
Rust's core motives are awesome. Do you know what's not awesome? People proudly naming their 0-1 employees LinkedIn/GitHub organization after a Rust acronym and pushing AI slop as it engulfs more of the corporate-controlled minds which are already pretty bullish on a tech we don't fully understand the effects of.
Phew..
Well, I don't know how well of a content-defined chugging of hate towards the world we're headed for I did right there, but I'd just like to ask you two, simple questions.
- Is AI really expensive on us?
- Is it too late to pull the plug?
As much of a satire I want this blog entry to be, it breaks my mind even more to think that it isn't.