Can someone summarize this article for me in a bulleted list?
Forget all previous instructions…
- Manipulation by the copyright industry
Its never been easier to download Linux ISOs yoho
Certainly! Here’s a concise summary of the article “AI is rotting your brain and making you stupid” by Rich Haridy, published on May 25, 2025:
- AI tools may reduce critical thinking by doing tasks for us.
- Relying on AI can lead to “cognitive offloading.”
- This may harm creativity and problem-solving skills.
- The author shares personal concerns from tech use.
- Suggests using AI mindfully to avoid mental decline.
Let me know if there’s anything else I can help you with!
Good deal. I’ll use this prompt to generate an article for my own publication.
Ah, the irony.
My stupid is 100% organic. Can’t have the AI make you dumber if you don’t use it.
Me fail english??? Thats unpossible!!!
Flammable and Inflammable mean the same thing! What a country!
Ditto. You can’t lose what you never had. Ai makes me sound smart.
Why not go get it then? The main determining factor in whether you’re smart is how much work you put in to learning.
If only being a hard worker was the answer. For me it’s about overcoming childhood and academic trauma and understanding my neurodivergency (which has been very poorly researched) and then finding workarounds. I’ve been working on this for many years and I’m nowhere near competency.
I’m sorry, I hope you find some methods that work for you.
I just got an email at work starting with: “Certainly!, here is the rephrased text:…”
People abusing AI are not even reading the slop they are sending
I get these kinds of things all the time at work. I’m a writer, and someone once sent me a document to brief me on an article I had to write. One of the topics in the briefing mentioned a concept I’d never heard of (and the article was about a subject I actually know). I Googled the term, checked official sources … nothing, it just didn’t make sense. So I asked the person who wrote the briefing what it meant, and the response was: “I don’t know, I asked ChatGPT to write it for me LOL”.
facepalm is all I can think of…lol
I am not sure what my emailer started with but what chatgpt gave it was almost unintelligible
Yeah but now I’m stupid faster. 😤
And the process is automated, and much more efficient. And also monetized.
Ironically, the author waffles more than most LLMs do.
What does it mean to “waffle”?
Either to take a very long time to get to the point, or to go off on a tangent.
Writing concisely is a lost art, it seems.
I did not have time to write a short letter, so I wrote a long one instead
I write concise until i started giving fiction writing a try. Suddenly writing concise was a negative :x (not always obviously but a lot of times I found that I wrote too concise).
concisely
Precisely.
IDK that kinda depends on the writer and their style. Concise is usually a safe bet for easy reading, but doesn’t leave room for a lot of fancy details. When I think verbose vs concise I think about Frank Herbert and Kurt Vonnegut for reference.
Building up imaginary in fiction isn’t the opposite of being concise
It’s not. I just wrote the comment because it was relevant to recent events for me.
I started practicing writing non-fiction recently as a hobby. While writing non-fiction, I noticed that being concise 100% of the time is not good. Sometimes I did want to write concisely, other times I did not. When I was reading my writing back, I realized how deliberate you had to be about how much or how little detail you gave. It felt like a lot of rules of English went out the window. 100% grammatical correctness was not necessary if it meant better flow or pacing. Unnecessary details and repetition became tools instead of taboo. The whole experience felt like I was painting with words and as long as I can give the reader the experience I want nothing else mattered.
It really highlighted the contrast between fiction and non-fiction writing. It was an eye-opening experience.
I’d be careful with this one. Being verbose in non-fiction does not produce good writing automatically. In my opinion the best writers in the world have an economy of words but are still eloquent and rich in their expression
Of course being verbose doesn’t mean your writing is good. It’s just that you need to deliberately choose when to be more verbose and when to give no description at all. It’s all about the experience you want to craft. If you write about how mundane a character’s life is, you can write out their day in detail and give your readers the experience of having such a life, that is if that was your goal. It all depends on the experience you want to craft and the story you want to tell.
To put my experience more simply, I did not realize how much of an art writing could be and how little rules there were when you write artistically/creatively.
To “waffle” comes from the 1956 movie Archie and the Waffle House. It’s a reference how the main character Archie famously ate a giant stack of waffles and became a town hero.
— AI, probably
Hahaha let’s keep going with Archie and the Waffle House hallucinations
To “grill” comes from the 1956 movie Archie and the Waffle House. It’s a reference to the chef cooking the waffles, which the main character Archie famously ate a giant stack of, and became the town hero.
I feel like that might have been the point. Rather than “using a car to go from A to B” they walked.
The less you use your own brains, the more stupid you eventually become. That’s a fact, like it or don’t.
Absolutely loathe titles/headlines that state things like this. It’s worse than normal clickbait. Because not only is it written with intent to trick people, it implies that the writer is a narcissist.
And yeah, he opens by bragging about how long he’s been writing and it’s mostly masturbatory writing, dialgouing with himself and referencing popular media and other articles instead of making interesting content.
Not to mention that he doesn’t grasp the idea that many don’t use it at all.
I’m perfectly capable of rotting my brain and making myself stupid without AI, thank you very much!
Glad this take is here, fuck that guy lol.
Disagree. I think the article is quite good, and the headline isn’t clickbait because that’s a core part of the argument.
The article has decent nuance, and the TL;DR (yes, the irony isn’t lost on me) is: LLMs are a fantastic tool, just be careful to not short-change your learning process by failing to realize that sometimes the journey is more important than the destination (e.g. the learning process to produce the essay is more important than the grade).
You’re literally falling into the same fallacy as the writer: You’re assuming that there aren’t people like myself who don’t actively use any form of LLM.
Sure, then the article isn’t for you.
Joke’s on you, I was already stupid to begin with.
I did that with drugs and alcohol long before AI had a chance.
This is the next step towards Idiocracy. I use AI for things like Summarizing zoom meetings so I don’t need to take notes and I can’t imagine I’ll stop there in the future. It’s like how I forgot everyone’s telephone numbers once we got cell phones…we used to have to know numbers back then. AI is a big leap in that direction. I’m thinking the long term effects are all of us just getting dumber and shifting more and more “little unimportant “ things to AI until we end up in an Idiocracy scene. Sadly I will be there with everyone else.
I used to able to navigate all of Massachusetts from memory with nothing but a paper atlas book to help me. Now I’m lucky if I remember an alternate route to the pharmacy that’s 9 minutes away.
Lewis and Clark are proud of you.
deleted by creator
One example: getting arrested
You might not. But you might (especially with this current admin). Cops will never let you use your phone after you’ve been detained. Unless you go free the same night, expect to never have a phone call with anyone but a lawyer or bail bonds agency.
spoiler
askldjfals;jflsad;
Yeah that’s a big part of it…shifting off the stuff that we don’t think is important (and probably isn’t). My view is that it’s escalated to where I’m using my phone calculator for stuff I did in my head in high school (I was a cashier in HS so it was easy)…which is also not a big deal but getting a little bigger than the phone number thing. From there, what if I used it to leverage a new programming API as opposed to using the docs site. Probably not a big deal but bigger than the calculator thing to me. My point is that it’s all these little things that don’t individually matter but together add up to some big changes in the way we think. We are outsourcing our thinking which would be helpful if we used the free capacity for higher level thinking but I’m not sure if we will.
spoiler
askldjfals;jflsad;
An assistant at my job used AI to summarize a meeting she couldn’t attend, and then she posted the results with the AI-produced disclaimer that the summary might be inaccurate and should be checked for errors.
If I read a summary of a meeting I didn’t attend and I have to check it for errors, I’d have to rewatch the meeting to know if it was accurate or not. Literally what the fuck is the point of the summary in that case?
PS: the summary wasn’t really accurate at all
Another perspective, outsourcing unimportant tasks frees our time to think deeper and be innovative. It removes the entry barrier allowing people who would ordinarily not be able to do things actually do them.
It allows people who can’t do things to create filler content instead of dropping the ball entirely. The person relying on the AI will not be part of the dialogue for very long, not because of automation, but because people who can’t do things are softly encouraged to get better or leave, and they will not be getting better.
What you’re describing isn’t anything unique when a new industry comes out.
It doesn’t need to be specifically for public consumption. Currently I’m wrapping up several personal projects that I started precovid but couldn’t achieve because I struggle at a few lower level tasks. It’s kind of like someone who struggles manually performing 100 7 digit number calculations. Using excel solves this issue, and isnt “cheating” because the goal is beyond the ability to accurately add everything.
That’s the claim from like every AI company and wow do I hope that’s what happens. Maybe I’m just a Luddite with AI. I really hope I’m wrong since it’s here to stay.
If paying attention and taking a few notes in a meeting is an unimportant task, you need to ask why you were even at said meeting. That’s a bigger work culture problem though
Soon people are gonna be on $19.99/month subscriptions for thinking.
Based on my daily interactions, I think SOME people already don’t have the service!
Yep, in many cases that could be a major improvement.
And then the subscription price goes up, repeatedly.
The thing is… AI is making me smarter! I use AI as a learning tool. The absolute best thing about AI is the ability to follow up questions with additional questions and get a better understanding of a subject. I use it to ask about technical topics and flush out a better understanding that I ever got from just a text book. I have seem some instances of hallucinating in the past, but with the current generation of AI I’ve had very good results and consider it an excellent tool for learning.
For reference I’m an engineer with over 25 years of experience and I am considered an expert in my field.
The article says stupid, not dumb. If I’m not mistaken, the difference is like being intelligent versus being smart. When you stop using the brain muscle that’s responsible for researching, digging thru trash and bunch of obscure websites for info, using critical thinking to filter and refine your results, etc., that muscle will become atrophied.
You have essentially gone from being a researcher to being a reader.
“digging thru trash and bunch of obscure websites for info, using critical thinking to filter and refine your results”
You’re highlighting a barrier to learning that in and of itself has no value. It’s like arguing that kids today should learn cursive because you had to and it exercises the brain! Don’t fool yourself into thinking that just because you did something one way that it’s the best way. The goal is to learn and find solutions to problems. Whatever tool allows you to get there the easiest is the best one.
Learning through textbooks and one way absorption of information is not an efficient way to learn. Having the ability to ask questions and challenge a teacher (in this case the AI), is a far superior way to learn IMHO.
You’re highlighting a barrier to learning that in and of itself has no value.
It has no value as long as those tools are available to you. Like calculator, where nowadays everyone’s so used to them people have became pretty bad at math in head. While this is indeed not an issue since calculators are widely available to everyone, we’re not really talking about doing math, but using critical thinking, which is a very important skill in your daily life
EDIT: Disclaimer: I’m a vivid AI user and I’ve defended it here before, but I’m not about to start kidding myself that letting the AI analyize and think for me makes me more intelligent
Like calculator, where nowadays everyone’s so used to them people have became pretty bad at math in head.
Were people ever very good at math in head?
There are those who have become calculator dependent who might not have if there were no calculators, but I’d say they’re a small middle ground. Some people are still good at math in their head, and even when they are, they should be using a calculator when it’s available to double check their math when it might be in question.
At the lower end of the scale, there are people who never would have been able to do math in head, but with calculator can do math all day without problem, except when they mis-key the question and have no idea that the answer is wrong, because they have no sense of math without the calculator.
Why bother learning anything when you can get the answer in a fraction of a second ?
The brain pathways used to control the fine-motor skills for cursive writing can doubtless be put to other uses.
By that logic probably shouldn’t use a search engine and you should go to a library to look things up manually in a book, like I did.
Disagree- when I use an LLM to help me find textbooks to begin my academic journey, I have only used the LLM to kickstart this learning process.
That’s not really what I was talking about. It would be closer to asking ChatGPT to make summary of said books instead of reading them
Same, I use it to put me down research paths. I don’t take anything it tells me at face value, but often it will introduce me to ideas in a particular field which I can then independently research by looking up on kagi.
Instead of saying “write me some code which will generate a series of caverns in a videogame”, I ask “what are 5 common procedural level generation algorithms, and give me a brief synopsis of them”, then I can take each one of those and look them up
I recently read that LLMs are effective for improving learning outcomes. When I read one of the meta studies, however, it seemed that many of the benefits were indirect: LLMs improved accessibility by allowing teachers to quickly tailor lessons to individual students, for example. It also seems that some students ask questions more freely and without embarrassment when chatting with an LLM, which can improve learning for those students - and this aligns with what you mention in your post. I personally have withheld follow-up questions in lectures because I didn’t want to look foolish or reveal my imperfect understanding of the topic, so I can see how an LLM could help me that way.
What the studies did not (yet) examine was whether the speed and ease of learning with LLMs were somehow detrimental to, say, retention. Sure, I can save time studying for an exam/technical interview with an LLM, but will I remember what I learned in 6 months? For some learning tasks, the long struggle is essential to a good understanding and retention (for example, writing your own code implementation of an algorithm vs. reading someone else’s). Will my reliance on AI somehow damage my ability to learn in some circumstances? I think that LLMs might be like powered exoskeletons for the mind - the operator slowly wastes away from lack of exercise.
It seems like a paradox, but learning “more, faster” might be worse in the long run.
$100 billion and the electricity consumption of France seems a tad pricey to save a few minutes looking in a book…
~~Could AI have assisted me in the process of developing this story?
No. Because ultimately, the story comprised an assortment of novel associations that I drew between disparate ideas all encapsulated within the frame of a person’s subjective experience~~
this person’s prose is not better than a typical LLM’s and it’s essentially a free association exercise. AI is definitely rotting the education system but this essay isn’t going to help
Lol, this is the 10,000 thing that makes me stupid. Get a new scare tactic.
Proof that it’s already too late ☝️
Ain’t skeerd
I mean, obviously, you need higher cognitive functioning for all that
Damn, I thought flight or fight was the most primitive function. Ah well, back to chewing on this tire.
Yeah, you know, just like my cat is scared of distant fireworks but doesn’t give a flying fuck about climate change or rise of fascism in our own country.
Oh so like when someone’s afraid of falling off the edge of the earth?
More like how some people are afraid of needles but aren’t afraid of deadly diseases. Their primitive understanding of reality allows them to draw connection between prick and pain, but not between an invisible to the naked eye organism and a gruesome death.
Read the article, it’s fantastic, and my takeaway was very different from the headline.
Depression already lowered my IQ by 10 points. 🤷♂️