Study: Using AI reduces Cognitive Activity

I know you're talking about kids, but I use ChatGPT daily in creative ways to solve problems at my job and I've never been this imaginative before.
I used ChatGPT for the first time at work today, and it's scarily efficient. However, I wouldn't ever confuse that with being imaginative.

I needed to update some stats in about 80 web pages, 28 stats for each page. The accepted way to do this annual task has always been to copy and paste each stat individually, so the job is divided between several people and takes a couple of days alongside regular work.

This year it was my turn, so I thought why not automate it? I'm not a programmer, so I searched for ways to do that on places like Stack Exchange, but ultimately ChatGPT wrote me a Python script that does it in seconds. It even suggested how to improve the results, and after a few steps of ChatGPT following its own ideas, it gave me a script that will take the spreadsheet and output code for all 80 webpages in one go.

Solving the problem was good, but I had very little input. And rather than learning how to do it myself, I now have a script I can pass on to whoever gets the job next time, and nobody will ever understand how it works.

That's the opposite of imagination. It feels like cheating, and I don't know if I dare tell anyone what I did next week. If this is the way forward, there's really no point in hiring humans, and no point in me trying to learn something new. I can feel my brain atrophying already!
 
I used ChatGPT for the first time at work today, and it's scarily efficient. However, I wouldn't ever confuse that with being imaginative.

I needed to update some stats in about 80 web pages, 28 stats for each page. The accepted way to do this annual task has always been to copy and paste each stat individually, so the job is divided between several people and takes a couple of days alongside regular work.

This year it was my turn, so I thought why not automate it? I'm not a programmer, so I searched for ways to do that on places like Stack Exchange, but ultimately ChatGPT wrote me a Python script that does it in seconds. It even suggested how to improve the results, and after a few steps of ChatGPT following its own ideas, it gave me a script that will take the spreadsheet and output code for all 80 webpages in one go.

Solving the problem was good, but I had very little input. And rather than learning how to do it myself, I now have a script I can pass on to whoever gets the job next time, and nobody will ever understand how it works.

That's the opposite of imagination. It feels like cheating, and I don't know if I dare tell anyone what I did next week. If this is the way forward, there's really no point in hiring humans, and no point in me trying to learn something new. I can feel my brain atrophying already!
No one in your job had the idea to automate it with ChatGPT, but you did. So give yourself some credit. You've improved the efficiency of your whole team with ChatGPT's help.

"And rather than learning how to do it myself". My friend, learning how to code isnt easy. Its something that you have to dedicate thousands of hours to get good at (and its still hard). You would dedicate this much only to have the satisfaction to say that you've made it yourself? Despite not being your job?

And most of the time, its not as simple as saying "create me the solution" and done. You have to understand your work deeply to tell it exactly what you need and how to do it. If your job can be replaced just by saying 'Chatgpt, generate this", then yeah, you should improve yourself or you'll get replaced.

And about "not learning something new". Well, now that you dont have to worry about getting those stats, you can choose how you want to spend that time. So choose to learn what you want, not what you need in order to do your job.
 
I've seen people basically lose the ability to write an email on their own. They're just so used to having ChatGPT do it for them. It is kind of worrying to see how quickly people have developed a reliance on it.

Maybe it's just my job, but I can't have an LLM write an email for me as each email has different content that I need to make sure is as accurate as possible. Said content is frequently confidential as well, so LLMs aren't used to it, nor have had access to it.

I'd spend more time checking them than just firing off emails that occasionally are poorly worded or have a few typos and grammatical mistakes in.

I guess LLMs for emails are fine if you're a 'scrum leader' or highlighter girl and you're just pointlessly sending around business jargon soup.
 
Last edited:
It's the MIT Media Lab, generally people with good engineering credentials, not a social pseudoscience team. Sounds like their methods were reasonable. Also 100% logically follows that outsourcing all of your critical thinking to a third party makes your brain worse off.
It's not about who did the research, moreso that these sort of studies require a lot more data to be conclusive.

The study is comprehensive, but only applicable when it comes to writing essays, not general use.
 
It's not about who did the research, moreso that these sort of studies require a lot more data to be conclusive.

The study is comprehensive, but only applicable when it comes to writing essays, not general use.
It's not making generalized claims.

Abstract

This study explores the neural and behavioral consequences of LLM-assisted essay writing.
 
Top Bottom