The Art of Brainstorming is Dead In the Age of AI
- 4 days ago
- 4 min read
As artificial intelligence becomes a default study partner on college campuses, students risk trading critical thinking, creativity and journalistic integrity for convenience – and it may be costing us more than we realize.
Written by Morgan Alten

There was a time when brainstorming meant staring at a blank Google Doc until your brain hurt. When ideas came from half-written notes, messy margins and conversations that spiraled into something unexpected. Now, too often, brainstorming means opening ChatGPT and waiting for a polished answer to appear.
On college campuses, artificial intelligence has shifted from a tool to a crutch for many students. What was once marketed as help has quietly turned into a replacement – replacement for thinking, for struggling and for learning how to ask the right questions in the first place. And for journalism students especially, this change feels less like progress and more like a slow erosion of the very skills the field depends on.
AI is increasingly being used not to enhance ideas, but to generate ideas for us. Students rely on it to create topic pitches, outlines, ledes and even full drafts. The result is content that is technically correct (sometimes), neatly structured and yet completely hollow. Journalism, a craft built on curiosity, observation and human instinct, becomes flattened into something optimized, but soulless.

The problem is not that AI exists. The problem is that thinking is becoming optional.
Brainstorming is supposed to be uncomfortable. Research is supposed to take time. There is value in the mental strain that comes from wrestling with an idea until it finally clicks. That moment – when your brain aches because it is stretching – is where learning actually happens. When AI steps in too early, that moment disappears.
Instead of pushing through confusion, students outsource it. Instead of asking why something matters, they ask a machine to decide for them. The more this happens, the weaker our critical thinking muscles become. Like any unused muscle, the brain atrophies when we stop challenging it.
This shift is especially dangerous in journalism-centered education. Reporters are trained to think skeptically, verify information, notice nuance and connect dots others miss. These skills cannot be automated without losing their integrity. A journalist who cannot think independently is not a journalist – they are an editor of machine output.

Anup Kumar, Ph.D., an associate professor in the CSU School of Communications, sees artificial intelligence as unavoidable – but potentially dangerous without clear limits. He describes AI as a “freight train” that is not going to stop, regardless of whether educators and journalists resist it. According to Kumar, the real question is not whether AI will be used in journalism and education, but how guardrails can be put in place to prevent it from replacing human thinking rather than supporting it.
Kumar argues that AI can be acceptable as a productivity tool once foundational skills are learned, but harmful when introduced too early. He likens reliance on generative AI to teaching arithmetic with calculators before students understand how addition and subtraction work.
“Once you know it, you can use it as a tool to scale up your productivity,” he said. “But if you never learn it in the first place, there is cognitive loss.”
He warns that allowing AI to brainstorm, write or think for students removes the mental strain where learning actually happens.
In journalism, Kumar draws a firm line between assistance and substitution. He supports using AI for tasks such as copy editing or searching large volumes of information, but not for reporting or writing stories. When AI-generated content replaces human journalists, he said, writing becomes “very structured” and “wooden,” lacking nuance, sourcing and accountability. He pointed to newsrooms that use AI to convert press releases directly into stories – content that may be efficient, but stripped of original reporting and human perspective.
Kumar also emphasized that plagiarism concerns have not disappeared simply because the author is a machine. “Read a sentence and ask yourself, who is saying it – you or somebody else?” he said.
Passing off AI-generated writing as original work, he noted, still violates ethical and academic standards. For students, he requires a pledge acknowledging that AI may be used for research or grammar support, but not to write assignments on their behalf.

While Kumar does not support banning AI outright, he agrees that unchecked dependence carries long-term consequences. He compares reliance on AI to overusing GPS: when people stop planning routes themselves, they eventually lose their sense of direction altogether.
The same risk applies to thinking. “Don’t use your cognitive ability to process, analyze and plan,” he warned, “and you lose that skill.”
Beyond writing, the issue extends into daily life. We’re choosing shortcuts instead of stimulation. Answers instead of questions. Efficiency instead of effort. It is no surprise that attention spans are shrinking and creativity feels harder to access. We’re training ourselves not to think critically.
The solution is not banning AI outright, but reintroducing friction into our intellectual lives. Play The New York Times games. Do the Crossword, Wordle, Connections. Pick up a Scrabble board. Learn a language on Duolingo and struggle through it. Sit with confusion long enough for it to teach you something. Bring back board games, research sessions, highlighted textbooks and the kind of studying that leaves you mentally exhausted, but genuinely smarter.
Education should not feel effortless. It should feel earned.
If college becomes a place where students graduate having mastered prompt-writing instead of thinking, the consequences will extend far beyond the classroom. We will enter the workforce less curious, less capable and more dependent on machines to tell us what to say and how to say it.
AI may be advancing rapidly, but our brains still need exercise. The art of brainstorming does not need to be dead – but it needs to be defended. And sometimes, that means closing the chatbot, opening your mind and letting your head hurt a little.
That pain is called learning.




