before you search
thu, apr 24, 2025I get a nagging feeling when I try to solve my problems in a search bar rather than on paper. Depending on the situation, I find myself readily delegating my thinking to Google and the like.
I’m saving time in the short-term, but in the long-term I’m opting out of critical thinking. I’m foregoing the hard-earned experience that comes with analyzing a problem, forming an opinion, testing hypotheses, etc. It’s like asking for the answers to your homework: yes, you’ll save time, but you may be deprived of learning. These technologies shorten the time from question to answer, but they don’t absolve us of the necessary effort required to learn a concept.
Of course, getting instant answers to our questions isn’t a wholly bad thing. Maybe you’re learning a new skill, seeking outside perspective, or forgot how to convert Celsius to Fahrenheit (guilty, every single time). But there are also times where we know enough to take on the task at hand and still opt for speed. Sometimes we’re low on time, sometimes we’re trying to obtain perfect information, and sometimes we simply do this out of habit. After quitting Facebook, it took me years to stop typing “fa” into my browser.
This over-reliance on technology worsens with the rapid adoption of generative AI. We no longer need to feed a search engine delicately phrased queries with quotes and filters. Instead, our fingers dance on the keyboard as we aim a firehose of thoughts into a make-shift messaging thread, watching as our all-knowing, ever-present participant distills our ramblings and tells us how the world really works. At this point, I don’t even need to formulate a good question - it’s enough to say “why doesn’t this work” or “fix this”.
This problem is naturally pervasive across disciplines. As programmers, we wrote every line of code by hand (when we weren’t copying from Stack Overflow or running code generators). Now, we can say “build me an app” and a network of “agents” does the heavy lifting, scaffolding directories, wiring together dependencies, automatically resolving compilation errors. The newly materialized codebase is rife with errors and complexity, reflecting how deeply you thought about solving the problem, and how thoroughly you specified the approach. With every major iteration of generative AI models, we distance ourselves from having to think.
Indeed, we can build these muscles with exercise, but they will wither if underused. If you’re strength training, are you going to get stronger by watching your coach lift heavy things?
This nagging feeling - the flash of “have I spent even 30 seconds thinking about this?” - is a call to intentionality. It’s a built-in notification for when I’m phoning it in, and that it doesn’t sit right with me. To that end, there’s a notebook that now lives beside my computer reserved for the moments where I’m on the edge of blindly consulting a collective of AI agents.