Prioritizing Strategy in Your Strategies
- Jason Sprenger
- 16 hours ago
- 2 min read
These days, I can’t go to any networking event without hearing or being asked about AI. It’s no longer an early-adopter thing; the masses seem curious to know more about it, and are possibly even experimenting with the value it can bring them.

Hands down, my most interesting conversations about AI have been with C-level tech and AI professionals. Those discussions haven’t been about basics; they’ve been graduate-level classes on usage theory and ethics.
Two statements in particular have really stuck with me:
The CEO of a relatively small software company told me he’s using AI today to do the work that junior-level staff normally do, and accomplishing enough that there’s a strong argument for cutting/eliminating junior-level staff positions. He’d save a lot of money doing that. But he’s struggling because if he and others don’t hire younger people and let them learn and develop…there won’t be any more execs like him in generations to come and there won’t be a tech workforce with the chops to handle most anything, let alone whatever version of AI exists in 10-20 years. The question, then, becomes: “What do you value more: cost savings today or the viability of your own profession in a generation or two?” I think we’re going to see this a lot more in the tech world in the months and years to come. How leaders like him answer that question…well, it might shape the future of our entire world. After all, AI isn’t exactly going away.
The CTO of a large company told me he’s starting to get worried about humans’ collective ability to think critically. He and his organization are using AI to analyze elaborate data sets and generate conclusions that spark specific actions. As a result, his staff no longer have to use as much brainpower to execute their tasks; the process is much more efficient, but the tech is doing most of the thinking and decision making instead of the humans. He said that the more this happens, and the more that AI becomes infused in everyday society, the less critical thinking humans will do – and the more endangered this essential skill will become.
I see a common thread in these two statements: if we let it, AI could be an existential threat to strategy as we know it today. The more we use and rely on tech, the less we’re bound to think for ourselves and make our own decisions.
If we value knowledge and experience (and I would say that’s our default setting, as human history has been all about learning and passing things down through the generations), then it’s time to ask yourself some questions. What do you value? How will you reconcile what you value with what’s easier, simpler or less expensive? How do you prioritize the documentation, preservation and continued evolution of essential strategic concepts in your family, business, industry, etc? How does all of this show up in your everyday work?
At the very least, I’m resolved to start talking more openly about strategy with those I am around the most and developing my own answers to these questions. I’d encourage you to do the same.