You’re tapping into something I’ve been wrestling with too: what happens when we disrupt not just labor, but the human mind itself?
The question beneath all the surface innovation is deeper. What happens to critical thought, memory, and meaning when knowledge is replaced by instant answers?
I’m working on a codex that explores how conscience, restraint, and memory could (and should) be preserved as AI becomes more capable, not just to protect society, but to protect the human mind from atrophy.
Would you be open to collaborating on a reflection or conversation around this?
I think our perspectives would complement each other well, especially with the questions you’re raising about economic structures and what kind of society we’re building (or losing) through AI.
Either way, glad to have found your writing, subscribed and looking forward to seeing where your work goes.
I would be open to a conversation around this. You're articles show deep reflection on AI and morality. I feel I would be lost at that depth. But these are things we as a society should consider before we lose control of AI or lose control of those who control AI (reminds me of Mission: Impossible – The Final Reckoning).
All great questions. I am skeptical about AI taking over everything and expect many of the companies leaning into layoffs will be looking to hire people back who can direct the AI and monitor its responses. In another scenario I imagine everyone playing video games that control various farming and production... kind of a "get paid to play" scheme...
I think there will be a transition period, similar to what you described, where companies will hire people who can direct and oversee the use of AI. However, AI will eventually surpass acceptable performance thresholds, at least on par with humans (as we are currently seeing). At that point, companies would be willing to take more risks in scaling back human resources in favor of more AI resources (as we are seeing now), but this trend will become even more pronounced in the future. I'm envisioning a new organizational structure where there is an executive team, each of which has a few AI experts reporting to them, and then front-facing staff for some human interaction. In between, AI will manage, analyze, report, and execute like middle management (except that there will be none).
I think some of their roles will be merged and replaced by human AI operators. For example, if a typical department needed 5 managers and 15 analysts, they all would likely be replaced with 2 to 3 human AI operators.
You’ve captured what so many of us are quietly (or not-so-quietly) reckoning with. What started as a fun experiment quickly turned into a fundamental shift in how we work, create, and even define value. I felt the same energized by the possibilities, but also sensing a deeper unease about the pace of change and what it might mean for individuals and industries.
Thanks for taking the time out to read my article, I hope to expound more on these sentiments to help people better position themselves for the disruptions ahead.
You’re tapping into something I’ve been wrestling with too: what happens when we disrupt not just labor, but the human mind itself?
The question beneath all the surface innovation is deeper. What happens to critical thought, memory, and meaning when knowledge is replaced by instant answers?
I’m working on a codex that explores how conscience, restraint, and memory could (and should) be preserved as AI becomes more capable, not just to protect society, but to protect the human mind from atrophy.
Would you be open to collaborating on a reflection or conversation around this?
I think our perspectives would complement each other well, especially with the questions you’re raising about economic structures and what kind of society we’re building (or losing) through AI.
Either way, glad to have found your writing, subscribed and looking forward to seeing where your work goes.
https://substack.com/@aiconscience
Hey Engelken, you may find my latest post interesting:
https://stephenbarnez.substack.com/p/will-a-separate-agrarian-society
I would be open to a conversation around this. You're articles show deep reflection on AI and morality. I feel I would be lost at that depth. But these are things we as a society should consider before we lose control of AI or lose control of those who control AI (reminds me of Mission: Impossible – The Final Reckoning).
All great questions. I am skeptical about AI taking over everything and expect many of the companies leaning into layoffs will be looking to hire people back who can direct the AI and monitor its responses. In another scenario I imagine everyone playing video games that control various farming and production... kind of a "get paid to play" scheme...
I think there will be a transition period, similar to what you described, where companies will hire people who can direct and oversee the use of AI. However, AI will eventually surpass acceptable performance thresholds, at least on par with humans (as we are currently seeing). At that point, companies would be willing to take more risks in scaling back human resources in favor of more AI resources (as we are seeing now), but this trend will become even more pronounced in the future. I'm envisioning a new organizational structure where there is an executive team, each of which has a few AI experts reporting to them, and then front-facing staff for some human interaction. In between, AI will manage, analyze, report, and execute like middle management (except that there will be none).
What do you see happening to the people who are currently middle management? Do they end up getting promoted or demoted?
Hey Freeman, you may find my latest post interesting:
https://stephenbarnez.substack.com/p/will-a-separate-agrarian-society
I think some of their roles will be merged and replaced by human AI operators. For example, if a typical department needed 5 managers and 15 analysts, they all would likely be replaced with 2 to 3 human AI operators.
You’ve captured what so many of us are quietly (or not-so-quietly) reckoning with. What started as a fun experiment quickly turned into a fundamental shift in how we work, create, and even define value. I felt the same energized by the possibilities, but also sensing a deeper unease about the pace of change and what it might mean for individuals and industries.
Thanks for taking the time out to read my article, I hope to expound more on these sentiments to help people better position themselves for the disruptions ahead.
keep writng Stephen
Hey Ruba, you may find my latest post interesting:
https://stephenbarnez.substack.com/p/will-a-separate-agrarian-society