A fireside reflection on growing up
Before we talk about Artificial Intelligence—before we talk about ethics, technology, or the future …
I want to tell you a story. Not a technical one. Not a political one.
A human one.
Because the most important questions facing us right now aren’t really about machines at all.
They’re about who we’ve been. Who we’ve become. And whether we’re finally ready for the next chapter of growing up.
This is a story about humanity—told not through data or forecasts, but through experience.
So let me take you back to a Saturday night, many years ago.
It starts like this …
Like it was yesterday, I remember a late Saturday night in high school in the 1970s.
Four of my friends and I were crammed into an older brother’s high-powered, cooler-than-cool GTO. None of us could drive yet, so Rob’s brother Mike offered to be our lead-footed chauffeur for the night.
Out on a long, straight stretch of highway in the Missouri countryside, a Porsche 911 appeared on the horizon. In a split second of teenage bravado, Mike decided to show his passengers what that GTO could really do.
The Porsche driver instantly took the challenge.
What followed was five minutes of engine-revving blur—speed, adrenaline, panic, recklessness … and almost no real control at all.
No one got hurt that night.
But five decades later, it’s hard not to see that late-night ride as a foreshadowing of what American culture—and increasingly global culture—has felt like ever since.
At the time, it felt thrilling. Scary. Exhilarating, even.
Looking back, it’s obvious what it really was: a group of teenagers riding in a machine far more powerful than any of us were ready to handle—egged on by ego, adrenaline, and a complete lack of foresight.
That’s adolescence.
And for most of recorded history, humanity has behaved much the same way.
For thousands of years, humanity lived in something like childhood.
We learned how to survive. We made sense of the world through story, myth, ritual, and religion. We feared the unknown, honored authority, and depended deeply on structures larger than ourselves. Childhood isn’t foolish—it’s formative. It’s how we learn what the world is and where we fit inside it.
As civilizations grew, humanity entered adolescence.
Empires rose and fell. Philosophy flourished. Science advanced. Power expanded faster than wisdom. We challenged authority before learning how to govern ourselves. We chased identity, dominance, novelty, and reward. We built extraordinary things—and often used them recklessly.
Humanity wasn’t evil. It was just immature … a natural state of development.
Then came the Industrial Revolution.
Suddenly, we weren’t just stronger—we were amplified. Machines extended our reach. Energy multiplied our force. Speed collapsed distance. We left the village. We left the land. We left our “parents,” culturally speaking.
Humanity got its sports car.
And like most teenagers with newfound power, we pushed the accelerator before we understood the brakes.
The twentieth century tried to warn us.
World wars. Nuclear weapons. Environmental damage. Existential anxiety. Runaway chronic illness. Psychology and philosophy whispering that something inside us was unfinished.
Thinkers like Carl Jung spoke about two phases of life: the first devoted to achievement, expansion, and ego … the second devoted to integration, meaning, and responsibility.
Humanity sensed that a second phase existed. But sensing isn’t the same as crossing the threshold.
Insight alone doesn’t mature a system. Consequences do. And the consequences kept piling up.
Which brings us to now.
Artificial Intelligence is NOT just another invention.
For the first time in history, humanity has externalized intelligence itself. Not muscle. Not energy. Not speed.
Intelligence.
And suddenly, a question we’ve avoided for millennia can no longer be dodged:
If intelligence is no longer uniquely human, what is our responsibility?
This is not primarily a technical moment. It’s a developmental one.
AI doesn’t force us to ask what machines can do. It forces us to ask who we are—and who we’re becoming.
There’s a clear fork in the road here.
One path looks familiar … and somewhat comfortable.
It treats AI like a faster engine in the same old car. More power. More scale. More profit. More dominance. More speed—still driven by short-term incentives, adolescent impulses, and outsourced responsibility.
That road doesn’t end well. Teenage recklessness, amplified to civilizational scale, never does.
The other path is quieter … and harder … and perhaps less comfortable.
It requires humanity to grow up.
- To pair intelligence with wisdom.
- Capability with conscience.
- Power with restraint.
- Speed with discernment.
This is where The New Intelligence enters the story. Not as smarter machines alone—but as a new relationship between human maturity and machine capability.
The New Intelligence recognizes a simple truth:
Intelligence without moral formation is not progress—it’s danger.
If there had been consequences that night in the 1970’s, they wouldn’t have been gentle.
That’s why ethical guardrails can’t be bolted on after the fact. They have to be foundational. Embedded. Lived.
This is the role of an Ethical Operating System—an EOS—not as a rulebook or control mechanism, but as a moral compass. A way to ensure that as intelligence accelerates, wisdom doesn’t fall behind.
EOS exists for one reason: to help humanity finally do what adolescence never requires—take responsibility.
Adulthood isn’t glamorous.
It’s slower. Heavier. More accountable.
Responsible adults don’t confuse freedom with license. They don’t chase every impulse. They don’t outsource responsibility and hope for the best.
They steward.
Most people don’t like to admit it, but our world is no longer business as usual. This moment in history is asking humanity a very grown-up question:
Are we ready to become the kind of adults who deserve the intelligence we’re creating?
That’s our fork in the road and the real choice before us.
Not AI versus humanity. Not progress versus tradition.
But prolonged adolescence … or conscious adulthood.
The engine is already running.
The only question left is whether we finally learn how to drive.
FACTORS Digital Intelligence … “Building Smarter Machines for Wiser Humans“
From Story to Library
If this reflection resonated, it’s because it points to something larger than a single moment or memory. The Tale of Humanity is one way into a broader body of work exploring what this moment in history is truly asking of us. In The New Intelligence Library, these ideas are examined more fully—through essays, doctrines, and frameworks that grapple directly with intelligence, responsibility, ethical formation, and human flourishing in an age of accelerating machines.
The story opens the door. The Library explores what lies beyond it. For more information, email:
library@factors-di.com
Library Attribution
This essay is part of The New Intelligence Library — a growing body of thought exploring the human, ethical, and civilizational implications of the era of Digital Intelligence.
The Library serves as a shared canonical resource stewarded independently of any single product, platform, or organization.
Written by Richard Hoffmann, Founding Steward, The New Intelligence Library
![]()