The Stoics believed that character is fate. If that is true for individuals, then for organizations, culture is destiny. While AI is undeniably a transformative force shaping our civilization, it is our character and culture that will ultimately define its impact. AI matters, but character and culture matter more.
But what about AI? Isn’t AI the future, if not the present, of our civilization? And if it is, doesn’t that mean AI matters more than anything?
As Peter Voss once told me, “There is nothing more important and exciting than building AGI.”
But there really is something more important. Something at the personal level and something at the collective level: character and culture.
These two will determine the nature of AI, for AI is not created in a vacuum. It is created by people within a culture. Naturally, what AI is and does carries our personal and collective intellectual DNA—our character and culture. It is designed by and for those. They fuel it in the form of “our collective data”—another way of saying “our culture.” Then, again, character and culture will decide the best way to use AI. So, from conception to implementation and application, AI is a magnifying mirror, a reflection of who we are.
For example, a military AI will have its own unique and specific character and culture, likely viewing humans as friends or enemies—to be destroyed or protected. A commercial AI will see us as clients or products—to be sold and monetized. In each case, the realm of what’s desirable, possible, and permissible, as well as the kind of AI designed to thrive in it, is predetermined by its respective character and culture.
Culture is our collective social data, and AI needs data. But we recognize that our cultural data is imperfect—its reality doesn’t match our aspirations for justice, fairness, equal opportunity, safety, etc. That is why Cory Doctorow notes AI is conservative and unchanging about who we are and skewed towards recreating the past in the future. Hence, there is a need for prompt engineering and reinforced learning, where AI is directed not solely by past data but also by our aspirations to do better than the data suggests. Because AI makes suggestions and predictions based on static versions of the past.
In the relationship between AI and humans, talent sets the floor, but character sets the ceiling. So, AI’s floor can be higher than that of the average human, but its ceiling will be lower than that of the best of us. Thus, regarding character and culture, we cannot learn from AI; it learns from us.
But why do I keep on emphasizing character and culture?
Because, as C.S. Lewis once said:
Each new power won by man is a power over man as well. Each advance leaves him weaker as well as stronger.
One example is Frank Herbert’s 1965 cautionary story about the people who turned their thinking over to machines, hoping that this would set them free, only to find themselves enslaved by other people with machines. That is why each new power of humanity is also a power over humanity. Because it can weaken us just as much as it can strengthen us.
Silver bullet thinking, like the one espoused by Peter Voss, is attractive but utopian. No single technological breakthrough can ever solve all of our problems. As futurist Nikolas Badminton once said to me, “AI is NOT a strategy.” But character and culture can be.
AI drives change, but character and culture define it because they drive the kind of change AI brings.
Thus, the people who have been raising alarm about “controlling AI” have it backward—humanity is the only species out of control on our planet. So, our greatest challenge is not mastering and controlling AI. It is mastering and controlling ourselves—i.e., mastering our character and culture. We need to win the conflict within and among ourselves. This is our greatest challenge and opportunity. This is our most significant project. This is how we will be measured in the future. This is how we survive or die. This is why AI matters, but character and culture matter more.