![]()
In September of 52 B.C., Vercingetorix ("victorious one") had to eat his-own-name-pie because a Roman legion, personally and literally spearheaded by Caesar (cf. Plutarch's Life of Caesar) broke through the remaining Gallic forces gathered by this great Averni-Gallic chieftain—who was a rare barbarian leader, brazen and fore-sighted enough to see what and whom Caesar was and what he was about to do to Gallic life as a whole, forever. At Alesia, the place I knew as a child only as the mysterious beginning-of-the-end never mentioned by the Gauls in the Asterix comics, Vercingetorix surrendered finally to the might of Rome—well, really to the might of Julius Caesar, who was a one-man force of technology and clever, inventive thinking along with his celebrated celeritas, speed. Caesar is portrayed by Plutarch and inadvertently by himself in the autobiographical Gallic Wars as a cartoon of imperiousness, talking about himself in the third-person, a detail that always strikes me as telling, just as telling as the way Cicero said he always kept his hair in perfect order (for Cicero, a sure sign of tyranny). More essentially, Caesar was Technology, in a sense, and Vercingetorix was the Personal, in a sense, and the surrender at Alesia was, in a sense, when Technology Won.
According to Plutarch in an episode that Caesar in The Gallic Wars conveniently sent down his third-person memory hole, Vercingetorix, when he saw through the columns of smoldering death-sighs that it was over, thundered out of Alesia on his mammoth white war-horse. It was a loss of everything he knew and he knew he'd be dead soon enough, so he cut a circle around Caesar and the Roman legion using massive hooves and speed, flaming plume and bloody armor flashing, as if he had to extinguish himself and the fire of his Celtic heart. It seemed as if he would charge the Great Roman commanding his makeshift high seat, whose red robes were like a wreath of mixed Roman and Gallic blood, all gathered up around a leathered, sinewed frame, accentuating a square, muscular jaw.
Did Caesar's jaw flinch as Vercingetorix charged right for him so that he could clearly see the whites of the war-horse's eye? Or did his quick guards set up a forest of spears? Or did Caesar put out a staying hand, steeling himself and the wall of soldiers? It must have been something to see; if there, I don't know if I would hope that the young and wild Chieftain would just keep going and mow down a man almost solely responsible, like George Bush, Jr., for a million deaths, or if I would be glad to see the sun set on that frothing, wild, personal Gallic world in favor of the pax that served as an iron and organized, civilizing peace, if one can call bureaucracy that.
Vercingetorix had the last free ride round the last breathless bastion of his culture created around family, a twisted, tangled mass of laws only unwound if one understood the complex social ties; academic sources and even Caesar himself describe the foundation of Gallic law and culture as a mixture of kin and religion in a web of covenants (the sources call these "contracts" but "covenant," indicating blood ties, whether that be sacrifice or relationship, seems a better term to me). It was a messy world, but in some ways, a more human world: the law followed the way people lived, and took into account the place of the individual within a real community: it was based on what Hannah Arendt calls the reality of the "space between us," that complex, living trinity of tradition, culture, and true authority, life-giving, the facilitation of flourishing based on natures, on the person. Roman law, on the other hand, had become impersonal, fundamentally bureaucratic. But still it was efficient and effective; it allowed for empire and leisure. It seems an easy choice: safety and surety over scrapping; engineering feats and roads over the chieftain-councils breaking and mending over love-and-war feuds; the celeritas of information over Celtic creative backwardness. The Technological Man over the smoldering and erratic flame on the war-horse.
Vercingetorix did stop, just before his horse landed on Caesar's lap; from what I know of Caesar's character, he probably did not flinch and allowed the Gaul to extinguish his own pride (Caesar was, at his best, a man's man and therefore understood); Vercingetorix dismounted, the steed disappeared into Roman ranks probably destined for some great house in the Roman countryside, and the young man sank onto the ground before Caesar. Later, the vanquished warrior would be paraded through the bellowing, sycophantic streets of Rome behind Caesar, property now, and would end his life "ceremonially strangled" ( a particularly horrifying phrase—the Roman precision and calculation of it is as great an argument against paganism as any) as a human sacrifice in the temple of Jupiter Optimus Maximus, Rome's greatest god. Technology Man himself was killed only two years after Vercingetorix, ceremonially stabbed 22 times by his fellows in front of a statue of Pompey in the Curia di Pompeo of Rome, but the Roman Empire of technological prowess kept expanding like a great Google digital cloud over much of the known world. It must have seemed like the end of the primitive and ancient world, and in a way, it was.
I think educators today feel a bit like Vercingetorix at Alesia, riding a last circle around the classroom as the celeritas of AI straddles our desk, waiting for us to sink at its feet, to be slated later for ceremonial strangulation as students disappear into its digital mind. There's no blood, though: AI is much, much colder and much more subtle than Caesar; moreover, like the Roman bureaucrats who came trundling into view along Roman roads, a middle-man cultural invasion, the tentacles of AI are already all around us as I speak.
Or, are our students Vercingetorix making that last circle round the inexorable march of progress with this new algorithmic entity seated on the high chair in the center?
The writer Conrad Flynn recently asserted that, per his research into the various Jupiters Optimi in Silicon Valley, some of those deepest in the center of it admit they do not know exactly how AI is working: they don't know; it is beyond them already, and so they will become like Caesar, passing into history as the Empire grows on, unfettered by its creators, over the known world.
I've taught on various levels for decades, from middle-school students to college, in classical and mainstream institutions: my specialty is the Trivium, the classical and liberal three arts—grammar, logic, and rhetoric—as the foundation for all other arts and sciences. My classroom has always been something of a tangled skein of threads, an artists' workshop with a foundation of community: like the world of Vercingetorix, it is colorful and somewhat unpredictable, moving from carefully calibrated activities to free-flowing discussion, but it is all based on "kin" or kindness, particular rules created, in a sense, by the community that each class becomes. It is a human place: not perfectly orchestrated and simplified, but, hopefully, reaching depth, reaching the whole human being: mind, soul, emotions. Fundamentally, though, it is a place in which reality and the mind in accord with reality (truth) is prized, searched for, hoped for. In this, we also search for God, though this is often more implicit than explicit.
It is certainly more Gallic than Roman, but nevertheless Caesar now seems to sit in the middle of the room, as I wrestle with how and if students will still learn to write—and think—for themselves, and—is the institution around me still Gallic, or is it Roman? I think we are all teetering on that Rubicon, unsure, some of us freely using AI and others holding the wall of Alesia still. But it seems that Alesia will fall again, crashing into a heap of human logs, petrified inside their screens, nursing reality second-hand from curated algorithms. Yes, again, there will not even be blood spilled: it will be a silent crash, a quiet retreat into the woods of the mind divorced from fundamental contemplation and articulation of the ground, the sea, the sun, real people, real teachers, real classmates, real interlocutors.
After all, as Professor Jane Sloane Peters and D. C. Schindler have argued, AI is more than just a glorified calculator: at its most dangerous and Caesar-like, it has potential as an idol. This, as Schindler suggests, is because it masks itself as intelligent, when fundamentally, it is not: it is a deep fake producing deep fakes. Why is it not intelligent? Why can't you have "artificial intelligence"?
Intelligence and the words that are inherent as both a means and expression of intelligence by definition are in a relationship with reality: intelligence only exists in a mind able to perceive, and AI does not perceive in this way. Professor Peters in her lecture on "Chatgpt and the Foolishness of Speech" explains that the explorative, sometimes awkward nature of human speech is fundamentally a creative act, a relationship with reality that D.C. Schindler explains, based on Plato and Aristotle, is an act of unity. We unify with the substantial forms and the properties and accidents of the beings around us, because our concepts and the substantial forms are the same thing; Schindler explains further:
Only a living thing can be intelligent, because only a being with an interiority, with
an internal principle capable of gathering its many parts into a per se unity, whichentails mediating the parts to each other so that they are intrinsically interdependent,can understand. Understanding, in other words, is a deepening of the kind ofunity that constitutes life. Plato describes it as the fruit of the soul's coupling withreality, wherein the deepest core of each becomes one.
It becomes obvious, then, that AI is not actually intelligent: it is a glorified calculator. However, as Schindler observes, it isn't made so that we naturally treat it as the calculator that it is: it is made to simulate intelligence, and when we begin to treat it as something with intelligence, we are involving ourselves willingly into a deception. Then, as the technocrats, the Jupiter Optimi, begin to hack AI as more intelligent than humans simply because it can calculate faster, we may be primed to treat it as a higher being, even though it has no eyes and therefore cannot see. Can we expect children, teenagers, and young people to somehow see the danger and respond appropriately, to resist the temptations to live in a deception and to be able to use AI ethically as the tool it is? Will they become Roman bureaucrats, or be ceremonially strangled, or will they stay human and search for the True God?
Ok. Maybe I'm waxing into overdrive. Maybe I'm an hysterical Gallic woman inside Alesia, unaware that my descendants will drive BMWs and have a much higher standard of living, will be educated and civilized. Maybe AI is just another step in technology that we will make choices about, for which we will find a place, a thing that will, in the end, kneel before reality in the dirt.
For answers, I look not to my colleagues, but to my students: they are the generation that will, ultimately, decide; I'm already too old, in a sense, because I wasn't born into a digital world. They, native to this new incursion into reality have, in the last year, given me two lenses through which I can muse about the future of education in this new Roman Age of AI: one is a sort of case study, a teacher I was meant to mentor (and largely failed; he left the school after one year); the other is a reflection a current student of mine just handed me.
The case study: I'll call this teacher Sam, named after Sam Altman, because this teacher was totally into AI: always interested in "technology in the classroom" (this is a public school buzz-phrase), each of his lessons was a Luna Park of Canvas, Padlets, Google forms, Kahoots, and a dizzying (and I thought, nauseatingly dizzy) array of online collaborative tools where students could connect with each other online, even though they might be sitting right across a classroom aisle. When AI hit, suddenly all this became even more dizzyingly easy to produce, and he off-loaded quiz and prompt creation to AI; up to this point, his students followed. I got the sense, when I observed, that I was actually part of a Tedtalk, as if he were orchestrating things from a stage, perhaps even behind teenager-proof glass; however, his students were, diligently in factory rows on their screens, learning through pixels how to persuade people (it was partly a rhetoric class). I intuited that perhaps Sam was actually afraid of interacting with real people, most especially the sixteen and seventeen-year-olds in their khakis and plaid skirts, their cherub faces and the pools of intelligence in their eyes; perhaps he wasn't sure if or when they might actually ask him a philosophical question about reality, about the world beyond, or, God forbid, about how he felt. I admit that teenagers can be quite disarming, and even sometimes a bit Gallic, ready to gallop a war horse right up to the desk, but they are just people after all: aren't we made to know others?
I suggested that he get to know their names and get them off screens and into some discussion every once in awhile: I saw some improvement—but then things seemed to truly go off the rails; I couldn't figure out why, exactly. Why did they come, complaining, with a level of contempt I rarely saw?
It wasn't until much too late that I realized that Sam had begun to use AI to grade, to provide guidance and comment for his students. When this line was crossed, they lost all respect for Sam and he was Kahoots. After he left, I kept wondering about this line, and the following year I slowly began to discuss it, obliquely of course, with my students. What I found was that this generation of students, oddly enough, seems to value something beyond all the convenience and fascination that the digital world in which they were born offers. They value reality; they want authenticity. They are, in a sense, beyond jading: I began to see that, perhaps, way beyond the Roman invasion, sprouts of Gallic culture were resurrecting in that deep desire for authenticity. Of course, this desire is not new for the youth of the world; it is an ever-bubbling spring deep in the Alps that reminds us older ones of our own humanity: I had just begun to be afraid that the digital bureau-bots had completely colonized our youth—and I don't imagine I'd be blamed for thinking that, based on just the movement from vines to TikToks to Snaps to the absurdity of whatever the memes are floating around our heads in the ether.
Sam's students were willing to follow him, albeit hesitantly and not joyfully, as long as he was relating to them in some scrap-authentic way: they told me in many ways that they wanted to be taught by an authentic person; they knew, in the core of their beings that knowledge was bound up with relationship, real relationship. It was, in the end, the same reason why, after the totalitarianism of Covid, we all didn't stay at home and teach on Zoom, or let AI begin to take over schooling. Learning is much deeper than information or acquiring skills; if this was all it was, we'd be disciples of Youtube.
Learning is play in the natural world and in the trinity of tradition, authority, and culture, or religion. We learn many things simply because we can or because they are fascinating, or beautiful; not, at the deepest levels, because it is useful. And much of the time, we want to play with others; regardless, though, we are always learning, whether it is formal or not; it is the most fundamental part of relationship, of friendship and even romance: we are always contemplating logos, because that is how we are made, made for reality. AI, on the other hand, does not apprehend or create concepts; it is not in touch with reality, only with a digital highway; it thinks based on the algorithms available to it, not on the true, good, and beautiful; it has no real conception of anything beyond the electrical. It also tends to synthesize to the point of simplification: that is both its strength and its terrible weakness; it does not create depth. And it is often therefore wrong on the nuances. Therefore, a human being cannot relate to AI; we can only use it.
Sam's students intuited this from long experience with video games and social media algorithms; they already knew AI before it came: sure, some young people, lonely and depressed and anxious, will become dependent on it and there have been and will be casualties: because of this alone I would send AI back to Rome to be stabbed. But it will not go back now, and I think we must go through to get beyond. Therefore, my apprehension for the future is more about the people who will become its subjects or use it for tyrannical means, not the technology itself. Nevertheless, it does have an inherent danger, that of deception, because, again, AI is itself a deep fake, as Schindler claims. I don't know how we will navigate this, but I think this navigation will become a critical part of K-12 education.
Our schools will need a return to material logic, the basic study of the three acts of the mind, and as I do in my Trivium course, we will have to open students' minds to the interworking of intelligence, to enable them to see for themselves that AI is not intelligent nor alive. We will have to help our young people understand again what it means to be human, and I have had some real success in this respect: I now put student work through the latest LLM checker and then have a conversation with a student who has fallen into AI plagiarism (up-leveled to letting another "think" and "create" for you). The conversation goes something like this:
"David, the checker indicates that this is mostly AI. What's up?"
"Yeah, Mrs. K, I was just stressed out; I did just have it go through and improve my ideas, though."
"So you are letting it change your thinking?" Silence.
"Yeah, I guess so."
"Can you see where this might go for you?"
"I might lose my ability to think for myself, and I'm not really learning."
"Yep. OK--so how might you use this as a tool and not a controlling mechanism that stunts you?"
"I could ask it for some outline structures based on my ideas, or some help with grammar and formatting?"
"Yes--as long as you know that you must be the master of the tool, and that you must choose, intelligently, the best option--or your own option--based only in part on its algorithmic suggestions. You are the only one in touch with reality in this situation, and so you are the only one responsible for the ideas."
The student usually agrees with me in the sense that none of them want to give over their God-given intelligence to a bot; in fact, they love most the sometimes messy, funny, deep incursions into reality that are a hallmark of my classroom. A wonderful sign. Nevertheless, I was feeling quite down about this whole thing, although I have gained in hope that this generation indeed cares more about truth and most are quite savvy about counterfeits. And then I received a reflection from a student, which taught me and grounded me back in firm hope: this young man grasped the essence of teaching and his own education, and in doing so, he revealed to me once again the dignity, the essence, of what it means to be imago Dei, something AI will never achieve or destroy in those who desire to keep it alive.
This student, a poet, an old soul already a philosopher, is nevertheless mired in the air of nihilism; he is teetering on the edge within his own soul between deep encounter with Being and an eternity with Nietzsche. When he entered my class last year, he was purposefully dumbing his writing down to fit in; as the course and I encouraged him to exercise his mind and soul and his deep intelligence, he began to reveal himself to himself. As a part of his personal statement for college applications, he wrote about the nature of education; the following is true of all teachers, and is less about me and more about the power of real education, that deep relationship of struggle and play, which can never be faked by AI, much less replaced by it. This is from the mind, soul, and heart of a teenager, riding out beyond Caesar and into reality on a great white war horse, expressing irrepressible human desire; this encounter, this lens, is enough to firmly ground our hope:
"Great teachers awaken something inside of us. For me, that came in [my Trivium] class, where philosophy met enlightenment, and where my tangled thought on truth, morality, and meaning found both a mirror and a guide. With intellectual rigor and profound empathy, my teacher didn't just teach me how to write or think critically; she taught me how to trust my own mind. She helped me dismantle self-doubt, amplified my voice, and set me on a path where questioning the world became not just a belief, but a calling. The classroom was more than a place of learning; it was the catalyst that turned my restless curiosity into a lifelong pursuit of deeper understanding . . . it wasn't just a space for analysis; it was a proving ground for perspectives. [My teacher] pushed me into moral dilemmas not to "correct" answers, but to teach us how to wrestle with uncertainty. And in that struggle, something in me shifted.
"I've always considered myself an old soul, drawn to philosophy's big questions, but before [this] class, those questions felt isolating. At home, I'd spiral into frustrations about truth, ethics, and meaning only to arrive in the classroom and find those very topics alive in our discussion. [My teacher] didn't hand me answers; she gave me tools. She talked her approach to each student, and for me, that meant recognizing the depth of my curiosity and challenging me to articulate it . . . Learning how my thought mattered not just to me, but to someone whose judgement I revered, ignited my confidence. . . her pride [in me] wasn't just about [my] achievement[s]; it was about the person I'd become . . . someone unafraid to question, to write with vulnerability, to chase ideas into the dark.
"And that's the mark of a great educator: they don't just light the path, they make you believe you belong on it.
"My experience connecting with [my teacher] is proof of a quiet truth: that the greatest education is not transactional, but transformational. A single human soul, armed with nothing but curiosity and care, can alter the trajectory of another, not though grand gestures, but through the steady insistence that I am capable of more than I know . . . Real learning begins with genuine interest, the kind that cannot be forced or faked, only kindled, And now having been seen and inspired . . . I carry forward a question of my own: How can I, in my way, become the kind of presence for others that [my teacher] was for me? Whether through words, actions, or simply the act of listening deeply, I want to pass on the gift she gave me, the courage to think boldly, to question relentlessly, and to trust that even in the dark, there are minds to help you find your way."




