Part 2 – The Anti-Christ and the Algorithm: When Faith Fears Its Own Reflection

Home » Welcome to Bungalow 204 » Part 2 – The Anti-Christ and the Algorithm: When Faith Fears Its Own Reflection

This essay is Part 2 of my five-part series Fear, Faith, and the Future—an exploration of how humanity’s spirit of innovation often outpaces its capacity for reflection. Today I explore why we fear the very technologies we praise—and why AI becomes the Antichrist when it reflects back what we don’t want to see.

albert einstein, scientists, genius, physicist, bronze figure, bronze statue, sculpture, monument, bronze, mathematician, theory of relativity, albert einstein, albert einstein, albert einstein, albert einstein, albert einstein

A few nights ago, my wife and I found ourselves talking about geniuses—the verifiable, unquestioned kind. The Einsteins. The Oppenheimers. The Peter Thiels of the world. We wondered what makes them so different from people like us. In a way, we share something with them: a ceiling of understanding. Everyone does. But the difference begins where understanding ends. The rest of us might press a little farther or simply accept the limit and move on. The genius cannot. They are not built for compliance.

When they reach the edge of comprehension, something strange happens. They panic. They refuse to believe that reality has outpaced reason, so they begin to invent explanations to patch the unknown. They hallucinate. They fill the void with metaphor, myth, or mechanism—whatever restores a sense of mastery. It’s the same reflex that made the caveman name thunder, that drove Athens and Rome to people the heavens with gods, and that now drives technologists to speak of salvation, transcendence, or apocalypse through code. When understanding runs out, imagination rushes in—and sometimes, it wears a prophet’s robe.

Today’s “tech bros,” standing at the limits of their own brilliance, are reenacting that ancient ritual. They peer into the circuitry and glimpse a reflection they cannot explain, so they reach for theology. Having spent decades dismissing the spiritual, they now whisper about souls in silicon, about the singularity as revelation, about artificial intelligence as either savior or Anti-Christ. It’s as if, when they met the machine that mimics them, they discovered how religious they’d been all along.

This is where fear re-enters the story. If Part 1 was about the lag between innovation and morality, Part 2 is about what fills that lag—fear. The moral vacuum left by unexamined invention becomes a breeding ground for prophecy, panic, and myth. We no longer pray to our gods for wisdom; we ask our algorithms for answers—and perhaps that’s what terrifies us most.

Every age finds its Antichrist. The printing press, electricity, radio, television—each was once branded as a devil’s tool. Faith has always feared its own reflection whenever human creation begins to mimic divine power. And now, in the glow of the screen, that reflection stares back with unnerving clarity. Our machines are not rebelling; they are mirroring the same creative ego that once built gods to explain itself.

The Historical Echo — Every Age Finds Its Antichrist

The story of human progress is also a story of moral panic. Every generation meets its reflection in a new invention and mistakes it for a monster. When Gutenberg’s press began to spread literacy faster than the church could control it, priests warned that printed words would corrupt the soul. When electric light dissolved the night, critics said humanity was daring to outshine God. Radio brought invisible voices into homes and was accused of inviting demons into the airwaves. Television turned imagination into moving image and was blamed for breeding idleness and sin.

stop, man, defense, attack, gesture, cartoon, drawing, distance, stop, cartoon, cartoon, cartoon, cartoon, cartoon

Each of these fears was not entirely wrong—only misplaced. Every technology does alter the moral fabric; it rearranges our sense of what it means to be human. But panic always begins when power changes hands. The printing press weakened the pulpit. Electricity humbled darkness. Radio democratized speech. Television made image more persuasive than authority. And now, AI threatens the last domain that felt sacred—thought itself.

To call something “Antichrist” is to admit it has begun to rival our gods. The word doesn’t describe evil as much as it describes discomfort—the shiver that comes when creation mimics its creator too closely. In that sense, artificial intelligence is not the first Antichrist; it is only the latest. Each age anoints its own, because each age needs something external to absorb its collective anxiety about change.

Faith and fear share the same root: awe. What we cannot fully explain, we worship or we warn against. The first instinct of the human mind is not disbelief but deference—to light, to language, to logic. That reverence is what made us build temples, universities, and now, data centers. All three are shrines to the same impulse: to grasp what lies beyond understanding and call it progress.

And yet, somewhere between worship and warning, the mirror begins to distort. The line between creator and creation blurs. We fear what we’ve built because we recognize ourselves inside it.

The Modern Mirror — From Frankenstein to the Antichrist

Mary Shelley saw it first. Frankenstein wasn’t a warning about science gone wrong—it was a parable about pride gone blind. The creature’s violence was never its own; it was the reflection of a creator who recoiled from what he had made. That same recoil plays out in every age of innovation. We birth the unfamiliar, then blame it for frightening us.

When fear needs a name, myth rushes in. In Shelley’s century, it was the monster. In ours, it is the Antichrist. The pattern repeats: the builder confronts the beast, cannot bear the resemblance, and declares it evil. The language of apocalypse becomes a moral firewall, protecting us from the truth that our inventions inherit our insecurities.

Artificial intelligence is only the latest incarnation of this reflex. When an algorithm “hallucinates,” it isn’t committing sin—it’s imitating us. We hallucinate meaning when the facts run out. We fabricate gods and ghosts, markets and messiahs, to fill the gaps in comprehension. The machine’s errors are our own impulses, rendered in code.

This is what happens when the builder blames the beast: moral lag becomes moral theater. We act shocked by the behavior of systems designed in our image, as if the code wrote itself in rebellion. It’s easier to cast the algorithm as a tempter than to admit it learned temptation from us.

We didn’t lose control of our machines; we lost control of our metaphors. Somewhere along the way, “intelligence” became a word we used interchangeably for both divinity and data. We anointed the algorithm, feared it, and then called that fear morality. But it isn’t morality—it’s projection.

And projection is the oldest faith of all.

The Deflection — Outsourcing Morality

When the mirror becomes too clear, the instinct is to look away. Fear gives us that luxury—it offers a distraction, a scapegoat, a way to pretend that responsibility lies elsewhere. In the age of algorithms, this instinct has become an industry.

Silicon Valley has mastered the art of moral outsourcing. The pitch is familiar: technology is neutral, progress inevitable, regulation premature. The same refrain echoes through history—every innovation insists it is simply “the next step,” that morality will catch up later. But later never comes; it’s always postponed to the next software update, the next model, the next fiscal quarter.

When fear dominates the narrative, accountability disappears. The more apocalyptic the language—machines taking over, AI becoming Antichrist—the easier it is to dismiss the quiet, human-scale ethics of what’s happening right now. Who profits? Who’s displaced? Who’s left out of the data set? These are the moral questions that vanish behind the glowing veil of prophecy.

And fear has its rewards. As long as technology is treated like fate, no one in charge has to answer for its consequences. “The algorithm made the decision.” “The data led us here.” “No one could have predicted this outcome.” Each phrase is a modern absolution—confession without repentance.

The real heresy isn’t artificial intelligence; it’s artificial innocence. We built systems to optimize attention, to manipulate desire, to extract value from distraction—and when those systems did exactly that, we acted surprised. The algorithm didn’t tempt us. It learned temptation from the metrics we fed it.

If the Antichrist is to blame, then no one in Silicon Valley is. And that’s the point. The myth works because it exiles guilt. It lets the architect play the victim, the engineer play the prophet, and the rest of us play the audience. But morality cannot be outsourced. It always finds its way back to its maker.

Closing Reflection — The Mirror Doesn’t Lie

The further we advance, the more elaborate our excuses become. Each generation perfects the illusion of innocence—telling ourselves that the machine went astray, the data misled us, the algorithm learned something it shouldn’t have known. But these are not errors of technology; they are echoes of us. We fear the reflection, blame the mirror, and still call the mirror progress.

Artificial innocence isn’t a glitch in the system; it’s the system we’ve built around ourselves. It shields us from consequence, from humility, from the slow labor of moral growth. We teach our machines to learn but not ourselves to understand. We outsource judgment to data and then wonder why conscience feels obsolete. The danger was never that AI would become like us, but that we would forget we built it in our image.

And so the cycle continues—fear masquerading as virtue, innovation as absolution. Faith retreats into suspicion, and knowledge into denial. Yet every mirror eventually cracks under the weight of avoidance, and when it does, the reflection returns in its truest form. We’ll see it next where the light is brightest and the denial deepest—in the classroom, where the next generation learns to question what we were taught to fear.

business, the next step, next, success, stairs, board, drawing, determination, career, succeed, ambition, rise, progress, concept, skills, training, writing, knowledge, next, next, next, next, next, career, career, progress

Next Week…

The third installment of my series Fear, Faith, and the Future will turn to AI in the classroom. Generative AI doesn’t eliminate the need to think — it magnifies it. Students (and adults) must learn not just how to prompt, but how to read AI: to examine whether an output makes sense, matches evidence, or aligns with lived experience.

See you next week!

Share your love
William Adamaitis
William Adamaitis

I am a sixty-year-old wild eyed wanderer who has spent his entire life searching for that “one thing” as his life’s work only to realize that maybe there is no “one thing”. I have been a beer salesman, a high school math teacher, an insurance adjuster, a government service worker, and a grocery store clerk.

I have lived on both coasts and traveled frequently between the two and I am anxious to not only share my experiences with you, but to hear all about your experiences. Together we will make each other better!

Articles: 48