"Computational processes are abstract beings that inhabit computers. As they evolve, processes manipulate other abstract things called data. The evolution of a process is directed by a pattern of rules called a program. People create programs to direct processes. In effect, we conjure the spirits of the computer with our spells. A computational process is indeed much like a sorcerer's idea of a spirit. It cannot be seen or touched. It is not composed of matter at all. However, it is very real."
— Hal Abelson, Structure and Interpretation of Computer Programs
I first brushed up against cellular automata a couple years ago while working through Lex Fridman's podcast archives. In the flow of a heady conversation, those words sound familiar enough to slip past without much scrutiny—just another piece of esoteric debris floating through the discourse.
But some ideas have a quality of persistence, they work their way up through layers of attention until they finally insist on being noticed. After the term popped up across multiple episodes, it raised itself enough to demand attention. So I googled it (these were the pre-ubiquitous-llm days), and like everyone else who's ever done so, I ran straight into Conway's Game of Life1.
The page loaded an empty, gray grid. Unsure of what I was supposed to do, I haphazardly clicked on a few grey boxes, toggling them to yellow, found a "start" button and clicked it. I don't recall exactly what I expected, probably some basic animation, maybe a few blinking squares. That's not what happened.
I watched patterns unfold, expand, contract and merge. I watched one particular pattern (a Glider) traverse the screen, collide with another structure and annihilate. More new configurations were birthed, died, and danced across the screen. Three simple rules were producing something that felt surprisingly organic.
My first thought was, "This is amazing."
My second thought was, "I want to make that."2
r-pentomino, a Methuselah pattern: a small initial configuration that takes an unusually long time to stabilize into a static or periodic state, often producing complex intermediate behavior and typically ending with a much larger population than it started with.
It didn't take long by the now-defunct standards of those pre-vibe-code days. I built a limited implementation over a weekend of tinkering. And, as goes with first attempts, it wasn't very good.
But something about it charmed me. I found myself reaching for physics metaphors when naming things: space, time, causality. It wasn't planned—it just started taking shape over the logic of the system.
Here's a snippet from the earliest version:
// app.tsx
export default function GameOfLife() {
const [space, setSpace] = useSpace()
const [tick, setTick, flow, setFlow] = useTime()
useSpaceTime(space, setSpace, tick)
const violateCausality = (location, state) =>
setSpace((prev) => {
const next = [...prev]
next[location] = state
return next
})
// ...
}
I've returned to the project to chisel away at the details a few times. With each iteration, the boundaries drawn by the metaphor have done more than just help make the code legible. They've started giving it a kind of ontology; one that's become hard to ignore.
// universe.tsx — the current entry point as of the time of writing
export default function Universe() {
return (
<div className={`${css['universe']} ${css['CMBR']}`}>
<Title />
<PhysicsProvider>
<EntropyProvider>
<Field>
<Matter />
</Field>
<ViolateCausality />
</EntropyProvider>
</PhysicsProvider>
</div>
)
}
A number of threads have fallen out of it that feel worth pulling on, but the most compelling one?
I summoned a demon.
As the project evolved, I found myself more interested in the metaphor than the implementation details. I let it inform the code's structure in a kind of concept-led feedback loop. In doing so, the logic became cleaner, the abstractions clearer—and one particular hook began to emerge as the axis of the program.
But before we visit that point, let's first look at the pieces that had to exist before any summoning could occur — the beginning of all things:
// hooks/use-initial-conditions.ts
const ON = 1
const OFF = 0
type Charge = typeof OFF | typeof ON
type FieldState = Charge[][]
export const useInitialConditions = (dimension: number): Physics => {
const [field, setField] = useState(() => initField(dimension))
const transition = () =>
setField((field) =>
field.map((column, y) =>
column.map((charge, x) => {
const self = { x, y }
const interactions = observe(self, field, dimension)
return evaluate(charge, interactions)
})
)
)
return { field, transition, violateCausality: setField }
}
const observe = (
self: { x: number; y: number },
field: FieldState,
dimension: number
): number =>
// prettier-ignore
[
[-1, -1], [0, -1], [ 1, -1],
[-1, 0], /*self*/ [ 1, 0],
[-1, 1], [ 0, 1], [ 1, 1],
].reduce((acc, [offsetX, offsetY]) => {
const otherX = self.x + offsetX
const otherY = self.y + offsetY
const inD1 = otherX >= 0 && otherX < dimension
const inD2 = otherY >= 0 && otherY < dimension
const inSpace = inD1 && inD2
const otherCharge = inSpace ? field[otherY][otherX] : OFF
return acc + otherCharge
}, 0)
const evaluate = (charge: Charge, interactions: number): Charge => {
if (charge === ON && interactions < 2) return OFF
if (charge === ON && interactions > 3) return OFF
if (charge === OFF && interactions === 3) return ON
return charge
}
These functions implement some surprising insights into observation. But that's a thread we'll need to save for another time. For now, we need to take a small detour into an adjacent realm of physics.
Entropy comes in different flavors, and if you're unfamiliar with them, making sense of the distinction helps inform where we're headed.
Thermodynamic entropy describes how physical systems move from order to disorder. Think of a cup of hot coffee on its way to room temperature. The heat doesn't disappear—it disperses. Becomes less concentrated. Less useful. This is the reason your desk gets messy, stars burn out, and your knees, on average, ache a little bit more than they did yesterday.
Information entropy, on the other hand, measures unpredictability in symbolic systems. Claude Shannon defined it as a measure of surprise—how much data is required to describe a message. It's the foundation of modern communication and compression.
These seem like very different things. One physical, one abstract. But the boundary between them dissolves under closer inspection.
In 1867, James Maxwell proposed a thought experiment to probe the Second Law of Thermodynamics—the principle that entropy always increases in isolated systems—to interrogate whether it was truly fundamental or merely statistical.
He imagined a tiny, intelligent being capable of observing individual gas molecules and operating a door between two chambers. The demon would watch particles approach the door, measure their velocities, and selectively allow fast (hot) particles to move one way while allowing slow (cold) particles the other way.
Through this sorting, the demon creates a temperature difference—extracting useful work, creating order—without expending energy. This result looked like it violated the Second Law.
The modern resolution comes from information theory. Rolf Landauer showed that erasing information incurs a thermodynamic cost; deleting a single bit generates heat.
(There's an actual equation that quantifies this, but I don't have the math to understand it, so I won't pretend to. I'm happy to take Rolf at his word.)
Anyway. The demon must store information about each particle it observes, and eventually that memory has to be cleared. When information is erased, entropy increases, preserving the Second Law.
This insight bridges the two types of entropy. Every bit stored, every computation performed, every process that manipulates information—all of it has a thermodynamic cost.
Landauer's resolution didn't just save the Second Law—it showed something about the nature of computation itself. The demon's pattern—observe, compute, update memory, generate heat—isn't specific to sorting gas molecules. It's a universal pattern of any computational process operating in the physical world.3
So, now we can return to the unassuming central point in the program — useMaxwellsDemon
— and looking at the hook from this angle, a familiar pattern begins to stand up. To see it, trace the call stack.
// use-maxwells-demon.ts
const compute = (entropy: boolean, transition: Physics['transition']) => {
let id: NodeJS.Timeout
if (entropy) id = setInterval(transition, ENTROPIC_STEP)
return () => clearInterval(id)
}
export const useMaxwellsDemon = (
transition: Physics['transition']
): [boolean, React.Dispatch<React.SetStateAction<boolean>>] => {
const [entropy, setEntropy] = useState(false)
useEffect(() => compute(entropy, transition), [entropy, transition])
return [entropy, setEntropy]
}
The demon compute
s, and in doing so, must call the transition
function, which returns depth-first through a familiar sequence.
The observe
function examines neighboring particles, counting interactions
—gathering the information needed for each decision. The evaluate
function applies Conway's rules to determine each particle's state. Finally, setField
updates the system's memory with the next configuration.
The metaphor doesn't break down—it begins to go transparent over an underlying structure.
observe → evaluate → update memory
From a functionalist perspective, the distinction between Maxwell's original demon and useMaxwellsDemon
begins to dissolve. If the demon is defined by its functional role, rather than its substrate, then useMaxwellsDemon
doesn't just resemble the thought experiment—it instantiates the same abstract machine Maxwell and Landauer described. The substrate differs, but the causal structure is the same. This function call is an entropic spell cast over the rest of the program.
Both processes convert information entropy into physical entropy—gathering data about a system's state, computing based on that information, and generating heat as an inevitable byproduct.
The demon's useful work here isn't sorting hot and cold particles—it's transforming static configurations into dynamic behavior. Each time the demon executes the transition
function the system moves from one discrete state to the next, creating a dimension4 in which pattern, movement, and interaction can emerge.
"Words are pale shadows of forgotten names. As names have power, words have power. Words can light fires in the minds of men. Words can wring tears from the hardest hearts. There are seven words that will make a person love you. There are ten words that will break a strong man's will. But a word is nothing but a painting of a fire. A name is the fire itself.”
— Patrick Rothfuss, Name of the Wind
In programming, as in myth, calling a thing by its true name taps into power.
For much of this repository's history, the feature we've been exploring here, the internal details of useMaxwellsDemon
, was rolled up inside a hook called useEntropy
—a perfectly obvious name for a custom hook that wraps useState
and returns [entropy, setEntropy]
.
But that name encouraged an unfortunate conflation. It lumped together the information entropy of the simulation with the broader concept of entropy in general, obscuring the specific principles coming into contact at that point in the program. This latent ambiguity was always there. I could sense it in the way that the metaphor never felt quite right, but I couldn't put my finger on it.
It wasn't until I revisited Maxwell's thought experiment while wrestling with this tension that the problem of the entropy distinction became clear.
I speculatively renamed the symbol, stepped back to take in the usage, and suddenly, the power of the metaphor snapped into focus. The name changed the way I understood the entire program. The hook isn't just managing a state value called "entropy" in some vague sense—it's implementing the same causal structure that Landauer described.
useMaxwellsDemon
does more than allow a user to toggle state and run the program. It mediates the relationship between abstract information and real-world effect.
A deeper question emerges: Does entropy drive computation, or does computation drive entropy?
From one perspective, computation seems built over thermodynamics. Logic gates switch because electrons move. The physical substrate enables the abstract process. Computation rides bareback atop the horse of entropy.
From another perspective, computation directs entropy. The rules of the program determine where energy flows. The physical machine only runs hot when the program determines the change in information.
The relationship isn't hierarchical but reciprocal: logical rules determine whether computation occurs, which determines whether physical entropy increases. The heat wouldn't generate without the program's logical structure, but the logical structure can't execute without the physical substrate that generates heat.
In this sense, we're not looking at two separate domains, but dual descriptions of the same process:
One description measured in bits, the other in heat. Both are transitions in state.
Abelson calls computational processes "abstract beings"—invisible agents animated by rules we write. It sounds a little overwrought, but I'm convinced that it's more than a poetic flourish.
What this project revealed is that computation isn't some ghostly abstraction floating above the physical world. It costs energy. It creates heat. It produces entropy. That is: it does work. And the act of writing a program—of carving out a pattern that can evolve and manipulate symbols—is a way of directing that work. Of tapping into a pattern that was there in the background all along.
Authoring a program that describes a causal pattern begs uncomfortable (or fascinating, depending on where you stand) questions about the degree to which we are subject to such abstract principals ourselves—running on rules we're fundamentally unable to see beyond.
None of this is news to physics. I've heard many of these ideas described and described well5. But arriving at it this way—line by line, function by function—made it feel real in a different way. It made the ideas less abstract and taken on authority, more earned and constructed. This project allowed me to connect some dots that were previously relegated to the unsatisfying realm of vague intuition.
What started as an exploration into cellular automata became something stranger and more satisfying. The symbols that compose the program are not just informational representations of a metaphor, they're causal bindings on physical reality. Demons—these thermodynamic mediators between information and heat—aren't unique to Conway's Game of Life or Maxwell's original thought experiment, they're the signature of Landauer's principal. They are everywhere, have always been everywhere, orchestrating every computational process. Most remain unnamed, hidden in plain sight behind familiar labels like useState
, event loop, or "I think, therefore I am."6
To borrow from Abelson one last time: when useMaxwellsDemon
is invoked, it's not simulating a demon—it's employing one. It runs on your laptop, it heats up your room, and in that heat, we feel the weight of its spell.
Play this Game of Life implementation here, or a more robust version here. ↩
I'm using "demon" here not to refer to Maxwell's specific particle-sorting mechanism, but to name the thermodynamic-informational bridge that Landauer revealed exists in all computation—the inescapable relationship between information processing and heat generation. ↩
In an ettempt to avoid conceptual overload in this note, I've moved the kernel of a deeper exploration into this notion of the emergence of time into a WIP note. ↩
One of Joscha Bach's many descriptions of software as spirit. In a similar way that the renaming of useMaxwellsDemon
was a pivot point in the application, Joscha's reference to the Abelson quote in that talk was deeply influential for the shape of this essay. ↩
The leap from JavaScript to Descartes requires accepting a computational theory of mind—the view that consciousness itself arises from computation. While current technologies seem to be hinting that this debate may one day be settled, it's still a contentious claim. But it also seems to me the only ontology that can actually work. If thought is computation, and all computation follows the thermodynamic pattern Landauer described, then even our most fundamental certainty—"I think"—is another expression of this entropic process. ↩