#2594: The Hierarchy of Immutable Code

From mask ROM to e-fuses: how hardware enforces a hierarchy of mutability in every computing device.

0:000:00
Episode Details
Episode ID
MWP-2753
Published
Duration
30:59
Audio
Direct link
Pipeline
V5
TTS Engine
chatterbox-regular
Script Writing Agent
deepseek-v4-pro

AI-Generated Content: This podcast is created using AI personas. Please verify any important information independently.

The Hierarchy of Immutable Code

Most people think of software as something you can always change. Reinstall the OS, update the drivers, flash a new ROM — there's always a way to modify what's running on a machine. But that assumption breaks down at the lowest levels of computing, where code becomes physically part of the silicon itself.

The Bootloader Chain

When a processor powers on, it doesn't know what an operating system is. It doesn't understand files or storage. It simply starts executing whatever sits at a specific memory address. That first code — the bootloader — is the foundation everything else builds upon.

On an ESP32 microcontroller, the boot process happens in stages. The very first stage lives in mask ROM: code that's literally etched into the silicon during chip fabrication. Those ones and zeros are physical structures, not data stored in memory. This roughly 4KB of code is immutable by design — it cannot be erased, rewritten, or modified. Its only job is to check a few hardware pins, locate the second-stage bootloader in flash memory, verify it, and hand off control.

The second-stage bootloader is where manufacturers ship updatable firmware. That code then loads the actual application. Even on a "basic" device, three layers of code execute before your program runs — none of which is an operating system.

Firmware as a Spectrum

The term "firmware" describes code that sits between hardware and software — softer than physical circuits because it can be updated, but firmer than applications because it's protected from casual modification. The bootloader is the firmest firmware of all.

On modern Android phones, this protection is enforced through cryptographic chains. The primary bootloader in CPU ROM verifies the secondary bootloader's signature. That verifies the Trusted Execution Environment and modem firmware before handing off to the OS. Each link checks the next, and any failure halts the device entirely.

One-Eway Physical Changes

Some manufacturers take immutability further. OnePlus implemented anti-rollback protection using e-fuses — one-time programmable memory cells on the silicon die. Blowing an e-fuse applies a voltage spike that physically alters the semiconductor material, creating a permanent conductive path. The change is metallurgical and irreversible.

Every time firmware updates on affected OnePlus devices, the bootloader checks a version number burned into the e-fuse array. Attempting to downgrade triggers a brick. Some implementations went further: tampering with the secure boot chain would permanently disable bootloader unlock capability. Not just lock it — physically sever the path forever. Even the manufacturer couldn't undo it without replacing the system-on-chip.

The Hierarchy of Mutability

What emerges is a clear hierarchy, enforced by hardware:

  • Mask ROM: Immutable by physics
  • E-fuses: Mutable exactly once, then permanent
  • Flash firmware: Rewritable but cryptographically protected
  • Operating system: Fully rewritable, lower privilege
  • User applications: The softest layer

ARM processors enforce this through exception levels — EL3 for the secure monitor and bootloader, down to EL0 for user applications. Once the bootloader hands off control, lower levels cannot reach back up. There's no system call to return to the bootloader. That's why hardware reset works: it triggers a full restart from the mask ROM.

When Firmware Becomes Software

Modern UEFI blurs the line between firmware and operating system. It includes network stacks, file system drivers, and graphical interfaces — running in the most privileged context before any security software loads. Compromise UEFI, and you own the machine permanently. Anti-theft systems like Computrace exploited this by embedding agents in UEFI flash that would survive complete disk wipes and OS reinstalls.

The same architecture that enables development — the hardware escape hatch of a reset button and mask ROM bootloader — also creates attack surfaces that can persist beneath any operating system.

Downloads

Episode Audio

Download the full episode as an MP3 file

Download MP3
Transcript (TXT)

Plain text transcript file

Transcript (PDF)

Formatted PDF with styling

#2594: The Hierarchy of Immutable Code

Corn
Daniel sent us this one — he's been thinking about our macro pad conversation, specifically the reset button and how it reloads the bootloader. And that got him spiraling into the deeper question of what bootloaders actually are, what we even mean by firmware, and this wild thing he came across about OnePlus physically destroying part of a phone's board to make rooting irreversible. He's asking about the essential distinction between firmware and software, across the micro scale and the larger computer scale, from an ESP32 board all the way up to a BIOS versus an operating system. So really, what are we talking about when we say firmware, and why is it walled off from everything else?
Herman
By the way, DeepSeek V four Pro is writing our script today. So hello to our synthetic friend in the cloud. Hope you're rendering this at a comfortable temperature.
Corn
I'm sure it appreciates the sentiment. Now, before we get into OnePlus shorting physical traces on a board — which is genuinely one of the most aggressive things I've heard a manufacturer do — I think we need to start with the thing Daniel's really circling around. There's this layer that lives between hardware and what we think of as software, and it's designed so you can't touch it. Sometimes it's literally burned into silicon.
Herman
And that's the key word — burned. Let's start with what a bootloader actually is, because most people hear the term and think it's just some utility that loads your operating system. It's way more fundamental than that. On almost every computing device, when power hits the chip, the processor doesn't know what an operating system is. It doesn't know what a file is. It wakes up with its program counter pointed at a specific memory address, and it starts executing whatever's there. That first thing it finds — that's the bootloader. On an ESP32, which Daniel mentioned, it's actually a multi-stage process. The very first stage is in mask ROM, which is literally hard-coded into the silicon during manufacturing. You cannot change it, you cannot erase it, it's physically part of the chip die.
Corn
Mask ROM — that's the thing where the data is laid down as physical patterns in the metal layers during fabrication, right? It's not something you flash, it's something that exists as a geometric fact about the chip.
Herman
The ones and zeros are actual physical structures. And that first-stage bootloader in the ESP32's mask ROM, it's tiny — about four kilobytes — and its entire job is to check a few strapping pins, figure out where to find the second-stage bootloader, load it, verify it, and jump to it. The second stage is what Espressif actually ships as their bootloader binary, and that lives in flash. That's the one you can update. And that second stage then loads your application firmware. So already, on what Daniel called a "very basic device," you've got three layers before your code even runs.
Corn
None of that is the operating system. That's the thing that trips people up. They think bootloader equals BIOS, and BIOS loads Windows. But the bootloader concept is fractal — it repeats at every scale. Your phone has a bootloader that loads another bootloader that loads the kernel. Your WiFi chip has its own bootloader. The microcontroller in your keyboard has a bootloader. The reset button Daniel was talking about — when you press that on an ESP32 dev board, it pulls a pin low that tells the mask ROM bootloader to enter download mode instead of jumping to the application. So the recovery mechanism isn't some software trick; it's built into the one piece of code that physically cannot be bricked.
Herman
This is where the term "firmware" starts to make sense. It's not just a marketing word. Firmware occupies this position on the spectrum between hardware and software — it's softer than hardware because you can update it, but it's firmer than application software because it's not something you casually mess with. The bootloader is the firmest firmware. On a modern Android phone, the boot chain is a stack of cryptographic checks. The primary bootloader, which is in the CPU's internal ROM, verifies the signature on the secondary bootloader. That secondary bootloader verifies the Trusted Execution Environment, the modem firmware, and then hands off to Android's boot image. Each link in the chain checks the next one. And if any check fails, the device halts.
Corn
Which brings us to OnePlus. Daniel said this blew his mind, and honestly, it's one of the most aggressive anti-consumer mechanisms I've seen. What OnePlus did — and specifically, this was on certain devices like the OnePlus three and three T, and then it evolved on later models — is they implemented something called anti-rollback protection using e-fuses.
Herman
Let's explain what those are, because the name is misleading. They're not fuses in the sense of a glass tube with a wire inside. An e-fuse is a tiny circuit on the silicon die itself — essentially a one-time programmable memory cell. When you "blow" an e-fuse, you're applying a voltage spike that physically alters the semiconductor material. It creates a permanent conductive path through what used to be an insulator. After that, the bit is permanently changed from zero to one, and there is no mechanism to change it back. It's a one-way, irreversible, physical modification of the chip.
Corn
It's not software at all. It's a physical state change.
Herman
It's metallurgical. You're literally burning a microscopic wire into existence. And here's what OnePlus did with it. Every time you update the firmware on one of these phones, the bootloader checks a version number burned into the e-fuse array. If the firmware you're trying to install has a version number lower than what's in the fuses, the bootloader refuses to load it. You cannot downgrade. And if you try to force a downgrade by flashing an older bootloader, the system detects the mismatch and bricks itself.
Corn
Daniel mentioned something even more extreme than that. He said the phone will "short something physical on the board that can never be recovered." And I think what he's referring to is a step beyond e-fuses. There were reports — and this got a lot of discussion on XDA Developers — that some OnePlus devices had what amounted to a self-destruct mechanism for the bootloader unlocking path. If you tried to tamper with the secure boot chain, the system would deliberately blow an e-fuse that permanently disabled the bootloader unlock capability. Not just relocked — permanently disabled. The phone would still boot, but you could never, ever unlock the bootloader again. The path was physically severed.
Herman
That's different from just bricking the device. Bricking means it won't boot at all. This is more insidious — the phone works fine, but you've lost a capability forever. The e-fuse that controlled the unlock flag is blown, and there's no going back. What makes this so brutal is that it's not a policy decision stored in rewritable flash that the manufacturer could reverse if they wanted to. It's a physical fact about the silicon. Even OnePlus couldn't undo it. They'd have to physically replace the system-on-chip.
Corn
Let's connect this back to Daniel's question about the separation between firmware and software. What we're really talking about is a hierarchy of mutability. At the bottom, you've got mask ROM — immutable by physics. Above that, you've got one-time programmable memory like e-fuses — mutable exactly once, then immutable forever. Then you've got firmware stored in flash memory, which is rewritable but protected by cryptographic signatures and hardware security mechanisms. Then you've got the operating system, which is fully rewritable but runs in a less privileged mode. And then you've got user applications, which are the softest layer of all.
Herman
That's a really good way to frame it. And the hardware enforces this hierarchy. On an ARM processor, which is what's in basically every phone and most embedded devices, there are exception levels — EL three, EL two, EL one, EL zero. The secure monitor runs at EL three, the highest privilege. The trusted execution environment runs there. The hypervisor runs at EL two. The operating system kernel at EL one. And user applications at EL zero — the least privileged. The bootloader typically runs at EL three before handing off control. Once it hands off, the lower levels can't reach back up. There's no system call to get back into the bootloader. That's why the reset button works the way it does — it triggers a hardware reset that starts the whole chain over from the mask ROM.
Corn
This is where the BIOS versus operating system distinction on PCs gets interesting, because the architecture is different. On a traditional PC, the BIOS — or UEFI on modern systems — isn't running on the main CPU in some separate enclave. It's firmware stored on a flash chip on the motherboard, and when the CPU starts up, it executes it directly. But even there, you've got the same layering. The BIOS initializes the hardware, sets up the memory controller, enumerates PCI devices, and then hands off to the bootloader, which hands off to the operating system. And once the OS is running, the BIOS is mostly dormant. The OS can call into it through ACPI for power management and things like that, but it can't rewrite it without going through a specific flashing process.
Herman
UEFI added a whole new layer to this. Modern UEFI firmware has its own network stack, its own file system drivers, its own graphical interface. You can browse the web from your UEFI setup screen if you really want to. It's practically a mini operating system. And that's where the line between firmware and software gets blurry. Is UEFI firmware or is it an operating system? It runs before the OS, it's stored in flash on the motherboard, you update it through a special process — it quacks like firmware. But it's got more features than some actual operating systems I've used.
Corn
I remember when UEFI first shipped and people were alarmed by how complex it was. The attack surface exploded. You've got a full TCP slash IP stack running in the most privileged context on the machine, before any security software has loaded. And if you compromise the UEFI, you own the machine permanently. The OS can't detect you because you're underneath it. That's what made the LoJack and Computrace controversies so interesting — those were UEFI modules that could survive a complete disk wipe and OS reinstall.
Herman
Computrace, or LoJack for Laptops as it was branded, was originally an anti-theft system. It embedded an agent in the UEFI firmware that would phone home on boot. But the mechanism it used — writing a module into the UEFI flash — is exactly the same mechanism that a rootkit would use. And the truly terrifying thing is, even if you wiped the hard drive entirely, even if you replaced the hard drive, the UEFI module would still be there. It would reinstall its Windows agent from the firmware on every fresh install. That is what "firmware persistence" means. It's not just surviving a reboot. It's surviving a complete OS replacement.
Corn
That brings us back to Daniel's question about the ESP32 and "even a very basic device." The ESP32 has no UEFI, no BIOS in the PC sense, but it has the same fundamental architecture. The mask ROM bootloader is untouchable. The flash where your application lives is separate. And there's a hardware mechanism — the strapping pins — that determines whether the chip boots normally or enters download mode. You can't override that from software. If your application crashes in a way that leaves the chip unresponsive, you pull GPIO zero low, hit reset, and the mask ROM bootloader takes over. It's a hardware escape hatch.
Herman
That escape hatch is what makes development possible. Without it, one bad firmware flash turns your device into a paperweight. But here's the thing — manufacturers are increasingly closing those escape hatches on consumer devices. The OnePlus e-fuse mechanism is the extreme example, but it's part of a broader trend. Google's Titan M security chip in Pixel phones, Apple's Secure Enclave — these are separate processors with their own firmware that the main OS can't touch. They run their own boot sequence, verify the main processor's firmware, and if something doesn't check out, they refuse to release the decryption keys. Your phone might boot, but all your data is gone because the keys never materialized.
Corn
Let's pull on that thread. Daniel asked about the essential distinction between firmware and software. I think the distinction is ultimately about who controls the update mechanism. Software is something you, the user, can replace at will. You can install a different operating system, a different browser, a different text editor. Firmware is something the manufacturer controls. Even if you can update it, you can only install updates the manufacturer signed. The update mechanism is gated by cryptographic keys that you don't have. And in the most extreme cases, like mask ROM and e-fuses, you can't update it at all.
Herman
That's the practical distinction. But there's a technical one too, and it's about what the code is allowed to do. Firmware runs at a higher privilege level than the operating system. It has access to hardware registers the OS can't touch. It can configure memory protection, set up the interrupt controller, initialize peripherals. The OS operates within the sandbox the firmware created for it. And on modern systems, there's an even deeper layer — the secure world. ARM's TrustZone splits the processor into two virtual cores, a secure world and a non-secure world. The non-secure world runs Android or Linux. The secure world runs a tiny, audited operating system that handles cryptographic operations and key storage. The non-secure world can't even see the secure world's memory. They're physically isolated by hardware.
Corn
When Daniel talks about the bootloader being "on a separate part of the chip, in its own enclave," that's literally true on a lot of modern silicon. The secure world has its own RAM, its own flash, its own peripherals. The main CPU literally has wires that only the secure world can access. And the bootloader is the gatekeeper that sets all this up before anyone else gets to run.
Herman
Let's make this concrete with the Android verified boot process, because it's a beautiful example of how all these layers interact. When you press the power button on a modern Android phone, the CPU's internal ROM — the mask ROM equivalent — loads a tiny bootloader that's signed by the chip manufacturer. That bootloader verifies the next stage, which is stored in flash on the device. That next stage is the Android bootloader, often called ABOOT on Qualcomm devices or the bootloader partition. ABOOT then verifies the boot image, which contains the Linux kernel and the initial RAM disk. The kernel verifies the system partition using dm-verity, which checks every block against a hash tree rooted in the verified boot image. And all of this is anchored in a root of trust that's physically embedded in the hardware — either in mask ROM or in one-time programmable fuses.
Corn
If any link in that chain is broken, the device either refuses to boot or boots in a degraded state. On a Pixel, if you unlock the bootloader, the device warns you on every boot that the OS can't be verified. If you relock it with a custom OS, it'll brick because the signatures don't match. And if you try to downgrade to an older version with known vulnerabilities, the anti-rollback mechanism — those e-fuses we talked about — will stop you.
Herman
Which is actually a security feature, not just a control mechanism. Anti-rollback prevents an attacker who steals your phone from downgrading the OS to a version with a known exploit. If you're on the November security patch and someone tries to flash the October version to use a vulnerability patched in November, the bootloader checks the e-fuse counter and refuses. That's good security. The OnePlus situation is where it gets ethically murky, because they used the same mechanism to prevent users from voluntarily downgrading or unlocking the bootloader. The mechanism is neutral — fuses don't care about your intentions — but the policy encoded in the bootloader determines whether it protects you or restricts you.
Corn
Daniel specifically asked about OnePlus making it so "you can't go backwards." And there's a philosophical question embedded in that. When you buy a device, do you own the firmware? Do you have the right to replace it with something else? The manufacturers say no — they're selling you the hardware and licensing the firmware. You don't own the code that makes the device work, you own the atoms. But the atoms are useless without the code. So what have you actually bought?
Herman
This is the right-to-repair and right-to-tinker argument applied to software. And it's why projects like LineageOS exist, and why the modding community fights so hard to unlock bootloaders. When a manufacturer blows an e-fuse that permanently disables bootloader unlocking, they're not just making a technical decision. They're making a statement about who ultimately controls the device. And they're making it irreversible. You can't sue them and get your unlock back. The atoms have been permanently rearranged.
Corn
Let's zoom out to the larger computer scale, because Daniel asked about BIOS versus operating system too. On a PC, the situation is different — and in some ways, it's more open and in some ways it's worse. The UEFI firmware on a PC is stored on a serial peripheral interface flash chip on the motherboard. It's physically accessible. If you have a hardware programmer — a little clip that attaches to the SPI chip — you can read it, write it, erase it, regardless of what the firmware wants. The hardware gives you physical access. On a phone, the eMMC or UFS storage and the SoC are all soldered together, and the boot ROM is inside the SoC. You can't clip onto it. Physical access doesn't get you the same level of control.
Herman
That's a really important distinction. On a PC, the firmware is stored on a discrete chip that you can physically interact with. That's why coreboot and Libreboot exist. People can replace the factory UEFI with open-source firmware because they can physically flash the chip. On a phone, the boot ROM is inside the same silicon as the CPU. There's no external bus you can tap into. The only way to interact with it is through the protocols the manufacturer designed. And if those protocols require signed firmware, you're locked out.
Corn
This gets to the heart of what firmware is, at least in my view. Firmware is code that runs on hardware you can't physically access or modify. It's sealed inside the chip package. You interact with it through the interfaces it exposes, and if it doesn't expose an interface for what you want to do, you're out of luck. The ESP32 is actually unusually open in this regard — Espressif documents the bootloader, provides the source, and lets you replace it. The mask ROM is sealed, but it's minimal and well-understood, and everything above it is yours to modify. That's why the maker community loves the ESP32. It's a device where the firmware/software boundary is explicitly designed to be permeable.
Herman
Compare that to a modern iPhone. The boot ROM — called iBoot in Apple's terminology — is in mask ROM on the A-series chip. It's never been publicly extracted. There's an entire industry of security researchers and jailbreak developers trying to find vulnerabilities in iBoot because that's the only way to gain code execution at that level. And Apple patches those vulnerabilities aggressively, because iBoot is the root of trust. If you control iBoot, you control everything above it — the kernel, the OS, the encrypted user data. That's why an iBoot exploit is worth millions of dollars on the vulnerability market.
Corn
The ESP32's mask ROM is four kilobytes and its job is to check a pin and load the next stage. iBoot is probably a few hundred kilobytes and it implements a full cryptographic verification chain, USB device firmware update recovery, and a bunch of other stuff. The complexity difference is enormous, and complexity is the enemy of security. But it's also the enemy of user control. Every feature in the boot ROM is a feature you can't change and can't opt out of.
Herman
Let's talk about the reset button again, because Daniel mentioned it as an "easy recovery mechanism," and I think there's something profound there. The reset button works because somewhere, at the very bottom of the stack, there's code that runs before anything else and that code is designed to be simple enough that it can't fail. On an ESP32, the mask ROM bootloader has one job — check a pin, decide whether to boot normally or enter download mode. It doesn't have a file system, it doesn't have drivers, it doesn't have an updater. It's so simple that there's almost nothing to go wrong. And that simplicity is what makes it reliable. That's the philosophy behind good firmware design. The lower you go in the stack, the simpler and more auditable the code should be.
Corn
That's the opposite of what we see in practice. UEFI is not simple. It's a multi-megabyte codebase with network stacks and file system drivers and graphical interfaces. Every line of code in the firmware is a potential vulnerability, and it runs at the highest privilege level. That's why projects like coreboot exist — they strip the firmware down to the bare minimum needed to initialize the hardware and hand off to a payload. Coreboot itself doesn't even load an operating system; it loads a payload like SeaBIOS or a Linux kernel directly. The idea is to minimize the amount of code that runs in that hyper-privileged context.
Herman
This connects to a broader principle in systems design. The trusted computing base — the set of code that, if compromised, compromises the entire system — should be as small as possible. On a modern smartphone, the trusted computing base includes the boot ROM, the bootloader, the secure world operating system, and the kernel. That's millions of lines of code. Every one of those lines is a potential vulnerability, and because they run in the secure world or at kernel privilege, a vulnerability there gives the attacker complete control. The firmware is the most sensitive part of the trusted computing base because it's the least visible and the hardest to update.
Corn
Let's circle back to Daniel's original framing. He said "we're really talking about the essential distinction between firmware and software." And I think after all this, the distinction comes down to three things. First, privilege — firmware runs at a higher privilege level than the OS and applications. Second, mutability — firmware is harder to change than software, and in the extreme cases like mask ROM and e-fuses, it's physically impossible to change. Third, and maybe most importantly, visibility. You can inspect your operating system. You can list processes, check file hashes, monitor network connections. The firmware is opaque. It runs before your monitoring tools, it hides in secure enclaves, and you can't audit it without specialized hardware and expertise.
Herman
That third point is the one that keeps security researchers up at night. There's a class of malware called firmware implants — or sometimes bootkits — that infect the UEFI or the bootloader. Once installed, they're invisible to the operating system. Your antivirus can scan every file on your hard drive and find nothing, because the malware isn't on the hard drive. It's in the SPI flash chip. It loads before the OS, it hooks into the OS boot process, and it can hide itself from anything running in the OS. The only way to detect it is to read the firmware directly with external hardware and compare it to a known good image.
Corn
This is why the integrity of the firmware supply chain matters so much. When you buy a motherboard or a phone, you're trusting that the firmware it ships with hasn't been tampered with. You're trusting the manufacturer's signing keys haven't been compromised. You're trusting that the update mechanism is secure. All of that trust is placed in code you can't see, running at a privilege level you can't access, on hardware you can't physically probe. It's the ultimate black box.
Herman
Let's tie this all together with something concrete. Daniel mentioned rooting Android. When you root an Android phone, what you're actually doing is gaining root access — superuser privileges — in the operating system. But root access in Android is not the same as full control of the device. Root gives you control of the non-secure world. It does not give you control of the secure world, the bootloader, or the firmware. Those run at higher privilege levels. So even a rooted phone has layers of code that the user can't touch. And on devices with locked bootloaders and blown e-fuses, rooting might be the best you can do, but it's not full ownership.
Corn
Full ownership would mean replacing the bootloader, the secure world OS, and the kernel with code you trust. That's what projects like GrapheneOS and postmarketOS are trying to do. But they can only do it on devices where the bootloader can be unlocked. And manufacturers are making that harder and harder. The OnePlus e-fuse mechanism is the logical endpoint of that trend — a device that physically self-destructs its unlock capability. It's a one-way ratchet toward lockdown.
Herman
There's an irony here. The same mechanisms that protect you from attackers also protect the manufacturer from you. Verified boot prevents malware from persisting across factory resets, which is good for security. But it also prevents you from installing an alternative operating system. The e-fuse anti-rollback prevents an attacker from exploiting old vulnerabilities, but it also prevents you from downgrading to a version that had an unlockable bootloader. The security and the control are implemented by the same hardware, using the same mechanisms. You can't have one without the other.
Corn
That's the fundamental tension. And I think it's what Daniel was getting at with his question. There's this layer — firmware — that sits between us and our devices. It's necessary, it's powerful, and it's increasingly designed to be beyond our reach. The reset button on a dev board is a reminder that it doesn't have to be that way. You can design firmware that's simple, auditable, and user-controlled. But the economic incentives push in the opposite direction. Manufacturers want control over the devices they sell, and firmware is the most effective control point.
Herman
Where does this leave us? If you're a developer or a tinkerer, buy devices that let you control the firmware. The ESP32 is great for that. The Raspberry Pi is great for that. In the phone world, Google Pixels are the most unlockable mainstream devices, and Fairphone makes a point of supporting alternative operating systems. If you're a regular user, understand that the firmware on your devices is a black box that you have to trust. Keep it updated, because firmware updates fix real vulnerabilities. And if you're somewhere in between — someone who cares about this stuff but isn't going to flash coreboot onto their laptop — at least be aware that the boundary between firmware and software is not just a technical distinction. It's a boundary of control.
Corn
Now: Hilbert's daily fun fact.

Hilbert: The mantis shrimp has sixteen color-receptive cones in its eyes. Humans have three. It can see ultraviolet, infrared, and polarized light, and it can punch with the acceleration of a twenty-two caliber bullet.
Corn
...right.
Herman
That's going to stay with me in a way I'm not entirely comfortable with.
Corn
Here's the forward-looking question I keep coming back to. We're seeing a trend toward deeper firmware lock-in across the industry. Apple's moving Macs to Apple Silicon with sealed boot ROMs. Qualcomm's adding more security features to Snapdragon chips that also happen to restrict user control. Cars have dozens of microcontrollers with signed firmware. The Internet of Things is a nightmare of un-updateable firmware with known vulnerabilities. Is there a counter-trend? Is anyone building devices where the firmware is open by design, not just open by accident of a forgotten debug pin?
Herman
I think the counter-trend is in the RISC-V ecosystem. RISC-V is an open instruction set architecture, and because there's no single company controlling it, the entire stack can be open. There are RISC-V systems-on-chip being designed now where the boot ROM is open source, the secure enclave is open source, and the firmware is auditable. It's early days, and the performance isn't competitive with ARM or x86 yet, but the architecture is there. Whether it gets traction before the locked-down model becomes universal is an open question.
Corn
That's the hope, anyway. Thanks to Daniel for the prompt — this one spiraled in exactly the right way. Thanks to Hilbert Flumingtop for producing, and for that deeply unsettling fact about mantis shrimp. This has been My Weird Prompts. Find us at myweirdprompts.com or wherever you get your podcasts.
Herman
We'll be back soon. Try not to think about the firmware in your devices. It's probably fine.

This episode was generated with AI assistance. Hosts Herman and Corn are AI personalities.