#1995: The Human Curriculum Machine

The current education standard isn't neutral—it's a political machine.

0:000:00
Episode Details
Episode ID
MWP-2151
Published
Duration
28:11
Audio
Direct link
Pipeline
V5
TTS Engine
chatterbox-regular
Script Writing Agent
Gemini 3 Flash

AI-Generated Content: This podcast is created using AI personas. Please verify any important information independently.

When we discuss the future of education, the conversation often turns to the "black box" of AI algorithms and the fear of algorithmic bias. However, a closer look at the current human-driven curriculum design reveals a system that is far from the objective, scientifically-derived baseline we often assume it to be. The process of deciding what a six-year-old learns is not a neutral act of finding truth; it is a massive, messy intersection of state politics, multi-billion dollar publishing interests, and bureaucratic inertia.

The "Texas Effect" and the Textbook Market

The United States has no federal curriculum. Instead, it has fifty different state experiments. However, due to the high cost of developing comprehensive educational materials—millions of dollars in research and design—publishers cannot afford to create fifty different versions of a textbook. This economic reality creates a massive concentration of power in just a few states. Texas, California, and Florida alone control roughly thirty-five percent of the national textbook market. This phenomenon is known as the "Texas Effect."

Publishers build their flagship products to satisfy the requirements of these largest buyers. If a state board in Texas votes to present climate change as "competing theories" or to emphasize "States' Rights" in Civil War history, publishers often tweak their national editions to secure that state's approval. This isn't always about deleting facts; it is about the hierarchy of information. A change in an adjective, the size of a heading, or the framing of a "suggested activity" can subtly shift a student's understanding of a complex topic. The result is a "centrist" textbook that is often a compromise between political extremes rather than a product optimized for how children actually learn.

The Curriculum Lag and District Decisions

Once a state adopts a curriculum, it is typically locked in for five to seven years. In a world where technology and scientific understanding move at light speed, a student in 2026 might be using a book written in 2020 based on standards debated in 2018. This structural sluggishness creates a significant disconnect between the classroom and the real world.

Even after the state approves a list of books, the local school district makes the final purchasing decision. These decisions are rarely purely academic. Curriculum directors and superintendents often prioritize budget constraints, teacher training requirements, and community fit. In some cases, districts choose a math or history curriculum not because of superior pedagogy, but because the publisher offers a "buy one, get one free" deal on digital licenses or provides a robust portal of supplemental worksheets. The selection process becomes a procurement decision driven by cost and convenience rather than educational quality.

The Teacher as the Ultimate Gatekeeper

Perhaps the most surprising layer of this machine is what happens once the textbook actually arrives in the classroom. A 2025 study by the RAND Corporation found that only about twenty-three percent of elementary teachers strictly follow their prescribed curriculum. This means nearly eighty percent of the time, the "official" textbook—the product of years of political debate and expensive development—is set aside.

Teachers are the ultimate gatekeepers. Faced with the reality of engaging twenty-five seven-year-olds, they often go "off-roading." They turn to websites like Teachers Pay Teachers, Pinterest, or YouTube to find lesson plans that are more engaging or better suited to their specific students' needs. While this flexibility allows teachers to adapt to their classroom, it creates a fragmented reality where the "curriculum" is a Frankenstein's monster of official texts, downloaded PDFs, and personal teaching styles. Quality control is often reduced to "teaching to the test," where the goal becomes test-readiness rather than deep conceptual understanding.

The Hidden Curriculum and the AI Comparison

Beyond the obvious political battles over history or climate change, there is a "hidden curriculum" embedded in the structure of these books. The types of jobs featured in math word problems, the characters in reading passages, and the values emphasized in community helper units all subtly condition students to a specific worldview. Whether it is a focus on individual entrepreneurship or community action, these choices build a specific lens through which children view the world.

Ultimately, the human system of curriculum design is not a gold standard of truth. It is a system compromised by politics, driven by economics, and frequently ignored by the teachers on the front lines. When critics argue against AI in education due to fears of hidden bias, they are often implicitly holding the human system to a standard of objectivity that simply does not exist. The "human standard" is just the result of the last election in a large state and which publisher had the best sales team.

Downloads

Episode Audio

Download the full episode as an MP3 file

Download MP3
Transcript (TXT)

Plain text transcript file

Transcript (PDF)

Formatted PDF with styling

#1995: The Human Curriculum Machine

Corn
We spent a lot of time recently worrying about the black box of AI algorithms and how they might bake political bias into a child's education, but we rarely stop to look at the massive, clanking, human-driven machine that already does that. Today's prompt from Daniel is about exactly that. He wants us to deconstruct the conventional human-driven curriculum design process for early education. Who actually chooses what our youngest and most malleable minds are learning before we even start comparing it to an AI alternative?
Herman
Herman Poppleberry here, and Corn, this is the perfect follow-up because we tend to treat the current educational standard as this neutral, scientifically-derived baseline. We act as if a group of objective monks sat in a room and decided that second graders need to know exactly these facts about the water cycle or American history. But when you look at the plumbing of the American education system, it is anything but neutral. It’s a fascinating, often messy intersection of state politics, multi-billion dollar publishing interests, and bureaucratic inertia. By the way, today's episode is powered by Google Gemini 1.5 Flash.
Corn
It’s funny you mention the plumbing, because most parents just see the water coming out of the tap. They see the textbook or the worksheet and assume it was just... found in nature. But you're saying there’s a whole series of filters and additives before it hits the desk. I think people are going to be surprised by how much power a very small number of people in very specific states actually have over what a kid in a completely different part of the country is reading.
Herman
That is the big one. We have to start with the fact that the United States has no federal curriculum. There is no national Department of Education office that writes the "Official American History" book. Instead, we have fifty different experiments running at once, but because of the economics of the publishing industry, three of those experiments matter way more than the other forty-seven.
Corn
Let me guess. Texas, California, and maybe Florida?
Herman
You nailed it. Those three states alone control about thirty-five percent of the national textbook market. Because the cost of developing a high-quality, comprehensive curriculum is so high—we're talking millions of dollars in research, design, and alignment—textbook publishers like Pearson or Houghton Mifflin Harcourt can't afford to make fifty different versions. So, they build their flagship products to satisfy the requirements of the biggest buyers. This is what researchers call the "Texas Effect."
Corn
So if a school board member in Austin has a specific opinion on how the Alamo should be taught, that opinion might end up in a textbook used by a kid in Maine just because the publisher didn't want to print two different versions of page eighty-four?
Herman
Precisely. Well, not precisely—I mean, that is the mechanism. Look at the 2024 Texas State Board of Education vote on climate science standards. They pushed for a "balanced" presentation of what they called "competing theories" regarding climate change. Now, regardless of where you stand on that politically, the result is that the major publishers had to tweak their science frameworks to get that Texas "stamp of approval." And once that framework is built, it becomes the template for the "national" edition sold to every other state.
Corn
But how does that work in practice? Does a publisher literally just delete a paragraph if Texas says so, or is it more subtle than that?
Herman
It’s often a game of adjectives and framing. For example, if a state board objects to the word "capitalism" being used in a certain context, the publisher might swap it for "free enterprise system." Or they might adjust the "suggested activities" in the teacher’s manual. If a board wants to de-emphasize the role of slavery in the causes of the Civil War, they might insist on a "States' Rights" heading being larger or more prominent. It’s not always about deleting facts; it’s about the hierarchy of information. What gets a full-page photo and what gets a tiny sidebar?
Corn
It’s like the movie industry. If a blockbuster wants to play in China, they might edit out a specific scene, and then that’s the version that ends up on the streaming services everywhere else because it’s cheaper than managing multiple cuts. But here, we’re talking about the fundamental building blocks of how a six-year-old understands the world. It’s not just a movie; it’s the operating system of their brain.
Herman
And the people making these decisions aren't always educators. The Texas State Board of Education, for instance, is an elected body. You have politicians, activists, and concerned parents voting on the specific wording of biology standards. They’re looking at the Texas Essential Knowledge and Skills, or TEKS, which is the state's massive list of what students must know. California has its own version, often leaning in the opposite ideological direction on social studies or environmental science. Publishers are caught in the middle, trying to create "bipartisan" books that won't get them banned in either state, which often leads to a sort of "graying out" of complex topics.
Corn
So we end up with these "centrist" textbooks that are actually just a weird compromise between two different political extremes, rather than being driven by what the latest cognitive science says about how kids actually learn. It seems like we’re optimizing for "least likely to cause a protest at a board meeting" rather than "most likely to help a kid understand physics."
Herman
That is the trade-off of the adoption cycle. And these cycles are long, Corn. We're talking five to seven years. If a state adopts a social studies curriculum in 2022, they might not look at it again until 2029. In a world where technology and our understanding of history or science is moving at light speed, a kid in 2026 might be using a book written in 2020 based on standards debated in 2018. The curriculum lag is a massive secondary effect that people ignore.
Corn
Seven years! In tech years, that’s like teaching someone how to use a rotary phone while they’re holding an iPhone 17. If I'm a teacher and I'm looking at a textbook that says "one day AI might be able to recognize a cat," and my students are currently using AI to write poetry, there’s a massive disconnect. Does that mean the "human standard" is actually just... outdated by definition?
Herman
It’s structurally sluggish. Think about the process: first, the state spends two years debating the standards. Then the publishers spend two years writing the books to match those standards. Then the state spends a year reviewing those books. Then the districts spend a year deciding which of those books to buy. By the time the book actually hits a student's desk, the world has moved on. It’s the educational equivalent of trying to drive a car by looking only in the rearview mirror.
Corn
And this brings us to the next layer of the machine: the school district. Even after the state picks a list of "approved" books, the local district—your city or county—has to choose which one to actually buy. This is where the curriculum directors and superintendents come in. They’re looking at budgets, teacher training requirements, and community fit. Sometimes they pick the book that comes with the best digital portal or the one that has the most "free" supplemental worksheets, because teachers are overworked and need all the help they can get.
Herman
I’ve seen cases where a district chose a math curriculum not because the pedagogy was superior, but because the publisher offered a "buy one, get one free" deal on the digital licenses for the first three years. We like to think these are purely academic decisions, but they are often procurement decisions.
Corn
I can see the sales pitch now. "Our history book is okay, but check out this amazing iPad app that comes with it!" It becomes a software sale more than a pedagogical choice. But then the book actually arrives in the classroom. Does the teacher just open to page one and start reading?
Herman
This is where the system gets even more fragmented. A 2025 study by the RAND Corporation found that only about twenty-three percent of elementary teachers actually follow their prescribed curriculum.
Corn
Wait, only twenty-three percent? So nearly eighty percent of the time, the "official" curriculum we just spent ten minutes deconstructing isn't even what’s being taught?
Herman
It’s wild, isn't it? Teachers are the ultimate gatekeepers. They look at the official book, realize it’s boring or too hard for their specific students, and they go "off-roading." They go to websites like Teachers Pay Teachers or Pinterest, or they just Google "lesson plan for fractions," and they pull in whatever looks engaging. They’re trying to survive the day and keep twenty-five seven-year-olds focused. So the "curriculum" in reality is this modular, Frankenstein's monster of official textbooks, downloaded PDFs, and the teacher’s own personal beliefs and style.
Corn
But how do they ensure they're hitting the required benchmarks if they're just pulling stuff off Pinterest? Isn't there some kind of quality control at the classroom level?
Herman
Theoretically, yes, through standardized testing. But that creates a "teaching to the test" culture. If the test says the kids need to know how to identify a trapezoid, and the official textbook has a boring chapter on it, the teacher might find a "Trapezoid Song" on YouTube. The song might be great for memorization, but it might skip the underlying geometric principles the original curriculum intended to teach. The teacher is prioritizing engagement and test-readiness over the deep structural logic of the curriculum.
Corn
So we have this top-down political battle at the state level, which creates a compromised textbook, which is then bought by a district based on budget, and then finally ignored by the teacher who is just trying to find a YouTube video that explains photosynthesis without making the kids fall asleep. If we’re worried about "AI bias," we’re comparing it to a human system that is already incredibly inconsistent, politically charged, and mostly ignored by the people on the front lines.
Herman
That’s the core of the issue. We hold AI to a standard of "perfect objectivity" that has never existed in a human classroom. When a critic says, "We can't let AI design a curriculum because it might have a hidden bias," they are often implicitly assuming that the human-designed alternative is a gold standard of truth. But as we’ve seen, the human standard is just the result of whoever won the last election in a large state and which publisher had the best sales team.
Corn
I want to dig into that "hidden bias" in the human system. It’s not just the stuff people argue about on the news, like history or climate change. There’s a "hidden curriculum" in how these books are structured, right? Like, who is featured in the math word problems? What kind of jobs do the people in the "community helpers" unit have?
Herman
You're touching on the second-order effects. Think about the "Texas Effect" again. If the standards emphasize "individualism" and "entrepreneurship" because that’s the political culture of the adopting state, every math problem might be about a kid starting a lemonade stand. If the standards in California emphasize "community action" and "environmental stewardship," the math problems might be about calculating the yield of a community garden. Neither of these is "wrong," but they are both subtle forms of conditioning. They build a specific worldview. When we talk about AI, we’re worried the AI will hallucinate or push a specific agenda, but the human system has been pushing agendas—sometimes intentionally, sometimes just through the path of least resistance—for a century.
Corn
Can you give me a concrete example of that? Like, something that isn't a hot-button political issue but still shows this conditioning?
Herman
Sure. Look at early literacy readers—those "See Spot Run" style books. For decades, they almost exclusively featured nuclear families in suburban settings with white picket fences. This wasn't necessarily a conscious "anti-urban" conspiracy; it was just the demographic of the people writing the books and the perceived "ideal" of the buyers. But for a kid living in a high-rise apartment in Chicago or a rural farm in Appalachia, that curriculum is sending a subtle message that their life is the "exception" while the suburban life is the "norm." That is a bias, and it’s baked into the very ink of the page.
Corn
And the AI isn't even the one "creating" these biases in a vacuum. It’s training on the very textbooks and lesson plans we're talking about. If a developer at a company like OpenAI or Anthropic feeds a model a library of "standard" American textbooks, the AI is just going to ingest the "Texas Effect" and the "California Effect" and spit out a synthesized version of that compromise. It’s not escaping the system; it’s just automating it.
Herman
Which is why understanding the supply chain of educational content is so critical. If we don’t understand how the "sausage is made" in the human world, we won't recognize when the AI is just serving us the same sausage at a million times the speed. We’re already seeing this with platforms like Khanmigo or MagicSchool AI. They are being adopted rapidly because they save teachers that "six hours a week" mentioned in the NPR report. But what are those tools using as their "truth"? Usually, they’re aligned to the Common Core or state standards—the very things we just described as being politically negotiated compromises.
Corn
It’s a bit of a "garbage in, garbage out" situation, but the "garbage" is actually just our own messy democracy. I’m curious about the "curriculum directors" at the district level. They seem like the most invisible part of this. Are they the ones who could actually fix this, or are their hands tied by the state?
Herman
They’re stuck in a vice. On one hand, they have to hit the state test scores. If their students don't perform on the standardized tests—which are aligned to the state standards—the district loses funding or gets "graded" poorly by the state. So the curriculum director has to choose materials that "teach to the test." On the other hand, they have local parents who might be very vocal about specific topics. It’s a high-pressure, low-reward job. That’s why you see so much "safe" curriculum. Innovation in the human system is incredibly slow because the risk of offending someone or missing a test metric is too high.
Corn
But what if a curriculum director wants to go rogue? Could they just say, "We're not using the state-approved list, we're building our own from scratch"?
Herman
They can, but it's a bureaucratic nightmare. They often have to apply for waivers, prove that their "alternative" curriculum meets every single one of the state's hundreds of standards, and then find a way to fund it without the state subsidies that often come with using the approved publishers. It’s like trying to build your own car from scratch when the government is offering you a free, pre-built minivan as long as you don't mind the color. Most districts just take the minivan.
Corn
So AI actually has an opening here, doesn't it? If the human system is slow, compromised, and often ignored, an AI tool that can generate a personalized, engaging lesson plan in five seconds that still hits the state standards is like a superpower for a teacher. But you mentioned earlier that teachers are already "off-roading." Does AI just make the off-roading more dangerous?
Herman
That’s the fear. If a teacher uses AI to generate a lesson on the Civil War because the textbook is too dry, and the AI—seeking to be "helpful" and "engaging"—adds in some "interesting facts" that are actually hallucinations or reflections of a very specific internet subculture it was trained on, we’ve lost the one thing the human system (for all its flaws) actually has: a paper trail. You can point to a textbook and say, "Who wrote this? Why is this here?" With an AI-generated lesson plan, that accountability starts to evaporate.
Corn
It’s the difference between a bureaucracy and a black box. With the bureaucracy, you can at least find the guy in the office who made the decision, even if he’s a politician you don’t like. With the AI, you’re just looking at a screen going, "Well, the chatbot said it was true." But let's look at the flip side. Could AI actually be used to break the "Texas Effect"? Could a small district in, say, Vermont, use AI to build a custom curriculum that is perfectly tailored to their local history and values, without having to rely on a book written for the Texas market?
Herman
That is the big "what if." If the cost of curriculum development drops to near zero because of AI, we could see a radical decentralization of education. We could move away from these "national" de facto standards and back toward truly local control. Imagine a world where every town has its own curriculum, constantly updated with the latest research, but still reflecting that community’s specific priorities. But—and this is a big "but"—that requires a level of oversight and pedagogical expertise at the local level that we might not have anymore because we’ve spent forty years centralizing everything.
Corn
We’ve basically spent decades training our school boards to just "pick a provider" rather than "design an education." If we give them the tools to design, do they even remember how to do it? It’s like giving someone who’s only ever eaten at McDonald's a professional chef's kitchen and saying, "Okay, go make a five-course meal." They might just end up making a really complicated Big Mac.
Herman
Or they might burn the kitchen down. But the reality is that the "frictionless" nature of AI—which that Fortune article warns about—is exactly why it’s going to win. Teachers are drowning. If an AI can save them six weeks of work a year, they are going to use it, whether the "official" curriculum likes it or not. The "off-roading" is about to become the main highway.
Corn
So if the off-roading is becoming the highway, we need to talk about what this means for the kids. Because the "youngest and most malleable minds," as Daniel put it, are the ones sitting there while this transition happens. If the "human standard" is a compromised, slow-moving beast and the "AI alternative" is a fast-moving, potentially biased black box, what does a "good" curriculum even look like in 2026?
Herman
It probably looks like a hybrid. But to get there, we have to stop pretending that the current system is a neutral authority. We need to look at the funding mechanisms. Why do Pearson and McGraw Hill have so much power? Because they have the capital to navigate the massive state adoption bureaucracies. If we want better curriculum—whether human or AI—we have to lower the barriers to entry for new ideas. Right now, the system is designed to favor the biggest, most "average" players.
Corn
It’s a monopoly on thought, essentially. Not a malicious one, but a structural one. I’m thinking about the parents who are listening to this. They might feel a little bit helpless. Like, "Great, my kid is either reading a politically compromised textbook or a hallucinating chatbot. Awesome." What can they actually do?
Herman
The first step is transparency. Most parents have no idea how their district chooses books. They don't know when the "adoption cycle" is happening. Every district has a curriculum committee. Usually, those meetings are open to the public, and usually, nobody shows up. If you want to influence what your kid learns, you have to show up when the "plumbing" is being installed, not just when the water is already running.
Corn
"Show up for the plumbing." I like that. It’s not as exciting as arguing about a specific book at a board meeting, but it’s where the real power is. It’s about the framework, not just the content. If you can influence the criteria for how a book is chosen, you’ve done a lot more than if you just get one chapter changed.
Herman
And we need to demand that same transparency from the AI tools. If a school is using an AI-assisted lesson planner, parents should be asking: "What datasets was this trained on? Does it have a 'neutrality' filter? How does it handle controversial topics?" We should be as skeptical of the AI "black box" as we are of the Texas State Board of Education's "smoke-filled rooms."
Corn
It’s funny, we started this thinking about AI bias, but the more we talk, the more it seems like the "human bias" is the more interesting problem because it’s so well-hidden in plain sight. We’ve normalized it. We’ve called it "standards."
Herman
I mean, you're right. We've branded political compromise as "pedagogical excellence." And that’s the danger. If we don’t recognize the bias in the human system, we’ll never be able to build an AI that’s any better. We’ll just be automating our own disagreements.
Corn
I want to go back to the RAND Corporation study for a second. Twenty-three percent of teachers following the curriculum. That means seventy-seven percent are doing their own thing. In any other industry, if seventy-seven percent of your employees ignored the official manual, you’d say your manual is broken. But in education, we just ignore the fact that the manual is broken and keep printing more copies of it.
Herman
Because the manual isn't for the teachers. The manual is for the politicians and the publishers. It’s a document of legal and political compliance. "Look, we covered X, Y, and Z, so we can't be sued or lose our 'A' rating from the state." Whether the kid actually learns X, Y, or Z is almost a secondary concern for the people at the very top of the hierarchy.
Corn
That’s a cynical take, Herman, but it feels accurate. It’s about "covering your bases" rather than "opening a mind." And if that’s the "human standard," then the bar for AI is actually pretty low. If an AI can just be consistently decent and engaging, it’s already beating a lot of the "compromise" material that’s gathering dust on school bookshelves.
Herman
But "engaging" is where the Fortune article gets worried. It talks about "artificial intimacy" and the "frictionless" nature of AI. If learning becomes too easy because the AI is so good at explaining things, do kids lose the ability to grapple with difficult, boring, or contradictory information? The human curriculum—for all its flaws—often forces you to deal with a dry, difficult text. There’s a cognitive muscle being built there. If the AI turns everything into a personalized, fun adventure, are we atrophying the part of the brain that handles "real world" complexity?
Corn
It’s the "broccoli vs. chocolate-covered broccoli" problem. The official curriculum is the raw broccoli. It’s good for you, but nobody wants to eat it, and most teachers end up throwing it away. The AI is the chocolate-covered version. Everyone eats it, but are they still getting the nutrients? Or are they just getting a sugar rush?
Herman
That’s the meta-question. And it’s why we need to be very careful about how we integrate these tools. We shouldn't just be looking at "does it hit the standards?" We should be looking at "is it challenging the student?" The current human system often fails at both, but at least we know how it fails. We’re still learning how AI fails in the classroom.
Corn
This really reframes the whole "AI in schools" debate for me. It’s not a battle between "perfect human teachers" and "scary robots." It’s a battle between a slow, politically-motivated bureaucracy and a fast, commercially-motivated algorithm. Neither one of them is inherently "on the side of the child" unless we, as parents and citizens, force them to be.
Herman
That’s the takeaway. The "who" in Daniel’s prompt isn't just one person. It’s a committee in Austin, a lobbyist in D.C., a publisher in New York, a curriculum director in your town, and a teacher who’s had three hours of sleep. And soon, it’ll be a software engineer in San Francisco and a GPU cluster in a data center. The "who" is becoming more complex, not less.
Corn
Well, I think we’ve thoroughly deconstructed the "standard" model. It’s a lot messier than the "neutral expertise" image I had in my head. Before we wrap up, I want to make sure we give people those practical bits they can actually use. Because this is a lot of "the system is broken" talk, and I want to give them some "here’s how to fix your corner of it" advice.
Herman
First, find out your district’s adoption schedule. Most states publish this online. If you know that your district is picking new English Language Arts books in 2027, you have a year to start asking questions. Ask for the "rubric" they use to evaluate the books. Is "pedagogical evidence" weighted higher than "digital features"? Second, look at what your kid is actually bringing home. If they’re using a lot of photocopied worksheets from random websites, your teacher is "off-roading." Don't judge them for it—they're trying to survive—but ask them what they feel is missing from the official book. That’ll tell you more about the quality of the curriculum than any school board meeting will.
Corn
And if your school is starting to use AI tools, ask about the "human-in-the-loop" policy. Is the AI generating the lesson, or is it just a starting point for the teacher? Who is checking the AI’s work for the "Texas Effect" or for hallucinations? Accountability shouldn't disappear just because the "author" is a model.
Herman
One last thing: check out the resources from groups like EdReports. They’re like the Consumer Reports for curriculum. They actually have experts sit down and rate these textbooks based on how well they align with research. It’s one of the few places where you can get a relatively objective look at whether a book is actually "good" or just "popular."
Corn
"Consumer Reports for textbooks." That’s actually really useful. I’m going to go check if my kid's math book is "highly rated" or just "highly sold."
Herman
I can almost guarantee it’s the latter if you live in a big state. But that’s the world we live in. We’re navigating a giant marketplace of ideas, and the "merchants" are politicians and publishers.
Corn
Well, this has been an eye-opener. I feel like I need to go read a textbook now just to see if I can spot the "hidden hand" of a Texas school board member.
Herman
You’ll see it everywhere once you know what to look for. It’s like the Matrix, Corn. Once you see the code, you can't go back.
Corn
Hopefully, the code we’re writing for the future—the AI code—will be a little more transparent than the human one. But I’m not holding my breath.
Herman
I'm not either, but I'm excited to keep digging into it. There’s so much more to explore here, especially as these AI models start taking over the "off-roading" part of teaching.
Corn
That’s a topic for another day. For now, I think we’ve given Daniel plenty to chew on. This has been "My Weird Prompts." I’m Corn, the thoughtful sloth.
Herman
And I’m Herman Poppleberry, the nerdy donkey. If you’re finding value in these deep dives, please consider leaving us a review on Apple Podcasts or Spotify. It’s the best way to help new listeners find the show.
Corn
Thanks as always to our producer, Hilbert Flumingtop, for keeping the gears turning behind the scenes. And a big thanks to Modal for providing the GPU credits that power our generation pipeline.
Herman
You can find all our past episodes, including our deep dive into the economics of textbook publishing, at myweirdprompts.com. We’ve got the full RSS feed and all the links to subscribe there.
Corn
We’re also on Telegram—just search for "My Weird Prompts" to get notified the second a new episode drops. It’s the best way to stay in the loop.
Herman
Until next time, keep asking the weird questions.
Corn
And keep an eye on those textbooks. See ya.
Herman
Goodbye.

This episode was generated with AI assistance. Hosts Herman and Corn are AI personalities.