#2255: Typst vs. LaTeX: The AI-Ready Document Engine

Can Typst succeed LaTeX as the go-to tool for programmatic typesetting, especially for AI agents? We compare the two and explore what makes a docum...

0:000:00
Episode Details
Episode ID
MWP-2413
Published
Duration
43:21
Audio
Direct link
Pipeline
V5
TTS Engine
chatterbox-regular
Script Writing Agent
deepseek-chat

AI-Generated Content: This podcast is created using AI personas. Please verify any important information independently.

The landscape of programmatic document generation is undergoing a significant shift, driven by the convergence of modern tooling and AI automation. At the center of this discussion is Typst, a new typesetting system designed to offer the power of LaTeX with a drastically improved developer and user experience. This comparison isn't just about syntax; it's about which system is better suited for a future where AI agents are frequent document authors.

Typst's Modern Approach
Typst 1.0 presents itself as a direct successor to LaTeX, built in Rust for performance and designed with clarity in mind. Its core innovation lies in its abstraction model. Instead of LaTeX's often-opaque macro expansion, Typst uses declarative "set" and "show" rules for styling—conceptually similar to CSS for documents. This makes controlling appearance more predictable and composable. Furthermore, Typst features incremental compilation for speed and, critically, human-readable error messages. This clarity is a major advantage not just for new users, but for automated systems that need to understand and correct mistakes.

The LaTeX Incumbent
LaTeX remains the undisputed champion for high-quality, precise typesetting, especially in academic and technical fields. Its vast, 40-plus-year ecosystem of packages can handle incredibly niche formatting needs, from musical notation to complex diagrams. However, this strength is also a weakness: its learning curve is steep, error messages can be cryptic, and its capabilities are a patchwork of packages that don't always interoperate smoothly. This "incidental complexity" makes it a challenging environment for AI agents to operate in reliably.

The AI Integration Layer
The discussion moves beyond human use to focus on AI readiness. The key is not just having an AI generate code for a typesetting system, but allowing the AI to use the system as a tool. This is where protocols like the Model Context Protocol (MCP) come in. An MCP server can expose document templates, brand assets, and compilation tools to an AI agent, enabling a true feedback loop: generate, compile, check for errors, and iterate.

For this agent-centric workflow, Typst's design offers clear advantages. Its clean API and deterministic error handling make building a robust MCP server more straightforward. An agent is more likely to produce a correct, compilable Typst document on the first try compared to a fragile LaTeX one.

Blueprint for a Great Typesetting System
The analysis leads to a set of first principles for a great, automation-ready typesetting program:

  1. Declarative, Predictable Styling: A coherent system like Typst's rules, not a macro soup.
  2. Clear Errors & Determinism: Essential for both debugging and automated correction.
  3. Strong Data Ingestion: Native handling of JSON, CSV, or YAML to seamlessly integrate dynamic data.
  4. Headless Operation: A CLI or API for scriptable, GUI-free compilation.
  5. Extensible but Constrained Styling: Power balanced with guardrails to maintain consistency.
  6. Professional Output Quality: Non-negotiable, print-ready PDFs.

The Verdict and Path Forward
For new projects or those prioritizing automation, Typst presents a compelling case. Its shallower learning curve and cleaner integration story make it a strong candidate for AI-driven document pipelines. The ideal toolchain involves exposing a Typst template—with brand guidelines encoded in its style rules—via an MCP server, allowing an AI agent to inject data and generate polished, consistent documents at scale.

While LaTeX's ecosystem remains unmatched for specialized tasks, the future of automated, high-quality document generation appears to be leaning toward modern, designed systems like Typst that treat AI as a first-class user.

Downloads

Episode Audio

Download the full episode as an MP3 file

Download MP3
Transcript (TXT)

Plain text transcript file

Transcript (PDF)

Formatted PDF with styling

#2255: Typst vs. LaTeX: The AI-Ready Document Engine

Corn
So Daniel sent us this one. He's asking about Typst and the whole world of programmatic typesetting. The core question is whether Typst is the easier successor to LaTeX, which we all know is powerful but famously... let's say user-hostile. He wants to know which tools have the deepest integration with protocols like MCP to let AI agents actually generate beautiful documents, not just text dumps. And while we're at it, what actually makes a great typesetting program great? What features should we be looking for, whether it's cloud-based or local?
Herman
That's a fantastic set of questions. And the timing is perfect. Typst one point zero just hit stable release in March, the MCP ecosystem is having its breakout moment, and we're hitting this inflection point where AI can do the thinking but still struggles with the final, polished output. The document is kind of the last frontier.
Corn
It's the difference between an agent that can analyze a dataset and one that can produce a board-ready report you'd actually hand to someone. Most of what comes out of a chatbot still looks like it was assembled by a very smart intern who's never heard of margins.
Herman
Or kerning.
Corn
Or the concept that pages should be numbered. So, where do we even start with this? The obvious anchor is LaTeX.
Herman
Right. And by the way, today's episode is being written by DeepSeek V three point two chat. Fitting, given the topic.
Corn
A little meta. I like it. Okay, LaTeX. The eight-hundred-pound gorilla that's been around since the eighties. It's the benchmark for programmatic, high-quality typesetting, especially in academia. But Daniel's prompt hits the nail on the head—it's notoriously difficult. The error messages are cryptographic, the package management is a dark art, and the learning curve is a cliff.
Herman
And that's the tension. We have this incredibly powerful, precise system that produces beautiful output, but it's surrounded by what feels like decades of incidental complexity. The promise of something like Typst is to preserve that power and precision while stripping away the pain. But is that what it actually does? Or is it just putting a nicer facade on a fundamentally similar model?
Corn
That's the first layer. Then we have the AI agent layer. An MCP server, for those who haven't been following the jargon, is essentially a standardized way for an AI to interact with a tool or a resource. So the question becomes: which of these typesetting systems is most 'agent-ready'? Which one can an AI most reliably and effectively use to generate a document that doesn't look like a chatbot... well, you said it.
Herman
Precisely. And that leads us to the third, more philosophical layer: what are the first principles of a great typesetting program? If we were to design the ideal system for this era—where automation and AI are first-class citizens—what features would it have? It's not just about prettier syntax.
Corn
So we've got a three-part investigation. The Typst versus LaTeX reality check. The AI integration landscape. And the blueprint for what greatness actually looks like. Should we start with the newcomer?
Herman
Let's start with the newcomer. Typst one point zero. The marketing pitch is clear: LaTeX, but simpler, faster, and designed for the modern toolchain. It's written in Rust, which gives it performance and safety characteristics that a forty-year-old codebase in Pascal and C can't match. But the interesting part isn't the language it's written in—it's the model of computation.
Corn
Which is?
Herman
LaTeX is essentially a macro expansion system on top of TeX's primitive box-and-glue model. You write commands, they expand, they manipulate boxes, and eventually you get a PDF. The control is immense, but the flow is often opaque. Typst introduces a different core abstraction: show rules and set rules.
Corn
That sounds suspiciously like CSS for documents.
Herman
It's in that conceptual family, yes. A set rule applies styling properties—make all headings this font, set the page margins to that. A show rule is more powerful; it can transform content. You can say 'show heading: it goes in a blue box with this spacing.' The key is that these rules are scoped and composable in a way that LaTeX's macro soup often isn't. And the compiler is incremental—it only recompiles what changed, which is why it feels blisteringly fast compared to a full LaTeX run.
Corn
So the claim of 'easier' isn't just about cleaner syntax for fractions. It's about a more predictable, declarative system for controlling appearance. The pitfall with LaTeX is that you can achieve any layout you want, but you might have to descend into package internals and fight with counter registers for three days to do it.
Herman
That's the common way this goes wrong, yes. With Typst, the styling system is built-in and coherent from the start. Need a grid layout? There's a built-in grid function. Need to style a table? You use set and show rules on the table element itself. It's a unified model, whereas LaTeX's capabilities are a patchwork of packages that sometimes cooperate and sometimes wage war in your document preamble.
Corn
And the error messages?
Herman
Human-readable. That's a huge deal for adoption, and an even bigger deal for AI integration. A LaTeX error might say 'undefined control sequence' on line two hundred and forty-five of some package file you've never heard of. A Typst error will typically tell you what you were trying to do, what went wrong, and often suggest a fix. That determinism and clarity is catnip for an automated system.
Corn
So on paper, Typst sounds like the heir apparent. But what's the trade-off? It can't just be strictly better. Nothing is.
Herman
The ecosystem. LaTeX has forty-plus years of packages for every conceivable niche: from typesetting chess notation to drawing phylogenetic trees. Typst's package ecosystem is growing rapidly, but it's young. If your workflow depends on a hyper-specialized LaTeX package, you might be waiting a while for a Typst equivalent. The other trade-off is mindset. If you're deeply invested in the LaTeX way of thinking—the specific commands, the workarounds, the tribal knowledge—switching to Typst's model requires unlearning some of that.
Corn
It's the classic innovator's dilemma. The new system is better on the dimensions that new users care about—ease of use, clarity, speed—but can't immediately match the incumbent on the long tail of specialized features. The question is whether that long tail matters for the majority of use cases, and for the new use case of AI-driven generation.
Herman
And that's where we pivot to the second layer. Let's say I'm building an AI agent that needs to generate a beautiful, data-driven report. I could tell it 'use LaTeX.' But the probability of it producing a compilable document on the first try, given LaTeX's fragility, is low. I could tell it 'use Typst.' The probability goes up significantly because the language is cleaner and the errors are clearer. But the real leap is giving the agent a dedicated tool via MCP.
Corn
Explain that. What does an MCP server for typesetting actually do?
Herman
Instead of just prompting the AI with 'write Typst code,' you run a small server that exposes 'resources' and 'tools' to the AI. The resources might be your company's document templates, your brand color palette defined as variables, your approved image assets. The tools would be things like 'compile this Typst source to PDF' or 'render a preview of this snippet.' The AI interacts with these through a structured protocol. It's not just generating text; it's using a toolchain.
Corn
So the AI becomes a user of the typesetting system, not just a code generator. It can iterate: generate, compile, see an error, fix it, recompile. That feedback loop is impossible if you're just dumping code into a prompt and hoping.
Herman
And this is where Typst's design pays dividends. A clean, well-defined API for compilation and introspection makes building that MCP server straightforward. With LaTeX, you'd likely need a thick wrapper layer to tame the compilation process and parse the logs. There's a reason the first wave of MCP servers for documentation are targeting tools like Mintlify or Archbee—cloud-native, API-first systems. Typst, with its local CLI and web assembly build, fits nicely into both cloud and local agent workflows.
Corn
So we're building a feature matrix for the 'great' typesetting program. Number one: a declarative, predictable styling system. Typst's show rules versus LaTeX's macros. Number two: human-readable errors and deterministic output. Critical for automation.
Herman
Number three: strong data ingestion. The system should natively handle JSON, YAML, CSV—not as an afterthought. You need to be able to plug in a dataset and loop over it to generate tables or charts. Number four: headless operation. It must have a CLI or API you can call without a GUI. Number five: an extensible styling system that balances power with constraint. And number six: output quality that meets professional standards for typography and PDF generation.
Corn
That last one is non-negotiable. If the PDFs look cheap, nothing else matters. LaTeX clears that bar effortlessly. Typst, from what I've seen, does too. The question is about the cloud versus local divide. Overleaf made LaTeX accessible in the cloud. Typst has its own web app. But for an AI pipeline, is the future in managed cloud services or local, containerized toolchains?
Herman
I think it's both, for different reasons. A cloud service is great for collaboration and for eliminating setup—just point an agent at an API endpoint. But for sensitive data, for air-gapped environments, for reproducible builds, you need a local toolchain you can containerize. The great tool will offer both. Typst's architecture, with a single Rust compiler that can run anywhere, suggests they get this.
Corn
Let's make this concrete. Walk me through a case study. Say I have an AI agent that needs to generate one hundred personalized sales proposals. Each one pulls client data from a CRM, product specs from a database, and needs to adhere to strict brand guidelines. What does the ideal toolchain look like?
Herman
You'd have an MCP server that exposes your proposal template—written in Typst—as a resource. The template would have placeholders for client data, product features, pricing. The agent's tool would be 'generate proposal,' which takes a client ID as input. The agent fetches the data, injects it into the template, calls the Typst compiler via the server, and outputs a PDF. The brand guidelines are encoded in the template's set and show rules, so they're enforced automatically. No manual formatting, no copy-pasting into Word.
Corn
Versus the alternative: the agent writes a Python script that uses a library like ReportLab to manually position every text box and line. It's possible, but the moment you need to change a font or adjust a margin, you're rewriting code, not adjusting a style rule.
Herman
That's the essence of programmatic typesetting. The document is a compiled artifact, like a binary. The source code is your content and your style rules. Change a rule, recompile, every document updates. It's a powerful model that's been hiding in academia for decades. The convergence of modern tools like Typst and AI agent protocols is finally bringing it to the mainstream.
Corn
So our takeaway for someone listening might be: if you're starting a new project where you control the document pipeline, give Typst a serious look. The learning curve is shallower, and the integration story for automation is clearer.
Herman
And if you want to make your documents AI-ready, don't just think about chat interfaces. Think about exposing your typesetting engine as a tool via a protocol like MCP. The agent needs to be able to compile, preview, and iterate. That's what turns a text generator into a document producer.
Corn
We've only scratched the surface on the competitor landscape—tools like Pandoc, Quarto, SILE—and the deep technical differences in how these systems handle layout. But the framework is there. A great typesetting program for the AI era is one built for both humans and machines, with clarity, determinism, and a separation of concerns between content and style that actually works.
Herman
And with that, we should probably dive into the nitty-gritty of how Typst's compiler actually works, because that's where the real magic—and the real trade-offs—are hiding.
Corn
Lead on, walking encyclopedia. Just try to keep the Rust jargon to a minimum. Some of us are still recovering from the last package manager debate.
Herman
No promises. But I'll start with the incremental compilation model. That's the secret sauce for the speed. It's what makes the web editor feel instantaneous.
Herman
When you type a character, it doesn't recompile the whole document from scratch. The Rust compiler tracks dependencies between parts of your code—if you change a styling rule on page ten, it knows which page layouts are affected and only recomputes those. It's the same architecture that powers modern IDEs.
Corn
So it's not just that the compiler is fast because it's written in Rust. It's fast because it's smart about what to recompile. That's a fundamental architectural advantage over LaTeX, which typically runs a full compilation pass every time, right?
Herman
For most workflows, yes. There are some LaTeX packages that enable partial previews, but they're add-ons, not core to the system. With Typst, it's built-in. This has a subtle but important effect on the development loop. You tweak a style, you see the result in under a second. That tight feedback encourages experimentation in a way that waiting for a multi-second LaTeX compile does not.
Corn
That feeds directly into the AI agent use case. An agent trying to refine a document through trial and error would be crippled by a slow compile cycle. Speed isn't just a convenience; it's an enabler for iterative automation.
Herman
And the compiler's architecture influences the language design. Because the system understands the structure of your document—this is a heading, this is a figure caption—it can provide better errors and enable those show rules we talked about. It's a virtuous cycle.
Corn
Let's talk about the package manager, because that's where LaTeX's legendary complexity often manifests. CTAN is a vast, uncurated repository. Finding the right package, dealing with conflicts, managing installs across different operating systems… it's a rite of passage.
Herman
Typst has a built-in package manager that's more like npm or Cargo. You declare dependencies in your document metadata. The compiler fetches them. They're versioned and namespaced. It's a centralized, curated ecosystem, which has pros and cons. The pro is no more 'package not found' errors because you forgot to install something with your system's package manager. The con is that you're reliant on Typst's central repository, and the selection is, as we said, younger.
Corn
It's the walled garden versus the wild frontier. LaTeX's CTAN is the wild frontier—incredible breadth, but you might get bitten by a snake. Typst's package system is safer and more manageable, but the garden hasn't grown all the exotic plants yet.
Herman
For the core use cases—academic papers, reports, letters, books, slides—Typst's built-in features and growing package set are more than sufficient. It's when you venture into highly specialized domains that you might hit a wall. But for the new wave of automated business document generation, those specialized domains aren't the target.
Corn
So we've defined programmatic typesetting. We've contrasted the old and new guard. The core questions Daniel is asking become sharper. Is Typst the easier successor? For new users and automated systems, the evidence points to yes. How do we connect AI to these tools? Through structured protocols like MCP that treat the typesetter as a tool, not a mystery box. And what features actually matter? We've started that list: predictability, clarity, speed, and a separation of content from style that doesn't leak.
Herman
The 'what is this really about' question is key. This isn't about choosing a better word processor. It's about building an infrastructure layer for high-quality, automated document generation. Moving past the limitations of markdown, which lacks precise layout control, and past the fragility of templated Word documents, which break when you look at them funny. It's about treating the document as a serious computational artifact.
Corn
Which means the stakes are higher than just prettier PDFs. It's about reliability, scalability, and maintainability in document pipelines. If your company's contracts, reports, or proposals are generated this way, the choice of typesetting engine becomes a foundational platform decision.
Herman
And with that foundation laid, we should look at the specific mechanics. How does Typst actually handle something fundamental, like page layout or floating figures? That's where many document systems show their seams.
Herman
The floating figure problem is a perfect example. In LaTeX, you place a figure environment, and the engine decides where it goes based on complex placement algorithms—'float specifiers' like h, t, b, p. It's powerful but opaque. You often end up with figures appearing pages away from their references, or you fight with the system using placement overrides.
Corn
And the logs are no help. 'LaTeX Warning: Float too large for page' is about as descriptive as a fortune cookie.
Herman
Typst takes a different approach. It has a concept called 'placement' that gives you more direct control. You can say 'place this figure at the top of the page' or 'make it span both columns,' but the default behavior is more predictable. The system tries to keep the figure close to where you wrote it in the source, but it also has a grid-based layout system that makes it easier to reason about where things will end up.
Corn
So it's trading some of LaTeX's automatic, 'we'll figure it out for you' magic for more predictability and control. For an automated pipeline, predictability is king. You can't have an AI agent generating a document where the figures jump around randomly between compiles.
Herman
That's the key. Deterministic output. If you compile the same Typst source twice, you get the same PDF, byte for byte. With LaTeX, because of floating placement and some internal timing, you can sometimes get minor differences. That's a nightmare for version control or automated testing.
Corn
Let's get concrete. Show me what a simple document with a figure looks like in both. Nothing fancy, just a title, a paragraph, and an image with a caption.
Herman
Okay. In LaTeX, you'd have your document class, you'd import the graphicx package. Your figure environment would look something like this. Backslash begin figure, brackets h t b p, backslash centering, backslash includegraphics width equals backslash textwidth, then backslash caption, backslash label, backslash end figure. The syntax is dense with backslashes and curly braces.
Corn
And if you forget to include the graphicx package, you get an incomprehensible error about an undefined control sequence. Fun.
Herman
In Typst, the equivalent is more concise. You use the image function, you can set its width relative to the page, and you wrap it in a figure function that automatically handles the numbering and caption. The syntax is closer to a modern programming language. Figure bracket, image bracket "chart dot png", width: eighty five percent, caption: "Sales growth over the quarter." And that's it. The styling of the figure—the border, the spacing, the caption font—is controlled by set and show rules elsewhere, not mixed with the content.
Corn
That separation is what makes it programmatic. The content is just the data—the image path, the caption text. The presentation is a rule that says 'all figures have this styling.' Change the rule once, every figure in every document updates. You can't do that in LaTeX without creating a custom macro or hacking the document class, which is its own rabbit hole.
Herman
And this brings us to the core architectural difference. LaTeX is built on TeX's macro expansion system. It's a programming language where you manipulate tokens. It's incredibly flexible, but that flexibility is why errors are so cryptic—you're debugging a macro that has expanded fifty times. Typst is built around a structured document model. The compiler knows what a figure is, semantically. So its error messages can say 'you forgot to close the caption argument in your figure on line twenty-four,' not 'runaway argument?'
Corn
So the 'easier' claim isn't just about nicer syntax. It's about a fundamentally different model of computation that's easier for humans and machines to reason about. Which leads to the big question: is Typst a true successor, meant to replace LaTeX, or is it a complementary tool for a different use case?
Herman
I think it's aiming to be a successor for a large swath of LaTeX's use cases, particularly in scientific and technical publishing where the pain points are highest. But it's also carving out a new niche: automated business document generation. LaTeX was never designed for that. Its roots are in high-quality typesetting for academia. Typst is designed from the ground up for both.
Corn
The competitor landscape shows there's demand for this middle ground. Pandoc is the universal document converter—it can take markdown and output LaTeX, Word, or HTML. But it's not a typesetting engine itself; it relies on those backends. Quarto builds on Pandoc and R Markdown, adding reactive notebooks and publishing workflows, but again, for PDF output, it often uses LaTeX or ConTeXt under the hood.
Herman
Right. Quarto is a fantastic tool for data science reporting, but it's a layer on top. SILE is another interesting one—a typesetting engine inspired by LaTeX and written in Lua, designed to be more scriptable. But it hasn't reached critical mass. Typst is unique in being a modern, from-scratch engine with a bundled package manager and a focus on both interactive and batch use.
Corn
Let's test this with a case study. A researcher needs to generate one hundred data-driven report variants. Each report uses the same template but pulls in different datasets, generates different charts, and has customized commentary. Which toolchain is more maintainable? LaTeX with Python scripts, or Typst?
Herman
With LaTeX, you'd likely write a Python script that uses a templating engine like Jinja to generate a hundred different .tex files, then shell out to pdflatex to compile each one. You have to manage the LaTeX package dependencies on the system, handle the log files for errors, and pray that a floating figure doesn't cause a page break that ruins a layout.
Corn
And when you need to update the template's style, you're editing that Jinja template, which is now mixing LaTeX syntax with template logic. It becomes a brittle house of cards.
Herman
With Typst, the template is a Typst file with variables. You write a small script—or an agent—that loads the data, calls the Typst compiler's API with the data fed as a JSON dictionary, and gets a PDF back. The styling is all in the Typst file. The logic is cleaner. And because Typst's errors are better, if a dataset has a malformed value, you get a clear error pointing to the problem, not a LaTeX crash deep in a package.
Corn
The trade-off, as always, is ecosystem maturity. If the researcher needs a specific bibliography style mandated by a journal, or a complex chemical structure diagram, LaTeX has a package for that, battle-tested over decades. Typst might not yet. So the answer depends on the specific requirements. For greenfield projects, especially those leaning into automation, Typst's architecture is a compelling advantage.
Herman
That's the pragmatic take. It's not about declaring a winner. It's about matching the tool to the job. For the job of 'creating beautiful, automated documents in a reproducible, maintainable way,' Typst's design makes it a top contender. Its integration with modern development practices—version control, CI/CD, API access—is just smoother.
Corn
Which brings us back to the AI agent angle. An LLM can more reliably generate correct Typst code than LaTeX code. The syntax is cleaner, the error messages are instructive. If the agent makes a mistake, the feedback loop is tighter. That's not a minor feature; it's an enabling one.
Herman
And that's why the MCP integration story is so promising. A Typst MCP server can expose not just a 'compile' tool, but also 'validate syntax,' 'list available templates,' or 'apply brand style.' The agent can interact with the typesetting system intelligently. With LaTeX, you'd be building a whole abstraction layer to hide the complexity. With Typst, the complexity is already lower.
Corn
So we've dived into the architecture, compared it directly, looked at the ecosystem. The 'easier' claim holds water, but with the caveat that 'easier' is about the total cost of ownership—writing, debugging, maintaining, and integrating the system, not just typing the first draft.
Herman
And that total cost is exactly what matters when you're building automated pipelines. You're not just writing a document; you're writing a program that writes documents. The typesetting engine is a critical library in that program. You want one with a clean API, good diagnostics, and predictable performance.
Corn
Speaking of performance, we touched on the incremental compiler. But what about the output itself? The PDF. Does Typst's modern approach sacrifice any of the typographic quality that TeX is famous for? The micro-typography, the spacing, the hyphenation?
Herman
That's the next layer down. And honestly, it's where the rubber meets the road. A typesetting system can have the best developer experience in the world, but if the final pages look amateurish, none of it matters.
Herman
So that typographic quality is crucial. And the short answer is: Typst holds its own. It uses the same high-quality hyphenation algorithms as TeX—the classic patterns developed by Frank Liang. It has support for OpenType font features: ligatures, old-style numerals, proper kerning. The PDF it produces is standards-compliant, with embedded fonts and correct metadata.
Corn
But TeX has forty-plus years of refinement on things like line breaking and paragraph justification. Knuth's algorithms are legendary. Can a new system really match that out of the gate?
Herman
It doesn't have to match it identically; it has to achieve a result that is, for all practical purposes, equally good. And from everything I've seen and tested, it does. The average reader, or even a typography nerd, would be hard-pressed to spot a quality difference in a typical academic paper. Where you might see a gap is in extreme edge cases: complex multi-line mathematical equations with nested fractions, or intricate tabular layouts with vertical rules. LaTeX's packages have had decades to solve those corner cases.
Corn
So for the ninety-five percent use case, the output is beautiful. For the five percent bleeding-edge typesetting challenge, you might still need LaTeX's bag of tricks. That's a reasonable trade-off for the developer experience gains.
Herman
And this brings us to the AI-agent integration layer we hinted at. Because if the goal is for an AI to generate beautiful documents, the typesetting engine needs to be 'agent-ready.' It needs to expose itself not as a black-box compiler, but as a set of tools an agent can understand and use. That's where the Model Context Protocol comes in.
Corn
Define MCP for us. Not the full technical spec, but the conceptual role.
Herman
The Model Context Protocol is a standardization layer, pioneered by Anthropic, that lets any tool or resource expose itself to an AI agent in a structured way. Think of it like a USB port for AI. A tool becomes an 'MCP server' that offers 'resources'—like templates, style guides, or document fragments—and 'tools'—like compile, validate, or render. The agent discovers these capabilities and can use them without needing custom code for every single application.
Corn
So a typesetting tool becomes an MCP server. What would that look like?
Herman
A Typst MCP server could expose a resource called 'company_report_template.typ'. The agent can read that template to understand the structure. It could expose tools like 'compile_typst', taking source text and data as input and returning a PDF. Or 'validate_typst', which checks syntax and returns clear errors. The agent's job becomes assembling the content and calling the right tools; it doesn't need to know the intricacies of Typst's command syntax, because the tools handle that.
Corn
That's a cleaner abstraction than what we have today, where you'd typically prompt an LLM with 'write Typst code for a report,' and it has to generate the raw source code, errors and all. The MCP model lets the agent work at a higher level of intent: 'create a sales report for client X using the Q2 data.'
Herman
And this is where we can evaluate integration depth. Which tools are most 'agent-ready'? Typst is a frontrunner because of its design. It has a clean, well-documented JSON API for its compiler. You can feed it a dictionary of variables. Its error messages are structured. Building an MCP server for it is straightforward. LaTeX, by contrast, would need a thick wrapper layer. You'd have to build a server that hides the package management, parses the arcane log files, and retries compilations when floats cause overfull hbox warnings. It's possible, but it's fighting the system.
Corn
What about cloud-native tools? Not programmatic typesetting engines per se, but document platforms like Archbee or Mintlify that have APIs.
Herman
Those are interesting. They often have rich REST APIs for creating and updating documents. An agent could use those APIs to assemble content. But there's a key difference: they're usually WYSIWYG editors under the hood. The agent is manipulating a content tree in a proprietary system, not working with a declarative source file. You lose the reproducibility, the version control friendliness, and often the fine-grained typographic control. You're trading capability for convenience.
Corn
So for true programmatic, high-quality output, you want a dedicated typesetting engine exposed via a protocol like MCP. Which leads to the bigger question: what makes a great typesetting program great for this era? We need a feature matrix.
Herman
I've been thinking about this. Let's propose six core criteria. First, declarative, predictable output. The document should be a function of its source and data. Compile twice, get the same PDF. No random floating.
Corn
Second, programmatic control. Variables, loops, conditionals. The ability to generate content dynamically, not just static text.
Herman
Third, strong data ingestion. Native support for pulling in JSON, YAML, CSV. Not as an afterthought, but as a first-class citizen. The typesetting system should be a data-to-document compiler.
Corn
Fourth, headless operation. A CLI and a well-documented API. It has to run in a CI pipeline, on a server, without a GUI.
Herman
Fifth, an extensible styling system. A way to define global styles—colors, fonts, spacing—and apply them consistently across elements. This is where Typst's set and show rules excel, and where LaTeX's style packages can become a tangled mess.
Corn
And sixth, the non-negotiable: quality of output. Excellent typography, adherence to PDF standards, proper font embedding, accessibility tagging. The document has to be beautiful and professional.
Herman
That's the feature matrix. And you can score systems against it. Typst scores high on all six, though its styling system is powerful but still evolving. LaTeX scores high on output quality and programmability, but low on predictability and headless operation friendliness. A generic HTML-to-PDF renderer scores high on headless operation, but low on programmatic control and often on typographic quality.
Corn
Let's make this concrete with a case study. Imagine an AI agent using a Typst MCP server to generate a personalized sales proposal. It pulls client data from a CRM, product specs from a database, and pricing from a live quote engine. It needs to apply strict brand guidelines: specific fonts, colors, logo placement.
Herman
With a Typst MCP setup, the agent fetches the 'proposal_template' resource, which defines the structure and the show rules for branding. It injects the client data as a JSON object into the template. It calls the 'compile' tool. The Typst engine handles the typesetting, ensuring the brand rules are followed automatically—the logo is in the right place, the headers are the correct blue, the fonts are embedded. The output is a print-ready PDF, identical in quality to one a human designer would produce, but generated in seconds.
Corn
Compare that to an alternative: using a generic 'generate PDF' API, like what you'd get from a headless browser rendering an HTML page. The agent would have to generate the entire HTML and CSS from scratch, or use a fragile template. The typography will be limited to web fonts, the page breaks might be awkward, the print margins might be wrong. The difference is between a typeset document and a webpage printed to PDF.
Herman
That distinction is everything for professional use. A contract, a data sheet, a annual report—these have to withstand scrutiny. Poor kerning, rivers of white space, widowed lines… they scream 'automated' in the worst way. A great typesetting program eliminates those issues by design.
Corn
This also ties into the cloud versus local tooling debate. The promise of cloud services is obvious: no installation, real-time collaboration, managed updates. Overleaf is the classic example for LaTeX. Typst has its own cloud editor. For a team working on a document together, that's powerful.
Herman
But for automated, agent-driven workflows, local tooling often wins. Control, privacy, offline capability, and no latency to an external service. You want the typesetting engine running in your own environment, close to your data. The sweet spot is a tool that offers both: a cloud option for collaboration and a local CLI/API for automation. Typst does that. You can use the web app to design a template, then download the .typ file and compile it locally in your pipeline.
Corn
And that's where the ecosystem lock-in risk appears. With a cloud-only service, your documents are trapped in their format. With a local tool like Typst or LaTeX, your source files are plain text. They're future-proof. You can switch rendering engines if you need to, or archive them for decades.
Herman
That's a profound advantage. The document as a compiled artifact from a plain text source is a powerful, resilient model. It's why LaTeX has lasted fifty years. It's why Typst is built on the same principle. The cloud is a convenient front-end, not the prison.
Corn
So the implication is clear. If you're building systems that need to generate high-quality documents at scale—whether it's personalized reports, automated invoices, or dynamic manuals—your foundation should be a programmatic typesetting engine that scores high on our six criteria. The AI agent is the composer, but the typesetter is the orchestra that actually produces the music.
Herman
And the MCP layer is the conductor's score, telling the composer what instruments are available and how to write for them. Without that structured protocol, you're just hoping the AI guesses the right syntax and the compiler doesn't choke. With it, you have a reliable, industrial-grade document factory.
Corn
Right, a reliable factory. So for someone listening who has a stack of Word templates and a dream of automation, what's the practical takeaway? Do they start with Typst if they're going greenfield?
Herman
If you control the stack and you're starting a new project, yes. The learning curve is demonstrably shallower. You can go from zero to a functional, styled document in an afternoon, not a week. And the integration story is more modern—that JSON API we mentioned, the clear error formats. It's built for pipelines.
Corn
But the counter-argument is always ecosystem. What if you need a specific LaTeX package for, I don't know, typesetting chess notation or phylogenetic trees?
Herman
Then you use LaTeX. Or you write the Typst package yourself, which is easier than writing a LaTeX package, but still work. The actionable insight isn't 'abandon LaTeX everywhere.' It's that for the vast middle—reports, proposals, letters, manuals, data sheets—Typst is now the better default. Its package ecosystem is growing fast. And for truly novel needs, you can always drop back to LaTeX for that one component and include it as a rendered image. The point is to stop assuming LaTeX is the only serious tool.
Corn
The second insight, on the AI front, is more architectural. To make your documents agent-ready, you need to expose the typesetting engine through a structured protocol like MCP, not just a chat window.
Herman
That's the key. Giving an AI a prompt that says 'write Typst code' is like giving a human a blank text editor and the TeXbook. What you want is to give the AI a set of tools: 'use the Q3 report template,' 'fill in this data table,' 'compile with the brand theme.' The MCP model forces you to think in terms of capabilities, not just syntax. It turns the typesetter from a compiler into a service.
Corn
So what can a listener actually do this week? You're not saying rewrite your entire documentation suite.
Herman
No. First, audit your document pipelines. Look for any recurring document that a human touches purely for formatting. The monthly status report that someone copies last month's and swaps out numbers. The invoice that gets generated from data but then adjusted in Acrobat. Those are your candidates.
Corn
Pick one. The ugliest, most manual one. Prototype converting it to a programmatic system. Use Typst, or Quarto, or even Pandoc with a good CSS theme. The goal isn't perfection. It's to see what breaks, and what you gain. You'll likely find that the hardest part isn't the typesetting—it's cleaning and structuring your input data. Which is a valuable discovery all by itself.
Herman
And that process teaches you the criteria. You'll quickly learn why 'predictable output' matters when your figure suddenly jumps to page four and ruins a batch of five hundred generated reports. You'll appreciate 'clear errors' when your CI job fails at three a.m. and the log tells you 'missing closing brace on line twenty-two' versus 'undefined control sequence \fig@capsule.'
Corn
The final recommendation, then, is that the 'great' typesetting tool is one that balances expressiveness with constraint. It gives you enough control to make something beautiful, but enough structure and rules that an automated process can reliably produce that beauty every single time. LaTeX is all expressiveness, with constraint only through extreme expertise. A WYSIWYG editor is all constraint, with little expressiveness for automation. The sweet spot is in the middle.
Herman
And that's where the modern tools are aiming. Typst, Quarto, even newer systems like SILE. They provide a box of high-level, predictable tools. You can build a beautiful house with them, and you can teach a machine to build the same beautiful house. That's the shift. The document as a compiled artifact, where the source code is logic and data, and the compiler enforces the style. That's the superpower.
Corn
That superpower—the document as a compiled artifact—raises the open question hanging over this whole discussion. Where does the killer app come from? Is it going to be something built from the ground up to be AI-native, or is AI just going to be another client, another user, of these established tools like Typst?
Herman
I think we'll see both, but the real leverage is in the MCP layer connecting them. We might end up with a split in the market. On one side, you have 'designer' typesetting tools—think Figma for documents, where the focus is on visual layout and collaboration. On the other, you have 'engineer' tools like Typst and LaTeX, where the focus is on programmatic generation and precision. The synergy happens when an AI agent can use MCP to pull resources from the designer tool—a style guide, a component library—and feed them into the engineer tool to compile a thousand perfect instances.
Corn
The document as a compiled artifact is a powerful idea that's been around since TeX. But it feels like its time has finally come, precisely because of AI. Mastering the tools that treat documents that way—as a function of logic and data—is a genuine superpower now. It turns document generation from a chore into a competitive advantage.
Herman
That's the final pitch. Don't think of this as swapping one markup language for another. Think of it as upgrading your entire philosophy of what a document is. When you do that, the tools choose themselves.
Corn
And on that note, we have to thank our producer, Hilbert Flumingtop, for keeping the whole operation running. Our thanks as always to Modal for providing the serverless GPU muscle that powers our pipeline—different workload, same reliable infrastructure.
Herman
If you got something out of this deep dive, leave us a review wherever you listen. It makes a real difference. For the full archive of over two thousand episodes, visit myweirdprompts.com.
Corn
This has been My Weird Prompts. I'm Corn.
Herman
And I'm Herman Poppleberry. Until next time.

This episode was generated with AI assistance. Hosts Herman and Corn are AI personalities.